Google AI Bots Will be Powered by Google-made Computer Chips

'Deep Learning'

Google is set to equip its deep learning bots with ‘home-made’ computer chips.

BEACON TRANSCRIPTOn Wednesday, Google’s chief executive Sundar Pichai announced at the I/O developer conference that the tech giant has developed an application-specific integrated circuit (ASIC) that is specially designed to drive deep neural networks, a state-of-the-art artificial intelligence tech that may soon revolutionize the way the Internet works.

Deep neural nets can learn new things all by themselves by analyzing troves of data from the Internet. Currently, Google uses the technology to identify people in photographs, automatically translate texts, and understand voice commands to its Android powered devices. The technology could also reinvent the way Google web search engine does its job.

Google’s own computer chip is dubbed the Tensor Processing Unit (TPU) because it relies on the tech that drives other deep learning processes, TensorFlow. The tech is now open-source, so every developer can access its code and use and customize it. But the design for the TPU was not made public. So, third-parties can use the company’s deep learning software through the company’s cloud services.

Other tech companies including Microsoft, Facebook, and Twitter are currently working on deep learning technologies. But these companies do not have their own computer chips. Instead, they rely on traditional hardware provided by established chipmakers such as Nvidia.

Microsoft, on the other hand, is now considering the field programmable gate arrays (FPGAs) as possible alternatives to traditional chips. FPGAs can be instructed to carry out specific tasks.

Google recently said that a TPU board is more powerful and yet smaller than most of the conventional hardware solutions in its data centers. The company added that the board can fit into the place of a hard drive with a performance to per watt ratio far greater.

TPU was specifically designed for deep learning processes and it needs fewer transistors per calculation. As a result, the silicon can run more calculations per second, which is essential for more sophisticated machine learning algorithms. Plus, the chip allows deep learning machines to process data faster and apply AI models more quickly.

We assume that with the advent of the new technology, Google will ditch microchips from conventional chip makers and replace them with its own. But the greatest loser will be the world’s largest chip maker: Intel, which now powers most of Google’s servers. And Intel’s worst nightmare is that one day Google will develop its own CPUs as well.

Image Source: Flickr

Comments

comments

COMMENTS