Google is using AI machine learning to help design its next generation of microchips. While human engineers take months to find the best possible arrangement of the different components of a chip, AI gets better results in just 6 hours of work.
For many years now, Google has been at the forefront of integrating AI into the vast majority of its applications and services. At its Google I/O 2021 conference, the Mountain View firm, for example, presented a dermatological support tool boosted to AI, which will allow users to identify their skin problems using a few photos.
This time, the Google researchers decided to use machine learning of their AI to design their next electronic chips, and more specifically their next TPU (Tensor Processing Unit). As a reminder, this is an integrated circuit specially designed by Google to improve artificial intelligence systems using neural networks. In other words, Google uses AI to accelerate the development of AI.
As Google explains, finding the ideal design for a chip to combine performance and speed of execution usually takes months to engineers. This task, called “floorplanning” is to find the optimal layout of a chip’s subsystems. However, according to the researchers of the Californian giant, in only 6 hours of work the algorithms were able to develop chips more efficient than those designed by humans.
ORGANIZE CHIP DESIGN AS AN AI GAME
To do this, Google researchers came up with the idea of organizing chip design as a game in the eyes of AI. So instead of a game board, you have a silicon dice. Instead of pieces like pieces, you have components like CPUs and GPUs. The objective here is therefore to find the best possible arrangement, in order to obtain an optimal calculation efficiency.
The researchers then trained the RNs via a data set consisting of 10,000 chips of varying quality, some of which were randomly generated. A specific “reward” function has been assigned to each design, depending on its success in different areas, such as energy consumption for example. The algorithm was then able to discern the most effective plans from the worst, which could then generate its own plans. Google has already adopted this system and intends to use it to reduce production costs and produce more efficient chips.