Leap into the future: why are quantum computers and what neuromorphic chips are
By 2030, the global economy thanks to artificial intelligence can grow by $ 15.7 trillion - this is the forecast of PWC. Artificial intelligence is involved in completely different areas: in metallurgical production, in the autopilot of cars, in the stock exchange. The problem is that AI is now at the beginning of its development - largely due to imperfect equipment. The amount of information is growing and modern equipment and neural networks will soon be unable to cope with the volume of incoming tasks. Therefore, humanity needs to solve this problem in the coming decades. One of the promising areas of development in the field of AI is neuromorphic systems. What are these and why is it not so easy to create them?
What is the main problem of artificial intelligence today?
According to Moore's Law, the computing power of processors is multiplied every two years. But the capacity required for training artificial intelligence doubles every three to four months - since 2012, they have grown more than 300 thousand times. Soon, the computing capabilities of even the most powerful computers will become insufficient for training AI, so it is necessary to search for fundamentally new solutions. Artificial intelligence is waiting for its revolution. Now one neural network is one task. Moreover, large artificial intelligence systems are too energy consuming (and expensive). If scientists fail to address these challenges, the development and scaling of entire segments of the economy will be at risk.
Why are neural networks expensive?
The more neurons in the network, the more electricity and computing power is required for its training and functioning. For example, OpenAI spent $ 4.5 million developing GPT-3, one of the largest language models in the world, most of that amount = electricity costs. It's all about how modern hardware works. Any computer uses the von Neumann architecture to perform tasks (here you can learn more about it): in it, all data is stored in one place, and calculations with them take place in another. In a study by the Institute of Electrical and Electronics Engineers (IEEE) on data computation, it is said that the energy required for a single data movement operation can be a thousand times higher than that for a single computation operation. Computing modern neural networks on von Neumann computers, the costs of data movement operations are too high. Accordingly, the more complex a neural network, the more energy needs to be spent, and the more expensive it will cost to train it.
What to do about it?
Throughout the history of development in the field of artificial intelligence, those approaches and algorithms have always won for which there was a suitable hardware platform, that is, "hardware". Therefore, it is important to consider AI algorithms in conjunction with the hardware component on which they are executed. One of the obvious directions in the development of AI is neuromorphism, that is, the use of the principles by which the human brain works in computing systems.
How does the human brain inspire AI researchers?
First: there are 80 to 100 billion neurons in the brain, each of which works asynchronously; many different processes occur simultaneously, in addition, the connections between neurons are plastic, which allows a person to quickly adapt to new conditions. Second: the number of active neurons in the brain is relatively small (less than 10%), which is very different from the work of classical neural networks, in which all neurons work simultaneously. The brain uses energy more economically, so AI researchers are trying to imitate it - with the help of "sparse computing", local learning, impulsive information transmission, calculations in memory, analog circuits and other phenomena borrowed from the brain.
What is a neuromorphic chip for?
The most obvious is smart sensors. It is thanks to them that full autopilot in a car, very accurate observations of seismologists and cosmologists, and many other things that will affect the lives of millions of people will become possible. Neuromorphic chips are also needed in robotics: it is thanks to them that the autonomous work of AI is possible, which allows robots to almost instantly respond to changes in the environment and make decisions about further actions. In the future, neuromorphic chips will allow the creation of infinitely large neural networks, the number of neurons in which will significantly exceed 100 billion, as in the human brain.
What is a quantum computer?
To solve the problem of huge energy costs and the duration of training neural networks, scientists are also considering the use of quantum computers. This is a new class of computing devices that, thanks to the use of quantum effects, are able to solve problems that are not available to the most powerful "classical" supercomputers.
Have Quantum Computers Been Built?
Yes, developments in this