A new modality of artificial intelligence called analog deep machine learning promises computation speeds far greater than other modalities, and at a fraction of the power usage of those.
Programmable resistors are the key pieces of this form of artificial intelligence, just as transistors are the main pieces of conventional digital processors. By iterating sets of programmable resistors in complex layers, it is possible to create a network of artificial analog “neurons” and “synapses” that perform calculations like a digital neural network. This network can be trained to perform complex tasks, such as image recognition and human language processing.
Working in this field, a team that includes Jesús A. del Alamo, Ju Li and Bilge Yildiz, from the Massachusetts Institute of Technology (MIT) in the United States, has improved an analog synapse that the team had previously developed. Del Alamo and his colleagues used an inorganic material in the manufacturing process that allows the devices to function a million times faster than previous versions, which is also about a million times faster than the synapses in the human brain. .
Furthermore, this inorganic material also makes programmable resistors extremely energy efficient.
Unlike the materials used in the previous version of the device, the new material is compatible with the manufacturing techniques used with silicon. This change has made it possible to manufacture devices at the nanometer scale and could pave the way for the integration of hardware of this class in a wide range of computers, endowing them with the capacity for deep machine learning.
The operation of the device is based on the electrochemical insertion of the smallest ion, the proton, in an insulating oxide, to modulate its behavior.
Artistic recreation of the operation of the new analog deep machine learning system based on programmable proton resistors. (Image: Ella Maru Studio, Murat Onen/MIT. CC BY-NC-ND 3.0)
Protonic programmable resistors greatly increase the speed at which a neural network is trained, while drastically reducing the energy consumption to perform that training. This could help scientists develop deep learning models much more quickly, which could be applied in applications such as self-driving cars, counterfeit detection or medical image analysis. (Font: NCYT by Amazings)
Add Comment