The memory is a Conductive Bridging RAM (CBRAM). This usually operates by the making and breaking of filaments of copper ions to represent 1s and 0s (see Adesto tips CBRAM as automotive embedded memory ).
The research team has shown that the CBRAM filament can be programmed to have multiple analog states and thereby emulate biological synapses in the human brain. This so-called synaptic device can be used to do in-memory computing for neural network training faster and at considerably higher energy efficiency than digital representations that repeatedly move data between memory and processors.
The work is described in a paper published in Nature Communications entitled "Neuro-inspired Unsupervised Learning and Pruning with Sub-quantum CBRAM arrays."
The approach uses a spiking neural network for implementing unsupervised learning in hardware. On top of that the team has applied software pruning of networks to make neural network training more energy efficient without sacrificing much in terms of accuracy.
Soft-pruning is a method that finds weights that have already matured and asymptoted to a particular value during training and then sets them to a constant non-zero value. This stops them from getting updated for the remainder of the training, which minimizes computing power.
The CBRAM was used as a synaptic array and trained to classify handwritten digits from the MNIST (Modified National Institute of Standards and Technology ) database.
The network achieved 93 percent accuracy when up to 75 percent of the weights were soft-pruned. In terms of energy savings, the team estimates that their neuro-inspired hardware-software co-design approach can eventually cut energy use during neural network training by two to three orders of magnitude compared to the state of the art.
Team leader Professor Duygu Kuzum, said that she and her team plan to work with multiple memory technology companies to advance this work.
Related links and articles: