BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News University of Pennsylvania Researchers Develop Processorless Learning Circuitry

University of Pennsylvania Researchers Develop Processorless Learning Circuitry

This item in japanese

Researchers from the University of Pennsylvania have designed an electrical circuit, similar to a neural network, that can learn tasks such as nonlinear regression. The circuit operates at low power levels and can be trained without a computer.

The circuit is a network of transistors, which function as virtual variable resistors. The system learns by adjusting the resistor values using an approach called coupled learning. The researchers show that this circuit can learn nonlinear functions, including XOR and nonlinear regression, suggesting that larger networks could approximate arbitrary functions, just as deep learning networks can. The advantage of the learning circuitry approach is its scalability: because it is a physical system and all elements are updated in parallel, training step duration does not depend on the network size. The system also performs inference quickly and with low power needs. According to the research team:

The circuitry is robust to damage, retrainable in seconds, and performs learned tasks in microseconds while dissipating only picojoules of energy across each transistor. This suggests enormous potential for fast, low- power computing in edge systems like sensors, robotic controllers, and medical devices, as well as manufacturability at scale for performing and studying emergent learning.

The system consists of a coupled pair of identical MOSFET transistor networks; the corresponding transistors in each network are connected to a single capacitor. During training, both networks are given identical voltages representing the training input. One of the networks, called the clamped network, also has its output to the desired output. The output of the other, or free, network is not set.

Learning Metamaterial Architecture

Learning Metamaterial Architecture. Source: Machine Learning without a Processor: Emergent Learning in a Nonlinear Electronic Metamaterial

The resulting difference in electrical states between the two networks will update the voltages in the capacitors between the transistors, and these voltages correspond to the "weights" in a neural network. Once the system is trained, the voltages can be frozen, and the system can then be used to perform inference by applying new input voltages and measuring the resulting output voltage.

While this system does have potential advantages in speed and power consumption, the researchers point out several "interesting issues" that need to be addressed. It is unclear what the optimal network topology for a given task is. The prototype uses a square lattice topology, which the team say is "likely too simple and sparse." They also want to investigate how training time and power consumption grow as system size increases.

In a discussion about the work on Hacker News, one user wondered:

Once the training is complete, one thing I didn't see mentioned in the paper was how they maintain the charge on the gate capacitors, which is analogous to the weights in a traditional neural network if I'm understanding this correctly. Any practical implementation will need to have some practical way to refresh that on a continuous basis so that the weights don't drift. Was this perhaps mentioned somewhere and I missed it?

Another user suggested that MOS capacitors could be used for this, which would be analogous to a flash drive; this would however limit the amount of training the system could handle.

Machine learning in hardware has a long history, dating all the way back to the original perceptron invented by Frank Rosenblatt in 1957. More recently, a research team at MIT developed an analog deep learning system similar to the UPenn research, which uses programmable resistors to implement a fast, low-power neural network in hardware.

About the Author

Rate this Article

Adoption
Style

BT