The first physical system to learn nonlinear tasks without a traditional computer processor

Sam Dillavou, a postdoctoral fellow in the Durian Research Group at the School of Arts & Sciences, created the components of this contrasted local learning network, an analog system that is fast, low-power, scalable, and capable of learning nonlinear tasks. Credit: Erica Moser

Scientists encounter many trade-offs when trying to build and scale brain-like systems that can perform machine learning. For example, artificial neural networks are capable of learning complex language and vision tasks, but the process of training computers to perform these tasks is slow and requires a lot of energy.

Training machines to learn digitally but perform tasks analogically—meaning that the input varies with a physical quantity like voltage—can reduce time and performance, but small errors can add up quickly.

The electrical network, previously designed by physicists and engineers at the University of Pennsylvania, is more scalable because errors do not compound in the same way as system size grows, but is severely limited because it can only learn linear tasks, such as those with a simple input-output relationship .

Now, researchers have created an analog system that is fast, low-power, scalable, and capable of learning more complex tasks, including exclusive or (XOR) relationships and nonlinear regression. This is called a contrastive local learning network; components evolve on their own based on local rules without knowledge of the larger structure.

Physics professor Douglas J. Durian compares this to how neurons in the human brain don’t know what other neurons are doing, yet learning occurs.

“It can learn, in a machine learning sense, to do useful tasks, much like a computational neural network, but it’s a physical object,” says physicist Sam Dillavou, a postdoc at the Durian Research Group and first author of a paper about the system published in Proceedings of the National Academy of Sciences.

“One of the things that we’re really excited about is that because it doesn’t know the structure of the network, it’s very fault-tolerant, it’s very resilient to being built in different ways, and we think that opens up a lot of opportunities to extend these things,” he says Marc Z. Miskin Professor of Engineering.

“I think it’s an ideal model system that we can study to gain insight into all kinds of problems, including biological problems,” says physics professor Andrea J. Liu. He also says it could be useful when connecting to devices that collect data that requires processing, such as cameras and microphones.

The authors claim in the paper that their self-learning system “provides a unique opportunity to study emergent learning. Compared to biological systems, including the brain, our system relies on simpler, well-understood dynamics, is precisely trainable, and uses simple modular components.”

This research is based on the Coupled Learning framework, which Liu and postdoc Menachem (Nachi) Stern devised and published their findings in 2021. In this paradigm, a physical system that is not designed to perform a particular task adapts to applied inputs to he learned the task. using local learning rules and no centralized processor.

Dillavou says he came to Penn specifically for this project and worked to translate the framework from simulation to work in its current physical design, which can be built using off-the-shelf circuit components.

“One of the craziest parts about this is that the thing actually teaches itself; we just kind of set it up,” Dillavou says. The researchers just apply a voltage as an input, and then the transistors that connect the nodes update their properties based on the Coupled Learning rule.

“Because the way it’s calculated and taught is based on physics, it’s much more interpretable,” says Miskin. “You can actually figure out what it’s trying to do because you have a good handle on the underlying mechanism. That’s kind of unique because a lot of other learning systems are black boxes where it’s much harder to figure out why the network did what it did.” “

Durian says he hopes it’s “the beginning of a huge field,” noting that another postdoc in his lab, Lauren Altman, is building mechanical versions of contrasting local learning networks.

The researchers are currently working on scaling the design, and Liu says there are many questions about memory storage time, the effects of noise, the best architecture for the network, and whether there are better forms of nonlinearity.

“It’s not really clear what changes as we expand the learning system,” says Miskin.

“If you imagine a brain, there’s a huge gap between a worm with 300 neurons and a human being, and it’s not obvious where those abilities come in, how things change as you get bigger. You have a physical system that you can scale up and it’s bigger and bigger and bigger opportunity to actually study it.”

More information:
Sam Dillavou et al, CPU-less Machine Learning: Emergent Learning in a Nonlinear Analog Network, Proceedings of the National Academy of Sciences (2024). DOI: 10.1073/pnas.2319718121

Provided by the University of Pennsylvania

Citation: First physical system to learn nonlinear tasks without a traditional computer processor (2024, July 8), retrieved July 8, 2024 from https://techxplore.com/news/2024-07-physical-nonlinear-tasks-traditional- processor.html

This document is subject to copyright. Except for any bona fide act for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top