Sam Dillavou, Menachem Stern, Andrea J. Liu, and Douglas J. Durian
Phys. Rev. Applied 18, 014040 : https://doi.org/10.1103/PhysRevApplied.18.014040

In typical artificial neural networks, neurons adjust according to global calculations of a central processor, but in the brain, neurons and synapses self-adjust based on local information. Contrastive learning algorithms have recently been proposed to train physical systems, such as fluidic, mechanical, or electrical networks, to perform machine-learning tasks from local evolution rules. However, to date, such systems have only been implemented in silico due to the engineering challenge of creating elements that autonomously evolve based on their own response to two sets of global boundary conditions. Here, we introduce and implement a physics-driven contrastive learning scheme for a network of variable resistors, using circuitry to locally compare the response of two identical networks subjected to the two different sets of boundary conditions. Using this method, our system effectively trains itself, optimizing its resistance values without the use of a central processor or external information storage. Once the system is trained for a specified allostery, regression, or classification task, the task is subsequently performed rapidly and automatically by the physical imperative to minimize power dissipation in response to the given voltage inputs. We demonstrate that, unlike typical computers, such learning systems are robust to extreme damage (and thus manufacturing defects) due to their decentralized learning. Our twin-network approach is therefore readily scalable to extremely large or nonlinear networks, where its distributed nature will be an enormous advantage; a laboratory network of only 500 edges will already outpace its in silico counterpart.