NOTICE: Due to a lapse in annual appropriations, most of this website is not being updated. Learn more.
Form submissions will still be accepted but will not receive responses at this time. Sections of this site for programs using non-appropriated funds (such as NVLAP) or those that are excepted from the shutdown (such as CHIPS and NVD) will continue to be updated.
An official website of the United States government
Here’s how you know
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
Secure .gov websites use HTTPS
A lock (
) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.
Multiplexed gradient descent: Fast online training of modern datasets on hardware neural networks without backpropagation
Published
Author(s)
Adam McCaughan, Bakhrom Oripov, Natesh Ganesh, Sae Woo Nam, Andrew Dienstfrey, Sonia Buckley
Abstract
We show that model-free perturbative methods can be used to efficiently train modern neural network architectures in a way that can be directly applied to emerging neuromorphic hardware. These methods were investigated for training VLSI neural networks beginning in the 1990s, and more recently on memristive crossbars and photonic hardware, but all these demonstrations have been very limited in scale, comprising small datasets with only a few neurons. We describe a framework for applying these techniques to existing neuromorphic hardware at much larger scales, with an emphasis on creating simple, highly-localized circuits that could be implemented on-chip if desired. The framework is also extensible to training existing hardware systems via a chip-in-the-loop technique.
McCaughan, A.
, Oripov, B.
, Ganesh, N.
, Nam, S.
, Dienstfrey, A.
and Buckley, S.
(2023),
Multiplexed gradient descent: Fast online training of modern datasets on hardware neural networks without backpropagation, APL Machine Learning, [online], https://doi.org/10.1063/5.0157645, https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=936062
(Accessed October 9, 2025)