Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Search Publications by: Advait Madhavan (Fed)

Search Title, Abstract, Conference, Citation, Keyword or Author
Displaying 1 - 25 of 36

Sampling from exponential distributions in the time domain with superparamagnetic tunnel junctions

April 22, 2025
Author(s)
Temitayo Adeyeye, Sidra Gibeault, Daniel Lathrop, Matthew Daniels, Mark Stiles, Jabez McClelland, William Borders, Jason Ryan, Philippe Talatchian, Ursula Ebels, Advait Madhavan
Though exponential distributions are ubiquitous in statistical physics and related computational models, sampling them from device behavior is rarely done. The superparamagnetic tunnel junction (SMTJ), a key device in probabilistic computing, shows

Layer ensemble averaging for fault tolerance in memristive neural networks

February 1, 2025
Author(s)
Osama Yousuf, Brian Hoskins, Karthick Ramu, Mitchell Fream, William Borders, Advait Madhavan, Matthew Daniels, Andrew Dienstfrey, Jabez McClelland, Martin Lueker-Boden, Gina Adam
Advancements in continual learning with artificial neural networks have been fueled in large part by scaling network dimensionalities. As this scaling continues, conventional computing systems are becoming increasingly inefficient due to the von Neumann

Measurement-driven Langevin modeling of superparamagnetic tunnel junctions

July 23, 2024
Author(s)
Liam Pocher, Temitayo Adeyeye, Sidra Gibeault, Philippe Talatchian, Ursula Ebels, Daniel Lathrop, Jabez J. McClelland, Mark Stiles, Advait Madhavan, Matthew Daniels
Superparamagnetic tunnel junctions are important devices for a range of emerging technologies, but most existing compact models capture only their mean switching rates. Capturing qualitatively accurate analog dynamics of these devices will be important as

Measurement-driven neural-network training for integrated magnetic tunnel junction arrays

May 14, 2024
Author(s)
William Borders, Advait Madhavan, Matthew Daniels, Vasileia Georgiou, Martin Lueker-Boden, Tiffany Santos, Patrick Braganca, Mark Stiles, Jabez J. McClelland, Brian Hoskins
The increasing scale of neural networks needed to support more complex applications has led to an increasing requirement for area- and energy-efficient hardware. One route to meeting the budget for these applications is to circumvent the von Neumann

Experimental demonstration of a robust training method for strongly defective neuromorphic hardware

December 11, 2023
Author(s)
William Borders, Advait Madhavan, Matthew Daniels, Vasileia Georgiou, Martin Lueker-Boden, Tiffany Santos, Patrick Braganca, Mark Stiles, Jabez J. McClelland, Brian Hoskins
Neural networks are increasing in scale and sophistication, catalyzing the need for efficient hardware. An inevitability when transferring neural networks to hardware is that non-idealities impact performance. Hardware-aware training, where non-idealities

Neural networks three ways: unlocking novel computing schemes using magnetic tunnel junction stochasticity

September 28, 2023
Author(s)
Matthew Daniels, William Borders, Nitin Prasad, Advait Madhavan, Sidra Gibeault, Temitayo Adeyeye, Liam Pocher, Lei Wan, Michael Tran, Jordan Katine, Daniel Lathrop, Brian Hoskins, Tiffany Santos, Patrick Braganca, Mark Stiles, Jabez J. McClelland
Due to their interesting physical properties, myriad operational regimes, small size, and industrial fabrication maturity, magnetic tunnel junctions are uniquely suited for unlocking novel computing schemes for in-hardware neuromorphic computing. In this

Magnetic tunnel junction-based crossbars: improving neural network performance by reducing the impact of non-idealities

July 13, 2023
Author(s)
William Borders, Nitin Prasad, Brian Hoskins, Advait Madhavan, Matthew Daniels, Vasileia Gerogiou, Tiffany Santos, Patrick Braganca, Mark Stiles, Jabez J. McClelland
Increasingly higher demand in chip area and power consumption for more sophisticated artificial neural networks has catalyzed efforts to develop architectures, circuits, and devices that perform like the human brain. However, many novel device technologies

Characterization of Noise in CMOS Ring Oscillators at Cryogenic Temperatures

July 12, 2023
Author(s)
Prashansa Mukim, Pragya Shrestha, Advait Madhavan, Nitin Prasad, Jason Campbell, Forrest Brewer, Mark Stiles, Jabez J. McClelland
Allan deviation provides a means to characterize the time-dependence of noise in oscillators and potentially identify the source characteristics. Measurements on a 130nm, 7-stage ring oscillator show that the Allan deviation declines from 300K to 150K as

Ultrafast ID-VG Technique for Reliable Cryogenic Device Characterization

March 21, 2023
Author(s)
Pragya Shrestha, Akin Akturk, Brian Hoskins, Advait Madhavan, Jason Campbell
An in-depth understanding of the transient operation of devices at cryogenic temperatures remains experimentally elusive. However, the impact of these transients has recently become important in efforts to develop both electronics to support quantum

Implementation of a Binary Neural Network on a Passive Array of Magnetic Tunnel Junctions

July 18, 2022
Author(s)
Jonathan Goodwill, Nitin Prasad, Brian Hoskins, Matthew Daniels, Advait Madhavan, Lei Wan, Tiffany Santos, Michael Tran, Jordan Katine, Patrick Braganca, Mark Stiles, Jabez J. McClelland
The increasing scale of neural networks and their growing application space have produced a demand for more energy and memory efficient artificial-intelligence-specific hardware. Avenues to mitigate the main issue, the von Neumann bottleneck, include in

Mutual control of stochastic switching for two electrically coupled superparamagnetic tunnel junctions

August 19, 2021
Author(s)
Philippe Talatchian, Matthew Daniels, Advait Madhavan, Matthew Pufall, Emilie Jue, William Rippard, Jabez J. McClelland, Mark Stiles
Superparamagnetic tunnel junctions (SMTJs) are promising sources for the randomness required by some compact and energy-efficient computing schemes. Coupling them gives rise to collective behavior that could be useful for cognitive computing. We use a

Impact ionization-induced bistability in CMOS transistors at cryogenic temperatures for capacitorless memory applications

July 29, 2021
Author(s)
Alexander Zaslavsky, Curt A. Richter, Pragya Shrestha, Brian Hoskins, Son Le, Advait Madhavan, Jabez J. McClelland
Cryogenic operation of complementary metal oxide semiconductor (CMOS) silicon transistors is crucial for quantum information science, but it brings deviations from standard transistor operation. Here we report on sharp current jumps and stable hysteretic

A System for Validating Resistive Neural Network Prototypes

July 27, 2021
Author(s)
Brian Hoskins, Mitchell Fream, Matthew Daniels, Jonathan Goodwill, Advait Madhavan, Jabez J. McClelland, Osama Yousuf, Gina C. Adam, Wen Ma, Muqing Liu, Rasmus Madsen, Martin Lueker-Boden
Building prototypes of heterogeneous hardware systems based on emerging electronic, magnetic, and photonic devices is an increasingly important area of research. On the face of it, the novel implementation of these systems, especially for online learning

Temporal Memory with Magnetic Racetracks

December 1, 2020
Author(s)
Hamed Vakili, Mohammed N. Sakib, Samiran Ganguly, Mircea Stan, Matthew Daniels, Advait Madhavan, Mark D. Stiles, Avik W. Ghosh
Race logic is a relative timing code that represents information in a wavefront of digital edges on a set of wires in order to accelerate dynamic programming and machine learning algorithms. Skyrmions, bubbles, and domain walls are mobile magnetic