We are aiding the development of hardware accelerators for broad classes of hard problems. For many hard computational problems, it can be efficient to find the solution using approaches that incorporate randomness. Application-specific hardware accelerators based on thermally unstable devices can increase the speed and energy efficiency of these calculations. We are enabling the development of such accelerators by carrying out systematic measurements of prototype devices, developing models of novel hardware for use in circuit simulations, designing efficient coupling circuitry, and identifying device properties needed by efficient algorithms.
Finding good solutions to many hard problems, like combinatorial optimization and traveling salesman problems, counterintuitively requires making the estimated solution worse before making it better. This situation results from many hard problems having many “solutions” that cannot be improved without moving through worse solutions. Using random processes to move in worse directions in a controlled way is essential for their solution. These approaches are inspired by material growth processes in which the temperature is lowered slowly so that the finite temperature continues to create fluctuations that keep the material from getting stuck in undesirable metastable phases.
Magnetic tunnel junctions operating in a superparamagnetic regime are an attractive approach to generating random processes needed to controllably move in bad directions. Implementations of this approach based purely on complementary metal-oxide-semiconductor (CMOS) circuitry use pseudorandom numbers generated by complex circuits. Novel hardware like magnetic tunnel junctions, in which thermal fluctuations drive random transitions between resistance states, can be more efficient. Incorporating such devices into CMOS circuitry requires accurately modeling their behavior in time-dependent circuits. We are developing models to reproduce the measured behavior of experimental devices. New models are required because the measured behavior disagrees with that predicted by the simple models typically used for that purpose.
The complex problems of interest can be formulated by defining interactions between many two-level systems, which we implement using the two resistance states of magnetic tunnel junctions operating in the superparamagnetic regime. The interactions are chosen so that the configuration of the two-level systems with the minimum energy is the solution. This solution is found by slowly lowering the effective temperature of the collection of two-level systems until the system settles into the lowest energy configuration. Existing demonstrations of this approach measure the state of each of the magnetic tunnel junctions, computing the resulting effect of each on all the others, and then applying that interaction to each. We have developed direct methods to implement these interactions and have demonstrated the ability to tune the effective temperature to the ground state of a simple model system.
These same random processes can be used to implement neural networks using random bitstreams to represent the output from neurons and synaptic weights, an approach referred to as stochastic computing. We have designed and simulated a neural network based on random bitstreams generated with superparamagnetic tunnel junctions and shown that it can be more energy efficient than approaches based purely on CMOS circuitry.