The existing training method, “backpropagation,” is difficult to implement in hardware and becomes untenable when the hardware is imperfect.
Parameter multiplexed gradient descent (PMGD) is a machine learning training method designed to easily train emergent non-volatile memory and neuromorphic hardware platforms.
Online training of both analog and digital systems. Useful for in-the-field training of custom machine learning and neuromorphic hardware.
Hardware networks can be trained much faster and use less energy than current equivalent techniques. It is also robust to hardware imperfections and noise.