AUTHOR=Liu Samuel , Xiao T. Patrick , Kwon Jaesuk , Debusschere Bert J. , Agarwal Sapan , Incorvia Jean Anne C. , Bennett Christopher H. TITLE=Bayesian neural networks using magnetic tunnel junction-based probabilistic in-memory computing JOURNAL=Frontiers in Nanotechnology VOLUME=4 YEAR=2022 URL=https://www.frontiersin.org/journals/nanotechnology/articles/10.3389/fnano.2022.1021943 DOI=10.3389/fnano.2022.1021943 ISSN=2673-3013 ABSTRACT=

Bayesian neural networks (BNNs) combine the generalizability of deep neural networks (DNNs) with a rigorous quantification of predictive uncertainty, which mitigates overfitting and makes them valuable for high-reliability or safety-critical applications. However, the probabilistic nature of BNNs makes them more computationally intensive on digital hardware and so far, less directly amenable to acceleration by analog in-memory computing as compared to DNNs. This work exploits a novel spintronic bit cell that efficiently and compactly implements Gaussian-distributed BNN values. Specifically, the bit cell combines a tunable stochastic magnetic tunnel junction (MTJ) encoding the trained standard deviation and a multi-bit domain-wall MTJ device independently encoding the trained mean. The two devices can be integrated within the same array, enabling highly efficient, fully analog, probabilistic matrix-vector multiplications. We use micromagnetics simulations as the basis of a system-level model of the spintronic BNN accelerator, demonstrating that our design yields accurate, well-calibrated uncertainty estimates for both classification and regression problems and matches software BNN performance. This result paves the way to spintronic in-memory computing systems implementing trusted neural networks at a modest energy budget.