AUTHOR=Georganas Evangelos , Kalamkar Dhiraj , Avancha Sasikanth , Adelman Menachem , Aggarwal Deepti , Anderson Cristina , Breuer Alexander , Bruestle Jeremy , Chaudhary Narendra , Kundu Abhisek , Kutnick Denise , Laub Frank , Md Vasimuddin , Misra Sanchit , Mohanty Ramanarayan , Pabst Hans , Retford Brian , Ziv Barukh , Heinecke Alexander TITLE=Tensor Processing Primitives: A Programming Abstraction for Efficiency and Portability in Deep Learning and HPC Workloads JOURNAL=Frontiers in Applied Mathematics and Statistics VOLUME=8 YEAR=2022 URL=https://www.frontiersin.org/journals/applied-mathematics-and-statistics/articles/10.3389/fams.2022.826269 DOI=10.3389/fams.2022.826269 ISSN=2297-4687 ABSTRACT=
During the past decade, novel Deep Learning (DL) algorithms, workloads and hardware have been developed to tackle a wide range of problems. Despite the advances in workload and hardware ecosystems, the programming methodology of DL systems is stagnant. DL workloads leverage either highly-optimized, yet platform-specific and inflexible kernels from DL libraries, or in the case of novel operators, reference implementations are built