Skip to main content

ORIGINAL RESEARCH article

Front. Comput. Sci.
Sec. Computer Security
Volume 6 - 2024 | doi: 10.3389/fcomp.2024.1465352
This article is part of the Research Topic Reliable and Secure System Software in Emerging Cloud and Distributed Environment View all articles

FedNIC: Enhancing Privacy-Preserving Federated Learning via Homomorphic Encryption Offload on SmartNIC

Provisionally accepted
Sean Choi Sean Choi 1*Disha Patel Disha Patel 1Diman Zad Tootaghaj Diman Zad Tootaghaj 2Lianjie Cao Lianjie Cao 2Faraz Ahmed Faraz Ahmed 2Puneet Sharma Puneet Sharma 2
  • 1 Santa Clara University, Santa Clara, United States
  • 2 Hewlett Packard Enterprise, Alpharetta, Georgia, United States

The final, formatted version of the article will be published soon.

    Federated Learning (FL) has emerged as a promising paradigm for secure distributed machine learning model training across multiple clients or devices, enabling model training without having to share data across the clients. Yet, recent studies revealed that FL can be vulnerable to data leakage and reconstruction attacks even if the data itself is never shared with another client. Thus, to resolve such vulnerability and improve privacy of all clients, a class of techniques, called privacy-preserving FL, incorporates encryption techniques, such as Homomorphic Encryption (HE), to encrypt and fully protect model information from being exposed to other parties. A downside to this approach is that encryption schemes like HE are very compute-intensive, often causing inefficient and excessive use of client CPU resources that can be used for other uses.To alleviate this issue, this work introduces a novel approach by leveraging Smart Network Interface Cards (SmartNICs) to offload compute-intensive HE operations of privacy-preserving FL. By employing SmartNICs as hardware accelerators, we enable efficient computation of HE while saving CPU cycles and other server resources for more critical tasks. In addition, by offloading encryption from the host to another device, the details of encryption remains secure even if the host is compromised, ultimately improving security of the entire FL system.Given such benefits, this paper presents a FL system named FedNIC that implements the above approach, with an in-depth description of the architecture, implementation, and performance evaluations. Our experimental results demonstrate a more secure FL system with no loss in model accuracy and up to 25% in reduced host CPU cycle, but with roughly 46% increase in total training time, showing the feasibility and trade offs of utilizing SmartNICs as an encryption offload device in federated learning scenarios. Finally, we illustrate promising future work and potential optimizations for a more secure and privacy-preserving federated learning system.1 Choi et al.

    Keywords: privacy-preserving machine learning, Federated learning, Homomorphic encryption, SmartNic, Network Offload Frontiers

    Received: 16 Jul 2024; Accepted: 27 Sep 2024.

    Copyright: © 2024 Choi, Patel, Zad Tootaghaj, Cao, Ahmed and Sharma. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Sean Choi, Santa Clara University, Santa Clara, United States

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.