Efficient privacy-preserving deep neural network training protocol in federated learning

Thumbnail Image



Journal Title

Journal ISSN

Volume Title


University of New Brunswick


Machine learning is being used in large sectors such as healthcare and financial services. This raises privacy concerns regarding user data and model privacy. As a result, federated learning (FL) was introduced. It empowers users to combine their models through a centralized server. In FL, since a user computes their model locally, the user input is not directly threatened, whereas the privacy risk of model misuse is high. In this thesis, we propose a robust, efficient privacy-preserving DNN training protocol, built with PrivFL as its foundation. Our private DNN training protocol consists of a secure and efficient local gradient computation protocol and a secure aggregation protocol. We develop an optimized two-party local gradient computation protocol using fully homomorphic encryption and garbled circuit. The essence of our secure multiparty aggregation is computing the global gradient of DNNs. We analyze the security against semi-honest adversaries and implement it on real-world datasets.