Examining the Environmental Impact of Federated Learning

Xinchi Qiu, Titouan Parcollet, Javier Fernandez-Marques, Pedro P. B. Gusmao, Yan Gao, Daniel J. Beutel, Taner Topal, Akhil Mathur, Nicholas D. Lane; 24(129):1−23, 2023.

Abstract

Although deep learning-based technologies have shown impressive results, they also raise significant concerns regarding privacy and the environment due to the training process often taking place in data centers. In response to these concerns, alternatives to centralized training, such as Federated Learning (FL), have emerged. FL is now being implemented on a global scale by companies that need to comply with new legal requirements and policies focused on privacy protection. However, the potential environmental impact of FL remains uncertain and unexplored. This article presents the first systematic study of the carbon footprint of FL. We introduce a comprehensive model for quantifying the carbon footprint, enabling further investigation into the relationship between FL design and carbon emissions. Additionally, we compare the carbon footprint of FL to that of traditional centralized learning. Our findings reveal that, depending on the configuration, FL can produce up to two orders of magnitude more carbon emissions than centralized training. However, in certain scenarios, FL can be comparable to centralized learning due to the reduced energy consumption of embedded devices. Lastly, we discuss the implications of these results and highlight future challenges and trends in FL aimed at reducing its environmental impact, including algorithm efficiency, hardware capabilities, and increased industry transparency.

[abs]

[pdf][bib]