Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language Model

By Alexandra Sasha Luccioni, Sylvain Viguier, Anne-Laure Ligozat; 24(253):1−15, 2023.

Abstract

Progress in machine learning (ML) comes at an environmental cost due to the resources, energy, and materials required for training ML models. This article aims to quantify the carbon footprint of BLOOM, a language model with 176 billion parameters, throughout its life cycle. The authors estimate that the final training of BLOOM emitted approximately 24.7 tonnes of CO2eq when considering only dynamic power consumption, and 50.5 tonnes when accounting for all processes from equipment manufacturing to energy-based operational consumption. Additionally, an empirical study measures the energy requirements and carbon emissions of BLOOM’s deployment for real-time inference via an API endpoint. The article concludes with a discussion on the challenges of precisely estimating the carbon footprint of ML models and suggests future research directions for improving carbon emissions reporting.

[abs][pdf][bib]
[code]