BigScience Large Open-science Open-access Multilingual Language Model is a 176-billion-parameter transformer-based autoregressive large language model. The model, as well as the code base and the data used to train it, are distributed under free licences. BLOOM was trained on approximately 366 billion tokens from March to July 2022.