22
2297

BLOOM: A 176B-Parameter Open-Access Multilingual Language Model

BigScience Workshop
:
Teven Le Scao
Angela Fan
Christopher Akiki
Ellie Pavlick
Suzana Ilić
Daniel Hesslow
Roman Castagné
A. Luccioni
François Yvon
Matthias Gallé
J. Tow
Alexander M. Rush
Stella Biderman
Albert Webson
Pawan Sasanka Ammanamanchi
Thomas Wang
Benoît Sagot
Niklas Muennighoff
Albert Villanova del Moral
Olatunji Ruwase
Rachel Bawden
Stas Bekman
Angelina McMillan-Major
Iz Beltagy
Huu Nguyen
Lucile Saulnier
Samson Tan
Pedro Ortiz Suarez
Victor Sanh
Hugo Laurenccon
Yacine Jernite
Julien Launay
Margaret Mitchell
Colin Raffel
Aaron Gokaslan
Adi Simhi
Aitor Soroa Etxabe
Alham Fikri Aji
Amit Alfassy
Anna Rogers
Ariel Kreisberg Nitzav
Canwen Xu
Chenghao Mou
Chris C. Emezue
Christopher Klamm
Colin Leong
Daniel Alexander van Strien
David Ifeoluwa Adelani
Dragomir R. Radev
E. G. Ponferrada
Efrat Levkovizh
Ethan Kim
Eyal Natan
F. Toni
Gérard Dupont
Germán Kruszewski
Giada Pistilli
Hady ElSahar
Hamza Benyamina
H. Tran
Ian Yu
Idris Abdulmumin
Isaac Johnson
Itziar Gonzalez-Dios
Javier de la Rosa
Jenny Chim
Jesse Dodge
Jian Zhu
Jonathan Chang
Jorg Frohberg
Josephine Tobing
J. Bhattacharjee
Khalid Almubarak
Kimbo Chen
Kyle Lo
Leandro von Werra
Leon Weber
Long Phan
Loubna Ben Allal
Ludovic Tanguy
Manan Dey
M. Muñoz
Maraim Masoud
María Grandury
Mario vSavsko
Max Huang
Maximin Coavoux
Mayank Singh
Mike Tian-Jian Jiang
Minh Chien Vu
M. A. Jauhar
Mustafa Ghaleb
Nishant Subramani
Nora Kassner
Nurulaqilla Khamis
Olivier Nguyen
Omar Espejel
Ona de Gibert
Paulo Villegas
Peter Henderson
Pierre Colombo
Priscilla Amuok
Quentin Lhoest
Rheza Harliman
Rishi Bommasani
R. López
Rui Ribeiro
Salomey Osei
S. Pyysalo
Sebastian Nagel
Shamik Bose
Shamsuddeen Hassan Muhammad
Shanya Sharma
Shayne Longpre
Somaieh Nikpoor
S. Silberberg
S. Pai
S. Zink
Tiago Timponi Torrent
Timo Schick
Tristan Thrush
V. Danchev
Vassilina Nikoulina
Veronika Laippala
Violette Lepercq
V. Prabhu
Zaid Alyafeai
Zeerak Talat
Arun Raja
Benjamin Heinzerling
Chenglei Si
Davut Emre Taşar
Elizabeth Salesky
Sabrina J. Mielke
Wilson Y. Lee
Abheesht Sharma
Andrea Santilli
Antoine Chaffin
Arnaud Stiegler
Debajyoti Datta
Eliza Szczechla
Gunjan Chhablani
Han Wang
Harshit Pandey
Hendrik Strobelt
Jason Alan Fries
Jos Rozen
Leo Gao
Lintang Sutawika
M Saiful Bari
Maged S. Al-Shaibani
Matteo Manica
Nihal V. Nayak
Ryan Teehan
Samuel Albanie
Sheng Shen
Srulik Ben-David
Stephen H. Bach
Taewoon Kim
T. Bers
Thibault Févry
Trishala Neeraj
Urmish Thakker
Vikas Raunak
Xiang Tang
Zheng-Xin Yong
Zhiqing Sun
Shaked Brody
Y. Uri
Hadar Tojarieh
Adam Roberts
Hyung Won Chung
Jaesung Tae
Jason Phang
Ofir Press
Conglong Li
Deepak Narayanan
Hatim Bourfoune
Jared Casper
Jeff Rasley
Max Ryabinin
Mayank Mishra
Minjia Zhang
M. Shoeybi
Myriam Peyrounette
N. Patry
Nouamane Tazi
Omar Sanseviero
Patrick von Platen
Pierre Cornette
Pierre Franccois Lavallée
Rémi Lacroix
Samyam Rajbhandari
Sanchit Gandhi
Shaden Smith
S. Requena
Suraj Patil
Tim Dettmers
Ahmed Baruwa
Amanpreet Singh
Anastasia Cheveleva
Anne-Laure Ligozat
Arjun Subramonian
Aurélie Névéol
Charles Lovering
Daniel H Garrette
D. Tunuguntla
Ehud Reiter
Ekaterina Taktasheva
E. Voloshina
Eli Bogdanov
Genta Indra Winata
Hailey Schoelkopf
Jan-Christoph Kalo
Jekaterina Novikova
Jessica Zosa Forde
Zdenvek Kasner
Jungo Kasai
Ken Kawamura
Liam Hazan
Marine Carpuat
Miruna Clinciu
Najoung Kim
Newton Cheng
O. Serikov
Omer Antverg
Oskar van der Wal
Rui Zhang
Ruochen Zhang
Sebastian Gehrmann
Shachar Mirkin
S. Pais
Tatiana Shavrina
Thomas Scialom
Tian Yun
Tomasz Limisiewicz
Verena Rieser
Vitaly Protasov
Vladislav Mikhailov
Yada Pruksachatkun
Yonatan Belinkov
Zachary Bamberger
Zdeněk Kasner
Xiangru Tang
A. Pestana
A. Feizpour
Ammar Khan
Amy Faranak
A. Santos
Anthony Hevia
Antigona Unldreaj
Arash Aghagol
Arezoo Abdollahi
A. Tammour
A. HajiHosseini
Bahareh Behroozi
Benjamin Ayoade Ajibade
B. Saxena
Carlos Muñoz Ferrandis
Daniel McDuff
Danish Contractor
D. Lansky
Davis David
Douwe Kiela
D. A. Nguyen
Edward Tan
Emi Baylor
Ezinwanne Ozoani
F. Mirza
Frankline Ononiwu
Habib Rezanejad
H.A. Jones
Indrani Bhattacharya
Irene Solaiman
Irina Sedenko
I. Nejadgholi
J. Passmore
Joshua Seltzer
Julio Bonis Sanz
Lívia Dutra
Mairon Samagaio
Maraim Elbadri
Margot Mieskes
Marissa Gerchick
Martha Akinlolu
Michael McKenna
Mike Qiu
M. Ghauri
Mykola Burynok
Nafis Abrar
Nazneen Rajani
Nour Elkott
N. Fahmy
Olanrewaju Samuel
Ran An
R. Kromann
Ryan Hao
S. Alizadeh
Sarmad Shubber
Silas L. Wang
Sourav Roy
S. Viguier
Thanh-Cong Le
Tobi Oyebade
T. Le
Yoyo Yang
Zach Nguyen
Abhinav Ramesh Kashyap
Alfredo Palasciano
A. Callahan
Anima Shukla
Antonio Miranda-Escalada
A. Singh
Benjamin Beilharz
Bo Wang
C. Brito
Chenxi Zhou
Chirag Jain
Chuxin Xu
Clémentine Fourrier
Daniel León Perinán
Daniel Molano
Dian Yu
Enrique Manjavacas
Fabio Barth
Florian Fuhrimann
Gabriel Altay
Giyaseddin Bayrak
Gully Burns
Helena U. Vrabec
I. Bello
Isha Dash
J. Kang
John Giorgi
Jonas Golde
J. Posada
Karthi Sivaraman
Lokesh Bulchandani
Lu Liu
Luisa Shinzato
Madeleine Hahn de Bykhovetz
Maiko Takeuchi
Marc Pàmies
M. A. Castillo
Marianna Nezhurina
Mario Sanger
Matthias Samwald
Michael Cullan
Michael Weinberg
M. Wolf
Mina Mihaljcic
Minna Liu
M. Freidank
Myungsun Kang
Natasha Seelam
N. Dahlberg
N. Broad
N. Muellner
Pascale Fung
Patrick Haller
Patricia Haller
R. Eisenberg
Robert Martin
Rodrigo Canalli
Rosaline Su
Ruisi Su
Samuel Cahyawijaya
Samuele Garda
Shlok S Deshmukh
Shubhanshu Mishra
Sid Kiblawi
Simon Ott
Sinee Sang-aroonsiri
Srishti Kumar
Stefan Schweter
S. Bharati
Tanmay Laud
Théo Gigant
Tomoya Kainuma
Wojciech Kusa
Yanis Labrak
Yashasvi Bajaj
Y. Venkatraman
Yifan Xu
Ying Xu
Yu Xu
Z. Tan
Zhongli Xie
Zifan Ye
M. Bras
Younes Belkada
Thomas Wolf
Abstract

Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License.

View on arXiv
Comments on this paper