BLOOM

The Big Science Language Open-science Open-access Multilingual

About BLOOM

A group of over 1000 AI researchers has created a multilingual large language model bigger than GPT-3—and they're giving it out for free.

The training started on March 11, 2022. But in fact, the preparations of the corpus and the datasets started much earlier. A model with these characteristics is not achieved overnight. 4 months later, here we have it.

And it hasn’t been easy:

  • 384 graphic cards of 80 gigabytes each on the Jean Zay supercomputer in France.
  • BLOOM has 176 billion parameters, one billion more than GPT-3. -70 layers – 112 attention heads per layers – hidden dimensionality of 14336 – 2048 tokens sequence length.
  • ALiBi positional embeddings – GeLU activation function.

The training has been open to everyone and we have been able to follow it. BLOOM has been trained in various languages (English, Spanish, Italian…) and even programming codes.

Every resource is available and documented

Big Science

As they explain on their blog, Big Science is an open collaboration promoted by HuggingFace, GENCI and IDRIS. This research workshop brings together academic, industry, and independent researchers from many affiliations and whose research interests span many research fields in AI, NLP, social science, legal, ethics, and public policy.

BLOOM screenshots

Ready to start building?

At Apideck we're building the world's biggest API network. Discover and integrate over 12,000 APIs.

Check out the API Tracker