Mistral AI

A Paris-based OpenAI rival

Über Mistral AI

Mistral AI aims to develop new models of generative artificial intelligence for companies, combining scientific excellence, an open-source approach, and a socially responsible vision of technology.

Mixtral 8x7B is a powerful and fast model adaptable to many use-cases. While being 6x faster, it matches or outperform Llama 2 70B on all benchmarks, speaks many languages, has natural coding abilities. It handles 32k sequence length. You can use it through our API, or deploy it yourself (it’s Apache 2.0). It is the strongest open-weight model with a permissive license and the best model overall regarding cost/performance trade-offs. In particular, it matches or outperforms GPT3.5 on most standard benchmarks.

Source: https://mistral.ai/news/mixtral-of-experts/?ref=upstract.com

Mistral AI Merkmale

  • It gracefully handles a context of 32k tokens.
  • It handles English, French, Italian, German and Spanish.
  • It shows strong performance in code generation.
  • It can be finetuned into an instruction-following model that - achieves a score of 8.3 on MT-Bench.

Mistral AI screenshots

Ready to start building?

At Apideck we're building the world's biggest API network. Discover and integrate over 12,000 APIs.

Check out the API Tracker