• Published on

    Mixtral 8x7B has been launched by Mistral AI, a company that focuses on creating and providing access to open AI models to benefit the developer community.

    This is a sparse model that outperforms competitors on benchmarks, offering 6x faster inference and superior cost-performance results compared to alternatives, including GPT 3.5.

    It can handle contexts of up to 32k tokens and has been trained on web data to recognise and process English, French, Italian, German and Spanish to generate human-like responses and code.

    Two versions are available: Mixtral 8x7B, which offers a range of capabilities, and Mixtral 8x7B Instruct, which has been fine-tuned to follow instructions and achieve a score of 8.3 on the MT-Bench benchmark.

    Licensed under Apache 2.0.