- Published on
AI21 has released the Jamba 1.5 family of open models, including Mini and Large versions, under the Jamba Open Model License.
These models feature a 256K context window, the longest among open models.
They are built on a novel combination of State Space Model (Mamba), Transformer architecture, and Mixture of Experts, resulting in excellent speed, efficiency, and quality.
Jamba 1.5 supports multiple languages, including English, Spanish, French, Portuguese, Italian, Dutch, German, Arabic, and Hebrew.
Jamba natively supports structured JSON output and function calling.
The models are available through various platforms, including AI21 Studio, Google Cloud Vertex AI, Hugging Face and OpenRouter. Jamba 1.5 Mini outperforms larger models like Mixtral 8x22B on benchmarks, while Jamba 1.5 Large surpasses Llama 3.1 70B and 405B.
The models are designed for enterprise applications, offering structured JSON output and function calling capabilities.