← Back to Models

Mixtral 8x22B

by Mistral AI

Mistral's larger MoE model with 8 experts of 22B each. Only activates 44B per forward pass but offers significantly improved capabilities over 8x7B. Serious hardware required.

chatcodereasoningmultilingual

Choose a size:

Or use the CLI:

ollama pull mixtral:8x22b:8x22b

Details

License
Apache 2.0
Released
April 2024
Context
64K tokens
Downloads
1.4M