← Back to Models
Mixtral 8x22B
by Mistral AIMistral's larger MoE model with 8 experts of 22B each. Only activates 44B per forward pass but offers significantly improved capabilities over 8x7B. Serious hardware required.
chatcodereasoningmultilingual
Choose a size:
Download 8x22B in Saga
Don't have Saga? Download it first
Or use the CLI:
ollama pull mixtral:8x22b:8x22bDetails
- License
- Apache 2.0
- Released
- April 2024
- Context
- 64K tokens
- Downloads
- 1.4M