File size: 525 Bytes
99de6d0 abb889e 99de6d0 |
1 2 3 4 5 6 7 |
<p>
This open-source model was created by <a target="_blank" href="https://mistral.ai/">Mistral AI<a>.
You can find the release blog post <a target="_blank" href="https://mistral.ai/news/mixtral-8x22b/">here</a>.
The model is available on the huggingface hub: <a target="_blank" href="https://huggingface.co/mistralai/Mixtral-8x22B-Instruct-v0.1">https://huggingface.co/mistralai/Mixtral-8x22B-Instruct-v0.1</a>.
The model has 141B total and 39B active parameters. It supports up to 64K token contexts.
</p>
|