This open-source model was created by Mistral AI.
You can find the release blog post here.
The model is available on the huggingface hub: https://huggingface.co/mistralai/Mixtral-8x22B-Instruct-v0.1.
The model has 141B total and 39B active parameters. It supports up to 64K token contexts.