rebnej commited on
Commit
7ea4b77
·
verified ·
1 Parent(s): d490baf

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +48 -0
README.md ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ tags:
6
+ - pretrained
7
+ pipeline_tag: text-generation
8
+ inference:
9
+ parameters:
10
+ temperature: 0.7
11
+
12
+ extra_gated_description: If you want to learn more about how we process your personal data, please read our <a href="https://mistral.ai/terms/">Privacy Policy</a>.
13
+ ---
14
+
15
+ # Model Card for Mistral-7B-v0.1
16
+
17
+ The Mistral-7B-v0.1 Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters.
18
+ Mistral-7B-v0.1 outperforms Llama 2 13B on all benchmarks we tested.
19
+
20
+ For full details of this model please read our [paper](https://arxiv.org/abs/2310.06825) and [release blog post](https://mistral.ai/news/announcing-mistral-7b/).
21
+
22
+ ## Model Architecture
23
+
24
+ Mistral-7B-v0.1 is a transformer model, with the following architecture choices:
25
+ - Grouped-Query Attention
26
+ - Sliding-Window Attention
27
+ - Byte-fallback BPE tokenizer
28
+
29
+ ## Troubleshooting
30
+
31
+ - If you see the following error:
32
+ ```
33
+ KeyError: 'mistral'
34
+ ```
35
+ - Or:
36
+ ```
37
+ NotImplementedError: Cannot copy out of meta tensor; no data!
38
+ ```
39
+
40
+ Ensure you are utilizing a stable version of Transformers, 4.34.0 or newer.
41
+
42
+ ## Notice
43
+
44
+ Mistral 7B is a pretrained base model and therefore does not have any moderation mechanisms.
45
+
46
+ ## The Mistral AI Team
47
+
48
+ Albert Jiang, Alexandre Sablayrolles, Arthur Mensch, Chris Bamford, Devendra Singh Chaplot, Diego de las Casas, Florian Bressand, Gianna Lengyel, Guillaume Lample, Lélio Renard Lavaud, Lucile Saulnier, Marie-Anne Lachaux, Pierre Stock, Teven Le Scao, Thibaut Lavril, Thomas Wang, Timothée Lacroix, William El Sayed.