Text Generation
Transformers
Safetensors
English
rwkv7
custom_code
ZhangRC nielsr HF staff commited on
Commit
9faeb25
·
verified ·
1 Parent(s): 0477182

Improve model card: add paper link, project page link, and library name (#1)

Browse files

- Improve model card: add paper link, project page link, and library name (0d7a3ad5ffdde7bcafcebbcb17f98be1c5d7307f)


Co-authored-by: Niels Rogge <[email protected]>

Files changed (1) hide show
  1. README.md +8 -5
README.md CHANGED
@@ -1,14 +1,15 @@
1
  ---
2
- license: apache-2.0
 
3
  datasets:
4
  - EleutherAI/the_pile_deduplicated
5
  language:
6
  - en
 
7
  metrics:
8
  - accuracy
9
- base_model:
10
- - BlinkDL/rwkv-7-pile
11
  pipeline_tag: text-generation
 
12
  ---
13
 
14
  # rwkv7-421M-pile
@@ -37,7 +38,9 @@ This is RWKV-7 model under flash-linear attention format.
37
  <!-- Provide the basic links for the model. -->
38
 
39
  - **Repository:** https://github.com/fla-org/flash-linear-attention ; https://github.com/BlinkDL/RWKV-LM
40
- - **Paper:** https://arxiv.org/abs/2503.14456
 
 
41
 
42
  ## Uses
43
 
@@ -80,4 +83,4 @@ This model is trained on the Pile with a total of 332 billion tokens.
80
  ## FAQ
81
  Q: safetensors metadata is none.
82
 
83
- A: upgrade transformers to >=4.48.0: `pip install 'transformers>=4.48.0'`
 
1
  ---
2
+ base_model:
3
+ - BlinkDL/rwkv-7-pile
4
  datasets:
5
  - EleutherAI/the_pile_deduplicated
6
  language:
7
  - en
8
+ license: apache-2.0
9
  metrics:
10
  - accuracy
 
 
11
  pipeline_tag: text-generation
12
+ library_name: transformers
13
  ---
14
 
15
  # rwkv7-421M-pile
 
38
  <!-- Provide the basic links for the model. -->
39
 
40
  - **Repository:** https://github.com/fla-org/flash-linear-attention ; https://github.com/BlinkDL/RWKV-LM
41
+ - **Paper:** [RWKV: Parallelizable RNN with Transformer-level LLM Performance](https://huggingface.co/papers/2503.14456)
42
+ - **Project Page:** [RWKV](https://huggingface.co/RWKV)
43
+
44
 
45
  ## Uses
46
 
 
83
  ## FAQ
84
  Q: safetensors metadata is none.
85
 
86
+ A: upgrade transformers to >=4.48.0: `pip install 'transformers>=4.48.0'`