nielsr HF staff commited on
Commit
e00f6c5
·
verified ·
1 Parent(s): 1e44595

Add link to SDD paper and code

Browse files

This PR adds a link to the paper and code repository for the Scale-Distribution Decoupling (SDD) method, which uses the OLMoE Mix dataset for its experiments. This clarifies the relationship between the dataset and the SDD method.

Files changed (1) hide show
  1. README.md +8 -8
README.md CHANGED
@@ -1,11 +1,11 @@
1
  ---
2
- task_categories:
3
- - text-generation
4
  language:
5
  - en
 
6
  size_categories:
7
  - 1B<n<10B
8
- license: odc-by
 
9
  pretty_name: OLMoE Mix (September 2024)
10
  dataset_info:
11
  features:
@@ -23,13 +23,13 @@ dataset_info:
23
 
24
  ## Dataset Description
25
 
26
- - **Repository:** https://github.com/allenai/OLMoE
27
- - **Paper:** [OLMoE: Open Mixture-of-Experts Language Models](https://arxiv.org/abs/2409.02060)
28
 
 
 
29
 
30
- <img alt="OLMoE Mix Logo." src="olmoe-mix.png" width="250px">
31
 
32
- The following data mix was used to train OLMoE-1B-7B, a Mixture-of-Experts LLM with 1B active and 7B total parameters released in September 2024.
33
 
34
  The base version of OLMoE-1B-7B can be found at [this page](https://huggingface.co/allenai/OLMoE-1B-7B-0924), the SFT of OLMoE-1B-7B is available [here](https://huggingface.co/allenai/OLMoE-1B-7B-0924-SFT), and a version combining SFT and DPO is available following [this link](https://huggingface.co/allenai/OLMoE-1B-7B-0924-Instruct).
35
 
@@ -62,7 +62,7 @@ In addition of the above, Starcoder dataset was further processed by removing an
62
 
63
  This mix is licensed under [Open Data Commons Attribution License (ODC-By) v1.0](https://opendatacommons.org/licenses/by/1-0/). By using this dataset, you are bound to licenses and Terms of Services of underlying datasets, which you can access by clicking on the links in the table above.
64
 
65
- ## Citation
66
 
67
  ```bibtex
68
  @misc{muennighoff2024olmoeopenmixtureofexpertslanguage,
 
1
  ---
 
 
2
  language:
3
  - en
4
+ license: odc-by
5
  size_categories:
6
  - 1B<n<10B
7
+ task_categories:
8
+ - text-generation
9
  pretty_name: OLMoE Mix (September 2024)
10
  dataset_info:
11
  features:
 
23
 
24
  ## Dataset Description
25
 
26
+ This dataset was used to train OLMoE-1B-7B, a Mixture-of-Experts LLM with 1B active and 7B total parameters released in September 2024. The Scale-Distribution Decoupling (SDD) method, presented in [Scale-Distribution Decoupling: Enabling Stable and Effective Training of Large Language Models](https://huggingface.co/papers/2502.15499), utilized this dataset in its experiments. The SDD code is available at [https://github.com/kaihemo/SDD](https://github.com/kaihemo/SDD).
 
27
 
28
+ - **Repository (OLMoE):** https://github.com/allenai/OLMoE
29
+ - **Paper (OLMoE):** [OLMoE: Open Mixture-of-Experts Language Models](https://arxiv.org/abs/2409.02060)
30
 
 
31
 
32
+ <img alt="OLMoE Mix Logo." src="olmoe-mix.png" width="250px">
33
 
34
  The base version of OLMoE-1B-7B can be found at [this page](https://huggingface.co/allenai/OLMoE-1B-7B-0924), the SFT of OLMoE-1B-7B is available [here](https://huggingface.co/allenai/OLMoE-1B-7B-0924-SFT), and a version combining SFT and DPO is available following [this link](https://huggingface.co/allenai/OLMoE-1B-7B-0924-Instruct).
35
 
 
62
 
63
  This mix is licensed under [Open Data Commons Attribution License (ODC-By) v1.0](https://opendatacommons.org/licenses/by/1-0/). By using this dataset, you are bound to licenses and Terms of Services of underlying datasets, which you can access by clicking on the links in the table above.
64
 
65
+ ## Citation (OLMoE)
66
 
67
  ```bibtex
68
  @misc{muennighoff2024olmoeopenmixtureofexpertslanguage,