Improve language tag

#2
by lbourdois - opened
Files changed (1) hide show
  1. README.md +83 -70
README.md CHANGED
@@ -1,70 +1,83 @@
1
- ---
2
- base_model:
3
- - EVA-UNIT-01/EVA-Qwen2.5-14B-v0.1
4
- - Qwen/Qwen2.5-14B
5
- - v000000/Qwen2.5-14B-Gutenberg-1e-Delta
6
- - arcee-ai/SuperNova-Medius
7
- library_name: transformers
8
- tags:
9
- - mergekit
10
- - merge
11
-
12
- ---
13
- # Celestial Harmony 14b v1.0 Experimental 10/15
14
-
15
- *In candlelight, as time unwinds... I find myself lost in your eyes... In midnight tolls, as darkness folds... I see your tears when we say goodbye... Watching stars as we drift on by... A touch, a glance, fly away.... Will our paths converge 'neath the sun?... A silent desire in melody sung.... In your memory, a whispered song....
16
- A seed of hope where we belong~*
17
-
18
- Listen to the song on Youtube: https://www.youtube.com/watch?v=kdV4K17KqAE&t=22s
19
-
20
- Yet Another merge, this one for AuriAetherwiing, at their request. I like it, so try it out?
21
-
22
- Merged Models:
23
-
24
- - v000000/Qwen2.5-14B-Gutenberg-1e-Delta
25
- - arcee-ai/SuperNova-Medius
26
- - EVA-UNIT-01/EVA-Qwen2.5-14B-v0.1
27
- - Qwen/Qwen2.5-14B
28
-
29
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
30
-
31
- ## Merge Details
32
- ### Merge Method
33
-
34
- This model was merged using the della_linear merge method using [Qwen/Qwen2.5-14B](https://huggingface.co/Qwen/Qwen2.5-14B) as a base.
35
-
36
- ### Models Merged
37
-
38
- The following models were included in the merge:
39
- * [EVA-UNIT-01/EVA-Qwen2.5-14B-v0.1](https://huggingface.co/EVA-UNIT-01/EVA-Qwen2.5-14B-v0.1)
40
- * [v000000/Qwen2.5-14B-Gutenberg-1e-Delta](https://huggingface.co/v000000/Qwen2.5-14B-Gutenberg-1e-Delta)
41
- * [arcee-ai/SuperNova-Medius](https://huggingface.co/arcee-ai/SuperNova-Medius)
42
-
43
- ### Configuration
44
-
45
- The following YAML configuration was used to produce this model:
46
-
47
- ```yaml
48
- models:
49
- - model: v000000/Qwen2.5-14B-Gutenberg-1e-Delta
50
- parameters:
51
- weight: 0.3
52
- density: 0.25
53
- - model: arcee-ai/SuperNova-Medius
54
- parameters:
55
- weight: 0.1
56
- density: 0.4
57
- - model: EVA-UNIT-01/EVA-Qwen2.5-14B-v0.1
58
- parameters:
59
- weight: 0.4
60
- density: 0.5
61
- merge_method: della_linear
62
- base_model: Qwen/Qwen2.5-14B
63
- parameters:
64
- epsilon: 0.05
65
- lambda: 1
66
- merge_method: della_linear
67
- dtype: bfloat16
68
-
69
-
70
- ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - EVA-UNIT-01/EVA-Qwen2.5-14B-v0.1
4
+ - Qwen/Qwen2.5-14B
5
+ - v000000/Qwen2.5-14B-Gutenberg-1e-Delta
6
+ - arcee-ai/SuperNova-Medius
7
+ library_name: transformers
8
+ tags:
9
+ - mergekit
10
+ - merge
11
+ language:
12
+ - zho
13
+ - eng
14
+ - fra
15
+ - spa
16
+ - por
17
+ - deu
18
+ - ita
19
+ - rus
20
+ - jpn
21
+ - kor
22
+ - vie
23
+ - tha
24
+ - ara
25
+ ---
26
+ # Celestial Harmony 14b v1.0 Experimental 10/15
27
+
28
+ *In candlelight, as time unwinds... I find myself lost in your eyes... In midnight tolls, as darkness folds... I see your tears when we say goodbye... Watching stars as we drift on by... A touch, a glance, fly away.... Will our paths converge 'neath the sun?... A silent desire in melody sung.... In your memory, a whispered song....
29
+ A seed of hope where we belong~*
30
+
31
+ Listen to the song on Youtube: https://www.youtube.com/watch?v=kdV4K17KqAE&t=22s
32
+
33
+ Yet Another merge, this one for AuriAetherwiing, at their request. I like it, so try it out?
34
+
35
+ Merged Models:
36
+
37
+ - v000000/Qwen2.5-14B-Gutenberg-1e-Delta
38
+ - arcee-ai/SuperNova-Medius
39
+ - EVA-UNIT-01/EVA-Qwen2.5-14B-v0.1
40
+ - Qwen/Qwen2.5-14B
41
+
42
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
43
+
44
+ ## Merge Details
45
+ ### Merge Method
46
+
47
+ This model was merged using the della_linear merge method using [Qwen/Qwen2.5-14B](https://huggingface.co/Qwen/Qwen2.5-14B) as a base.
48
+
49
+ ### Models Merged
50
+
51
+ The following models were included in the merge:
52
+ * [EVA-UNIT-01/EVA-Qwen2.5-14B-v0.1](https://huggingface.co/EVA-UNIT-01/EVA-Qwen2.5-14B-v0.1)
53
+ * [v000000/Qwen2.5-14B-Gutenberg-1e-Delta](https://huggingface.co/v000000/Qwen2.5-14B-Gutenberg-1e-Delta)
54
+ * [arcee-ai/SuperNova-Medius](https://huggingface.co/arcee-ai/SuperNova-Medius)
55
+
56
+ ### Configuration
57
+
58
+ The following YAML configuration was used to produce this model:
59
+
60
+ ```yaml
61
+ models:
62
+ - model: v000000/Qwen2.5-14B-Gutenberg-1e-Delta
63
+ parameters:
64
+ weight: 0.3
65
+ density: 0.25
66
+ - model: arcee-ai/SuperNova-Medius
67
+ parameters:
68
+ weight: 0.1
69
+ density: 0.4
70
+ - model: EVA-UNIT-01/EVA-Qwen2.5-14B-v0.1
71
+ parameters:
72
+ weight: 0.4
73
+ density: 0.5
74
+ merge_method: della_linear
75
+ base_model: Qwen/Qwen2.5-14B
76
+ parameters:
77
+ epsilon: 0.05
78
+ lambda: 1
79
+ merge_method: della_linear
80
+ dtype: bfloat16
81
+
82
+
83
+ ```