ChayanM commited on
Commit
451f6fd
·
verified ·
1 Parent(s): 667f155

Model save

Browse files
Files changed (3) hide show
  1. README.md +111 -0
  2. generation_config.json +5 -0
  3. model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,111 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - generated_from_trainer
4
+ metrics:
5
+ - rouge
6
+ model-index:
7
+ - name: Image_Captioner
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # Image_Captioner
15
+
16
+ This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.0975
19
+ - Rouge1: 24.6871
20
+ - Rouge2: 9.5762
21
+ - Rougel: 20.8694
22
+ - Rougelsum: 23.5961
23
+ - Gen Len: 18.9192
24
+
25
+ ## Model description
26
+
27
+ More information needed
28
+
29
+ ## Intended uses & limitations
30
+
31
+ More information needed
32
+
33
+ ## Training and evaluation data
34
+
35
+ More information needed
36
+
37
+ ## Training procedure
38
+
39
+ ### Training hyperparameters
40
+
41
+ The following hyperparameters were used during training:
42
+ - learning_rate: 5e-05
43
+ - train_batch_size: 12
44
+ - eval_batch_size: 12
45
+ - seed: 42
46
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
+ - lr_scheduler_type: linear
48
+ - num_epochs: 50
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
53
+ |:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
54
+ | 0.2412 | 1.0 | 558 | 0.1433 | 25.8229 | 12.3048 | 23.7573 | 25.4769 | 19.0 |
55
+ | 0.1504 | 2.0 | 1116 | 0.1213 | 25.6896 | 12.1913 | 23.6153 | 25.3363 | 19.0 |
56
+ | 0.132 | 3.0 | 1674 | 0.1099 | 26.1044 | 12.1738 | 23.7995 | 25.4389 | 19.0 |
57
+ | 0.1165 | 4.0 | 2232 | 0.1018 | 24.4958 | 11.6079 | 21.9255 | 24.1601 | 19.0 |
58
+ | 0.105 | 5.0 | 2790 | 0.0942 | 26.0341 | 12.2669 | 23.7171 | 25.6348 | 19.0 |
59
+ | 0.0942 | 6.0 | 3348 | 0.0874 | 25.3352 | 10.6409 | 22.3911 | 24.6749 | 19.0 |
60
+ | 0.0847 | 7.0 | 3906 | 0.0825 | 24.3455 | 11.1631 | 22.0193 | 23.8951 | 19.0 |
61
+ | 0.0764 | 8.0 | 4464 | 0.0782 | 25.4345 | 11.7359 | 23.0169 | 24.8774 | 19.0 |
62
+ | 0.0615 | 9.0 | 5022 | 0.0742 | 26.5655 | 12.5315 | 24.0201 | 26.0177 | 19.0 |
63
+ | 0.0546 | 10.0 | 5580 | 0.0714 | 26.984 | 10.977 | 23.3161 | 25.9544 | 19.0 |
64
+ | 0.0483 | 11.0 | 6138 | 0.0689 | 26.2815 | 11.5641 | 23.1829 | 25.4578 | 19.0 |
65
+ | 0.044 | 12.0 | 6696 | 0.0663 | 25.2328 | 11.3217 | 22.4545 | 24.653 | 19.0 |
66
+ | 0.0383 | 13.0 | 7254 | 0.0648 | 25.9672 | 10.9082 | 22.8064 | 25.1251 | 19.0 |
67
+ | 0.0351 | 14.0 | 7812 | 0.0660 | 26.0833 | 11.0382 | 22.6573 | 25.2428 | 19.0 |
68
+ | 0.0313 | 15.0 | 8370 | 0.0658 | 26.7009 | 10.9455 | 22.845 | 25.6707 | 19.0 |
69
+ | 0.0276 | 16.0 | 8928 | 0.0659 | 26.1769 | 10.6049 | 22.4903 | 25.321 | 19.0 |
70
+ | 0.0246 | 17.0 | 9486 | 0.0661 | 26.1478 | 10.3981 | 22.4809 | 25.1318 | 19.0 |
71
+ | 0.0197 | 18.0 | 10044 | 0.0682 | 25.3438 | 10.4852 | 22.1182 | 24.5048 | 19.0 |
72
+ | 0.0178 | 19.0 | 10602 | 0.0689 | 25.2217 | 9.6912 | 21.4433 | 24.1131 | 19.0 |
73
+ | 0.0159 | 20.0 | 11160 | 0.0707 | 24.6521 | 9.4214 | 21.0511 | 23.6559 | 18.9771 |
74
+ | 0.0148 | 21.0 | 11718 | 0.0725 | 24.6477 | 9.8393 | 21.2375 | 23.8722 | 18.9933 |
75
+ | 0.0133 | 22.0 | 12276 | 0.0723 | 25.2483 | 9.9924 | 21.6189 | 24.2124 | 18.9933 |
76
+ | 0.0121 | 23.0 | 12834 | 0.0741 | 24.3834 | 9.8801 | 21.0599 | 23.5916 | 18.9664 |
77
+ | 0.0114 | 24.0 | 13392 | 0.0757 | 25.0727 | 9.7857 | 21.3046 | 24.0167 | 18.9758 |
78
+ | 0.0103 | 25.0 | 13950 | 0.0774 | 25.1959 | 10.1108 | 21.5608 | 24.1292 | 18.9098 |
79
+ | 0.0089 | 26.0 | 14508 | 0.0783 | 25.5931 | 9.9812 | 21.6953 | 24.3561 | 18.9219 |
80
+ | 0.0083 | 27.0 | 15066 | 0.0793 | 24.8603 | 10.0231 | 21.2615 | 23.9145 | 18.9879 |
81
+ | 0.0076 | 28.0 | 15624 | 0.0802 | 24.741 | 9.6977 | 21.112 | 23.8097 | 18.9367 |
82
+ | 0.0074 | 29.0 | 16182 | 0.0812 | 24.0656 | 9.4335 | 20.6021 | 23.0172 | 18.8748 |
83
+ | 0.0067 | 30.0 | 16740 | 0.0838 | 24.9923 | 9.9583 | 21.2749 | 23.9427 | 18.9556 |
84
+ | 0.0063 | 31.0 | 17298 | 0.0844 | 24.8869 | 9.6309 | 21.0218 | 23.7523 | 18.8789 |
85
+ | 0.0058 | 32.0 | 17856 | 0.0870 | 24.8009 | 9.9887 | 21.0596 | 23.71 | 18.9139 |
86
+ | 0.0054 | 33.0 | 18414 | 0.0879 | 24.9076 | 9.663 | 21.0755 | 23.8641 | 18.9287 |
87
+ | 0.0052 | 34.0 | 18972 | 0.0902 | 25.0668 | 9.5739 | 21.2282 | 23.9928 | 18.9044 |
88
+ | 0.0044 | 35.0 | 19530 | 0.0908 | 25.0616 | 10.0034 | 21.6482 | 23.9978 | 18.9098 |
89
+ | 0.0041 | 36.0 | 20088 | 0.0912 | 25.0681 | 10.099 | 21.4527 | 24.0219 | 18.8573 |
90
+ | 0.0039 | 37.0 | 20646 | 0.0916 | 24.6263 | 9.7547 | 20.8695 | 23.5722 | 18.9367 |
91
+ | 0.0037 | 38.0 | 21204 | 0.0922 | 24.6973 | 9.6421 | 21.1171 | 23.733 | 18.9435 |
92
+ | 0.0034 | 39.0 | 21762 | 0.0929 | 25.3821 | 9.8435 | 21.4803 | 24.3296 | 18.8439 |
93
+ | 0.0032 | 40.0 | 22320 | 0.0944 | 25.2386 | 9.9245 | 21.4207 | 24.1773 | 18.9287 |
94
+ | 0.003 | 41.0 | 22878 | 0.0947 | 25.2413 | 10.0581 | 21.5136 | 24.146 | 18.9623 |
95
+ | 0.0028 | 42.0 | 23436 | 0.0958 | 25.1041 | 9.8452 | 21.2494 | 24.0197 | 18.9166 |
96
+ | 0.0027 | 43.0 | 23994 | 0.0960 | 24.8 | 9.932 | 21.1541 | 23.7546 | 18.9529 |
97
+ | 0.0024 | 44.0 | 24552 | 0.0965 | 25.1426 | 10.0351 | 21.3824 | 24.0439 | 18.9341 |
98
+ | 0.0023 | 45.0 | 25110 | 0.0964 | 24.837 | 9.7853 | 21.0592 | 23.7822 | 18.9421 |
99
+ | 0.0022 | 46.0 | 25668 | 0.0968 | 24.8325 | 9.8007 | 20.976 | 23.7022 | 18.9596 |
100
+ | 0.0021 | 47.0 | 26226 | 0.0972 | 24.7028 | 9.6921 | 21.0038 | 23.6658 | 18.9394 |
101
+ | 0.0021 | 48.0 | 26784 | 0.0974 | 24.7233 | 9.77 | 21.0449 | 23.6333 | 18.9260 |
102
+ | 0.002 | 49.0 | 27342 | 0.0977 | 24.7481 | 9.6823 | 20.9024 | 23.6522 | 18.9300 |
103
+ | 0.0019 | 50.0 | 27900 | 0.0975 | 24.6871 | 9.5762 | 20.8694 | 23.5961 | 18.9192 |
104
+
105
+
106
+ ### Framework versions
107
+
108
+ - Transformers 4.37.1
109
+ - Pytorch 1.13.1+cu117
110
+ - Datasets 2.15.0
111
+ - Tokenizers 0.15.1
generation_config.json ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ {
2
+ "bos_token_id": 50256,
3
+ "eos_token_id": 50256,
4
+ "transformers_version": "4.37.1"
5
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a94d81852b15d8e152839d4538c1bb5fa8cf23bb7a87a27f14c9606eb0e14f6f
3
  size 956835520
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:096d80684b244d8287c6bdce6821476a28cae397573b3e45705b8bdf9103b69c
3
  size 956835520