ayushexel commited on
Commit
282edd8
·
verified ·
1 Parent(s): 37f7a02

Add new CrossEncoder model

Browse files
README.md ADDED
@@ -0,0 +1,493 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ tags:
6
+ - sentence-transformers
7
+ - cross-encoder
8
+ - generated_from_trainer
9
+ - dataset_size:2223773
10
+ - loss:BinaryCrossEntropyLoss
11
+ base_model: cross-encoder/ms-marco-MiniLM-L6-v2
12
+ pipeline_tag: text-ranking
13
+ library_name: sentence-transformers
14
+ metrics:
15
+ - map
16
+ - mrr@10
17
+ - ndcg@10
18
+ model-index:
19
+ - name: ModernBERT-base trained on GooAQ
20
+ results:
21
+ - task:
22
+ type: cross-encoder-reranking
23
+ name: Cross Encoder Reranking
24
+ dataset:
25
+ name: gooaq dev
26
+ type: gooaq-dev
27
+ metrics:
28
+ - type: map
29
+ value: 0.638
30
+ name: Map
31
+ - type: mrr@10
32
+ value: 0.6361
33
+ name: Mrr@10
34
+ - type: ndcg@10
35
+ value: 0.6822
36
+ name: Ndcg@10
37
+ - task:
38
+ type: cross-encoder-reranking
39
+ name: Cross Encoder Reranking
40
+ dataset:
41
+ name: NanoMSMARCO R100
42
+ type: NanoMSMARCO_R100
43
+ metrics:
44
+ - type: map
45
+ value: 0.5437
46
+ name: Map
47
+ - type: mrr@10
48
+ value: 0.5348
49
+ name: Mrr@10
50
+ - type: ndcg@10
51
+ value: 0.606
52
+ name: Ndcg@10
53
+ - task:
54
+ type: cross-encoder-reranking
55
+ name: Cross Encoder Reranking
56
+ dataset:
57
+ name: NanoNFCorpus R100
58
+ type: NanoNFCorpus_R100
59
+ metrics:
60
+ - type: map
61
+ value: 0.3885
62
+ name: Map
63
+ - type: mrr@10
64
+ value: 0.563
65
+ name: Mrr@10
66
+ - type: ndcg@10
67
+ value: 0.4077
68
+ name: Ndcg@10
69
+ - task:
70
+ type: cross-encoder-reranking
71
+ name: Cross Encoder Reranking
72
+ dataset:
73
+ name: NanoNQ R100
74
+ type: NanoNQ_R100
75
+ metrics:
76
+ - type: map
77
+ value: 0.4626
78
+ name: Map
79
+ - type: mrr@10
80
+ value: 0.4628
81
+ name: Mrr@10
82
+ - type: ndcg@10
83
+ value: 0.5091
84
+ name: Ndcg@10
85
+ - task:
86
+ type: cross-encoder-nano-beir
87
+ name: Cross Encoder Nano BEIR
88
+ dataset:
89
+ name: NanoBEIR R100 mean
90
+ type: NanoBEIR_R100_mean
91
+ metrics:
92
+ - type: map
93
+ value: 0.4649
94
+ name: Map
95
+ - type: mrr@10
96
+ value: 0.5202
97
+ name: Mrr@10
98
+ - type: ndcg@10
99
+ value: 0.5076
100
+ name: Ndcg@10
101
+ ---
102
+
103
+ # ModernBERT-base trained on GooAQ
104
+
105
+ This is a [Cross Encoder](https://www.sbert.net/docs/cross_encoder/usage/usage.html) model finetuned from [cross-encoder/ms-marco-MiniLM-L6-v2](https://huggingface.co/cross-encoder/ms-marco-MiniLM-L6-v2) using the [sentence-transformers](https://www.SBERT.net) library. It computes scores for pairs of texts, which can be used for text reranking and semantic search.
106
+
107
+ ## Model Details
108
+
109
+ ### Model Description
110
+ - **Model Type:** Cross Encoder
111
+ - **Base model:** [cross-encoder/ms-marco-MiniLM-L6-v2](https://huggingface.co/cross-encoder/ms-marco-MiniLM-L6-v2) <!-- at revision fbf9045f293a58fa68636213c5e0cb8a2de5d45e -->
112
+ - **Maximum Sequence Length:** 512 tokens
113
+ - **Number of Output Labels:** 1 label
114
+ <!-- - **Training Dataset:** Unknown -->
115
+ - **Language:** en
116
+ - **License:** apache-2.0
117
+
118
+ ### Model Sources
119
+
120
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
121
+ - **Documentation:** [Cross Encoder Documentation](https://www.sbert.net/docs/cross_encoder/usage/usage.html)
122
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
123
+ - **Hugging Face:** [Cross Encoders on Hugging Face](https://huggingface.co/models?library=sentence-transformers&other=cross-encoder)
124
+
125
+ ## Usage
126
+
127
+ ### Direct Usage (Sentence Transformers)
128
+
129
+ First install the Sentence Transformers library:
130
+
131
+ ```bash
132
+ pip install -U sentence-transformers
133
+ ```
134
+
135
+ Then you can load this model and run inference.
136
+ ```python
137
+ from sentence_transformers import CrossEncoder
138
+
139
+ # Download from the 🤗 Hub
140
+ model = CrossEncoder("ayushexel/reranker-ms-marco-MiniLM-L6-v2-gooaq-bce")
141
+ # Get scores for pairs of texts
142
+ pairs = [
143
+ ['what does it mean when you get a sharp pain in your left arm?', 'Pain in the left arm A pain in your left arm could mean you have a bone or joint injury, a pinched nerve, or a problem with your heart. Read on to learn more about the causes of left arm pain and what symptoms could signal a serious problem.'],
144
+ ['what does it mean when you get a sharp pain in your left arm?', "In this Article Whether it's throbbing, aching, or sharp, everyone has been in pain. The uncomfortable sensation is a red flag. Pain in your armpit could mean that you've simply strained a muscle, which is eased with ice and rest. It could also be a sign of more serious conditions, like an infection or breast cancer."],
145
+ ['what does it mean when you get a sharp pain in your left arm?', 'Sharp: When you feel a sudden, intense spike of pain, that qualifies as “sharp.” Sharp pain may also fit the descriptors cutting and shooting. Stabbing: Like sharp pain, stabbing pain occurs suddenly and intensely. However, stabbing pain may fade and reoccur many times.'],
146
+ ['what does it mean when you get a sharp pain in your left arm?', 'Symptoms. A herniated disc in the neck can cause neck pain, radiating arm pain, shoulder pain, and numbness or tingling in the arm or hand. The quality and type of pain can vary from dull, aching, and difficult to localize to sharp, burning, and easy to pinpoint.'],
147
+ ['what does it mean when you get a sharp pain in your left arm?', 'Injuries or trauma to any part of the arm or shoulder, including bone fractures, joint dislocations, and muscle strains and sprains, are common causes of arm pain. Sometimes diseases that affect other organs in the body, like peripheral vascular disease or arthritis, can be the cause of pain in the arm.'],
148
+ ]
149
+ scores = model.predict(pairs)
150
+ print(scores.shape)
151
+ # (5,)
152
+
153
+ # Or rank different texts based on similarity to a single text
154
+ ranks = model.rank(
155
+ 'what does it mean when you get a sharp pain in your left arm?',
156
+ [
157
+ 'Pain in the left arm A pain in your left arm could mean you have a bone or joint injury, a pinched nerve, or a problem with your heart. Read on to learn more about the causes of left arm pain and what symptoms could signal a serious problem.',
158
+ "In this Article Whether it's throbbing, aching, or sharp, everyone has been in pain. The uncomfortable sensation is a red flag. Pain in your armpit could mean that you've simply strained a muscle, which is eased with ice and rest. It could also be a sign of more serious conditions, like an infection or breast cancer.",
159
+ 'Sharp: When you feel a sudden, intense spike of pain, that qualifies as “sharp.” Sharp pain may also fit the descriptors cutting and shooting. Stabbing: Like sharp pain, stabbing pain occurs suddenly and intensely. However, stabbing pain may fade and reoccur many times.',
160
+ 'Symptoms. A herniated disc in the neck can cause neck pain, radiating arm pain, shoulder pain, and numbness or tingling in the arm or hand. The quality and type of pain can vary from dull, aching, and difficult to localize to sharp, burning, and easy to pinpoint.',
161
+ 'Injuries or trauma to any part of the arm or shoulder, including bone fractures, joint dislocations, and muscle strains and sprains, are common causes of arm pain. Sometimes diseases that affect other organs in the body, like peripheral vascular disease or arthritis, can be the cause of pain in the arm.',
162
+ ]
163
+ )
164
+ # [{'corpus_id': ..., 'score': ...}, {'corpus_id': ..., 'score': ...}, ...]
165
+ ```
166
+
167
+ <!--
168
+ ### Direct Usage (Transformers)
169
+
170
+ <details><summary>Click to see the direct usage in Transformers</summary>
171
+
172
+ </details>
173
+ -->
174
+
175
+ <!--
176
+ ### Downstream Usage (Sentence Transformers)
177
+
178
+ You can finetune this model on your own dataset.
179
+
180
+ <details><summary>Click to expand</summary>
181
+
182
+ </details>
183
+ -->
184
+
185
+ <!--
186
+ ### Out-of-Scope Use
187
+
188
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
189
+ -->
190
+
191
+ ## Evaluation
192
+
193
+ ### Metrics
194
+
195
+ #### Cross Encoder Reranking
196
+
197
+ * Dataset: `gooaq-dev`
198
+ * Evaluated with [<code>CrossEncoderRerankingEvaluator</code>](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CrossEncoderRerankingEvaluator) with these parameters:
199
+ ```json
200
+ {
201
+ "at_k": 10,
202
+ "always_rerank_positives": false
203
+ }
204
+ ```
205
+
206
+ | Metric | Value |
207
+ |:------------|:---------------------|
208
+ | map | 0.6380 (+0.2121) |
209
+ | mrr@10 | 0.6361 (+0.2199) |
210
+ | **ndcg@10** | **0.6822 (+0.2001)** |
211
+
212
+ #### Cross Encoder Reranking
213
+
214
+ * Datasets: `NanoMSMARCO_R100`, `NanoNFCorpus_R100` and `NanoNQ_R100`
215
+ * Evaluated with [<code>CrossEncoderRerankingEvaluator</code>](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CrossEncoderRerankingEvaluator) with these parameters:
216
+ ```json
217
+ {
218
+ "at_k": 10,
219
+ "always_rerank_positives": true
220
+ }
221
+ ```
222
+
223
+ | Metric | NanoMSMARCO_R100 | NanoNFCorpus_R100 | NanoNQ_R100 |
224
+ |:------------|:---------------------|:---------------------|:---------------------|
225
+ | map | 0.5437 (+0.0541) | 0.3885 (+0.1275) | 0.4626 (+0.0430) |
226
+ | mrr@10 | 0.5348 (+0.0573) | 0.5630 (+0.0632) | 0.4628 (+0.0361) |
227
+ | **ndcg@10** | **0.6060 (+0.0655)** | **0.4077 (+0.0827)** | **0.5091 (+0.0084)** |
228
+
229
+ #### Cross Encoder Nano BEIR
230
+
231
+ * Dataset: `NanoBEIR_R100_mean`
232
+ * Evaluated with [<code>CrossEncoderNanoBEIREvaluator</code>](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CrossEncoderNanoBEIREvaluator) with these parameters:
233
+ ```json
234
+ {
235
+ "dataset_names": [
236
+ "msmarco",
237
+ "nfcorpus",
238
+ "nq"
239
+ ],
240
+ "rerank_k": 100,
241
+ "at_k": 10,
242
+ "always_rerank_positives": true
243
+ }
244
+ ```
245
+
246
+ | Metric | Value |
247
+ |:------------|:---------------------|
248
+ | map | 0.4649 (+0.0749) |
249
+ | mrr@10 | 0.5202 (+0.0522) |
250
+ | **ndcg@10** | **0.5076 (+0.0522)** |
251
+
252
+ <!--
253
+ ## Bias, Risks and Limitations
254
+
255
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
256
+ -->
257
+
258
+ <!--
259
+ ### Recommendations
260
+
261
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
262
+ -->
263
+
264
+ ## Training Details
265
+
266
+ ### Training Dataset
267
+
268
+ #### Unnamed Dataset
269
+
270
+ * Size: 2,223,773 training samples
271
+ * Columns: <code>question</code>, <code>answer</code>, and <code>label</code>
272
+ * Approximate statistics based on the first 1000 samples:
273
+ | | question | answer | label |
274
+ |:--------|:-----------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|:------------------------------------------------|
275
+ | type | string | string | int |
276
+ | details | <ul><li>min: 19 characters</li><li>mean: 45.87 characters</li><li>max: 88 characters</li></ul> | <ul><li>min: 61 characters</li><li>mean: 253.13 characters</li><li>max: 374 characters</li></ul> | <ul><li>0: ~86.70%</li><li>1: ~13.30%</li></ul> |
277
+ * Samples:
278
+ | question | answer | label |
279
+ |:---------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
280
+ | <code>what does it mean when you get a sharp pain in your left arm?</code> | <code>Pain in the left arm A pain in your left arm could mean you have a bone or joint injury, a pinched nerve, or a problem with your heart. Read on to learn more about the causes of left arm pain and what symptoms could signal a serious problem.</code> | <code>1</code> |
281
+ | <code>what does it mean when you get a sharp pain in your left arm?</code> | <code>In this Article Whether it's throbbing, aching, or sharp, everyone has been in pain. The uncomfortable sensation is a red flag. Pain in your armpit could mean that you've simply strained a muscle, which is eased with ice and rest. It could also be a sign of more serious conditions, like an infection or breast cancer.</code> | <code>0</code> |
282
+ | <code>what does it mean when you get a sharp pain in your left arm?</code> | <code>Sharp: When you feel a sudden, intense spike of pain, that qualifies as “sharp.” Sharp pain may also fit the descriptors cutting and shooting. Stabbing: Like sharp pain, stabbing pain occurs suddenly and intensely. However, stabbing pain may fade and reoccur many times.</code> | <code>0</code> |
283
+ * Loss: [<code>BinaryCrossEntropyLoss</code>](https://sbert.net/docs/package_reference/cross_encoder/losses.html#binarycrossentropyloss) with these parameters:
284
+ ```json
285
+ {
286
+ "activation_fn": "torch.nn.modules.linear.Identity",
287
+ "pos_weight": 7
288
+ }
289
+ ```
290
+
291
+ ### Training Hyperparameters
292
+ #### Non-Default Hyperparameters
293
+
294
+ - `eval_strategy`: steps
295
+ - `per_device_train_batch_size`: 2048
296
+ - `per_device_eval_batch_size`: 2048
297
+ - `learning_rate`: 2e-05
298
+ - `warmup_ratio`: 0.1
299
+ - `seed`: 12
300
+ - `bf16`: True
301
+ - `dataloader_num_workers`: 12
302
+ - `load_best_model_at_end`: True
303
+
304
+ #### All Hyperparameters
305
+ <details><summary>Click to expand</summary>
306
+
307
+ - `overwrite_output_dir`: False
308
+ - `do_predict`: False
309
+ - `eval_strategy`: steps
310
+ - `prediction_loss_only`: True
311
+ - `per_device_train_batch_size`: 2048
312
+ - `per_device_eval_batch_size`: 2048
313
+ - `per_gpu_train_batch_size`: None
314
+ - `per_gpu_eval_batch_size`: None
315
+ - `gradient_accumulation_steps`: 1
316
+ - `eval_accumulation_steps`: None
317
+ - `torch_empty_cache_steps`: None
318
+ - `learning_rate`: 2e-05
319
+ - `weight_decay`: 0.0
320
+ - `adam_beta1`: 0.9
321
+ - `adam_beta2`: 0.999
322
+ - `adam_epsilon`: 1e-08
323
+ - `max_grad_norm`: 1.0
324
+ - `num_train_epochs`: 3
325
+ - `max_steps`: -1
326
+ - `lr_scheduler_type`: linear
327
+ - `lr_scheduler_kwargs`: {}
328
+ - `warmup_ratio`: 0.1
329
+ - `warmup_steps`: 0
330
+ - `log_level`: passive
331
+ - `log_level_replica`: warning
332
+ - `log_on_each_node`: True
333
+ - `logging_nan_inf_filter`: True
334
+ - `save_safetensors`: True
335
+ - `save_on_each_node`: False
336
+ - `save_only_model`: False
337
+ - `restore_callback_states_from_checkpoint`: False
338
+ - `no_cuda`: False
339
+ - `use_cpu`: False
340
+ - `use_mps_device`: False
341
+ - `seed`: 12
342
+ - `data_seed`: None
343
+ - `jit_mode_eval`: False
344
+ - `use_ipex`: False
345
+ - `bf16`: True
346
+ - `fp16`: False
347
+ - `fp16_opt_level`: O1
348
+ - `half_precision_backend`: auto
349
+ - `bf16_full_eval`: False
350
+ - `fp16_full_eval`: False
351
+ - `tf32`: None
352
+ - `local_rank`: 0
353
+ - `ddp_backend`: None
354
+ - `tpu_num_cores`: None
355
+ - `tpu_metrics_debug`: False
356
+ - `debug`: []
357
+ - `dataloader_drop_last`: False
358
+ - `dataloader_num_workers`: 12
359
+ - `dataloader_prefetch_factor`: None
360
+ - `past_index`: -1
361
+ - `disable_tqdm`: False
362
+ - `remove_unused_columns`: True
363
+ - `label_names`: None
364
+ - `load_best_model_at_end`: True
365
+ - `ignore_data_skip`: False
366
+ - `fsdp`: []
367
+ - `fsdp_min_num_params`: 0
368
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
369
+ - `tp_size`: 0
370
+ - `fsdp_transformer_layer_cls_to_wrap`: None
371
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
372
+ - `deepspeed`: None
373
+ - `label_smoothing_factor`: 0.0
374
+ - `optim`: adamw_torch
375
+ - `optim_args`: None
376
+ - `adafactor`: False
377
+ - `group_by_length`: False
378
+ - `length_column_name`: length
379
+ - `ddp_find_unused_parameters`: None
380
+ - `ddp_bucket_cap_mb`: None
381
+ - `ddp_broadcast_buffers`: False
382
+ - `dataloader_pin_memory`: True
383
+ - `dataloader_persistent_workers`: False
384
+ - `skip_memory_metrics`: True
385
+ - `use_legacy_prediction_loop`: False
386
+ - `push_to_hub`: False
387
+ - `resume_from_checkpoint`: None
388
+ - `hub_model_id`: None
389
+ - `hub_strategy`: every_save
390
+ - `hub_private_repo`: None
391
+ - `hub_always_push`: False
392
+ - `gradient_checkpointing`: False
393
+ - `gradient_checkpointing_kwargs`: None
394
+ - `include_inputs_for_metrics`: False
395
+ - `include_for_metrics`: []
396
+ - `eval_do_concat_batches`: True
397
+ - `fp16_backend`: auto
398
+ - `push_to_hub_model_id`: None
399
+ - `push_to_hub_organization`: None
400
+ - `mp_parameters`:
401
+ - `auto_find_batch_size`: False
402
+ - `full_determinism`: False
403
+ - `torchdynamo`: None
404
+ - `ray_scope`: last
405
+ - `ddp_timeout`: 1800
406
+ - `torch_compile`: False
407
+ - `torch_compile_backend`: None
408
+ - `torch_compile_mode`: None
409
+ - `dispatch_batches`: None
410
+ - `split_batches`: None
411
+ - `include_tokens_per_second`: False
412
+ - `include_num_input_tokens_seen`: False
413
+ - `neftune_noise_alpha`: None
414
+ - `optim_target_modules`: None
415
+ - `batch_eval_metrics`: False
416
+ - `eval_on_start`: False
417
+ - `use_liger_kernel`: False
418
+ - `eval_use_gather_object`: False
419
+ - `average_tokens_across_devices`: False
420
+ - `prompts`: None
421
+ - `batch_sampler`: batch_sampler
422
+ - `multi_dataset_batch_sampler`: proportional
423
+
424
+ </details>
425
+
426
+ ### Training Logs
427
+ | Epoch | Step | Training Loss | gooaq-dev_ndcg@10 | NanoMSMARCO_R100_ndcg@10 | NanoNFCorpus_R100_ndcg@10 | NanoNQ_R100_ndcg@10 | NanoBEIR_R100_mean_ndcg@10 |
428
+ |:----------:|:--------:|:-------------:|:--------------------:|:------------------------:|:-------------------------:|:--------------------:|:--------------------------:|
429
+ | -1 | -1 | - | 0.6371 (+0.1550) | 0.6686 (+0.1282) | 0.3930 (+0.0680) | 0.7599 (+0.2592) | 0.6072 (+0.1518) |
430
+ | 0.0009 | 1 | 2.1175 | - | - | - | - | - |
431
+ | 0.1842 | 200 | 1.1892 | - | - | - | - | - |
432
+ | 0.3683 | 400 | 0.676 | - | - | - | - | - |
433
+ | 0.5525 | 600 | 0.6268 | - | - | - | - | - |
434
+ | 0.7366 | 800 | 0.606 | - | - | - | - | - |
435
+ | 0.9208 | 1000 | 0.5933 | 0.6731 (+0.1910) | 0.6038 (+0.0634) | 0.4572 (+0.1321) | 0.5220 (+0.0213) | 0.5277 (+0.0723) |
436
+ | 1.1050 | 1200 | 0.5756 | - | - | - | - | - |
437
+ | 1.2891 | 1400 | 0.5625 | - | - | - | - | - |
438
+ | 1.4733 | 1600 | 0.5575 | - | - | - | - | - |
439
+ | 1.6575 | 1800 | 0.549 | - | - | - | - | - |
440
+ | 1.8416 | 2000 | 0.5475 | 0.6799 (+0.1977) | 0.6072 (+0.0667) | 0.4278 (+0.1028) | 0.5031 (+0.0024) | 0.5127 (+0.0573) |
441
+ | 2.0258 | 2200 | 0.5391 | - | - | - | - | - |
442
+ | 2.2099 | 2400 | 0.5276 | - | - | - | - | - |
443
+ | 2.3941 | 2600 | 0.5271 | - | - | - | - | - |
444
+ | 2.5783 | 2800 | 0.5264 | - | - | - | - | - |
445
+ | **2.7624** | **3000** | **0.5244** | **0.6822 (+0.2001)** | **0.6060 (+0.0655)** | **0.4077 (+0.0827)** | **0.5091 (+0.0084)** | **0.5076 (+0.0522)** |
446
+ | 2.9466 | 3200 | 0.5235 | - | - | - | - | - |
447
+ | -1 | -1 | - | 0.6822 (+0.2001) | 0.6060 (+0.0655) | 0.4077 (+0.0827) | 0.5091 (+0.0084) | 0.5076 (+0.0522) |
448
+
449
+ * The bold row denotes the saved checkpoint.
450
+
451
+ ### Framework Versions
452
+ - Python: 3.11.0
453
+ - Sentence Transformers: 4.0.1
454
+ - Transformers: 4.50.3
455
+ - PyTorch: 2.6.0+cu124
456
+ - Accelerate: 1.5.2
457
+ - Datasets: 3.5.0
458
+ - Tokenizers: 0.21.1
459
+
460
+ ## Citation
461
+
462
+ ### BibTeX
463
+
464
+ #### Sentence Transformers
465
+ ```bibtex
466
+ @inproceedings{reimers-2019-sentence-bert,
467
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
468
+ author = "Reimers, Nils and Gurevych, Iryna",
469
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
470
+ month = "11",
471
+ year = "2019",
472
+ publisher = "Association for Computational Linguistics",
473
+ url = "https://arxiv.org/abs/1908.10084",
474
+ }
475
+ ```
476
+
477
+ <!--
478
+ ## Glossary
479
+
480
+ *Clearly define terms in order to be accessible across audiences.*
481
+ -->
482
+
483
+ <!--
484
+ ## Model Card Authors
485
+
486
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
487
+ -->
488
+
489
+ <!--
490
+ ## Model Card Contact
491
+
492
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
493
+ -->
config.json ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "BertForSequenceClassification"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "classifier_dropout": null,
7
+ "gradient_checkpointing": false,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 384,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 1536,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 6,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "sentence_transformers": {
27
+ "activation_fn": "torch.nn.modules.linear.Identity",
28
+ "version": "4.0.1"
29
+ },
30
+ "torch_dtype": "float32",
31
+ "transformers_version": "4.50.3",
32
+ "type_vocab_size": 2,
33
+ "use_cache": true,
34
+ "vocab_size": 30522
35
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:45ab940315e00147cc4d993886eb469cda5b4c69cad5d38dc1738504eefe0e6d
3
+ size 90866412
special_tokens_map.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": "[CLS]",
3
+ "mask_token": "[MASK]",
4
+ "pad_token": "[PAD]",
5
+ "sep_token": "[SEP]",
6
+ "unk_token": "[UNK]"
7
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": true,
45
+ "cls_token": "[CLS]",
46
+ "do_basic_tokenize": true,
47
+ "do_lower_case": true,
48
+ "extra_special_tokens": {},
49
+ "mask_token": "[MASK]",
50
+ "model_max_length": 512,
51
+ "never_split": null,
52
+ "pad_token": "[PAD]",
53
+ "sep_token": "[SEP]",
54
+ "strip_accents": null,
55
+ "tokenize_chinese_chars": true,
56
+ "tokenizer_class": "BertTokenizer",
57
+ "unk_token": "[UNK]"
58
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff