CrossEncoder based on microsoft/MiniLM-L12-H384-uncased

This is a Cross Encoder model finetuned from microsoft/MiniLM-L12-H384-uncased on the ms_marco dataset using the sentence-transformers library. It computes scores for pairs of texts, which can be used for text reranking and semantic search.

Model Details

Model Description

Model Sources

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import CrossEncoder

# Download from the 🤗 Hub
model = CrossEncoder("yjoonjang/reranker-msmarco-v1.1-MiniLM-L12-H384-uncased-plistmle")
# Get scores for pairs of texts
pairs = [
    ['How many calories in an egg', 'There are on average between 55 and 80 calories in an egg depending on its size.'],
    ['How many calories in an egg', 'Egg whites are very low in calories, have no fat, no cholesterol, and are loaded with protein.'],
    ['How many calories in an egg', 'Most of the calories in an egg come from the yellow yolk in the center.'],
]
scores = model.predict(pairs)
print(scores.shape)
# (3,)

# Or rank different texts based on similarity to a single text
ranks = model.rank(
    'How many calories in an egg',
    [
        'There are on average between 55 and 80 calories in an egg depending on its size.',
        'Egg whites are very low in calories, have no fat, no cholesterol, and are loaded with protein.',
        'Most of the calories in an egg come from the yellow yolk in the center.',
    ]
)
# [{'corpus_id': ..., 'score': ...}, {'corpus_id': ..., 'score': ...}, ...]

Evaluation

Metrics

Cross Encoder Reranking

  • Datasets: NanoMSMARCO_R100, NanoNFCorpus_R100 and NanoNQ_R100
  • Evaluated with CrossEncoderRerankingEvaluator with these parameters:
    {
        "at_k": 10,
        "always_rerank_positives": true
    }
    
Metric NanoMSMARCO_R100 NanoNFCorpus_R100 NanoNQ_R100
map 0.5190 (+0.0295) 0.3333 (+0.0723) 0.5948 (+0.1752)
mrr@10 0.5072 (+0.0297) 0.5492 (+0.0493) 0.5977 (+0.1710)
ndcg@10 0.5754 (+0.0350) 0.3530 (+0.0280) 0.6497 (+0.1491)

Cross Encoder Nano BEIR

  • Dataset: NanoBEIR_R100_mean
  • Evaluated with CrossEncoderNanoBEIREvaluator with these parameters:
    {
        "dataset_names": [
            "msmarco",
            "nfcorpus",
            "nq"
        ],
        "rerank_k": 100,
        "at_k": 10,
        "always_rerank_positives": true
    }
    
Metric Value
map 0.4824 (+0.0923)
mrr@10 0.5513 (+0.0833)
ndcg@10 0.5260 (+0.0707)

Training Details

Training Dataset

ms_marco

  • Dataset: ms_marco at a47ee7a
  • Size: 78,704 training samples
  • Columns: query, docs, and labels
  • Approximate statistics based on the first 1000 samples:
    query docs labels
    type string list list
    details
    • min: 11 characters
    • mean: 33.97 characters
    • max: 100 characters
    • min: 3 elements
    • mean: 6.50 elements
    • max: 10 elements
    • min: 3 elements
    • mean: 6.50 elements
    • max: 10 elements
  • Samples:
    query docs labels
    ampullae of lorenzini definition ['Definition of AMPULLA OF LORENZINI. : any of the pores on the snouts of marine sharks and rays that contain receptors highly sensitive to weak electric fields. ADVERTISEMENT. Stefano Lorenzini fl 1678 Italian physician. First Known Use: 1898.', 'Definition of AMPULLA. 1. : a glass or earthenware flask with a globular body and two handles used especially by the ancient Romans to hold ointment, perfume, or wine. 2. : a saccular anatomical swelling or pouch. — am·pul·la·ry \am-ˈpu̇-lər-ē, ˈam-pyə-ˌler-ē\ adjective.', 'These sensory organs help fish to sense electric fields in the water. Each ampulla consists of a jelly-filled canal opening to the surface by a pore in the skin and ending blindly in a cluster of small pockets full of special jelly.', 'Wiktionary (5.00 / 1 vote) Rate this definition: ampulla of Lorenzini (Noun). An electroreceptor found mainly in cartilaginous fish such as sharks and rays, forming a network of jelly-filled canals. Origin: After Stephano Lorenzini, who first described them.', 'The ampullae of Lorenzini are special sensing organs called electroreceptors, forming a network of jelly-filled pores. They are mostly discussed as being found in cartilaginous fish (sharks, rays, and chimaeras); however, they are also reported to be found in Chondrostei such as reedfish and sturgeon.'] [1, 0, 0, 0, 0]
    pulmonary function tests are conducted by respiratory therapists ['Respiratory Care. Our Respiratory Care Department offers a full range of inpatient therapeutic and diagnostic services, including a full range of pulmonary function testing. Our therapists also provide pulmonary education such as Living with COPD and the Asthma Awareness Program.. ', 'Spirometry. Spirometry is the first and most commonly done lung function test. It measures how much and how quickly you can move air out of your lungs. For this test, you breathe into a mouthpiece attached to a recording device (spirometer). Lung Function Tests. Guide. Lung function tests (also called pulmonary function tests, or PFTs) check how well your lungs work. The tests determine how much air your lungs can hold, how quickly you can move air in and out of your lungs, and how well your lungs put oxygen into and remove carbon dioxide from your blood.', 'They provide your physician needed information to help diagnose disease, measure the severity of lung problems, recommend treatments, and follow yo... [1, 0, 0, 0, 0, ...]
    organization of American states definition ["The Organization of American States, or the OAS, is a continental organization founded on 30 April 1948 for the purposes of regional solidarity and cooperation among its member states. Headquartered in Washington, D.C., United States, the OAS's members are the 35 independent states of the Americas. ", 'More videos ». The Organization of American States is the premier regional forum for political discussion, policy analysis and decision-making in Western Hemisphere affairs. The OAS brings together leaders from nations across the Americas to address hemispheric issues and opportunities. The Coordinating Office of the Offices in the Member States invites you to visit their site. You will be able to receive updates, find out who they are and learn out about projects, programs, internships, and scholarships in each office.', "That adherence by any member of the Organization of American States to Marxism-Leninism is incompatible with the inter-American system and the alignment of such a go... [1, 0, 0, 0, 0, ...]
  • Loss: ListMLELoss with these parameters:
    {
        "lambda_weight": "sentence_transformers.cross_encoder.losses.ListMLELoss.ListMLELambdaWeight",
        "activation_fct": "torch.nn.modules.linear.Identity",
        "mini_batch_size": 16,
        "respect_input_order": true
    }
    

Evaluation Dataset

ms_marco

  • Dataset: ms_marco at a47ee7a
  • Size: 1,000 evaluation samples
  • Columns: query, docs, and labels
  • Approximate statistics based on the first 1000 samples:
    query docs labels
    type string list list
    details
    • min: 9 characters
    • mean: 33.83 characters
    • max: 101 characters
    • min: 2 elements
    • mean: 6.00 elements
    • max: 10 elements
    • min: 2 elements
    • mean: 6.00 elements
    • max: 10 elements
  • Samples:
    query docs labels
    what is tidal flow ['Noun. 1. tidal flow-the water current caused by the tides. tidal current. tide-the periodic rise and fall of the sea level under the gravitational pull of the moon. aegir, eager, eagre, tidal bore, bore-a high wave (often dangerous) caused by tidal flow (as by colliding tidal currents or in a narrow estuary). ', 'Tidal energy is a form of hydropower that converts the energy of the tides into electricity or other useful forms of power. The tide is created by the gravitational effect of the sun and the moon on the earth causing cyclical movement of the seas. Tidal Stream. Tidal Stream is the flow of water as the tide ebbs and floods, and manifests itself as tidal current. Tidal Stream devices seek to extract energy from this kinetic movement of water, much as wind turbines extract energy from the movement of air.', 'A horizontal movement of water often accompanies the rising and falling of the tide. This is called the tidal current. The incoming tide along the coast and into the bays a... [1, 0, 0, 0, 0, ...]
    what is matelasse ['The French word, matelasse matelassé “means,” “quilted,” padded “or,” cushioned and in usage with, fabric refers to hand quilted. Textiles it is meant to mimic the style of-hand Stitched marseilles type quilts made In, Provence. france Matelasse matelassé fabric is used on upholstery for slip covers and throw, pillows and in, bedding for, coverlets duvet covers and pillow. Shams it is also used in crib bedding and’children s bedding. sets', 'Matelasse (matelassé-mat-LA) say is a weaving or stitching technique yielding a pattern that appears quilted or. Padded matelasse matelassé may be achieved, by hand on a, jacquard loom or a. Quilting machine it is meant to mimic the style-of hand stitched quilts Made, In. marseilles france Matelasse matelassé may be achieved by, hand on a jacquard, loom or a quilting. Machine it is meant to mimic the style of-hand stitched quilts made In, Marseilles. france', "Save. Matelasse is type of double-woven fabric that first gained popularity in the 18th... [1, 1, 0, 0, 0, ...]
    what does atp mean ['Conversion from ATP to ADP. Adenosine triphosphate (ATP) is the energy currency of life and it provides that energy for most biological processes by being converted to ADP (adenosine diphosphate). Since the basic reaction involves a water molecule, this reaction is commonly referred to as the hydrolysis of ATP. Free Energy from Hydrolysis of ATP. Adenosine triphosphate (ATP) is the energy currency of life and it provides that energy for most biological processes by being converted to ADP (adenosine diphosphate). Since the basic reaction involves a water molecule, this reaction is commonly referred to as the hydrolysis of ATP.', 'ATP is a nucleotide that contains a large amount of chemical energy stored in its high-energy phosphate bonds. It releases energy when it is broken down (hydrolyzed) into ADP (or Adenosine Diphosphate). The energy is used for many metabolic processes. ', '• ATP (noun). The noun ATP has 1 sense: 1. a nucleotide derived from adenosine that occurs in muscle tiss... [1, 0, 0, 0, 0, ...]
  • Loss: ListMLELoss with these parameters:
    {
        "lambda_weight": "sentence_transformers.cross_encoder.losses.ListMLELoss.ListMLELambdaWeight",
        "activation_fct": "torch.nn.modules.linear.Identity",
        "mini_batch_size": 16,
        "respect_input_order": true
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • seed: 12
  • bf16: True
  • load_best_model_at_end: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 12
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss Validation Loss NanoMSMARCO_R100_ndcg@10 NanoNFCorpus_R100_ndcg@10 NanoNQ_R100_ndcg@10 NanoBEIR_R100_mean_ndcg@10
-1 -1 - - 0.0301 (-0.5103) 0.2693 (-0.0557) 0.0549 (-0.4457) 0.1181 (-0.3372)
0.0002 1 909.2226 - - - - -
0.0508 250 918.5451 - - - - -
0.1016 500 883.3122 876.4382 0.2066 (-0.3338) 0.2445 (-0.0805) 0.3186 (-0.1821) 0.2566 (-0.1988)
0.1525 750 859.0346 - - - - -
0.2033 1000 864.3308 850.8157 0.4610 (-0.0794) 0.3138 (-0.0112) 0.6074 (+0.1068) 0.4607 (+0.0054)
0.2541 1250 851.3652 - - - - -
0.3049 1500 838.7614 838.7972 0.5708 (+0.0304) 0.3423 (+0.0173) 0.6056 (+0.1050) 0.5063 (+0.0509)
0.3558 1750 853.0997 - - - - -
0.4066 2000 837.1816 834.6595 0.4936 (-0.0469) 0.3460 (+0.0209) 0.5778 (+0.0771) 0.4724 (+0.0171)
0.4574 2250 820.9718 - - - - -
0.5082 2500 829.679 832.1774 0.5754 (+0.0350) 0.3530 (+0.0280) 0.6497 (+0.1491) 0.5260 (+0.0707)
0.5591 2750 816.8598 - - - - -
0.6099 3000 841.9976 830.9660 0.5351 (-0.0054) 0.3651 (+0.0401) 0.6357 (+0.1351) 0.5120 (+0.0566)
0.6607 3250 820.7183 - - - - -
0.7115 3500 812.7813 825.5827 0.5444 (+0.0040) 0.3803 (+0.0552) 0.6208 (+0.1201) 0.5152 (+0.0598)
0.7624 3750 852.4021 - - - - -
0.8132 4000 830.3532 824.7762 0.5760 (+0.0355) 0.3600 (+0.0350) 0.6315 (+0.1309) 0.5225 (+0.0671)
0.8640 4250 834.5426 - - - - -
0.9148 4500 828.2203 822.1611 0.5711 (+0.0307) 0.3682 (+0.0432) 0.6303 (+0.1296) 0.5232 (+0.0678)
0.9656 4750 842.7682 - - - - -
-1 -1 - - 0.5754 (+0.0350) 0.3530 (+0.0280) 0.6497 (+0.1491) 0.5260 (+0.0707)
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.11.11
  • Sentence Transformers: 3.5.0.dev0
  • Transformers: 4.49.0
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.5.2
  • Datasets: 3.4.0
  • Tokenizers: 0.21.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

ListMLELoss

@inproceedings{lan2013position,
    title={Position-aware ListMLE: a sequential learning process for ranking},
    author={Lan, Yanyan and Guo, Jiafeng and Cheng, Xueqi and Liu, Tie-Yan},
    booktitle={Proceedings of the Twenty-Ninth Conference on Uncertainty in Artificial Intelligence},
    pages={333--342},
    year={2013}
}
Downloads last month
3
Safetensors
Model size
33.4M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for yjoonjang/reranker-msmarco-v1.1-MiniLM-L12-H384-uncased-plistmle

Finetuned
(89)
this model

Dataset used to train yjoonjang/reranker-msmarco-v1.1-MiniLM-L12-H384-uncased-plistmle

Evaluation results