MS-MARCO Embeddings
Collection
Embedding models for MS-MARCO (Simple embedding models for RAG)
•
7 items
•
Updated
This is a sentence-transformers model finetuned from jinaai/jina-embeddings-v3. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Finetuned from jinaai/jina-embeddings-v3 (trained with msmarco-v3 dataset).
SentenceTransformer(
(transformer): Transformer(
(auto_model): XLMRobertaLoRA(
(roberta): XLMRobertaModel(
(embeddings): XLMRobertaEmbeddings(
(word_embeddings): ParametrizedEmbedding(
250002, 1024, padding_idx=1
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(token_type_embeddings): ParametrizedEmbedding(
1, 1024
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
)
(emb_drop): Dropout(p=0.1, inplace=False)
(emb_ln): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
(encoder): XLMRobertaEncoder(
(layers): ModuleList(
(0-23): 24 x Block(
(mixer): MHA(
(rotary_emb): RotaryEmbedding()
(Wqkv): ParametrizedLinearResidual(
in_features=1024, out_features=3072, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(inner_attn): FlashSelfAttention(
(drop): Dropout(p=0.1, inplace=False)
)
(inner_cross_attn): FlashCrossAttention(
(drop): Dropout(p=0.1, inplace=False)
)
(out_proj): ParametrizedLinear(
in_features=1024, out_features=1024, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
)
(dropout1): Dropout(p=0.1, inplace=False)
(drop_path1): StochasticDepth(p=0.0, mode=row)
(norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): ParametrizedLinear(
in_features=1024, out_features=4096, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(fc2): ParametrizedLinear(
in_features=4096, out_features=1024, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
)
(dropout2): Dropout(p=0.1, inplace=False)
(drop_path2): StochasticDepth(p=0.0, mode=row)
(norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
)
)
)
(pooler): XLMRobertaPooler(
(dense): ParametrizedLinear(
in_features=1024, out_features=1024, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(activation): Tanh()
)
)
)
)
(pooler): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(normalizer): Normalize()
)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("BlackBeenie/jina-embeddings-v3-msmarco-v3-bpr")
# Run inference
sentences = [
'what is a fermentation lock used for',
'The fermentation lock or airlock is a device used in beer brewing and wine making that allows carbon dioxide released by the beer to escape the fermenter, while not allowing air to enter the fermenter, thus avoiding oxidation. There are two main designs for the fermentation lock, or airlock.',
'Remember, fermentation is a method of preserving food. Leaving it on your counter gives it more time for the LAB activity to increase â\x80\x94 which, in turn, lowers pH â\x80\x94 and prevents spoilage. As long as your jar can keep out the oxygen, you shouldnâ\x80\x99t be worried. Which leads me toâ\x80¦.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
sentence_0
, sentence_1
, and sentence_2
sentence_0 | sentence_1 | sentence_2 | |
---|---|---|---|
type | string | string | string |
details |
|
|
|
sentence_0 | sentence_1 | sentence_2 |
---|---|---|
how much does it cost to paint a interior house |
Interior House Painting Cost Factors. Generally, it will take a minimum of two gallons of paint to cover a room. At the highest end, paint will cost anywhere between $30 and $60 per gallon and come in three different finishes: flat, semi-gloss or high-gloss.Flat finishes are the least shiny and are best suited for areas requiring frequent cleaning.rovide a few details about your project and receive competitive quotes from local pros. The average national cost to paint a home interior is $1,671, with most homeowners spending between $966 and $2,426. |
Question DetailsAsked on 3/12/2014. Guest_... How much does it cost per square foot to paint the interior of a house? We just bought roughly a 1500 sg ft townhouse and want to get the entire house, including ceilings painted (including a roughly 400 sq ft finished basement not included in square footage). |
when is s corp taxes due |
If you form a corporate entity for your small business, regardless of whether it's taxed as a C or S corporation, a tax return must be filed with the Internal Revenue Service on its due date each year. Corporate tax returns are always due on the 15th day of the third month following the close of the tax year. The actual day that the tax return filing deadline falls on, however, isn't the same for every corporation. |
Before Jan. 1, 2026 After Dec. 31, 2025 Starting with 2016 tax returns, all. other C corps besides Dec. 31 and. June 30 year-ends (including those with. other fiscal year-ends) will be due on. the 15th of the 4th month after the. |
what are disaccharides |
Disaccharides are formed when two monosaccharides are joined together and a molecule of water is removed, a process known as dehydration reaction. For example; milk sugar (lactose) is made from glucose and galactose whereas the sugar from sugar cane and sugar beets (sucrose) is made from glucose and fructose.altose, another notable disaccharide, is made up of two glucose molecules. The two monosaccharides are bonded via a dehydration reaction (also called a condensation reaction or dehydration synthesis) that leads to the loss of a molecule of water and formation of a glycosidic bond. |
Disaccharides- Another type of carbohydrate. How many sugar units are disaccharides composed of?_____ What elements make up disaccharides? _____ How does the body use disaccharides? _____ There is no chemical test for disaccharides. Table sugar (white granulated sugar) is an example of a disaccharide. List some foods that contain a lot of disaccharides: _____ |
beir.losses.bpr_loss.BPRLoss
eval_strategy
: stepsper_device_train_batch_size
: 32per_device_eval_batch_size
: 32num_train_epochs
: 8multi_dataset_batch_sampler
: round_robinoverwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 32per_device_eval_batch_size
: 32per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 5e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1num_train_epochs
: 8max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Falsehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseeval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
: auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseeval_use_gather_object
: Falsebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: round_robinEpoch | Step | Training Loss |
---|---|---|
0.0321 | 500 | 1.7204 |
0.0641 | 1000 | 0.6847 |
0.0962 | 1500 | 0.4782 |
0.1283 | 2000 | 0.4001 |
0.1603 | 2500 | 0.3773 |
0.1924 | 3000 | 0.3538 |
0.2245 | 3500 | 0.3424 |
0.2565 | 4000 | 0.3375 |
0.2886 | 4500 | 0.3286 |
0.3207 | 5000 | 0.3289 |
0.3527 | 5500 | 0.3266 |
0.3848 | 6000 | 0.3226 |
0.4169 | 6500 | 0.3266 |
0.4489 | 7000 | 0.3262 |
0.4810 | 7500 | 0.3241 |
0.5131 | 8000 | 0.3216 |
0.5451 | 8500 | 0.3232 |
0.5772 | 9000 | 0.3186 |
0.6092 | 9500 | 0.3194 |
0.6413 | 10000 | 0.314 |
0.6734 | 10500 | 0.3217 |
0.7054 | 11000 | 0.3156 |
0.7375 | 11500 | 0.3244 |
0.7696 | 12000 | 0.3189 |
0.8016 | 12500 | 0.3235 |
0.8337 | 13000 | 0.3305 |
0.8658 | 13500 | 0.3284 |
0.8978 | 14000 | 0.3213 |
0.9299 | 14500 | 0.3283 |
0.9620 | 15000 | 0.3219 |
0.9940 | 15500 | 0.3247 |
1.0 | 15593 | - |
1.0261 | 16000 | 0.3287 |
1.0582 | 16500 | 0.3346 |
1.0902 | 17000 | 0.3245 |
1.1223 | 17500 | 0.3202 |
1.1544 | 18000 | 0.332 |
1.1864 | 18500 | 0.3298 |
1.2185 | 19000 | 0.332 |
1.2506 | 19500 | 0.3258 |
1.2826 | 20000 | 0.3291 |
1.3147 | 20500 | 0.334 |
1.3468 | 21000 | 0.3328 |
1.3788 | 21500 | 0.3362 |
1.4109 | 22000 | 0.3348 |
1.4430 | 22500 | 0.3402 |
1.4750 | 23000 | 0.3346 |
1.5071 | 23500 | 0.339 |
1.5392 | 24000 | 0.3406 |
1.5712 | 24500 | 0.3239 |
1.6033 | 25000 | 0.3275 |
1.6353 | 25500 | 0.3287 |
1.6674 | 26000 | 0.3271 |
1.6995 | 26500 | 0.3337 |
1.7315 | 27000 | 0.3352 |
1.7636 | 27500 | 0.3244 |
1.7957 | 28000 | 0.3418 |
1.8277 | 28500 | 0.349 |
1.8598 | 29000 | 0.3395 |
1.8919 | 29500 | 0.3386 |
1.9239 | 30000 | 0.3379 |
1.9560 | 30500 | 0.3412 |
1.9881 | 31000 | 0.3364 |
2.0 | 31186 | - |
2.0201 | 31500 | 0.3386 |
2.0522 | 32000 | 0.3417 |
2.0843 | 32500 | 0.3362 |
2.1163 | 33000 | 0.3251 |
2.1484 | 33500 | 0.3563 |
2.1805 | 34000 | 0.3341 |
2.2125 | 34500 | 0.3478 |
2.2446 | 35000 | 0.3389 |
2.2767 | 35500 | 0.342 |
2.3087 | 36000 | 0.3467 |
2.3408 | 36500 | 0.3419 |
2.3729 | 37000 | 0.3513 |
2.4049 | 37500 | 0.3441 |
2.4370 | 38000 | 0.3484 |
2.4691 | 38500 | 0.3457 |
2.5011 | 39000 | 0.3503 |
2.5332 | 39500 | 0.3446 |
2.5653 | 40000 | 0.3461 |
2.5973 | 40500 | 0.3399 |
2.6294 | 41000 | 0.3405 |
2.6615 | 41500 | 0.3382 |
2.6935 | 42000 | 0.3388 |
2.7256 | 42500 | 0.3378 |
2.7576 | 43000 | 0.336 |
2.7897 | 43500 | 0.3471 |
2.8218 | 44000 | 0.3563 |
2.8538 | 44500 | 0.3465 |
2.8859 | 45000 | 0.3501 |
2.9180 | 45500 | 0.3439 |
2.9500 | 46000 | 0.3546 |
2.9821 | 46500 | 0.3414 |
3.0 | 46779 | - |
3.0142 | 47000 | 0.3498 |
3.0462 | 47500 | 0.3484 |
3.0783 | 48000 | 0.3496 |
3.1104 | 48500 | 0.3392 |
3.1424 | 49000 | 0.3583 |
3.1745 | 49500 | 0.3505 |
3.2066 | 50000 | 0.3547 |
3.2386 | 50500 | 0.3469 |
3.2707 | 51000 | 0.3489 |
3.3028 | 51500 | 0.3473 |
3.3348 | 52000 | 0.3579 |
3.3669 | 52500 | 0.3523 |
3.3990 | 53000 | 0.3427 |
3.4310 | 53500 | 0.3685 |
3.4631 | 54000 | 0.3479 |
3.4952 | 54500 | 0.355 |
3.5272 | 55000 | 0.3464 |
3.5593 | 55500 | 0.3473 |
3.5914 | 56000 | 0.348 |
3.6234 | 56500 | 0.3426 |
3.6555 | 57000 | 0.3394 |
3.6876 | 57500 | 0.3454 |
3.7196 | 58000 | 0.345 |
3.7517 | 58500 | 0.3411 |
3.7837 | 59000 | 0.3557 |
3.8158 | 59500 | 0.3505 |
3.8479 | 60000 | 0.3605 |
3.8799 | 60500 | 0.3554 |
3.9120 | 61000 | 0.349 |
3.9441 | 61500 | 0.3629 |
3.9761 | 62000 | 0.3456 |
4.0 | 62372 | - |
4.0082 | 62500 | 0.3562 |
4.0403 | 63000 | 0.3531 |
4.0723 | 63500 | 0.3569 |
4.1044 | 64000 | 0.3494 |
4.1365 | 64500 | 0.3513 |
4.1685 | 65000 | 0.3599 |
4.2006 | 65500 | 0.3487 |
4.2327 | 66000 | 0.3561 |
4.2647 | 66500 | 0.3583 |
4.2968 | 67000 | 0.3539 |
4.3289 | 67500 | 0.3614 |
4.3609 | 68000 | 0.3558 |
4.3930 | 68500 | 0.3485 |
4.4251 | 69000 | 0.3715 |
4.4571 | 69500 | 0.3585 |
4.4892 | 70000 | 0.3571 |
4.5213 | 70500 | 0.3498 |
4.5533 | 71000 | 0.3576 |
4.5854 | 71500 | 0.3498 |
4.6175 | 72000 | 0.3507 |
4.6495 | 72500 | 0.3436 |
4.6816 | 73000 | 0.3461 |
4.7137 | 73500 | 0.3451 |
4.7457 | 74000 | 0.3554 |
4.7778 | 74500 | 0.354 |
4.8099 | 75000 | 0.3514 |
4.8419 | 75500 | 0.3688 |
4.8740 | 76000 | 0.3573 |
4.9060 | 76500 | 0.3557 |
4.9381 | 77000 | 0.3607 |
4.9702 | 77500 | 0.3488 |
5.0 | 77965 | - |
5.0022 | 78000 | 0.3555 |
5.0343 | 78500 | 0.3596 |
5.0664 | 79000 | 0.3572 |
5.0984 | 79500 | 0.355 |
5.1305 | 80000 | 0.3427 |
5.1626 | 80500 | 0.3669 |
5.1946 | 81000 | 0.3578 |
5.2267 | 81500 | 0.3589 |
5.2588 | 82000 | 0.3586 |
5.2908 | 82500 | 0.3581 |
5.3229 | 83000 | 0.3607 |
5.3550 | 83500 | 0.3563 |
5.3870 | 84000 | 0.3597 |
5.4191 | 84500 | 0.3712 |
5.4512 | 85000 | 0.3574 |
5.4832 | 85500 | 0.359 |
5.5153 | 86000 | 0.3598 |
5.5474 | 86500 | 0.3604 |
5.5794 | 87000 | 0.3535 |
5.6115 | 87500 | 0.3606 |
5.6436 | 88000 | 0.3469 |
5.6756 | 88500 | 0.3568 |
5.7077 | 89000 | 0.3497 |
5.7398 | 89500 | 0.3597 |
5.7718 | 90000 | 0.3582 |
5.8039 | 90500 | 0.3556 |
5.8360 | 91000 | 0.3716 |
5.8680 | 91500 | 0.3615 |
5.9001 | 92000 | 0.3532 |
5.9321 | 92500 | 0.3747 |
5.9642 | 93000 | 0.3521 |
5.9963 | 93500 | 0.362 |
6.0 | 93558 | - |
6.0283 | 94000 | 0.3701 |
6.0604 | 94500 | 0.3636 |
6.0925 | 95000 | 0.3556 |
6.1245 | 95500 | 0.3508 |
6.1566 | 96000 | 0.3626 |
6.1887 | 96500 | 0.3618 |
6.2207 | 97000 | 0.3683 |
6.2528 | 97500 | 0.362 |
6.2849 | 98000 | 0.3534 |
6.3169 | 98500 | 0.3643 |
6.3490 | 99000 | 0.36 |
6.3811 | 99500 | 0.3592 |
6.4131 | 100000 | 0.3606 |
6.4452 | 100500 | 0.369 |
6.4773 | 101000 | 0.3607 |
6.5093 | 101500 | 0.3683 |
6.5414 | 102000 | 0.3648 |
6.5735 | 102500 | 0.3481 |
6.6055 | 103000 | 0.3565 |
6.6376 | 103500 | 0.3555 |
6.6697 | 104000 | 0.347 |
6.7017 | 104500 | 0.3585 |
6.7338 | 105000 | 0.3553 |
6.7659 | 105500 | 0.3539 |
6.7979 | 106000 | 0.3638 |
6.8300 | 106500 | 0.3674 |
6.8621 | 107000 | 0.3674 |
6.8941 | 107500 | 0.3617 |
6.9262 | 108000 | 0.3655 |
6.9583 | 108500 | 0.3593 |
6.9903 | 109000 | 0.3603 |
7.0 | 109151 | - |
7.0224 | 109500 | 0.3614 |
7.0544 | 110000 | 0.3655 |
7.0865 | 110500 | 0.3597 |
7.1186 | 111000 | 0.3443 |
7.1506 | 111500 | 0.3781 |
7.1827 | 112000 | 0.3587 |
7.2148 | 112500 | 0.3676 |
7.2468 | 113000 | 0.357 |
7.2789 | 113500 | 0.3639 |
7.3110 | 114000 | 0.3691 |
7.3430 | 114500 | 0.3606 |
7.3751 | 115000 | 0.3679 |
7.4072 | 115500 | 0.3697 |
7.4392 | 116000 | 0.3726 |
7.4713 | 116500 | 0.3603 |
7.5034 | 117000 | 0.3655 |
7.5354 | 117500 | 0.3639 |
7.5675 | 118000 | 0.3557 |
7.5996 | 118500 | 0.358 |
7.6316 | 119000 | 0.3526 |
7.6637 | 119500 | 0.3579 |
7.6958 | 120000 | 0.3584 |
7.7278 | 120500 | 0.3507 |
7.7599 | 121000 | 0.3472 |
7.7920 | 121500 | 0.3757 |
7.8240 | 122000 | 0.3717 |
7.8561 | 122500 | 0.3646 |
7.8882 | 123000 | 0.3662 |
7.9202 | 123500 | 0.3668 |
7.9523 | 124000 | 0.3677 |
7.9844 | 124500 | 0.3588 |
8.0 | 124744 | - |
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
Base model
jinaai/jina-embeddings-v3