SentenceTransformer based on jinaai/jina-embeddings-v3
This is a sentence-transformers model finetuned from jinaai/jina-embeddings-v3 on the hard_negative_merged dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: jinaai/jina-embeddings-v3
- Maximum Sequence Length: 2048 tokens
- Output Dimensionality: 1024 dimensions
- Similarity Function: Cosine Similarity
- Training Dataset:
- hard_negative_merged
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(transformer): Transformer(
(auto_model): XLMRobertaLoRA(
(roberta): XLMRobertaModel(
(embeddings): XLMRobertaEmbeddings(
(word_embeddings): ParametrizedEmbedding(
250002, 1024, padding_idx=1
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(token_type_embeddings): ParametrizedEmbedding(
1, 1024
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
)
(emb_drop): Dropout(p=0.1, inplace=False)
(emb_ln): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
(encoder): XLMRobertaEncoder(
(layers): ModuleList(
(0-23): 24 x Block(
(mixer): MHA(
(rotary_emb): RotaryEmbedding()
(Wqkv): ParametrizedLinearResidual(
in_features=1024, out_features=3072, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(inner_attn): FlashSelfAttention(
(drop): Dropout(p=0.1, inplace=False)
)
(inner_cross_attn): FlashCrossAttention(
(drop): Dropout(p=0.1, inplace=False)
)
(out_proj): ParametrizedLinear(
in_features=1024, out_features=1024, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
)
(dropout1): Dropout(p=0.1, inplace=False)
(drop_path1): StochasticDepth(p=0.0, mode=row)
(norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): ParametrizedLinear(
in_features=1024, out_features=4096, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(fc2): ParametrizedLinear(
in_features=4096, out_features=1024, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
)
(dropout2): Dropout(p=0.1, inplace=False)
(drop_path2): StochasticDepth(p=0.0, mode=row)
(norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
)
)
)
(pooler): XLMRobertaPooler(
(dense): ParametrizedLinear(
in_features=1024, out_features=1024, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(activation): Tanh()
)
)
)
)
(pooler): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(normalizer): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Jrinky/jina_final_temp")
# Run inference
sentences = [
'What is the description of the Myrmecoleon and what are its two interpretations',
'The stone lies at the bottom of the sea and comes to life early in the morning. When it rises from its resting-place to the surface of the sea, it opens its mouth and takes in some heavenly dew, and the rays of the sun shine around it; thus there grows within the stone a most precious, shining pearl indeed, conceived from the heavenly dew and given lustre by the rays of the sun." Interpretations\n\nThere are two interpretations of what a Myrmecoleon is. In one version, the antlion is so called because it is the "lion of ants", a large ant or small animal that hides in the dust and kills ants. In the other version, it is a beast that is the result of a mating between a lion and an ant. It has the face of a lion and the body of an ant, with each part having its appropriate nature. Because the lion part will only eat meat and the ant part can only digest grain, the ant-lion starves.',
'It is found in Medieval bestiaries such as the Hortus Sanitatis of Jacob Meydenbach. It is also referenced in some sources as a Formicaleon (Antlion), Formicaleun or Mirmicioleon.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
hard_negative_merged
- Dataset: hard_negative_merged
- Size: 589,508 training samples
- Columns:
anchor
,positive
,negative_1
,negative_2
, andnegative_3
- Approximate statistics based on the first 1000 samples:
anchor positive negative_1 negative_2 negative_3 type string string string string string details - min: 6 tokens
- mean: 17.37 tokens
- max: 37 tokens
- min: 5 tokens
- mean: 122.81 tokens
- max: 2048 tokens
- min: 5 tokens
- mean: 128.36 tokens
- max: 2048 tokens
- min: 5 tokens
- mean: 110.47 tokens
- max: 1920 tokens
- min: 5 tokens
- mean: 103.93 tokens
- max: 2048 tokens
- Samples:
anchor positive negative_1 negative_2 negative_3 What does the plot of the story revolve around
Respawn points are created when the player accumulates enough blood collected from slain enemies or in-level blood pickups, and idles a certain distance away from immediate level hazards. Plot
The plot follows the events of an unnamed young girl's arrival at the Lafcadio Academy for Troubled Young Ladies.An really interesting idea behind the story and one that had me unable to put it down some nights! View all my reviews
And everything has such meaning and depth behind it. Nothing is just said casually, and it is all so thoughfully laced with emotion and words to draw you in to the story itself.
It has a terribly implication that this flashback may be lasting more than a chapter. It's not as if we aren't learning anything of importance. I'm just not curious where this is going. I'm wondering when it'll finally be over. Not something you want from your audience as a story teller. In no simple terms.
What type of warranty is offered with the Zhumell Signature 10x42 binoculars
The Signature is also backed by Zhumell's full, 25-year, no-fault warranty, ensuring a lifetime of worry-free viewing. The Zhumell Signature 10x42 binoculars will give you plenty of power - whenever you need it, for as long as you need it!
This item is backed by a Limited Lifetime Warranty. In the event this item should fail due to manufacturing defects during intended use, we will exchange the part free of charge (excludes shipping charges) for the original purchaser.
if you have different ideas or better suggestion ,be free to leave message . Warranty and terms:
-Warranty year is 1 year under normal use,the warranty period is a year from the date of original purchase.We have more than 55 years of experience designing, manufacturing and refining custom optical lenses for use in a range of industries. Our production staff follows strict ISO 9001 standards and uses state-of-the-art metrology equipment to test finished lenses for quality and performance.
When did he announce his retirement from all professional rugby
He was named in the Pro12 Dream Teams at the end of the 2014/15 and 2016/17 seasons. In April 2021 he announced his retirement from all professional rugby. International career
Qualifying to play internationally for Scotland through his Glasgow-born mother, on 24 October 2012 he was named in the full Scottish national team for the 2012 end-of-year rugby union tests.After retiring from full-time professional football, he worked as a production controller before becoming a sales administrator for International Computers Limited. He lived in Southampton for the rest of his life and died on 28 January 2014.
On December 15 2018, it was announced that he had left WWE voluntarily. Professional boxing record
{class="wikitable" style="text-align:center;" - Loss:
cachedselfloss2.CachedInfonce
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Evaluation Dataset
hard_negative_merged
- Dataset: hard_negative_merged
- Size: 589,508 evaluation samples
- Columns:
anchor
,positive
,negative_1
,negative_2
, andnegative_3
- Approximate statistics based on the first 1000 samples:
anchor positive negative_1 negative_2 negative_3 type string string string string string details - min: 4 tokens
- mean: 17.27 tokens
- max: 39 tokens
- min: 6 tokens
- mean: 120.45 tokens
- max: 2031 tokens
- min: 6 tokens
- mean: 123.54 tokens
- max: 2018 tokens
- min: 5 tokens
- mean: 114.85 tokens
- max: 1860 tokens
- min: 5 tokens
- mean: 115.74 tokens
- max: 1605 tokens
- Samples:
anchor positive negative_1 negative_2 negative_3 What could the term 'Golia' refer to
Golia may refer to:
Golia (surname)
Golia, Ganjam
Golia Monastery
1226 GoliaGouka may refer to:
9708 Gouka, a main-belt asteroid after the Dutch astronomer Adriaan Gouka
Eric Gouka (born 1970), Dutch cricketer
Gouka, Benin, a town and arrondissementGottschelia is a genus of liverworts belonging to the family Cephaloziellaceae.
Agila may refer to:
Agila I (died 554), Visigothic king
Agila II (died 714), Visigothic king
Agila 2, the first Filipino satellite
Agila (album), a 1996 album by Spanish rock band Extremoduro
Agila (film), a 1980 Philippine film directed by Eddie Romero
Agila (TV series), a 1987 Philippine teledrama series
Agila Town, Benue State, Nigeria
Opel Agila or Vauxhall Agila, a city car
See also
Agila division, the 10th Infantry Division of the Philippine Army
Aguila (disambiguation)What is the timeframe in which Itera plans to potentially make an agreement with a financial institution
As Itera's President Igor Makarov reported at today's meeting of the Russian Gas Society in Moscow, the gas company could make an agreement with a financial institution, which would make the most profitable and optimum offer, in the next two to three months. According to him, they are currently holding negotiations with several financial enterprises, which specialize in introducing companies to the financial market.
The process from receipt of the funding proposal to completion of due diligence is incredibly quick, with a goal of 30 days. After initial evaluation of their proposals, a selected number of start-ups, usually 6 to 8, are asked to make preliminary presentations to the steering committee.
Coinexchange, Cryptopia, YoBit, HitBtc, Binance, Bittrex
Q1 2018 : Partners announced (Debit card & Merchants) We are currently in negotiation with major payment providers to offer you a worldwide usable card. Q1/2 2018 : ETHX Beta Wallet release (Android, Windows, iOS) and debit cart pre-order
Q3 201 : More partnerships Wider range of companies accepting ETHX. First targets are the biggest e-commerce websites. We will release a beta application to collect user reviews and answer to the community. The app is expected to come out in Q1 2018 on Android and later on iOS. We are very sensitive about our community welfare, so we try to do our best to keep our members informed about the latest news. The app will also help us to inform and get suggestions. Ethereum X is community driven. If you are also a cryptography and distributed ledger tech-nology enthusiast and want to support the project, please feel free to contact us. Additional developers as well as community managers for our social...The project will be floated in the market for solicitation of expression of interest from the potential investors in June 2017. The land slots will be awarded to the successful bidders based on evaluation by the end of August, 2017. The Monitoring and Evaluation (M&E) of forest sites, awarded to successful bidders, will be done in collaboration with the Forestry, Wildlife & Fisheries Department, Government of the Punjab, as per the provisions of PPP Act, 2014, and The Punjab Forest (Amendment) Act, 2016. Revenue sharing will be done in this initiative. The Company in order to effectively reach out to the business community is organizing seminars in collaboration with various Chambers of Commerce & Industry to sensitize business groups to invest in the opportunity.
What role does File History play in the issue being discussed
What has File History got to do with the problem
I don't know but maybe someone at DC does
I post the question..... get lots of ideas and methods to remove the naughty files, but I still don't know why deleting file history worked unless the file history is tacked onto the file somehow
Since then I've been checking more of the "includes folders" for more over-long files and trying to figure what to do with them. The files are easy to find once you start paying attention
Open a folder and if it contains extra long files a scroll bar appears at the bottom of the page
Found some more files and started playing.Newspapers feature stories about lost computers and memory sticks but a more common and longstanding problem is about staff accessing records that they have no right to see. It has always been possible for staff to look at paper records, and in most cases, there is no track of record.
In data vault it is referred to as the record source. Background
The need to identify systems of record can become acute in organizations where management information systems have been built by taking output data from multiple source systems, re-processing this data, and then re-presenting the result for a new business use.The idea of preservation, in the sense of both immortalization and protection is addressed. How do we decide what to remember from history, and what do we leave out
- Loss:
cachedselfloss2.CachedInfonce
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 500per_device_eval_batch_size
: 500learning_rate
: 2e-05num_train_epochs
: 10warmup_ratio
: 0.1bf16
: Truebatch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 500per_device_eval_batch_size
: 500per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 2e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 10max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Truedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}tp_size
: 0fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportional
Training Logs
Epoch | Step | Training Loss | Validation Loss |
---|---|---|---|
0.1786 | 40 | 8.7768 | 8.5959 |
0.3571 | 80 | 8.8187 | 8.5129 |
0.5357 | 120 | 8.6175 | 8.2742 |
0.7143 | 160 | 8.0868 | 7.8954 |
0.8929 | 200 | 7.5681 | 7.3531 |
1.0714 | 240 | 7.0288 | 6.5431 |
1.25 | 280 | 6.2266 | 5.8462 |
1.4286 | 320 | 5.4682 | 5.2924 |
1.6071 | 360 | 5.0398 | 4.8148 |
1.7857 | 400 | 4.5158 | 4.4110 |
1.9643 | 440 | 4.184 | 4.0419 |
2.1429 | 480 | 3.7868 | 3.7165 |
2.3214 | 520 | 3.6258 | 3.4216 |
2.5 | 560 | 3.2262 | 3.1530 |
2.6786 | 600 | 3.0175 | 2.9128 |
2.8571 | 640 | 2.75 | 2.6999 |
3.0357 | 680 | 2.4915 | 2.5085 |
Framework Versions
- Python: 3.10.14
- Sentence Transformers: 3.4.1
- Transformers: 4.50.0
- PyTorch: 2.3.1+cu121
- Accelerate: 1.5.2
- Datasets: 3.4.1
- Tokenizers: 0.21.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
CachedInfonce
@misc{gao2021scaling,
title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
year={2021},
eprint={2101.06983},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
- Downloads last month
- 50
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for Jrinky/jina_final_temp
Base model
jinaai/jina-embeddings-v3