---
language:
- en
license: apache-2.0
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:5822
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
base_model: nomic-ai/nomic-embed-text-v1.5
widget:
- source_sentence: "submitted to the CIA for each year.” Id. at 1–2. On July 22,\
\ 2010, the CIA responded to this \nrequest, stating “[w]e . . . have determined\
\ that our record systems are not configured in a way \nthat would allow us to\
\ perform a search reasonably calculated to lead to the responsive record \nwithout\
\ an unreasonable effort.” First Lutz Decl. Ex. L at 1, No. 11-444, ECF No. 20-3.\
\ As a"
sentences:
- How many instances of individual's names does the plaintiff point to?
- What date did the CIA respond to the request?
- What phrase does the Bar propose to delete references to in the Preamble to Chapter
4?
- source_sentence: "City Department of Education, the self-represented plaintiff \n\
submitted a filing containing hallucinations. No. 24-cv-04232, \n \n20 \n2024\
\ WL 3460049, at *7 (S.D.N.Y. July 18, 2024) (unpublished \nopinion). The court\
\ noted that “[s]anctions may be imposed for \nsubmitting false and nonexistent\
\ legal authority to the [c]ourt.” Id. \nHowever, the court declined to impose\
\ sanctions due to the"
sentences:
- In which sections of their opposition does the plaintiff discuss the deliberative-process
privilege?
- Who submitted the filing containing hallucinations?
- When did the plaintiff file a motion?
- source_sentence: "§ 424 and Exemption 3; Exemption 5; and/or Exemption 6. See Second\
\ Williams Decl. Ex. A. \n120 \n \nTherefore, the Court need not decide whether\
\ the DIA has the independent authority to invoke \nthe National Security Act\
\ as an Exemption 3 withholding statute. \n3. \nODNI \nFinally, the plaintiff\
\ challenges the ODNI’s decision to withhold certain portions of e-"
sentences:
- How many counts did EPIC bring related to the APA?
- Which organization's decision is being challenged by the plaintiff?
- Does the Government agree with EPIC's claim about their Answer?
- source_sentence: "confidentiality agreement/order, that remain following those discussions.\
\ This is a \nfinal report and notice of exceptions shall be filed within three\
\ days of the date of \nthis report, pursuant to Court of Chancery Rule 144(d)(2),\
\ given the expedited and \nsummary nature of Section 220 proceedings. \n \n\
\ \n \n \n \n \n \nRespectfully, \n \n \n \n \n \n \n \n \n/s/ Patricia W. Griffin"
sentences:
- Who signed this document?
- Did Mr. Mooney allege that the video was altered or tampered with?
- Did the plaintiff report the defendant at that time?
- source_sentence: "such an argument, and she does not offer any case law, cites to\
\ secondary sources, dictionaries \nor grammatical texts, arguments by analogy,\
\ or other citations, except for the mere assertion \nthat defendant failed to\
\ move in a timely fashion after he was “on notice” of the ex parte order. \n\
A reviewing court is entitled to have issues clearly defined with relevant authority\
\ cited."
sentences:
- What page is Cross-MJAR's emphasis mentioned on?
- What mere assertion does she make?
- On what dates did the Commission meet in 2019?
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: nomic-embed-text-v1.5
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 768
type: dim_768
metrics:
- type: cosine_accuracy@1
value: 0.5486862442040186
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5965996908809892
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7017001545595054
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7697063369397218
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.5486862442040186
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.5239567233384853
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.40989180834621336
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.24142194744976814
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.19049459041731065
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.5101751674394642
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.6503091190108191
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7595311695002576
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6615339195276682
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.6004440519123668
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.6427552042140723
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 512
type: dim_512
metrics:
- type: cosine_accuracy@1
value: 0.5409582689335394
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.58887171561051
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.6924265842349304
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7743431221020093
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.5409582689335394
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.5172591447707368
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.4034003091190108
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.24188562596599691
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.18740340030911898
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.5054095826893354
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.6411643482740855
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7622359608449253
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6576404555647709
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5934416476533937
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.6355153178607286
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.508500772797527
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5564142194744977
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.6707882534775889
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7449768160741885
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.508500772797527
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.4873776403915508
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.38639876352395675
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.23122102009273574
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.17671303451828954
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.47707367336424517
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.6141164348274084
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.7257856774858321
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6257588263652936
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.562961531856431
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.6091899586876254
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.45131375579598143
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5054095826893354
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.58887171561051
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6862442040185471
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.45131375579598143
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.437403400309119
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.3415765069551777
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.21298299845440496
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.15700669757856775
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.4282586295723854
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5426326635754766
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6720762493560021
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5679548352076085
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.503881160913618
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.5511797935827811
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.35239567233384855
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.3894899536321484
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.47295208655332305
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.5641421947449768
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.35239567233384855
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.33900051519835134
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.26955177743431225
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.1723338485316847
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.12171561051004637
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.33217413704276144
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.4310922205048943
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.5446934569809376
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.45200452556542003
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.39659662422413555
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.44614347894124107
name: Cosine Map@100
---
# nomic-embed-text-v1.5
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [nomic-ai/nomic-embed-text-v1.5](https://huggingface.co/nomic-ai/nomic-embed-text-v1.5) on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [nomic-ai/nomic-embed-text-v1.5](https://huggingface.co/nomic-ai/nomic-embed-text-v1.5)
- **Maximum Sequence Length:** 8192 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- json
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NomicBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Thejina/nomic-embed-text-finetuned")
# Run inference
sentences = [
'such an argument, and she does not offer any case law, cites to secondary sources, dictionaries \nor grammatical texts, arguments by analogy, or other citations, except for the mere assertion \nthat defendant failed to move in a timely fashion after he was “on notice” of the ex parte order. \nA reviewing court is entitled to have issues clearly defined with relevant authority cited.',
'What mere assertion does she make?',
"What page is Cross-MJAR's emphasis mentioned on?",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Evaluation
### Metrics
#### Information Retrieval
* Dataset: `dim_768`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
```json
{
"truncate_dim": 768
}
```
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.5487 |
| cosine_accuracy@3 | 0.5966 |
| cosine_accuracy@5 | 0.7017 |
| cosine_accuracy@10 | 0.7697 |
| cosine_precision@1 | 0.5487 |
| cosine_precision@3 | 0.524 |
| cosine_precision@5 | 0.4099 |
| cosine_precision@10 | 0.2414 |
| cosine_recall@1 | 0.1905 |
| cosine_recall@3 | 0.5102 |
| cosine_recall@5 | 0.6503 |
| cosine_recall@10 | 0.7595 |
| **cosine_ndcg@10** | **0.6615** |
| cosine_mrr@10 | 0.6004 |
| cosine_map@100 | 0.6428 |
#### Information Retrieval
* Dataset: `dim_512`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
```json
{
"truncate_dim": 512
}
```
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.541 |
| cosine_accuracy@3 | 0.5889 |
| cosine_accuracy@5 | 0.6924 |
| cosine_accuracy@10 | 0.7743 |
| cosine_precision@1 | 0.541 |
| cosine_precision@3 | 0.5173 |
| cosine_precision@5 | 0.4034 |
| cosine_precision@10 | 0.2419 |
| cosine_recall@1 | 0.1874 |
| cosine_recall@3 | 0.5054 |
| cosine_recall@5 | 0.6412 |
| cosine_recall@10 | 0.7622 |
| **cosine_ndcg@10** | **0.6576** |
| cosine_mrr@10 | 0.5934 |
| cosine_map@100 | 0.6355 |
#### Information Retrieval
* Dataset: `dim_256`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
```json
{
"truncate_dim": 256
}
```
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.5085 |
| cosine_accuracy@3 | 0.5564 |
| cosine_accuracy@5 | 0.6708 |
| cosine_accuracy@10 | 0.745 |
| cosine_precision@1 | 0.5085 |
| cosine_precision@3 | 0.4874 |
| cosine_precision@5 | 0.3864 |
| cosine_precision@10 | 0.2312 |
| cosine_recall@1 | 0.1767 |
| cosine_recall@3 | 0.4771 |
| cosine_recall@5 | 0.6141 |
| cosine_recall@10 | 0.7258 |
| **cosine_ndcg@10** | **0.6258** |
| cosine_mrr@10 | 0.563 |
| cosine_map@100 | 0.6092 |
#### Information Retrieval
* Dataset: `dim_128`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
```json
{
"truncate_dim": 128
}
```
| Metric | Value |
|:--------------------|:----------|
| cosine_accuracy@1 | 0.4513 |
| cosine_accuracy@3 | 0.5054 |
| cosine_accuracy@5 | 0.5889 |
| cosine_accuracy@10 | 0.6862 |
| cosine_precision@1 | 0.4513 |
| cosine_precision@3 | 0.4374 |
| cosine_precision@5 | 0.3416 |
| cosine_precision@10 | 0.213 |
| cosine_recall@1 | 0.157 |
| cosine_recall@3 | 0.4283 |
| cosine_recall@5 | 0.5426 |
| cosine_recall@10 | 0.6721 |
| **cosine_ndcg@10** | **0.568** |
| cosine_mrr@10 | 0.5039 |
| cosine_map@100 | 0.5512 |
#### Information Retrieval
* Dataset: `dim_64`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
```json
{
"truncate_dim": 64
}
```
| Metric | Value |
|:--------------------|:----------|
| cosine_accuracy@1 | 0.3524 |
| cosine_accuracy@3 | 0.3895 |
| cosine_accuracy@5 | 0.473 |
| cosine_accuracy@10 | 0.5641 |
| cosine_precision@1 | 0.3524 |
| cosine_precision@3 | 0.339 |
| cosine_precision@5 | 0.2696 |
| cosine_precision@10 | 0.1723 |
| cosine_recall@1 | 0.1217 |
| cosine_recall@3 | 0.3322 |
| cosine_recall@5 | 0.4311 |
| cosine_recall@10 | 0.5447 |
| **cosine_ndcg@10** | **0.452** |
| cosine_mrr@10 | 0.3966 |
| cosine_map@100 | 0.4461 |
## Training Details
### Training Dataset
#### json
* Dataset: json
* Size: 5,822 training samples
* Columns: positive
and anchor
* Approximate statistics based on the first 1000 samples:
| | positive | anchor |
|:--------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details |
functional test, too. Id. at 89–90. Still, the Court made clear that this functional test was “not
relevant.” Id. at 90. So, just as in Energy Research, its application of the functional test was
dicta. And because this discussion relied on the dicta from Energy Research, this was dicta
upon dicta.
The Government is thus imprecise when it asserts as the “law of the case” that the
| What page is the functional test mentioned as 'not relevant'?
|
| authenticated through his testimony under Maryland Rule 5-901(b)(1) as a witness with
personal knowledge of the events.
- 6 -
The part of the video depicting the shooting was properly authenticated through
circumstantial evidence under Maryland Rule 5-901(b)(4), as there was sufficient
circumstantial evidence from which a reasonable juror could have inferred that the video
| Which part of the video was authenticated?
|
| KLAN202300916
9
Los derechos morales, a su vez, están fundamentalmente
protegidos por la legislación estatal. Esta reconoce los derechos de
los autores como exclusivos de estos y los protege no solo en
beneficio propio, sino también de la sociedad por la contribución
social y cultural que históricamente se le ha reconocido a la
| ¿En beneficio de quién se protegen los derechos de los autores?
|
* Loss: [MatryoshkaLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 16
- `learning_rate`: 2e-05
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `bf16`: True
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `batch_sampler`: no_duplicates
#### All Hyperparameters