---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:152913
- loss:BatchAllTripletLoss
base_model: Almawave/Velvet-2B
widget:
- source_sentence: La crisi non tocca il mercato del “luxury food”, che continua a
crescere
sentences:
- Avviso aggiornamento (1.55)
- Il fotoritocco sui social per fingere di essere al Mic
- Turchia, Biberovic giocherà da passaportato nella finestra FIBA
- source_sentence: 'Miller stagione finita, Pagani: «Rimpiazzo? Valutiamo il mercato
europeo»'
sentences:
- Beautiful, le trame della settimana dal 2 al 7 dicembre
- 'Corte dei conti: concorso per 8 funzionari a tempo indeterminato'
- 'Leonardo, AD: incontrato stamani numero uno Airbus su alleanza satellitare'
- source_sentence: 'Il segreto di Jalen Hurts in una foto sullo smartphone: così ha
vinto e sconfitto gli scettici'
sentences:
- Scarcerato il boss Ernesto Fazzalari, era il latitante più ricercato dopo Messina
Denaro
- 'UniVdA: ecco il nuovo master in Psicologia dello sport'
- 'Gran Turismo 7: arrivano quattro nuove auto con l’update 1.55 [VIDEO]'
- source_sentence: 'San Vito al Torre, recuperato un antico monumento funerario romano
dal fiume: la scoperta'
sentences:
- Charlotte Casiraghi in pubblico dopo le voci di divorzio
- Mr. Bezos, la sua non è imparzialità ma viltà
- 'Elisa Di Francisca a La Talpa: “Sono troppo vera per tenere segreti”'
- source_sentence: 'NBA, dopo l’addio a Schroeder i Nets promuovono Simmons in quintetto:
l''idea è correre'
sentences:
- 'Picasso a Milano: al Mudec la mostra sulle metamorfosi del maestro'
- Italia Viva e +Europa non parteciperanno alle elezioni regionali in Liguria
- 'Achille Costacurta rivela: «Sono stato rinchiuso per un anno e sette mesi in
un centro penale'
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- average_precision
- f1
- precision
- recall
- threshold
model-index:
- name: SentenceTransformer based on Almawave/Velvet-2B
results:
- task:
type: paraphrase-mining
name: Paraphrase Mining
dataset:
name: Unknown
type: unknown
metrics:
- type: average_precision
value: 0.5283532795784699
name: Average Precision
- type: f1
value: 0.5502357974952371
name: F1
- type: precision
value: 0.5567564151181899
name: Precision
- type: recall
value: 0.5438661480521084
name: Recall
- type: threshold
value: 0.9310455322265625
name: Threshold
---
# SentenceTransformer based on Almawave/Velvet-2B
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [Almawave/Velvet-2B](https://huggingface.co/Almawave/Velvet-2B). It maps sentences & paragraphs to a 2048-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [Almawave/Velvet-2B](https://huggingface.co/Almawave/Velvet-2B)
- **Maximum Sequence Length:** 32768 tokens
- **Output Dimensionality:** 2048 dimensions
- **Similarity Function:** Cosine Similarity
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 32768, 'do_lower_case': False}) with Transformer model: MistralModel
(1): Pooling({'word_embedding_dimension': 2048, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': True, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("ancc/Velvet-2B-embedding-news")
# Run inference
sentences = [
"NBA, dopo l’addio a Schroeder i Nets promuovono Simmons in quintetto: l'idea è correre",
'Italia Viva e +Europa non parteciperanno alle elezioni regionali in Liguria',
'Achille Costacurta rivela: «Sono stato rinchiuso per un anno e sette mesi in un centro penale',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 2048]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Evaluation
### Metrics
#### Paraphrase Mining
* Evaluated with [ParaphraseMiningEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.ParaphraseMiningEvaluator)
| Metric | Value |
|:----------------------|:-----------|
| **average_precision** | **0.5284** |
| f1 | 0.5502 |
| precision | 0.5568 |
| recall | 0.5439 |
| threshold | 0.931 |
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 152,913 training samples
* Columns: sentence
and label
* Approximate statistics based on the first 1000 samples:
| | sentence | label |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| type | string | int |
| details |
MERCATO LBA - Treviso, Giofrè: "Mercato in continua osservazione, vedremo..."
| 0
|
| Ky Bowman: Non sono soddisfatto delle mie performance
| 0
|
| LBA - Treviso, Giofrè: "Sabato la Reggiana, dobbiamo vincere. Punto"
| 0
|
* Loss: [BatchAllTripletLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#batchalltripletloss)
### Evaluation Dataset
#### Unnamed Dataset
* Size: 9,310 evaluation samples
* Columns: sentence
and label
* Approximate statistics based on the first 1000 samples:
| | sentence | label |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| type | string | int |
| details | Supplenze: come funzionano i contratti fino al 31 dicembre 2024 e il calcolo del punteggio?
| 0
|
| Docente non abilitato assunto a tempo determinato da concorso PNRR1: in quale scuola "andrò a finire" se nella mia si perde un posto?
| 0
|
| Docenti non abilitati nominati dopo il 31 agosto da graduatorie pubblicate prima: otterranno sede di titolarità all’esito delle operazioni di mobilità [Chiarimenti]
| 0
|
* Loss: [BatchAllTripletLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#batchalltripletloss)
### Training Hyperparameters
#### Non-Default Hyperparameters
- `per_device_train_batch_size`: 32
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.25
- `seed`: 17
- `data_seed`: 17
- `bf16`: True
- `batch_sampler`: group_by_label
#### All Hyperparameters