test

This is a sentence-transformers model finetuned from intfloat/e5-base-v2 on the quati and msmarco datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: intfloat/e5-base-v2
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity
  • Training Datasets:
    • quati
    • msmarco
  • Language: pt

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("nskwal/rayumi-5epochs")
# Run inference
sentences = [
    'para que serve a azitromicina',
    'Infecções bacterianas. A azitromicina é mais comumente usada para tratar as seguintes infecções: 1 Infecções respiratórias, como bronquite. 2 Infecções de ouvido (otite média). 3 infecções sinusais (sinusite). 4 Pneumonia. 5 Infecções da garganta (amigdalite / faringite). 6 Infecções da pele, como celulite, foliculite ou impetigo.',
    'A azitromicina também pode ser usada para tratar várias outras infecções bacterianas mais incomuns. A azitromicina não é eficaz contra nenhuma infecção causada por um vírus, como gripe, gastroenterite ou resfriado comum.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Datasets

quati

  • Dataset: quati
  • Size: 1,415 training samples
  • Columns: query and passage
  • Approximate statistics based on the first 1000 samples:
    query passage
    type string string
    details
    • min: 10 tokens
    • mean: 16.57 tokens
    • max: 24 tokens
    • min: 68 tokens
    • mean: 379.86 tokens
    • max: 512 tokens
  • Samples:
    query passage
    "O que são os celulares ""mid-range""?" Câmeras traseiras: 64 MP quad-pixel + 12 MP (ultra-wide) + 5 MP (macro) + 5 MP (sensor de profundidade) Filma em: 4K Câmera frontal: 32 MP Bateria: 4.500 mAh com carregamento turbo de 25W Tem conexão 3G e 4G Pontos positivos: Tela grande com resolução Full HD 128 GB de armazenamento é um bom espaço Câmera de 64 MP que filma em 4K Câmera frontal também filma em 4K Processador potente para uso no dia a dia Pontos negativos: Bateria com tamanho abaixo dos concorrentes Sem proteção contra água Melhor Preço Conclusões Como dito no começo da matéria o mercado de celulares está crescendo exponencialmente e isso faz com que estejam disponíveis vários modelos no mercado, para os mais diferentes gostos. Nem todo mundo precisa ou está disposto a pagar pelos melhores celulares e é onde entram os modelos citados nesta lista: Um bom celular por um preço mediano. Para um uso comum estes modelos atendem muito bem. Se você sentiu falta de alguma opção nesta lista deixe ai nos comentários. Vale lembrar ...
    "O que são os celulares ""mid-range""?" Smartphone Motorola Moto G8 Plus Imagem Celular Intermediário Detalhes Smartphone Xiaomi Redmi Note 8 Pro Melhor celular intermediário, processador rápido Smartphone Xiaomi Redmi Note 8 Melhor celular intermediário custo benefício, câmera quádrupla Smartphone Motorola One Action Sensor exclusivo para vídeo Smartphone Huawei P30 Lite Diversas tecnologias diferenciadas Smartphone Samsung Galaxy A50 Câmera frontal de 25 MP Smartphone Samsung Galaxy A30s Leitor de impressão digital embutido na tela Smartphone Motorola Moto G8 Plus Design moderno e bonito Hoje em dia os smartphones são verdadeiros aliados. Apenas com eles é possível executar uma grande quantidade de tarefas como ligações, mensagens, acesso a e-mail e redes sociais e muito mais. Mas para conseguir isso é importante ter em mãos um aparelho que reúna componentes de qualidade, tal como, boa câmera, ótimo espaço de armazenamento e processador ágil. Pensando nisso, selecionamos os modelos de celular intermediário que englobam as ...
    "O que são os celulares ""mid-range""?" Os monócitos, eosinófilos, basófilos e seus progenitores circulam no sangue em pequenas quantidades, no entanto, essas células são muitas vezes combinados em um grupo que é designado como MXD ou MID. Este grupo pode ser expressa como uma percentagem do número total de leucócitos (MXD%), ou um número absoluto (MXD #, # MID). Estes tipos de células do sangue e as células brancas do sangue e são funções importantes (a luta contra parasitas, bactérias, reacções alérgicas, etc.). Absoluta e percentagem deste valor aumenta se o aumento do número de um dos tipos de células na sua composição. Para determinar a natureza da alteração geralmente é estudar a percentagem de cada tipo de célula (monócitos, eosinófilos, basófilos e os seus precursores). Requisitos: eosinófilos reduzidos e aumento no sangue # MID (MID, MXD #) 0,2-0,8 x 109 / l MID% (MXD%) 5 - 10% O número de granulócitos (GRA, GRAN) Granulócitos - são leucócitos que contêm grânulos (leucócitos granulares). Granulócitos 3 tipos de célu...
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

msmarco

  • Dataset: msmarco
  • Size: 39,780,811 training samples
  • Columns: query, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    query positive negative
    type string string string
    details
    • min: 5 tokens
    • mean: 14.68 tokens
    • max: 47 tokens
    • min: 17 tokens
    • mean: 136.04 tokens
    • max: 443 tokens
    • min: 30 tokens
    • mean: 123.94 tokens
    • max: 499 tokens
  • Samples:
    query positive negative
    é um pouco de cafeína ok durante a gravidez Não sabemos muito sobre os efeitos da cafeína durante a gravidez sobre você e seu bebê. Portanto, é melhor limitar a quantidade que você recebe a cada dia. Se você estiver grávida, limite a cafeína a 200 miligramas por dia. Isso é aproximadamente a quantidade em 1 x 8 onças de café ou uma xícara de 12 onças de café. Em geral, é seguro para mulheres grávidas comer chocolate porque estudos demonstraram alguns benefícios de comer chocolate durante a gravidez. No entanto, as mulheres grávidas devem garantir que a ingestão de cafeína seja inferior a 200 mg por dia.
    que fruta é nativa da Austrália Passiflora herbertiana. Um raro maracujá nativo da Austrália. Os frutos são de casca verde, polpa branca, com uma classificação comestível desconhecida. Algumas fontes listam as frutas como comestíveis, doces e saborosas, enquanto outras listam as frutas como sendo amargas e não comestíveis.assiflora herbertiana. Um raro maracujá nativo da Austrália. Os frutos são de casca verde, polpa branca, com uma classificação comestível desconhecida. Algumas fontes listam as frutas como comestíveis, doces e saborosas, enquanto outras listam as frutas como amargas e não comestíveis. A noz de cola é o fruto da árvore da cola, um gênero (Cola) de árvores que são nativas das florestas tropicais da África.
    quão grande é o exército canadense As Forças Armadas canadenses. 1 A primeira missão de manutenção da paz canadense em grande escala começou no Egito em 24 de novembro de 1956. 2 Há aproximadamente 65.000 membros da Força Regular e 25.000 membros reservistas nas forças armadas canadenses. 3 No Canadá, o dia 9 de agosto é designado como Dia Nacional dos Pacificadores. O Canadian Physician Health Institute (CPHI) é um programa nacional criado em 2012 como uma colaboração entre a Canadian Medical Association (CMA), a Canadian Medical Foundation (CMF) e as Provincial and Territorial Medical Associations (PTMAs).
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • gradient_accumulation_steps: 2
  • learning_rate: 2e-05
  • weight_decay: 0.01
  • num_train_epochs: 5
  • warmup_ratio: 0.05
  • fp16: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 2
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.01
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 5
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.05
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: True
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • tp_size: 0
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss
0.0051 100 7.873
0.0103 200 6.1067
0.0154 300 3.2422
0.0206 400 1.4932
0.0257 500 1.0253
0.0309 600 0.8821
0.0360 700 0.8116
0.0412 800 0.7522
0.0463 900 0.713
0.0515 1000 0.674
0.0566 1100 0.6472
0.0618 1200 0.6078
0.0669 1300 0.5876
0.0721 1400 0.5541
0.0772 1500 0.5403
0.0824 1600 0.5201
0.0875 1700 0.4952
0.0927 1800 0.4759
0.0978 1900 0.4533
0.1030 2000 0.4435
0.1081 2100 0.4214
0.1133 2200 0.41
0.1184 2300 0.398
0.1236 2400 0.3816
0.1287 2500 0.37
0.1339 2600 0.3566
0.1390 2700 0.3424
0.1441 2800 0.3259
0.1493 2900 0.3204
0.1544 3000 0.3128
0.1596 3100 0.2974
0.1647 3200 0.292
0.1699 3300 0.2861
0.1750 3400 0.2732
0.1802 3500 0.2666
0.1853 3600 0.2565
0.1905 3700 0.2485
0.1956 3800 0.244
0.2008 3900 0.2351
0.2059 4000 0.2297
0.2111 4100 0.2229
0.2162 4200 0.2164
0.2214 4300 0.2114
0.2265 4400 0.2053
0.2317 4500 0.1999
0.2368 4600 0.1953
0.2420 4700 0.1919
0.2471 4800 0.1838
0.2523 4900 0.179
0.2574 5000 0.1775
0.2626 5100 0.1713
0.2677 5200 0.1701
0.2729 5300 0.1659
0.2780 5400 0.1615
0.2831 5500 0.1579
0.2883 5600 0.1524
0.2934 5700 0.1501
0.2986 5800 0.1469
0.3037 5900 0.1455
0.3089 6000 0.143
0.3140 6100 0.1432
0.3192 6200 0.1397
0.3243 6300 0.1356
0.3295 6400 0.1348
0.3346 6500 0.131
0.3398 6600 0.1283
0.3449 6700 0.129
0.3501 6800 0.1247
0.3552 6900 0.1225
0.3604 7000 0.122
0.3655 7100 0.1203
0.3707 7200 0.1209
0.3758 7300 0.1179
0.3810 7400 0.1148
0.3861 7500 0.1129
0.3913 7600 0.113
0.3964 7700 0.1105
0.4016 7800 0.1085
0.4067 7900 0.1077
0.4119 8000 0.1091
0.4170 8100 0.1061
0.4221 8200 0.106
0.4273 8300 0.1034
0.4324 8400 0.103
0.4376 8500 0.1011
0.4427 8600 0.0993
0.4479 8700 0.0992
0.4530 8800 0.0965
0.4582 8900 0.0951
0.4633 9000 0.0966
0.4685 9100 0.0948
0.4736 9200 0.0944
0.4788 9300 0.0926
0.4839 9400 0.0921
0.4891 9500 0.0914
0.4942 9600 0.0901
0.4994 9700 0.0899
0.5045 9800 0.0895
0.5097 9900 0.0868
0.5148 10000 0.088
0.5200 10100 0.087
0.5251 10200 0.0865
0.5303 10300 0.0836
0.5354 10400 0.0832
0.5406 10500 0.0825
0.5457 10600 0.0816
0.5509 10700 0.0829
0.5560 10800 0.0803
0.5611 10900 0.0811
0.5663 11000 0.0812
0.5714 11100 0.0802
0.5766 11200 0.0785
0.5817 11300 0.0801
0.5869 11400 0.0773
0.5920 11500 0.0763
0.5972 11600 0.0778
0.6023 11700 0.0756
0.6075 11800 0.0743
0.6126 11900 0.075
0.6178 12000 0.0751
0.6229 12100 0.0735
0.6281 12200 0.0733
0.6332 12300 0.0706
0.6384 12400 0.0725
0.6435 12500 0.0717
0.6487 12600 0.0722
0.6538 12700 0.0709
0.6590 12800 0.07
0.6641 12900 0.0688
0.6693 13000 0.0693
0.6744 13100 0.0698
0.6796 13200 0.0674
0.6847 13300 0.067
0.6899 13400 0.067
0.6950 13500 0.0664
0.7001 13600 0.0658
0.7053 13700 0.0649
0.7104 13800 0.066
0.7156 13900 0.0659
0.7207 14000 0.0652
0.7259 14100 0.065
0.7310 14200 0.0629
0.7362 14300 0.063
0.7413 14400 0.0629
0.7465 14500 0.0631
0.7516 14600 0.0629
0.7568 14700 0.0638
0.7619 14800 0.0629
0.7671 14900 0.0623
0.7722 15000 0.0614
0.7774 15100 0.0609
0.7825 15200 0.0604
0.7877 15300 0.0592
0.7928 15400 0.059
0.7980 15500 0.0597
0.8031 15600 0.0587
0.8083 15700 0.0584
0.8134 15800 0.0577
0.8186 15900 0.058
0.8237 16000 0.0575
0.8289 16100 0.058
0.8340 16200 0.0571
0.8391 16300 0.0582
0.8443 16400 0.0572
0.8494 16500 0.056
0.8546 16600 0.0558
0.8597 16700 0.0568
0.8649 16800 0.0554
0.8700 16900 0.0543
0.8752 17000 0.0555
0.8803 17100 0.0542
0.8855 17200 0.0547
0.8906 17300 0.0547
0.8958 17400 0.0531
0.9009 17500 0.0537
0.9061 17600 0.0536
0.9112 17700 0.0521
0.9164 17800 0.0516
0.9215 17900 0.0521
0.9267 18000 0.0512
0.9318 18100 0.0512
0.9370 18200 0.0525
0.9421 18300 0.0514
0.9473 18400 0.0507
0.9524 18500 0.0503
0.9576 18600 0.0509
0.9627 18700 0.0503
0.9678 18800 0.0489
0.9730 18900 0.0494
0.9781 19000 0.0497
0.9833 19100 0.0491
0.9884 19200 0.0492
0.9936 19300 0.0493
0.9987 19400 0.0497
1.0039 19500 0.0482
1.0091 19600 0.0496
1.0142 19700 0.0485
1.0194 19800 0.0485
1.0245 19900 0.048
1.0297 20000 0.0485
1.0348 20100 0.0482
1.0399 20200 0.0473
1.0451 20300 0.0476
1.0502 20400 0.0459
1.0554 20500 0.0459
1.0605 20600 0.0465
1.0657 20700 0.0462
1.0708 20800 0.0459
1.0760 20900 0.0456
1.0811 21000 0.0467
1.0863 21100 0.0454
1.0914 21200 0.0451
1.0966 21300 0.0435
1.1017 21400 0.0456
1.1069 21500 0.0445
1.1120 21600 0.0443
1.1172 21700 0.0432
1.1223 21800 0.044
1.1275 21900 0.0429
1.1326 22000 0.0423
1.1378 22100 0.0433
1.1429 22200 0.0432
1.1481 22300 0.0412
1.1532 22400 0.0423
1.1584 22500 0.0415
1.1635 22600 0.0409
1.1687 22700 0.0417
1.1738 22800 0.0412
1.1789 22900 0.0417
1.1841 23000 0.0416
1.1892 23100 0.0412
1.1944 23200 0.0411
1.1995 23300 0.0403
1.2047 23400 0.0408
1.2098 23500 0.0398
1.2150 23600 0.0402
1.2201 23700 0.0396
1.2253 23800 0.0394
1.2304 23900 0.0385
1.2356 24000 0.0386
1.2407 24100 0.0389
1.2459 24200 0.0382
1.2510 24300 0.0378
1.2562 24400 0.0387
1.2613 24500 0.0375
1.2665 24600 0.038
1.2716 24700 0.0384
1.2768 24800 0.0372
1.2819 24900 0.0378
1.2871 25000 0.0363
1.2922 25100 0.0369
1.2974 25200 0.0365
1.3025 25300 0.0363
1.3077 25400 0.0369
1.3128 25500 0.037
1.3179 25600 0.0366
1.3231 25700 0.0369
1.3282 25800 0.0361
1.3334 25900 0.0351
1.3385 26000 0.035
1.3437 26100 0.037
1.3488 26200 0.0353
1.3540 26300 0.0349
1.3591 26400 0.0346
1.3643 26500 0.035
1.3694 26600 0.0352
1.3746 26700 0.0351
1.3797 26800 0.0346
1.3849 26900 0.0342
1.3900 27000 0.0345
1.3952 27100 0.0347
1.4003 27200 0.0337
1.4055 27300 0.0337
1.4106 27400 0.0345
1.4158 27500 0.0341
1.4209 27600 0.034
1.4261 27700 0.0336
1.4312 27800 0.0326
1.4364 27900 0.033
1.4415 28000 0.0329
1.4467 28100 0.033
1.4518 28200 0.0317
1.4569 28300 0.0321
1.4621 28400 0.0324
1.4672 28500 0.0327
1.4724 28600 0.0319
1.4775 28700 0.0326
1.4827 28800 0.0325
1.4878 28900 0.0314
1.4930 29000 0.0316
1.4981 29100 0.0319
1.5033 29200 0.0318
1.5084 29300 0.0315
1.5136 29400 0.0323
1.5187 29500 0.0317
1.5239 29600 0.0315
1.5290 29700 0.0314
1.5342 29800 0.0312
1.5393 29900 0.0308
1.5445 30000 0.0299
1.5496 30100 0.031
1.5548 30200 0.0314
1.5599 30300 0.0306
1.5651 30400 0.0306
1.5702 30500 0.0313
1.5754 30600 0.0295
1.5805 30700 0.0312
1.5857 30800 0.0303
1.5908 30900 0.0301
1.5959 31000 0.0304
1.6011 31100 0.0303
1.6062 31200 0.0292
1.6114 31300 0.0298
1.6165 31400 0.0304
1.6217 31500 0.03
1.6268 31600 0.0296
1.6320 31700 0.0291
1.6371 31800 0.029
1.6423 31900 0.0284
1.6474 32000 0.0289
1.6526 32100 0.0289
1.6577 32200 0.0285
1.6629 32300 0.0283
1.6680 32400 0.0288
1.6732 32500 0.0287
1.6783 32600 0.0282
1.6835 32700 0.0281
1.6886 32800 0.0278
1.6938 32900 0.028
1.6989 33000 0.0279
1.7041 33100 0.028
1.7092 33200 0.0281
1.7144 33300 0.0277
1.7195 33400 0.028
1.7247 33500 0.0278
1.7298 33600 0.0276
1.7349 33700 0.0271
1.7401 33800 0.028
1.7452 33900 0.0276
1.7504 34000 0.027
1.7555 34100 0.0287
1.7607 34200 0.0278
1.7658 34300 0.0277
1.7710 34400 0.0277
1.7761 34500 0.0275
1.7813 34600 0.027
1.7864 34700 0.0268
1.7916 34800 0.027
1.7967 34900 0.0269
1.8019 35000 0.0262
1.8070 35100 0.0268
1.8122 35200 0.026
1.8173 35300 0.0261
1.8225 35400 0.0265
1.8276 35500 0.0267
1.8328 35600 0.0263
1.8379 35700 0.0273
1.8431 35800 0.0264
1.8482 35900 0.0261
1.8534 36000 0.0257
1.8585 36100 0.0259
1.8637 36200 0.0266
1.8688 36300 0.0258
1.8739 36400 0.0266
1.8791 36500 0.0254
1.8842 36600 0.0262
1.8894 36700 0.0264
1.8945 36800 0.0251
1.8997 36900 0.0254
1.9048 37000 0.0255
1.9100 37100 0.0246
1.9151 37200 0.0252
1.9203 37300 0.0249
1.9254 37400 0.0245
1.9306 37500 0.0248
1.9357 37600 0.0246
1.9409 37700 0.0257
1.9460 37800 0.0252
1.9512 37900 0.0249
1.9563 38000 0.0253
1.9615 38100 0.0253
1.9666 38200 0.0241
1.9718 38300 0.0245
1.9769 38400 0.0242
1.9821 38500 0.0244
1.9872 38600 0.0243
1.9924 38700 0.025
1.9975 38800 0.0248
2.0027 38900 0.0245
2.0078 39000 0.0244
2.0130 39100 0.0245
2.0181 39200 0.0244
2.0233 39300 0.0235
2.0284 39400 0.0245
2.0336 39500 0.0249
2.0387 39600 0.0239
2.0439 39700 0.0246
2.0490 39800 0.0234
2.0542 39900 0.0234
2.0593 40000 0.0236
2.0645 40100 0.0239
2.0696 40200 0.0241
2.0748 40300 0.0232
2.0799 40400 0.024
2.0850 40500 0.0237
2.0902 40600 0.0232
2.0953 40700 0.0226
2.1005 40800 0.0236
2.1056 40900 0.0234
2.1108 41000 0.0231
2.1159 41100 0.0233
2.1211 41200 0.0234
2.1262 41300 0.0234
2.1314 41400 0.0223
2.1365 41500 0.0227
2.1417 41600 0.0229
2.1468 41700 0.0218
2.1520 41800 0.0226
2.1571 41900 0.0224
2.1623 42000 0.0221
2.1674 42100 0.0226
2.1726 42200 0.0222
2.1777 42300 0.0225
2.1829 42400 0.0221
2.1880 42500 0.0218
2.1932 42600 0.022
2.1983 42700 0.0218
2.2035 42800 0.0223
2.2086 42900 0.0216
2.2138 43000 0.022
2.2189 43100 0.0214
2.2240 43200 0.0218
2.2292 43300 0.0206
2.2343 43400 0.0211
2.2395 43500 0.0214
2.2446 43600 0.0205
2.2498 43700 0.021
2.2549 43800 0.0214
2.2601 43900 0.0209
2.2652 44000 0.0207
2.2704 44100 0.0212
2.2755 44200 0.0206
2.2807 44300 0.021
2.2858 44400 0.0202
2.2910 44500 0.0205
2.2961 44600 0.0198
2.3013 44700 0.0203
2.3064 44800 0.0208
2.3116 44900 0.021
2.3167 45000 0.0205
2.3219 45100 0.0205
2.3270 45200 0.0197
2.3322 45300 0.0198
2.3373 45400 0.0196
2.3425 45500 0.0211
2.3476 45600 0.0199
2.3528 45700 0.0198
2.3579 45800 0.0199
2.3630 45900 0.02
2.3682 46000 0.0204
2.3733 46100 0.0197
2.3785 46200 0.0203
2.3836 46300 0.0199
2.3888 46400 0.0196
2.3939 46500 0.0193
2.3991 46600 0.0193
2.4042 46700 0.0198
2.4094 46800 0.0194
2.4145 46900 0.02
2.4197 47000 0.0191
2.4248 47100 0.0195
2.4300 47200 0.0189
2.4351 47300 0.0192
2.4403 47400 0.0192
2.4454 47500 0.0186
2.4506 47600 0.0185
2.4557 47700 0.0184
2.4609 47800 0.0187
2.4660 47900 0.0191
2.4712 48000 0.019
2.4763 48100 0.0187
2.4815 48200 0.0195
2.4866 48300 0.0189
2.4918 48400 0.019
2.4969 48500 0.0193
2.5020 48600 0.0188
2.5072 48700 0.0182
2.5123 48800 0.0186
2.5175 48900 0.0188
2.5226 49000 0.0178
2.5278 49100 0.0183
2.5329 49200 0.0181
2.5381 49300 0.0182
2.5432 49400 0.0177
2.5484 49500 0.0182
2.5535 49600 0.0187
2.5587 49700 0.0179
2.5638 49800 0.0181
2.5690 49900 0.0187
2.5741 50000 0.018
2.5793 50100 0.0181
2.5844 50200 0.0183
2.5896 50300 0.0182
2.5947 50400 0.0181
2.5999 50500 0.0178
2.6050 50600 0.0177
2.6102 50700 0.0174
2.6153 50800 0.0183
2.6205 50900 0.0176
2.6256 51000 0.0178
2.6307 51100 0.0172
2.6359 51200 0.018
2.6410 51300 0.0173
2.6462 51400 0.0179
2.6513 51500 0.0173
2.6565 51600 0.0173
2.6616 51700 0.0171
2.6668 51800 0.0176
2.6719 51900 0.0172
2.6771 52000 0.0174
2.6822 52100 0.0169
2.6874 52200 0.0174
2.6925 52300 0.0172
2.6977 52400 0.0172
2.7028 52500 0.0171
2.7080 52600 0.0169
2.7131 52700 0.0171
2.7183 52800 0.0175
2.7234 52900 0.0168
2.7286 53000 0.0167
2.7337 53100 0.0165
2.7389 53200 0.0171
2.7440 53300 0.0172
2.7492 53400 0.0165
2.7543 53500 0.0177
2.7595 53600 0.0175
2.7646 53700 0.0174
2.7697 53800 0.0173
2.7749 53900 0.0171
2.7800 54000 0.0168
2.7852 54100 0.0166
2.7903 54200 0.0166
2.7955 54300 0.0165
2.8006 54400 0.0171
2.8058 54500 0.0168
2.8109 54600 0.0166
2.8161 54700 0.0165
2.8212 54800 0.0165
2.8264 54900 0.0167
2.8315 55000 0.0169
2.8367 55100 0.0167
2.8418 55200 0.017
2.8470 55300 0.0166
2.8521 55400 0.0162
2.8573 55500 0.0164
2.8624 55600 0.017
2.8676 55700 0.0165
2.8727 55800 0.0164
2.8779 55900 0.0158
2.8830 56000 0.016
2.8882 56100 0.0163
2.8933 56200 0.0165
2.8985 56300 0.0161
2.9036 56400 0.016
2.9087 56500 0.0154
2.9139 56600 0.0159
2.9190 56700 0.0158
2.9242 56800 0.0161
2.9293 56900 0.016
2.9345 57000 0.0155
2.9396 57100 0.0161
2.9448 57200 0.0155
2.9499 57300 0.0158
2.9551 57400 0.0157
2.9602 57500 0.0159
2.9654 57600 0.0155
2.9705 57700 0.0158
2.9757 57800 0.0158
2.9808 57900 0.0161
2.9860 58000 0.0154
2.9911 58100 0.0159
2.9963 58200 0.0159
3.0014 58300 0.0162
3.0066 58400 0.0158
3.0117 58500 0.0159
3.0169 58600 0.0165
3.0220 58700 0.0153
3.0272 58800 0.0158
3.0323 58900 0.0159
3.0375 59000 0.0153
3.0426 59100 0.0159
3.0478 59200 0.0153
3.0529 59300 0.015
3.0581 59400 0.0152
3.0632 59500 0.0154
3.0684 59600 0.0154
3.0735 59700 0.0149
3.0787 59800 0.0153
3.0838 59900 0.0156
3.0890 60000 0.0154
3.0941 60100 0.0152
3.0993 60200 0.0148
3.1044 60300 0.0156
3.1096 60400 0.0154
3.1147 60500 0.0151
3.1198 60600 0.0151
3.1250 60700 0.0152
3.1301 60800 0.0148
3.1353 60900 0.0149
3.1404 61000 0.0148
3.1456 61100 0.0143
3.1507 61200 0.0148
3.1559 61300 0.0149
3.1610 61400 0.0145
3.1662 61500 0.0144
3.1713 61600 0.0144
3.1765 61700 0.0148
3.1816 61800 0.0148
3.1868 61900 0.0146
3.1919 62000 0.0145
3.1971 62100 0.0145
3.2022 62200 0.0143
3.2074 62300 0.0147
3.2125 62400 0.0145
3.2177 62500 0.0142
3.2228 62600 0.0142
3.2280 62700 0.0138
3.2331 62800 0.0146
3.2383 62900 0.0142
3.2434 63000 0.0143
3.2486 63100 0.0138
3.2537 63200 0.0142
3.2588 63300 0.0139
3.2640 63400 0.0141
3.2691 63500 0.0142
3.2743 63600 0.0139
3.2794 63700 0.0137
3.2846 63800 0.0142
3.2897 63900 0.0135
3.2949 64000 0.0137
3.3000 64100 0.0141
3.3052 64200 0.0137
3.3103 64300 0.0139
3.3155 64400 0.0141
3.3206 64500 0.014
3.3258 64600 0.0139
3.3309 64700 0.0136
3.3361 64800 0.0134
3.3412 64900 0.0137
3.3464 65000 0.0138
3.3515 65100 0.0135
3.3567 65200 0.0136
3.3618 65300 0.0134
3.3670 65400 0.0138
3.3721 65500 0.0133
3.3773 65600 0.0138
3.3824 65700 0.0136
3.3876 65800 0.0133
3.3927 65900 0.0135
3.3978 66000 0.0132
3.4030 66100 0.0134
3.4081 66200 0.0134
3.4133 66300 0.0136
3.4184 66400 0.0132
3.4236 66500 0.0132
3.4287 66600 0.0127
3.4339 66700 0.0132
3.4390 66800 0.0131
3.4442 66900 0.0127
3.4493 67000 0.0127
3.4545 67100 0.013
3.4596 67200 0.0131
3.4648 67300 0.0134
3.4699 67400 0.0131
3.4751 67500 0.0132
3.4802 67600 0.0132
3.4854 67700 0.0136
3.4905 67800 0.0126
3.4957 67900 0.0134
3.5008 68000 0.0129
3.5060 68100 0.013
3.5111 68200 0.0128
3.5163 68300 0.0127
3.5214 68400 0.0128
3.5266 68500 0.0127
3.5317 68600 0.0127
3.5368 68700 0.0125
3.5420 68800 0.0127
3.5471 68900 0.0126
3.5523 69000 0.013
3.5574 69100 0.0126
3.5626 69200 0.0122
3.5677 69300 0.0131
3.5729 69400 0.0128
3.5780 69500 0.0128
3.5832 69600 0.013
3.5883 69700 0.0129
3.5935 69800 0.0127
3.5986 69900 0.0127
3.6038 70000 0.0125
3.6089 70100 0.0121
3.6141 70200 0.0125
3.6192 70300 0.0128
3.6244 70400 0.0126
3.6295 70500 0.0125
3.6347 70600 0.0122
3.6398 70700 0.0123
3.6450 70800 0.0127
3.6501 70900 0.0126
3.6553 71000 0.012
3.6604 71100 0.012
3.6656 71200 0.0123
3.6707 71300 0.0123
3.6758 71400 0.0122
3.6810 71500 0.0118
3.6861 71600 0.0125
3.6913 71700 0.0122
3.6964 71800 0.0121
3.7016 71900 0.0119
3.7067 72000 0.0121
3.7119 72100 0.0121
3.7170 72200 0.0123
3.7222 72300 0.0123
3.7273 72400 0.0124
3.7325 72500 0.0117
3.7376 72600 0.0121
3.7428 72700 0.012
3.7479 72800 0.012
3.7531 72900 0.0123
3.7582 73000 0.0122
3.7634 73100 0.0123
3.7685 73200 0.0122
3.7737 73300 0.0127
3.7788 73400 0.012
3.7840 73500 0.0122
3.7891 73600 0.0116
3.7943 73700 0.012
3.7994 73800 0.0121
3.8046 73900 0.0118
3.8097 74000 0.012
3.8148 74100 0.0118
3.8200 74200 0.0122
3.8251 74300 0.0121
3.8303 74400 0.0118
3.8354 74500 0.0119
3.8406 74600 0.0121
3.8457 74700 0.0119
3.8509 74800 0.0116
3.8560 74900 0.0116
3.8612 75000 0.012
3.8663 75100 0.0116
3.8715 75200 0.0119
3.8766 75300 0.0118
3.8818 75400 0.0118
3.8869 75500 0.0119
3.8921 75600 0.0117
3.8972 75700 0.0115
3.9024 75800 0.0116
3.9075 75900 0.0117
3.9127 76000 0.0114
3.9178 76100 0.0115
3.9230 76200 0.0116
3.9281 76300 0.0118
3.9333 76400 0.0114
3.9384 76500 0.0116
3.9436 76600 0.0114
3.9487 76700 0.0115
3.9538 76800 0.0114
3.9590 76900 0.0117
3.9641 77000 0.0115
3.9693 77100 0.0118
3.9744 77200 0.0114
3.9796 77300 0.0117
3.9847 77400 0.0112
3.9899 77500 0.0111
3.9950 77600 0.0116
4.0002 77700 0.0117
4.0054 77800 0.0113
4.0105 77900 0.0115
4.0157 78000 0.0117
4.0208 78100 0.0114
4.0259 78200 0.0116
4.0311 78300 0.0114
4.0362 78400 0.0115
4.0414 78500 0.0116
4.0465 78600 0.0111
4.0517 78700 0.0111
4.0568 78800 0.0113
4.0620 78900 0.0112
4.0671 79000 0.0115
4.0723 79100 0.0113
4.0774 79200 0.0113
4.0826 79300 0.0114
4.0877 79400 0.0115
4.0929 79500 0.0114
4.0980 79600 0.0109
4.1032 79700 0.0115
4.1083 79800 0.0119
4.1135 79900 0.0112
4.1186 80000 0.0114
4.1238 80100 0.011
4.1289 80200 0.0115
4.1341 80300 0.011
4.1392 80400 0.0112
4.1444 80500 0.0104
4.1495 80600 0.0109
4.1547 80700 0.0108
4.1598 80800 0.0107
4.1649 80900 0.011
4.1701 81000 0.011
4.1752 81100 0.011
4.1804 81200 0.0111
4.1855 81300 0.0109
4.1907 81400 0.0108
4.1958 81500 0.0109
4.2010 81600 0.0107
4.2061 81700 0.0112
4.2113 81800 0.011
4.2164 81900 0.0109
4.2216 82000 0.0108
4.2267 82100 0.0104
4.2319 82200 0.0105
4.2370 82300 0.0108
4.2422 82400 0.0108
4.2473 82500 0.0106
4.2525 82600 0.0106
4.2576 82700 0.0109
4.2628 82800 0.0102
4.2679 82900 0.0112
4.2731 83000 0.0109
4.2782 83100 0.0106
4.2834 83200 0.0105
4.2885 83300 0.0104
4.2936 83400 0.0106
4.2988 83500 0.0103
4.3039 83600 0.0104
4.3091 83700 0.0107
4.3142 83800 0.0107
4.3194 83900 0.0106
4.3245 84000 0.0104
4.3297 84100 0.0102
4.3348 84200 0.0102
4.3400 84300 0.01
4.3451 84400 0.0109
4.3503 84500 0.0101
4.3554 84600 0.0103
4.3606 84700 0.0106
4.3657 84800 0.0104
4.3709 84900 0.0105
4.3760 85000 0.0106
4.3812 85100 0.0105
4.3863 85200 0.0101
4.3915 85300 0.0103
4.3966 85400 0.0103
4.4018 85500 0.0101
4.4069 85600 0.0102
4.4121 85700 0.0107
4.4172 85800 0.0101
4.4224 85900 0.0102
4.4275 86000 0.01
4.4326 86100 0.0099
4.4378 86200 0.0098
4.4429 86300 0.0103
4.4481 86400 0.0101
4.4532 86500 0.0099
4.4584 86600 0.0101
4.4635 86700 0.0106
4.4687 86800 0.0099
4.4738 86900 0.0102
4.4790 87000 0.0104
4.4841 87100 0.0104
4.4893 87200 0.0098
4.4944 87300 0.0104
4.4996 87400 0.0101
4.5047 87500 0.0101
4.5099 87600 0.0098
4.5150 87700 0.0101
4.5202 87800 0.0103
4.5253 87900 0.0098
4.5305 88000 0.0098
4.5356 88100 0.0098
4.5408 88200 0.0103
4.5459 88300 0.0099
4.5511 88400 0.01
4.5562 88500 0.01
4.5614 88600 0.0097
4.5665 88700 0.0101
4.5716 88800 0.0099
4.5768 88900 0.0096
4.5819 89000 0.0103
4.5871 89100 0.01
4.5922 89200 0.0098
4.5974 89300 0.0099
4.6025 89400 0.0097
4.6077 89500 0.0096
4.6128 89600 0.0098
4.6180 89700 0.0098
4.6231 89800 0.0098
4.6283 89900 0.0102
4.6334 90000 0.0094
4.6386 90100 0.0099
4.6437 90200 0.0098
4.6489 90300 0.0099
4.6540 90400 0.0098
4.6592 90500 0.0097
4.6643 90600 0.0096
4.6695 90700 0.01
4.6746 90800 0.0097
4.6798 90900 0.0095
4.6849 91000 0.0099
4.6901 91100 0.0098
4.6952 91200 0.0098
4.7004 91300 0.0098
4.7055 91400 0.0097
4.7106 91500 0.0099
4.7158 91600 0.0096
4.7209 91700 0.0098
4.7261 91800 0.0097
4.7312 91900 0.0092
4.7364 92000 0.0097
4.7415 92100 0.0096
4.7467 92200 0.0098
4.7518 92300 0.0099
4.7570 92400 0.0099
4.7621 92500 0.0099
4.7673 92600 0.0099
4.7724 92700 0.0098
4.7776 92800 0.0097
4.7827 92900 0.0096
4.7879 93000 0.0094
4.7930 93100 0.0096
4.7982 93200 0.0096
4.8033 93300 0.0098
4.8085 93400 0.0097
4.8136 93500 0.0095
4.8188 93600 0.0097
4.8239 93700 0.0096
4.8291 93800 0.0097
4.8342 93900 0.0098
4.8394 94000 0.0096
4.8445 94100 0.0098
4.8496 94200 0.0099
4.8548 94300 0.0096
4.8599 94400 0.0096
4.8651 94500 0.0097
4.8702 94600 0.0092
4.8754 94700 0.0101
4.8805 94800 0.0097
4.8857 94900 0.0098
4.8908 95000 0.0098
4.8960 95100 0.0093
4.9011 95200 0.0097
4.9063 95300 0.0097
4.9114 95400 0.0093
4.9166 95500 0.0096
4.9217 95600 0.0097
4.9269 95700 0.0092
4.9320 95800 0.0094
4.9372 95900 0.0097
4.9423 96000 0.0097
4.9475 96100 0.0096
4.9526 96200 0.0096
4.9578 96300 0.0097
4.9629 96400 0.0096
4.9681 96500 0.0097
4.9732 96600 0.0094
4.9784 96700 0.0095
4.9835 96800 0.0093
4.9886 96900 0.0095
4.9938 97000 0.0096
4.9989 97100 0.0097

Framework Versions

  • Python: 3.11.11
  • Sentence Transformers: 4.1.0
  • Transformers: 4.51.3
  • PyTorch: 2.6.0+cu126
  • Accelerate: 1.6.0
  • Datasets: 3.5.0
  • Tokenizers: 0.21.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
1
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for nskwal/rayumi-5epochs

Finetuned
(30)
this model