1. Overview
S-BERT for Oh-LoRA ๐ฑโโ๏ธ (์ค๋ก๋ผ) LLM memory, for Oh-LoRA Project.
- This S-BERT model is a Fine-tuned version of
klue/roberta-base
. - Detailed info (in Korean)
2. Save Path
Save downloaded files in directory 2025_04_08_OhLoRA/llm/models/memory_sbert/trained_sbert_model
as below:
memory_sbert
- trained_sbert_model
- 1_Pooling
- config.json
- eval
- similarity_evaluation_valid_evaluator_results.csv
- config.json
- config_sentence_transformers.json
- model.safetensors
- modules.json
- README.md
- sentence_bert_config.json
- special_tokens_map.json
- tokenizer.json
- tokenizer_config.json
- vocab.txt
- Downloads last month
- 6
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for daebakgazua/250408_OhLoRA_LLM_SBERT
Base model
klue/roberta-base