Kaichengalex commited on
Commit
3d52600
·
verified ·
1 Parent(s): cca7ba6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -6
README.md CHANGED
@@ -30,12 +30,6 @@ Yingda Chen,</span>
30
  <img src="figures/fig1.png">
31
  </p>
32
 
33
-
34
- ## 🎺 News
35
- - [2025/04/24]: ✨We release the evaluation and demo code.
36
- - [2025/04/24]: ✨The paper of UniME is submitted to Arxiv.
37
- - [2025/04/22]: ✨We release the model weight of UniME in [🤗 Huggingface](https://huggingface.co/collections/DeepGlint-AI/unime-6805fa16ab0071a96bef29d2)
38
-
39
  ## 💡 Highlights
40
  To enhance the MLLM's embedding capability, we propose textual discriminative knowledge distillation. The training process involves decoupling the MLLM's LLM component and processing text with the prompt "Summarize the above sentences in one word.", followed by aligning the student (MLLM) and teacher (NV-Embed V2) embeddings via KL divergence on batch-wise similarity distributions. **Notably, only the LLM component is fine-tuned during this process, while all other parameters remain frozen**.
41
 
 
30
  <img src="figures/fig1.png">
31
  </p>
32
 
 
 
 
 
 
 
33
  ## 💡 Highlights
34
  To enhance the MLLM's embedding capability, we propose textual discriminative knowledge distillation. The training process involves decoupling the MLLM's LLM component and processing text with the prompt "Summarize the above sentences in one word.", followed by aligning the student (MLLM) and teacher (NV-Embed V2) embeddings via KL divergence on batch-wise similarity distributions. **Notably, only the LLM component is fine-tuned during this process, while all other parameters remain frozen**.
35