Roberta Zinc Compression Head

This model is trained to compress embeddings generated by the roberta_zinc_480m model from the native size of 768 to compressed sizes 512, 256, 128, 64, and 32.

The compressed embeddings are trained to maintain cosine similarity computed using the native embeddings.


license: mit

Downloads last month
371
Safetensors
Model size
2.64M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for entropy/roberta_zinc_compression_head

Finetuned
(1)
this model