Roberta Zinc Compression Head
This model is trained to compress embeddings generated by the roberta_zinc_480m model from the native size of 768 to compressed sizes 512, 256, 128, 64, and 32.
The compressed embeddings are trained to maintain cosine similarity computed using the native embeddings.
license: mit
- Downloads last month
- 371
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for entropy/roberta_zinc_compression_head
Base model
entropy/roberta_zinc_480m