sail-clip
This model is a fine-tuned version of openai/clip-vit-large-patch14 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.0815
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 1
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
4.8985 | 0.0552 | 10 | 4.0903 |
4.1607 | 0.1105 | 20 | 3.9977 |
3.9999 | 0.1657 | 30 | 3.8481 |
3.7439 | 0.2210 | 40 | 3.6745 |
3.6873 | 0.2762 | 50 | 3.5240 |
3.4241 | 0.3315 | 60 | 3.2912 |
3.2521 | 0.3867 | 70 | 3.1707 |
3.0498 | 0.4420 | 80 | 2.9794 |
2.9275 | 0.4972 | 90 | 2.8728 |
2.8409 | 0.5525 | 100 | 2.6969 |
2.6954 | 0.6077 | 110 | 2.6175 |
2.5344 | 0.6630 | 120 | 2.5060 |
2.5042 | 0.7182 | 130 | 2.4477 |
2.2965 | 0.7735 | 140 | 2.3057 |
2.3179 | 0.8287 | 150 | 2.2107 |
2.2797 | 0.8840 | 160 | 2.1689 |
2.0838 | 0.9392 | 170 | 2.1016 |
1.9926 | 0.9945 | 180 | 2.0815 |
Framework versions
- Transformers 4.49.0
- Pytorch 2.5.1+cu118
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 7
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for cringgaard/sail-clip
Base model
openai/clip-vit-large-patch14