Model Trained Using AutoTrain

  • Problem type: Text Classification

Validation Metrics

loss: 0.529706597328186

f1_macro: 0.8691819285713329

f1_micro: 0.8734939759036144

f1_weighted: 0.8718912072241768

precision_macro: 0.8717301983910195

precision_micro: 0.8734939759036144

precision_weighted: 0.8772879415765209

recall_macro: 0.8730474882260596

recall_micro: 0.8734939759036144

recall_weighted: 0.8734939759036144

accuracy: 0.8734939759036144

Downloads last month
1
Safetensors
Model size
163M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Hsawa/ARBERT20250211HunIshHubOther15wordsplit1300Epoch3

Base model

UBC-NLP/ARBERTv2
Finetuned
(7)
this model

Collection including Hsawa/ARBERT20250211HunIshHubOther15wordsplit1300Epoch3