jfforero/distilbert-base-uncased-BERT-POET4

This model is a fine-tuned version of distilbert-base-uncased for sentiment analysis.

Model description

This model is trained for sentiment classification with three labels: positive, neutral, and negative. It is based on DistilBERT, a lighter and faster version of BERT.

Intended uses & limitations

Intended Use

  • Sentiment classification of short texts, such as product reviews or social media posts.
  • Designed for English-language input.

Limitations

  • May struggle with sarcasm or complex irony.
  • Performance depends on training data quality.

Training and evaluation data

  • Fine-tuned on [Dataset Name] (if available, add link).
  • Contains X training samples and Y validation samples.

Training procedure

Training hyperparameters

  • Optimizer: AdamW
  • Learning Rate: 2e-5
  • Batch Size: X
  • Epochs: Y

Training results

Train Loss Validation Loss Epoch
X.XXXX Y.YYYY Z

Framework versions

  • Transformers 4.38.2
  • TensorFlow 2.15.0
  • Datasets 2.19.0
  • Tokenizers 0.15.2
Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for jfforero/distilbert-base-uncased-BERT-POET4

Finetuned
(8430)
this model