roberta-large-ner-uk
A transformer-based NER model for Ukrainian, trained on a combination of human-annotated data (NER-UK 2.0) and high-quality silver-standard annotations (UberText-NER-Silver). Based on roberta-large-NER
, this model achieves state-of-the-art performance on a wide range of named entities in Ukrainian.
Model Details
- Model type: Transformer-based encoder (spaCy pipeline)
- Language (NLP): Ukrainian
- License: Apache 2.0
- Finetuned from model:
51la5/roberta-large-NER
- Entity Types (13):
PERS
,ORG
,LOC
,DATE
,TIME
,JOB
,MON
,PCT
,PERIOD
,DOC
,QUANT
,ART
,MISC
Usage
import spacy
nlp = spacy.load("roberta-large-ner-uk")
doc = nlp("Президент України Володимир Зеленський виступив у Брюсселі.")
print([(ent.text, ent.label_) for ent in doc.ents])
Authors
- Downloads last month
- 0
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for lang-uk/roberta-large-ner-uk
Base model
51la5/roberta-large-NERDataset used to train lang-uk/roberta-large-ner-uk
Evaluation results
- NER Precisionself-reported0.947
- NER Recallself-reported0.942
- NER F1self-reported0.944