roberta-large-ner-uk

A transformer-based NER model for Ukrainian, trained on a combination of human-annotated data (NER-UK 2.0) and high-quality silver-standard annotations (UberText-NER-Silver). Based on roberta-large-NER, this model achieves state-of-the-art performance on a wide range of named entities in Ukrainian.

Model Details

  • Model type: Transformer-based encoder (spaCy pipeline)
  • Language (NLP): Ukrainian
  • License: Apache 2.0
  • Finetuned from model: 51la5/roberta-large-NER
  • Entity Types (13): PERS, ORG, LOC, DATE, TIME, JOB, MON, PCT, PERIOD, DOC, QUANT, ART, MISC

Usage

import spacy
nlp = spacy.load("roberta-large-ner-uk")
doc = nlp("Президент України Володимир Зеленський виступив у Брюсселі.")
print([(ent.text, ent.label_) for ent in doc.ents])

Authors

Vladyslav Radchenko, Nazarii Drushchak

Downloads last month
0
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for lang-uk/roberta-large-ner-uk

Finetuned
(1)
this model

Dataset used to train lang-uk/roberta-large-ner-uk

Evaluation results