mt5-base Reranker ZH mMARCO/v2 Native Queries tokenised with Anserini

This is a variation of Unicamp's mt5-base Reranker initially finetuned on mMARCOv/2.

The queries were tokenised with pyterrier_anserini.

The model was used for the SIGIR 2025 Short paper: Lost in Transliteration: Bridging the Script Gap in Neural IR.

Downloads last month
1
Safetensors
Model size
582M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for andreaschari/mt5-ZH_MMARCO_NATIVE_ANSERINI

Finetuned
(12)
this model

Dataset used to train andreaschari/mt5-ZH_MMARCO_NATIVE_ANSERINI

Collection including andreaschari/mt5-ZH_MMARCO_NATIVE_ANSERINI