WMT25: Multilingual Instruction

The task focuses on evaluating and advancing models capable of following instructions across multiple languages and diverse task types. The objective, among others, is to establish a comprehensive evaluation framework that assesses multilingual models on a range of instruction-following capabilities.

Official website: https://www2.statmt.org/wmt25/multilingual-instruction.html

Download test sets Register your team Create submission Competition updates

Leaderboard of WMT25: Multilingual Instruction

wmt25mist test set (*-*)

# Name BLEU chrF Date
1 Anonymous submission #103 --- --- July 4, 2025, 11:50 a.m.
2 Anonymous submission #64 --- --- July 4, 2025, 7:10 a.m.
3 Anonymous submission #50 --- --- July 4, 2025, 1:04 a.m.
4 Anonymous submission #6 --- --- June 28, 2025, 7:11 p.m.
5 Anonymous submission #4 --- --- June 24, 2025, 10:33 p.m.
6 Anonymous submission #3 --- --- June 24, 2025, 10:27 p.m.
BLEU and ChrF are sacreBLEU scores. Systems in bold face are your submissions. We only display the top-10 submissions per language pair. Submission validation errors denoted by -1.0 score.

Click on the column header to sort the table. Hold down the Shift key and select a second column to sort by multiple criteria.