Interested in Contributing?

Scored Systems

System Submitter System Notes Constraint Run Notes BLEU BLEU-cased TER BEER 2.0 CharactTER
uedin-nmt-ensemble  (Details) rsennrich
University of Edinburgh
BPE neural MT system with monolingual training data (back-translated). ensemble of 4 L2R and 4 R2L models. yes

36.5

35.1

0.524

0.620

0.489

KIT primary  (Details) eunah.cho
KIT
yes

36.1

34.6

failed

0.618

0.492

LIUM-NMTPY-Ensemble of 6 systems  (Details) fbougares
LIUM - Le Mans University
Ensemble of 6 different NMT systems (post-deadline) yes

35.2

33.9

0.538

0.614

0.500

uedin-nmt-single  (Details) rsennrich
University of Edinburgh
BPE neural MT system with monolingual training data (back-translated). single model. (contrastive) yes

34.7

33.5

0.537

0.611

0.501

UniMelb-NMT-Transformer-BT  (Details) vhoang
The University of Melbourne, Australia
NMT with Transformer architecture (medium size, 4 heads + 4 encoder/decoder layers) enhanced with WMT'16 back-translation data. (decoding with single best system) yes

34.4

33.2

0.530

0.610

0.509

SYSTRAN-single  (Details) jmcrego
SYSTRAN
OpenNMT + BPE + backtranslated monolingual data + hyperspecialization yes

34.3

33.2

0.529

0.612

0.505

RWTH NMT   (Details) jtp
RWTH Aachen University
Ensemble of 4 using backtranslated data and BPE yes

34.5

33.1

0.542

failed

failed

Fascha  (Details) Fascha
University of Regensburg
yes

33.7

32.5

0.539

0.609

0.515

RWTH NMT single  (Details) jtp
RWTH Aachen University
NMT using backtranslated data and BPE yes

33.7

32.2

0.548

0.605

0.511

uedin-nmt-2016  (Details) rsennrich
University of Edinburgh
single system of WMT16 (uedin-nmt-single). Contrastive. yes

32.4

31.1

0.556

0.599

0.520

RWTH Phrasal JTR Decoder w/ attention-based NMT  (Details) guta
RWTH Aachen University
Phrasal Decoder using a count-based LM, wcLM and JTR model, NMT system used in rescoring yes

32.4

31.0

0.576

0.596

0.526

RWTH Phrasal JTR Decoder w/ attention-based and alignment-based NMT  (Details) guta
RWTH Aachen University
Phrasal Decoder using a count-based LM and JTR model + NMT system and alignment-based neural models used in rescoring. yes

32.3

30.9

0.577

0.597

0.526

lium-nmt-ensemble  (Details) fbougares
LIUM - Le Mans University
Ensemble of 2 NMT models without backtranslation yes

31.5

30.1

0.575

failed

failed

C-3MA  (Details) mphi
University of Tartu
Nematus + filtered monolingual back-translated data + NE forcing + ngram deduplication yes NeuralMonkey + filtered monolingual back-translated data + NE forcing + ngram deduplication

30.2

29.0

0.575

failed

failed

NJUNMT  (Details) ZhaoChengqi
Nanjing University
single layer, cGRU, baseline yes

29.6

28.4

0.597

0.578

0.557

TALP-UPC  (Details) cescolano
TALP-UPC
Character to character nmt system with additional monolingual training data (backtranslated). yes Character to character nmt system with extra bilingual corpus.

29.3

28.1

0.582

0.586

0.540

BaseNematusDeEn  (Details) m4t1ss
Tilde
yes This should be right

28.4

27.2

0.611

0.571

0.567

ParFDA  (Details) bicici
de-en ParFDA Moses phrase-based SMT system yes de-en (after the deadline)

27.3

26.1

0.634

0.567

0.595