Interested in Contributing?

Scored Systems

System Submitter System Notes Constraint Run Notes BLEU BLEU-cased TER BEER 2.0 CharactTER
Fascha  (Details) Fascha
University of Regensburg
yes

41.5

40.2

0.461

0.655

0.433

UniMelb-NMT-Transformer-BT  (Details) vhoang
The University of Melbourne, Australia
NMT with Transformer architecture (medium size, 4 heads + 4 encoder/decoder layers) enhanced with WMT'16 back-translation data. (decoding with single best system) yes

40.3

39.0

0.471

0.643

0.449

uedin-nmt-ensemble  (Details) rsennrich
University of Edinburgh
BPE neural MT system with monolingual training data (back-translated). ensemble of 4, reranked with right-to-left model. yes

39.9

38.6

0.481

0.643

0.446

uedin-nmt-single  (Details) rsennrich
University of Edinburgh
BPE neural MT system with monolingual training data (back-translated). single model. (contrastive) yes

37.5

36.2

0.503

0.628

0.469

uedin-pbt-wmt16-de-en  (Details) Matthias Huck
University of Edinburgh
Phrase-based Moses yes with LDC Gigaword

36.4

35.1

0.525

0.626

0.489

Moses Phrase-Based  (Details) jhu-smt
Johns Hopkins University
Phrase-based model, word clusters for all model components (LM, OSM, LR, sparse features), neural network joint model, large cc LM yes [25-8]

35.8

34.5

0.531

0.623

0.487

uedin-syntax  (Details) uedin-maria
University of Edinburgh
GHKM, cc-monolingual yes

35.9

34.4

0.533

0.624

0.487

KIT primary  (Details) eunah.cho
KIT
yes Phrase-based MT with Rescoring using NN Models

35.3

33.9

0.529

0.623

0.488

uedin-pbt-wmt16-de-en-contrastive  (Details) Matthias Huck
University of Edinburgh
Phrase-based Moses (contrastive, 2015 system) yes with LDC Gigaword

35.1

33.8

0.535

0.619

0.496

jhu-syntax  (Details) sding
Johns Hopkins University
GHKM, cc-monolingual yes

32.3

31.0

0.571

0.598

0.524

ParFDA  (Details) bicici
de-en ParFDA Moses phrase-based SMT system yes de-en (after the deadline)

31.4

30.1

0.575

0.596

0.546