2006 Conference on Empirical Methods in Natural Language Processing (EMNLP 2006), July 22-23, 2006, Sydney, Australia
We discuss different strategies for smoothing the phrasetable in Statistical MT, and give results over a range of translation settings. We show that any type of smoothing is a better idea than the relative-frequency estimates that are often used. The best smoothing techniques yield consistent gains of approximately 1% (absolute) according to the BLEU metric.
2006 Conference on Empirical Methods in Natural Language Processing (EMNLP 2006) [Proceedings].