Low-Resource NLP: Self-Supervised Neural Machine Translation

Welcome to the last session in Low-Resource NLP: Multilinguality and Machine Translation!

In the last iteration in this series, we discuss Self-Supervised Neural Machine Translation in great detail. We start first by giving an overview of SS-NMT, training procedure and algorithm description amongst others.

Evaluation methods including automatic evaluation which are commonly used in SS-NMT are discussed. Nowadays, pre-trained models have resulted in outstanding results and we mention such models such as BERT, BART, mBART and GPT. Towards the end we also discuss SS-NMT in the Low-Resource setting.

The YouTube video can be found below, while the PowerPoint slides can be found in this link.

Leave a Reply