Syllabus
Introduction – Machine Translation Overview – Language Models – Rule based Machine Translation – Statistical Machine Translation – Encoder-decoder models – Attention mechanism – Neural Machine Translation – Phrase based models – Tree based models – Subword level models – Transformer networks – Evaluation metrics
Objectives and Outcomes
Course Objectives
- The main objective of the course is to obtain basic understanding and implementation skills for modern methods for machine translation
- This course introduces different approaches to build machine translation systems
- This course helps the students to understand various evaluation metrics used for assessing the performance of a machine translation model
- This courses introduces different deep learning architectures used for implementing the machine translation system.
Course Outcomes
After completing this course, students will be able to
CO1
|
Implement a statistical machine translation system
|
CO2
|
Implement a neural machine translation system using RNN-based encoder-decoder architecture
|
CO3
|
Implement a neural machine translation system using transformer-based encoder-decoder architecture
|
CO4
|
Evaluate the performance of machine translation models
|
CO-PO Mapping
PO/PSO
|
PO1
|
PO2
|
PO3
|
PO4
|
PO5
|
PO6
|
PO7
|
PO8
|
PO9
|
PO10
|
PO11
|
PO12
|
PSO1
|
PSO2
|
PSO3
|
CO
|
CO1
|
2
|
2
|
3
|
2
|
3
|
–
|
–
|
–
|
1
|
–
|
1
|
1
|
3
|
1
|
1
|
CO2
|
2
|
2
|
3
|
2
|
3
|
–
|
–
|
–
|
1
|
–
|
1
|
1
|
3
|
1
|
1
|
CO3
|
2
|
2
|
3
|
2
|
3
|
–
|
–
|
–
|
1
|
–
|
1
|
1
|
3
|
1
|
1
|
CO4
|
–
|
–
|
1
|
–
|
1
|
–
|
–
|
–
|
–
|
–
|
–
|
1
|
–
|
3
|
–
|
Text Books / References
Text Books / References
Daniel Jurafsky, James H Martin, Speech & language processing, preparation [cited 2020 June 1] Available from: https://web. stanford. edu/~ jurafsky/slp3 (2018).
Philipp Koehn, Statistical machine translation. Cambridge University Press, 2009..
Philipp Koehn, Neural machine translation. Cambridge University Press, 2020.