Publication Type : Conference Paper
Publisher : IEEE
Source : 7th International Conference on Computing Methodologies and Communication (ICCMC 2023), IEEE. DOI: 10.1109/ICCMC56507.2023.10084034
Url : https://ieeexplore.ieee.org/document/10084034
Campus : Chennai
School : School of Computing
Year : 2023
Abstract : Natural Language Processing is among the emerging fields in machine learning and deep learning. Neural machine translation is a subfield of Natural Language Processing that focuses on language translation. In this paper, the different methods of Neural Machine Translation (NMT) are discussed along with their architectures. It starts from traditional NMT techniques that give poor performance when it encounters long sentences and when there are problems related to vocabulary. Attention-based NMT can provide better performance for long sentences, but the problem of vocabulary remains the same. This can get solved by Attention-based NMT along with sub-word segmentation. Moreover, some of the essential models developed in recent times are discussed. An Adam-based Bi-directional GAN is employed in this work to optimize the training process and to stabilize the GANs. The model is evaluated based on BLEU scores and is compared with the existing models.
Cite this Research Publication : R. Prasanna Kumar, Ippatapu Venkata Srisurya, "Neural Machine Translation Using Adam Optimised Generative Adversarial Network," 2023, 7th International Conference on Computing Methodologies and Communication (ICCMC 2023), IEEE. DOI: 10.1109/ICCMC56507.2023.10084034