Hyperparameter Optimization on Neural Machine Translation

Thumbnail Image
Date
2019-01-01
Authors
Agnihotri, Souparni
Major Professor
Chinmay Hegde
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Authors
Research Projects
Organizational Units
Journal Issue
Is Version Of
Versions
Series
Department
Electrical and Computer Engineering
Abstract

With the growth of deep learning in the recent years, there have been several models created to tackle different real world goals, autonomously. One such goal is the automatic translation of text from one language to another. This is commonly known as Neural Machine Translation (NMT). NMT has proved to be a significant challenge to achieve, given the fluidity of human language. Most NMT models rely on Recurrent Neural Networks (RNNs) and deep Long Short-Term Memory networks (LSTMs). In this study, we will explore the Sequence to Sequence Learning with Neural Networks Model (Sutskever et al. (5)) and perform an ablation study of the model on two different data sets - the English-Vietnamese parallel corpus by the IWSLT Evaluation Campaign, and the German-English parallel corpus obtained from the WMT Evaluation Campaign.

Comments
Description
Keywords
Citation
DOI
Source
Copyright
Tue Jan 01 00:00:00 UTC 2019