Skip to content

bdavis9725/MSc_Machine_translation_en-zh

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Machine Translation EN->ZH -- MSc Dissertation

Title: Document Level Translation and Evaluation of Parellel Bilingual Texts using Deep Learning Techniques

Research Question/s

  1. How can we effectively use existing Deep Learning techniques to improve translation quality of Bilingual Parallel Corpora?
  2. How can we make use of Qualitative techniques to verify Neural Machine Translation model performance?

Abstract

Neural Machine Translation is one of the most prevalent areas of research, with the main focus being translation of a source language to a target language. The aim of this paper is to understand what impact the adjustment of hyper-parameters in the training phase of model development can have on the validity and accuracy of translations. Making use of a dataset containing up to 100,000 sentence pairs, model development shows that the adjustment of hyper-parameters is not the only aspect that needs to be considered. Whilst these adjustments show small differences in training model accuracy, the use of automated evaluation metrics don’t help to understand the full scale of model quality. To understand model strengths and weaknesses in further detail, the utilisation of a questionnaire to gather human feedback is created, to help show that a model that looks good when considering automated scoring, the reality shows that native speakers of the target language do not agree that the context is being maintained within these translations. This confirms that human evaluation, although seen as a time-consuming endeavour, is still a necessity. The project was able to develop a translation model with a score of 53% via Word Error Rate, with this result showing that a model with a relatively low training time can still achieve a satisfactory accuracy.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors