SURF

Evelyn Li

Advancing Neural Machine Translation in Speed and Accuracy

Language is a crowning achievement of human civilization and a defining characteristic of human intelligence. There are over 5000 languages spoken in the world. Most people speak only a handful of languages, so translation is critical to overcoming barriers in communication. Unfortunately, translation is laborious and challenging, and cannot be done manually at scale. As a result, machine translation (MT) has been an active area of research in artificial intelligence.
This research focuses on methods to improve MT. It incorporates the idea of neural machine translation, which builds a neural network to process languages. Typically, the network model is given access to a database of text in some source languages and manually translated target languages.
Though powerful, MT still has unsolved problems. An inherent one in its training process is that there are multiple equally valid ways to translate the same sentence but the system only has access to one correct translation and would consider any other valid translations wrong. I will be investigating ways to solve it by building up new models and proving them.

Message to Sponsor

Hi Chen, thanks again for funding me for this wonderful opportunity! I definitely have learnt a lot over the summer with my mentor and professors and I really enjoyed challenging myself. I am trying my best to finish it before some major paper deadlines, so that it has the potential to be published at a conference. I will definitely let you know once I finish the paper, and hope to see you again! Thank you very much again for this summer!
  • Major: Computer Science
  • Sponsor: Chen Fund
  • Mentor: Jitendra Malik