SURF

Evelyn Li

Advance Neural Machine Translation in Speed and Accuracy

Language is a crowning achievement of human civilization and a defining characteristic of human intelligence. There are over 5000 languages spoken in the world. Most people speak only a handful of languages, so translation is critical to overcoming barriers. Unfortunately, translation is laborious and challenging, and cannot be done manually at scale. As a result, machine translation (MT) has been an active area of research in artificial intelligence. This research focuses on methods to improve MT. It incorporates the idea of neural machine translation, which builds a neural network to process languages. Typically, the network model is given access to a database of text in some source language and manually translated target languages. Though powerful, MT still has unsolved problems. An inherent one in its training process is that there are multiple equally valid ways to translate the same sentence but the system only has access to one correct translation and would consider any other valid translations wrong. I want to investigate ways to solve it by building up new models and proving them.

Message to Sponsor

I would like to extend my deepest gratitude toward the Chen Fund for making my research possible this summer. My experience as a SURF fellow will be a milestone in my academic development that will definitely play an important role in my future career and life. I hope my research will be a success that could bring meaning progress to the field. Thanks again to the Chen Fund for believing and supporting me.
  • Major: Computer Science
  • Sponsor: Ken Chen L&S
  • Mentor: Jitendra Malik