I am a Research Scientist at Lilt. I have worked on cross-lingual, semi-supervised, and unsupervised learning for neural machine translation. Currently, I’m focusing on the personalized adaptation of translation models and an efficient computer-assisted workflow for human translators.
I have a long-lasting dream of building an agent that understands and speaks languages of any country or planet in any style or tone. I want the agent to be compassionate and thoughtful with a good understanding of the situation, like Gerty.
|Nov 6, 2021||Check out this blog post on my work and life at Lilt!|
|Jul 19, 2021||The new homepage is launched!|
When and Why is Unsupervised Neural Machine Translation Useless?Proceedings of the 22nd Annual Conference of the European Association for Machine Translation (EAMT 2020)
Pivot-based Transfer Learning for Neural Machine Translation between Non-English LanguagesProceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP 2019)
When and Why is Document-level Context Useful in Neural Machine Translation?Proceedings of the Fourth Workshop on Discourse in Machine Translation (DiscoMT 2019)
Effective Cross-lingual Transfer of Neural Machine Translation Models without Shared VocabulariesProceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019)