My research aims at generating long-form texts that exhibit high-level plots, conform to domain knowledge, and connect to real-world events while maintaining consistency and continuity. To achieve this goal, my research develops deep generative models, a toolkit that allows us to explicitly model the dependency structures among variables of interest, and learn these dependencies from partially observed data.
I am also interested in open-source projects to make my research efforts more readily available for developers and researchers.
Current Research Area
- Deep generative models for probabilistic text generation.
Cascaded Text Generation with Markov Transformers
Yuntian Deng, Alexander M. Rush.
Residual Energy-Based Models for Text Generation
Yuntian Deng, Anton Bakhtin, Myle Ott, Arthur Szlam, Marc'Aurelio Ranzato.
Latent Alignment and Variational Attention
Yuntian Deng*, Yoon Kim*, Justin Chiu, Demi Guo, Alexander M. Rush.
Image-to-Markup Generation with Coarse-to-Fine Attention
Yuntian Deng, Anssi Kanervisto, Jeffrey Ling, and Alexander M. Rush.
Neural Linguistic Steganography
Zachary Ziegler*, Yuntian Deng*, Alexander Rush.
EMNLP 2019 (oral)
OpenNMT: Open-Source Toolkit for Neural Machine Translation
Guillaume Klein, Yoon Kim, Yuntian Deng, Jean Senellart, Alexander M. Rush.
ACL Demo 2017 (Best Demo Runner-up)
dengyuntian at seas.harvard.edu
Harvard SEC Room 5.443, Cambridge, MA