Yuntian Deng

Ph.D. Student
Harvard University
[] [cv]

My research aims at generating long-form texts that exhibit high-level plots, conform to domain knowledge, and connect to real-world events while maintaining consistency and continuity. To achieve this goal, my research develops deep generative models, a toolkit that allows us to explicitly model the dependency structures among variables of interest, and learn these dependencies from partially observed data.

I am also interested in open-source projects to make my research efforts more readily available for developers and researchers.

Current Research Area

  • Deep generative models for probabilistic text generation.

Selected Papers

Cascaded Text Generation with Markov Transformers
Yuntian Deng, Alexander M. Rush.
NeurIPS 2020

Residual Energy-Based Models for Text Generation
Yuntian Deng, Anton Bakhtin, Myle Ott, Arthur Szlam, Marc'Aurelio Ranzato.
ICLR 2020

Latent Alignment and Variational Attention
Yuntian Deng*, Yoon Kim*, Justin Chiu, Demi Guo, Alexander M. Rush.
NIPS 2018

Image-to-Markup Generation with Coarse-to-Fine Attention
Yuntian Deng, Anssi Kanervisto, Jeffrey Ling, and Alexander M. Rush.
ICML 2017

Neural Linguistic Steganography
Zachary Ziegler*, Yuntian Deng*, Alexander Rush.
EMNLP 2019 (oral)

OpenNMT: Open-Source Toolkit for Neural Machine Translation
Guillaume Klein, Yoon Kim, Yuntian Deng, Jean Senellart, Alexander M. Rush.
ACL Demo 2017 (Best Demo Runner-up)


Yuntian Deng
dengyuntian at
Harvard SEC Room 5.443, Cambridge, MA