Yuntian Deng

Ph.D. Student
Harvard University
[CV] [Google Scholar] [Email]

My research aims at enabling ubiquitous long-form text generation, where anyone can produce a document on any topic with just a few clicks. To achieve this goal, my research focuses on the following three directions, utilizing advances in deep learning combined with probabilistic modeling:

  • Long-Form Coherence to generate coherent documents with self-consistency and clear transitions.
  • Transparent Generation to enable users to understand, debug, and control document generation.
  • Efficient Systems to handle the scale and real-time requirements of long-form text generation.

I also work on open-source projects such as OpenNMT, Im2LaTeX, LaTeX2Im, and Steganography to make my research efforts more readily available for developers and researchers.

Statements

[Research Statement] [Teaching Statement] [Interdisciplinary Statement] [Diversity Statement]

Selected Papers

Markup-to-Image Diffusion Models with Scheduled Sampling
Yuntian Deng, Noriyuki Kojima, Alexander M. Rush.
ICLR 2023

Model Criticism for Long-Form Text Generation
Yuntian Deng, Volodymyr Kuleshov, Alexander M Rush.
EMNLP 2022

Cascaded Text Generation with Markov Transformers
Yuntian Deng, Alexander M. Rush.
NeurIPS 2020

Residual Energy-Based Models for Text Generation
Yuntian Deng, Anton Bakhtin, Myle Ott, Arthur Szlam, Marc'Aurelio Ranzato.
ICLR 2020

Latent Alignment and Variational Attention
Yuntian Deng*, Yoon Kim*, Justin Chiu, Demi Guo, Alexander M. Rush.
NIPS 2018

Image-to-Markup Generation with Coarse-to-Fine Attention
Yuntian Deng, Anssi Kanervisto, Jeffrey Ling, and Alexander M. Rush.
ICML 2017

Neural Linguistic Steganography
Zachary Ziegler*, Yuntian Deng*, Alexander Rush.
EMNLP 2019 (Oral)

OpenNMT: Open-Source Toolkit for Neural Machine Translation
Guillaume Klein, Yoon Kim, Yuntian Deng, Jean Senellart, Alexander M. Rush.
ACL Demo 2017 (Best Demo Runner-up)

Contact

Science and Engineering Complex 5.443
150 Western Avenue
Boston, MA 02134
USA