|
Yuntian DengPh.D.Harvard University [CV] [Google Scholar] [Email] |
My research aims at enabling ubiquitous long-form text generation, where anyone can produce a document on any topic with just a few clicks. To achieve this goal, my research focuses on the following three directions, utilizing advances in deep learning combined with probabilistic modeling:
- Long-Form Coherence to generate coherent documents with self-consistency and clear transitions.
- Transparent Generation to enable users to understand, debug, and control document generation.
- Efficient Systems to handle the scale and real-time requirements of long-form text generation.
I also work on open-source projects such as OpenNMT, Im2LaTeX, LaTeX2Im, and Steganography to make my research efforts more readily available for developers and researchers.
Statements
[Research Statement] [Teaching Statement] [Interdisciplinary Statement] [Diversity Statement]News
- Mar 29, 2023: OpenAIWatch.com is launched! It tracks GPT-4's nondeterministic behavior even with greedy decoding in unicorn illustrations. 🦄
- Mar 29, 2023: Our GPT-4 Chatbot, based on Yuvraj Sharma's code, is now live! It provides free acess to GPT-4 with the aim of collecting dialogue data for research purposes.
- Oct 18, 2022: Our latest paper, Model Criticism for Long-Form Text Generation, is now publicly available! This paper uses model criticism in latent space to quantify various notions of high-level coherence in long-form text generation.
- Oct 12, 2022: Markup-to-Image Diffusion Models demo is now live! This project uses a diffusion model to learn how to render various types of markups, including LaTeX.
- Jun 2, 2020: Our latest paper, Cascaded Text Generation with Markov Transformers, is available! It allows parallel, fast, autoregressive, and accurate text generation using a high-order Markov model.
- Apr 26, 2020: Introducing Residual Energy-Based Models for Text Generation, a globally-normalized approach to text generation! Our approach uses a global discriminator to guide the traditional locally-normalized language model to produce text that's more indistinguishable from human-written text.
- Sep 5, 2019: Neural Linguistic Steganography demo is now live! This project lets you hide secret messages in natural language using arithmetic coding.
- Dec 19, 2016: Excited to introduce OpenNMT, an open-source neural machine translation toolkit developed for industrial and academic use.
- Sep 19, 2016: Excited to announce that we've provided a solution to OpenAI's requests-for-research im2latex challenge using neural sequence-to-sequence learning! Check out the visualizations here.
Selected Papers
![]() |
Markup-to-Image Diffusion Models with Scheduled Sampling
Yuntian Deng, Noriyuki Kojima, Alexander M. Rush. ICLR 2023 |
![]() |
Model Criticism for Long-Form Text Generation
Yuntian Deng, Volodymyr Kuleshov, Alexander M Rush. EMNLP 2022 |
![]() |
Cascaded Text Generation with Markov Transformers
Yuntian Deng, Alexander M. Rush. NeurIPS 2020 |
![]() |
Residual Energy-Based Models for Text Generation
Yuntian Deng, Anton Bakhtin, Myle Ott, Arthur Szlam, Marc'Aurelio Ranzato. ICLR 2020 |
![]() |
Latent Alignment and Variational Attention
Yuntian Deng*, Yoon Kim*, Justin Chiu, Demi Guo, Alexander M. Rush. NIPS 2018 |
![]() |
Image-to-Markup Generation with Coarse-to-Fine Attention
Yuntian Deng, Anssi Kanervisto, Jeffrey Ling, and Alexander M. Rush. ICML 2017 |
![]() |
Neural Linguistic Steganography
Zachary Ziegler*, Yuntian Deng*, Alexander Rush. EMNLP 2019 (Oral) |
![]() |
OpenNMT: Open-Source Toolkit for Neural Machine Translation
Guillaume Klein, Yoon Kim, Yuntian Deng, Jean Senellart, Alexander M. Rush. ACL Demo 2017 (Best Demo Runner-up) |
Contact
Science and Engineering Complex 5.443
150 Western Avenue
Boston, MA 02134
USA
150 Western Avenue
Boston, MA 02134
USA