Yuntian Deng

Yuntian Deng

Assistant Professor, UWaterloo
Associate, Harvard SEAS
Faculty Affiliate, Vector Institute
PhD in CS, Harvard
[CV] [Google Scholar] [Twitter]

I am an assistant professor at the University of Waterloo. My research interests are Natural Language Processing and Machine Learning. I also enjoy building demos such as WildVis, WildChat, Multiplication Predictor w/o CoT, Grade School Math Solver w/o CoT, OpenAI Watch, Linguistic Steganography, AKSelectionPredictor, OpenNMT, Markup-to-Image Diffusion, and Image-to-Markup. I received my PhD from Harvard University, where I was advised by Prof. Alexander Rush and Prof. Stuart Shieber. I did a postdoc under the supervision of Prof. Yejin Choi.


News


Representative Works

These are some of my representative works. For all my papers, visit here or my Google Scholar.

From Explicit CoT to Implicit CoT: Learning to Internalize CoT Step by Step
Yuntian Deng, Yejin Choi, Stuart Shieber.
In submission

Implicit Chain of Thought Reasoning via Knowledge Distillation
Yuntian Deng, Kiran Prasad, Roland Fernandez, Paul Smolensky, Vishrav Chaudhary, Stuart Shieber.
In submission

WildChat: 1M ChatGPT Interaction Logs in the Wild
Wenting Zhao, Xiang Ren, Jack Hessel, Claire Cardie, Yejin Choi, Yuntian Deng.
ICLR 2024 Spotlight
Featured in the Washington Post
Used in OpenAI's o1 for safety evaluation
Used in Anthropic's Claude 3 for evaluating refusals

WildVis: Open Source Visualizer for Million-Scale Chat Logs in the Wild
Yuntian Deng, Wenting Zhao, Jack Hessel, Xiang Ren, Claire Cardie, Yejin Choi.
EMNLP 2024 Demo

Tree Prompting: Efficient Task Adaptation without Fine-Tuning
John Xavier Morris*, Chandan Singh*, Alexander M. Rush, Jianfeng Gao, Yuntian Deng.
EMNLP 2023

Markup-to-Image Diffusion Models with Scheduled Sampling
Yuntian Deng, Noriyuki Kojima, Alexander M. Rush.
ICLR 2023

Model Criticism for Long-Form Text Generation
Yuntian Deng, Volodymyr Kuleshov, Alexander M Rush.
EMNLP 2022

Cascaded Text Generation with Markov Transformers
Yuntian Deng, Alexander M. Rush.
NeurIPS 2020

Residual Energy-Based Models for Text Generation
Yuntian Deng, Anton Bakhtin, Myle Ott, Arthur Szlam, Marc'Aurelio Ranzato.
ICLR 2020
Referenced by Meta's Llama 2
Referenced by the diffusion paper (DDPM)

Bottom-Up Abstractive Summarization
Sebastian Gehrmann, Yuntian Deng, Alexander Rush.
EMNLP 2018

Latent Alignment and Variational Attention
Yuntian Deng*, Yoon Kim*, Justin Chiu, Demi Guo, Alexander M. Rush.
NIPS 2018

Image-to-Markup Generation with Coarse-to-Fine Attention
Yuntian Deng, Anssi Kanervisto, Jeffrey Ling, and Alexander M. Rush.
ICML 2017

Neural Linguistic Steganography
Zachary Ziegler*, Yuntian Deng*, Alexander Rush.
EMNLP 2019 (Oral)

OpenNMT: Open-Source Toolkit for Neural Machine Translation
Guillaume Klein, Yoon Kim, Yuntian Deng, Jean Senellart, Alexander M. Rush.
ACL 2017 Demo (Best Demo Runner-up)


Prospective Students

I plan to take new PhD students this year. Special consideration will be given to those who can solve the following challenge by January 2025: In the GPT2 model trained to directly solve 20-by-20 multiplication without using intermediate steps (implicit chain-of-thought by learning to internalize step by step), what is the internal algorithm that the trained model uses to solve multiplication directly? (This is not about the training procedure. but about the model's internal reasoning process post-training.) Please see our demo and our paper for more details.

Important Note: I kindly request prospective students to apply directly through the University of Waterloo's application system. Please refrain from sending me emails without a specific question. To ensure that your email is considered with the attention it deserves, please read my papers or address the challenge above.