Yuntian Deng
Harvard University
Research

Yuntian works on natural language generation using deep generative models. In particular, his research targets two challenges of scaling text generation models to long-form generation: first, he works on reducing the computational complexity of neural text generation models using probabilistic formulations that enable fast inference; second, he works on maintaining the global consistency and high-level organization of the generated text by introducing explicit latent structures into the model. To make his research efforts readily available, Yuntian also works on open-source libraries such as OpenNMT and image-to-LaTeX.

Bio

Yuntian Deng is a Ph.D. candidate in Computer Science at Harvard University, advised by Prof. Alexander Rush and Prof. Stuart Shieber. He is interested in natural language generation using deep generative models. Yuntian received a B.E. in Automation from Tsinghua University and an M.S. in Language Technologies from Carnegie Mellon University.

Hometown
Shenyang, China