Yejin Choi is Senior Director of Large Language Model (LLM) Research at NVIDIA and Wissner-Slivka Professor at the Paul G. Allen School of Computer Science & Engineering at the University of Washington. Her current research interests include synthetic data for LLM training, test-time reasoning algorithms, word-based world models, symbolic knowledge distillation, and pluralistic alignment. She is a MacArthur Fellow (class of 2022), named among Time100 Most Influential People in AI in 2023, and a co-recipient of 2 Test-of-Time awards (ACL 2021 and CVPR 2021) and 8 Best and Outstanding Paper Awards at ACL, EMNLP, NAACL, ICML, NeurIPS, and AAAI. She was a main stage speaker at TED 2023, and a keynote speaker for conferences across several disciplines including MLSys, CVPR, VLDB, ACL, ICLR, WebConf, and AAAI. She is currently serving as a General Chair for the inaugural Conference on Language Modeling (CoLM).