I am a third year Ph.D. student in Computer Science at Stanford University advised by Stefano Ermon, where I'm affiliated with the SAIL and StatML groups. My research interests lie in probabilistic machine learning methods: in particular, I am interested in developing techniques for better adaptation in generative models, robust representation learning, and various applications.

My research is supported by the NSF GRFP, Stanford Graduate Fellowship, and the Qualcomm Innovation Fellowship. I completed my undergraduate studies in CS-Stats at Columbia, where I worked on problems in computational biology as part of the Pe'er lab.

I previously interned at Google Brain in 2019 as part of the Magenta project. In my free time I'm an avid tennis player, runner, and food enthusiast!

Preprints

Publications

Encoding Musical Style with Transformer Autoencoders.
Kristy Choi, Curtis Hawthorne, Ian Simon, Monica Dinculescu, Jesse Engel
International Conference of Machine Learning (ICML), 2020.
[arXiv] [code]
Fair Generative Modeling via Weak Supervision.
Kristy Choi*, Aditya Grover*, Trisha Singh, Rui Shu, Stefano Ermon
International Conference of Machine Learning (ICML), 2020.
[arXiv] [code]
Meta-Amortized Variational Inference and Learning.
Mike Wu*, Kristy Choi*, Noah Goodman, Stefano Ermon
AAAI Conference on Artificial Intelligence (AAAI), 2020.
[arXiv]
Neural Joint-Source Channel Coding.
Kristy Choi, Kedar Tatwawadi, Aditya Grover, Tsachy Weissman, Stefano Ermon.
International Conference of Machine Learning (ICML), 2019. Long oral.
[pdf] [code]
Wishbone identifies bifurcating developmental trajectories from single-cell data.
Manu Setty, Michelle Tadmor, Shlomit Reich-Zeliger, Omer Angel, Tomer Salame, Pooja Kathail, Kristy Choi, Sean Bendall, Nir Friedman, Dana Pe'er.
Nature Biotechnology, 34(6), 637-645.
[pdf]

Workshop Papers

Tensor Decomposition for Single-cell RNA-seq Data.
Kristy Choi*, Ambrose J. Carr*, Sandhya Prabhakaran, Dana Pe'er
Practical Bayesian Nonparametrics Workshop, NeurIPS 2016.
[pdf]

Teaching

Fall 2019: Head Teaching Assistant for CS236: Deep Generative Models at Stanford

Fall 2018: Teaching Assistant for CS236: Deep Generative Models at Stanford

Spring 2017: Head Teaching Assistant for COMS4117: Machine Learning at Columbia

Service

Reviewer: NeurIPS 2020, ICML 2020, UAI 2020, ICLR 2020, AAAI 2020, NeurIPS 2019, ICML 2019, ICLR 2019.

Workshop Co-Organizer: Information Theory & Machine Learning (NeurIPS 2019)