π This fall I began my work as a PYI on the OLMo team at AI2 in Seattle working on a fully open ecosystem of language models!
I'm interested in understanding language model behavior. This includes fine-grained evaluation of LLM generation [1, 2], using measures of behavior to improve text generation [3], and interpreting language model behavior with existing theories of human cognition [4].
Previously, I completed my undergrad at Georgia Tech π, where I was fortunate to be advised by Prof. Wei Xu and work with Yao Dou and Dr. Mounica Maddela. I've also spent a few summers as an intern at AWS and Patientco, a healthcare / fintech startup. I enjoy reading, hiking, and making homebrew nitrogen cold brew. βοΈ β°οΈ
Improving Minimum Bayes Risk Decoding with Multi-Prompt [code]
Towards a Path Dependent Account of Category Fluency [code]
Thresh πΎ: Unified, Customizable and Deployable Fine-Grained Text Evaluation [thresh.tools]
Edit-level Simplification Evaluation using SALSA π [code/data, metric]
LENS π: A Learnable Evaluation Metric for Text Simplification
* = equal contribution
A few interesting corners of the internet I find worth checking out!
... to flip through
Games, Puzzles, and Computation by Erik Demaine
The Age of Surveillance Capitalism by Shoshana Zuboff
Society Must be Defended by Michel Foucault
GΓΆdel, Escher, Bach by Douglas Hofstadter
I also enjoy trying new coffee shops. Here's 50+ recommendations across Atlanta, that I visited during my undergrad.