Clara Na

Hello! I am a 2nd year PhD student at Carnegie Mellon University’s Language Technologies Institute, where I am advised by Emma Strubell and supported by an NSF Graduate Research Fellowship.

Before coming to CMU, I earned a BA in Computer Science and Mathematics at the University of Virginia. I began my research journey at UVA looking for “subtractive” design in patents with Katelyn Stenger and Leidy Klotz. My NLP origin story involves my half-baked bilingualism, a data science internship at the Washington Post, and some generous mentorship from Yangfeng Ji.

I am (very) broadly interested in language, information, and impacts and applications of language technologies.

Misc: I was born and raised in northern Virginia (NoVA). My middle name is 선우 (Seon-Woo) – I am a second generation Korean American. I have a younger brother who is an undergrad here at CMU. In my spare time I like playing piano (especially with other people) and running.


Oct 2022 Our paper, Train Flat, Then Compress, was accepted to EMNLP Findings ‘22!
Aug 2022 Received the Best Novel Work award at the LTI Student Research Symposium :)
Aug 2021 Started grad school!
Mar 2021 Awarded an NSF Graduate Research Fellowship!

selected publications

  1. In submission
    Virtual Task Selection in Meta-Learning for Domain Generalization in Semantic Parsing
    In submission 2022
  2. EMNLP Findings
    Train Flat, Then Compress: Sharpness-Aware Minimization Learns More Compressible Models
    To appear at EMNLP Findings 2022