Seyeon Lee

seyeonle@usc.edu

Hello! I’m second year master’s student in Computer Science at University of Southern California. I’m working in INK(Intelligence and Knowledge Discovery) Lab advised by Professor Xiang Ren. I’m interested in Analysis on Pre-trained Language Models, Language Model Pre-training and Commonsense Reasoning.

You can find my CV here.

Other than research, I like baking, working out and watching netflix. XD!

Publications

Birds have four legs?! NumerSense: Probing Numerical Commonsense Knowledge of Pre-trained Language Models
in Empirical Methods in Natural Language Processing(EMNLP) 2020
Bill Yuchen Lin, Seyeon Lee, Rahul Khanna and Xiang Ren
Paper Project Website

Pre-training Text-to-Text Transformers for Concept-centric CommonSense
In the International Conference on Learning Representations(ICLR) 2021 (Previous version in SSL@ NeurIPS 2020)
Wangchunshu Zhou*, Dong-Ho Lee*, Ravi Kiran Selvam, Seyeon Lee, Bill Yuchen Lin and Xiang Ren
Pre-print

LEAN-LIFE: A Label-Efficient Annotation Framework Towards Learning from Explanation
in Annual Meeting of the Association for Computational Linguistics(ACL) 2020(system demo)
Dong-Ho Lee*, Rahul Khanna*, Bill Yuchen Lin, Jamin Chen, Seyeon Lee, Qinyuan Ye, Elizabeth Boschee, Leonardo Neves and Xiang Ren
Paper Project Website