Seyeon Lee
seyeonle@usc.edu
Hello! I’m a second year master’s student in Computer Science at University of Southern California. I’m working in INK(Intelligence and Knowledge Discovery) Lab advised by Professor Xiang Ren. I’m interested in the Analysis on Pre-trained Language Models, Language Model Pre-training and Commonsense Reasoning.
You can find my CV here.
Other than research, I like baking, working out, playing Ukulele and watching netflix. XD!
Preprint
RICA: Evaluating Robust Inference Capabilities Based on Commonsense Axioms
Pei Zhou, Rahul Khanna, Seyeon Lee, Bill Yuchen Lin, Daniel Ho, Jay Pujara, Xiang Ren
Paper
Publications
Common Sense Beyond English: Evaluating and Improving Multilingual Language Models for Common Sense Reasoning
in the Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP) 2021
Bill Yuchen Lin, Seyeon Lee, Xiaoyang Qiao and Xiang Ren
Birds have four legs?! NumerSense: Probing Numerical Commonsense Knowledge of Pre-trained Language Models
in Empirical Methods in Natural Language Processing(EMNLP) 2020
Bill Yuchen Lin, Seyeon Lee, Rahul Khanna and Xiang Ren
Paper Project Website
Pre-training Text-to-Text Transformers for Concept-centric CommonSense
In the International Conference on Learning Representations(ICLR) 2021 (Previous version in SSL@ NeurIPS 2020)
Wangchunshu Zhou*, Dong-Ho Lee*, Ravi Kiran Selvam, Seyeon Lee, Bill Yuchen Lin and Xiang Ren
Pre-print
LEAN-LIFE: A Label-Efficient Annotation Framework Towards Learning from Explanation
in the Annual Meeting of the Association for Computational Linguistics(ACL) 2020(system demo)
Dong-Ho Lee*, Rahul Khanna*, Bill Yuchen Lin, Jamin Chen, Seyeon Lee, Qinyuan Ye, Elizabeth Boschee, Leonardo Neves and Xiang Ren
Paper Project Website