Hi, I am a fourth-year Ph.D. student at UCSB CS and UCSB NLP group. During my Ph.D., I did summer internships at AWS AI Labs, Apple MLR, and Microsoft Research. Before joining UCSB, I worked as a research scientist at NAVER and earned both my B.S. and M.S. from Seoul National University.
My research mainly focuses on improving the efficiency and reliability of language models through developing advanced algorithms for retrieval ( DensePhrases-UL-HN and PKM-augmented PLMs), adaptive computation ( Length-Adaptive Transformer and Dynamic-TinyBERT), data augmentation ( SSMix and VAT-D), and generation ( Subword LM QAC and SOM-DST). Other than that, I did research on multi-modal learning ( Textual KD SLU, ST-BERT), and membership inference attacks for LLMs ( EM-MIA). Recently, I am interested in efficient LLM inference (KV cache compression, speculative decoding, etc.), long-context modeling, inference scaling, and retrieval-augmented generation.
M.S. in Electrical and Computer Engineering, 2017
Seoul National University
B.S. in Electrical and Computer Engineering / Mathematical Science (double major), 2014
Seoul National University
High School (early graduation), 2010
Seoul Science High School
Mentors:
Worked on:
Working/Worked on:
Worked on:
Fundamental Research for Big LMs
Language Representation by Clova (LaRva)
Context Center AI ( CCAI)
Korean Grammatical Error Correction
Language Model based Query Auto-Completion
LINE Sticker Reply Recommendation
Community Question Answering based on Query Similarity
Worked on:
Worked on:
Worked on:
$*$ denotes eqaul contribution