Publications

Preprint (arXiv)

[P2] Subgraph-Aware Training of Text-based Methods for Knowledge Graph Completion

[P1] Investigating the Influence of Prompt-Specific Shortcuts in AI Generated Text Detection


International Conferences

2024

[C22] Adaptive Contrastive Decoding in Retrieval-Augmented Generation for Handling Noisy Contexts

[C21] Revisiting the Impact of Pursuing Modularity for Code Generation

[C20] Aligning Language Models to Explicitly Handle Ambiguity

[C19] Hyper-CL: Conditioning Sentence Representations with Hypernetworks

[C18] Analysis of Multi-Source Language Training in Cross-Lingual Transfer

[C17] BlendX: Complex Multi-Intent Detection with Blended Patterns 

2023

[C16] X-SNS: Cross-Lingual Transfer Prediction through Sub-Network Similarity

[C15] Universal Domain Adaptation for Robust Handling of Distributional Shifts in NLP

[C14] Prompt-Augmented Linear Probing: Scaling Beyond The Limit of Few-shot In-Context Learners

2022

[C13] Ground-Truth Labels Matter: A Deeper Look into Input-Label Demonstrations

[C12] Enhancing Out-of-Distribution Detection in Natural Language Understanding via Implicit Layer Ensemble

[C11] Revisiting the Practical Effectiveness of Constituency Parse Extraction from Pre-trained Language Models


Before 2022

[C10] Multilingual Chart-based Constituency Parse Extraction from Pre-trained Language Models

[C9] Self-Guided Contrastive Learning for BERT Sentence Representations

[C8] Heads-up! Unsupervised Constituency Parsing via Self-Attention Heads

[C7] Are Pre-trained Language Models Aware of Phrases? Simple but Strong Baselines for Grammar Induction

[C6] Cell-aware Stacked LSTMs for Modeling Sentences

[C5] Don't Just Scratch the Surface: Enhancing Word Representations for Korean with Hanja

[C4] Intrinsic Evaluation of Grammatical Information within Word Embeddings

[C3] A Cross-Sentence Latent Variable Model for Semi-Supervised Text Sequence Matching

[C2] Dynamic Compositionality in Recursive Neural Networks with Structure-aware Tag Representations

[C1] Element-wise Bilinear Interaction for Sentence Matching

International Workshops

[W6] Self-Generated In-Context Learning: Leveraging Auto-regressive Language Models as a Demonstration Generator

[W5] HYU at SemEval-2022 Task 2: Effective Idiomaticity Detection with Consideration at Different Levels of Contextualization

[W4] IDS at SemEval-2020 Task 10: Does Pre-trained Language Model Know What to Emphasize?

[W3] Summary Level Training of Sentence Rewriting for Abstractive Summarization

[W2] SNU_IDS at SemEval-2018 Task 12: Sentence Encoder with Contextualized Vectors for Argument Reasoning Comprehension

[W1] A Syllable-based Technique for Word Embeddings of Korean Words

Domestic Conferences

[D11] 한국어 표 설명 능력 향상을 위한 전처리 및 학습 방법론 탐구
(Learning Strategies to Improve Table Understanding and Explanation in Korean)

[D10] KR-HumanEval을 활용한 언어 모델의 한국어 프로그램 합성 성능 분석
(Analysis of Language Models in Korean Program Synthesis Based on the KR-HumanEval Benchmark)

[D9] 한국어 발화의 다중 의도 감지 연구
(Multi-Intent Detection for Korean Spoken Language)

[D8] 다중 레이블 분류를 위한 프롬프팅 고도화
(Enhanced Prompting for Multi-Label Classification)

[D7] 생락과 상호참조를 보강한 다중 의도 데이터셋
(A Multi-Intent Dataset Enhanced with Implicit Concatenation)

[D6] 기계 독해를 활용한 한국어 의미역 결정
(Korean Semantic Role Labeling with Machine Reading Comprehension)

[D5] 대조학습 기반 문장표현 방법론 개선을 위한 공통 오류 분석 및 앙상블 기법
(Enhancing Sentence Representations with Common Error Analysis and Ensemble Techniques in Contrastive Learning)

[D4] 효과적인 한국어 교차언어 전송을 위한 특성 연구
(Research on Features for Effective Cross-Lingual Transfer in Korean)

[D3] MAdapter: 효율적인 중간 층 도입을 통한 Adapter 구조 개선
(MAdapter: A Refinement of Adapters by Augmenting Efficient Middle Layers)

[D2] 원천 언어 다각화를 통한 교차 언어 전이 성능 향상
(Enhanced Zero-Shot Cross-Lingual Transfer with the Diversification of Source Languages)

[D1] 한국어 문장 표현을 위한 비지도 대조 학습 방법론의 비교 및 분석
(Comparison and Analysis of Unsupervised Contrastive Learning Approaches for Korean Sentence Representations)