Seguir
Sanghwan Bae
Sanghwan Bae
NAVER CLOVA
Email confirmado em navercorp.com
Título
Citado por
Citado por
Ano
Summary level training of sentence rewriting for abstractive summarization
S Bae, T Kim, J Kim, S Lee
EMNLP 2019 Workshop, 10-20, 2019
962019
Aligning Large Language Models through Synthetic Feedback
S Kim, S Bae, J Shin, S Kang, D Kwak, KM Yoo, M Seo
EMNLP 2023, 2023
662023
Keep Me Updated! Memory Management in Long-term Conversations
S Bae, D Kwak, S Kang, MY Lee, S Kim, Y Jeong, H Kim, SW Lee, W Park, ...
EMNLP 2022 Findings, 2022
592022
Building a Role Specified Open-Domain Dialogue System Leveraging Large-Scale Language Models
S Bae, D Kwak, S Kim, D Ham, S Kang, SW Lee, W Park
NAACL 2022, 2022
452022
Dynamic compositionality in recursive neural networks with structure-aware tag representations
T Kim, J Choi, D Edmiston, S Bae, S Lee
AAAI 2019 33 (01), 6594-6601, 2019
282019
SNU_IDS at SemEval-2019 task 3: Addressing training-test class distribution mismatch in conversational classification
S Bae, J Choi, S Lee
NAACL 2019 Workshop, 2019
112019
Hyperclova x technical report
KM Yoo, J Han, S In, H Jeon, J Jeong, J Kang, H Kim, KM Kim, M Kim, ...
arXiv preprint arXiv:2404.01954, 2024
62024
Revealing user familiarity bias in task-oriented dialogue via interactive evaluation
T Kim, J Shin, YH Kim, S Bae, S Kim
arXiv preprint arXiv:2305.13857, 2023
12023
Syntactic analysis apparatus and method for the same
SS Park, CW Chun, CI Park, SH Park, JK Lee, HT Kim, SG Lee, KM Yoo, ...
US Patent 11,714,960, 2023
2023
O sistema não pode efectuar a operação agora. Tente mais tarde.
Artigos 1–9