Folgen
Sanghwan Bae
Sanghwan Bae
NAVER CLOVA
Bestätigte E-Mail-Adresse bei navercorp.com
Titel
Zitiert von
Zitiert von
Jahr
Summary level training of sentence rewriting for abstractive summarization
S Bae, T Kim, J Kim, S Lee
EMNLP 2019 Workshop, 10-20, 2019
772019
Keep Me Updated! Memory Management in Long-term Conversations
S Bae, D Kwak, S Kang, MY Lee, S Kim, Y Jeong, H Kim, SW Lee, W Park, ...
EMNLP 2022 Findings, 2022
312022
Dynamic compositionality in recursive neural networks with structure-aware tag representations
T Kim, J Choi, D Edmiston, S Bae, S Lee
AAAI 2019 33 (01), 6594-6601, 2019
282019
Building a Role Specified Open-Domain Dialogue System Leveraging Large-Scale Language Models
S Bae, D Kwak, S Kim, D Ham, S Kang, SW Lee, W Park
NAACL 2022, 2022
262022
Aligning Large Language Models through Synthetic Feedback
S Kim, S Bae, J Shin, S Kang, D Kwak, KM Yoo, M Seo
EMNLP 2023, 2023
242023
SNU_IDS at SemEval-2019 task 3: Addressing training-test class distribution mismatch in conversational classification
S Bae, J Choi, S Lee
NAACL 2019 Workshop, 2019
122019
HyperCLOVA X Technical Report
KM Yoo, J Han, S In, H Jeon, J Jeong, J Kang, H Kim, KM Kim, M Kim, ...
arXiv preprint arXiv:2404.01954, 2024
2024
Syntactic analysis apparatus and method for the same
SS Park, CW Chun, CI Park, SH Park, JK Lee, HT Kim, SG Lee, KM Yoo, ...
US Patent 11,714,960, 2023
2023
Revealing User Familiarity Bias in Task-Oriented Dialogue via Interactive Evaluation
T Kim, J Shin, YH Kim, S Bae, S Kim
arXiv preprint arXiv:2305.13857, 2023
2023
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–9