Seguir
Yujia Qin
Yujia Qin
ByteDance
Dirección de correo verificada de bytedance.com - Página principal
Título
Citado por
Citado por
Año
Parameter-efficient fine-tuning of large-scale pre-trained language models
N Ding, Y Qin, G Yang, F Wei, Z Yang, Y Su, S Hu, Y Chen, CM Chan, ...
Nature Machine Intelligence 5 (3), 220-235, 2023
749*2023
Toolllm: Facilitating large language models to master 16000+ real-world apis
Y Qin, S Liang, Y Ye, K Zhu, L Yan, Y Lu, Y Lin, X Cong, X Tang, B Qian, ...
ICLR 2024 spotlight, 2023
4242023
Enhancing chat language models by scaling high-quality instructional conversations
N Ding, Y Chen, B Xu, Y Qin, Z Zheng, S Hu, Z Liu, M Sun, B Zhou
arXiv preprint arXiv:2305.14233, 2023
2932023
Tool learning with foundation models. CoRR, abs/2304.08354, 2023. doi: 10.48550
Y Qin, S Hu, Y Lin, W Chen, N Ding, G Cui, Z Zeng, Y Huang, C Xiao, ...
arXiv preprint arXiv.2304.08354 10, 0
243*
Agentverse: Facilitating multi-agent collaboration and exploring emergent behaviors
W Chen, Y Su, J Zuo, C Yang, C Yuan, CM Chan, H Yu, Y Lu, YH Hung, ...
The Twelfth International Conference on Learning Representations, 2023
211*2023
On Transferability of Prompt Tuning for Natural Language Understanding
Y Su, X Wang, Y Qin, CM Chan, Y Lin, Z Liu, P Li, J Li, L Hou, M Sun, ...
NAACL 2022, 2021
140*2021
ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive Learning
Y Qin, Y Lin, R Takanobu, Z Liu, P Li, H Ji, M Huang, M Sun, J Zhou
ACL 2021, 2020
1282020
CPM: A large-scale generative Chinese pre-trained language model
Z Zhang, X Han, H Zhou, P Ke, Y Gu, D Ye, Y Qin, Y Su, H Ji, J Guan, F Qi, ...
AI Open 2, 93-99, 2021
1202021
bert2BERT: Towards Reusable Pretrained Language Models
C Chen, Y Yin, L Shang, X Jiang, Y Qin, F Wang, Z Wang, X Chen, Z Liu, ...
ACL 2022, 2021
692021
Webcpm: Interactive web search for chinese long-form question answering
Y Qin, Z Cai, D Jin, L Yan, S Liang, K Zhu, Y Lin, X Han, N Ding, H Wang, ...
arXiv preprint arXiv:2305.06849, 2023
622023
Creator: Tool creation for disentangling abstract and concrete reasoning of large language models
C Qian, C Han, YR Fung, Y Qin, Z Liu, H Ji
arXiv preprint arXiv:2305.14318, 2023
61*2023
Exploring Universal Intrinsic Task Subspace for Few-Shot Learning via Prompt Tuning
Y Qin, X Wang, Y Su, Y Lin, N Ding, J Yi, W Chen, Z Liu, J Li, L Hou, P Li, ...
IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2024
57*2024
ELLE: Efficient Lifelong Pre-training for Emerging Data
Y Qin, J Zhang, Y Lin, Z Liu, P Li, M Sun, J Zhou
Findings of ACL 2022, 2022
572022
Knowledge inheritance for pre-trained language models
Y Qin, Y Lin, J Yi, J Zhang, X Han, Z Zhang, Y Su, Z Liu, P Li, M Sun, ...
NAACL 2022, 2021
552021
Learning from Explanations with Neural Execution Tree
Z Wang, Y Qin, W Zhou, J Yan, Q Ye, L Neves, Z Liu, X Ren
ICLR 2020, 2019
45*2019
ProQA: Structural Prompt-based Pre-training for Unified Question Answering
W Zhong, Y Gao, N Ding, Y Qin, Z Liu, M Zhou, J Wang, J Yin, N Duan
NAACL 2022, 2022
342022
Debugbench: Evaluating debugging capability of large language models
R Tian, Y Ye, Y Qin, X Cong, Y Lin, Z Liu, M Sun
ACL 2024, 2024
332024
Moderate-fitting as a Natural Backdoor Defender for Pre-trained Language Models
B Zhu, Y Qin, G Cui, Y Chen, W Zhao, C Fu, Y Deng, Z Liu, J Wang, W Wu, ...
NeurIPS 2022, 2022
252022
Tell me more! towards implicit user intention understanding of language model driven agents
C Qian, B He, Z Zhuang, J Deng, Y Qin, X Cong, Y Lin, Z Zhang, Z Liu, ...
ACL 2024, 2024
22*2024
Exploring Mode Connectivity for Pre-trained Language Models
Y Qin, C Qian, J Yi, W Chen, Y Lin, X Han, Z Liu, M Sun, J Zhou
EMNLP 2022, 2022
222022
El sistema no puede realizar la operación en estos momentos. Inténtalo de nuevo más tarde.
Artículos 1–20