フォロー
Peiyu Liu(刘沛羽)
Peiyu Liu(刘沛羽)
確認したメール アドレス: ruc.edu.cn - ホームページ
タイトル
引用先
引用先
A survey of large language models
WX Zhao, K Zhou, J Li, T Tang, X Wang, Y Hou, Y Min, B Zhang, J Zhang, ...
arXiv preprint arXiv:2303.18223, 2023
3441*2023
WenLan: Bridging vision and language by large-scale multi-modal pre-training
Y Huo, M Zhang, G Liu, H Lu, Y Gao, G Yang, J Wen, H Zhang, B Xu, ...
arXiv preprint arXiv:2103.06561, 2021
1392021
Parameter-efficient mixture-of-experts architecture for pre-trained language models
ZF Gao, P Liu, WX Zhao, ZY Lu, JR Wen
arXiv preprint arXiv:2203.01104, 2022
372022
Enabling lightweight fine-tuning for pre-trained language model compression based on matrix product operators
P Liu, ZF Gao, WX Zhao, ZY Xie, ZY Lu, JR Wen
arXiv preprint arXiv:2106.02205, 2021
282021
Do emergent abilities exist in quantized large language models: An empirical study
P Liu, Z Liu, ZF Gao, D Gao, WX Zhao, Y Li, B Ding, JR Wen
arXiv preprint arXiv:2307.08072, 2023
232023
Small pre-trained language models can be fine-tuned as large models via over-parameterization
ZF Gao, K Zhou, P Liu, WX Zhao, JR Wen
Proceedings of the 61st Annual Meeting of the Association for Computational …, 2023
112023
TikTalk: A Video-Based Dialogue Dataset for Multi-Modal Chitchat in Real World
H Lin, L Ruan, W Xia, P Liu, J Wen, Y Xu, D Hu, R Song, WX Zhao, Q Jin, ...
Proceedings of the 31st ACM International Conference on Multimedia, 1303-1313, 2023
62023
WuDaoMM: A large-scale Multi-Modal Dataset for Pre-training models
S Yuan, S Zhao, J Leng, Z Xue, H Zhao, P Liu, Z Gong, WX Zhao, J Li, ...
arXiv preprint arXiv:2203.11480, 2022
62022
Enhancing scalability of pre-trained language models via efficient parameter sharing
P Liu, ZF Gao, Y Chen, WX Zhao, JR Wen
Findings of the Association for Computational Linguistics: EMNLP 2023, 13771 …, 2023
42023
TikTalk: A Multi-Modal Dialogue Dataset for Real-World Chitchat
H Lin, L Ruan, W Xia, P Liu, J Wen, Y Xu, D Hu, R Song, WX Zhao, Q Jin
arXiv preprint arXiv 2301, 2023
22023
Scaling pre-trained language models to deeper via parameter-efficient architecture
P Liu, ZF Gao, Y Chen, WX Zhao, JR Wen
arXiv preprint arXiv:2303.16753, 2023
12023
Compression Image Dataset Based on Multiple Matrix Product States
ZF Gao, P Liu, WX Zhao, ZY Xie, JR Wen, ZY Lu
Future of Information and Communication Conference, 621-638, 2024
2024
Image Dataset Compression Based on Matrix Product States
ZF Gao, P Liu, XH Zhang, X Zhao, ZY Xie, ZY Lu, JR Wen
現在システムで処理を実行できません。しばらくしてからもう一度お試しください。
論文 1–13