Theo dõi
Chao Lou
Chao Lou
Email được xác minh tại shanghaitech.edu.cn
Tiêu đề
Trích dẫn bởi
Trích dẫn bởi
Năm
Nested named entity recognition as latent lexicalized constituency parsing
C Lou, S Yang, K Tu
arXiv preprint arXiv:2203.04665, 2022
442022
Seqgpt: An out-of-the-box large language model for open domain sequence understanding
T Yu, C Jiang, C Lou, S Huang, X Wang, W Liu, J Cai, Y Li, Y Li, K Tu, ...
Proceedings of the AAAI Conference on Artificial Intelligence 38 (17), 19458 …, 2024
142024
Unsupervised vision-language parsing: Seamlessly bridging visual scene graphs with language structures via dependency relationships
C Lou, W Han, Y Lin, Z Zheng
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2022
122022
Sparser is faster and less is more: Efficient sparse attention for long-range transformers
C Lou, Z Jia, Z Zheng, K Tu
arXiv preprint arXiv:2406.16747, 2024
92024
Effective Demonstration Annotation for In-Context Learning via Language Model-Based Determinantal Point Process
P Wang, X Wang, C Lou, S Mao, P Xie, Y Jiang
arXiv preprint arXiv:2408.02103, 2024
12024
AMR Parsing with Causal Hierarchical Attention and Pointers
C Lou, K Tu
arXiv preprint arXiv:2310.11964, 2023
12023
Improving Grammar-based Sequence-to-Sequence Modeling with Decomposition and Constraints
C Lou, K Tu
arXiv preprint arXiv:2306.02671, 2023
12023
Dependency Transformer Grammars: Integrating Dependency Structures into Transformer Language Models
Y Zhao, C Lou, K Tu
arXiv preprint arXiv:2407.17406, 2024
2024
Spa: On the Sparsity of Virtual Adversarial Training for Dependency Parsing
C Lou, W Han, K Tu
Findings of the Association for Computational Linguistics: AACL-IJCNLP 2022 …, 2022
2022
Hệ thống không thể thực hiện thao tác ngay bây giờ. Hãy thử lại sau.
Bài viết 1–9