Segui
Yin-Wen Chang
Yin-Wen Chang
Email verificata su google.com
Titolo
Citata da
Citata da
Anno
Training and testing low-degree polynomial data mappings via linear SVM.
YW Chang, CJ Hsieh, KW Chang, M Ringgaard, CJ Lin
Journal of Machine Learning Research 11 (4), 2010
7502010
Feature ranking using linear SVM
YW Chang, CJ Lin
Causation and prediction challenge, 53-64, 2008
3902008
Pre-training tasks for embedding-based large-scale retrieval
WC Chang, FX Yu, YW Chang, Y Yang, S Kumar
arXiv preprint arXiv:2002.03932, 2020
3212020
O (n) connections are expressive enough: Universal approximability of sparse transformers
C Yun, YW Chang, S Bhojanapalli, AS Rawat, S Reddi, S Kumar
Advances in Neural Information Processing Systems 33, 13783-13794, 2020
742020
A simple and effective positional encoding for transformers
PC Chen, H Tsai, S Bhojanapalli, HW Chung, YW Chang, CS Ferng
arXiv preprint arXiv:2104.08698, 2021
672021
Exact decoding of phrase-based translation models through lagrangian relaxation
YW Chang
Massachusetts Institute of Technology, 2012
592012
Optimal beam search for machine translation
AM Rush, YW Chang, M Collins
Proceedings of the 2013 Conference on Empirical Methods in Natural Language …, 2013
372013
Leveraging redundancy in attention with reuse transformers
S Bhojanapalli, A Chakrabarti, A Veit, M Lukasik, H Jain, F Liu, YW Chang, ...
arXiv preprint arXiv:2110.06821, 2021
212021
A constrained viterbi relaxation for bidirectional word alignment
YW Chang, AM Rush, J DeNero, M Collins
Proceedings of the 52nd Annual Meeting of the Association for Computational …, 2014
142014
Demystifying the better performance of position encoding variants for transformer
PC Chen, H Tsai, S Bhojanapalli, HW Chung, YW Chang, CS Ferng
arXiv preprint arXiv:2104.08698 3 (7), 2021
72021
A polynomial-time dynamic programming algorithm for phrase-based decoding with a fixed distortion limit
YW Chang, M Collins
Transactions of the Association for Computational Linguistics 5, 59-71, 2017
52017
Low-degree polynomial mapping of data for svm
Y Chang, C Hsieh, K Chang, M Ringgaard, C Lin
Journal of Machine Learning Research 11, 1-21, 2010
52010
BEVOLO, AJ, 176
JC BOIVIN, WE BROWN, C CARCALY, A CHANG, R CHEVREL, ...
Journal of Solid State Chemistry 35, 407-408, 1980
11980
Leveraging Redundancy in Attention with Reuse Transformers
VS Bhojanapalli, A Veit, A Chakrabarti, F Liu, H Jain, M Lukasik, S Kumar, ...
US Patent App. 17/960,380, 2023
2023
Connections are Expressive Enough: Universal Approximability of Sparse Transformers
AS Rawat, C Yun, S Kumar, S Reddi, S Bhojanapalli, YW Chang
2020
Source-Side Left-to-Right or Target-Side Left-to-Right? An Empirical Comparison of Two Phrase-Based Decoding Algorithms
YW Chang, M Collins
Proceedings of the 2017 Conference on Empirical Methods in Natural Language …, 2017
2017
Exact Decoding of Phrase-Based Translation Models through Lagrangian Relaxation: Supplementary Material
YW Chang, M Collins
Il sistema al momento non può eseguire l'operazione. Riprova più tardi.
Articoli 1–17