Gemini: a family of highly capable multimodal models G Team, R Anil, S Borgeaud, JB Alayrac, J Yu, R Soricut, J Schalkwyk, ... arXiv preprint arXiv:2312.11805, 2023 | 3417 | 2023 |
Palm 2 technical report R Anil, AM Dai, O Firat, M Johnson, D Lepikhin, A Passos, S Shakeri, ... arXiv preprint arXiv:2305.10403, 2023 | 1619 | 2023 |
Gemini 1.5: Unlocking multimodal understanding across millions of tokens of context G Team, P Georgiev, VI Lei, R Burnell, L Bai, A Gulati, G Tanzer, ... arXiv preprint arXiv:2403.05530, 2024 | 1305 | 2024 |
Adversarial example generation with syntactically controlled paraphrase networks M Iyyer, J Wieting, K Gimpel, L Zettlemoyer arXiv preprint arXiv:1804.06059, 2018 | 827 | 2018 |
Towards universal paraphrastic sentence embeddings J Wieting, M Bansal, K Gimpel, K Livescu arXiv preprint arXiv:1511.08198, 2015 | 756 | 2015 |
ParaNMT-50M: Pushing the limits of paraphrastic sentence embeddings with millions of machine translations J Wieting, K Gimpel arXiv preprint arXiv:1711.05732, 2017 | 396 | 2017 |
From paraphrase database to compositional paraphrase model and back J Wieting, M Bansal, K Gimpel, K Livescu Transactions of the Association for Computational Linguistics 3, 345-358, 2015 | 328 | 2015 |
Paraphrasing evades detectors of ai-generated text, but retrieval is an effective defense K Krishna, Y Song, M Karpinska, J Wieting, M Iyyer Advances in Neural Information Processing Systems 36, 27469-27500, 2023 | 300 | 2023 |
Reformulating unsupervised style transfer as paraphrase generation K Krishna, J Wieting, M Iyyer arXiv preprint arXiv:2010.05700, 2020 | 282 | 2020 |
Charagram: Embedding words and sentences via character n-grams J Wieting, M Bansal, K Gimpel, K Livescu arXiv preprint arXiv:1607.02789, 2016 | 256 | 2016 |
Canine: Pre-training an Efficient Tokenization-Free Encoder for Language Representation JH Clark, D Garrette, I Turc, J Wieting Transactions of the Association for Computational Linguistics 10, 73-91, 2022 | 242 | 2022 |
Beyond BLEU: training neural machine translation with semantic similarity J Wieting, T Berg-Kirkpatrick, K Gimpel, G Neubig arXiv preprint arXiv:1909.06694, 2019 | 182 | 2019 |
compare-mt: A tool for holistic comparison of language generation systems G Neubig, ZY Dou, J Hu, P Michel, D Pruthi, X Wang, J Wieting arXiv preprint arXiv:1903.07926, 2019 | 136 | 2019 |
No training required: Exploring random encoders for sentence classification J Wieting, D Kiela arXiv preprint arXiv:1901.10444, 2019 | 131 | 2019 |
Learning paraphrastic sentence embeddings from back-translated bitext J Wieting, J Mallinson, K Gimpel arXiv preprint arXiv:1706.01847, 2017 | 121 | 2017 |
Revisiting recurrent networks for paraphrastic sentence embeddings J Wieting, K Gimpel arXiv preprint arXiv:1705.00364, 2017 | 105 | 2017 |
Rankgen: Improving text generation with large ranking models K Krishna, Y Chang, J Wieting, M Iyyer arXiv preprint arXiv:2205.09726, 2022 | 65 | 2022 |
Evaluating large language models on controlled generation tasks J Sun, Y Tian, W Zhou, N Xu, Q Hu, R Gupta, JF Wieting, N Peng, X Ma arXiv preprint arXiv:2310.14542, 2023 | 55 | 2023 |
Simple and effective paraphrastic similarity from parallel translations J Wieting, K Gimpel, G Neubig, T Berg-Kirkpatrick arXiv preprint arXiv:1909.13872, 2019 | 50 | 2019 |
Exploring document-level literary machine translation with parallel paragraphs from world literature K Thai, M Karpinska, K Krishna, B Ray, M Inghilleri, J Wieting, M Iyyer arXiv preprint arXiv:2210.14250, 2022 | 46 | 2022 |