Program synthesis with large language models J Austin, A Odena, M Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ... arXiv preprint arXiv:2108.07732, 2021 | 1221 | 2021 |
Show your work: Scratchpads for intermediate computation with language models M Nye, AJ Andreassen, G Gur-Ari, H Michalewski, J Austin, D Bieber, ... arXiv preprint arXiv:2112.00114, 2021 | 542 | 2021 |
DreamCoder: growing generalizable, interpretable knowledge with wake–sleep Bayesian program learning K Ellis, L Wong, M Nye, M Sable-Meyer, L Cary, L Anaya Pozo, L Hewitt, ... Philosophical Transactions of the Royal Society A 381 (2251), 20220050, 2023 | 232 | 2023 |
Dreamcoder: Bootstrapping inductive program synthesis with wake-sleep library learning K Ellis, C Wong, M Nye, M Sablé-Meyer, L Morales, L Hewitt, L Cary, ... Proceedings of the 42nd acm sigplan international conference on programming …, 2021 | 193 | 2021 |
Write, execute, assess: Program synthesis with a repl K Ellis, M Nye, Y Pu, F Sosa, J Tenenbaum, A Solar-Lezama Advances in Neural Information Processing Systems 32, 2019 | 166 | 2019 |
Implicit representations of meaning in neural language models BZ Li, M Nye, J Andreas arXiv preprint arXiv:2106.00737, 2021 | 153 | 2021 |
Learning compositional rules via neural program synthesis M Nye, A Solar-Lezama, J Tenenbaum, BM Lake Advances in Neural Information Processing Systems 33, 10832-10842, 2020 | 125 | 2020 |
Learning to infer program sketches M Nye, L Hewitt, J Tenenbaum, A Solar-Lezama International Conference on Machine Learning, 4861-4870, 2019 | 124 | 2019 |
Improving coherence and consistency in neural sequence models with dual-system, neuro-symbolic reasoning M Nye, M Tessler, J Tenenbaum, BM Lake Advances in Neural Information Processing Systems 34, 25192-25204, 2021 | 118 | 2021 |
The variational homoencoder: Learning to learn high capacity generative models from few examples LB Hewitt, MI Nye, A Gane, T Jaakkola, JB Tenenbaum arXiv preprint arXiv:1807.08919, 2018 | 76 | 2018 |
Communicating natural programs to humans and machines S Acquaviva, Y Pu, M Kryven, T Sechopoulos, C Wong, G Ecanow, M Nye, ... Advances in Neural Information Processing Systems 35, 3731-3743, 2022 | 61 | 2022 |
Introducing our multimodal models, 2023 R Bavishi, E Elsen, C Hawthorne, M Nye, A Odena, A Somani, S Tasırlar URL https://www. adept. ai/blog/fuyu-8b 2, 0 | 60 | |
Program synthesis with large language models. CoRR abs/2108.07732 (2021) J Austin, A Odena, MI Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ... arXiv preprint arXiv:2108.07732, 2021 | 54 | 2021 |
Show your work: Scratchpads for intermediate computation with language models, 2021 M Nye, AJ Andreassen, G Gur-Ari, H Michalewski, J Austin, D Bieber, ... URL https://arxiv. org/abs/2112.00114, 2021 | 41 | 2021 |
Introducing our multimodal models R Bavishi, E Elsen, C Hawthorne, M Nye, A Odena, A Somani, S Tasırlar | 34 | 2023 |
Representing partial programs with blended abstract semantics M Nye, Y Pu, M Bowers, J Andreas, JB Tenenbaum, A Solar-Lezama arXiv preprint arXiv:2012.12964, 2020 | 25 | 2020 |
A large-scale benchmark for few-shot program induction and synthesis F Alet, J Lopez-Contreras, J Koppel, M Nye, A Solar-Lezama, ... International Conference on Machine Learning, 175-186, 2021 | 23 | 2021 |
Program synthesis with large language models (2021) J Austin, A Odena, M Nye, M Bosma, H Michalewski, D Dohan, E Jiang, ... arXiv preprint arXiv:2108.07732, 2021 | 23 | 2021 |
Are efficient deep representations learnable? M Nye, A Saxe arXiv preprint arXiv:1807.06399, 2018 | 22 | 2018 |
Language modeling with latent situations BZ Li, M Nye, J Andreas arXiv preprint arXiv:2212.10012, 2022 | 8 | 2022 |