Seguir
Theodor Misiakiewicz
Theodor Misiakiewicz
Assistant Professor, Yale University
E-mail confirmado em yale.edu - Página inicial
Título
Citado por
Citado por
Ano
Mean-field theory of two-layers neural networks: dimension-free bounds and kernel limit
S Mei, T Misiakiewicz, A Montanari
Conference on learning theory, 2388-2464, 2019
3192019
Linearized two-layers neural networks in high dimension
B Ghorbani, S Mei, T Misiakiewicz, A Montanari
2712021
When do neural networks outperform kernel methods?
B Ghorbani, S Mei, T Misiakiewicz, A Montanari
Advances in Neural Information Processing Systems 33, 14820-14830, 2020
2112020
Limitations of lazy training of two-layers neural network
B Ghorbani, S Mei, T Misiakiewicz, A Montanari
Advances in Neural Information Processing Systems 32, 2019
1582019
Generalization error of random feature and kernel methods: hypercontractivity and kernel matrix concentration
S Mei, T Misiakiewicz, A Montanari
Applied and Computational Harmonic Analysis 59, 3-84, 2022
1452022
The merged-staircase property: a necessary and nearly sufficient condition for sgd learning of sparse functions on two-layer neural networks
E Abbe, EB Adsera, T Misiakiewicz
Conference on Learning Theory, 4782-4887, 2022
1222022
Learning with invariances in random features and kernel models
S Mei, T Misiakiewicz, A Montanari
Conference on Learning Theory, 3351-3418, 2021
862021
Sgd learning on neural networks: leap complexity and saddle-to-saddle dynamics
E Abbe, EB Adsera, T Misiakiewicz
The Thirty Sixth Annual Conference on Learning Theory, 2552-2623, 2023
812023
Solving SDPs for synchronization and MaxCut problems via the Grothendieck inequality
S Mei, T Misiakiewicz, A Montanari, RI Oliveira
Conference on learning theory, 1476-1515, 2017
802017
Precise Learning Curves and Higher-Order Scaling Limits for Dot Product Kernel Regression
L Xiao, H Hu, T Misiakiewicz, YM Lu, J Pennington
Journal of Statistical Mechanics: Theory and Experiment 2023 (11), 114005, 2023
52*2023
Spectrum of inner-product kernel matrices in the polynomial regime and multiple descent phenomenon in kernel ridge regression
T Misiakiewicz
arXiv preprint arXiv:2204.10425, 2022
452022
Learning with convolution and pooling operations in kernel methods
T Misiakiewicz, S Mei
Advances in Neural Information Processing Systems 35, 29014-29025, 2022
252022
Asymptotics of random feature regression beyond the linear scaling regime
H Hu, YM Lu, T Misiakiewicz
arXiv preprint arXiv:2403.08160, 2024
132024
Six lectures on linearized neural networks
T Misiakiewicz, A Montanari
Journal of Statistical Mechanics: Theory and Experiment 2024 (10), 104006, 2024
112024
Discussion of:“Nonparametric regression using deep neural networks with ReLU activation function”
B Ghorbani, S Mei, T Misiakiewicz, A Montanari
112020
Efficient reconstruction of transmission probabilities in a spreading process from partial observations
AY Lokhov, T Misiakiewicz
arXiv preprint arXiv:1509.06893, 2015
92015
Minimum complexity interpolation in random features models
M Celentano, T Misiakiewicz, A Montanari
arXiv preprint arXiv:2103.15996, 2021
82021
A non-asymptotic theory of Kernel Ridge Regression: deterministic equivalents, test error, and GCV estimator
T Misiakiewicz, B Saeed
arXiv preprint arXiv:2403.08938, 2024
72024
Dimension-free deterministic equivalents for random feature regression
L Defilippis, B Loureiro, T Misiakiewicz
arXiv preprint arXiv:2405.15699, 2024
52024
On the complexity of learning sparse functions with statistical and gradient queries
N Joshi, T Misiakiewicz, N Srebro
arXiv preprint arXiv:2407.05622, 2024
22024
O sistema não pode executar a operação agora. Tente novamente mais tarde.
Artigos 1–20