关注
Sushrut Karmalkar
Sushrut Karmalkar
University of Wisconsin-Madison
在 cs.utexas.edu 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
List-decodable linear regression
S Karmalkar, A Klivans, P Kothari
Advances in neural information processing systems 32, 2019
942019
Superpolynomial lower bounds for learning one-layer neural networks using gradient descent
S Goel, A Gollakota, Z Jin, S Karmalkar, A Klivans
International Conference on Machine Learning, 3587-3596, 2020
852020
Approximation schemes for relu regression
I Diakonikolas, S Goel, S Karmalkar, AR Klivans, M Soltanolkotabi
Conference on learning theory, 1452-1485, 2020
622020
Time/accuracy tradeoffs for learning a relu with respect to gaussian marginals
S Goel, S Karmalkar, A Klivans
Advances in neural information processing systems 32, 2019
602019
Robustly learning any clusterable mixture of gaussians
I Diakonikolas, SB Hopkins, D Kane, S Karmalkar
arXiv preprint arXiv:2005.06417, 2020
562020
Fairness for image generation with uncertain sensitive attributes
A Jalal, S Karmalkar, J Hoffmann, A Dimakis, E Price
International Conference on Machine Learning, 4721-4732, 2021
532021
Instance-optimal compressed sensing via posterior sampling
A Jalal, S Karmalkar, AG Dimakis, E Price
arXiv preprint arXiv:2106.11438, 2021
492021
Outlier-robust high-dimensional sparse estimation via iterative filtering
I Diakonikolas, D Kane, S Karmalkar, E Price, A Stewart
Advances in Neural Information Processing Systems 32, 2019
452019
Compressed sensing with adversarial sparse noise via l1 regression
S Karmalkar, E Price
arXiv preprint arXiv:1809.08055, 2018
402018
Outlier-robust clustering of gaussians and other non-spherical mixtures
A Bakshi, I Diakonikolas, SB Hopkins, D Kane, S Karmalkar, PK Kothari
2020 ieee 61st annual symposium on foundations of computer science (focs …, 2020
372020
Robust sparse mean estimation via sum of squares
I Diakonikolas, DM Kane, S Karmalkar, A Pensia, T Pittas
Conference on Learning Theory, 4703-4763, 2022
282022
On the power of compressed sensing with generative models
A Kamath, E Price, S Karmalkar
International Conference on Machine Learning, 5101-5109, 2020
192020
Robust polynomial regression up to the information theoretic limit
D Kane, S Karmalkar, E Price
2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS …, 2017
192017
Lower bounds for compressed sensing with generative models
A Kamath, S Karmalkar, E Price
arXiv preprint arXiv:1912.02938, 2019
162019
Multi-model 3d registration: Finding multiple moving objects in cluttered point clouds
D Jin, S Karmalkar, H Zhang, L Carlone
2024 IEEE International Conference on Robotics and Automation (ICRA), 4990-4997, 2024
132024
List-decodable sparse mean estimation via difference-of-pairs filtering
I Diakonikolas, D Kane, S Karmalkar, A Pensia, T Pittas
Advances in Neural Information Processing Systems 35, 13947-13960, 2022
132022
Fourier entropy-influence conjecture for random linear threshold functions
S Chakraborty, S Karmalkar, S Kundu, SV Lokam, N Saurabh
LATIN 2018: Theoretical Informatics: 13th Latin American Symposium, Buenos …, 2018
52018
Compressed sensing with approximate priors via conditional resampling
A Jalal, S Karmalkar, A Dimakis, E Price
NeurIPS 2020 Workshop on Deep Learning and Inverse Problems, 2020
42020
Sum-of-squares lower bounds for non-gaussian component analysis
I Diakonikolas, S Karmalkar, S Pang, A Potechin
2024 IEEE 65th Annual Symposium on Foundations of Computer Science (FOCS …, 2024
22024
Distribution-independent regression for generalized linear models with oblivious corruptions
I Diakonikolas, S Karmalkar, JH Park, C Tzamos
The Thirty Sixth Annual Conference on Learning Theory, 5453-5475, 2023
22023
系统目前无法执行此操作,请稍后再试。
文章 1–20