Folgen
Mert Pilanci
Titel
Zitiert von
Zitiert von
Jahr
Newton sketch: A near linear-time optimization algorithm with linear-quadratic convergence
M Pilanci, MJ Wainwright
SIAM Journal on Optimization 27 (1), 205-245, 2017
2492017
Iterative Hessian sketch: Fast and accurate solution approximation for constrained least-squares
M Pilanci, MJ Wainwright
The Journal of Machine Learning Research 17 (1), 1842-1879, 2016
2032016
Randomized sketches of convex programs with sharp guarantees
M Pilanci, MJ Wainwright
IEEE Transactions on Information Theory 61 (9), 5096-5115, 2015
1752015
Randomized sketches for kernels: Fast and optimal nonparametric regression
Y Yang, M Pilanci, MJ Wainwright
The Annals of Statistics 45 (3), 991-1023, 2017
1492017
Sparse learning via boolean relaxations
M Pilanci, MJ Wainwright, L El Ghaoui
Mathematical Programming 151 (1), 63-87, 2015
662015
Recovery of sparse probability measures via convex programming
M Pilanci, L Ghaoui, V Chandrasekaran
Advances in Neural Information Processing Systems 25, 2012
632012
Neural networks are convex regularizers: Exact polynomial-time convex optimization formulations for two-layer networks
M Pilanci, T Ergen
International Conference on Machine Learning, 7695-7705, 2020
462020
Randomized sketches for kernels: Fast and optimal non-parametric regression
Y Yang, M Pilanci, MJ Wainwright
arXiv preprint arXiv:1501.06195, 2015
352015
Revealing the structure of deep neural networks via convex duality
T Ergen, M Pilanci
International Conference on Machine Learning, 3004-3014, 2021
322021
Structured least squares problems and robust estimators
M Pilanci, O Arikan, MC Pinar
IEEE transactions on signal processing 58 (5), 2453-2465, 2010
282010
Implicit convex regularizers of cnn architectures: Convex optimization of two-and three-layer networks in polynomial time
T Ergen, M Pilanci
arXiv preprint arXiv:2006.14798, 2020
262020
Vector-output relu neural network problems are copositive programs: Convex analysis of two layer networks and polynomial-time algorithms
A Sahiner, T Ergen, J Pauly, M Pilanci
arXiv preprint arXiv:2012.13329, 2020
242020
Optimal randomized first-order methods for least-squares problems
J Lacotte, M Pilanci
International Conference on Machine Learning, 5587-5597, 2020
242020
Convex geometry and duality of over-parameterized neural networks
T Ergen, M Pilanci
Journal of machine learning research, 2021
232021
Convex geometry of two-layer relu networks: Implicit autoencoding and interpretable models
T Ergen, M Pilanci
International Conference on Artificial Intelligence and Statistics, 4024-4033, 2020
212020
Effective dimension adaptive sketching methods for faster regularized least-squares optimization
J Lacotte, M Pilanci
Advances in neural information processing systems 33, 19377-19387, 2020
212020
Debiasing distributed second order optimization with surrogate sketching and scaled regularization
M Derezinski, B Bartan, M Pilanci, MW Mahoney
Advances in Neural Information Processing Systems 33, 6684-6695, 2020
192020
All local minima are global for two-layer relu neural networks: The hidden convex optimization landscape
J Lacotte, M Pilanci
arXiv e-prints, arXiv: 2006.05900, 2020
182020
Faster least squares optimization
J Lacotte, M Pilanci
arXiv preprint arXiv:1911.02675, 2019
182019
Expectation maximization based matching pursuit
AC Gurbuz, M Pilanci, O Arikan
2012 IEEE International Conference on Acoustics, Speech and Signal …, 2012
182012
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20