Folgen
Albert S. Berahas
Albert S. Berahas
Assistant Professor, University of Michigan
Bestätigte E-Mail-Adresse bei umich.edu - Startseite
Titel
Zitiert von
Zitiert von
Jahr
A multi-batch L-BFGS method for machine learning
AS Berahas, J Nocedal, M Takác
Advances in Neural Information Processing Systems 29, 2016
1402016
An investigation of Newton-sketch and subsampled Newton methods
AS Berahas, R Bollapragada, J Nocedal
Optimization Methods and Software 35 (4), 661-680, 2020
1122020
A theoretical and empirical comparison of gradient approximations in derivative-free optimization
AS Berahas, L Cao, K Choromanski, K Scheinberg
Foundations of Computational Mathematics 22 (2), 507-560, 2022
1092022
Balancing communication and computation in distributed optimization
AS Berahas, R Bollapragada, NS Keskar, E Wei
IEEE Transactions on Automatic Control 64 (8), 3141-3155, 2018
1042018
Derivative-free optimization of noisy functions via quasi-Newton methods
AS Berahas, RH Byrd, J Nocedal
SIAM Journal on Optimization 29 (2), 965-993, 2019
812019
Quasi-Newton methods for machine learning: forget the past, just sample
AS Berahas, M Jahani, P Richtárik, M Takáč
Optimization Methods and Software 37 (5), 1668-1704, 2022
72*2022
Global convergence rate analysis of a generic line search algorithm with noise
AS Berahas, L Cao, K Scheinberg
SIAM Journal on Optimization 31 (2), 1489-1518, 2021
552021
adaQN: An adaptive quasi-Newton algorithm for training RNNs
NS Keskar, AS Berahas
Machine Learning and Knowledge Discovery in Databases: European Conference …, 2016
462016
Sequential quadratic optimization for nonlinear equality constrained stochastic optimization
AS Berahas, FE Curtis, D Robinson, B Zhou
SIAM Journal on Optimization 31 (2), 1352-1379, 2021
382021
A stochastic sequential quadratic optimization algorithm for nonlinear equality constrained optimization with rank-deficient jacobians
AS Berahas, FE Curtis, MJ O'Neill, DP Robinson
arXiv preprint arXiv:2106.13015, 2021
212021
Scaling up quasi-newton algorithms: Communication efficient distributed sr1
M Jahani, M Nazari, S Rusakov, AS Berahas, M Takáč
Machine Learning, Optimization, and Data Science: 6th International …, 2020
182020
On the convergence of nested decentralized gradient methods with multiple consensus and gradient steps
AS Berahas, R Bollapragada, E Wei
IEEE Transactions on Signal Processing 69, 4192-4203, 2021
152021
Linear interpolation gives better gradients than Gaussian smoothing in derivative-free optimization
AS Berahas, L Cao, K Choromanski, K Scheinberg
arXiv preprint arXiv:1905.13043, 2019
152019
Nested distributed gradient methods with adaptive quantized communication
AS Berahas, C Iakovidou, E Wei
2019 IEEE 58th Conference on Decision and Control (CDC), 1519-1525, 2019
142019
Sparse representation and least squares-based classification in face recognition
M Iliadis, L Spinoulas, AS Berahas, H Wang, AK Katsaggelos
2014 22nd European Signal Processing Conference (EUSIPCO), 526-530, 2014
132014
SONIA: a symmetric blockwise truncated optimization algorithm
M Jahani, M Nazari, R Tappenden, A Berahas, M Takác
International conference on artificial intelligence and statistics, 487-495, 2021
122021
Accelerating stochastic sequential quadratic programming for equality constrained optimization using predictive variance reduction
AS Berahas, J Shi, Z Yi, B Zhou
Computational Optimization and Applications, 1-38, 2023
102023
Modeling and predicting heavy-duty vehicle engine-out and tailpipe nitrogen oxide (NOx) emissions using deep learning
R Pillai, V Triantopoulos, AS Berahas, M Brusstar, R Sun, T Nevius, ...
Frontiers in Mechanical Engineering 8, 11, 2022
102022
Limited-memory BFGS with displacement aggregation
AS Berahas, FE Curtis, B Zhou
Mathematical Programming 194 (1-2), 121-157, 2022
92022
Finite difference neural networks: Fast prediction of partial differential equations
Z Shi, NS Gulgec, AS Berahas, SN Pakzad, M Takáč
2020 19th IEEE International Conference on Machine Learning and Applications …, 2020
82020
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20