Folgen
Jaehoon Lee
Jaehoon Lee
Google Brain
Bestätigte E-Mail-Adresse bei google.com - Startseite
Titel
Zitiert von
Zitiert von
Jahr
Deep Neural Networks as Gaussian Processes
J Lee*, Y Bahri*, R Novak, SS Schoenholz, J Pennington, ...
International Conference on Learning Representations (ICLR), 2018
11622018
Wide neural networks of any depth evolve as linear models under gradient descent
J Lee*, L Xiao*, SS Schoenholz, Y Bahri, J Sohl-Dickstein, J Pennington
Neural Information Processing Systems (NeurIPS), 2019
9892019
Beyond the imitation game: Quantifying and extrapolating the capabilities of language models
A Srivastava, A Rastogi, A Rao, AAM Shoeb, A Abid, A Fisch, AR Brown, ...
TMLR 2023, 2022
7272022
Measuring the effects of data parallelism on neural network training
CJ Shallue*, J Lee*, J Antognini, J Sohl-Dickstein, R Frostig, GE Dahl
Journal of Machine Learning Research (2019) 20, 1-49, 2019
4162019
Bayesian Deep Convolutional Neural Networks with Many Channels are Gaussian Processes
R Novak*, L Xiao*, J Lee, Y Bahri, G Yang, D Abolafia, J Pennington, ...
International Conference on Learning Representations (ICLR), 2019
339*2019
On empirical comparisons of optimizers for deep learning
D Choi, CJ Shallue, Z Nado, J Lee, CJ Maddison, GE Dahl
arXiv preprint arXiv:1910.05446, 2019
3232019
Neural tangents: Fast and easy infinite neural networks in python
R Novak*, L Xiao*, J Hron, J Lee, AA Alemi, J Sohl-Dickstein, ...
International Conference on Learning Representations (ICLR), Spotlight, 2020
2352020
Finite versus infinite neural networks: an empirical study
J Lee, SS Schoenholz, J Pennington, B Adlam, L Xiao, R Novak, ...
Neural Information Processing Systems (NeurIPS), Spotlight, 2020
1822020
Dataset Distillation with Infinitely Wide Convolutional Networks
T Nguyen, R Novak, L Xiao, J Lee
Neural Information Processing Systems (NeurIPS), 2021
1792021
Dataset Meta-Learning from Kernel Ridge-Regression
T Nguyen, Z Chen, J Lee
International Conference on Learning Representations (ICLR), 2021
1722021
The superconformal bootstrap in three dimensions
SM Chester, J Lee, SS Pufu, R Yacoby
Journal of High Energy Physics 2014 (9), 1-59, 2014
1672014
Exact correlators of BPS operators from the 3d superconformal bootstrap
SM Chester, J Lee, SS Pufu, R Yacoby
Journal of High Energy Physics 2015 (3), 1-55, 2015
1492015
Explaining neural scaling laws
Y Bahri*, E Dyer*, J Kaplan*, J Lee*, U Sharma*
arXiv preprint arXiv:2102.06701, 2021
1262021
On the infinite width limit of neural networks with a standard parameterization
J Sohl-Dickstein, R Novak, SS Schoenholz, J Lee
arXiv preprint arXiv:2001.07301, 2020
482020
Towards NNGP-guided Neural Architecture Search
DS Park*, J Lee*, D Peng, Y Cao, J Sohl-Dickstein
arXiv preprint arXiv:2011.06006, 2020
342020
3d minimal SCFTs from wrapped M5-branes
JB Bae, D Gang, J Lee
Journal of High Energy Physics 2017 (8), 118, 2017
322017
Algebra of Majorana doubling
J Lee, F Wilczek
Physical Review Letters 111 (22), 226402, 2013
322013
GLSMs for non-Kähler geometries
A Adams, E Dyer, J Lee
Journal of High Energy Physics 2013 (1), 1-39, 2013
312013
Entanglement entropy from one-point functions in holographic states
MJS Beach, J Lee, C Rabideau, M Van Raamsdonk
Journal of High Energy Physics 2016 (6), 1-29, 2016
292016
Exploring the Uncertainty Properties of Neural Networks’ Implicit Priors in the Infinite-Width Limit
B Adlam*, J Lee*, L Xiao*, J Pennington, J Snoek
International Conference on Learning Representations (ICLR), 2021
172021
Das System kann den Vorgang jetzt nicht ausführen. Versuchen Sie es später erneut.
Artikel 1–20