Journal article
Mean field analysis of neural networks: a central limit theorem
- Abstract:
-
We rigorously prove a central limit theorem for neural network models with a single hidden layer. The central limit theorem is proven in the asymptotic regime of simultaneously (A) large numbers of hidden units and (B) large numbers of stochastic gradient descent training iterations. Our result describes the neural network’s fluctuations around its mean-field limit. The fluctuations have a Gaussian distribution and satisfy a stochastic partial differential equation. The proof relies upon weak...
Expand abstract
- Publication status:
- Published
- Peer review status:
- Peer reviewed
Actions
Authors
Bibliographic Details
- Publisher:
- Elsevier Publisher's website
- Journal:
- Stochastic Processes and their Applications Journal website
- Volume:
- 130
- Issue:
- 3
- Pages:
- 1820-1852
- Publication date:
- 2019-06-12
- Acceptance date:
- 2020-06-03
- DOI:
- ISSN:
-
0304-4149
Item Description
- Language:
- English
- Pubs id:
-
1124852
- Local pid:
- pubs:1124852
- Deposit date:
- 2020-08-10
Terms of use
- Copyright holder:
- Elsevier B.V
- Copyright date:
- 2019
- Rights statement:
- © 2019 Elsevier B.V. All rights reserved.
- Notes:
-
This is the accepted manuscript version of the article. The final version is available from Elsevier at:
https://doi.org/10.1016/j.spa.2019.06.003
Metrics
If you are the owner of this record, you can report an update to it here: Report update to this record