Conference item
Stable ResNet
- Abstract:
- Deep ResNet architectures have achieved state of the art performance on many tasks. While they solve the problem of gradient vanishing, they might suffer from gradient exploding as the depth becomes large. Moreover, recent results have shown that ResNet might lose expressivity as the depth goes to infinity [Yang and Schoenholz, 2017, Hayou et al., 2019a]. To resolve these issues, we introduce a new class of ResNet architectures, called Stable ResNet, that have the property of stabilizing the gradient while ensuring expressivity in the infinite depth limit.
- Publication status:
- Published
- Peer review status:
- Peer reviewed
Actions
Authors
- Publisher:
- Journal of Machine Learning Research
- Host title:
- Proceedings of The 24th International Conference on Artificial Intelligence and Statistics
- Pages:
- 1324-1332
- Series:
- Proceedings of Machine Learning Research
- Series number:
- 130
- Publication date:
- 2021-03-29
- Acceptance date:
- 2021-01-22
- Event title:
- 24th International Conference on Artificial Intelligence and Statistics (AISTATS 2021)
- Event location:
- Virtual event
- Event website:
- https://aistats.org/aistats2021/
- Event start date:
- 2021-04-13
- Event end date:
- 2021-04-15
- ISSN:
-
2640-3498
- Language:
-
English
- Keywords:
- Pubs id:
-
1192441
- Local pid:
-
pubs:1192441
- Deposit date:
-
2022-03-10
Terms of use
- Copyright holder:
- Hayou et al
- Copyright date:
- 2021
- Rights statement:
- © The Author(s) 2021.
- Notes:
- This is the accepted version of the conference paper. The final version is available from the Proceedings of Machine Learning Research at http://proceedings.mlr.press/v130/
If you are the owner of this record, you can report an update to it here: Report update to this record