Conference item
Data parallelism in training sparse neural networks
- Abstract:
-
Network pruning is an effective methodology to compress large neural networks, and sparse neural networks obtained by pruning can benefit from their reduced memory and computational costs at use. Notably, recent advances have found that it is possible to find a trainable sparse neural network even at random initialization prior to training; hence the obtained sparse network only needs to be trained. While this approach of pruning at initialization turned out to be highly effective, little has...
Expand abstract
- Publication status:
- Published
- Peer review status:
- Peer reviewed
Actions
Access Document
- Files:
-
-
(Accepted manuscript, 791.9KB)
-
- Publication website:
- https://pml4dc.github.io/iclr2020/program/pml4dc_22.html
Authors
Bibliographic Details
- Publisher:
- ICLR Publisher's website
- Publication date:
- 2020-04-26
- Event title:
- ICLR 2020 Workshop: Practical ML for Developing Countries: learning under limited/low resource scenarios
- Event location:
- Addis Ababa, Ethiopia
- Event website:
- https://pml4dc.github.io/iclr2020/
- Event start date:
- 2020-04-26
- Event end date:
- 2020-04-26
Item Description
- Language:
- English
- Keywords:
- Pubs id:
-
1147407
- Local pid:
- pubs:1147407
- Deposit date:
- 2020-12-01
Terms of use
- Copyright holder:
- Lee et al.
- Copyright date:
- 2020
- Notes:
- This is the accepted manuscript version of the paper, available online at: https://pml4dc.github.io/iclr2020/program/pml4dc_22.html
Metrics
If you are the owner of this record, you can report an update to it here: Report update to this record