Journal article icon

Journal article

Dropout distillation for efficiently estimating model confidence

Abstract:

We propose an efficient way to output better calibrated uncertainty scores from neural networks. The Distilled Dropout Network (DDN) makes standard (non-Bayesian) neural networks more introspective by adding a new training loss which prevents them from being overconfident. Our method is more efficient than Bayesian neural networks or model ensembles which, despite providing more reliable uncertainty scores, are more cumbersome to train and slower to test. We evaluate DDN on the the task of im...

Expand abstract
Publication status:
Accepted
Peer review status:
Peer reviewed

Actions


Authors


More by this author
Institution:
University of Oxford
Division:
MPLS Division
Department:
Engineering Science
Oxford college:
Pembroke College
Role:
Author
Journal:
arXiv
Publication date:
2018-01-01
Acceptance date:
2018-09-27
Pubs id:
pubs:942840
URN:
uri:98f841eb-6f53-45d4-a58b-eb41d82c2f4b
UUID:
uuid:98f841eb-6f53-45d4-a58b-eb41d82c2f4b
Local pid:
pubs:942840

Terms of use


Metrics


Views and Downloads






If you are the owner of this record, you can report an update to it here: Report update to this record

TO TOP