Skip to main content

Research Repository

Advanced Search

Selective dropout for deep neural networks.

Barrow, Erik; Eastwood, Mark; Jayne, Chrisina

Authors

Erik Barrow

Mark Eastwood

Chrisina Jayne



Contributors

Akira Hirose
Editor

Seiichi Ozawa
Editor

Kenji Doya
Editor

Kazushi Ikeda
Editor

Minho Lee
Editor

Derong Liu
Editor

Abstract

Dropout has been proven to be an effective method for reducing overfitting in deep artificial neural networks. We present 3 new alternative methods for performing dropout on a deep neural network which improves the effectiveness of the dropout method over the same training period. These methods select neurons to be dropped through statistical values calculated using a neurons change in weight, the average size of a neuron's weights, and the output variance of a neuron. We found that increasing the probability of dropping neurons with smaller values of these statistics and decreasing the probability of those with larger statistics gave an improved result in training over 10,000 epochs. The most effective of these was found to be the Output Variance method, giving an average improvement of 1.17 % accuracy over traditional dropout methods.

Citation

BARROW, E., EASTWOOD, M. and JAYNE, C. 2016. Selective dropout for deep neural networks. In Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M. and Liu, D. (eds.) Neural information processing: Proceedings of the 23rd International conference on neural information processing (ICONIP 2016), 16-21 October 2016, Kyoto, Japan. Lecture notes in computer science, 9949. Cham: Springer [online], pages 519-528. Available from: https://doi.org/10.1007/978-3-319-46675-0_57

Conference Name 23rd International conference on neural information processing (ICONIP 2016)
Conference Location Kyoto, Japan
Start Date Oct 16, 2016
End Date Oct 21, 2016
Acceptance Date Jun 4, 2016
Online Publication Date Sep 29, 2016
Publication Date Sep 29, 2016
Deposit Date Feb 10, 2017
Publicly Available Date Sep 30, 2017
Print ISSN 0302-9743
Publisher Springer
Pages 519-528
Series Title Lecture notes in computer science
Series Number 9949
Series ISSN 0302-9743
ISBN 9783319466743
DOI https://doi.org/10.1007/978-3-319-46675-0_57
Keywords MNIST; Artificial neural network; Deep learning; Dropout network; Nonrandom dropout; Selective dropout
Public URL http://hdl.handle.net/10059/2165

Files




You might also like



Downloadable Citations