File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

김광인

Kim, Kwang In
Machine Learning and Vision Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.conferencePlace US -
dc.citation.title IEEE Conference on Computer Vision and Pattern Recognition -
dc.contributor.author Mehta Dushyant -
dc.contributor.author Kim, Kwang In -
dc.contributor.author Theobalt, Christian -
dc.date.accessioned 2024-02-01T00:08:33Z -
dc.date.available 2024-02-01T00:08:33Z -
dc.date.created 2019-11-30 -
dc.date.issued 2019-06-18 -
dc.description.abstract We investigate filter level sparsity that emerges in convolutional neural networks (CNNs) which employ Batch Normalization and ReLU activation, and are trained with adaptive gradient descent techniques and L2 regularization or weight decay. We conduct an extensive experimental study casting our initial findings into hypotheses and conclusions about the mechanisms underlying the emergent filter level sparsity. This study allows new insight into the performance gap obeserved between adapative and non-adaptive gradient descent methods in practice. Further, analysis of the effect of training strategies and hyperparameters on the sparsity leads to practical suggestions in designing CNN training strategies enabling us to explore the tradeoffs between feature selectivity, network capacity, and generalization performance. Lastly, we show that the implicit sparsity can be harnessed for neural network speedup at par or better than explicit sparsification / pruning approaches, with no modifications to the typical training pipeline required. -
dc.identifier.bibliographicCitation IEEE Conference on Computer Vision and Pattern Recognition -
dc.identifier.doi 10.1109/CVPR.2019.00061 -
dc.identifier.scopusid 2-s2.0-85071971310 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/79655 -
dc.publisher IEEE -
dc.title On implicit filter level sparsity in convolutional neural networks -
dc.type Conference Paper -
dc.date.conferenceDate 2019-06-16 -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.