File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

황성주

Hwang, Sung Ju
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.conferencePlace AT -
dc.citation.conferencePlace Sydney -
dc.citation.endPage 6039 -
dc.citation.startPage 6031 -
dc.citation.title 34th International Conference on Machine Learning, ICML 2017 -
dc.contributor.author Yoon, J -
dc.contributor.author Hwang, Sung Ju -
dc.date.accessioned 2023-12-19T18:37:08Z -
dc.date.available 2023-12-19T18:37:08Z -
dc.date.created 2019-03-21 -
dc.date.issued 2017-08-06 -
dc.description.abstract The number of parameters in a deep neural network is usually very large, which helps with its learning capacity but also hinders its scalability and practicality due to memory/time inefficiency and overfitting. To resolve this issue, we propose a sparsity regularization method that exploits both positive and negative correlations among the features to enforce the network to be sparse, and at the same time remove any redundancies among the features to fully utilize the capacity of the network. Specifically, we propose to use an exclusive sparsity regularization based on (1, 2)-norm, which promotes competition for features between different weights, thus enforcing them to fit to disjoint sets of features. We further combine the exclusive sparsity with the group sparsity based on (2, l)-norm, to promote both sharing and competition for features in training of a deep neural network. We validate our method on multiple public datasets, and the results show that our method can obtain more compact and efficient networks while also improving the performance over the base networks with full weights, as opposed to existing sparsity regularizations that often obtain efficiency at the expense of prediction accuracy. -
dc.identifier.bibliographicCitation 34th International Conference on Machine Learning, ICML 2017, pp.6031 - 6039 -
dc.identifier.issn 0000-0000 -
dc.identifier.scopusid 2-s2.0-85048580203 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/35110 -
dc.language 영어 -
dc.publisher International Machine Learning Society (IMLS) -
dc.title Combined group and exclusive sparsity for deep neural networks -
dc.type Conference Paper -
dc.date.conferenceDate 2017-08-06 -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.