There are no files associated with this item.
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.citation.endPage | 131 | - |
dc.citation.startPage | 118 | - |
dc.citation.title | Neural Networks | - |
dc.citation.volume | 126 | - |
dc.contributor.author | Jang, Jinhyeok | - |
dc.contributor.author | Cho, Hyunjoong | - |
dc.contributor.author | Kim, Jaehong | - |
dc.contributor.author | Lee, Jaeyeon | - |
dc.contributor.author | Yang, Seungjoon | - |
dc.date.accessioned | 2023-12-21T17:37:01Z | - |
dc.date.available | 2023-12-21T17:37:01Z | - |
dc.date.created | 2020-03-23 | - |
dc.date.issued | 2020-06 | - |
dc.description.abstract | In this study, we present deep neural networks with a set of node-wise varying activation functions. The feature-learning abilities of the nodes are affected by the selected activation functions, where the nodes with smaller indices become increasingly more sensitive during training. As a result, the features learned by the nodes are sorted by the node indices in order of their importance such that more sensitive nodes are related to more important features. The proposed networks learn input features but also the importance of the features. Nodes with lower importance in the proposed networks can be pruned to reduce the complexity of the networks, and the pruned networks can be retrained without incurring performance losses. We validated the feature-sorting property of the proposed method using both shallow and deep networks as well as deep networks transferred from existing networks. (c) 2020 Elsevier Ltd. All rights reserved. | - |
dc.identifier.bibliographicCitation | Neural Networks, v.126, pp.118 - 131 | - |
dc.identifier.doi | 10.1016/j.neunet.2020.03.004 | - |
dc.identifier.issn | 0893-6080 | - |
dc.identifier.scopusid | 2-s2.0-85082121663 | - |
dc.identifier.uri | https://scholarworks.unist.ac.kr/handle/201301/31660 | - |
dc.identifier.url | https://www.sciencedirect.com/science/article/pii/S0893608020300812 | - |
dc.identifier.wosid | 000536450900011 | - |
dc.language | 영어 | - |
dc.publisher | PERGAMON-ELSEVIER SCIENCE LTD | - |
dc.title | Deep neural networks with a set of node-wise varying activation functions | - |
dc.type | Article | - |
dc.description.isOpenAccess | FALSE | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Artificial Intelligence; Neurosciences | - |
dc.relation.journalResearchArea | Computer Science; Neurosciences & Neurology | - |
dc.type.docType | Article | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.subject.keywordAuthor | Deep network | - |
dc.subject.keywordAuthor | Principal component analysis | - |
dc.subject.keywordAuthor | Pruning | - |
dc.subject.keywordAuthor | Varying activation | - |
Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Tel : 052-217-1404 / Email : scholarworks@unist.ac.kr
Copyright (c) 2023 by UNIST LIBRARY. All rights reserved.
ScholarWorks@UNIST was established as an OAK Project for the National Library of Korea.