File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

양승준

Yang, Seungjoon
Signal Processing Lab .
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.endPage 131 -
dc.citation.startPage 118 -
dc.citation.title Neural Networks -
dc.citation.volume 126 -
dc.contributor.author Jang, Jinhyeok -
dc.contributor.author Cho, Hyunjoong -
dc.contributor.author Kim, Jaehong -
dc.contributor.author Lee, Jaeyeon -
dc.contributor.author Yang, Seungjoon -
dc.date.accessioned 2023-12-21T17:37:01Z -
dc.date.available 2023-12-21T17:37:01Z -
dc.date.created 2020-03-23 -
dc.date.issued 2020-06 -
dc.description.abstract In this study, we present deep neural networks with a set of node-wise varying activation functions. The feature-learning abilities of the nodes are affected by the selected activation functions, where the nodes with smaller indices become increasingly more sensitive during training. As a result, the features learned by the nodes are sorted by the node indices in order of their importance such that more sensitive nodes are related to more important features. The proposed networks learn input features but also the importance of the features. Nodes with lower importance in the proposed networks can be pruned to reduce the complexity of the networks, and the pruned networks can be retrained without incurring performance losses. We validated the feature-sorting property of the proposed method using both shallow and deep networks as well as deep networks transferred from existing networks. (c) 2020 Elsevier Ltd. All rights reserved. -
dc.identifier.bibliographicCitation Neural Networks, v.126, pp.118 - 131 -
dc.identifier.doi 10.1016/j.neunet.2020.03.004 -
dc.identifier.issn 0893-6080 -
dc.identifier.scopusid 2-s2.0-85082121663 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/31660 -
dc.identifier.url https://www.sciencedirect.com/science/article/pii/S0893608020300812 -
dc.identifier.wosid 000536450900011 -
dc.language 영어 -
dc.publisher PERGAMON-ELSEVIER SCIENCE LTD -
dc.title Deep neural networks with a set of node-wise varying activation functions -
dc.type Article -
dc.description.isOpenAccess FALSE -
dc.relation.journalWebOfScienceCategory Computer Science, Artificial Intelligence; Neurosciences -
dc.relation.journalResearchArea Computer Science; Neurosciences & Neurology -
dc.type.docType Article -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor Deep network -
dc.subject.keywordAuthor Principal component analysis -
dc.subject.keywordAuthor Pruning -
dc.subject.keywordAuthor Varying activation -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.