Cited time in
Full metadata record
| DC Field | Value | Language |
|---|---|---|
| dc.citation.startPage | 106362 | - |
| dc.citation.title | NEURAL NETWORKS | - |
| dc.citation.volume | 176 | - |
| dc.contributor.author | Lee, Hyunwoo | - |
| dc.contributor.author | Kim, Yunho | - |
| dc.contributor.author | Yang, Seung Yeop | - |
| dc.contributor.author | Choi, Hayoung | - |
| dc.date.accessioned | 2024-05-20T12:05:08Z | - |
| dc.date.available | 2024-05-20T12:05:08Z | - |
| dc.date.created | 2024-05-16 | - |
| dc.date.issued | 2024-08 | - |
| dc.description.abstract | Appropriate weight initialization settings, along with the ReLU activation function, have become cornerstones of modern deep learning, enabling the training and deployment of highly effective and efficient neural network models across diverse areas of artificial intelligence. The problem of “dying ReLU,” where ReLU neurons become inactive and yield zero output, presents a significant challenge in the training of deep neural networks with ReLU activation function. Theoretical research and various methods have been introduced to address the problem. However, even with these methods and research, training remains challenging for extremely deep and narrow feedforward networks with ReLU activation function. In this paper, we propose a novel weight initialization method to address this issue. We establish several properties of our initial weight matrix and demonstrate how these properties enable the effective propagation of signal vectors. Through a series of experiments and comparisons with existing methods, we demonstrate the effectiveness of the novel initialization method. © 2024 The Authors | - |
| dc.identifier.bibliographicCitation | NEURAL NETWORKS, v.176, pp.106362 | - |
| dc.identifier.doi | 10.1016/j.neunet.2024.106362 | - |
| dc.identifier.issn | 0893-6080 | - |
| dc.identifier.scopusid | 2-s2.0-85192499052 | - |
| dc.identifier.uri | https://scholarworks.unist.ac.kr/handle/201301/82649 | - |
| dc.identifier.wosid | 001265952000001 | - |
| dc.language | 영어 | - |
| dc.publisher | Elsevier Ltd | - |
| dc.title | Improved weight initialization for deep and narrow feedforward neural network | - |
| dc.type | Article | - |
| dc.description.isOpenAccess | TRUE | - |
| dc.relation.journalWebOfScienceCategory | Computer Science, Artificial Intelligence;Neurosciences | - |
| dc.relation.journalResearchArea | Computer Science;Neurosciences & Neurology | - |
| dc.type.docType | Article | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.subject.keywordAuthor | Feedforward neural networks | - |
| dc.subject.keywordAuthor | ReLU activation function | - |
| dc.subject.keywordAuthor | Weight initialization | - |
| dc.subject.keywordAuthor | Initial weight matrix | - |
| dc.subject.keywordAuthor | Deep learning | - |
| dc.subject.keywordPlus | APPROXIMATION | - |
| dc.subject.keywordPlus | ERROR | - |
Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Tel : 052-217-1403 / Email : scholarworks@unist.ac.kr
Copyright (c) 2023 by UNIST LIBRARY. All rights reserved.
ScholarWorks@UNIST was established as an OAK Project for the National Library of Korea.