File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

김윤호

Kim, Yunho
Mathematical Imaging Analysis Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Improved weight initialization for deep and narrow feedforward neural network

Author(s)
Lee, HyunwooKim, YunhoYang, Seung YeopChoi, Hayoung
Issued Date
2024-08
DOI
10.1016/j.neunet.2024.106362
URI
https://scholarworks.unist.ac.kr/handle/201301/82649
Citation
NEURAL NETWORKS, v.176, pp.106362
Abstract
Appropriate weight initialization settings, along with the ReLU activation function, have become cornerstones of modern deep learning, enabling the training and deployment of highly effective and efficient neural network models across diverse areas of artificial intelligence. The problem of “dying ReLU,” where ReLU neurons become inactive and yield zero output, presents a significant challenge in the training of deep neural networks with ReLU activation function. Theoretical research and various methods have been introduced to address the problem. However, even with these methods and research, training remains challenging for extremely deep and narrow feedforward networks with ReLU activation function. In this paper, we propose a novel weight initialization method to address this issue. We establish several properties of our initial weight matrix and demonstrate how these properties enable the effective propagation of signal vectors. Through a series of experiments and comparisons with existing methods, we demonstrate the effectiveness of the novel initialization method. © 2024 The Authors
Publisher
Elsevier Ltd
ISSN
0893-6080
Keyword (Author)
Deep learningFeedforward neural networksReLU activation functionWeight initializationInitial weight matrix

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.