File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Improved Activation Functions of Deep Convolutional Neural Networks for Image Classification

Author(s)
Lee, Dae-Sik
Advisor
Sim, Jae-Young
Issued Date
2017-02
URI
https://scholarworks.unist.ac.kr/handle/201301/72115 http://unist.dcollection.net/jsp/common/DcLoOrgPer.jsp?sItemId=000002332729
Abstract
In this thesis, we investigate the performance of various activation functions of deep convolutional neural networks (DCNNs) and propose new activation functions. First, we propose twofold parametric ReLU. We observed that time complexity of S-shaped ReLU is relatively huge due to the computation of forward and backward-pass propagation. Thus we removed translation parameters of S-shaped ReLU and design twofold parametric ReLU. Second, inspired by just noticeable difference of the Weber's law, we reflect the property that subjective sensation is proportional to the logarithm of image intensity. We formulate an activation function by modifying the logarithm function which is used only on the first layer of DCNNs. Experimental results show that the performances of the proposed activation functions are better than that of the existing activation functions.
Publisher
Ulsan National Institute of Science and Technology (UNIST)
Degree
Master
Major
Department of Electrical Engineering

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.