File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.contributor.advisor Sim, Jae-Young -
dc.contributor.author Lee, Dae-Sik -
dc.date.accessioned 2024-01-25T13:57:23Z -
dc.date.available 2024-01-25T13:57:23Z -
dc.date.issued 2017-02 -
dc.description.abstract In this thesis, we investigate the performance of various activation functions of deep convolutional neural networks (DCNNs) and propose new activation functions. First, we propose twofold parametric ReLU. We observed that time complexity of S-shaped ReLU is relatively huge due to the computation of forward and backward-pass propagation. Thus we removed translation parameters of S-shaped ReLU and design twofold parametric ReLU. Second, inspired by just noticeable difference of the Weber's law, we reflect the property that subjective sensation is proportional to the logarithm of image intensity. We formulate an activation function by modifying the logarithm function which is used only on the first layer of DCNNs. Experimental results show that the performances of the proposed activation functions are better than that of the existing activation functions. -
dc.description.degree Master -
dc.description Department of Electrical Engineering -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/72115 -
dc.identifier.uri http://unist.dcollection.net/jsp/common/DcLoOrgPer.jsp?sItemId=000002332729 -
dc.language eng -
dc.publisher Ulsan National Institute of Science and Technology (UNIST) -
dc.rights.embargoReleaseDate 9999-12-31 -
dc.rights.embargoReleaseTerms 9999-12-31 -
dc.title Improved Activation Functions of Deep Convolutional Neural Networks for Image Classification -
dc.type Thesis -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.