In this thesis, we investigate the performance of various activation functions of deep convolutional neural networks (DCNNs) and propose new activation functions. First, we propose twofold parametric ReLU. We observed that time complexity of S-shaped ReLU is relatively huge due to the computation of forward and backward-pass propagation. Thus we removed translation parameters of S-shaped ReLU and design twofold parametric ReLU. Second, inspired by just noticeable difference of the Weber's law, we reflect the property that subjective sensation is proportional to the logarithm of image intensity. We formulate an activation function by modifying the logarithm function which is used only on the first layer of DCNNs. Experimental results show that the performances of the proposed activation functions are better than that of the existing activation functions.
Publisher
Ulsan National Institute of Science and Technology (UNIST)