File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

나승훈

Na, Seung-Hoon
Natural Language Processing Lab
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Revisiting Dropout: Escaping Pressure for Training Neural Networks with Multiple Costs

Author(s)
Woo, SangminKim, KangilNoh, JunhyugShin, Jong-HunNa, Seung-Hoon
Issued Date
2021-05
DOI
10.3390/electronics10090989
URI
https://scholarworks.unist.ac.kr/handle/201301/86790
Citation
ELECTRONICS, v.10, no.9, pp.989
Abstract
A common approach to jointly learn multiple tasks with a shared structure is to optimize the model with a combined landscape of multiple sub-costs. However, gradients derived from each sub-cost often conflicts in cost plateaus, resulting in a subpar optimum. In this work, we shed light on such gradient conflict challenges and suggest a solution named Cost-Out, which randomly drops the sub-costs for each iteration. We provide the theoretical and empirical evidence of the existence of escaping pressure induced by the Cost-Out mechanism. While simple, the empirical results indicate that the proposed method can enhance the performance of multi-task learning problems, including two-digit image classification sampled from MNIST dataset and machine translation tasks for English from and to French, Spanish, and German WMT14 datasets.
Publisher
MDPI
ISSN
2079-9292
Keyword (Author)
multitask learninggradient conflictCost-Outescaping pressuredropout

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.