File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

나승훈

Na, Seung-Hoon
Natural Language Processing Lab
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.number 9 -
dc.citation.startPage 989 -
dc.citation.title ELECTRONICS -
dc.citation.volume 10 -
dc.contributor.author Woo, Sangmin -
dc.contributor.author Kim, Kangil -
dc.contributor.author Noh, Junhyug -
dc.contributor.author Shin, Jong-Hun -
dc.contributor.author Na, Seung-Hoon -
dc.date.accessioned 2025-04-25T15:11:37Z -
dc.date.available 2025-04-25T15:11:37Z -
dc.date.created 2025-04-08 -
dc.date.issued 2021-05 -
dc.description.abstract A common approach to jointly learn multiple tasks with a shared structure is to optimize the model with a combined landscape of multiple sub-costs. However, gradients derived from each sub-cost often conflicts in cost plateaus, resulting in a subpar optimum. In this work, we shed light on such gradient conflict challenges and suggest a solution named Cost-Out, which randomly drops the sub-costs for each iteration. We provide the theoretical and empirical evidence of the existence of escaping pressure induced by the Cost-Out mechanism. While simple, the empirical results indicate that the proposed method can enhance the performance of multi-task learning problems, including two-digit image classification sampled from MNIST dataset and machine translation tasks for English from and to French, Spanish, and German WMT14 datasets. -
dc.identifier.bibliographicCitation ELECTRONICS, v.10, no.9, pp.989 -
dc.identifier.doi 10.3390/electronics10090989 -
dc.identifier.issn 2079-9292 -
dc.identifier.scopusid 2-s2.0-85104382384 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/86790 -
dc.identifier.wosid 000649987400001 -
dc.language 영어 -
dc.publisher MDPI -
dc.title Revisiting Dropout: Escaping Pressure for Training Neural Networks with Multiple Costs -
dc.type Article -
dc.description.isOpenAccess FALSE -
dc.relation.journalWebOfScienceCategory Computer Science, Information Systems; Engineering, Electrical & Electronic; Physics, Applied -
dc.relation.journalResearchArea Computer Science; Engineering; Physics -
dc.type.docType Article -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor multitask learning -
dc.subject.keywordAuthor gradient conflict -
dc.subject.keywordAuthor Cost-Out -
dc.subject.keywordAuthor escaping pressure -
dc.subject.keywordAuthor dropout -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.