File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

황성주

Hwang, Sung Ju
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Asymmetric Multi-task Learning Based on Task Relatedness and Loss

Author(s)
Giwoong LeeEunho YangHwang, Sung Ju
Issued Date
2016-06-22
URI
https://scholarworks.unist.ac.kr/handle/201301/35402
Fulltext
http://jmlr.org/proceedings/papers/v48/leeb16.html
Citation
33rd International Conference on Machine Learning, ICML 2016, pp.374 - 382
Abstract
We propose a novel multi-task learning method that can minimize the effect of negative transfer by allowing asymmetric transfer between the tasks based on task relatedness as well as the amount of individual task losses, which we refer to as Asymmetric Multi-task Learning (AMTL). To tackle this problem, we couple multiple tasks via a sparse, directed regularization graph, that enforces each task parameter to be reconstructed as a sparse combination of other tasks, which are selected based on the task-wise loss. We present two different algorithms to solve this joint learning of the task predictors and the regularization graph. The first algorithm solves for the original learning objective using alternative optimization, and the second algorithm solves an approximation of it using curriculum learning strategy, that learns one task at a time. We perform experiments on multiple datasets for classification and regression, on which we obtain significant improvements in performance over the single task learning and symmetric multitask learning baselines.
Publisher
33rd International Conference on Machine Learning, ICML 2016

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.