File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

윤성환

Yoon, Sung Whan
Machine Intelligence and Information Learning Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

XB-MAML: Learning Expandable Basis Parameters for Effective Meta-Learning with Wide Task Coverage

Author(s)
Lee, Jae-JunYoon, Sung Whan
Issued Date
2024-05-02
URI
https://scholarworks.unist.ac.kr/handle/201301/68204
Citation
International Conference on Artificial Intelligence and Statistics
Abstract
Meta-learning, which pursues an effective initialization model, has emerged as a promising approach to handling unseen tasks. However, a limitation remains to be evident when a meta-learner tries to encompass a wide range of task distribution, e.g., learning across distinctive datasets or domains. Recently, a group of works has attempted to employ multiple model initializations to cover widelyranging tasks, but they are limited in adaptively expanding initializations. We introduce XB-MAML, which learns expandable basis parameters, where they are linearly combined to form an effective initialization to a given task. XB-MAML observes the discrepancy between the vector space spanned by the basis and fine-tuned parameters to decide whether to expand the basis. Our method surpasses the existing works in the multi-domain meta-learning benchmarks and opens up new chances of meta-learning for obtaining the diverse inductive bias that can be combined to stretch toward the effective initialization for diverse unseen tasks.
Publisher
Society for Artificial Intelligence and Statistics

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.