dc.citation.conferencePlace |
SP |
- |
dc.citation.conferencePlace |
Palacio de Congresos de València, València SPAIN |
- |
dc.citation.title |
International Conference on Artificial Intelligence and Statistics |
- |
dc.contributor.author |
Lee, Jae-Jun |
- |
dc.contributor.author |
Yoon, Sung Whan |
- |
dc.date.accessioned |
2024-01-22T17:05:08Z |
- |
dc.date.available |
2024-01-22T17:05:08Z |
- |
dc.date.created |
2024-01-22 |
- |
dc.date.issued |
2024-05-02 |
- |
dc.description.abstract |
Meta-learning, which pursues an effective initialization model, has emerged as a promising approach to handling unseen tasks. However, a limitation remains to be evident when a meta-learner tries to encompass a wide range of task distribution, e.g., learning across distinctive datasets or domains. Recently, a group of works has attempted to employ multiple model initializations to cover widelyranging tasks, but they are limited in adaptively expanding initializations. We introduce XB-MAML, which learns expandable basis parameters, where they are linearly combined to form an effective initialization to a given task. XB-MAML observes the discrepancy between the vector space spanned by the basis and fine-tuned parameters to decide whether to expand the basis. Our method surpasses the existing works in the multi-domain meta-learning benchmarks and opens up new chances of meta-learning for obtaining the diverse inductive bias that can be combined to stretch toward the effective initialization for diverse unseen tasks. |
- |
dc.identifier.bibliographicCitation |
International Conference on Artificial Intelligence and Statistics |
- |
dc.identifier.uri |
https://scholarworks.unist.ac.kr/handle/201301/68204 |
- |
dc.language |
영어 |
- |
dc.publisher |
Society for Artificial Intelligence and Statistics |
- |
dc.title |
XB-MAML: Learning Expandable Basis Parameters for Effective Meta-Learning with Wide Task Coverage |
- |
dc.type |
Conference Paper |
- |
dc.date.conferenceDate |
2024-05-02 |
- |