File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

유현우

Yu, Hyeonwoo
Lab. of AI and Robotics
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Zero-shot Learning via Simultaneous Generating and Learning

Author(s)
Yu, HyeonwooLee, Beomhee
Issued Date
2019-12-08
URI
https://scholarworks.unist.ac.kr/handle/201301/78704
Citation
Neural Information Processing Systems
Abstract
To overcome the absence of training data for unseen classes, conventional zero-shot learning approaches mainly train their model on seen datapoints and leverage the semantic descriptions for both seen and unseen classes. Beyond exploiting relations between classes of seen and unseen, we present a deep generative model to provide the model with experience about both seen and unseen classes. Based on the variational auto-encoder with class-specific multi-modal prior, the proposed method learns the conditional distribution of seen and unseen classes. In order to circumvent the need for samples of unseen classes, we treat the non-existing data as missing examples. That is, our network aims to find optimal unseen datapoints and model parameters, by iteratively following the generating and learning strategy. Since we obtain the conditional generative model for both seen and unseen classes, classification as well as generation can be performed directly without any off-the-shell classifiers. In experimental results, we demonstrate that the proposed generating and learning strategy makes the model achieve the outperforming results compared to that trained only on the seen classes, and also to the several state-of-the-art methods.
Publisher
NEURAL INFORMATION PROCESSING SYSTEMS (NIPS)
ISSN
1049-5258

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.