File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

백승렬

Baek, Seungryul
UNIST VISION AND LEARNING LAB.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Sampling Strategies for GAN Synthetic Data

Author(s)
Bhattarai, BinodBaek, SeungryulBodur, RumeysaKim, Tae-Kyun
Issued Date
2020-05-04
URI
https://scholarworks.unist.ac.kr/handle/201301/78543
Fulltext
https://arxiv.org/abs/1909.04689
Citation
IEEE International Conference on Acoustics, Speech and Signal Processing
Abstract
Generative Adversarial Networks (GANs) have been used widely to generate large volumes of synthetic data. This data is being utilized for augmenting with real examples in order to train deep Convolutional Neural Networks (CNNs). Studies have shown that the generated examples lack sufficient realism to train deep CNNs and are poor in diversity. Unlike previous studies of randomly augmenting the synthetic data with real data, we present our simple, effective and easy to implement synthetic data sampling methods to train deep CNNs more efficiently and accurately. To this end, we propose to maximally utilize the parameters learned during training of the GAN itself. These include discriminator's realism confidence score and the confidence on the target label of the synthetic data. In addition to this, we explore reinforcement learning (RL) to automatically search a subset of meaningful synthetic examples from a large pool of GAN synthetic data. We evaluate our method on two challenging face attribute classification data sets viz. AffectNet and CelebA. Our extensive experiments clearly demonstrate the need of sampling synthetic data before augmentation, which also improves the performance of one of the state-of-the-art deep CNNs in vitro.
Publisher
Institute of Electrical and Electronics Engineers Inc.

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.