File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

유재준

Yoo, Jaejun
Lab. of Advanced Imaging Technology
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

SimUSR: A simple but strong baseline for unsupervised image super-resolution

Author(s)
Ahn, NamhyukYoo, JaejunSohn, Kyung-Ah
Issued Date
2020-06
DOI
10.1109/CVPRW50498.2020.00245
URI
https://scholarworks.unist.ac.kr/handle/201301/78511
Citation
IEEE Conference on Computer Vision and Pattern Recognition, pp.1953 - 1961
Abstract
In this paper, we tackle a fully unsupervised super-resolution problem, i.e., neither paired images nor ground truth HR images. We assume that low resolution (LR) images are relatively easy to collect compared to high resolution (HR) images. By allowing multiple LR images, we build a set of pseudo pairs by denoising and downsampling LR images and cast the original unsupervised problem into a supervised learning problem but in one level lower. Though this line of study is easy to think of and thus should have been investigated prior to any complicated unsupervised methods, surprisingly, there are currently none. Even more, we show that this simple method outperforms the state-of- the-art unsupervised method with a dramatically shorter latency at runtime, and significantly reduces the gap to the HR supervised models. We submitted our method in NTIRE 2020 super-resolution challenge and won 1st in PSNR, 2nd in SSIM, and 13th in LPIPS. This simple method should be used as the baseline to beat in the future, especially when multiple LR images are allowed during the training phase. However, even in the zero-shot condition, we argue that this method can serve as a useful baseline to see the gap between supervised and unsupervised frameworks. © 2020 IEEE.
Publisher
IEEE Computer Society
ISSN
2160-7508

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.