File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이종은

Lee, Jongeun
Intelligent Computing and Codesign Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.conferencePlace US -
dc.citation.conferencePlace Nashville, TN, USA -
dc.citation.title IEEE Conference on Computer Vision and Pattern Recognition -
dc.contributor.author Oh, Sangyun -
dc.contributor.author Sim, Hyeonuk -
dc.contributor.author Lee, Sugil -
dc.contributor.author Lee, Jongeun -
dc.date.accessioned 2024-01-31T21:40:00Z -
dc.date.available 2024-01-31T21:40:00Z -
dc.date.created 2022-01-06 -
dc.date.issued 2021-06-20 -
dc.description.abstract Quantization plays an important role in deep neural network (DNN) hardware. In particular, logarithmic quantization has multiple advantages for DNN hardware implementations, and its weakness in terms of lower performance at high precision compared with linear quantization has been recently remedied by what we call selective two-word logarithmic quantization (STLQ). However, there is a lack of training methods designed for STLQ or even logarithmic quantization in general. In this paper we propose a novel STLQ-aware training method, which significantly outperforms the previous state-of-the-art training method for STLQ. Moreover, our training results demonstrate that with our new training method, STLQ applied to weight parameters of ResNet-18 can achieve the same level of performance as state-of-the-art quantization method, APoT, at 3-bit precision. We also apply our method to various DNNs in image enhancement and semantic segmentation, showing competitive results. -
dc.identifier.bibliographicCitation IEEE Conference on Computer Vision and Pattern Recognition -
dc.identifier.doi 10.1109/cvpr46437.2021.00080 -
dc.identifier.scopusid 2-s2.0-85123197094 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/77268 -
dc.identifier.wosid 000739917300071 -
dc.publisher IEEE -
dc.title Automated Log-Scale Quantization for Low-Cost Deep Neural Networks -
dc.type Conference Paper -
dc.date.conferenceDate 2021-06-20 -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.