File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이종은

Lee, Jongeun
Intelligent Computing and Codesign Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.conferencePlace UK -
dc.citation.title British Machine Vision Conference -
dc.contributor.author Asim, Faaiz -
dc.contributor.author Park, Jaewoo -
dc.contributor.author Azamat, Azat -
dc.contributor.author Lee, Jongeun -
dc.date.accessioned 2024-01-31T19:36:41Z -
dc.date.available 2024-01-31T19:36:41Z -
dc.date.created 2022-11-28 -
dc.date.issued 2022-11-21 -
dc.description.abstract Recent advances in quantized neural networks (QNNs) are closing the performance gap with the full precision neural networks. However at very low precision (i.e., -bits), QNNs often still suffer significant performance degradation. The conventional uniform symmetric quantization scheme allocates unequal numbers of positive and negative quantization levels. We show that this asymmetry in the number of positive and negative quantization levels can result in significant quantization error and performance degradation at low precision. We propose and analyze a quantizer called centered symmetric quantizer (CSQ), which preserves the symmetry of latent distribution by providing equal representations to the negative and positive sides of the distribution. We also propose a novel method to efficiently map CSQ to binarized neural network hardware using bitwise operations. Our analyses and experimental results using state-of-the-art quantization methods on ImageNet and CIFAR-10 show the importance of using CSQ for weight in place of the conventional quantization scheme at extremely low-bit precision (23 bits). -
dc.identifier.bibliographicCitation British Machine Vision Conference -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/75046 -
dc.publisher British Machine Vision Association (BMVA) -
dc.title Centered Symmetric Quantization for Hardware-Efficient Low-Bit Neural Networks -
dc.type Conference Paper -
dc.date.conferenceDate 2022-11-21 -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.