File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이종은

Lee, Jongeun
Intelligent Computing and Codesign Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.startPage 543472 -
dc.citation.title FRONTIERS IN NEUROSCIENCE -
dc.citation.volume 14 -
dc.contributor.author Sim, Hyeonuk -
dc.contributor.author Lee, Jongeun -
dc.date.accessioned 2023-12-21T16:38:39Z -
dc.date.available 2023-12-21T16:38:39Z -
dc.date.created 2020-12-30 -
dc.date.issued 2020-12 -
dc.description.abstract While convolutional neural networks (CNNs) continue to renew state-of-the-art performance across many fields of machine learning, their hardware implementations tend to be very costly and inflexible. Neuromorphic hardware, on the other hand, targets higher efficiency but their inference accuracy lags far behind that of CNNs. To bridge the gap between deep learning and neuromorphic computing, we present bitstream-based neural network, which is both efficient and accurate as well as being flexible in terms of arithmetic precision and hardware size. Our bitstream-based neural network (called SC-CNN) is built on top of CNN but inspired by stochastic computing (SC), which uses bitstreams to represent numbers. Being based on CNN, our SC-CNN can be trained with backpropagation, ensuring very high inference accuracy. At the same time our SC-CNN is deterministic, hence repeatable, and is highly accurate and scalable even to large networks. Our experimental results demonstrate that our SC-CNN is highly accurate up to ImageNet-targeting CNNs, and improves efficiency over conventional digital designs ranging through 50-100% in operations-per-area depending on the CNN and the application scenario, while losing <1% in recognition accuracy. In addition, our SC-CNN implementations can be much more fault-tolerant than conventional digital implementations. -
dc.identifier.bibliographicCitation FRONTIERS IN NEUROSCIENCE, v.14, pp.543472 -
dc.identifier.doi 10.3389/fnins.2020.543472 -
dc.identifier.issn 1662-4548 -
dc.identifier.scopusid 2-s2.0-85099064051 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/49282 -
dc.identifier.url https://www.frontiersin.org/articles/10.3389/fnins.2020.543472/full -
dc.identifier.wosid 000605959100001 -
dc.language 영어 -
dc.publisher Frontiers Media S.A. -
dc.title Bitstream-Based Neural Network for Scalable, Efficient, and Accurate Deep Learning Hardware -
dc.type Article -
dc.description.isOpenAccess TRUE -
dc.relation.journalWebOfScienceCategory Neurosciences -
dc.relation.journalResearchArea Neurosciences & Neurology -
dc.type.docType Article -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor bitstream-based neural network -
dc.subject.keywordAuthor neuromorphic computing -
dc.subject.keywordAuthor stochastic computing -
dc.subject.keywordAuthor deep learning hardware -
dc.subject.keywordAuthor dynamic precision scaling -
dc.subject.keywordAuthor SC-CNN -
dc.subject.keywordAuthor variable precision -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.