File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

최재식

Choi, Jaesik
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.number 10 -
dc.citation.startPage 976 -
dc.citation.title ENTROPY -
dc.citation.volume 21 -
dc.contributor.author Nguyen, Thanh Tang -
dc.contributor.author Choi, Jaesik -
dc.date.accessioned 2023-12-21T19:45:39Z -
dc.date.available 2023-12-21T19:45:39Z -
dc.date.created 2019-11-20 -
dc.date.issued 2019 -
dc.description.abstract While rate distortion theory compresses data under a distortion constraint, information bottleneck (IB) generalizes rate distortion theory to learning problems by replacing a distortion constraint with a constraint of relevant information. In this work, we further extend IB to multiple Markov bottlenecks (i.e., latent variables that form a Markov chain), namely Markov information bottleneck (MIB), which particularly fits better in the context of stochastic neural networks (SNNs) than the original IB.We show that Markov bottlenecks cannot simultaneously achieve their information optimality in a non-collapse MIB, and thus devise an optimality compromise. With MIB, we take the novel perspective that each layer of an SNN is a bottleneck whose learning goal is to encode relevant information in a compressed form from the data. The inference from a hidden layer to the output layer is then interpreted as a variational approximation to the layer's decoding of relevant information in the MIB. As a consequence of this perspective, the maximum likelihood estimate (MLE) principle in the context of SNNs becomes a special case of the variational MIB.We show that, compared to MLE, the variational MIB can encourage better information flow in SNNs in both principle and practice, and empirically improve performance in classification, adversarial robustness, and multi-modal learning in MNIST. -
dc.identifier.bibliographicCitation ENTROPY, v.21, no.10, pp.976 -
dc.identifier.doi 10.3390/e21100976 -
dc.identifier.issn 1099-4300 -
dc.identifier.scopusid 2-s2.0-85074007561 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/30443 -
dc.identifier.url https://www.mdpi.com/1099-4300/21/10/976 -
dc.identifier.wosid 000495094000058 -
dc.language 영어 -
dc.publisher MDPI AG -
dc.title Markov information bottleneck to improve information flow in stochastic neural networks -
dc.type Article -
dc.description.isOpenAccess FALSE -
dc.relation.journalWebOfScienceCategory Physics, Multidisciplinary -
dc.relation.journalResearchArea Physics -
dc.type.docType Article -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor Information bottleneck -
dc.subject.keywordAuthor Machine learning -
dc.subject.keywordAuthor Stochastic neural networks -
dc.subject.keywordAuthor Variational inference -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.