PARAMETRIC INFORMATION BOTTLENECK TO OPTIMIZE STOCHASTIC NEURAL NETWORKS
Cited 0 times inCited 0 times in
- PARAMETRIC INFORMATION BOTTLENECK TO OPTIMIZE STOCHASTIC NEURAL NETWORKS
- Thanh Nguyen Tang
- Choi, Jaesik
- Issue Date
- Graduate School of UNIST
- In this thesis, we present a layer-wise learning of Stochastic Neural Networks (SNNs) in an information-theoretic perspective. In each layer of an SNN, the compression and the relevance are defined to quantify the amount of information that the layer contains about the input space and the target space, respectively. We jointly optimize the compression and the relevance of all parameters in an SNN to better exploit the neural network’s representation. Previously, the Information Bottleneck (IB) () extracts relevant information for a target variable. Here, we propose Parametric Information Bottleneck (PIB) for a neural network by utilizing (only) its model parameters explicitly to approximate the compression and the relevance. We show that, the PIB framework can be considered as an extension of the Maximum Likelihood Estimate (MLE) principle to every layer level. We also show that, as compared to the MLE principle, PIB : (I) improves the generalization of neural networks in classification tasks, (ii) generates better samples in multi-modal prediction, (iii) is more efficient to exploit a neural network’s representation by pushing it closer to the optimal information-theoretical representation in a faster manner. Our PIB framework, therefore, shows a great potential from an information-theoretic perspective for exploiting neural networks’ representative power that have not yet been fully utilized.
- Department of Computer Science and Engineering
- Go to Link;
- Appears in Collections:
- Files in This Item:
PARAMETRIC INFORMATION BOTTLENECK TO OPTIMIZE STOCHASTIC NEURAL NETWORKS.pdf
can give you direct access to the published full text of this article. (UNISTARs only)
Show full item record
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.