An Expectation Maximization Method to Learn the Group Structure of Deep Neural Network
Cited 0 times inCited 0 times in
- An Expectation Maximization Method to Learn the Group Structure of Deep Neural Network
- Yi, Subin
- Choi, Jaesik
- Issue Date
- Graduate School of UNIST
- Analyzing multivariate time series data is important for many applications such as automated control, sensor fault diagnosis and financial data analysis. One of the key challenges is to learn latent features automatically from dynamically changing multivariate input. Convolutional neural networks (CNNs) have been successful to learn generalized feature extractors with shared parameters over the spatial domain in visual recognition tasks. For high-dimensional multivariate time series, designing an appropriate CNN model structure is challenging because the kernels may need to be extended through the full dimension of the input volume. To address this issue, we propose an Expectation Maximization (EM) method to learn the group structure of deep neural networks so that we can process the multiple high-dimensional kernels efficiently. This algorithm groups the kernels for each channel using the EM method and partition the kernel matrix into a block matrix. The EM method assumes the Gaussian Mixture Model (GMM) and the parameters of the GMM is updated together with the parameters of deep neural network by end-to-end backpropagation learning.
- Department of Computer Science and Engineering
- Go to Link;
- Appears in Collections:
- Files in This Item:
An Expectation Maximization Method to Learn the Group Structure of Deep Neural Network.pdf
can give you direct access to the published full text of this article. (UNISTARs only)
Show full item record
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.