An Expectation Maximization Method to Learn the Group Structure of Deep Neural Network

DC Field Value Language
dc.contributor.advisor Choi, Jaesik - Yi, Subin - 2018-02-13T06:45:16Z - 2018-02-13T06:45:16Z - 2018-02 -
dc.identifier.other 200000011609 -
dc.identifier.uri en_US
dc.identifier.uri -
dc.description Department of Computer Science and Engineering -
dc.description.abstract Analyzing multivariate time series data is important for many applications such as automated control, sensor fault diagnosis and financial data analysis. One of the key challenges is to learn latent features automatically from dynamically changing multivariate input. Convolutional neural networks (CNNs) have been successful to learn generalized feature extractors with shared parameters over the spatial domain in visual recognition tasks. For high-dimensional multivariate time series, designing an appropriate CNN model structure is challenging because the kernels may need to be extended through the full dimension of the input volume. To address this issue, we propose an Expectation Maximization (EM) method to learn the group structure of deep neural networks so that we can process the multiple high-dimensional kernels efficiently. This algorithm groups the kernels for each channel using the EM method and partition the kernel matrix into a block matrix. The EM method assumes the Gaussian Mixture Model (GMM) and the parameters of the GMM is updated together with the parameters of deep neural network by end-to-end backpropagation learning. -
dc.description.statementofresponsibility open -
dc.language ENG -
dc.publisher Graduate School of UNIST -
dc.title An Expectation Maximization Method to Learn the Group Structure of Deep Neural Network -
dc.type Master's thesis -
dc.administration.regnum 200000011609 -
Appears in Collections:

find_unist can give you direct access to the published full text of this article. (UNISTARs only)

Show simple item record


  • mendeley


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.