File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

최재식

Choi, Jaesik
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Discovering Latent Covariance Structures for Multiple Time Series

Author(s)
Tong, AnhChoi, Jaesik
Issued Date
2019-06-12
URI
https://scholarworks.unist.ac.kr/handle/201301/79687
Fulltext
http://proceedings.mlr.press/v97/tong19a.html
Citation
IEEE International Conference on Machine Learning, pp.6285 - 6294
Abstract
Analyzing multivariate time series data is important to predict future events and changes of complex systems in finance, manufacturing, and administrative decisions. The expressiveness power of Gaussian Process (GP) regression methods has been significantly improved by compositional covariance structures. In this paper, we present a new GP model which naturally handles multiple time series by placing an Indian Buffet Process (IBP) prior on the presence of shared kernels. Our selective covariance structure decomposition allows exploiting shared parameters over a set of multiple, selected time series. We also investigate the well-definedness of the models when infinite latent components are introduced. We present a pragmatic search algorithm which explores a larger structure space efficiently. Experiments conducted on five real-world data sets demonstrate that our new model outperforms existing methods in term of structure discoveries and predictive performances.
Publisher
International Machine Learning Society

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.