File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

PROBABILISTIC MODEL DISCOVERY RELATIONAL LEARNING AND SCALABLE INFERENCE

Author(s)
Tong, Anh
Advisor
Kim, Kwang In
Issued Date
2021-02
URI
https://scholarworks.unist.ac.kr/handle/201301/82417 http://unist.dcollection.net/common/orgView/200000371291
Abstract
This thesis studies interesting problems in compositionality for machine learning models under some settings including relational learning, scalability and deep models. Compositionality is the terminology describing the process of building small objects to complex ones. Bringing this concept into machine learning is important because it appears in many aspects from infinitesimal atomic to planetary structures. In this thesis, machine learning models center around Gaussian process of which covariance function is compositionally constructed. The proposed approach builds methods that can explore compositional model space automatically and efficiently as well as strives to address the interpretability for obtained models.

The aforementioned problems are both important and challenging. Considering multivariate or relational learning is de facto in time series analysis for many domains. However, the existing methods of compositional learning are inapplicable to extend to such a setting since the explosion in model space makes it infeasible to use. Learning compositional structures is already a time-consuming task. Although there are existing approximation methods, they do not work well for compositional covariances. This makes it even harder to propose a scalable approach without sacrificing model performances. Finally, analyzing hierarchical deep Gaussian processes is notoriously difficult especially when incorporating different covariance functions. Previous work focuses on a single case of covariance function and is difficult to generalize for many other cases.

The goal of this thesis is to propose solutions to the given problems. The first contribution of this thesis is a general framework for modeling multiple time series which provides descriptive relations between time series. Second, this thesis presents efficient probabilistic approaches to address the model search problem which previously is done by exhaustive enumerating evaluation. Furthermore, a scalable inference for Gaussian process is proposed, providing accurate approximation with guarantees of error bounds. Last but not least, to address the existing issues in deep Gaussian process, this thesis presents a unified theoretical framework to explain the pathology in deep Gasssian processes with better error bounds for various kernels compared to existing work and rates of convergence.
Publisher
Ulsan National Institute of Science and Technology (UNIST)
Degree
Doctor
Major
Department of Computer Science and Engineering

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.