File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

나승훈

Na, Seung-Hoon
Natural Language Processing Lab
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Memory-restricted latent semantic analysis to accumulate term-document co-occurrence events

Author(s)
Na, Seung-HoonLee, Jong-Hyeok
Issued Date
2012-09
DOI
10.1016/j.patrec.2012.05.002
URI
https://scholarworks.unist.ac.kr/handle/201301/86831
Citation
PATTERN RECOGNITION LETTERS, v.33, no.12, pp.1623 - 1631
Abstract
This paper addresses a novel adaptive problem of obtaining a new type of term-document weight. In our problem, an input is given by a long sequence of co-occurrence events between terms and documents, namely, a stream of term-document co-occurrence events. Given a stream of term-document co-occurrences, we learn unknown latent vectors of terms and documents such that their inner product adaptively approximates the target query-based term-document weights resulting from accumulating co-occurrence events. To this end, we propose a new incremental dimensionality reduction algorithm for adaptively learning a latent semantic index of terms and documents over a collection. The core of our algorithm is its partial updating style, where only a small number of latent vectors are modified for each term-document co-occurrence, while most other latent vectors remain unchanged. Experimental results on small and large standard test collections demonstrate that the proposed algorithm can stably learn the latent semantic index of terms and documents, showing an improvement in the retrieval performance over the baseline method. (C) 2012 Elsevier B.V. All rights reserved.
Publisher
ELSEVIER
ISSN
0167-8655
Keyword (Author)
Partial-update algorithmLatent semantic analysisCo-occurrenceDimensionality reduction

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.