File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이창용

Lee, Changyong
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Early identification of emerging technologies: A machine learning approach using multiple patent indicators

Author(s)
Lee, ChangyongKwon, OhjinKim, MyeongjungKwon, Daeil
Issued Date
2018-02
DOI
10.1016/j.techfore.2017.10.002
URI
https://scholarworks.unist.ac.kr/handle/201301/22914
Fulltext
https://www.sciencedirect.com/science/article/pii/S0040162517304778
Citation
TECHNOLOGICAL FORECASTING AND SOCIAL CHANGE, v.127, pp.291 - 303
Abstract
Patent citation analysis is considered a useful tool for identifying emerging technologies. However, the outcomes of previous methods are likely to reveal no more than current key technologies, since they can only be performed at later stages of technology development due to the time required for patents to be cited (or fail to be cited). This study proposes a machine learning approach to identifying emerging technologies at early stages using multiple patent indicators that can be defined immediately after the relevant patents are issued. For this, first, a total of 18 input and 3 output indicators are extracted from the United States Patent and Trademark Office database. Second, a feed-forward multilayer neural network is employed to capture the complex nonlinear relationships between input and output indicators in a time period of interest. Finally, two quantitative indicators are developed to identify trends of a technology's emergingness over time. Based on this, we also provide the practical guidelines for implementation of the proposed approach. The case of pharmaceutical technology shows that our approach can facilitate responsive technology forecasting and planning.
Publisher
ELSEVIER SCIENCE INC
ISSN
0040-1625

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.