File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.conferencePlace KO -
dc.citation.endPage 895 -
dc.citation.startPage 892 -
dc.citation.title 한국자동제어학회 -
dc.contributor.author Kim, EuiSeok -
dc.contributor.author Jung, Mooyoung -
dc.date.accessioned 2023-12-20T07:08:18Z -
dc.date.available 2023-12-20T07:08:18Z -
dc.date.created 2014-12-23 -
dc.date.issued 1992-10-19 -
dc.description.abstract This paper presents a methodology for automatic feature extraction used in a vision system of FMC (flexible Manufacturing Cell). To implement a robot vision system, it is important to make a feature database for object recognition, location, and orientation. For industrial applications, it is necessary to extract feature information from CAD database since the detail information about an object is described in CAD data. Generally, CAD description is three dimensional information but single image data from camera is two dimensional information. Because of this dimensiional difference, many problems arise. Our primary concern in this study is to convert three dimensional data into two dimensional data and to extract some features from them and store them into the feature database. Secondary concern is to construct feature selecting system that can be used for part recognition in a given set of objects. -
dc.identifier.bibliographicCitation 한국자동제어학회, pp.892 - 895 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/52447 -
dc.publisher 한국자동제어학회 -
dc.title.alternative Feature Extraction for the Part Recognition System of FMC -
dc.title FMC의 부품인식을 위한 형상 정보 추출에 관한 연구 -
dc.type Conference Paper -
dc.date.conferenceDate 1992-10-19 -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.