File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Vision-based real-time layer error quantification for additive manufacturing

Author(s)
Jeong, HaedongKim, MinsubPark, BumsooLee, Seungchul
Issued Date
2017-06-04
DOI
10.1115/MSEC2017-2991
URI
https://scholarworks.unist.ac.kr/handle/201301/35318
Fulltext
http://proceedings.asmedigitalcollection.asme.org/proceeding.aspx?articleid=2646245
Citation
ASME 2017 12th International Manufacturing Science and Engineering Conference collocated with the JSME/ASME 2017 6th International Conference on Materials and Processing
Abstract
Quality assurance of Additive Manufacturing (AM) products has become an important issue as the AM technology is extending its application throughout the industry. However, with no definite measure to quantify the error of the product and monitor the manufacturing process, many attempts are made to propose an effective monitoring system for the quality assurance of AM products. In this research, a novel approach for quantifying the error in real-time is presented through a closed-loop vision-based tracking method. As conventional AM processes are open-loop processes, we focus on the implementation of real-time error quantification of the products through the utilization of a closed-loop process. Three test models are designed for the experiment, and the tracking data from the camera will be compared with the G-code of the product to evaluate the geometrical errors. The results obtained from the camera analysis will then be validated through comparison of the results obtained from a 3D scanner.
Publisher
ASME 2017 12th International Manufacturing Science and Engineering Conference collocated with the JSME/ASME 2017 6th International Conference on Materials and Processing

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.