File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

정지범

Chung, Jibum
Risk Management Policy and Safety Design Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Leveraging Large Language Models for Enhanced Back-Translation: Techniques and Applications

Author(s)
Chung, JibumKim Taehyun
Issued Date
2025-04
DOI
10.1109/ACCESS.2025.3557014
URI
https://scholarworks.unist.ac.kr/handle/201301/87503
Citation
IEEE ACCESS, v.13, pp.61322 - 61328
Abstract
Cross-cultural studies are prevalent in academia, yet challenges arise in conducting objective research due to linguistic and cultural disparities. Rigorous international comparative research requires appropriate questionnaires that can be used in all countries, and translation becomes an extremely important process. Brislin's back-translation method is widely recognized, but it usually requires many skilled bilingual translators and is both time-consuming and expensive. This study aims to overcome these limitations by using Large Language Model (LLM) AI technology. We utilized the Application Programming Interfaces (APIs) of well-known commercial LLM models such as ChatGPT3.5, ChatGPT4o, Google-Gemini, and Anthropic-Claude 3. The entire program was built using the Python programming language, and the user interface was built using the Streamlit library. This pilot study's results confirm the feasibility of LLM-assisted back-translation, particularly for complex topics like carbon footprint reduction planning. This represents a significant advance over traditional back-translation methods, offering substantial time and cost savings.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
ISSN
2169-3536
Keyword (Author)
TranslationInstruments

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.