<?xml version="1.0" encoding="UTF-8"?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns="http://purl.org/rss/1.0/" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel rdf:about="https://scholarworks.unist.ac.kr/handle/201301/146">
    <title>Repository Collection:</title>
    <link>https://scholarworks.unist.ac.kr/handle/201301/146</link>
    <description />
    <items>
      <rdf:Seq>
        <rdf:li rdf:resource="https://scholarworks.unist.ac.kr/handle/201301/91185" />
        <rdf:li rdf:resource="https://scholarworks.unist.ac.kr/handle/201301/91184" />
        <rdf:li rdf:resource="https://scholarworks.unist.ac.kr/handle/201301/91183" />
        <rdf:li rdf:resource="https://scholarworks.unist.ac.kr/handle/201301/91182" />
      </rdf:Seq>
    </items>
    <dc:date>2026-04-04T11:26:30Z</dc:date>
  </channel>
  <item rdf:about="https://scholarworks.unist.ac.kr/handle/201301/91185">
    <title>RealityBrush: an AR authoring system that captures and utilizes kinetic properties of everyday objects</title>
    <link>https://scholarworks.unist.ac.kr/handle/201301/91185</link>
    <description>Title: RealityBrush: an AR authoring system that captures and utilizes kinetic properties of everyday objects
Author(s): Kim, Hyunju; Hong, Sanghwa; Kim, Junki; Jang, Taesoo; Woo, Woontaek; Heo, Seongkook; Lee, Byungjoo
Abstract: This study introduces RealityBrush, a novel augmented reality (AR) authoring system that allows designers to quickly and easily create realistic virtual objects by capturing and utilizing the kinetic properties of everyday physical objects in the early stages of design. The RealityBrush system consists of a handheld device, a data analysis module and an AR feedback module. The handheld device, which is made in the shape of a rod, is equipped with a depth camera and a force sensor at the tip. When a user holds the device and pokes a physical object, the local force applied to the object and the resulting deformations of the object are measured simultaneously. By analyzing the relationship between measured force and deformations, the RealityBrush system can identify two kinetic properties of the poked object: stiffness and motion resistance. The user can then use the handheld device as a 3D brush to create a virtual object in the air and assign the measured kinetic properties to the created virtual object. Finally, the system's physics engine allows the user to interact with the created object by using the device to poke or push the object. The technical evaluation showed that the system can successfully extract the stiffness and motion resistance of everyday objects. We also report initial user feedback on AR authoring using the RealityBrush system.</description>
    <dc:date>2021-07-31T15:00:00Z</dc:date>
  </item>
  <item rdf:about="https://scholarworks.unist.ac.kr/handle/201301/91184">
    <title>MaterialSense: Estimating and utilizing material properties of contact objects in multi-touch interaction</title>
    <link>https://scholarworks.unist.ac.kr/handle/201301/91184</link>
    <description>Title: MaterialSense: Estimating and utilizing material properties of contact objects in multi-touch interaction
Author(s): Hong, Sanghwa; Heo, Seongkook; Lee, Byungjoo
Abstract: In today's digital art applications, users can play virtual musical instruments or paint on virtual canvas using multi-touch input. Existing touch input devices, however, do not consider that the result of artistic expression changes depending on the material properties of the tool that comes into contact with the art medium. This paper proposes a novel method called MaterialSense that estimates the material properties of objects in contact with a touch input surface. The technique uses a commercial touchpad with six load cells attached, and when up to two objects are in contact, solves the force equilibrium equation to estimate the stiffness and friction coefficient of each contacted object. By analyzing the time-series data of the measured 3-axis force and normal vector for each touchpoint, it can estimate the stiffness and kinetic friction coefficient of the contacted object. Estimated material properties enable novel and realistic artistic expressions in touch-based digital art applications, such as changes in the tone of virtual instruments or the effects of different painting brushes. We present two application scenarios using MaterialSense, along with an in-depth technical evaluation to verify the accuracy and precision of its estimates.</description>
    <dc:date>2023-03-31T15:00:00Z</dc:date>
  </item>
  <item rdf:about="https://scholarworks.unist.ac.kr/handle/201301/91183">
    <title>SequenceSense: A Tool for Designing Usable Foot-Based Gestures Using a Sequence-Based Gesture Recognizer</title>
    <link>https://scholarworks.unist.ac.kr/handle/201301/91183</link>
    <description>Title: SequenceSense: A Tool for Designing Usable Foot-Based Gestures Using a Sequence-Based Gesture Recognizer
Author(s): Azim, Md Aashikur Rahman; Rahman, Adil; Heo, Seongkook
Abstract: Foot-based gestures enable people to interact with mobile and wearable devices when their hands are unavailable for interaction. For the foot gestures to be truly usable, the gestures should be recognizable by the system without being confused by daily activities and still be easy to perform. However, designing such gestures often requires multiple iterations of gesture design, model training, and evaluation. In this paper, we present SequenceSense, a tool developed to help designers efficiently design a usable gesture set using inertial sensors, which eliminates the need for multiple data collection studies to evaluate the gestures' usability through gesture modification by sequencing atomic actions and instant false positive analysis, and instead requires only the initial gesture sample collection. Unlike gesture recognizers using complete gestures to train a model, SequenceSense segments gesture into a sequence of atomic actions. For example, a foot tap to the right may have (1) lift the foot, (2) move the foot to the right, and (3) land the foot. SequenceSense also compares the gesture sequence with the sequence database created from the daily activities to identify possible conflicts. This allows gesture designers to build easily usable foot-based gestures without the need for recollecting and evaluating gestures. We validated SequenceSense's efficacy in designing usable gestures with low false positives through a user study with nine gesture designers.</description>
    <dc:date>2023-07-31T15:00:00Z</dc:date>
  </item>
  <item rdf:about="https://scholarworks.unist.ac.kr/handle/201301/91182">
    <title>Frappé: An Ultra Lightweight Mobile UI Framework for Rapid API-based Prototyping and Environmental Deployment</title>
    <link>https://scholarworks.unist.ac.kr/handle/201301/91182</link>
    <description>Title: Frappé: An Ultra Lightweight Mobile UI Framework for Rapid API-based Prototyping and Environmental Deployment
Author(s): Rahman, Adil; Heo, Seongkook
Abstract: QR codes have been used as an inexpensive means to connect users to digital platforms such as websites and mobile applications. However, despite their ubiquity, QR codes are limited in purpose and can only redirect users to the URL contained within it, thereby making their use heavily network dependent, which can be unsuitable for use in ephemeral scenarios and areas with limited connectivity. In this paper, we introduce Frapp &amp; eacute;, a framework capable of deploying ultra lightweight UIs to mobile devices directly through QR codes, without requiring any network connectivity. This is achieved by decomposing the UI into metadata and storing it inside the QR code, while offloading the UI functionality to API calls. We also introduce enFrapp &amp; eacute;, a WYSIWYG tool for building Frapp &amp; eacute; UIs. We demonstrate the lightweight nature of our framework through a technical evaluation, whereas the usability of our UI builder tool is demonstrated through a user study.</description>
    <dc:date>2023-08-31T15:00:00Z</dc:date>
  </item>
</rdf:RDF>

