| dc.description.abstract |
Non-humanoid domestic robots rely heavily on physical motion to communicate with intent and affect, yet most existing frameworks and tools either assume humanoid morphologies or fragment responsibility between designers and engineers. This thesis proposes a three-stage framework for designing contextually appropriate physical actions in non-humanoid robots and illustrates it for Lemmy, an indoor elderly care robot with a differential drive base and tilting body. In the first stage, Designing Robot Actions, designer interviews and a participatory workshop produce a hierarchical action catalogue that separates functional and expressive behaviors, and decomposes locomotion into three motion components: speed, path, and posture. Two user studies then establish component preferences and validate that actions assembled from preferred components are rated more favorably than non-preferred or randomly composed actions, including in an elderly cohort. In the second stage, Technical Integration and Engineering Export, a custom Blender add-on, True RoboAnimator, links chassis animation to differential drive kinematics, automatically checks non-holonomic feasibility, corrects invalid paths, and exports wheel-level trajectories and speed profiles that can be used directly by robotics and AR pipelines. A usability study with external 3D artists indicates that the addon fits existing animation workflows while exposing engineering constraints in an accessible way. The third stage, User Testing in Augmented Reality, develops an iOS and visionOS application that plays back validated actions as life-size animations in users’ own environments, enabling remote evaluation of action variants and expressive motions before hardware deployment. Finally, expert interviews with designers, engineers, and XR practitioners assess the clarity, usefulness, and transferability of the framework. Together, these contributions demonstrate a concrete, repeatable process that connects component-level preferences, kinematically grounded animation tools, and AR-based evaluation, providing a scalable alternative to ad hoc motion design for non-humanoid service robots. |
- |