Abstract:
Augmented Reality (AR) holds significant potential to facilitate users in executing manual tasks. For effective support, however, we need to understand how showing movement instructions in AR affects how well people can follow those movements in real life. In this paper, we examine the degree to which users can synchronize the speed of their movements with speed cues presented through an AR environment. Specifically, we investigate the effects of timing in AR visual guidance. We assess performance using a highly realistic Mixed Reality (MR) welding simulation. Welding is a task that requires very precise timing and control over hand and arm motion. Our results show that upfront visual guidance (before manual task execution) alone often fails to transfer the knowledge of intended speeds, especially at higher target speeds. Live guidance (during manual task execution) during the activity provides more accurate speed results but typically requires a higher overshoot at the start. Optimal outcomes occur when visual guidance appears upfront and continues during the activity for users to follow through.