Posts tagged: HCI
The EICS 2014 doctoral consortium
Suit up!: Enabling eyes-free interactions on jacket buttons
We present a new interaction space for wearables by integrating interactive elements, in the form of buttons, into outdoor clothing, specifically jackets and coats. Interactive buttons, or "iButtons", allow users to perform specific tasks using subtle, inconspicuous gestures. They are intended for outdoor settings, where reaching for a mobile phone or an other device may not be convenient or appropriate. Different types of buttons serve dedicated functions, and appropriate placement of these buttons make them easily accessible, without requiring visual contact. By adding context sensitivity, these buttons can also be repurposed to fit other functions. By linking multiple buttons, it is possible to create workflows for specific tasks. We provide a description of an initial iButton design space and highlight some scenarios to illustrate the envisioned usage of interactive buttons.
Paddle: Highly deformable mobile devices with physical controls
Paddle: Highly deformable mobile devices with physical controls
Touch screens have been widely adopted in mobile devices. Although touch input is very flexible in that it can be used for a wide variety of applications on mobile devices, they do not provide physical affordances, encourage eyes-free use or utilize the full dexterity of our hands due to the lack of physical controls. On the other hand, physical controls are often tailored to the task at hand, making them less flexible and therefore less suitable for general purpose use in mobile settings. In this paper, we show how to combine the flexibility of touch screens with the physical qualities that real world controls provide in a mobile context. We do so using a deformable device that can be transformed into various special-purpose physical controls. We present Paddle, a highly deformable device that can be transformed to different shapes. Paddle bridges the gap between differently sized mobile available devices nowadays, such as phones and tablets. Additionally, Paddle demonstrates a novel opportunity for deformable devices to transform into differently shaped physical controls that provide clear physical affordances for the task at hand. Physical controls have the advantage of exploiting people's innate abilities for manipulating physical objects in the real world. We designed and implemented a prototyped system of which the engineering principles are based on the design of the Rubik's magic, a folding plate puzzle. Additionally, we explore the interaction techniques enabled by this concept and conduct an in-depth study to evaluate our transformable physical controls. Our findings show that these physical controls provide several benefits over traditional touch interaction techniques commonly used on mobile devices.
Multi-viewer gesture-based interaction for omni-directional video
Omni-directional video (ODV) is a novel medium that offers viewers a 360º panoramic recording. This type of content will become more common within our living rooms in the near future, seeing that immersive displaying technologies such as 3D television are on the rise. However, little attention has been given to how to interact with ODV content. We present a gesture elicitation study in which we asked users to perform mid-air gestures that they consider to be appropriate for ODV interaction, both for individual as well as collocated settings. We are interested in the gesture variations and adaptations that come forth from individual and collocated usage. To this end, we gathered quantitative and qualitative data by means of observations, motion capture, questionnaires and interviews. This data resulted in a user-defined gesture set for ODV, alongside an in-depth analysis of the variation in gestures we observed during the study.
Investigating the effects of using biofeedback as visual stress indicator during video-mediated collaboration
During remote video-mediated assistance, instructors often guide workers through problems and instruct them to perform unfamiliar or complex operations. However, the workers' performance might deteriorate due to stress. We argue that informing biofeedback to the instructor, can improve commu- nication and lead to lower stress. This paper presents a thor- ough investigation on mental workload and stress perceived by twenty participants, paired up in an instructor-worker sce- nario, performing remote video-mediated tasks. The interface conditions differ in task, facial and biofeedback communi- cation. Two self-report measures are used to assess mental workload and stress. Results show that pairs reported lower mental workload and stress when instructors are using the biofeedback as compared to using interfaces with facial view. Significant correlations were found on task performance with reducing stress (i.e. increased task engagement and decreased worry) for instructors and declining mental workload (i.e. in- creased performance) for workers. Our findings provide in- sights to advance video-mediated interfaces for remote col- laborative work.
Game of tones: Learning to play songs on a piano using projected instructions and games
Learning to play a musical instrument such as the piano requires a substantial amount of practice and perseverance in learning to read and play from sheet music. Our interactivity demo allows people to learn to play songs without requiring sheet music reading skills. We project a graphical notation on top of a piano that indicates what key(s) need to be pressed and create a feedback loop that monitors the player's performance. We implemented The Augmented Piano (TAP), which is a straightforward combination of a physical piano with our alternative notation projected on top. Piano Attack (PAT) extends TAP with a shooting game that continuously provides game-based incentives for learning to play the piano.
A domain-specific textual language for rapid prototyping of multimodal interactive systems
Timisto: A technique to extract usage sequences from storyboards
Storyboarding is a technique that is often used for the conception of new interactive systems. A storyboard illustrates graphically how a system is used by its users and what a typical context of usage is. Although the informal notation of a storyboard stimulates creativity, and makes them easy to understand for everyone, it is more difficult to integrate in further steps in the engineering process. We present an approach, "Time In Storyboards" (Timisto), to extract valuable information on how various interactions with the system are positioned in time with respect to each other. Timisto does not interfere with the creative process of storyboarding, but maximizes the structured information about time that can be deduced from a storyboard.