Posts tagged: HCI

Hidden in plain sight: An exploration of a visual language for near-eye out-of-focus displays in the peripheral view

In this paper, we set out to find what encompasses an appropriate visual language for information presented on near-eye out-of-focus displays. These displays are positioned in a user's peripheral view, very near to the user's eyes, for example on the inside of the temples of a pair of glasses. We explored the usable display area, the role of spatial and retinal variables, and the influence of motion and interaction for such a language. Our findings show that a usable visual language can be accomplished by limiting the possible shapes and by making clever use of orientation and meaningful motion. We found that especially motion is very important to improve perception and comprehension of what is being displayed on near-eye out-of-focus displays, and that perception is further improved if direct interaction with the content is allowed.

Read more →

Calculating and visualising energy expenditure to monitor physical activity in tele-rehabilitation

We have developed an approach that presents patients with an intelligible, user-friendly yet correct visualisation to check progress and verify adherence to the prescribed physical exercise program. Integrated in a comprehensive, mobile self-monitoring app, this patient-centric approach facilitates keeping patients motivated and engaged while rehabilitating remotely.

Back on bike: The BoB mobile cycling app for secondary prevention in cardiac patients

Persons that suffered from a cardiac disease are often recommended to integrate a sufficient level of physical exercise in their daily life. Initially, cardiac rehabilitation takes place in a closely monitored setting in a hospital or a rehabilitation center. Sustaining the effort once the patient has left the ambulatory, supervised environment is a challenge, and drop-out rates are high. Emerging approaches such as telemonitoring and telerehabilitation have been proven to show the potential to support the cardiac patient in adhering to the advised physical exercise. However, most telerehabilitation solutions only support a limited range of physical exercise, such as step-counting during walking. We propose BoB (Back on Bike), a mobile application that guides cardiac patients while cycling. Design choices are explained according to three pillars: ease of use, reduce fear, and direct and indirect motivation. In this paper, we report the results from a field study with cardiac patients.

Read more →

A grounded approach for applying behavior change techniques in mobile cardiac tele-rehabilitation

In mobile tele-rehabilitation applications for Coronary Artery Disease (CAD) patients, behavior change plays a central role in influencing better therapy adherence and prevention of disease recurrence. However, creating sustainab le behavior chan ge that holds a beneficial impact over a prolonged period of time remains an important challenge. In this paper we discuss various models and frameworks related to persuas ion and behav ior chan ge, and investigate how to incorporate these with a multidisciplinary user-centered design approach for creating a mobile tele-rehabilitation application. By implementing different concepts that contribute to behavior change and applying a set of distinct persuasive design patterns, we were able to translate the high-level goals of behavior theory into a mobile application that explicitly incorporates behavior change techniques and also offers a good overall user experience. We evaluated our system, HeartHab, in a lab setting and show that our approach leads to a high user acceptance and willingness to use the system in daily activities.

Read more →

PaperPulse: An integrated approach for embedding electronics in paper designs

We present PaperPulse, a design and fabrication approach that enables designers without a technical background to produce standalone interactive paper artifacts by augmenting them with electronics. With PaperPulse, designers overlay pre-designed visual elements with widgets available in our design tool. PaperPulse provides designers with three families of widgets designed for smooth integration with paper, for an overall of 20 different interactive components. We also contribute a logic demonstration and recording approach, Pulsation, that allows for specifying functional relationships between widgets. Using the final design and the recorded Pulsation logic, PaperPulse generates layered electronic circuit designs, and code that can be deployed on a microcontroller. By following automatically generated assembly instructions, designers can seamlessly integrate the microcontroller and widgets in the final paper artifact.

Read more →

Gestu-wan - an intelligible mid-air gesture guidance system for walk-up-and-use displays

We present Gestu-Wan, an intelligible gesture guidance system designed to support mid-air gesture-based interaction for walk-up-and-use displays. Although gesture-based interfaces have become more prevalent, there is currently very little uniformity with regard to gesture sets and the way gestures can be executed. This leads to confusion, bad user experiences and users who rather avoid than engage in interaction using mid-air gesturing. Our approach improves the visibility of gesture-based interfaces and facilitates execution of mid-air gestures without prior training. We compare Gestu-Wan with a static gesture guide, which shows that it can help users with both performing complex gestures as well as understanding how the gesture recognizer works.

Read more →

Empirical study: Comparing hasselt with c\# to describe multimodal dialogs

Previous research has proposed guidelines for creating domain-specific languages for modeling human-machine multimodal dialogs. One of these guidelines suggests the use of multiple levels of abstraction so that the descriptions of multimodal events can be separated from the human-machine dialog model. In line with this guideline, we implemented Hasselt, a domain-specific language that combines textual and visual models, each of them aiming at describing different aspects of the intended dialog system. We conducted a user study to measure whether the proposed language provides benefits over equivalent event-callback code. During the user study participants had to modify the Hasselt models and the equivalent C# code. The completion times obtained for C# were on average shorter, although the difference was not statiscally significant. Subjective responses were collected using standardized questionnaires and an interview, which both indicated that participants saw value in the proposed models. We provide possible explanations for the results and discuss some lessons learned regarding the design of the empirical study.

Read more →

Augmenting social interactions: Realtime behavioural feedback using social signal processing techniques

Nonverbal and unconscious behaviour is an important component of daily human-human interaction. This is especially true in situations such as public speaking, job interviews or information sensitive conversations, where researchers have shown that an increased awareness of one's behaviour can improve the outcome of the interaction. With wearable technology, such as Google Glass, we now have the opportunity to augment social interactions and provide realtime feedback on one's behaviour in an unobtrusive way. In this paper we present Logue, a system that provides realtime feedback on the presenters' openness, body energy and speech rate during public speaking. The system analyses the user's nonverbal behaviour using social signal processing techniques and gives visual feedback on a head-mounted display. We conducted two user studies with a staged and a real presentation scenario which yielded that Logue's feedback was perceived helpful and had a positive impact on the speaker's performance.

Read more →