Haptipedia: Accelerating Haptic Device Discovery for Designers

Creating physical experiences often entails selecting, modifying, or inventing specialized haptic hardware. However, experience designers are rarely engineers. Lack of knowledge about haptic hardware is often a major barrier in designing novel physical interactions by non-experts.

We developed Haptipedia: an online taxonomy, database, and visualization of 105+ haptic devices that are invented since 1992. Haptipedia's taxonomy unifies expert knowledge on haptics and makes it accessible to HCI researchers. The database and visualization provide users with a flexible way to explore major developments in the field, xamine device trade-offs, and repurpose them into novel devices and interactions for their projects.

Interact with Haptipedia

Get involved and contribute!

Characterizing Users' Sense-Making Schemas for Vibrations

How do people perceive haptic signals? How to design a "snoring" sensation?

In this project, we examined haptic facets as a means to describe and analyze users' cognitive frameworks for making sense of qualitative and affective characteristics of haptic sensations (e.g.,synthetic vibrations). Specifically, we proposed five facets based on how people describe vibrations (physical, sensory, emotional, metaphoric, and usage examples) and investigated underlying dimensions and cross-linkages among these descriptions based on participants' perception of a 120-item vibration library. We provide concrete guidelines for designing and evaluating expressive vibrotactile signals.

[Video][JIHCS Journal Paper]

Crowdsourcing Haptic Evaluation on Amazon Mechanical Turk

Crowdsourcing can gather rapid feedback at scale, but how do we crowdsource a haptic prototype? Unlike visual and auditory studies, haptic research often requires specialized hardware not available to the crowds. Thus, haptic evaluation is commonly limited to small-scale lab-based studies. In this project, we investigated ``proxy modalities'' as a means to crowdsource haptic evaluation on online platforms such as Amazon's Mechanical Turk. We designed three sets of proxies for vibrations (two visual proxies and a low-fidelity vibration proxy) and ran lab-based and Mechanical Turk studies to validate our approach.

[Video][CHI'16 Paper]

VibViz: Organizing and Visualizing a Vibrotactile Library for Ordinary Users

How can we provide simple and effective access to haptic libraries? How can we make the navigation and selection process engaging?

We designed VibViz, an interactive visualization that provides easy and highly navigable access to large, diverse sets of vibrotactile stimuli. The interface visualizes our 120-item library in three connected views: Physical view, Sensory and Emotional View, and Metaphor and Usage Example View. Users can freely browse the vibrations or narrow down to a subset using sliders and adjective filters.

[Video][WHC'15 Paper]

Interact with VibViz
Check VibViz code on GitHub

To referene VibViz, please cite the following paper: Seifi, Hasti, Kailun Zhang, and Karon E. MacLean. "VibViz: Organizing, visualizing and navigating vibration libraries." World Haptics Conference (WHC), 2015 IEEE. IEEE, 2015.

Vibrotactile Personalization Tools For Ordinary Users

What characteristics will make a haptic personalization tool usable and effective for naive users?

Based on a review of existing tools in haptics and other domains, we proposed five design parameters for haptic personalization tools and three distinct personalization mechanisms: 1) choosing: users select from a list of pre-designed vibrations, 2) tuning: users adjust high-level characteristics of a vibration using a control or slider, 3) chaining: users combine short pre-designed tactile building blocks (e.g., by sequencing them) to create a new vibration sensation. Results from a controlled Wizard of Oz study suggested choosing, and tuning to be the most practical and preferred mechanisms for personalization tools.

[Video][HAPTICS'14 Paper]

Personalized Training Feedback in Laparoscopic Surgery

Traditional means training novice surgeons in laparoscopic (minimally invasive) surgical procedures require extensive monitoring and guidance from expert surgeons. Advances in recent years show strong links between the trainee's tool motions and their skill. Building on these studies, we aim to assess trainee performance and inefficiency during dry-lab practice from haptic sensor data attached to their surgical tools and develop personalized haptic and multimodal feedback to correct a trainee's hand movements.

We are collaborating with the Children's Hospital of Philadelphia (CHOP) in the USA to collect practice data from novice surgeons, annotate them with 6-9 activities and mistakes, and analyze the signals with machine learning techniques.

Design space of interactive haptic applications for smartwatched

Despite the rapid growth of wearable and mobile devices and associated availability of vibrotactile actuators, the design space of vibrotactile applications has remained limited to event-based vibration notifications. While useful in some cases, these notifications can be disruptive and cognitively-demanding. In this project, we extended the design space of vibrotactile applications by utilizing users' touch input (e.g., 2D location, path, etc.) together with vibration sensations for smartwatch interactions. We proposed a design space for vibrotactile interactions and investigated user performance (error rate, completion time, perceived demand) for touch-based VT interactions with a user study. Finally, we populated the design space with five example applications that run on off-the-shelf smartwatches.

[Video][ISWC'16 Paper]

Qualitative Examination of Supervisor-Student Collaborations

Supervisory meetings are a crucial aspect of graduate studies and have a strong impact on the success of research and supervisor-student relations, yet there is little research on supporting this relationship and even less on understanding the nature of this collaboration and user requirements. Thus, we conducted an exploratory study on the choice and success of tools and practices used by supervisors and students for meetings, for the purpose of making informed design recommendations. Data from a series of five focus groups and three individual interviews with faculty members and students were analyzed using thematic analysis, and resulted in three themes and two implications for design.

[GI'14 Paper]

Evaluating Emotional Impact of Painterly Animation Sequences

Animation and gaming industries seek new techniques to enhance emotional expressiveness of their products. In my Master's thesis, I conducted a series of user studies on the effect of painterly rendering on users' perceptions of facial expression sequences. I proposed a knowledge-based painterly rendering technique for facial sequences using two sources of information (emotion content of each keyframe and painterly parameters for each keyframe) to guide the painting process and described its integration into a facial animation authoring tool. The results of our studies showed significant impact of rendering style and parameters on users' perception. Later, we adapted our technique and results into animation sequences for a pre-visualization company (TwentyOneInc) in Vancouver.

[CAe'12 Paper][IJCGT Journal Paper]

Mathematical Modeling of the Liquor Bar Density

I worked with a mathematician and three criminologists to mathematically model the social impact of liquor establishments on their city neighborhoods. We developed a Cellular Automata model of the phenomena in Matlab and tested the model using liquor establishments and crime data from the City of Vancouver in British Columbia. The project was innovative as it proposed the cellular automata as a new method for the criminology research. The validated mathematical model provided a powerful tool for testing various scenarios with very little cost.

[CEUS Journal Paper]

Identifying Individuals' Characteristics Using Physiological (Bio) Signals

The ability to identify an individual's characteristics (e.g., demographics and emotions) from sensor data is beneficial in a large range of HCI application. In this project, I classified an individual's gender and emotions from their biological sensor data (e.g., skin conductance). In one component of the project, I extracted shape features from biological signals and achieved a 90% accuracy in classifying geneder using the KNN algorithm. In a second component, I used a Hidden-Markov-Model for 58% accuracy in detecting users' valence and arousal.