Socially assistive robot companions for chilren with autism could better understand and react to a child’s needs if they utilized tactile sensing.
Based on interviews with 11 experienced autism specialists, we devised seven qualitative requirements and quantitative specifications for tactile sensing in SARs for autims. These specifications describe robustness and maintainability, sensing range, feel, gesture identification, spatial, temporal, and adaptation attributes for the touch perception system of the companion robots.
Designing multisensory applications is a complex process, requiring knowledge of hardware, programming, and human perception (Figure 4). To inform tools and guidelines for novice designers, we conducted a nine-week longitudinal study during the Student Innovation Challenge at the IEEE World Haptics Conference (WHC17), the largest existing meeting on haptics research. Based on qualitative analysis of surveys, interviews, and team blogs, we outlined the novice design activities and process, 10 design choices that novices tended to ignore, and three success strategies for developing interpretable multisensory applications.
Creating physical experiences often entails selecting, modifying, or inventing specialized haptic hardware. However, experience designers are rarely engineers. Lack of knowledge about haptic hardware is often a major barrier in designing novel physical interactions by non-experts.
We developed Haptipedia: an online taxonomy, database, and visualization of 105+ haptic devices that are invented since 1992. Haptipedia's taxonomy unifies expert knowledge on haptics and makes it accessible to HCI researchers. The database and visualization provide users with a flexible way to explore major developments in the field, xamine device trade-offs, and repurpose them into novel devices and interactions for their projects.
How do people perceive haptic signals? How to design a "snoring" sensation?
In this project, we examined haptic facets as a means to describe and analyze users' cognitive frameworks for making sense of qualitative and affective characteristics of haptic sensations (e.g.,synthetic vibrations). Specifically, we proposed five facets based on how people describe vibrations (physical, sensory, emotional, metaphoric, and usage examples) and investigated underlying dimensions and cross-linkages among these descriptions based on participants' perception of a 120-item vibration library. We provide concrete guidelines for designing and evaluating expressive vibrotactile signals.
Crowdsourcing can gather rapid feedback at scale, but how do we crowdsource a haptic prototype?
Unlike visual and auditory studies, haptic research often requires specialized hardware not available to the crowds.
Thus, haptic evaluation is commonly limited to small-scale lab-based studies.
In this project, we investigated ``proxy modalities'' as a means to crowdsource haptic evaluation on online platforms such as Amazon's Mechanical Turk.
We designed three sets of proxies for vibrations (two visual proxies and a low-fidelity vibration proxy) and ran lab-based and Mechanical Turk studies to validate our approach.
How can we provide simple and effective access to haptic libraries? How can we make the navigation and selection process engaging?
What characteristics will make a haptic personalization tool usable and effective for naive users?
Based on a review of existing tools in haptics and other domains, we proposed five design parameters for haptic personalization tools and three distinct personalization mechanisms: 1) choosing: users select from a list of pre-designed vibrations, 2) tuning: users adjust high-level characteristics of a vibration using a control or slider, 3) chaining: users combine short pre-designed tactile building blocks (e.g., by sequencing them) to create a new vibration sensation. Results from a controlled Wizard of Oz study suggested choosing, and tuning to be the most practical and preferred mechanisms for personalization tools.
Traditional means training novice surgeons in laparoscopic (minimally invasive) surgical procedures require extensive monitoring and guidance from expert surgeons. Advances in recent years show strong links between the trainee's tool motions and their skill. Building on these studies, we aim to assess trainee performance and inefficiency during dry-lab practice from haptic sensor data attached to their surgical tools and develop personalized haptic and multimodal feedback to correct a trainee's hand movements.
We are collaborating with the Children's Hospital of Philadelphia (CHOP) in the USA to collect practice data from novice surgeons, annotate them with 6-9 activities and mistakes, and analyze the signals with machine learning techniques.
Despite the rapid growth of wearable and mobile devices and associated availability of vibrotactile actuators, the design space of vibrotactile applications has remained limited to event-based vibration notifications.
While useful in some cases, these notifications can be disruptive and cognitively-demanding.
In this project, we extended the design space of vibrotactile applications by utilizing users' touch input (e.g., 2D location, path, etc.) together with vibration sensations for smartwatch interactions.
We proposed a design space for vibrotactile interactions and investigated user performance (error rate, completion time, perceived demand) for touch-based VT interactions with a user study.
Finally, we populated the design space with five example applications that run on off-the-shelf smartwatches.
Supervisory meetings are a crucial aspect of graduate studies and have a strong impact on the success of research and supervisor-student relations, yet there is little research on supporting this relationship and even less on understanding the nature of this collaboration and user requirements.
Thus, we conducted an exploratory study on the choice and success of tools and practices used by supervisors and students for meetings, for the purpose of making informed design recommendations.
Data from a series of five focus groups and three individual interviews with faculty members and students were analyzed using thematic analysis, and resulted in three themes and two implications for design.
Animation and gaming industries seek new techniques to enhance emotional expressiveness of their products.
In my Master's thesis, I conducted a series of user studies on the effect of painterly rendering on users' perceptions of facial expression sequences.
I proposed a knowledge-based painterly rendering technique for facial sequences using two sources of information (emotion content of each keyframe and painterly parameters for each keyframe) to guide the painting process and described its integration into a facial animation authoring tool.
The results of our studies showed significant impact of rendering style and parameters on users' perception.
Later, we adapted our technique and results into animation sequences for a pre-visualization company (TwentyOneInc) in Vancouver.
I worked with a mathematician and three criminologists to mathematically model the social impact of liquor establishments on their city neighborhoods.
We developed a Cellular Automata model of the phenomena in Matlab and tested the model using liquor establishments and crime data from the City of Vancouver in British Columbia.
The project was innovative as it proposed the cellular automata as a new method for the criminology research.
The validated mathematical model provided a powerful tool for testing various scenarios with very little cost.
The ability to identify an individual's characteristics (e.g., demographics and emotions) from sensor data is beneficial in a large range of HCI application. In this project, I classified an individual's gender and emotions from their biological sensor data (e.g., skin conductance). In one component of the project, I extracted shape features from biological signals and achieved a 90% accuracy in classifying geneder using the KNN algorithm. In a second component, I used a Hidden-Markov-Model for 58% accuracy in detecting users' valence and arousal.