Current Projects

SpokeIt: An Interactive Game for Cleft Speech

Orofacial cleft is a birth defect that results in tissues of the face, mouth, or lip to not be fused together correctly. At a very young age, these children undergo surgical procedures, and will need long-term speech therapy. SpokeIt is an interdisciplinary project created at UC Santa Cruz’s computational media department and psychology department in conjunction with the medical team at UC Davis.  SpokeIt is a game created with the goal of making practicing speech fun and effective.  SpokeIt benefits children with Cleft speech because it makes practice seamless, gives speech therapists access to their patient’s progress, and assigns words and phrases that are suitable for the child to exercise.  SpokeIt employs a dynamic curriculum that adjusts its difficulty as the child plays.

Researchers: Jared Duval, Zak Rubin, Sri Kurniawan

Project Butterfly: Immersive Virtual Reality for Physically and Emotionally Intelligent Healthcare Experiences

The goal of Project Butterfly (PBF) is to create a controlled immersive media environment for the adaptable and translatable therapeutic movement, which incorporates runtime data feedback on player movement performance and behavioral analysis. Its aims are to bridge the gap between therapists and at home users undergoing repetitive exercise and physical therapy through mapping movement by "on-the-fly" motion capture for gamified scenarios such as protecting a virtual butterfly and catching crystals. Pilot work was conducted to explore how Mirror Visual Feedback Therapy could be translated into an immersive virtual reality environment through exploring how users protect a virtual butterfly with head-mounted display systems and wearable soft robotic exosuits. We are actively expanding upon this work to explore long-term feasability, physical intelligenece through personalized wearable robotics, and emotional intelligence through custom biofeedback sensing pipelines. Our ultimate goal is to explore this technology as a potential future with healthcare  augmented by this cybernetic physical-virtual experience aimed at healing each user to their own individuality and response.

Here is a video of the first iteration prototype for PBF:

Researchers: Aviv Elor, Michael Powell (DANSER Labs), Evanjelin Mahmoodi, Mircea Teodorescu (DANSER Labs), Sri Kurniawan

Project Star Catcher: Translating Physical Therapy into Immersive Virtual Reality

Immersive virtual reality gaming has the potential to motivate individuals to perform intensive repetitive task-based therapy, and it can be combined with motion capture as a way to track therapy compliance and progress. This project explores the design and evaluation of an immersive virtual reality experience, titled “Project Star Catcher,” for those with weakness on one side of their upper bodies. Our game mechanics were adapted from constraint-induced movement therapy, an established therapy method where users are asked to use the weaker arm by physically binding the stronger arm. This adaptation innovates from physical to psychological binding by providing a dynamic reward system that promotes the use of the weaker arm. Players are rewarded by scoring points when performing a rehabilitative motion to catch falling stars in an immersive, cosmic virtual reality. Initial results indicate that immersive games like PSC provide a powerful medium for physical exercise with an increase of over 40% exercise compliance for adults of mixed ability. We are performing futher studies for a systematic comparision of VR devices with PSC, as well as applying affective computing techniques to understand user emotional response. This modular system enacts a behavioral playground that is flexible from studying VR for Therapy, physical task-based analysis, and runtime adaptive stimuli.

Here is a video overview of the pilot PSC goals:

Researchers: Aviv Elor, Evanjelin Mahmoodi, Nico Hawthorne, Michael Powell (DANSER Labs), Mircea Teodorescu (DANSER Labs), Sri Kurniawan

Dynamic Online Computerized Neuropsychological Testing System

Traditional cognitive testing for detecting cognitive impairment can be inaccessible, expensive, and time consuming. This project aims to develop an automated online computerized neuropsychological testing (CNT) system for rapidly tracking an individual’s cognitive performance throughout the user’s daily or weekly schedule in an unobtrusive way. By utilizing embedded microsensors within tablet devices, the proposed context-aware system will capture ambient and behavioral data pertinent to the real-world contexts and times of testing to compliment psychometric results, by providing insight into the contextual factors relevant to the user's testing efficacy and performance.

Our primary objectives for the project will be to:

  • Develop an accessible, dynamic, online CNT system capable of capturing contextual data during testing and tracking temporal variations in the user’s test performance.
  • Develop an information fusion system capable of capturing and analyzing ambient data from different sources during testing.

Researchers: Sean Smith, Breanna Baltaxe, Alex Cabral, Brookelyn Mcjunkin, Bronte Brillantes, Joshua Lopez, Trevor Parker

Document Layout and Formatting Helper for Blind Authors

This project, funded through an NSF CAREER award and the NSF GRFP, aims to facilitate independence for blind authors in producing documents that meet the presentation 'standards' expected by sighted readers. This includes gathering design guidelines for tools that aim to help blind people format their docuemnts independetly by: 1) developmenting an impact-weighted taxonomy of common document presentation errors produced by blind authors, 2) exploring blind persons' mental models and strategies for learning and coping with docuemnt formatting, and how these models and strategies contribute to the success of independent document formatting and layout activities; and 3) by iterating over the previous the development and evaluation of prototype tools.

Here is a video overview of our project (i.e. our motivation, objectives, and methods):

Researchers: Lourdes Morales-Villaverde, Sri Kurniawan, Sonia Arteaga
Collaborators: Priya Bhattacharjee, Tiffany Thang, Peter Cottrell, Dustin Adams

Past Projects

Speech Therapy Game for Children with Cleft Palate

We are developing a video game to aid young children undergoing speech therapy after cleft palate surgery. Children undergoing speech therapy must first unlearn their compensations and ommissions that they use to get around certain syllables they cannot do. Unlearning involves significant amounts of practice at home, and parents have extreme difficulty in motivating children to practice. Working directly with children who are undergoing speech therapy, we are designing a simple game using a novel speech recognition engine to help motivate them to perform their therapy exercises. We aim to accelerate the rate of recovery and give therapists more tools to use with children. We also intend to forward research in educational games.

Here is a video overview of our prototype game "Speech Adventure":

Researchers: Zachary Rubin, Sri Kurniawan. In collaboration with UC Davis Medical Center.

Blind Photography

Photography is a visual way to capture a moment in time. Photos can be used for artistic expression, and to remember significant events. Because photo taking, organizing and sharing traditionally requires visual information, those with no or limited sight often have problems with these activities. Previous work has made photo capturing without sight easier, however, there is little work that makes photo browsing and sharing blind-accessible. My dissertation research aims at facilitating independence for blind persons to take, organize and share photos through user-centered development of a smartphone application that can be used without sight. The work starts with an investigation of current practices of blind persons in these activities, continued with a review of existing applications, and finally the design and long-term evaluation of the application.

The overarching needs that this smartphone application aims to meet are the following:

  • Photo Taking: Aiming, focusing, positioning, and framing; Easy way to get sighted help; Accessible device; Improving photo quality.
  • Photo Organizing and Editing: Identifying what’s in the picture; Labeling the pictures; Manipulating pictures
  • Photo Sharing: Easy way to get sighted help; Accessible photo sharing method.

Here is a video of the proposed system, Phodio, a blind accessible iPhone app designed to help blind people take and organize their photos for easier retrieval and sharing:

Researchers: DustinAdams, Sri Kurniawan. In collaboration with HaradaSatoAsakawaTakagiIBM Research - Tokyo.

Online Learning System To Help People with Developmental Disabilities Reinforce Basic Skills

We are working on developing and testing a new method of providing activities that help individuals with developmental disabilities (DD) of all ages learn or reinforce basic skills such as adding money, identifying US currency, numbers, or letters. The goal is to make the process of learning and reviewing those skills more enjoyable and manageable for people with DD and their caregivers or guardians.

Our methods include working in collaboration with Imagine! and Hope Services, two not-for-profit organizations that provides support services to people with DD, to gather system requirements and developed and evaluate a prototype in the form of an online application which works primarily as a web-app on the iPad and includes activities to teach individuals with DD of all ages about numbers, letters, colors, and currency. To check out the prototype please visit: eLearning Basic Skills.

Here is a video of a user testing the activity on recognizing money in our system: 

Video produced by Imagine!

Researchers: Lourdes Morales-Villaverde, Taylor Gotfrid, Kariina Caro, Luke Buschmann, Sri Kurniawan. In collaboration with Imagine! and Hope Services.

Virtual Hemiparesis Rehabilitation Game for Stroke Survivors

Stroke can leave survivors with some form of hemiparesis (weakness of one side of the body). Virtual rehabilitation in the past decade has shown higher success than traditional rehabilitation. It improves the patient’s experience and can allow for clinical rehab without a clinician present. Research in non-virtual rehabilitation shows that constraint induced movement therapy is very effective in treating hemiplegia. In this therapy, the patient’s strong side is physically constrained using a mitt or hand splint on the non-affected limb, forcing the patient to utilize their weaker limb for daily activity.

The goal of this research is to evaluate the behavior of stroke survivors with hemiparesis in virtual rehabilitation where a constraint is induced virtually via varying incentives during gameplay.

Examining the user’s behavior during their virtual rehab sessions, we will look at a variety of elements. We will examine the user’s preference of game. Each of the four mini games use a different upper limb movement. The hypothesis is that the user may have a preference of game that relates to their physical condition. We will also examine the user’s preference of side (weak or strong) that they use to play the games. We may be able to find a point at which the incentive to use the weak side is adequate to reliably motivate the users. We can also compare the user’s compliance rate of adhering to the constraint with currently published research. We will also compare user’s range of motion measurements before and after the study to assess the effectiveness of the system.

In the next image we have a sample game: In the top left game, the user moves his hand side to side to control the bucket to catch the eggs as they fall from the sky. In the top right game, the user moves his hand up and down to control catch the stars as they fly across the sky. In the bottom left game, the user moves their forearm to control the bat as balls are thrown toward the screen. In the bottom right game, the user controls the egg pan with their forearm to catch fried eggs as they fall from the sky.

The following is a video overview of our porject:

Researchers: Buschmann, Kurniawan. In collaboration with Cabrillo College's Stroke and Disability Learning Center.

Vibrotactile Guidance for Wayfinding of Blind Walkers

Our project's goal was to test the feasibility of a novel vibrotactile guidance system, in the form of a belt, for helping blind walkers with wayfinding by enabling them to receive haptic directional instructions without negatively impacting their ability to listen and/or perceive the environment. We evaluated the belt interface in a controlled study with 10 blind individuals and compared it to an audio guidance system. The experiments were videotaped and the participants’ behaviors and comments were content analyzed. Completion times and deviations from ideal paths were also collected and statistically analyzed.

Here is a video overview of our project and clips from our user evaluation:

Researchers from UCSC: Sri Kurniawan, German Flores, Roberto Manduchi, Lourdes Morales-Villaverde.

Researchers from Toyota ITC: Erich Martinson, Akin Sisbot

Brain-Training Software for Stroke Survivors

We investigated the feasibility of using web-based brain-training software to help stroke survivors and, in general, individuals with cognitive impairments. For this purpose we observed and interviewed stroke survivors to get a better understanding of the technologies that they feel are helpful, as well as examine the effectiveness and limitations of such technologies. From this, we compiled an informal set of design guidelines for rehabilitation software aimed to help stroke survivors improve their cognitive skills. To validate the guidelines and see if new ones emerged, we developed a low-fidelity prototype of a web-based brain-training software and tested it with five participants to check its feasibility as a cognitive rehabilitation software solution.

Here is a video overview of the prototype for our proposed system:

Researchers: Lourdes Morales-Villaverde, Sean Smith, Sri Kurniawan. In collaboration with the Cabrillo College's Stroke and Disability Learning Center.

LASSIE (Live-in ASSistance for Independence and Eldercare) 

This LASSIE robot is designed to be a monitoring assistive living robot, which is low cost and can be steered over the Internet by family member to remotely monitor and help assess the well-being of an older relative living alone. A commercially available iRobot Create platform was used as a starting point for the Assistive Living Robot, however the system could potentially be used on another robotic base. Our system (i) takes advantage of commercially available systems to reduce development cost and effort, (ii) acts as a video and audio communication tool between older persons and their family members or caregivers and (iii) can analyze the video feed to detect heart rate and breathing rate. The proposed system could be integrated with in-home monitoring sensors (Smart Grid, Professor Patrick Mantey), which could be the trigger for alerting family members that an out of ordinary event is occurring, therefore, our robot could be woken up and sent to be used as over the Internet watchful eye.

Left: LASSIE's size in relation to user. Right: LASSIE's components

Researchers: Peter Cottrell, Sebastian Hening

Digital Birth

Labor and childbirth is a multidimensional experience for which it is difficult to prepare (without having done it before). It has been shown that emotional, informational, and physical support is key in shortening labor, increasing maternal happiness, and decreasing the need for unnecessary interventions during labor. This mobile (iPhone) video game training tool aims to train birth partners about the stages of labor and ways to support a first-time mom.

Here is a video overview of our prototype training tool:

Researchers: Alexandra Holloway, Sri Kurniawan. In collaboration with Shaw-Battista, School of Nursing, UCSF and Moodie, Anthropology Dept, UCSC. Research funded by CITRIS.

Subscribe to ASSIST Lab @ UCSC News