Tangible Technology

Reshaping Communication for the DHH & DeafBlind Communities

CSE 440 Staff
6 min readNov 18, 2020

Contributors: Nora Morsi, Micah Witthaus, Maryam Saleh, Kaytlin Melvin

An Overview

As students in CSE 440: Introduction to Human-Computer Interaction, at University of Washington, we’re tasked with coming up with a design to benefit a Seattle community different from our own. Our team focused on a community with different abilities than our own as we explored the problems faced by the Deaf community in Seattle. We began by seeking out a new and inventive way to bridge the gap in communication between the DHH (Deaf and Hard-of-Hearing) community and hearing people, but following unsuccessful attempts to connect with the general DHH community in Seattle, we shifted focus. Our team is now exploring how to support communication in remote settings between members of the DeafBlind community, who normally utilize touch as an essential part of ProTactile, a touch-based modification of American Sign Language. With touch now being labeled as dangerous due to COVID-19, in addition to general social-distancing measures limiting human connection, we’ve focused on this new obstacle of allowing communication within the DeafBlind community while remote to resemble how it once was in person.

Our Research Goals, Stakeholders, and Participants

When we were originally attempting to learn more about the general DHH community, we decided on conducting directed storytelling to get a better understanding of the day-to-day lifestyle of DHH individuals, including highlighting common problems faced in communicating with hearing individuals. We also incorporated a graffiti wall gallery as a way to spot any similarities or shared sentiments across research participants. We were sadly only able to connect with two individuals and relied on academic literature in addition to their responses in order to educate our decisions moving forward. Our respondents included one deaf person and one hard of hearing person, which did give us a little bit of variety in our findings. After conducting our initial research, we were able to connect with a UW CSE staff member who was raised by deaf parents and continues to do research in accessibility and remain connected to the DHH community in Seattle. We proposed our ideas to him, and from there, we realized we had been focusing too much on sound awareness, an issue already explored with real solutions created, so our concept wasn’t unique. He instead pointed us in two possible directions. The first was to design for the DeafBlind community in Seattle, whose core language of ProTactile still being relatively new with few accommodations made available. The second was to design a high-fidelity translator from ASL to audio that could also allow for labelled speech to text for DHH individuals. This would be more novel than our original sound awareness app idea, as it focuses on ASL to audio translation as well as allows for DHH individuals to track group conversations in a much easier and accessible manner.

Design Research Results and Themes

Although the vast majority of our research was conducted before we discovered ProTactile, our results still gave us useful insight into the deaf community and the struggles that they face. The primary theme that we noticed after reading our participants’ responses is that DHH people frequently experience difficulty holding group conversations with hearing people. When more than one person is talking at once, lip reading becomes difficult; even using an interpreter doesn’t always give the DHH person a full understanding of what’s being said in these conversations. We found that this communication issue is made even worse for DHH individuals during virtual meetings. Technologies like Google Meet do have auto-captioning features, but these captions are not always accurate and aren’t a perfect replacement for a hired captioner or interpreter. Additionally, while captions may help a DHH person understand a hearing person, they aren’t as effective in helping a hearing person understand a DHH person. One respondent pointed out that it can be difficult to understand a Deaf accent over a video call, so they often resort to typing in the chat instead. Our initial product design, which we’ve now changed, heavily relied on these research findings, but Professor Ladner’s proposed transcription tool would address these problems as well, so we’re continuing to use our research from before we shifted focus.

Our research on the Deaf community essentially ended where our research on the DeafBlind community began. After our conversation with Professor Ladner, wherein we discovered that our proposed design was not nearly as unique as we thought, we began looking into other possible solutions for the DeafBlind community in Seattle as well. We familiarized ourselves with the ProTactile language by watching YouTube videos about it, which helped us understand the basic functionality that our design would need to support. This is the extent of our additional research we’ve completed so far; we’re still working on getting in touch with potential research participants from the DeafBlind community in Seattle. For the time being, though, we’re learning as much as we can about ProTactile and finding ways to incorporate our previous research findings into our new design.

Our Proposed Design

With the current state of virtual work and learning, remote communication creates yet another barrier for the DeafBlind and DHH community. This has pointed our design ideation towards a more tactile, hands-on approach. From our original ideas, we recognized the possibility that robotics and physical components could be imperative. This transitions well into the physical contact component of ProTactile and also broadens the possible wearable mediums we could apply our design. Additionally, our research was able to discern what aspects of technology, both current and traditional, both communities have interacted with. We are going to combine new and old design approaches to iterate on a virtual assistant approach allowing these communities to interact with each other and/or individuals outside their community easier.

Currently we are working on designing the two solutions that the UW CSE staff member suggested to us. The high-fidelity translator would be used in order to provide fluid conversation between DHH individuals who prefer ASL and hearing individuals who don’t know ASL. It will also label the speaker in a group setting so that the DHH person can track group conversations easier. This translator relies on a combination of computer vision in order to detect/translate what is being signed through a camera, as well as NLP in order to uniquely identify the current speaker and convert what they are saying into text. The design to help the DeafBlind community communicate with others remotely comes in two pieces, as shown below in our rough sketches during ideation (Figure 1).

Figure 1: A sketch of one of our proposed designs.

The first piece will be a robotic torso with sensors on it, which will be used to read and send the ProTactile communication. The second piece is a haptic jacket which is what the receiver of the ProTactile communication will wear. Essentially, when someone communicates with the robotic torso using ProTactile it will relay the touch sensation to the haptic jacket that the user on the receiving end is wearing. This allows them to feel everything being communicated, including back-channeling, which is a way of communicating visual and verbal listening cues such as laughing and nodding through touch. In order to send a message back this design can work in a few different ways. If it is communication from one DeafBlind individual to another, then both users would require the two parts of this design individually so they can send and receive messages. Otherwise, the robot torso used to send the ProTactile message would be able to translate and relay the message sent using ProTactile to either visual text or audio. I have attached an image of our ProTactile robot design below to provide some visual context.

Moving Forward

We plan to explore these two proposals more thoroughly as we dive deeper into academic literature given to us by Professor Ladner to help us understand the DHH and DeafBlind experiences. Prototyping and evaluations of our designs are next, before which we plan on finalizing which proposal we’re moving forward with. Either way, our team can already say we’ve learned a lot about the design process thus far, and even more so, about empathy and understanding a community different from our own.

--

--

CSE 440 Staff

University of Washington Computer Science, Intro to Human Computer Interaction