Improving the Famous White Cane: Getting to the Store with Only Sound and Touch

CSE 440 Staff
8 min readFeb 18, 2022

--

The Team

Alex Karulin, task oriented research and design

Grace Callahan, UI design and research usability of design

Maxwell Campbell, hardware and UI design

Noah Ponto, User research and gathering design feedback

Improving accessibility for the blind or visually impaired

Our group went into this project with a desire to try to improve the options of accessibility aids and technology for people who are blind or visually impaired. Through our research, we’ve found that the area with the most room for improvement right now is navigation, especially in unfamiliar indoor settings, like a grocery or hardware store. Even existing outdoor navigation tools like Google Maps often provide distracting and unhelpful information like the distance to the next turn in miles, when in reality information such as remaining steps or time would be much more helpful for visually impaired users. Our group is focused on providing a solution which improves on existing UI, requires low technological literacy, and mostly relies on hardware that the user already owns and knows how to use such as their smartphone. Overall, our solution gives visually impaired users another option for autonomous navigation indoors, to make daily necessary errands like grocery shopping much easier and more accessible.

The methodology of finding a problem from scratch

As a team, we started our research from square one — none of us had any familiarity with the problems that members of the blind and low-vision community experience. We decided the first step had to be an interview, where we could gain a high-level understanding of our user group and their experiences. We then combined the interview with a personal inventory, allowing us to explore both the general problems faced by our user group and the nuances of the currently available accessibility aids. After reaching out to several organizations for the blind and visually imparied, we only got a response from one — so we didn’t have much choice over our participants. However, it was maybe the best source we could have asked for. We had the chance to chat with an assistive technology specialist who has been visually impaired since birth, who we’ve given the pseudonym A in this blog post. Through his interview and personal inventory, he was able to provide lots of insight into a breadth of current assistive aids for the visually impaired. At the end of the call, we also got the opportunity to conduct a personal inventory with one of A’s coworkers, B, who has also been legally blind since birth. Both A and B work in a Seattle office, so they provided great insight into a user living in an urban environment. We found out really useful information about difficulties navigating both indoors and outdoors, and they shared many areas where existing technologies and aids can improve. After the interview(s), we sent out a survey to confirm our findings. We used the higher level info from the interviews to create a survey targeted toward learning more about the indoor and outdoor navigation difficulties. We sent links to the survey into blind and visually impaired Facebook groups, and received a much higher response rate than for the interviews, resulting in a wider demographic of users. We found very similar sentiments expressed by A and B, which helped to solidify our findings that there is room for improvement in the space of blind and visually impaired navigational aids.

Research Takeaways

As mentioned before, our first line of inquiry was to determine which problem we should be solving. Many designers, not just in the field of accessibility aids, skim over this step and end up with less than helpful designs that fail to solve a real issue. So, we asked our research participants to give us their two cents.

The main sentiment expressed by our research participants was that ease of use was critical in any new design. A stressed that the technology to effectively assist the blind is already there, and there isn’t a lack of high-tech solutions to try to alleviate the problems faced by this community — some of these solutions include wearable augmented reality devices, vests equipped with haptic feedback for navigational purposes, and obstacle-detecting white canes. However, the issue with many of these products is that they are simply too complicated to be an effective, day-to-day assistive aid. They are hard to use, provide unclear feedback, and are generally not very appealing to use. This showed us that the technology for helpful aids is already there, but the real bottleneck for useful designs is creating an intuitive interface. This was rather surprising for us, as it is easy to immediately draw parallels between a bad design and a lack of innovative technology.

The second major point that A and B kept coming back to was the availability of incredible smartphone apps that helped with functions such as navigation, reading written text aloud, scanning labels, etc. While conducting the interviews, we noted that both A and B immediately mentioned that they brought their smartphones everywhere, as they are a useful accessibility aid. A mentioned that many new designs aimed to simply duplicate the functionality that is already provided by a cellphone app, which is not entirely helpful in the overall progression of accessibility aids. However, both A and B mentioned that there was still lots of room for improvement within these apps, as many of them aren’t built specifically for blind and low-vision people. As an example, many navigation apps convey direction using the cardinal directions, which A mentioned is an unhelpful metric for a blind person.

We also learned that there is a significant lack of indoor navigation options. To access basic necessities like groceries or the pharmacy as someone who is blind or has low vision, the best option is often to ask a store employee for help. However, as emphasized in our survey responses, it’s unfortunately possible for stores to be unable to provide assistance, which can result in the user having to return another day. Additionally, alternatives like shopping services are often prohibitively expensive or result in lower quality goods. Overall, this indicated to us that there is significant room for improvement in how blind or low vision people access public indoor spaces, especially stores.

Finding the right design

Throughout our research we found multiple areas where we could improve on existing technologies in our own solution. Firstly, since we found that users had very little interest in buying or learning how to use a new device, our solution must primarily revolve around an application that users can download onto their smartphone. We realized we could also improve on the classic white cane, a device which many users already own. Since many users carry the cane while navigating, the cane’s grip is the perfect location for us to install haptic feedback for additional user interface at minimal hassle or additional cost. Overall, the goal of improving existing devices is to reduce disruption for the user, the number of distinct devices they must carry, cost, and the bar of technological literacy. Shown below in Figure 1 is an early sketch for our white cane design, which shows some initial ideas, like haptic vibration, cameras, and voice output.

Sketch of design for improved white cane, which includes buttons to pair with a navigation device, speakers, ergonomic grip, and vibration features
Figure 1: White Cane Improvement Sketch

In addition to a disdain for new devices, users had a common frustration with current user interfaces for existing navigation aids such as Google Maps, leading our solution to support navigation directions which are more helpful for the visually impaired. Our navigation solution will provide literal step-by-step navigation, and provide more distant information in terms of time or steps rather than distance. One of the most significant improvements we will make on current accessibility options is providing indoor navigation as well. In our research, we found that some stores already have methods of locating the aisle for a given item, but this does not necessarily help a blind person navigate to the aisle, or the section of the aisle that the item is in. Combining this publicly available information with crowd sourced mapping or partnerships with stores, our solution hopes to extend our improved style of navigation into the indoor space, allowing visually impaired people to similarly navigate in stores and other unfamiliar indoor areas.

To combine all of these features into a single design, the user’s phone will act as the “brain” of the operation, processing navigation and other complex information. The phone can interface with a device on the handle of a classic white cane, providing the user with additional haptic feedback at their fingertips. The phone will provide the user with a UI designed for the blind or low vision, meaning highly compatible with a screen reader and with configurable-size text and buttons. By leaning on existing voice recognition solutions, we can let the user easily input more complex information, such as the destination grocery store, or the item they wish to locate within the store. Our solution will also allow users to input their entire shopping list, and will automatically route them through the grocery store to locate all their items. Shown below in Figure 2 is the storyboard of someone using our app to navigate through a grocery store.

Storyboard of John, a hypothetical user, entering grocery items into the app and visiting the grocery store to find all of the indicated items with navigation
Figure 2: Storyboard of Navigating a Grocery Store

Again, the main goal of our device is to enhance existing capabilities of screen readers by using the haptic feedback on the cane to convey additional information, especially in situations where having auditory feedback may be difficult, like in loud areas when you aren’t carrying earbuds, or when it is simply preferred by the user. In line with our original goal of making our solution intuitive and simple, we intend to encode very straightforward vibrations to indicate information like proximity to destination, alerts about the nearby area, or other important notifications that should be served independently from the cell phone. Additionally, we believe that by using an additional array of cameras on the cane, we can supplement poor GPS signals using computer vision technology and collaborations with store owners to provide precise navigation, hopefully down to specific sections of an aisle. Additionally, with this camera array, we can improve the most basic functionality of a typical white cane by having a method to alert the user when the tip has passed underneath a chest-high object that might otherwise go undetected.

The improved white cane and accompanying app interface should help improve the navigational ability of blind and low vision users. Specifically, they will be able to travel and shop without having to depend on unreliable applications or the often unavailable assistance that busy store employees can provide. With our design, users can more easily and independently navigate both indoor and outdoor environments with confidence.

--

--

CSE 440 Staff
CSE 440 Staff

Written by CSE 440 Staff

University of Washington Computer Science, Intro to Human Computer Interaction

No responses yet