Improving the Famous White Cane: Providing New Opportunities for Blind and Low-Vision People to Navigate Public Indoor Spaces

CSE 440 Staff
5 min readMar 11, 2022

--

The Team

Alex Karulin, task-oriented research and design

Grace Callahan, UI design and research usability of the design

Maxwell Campbell, hardware and UI design

Noah Ponto, User research and gathering design feedback

Problem Overview

Our group began this project with a desire to improve accessibility aids and technology for people who are blind or visually impaired. During our user research, we found the need to remedy the lack of solutions for indoor navigation, especially in the context of shopping. Through personal inventories, we found people were comfortable primarily relying on their white canes and smartphones, and would rather not have to carry anything else. Many users currently use their smartphones for outdoor navigation apps like Google Maps, but these existing solutions often rely on visual landmarks and provide minimal (if any) support for indoor spaces. Especially within a busy indoor area like a grocery store, the main option is finding an available store employee, but this can prove difficult and isn’t an optimal solution.

Solution

To fulfill users’ desire to avoid carrying additional gadgets, our final design combines a smart white cane handle with a phone application. Users interact with our design similar to how they might use Google Maps outside, but instead, it provides accessible step-by-step directions designed for indoor navigation. We focus on providing accessible directions especially designed for blind and low-vision users, for instance by providing distance in terms of steps with a custom gait for each user. The cane provides directional vibrational feedback to nudge the user in the right direction, so they can feel confident in their stride. Once in a store, users can simply pull out their phones and use their familiar screen reader to search our app. They’ll choose a specific product, then, at the press of a button, quickly and confidently find what they’re looking for and get on with their day.

Paper Prototype, Testing Process, and Results

In our first paper prototype, we explored how to build a mobile application for navigating. We took heavy inspiration from modern app design tropes that we thought fit well with our goals. For example, we borrowed the idea of dividing up our locations of interest based on category (recents, popular locations, grocery stores, and so on) from apps like Google Maps and Uber Eats. Additionally, we briefly tested a paper prototype of our white cane, although it was not a primary focus during usability testing.

From our usability testing, we had several key takeaways and refinements to our paper prototype that eventually made it into the final version. We had several users suggest we improve the flow of searching for items by adding the ability to browse by categories like recent items or by aisle. However, to focus on usability with screen readers, we also minimized excess information on the screen, instead opting to have each category expand into a full list.

Digital Mockup

In our design process, we chose to focus on two primary tasks our users should be able to accomplish with the help of our product. Feel free to explore our digital mockup in Figma:

https://www.figma.com/proto/JSiTMwcL8SZXNfgyPeRNqi/Final-Digital-Mockup?node-id=7%3A1221&scaling=scale-down&page-id=0%3A1&starting-point-node-id=7%3A1221

Task #1:

The first task is for a user to find navigation directions to a specific item using our app. The design we created supports this task by first allowing the user to choose a store (1), then view options for exploring the store’s items (2), then search for the desired item (3), and finally get to the item page (4), which includes a large “Navigate” button.

At any point, the user can click on “Shopping List” in the bottom right-hand corner to reach a screen that also has a “Navigate” button, which will direct the user along the shortest route between all desired items. We decided to redundantly incorporate the ability to “Navigate” from several different screens, to strive for Nielson’s flexibility and efficiency of use design heuristic so the user can choose to navigate to a specific item, or only begin the directions once they have added all desired items to the shopping list.

Task #2:

The second task we focus on is for the user to use a combination of the app and their white cane to effectively follow navigation directions to arrive at their destination. We do this by incorporating the smartphone’s visual and auditory instructions with vibrational feedback from the white cane. We made sure to comply with the Web Content Accessibility Guidelines’ AAA standard for contrast so low-vision users will be able to easily distinguish different buttons and directions in our app. Our digital mockup demonstrates a user accomplishing task #2 with phone notifications from Smart Cane, as shown below:

A key focus for us when designing for this task was adhering to Nielson’s minimalist design heuristic, as our user research indicated for us to not get carried away with complex vibration encodings or convoluted navigational directions.

Summary

Throughout this project, we discovered a clear need for more convenient ways to navigate unfamiliar indoor spaces as a blind or low-vision person. One story from our early surveys rang clear — navigating a store with a vision impairment can be incredibly frustrating due to unintuitive store layouts, impossible-to-read labels, or unhelpful store employees. With our app, we provide a reliable and easily digestible source of information and directions for blind and low-vision users, and by incorporating our optional smart cane handle, enhance general quality-of-life of all accessibility applications by offering a new way to get alerts and information tailored to preferences and use-case. We believe that our solution will take the frustration out of shopping, allowing our users to enjoy shopping at any store, regardless of circumstance.

--

--

CSE 440 Staff

University of Washington Computer Science, Intro to Human Computer Interaction