Giving visually impaired runners some room to move.

Project Overview


catherine getchell

my role

project manager

ux lead


3 months

the team

jonathan loeb

jason zhu

patrick carrington

Running Blind

Don't have time to read everything? Skim colored text to get the gist.


Blind runners currently need to be tethered to a sighted guide by a rope that is only 6-10 inches long. The tether forces the runner and guide to synchronize their movements, in order to traverse and avoid obstacles at the same time.

Our Target Users

We aimed to create a generalizable solution for any blind or visually impaired runner, accompanied by a sighted guide, on consistently flat terrain.

For this project, we worked with Catherine Getchell, the head of Disability Services at CMU. Catherine has lived without sight for a long time, and helped us quickly test concepts as well as offered her experience with prior technological solutions.

Blind athlete running, tethered to a sighted guide. Photo credit: Tim Hipps,


Breaking Ties

As you might imagine, running while attached to another person is not comfortable. Though there are many challenges for blind runners, we focused our efforts on designing a system that obviates the need for the tether. Since a sighted guide will still be present, we did not need to solve for obstacles and the terrain. This project aimed to find a way to get distance and direction, and then communicate this information to the runner.

Visual sketch of CV programming.


To put our design into context, imagine instead of the runner and guide running side-by-side, the runner is several feet behind the guide.


To get the guide’s position, our system employs computer vision using a smartphone’s camera and processing power. The phone, mounted to the runner’s chest using a harness, tracks an AR marker placed on the sighted guide’s back. Our program then processes the video and calculates the distance and direction based on the relative size of the marker and its position on screen.

To communicate the guide’s position to the runner, we started developing a haptic belt, originally posited by Catherine. This belt would consist of 5 motors distributed around the front of the waist. Based on the intensity and position of the motors that activate, the runner can discern where the sighted guide is, in real time.

Computer Vision recognizing the ArUco marker.


Competitive Analysis

Before exploring ideas of our own, we researched existing products, looking at their functionality, mode of communication, and compatibility with other devices. When we spoke to Catherine, who had tried some of these other products, it was clear that many of them do not offer useful features, and the ones that do are too imprecise.

Testing sound at different distances, recorded on 2 laptops to analyze the sound waves.

Measuring Relative Position

To get distance and direction, we explored the following technologies.

  • GPS

  • Bluetooth

  • Radio Frequency

  • Sound

  • Computer Vision


We quickly eliminated the first 4 options as they were unreliable in one way or another. GPS has a margin of error of 4 meters, Bluetooth signal strength fluctuates constantly, RF is extremely complex, and sound waves yielded inconsistent results.

Thus, we determined that computer vision was the best direction to pursue.


Our first instinct was to use auditory feedback; however, to those who are blind, the sense of hearing is everything. It is how they “see” the world and obstructing this was not an option. Even bone conduction headphones, which do not block the ear canals, interfere too much with the sense of hearing.

Without sight or sound, we looked at communicating through touch. Some ideas included a haptic vest, belt, wristbands, and even gauntlets; however, we found that the most effective, economical, and comfortable form was the belt.

Ideation sketches for haptic feedback.


The method of communication is one thing, but we also needed to know exactly what information we should communicate. Therefore, we took to the track with Catherine, and tested out different levels of precision for distance and direction.

In short, what we found is that the more precise and succinct, the better.


Catherine felt significantly more comfortable knowing exactly how far she was from the sighted guide (in feet), and the angle from her center (in degrees).

Usability study with Catherine, Jason, and me at CMU's track.


This solution consists of 4 main components.

-  Computer vision software

-  A smartphone mounted on a harness

-  An attachable ArUco marker

-  A haptic belt with a microcontroller

Computer vision tracking a solid color and activating LEDs based on its position.

Computer Vision

Using OpenCV and Processing, we created a program that could run on Android phones as a Processing sketch.

Computer vision recognizing the ArUco marker.


The smartphone’s camera allows us to take in video, and process each frame using an algorithm that enables us to track a predefined marker.

In order to counteract the turbulence caused by running, we used a fisheye lens and slow motion video capture.

We stabilized the video with a fish eye lens and slow motion. Pixel 2 image credit: Google

ArUco Marker

In order to track the sighted guide we needed a unique symbol. We considered tracking a solid color; however, colors change under different lighting conditions. So we instead used pattern recognition with an ArUco marker.

The marker we used in testing.

Haptic Belt

Our design makes use of a microcontroller like Arduino to activate the motors embedded within the belt. The intensity and position of the motors that vibrate indicate the distance and direction, respectively. With more time, we would refine this prototype into a wearable, and further test the haptic feedback.

Using Arduino, we converted the output from computer vision to haptic feedback.


Though this is a great start for producing a truly useful technological solution, there is more that can be done.

Further testing is needed to properly calibrate the calculations for distance and direction based on the marker's relative size and position.

With a software engineer, our prototype could be used as the foundation for building a fully featured app that allows users to set up and control the system.

It would also be great to incorporate a more accessible control scheme that makes it easier for blind users, like a conversational interface.

Finally, we would bring in an industrial designer to build higher grade physical components. The motors and microcontroller would be fastened onto a belt, and the harness would benefit from shock absorbing materials to reduce the turbulence caused by running.

Health Upgrade
Running Blind
Mobile Service
Online Lending
Urban Gardening