Detecting eyeball defects

The EyeCheck app lets optometrists assess vision and eye health with a snap of a photo. Version 1, the MVP, was intended for rural India, where low-income inhabitants rely on free pop-up optometry clinics for their eye health. However, these clinics are often severely understaffed and cannot keep up with the hundreds of people who attend, meaning long wait times and rushed optometrists. The EyeCheck helps optometrists quickly triage patients, meaning faster triaging for patients and better care.


My Contributions

  • Wireframes
  • Design—Lead
The MVP of the EyeCheck app was designed for use in rural, low-income India. Photo by siddashi.

Project Goals

The EyeCheck team developed an algorithm that could detect eyeball abnormalities (like cataracts) and prescription using a photograph of the eye. My role was to develop a proof-of-concept app to guide users through the picture-taking process and present results.

The International Sign for Put-Face-Here

Although the first version would be used in India, the vision for the app was an international audience: the interface needed to be easily translated. Colour, text and layout needed to be assessed through a globalized lens. Ideally, the less text used, the better.

When it came to helping the user take the photograph, this presented an interesting problem. In order for the app to accurately diagnose abnormalities and prescription, the eyes needed to be particularly positioned in the photograph. How might we wordlessly guide users to the correct eye placement?

Some of the (mildly terrifying) iterations I explored before landing on the glasses concept.

I explored several concepts before landing on what now seems obvious: glasses. The Take Photo UI became a sort of Pin-the-Tail-on-the-Donkey scenario: the photographer lines their subject's face with the glasses, and when the face is properly aligned, the Camera icon activates and allows them to snap the shot.

The solution tested well with users. Most understood the functionality instantly, with others understanding within seconds.

Into the darkness

The algorithm depended on pupil size to produce an accurate diagnosis. In order to naturally enlarge the pupils, the photographs needed to be taken in pitch-black rooms: any light might interfere and cause the pupils to shrink.

This constraint informed the app’s palette. I researched how our eyes respond to various kinds of light and chose colours in the warm spectrum to limit pupil contraction as much as possible. Black and a deep a brown-gray served as the foundation, with light colours reserved for key information and actions.

The app flow: the user takes a picture, the app analyzes the photograph and outputs a result. The patient is then triaged according to the result.

Many parts of rural India lack consistent access to electricity. To operate well in these conditions, the app needed to consume as little energy as possible. Every interaction, sound and animation was examined with respect to the GPU (Graphics Process Unit), and thusly energy, it would require. The result was a minimalist app, with animation used only when communicatively necessary.

Read Next

Reducing night terrors in children. Login Required 2016 — iOS app with hardware — Research, UX Lead