The EyeCheck app lets optometrists assess vision and eye health with a snap of a photo. Version 1, the MVP, was intended for rural India, where low-income inhabitants rely on free pop-up optometry clinics for their eye health. However, these clinics are often severely understaffed and cannot keep up with the hundreds of people who attend, meaning long wait times and rushed optometrists. The EyeCheck helps optometrists quickly triage patients, meaning faster triaging for patients and better care.
The EyeCheck team developed an algorithm that could detect eyeball abnormalities (like cataracts) and prescription using a photograph of the eye. My role was to develop a proof-of-concept app to guide users through the picture-taking process and present results.
Although the first version would be used in India, the vision for the app was an international audience: the interface needed to be easily translated. Colour, text and layout needed to be assessed through a globalized lens. Ideally, the less text used, the better.
When it came to helping the user take the photograph, this presented an interesting problem. In order for the app to accurately diagnose abnormalities and prescription, the eyes needed to be particularly positioned in the photograph. How might we wordlessly guide users to the correct eye placement?
I explored several concepts before landing on what now seems obvious: glasses. The Take Photo UI became a sort of Pin-the-Tail-on-the-Donkey scenario: the photographer lines their subject's face with the glasses, and when the face is properly aligned, the Camera icon activates and allows them to snap the shot.
The solution tested well with users. Most understood the functionality instantly, with others understanding within seconds.
The algorithm depended on pupil size to produce an accurate diagnosis. In order to naturally enlarge the pupils, the photographs needed to be taken in pitch-black rooms: any light might interfere and cause the pupils to shrink.
This constraint informed the app’s palette. I researched how our eyes respond to various kinds of light and chose colours in the warm spectrum to limit pupil contraction as much as possible. Black and a deep a brown-gray served as the foundation, with light colours reserved for key information and actions.
Many parts of rural India lack consistent access to electricity. To operate well in these conditions, the app needed to consume as little energy as possible. Every interaction, sound and animation was examined with respect to the