For this project, I took on the challenge of creating an app that breaks down accessibility barriers for individuals with visual impairments. The app features a unique voice-activated AI assistant "Venus," which I programmed to guide blind users through a simple and easy-to-use app using voice commands. This ensured an inclusive and seamless fitness experience for all users.
Traditional fitness apps predominantly rely on visual cues, making them inaccessible and confusing for users with visual impairments. The lack of accessibility features in these apps alienates a significant portion of the population from engaging in fitness routines and tracking their progress effectively.
To address the accessibility gap, I conceptualized and designed VizFit, a user-centric app that prioritizes inclusivity. The key solution is the integration of a voice-activated AI assistant, "Venus." Through ProtoPie programming, blind users can easily call upon Venus to guide them through workouts, choose exercises, and track their progress using intuitive voice commands.
Accessible Design
UX Researcher, UX Designer, Visual Designer, Logo Designer
Figma, ProtoPie, Illustrator
Accessing blind and visually impaired users proved to be quite a difficult task and I had limited time to find research participants, so I put myself in the users' shoes by using fitness apps while having my smartphone’s voice over function turned on in the accessibility settings. I also joined several Facebook groups geared toward blind and visually impaired users where they shared their first-hand experiences of using mobile fitness apps.
I also conducted secondary research via Reddit forums and online articles to gain a better understanding of who my target market is, empathize better with their frustrations, and learn more about how they use smart phones to tackle daily challenges.
From my preliminary research, users collectively had similar challenges when it came to using fitness apps.
Inaccessible buttons, menus, and content result in a frustrating user experience.
Users may struggle to find and access essential features or content within the app.
Users with visual impairments may find it challenging to control and navigate the app without voice input.
Based on my learnings from the research, I created two personas as my target users. My first persona is partially visually impaired and my second persona is blind; I chose to put them into two groups because they have very different needs and considerations.
Once I understood users' pain points and identified key problems, I immediately went into brainstorming solutions to answer "how might we" questions. I wanted to reduce the amount of friction it takes to start using fitness apps, and lessen the steps users need to ensure the app is accessible for them.
The main issue was centered around UI being too complex for screen readers and users with low vision. I came up with VizFit, an app that allows users to access fitness routines on their phones. This allowed users to no longer rely on manual processes or confusing UI that interrupt their workouts or make it too difficult for them to navigate on a mobile phone.
To advance the project, I needed to find software capable of programming a voice assistant. Unfortunately, I couldn't find the necessary features among Figma's various plugins.
Enter ProtoPie: an advanced prototyping tool enabling dynamic interactions. I promptly delved into ProtoPie's Voice Assistants masterclass and initiated the programming of my own voice assistant shortly after. While the process posed significant challenges and extended the project's duration, the end result was a seamlessly functioning voice assistant.
After gathering several different ideas, I landed on voice assistants such as Siri or Alexa being an answer to several user problems. I created Venus: The Voice Activated Fitness Coach. Venus acts as a voice assistant that onboards new users, listens to user feedback, and helps users navigates through a simple app geared toward users reaching their fitness goals.
Because the purpose of VizFit is specifically to give access to fitness routines via voice commands, it was important to keep the user flow very simple and easy. Considering that the users have difficulties with vision, I had to avoid making the app more complex than it needed to be.
This initial phase involved sketching out the key components, such as the main interface, workout selection, and voice command interactions.
By aligning the initial wireframes with my research findings, I ensured that accessibility goals remained at the forefront of my design process.
Transitioning from paper wireframes, I translated the concepts into a low-fidelity digital prototype. This phase focused on refining the user flow and interactions, incorporating insights from research. Using tools like Figma and Adobe Illustrator, I created a simplified version of the app, emphasizing functionality over aesthetics.
With the foundation established through the low-fidelity prototype, the next step involved elevating the design to a high-fidelity level. Using design tools like Figma and ProtoPie, I incorporated visual elements, color schemes, and detailed UI components. The high-fidelity prototype aimed to provide a realistic representation of the final product, considering not only functionality but also the visual appeal and brand identity of VizFit.
Turn on the volume on your computer to experience the app walkthrough with VoiceOver simulation.
Due to time constraints and a user group that is challenging to access on my own, conducting prototype tests became unfeasible. Should time permit, I would enlist participants from student and community groups catering to the blind and visually impaired.
Throughout the testing phase, I would evaluate the overall usability of the app, closely observing user interactions. My focus will be on garnering feedback regarding any overlooked features, identifying areas for enhanced accessibility, and pinpointing unclear sections.
One of the biggest learnings from this project is knowing how unaware I was of accessible design and the needs of people with disabilities. To better empathize with them, I used voice control on my phone to get a better understanding of their fitness app journey from beginning to end. By putting myself in their shoes (to a certain extent), I was able to uncover pain points in their experience.
A significant lesson from this project was my prior lack of awareness regarding accessible design and the needs of individuals with disabilities. To enhance my empathy, I utilized voice control on my phone, immersing myself in their fitness app journey. This allowed me to identify key pain points in their experience by briefly stepping into their perspective.