Next-Gen Accessible Self-Checkout
Redefining retail checkout with inclusive design
Accessbility
Inclusive Design
Design Strategy
Usability Testing
[2024]
Overview:
Shoppers with low or no vision currently face barriers when using self-checkout, which relies heavily on visual touchscreens. To meet the upcoming EAA 2025 accessibility requirements by June 28, 2025, we designed an audio-guided experience using uNav hardware with a headphone jack and a tactile controller featuring directional arrows and a center button. Alongside the audio experience, we introduced a magnifier, text resize mode, and reach mode to make the entire process easier for people with different types of disabilities.
This solution enables non-visual navigation across both Next Generation and Legacy self-checkout platforms.
Company: NCR Voyix
Timeframe: 9 months
Industry: Retail checkout, Accessibility
Role: Product Designer
Patents filed: 4
Why we did it?
We did this to help NCR Voyix meet EU accessibility requirements for self-checkouts, which are essential for selling Next Gen self-checkouts and retaining Legacy customers in Europe. Compliance includes features such as text to speech, magnification, and text resizing, ensuring that all users can navigate the experience independently.
Project goals:
Our goal was to design a non-visual, audio-first experience that delivers clear, efficient guidance. We focused on optimizing the quality, amount, and order of information to reduce cognitive load and help users navigate self-checkout with confidence.
What did I do?
I worked alongside a teammate for 9 months to design this experience from the ground up. Together, we ideated the navigation flow, crafted audio scripts, and tested with 59 real users to ensure the experience met their needs.
I collaborated closely with the engineering team through numerous working sessions to ensure accurate implementation, and partnered with a third-party auditor to obtain a VPAT for accessibility compliance.
Our design process
Research
Competitive Analysis
We began by researching how other companies approach accessibility in self-service. As part of this, we reviewed audio navigation experiences at the Atlanta airport to understand current standards in assistive technology.
We focused on navigation logic, scripting style, and available controls to gather insights that shaped our own design decisions.
Overview
True Personas: We were able to test our designs with a diverse participant pool – ranging in levels of visual impairment and technology use in both in-person and online settings.
Design, Test, Iterate, Repeat: Over the course of seven rounds of testing, we used a range of methodologies to test our designs. This included discovery surveys, A/B testing, usability time on task, and validation testing.
Comprehensive Flows: To cover all unique interactions across both platforms, we tested a wide variety of workflows with participants, including both common happy paths as well as error flows.
Solution
Section-Based Layout
Each page is divided into sections, with audio cues guiding users step-by-step to create a structured, intuitive experience.
Why? A section-based layout simplifies navigation for users with low or no vision, making the experience more accessible and easier to follow.
Breaking down a page
The audio experience design starts with taking a given page, then that page is broken up into sections, and lastly section contents.
A page is set of contained sections from which to initiate or complete workflows.
Navigation
Up / Down Arrow keys: Move between sections, allowing users to navigate through different parts of the interface.
Left / Right Arrow Keys: Move within a section, helping users explore available options within a specific area.
Center Key: Confirms a selection, enabling users to make choices as they navigate.
Legacy Self-Checkout Prototype
How did we pitch this to engineering?
Step 1: Define the section
The first step is to define the sections in the UI for each page and determine their order. This establishes the structure and flow, ensuring users can navigate the page intuitively and effectively.
Step 2: Name the section
The next step is naming each section. Clear and descriptive names help ensure everyone understands the purpose of each section, making the navigation and implementation process straightforward.
- Configurable by Customers
- Adaptable Translation
Step 3: Define the section type
We categorize each section by its type, which directly impacts the instructions provided. Each section type comes with its own set of instructions, ensuring users receive the right guidance based on the section they’re in.
- Start of Page
- Informational Element
- 1 / 2+ options
- Items
Step 4: Add instructions
Page start: “[Page name] page. [Additional page text if applicable]. This page has [#] sections. Press the down arrow key to go to the first section.”
Informational Element: “[Section name]. [Additional section text if applicable]. Press the down arrow key to go to the next section.”
1 Option: “[Button label] button. Press the center key to select. Press the down arrow key to go to the next section.”
2+ Options: “[Section Name]. Press the right arrow key to go to your [#] options. Press the down arrow key to go to the next section.”
1+ Items: “[Section Name]. Press the right arrow key to go to your [#] item(s). Press the down arrow key to go to the next section.”
Step 5: Define Items & Options in section
Items: Cart, Quick add items, Picklist items
Options: All other selectable buttons.
Selectable vs Non-selectable: Sections can contain lists of non-selectable text elements that should still be navigable and part of a list scripting structure.
Step 6: Selection Feedback
To ensure the user knows their selection was made, each button will give audio feedback after pressing the center key,
“[Button label] button selected”
followed by any new instructions based on the current navigation focus.
Example
Section contents: Global functions
Section name: “Help and Settings”
Section type: 2+ Options
Section instructions: “[Help and Settings]. Press the right arrow key to go to your [4] options. Press the down arrow key to go to the next section.”
Options: “[Assistance] button. 1 of 4. Press the center key to select. Press the right arrow key to go to the next option or the down arrow key to go to the next section.
Feedback: “[Assistance] button selected.”
Controls & Indicators

Adjustable audio playback speeds to provide users with personalized control.
Repeat Last Audio
After 5 seconds of inactivity, the system repeats the last audio instruction.

No Interaction Feedback
A negative sound effect will play when a key press doesn’t trigger navigation.

Blind shopper using text to speech on Next Gen Self-Checkout
Additional Accessibility Features
Reach Mode designed for people with mobility disabilities
Magnifier designed for people with low vision
Text resize designed for people with low vision
Takeaways
This project gave me a deep appreciation for the role of accessibility in UX, something I hadn’t fully explored at SCAD. Working closely with real users helped me understand their needs and frustrations with digital kiosks, reinforcing the importance of designing inclusively.
I learned to prioritize frequent testing, gained insight into the architecture of self-checkout systems, and collaborated with engineers to build a complex experience from scratch. I also learned how to interpret accessibility requirements and work with a third-party auditor to complete a VPAT.
Most importantly, this project taught me to always consider accessibility in my design process, no matter what product I am working on.

















