Figuring out how to make an accessible component in your design system isn’t always straightforward. There’s more to it than creating a design and coding it. Regarding accessibility, teams need to ensure that the user experience translates accurately to the user through assistive technology (AT). AT is any adaptive device or tool to help people accomplish tasks they couldn't do with typical devices/tools. Some examples of assistive technology include screen readers, joystick mouse controllers, keyboards with large letters, etc.
Ideally, when creating an accessible component, a designer knows code, and a developer knows UX; each role has a good sense of how to accomplish creating the component. While that’s rare, a more realistic option is to pair designers and developers together to develop accessible components.
Coming up with a solution can feel intimidating. But working toward a common goal can be a lot of fun and feels rewarding. In this article, I’ll cover an approach for how your designer and developer teammates can collaborate on accessible solutions.
Pair design and development
Once the design for a component is ready for implementation, the designer and a developer can pair together. Similar to pair designing or pair programming, they can work in-person or remotely in real-time to figure out an accessible solution.
In this case, each role can lean on the expertise of the other and together find an answer. The designer can provide the perspective of the intended user experience, and the developer can leverage knowledge or interpret code documentation to find an appropriate solution.
Discuss the intent
Since components are the basic building blocks of any UI, we can take for granted their intention. For example, a toggle switch is a clickable control to turn something on/off. Sighted people can observe the many visual cues at once to help connect the dots with the toggle UI and what it controls.
Regarding accessibility, we need to ensure that the code conveys that meaning through any assistive technology. We want to make sure that a screen reader user understands the information presented, if any actions are available, and the status of the item. That means we need to think more thoroughly about the intent of the toggle switch. The easiest way to start thinking about this is to break down things into simple statements. It would look something like this—the user should know the following:
- There’s a toggle
- What the toggle controls
- The current state of the toggle
- The action that they can take on the toggle
When thinking about the intent, consider answering these questions:
- Would this component appear in a group? Does the user need to know the total number of items or which item they are on? (Think of a dropdown menu or a bulleted list.)
- What are the states of the component? (e.g., active, loading, disabled, checked, unchecked)
- What actions are available to the user? (e.g., navigate to, select, unselect, etc.)
- What is the purpose of the component? (e.g., Does it inform? Does it allow the user to take action? etc.)
You can use these answers as your accessibility checklist to ensure the coded component can inform the user of these things.
Start brainstorming solutions
With the intent established, the developer can look at possible solutions. As a first check, we recommend seeing if any solutions work from the existing code. You both might know the current code isn’t reliable, so you’ll need to start from scratch.
We recommend checking out any existing semantic HTML to leverage. Screen readers can read semantic HTML elements easier compared to custom-coded items. (Using custom-coded items also means adding custom code to make the item accessible. And if you can, try to avoid making things harder on yourself.) Developers have a good handle on this and can quickly figure out some options.
If you can’t find what you’re looking for, you can dig into Web Accessibility Initiative - Accessible Rich Internet Applications (WAI-ARIA) work. WAI-ARIA creates the standards to ensure web content and applications are accessible to more people. It brings more equity to access the internet's more dynamic and interactive experiences. Some good references include
I typically use the MDN Web Docs as a starting point because it’s more digestible. We’ll look around and see if there seems to be viable options. Sometimes I’ll dig deeper into the W3C WAI-ARIA guidelines, but the detail overwhelms me as a designer. Usually, the developer will poke around on that site to see what options are available and help make sense of the content.
The developer recognized that our toggle behaved more like a checkbox interaction by abstracting the intent from the physical looks of the toggle and from looking around the ARIA docs. So when the toggle is on, it means that option is selected. When the toggle is off, then the option is unselected. By collaborating and understanding the intent, he could arrive at this conclusion. It was a relief that we could leverage something that existed already!
Try googling for solutions for trickier scenarios or if you aren’t finding anything suitable.
Code and test the component
Once you know what to do, the developer can start coding the component. (Note, before coding, the designer should have tested the component design for enough color contrast.) If you’re the designer, you can set up the screen reader while you wait. Depending on your audience's tools, this might mean using VoiceOver on MacOS or JAWS, NVDA, or Narrator on Windows. It might be using VoiceOver on iOS or TalkBack on Android if you're working on mobile devices.
A few things to note when getting ready to test with a screen reader:
- Test with actual equipment. If your users use a Windows computer, test on a Windows device. Avoid using emulators or virtual machines (i.e., Setting up a Windows emulator on a MacBook and using VoiceOver to read the content) because it creates unrealistic scenarios that might yield inaccurate results.
- Know how to turn on/off a screen reader before you turn it on. Nothing can be more frustrating than when the screen reader won’t stop talking, and you can’t figure out how to turn it off. (If all else fails, hit the “mute” button.)
- Familiarize yourself with screen readers. Screen reader users are comfortable using them, which means they can easily navigate apps and screens and know all the shortcut keys. While you don’t have to be an expert, understand the basics of navigating to make testing more realistic.
Test the component with the screen reader to see how accurately the screen reader conveys the information. For the toggle, I listened to hear if what the screen reader said made sense. I wanted to confirm it didn’t say “checkbox” and then later, out of context, read the label “send notifications.” Instead, I wanted to ensure it conveyed the option to send notifications was checked and that it’s a control the user can interact with.
Sometimes, it works the first time, which is great! But sometimes, we need to make tweaks. This is where the designer and developer can collaborate some more. The designer can mention what seems off about the user experience, and the developer can adjust the code so the screen reader can read things more accurately. You might have to go back and forth a few times, which can be frustrating and fun. If it gets too frustrating, take a break or seek another opinion. It could also mean that you need to try another solution.
Once you have the basics, keep testing different scenarios. For our toggle component, there are times when the toggle is disabled (i.e., the user can’t interact with it), but they need to know what setting the toggle represents and if it’s on or off. This was trickier to figure out, but with a lot of trial and error, we came up with a solution that conveyed the correct information.
You can also evaluate keyboard navigation as you test with the screen reader. When trying with a keyboard, double-check that you do not have to do anything unconventional to interact with the component. The more natural the interaction is, the better.
What about dev tools and automated checkers?
I think dev tools and automated checkers are great for validating things from the perspective of meeting standards and best practices. But tools like that can’t validate the human side of the user experience. So manually experimenting and testing can help ensure that you’re providing the best possible user experience for people.
What about collaborating with accessibility specialists or doing usability testing?
If you have accessibility specialists at your organization, then yes, definitely collaborate with them! They are a great resource and can give you more details about how your organization’s products accommodate accessibility. I still encourage developers and designers to work through solutions together. This helps get everyone get more familiar with designing and developing for accessibility.
If you can, we recommend usability testing components with people who rely on AT to navigate the Internet. Companies like Fable provide accessibility testing by people with disabilities.
Document the accessibility
Once you’re happy with the accessibility of the component, take a minute to document how the accessibility works in your design system documentation site. From the developer's angle, make notes about what attributes to use, change, etc.
From the design perspective, talk about the experience conveyed by the screen reader, but keep it high-level. For example, rather than list out every single toggle scenario and what the screen reader says, I would note, “The screen reader reads the label for the control and announces its current state.”
For keyboard-only interactions, it’s helpful to determine which key(s) users need to use to navigate or select items.
The benefit of collaboration
The overall benefit of working this way is that the duo can likely create an accurate and accessible solution quickly. Aside from that, it’s a noteworthy occasion for designers and developers to collaborate on something so closely. We rarely get to interact in real-time to solve a problem together. So in some ways, it’s a great team-building exercise—especially if several design/developer duos are involved. It can help create better working relationships, and it feels like an accomplishment when you both succeed together. Additionally, we learn more about each other’s worlds, which builds empathy and allows us to communicate more effectively.
Come chat with us
There’s more than one way to collaborate, so we’d love to hear how you and your teammates have worked together to create accessible components. Do you use other resources? Say hello on zheroes, our Slack community, and let us know.