When we talk about web accessibility, it’s easy to think in checklists: alt text, color contrast, keyboard focus. But behind those checklists are real people with different ways of seeing, hearing, moving, and interacting with the world.
As a web accessibility expert, I work every day to make the digital world more inclusive. I’m not a person with a disability myself, but I’ve learned from – and alongside – people who are. Their experiences have reshaped how I think about design, technology, and equity. Accessibility isn’t just about compliance. It’s about dignity, autonomy, and equal access to information and opportunity. And in a world where more and more of life happens online, this matters more than ever.
In this article, I want to share how people with the following disabilities – blind and visually impaired, deaf and hard of hearing, and those with motor impairments – navigate the web and how to use this knowledge for better web accessibility of your digital products.
Blind and visually impaired users
They don’t see your site. They listen to it.
Blind users rely on screen readers — software that turns text into speech — and navigate by keyboard, voice, or gestures. A well-coded site becomes a mental map: made of headings, labels, regions, and landmarks.
What can surprise you:
- Blind people have been using the web since the 1990s, thanks to early screen readers.
- Experienced users can “listen” to a website at up to 600 words per minute – faster than most of us can read.
- Most don’t use a mouse at all, just the keyboard or gestures.
- And many leave websites quickly if they’re too hard to use.
Devices blind people often use to interact with the web:
- Screen readers (NVDA, JAWS, VoiceOver, TalkBack)
- Refreshable Braille displays (e.g. Focus 40 Blue, Brailliant BI, Orbit Reader)
- Screen reader-compatible smartphones and tablets
- Keyboard-only navigation tools and shortcuts
Want to understand how a blind person experiences your site? Try these tools:
- NVDA (Windows) – a popular free screen reader
- VoiceOver (Mac/iOS) – built into every Apple device
- Fangs (Firefox) – simulates what screen readers read
- SA11Y – shows how content is structured for screen readers
Deaf and hard of hearing users
They see your site, but may not hear it.
For users who are deaf or hard of hearing, audio content can become invisible. Podcasts, video voiceovers, or sound effects without captions or transcripts leave major gaps in understanding. Visual clarity and text-based alternatives are essential.
What can surprise you:
- Many deaf users don’t consider written language equal to sign language – clear, well-paced captions matter more than you might think.
- Auto-captions are often wrong. Misinterpreted captions don’t just confuse – they exclude.
- Audio-only content (like podcasts or spoken instructions) becomes a dead end without transcripts or visual cues.
Devices and tools deaf users often rely on:
- Captioned videos and events (live or recorded)
- Subtitles in multiple languages
- Speech-to-text tools and apps
- Vibrating or flashing alerts in place of audio notifications
Want to understand how a deaf person experiences your site? Try these tools:
- Amara – free collaborative captioning tool
- Web Captioner – real-time captions for events and videos
- Starkey Hearing Loss Simulator – simulates various types of hearing loss so you can hear how audio is filtered
Quick test you can try: watch your site’s video content on mute. Does it still make sense? Could a deaf user get the same value from it?
Users with motor disabilities
They may not use a mouse — or even their hands.
For users with limited mobility — due to conditions like cerebral palsy, spinal cord injuries, multiple sclerosis, or Parkinson’s — interacting with a website may depend entirely on a keyboard, voice commands, switch devices, or eye-tracking systems. Precision gestures like drag-and-drop or small clickable areas can become major barriers.
What can surprise you:
- Eye-tracking users can click with a blink – and type entire documents using on-screen keyboards and gaze tracking.
- Some power users operate computers entirely by voice or eye movements – writing emails, coding, or designing – all without touching a keyboard.
- Some users rely on keyboard navigation and may need to press Tab dozens of times just to reach one button.
Devices and assistive tech often used:
- Switch controls (e.g. sip-and-puff systems, head wands)
- Eye-tracking systems (e.g. Tobii Dynavox, EyeTech)
- Voice recognition software (e.g. Dragon NaturallySpeaking, Voice Control on iOS/macOS)
- Adaptive keyboards, trackballs, or joysticks
Want to understand how a motor-impaired user experiences your site? Try this:
- Unplug your mouse and navigate your site using just the keyboard (Tab, Shift+Tab, Enter, Arrow keys). Can you reach every interactive element?
- Atalhos Acessíveis – enables the creation of accessible keyboard shortcuts.
- Try using Voice Control (macOS/iOS) or Talon Voice (Windows) to simulate hands-free navigation.
Making the web accessible isn’t about perfection – it’s about intention. When we understand how people with different disabilities navigate digital spaces, we start to see design differently: not just as visual or functional, but as inclusive or exclusive. Every choice, from adding alt text to structuring headings or enabling keyboard navigation, can open up access or quietly shut someone out. Accessibility isn’t extra work; it’s essential work that reflects the kind of digital world we want to create: one where everyone belongs.

Leave a Reply