IT Brief US - Technology news for CIOs & IT decision-makers
United States
AI accessibility tools miss major app barriers, study finds

AI accessibility tools miss major app barriers, study finds

Fri, 15th May 2026 (Today)
Joseph Gabriel Lagonsin
JOSEPH GABRIEL LAGONSIN News Editor

Applause has published research showing that more than half of assistive technology users have been blocked by inaccessible apps this year, even as most organisations surveyed said they are using artificial intelligence to improve accessibility.

The study drew on responses from more than 500 development and quality assurance professionals and more than 1,000 people who use assistive technology, including screen readers, captioning, font magnification and alternative navigation tools.

It found a gap between corporate use of AI tools and the experience of people who rely on accessibility features. While 78% of organisations said they use AI to improve digital accessibility in websites and applications, 56% of assistive technology users said they had regularly encountered inaccessible apps since the start of the year.

Those barriers often prevented users from completing basic tasks. The data showed that 28% faced such issues monthly and 17% weekly.

Poor accessibility also appears to carry a commercial cost, with 44% of assistive technology users saying they were highly likely to abandon apps with poor accessibility.

At the same time, accessible services may strengthen customer retention. The report found that 97% of assistive technology users were loyal to brands that offer accessible experiences, with 62% describing themselves as extremely loyal.

AI adoption

Organisations reported using AI across several stages of digital development. The research found that 60% use AI coding tools to address accessibility issues, while 58% use coding agents to generate accessible code for new features.

Another 56% deliver AI-based features, 47% use AI to scan sites or apps for accessibility issues, and 45% use it to generate captions or subtitles for audio or video.

Confidence in those tools remains mixed. Only 22% of respondents said their AI-driven auditing tools accurately identify 75% or more of accessibility issues.

More than half of those using AI accessibility scanning tools raised concerns about accuracy. Of that group, 24% said their tools flag false issues, while 13% said the tools miss issues altogether.

Human review still plays a major role in testing. Just 10% of organisations said they rely on AI accessibility tools alone, while 90% validate automated results with some form of manual testing.

Human oversight

The report pointed to a continuing need for direct input from people with disabilities rather than relying only on automation. Without involving the disability community in reviews, it said, some problems remain undetected.

That reflects a broader issue in accessibility testing. Automated checks can identify coding errors and missing labels, but they may not capture how an app works in practice for someone using assistive technology. Real-world interaction, especially across different devices and use cases, can expose issues that automated scans miss.

Bob Farrell, Vice President of Solutions Delivery & Accessibility at Applause, said manual testing remains essential alongside AI tools.

"More teams are incorporating AI-powered accessibility testing tools into the development process, even at the coding stage," Farrell said. "However, these tools miss up to 80% of meaningful accessibility issues that are not machine discoverable. The majority of organizations incorporate some form of manual testing to complement AI-powered accessibility checks. What could make these checks more effective is having users with disabilities involved, and generally, testers with expertise in accessibility and inclusive design. That expertise includes knowledge of the latest WCAG and EAA requirements, and more."

The findings suggest accessibility is moving beyond compliance into product design, customer experience and retention. Businesses are investing in AI tools to address problems earlier in development, but the data indicates those systems do not remove the need for specialist testing and user feedback.

ShareFile, which works with Applause on accessibility testing, said the approach had reduced defects and delivered commercial benefits.

"Working with the Applause team - and its global community of real users, including people with disabilities - has helped us decrease accessibility defects by more than 60% year over year," said Ivan Ereiz, Senior Director of Product Design & Research at ShareFile.

A second executive linked accessibility work to customer retention and new business.

"Being able to demonstrate that we're prioritizing accessibility has helped us both retain existing customers and attract new ones, as compliance with regulations can be a crucial factor in the procurement process," said John McCartney, Senior Manager of User Experience at ShareFile. "As we continue to work with Applause to shift left and move toward a culture of inclusive design, we expect to see even greater impact on the business."