Sign In / Sign Out
Navigation for Entire University
- ASU Home
- My ASU
- Colleges and Schools
- Map and Locations
Creating accessible new code and content is a priority for ASU developers and editors. But testing existing content is equally valuable--even the most knowledgeable of us make mistakes.
One of the most effective things you can do to improve the accessibility of your websites and applications is to introduce a mandatory QA stage in your team's development and/or publication processes to catch--and fix!--problems before going live.
At ASU, all new and redesigned web-based applications and sites should comply with the Web Content Accessibility Guidelines (WCAG) 2.1, level AA. See the Best Practices for more on how to comply with WCAG requirements.
When auditing a website or application for accessibility, you don't need to test every page or screen. Usually, you will evaluate:
Setting up your website or application to be continuously monitored by a scanning service such as Siteimprove will offer the best indication of how accessible it is and when it might be time to perform more stringent testing.
If continuous scanning is not available, plan to test your entire website or application once a year. Perform additional tests after any redesigns or large content changes.
Evaluating websites and applications for accessibility can (and often should) involve multiple levels of testing:
Automated testing tools such as the WAVE and Siteimprove can give a rough idea of the overall accessibility of a site or application and are a valuable first step in accessibility evaluations. However, automated tools can identify only 20-30% of accessibility issues and are rarely relied upon alone.
In addition to the two listed above, many other accessibility browser extensions and tools exist.
The most accurate accessibility evaluations use a combination of automated and manual testing. The ASU Web Accessibility Audit walks you step-by-step through a full accessibility audit and covers most of the basics.
Testing with assistive technologies such as screen readers is standard practice in full accessibility evaluations. User testing can identify barriers that other tests are incapable of finding, and it is particularly useful when people with disabilities are included.
The desktop screen readers used most often in testing are JAWS or NVDA. VoiceOver also is frequently used, particularly with responsive sites that will be viewable on the iPhone:
The UTO UX Research team offers user testing that includes people with disabilities. Their services help improve user experiences through the analysis of real users interacting with products.
Some third-party suppliers can evaluate websites and applications for accessibility and even remediate issues. Contact the ASU Web Accessibility team for suggestions on reputable vendors.
The ASU Web Accessibility team also performs a limited number of accessibility evaluations for university projects.
In addition to the tools described above, developers need special tools that they can rely on during development. Some of the best include: