Make the process open to all
Editor’s note: Bill MacElroy is president of Socratic Technologies, Inc., a San Francisco research firm.
Everywhere you go in the U.S., one finds that buildings, stores and public places have changed significantly in the past 15 years. Since 1990, when the Americans with Disabilities Act was enacted, entryways, staircases, elevators and many other “barriers to access” have been redesigned and rebuilt to provide those with various types of mobility limitations easier entry into mainstream life. Lately the same goal is being extended to previously unthought-of areas where various disabilities may cause limited access to information, commerce and educational opportunities. One such area is the Web. Researchers utilizing the Web for data collection should begin understanding how changes in usability regulations may soon impact how we deliver surveys online.
In 1998, Congress amended the Rehabilitation Act to require federal agencies to make their electronic and information technology accessible to people with disabilities. Inaccessible technology interferes with an individual’s ability to obtain and use information quickly and easily. Section 508 was enacted to eliminate barriers in information technology, to make available new opportunities for people with disabilities, and to encourage development of technologies that will help achieve these goals.
The law applies to all federal agencies when they develop, procure, maintain or use electronic and information technology. Under Section 508, agencies must give disabled employees and members of the public access to information that is comparable to the access available to others. And in a number of cases, those who do business with and/or have contracts to provide services to government employees are being required to provide the same sort of access.
Although the law specifically limits the degree to which online systems must comply only to those Web services that are contractually provided to the federal government, lawsuits targeting large, “deep pocket” entities are causing companies to consider voluntary compliance. As more companies accept the challenge to create open and easy-to-use sites for everyone regardless of ability, this will likely become the norm.
In the world of online research, these requirements are beginning to surface more and more frequently - particularly when the research is being recruited from a corporate Web site, or is adopting a look and feel of the sponsor’s site. But unlike systems that deliver mostly content and information, many modern online survey applications are designed to dynamically collect information. Most Web research applications do not employ static HTML, where each possible page is programmed ahead of time, but rather they create custom pages based on each respondent’s individual answers. This is a challenge that requires an additional layer of programming to make the data collection systems capable of complying with Section 508 requirements.
The accessibility that is required falls into several categories associated with the usability of a Web-based system (and, by extension, Web surveys being conducted on or recruited from the site). Most of the relevant provisions for online applications pertain to accommodations for people with impaired vision. Two primary areas of concern for accessibility include “substitution for keyboard navigation” (for people with vision problems who cannot use a mouse or other on-screen tracking and pointing devices) and, secondly, “information display and delivery” methods.
Marketing researchers who work with visual exhibits and animated work are familiar with the requirement that certain physical abilities must be assessed before a respondent is “qualified” for a survey. For example, it is usually the case that people who are color blind would be screened out of a study on packaging design. But the requirements of the ADA are designed to move away from “screening out” people with various types of disabilities and toward allowing them to take part in mainstream decision-influencing activities (such as surveys).
Most people with visual impairments use a class of software called screen readers, which translate the visible text on a screen into spoken words. Others use physical devices which render the text portions of the screen into braille or other mechanical feedback. In each of these cases, the application producing the screen content must be compatible with these devices.
This means that many of the graphics used in standard survey work must have readable tags that either allow the text displayed to be read in a recognizable fashion and/or to describe the graphics being used so that the context of the question is not lost. From a research standpoint, “describing a graphic element” may not produce the same test result as the feedback from a sighted individual, but it does offer a partial means for including those whose opinions would otherwise be ignored.
If you stop and visualize a typical online survey from the standpoint of someone with vision or other physical disability, you will soon see that many of the items that are used to create engagement, ease-of-navigation or an appealing look and feel are often not readable in their native form. Therefore we need to “open up the hood” and add additional descriptions of what is going on.
Taking an inventory of graphic elements that are used to convey meaning (e.g., color-coded sections of questions; boxes and lines that convey task flow; icons or thumbnails that indicate previously presented information; or logos or pictures used as exhibits) identifies sight-centric communications that need to be “annotated” beneath the visual layer. This is usually done by employing meta tags, which are text-based messages used to describe what’s going on within the page without relying on the graphic cues. This additional information doesn’t show up on the screen, but is spoken or printed for the sight-impaired.
A similar set of standards takes into account that screen colors and layouts must be accessible not just for the blind but also for those with lesser degrees of vision impairment. This means that colors used on the screens must be of sufficient contrast to produce a legible read and that faint colors are not utilized for key text entries. Timing of elements that are animated must either be slow enough for study and interpretation or they must repeat (either automatically or on command). Although these may sound like burdensome requirements, in the broader sense, it’s probably just good design usability.
Don’t wait
My suggestion to researchers is to begin thinking about this challenge now, rather than waiting for it to become a regulatory or legislative mandate. Many of these requirements are not small fixes and it will take time, investment and training to come up to production in this mode. On the other hand, starting to gain expertise in alternative information delivery systems may actually apply to other forms of future technology as well. (Consider that a survey that combines both spoken and graphic content delivery might be more appropriate for devices with limited screen space - like cell phones and PDAs.)
Recognizing that this is an important trend, which could affect the entire online research industry, the Council for Marketing and Opinion Research (CMOR), in conjunction with the Marketing Research Association and the Interactive Marketing Research Organization, is working now to understand the practical implications of regulations and to create information sources for those interested in studying compliance. At the same time, CMOR will continue to represent the industry’s interests before the relevant committees of Congress, and wherever possible, to distinguish the unique characteristics of survey research from simple site-content communication.