Differences do matter
Editor’s note: Efrain Ribeiro is global head of access panels at the Owings Mills, Md., office of research firm TNS.
Having built TNS’ first test Internet panel in 1996, I have witnessed the development and rapid evolution of online access panel best practices over the last 10 years. I am often surprised at how little attention some researchers pay to what I consider critical and important quality components of online data collection via access panels. It is important for users of these valuable sample sources to understand the critical differences between well-run, well-maintained panels and glorified e-mail address lists with few quality controls and low response rates.
During a recent visit to one client company and meeting with their analytic research team, I was asked to share some of my knowledge and understanding of how online access panels operate. The request was precipitated by problems that the research team was having with its study data. Having used a number of different online panel suppliers for a variety of projects, the client had begun to encounter inconsistencies and was concerned about the validity of the research findings. The client had assumed that all online panels were actively managed using similar techniques and best practices. Therefore, the client concluded, these sample sources were interchangeable.
I was understandably surprised to discover that these experienced researchers, who depend on the quality of online access panels and ultimately help make multimillion-dollar decisions based on their data, had underestimated the critical factors involved in ensuring the integrity of the panel they used for research.
We reviewed the many important components involved in developing, managing and maintaining a quality online access panel. When we were finished, they fully understood that not all online sample sources are created equal. In their previous work they had interchanged mid-process sample from one supplier that had single-digit return rates with another that was able to deliver the entire sample they required in a single, 12-hour period. At the time, it was a convenient move and it helped them get the results to their brand manager on schedule. However, it also presented risks to the quality of data obtained and therefore, the usability of survey results.
Increasingly, industry experts are speaking out about these risks. Interviewed about trends he saw for the year 2006, Doug Rivers, head of Palo Alto, Calif.-based research firm Polimetrix, said: “The quality of the information we collect ultimately depends upon the goodwill and cooperation of those who take our surveys. Unfortunately, some online panels are deluging respondents with multiple invitations per week or even per day. The cost of collecting this data this way is very low, but so is the quality. Clients are often not asking their suppliers the right questions: ‘Where are your panelists recruited? How often are they surveyed and for how long? How many respond?’ The Internet can deliver high-quality data, but it makes a difference how a panel is managed.” (Research Business Report, December 2005)
These are just some of the considerations an informed user of online access panels needs to take into account. This article addresses several additional factors that are key to ensuring that your respondents truly represent the audience clients want to reach.
Although there are the usual obvious and important elements that most market researchers attend to regardless of sample source (such as age/sex demographics, geography and race/ethnicity), this article focuses on the often overlooked but very important and interrelated factors related to panel management. These factors impact return rates and in turn, have the potential to adversely affect representivity and data quality.
• Panelist relationship management
Effective panelist relationship management is the foundation of well-managed access panels and involves choreographing all aspects of the panelist life cycle and being mindful of every aspect of each panelist’s contact with you - from the point of prospect identification to recruitment to membership termination.
The primary goals of panelist relationship management are to increase the tenure and cooperation of panelists and to promote the ongoing collection of valid information. The means of accomplishing these goals are consistent, positive communication that respects the panelists and addresses their concerns in a timely fashion, combined with rewards (both tangible and intangible) for their contributions. A good managed access panel approach will collect information at multiple points and monitor this information and its relationship to other panel management factors.
Figure 1 provides a schematic of the primary components of the relationship that TNS maintains with the members of its panel and the key elements of panelist relationship management. TNS has a worldwide network of access panels. In the United States, members participate in studies via the TNS NFO panels.
• Cooperation rates
As with any method of research, respondent cooperation rates with access panels are critical. In fact, an original advantage of utilizing traditional (mail and phone) access panels was their superior cooperation rates versus other methods. These provided cost benefits and, equal in importance, they helped reduce potential non-response error and impact on data quality. An actively managed online access panel should be able to achieve cooperation rates in the 40 to 50 percent range through a number of critical strategies. These include: identification and purging of non-active members; a reasonable and fair incentive program; panelist rapport initiatives, including providing prompt support and assistance when issues arise.
As most of readers know, today many panel suppliers are achieving single-digit cooperation rates.
• Keeping panelist information current
Profiling information enables managed access panels to select samples that are appropriate for the research being conducted. Typically, comprehensive profiling information is collected at or near the time of first registration. However, changes to ownership, health and household composition are quite common, and proactive panel management therefore includes routine collection of updated information. For example, TNS NFO panelists are asked to update most profile information three months after registering and annually thereafter. The three-month initial update allows us to collect information that panelists may have been reluctant to give at first but would be more willing to provide after a trusted relationship has been established. Panelists are also encouraged via our routine communications to notify us of any changes to household composition on an as-needed basis.
It is also vital to keep panelist contact information current, especially e-mail address, which is apt to change multiple times during the year. A recent TNS study inquiring about the number of changes to primary e-mail address in the past year indicates that over 40 percent of respondents made one or more changes (see Figure 2). Because e-mail is the primary means of contact for most online panels, providing fast and easy ways for panelists to update their e-mail addresses (such as an online information update form) and encouraging panel members to update is essential. Asking panel members to provide a secondary e-mail address to be used if the primary one becomes invalid is another strategy for ensuring continued communication with your panelists.
• Consistent e-mail delivery
You’ve built your online panel and taken steps to build rapport, keep cooperation rates high and information current, to include current e-mail addresses. Now all you have to do is select the sample, e-mail the survey invitations and you’re home free, right? Not exactly.
Unfortunately, online mail is not the same as postal mail in terms of guaranteed delivery. With few exceptions, the post office will deliver all mail given an adequate address and sufficient postage. In the online world, as part of the continuing effort to effectively filter and reduce spam, Internet service providers (ISPs) have the first (and often final) say about whether or not your e-mails reach their intended destination or end up in the trash.
Recent data indicate that over 20 percent of wanted e-mail never reaches the inbox. Return Path Inc. (a company that provides e-mail monitoring and delivery services) conducted an analysis of 117,761 e-mail campaigns sent between July and December 2005, monitoring delivery rates at the top 28 ISPs and the three most-used corporate filtering systems. They found that non-delivery rates for permission-based e-mails averaged 20.5 percent, with large variations by ISP - from over 40 percent with Excite and Gmail, to less than 10 percent through USA.net, CompuServe, Mac.com and Earthlink (see Figure 3). Non-delivered e-mail is defined as e-mail that is either delivered to the junk mail folder or not at all.
According to a 2005 consumer survey (also conducted by Return Path), 73.4 percent reported that they had e-mail they wanted to receive end up in their junk folder or never arrive at all.
What do delivery rates have to do with sample integrity beyond the potential impact to overall cooperation rates? Consider this example: According to Hotmail, its nearly 31 million unique U.S. users comprise 19.9 percent of the all U.S. Internet users. Suppose you send out a survey today that seeks to measure the U.S. Internet population and Hotmail routes all your survey invitations to the junk folder and/or fails to deliver them at all. You’ve just lost most of your ability to speak to and hear from almost 20 percent of your desired audience, thereby introducing a nontrivial source response bias.
Many access panel companies may not proactively monitor e-mail delivery and therefore, would be unaware of potential bias problems related to e-mail delivery. Additionally, being blind to e-mail delivery issues also means that one cannot judge the efficacy of steps taken to prevent problems or improve delivery rates.
Proactive panel management to ensure quality data includes ongoing attention to several aspects of survey invitation delivery:
— Controlling factors that are known to adversely affect e-mail delivery, including but not limited to e-mail send volume, routine bounce processing, message and subject line content and type of e-mail format. Although attending to these factors takes time and effort, they are completely within the control of the sender.
— Continuous monitoring to identify and resolve problems quickly. Recommended types of monitoring include blacklist monitoring (are your IP addresses on any major blacklists?); delivery monitoring (do seeded e-mail addresses reach their destination?); open-and-click rate monitoring (do open-and-click rate trends indicate that your survey invitations are being delivered?); and complaint monitoring (are your e-mails being marked as spam or junk by recipients?).
— Keeping current and adhering to current industry standards that promote delivery, such as implementing e-mail authentication methods, segmenting IP addresses depending on type of e-mail sent and obtaining double opt-in permission from your panel members.
— Actively working with ISPs to enhance deliverability, such as subscribing to whitelists, feedback loops or other certification services that enhance e-mail delivery.
— Engaging in an ongoing dialogue with panelists related to e-mail delivery to encourage them to add your domain to their trusted sender list, provide alternate e-mail addresses and to report e-mail address changes promptly.
— Seek and implement non-e-mail methods of survey notification, such as downloadable notification software and encouraging periodic visits to the survey Web site in case a survey is available.
Ensure results
In principle, all managed access panels attend to the often-invisible details associated with panel management. To be effective in practice, however, requires that market researchers implement programs to ensure representative results and high quality data. Because none of the strategies reviewed in this article are one-time efforts and because new challenges present themselves daily, dedicated staff must be allocated to operate and manage these programs and to be on the alert for other factors that may impact data quality. Over the past five years, e-mail delivery issues and solutions have changed markedly from one year to the next. What works today may not work tomorrow.
Even with proactive panel management, some problems and issues cannot be prevented. Therefore, having problem identification and monitoring in place as a second-tier strategy in the programs you develop ensures that issues can be promptly identified to determine the impact on survey results.
You may ask if it’s worth the effort. My experience is that the payoff warrants the time and resources expended. Having systems and resources in place to manage and monitor these critical factors puts you several steps ahead of panel companies that are less conscientious in their approach. As with many aspects of market research, the devil is in the details, but attention to detail pays off in terms of valid results and satisfied clients.