Editor’s note: By Bill MacElroy is president, Socratic/Modalis, a San Francisco research firm.
Determining the response rates for on-line surveys (as opposed to traditional phone or mail studies) has been the recent topic of several research conferences and forums. Although response rates are easily calculated for some forms of on-line recruitment techniques, others are more difficult to monitor accurately.
In general, there are four main (ethical) recruitment techniques being used in Internet-based research. These include: off-line recruiting, pre-recruited panels, site intercept and customer database sample. I qualify these as ethical so as to distinguish these techniques from two forms that are generally considered to be rude (techno-culturally).
The first “bad” recruitment technique is broadcast, random e-mail and other forms of spam in which certain researchers have attempted to replicate the random digit dialing technique by purchasing e-mail lists. Not only is this considered rude, but it can also land you in deep trouble with your Internet service provider (most will disconnect your Internet access if people complain about you . . . and they will). If that weren’t enough disincentive, the simple fact is that spam recruitment just doesn’t work. Response rates to spam are reportedly extremely low, (e.g., 1 percent or less, even used in combination with a drawing) and of those who do respond, obvious and deliberate response sabotage is common.
The other “bad” sample acquisition/recruitment technique is the use of automated e-mail detection technologies (e.g., sniffers, Web crawlers or smart bots) to collect e-mail addresses surreptitiously from Web sites, bulletin boards and Usenet areas. Like spam, any form of recruitment that isn’t preceded by some type of relationship and/or permission-based contact, won’t work and will get you more grief than it’s worth.
Worst to best
So what are the response rates (number attempting out of the number invited) of the acceptable recruitment techniques? Here are our estimates, ranging from the techniques with the worst response rates to the best.
The first category with the lowest overall response rate is off-line recruitment. This consists of using real-world techniques to direct people to a survey. Examples include putting ads in the newspaper, calling people on the phone, sending requests through the mail, etc. In general, we have found these to be the least effective methods of recruitment because of what we’ve come to refer to as the digital/analog divide. This is where it is more difficult to get people to do things on-line when using off-line techniques to drive them to the Web. The converse is also true; it’s hard to get people to carry out off-line tasks (like keeping a paper-based diary) when using on-line means of communication.
If you are attempting to mail or advertise for a Web survey, expect a response rate of 1 to 3 percent of total contacts made. For telephone pre-recruit, expect a 50 percent to 60 percent on-line response from those who have agreed to participate and given a valid e-mail address (for a survey on a personally relevant and interesting topic for which they will receive an incentive). Also note that telephone pre-recruitment can cost up to 70 percent of the cost of doing the entire interview over the phone, so there has to be a very compelling reason to use the Internet if you must call ahead of time.
On the subject of incentives, most people find that cash (or on-line currency) is king. Three to five dollars is usually enough for most less-than-15-minute surveys. Prizes and drawings are able to elicit some response, but have to be fairly rich (e.g., more than $1,200) to reach a decent return. For technical audiences (without very strong affiliation with the survey sponsor), a per capita cash incentive of $25 is a minimum threshold for participation. Another thing to keep in mind is that any incentive payment that is more than $500 must be declared as income, and subsequently taxed. Some researchers have reported phenomenal response rates with no incentives - I’ve rarely witnessed it.
A technique analogous to mall intercept is site intercept. As the name implies, this technique seeks to get potential respondents’ attention through the use of banners, buttons, badges, hypertext links and other Web site elements. This method is the most difficult to associate with a specific response rate, because the contact rate is hard to track. When we have used banner-tracking software to assess the number of “exposures” to a banner, the actual response rate is low. Although sites with a lot of traffic (over 5,000 hits per day) can still get a good number of completes in a relatively short period of time (i.e., 200 to 300 in less than a week).
An alternative method to passive intercept is the interstitial window recruit. These are the pop-up or “daughter” windows that open a new browser window and command attention. Using this technique, the intercept rate can be paced to display the invitation to every nth visitor. This cuts down somewhat on self-selection bias, but has also been reported to be somewhat annoying. The response rate from this more aggressive intercept is much higher (15 percent to 30 percent) than for banners and badges.
The next-best response category is a sample drawn from a customer registration database. These should be pre-collected e-mail addresses of people who have opted-in for future contact. Using these addresses, you can invite people to take part in a survey and link them directly to it. Response rates (number attempting out of number invited) range anywhere from 20 percent to 50 percent depending, once again, on relevance and interest in the topic. Beware, however, of purported opt-in lists that may be sold by some unscrupulous sources and represented as self-invited potential respondents.
Consider mid-terms
As we attempt to define on-line response rates precisely, we also need to consider mid-terms - people who drop out midway through the survey. Without a human interviewer urging the respondent to continue, on-line research tends to suffer more partial completes than phone (e.g., 20 percent to 30 percent). If you can track those respondents who have paused during a survey, however, the conversion rates of incompletes to completes is much higher than simply inviting new potential respondents. This is particularly true if they can pick up where they left off when they quit. Expect to convert up to 50 percent of on-line mid-terms to completes if you have an automatic pickup option.
The best response rates are from pre-recruited panels (either syndicated or custom-built). The pre-recruitment, which is done using all of the above-mentioned techniques, yields a ready-to-use database from which a random sample can be selected and invited. Although response rates will vary by topic and level of incentive, 40 percent to 50 percent participation levels are not uncommon. National panels (e.g., Greenfield Online, NFO, NPD, Harris Interactive, Socratic Forum) can be useful for obtaining non-customer respondents. A custom panel built from within a company’s own client base creates affinity and boosts participation rates even higher.
Reality barometer
Finally, keep a reality barometer in mind. If you suspect that the sampling methodology could have biased sample selection in any way, take a look at known parameters about the populations you are researching. If a target audience that you are trying to reach is known to consist of 70 percent women and 30 percent men, and your survey results skew toward males, you might want to investigate the effect of sampling on your survey audience.