Know thine enemy well
Editor’s note: Brett Watkins is president of L&E Research, Raleigh, N.C.
The debate over professional respondents and how to deal with this problem and its correlative subject of respondent cooperation/improving cooperation rates (i.e., eliminating professional respondents without reducing cooperation rates) has been a heavily discussed and debated, subject: Most every market research industry journal has had authors address this issue with a variety of positions over the years.
However, there has been no analysis of what taking more punitive measures to weed out professional respondents would have on the industry and what alternatives exist to improve facility databases without further reducing cooperation rates. Namely, what are the impacts of implementing more stringent measures on denying cheaters access to studies and what are the cost impacts on continually-declining cooperation/response rates.
This article will analyze the cost benefit analysis of databases, the cost of implementing more restrictive measures on said databases and how improved database technologies can help improve qualitative research quality while holding down costs. Specifically, with improved database technologies and better communication, focus group facilities/field agencies can improve participant quality and reduce professionals while increasing their value to clients.
Analysis of the cost
Typically, discussions about professional respondents look at the impact on the quality of the research versus the cost of the research. While everyone recognizes the damage to qualitative research caused by professional respondents, few have done an actual analysis of the cost of taking on more draconian measures to prevent professionals. So let’s study this more closely.
Most everyone recognizes that recruiting from non-database sources is more difficult than utilizing a facility’s database. The person answering the phone for database calls knows the facility only calls regarding paid market research studies. Hence, they typically answer their phone and participate in the call. For list calls the response rates are dramatically lower. Recent analysis of plunging response rates to telephone surveys, when factoring in access to the total population and declining response to such calls due to Caller ID technologies and other factors, was quoted at nearly 85 percent non-response.1 While telephone surveys and qualitative research calls are not the same, the impact is similar. The general population still has a significant learning curve with qualitative research: popular to contrary opinion, the majority of society, when actually probed, does not know what a focus group is, much less how it works. Hence, stating a call is for a focus group/paid research is not likely to significantly alter response rates, as most consumers still believe it’s “too good to be true.”
While no studies I am familiar with have studied response rates to database calls, my own 16 years of experience would put that number at minimally 85 percent. But for sake of argument, let’s reduce that number to 80 percent and increase response to blind calls to customer or purchased lists to 20 percent (higher than the 15 percent quoted in the latest analysis aforementioned). As such, we are still looking at databases having four times-greater response rates than lists provided by clients. This is the first fundamental step in evaluating call-center productivity: list quality. I recently did this math for a client, to help him explain to his end client why list recruiting is more expensive:
Database calling: 80 percent list quality (good contact information) x 80 percent response rate (the right person answers the phone) x 80 percent cooperation rate (they agree to take your call) = 51.2 percent.
List calling: 80 percent list quality (dubious, as all researchers know) x 20 percent response rate x 30 percent cooperation rate (both numbers higher than my experience) = 4.8 percent.
My experience has been the percentages used for database calling are conservative (i.e., we’ve seen better response rates than this) and list calling liberal (i.e., we’ve seen worse rates than this), so list calling is, minimally, approximately 12 times more difficult. While incidence is the final factor to be equated: client/purchase lists many times argue 80 percent or better (again, rarely seen it); and a database only 20 percent (which I would argue many times is low, as a client’s study incidence, if they know it, is tied to the general population, not stratified populations in a database according to demographics and other information a field facility’s database collects), this still results in a list study being more than three times more difficult. Hence, this clearly shows why the industry continues to utilize database resources and not return to cold-call methodologies: The industry cannot withstand a tripling or more of costs, not to mention longer study timelines.
Improve their quality
With the above math substantiating the cost-effectiveness of databases, the next important step is determining how best to improve their quality to ensure they meet the qualitative research industry’s needs. As it relates to cheats, some have advocated that facilities should become more stringent in identifying cheaters and implement policies to eliminate them from their databases. These steps include taking pictures of participants; creating a “wall of shame” of pictures of cheats who have been caught and banished; seeking criminal prosecution of cheaters; refusing payment to anyone who does not re-screen; making examples of participants who are cheaters in the holding area and on Web sites; and more.
Looking past conflicts some of these behaviors would create for clients, as well as legally for facility owners (not to mention no law enforcement agencies engage in criminal prosecution), a primary flaw in such actions is their impact on legitimate respondents. Recall we studied previously the multiplier effect on costs of other recruiting methods; if we implement measures that place greater scrutiny on database members and make database membership more onerous, then we hamper efforts to obtain new database registrants (which facility owners will already tell you is not as simple as it sounds; again the majority of people do not know what qualitative research is, or how it works, so convincing them to give contact and personal information is not easy).
Further, employing more militant behavior against our current database members - treating them as commodities versus valued partners in the research process - results in more dissatisfied database members (who tell their friends and make recruiting even harder). Increasing the difficulty in finding new database members increases costs, thereby driving up recruiting costs for qualitative research, returning us back to the same issue as with older methodologies: getting good recruits but at a cost beyond our client’s budgets. Instead, we need to find better solutions to identify professional respondents (and keep them away from our studies).
Can certainly attract
Clearly, there are technologies that hamper quality recruiting when not utilized properly. While the actual study details should be disguised when recruiting via Web postings, e-mail blasts and other publicly-shared networks like Facebook, Twitter and Craigslist, the reality is, any public attempt to seek study participants can certainly attract professional respondents. At L&E Research, we have marked over 2,000 people as “do not call” for a variety of reasons; many are the professionals that surf our site and social networks in search of an easy $75.
However, advanced database technologies that we have developed internally have also aided us in adding over 50,000 new database members in the last five years. These technologies can be quite useful in attracting new members to a database and identifying cheats, if the facility actively monitors its member registrations. As with all technologies, the quality of output is as good as the input. Implementing procedures that require validation of data and using database technologies that seek out duplicative data and have search tools to find respondents that are looking to game the system (and then flagging them to ensure they are not called/recruited) is critical to creating a fresh database of engaged members. And, the beauty is, the cheater doesn’t realize he’s caught, as he doesn’t realize his behavior resulted in his account being flagged.
Hence, I believe our systems do not need to be more restrictive and thus bottleneck registrations via the requirement of human validation (100 percent human validation is argued by some to be necessary, whereas I would argue it is not only cost-prohibitive but unnecessary). Instead, they need to make registration into our databases easier and let us capture more information so we can more easily identify cheats. This accomplishes two things: Cheats are more easily identified and flagged to prevent participation in studies; the database registration process is simplified and improved, making database member referrals easier (and easy to track), hence growing one’s database and adding more virgin participants.
Better means of validating
Most all qualitative researchers have advocated the need for better security of databases and utilizing better means of validating participant information. Our clients seek harder-to-reach respondents as marketing becomes more one-to-one but they need field researchers to identify ways to deliver those participants - the right participants - without paying exorbitant costs.
Cearly we cannot go back to the old ways of doing business. Returning to cold-call lists or other lower-cooperation/response resources cannot meet the needs of the industry without dramatically increasing costs and time-to-completion. Qualitative research is getting harder; incidence rates are dropping; clients are seeking participants through questions that aren’t traditionally captured with database questionnaires and yet timeliness of the research is still critical.
It is time we look through the prism differently, to recognize that technology is our friend and can help us recruit faster while recruiting better. Facilities and field agencies that invest in their databases and the technologies that manage them will make a significant quality difference in our profession and eliminate the noise that our industry is saddled with cheaters, repeaters and professionals.
If we take the time to develop better database systems, we can learn more about our members, including those who are simply looking for an easy $75. By making our systems easier on the front end, while employing advanced database technologies on the back end, we improve the quality of the primary service we provide the qualitative research industry: good recruiting. In doing so, we begin providing solutions to clients that make us valuable again. Costs then become a secondary element in the equation, not the primary deciding factor, as our costs are competitive but our quality and speed-of-delivery are superior.
References
1Curtin, Richard, Stanley Presser, and Eleanor Singer. 2005. “Changes in Telephone Survey Nonresponse over the Past Quarter Century.” Public Opinion Quarterly 69:87-98.