Editor’s note: John Wulff is vice president, business development, at market research firm The Logit Group, Toronto. This is an edited version of a post that originally appeared under the title, “Trust, but verify: using online panels for B2B research.”
Conducting B2B research using online panels is an increasingly attractive option. Its more efficient cost model translates to roughly 30% of the price of running the same project by telephone. Incentives are lower online and you’re able to cast a wider net to accomplish goals more quickly.
While this may sound great, serious questions can arise over how respondents are recruited. How do you ensure those answering your surveys are in fact qualified to do so (and are who they say they are)?
The survey-taking experience is recruited under the guise that it is completely anonymous. As such, respondents aren’t recruited with a phone number and can’t be validated with telephone verification. If you are choosing a panel company, this means you need to have a certain level of trust in the company and this trust needs to be cemented in a history of successful projects and the power of its name within the industry.
While trust is important, so is a little common sense. Regardless of past performance, there are steps to take across each survey to help ensure the content of the report comes from qualified B2B online respondents.
Choosing the right partner
Client relationships are carefully constructed; they need care, attention and acknowledgement that years of hard work have taken place prior to the partnership. It’s important to pick a partner that not only respects this philosophy but also has the experience and courage to share possible pitfalls, preparing you for the reality of the project at hand. This enables you to more reasonably predict the end result, using experience and asking the right clarifying questions to give everyone confidence and a platform from which to build.
With data collection, an account manager will often work on the viability/feasibility and costs for a project, but then passes it along to a project team for execution. Effective B2B research is accomplished when the account manager is tethered to the project from start to finish, and can frame expectations, ensure the team is on target and work with the client on the fly to adjust and implement backup plans. B2B research can be nuanced and fraught with challenges that require foresight, experience and the ability to jump in, correct and sometimes change direction.
Prescreening B2B panelists
“You’re only as good as your last book” is a smart adage to adopt when working with panel sources.
Panels are expected to adhere to the ESOMAR/GRBN Guideline on Online Sample Quality, which sets best practices in:
- research participant validation, to ensure the respondent falls within the description of the research sample;
- survey fraud prevention, to ensure the same person doesn’t try to receive more incentives by completing a survey more than once;
- survey engagement, to ensure that the respondent is paying sufficient attention;
- category and other types of exclusions, to ensure the sample does not include respondents who might bias the results; and
- sampling (including sample selection, sample blending, weighting, survey routers, profiling and screening) to provide transparency.
While these are the cornerstones of panel sampling businesses, it’s important to ensure they do this and acknowledge that respondent profiling isn’t as advanced as it needs to be in B2B sampling.
B2B profilers are sent out, of course, but the completion rates are low and panel companies will often direct consumer respondent traffic that they know is employed within a general business sector.
Most proprietary panel companies have partner sources they introduce. Although vetted appropriately, new sources in the mix can increase the probability for errors based on each source’s ability to control the fraudulent behavior appearing from time to time. Some of these partner sources can also skew results, with the base of answers really off the expected norms or what other sources in aggregate are showing.
To mitigate this, pre-screening becomes very important even among panel sources that have sufficient profiling for B2B respondents in place. Screening questions for the targeted respondent to go through before entering your survey are ideal for ensuring a respondent is truly qualified to participate.
About half of the incoming panel traffic fails for some reason or another, but this is still an important piece to put in place to ensure that those entering the survey are in fact who you need to answer the survey.
Trust, but verify
Trust, but verify is a useful way to describe how to manage and monitor a B2B market research panel project and ensure a high-quality data set.
Given the absence of exact profiling, many panels’ sources need to be tethered together to accomplish ambitious goals or to look for a subsection of respondents within a certain industry.
Whether or not there has been that additional layer of pre-screening, it is critical to embed security conditions (e.g., time to complete, straight-lining) and pepper red herring questions into the survey. (These can be monitored in your daily field disposition, with fails tied to the panel source). Reviewing verbatim for gibberish is another measure for discarding cases that don’t meet quality criteria.
When blending multiple panel sources, it is important to measure the sources against each other and focus on the quality fails that arise from the security conditions set, the red herrings and verbatim review to arrive at pass-back rate percentage by panel. Additionally, you should review responses by panel across each other to identify blips and skews in data. If any are present, they should be isolated and removed from the data set, and passed back to the panel for replacement at no charge. Further, after a pre-test of 10% of the quota is completed, the panel source(s) showing pass-back rates higher than 30 to 40% should be investigated for legitimacy. If necessary, they should be removed from the sampling, forwarded and removed from the data set.
While all these quality review metrics are important, they must be reasonable – typical pass-back rates on security fails in the industry range between 10 and 20%. (With a pre-screener employed, it tends to be much less). When it is above 20%, there is a quality source issue or it is overly stringent and the project at hand may not be appropriate for the online methodology. It is important to investigate both possibilities.
Avoiding issues
Human beings are creatures of comfort, and we prefer to put a lot of faith and trust in proven panel providers. While I think trust is key, it is also important to be vigilant and to employ your own reasonable security metrics. You also need to understand that with panel sources, issues with respondent quality can arise and fraudulent sources (e.g., bots) can break though. With these extra steps and an experienced partner, you’re able to avoid issues and ensure that your report is based purely on respondents that belong.