Shorten the screeners, lose the grids

Editor's note: Scott Worthge is research director at Quest Mindshare. He can be reached at sworthge@questmindshare.com.

As any researcher can tell you, response rates for online research, particularly surveys, have been declining for decades. I’ve been involved in “panel and sample” for over 30 years and this issue became more acute a few years ago – a “sample shortage” that the market research industry hadn’t seen previously. Potential participants weren’t responding as they had, for a host of reasons I could expound on at length. But the upshot was that fewer people were taking the online surveys researchers needed them to complete, affecting suppliers and clients across the market research landscape.

What started those few years ago continues today. This issue affects business surveys for B2B audiences more than B2C consumer surveys, given the smaller and more specific populations that are available for participation. One research industry exec nailed it, saying, “If B2C is drawing from an Olympic-sized swimming pool of potential respondents, business surveys are working with a kiddie pool – and sometimes a fishbowl – to find the people critical for participating in research and generating the data for generating insights.” Meanwhile, demand for business survey participants increases consistently, especially for higher managerial decision-makers and specialist target audiences such as information technology decision makers (ITDMs).

I have a vested interest in understanding and motivating B2B respondents, given Quest’s focus on providing such audiences for our partners’ research projects. In researching declining responsiveness, I found lots of articles, blogs and even webinars offering advice about participation in B2B surveys. So many researchers weighed in with their opinions about what made for a good B2B survey, from the incentives necessary and appropriate for gaining cooperation from potential respondents, to survey length, question types and so on. 

What I didn’t find was research-on-research asking actual business audiences about what would create more motivation and better engagement for them – what they as B2B survey-takers like/don’t like, want and need in order to participate more frequently, willingly and authentically. Lots of opinions about these business professionals? Absolutely. Actual data from someone surveying them directly? Nowhere to be found.

Looking for answers, I designed Quest’s own investigation among several highly in-demand B2B audiences (based on the hundreds of B2B surveys we field each month). After all, who better to comment on what will increase B2B survey motivation and participation than the people who are asked to take those surveys? What we built to investigate this issue was a survey of several B2B groups:

  • small business owners and general managers (with fewer than 50 employees);
  • mid-level managers and directors;
  • higher-level management – AVP, VP, SVP;
  • tech specialists – ITDMs, IT influencers;
  • non-tech, non-management “regular workers.”

We surveyed 80 of each group, 400 total, in Q1 2024 with quotas across these for company size (small, mid-sized, enterprise) and geographic balance by U.S. Census region. We didn’t impose quotas for line of business, figuring this would be too granular for our exploratory work and the per-group sample size selected. The respondents were a mix of members of B2B online panels, expert networks and non-members of either, accessed through intercepts.

What we were really after was the identification of pain points – where their typical experience with a B2B survey caused them to wince (figuratively or perhaps literally), complain, even consider or actually drop out of a survey. We asked for their de-motivators, distractions, dissatisfaction and even disgust. While we were careful not to bias the results negatively with how we asked, we explored the entire process of taking a B2B survey from the invitation to the wrap-up and incentives provided.

What did we learn from our exploration? Much that we expected but with several distinct surprises. I’ve been involved in online surveys since that methodology emerged as the predominant data collection approach in the late 1990s, increasingly supplanting telephone interviews. I found several topics we asked of our respondents where their clear answers weren’t intuitive for me. And I was chagrined if not embarrassed that I as a long-time researcher didn’t know better what my survey-takers wanted! 

Let’s dig into the results, which are a summary of what was presented at the Quirk’s Dallas conference in February 2024. While the survey provided a volume of data, we’ll highlight the more notable themes for each of our major areas of inquiry.

“What are the top motivators for you to take a B2B survey?”

“Making my opinion known/heard” was the top result. This was particularly strong for smaller companies and lower-level employees.

A close second place was “compensation” – a direct incentive paid in cash or a cash equivalent such as a gift card or code.

The tech crowd was particularly motivated by “exposure to new ideas” and “gaining information helpful to my role,” but other groups in general weren’t very interested. I thought this might be stronger across the board.

The losers for motivation generally were “non-cash incentives” and “competitive info to help my company.” 

“What pain points have you had in business surveys?”

The big answer here was “screening and qualifying questions.” This was a surprise to me for how strongly all our respondents felt the screening experiences they had were too long, too general or not relevant enough and just plain “missed the mark.” Comments such as “ask me what I know and what my responsibilities are” and “stop relying on my title” were typical.

The time needed for screening and qualifying was a huge point. Respondents felt very strongly that screeners needed to “get to the point quicker,” asking more direct questions and fewer of them.

This pain point was three times more important than anything else – certainly an aspect of B2B survey-taking that merits scrutiny and more attention by sponsors of such surveys. 

Very close to this as a pain point was “confusing or bad survey design.” When talking to clients about data quality, this issue comes up often – what we ask and how we ask our respondents as researchers needs review and evolution. I won’t get on my soapbox about training in survey design but this was a consistent deal-breaker. “I can tell when the questions are written by someone who doesn’t know my industry” sums up the sentiment behind this dissatisfaction and it was a significant cause for those who said they would drop out of a survey that showed naivete about the topic under discussion. Top execs and tech audiences were particularly sensitive to this.

Interestingly to me, “survey invitations” didn’t come through as a pain point – 85% of our sample basically said “no problems here.” I had hypothesized that how an invitation was worded, how it portrayed a survey opportunity to a potential respondent would be more of an issue, but evidently not, at least in our sample.

“What kind of questions during a survey bother you?”

I was expecting “open-ends, hate those” and we did get that sentiment. Aren’t open-end questions universally disliked? How many is too many is up for debate still but one we have in mind for future investigation.

Disliking open-ends wasn’t nearly as strong as the outright hatred of “big grids, with forced responses” – this jumped into first place among our sample, with comments such as “Why ask for so much detail?” and “Why can’t you narrow down a long list to what’s relevant to me, then ask all your questions?” I’m rethinking how I present grids for my clients, based on the strength of what I heard here.

Another of the most objectionable question types was what I call “work PII” – asking for personally identifiable information about the specific job and company of the respondent. Often enough in B2B surveys I see questions asking for the respondent’s work e-mail address or LinkedIn profile, or their company name and/or specific location. This is a deal-breaker for the respondents we asked, especially middle managers at larger companies. They felt if they provided such information, their responses wouldn’t be anonymized and likely would be used for purposes other than research. Comments like “Why would a survey need to know my LinkedIn?” and “No way will I tell who my employer is by name. Then my comments could be traced back to me easily” exemplify the tone expressed. I can understand questions such as these being asked for validation of respondents, especially for more specific B2B audiences. But respondents felt very strongly that any questions needing to establish their qualification for a survey should be upfront and based on their knowledge and experience, not their personal and company identification. “Why care about who I work for? Shouldn’t surveys be asking if I sit in the right chair and make decisions they want to ask about?” sums it up.

I was interested to find out what information our respondents felt they could share, that they considered non-confidential. These included “customer makeup,” such as the amounts that are wholesale vs. retail, or domestic vs. international. They also indicated a willingness to share purchase stage for types of products and general budget – they didn’t object to being asked where in a consideration-to-purchase sales cycle they were for, say, HR management software, and what they were budgeting for that purchase overall.

“How long is too long for a business survey?”

The age-old question of length-of-interview, right? As a researcher I rarely push back too hard on clients who tell me they have a 20- or 25-minute B2B survey. But I’ve never had guidance for what happens at those survey lengths to say anything concrete and advise differently.

Our respondents were clear: Their cooperation peaked at 15 minutes. Up until that time, only 20% of our respondents would consider dropping out of a typical B2B survey and most said they would almost always “stick with” what they started. But after those 15 minutes, their cooperation fell off a cliff – at 20 minutes almost 60% said they would be ready to drop most surveys and at 25 minutes this rose to 75%. Three-quarters of a typical B2B survey audience is disengaging at 25 minutes?! Ouch. Small business owners were the strongest for this sentiment, saying “The time I spend in a survey is the time I’m not running my business. I’ll give only so much.”

This bears looking into and will be an aspect of B2B surveys Quest investigates further in 2024 and likely into 2025. We had done a similar B2C investigation a couple years ago into what we called data degradation, our name for what type of data suffered what declines in quality and reliability at what point in a typical consumer survey. We want a more definitive answer here than “don’t go past 15 minutes,” so we’ll be digging into this in future research-on-research.

Next up: incentives

I haven’t addressed here several very important questions about incentives for B2B surveys. We asked specific questions about expectations and reality for incentives our respondents had been offered in past B2B surveys. We received a lot of important information on what they deem appropriate types and amounts of compensation. I’ll be bringing that information to you in a future article, given it’s a separate and focused discussion unto itself that deserves more than a few paragraphs here.

Quest and I hope that showing the results of our recent survey of U.S. business professionals will provide insights and advice for researchers of all types. Our purpose in developing and providing this information is for all of us to gain a new understanding of B2B audiences and reconsider how we approach creating and conducting B2B surveys. We welcome comments and collaboration as we dig into B2B best practices and share what we find.