DIY on a new high?
Editor's note: Tim Davidson is president of Prevision Surveys, Marshfield, Mass. He can be reached at tdavidson@previsionsurveys.com. Interested readers can receive a copy of the most recent study by e-mailing the author. Prevision is also seeking respondents for the current study, who will receive a copy of the respondent report when it is tabulated later this year.
For the past 14 years my firm, Prevision Surveys, has conducted research-on-research studies of buyers of market research regarding their satisfaction with the suppliers they used in the prior year. These are structured telephone interviews of research buyers in all major non-government industries in the United States.
Each year we tabulate the 290-300 interviews and produce average scores for each provider mentioned. These averages are computed for seven research categories (e.g., early-stage concept screening, later-stage concept/product tests, A&U studies, ad copy testing) and for seven attributes (overall satisfaction, data quality, analytical skills, communication skills, on-time delivery, other customer service attributes and value-for-money).
During the interviews, respondents often mentioned that some research was conducted by their own staff rather than by hiring an outside researcher in the prior year. Prevision then asked the respondents to put on their “objective hat” and rate their departments’ do-it-yourself (DIY) work in the same manner as their ratings of outside contractors.
Bolster internal decision-making
Motivated by the success of mid-20th century survey sampling and polling techniques of George Gallup, creator of the Gallup Poll, staff in the marketing departments of prominent CPG companies first conducted marketing research using their own staff to gain insight from their customers/consumers to bolster internal decision-making. Soon specialty groups within these marketing research departments (MRDs) were created to take on this labor-intensive work. Then new specialized outside firms emerged to do the time-consuming fieldwork and tabulation of research results.
Eventually, independent custom quantitative market research providers expanded their services into six tasks, once done exclusively by in-house staff: assisting in research design; questionnaire development; fielding the survey; coding and tabulating the findings; reporting of the tabulated results; and developing relevant insights around the marketing/sales decisions to be made.
The most recent Prevision quality/value study covering 2018 found that there are well over 250 medium-to-large MR providers operating in the U.S. that provide virtually all of the above quantitative research tasks. A decade ago, most of the custom quantitative and qualitative research needs of U.S. industry were provided by market research firms and few DIY projects were performed. Gradually, more and more client-side firms resumed some MR project work with their own staff (employing specialized survey software) without the help and expense of outside providers. This in-house activity is referred to as do-it-yourself research.
Several years ago, when MR department budgets were under new economic pressures, many respondent firms increased their use of DIY for certain research projects by doing all six research tasks discussed above with in-house staff only, without any help from custom quantitative or qualitative MR providers.
Many DIY-using firms purchased lists of respondent e-mails and employed online survey software. More recently, wider-ranging online research platforms (equipped with tools like survey software, questionnaire templates, tabulation/reporting aids and respondent panels) have become very popular for assistance with DIY tasks.
Prevision found that in 2018, 80% of respondents used DIY approaches to some extent, up from 71% just two years prior. The rest used no DIY and relied solely on outside help by directly hiring MR providers or marketing research consultants.
Of those respondents that used DIY for some of their projects in 2018, 20% used DIY for up to 5% of all their custom quantitative and qualitative research projects. In contrast, 16% of respondents (up from 7% in 2016) used DIY for more than 70% of their research projects.
A sample of 95 DIY-using respondents claimed that their department conducted more than 1,400 DIY projects in 2018. Forty-two percent of these firms expected the number of DIY projects to increase for 2019 and an equal number said it would stay the same. Only 5 percent expected the DIY project count to decrease.
Bragging rights
How does DIY compare to professional researchers for research quality and value? And how do these DIY attribute ratings compare with DIY ratings in the prior study year? The 2018 study found that if DIY were an independent MR provider, it would have the bragging rights regarding the seven study attributes shown below.
Among the 230 MR buyers rating DIY custom quantitative research projects in 2018:
- For the overall satisfaction study attribute, DIY would rank 10th (vs. sixth in 2017). (That is, in the 2018 study, there were nine professional MR firms that had higher overall satisfaction average scores than the average score for DIY research. Further, DIY’s overall satisfaction ratings are slipping from year to year. There were five fewer professional MR firms having better overall satisfaction ratings in 2017 than in 2018.)
- For data quality, DIY would rank 11th (vs. 12th in 2017).
- For analytical skills, DIY would be in eighth place (vs. ninth in the 2017 study).
- For written and verbal communication skills, DIY would rank seventh, (unchanged in its 2017 ranking).
- For on-time delivery, DIY would rank eighth, ranking unchanged.
- For other customer service attributes, DIY would rank fifth vs. sixth.
- For value-for-money, DIY would be in first place, unchanged from 2017 but statistically tied with two independent research providers.
Why does DIY seem to be the best value? Many respondents only considered the out-of-pocket costs of DIY research (e.g., software fees, platform and panel charges, etc.) without counting the salary and other employee-related expenses of in-house staff performing the DIY work. Prevision found, however, that some respondent firms hire new MRD staff exclusively to prosecute DIY projects. Others expect existing staff to spend a substantial part of their time doing DIY work. Clearly, if the cost of labor and overhead were included in DIY research project costs, DIY would likely lose its first-place position in the value-for-money attribute.
High and growing
The quality and value of DIY is clearly high and growing in the eyes of survey respondents for certain types of market research studies. The 2018 study found that if DIY were an independent MR provider, it would have the following ranks regarding the overall satisfaction attribute:
- For early-stage concept screening, DIY would rank fourth in overall satisfaction, up from seventh in 2017.
- For later-stage concept and product testing, DIY would be fifth, up from sixth in 2017.
- For attitude and usage studies, DIY would be fifth, down from third in 2017.
- For customer satisfaction studies, DIY would be third, up from fifth in 2017.
- For ad copy testing, ad/brand tracking studies and brand equity/market structure work, DIY was rarely used.
DIY is commonly used for concept screening and concept testing, where confidentiality is essential; and for attitude and usage testing and for customer satisfaction/loyalty measurement when the project objectives and research design are simple. DIY is not thought to be appropriate for certain more complex types of research studies (e.g., brand equity/market structure, segmentation and other modeling studies) or where specialist MR providers are common and inexpensive (e.g., for ad copy-testing and long-term ad- or brand-tracking studies).
‘Partial DIY’
Another strategy to reduce out-of-pocket research costs is also employed by 12% of respondent MRDs. These “partial DIY” respondent firms outsource fielding the survey and tabulating of the survey responses, leaving the four remaining research tasks to the in-house staff. The study found that most MR consultants use some MR suppliers as field-and-tab shops because consultants are normally expected to develop the questionnaire and come up with the insight as part of their own services to their clients. (MR consultants represent less than 10% of the survey respondents.) DIY and partial DIY using field-and-tab shops may also be employed when the subject of the research is to be kept as confidential as possible, as in new product concept screening and testing.
Since the advent of internet-based research and the easy access to specialized survey software applications, many more market research departments have been motivated to do their own studies. Those firms that do not manage their own panelist/customer lists often buy e-mail lists of potential survey respondents from panel suppliers for online DIY survey projects. Further, some firms hire panel suppliers to also manage their own proprietary customer/prospect lists.
In the 2018 study, 78 respondents mentioned and rated 24 panelist/platform suppliers. Dynata (formerly Research Now SSI) led with 23% of the ratings, followed by Qualtrics with 11% of ratings. Of the two leaders, Dynata had higher attribute scores. These two firms, plus ZappiStore, C Space and Vision Critical, represented over 50% of the mentions. Honorable mention goes to ZappiStore for its high overall satisfaction and value-for-money scores.
By far the most frequently mentioned software products to assist respondents with DIY online surveys are Qualtrics and SurveyMonkey. Together they represent 56% of all survey software products mentioned and rated. Sparq (from Vision Critical), SPSS and SurveyGizmo had substantially fewer mentions. Survey respondents also identified 12 more software products, five of which were mentioned only once.
Of the two most popular survey software products, Qualtrics had the higher score in overall satisfaction (4.20/5). SurveyMonkey received the highest value-for-money score (4.67/5) in 2018. (These same software products may be available for use on a web-survey platform as well as on the personal computers used by DIY researchers.)
Clearly on the rise
DIY research use is still clearly on the rise, having been reported more in 2018 than in any earlier study year. The number of DIY observations was 73% more than of the prior year (288 vs. 166). Prevision’s Gold Index (GI) is the sum of the average attribute scores indexed to 1,000. Only five major MR providers of custom quantitative market research in the U.S. have higher Gold Index scores than that of DIY. Conversely, an equal number have lower GI scores. The Gold Index for DIY (847/1,000) is five points higher than that of all providers and is like that of the previous year. But DIY has a lower score than the average scores of all providers in five attributes: overall satisfaction, data quality, analytical skills and on-time delivery. Understandably, DIY’s value-for-money score tops that of any other research provider because survey respondents do not often factor in the salary cost of in-house staff used.