Still struggling with technology
Editor’s note: Tim Macer is managing director, and Sheila Wilson is an associate, at meaning ltd., the U.K.-based research software consultancy which carried out the study on which this article is based on behalf of Confirmit.
Now in its fifth year, the annual Confirmit Market Research Software Survey, as usual, throws up some surprises, confirms some hunches and reveals a few long-term trends in what the industry thinks about and does with the computer technology it uses.
The results of the 2008 study show us that market researchers are grappling with efficiency problems and are struggling to get the performance they need from some of the tools in place. Market research-specific autodialers have been on the market well over a decade, yet, as this study shows, the uptake of these cost-saving devices has been relatively modest. Conversely, there are innovative companies who are incorporating a range of Web 2.0 approaches into their surveys, and in some cases finding this quite hard to do with the current range of software. It is interesting that while some parts of the industry are embracing these edgy new technologies so swiftly, other parts are lagging behind.
Each year, in the past, the study has found a large number of companies wishing to change their software, and this year is no exception, with some companies even looking to make changes across the board. However, our survey was in the field in autumn 2008, at an early point in the understanding of the global economic downturn. We would expect the time line for such changes to be extended considerably now, and for some of the other estimates of growth to have been suppressed by economic events.
The study was conducted by meaning ltd during 2008 among a balanced sample of 215 research companies of all sizes across the globe. It was carried out as an online survey in four global languages. We are most grateful to Confirmit for sponsoring this research and also for allowing us to publish these results, and of course, to the 215 respondents who willingly shared their insights with us.
Online dominates
Figure 1 shows the volume of surveys being carried out by each research mode, expressed by each respondent as a proportion. Clearly, online research dominates, and has now risen to nearly half of all research modes by volume among the companies in our survey (though ESOMAR’s 2008 Global Estimates, based on revenues, put this at 30 percent). From the buzz in the industry, it would be easy to assume that this growth has been at the expense of CATI. Our research shows that CATI is holding up well, and it is paper-based research which is showing a decline.
The onward march of Web research has been relentless throughout the lifetime of this annual study. When we first conducted it in 2004, we discovered that 81 percent of market research companies offered online research, and it has now reached near ubiquity, standing at 94 percent of research companies. Certainly, in just a decade, the industry’s skepticism of the method has been replaced by near universal acceptance. Apart from Web and CATI, all the other data collection modes either seem to be in decline or have not yet taken off.
Still looking at penetration, rather than volume, there has been a 12 percent drop in the number of companies conducting paper-based research - down from 63 percent of firms in 2007 to around half (51 percent) in 2008. Laptop CAPI appears to be gradually declining too, with 32 percent of companies offering it in 2007 and 24 percent in 2008, although revenues have only dropped slightly in that time. With new and exciting modes such as mobile CAPI and mixed-mode maturing, we wonder if we are approaching the day when the paper-based survey in the professional research company becomes a niche service.
Figure 2 shows the responses when we asked researchers to predict changes they anticipated in the balance of interviewing work over the next three years using a four-point scale, where 2 represents major growth, 1 modest growth, 0 no change, and -1 a modest decline.
Given the findings shown in Figure 1, it is not surprising that the Web is predicted to be the main growth area, though we suspect this trajectory will begin to level out, given the high proportion it has already reached. Also a growth in mixed-mode CATI/Web and mCAPI seems inevitable as these technologies are maturing, making the task easier, and respondents are proving harder to reach by one channel and less so if more than one channel is combined. However, our respondents’ expectations are modest for other modes, especially for mCAPI. It is noticeable that the North Americans and Europeans expect mixed-mode CATI/Web to grow, whereas those in Asia-Pacific anticipate virtually no change.
The decline in paper seems inevitable, as it is cumbersome and inefficient compared with Web and mCAPI surveys. The expectation that CATI will decline slightly in all regions is at odds with our findings in Figure 1, where CATI has not declined; it appears to be more resilient than its practitioners anticipate. This sentiment may be linked to the observation that one in every four respondents said (in answer to another question in this study) that falling response rates is the principal challenge the industry faces.
The question illustrated in Figure 3 looks at telephone dialing methods and tells us the proportion of CATI work using each method both at the time of the study and as anticipated by respondents one year later.
Given the huge efficiency gains that can be made with autodialers, we are surprised that as much as two-fifths of all CATI work is still handled by manual dialing. While it is true that not every project is suited to autodialing - for example in B2B where the phone is almost always answered someone but not necessarily by the respondent - the technology appears to be underutilized.
Those firms who are not using autodialers are working at a major cost disadvantage to those who are, since autodialers give huge savings on staff - the main expense in a CATI operation. This gloomy prediction may be a rational response to the predicted demise of CATI in Figure 2, though is less rational if that decline fails to materialize.
Predictive dialing can offer even greater economy, its proponents claim. However, the industry is expecting only a gradual shift away from manual dialing towards predictive. We asked those who do not use predictive dialing why they don’t. The top reason, for 42 percent of respondents, was the belief that they would realize no cost savings, due to the types of research they conduct. Nearly as many expressed ethical qualms, with concerns over nuisance calls. A further 22 percent felt that it would be too costly to install and operate. It seems that a proportion of the industry is unconvinced when it comes to autodialers, or perhaps unwilling to invest in something that is considered to be old hat.
What went wrong
Figure 4 summarizes responses to a new question for 2008, in which we asked the respondents about the technology challenges they were facing. “Automating repetitive tasks” came out on top. If we had received this answer in 1984, we would have found the insight impressive. A quarter of a century later, we are wondering what went wrong, as this is what computing at its most basic is supposed to achieve. It’s a finding that does not speak well of the quality of the software that the industry is using and it may go some way to explaining some of the other findings we report later: many companies wish to change their software (Figure 6) and many market research companies are using bespoke tools (Figure 8).
Similarly, keeping up-to-date with technology developments and recruiting staff could be interpreted as reactive concerns, not ones where those responsible for technology are taking the lead.
Since the industry is preoccupied by problems at the more fundamental level, it is little wonder that some of the complex tasks further up the tree, like handling trackers or moving data or questionnaires to different platforms, are of a more secondary concern.
Innovators and traditionalists
For the first time, we asked respondents about their use of six emerging Web 2.0-style technologies in their research (Figure 5). The profession seems to divide between innovators and traditionalists. Although a significant minority of traditionalists felt these technologies were not required, there were also an impressive number of innovators who are already using blogging and co-creating in their studies or who are making their questionnaires more high-tech by presenting questions in video format, for example. There are not many market research-specific tools that incorporate Web 2.0, so those already employing these approaches have clearly used some effort and a great deal of imagination. Some items seem harder to do than others - analyzing unstructured text stands out in particular - but co-creation also has more users struggling to do it than actually doing it.
Given that Figure 4 revealed a profession perhaps running to catch up with computer technology, the latest gizmos may also not be a top priority for many practitioners.
Increased every year
Since we first asked respondents in 2006 whether they were planning to change their software over the coming two years, the number of those who said yes has increased every year - from 26 percent in 2006, to 34 percent in 2007 and 40 percent in 2008.
In the 2008 study, we asked those who said they planned to change their software which types of applications they wished to replace (Figure 6). The high proportion registered for each software type seems to indicate that some companies want to overhaul their software across the spectrum, rather than simply tinker at the edges. The Europeans appear to have slightly less radical ideas, as their noticeably shorter columns in Figure 6 attest. It’s a difference we are unable to explain.
It is to be expected that far fewer companies wish to change their panel management software since many companies have not been using these tools for long enough to be thinking of changing them.
In a follow-up question, we asked the sample what were the main reasons for changing each type of software. The results were similar for each software type, with “seeking more functionality” coming out on top, followed by “achieve efficiency improvements” and “move to a more modern platform.” There is, in any case, a large amount of legacy software in use throughout market research, and the observations here, combined with the earlier observations on unmet requirements for efficiency, can be viewed together as evidence of the difficulties many users are having with software that is long-in-the-tooth and not well integrated with other modern software tools.
Mixed-mode is important
As Figure 7 illustrates, we asked: “If you were choosing new software, or reviewing your current solution, how much importance would you place on the tool’s ability to mix and combine different data collection modes?” Nearly all respondents think mixed-mode is important in new software. Given the ability of mixed-mode research to improve response rates - which respondents stated (in another question) was the main challenge the industry faces - it seems logical that the industry should wish to follow the path of mixed-mode. Thus it’s a little surprising or even contradictory that in Figure 2 the expected growth of mixed-mode is so modest. However, many research companies appear to be unconvinced on the merits of multichannel research projects, and wish to adopt mixed-mode research platforms for the internal efficiencies they bring. In a separate question, we asked what level of functionality was required and only 21 percent said they needed the functionality to support mixed-mode projects where interviews can switch from one channel to another, 51 percent sought the ability to run mixed-mode samples in parallel and 22 percent were only seeking the benefits of one common platform for authoring and deployment.
Revenues are modest but not insignificant. In another question we found that 6 percent of revenues are now attributable to mixed-mode studies. Despite its apparent promise, it seems that market researchers are not anticipating any rapid gains in the area of mixed-mode research.
Never cease to amaze
Figure 8 shows the proportion of companies which use packaged versus bespoke software. We publish the latest results every year because they never cease to amaze us.
Given the costs and risks associated with developing custom software, and given the huge number of off-the-shelf applications on the market, one would think that tailor-made technology is a rarity, but far from it. In fact, in the case of Web and analysis, our measurements detect that the use of own-developed software is actually increasing. In 2007, 17 percent used own-developed Web software only, whereas in 2008 this has risen to 25 percent. Similarly, with analysis, the corresponding figures are 10 percent for 2007 and 15 percent for 2008.
In fact with Web (and indeed CAPI, though on a very small base) the use of packaged software has also increased, with 72 percent using packaged Web software in 2007 against 80 percent in 2008. As we saw in Figure 1, the number of companies offering Web research as a service is clearly growing but a proportion of the resulting spending on new technology is staying in-house and being put toward developing own-grown solutions.
The popularity of custom CATI software may go part of the way to explaining why so many companies are still not using autodialers. Integrating autodialer hardware and software with a CATI application is no simple matter for a software developer, so it seems unlikely that many bespoke CATI tools would support autodialers.
We also find it surprising that analysis is the software type that is least likely to be customized, since that is the part that produces the deliverable to the client - and one would think that that is where the market research firm has the most scope for differentiation and building added value.
Gathered pace
But what for 2009? Since fielding the study in early fall 2008, the economic downturn has gathered pace, alongside a greater understanding of its extent, and we would now expect companies to be much more pessimistic about the future. There are perhaps one or two early-warning signs already in the figures. For example, we noticed that in 2008, the big companies had become less certain about changing their technology than previously and the use of a wide range of sources of online sample, such as access panels, which had been steadily growing over the years, had levelled out in 2008.
As research businesses focus on survival in 2009, it seems unlikely that many will have the appetite or the budget for the major changes in technology predicted in the 2008 study, unless a bankable return on investment can be assured within the same fiscal year. But there is nothing like a recession for driving out inefficiency as well as cost, and, anecdotally, we have heard of several of the newer, more modern and lower-cost software suppliers actually having quite a good year so far.
We will be conducting the survey again is 2009 and expect it to reveal a very different landscape, probably a changed of priorities for research companies and no doubt some unexpected triumphs in adversity.