Sponsored content
Editor's note: Kalyan Raman is chief technology officer, Research Now/SSI.
Today’s market is overcrowded: too many competitors, too many products and services and too many media channels. They are all competing for the interest of potential buyers whose attention span continues to decrease and whose choices are driven by diverse tastes and trends that can change overnight. In this world, research-based insights that can help organizations understand and respond to shifts in demand and buyers’ preferences are truly invaluable.
By extension, technology that assists in the analysis of research data to uncover those insights is also invaluable. The analytical context where technology can contribute revolves around three essential questions, which almost every organization is seeking to answer:
- Who is my next customer? Or, what specific groups of consumers should I target to grow my business and my market share?
- How can I fuel that growth? Or, what should I do to gain share among those specific targeted prospect groups?
- How can I reverse a decline? Or, why am I losing share among a specific group of customers and what should I do to change it?
There are innumerable variations on these core queries and many possible answers to each of them, leading to the notion of combinatorial scale. This issue of scale and the need to identify relevant combinations among billions of possibilities differentiates the nature of the underlying problem that technology is used to solve.
Market researchers have created methodologies and sophisticated scientific approaches grounded in data to tackle that combinatorial challenge and address the questions above. To get to the best answers – and to arrive at them quickly – researchers use the same kind of deductive process any good investigator does, adapted to the specific requirements of market research.
- Collect all the evidence via consumer or buyer research.
- Connect the dots, or the relevant data points, to reveal patterns in the research results that can help shed light on the dynamics of demand.
- Contextualize the patterns to gain insights, based on knowledge of your organization, your customers and the marketplace in which you compete.
- Communicate the findings, recommendations and action plan to the decision makers in your organization who can consider the demand insights you’ve found, and make the decisions that will achieve the results your organization seeks.
As an example of this process, consider a product or a service that is losing sales or growth among men 25-35 who live in a particular group of postal codes and own a specific brand of car. Researchers gather data from that segment and analyze the results to identify commonalities that more precisely define the customers being lost. They then communicate findings and recommendations – which may involve anything from different approaches to marketing, advertising, pricing, targeting or promotional strategies to changes in the product or service itself. This core process can be repeated to ensure that the actions taken are producing the desired results, without unintended consequences.
Technology has long played a central role in the first step of this process: collecting evidence. Given that the preferences of consumers vary at an individual level, it is important to understand preferences and design strategies for individual targets as opposed to groups of consumers and to have a measure of surety in the representativeness of the individuals. This is where the role of online research panels sits at the heart of all our offerings.
Now technology is enabling tools that can support and enhance the remaining steps of the process – connecting the dots, visualizing contextual insights and communicating them to decision makers to drive activation. While these tools are relatively new, and not universally adopted, they can provide important advantages.
Researchers are increasingly tapping large, well-curated data sets, integrated using state-of-the-art data management and techniques, such as those provided by a consumer data platform. Technology that ranges from machine learning and deep learning to visual discoveries can act on these large sets of data to address the combinatorial scale problem, reducing the number of combinations by applying a relevancy filter at every stage using multi-step computational algorithms. These techniques can bring new and different insights into focus that are relevant to the context and may not have been evident using more traditional methods.
In addition, the sheer quantity of evidentiary data that researchers can collect today can overwhelm traditional analytical methods. Technology enables more powerful analytical modeling methods, such as multi-touch attribution or modeling of path to purchase along with visual discovery tools, to better cope with these large amounts of data.
Essential characteristics
The accelerated pace of competition and the unexpected disruptions that affect every market make speed and agility increasingly essential characteristics for every business. Technology can help analysts shorten the time to insights, particularly when dealing with high volumes of data, and bring the communication of results to near-real time.
Consider an example of the first of these values of technology – bringing new and different insights into focus: the application of machine learning to data analysis.
Traditional analysis starts with a hypothesis – what researchers or others believe is influencing consumer demand. The researcher then gathers data and analyzes the results to see if the hypothesis is supported or rejected in favor of a different conclusion.
The inherent limitation of this approach is that the hypotheses are based on expectations of what will be found, making it more difficult to uncover truly unexpected results. Hypothesis-based research tends to support incremental gains in understanding rather than truly surprising results that can lead to disruptive innovations.
By contrast, the method enabled by machine learning and large data sets introduces a discovery process that requires no starting hypothesis and is therefore independent of expectations. In this gaming approach, the computer identifies and learns from clusters of data points as well as the broader patterns they create, zeroing in on possible findings based only on the data itself.
Many, perhaps most, of these findings will ultimately prove irrelevant, because they are generated without any contextual understanding. Creating actionable insights from any findings requires the crucial contextual knowledge of an organization and its goals in the marketplace that machines do not possess.
However, this kind of analysis can lead to insights that no one would have predicted or expected. They, in turn, can lead to breakthrough ideas with the potential to truly revolutionize a company and a market.
Raised concerns
Another category of technology-enabled analytical tools – more familiar to most researchers – are dashboards and data visualizations. For some, these tools have raised concerns that visual displays may provide too-easy access to oversimplified results, particularly for research clients who are not trained in the finer points of data analysis.
It’s true that these visual displays act as translators, serving as an accessible interface that simplifies and improves the interpretation of data and allows for a far more immediate grasp of dense research data than a table of numbers. Therein lies their value, and as the data involved in any research study grows in volume and complexity – such as analyses based on machine learning – the value of visualizations grows correspondingly. We are highly visual learners, after all, and data visualizations unlock the potential for visual discovery.
Technology-driven visualizations can add power to the art of analytics – helping analysts identify interesting and potentially important outliers in data, for example. Outliers have gained recognition for their potential to enrich research results with insights that can provide a major impact on a brand’s success, such as identifying the “superconsumers” who are a brand’s true evangelists – identified by Eddie Yoon in his book Superconsumers: A Simple, Speedy and Sustainable Path to Superior Growth.
Dashboard visualizations displaying real-time data can greatly simplify previously time-consuming analytical tasks. For example, our cross-media campaign effectiveness solution incorporates an interactive dashboard visualization allowing flexible data recombinations for virtually instantaneous comparative views. The tool makes it much easier to see and understand complex results from multiple media channels.
This dynamic linkage of real-time data with flexible visualizations offers users the additional advantages of speed and agility. Powerful tools such as these can help decision makers readily grasp important story lines in research findings, enabling them to make informed choices at the speed of the modern marketplace.
Need to pay attention
It’s important to note, however, that not all analytical tools are created equal. Many have encountered online language translators that fell short. Since data visualization tools function as translators, researchers need to pay attention to the way they are built.
The ability of visualizations to simplify data interpretation through visual discovery is grounded in the idea of progressive disclosure. Under the principle, when a researcher shifts the view and focus of the visualization – from a high-level overview to a detailed segment, for example – the tool presents only the relevant, necessary data, not an ocean of data that would overwhelm the user.
Consequently, the tool must be designed with a thorough understanding of how research analysts work, and what they are trying to accomplish in any analytical task, to ensure the incorporation of the right data. That means not only providing the data the analyst needs at any given level of analysis but also masking the data that’s not required – until the analyst reaches the point that it becomes relevant.
To fulfill these requirements, the tool must maintain dynamic, robust connectivity with the underlying research data. Otherwise it cannot deliver the flexible functionality that will ultimately provide a clearer path to insights.
Dashboards and data visualizations should be compatible with your communication and presentation tools, such as PowerPoint, as objects. That allows the display to update automatically based on changes in the underlying data, ensuring your presentation graphics remain up-to-date.
These analytical tools can play an important, valuable role as components in the deductive process. They can help researchers process raw data to uncover new and potentially more powerful insights into the shifting attitudes and behaviors that drive demand, while shortening the time it takes to reach them.
Of course, any analytical tool is only as good as the data that is fed into it, and technology-driven tools are no exception. They require robust, accurate, trusted data as the input. To deliver on their promised potential, their design and functionalities must have been guided by an understanding of the needs of marketers and researchers. Their output must be informed and evaluated by the judgment of thoughtful professionals to identify valuable findings and prioritize them for decision makers to act on.
When those conditions are achieved, technology in the analytical process extends and expands the ability of researchers to identify and uncover critical demand insights by collecting data, connecting the dots to understand its implications and communicating findings to drive decisions. In an overcrowded, oversupplied market, these fundamental benefits are simply too important to ignore.