Structural problems driving data quality concerns in market research
Editor’s note: Bob Fawson is founder and CEO of Data Quality Co-Op and vice board chairman at SampleCon.
We have a data quality problem in the market research industry. This is nothing new. Fraud, disengagement and declining respondent trust have turned sample sourcing into an arms race of better bots fighting better bot detection. The question isn’t whether data quality is a challenge – it’s how we got here and, more importantly, how we get out.
The answer might not only come from better AI or new detection models, but from history.
Lessons from 19th-century market failures
Tyler Brough, a fintech professor at Utah State University, said, “Everyone has a data supply chain. Could you imagine in a mature business, like a manufacturing business, if you were throwing away 30-to-40% of your raw materials? You’d either go out of business or get sued.”
And yet, that’s exactly what’s happening in market research. Researchers routinely discard large portions of survey responses due to fraud, inattentiveness or low engagement, treating waste as an unavoidable cost of doing business.
Other industries have faced similar challenges and found solutions. Take financial markets. In the 19th century, commodity futures trading was a chaotic mess. Buyers and sellers had no standardized contracts, no way to verify product quality and no way to enforce agreements. As a result, fraud was rampant, speculation was dangerous and trust was low.
Then something changed. Markets evolved institutional frameworks – clearinghouses, standardized contracts and shared pricing data – to mitigate risk, align incentives and create transparency. Over time, these structures transformed volatile, unreliable markets into stable, trusted ecosystems.
Data quality is a human problem AI can’t solve
Modern AI-powered fraud detection plays an important role in improving data quality, but it can’t fix the underlying structural problem. Brough told me, “The interesting thing is that there probably is no complete solution without the ability to design, draft, understand and enforce contracts.”
The problem isn’t just fraud – it’s misaligned incentives. Sample suppliers are incentivized to deliver as many completes as possible, while researchers want high-quality, thoughtful responses. Meanwhile, participants are often left out of the equation entirely, treated as data sources rather than valued contributors. The market structure encourages cutting corners – lowballing incentives, over-surveying engaged participants and relying on technical fixes rather than systemic reform.
Brough sees a strong analogy between today’s data quality crisis and the development of futures markets. In the 1840s, upstate New York grain traders were struggling with frequent contract defaults. Then, something remarkable happened: market participants voluntarily established standardized contracts and self-regulated through shared transparency mechanisms. They created an environment where cooperation was more profitable than deception – without needing courts or regulators to force their hand.
Market research needs its own version of that evolution: a data quality clearinghouse that provides shared visibility into quality metrics across the supply chain.
A path forward: Institutionalizing trust
So, what does this look like in practice? For one, it means moving past the short-term “cat and mouse” game of fraud detection and into a long-term structural approach.
- Data transparency: Buyers and suppliers need shared visibility into data quality indicators, just as credit markets rely on agencies like Experian or financial markets depend on rating agencies.
- Standardization: Quality expectations must be codified into clear, enforceable agreements, preventing a race to the bottom on price and engagement.
- Research incentives that work: Participants should be compensated fairly and offered experiences that encourage genuine responses, not just fast completions.
This is about intelligent infrastructure, not just better technology. As Brough puts it, “AI is just machine learning, and machine learning is just traditional statistics amped up with algorithms and lots of data. But what’s missing is the institutional layer – the economic and social frameworks that guide how we interact with data.”
The market research industry has been here before, and other industries have already found ways to solve these exact challenges. The question isn’t whether change is coming – it’s whether we will shape that change or let the market continue to erode.
It’s time for a new model – one built on cooperation, transparency and a little inspiration from history.