There has been a lot of emphasis in the marketing research industry recently about mining social media and other Web sites and combining the data with text analytics in order to get a true understanding of consumers’ views. Many in the industry are selling this as more accurate and less biased than traditional qualitative and quantitative methods. But I have to admit I’m a bit of a skeptic. I’ve always felt these new tools can supplement traditional research but not replace it because the sample is so biased and I simply don’t trust many of the comments I read in online product or service reviews posted by other consumers. To me, the real value of these new techniques is to help researchers add shape and color to data gleaned from their more traditional research approaches.
I was not surprised to learn earlier this year that Yelp! has had a problem with fake reviews and that TripAdvisor is under investigation in the U.K. for fake reviews. Apparently the problem with TripAdvisior is so rampant that many feel the site itself (which is designed to be a review site) is worthless. As more and more companies practice social media marketing strategies, the problem is likely to get worse. And it isn’t just positive reviews that are fake, but competitors have been known to post negative reviews as well.
So what’s a researcher to do?
Luckily there seem to be some new tools coming to the rescue. Researchers at Cornell University have developed a software that is designed to find fraudulent reviews (humans are not very good at this because we suffer from a truth bias). In an initial test the software analyzed 800 Chicago hotel reviews and was able to pick out the fake reviews 90 percent of the time. (Apparently fake reviews use more verbs than legitimate ones.) Until review sites start to use this type of software to screen their consumer-generated reviews it might make sense to be cautious. You don’t want to stake your reputation as a researcher on false data.