Editor’s note: In conjunction with his BOSS Academy Radio podcast, Paul Kirch, CEO of Actus Sales Intelligence, a Fort Worth, Texas, business and sales consulting agency, is interviewing authors, marketers and marketing researchers on a wide range of topics. By special arrangement, we’ll periodically feature edited recaps in the e-newsletter including portions of the conversations that touch on research-related topics.
As regulations surrounding outbound cold-calling increase and researchers continue to look for a fresh platform – outside of online panels – for gathering survey participants, Scott Richards, chairman of Dial800 and CEO of Reconnect Research, is turning to misdialed inbound calls as one marketing research survey sample solution.
In an interview with Paul Kirch for BOSS Academy Radio, Richards discussed how marketing researchers can leverage misdialed, incomplete or disconnected inbound calls. When one of these inbound calls comes in, the caller is screened and routed to a marketing research or polling survey via IVR, text and live transfer. In addition to sharing the process behind implementing this method and citing several case studies, Richards touched on the strengths and weaknesses found when using inbound calls, as well as his hopes for its future in marketing research.
Paul Kirch: You were talking to me offline about MIDI calls, which is an acronym for misdialed, incomplete disconnected inbound calls. These calls are basically getting thrown away and not being leveraged and you found a way to take and capture data from this and use this for marketing research purposes. Talk a little bit about this process …
Scott Richards: It really started with misdialed toll-free calls. If somebody had accidentally dialed a two instead of a one in the 10 digits that they're trying to dial in, as opposed to getting their plumber or phone company they would get one of our [Dial800] numbers. We have about a half a million numbers and they would call one of our numbers, so we would play messages on them. We started testing this with market research.
We simply said, "Hey, would you do a market research survey?" and a certain percentage of people said they would. People are already on the phone and so as opposed to throwing [callers] away … offer to do a market research study or to answer a political poll and a certain percentage of them will.
You've done some tests but is the concept that of, "Hey, we're going to have multiple clients, multiple surveys and we could be screening at any given time and sending people to the appropriate survey," or is it going to be more specifically tailored to an individual project? How are you envisioning this or how are you currently working on it?
Ideally [researchers would] have a simple screener. Let's say I have five projects for women [and] five projects from men, so if they ask me for a man or a woman at the top we know which pipeline we can start sending them to. We use the IVR to screen people.
The screener can screen for the demographic profile, the geographic profile, the time of day, day of week – any part that we want to screen for and, ideally, have multiple low-incidence-type surveys at the top and very high-incident surveys at [the] bottom.
We work in an industry that has historically done a great job of exhausting panelists and sample pulls. This would allow researchers to have a fresh perspective, fresh pull. I'm assuming that there's an opportunity to recruit active panelists or members for further research.
In order to get people after they had done it, when [participants that have been offered] the reward go to claim the reward [they are presented with] an opportunity to push a button that says, "Would you like to do future surveys?" Our hope is that a certain percentage of people will decide they would like to do that and we can start building a panel.
Are you seeing some areas that work better than others or that you think are going to work better than others?
We were asked by a particular research firm to find people who lived in just a few zip codes within an area of the country. The incidence rate was the proverbial needle in the haystack. We said, "No, we're not going to do this." The chance of finding [participants] was very small.
The adoption curve is always, "This is risky," and then, "This is new," and then, "Oh wow, this really works." Then it becomes industry standard. It’s exactly the same thing that happened with the Internet. Our job right now is to prove that this is something that can be relied on.
You can listen to the entire interview at www.bossacademy.com.