Mailed it
Editor's note: Joe Hopper is president of Versta Research, Chicago.
Who would think that old-fashioned paper surveys, mailed through the U.S. postal service, are a good way to conduct customer satisfaction research? Not us – until our firm faced a rather difficult need for research and we could think of no better way.
To our happy surprise, it worked! And not only did it work but it easily outperformed other modes of research typically used these days. Our traditional paper-based mail survey got a 25% response rate – with no additional attempts, outreach or reminders.
A seemingly super-simple customer satisfaction survey is now one of the most interesting and memorable research efforts of my career.
Here is the story of that survey – why we did it on paper, the problems we solved, the steps we took to conduct it and why it succeeded beyond what we had hoped.
We hope it provides some valuable insights for you about how to design and execute surveys when facing situations you may have never encountered before.
True melting pot
Our client was Devon Bank, a Chicago community bank located in an urban neighborhood that continually transforms as new waves of immigrants settle into the city, assimilate and make way for new populations from other countries. Its retail customers speak over 30 native languages besides English. They come from countries in Eastern Europe, the Middle East, Asia and Latin American – a true melting pot of cultures, languages and religions from around the world.
The bank did not have good e-mail addresses or phone numbers for most customers and, even if it did, we worried about a research firm reaching out to them in today’s political climate. What the bank did have, of course, were postal addresses being used every month for sending account statements.
On top of that, Devon Bank is small. With only 4,000 customers in the specific market we wanted to survey, there was no opportunity to think about sampling or potential stratification. Research industry response rates are exceedingly low these days, in the neighborhood of 1 to 3 percent. Our goal for a minimum sample size is generally 300. That would mean reaching out to every customer and achieving a 7.5% response rate. Yikes.
To make that happen (if it could happen) we thought through every last detail of making the survey easy, attractive and trustworthy. We ended up designing a research approach quite specific to this unique population and challenge.
Paper survey. With no e-mail addresses or phone numbers, our best option seemed to be a paper survey sent through the mail. We considered in-person surveys, or distributing surveys at bank branches, but the sporadic nature of in-person visits made that impractical. Regular U.S. mail from the bank would be expected, welcome and probably opened, so that was our choice. We decided to make this mailing special by sending it separate from the monthly statements, along with a postage-paid return envelope. We hoped to grab attention with an excellent layout, warm invitation and generous incentive.
Tested design. The last time I designed and conducted a paper survey was back in graduate school while working for the university’s office of policy and planning. That was a long time ago. But there are still people who know a lot about designing and rigorously testing and refining excellent surveys on paper: the U.S. Census Bureau. So we printed out a copy of the American Community Survey and it became our blueprint for how to ask questions on paper. Then we gave it to our graphic design team and asked them to create the exact same look and feel for this unique mode of administration.
Sincere invitation. The survey came with a letter directly from the bank president and it described why the bank was doing it and how it would be valuable to customers (not to the bank). Plus it offered to pay respondents for their time. My favorite sentence was this: “An open dialogue between you and the Bank is an essential part of building and maintaining a strong relationship.” This approach is the opposite of what we often see in survey invitations, like the obnoxious one I received just last week: “Your Help Is Key to Our Success.”
Good incentive. Customers were offered $5 for completing the survey, which the bank would deposit directly into their accounts. That seemed like an attractive amount for a non-affluent population and for a simple five-minute task. Looking back, perhaps we could have paid less (maybe $2) and easily hit our target of n=300. But we also knew there were no second chances. Unlike e-mail, which is nearly instantaneous and allows for easy testing, we could not recalibrate and adjust once we launched. We all agreed that paying a good incentive was a commitment worth making upfront.
Super-short. Devon Bank had never surveyed its customers and it wanted to know a lot. But filling out surveys is a burden and with all the challenges we faced in overcoming resistance and having no second chance, we argued for a very short survey. In the end, we asked 16 questions, which took respondents roughly five minutes to answer, laid out on two pages (the inside-facing pages of a 17x11-inch folded piece of paper). It was short enough to keep respondents engaged and we still got detailed data for a rich analysis of satisfaction, importance of services, age differences, banking with competitors and much more.
Multiple languages. Our survey documented 30 different languages used by the bank’s customers and of course we surveyed only one-quarter of them. Translating into all these languages was not financially feasible, nor did we know exactly what all the languages might be. So bank staff estimated we could focus on these top five: English, Russian, Arabic, Spanish and Hindi. The invitation included in the survey offered prominent call-outs with text in each language explaining the survey (and the $5 incentive) and providing instructions on how to access non-English versions.
Online option. We had to offer an online version for one big reason: the survey was offered in five languages and there was no way of knowing which version should be sent to whom. Nor was it practical to offer all versions by mail, because the mailing needed to be clean and inviting. So we translated and programmed the full survey online, accessible directly from the bank’s website. Each mailed survey offered a unique five-digit survey code for access, which we intentionally did not call a PIN in order to avoid any confusion with other banking-related PINs.
Research company invisibility. Sometimes highlighting the involvement of an outside firm enhances the credibility of a survey. It can offer a reassuring promise that even negative feedback is welcome and useful. For this population, however, we expected sensitivity around a third-party collecting data and therefore decided that all communications should come from, and return to, the bank. The invitation and survey were on bank letterhead and the postage-paid return envelope was returned to the bank as well. We did not promise respondents anonymity but the bank agreed to let us manage the data and strip out personally identifiable information in the process.
Nervous for weeks
We were nervous for weeks after the mail drop. One or two came back each day. We fretted, knowing that there was no way to remediate if our plan didn’t work – no easy way to send reminders, boost the incentive, reach out to more sample or make more phone calls. For three weeks it seemed we might be looking at failure.
Then our contact at Devon Bank called: “We just got 500 surveys!” No, we did not believe it. As good researchers, we searched for disconfirming evidence by considering all the alternative explanations of what had happened. Perhaps the 500 surveys were returned as bad addresses (but surely we must have a good list – the bank mails statements every month!). Perhaps the printer mailed back the overage and these surveys were blank (but we confirmed the printer had only a handful and they did not ship them back).
Well, it was true, they really did get 500 surveys and more than 500 additional surveys came on top of those. Our old-fashioned paper-based mail survey got a 25% response rate, with no additional attempts, outreach or reminders.
When I shared the final research report with the bank’s executive committee, the CFO jumped in quickly to ask why they got such a great and unexpected response rate. I answered by segueing into the results of the survey: they had strong relationships with customers; their customers really like them; they had not poisoned the well by nagging with a survey after every transaction.
I should have taken some credit, as well. We knew the challenges we faced. We thought through every approach possible and addressed every point of resistance we could anticipate. We brought up-to-date knowledge and expertise and applied it to some old-fashioned techniques in novel ways that were truly unique.
Yes, old-fashioned paper surveys are still a viable option. In some cases they may be the only option. If you know what you are doing, you can make them as successful as online surveys – and maybe even go 22% better.