Editor’s note: Ken Green is an assistant professor at Henderson State University, Arkadelphia, Ark. Bobby Medlin is associate professor at University of Arkansas at Monticello. Dwayne Whitten is an assistant professor at Ouachita Baptist University, Arkadelphia.
Survey instruments serve as the primary means of data collection for organization researchers. Data collection using survey instruments is both costly and time consuming. The phenomena under observation by organization researchers change rapidly, making it desirable to shorten the research publication process. Dickson and Maclachlan (1996) compared the productivity of fax and mail survey methodologies. They found that data collected using the fax survey method was consistent with that collected using the more traditional mail survey method and that fax responses were received more quickly and in greater numbers. The purpose of this study is to identify and investigate the use of a third alternative: Internet surveys.
Internet and mail survey methodologies are compared on the basis of data consistency, speed of response, response rate and cost. Advantages and disadvantages of the Internet and mail alternatives are identified and discussed.
Researchers using survey instruments to collect data seek to 1) improve response rates, 2) shorten the time required for data collection, and 3) reduce the cost of data collection. The use of an Internet surveying methodology offers possibilities for improvement in all three areas. Investigation of the efficacy of Internet surveying is necessary to determine if, in fact, response rates can be improved, time can be saved, and costs can be reduced.
The general purpose of this study is to provide an empirical evaluation of the use of Internet surveying methodology compared to a mail-based methodology. The objectives of this study are to compare the data consistency between Internet and mail survey methodologies and to compare response rates, response times, and costs associated with the two methodologies. The comparisons are necessary to identify the efficacy of the Internet survey methodology.
The study provides researchers with information to assist in the selection of a survey-based data collection methodology. Dickson and Maclachlan’s (1996) results support the use of a fax survey methodology as a viable alternative to mail surveying. This survey provides information relating to a third methodology, Internet surveying.
Literature review
Yu and Cooper (1983) conducted a comprehensive literature review of techniques used to increase response rates to questionnaires. In general, they found that, as personal contact, the use of incentives, and the application of follow-up measures increase, response rates increase. While personal contact, incentive, and follow-up increase response rates, they also increase costs. Their review included 389 mail surveys with a weighted average response rate of 47.3 percent and a standard deviation of 19.6 percent. Dickson and Maclachlan (1996) sought to compare fax and mail survey methodologies. They found that the data collected was consistent across methods and that fax methodology yielded improvements in response rate and response time. Dickson and Maclachlan (1996) contend that surveying by fax is less costly than mail surveying but they offer no specific analysis to support their contention.
Technology supporting the Internet has advanced rapidly, making it possible to collect data using an Internet survey methodology. This newer methodology remains untested. This study compares mail and Internet survey methodologies. It is hypothesized that 1) data quality will be consistent across methods, 2) Internet survey methodology will yield a significantly higher response rate than mail surveys, 3) Internet survey methodology will yield a significantly faster response time than the mail methodology, and 4) Internet survey methodology will cost significantly less than the mail methodology.
Methodology
A questionnaire was developed for the purpose of collecting data related to the use of peer evaluations by AACSB-accredited schools of business. The sample frame was systematically divided into two groups. The questionnaire was administered to one group using a traditional mail survey methodology; the second group was asked to respond electronically through the Internet. MANOVA was used to ascertain data consistency. Response rates and times and costs were assessed and compared.
The questionnaire used in this study was constructed for the purpose of collecting data related to the use of peer evaluations as part of the business faculty evaluation process. The peer evaluation data will be analyzed in a following study. A sample frame of approximately 350 Association to Advance Collegiate Schools of Business-accredited business schools was identified using the AACSB 98/99 Membership Directory. The sample frame was systematically divided into two groups. Deans of business schools in the first group were mailed a peer review questionnaire. Deans of schools in the second group were e-mailed a request to respond to the questionnaire at a specified Internet address.
Paper questionnaires were mailed on a Friday afternoon, and e-mail messages sent the following Tuesday morning in an attempt to approximately match receipt times. The paper questionnaire was headed by an appeal to respond and each was accompanied by a self-addressed stamped return envelope. The e-mail message sent to deans in the Internet survey group included an appeal to respond to the electronic questionnaire posted at a specified Internet address.
Results
Some support for Hypothesis 1 (data quality will be consistent) was found. The SAS MANOVA procedure was used to compare data from the mail survey to data from the Internet survey. Wilk’s Lambda, Pillai’s Trace, Hotelling-Lawley Trace, and Roy’s Greatest Root tests all returned probabilities greater than F of .5117, indicating no significant difference between the mail and Internet data. This comparison of data sets used only data from questionnaires with yes responses to Question 1. Question 1 on the questionnaires asked whether peer evaluations where included as part of faculty evaluations. Fifty-eight percent of the Internet respondents and 49 percent of the mail respondents answered yes to Question 1.
Support for Hypothesis 2 (higher response rate for Internet methodology) was not found. The response rate for Internet surveys was 24.54 percent; the rate for mail surveys was 30.11 percent. While both response rates are reasonable, the Internet methodology did not surpass the mail methodology.
Support for Hypothesis 3 (faster response time for the Internet methodology) was found. The average response time for Internet respondents was 2.45 days; the rate for mail respondents was 11.85 days. Figures 1 and 2 illustrate the return patterns for the alternate survey methodologies.
Support for Hypothesis 4 (lower cost for Internet methodology) was found. Each mail-out required a stamped and addressed envelope, a three-page questionnaire, and a self-addressed and stamped return envelope. None of these costs were incurred during the electronic surveying process. Preparation costs for the two methods was considered to be approximately equal.
Conclusions
Results indicate that the Internet survey methodology may be a viable alternative to the more traditional mail survey methodology. Data collected was found to be consistent across the two methods. While the mail response rate exceeded the Internet rate, both rates were above 20 percent. The average response time was significantly shorter for Internet respondents, and the Internet process was overall significantly less costly. Certainly, additional investigation into the efficacy of an Internet survey methodology is warranted. Replication of this study is necessary. The Internet methodology should be revised in an attempt to garner higher response rates.
References
Armstrong, J. S. & Overton, T. S. (1977). Estimating Nonresponse Bias in Mail Surveys, Journal of Marketing Research, 14, 396-402.
Dickson, J. P. & Maclachlan, D. L. (1996). Fax Surveys: Return Patterns and Comparison With Mail Surveys, Journal of Marketing Research, 33, 108-113.
Lane, D. & Maxfield, R. (1996). Strategy under Complexity: Fostering Generative Relationships, Long Range Planning, 29, 215-231.
Yu, J. & Cooper, H. (1983). A Quantitative Review of Research Design Effects on Response Rates to Questionnaires, Journal of Marketing Research, 20, 36-44.