A slight change in the route

Editor’s note: Lori Laflin is senior market research analyst at the Minnesota Department of Transportation, St. Paul. Michele Hanson is research manager at Readex Research, Stillwater, Minn.

Created by the state Legislature in 1976, the Minnesota Department of Transportation (Mn/DOT) develops and implements policies, plans and programs for highways, railroads, commercial waterways, aeronautics, public transit and motor carriers throughout the state. The vision is to develop a transportation system that meets the needs of Minnesotans while considering available resources and local guidelines.

Understanding customer needs, perceptions and expectations and creating performance measures and tracking this progress are vital for the department’s tactical and strategic decision-making.

That’s where the Mn/DOT market research function comes in. To provide this information, it conducts a wide variety of ad hoc and tracking studies, including the maintenance business planning (MBP) study. Conducted for MnDOT’s Office of Maintenance, the study examines highway maintenance from the public’s perspective.

In 2005, after it was determined that a change needed to be made to the maintenance business planning study, Mn/DOT contracted with Stillwater, Minn.-based Readex Research to collaborate and get the Office of Maintenance the new information it needed.

The Office of Maintenance is responsible for maintaining and preserving Minnesota highways so they are safe, structurally sound, convenient to use and aesthetically pleasing. Ongoing informational objectives of the business planning study are:

  • to maintain trending data on importance of maintenance products and services, perceived performance for those products and services, how the public would allocate funding to individual services; and,
  • to measure any new changes in public perception about maintenance products and services.

The study asks customers to rate importance and satisfaction overall and with these 15 specific maintenance activities: clearing roads of debris; clearing roads of ice and snow; keeping roads in similar condition statewide; ensuring road shoulders are in good condition; ensuring road surfaces are smooth and comfortable to drive on; eliminating weeds on the roadside; keeping the plants, grasses and flowers by the roadside looking good; removing litter and trash by the roadside; ensuring stoplights and stop signs are clearly visible and working; ensuring highway signs are clearly readable; ensuring guardrails are in working condition (undented and whole); ensuring road stripes and markings are clearly visible; ensuring roadway lighting works; keeping rest areas safe, clean and attractive; providing current information on unplanned or emergency highway conditions.

Fund allocation

The MBP also includes a fund allocation section which serves as another way to understand the importance of maintenance service areas. Respondents are asked to make some decisions about how they would allocate $100 between three general types of road maintenance:

  • maintaining the road surface (snow and ice removal, keeping pavement smooth and keeping road stripes clearly visible);
  • providing motorist services (road signs and traffic signals, upkeep for rest areas and providing current information on unplanned or emergency road conditions);
  • maintaining roadsides (keeping plants and grasses neat and attractive, removing any trash or litter and eliminating weeds from the roadside).

With this information, the department was able to develop a quadrant chart showing areas for increased emphasis (the convergence of low satisfaction and high importance) and decreased emphasis (the convergence of high satisfaction and low importance).

What level of service?

The quadrant chart was very helpful but it failed to provide answers to a key dimension: what level of service does the public require? To help allocate resources in a time of tight budgets, MnDOT decided it would be very helpful to understand what the public believes are acceptable levels of service.

Similar to the performance and importance questions, this new question would ask respondents to use a 10-point scale to rate the level of performance that would be acceptable to them to fully meet their needs, given the fact that Mn/DOT must work within a budget while at the same time ensuring that customers’ needs are being met. (A score of 10 would mean Mn/DOT needs to do an extremely good job in the area, 1 would mean it could do an extremely poor job in the area and still meet the respondent’s needs, and 5 would mean it would need to do an average job to meet the respondent’s needs.)

Telephone interviews

A random sample of Minnesota telephone numbers was used to collect 1,001 telephone interviews, conducted by Minneapolis-based Market Solutions Group, between February 24 and March 7, 2005. Average interview length was 16 minutes; the incidence for qualified respondents was 89 percent, and the refusal rate was 23 percent. The margin of error for percentages based on the 1,001 total tabulated responses is ±3.5 percent at the 95 percent confidence level.

The sample was stratified by district (there are eight districts which cover the state of Minnesota) to optimize statistical precision for anticipated segment-level analyses. Responses were weighted in tabulation to accurately reflect true population proportions.

Readex Research managed the design and fielding, analyzed the data and prepared the written report. Previous quantitative waves were in 1994, 1996 and 2000 by other research firms.

Apprehensions

Even though there were apprehensions about changing a tracking questionnaire, the need for this information outweighed the possible impact on tracking.

Inserting the new question series after the satisfaction ratings seemed to make logical sense. The new questionnaire order would be:

1. importance

2. satisfaction

3. NEW - acceptable level

4. fund allocation

Yet we were concerned that if there were changes to the trended data for the fund allocation section, we would be unable to determine whether the differences were due to shifts in public opinion or simply due to the questionnaire order. Given this, we thought about setting up the revised questionnaire with the new question at the end:

1. importance

2. satisfaction

3. fund allocation

4. NEW - acceptable level

Because our preference would be to place the acceptable-level question in future studies immediately after the satisfaction ratings for better flow, we decided to test if inserting this new question after the satisfaction ratings and before the fund allocation would have an impact on how respondents allocated funds. To do so, respondents were randomly divided into two halves (balanced on key demographics) and the final questionnaire order was determined to be:

1. importance

2. satisfaction

3. NEW - acceptable level (half of the respondents)

4. fund allocation

5. NEW - acceptable level (half of the respondents)

No significant differences

It was with great relief that we discovered that there were no statistically significant differences in fund allocation based on the question location, as shown in Table 1.

We did, however, find something completely unexpected - a difference in acceptable level ratings. Average acceptable levels for most of the service areas were lower for those who were asked to allocate funds before rating the acceptable levels than for those who were rating acceptable levels before allocating funds.

Table 2 shows the average results for those who rated acceptable levels before allocating funds versus those who allocated funds before rating acceptable levels. Statistically significant differences are highlighted in yellow.

Affected the outcome

Based on these unexpected differences, it seems logical to conclude that the order of the questions affected the outcome of the results, with those having first allocated funds (and therefore, considering budgetary constraints) more willing to accept lower service levels than those who allocated funds after rating acceptable service levels.

And, with the help of the new acceptable level question, the results indicated that, statewide, customers believe Mn/DOT should provide additional attention to three key areas:

1. clearing roads of ice and snow;

2. keeping road surfaces smooth and comfortable to drive on;

3. making road stripes and markings clearly visible.

These service areas had the highest levels of acceptable performance along with the largest gaps between acceptable and actual performance.

Without the added understanding about acceptable performance levels, results of a quadrant analysis alone would have shown a slightly different scenario. For example, clearing roads of ice and snow would not have been considered an area for improvement but rather a strength because it was rated highly for both importance and satisfaction. However, by comparing the acceptable levels to actual performance, it’s clear that even though Mn/DOT is currently doing a satisfactory job in this area, improvements could still be made to meet the minimum acceptable level of performance for the average customer.

Another dimension

As hoped, this new question brought another dimension to the analysis by better understanding the minimum expectations of Mn/DOT’s customers. It also brought the unexpected benefit of concrete evidence regarding the importance of strategic question placement and its relationship to potential positional bias.