A strong connection
Editor’s note: Robert Brass is president and co-founder of Development II, Inc., a Woodbury, Conn., research firm.
George Kerns is the senior executive in charge of Network Operations at GTE Internetworking (GTEI), Cambridge, Mass. His organization is responsible for the installation, maintenance and operation of the Internet services that GTE provides for business customers. These consist primarily of three major ISP (Internet service provider) services: Internet Advantage, Site Patrol and Web Advantage. Internet Advantage provides customers with an Internet connection via a router and a dedicated line. Site Patrol is a monitored Internet security service whose function is to monitor and intercept unauthorized external access. Web Advantage is Web hosting for the provision of Web site services.
Kerns has a passion for quality. He has focused the efforts of his organization toward the continual increase in the level of quality and customer satisfaction. While many companies have an integrated process to increase the satisfaction of the customer and to improve the quality of the product or service, few have attained the level of success that has been reached by GTEI in less than two years.
The ongoing improvement is achieved through a twofold process: first, an external customer satisfaction survey and analysis technique that acts as the "meter" for the activity and, second, an internal process that uses this information in a highly disciplined way to provide transactional and systemic corrective actions.
To illustrate, we’ll look at GTEI’s Internet Advantage service. The coordinating department throughout the installation and operation of the Internet connection provided through Internet Advantage is called the Network Operations Center (NOC), part of Network Operations organization. Director of Customer Provisioning Steve Zajac’s group is responsible for the initial installation of the service while Director of Customer Care Jim McLaughlin’s group operates and maintains the network connection.
The customer satisfaction survey
While there are internal measures and standards that define operational expectations, the process is primarily driven by customer reaction. The basic source of the customer opinion is a customer satisfaction and quality survey. A customer is initially surveyed 30 days after service begins to evaluate the quality and timeliness of the installation plus the initial reaction to the service. In addition, there are several other measurements assessing the perception of customer support, the effectiveness of communications, and other key attributes that the customer deems important.
Ninety days after the installation, a second survey is conducted, focused primarily on quality and performance. From that time on, each site (installation) is surveyed once per year to assess the customer perceptions of GTEI’s service, quality and support. This ongoing survey is a complete census of the customer base, not just a representative sample. The survey is conducted by telephone on a continuous and daily basis.
The scale and the measurement metric of the survey is representative of the focus upon quality. This scale is limited to four levels: totally satisfied, somewhat satisfied, somewhat dissatisfied and totally dissatisfied. Additionally, the respondent is offered the option of indicating that they have insufficient information to provide an answer. The measurement metric, against which progress is measured, is the percentage of customers who are totally satisfied. A 70 percent totally satisfied customer base is considered GTEI’s minimum acceptable level.
The structure of the survey is based upon approximately 10 high-level satisfaction categories such as "professionalism of the NOC personnel," the "quality of the service," etc. If the respondent does not indicate that they are totally satisfied, there is an additional series of "drill-down" questions that seek out the specific reasons for lack of total satisfaction. This is augmented by open-ended queries to further discriminate the reasons. A unique element of this survey is that it is modified monthly depending upon the customer-identified areas currently targeted for improvement. The survey is represented to the respondent as a communication process designed to help them relate their issues and concerns directly to GTEI. It is not positioned as, nor intended to be, anonymous.
The process
The core of the program is the process that integrates the survey results and the analysis of the survey data with the operational activities of the NOC. Here is where the fundamental difference between most customer satisfaction programs and GTEI’s approach becomes most apparent.
At the beginning of each month, representatives of Development II, the Woodbury, Conn., firm conducting the survey, meet with Kerns, Zajac and McLaughlin to discuss the previous month’s survey results for Internet Advantage. The statistics and comments are reviewed by category, by question and by individual. Trends are assessed and causal factors discussed. The result is often a modification in the survey question set, primarily in the drill-down questions or by the addition of new questions. The monthly report that is distributed to management and operations is sometimes modified slightly to increase its communication effectiveness.
This monthly discussion is only the launching point of the improvement process. Both Zajac and McLaughlin take the reports with the results of the reviews and meet with their staff to formulate and evaluate actions for improvement. Goals are set and tracked as an ongoing process.
A final element in this system is the "Hot Line." If any individual survey discloses a totally dissatisfied answer or three somewhat dissatisfied answers, this creates a Hot Line which is transmitted directly to the coordinator of the program, Paul Rondina, manager of customer/quality programs. He evaluates each one, creates a "ticket," assigns it to a specific individual, and enters the details into an on-line monitoring system. Periodic summary reports assessing the status and disposition of each ticket go to McLaughlin, Zajac and Kerns.
Identifying causal factors
One of the keys in focusing the effort for improvement is the quantification of the impact of each of the measured attributes upon overall satisfaction. An importance ranking from the respondent is not sought, as a customer’s sense of priority tends to be relatively inconsistent. Nor does a low percentage of customer satisfaction necessarily identify an area that deserves significant attention. The approach used by Development II is to analyze the results using neural networks. This provides not only a ranking of the drivers of satisfaction, but a quantitative relationship that characterizes the impact that improvement in the specific attributes will have upon the customer’s overall satisfaction. This causal analysis is completed approximately twice a year. The following example illustrates this process and explores the results that were obtained.
As part of the Internet Advantage service GTEI provides the equipment for the site as well as the hub connection, but it depends upon the local telephone company to install the dedicated line. This is typically a 56K, frame relay, T1 carrier or T3 carrier connection. While it can happen within 30 days or less, occasionally it extends to 45 days and beyond.
The results of the initial customer satisfaction surveys for installation indicated that only 48 percent of the customers were totally satisfied with the installation. This was unacceptably low. Clues for the problem abounded. The satisfaction with the time to install was relatively low, a fact echoed by customer comments. One would guess that reducing installation time would solve the problem. The neural network analysis, however, pointed to a different culprit: communication with the Network Operations Center (NOC). At first this seemed to be a curious result, but during the regular monthly customer satisfaction meeting, this issue was discussed and a hypothesis emerged that expectations (for installation time) established during the sales process were unrealistic. The survey was therefore modified with additional questions to probe the customer’s expectation of installation time versus the actual time it took.
At the next monthly meeting the results were quite clear. There was a difference in expectations and actual times in many cases which contributed to dissatisfaction. The fix was relatively straightforward. Little could be done about decreasing the installation time of the dedicated line since this was controlled by the local telephone company, but expectations could be reset by the Network Operations Center based upon their most current knowledge of circuit installation interval.
After Zajac discussed this conclusion with his staff, the initial conversation with the customer was modified to provide a realistic installation time line. Evidence of this change appears in the satisfaction level in the survey within three months.
Several other areas were addressed by the same process during the year. The result was a dramatic change in overall satisfaction from 48 percent to 83 percent in less than 12 months (see Figure 1). It is interesting to note that the satisfaction with the "Time for Installation" changed very little and today still remains at the lower end of all of the satisfaction ratings.
The improvements in the overall satisfaction with the ongoing Internet connectivity service followed a similar increasing pattern (see Figure 2) This improvement was less dramatic only because the initial starting point was much higher. What is impressive is that now the overall satisfaction is in excess of 90 percent.
Critical ingredient
The highly focused feedback process that was discussed in this article is only part of the reason for the success of the improvement program. The clear message that comes from Kerns downward through the organization is that increasing customer satisfaction is his highest priority. This is a critical ingredient that is truly mandatory for a successful process. The results speak for themselves.
Improving the overall satisfaction for installation from 48 percent totally satisfied to 83 percent in less than one year is very unusual. Typically, movements in the level of satisfaction tend to be much more limited, even in highly focused customer oriented environments. A 10 percent increase in most cases would have been quite laudable.
Having over 90 percent of the customers totally satisfied with any service or product is similarly rare. Usually satisfaction measurements that reach these heights require that the percentage of totally and somewhat satisfied customers be combined.
Equally impressive are some of the other customer evaluations. A selection from the most recent report shows 99.7 percent of the customers totally satisfied with the "courtesy and professionalism of the NOC personnel" and 98.4 percent believing the service "met their quality standards during the past two months."
These exceptional results indicate what can be done with a highly focused process, dedicated senior management and an effective customer sensing program. Perhaps the greatest compliment came recently when a group from GTE corporate attended one of Kerns’ monthly meetings to see for themselves if the satisfaction numbers were objective and real. They left as believers.