QUALITY OF SERVICE OF INCUMBENT LOCAL EXCHANGE CARRIERS NOVEMBER 2005 Industry Analysis and Technology Division Wireline Competition Bureau Federal Communications Commission This report was authored by Jonathan M. Kraushaar of the Industry Analysis and Technology Division of the FCC’s Wireline Competition Bureau. The author can be reached at (202) 418-0947; e-mail address: jonathan.kraushaar@fcc.gov; TTY: (202) 418-0484. This report is available for reference in the FCC's Reference Information Center, Courtyard Level, 445 12th Street, S.W. Copies may be purchased by calling Best Copy and Printing, Inc. at (202) 488-5300. The report can be downloaded from the Wireline Competition Bureau Statistical Reports Internet site at http://www.fcc.gov/wcb/stats. 1 Quality of Service of Incumbent Local Exchange Carriers 1. Executive Summary 1.1 Overview This report summarizes the Automated Reporting Management Information System (ARMIS) service quality data filed by the regional Bell companies, Sprint and other price-cap regulated incumbent local exchange carriers (ILECs) for calendar year 2004. 1 The data track the quality of service provided to both retail customers (business and residential) and access customers (interexchange carriers). The Federal Communications Commission (FCC or Commission) does not impose service quality standards on communications common carriers. Rather, the Commission monitors quality of service data submitted by incumbent local exchange carriers that are regulated as price-cap carriers. The Commission summarizes these data and publishes a report on quality of service trends annually. 2 The tables of this report present comparative data on key company performance indicators. These data include several objective indicators of installation, maintenance, switch outage and trunk blocking performance for each reporting company. The tables also present data on customer perception of service and the level of consumer complaints. A number of indicators are charted over time to present a multi- year view. In addition, the Commission uses statistical methods to analyze the data for long term trends and to establish patterns of industry performance. The results of these analyses are also contained in this report. 1.2 Key Findings for 2004 The quality of service report charts industry performance over time on eight key quality of service indicators. Since our last report, there have been only small changes in the values of most of these indicators. However, our statistical analysis, which incorporates performance data from the most recent six years, indicates the presence of statistically significant long term trends in most 1 See Revision of ARMIS Annual Summary Report (FCC Report 43-01), ARMIS USOA Report (FCC Report 43- 02), ARMIS Joint Cost Report (FCC Report 43-03), ARMIS Access Report (FCC Report 43-04), ARMIS Service Quality Report (FCC Report 43-05), ARMIS Customer Satisfaction Report (FCC Report 43-06), ARMIS Infrastructure Report (FCC Report 43-07), ARMIS Operating Data Report (FCC Report 43-08), ARMIS Forecast of Investment Usage Report (FCC Report 495A), and ARMIS Actual Usage of Investment Report (FCC Report 495B) for Certain Class A and Tier 1 Telephone Companies, CC Docket No. 86-182, Order, 20 FCC Rcd 1048 (2004). 2 The last report, which included data for 2003, was released in December 2004. See Industry Analysis and Technology Division, Wireline Competition Bureau, Federal Communications Commission, Quality of Service of Incumbent Local Exchange Carriers (December, 2004). That report can be found on the Commission’s website at www.fcc.gov/wcb/stats under the file name QUAL03.ZIP. Source data used to prepare this report may be useful for further investigation and can be extracted from the ARMIS 43-05 and 43-06 tables on the online database maintained on the FCC website at www.fcc.gov/wcb/eafs. 2 industry indicators (i.e., in the data for large and small companies combined) and in the data for large companies alone. 3 Most of these trends are indicative of long-term improvement. Statistically significant differences in performance were also noted across most companies. Our findings are summarized below: • Statistically significant trends were identified in six indicators of industry-wide performance. These indicators and their expected annual downward (-) or upward (+) trend (i.e., percentage decline or increase in the value of the indicator) are average complaints per million lines (-8.1%), lengths of installation intervals (-7.3%), lengths of repair intervals (+4.2%), trouble reports per 1000 lines (-2.0%), percent installation dissatisfaction (-5.4%), and percent of switches with outages (-16.1%). • Statistically significant long-term trends were also identified in seven indicators of performance when the data were restricted to the larger reporting companies. 4 The “large company” indicators and their expected annual trend are average complaints per million lines (-15.8%), lengths of installation intervals (-12.4%), lengths of repair intervals (+3.4%), percent of installation commitments met (+0.2%), trouble reports per 1000 lines (-3.6%), percent installation dissatisfaction (-5.4%) and percent of switches with outages (-20.1%). • The performance of small companies was found to differ significantly from that of large companies on all indicators of small company performance tracked by the quality of service report, except for the length of repair intervals and the trouble reports per 1000 lines. No significant trends were identified in small company performance, except for length of repair intervals (+10.4%). Data on percent installation dissatisfaction and percent repair dissatisfaction were not collected for small companies. • There were significant differences across companies on all indicators of performance, except for percent of installation commitments met. Trends were also found to vary widely across companies except for trouble reports per 1000 lines and percent of installation commitments met. • Relative to their performance in 2003, average complaints per million lines for the larger companies changed very little in 2004. In addition, length of installation intervals and associated customer satisfaction levels remained near their previous levels for all but one of the larger companies. However, length of repair intervals and initial trouble reports increased for at least three of the larger companies, while customer dissatisfaction associated with repairs increased for only one of those carriers. 3 Essentially, we have identified trends in the data and have demonstrated statistically that the probability these trends occurred by chance was small. For most trends, this probability was less the 0.001, i.e., there was less than one chance in one thousand that the trend occurred as a result of random fluctuations in the data. 4 For a list of large and small companies, see footnotes 22 and 23. 3 2. Report History At the end of 1983, anticipating AT&T's imminent divestiture of its local operating companies, the Commission directed the Common Carrier Bureau 5 to establish a monitoring program that would provide a basis for detecting adverse trends in Bell operating company network service quality. The Bureau subsequently worked with industry to refine the reporting requirements, ensuring that the data were provided in a uniform format. Initially, the data were filed twice yearly. The data collected for 1989 and 1990 formed the basis for FCC service quality reports published in June 1990 and July 1991, respectively. These reports highlighted five basic service quality measurements collected at that time. 6 With the implementation of price-cap regulation for certain local exchange carriers, the Commission made several major changes to the service quality monitoring program. These changes first affected data filed for calendar year 1991. First, the Commission expanded the class of companies required to file quality of service data to include non-Bell carriers that elected to be subject to price-cap regulation. 7 These carriers are known collectively as non-mandatory price-cap carriers, and most of them are much smaller than the Bell operating companies. Second, the Commission included service quality reporting in the ARMIS data collection system. 8 Finally, the Commission ordered significant changes to the kinds of data carriers had to report. 9 Following these developments, the Commission released service quality reports in February 1993, March 1994, and March 1996. In 1996, pursuant to requirements in the Telecommunications Act of 1996, 10 the Commission reduced the frequency of ARMIS data reporting to annual submissions, and in May 1997, clarified 5 As the result of a reorganization in March 2002, the Wireline Competition Bureau now performs Common Carrier Bureau functions described in this report. In this report, references to the Common Carrier Bureau apply to activities prior to the above date. 6 These were customer satisfaction level, dial tone delay, transmission quality, on time service orders, and percentage of call blocking due to equipment failure. 7 Policy and Rules Concerning Rates for Dominant Carriers, CC Docket No. 87-313, Second Report and Order, 5 FCC Rcd 6786, 6827-31 (1990) (LEC Price-Cap Order) (establishing the current service quality monitoring program and incorporating the service quality reports into the ARMIS program), Erratum, 5 FCC Rcd 7664 (1990), modified on recon., 6 FCC Rcd 2637 (1991), aff'd sub nom., Nat'l Rural Telecom Ass'n v. FCC, 988 F.2d 174 (D.C. Cir. 1993). The incumbent local exchange carriers that are rate-of-return regulated are not subject to federal service quality reporting requirements. 8 LEC Price-Cap Order, 5 FCC Rcd at 6827-30. The ARMIS database includes a variety of mechanized company financial and infrastructure reports in addition to the quality-of-service reports. Most data are available disaggregated to a study area level which generally represents operations within a given state. 9 Id.; Policy and Rules Concerning Rates for Dominant Carriers, CC Docket No. 87-313, Memorandum Opinion and Order, 6 FCC Rcd 2974 (1991) (Service Quality Order), recon., 6 FCC Rcd 7482 (1991). Previously the Common Carrier Bureau had collected data on five basic service quality measurements from the Bell operating companies, described earlier. 10 Telecommunications Act of 1996, Pub. L. No. 104-104, 110 Stat. 56. 4 relevant definitions. 11 The raw data are now filed in April of each year. The Commission summarizes these data and publishes the quality of service report annually. 12 3. The Data 3.1 Tables The data presented in this report summarize the most recent ARMIS 43-05 and 43-06 carrier reports. 13 Included are data from the regional Bell companies, Sprint and all other reporting incumbent local exchange carriers. 14 Tables 1(a) through 1(f) cover data from the regional Bell companies, or mandatory price-cap companies. Tables 2(a) through 2(c) cover data from the smaller non-mandatory price-cap companies. These companies report quality of service data at a study area level which generally represents operations within a given state. Although reporting companies provide selected company aggregate data, the tables of this report contain summary data that have been recalculated by FCC staff as the composite aggregate of all study areas for each listed entity. This report also includes a 11 Orders implementing filing frequency and other reporting requirement changes associated with implementation of the Telecommunications Act of 1996 are as follows: Implementation of the Telecommunications Act of 1996: Reform of Filing Requirements and Carrier Classifications, CC Docket No. 96-193, Order and Notice of Proposed Rulemaking, 11 FCC Rcd 11716 (1996); Revision of ARMIS Quarterly Report (FCC Report 43-01) et al., CC Docket No. 96-193, Order, 11 FCC Rcd 22508 (1996); Policy and Rules Concerning Rates for Dominant Carriers, CC Docket No. 87-313, Memorandum Opinion and Order, 12 FCC Rcd 8115 (1997); Revision of ARMIS Annual Summary Report (FCC Report 43-01) et al., AAD No. 95- 91, Order, 12 FCC Rcd 21831 (1997). 12 The Commission released quality of service reports in September 1998, December 1999, December 2001, January 2003, February 2004 and December 2004, in addition to those listed earlier in this report. These reports have included data from the mandatory price-cap companies and the largest non-mandatory carriers, GTE and Sprint. GTE is now a part of Verizon, a mandatory price-cap carrier. Beginning with the December 2004 report, the following smaller non-mandatory price-cap companies that file ARMIS 43-05 data are included: Alltel Corp., Century Tel., Cincinnati Bell, Citizens, Citizens Frontier, Iowa Telecom, and Valor Telecommunications. Non-mandatory carriers are not required to file customer satisfaction data that appear in the ARMIS 43-06 report. 13 Source data used in preparing this report may be useful for further investigation and can be extracted from the ARMIS 43-05 and 43-06 tables on the online database maintained on the FCC website at www.fcc.gov/wcb/eafs. The data are also available from Best Copy and Printing, Inc at (202) 488-5300. A number of prior-year data summary reports are available through the FCC’s Reference Information Center (Courtyard Level) at 445 12th Street, S.W., Washington, D.C. 20554, and the Wireline Competition Bureau Statistical Reports website at www.fcc.gov/wcb/stats. 14 In February 1992, United Telecommunications Inc. became Sprint Corporation (Local Division); and in March 1993, Sprint Corporation acquired Centel Corporation. Bell Atlantic and NYNEX merged in August 1997, and then merged with GTE in 2000. Verizon Communications is shown separately for GTE, Verizon North (the former NYNEX companies), and Verizon South (the former Bell Atlantic Companies). SBC, Pacific Telesis, Ameritech, and SNET are shown separately despite the merger of SBC and Pacific Telesis in April 1997, SBC and SNET in October 1998, and SBC and Ameritech in October 1999. 5 fairly extensive summary of data about individual switching outages, including outage durations and numbers of lines affected, for which no company calculated aggregates are provided. Switch outage data have also been aggregated to the company level for inclusion in the tables. The company-level quality of service data included in Tables 1(a)-1(f) and Tables 2(a)-2(c) are derived by calculating sums or weighted averages of data reported at the study area level. In particular, where companies report study area information in terms of percentages or average time intervals, this report presents company composites that are calculated by weighting the percentage or time interval figures from all study areas within that company. For example, we weight the percent of commitments met by the corresponding number of orders provided in the filed data. 15 In the case of outage data summarized in Tables 1(b), 1(c), 2(b), and 2(c), we calculate a number of useful statistics from raw data records for individual switches with outages lasting more than two minutes. These statistics include the total number of events lasting more than two minutes, the average outage duration, the average number of outages per hundred switches, the average number of outages per million access lines, and the average outage line-minutes per thousand access lines and per event. Outage line-minutes is a measure that combines both duration and number of lines affected in a single parameter. We derive this parameter from the raw data by multiplying the number of lines involved in each outage by the duration of the outage and summing the resulting values. We then divide the resulting sum by the total number of thousands of access lines or of events to obtain average outage line-minutes per access line and average outage line minutes per event respectively. The tables contained in this report cover data for 2004. Tables 1(a) and 2(a) provide installation, maintenance and customer complaint data. The installation and maintenance data are presented separately for local services provided to end users and access services provided to interexchange carriers. Tables 1(b) and 2(b) show switch downtime and trunk servicing data. Tables 1(c) and 2(c) show outage data by cause. Table 1(d) presents the percentages of residential, small business and large business customers indicating dissatisfaction with BOC installations, repairs and business offices, as determined by BOC customer perception surveys. 16 Table 1(e) shows the underlying survey sample sizes. 3.2 Charts This report displays data elements that have remained roughly comparable over the past few years. Such data are useful in identifying and assessing trends. In addition to the tables, this report 15 Although companies file their own company composites, we have recalculated a number of them from study area data for presentation in the tables to assure that company averages are calculated in a consistent manner. We weight data involving percentages or time intervals in order to arrive at consistent composite data shown in the tables. Parameters used for weighting in this report were appropriate for the composite being calculated and were based on the raw data filed by the carriers but are not necessarily shown in the tables. For example, we calculate composite installation interval data by multiplying the average installation interval at the individual study area level by the number of orders in that study area, summing the results for all study areas, and then dividing that sum by the total number of orders. 16 Customer satisfaction data collected in the 43-06 report and summarized in Tables 1(d) and 1(e) are required to be reported only by the mandatory price-cap carriers. 6 contains charts that highlight company trends for the last 6 years. Unlike the tables for which the company composites are recalculated, the data presented in the charts is presented or derived from company provided rollup or composite data. 17 Charts 1 through 7 graphically illustrate trends in complaint levels, initial trouble reports, residential installation dissatisfaction, percent of residential installation commitments met, residential installation intervals, residential repair dissatisfaction, and residential initial out-of-service repair intervals, respectively. Chart 8 displays trends among the larger price-cap carriers in the percentage of switches with outages. Data for Sprint, the largest non- mandatory price-cap company, is included only in those charts displaying ARMIS 43-05 data that it is required to file. This report charts the performance of the smaller price-cap carriers only on selected quality of service indicators including numbers of trouble reports, repair intervals and installation intervals. These indicators were selected for charting because they are generally less volatile than the others, thus allowing better comparison with similar trended data from the larger companies. (In the cases where we chart both large and small company performance, the larger companies are tracked on the chart with an ‘A’ designation, e.g., Chart 7A, while the smaller companies are tracked on the chart with a ‘B’ designation, e.g., Chart 7B.) Filed data are available only for the past one or two years for several of the smaller companies, which accounts for the truncated trend lines in some of the charts. 3.3 For More Information about the Data More detailed information about the raw data from which this report has been developed may be found on the Commission’s ARMIS web page cited earlier. Descriptions of the raw ARMIS 43-05 source data items from which Tables 1(a), 1(b), 1(c), 2(a), 2(b), and 2(c) were prepared can be found in Appendix A of this report. Tables 1(d) and 1(e) were prepared from data filed only by the Bell operating companies in the ARMIS 43-06 report. The statistics presented in Tables 1(d) and1(e) are straightforward and reflect the data in the format filed. Complete data descriptions are available in several Commission orders. 18 4. Qualifications Overall, we caution readers to be aware of potential inconsistencies in the service quality data and methodological shortcomings affecting both the collection and interpretation of the data. Some common sources of issues are described below. 4.1 Data Re-filings Commission staff generally screen company-filed service quality data for irregularities and provide feedback to reporting companies on suspected problems. The reporting companies are then 17 Calculations to normalize data and derive percentages in charts 1, 2A, 2B and 8 in this year’s report were performed directly on company provided composite data rather than from recalculated composites in the attached tables. Other charts contain data that were taken directly from company provided composite data. 18 See supra note 11. 7 given an opportunity to re-file. Re-filed data appear in this report if they are received in time to be included in the Commission’s recalculation of holding company totals and other data aggregates described in Section 3.1 prior to publication. However, it is expected that the process of data correction continues beyond the date of publication of this report, as new problems are identified. Reporting companies frequently re-file data, not only for the current reporting period, but also occasionally for previous reporting periods. Hence, users of the quality of service report data may find some inconsistencies with data extracted from the ARMIS database at a later or earlier date. 4.2 Commission Recalculation of Holding Company Aggregate Statistics Commission staff do not typically delete or adjust company-filed data for presentation in the quality of service report, except for recalculating holding company totals and other data aggregates as described in Section 3.1. Recalculated aggregates appear in the tables of the quality of service report. These may not match corresponding company-filed totals and composites. 19 Such inconsistencies are due primarily to differences in the way we and the reporting company derive the data element, for example, in the use of percentages or average intervals that require weighting in the calculations. 4.3 Company-specific Variations Users conducting further analysis of the data should be aware that variations in service quality measurements may occur among companies and even within the same company over time for reasons other than differences in company performance. For example, data definitions must be properly and consistently interpreted. The Commission has, on occasion, provided clarifications when it became apparent that reporting companies had interpreted reporting requirements inconsistently. 20 Changes in a company’s internal data collection procedures or measurement technology may also result in fluctuations in its service quality measurements over time. In some cases, procedural changes in the data measurement and collection process may be subtle enough so that they are not immediately noticeable in the data. However, significant changes in company data collection procedures usually result in noticeable and abrupt changes in the data. 21 It appears that at least some of these changes have 19 Data presented in the charts are company-filed composites, except where noted. 20 For example, because of data problems resulting from the various classifications of trouble reports, the Commission addressed problems relating to subtleties in the definitions associated with the terms “initial” and “repeat” trouble reports. See Policy and Rules Concerning Rates for Dominant Carriers, CC Docket No. 87- 313, Memorandum Opinion and Order, 12 FCC Rcd 8115, 8133, para. 40 (1997); Policy and Rules Concerning Rates for Dominant Carriers, AAD No. 92-47, Memorandum Opinion and Order, 8 FCC Rcd 7474, 7478, para. 26, 7487-7549, Attachment (1993); Revision of ARMIS Annual Summary Report (FCC Report 43-01) et al., AAD 95-91, Order, 12 FCC Rcd 21831, 21835, para. 10 (1997) (introducing reporting of “subsequent” troubles). This issue was discussed at greater length in a prior summary report. See Industry Analysis Division, Common Carrier Bureau, Federal Communications Commission, Quality of Service for the Local Operating Companies Aggregated to the Holding Company Level (March 1996). 21 For example, SBC reported changes for 2003 in its complaint data which were designed to normalize disparate reporting methodologies in its Ameritech region. Resulting declines in complaint levels are at least partially attributable to these changes which involved elimination of several complaint data reporting subcategories previously included by Ameritech. At our request the company restated 2002 data for Ameritech to conform to new procedures that were introduced for the 2003 data collection and reporting. The restated Ameritech data was not formally filed as a revision but would have shown 43.9 residential complaints per million residential 8 not been reported to the Commission. These factors tend to limit the number of years of reliable data available to track service quality trends. Although the Commission has made considerable efforts to standardize data reporting requirements over the years, given the number of changes to the reporting regimes and predictable future changes, one should not assume exact comparability on all measurements for data sets as they are presented year by year. In spite of all of the foregoing, deteriorating or improving service quality trends that persist for more than a year or two usually become obvious and can provide a critical record for state regulators. 4.4 Trend Analysis and Data Volatility Because measurements of any particular quality of service indicator may fluctuate over time, trend analysis can be an effective tool in helping to evaluate longer-term company and industry performance. Consideration of trends may also provide insight into typical lead times that might be needed to correct certain problems once they have been identified. In addition, adverse trends in complaint levels of significant duration, when identified, can serve as warning indicators of problems not included in the more specific objective measurements. For this reason, we recommend the use of trend analysis of service quality and complaint data along with pattern analysis to get a holistic assessment of a company’s overall performance. With respect to individual measures of company performance, it is our experience that service reliability and to a lesser extent customer satisfaction data are, by their nature, subject to greater volatility than other types of company data. For these measures, in particular, data interpretation must consider longer term trends and take into consideration filing intervals and lag times in data preparation and filing. 4.5 Interpretation of Outage Statistics Outage statistics should be considered in context. For example, a statistic representing the average number of lines affected per event would tend to favor a company with a larger number of smaller or remote switches with lower line counts per switch, while a statistic representing the average outage duration might favor a company with larger switches. Thus, using the average number of lines per event measurement, one 25,000 line switch that is out of service for five minutes would appear to have a greater service impact than ten 2,500 line switches that are each out of service for five minutes. To provide a basis of comparison of performance of companies having different switch size characteristics, we present a grouping of outage statistics that include outage line-minutes per event and per 1,000 access lines. lines and 15.9 business complaints per million business lines. This would have resulted in an average of 29.9 complaints per million lines instead of the 213.4 complaints per million lines shown for the year 2002 Chart 1. Although improvement in 2003 is still indicated, the improvement appears to be more modest if we assume that SBC's procedural change took place in 2002 instead of 2003. 9 4.6 External Factors We note that external factors, including economic conditions and natural disasters, the level of competitive activity, and changes in regulation have the potential to affect the quality of service available in specific regions of the country or in the industry as a whole, and these effects may be manifested in the quality of service data. The Commission does not currently consider these effects in its analysis. 5. Observations and Statistical Analysis 5.1 Observations from the Current Year Summary Data Charts 1 to 9 visually display some of the key characteristics of the data. These charts, which track summary data for the large and small price-cap carriers on key quality of service parameters, generally reveal small changes from the patterns observed last year. In general, repair performance over the past few years has exhibited declining performance, while other parameters have not changed significantly or have exhibited improvement in recent years. This year’s data show weighted average complaint levels very close to levels seen last year. The data on installation intervals and associated customer satisfaction levels also exhibited little change over the past couple of years for the larger price-cap companies and installation intervals show improvement this year for the smaller companies. However, the data for residential repair intervals again showed declining performance for all but one of the larger price-cap companies. Increased repair intervals appeared in conjunction with increases in the number of reported initial trouble reports for at least three of the charted larger price-cap carriers. Nonetheless, residential customer dissatisfaction associated with repairs increased for only one of those carriers. 5.2 Statistical Analysis The FCC’s quality of service report has presented graphical analysis of several key indicators of industry and company performance since the December 2001 report. The graphs have typically presented the data for the most recent five or six year period. The indicators currently tracked are complaints per million lines, length of installation intervals, length of repair intervals, percent of installation commitments met, trouble reports per thousand lines, percent installation dissatisfaction, percent of repair dissatisfaction and percent of switches with outages. With this year’s report we present the results of a statistical analysis of these indicators from raw data samples received from the companies. The overall goals of our statistical analysis were to: ƒ Determine if there were any discernable trends in performance as tracked by these indicators across the years, ƒ Determine if reporting companies performed differently from each other, ƒ Determine whether the large reporting companies performed differently or had different trend behavior from small reporting companies, and ƒ Develop models of trends in performance that could be used to predict next year’s performance. 10 For the purpose of our analysis, we classified companies as “large” or “small.” This classification is largely the same as that used earlier in creating the charts (i.e., the larger companies 22 are tracked on the charts with an ‘A’ designation (e.g., chart 2A), and the smaller companies 23 are tracked on the charts with a ‘B’ designation (e.g., chart 2B). However, even though Iowa Telecom was classified as a small company in the charts, it was included as a large company for the statistical analysis, since its performance was very close to that of the larger companies. We used several types of statistical techniques in analyzing the data. These included ANOVA (Analysis of Variance), ANCOVA (Analysis of Covariance) and simple linear regression. They allowed us to analyze small-versus-large company effects, individual company effects, and year effects (i.e., does performance vary from year-to-year) in the performance data for each of the key indicators. We tested for the existence of overall trends, 24 trends for only the large companies, and trends for only the small companies. If a trend existed, we then determined its direction and magnitude. In addition, the statistical testing allowed us to determine if the trends varied widely across companies, if there were performance differences across companies, and if large company performance differed from small company performance. The following table summarizes the results of our statistical analysis on data filed by reporting companies since 1999, representing the most recent six-year reporting period. (Note that smaller non-mandatory price cap carriers are not required to file data on all performance indicators. These are designated as “NA” in the table.) The rows of the table contain the key indicators of company performance tracked by this report. The columns contain the effects described above. A “Yes” entry in the table means that we have concluded with a high level of statistical confidence that the effect for which we have tested is indeed present. A “No” entry means that the data did not support such a conclusion. For example, we tested to determine whether large company performance differs from small company performance on the average complaints per million lines indicator, and we concluded with a high degree of statistical confidence that large company performance does differ from small company performance on this indicator. We included the direction and magnitude of a trend in the table if our statistical testing indicated that there was a low probability the trend occurred as a result of random fluctuations in the data. Almost all trends were found significant at less than the 0.001 level, meaning there was less than one chance in 1000 that these trends occurred as a result of random data fluctuations. However, asterisked trends were found significant at less than the 0.01 level, meaning that there was a greater probability--less than one chance in a hundred--that these trends happened by chance. The word “No” appearing in any of the first three columns of the table indicates that a trend could not be established at the 0.01 level 22 The larger companies in the charts of this report are BellSouth, Qwest, SBC Ameritech, SBC Pacific, SBC Southwestern, SBC SNET, Verizon GTE, Verizon North, Verizon South, and Sprint. 23 The smaller companies in the charts of this report are Alltel Corp., Cincinnati Bell, Citizens, Citizens Frontier, Century Tel., Iowa Telecom, and Valor. 24 A trend is the expected annual change in the value of the performance indicator. For example, a negative trend of -5.2% means that every year the value of the indicator is expected to decrease by approximately 5.2%. A positive trend (for example, +6.3%), means that every year the value of the indicator is expected to increase by 6.3%. The magnitude and direction of the trend for a particular performance indicator is estimated by fitting a linear regression model to the logarithms of the values of that performance indicator for the past six years. 11 of significance. In the last three columns of the table the word “Yes” indicates that significant statistical differences were found between companies or groups of companies and the word “No” indicates that such statistical differences were not detected. The term “barely” in the last column of the table indicates that large companies are statistically different from small companies at significance level of nearly 0.01. Results of Statistical Testing of Key Industry Performance Indicators Trend Over All Companies Trend For Large Companies Trend for Small Companies Trends Vary Widely Across Companies Performanc e Differences Across Companies Large Company Performanc e Differs From Small Average complaints per million lines -8.1% -15.8% No Yes Yes Yes Installation intervals -7.3% -12.4% No Yes Yes Yes Repair intervals +4.2% * +3.4% +10.4% Yes Yes No Percent commitments met No *+ 0.2% No No No (except 1 company) Barely Trouble report rate per 1000 lines * -2.0% -3.6% No No Yes No Percent installation dissatisfaction -5.4% -5.4% NA Yes Yes NA Percent repair dissatisfaction No No NA Yes Yes NA Percent switches with outages -16.1% -20.1% No Yes Yes Yes All results are significant at less than the 0.001 level except as noted below or in the text: * Indicates a trend which was significant at less than the 0.01 level. As noted earlier, a trend represents the expected or average change in the value of the performance indicator from year to year. Considering columns 1 through 3, we note our analysis has allowed us to conclude with a high degree of confidence that statistically significant trends do exist in the data for many indicators of performance. Factors other than random data variability are likely to be responsible for these trends. However, what those factors are cannot be determined from our data alone. We also note that recent observed annual performance changes may not necessarily be in a direction consistent with calculated trends of the previous five years. 25 This may occur, for example, when significant underlying events or changes occur. 25 For example, in chart 2A covering trouble reports per thousand lines for large companies, the current year’s data shows an increase in the trouble report rate for the current year, while there is a longer term trend toward declining trouble report rates. 12 Considering column 4, we find that trends vary widely across companies, except for the “percentage of commitments met” indicator (where no trends were identified in any of the large, small and combined company data) and the “trouble reports per 1000 lines” indicator (where only large and combined company data showed evidence of small trends). Column 5 shows that there are significant statistical differences across companies in all performance measures except for the “percent of installation commitments met,” where only one company was statistically different from the others. Finally, column 6 shows that there is virtually no statistical difference between large and small companies in the categories of “trouble reports per 1000 lines” and “repair intervals.” Overall, our analysis shows that there are statistically significant trends for most of the performance measures (i.e., in the data for large and small companies combined and in the data for large companies alone). These trends are typically indicative of long-term improvement. However, the overall upward trend in the length of repair intervals (with all companies included in the analysis) provides evidence of longer-term declining performance in this area. While reasons for the declining performance in repair intervals cannot be determined from these data alone, we note that reported complaint levels exhibit a higher correlation with installation intervals than with repair intervals. In addition, there appear to be no corresponding statistically significant trends in customer dissatisfaction with repairs as there are for customer dissatisfaction with installations. The reasons for these unexpected inconsistencies could not be established from tests performed on the data. In closing, we note that although the highlighted trends reflect longer term patterns in company performance than simply looking at year over year changes, their direction in the future may change as companies respond or fail to respond to quality of service issues. ARMIS 43-05 Report 1999 2000 2001 2002 2003 2004 BellSouth 192.9 241.6 192.7 131.5 128.0 131.4 Qwest 722.1 379.2 203.4 149.2 103.5 89.1 SBC Ameritech 178.4 613.2 382.8 213.4 13.2 11.2 SBC Pacific 35.8 39.2 19.6 12.5 10.6 10.4 SBC Southwestern 28.5 28.0 23.9 17.0 13.4 21.9 SBC SNET 323.0 326.3 231.6 186.6 87.1 88.5 Verizon GTE 86.1 106.7 80.1 60.3 79.1 104.8 Verizon North (Combined with Verizon South) Verizon South 223.5 299.4 197.3 151.8 190.7 184.7 Sprint 183.8 287.9 136.4 75.3 78.9 43.3 Weighted BOC/Sprint Composite* 204.4 259.1 167.0 113.6 94.7 93.9 *Weighted composite is calculated using access line counts. Chart 1 Average of Residential and Business Complaints per Million Access Lines (Calculated Using Data from Company Provided Composites) Relative Complaint Levels Large Price-Cap Carriers 0.0 100.0 200.0 300.0 400.0 500.0 600.0 700.0 800.0 1999 2000 2001 2002 2003 2004 Years Co mpla ints per Millio n Lines Weighted BOC/Sprint Composite* Weighted Verizon Avg. BellSouth Weighted SBC Avg. Qwest Sprint 13 ARMIS 43-05 Report 1999 2000 2001 2002 2003 2004 BellSouth 287.8 290.9 300.1 285.0 278.5 298.2 Qwest 202.2 163.0 131.3 111.4 113.4 117.6 SBC Ameritech 208.3 177.5 200.4 171.4 149.7 146.2 SBC Pacific 146.7 157.7 146.8 129.0 119.4 116.1 SBC Southwestern 205.1 212.8 222.0 197.8 175.4 190.5 SBC SNET 195.9 194.0 195.6 173.2 180.3 165.8 Verizon GTE 173.7 177.1 164.5 146.4 153.0 167.2 Verizon North (Combined with Verizon South) Verizon South 168.2 168.3 160.6 151.5 169.4 157.8 Sprint 235.8 223.7 206.3 165.6 192.2 216.1 Weighted BOC/Sprint Composite* 199.4 194.2 190.8 172.1 172.2 175.8 * Weighted composite is calculated using access line counts. Initial Total Trouble Reports per Thousand Lines (Residence + Business) (Calculated Using Data from Company Provided Composites) Chart 2A Initial Trouble Reports per Thousand Lines Large Price-Cap Carriers 0.0 50.0 100.0 150.0 200.0 250.0 300.0 350.0 1999 2000 2001 2002 2003 2004 Years Number o f Repo rts Weighted Verizon Avg. BellSouth Weighted SBC Avg. Qwest Sprint Weighted BOC/Sprint Composite* 14 ARMIS 43-05 Report 1999 2000 2001 2002 2003 2004 Alltel Corp. 233.5 193.1 Cincinnati Bell 122.3 136.6 136.0 118.7 114.6 113.6 Citizens 265.3 313.3 286.0 264.0 260.2 296.0 Citizens (Frontier) 280.5 305.6 252.6 345.8 266.6 257.2 Century Tel. 266.9 265.0 Iowa Telecom 135.9 132.6 157.2 Valor 397.7 368.0 422.6 Weighted BOC/Sprint Composite* 199.4 194.2 190.8 172.1 172.2 175.8 Weighted Small Co.Composite* 227.0 258.2 231.3 319.2 318.4 244.0 * Weighted composite is calculated using access line counts. Initial Total Trouble Reports per Thousand Lines (Residence + Business) (Calculated Using Data from Company Provided Composites) Chart 2B Initial Trouble Reports per Thousand Lines Small Price-Cap Carriers 0.0 50.0 100.0 150.0 200.0 250.0 300.0 350.0 400.0 450.0 1999 2000 2001 2002 2003 2004 Years Number o f Repo rts Alltel Corp. Cincinnati Bell Citizens Citizens (Frontier) Weighted Small Co.Composite* Weighted BOC/Sprint Composite* Century Tel. Iowa Telecom Valor 15 ARMIS 43-06 Report 1999 2000 2001 2002 2003 2004 BellSouth 9.2 12.8 11.2 10.3 6.7 6.4 Qwest 7.3 7.4 6.4 7.0 5.5 3.9 SBC Ameritech 7.7 16.4 15.5 10.7 8.1 7.6 SBC Pacific 10.8 13.5 8.8 6.4 6.1 6.1 SBC Southwestern 5.7 6.8 8.0 8.1 7.9 8.4 SBC SNET 11.6 8.3 7.3 7.6 8.6 Verizon GTE 7.4 4.4 4.8 4.1 3.5 5.3 Verizon North (Combined with Verizon South) Verizon South 5.3 5.2 4.8 5.2 6.2 6.4 Weighted BOC Composite* 7.3 9.2 8.2 7.2 6.3 6.4 *Weighted composite is calculated using access line counts. Chart 3 Percent Dissatisfied --BOC Residential Installations (Using Company Provided Composites) Residential Installation Dissatisfaction BOCs 0.0 2.0 4.0 6.0 8.0 10.0 12.0 14.0 1999 2000 2001 2002 2003 2004 Years Percent Dissa t isfied Weighted Verizon Avg BellSouth Weighted SBC Avg. Qwest Weighted BOC Composite* 16 ARMIS 43-05 Report 1999 2000 2001 2002 2003 2004 BellSouth 97.8 100.0 100.0 100.0 98.2 98.7 Qwest 98.5 98.9 99.3 99.5 99.7 99.7 SBC Ameritech 99.0 98.9 98.8 99.1 98.9 98.6 SBC Pacific 99.0 99.1 99.5 99.6 99.6 99.4 SBC Southwestern 98.6 98.8 98.8 98.9 99.1 99.0 SBC SNET 96.7 98.9 100.0 100.0 99.5 99.6 Verizon GTE 95.6 96.2 95.5 98.5 98.3 98.4 Verizon North (Combined with Verizon South) Verizon South 98.4 98.5 98.9 98.7 98.7 98.8 Sprint 98.0 97.7 98.8 98.2 97.5 96.8 Weighted BOC/Sprint Composite* 98.1 98.6 98.8 99.1 98.8 98.8 *Weighted composite is calculated using access line counts. Percent Installation Commitments Met -- Residential Services (Using Company Provided Composites) Chart 4 Percent Residential Installation Commitments Met Large Price-Cap Carriers 97.0 97.5 98.0 98.5 99.0 99.5 100.0 1999 2000 2001 2002 2003 2004 Years Percent o f Co mmitments M e t Weighted BOC/Sprint Composite* Weighted Verizon Avg BellSouth Weighted SBC Avg. Qwest 17 ARMIS 43-05 Report 1999 2000 2001 2002 2003 2004 BellSouth 1.3 1.3 1.2 1.1 1.1 1.1 Qwest 1.1 1.0 0.6 0.5 0.4 0.3 SBC Ameritech 2.1 2.1 2.0 2.1 1.5 1.4 SBC Pacific 1.5 1.8 1.3 1.2 1.5 1.6 SBC Southwestern 0.8 0.8 2.2 1.8 1.9 2.0 SBC SNET 2.1 2.2 1.8 1.0 1.0 1.0 Verizon GTE 1.4 1.0 0.8 0.6 0.6 0.6 Verizon North (Combined with Verizon South) Verizon South 1.5 1.5 1.1 1.0 1.1 1.1 Sprint 4.5 3.9 3.2 1.5 1.4 1.7 Weighted BOC/Sprint Composite* 1.6 1.5 1.4 1.2 1.2 1.2 * Weighted composite is calculated using access line counts. Chart 5A Average BOC Residential Installation Interval in Days (Using Company Provided Composites) Residential Installation Intervals Large Price-Cap Carriers 0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 1999 2000 2001 2002 2003 2004 Years In t e r val i n D ays Weighted BOC/Sprint Composite* Weighted Verizon Avg BellSouth Weighted SBC Avg. Qwest Sprint 18 ARMIS 43-05 Report 1999 2000 2001 2002 2003 2004 Alltel Corp. 1.8 1.6 Cincinnati Bell 2.3 1.9 2.3 1.7 4.5 1.7 Citizens 4.6 4.8 4.6 4.8 5.3 4.1 Citizens (Frontier) 6.1 5.6 3.5 5.3 4.8 5.1 Century Tel. 3.3 1.6 Iowa Telecom 2.1 1.8 1.9 Valor 3.0 2.0 1.6 Weighted BOC/Sprint Composite* 1.6 1.5 1.4 1.2 1.2 1.2 Weighted Small Co.Composite* 4.4 4.2 3.6 4.7 5.2 2.9 * Weighted composite is calculated using access line counts. Chart 5B Average BOC Residential Installation Interval in Days (Using Company Provided Composites) Residential Installation Intervals Small Price-Cap Carriers 0.0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 1999 2000 2001 2002 2003 2004 Years In t e r val i n D ays Weighted BOC/Sprint Composite* Alltel Corp. Cincinnati Bell Citizens Citizens (Frontier) Weighted Small Co.Composite* Century Tel. Iowa Telecom Valor 19 Percent Dissatisfied -- BOC Residential Repairs (Using Company Provided Composites) ARMIS 43-06 Report 1999 2000 2001 2002 2003 2004 BellSouth 15.1 18.8 17.6 14.6 10.1 10.0 Qwest 13.9 8.0 10.0 9.3 6.5 5.9 SBC Ameritech 15.4 26.5 19.2 14.6 11.4 11.0 SBC Pacific 15.8 23.6 10.0 7.3 7.6 7.4 SBC Southwestern 7.9 9.6 11.7 9.6 9.9 10.4 SBC SNET 18.7 14.2 14.5 11.9 11.6 Verizon GTE 11.6 9.4 10.1 11.9 11.2 14.0 Verizon North (Combined with Verizon South) Verizon South 14.8 15.0 13.4 15.3 20.8 19.0 Weighted BOC Composite* 13.6 16.2 13.5 12.6 12.6 12.3 * Weighted composite is calculated using access line counts. Chart 6 Residential Repair Dissatisfaction BOCs 0.0 5.0 10.0 15.0 20.0 25.0 1999 2000 2001 2002 2003 2004 Years P ercen t Di ssati sfi e d Weighted BOC Composite* Weighted Verizon Avg BellSouth Weighted SBC Avg. Qwest 20 Average Initial Out-of-Service Repair Interval in Hours -- Residential Services (Using Company Provided Composites) ARMIS 43-05 Report 1999 2000 2001 2002 2003 2004 BellSouth 24.3 23.1 20.8 20.0 21.5 33.5 Qwest 25.3 19.0 14.1 13.6 14.7 16.3 SBC Ameritech 21.7 49.0 22.7 18.9 16.8 17.2 SBC Pacific 37.7 42.1 26.8 25.9 25.8 28.8 SBC Southwestern 20.9 23.2 24.9 21.0 22.1 29.0 SBC SNET 39.2 38.2 27.2 27.4 26.7 27.2 Verizon GTE 14.1 13.0 13.5 15.5 15.7 28.9 Verizon North (Combined with Verizon South) Verizon South 24.0 27.0 22.0 24.1 34.5 29.2 Sprint 18.9 16.3 13.9 15.2 17.3 22.6 Weighted BOC/Sprint Composite* 24.0 27.7 20.7 20.4 23.3 26.7 * Weighted composite is calculated using access line counts. Chart 7A Residential Initial Out-of-Service Repair Intervals Large Price-Cap Carriers 0.0 5.0 10.0 15.0 20.0 25.0 30.0 35.0 40.0 45.0 1999 2000 2001 2002 2003 2004 Years Int erval in H o urs Weighted BOC/Sprint Composite* Weighted Verizon Avg BellSouth Weighted SBC Avg. Qwest Sprint 21 Average Initial Out-of-Service Repair Interval in Hours -- Residential Services (Using Company Provided Composites) ARMIS 43-05 Report 1999 2000 2001 2002 2003 2004 Alltel Corp. 25.9 15.4 Cincinnati Bell 31.5 36.7 49.3 36.1 37.5 28.2 Citizens 14.2 14.3 14.7 14.4 16.3 16.7 Citizens (Frontier) 16.9 20.7 16.4 17.7 28.1 22.3 Century Tel. 14.9 13.9 Iowa Telecom 11.3 10.1 11.1 Valor 21.8 16.8 17.3 Weighted BOC/Sprint Composite* 24.0 27.7 20.7 20.4 23.3 26.7 Weighted Small Co.Composite* 18.7 21.2 23.7 24.4 31.2 19.0 * Weighted composite is calculated using access line counts. Chart 7B Residential Initial Out-of-Service Repair Intervals Small Price-Cap Carriers 0.0 10.0 20.0 30.0 40.0 50.0 60.0 1999 2000 2001 2002 2003 2004 Years Int erval in H o urs Weighted BOC/Sprint Composite* Alltel Corp. Cincinnati Bell Citizens Century Tel. Weighted Small Co.Composite* Iowa Telecom Valor 22 Percentage of Switches with Downtime (Calculated Using Data from Company Provided Composites) ARMIS 43-05 Report 1999 2000 2001 2002 2003 2004 BellSouth 8.3% 6.4% 5.9% 4.2% 2.5% 1.6% Qwest 77.5% 42.1% 36.0% 18.8% 11.1% 20.0% SBC Ameritech 23.3% 3.7% 3.4% 4.5% 1.5% 1.0% SBC Pacific 16.2% 10.1% 15.4% 2.3% 3.3% 3.7% SBC Southwestern 17.2% 12.0% 10.3% 4.3% 3.9% 1.5% SBC SNET 7.9% 28.8% 42.3% 4.4% 0.6% 6.2% Verizon GTE 3.4% 2.9% 1.6% 1.3% 2.7% 1.5% Verizon North (Combined with Verizon South) Verizon South 4.7% 8.6% 5.6% 2.4% 4.4% 0.9% Sprint 12.6% 10.2% 8.8% 10.2% 3.5% 7.5% Weighted BOC/Sprint Composite* 17.1% 11.1% 10.0% 4.9% 3.9% 3.7% *Weighted composite is calculated using access line counts. Chart 8 Percentage of Switches with Downtime Large Price-Cap Carriers 0.0% 10.0% 20.0% 30.0% 40.0% 50.0% 60.0% 70.0% 80.0% 90.0% 1999 2000 2001 2002 2003 2004 Years Percent Weighted BOC/Sprint Composite* Weighted Verizon Avg BellSouth Weighted SBC Avg. Qwest Sprint 23 Table 1(a): Installation, Maintenance, & Customer Complaints Mandatory Price-Cap Company Comparison -- 2004 BellSouth Qwest SBC SBC SBC SBC Verizon Verizon Verizon Ameritech Pacific Southwestern SNET North South GTE Access Services Provided to Carriers-- Switched Access Percent Installation Commitments Met 100.0 99.8 94.4 95.1 84.9 74.4 99.9 99.8 92.4 Average Installation Interval (days) 19.2 15.0 31.8 20.3 27.6 28.1 28.5 20.1 26.6 Average Repair Interval (hours) 0.6 1.3 4.4 9.6 3.6 0.4 15.0 4.0 9.5 Access Services Provided to Carriers -- Special Access Percent Installation Commitments Met 99.8 97.9 95.5 98.2 99.1 98.5 91.8 92.8 91.0 Average Installation Interval (days) 14.0 9.0 17.9 16.2 17.4 20.7 21.3 16.5 20.2 Average Repair Interval (hours) 3.3 2.8 4.7 4.3 3.7 3.7 5.4 3.8 17.3 Local Services Provided to Res. and Business Customers Percent Installation Commitments Met 97.7 99.6 98.6 99.4 98.9 99.5 98.8 98.7 98.1 Residence 98.7 99.7 98.6 99.4 98.9 99.6 98.8 98.8 98.4 Business 90.6 98.8 98.3 99.1 98.5 99.0 98.0 97.5 95.5 Average Installation Interval (days) 1.5 0.3 1.4 1.8 2.1 1.2 0.9 1.4 0.7 Residence 1.2 0.3 1.4 1.6 2.0 1.0 0.8 1.3 0.6 Business 1.9 1.2 1.4 3.0 2.4 2.9 1.5 2.0 2.3 Avg. Out of Svc. Repair Interval (hours) 31.3 16.0 16.7 26.9 27.7 26.7 25.7 29.0 26.4 Total Residence 33.5 16.3 17.2 28.8 28.9 27.2 27.1 31.7 28.9 Total Business 19.9 14.9 14.2 17.6 20.7 23.9 20.1 15.1 13.8 Initial Trouble Reports per Thousand Lines 298.2 117.6 146.2 116.1 190.5 165.8 181.6 139.2 167.2 Total MSA 289.7 133.8 145.9 115.5 182.7 164.3 175.6 133.4 157.8 Total Non MSA 348.9 39.3 148.9 136.2 228.6 181.4 245.3 219.3 205.4 Total Residence 344.8 137.1 206.8 162.6 247.2 211.0 221.9 184.7 192.1 Total Business 177.1 74.5 58.9 45.7 82.2 70.2 104.8 62.7 108.6 Troubles Found per Thousand Lines 197.3 89.2 107.8 98.0 147.6 91.8 137.8 102.3 140.0 Repeat Troubles as a Pct. of Trouble Rpts. 18.6% 20.3% 16.1% 10.2% 16.6% 16.7% 20.6% 21.0% 14.6% Residential Complaints per Million Res. Access Lines 212.4 130.8 16.2 17.5 35.8 128.8 100.1 496.5 161.1 Business Complaints per Million Business Access Lines 50.3 47.4 6.2 3.3 8.0 48.2 38.2 60.3 49.9 * Please refer to text for notes and data qualifications. Table 1(b): Switch Downtime & Trunk Blocking Mandatory Price-Cap Company Comparison -- 2004 BellSouth Qwest SBC SBC SBC SBC Verizon Verizon Verizon Ameritech Pacific Southwestern SNET North South GTE Total Access Lines in Thousands 20,938 13,425 17,287 16,156 13,912 2,069 15,829 20,276 15,785 Total Trunk Groups 3,230 1,523 962 1,227 734 88 736 900 1,564 Total Switches 1,625 1,323 1,436 778 1,654 161 1,290 1,349 3,180 Switches with Downtime Number of Switches 26 264 14 29 25 10 14 10 49 As a percentage of Total Switches 1.6% 20.0% 1.0% 3.7% 1.5% 6.2% 1.1% 0.7% 1.5% Average Switch Downtime in seconds per Switch* For All Events (including events over 2 minutes) 19.8 104.4 5.0 0.2 4.4 20.9 47.7 10.9 212.6 For Unscheduled Events Over 2 Minutes 19.8 90.4 2.1 NA 3.7 17.9 43.6 10.7 NA For Unscheduled Downtime More than 2 Minutes Number of Occurrences or Events 15 29 5 0 5 2 10 9 0 Events per Hundred Switches 1 2 0 0 0 1 1 1 0 Events per Million Access Lines 1 2 0 0 0 1 1 0 0 Average Outage Duration in Minutes 36 69 10 NA 20 24 94 27 NA Average Lines Affected per Event in Thousands 15.6 6.5 37.1 NA 22.2 22.3 21.8 15.0 NA Outage Line-Minutes per Event in Thousands 282.8 243.7 365.7 NA 430.0 78.4 516.7 1,024.6 NA Outage Line-Minutes per 1,000 Access Lines 202.6 526.4 105.8 0.0 154.5 75.7 326.4 454.8 0.0 For Scheduled Downtime More than 2 Minutes Number of Occurrences or Events 5 28 1 0 2 0 3 1 0 Events per Hundred Switches 0.3 2.1 0.1 0 0.1 0 0.2 0.1 0 Events per Million Access Lines 0.24 2.09 0.06 0 0.14 0 0.19 0.05 0 Average Outage Duration in Minutes 8.1 5.5 66.0 NA 5.5 NA 28.5 2.3 NA Avg. Lines Affected per Event in Thousands 1.4 16.1 21.0 NA 19.3 NA 22.9 21.8 NA Outage Line-Minutes per Event in Thousands 11.4 66.5 1,388.1 NA 77.5 NA 142.9 50.6 NA Outage Line-Minutes per 1,000 Access Lines 2.7 138.8 80.3 0.0 11.1 0.0 27.1 2.5 0.0 % Trunk Grps. Exceeding Blocking Objectives 1.30% 9.72% 0.10% 0.49% 0.68% 0.00% 3.80% 2.11% 0.19% * Aggregate downtime divided by total number of company switches. Please refer to text for notes and data qualifications. ` Table 1(c): Switch Downtime Causes -- Outages more than 2 Minutes in Duration Mandatory Price-Cap Company Comparison -- 2004 BellSouth Qwest SBC SBC SBC SBC Verizon Verizon Verizon Ameritech Pacific Southwestern SNET North South GTE Total Number of Outages 1. Scheduled 5 28 1 0 2 0 3 1 0 2. Proced. Errors -- Telco. (Inst./Maint.) 0 0 1 0 0 0 2 2 0 3. Proced. Errors -- Telco. (Other) 0 0 0 0 0 0 0 0 0 4. Procedural Errors -- System Vendors 2 10 0 0 0 0 0 0 0 5. Procedural Errors -- Other Vendors 0 2 0 0 1 0 0 0 0 6. Software Design 1 0 1 0 0 1 2 4 0 7. Hardware design 0 0 0 0 0 0 1 1 0 8. Hardware Failure 5 9 3 0 4 1 3 1 0 9. Natural Causes 3 0 0 0 0 0 1 0 0 10. Traffic Overload 0 0 0 0 0 0 0 0 0 11. Environmental 0 0 0 0 0 0 0 1 0 12. External Power Failure 1 2 0 0 0 0 1 0 0 13. Massive Line Outage 0 0 0 0 0 0 0 0 0 14. Remote 5 28 1 0 2 0 3 1 0 15. Other/Unknown 1 0 0 0 0 0 0 0 0 Total Outage Line-Minutes per Thousand Access Lines 1. Scheduled 2.7 138.8 80.3 0.0 11.1 0.0 27.1 2.5 0.0 2. Proced. Errors -- Telco. (Inst./Maint.) 0.0 0.0 10.4 0.0 0.0 0.0 44.2 24.0 0.0 3. Proced. Errors -- Telco. (Other) 7.9 56.2 0.0 0.0 0.0 0.0 0.0 0.0 0.0 4. Procedural Errors -- System Vendors 2.0 32.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 5. Procedural Errors -- Other Vendors 0.0 73.1 0.0 0.0 128.3 0.0 0.0 0.0 0.0 6. Software Design 49 0 10 0 0 66 4 8 0 7. Hardware design 0.0 0.0 0.0 0.0 0.0 0.0 5.2 3.7 0.0 8. Hardware Failure 115.1 295.5 85.0 0.0 26.2 9.6 226.2 20.1 0.0 9. Natural Causes 19.4 0.0 0.0 0.0 0.0 0.0 9.0 0.0 0.0 10. Traffic Overload 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 11. Environmental 0 0 0 0 0 0 0 399 0 12. External Power Failure 4.6 69.2 0.0 0.0 0.0 0.0 37.6 0.0 0.0 13. Massive Line Outage 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 14. Remote 0.0 0.5 0.0 0.0 0.0 0.0 0.0 0.0 0.0 15. Other/Unknown 4.7 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 * Please refer to text for notes and data qualifications. ` Table 1(d): Company Comparision -- 2004 Customer Perception Surveys Mandatory Price-Cap Companies: BellSouth Qwest SBC SBC SBC SBC Verizon Verizon Verizon Ameritech Pacific Southwestern SNET North South GTE Percentage of Customers Dissatisfied Installations: Residential 6.38% 3.90% 7.57% 6.06% 8.38% 8.62% 6.04% 6.86% 5.33% Small Business 9.23% 9.18% 11.75% 7.99% 9.48% 6.94% 10.25% 12.92% 10.22% Large Business 6.38% NA 10.28% 5.74% 8.76% NA 5.68% 5.87% 7.34% Repairs: Residential 9.95% 5.91% 11.01% 7.44% 10.40% 11.61% 17.73% 20.74% 14.01% Small Business 7.26% 7.47% 10.71% 4.92% 7.64% 9.56% 12.73% 10.95% 10.22% Large Business 5.50% NA 9.30% 4.68% 7.45% NA 9.98% 5.53% 7.39% Business Office: Residential 8.12% 1.62% 8.15% 5.10% 8.83% 6.88% 6.21% 6.58% 6.61% Small Business 10.94% 3.30% 9.10% 6.52% 6.96% 8.64% 7.08% 8.44% 7.76% Large Business 6.13% NA 10.72% 3.01% 4.97% NA 10.14% 13.51% 8.50% * Please refer to text for notes and data qualifications Table 1(e): Company Comparision -- 2004 Customer Perception Surveys Mandatory Price-Cap Companies: BellSouth Qwest SBC SBC SBC SBC Verizon Verizon Verizon Ameritech Pacific Southwestern SNET North South GTE Sample Sizes -- Customer Perception Surveys Installations: Residential 45,000 49,832 10,708 10,841 10,311 4,558 20,301 15,346 19,762 Small Business 45,310 31,801 10,484 11,256 10,227 2,148 19,446 15,246 19,891 Large Business 9,801 0 2,287 3,480 2,272 0 599 733 436 Repairs: Residential 30,680 35,055 10,505 12,348 10,537 2,368 20,255 15,224 20,023 Small Business 46,100 24,993 10,724 11,071 10,665 1,193 20,151 15,052 20,176 Large Business 7,340 0 2,376 3,689 2,390 0 551 651 433 Business Office: Residential 42,609 46,008 21,038 22,934 20,887 2,876 11,057 9,783 12,966 Small Business 10,900 31,757 20,346 22,954 21,281 1,111 3,418 3,625 3,402 Large Business 457 0 2,883 2,853 2,534 0 503 592 341 * Please refer to text for notes and data qualifications Table 2(a): Installation, Maintenance, & Customer Complaints Non-Mandatory Price-Cap Company Comparison -- 2003 Alltel Century Cincinnati Citizens Citizens Iowa Sprint Valor Tel. Frontier Access Services Provided to Carriers-- Switched Access Percent Installation Commitments Met 98.6 80.0 100.0 92.7 99.2 64.2 92.6 89.1 Average Installation Interval (days) 13.5 18.9 22.2 12.8 25.6 20.5 11.9 31.7 Average Repair Interval (hours) 3.1 12.9 NA 11.8 2.6 26.1 2.6 2.8 Access Services Provided to Carriers -- Special Access Percent Installation Commitments Met 90.6 83.0 100.0 89.2 93.5 63.7 94.0 94.7 Average Installation Interval (days) 11.8 20.7 17.0 12.0 21.1 9.6 10.0 23.0 Average Repair Interval (hours) 3.2 12.6 3.2 14.8 23.0 18.3 3.6 3.1 Local Services Provided to Res. and Business Customers Percent Installation Commitments Met 97.4 95.9 99.8 95.1 97.6 98.4 96.5 98.2 Residence 97.7 96.0 99.9 95.3 98.1 98.4 96.8 98.2 Business 94.3 95.5 99.5 94.3 95.2 97.4 94.5 98.1 Average Installation Interval (days) 1.8 1.9 2.2 4.0 5.3 2.1 1.8 1.6 Residence 1.6 1.6 1.7 4.1 5.2 2.0 1.7 1.6 Business 3.0 1.7 4.7 3.7 6.4 3.4 2.7 1.6 Avg. Out of Svc. Repair Interval (hours) 15.1 15.2 26.7 16.9 22.0 10.8 22.2 17.1 Total Residence 15.4 18.4 28.2 16.7 22.3 11.1 22.5 17.3 Total Business 13.6 13.8 16.2 17.8 19.8 8.6 20.1 15.4 Initial Trouble Reports per Thousand Lines 193.1 230.5 113.6 296.0 257.2 157.2 216.1 422.6 Total MSA 170.9 215.4 113.6 NA 260.7 155.3 195.5 328.2 Total Non MSA 214.8 244.6 NA 296.0 253.5 157.7 261.6 505.0 Total Residence 243.2 68.1 141.7 336.2 315.7 179.0 260.2 504.0 Total Business 76.6 729.0 48.7 170.1 127.3 82.1 101.2 198.3 Troubles Found per Thousand Lines 149.4 189.6 106.1 236.4 210.3 141.8 144.4 404.1 Repeat Troubles as a Pct. of Trouble Rpts. 20.3% 42.9% 12.2% 16.0% 11.6% 17.1% 21.9% 7.6% Residential Complaints per Million Res. Access Lines 189.2 714.1 664.6 687.5 727.8 21.0 61.8 154.0 Business Complaints per Million Bus. Access Lines 70.3 408.8 83.5 137.4 109.7 0.0 24.8 36.6 * Please refer to text for notes and data qualifications Table 2(b): Switch Downtime & Trunk Blocking Non-Mandatory Price-Cap Company Comparison -- 2003 Alltel Century Cincinnati Citizens Citizens Iowa Sprint Valor Tel. Frontier Total Access Lines in Thousands 756 607 953 1,293 940 246 7,546 513 Total Trunk Groups 90 244 61 247 92 170 541 246 Total Switches 243 187 86 643 201 273 1,344 292 Switches with Downtime Number of Switches 57 0 17 55 12 20 101 31 As a percentage of Total Switches 23.5% 0.0% 19.8% 8.6% 6.0% 7.3% 7.5% 10.6% Average Switch Downtime in seconds per Switch * For All Events (including events over 2 minutes) 3,580.8 0.0 44.8 1,011.5 829.9 254.2 2,978.0 770.1 For Unscheduled Events Over 2 Minutes 3,233.7 NA NA 946.3 857.0 197.5 2,797.6 770.1 For Unscheduled Downtime More than 2 Minutes Number of Occurrences or Events 29 0 0 63 16 18 77 31 Events per Hundred Switches 11.9 0.0 0.0 9.8 8.0 6.6 5.7 10.6 Events per Million Access Lines 38.36 0.00 0.00 48.71 17.02 73.05 10.20 60.40 Average Outage Duration in Minutes 451.6 NA NA 161.0 179.4 49.9 813.9 120.9 Average Lines Affected per Event in Thousands 2.2 NA NA 0.8 1.5 0.7 3.2 1.7 Outage Line-Minutes per Event in Thousands 565.7 NA NA 368.6 188.5 38.2 6,803.4 349.8 Outage Line-Minutes per 1,000 Access Lines 21,697.3 0.0 0.0 17,956.8 3,207.5 2,790.2 69,423.5 21,130.4 For Scheduled Downtime More than 2 Minutes Number of Occurrences or Events 2 0 0 16 0 1 24 0 Events per Hundred Switches 0.8 0.0 0.0 2.5 0.0 0.4 1.8 0.0 Events per Million Access Lines 2.65 0.00 0.00 12.37 0.00 4.06 3.18 0.00 Average Outage Duration in Minutes 703.0 NA NA 34.2 NA 258.4 168.3 NA Avg. Lines Affected per Event in Thousands 1.4 NA NA 1.2 NA 0.7 4.8 NA Outage Line-Minutes per Event in Thousands 731.1 NA NA 28.6 NA 187.3 831.2 NA Outage Line-Minutes per 1,000 Access Lines 1,933.9 0.0 0.0 354.4 0.0 760.3 2,643.5 0.0 % Trunk Grps. Exceeding Blocking Objectives 1.11% 5.33% 24.59% 0.00% 0.00% 0.00% 2.96% 0.81% * Aggregate downtime divided by total number of company switches. Please refer to text for notes and data qualifications. Table 2(c): Switch Downtime Causes -- Outages More than 2 Minutes in Duration Non-Mandatory Price-Cap Company Comparison -- 2003 Alltel Century Cincinnati Citizens Citizens Iowa Sprint Valor Tel. Frontier Total Number of Outages 1. Scheduled 2 0 0 16 0 1 24 0 2. Proced. Errors -- Telco. (Inst./Maint.) 3 0 0 0 0 2 16 15 3. Proced. Errors -- Telco. (Other) 0 0 0 0 0 0 0 0 4. Procedural Errors -- System Vendors 1 0 0 0 0 0 0 0 5. Procedural Errors -- Other Vendors 0 0 0 0 0 0 1 3 6. Software Design 4 0 0 16 2 2 2 0 7. Hardware design 0 0 0 0 0 2 0 1 8. Hardware Failure 6 0 0 21 2 9 9 3 9. Natural Causes 3 0 0 4 2 0 8 1 10. Traffic Overload 0 0 0 0 0 0 0 0 11. Environmental 0 0 0 0 0 0 0 0 12. External Power Failure 2 0 0 12 5 2 19 0 13. Massive Line Outage 7 0 0 0 0 1 1 0 14. Remote 2 0 0 16 0 1 24 0 15. Other/Unknown 2 0 0 0 0 0 2 0 Total Outage Line-Minutes per Thousand Access Lines 1. Scheduled 1,933.9 0.0 0.0 354.4 0.0 760.3 2,643.5 0.0 2. Proced. Errors -- Telco. (Inst./Maint.) 278.1 0.0 0.0 0.0 0.0 361.1 257.4 5,829.5 3. Proced. Errors -- Telco. (Other) 53.8 0.0 0.0 0.0 0.0 0.0 0.0 620.0 4. Procedural Errors -- System Vendors 56.4 0.0 0.0 0.0 0.0 0.0 0.0 0.0 5. Procedural Errors -- Other Vendors 0.0 0.0 0.0 0.0 0.0 0.0 4.5 1,168.1 6. Software Design 1327 0 0 1089 258 1318 10 0 7. Hardware design 0.0 0.0 0.0 0.0 0.0 112.6 0.0 6.4 8. Hardware Failure 1,665.3 0.0 0.0 1,148.2 58.7 810.4 373.6 10,750.3 9. Natural Causes 3,167.7 0.0 0.0 13,379.7 951.6 0.0 65,706.0 2,756.1 10. Traffic Overload 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 11. Environmental 0 0 0 0 0 0 0 0 12. External Power Failure 216.1 0.0 0.0 1,276.6 1,313.3 120.8 2,783.1 0.0 13. Massive Line Outage 14,903.1 0.0 0.0 0.0 0.0 67.8 168.5 0.0 14. Remote 0.0 0.0 0.0 1,063.2 626.1 0.0 113.1 0.0 15. Other/Unknown 29.3 0.0 0.0 0.0 0.0 0.0 7.6 0.0 * Please refer to text for notes and data qualifications 1 Appendix A – Description of Key Terminology in the Tables This Appendix contains descriptions of key terms that appear in the tables and charts of the Quality of Service Report. The data elements in the tables are derived from raw source data for individual study areas submitted by carriers in the ARMIS 43-05 reports. A detailed specification of each element used in the tables of this summary report follows this general description. Data in the charts are derived from composite data provided by the companies. 1. Percent of Installation Commitments Met This term represents the percent of installations that were met by the date promised by the company to the customer. The associated data are presented separately for residential and business customers’ local service in the tables. These data are also summarized in the accompanying charts. 2. Average Installation Interval (in days) This term represents the average interval (in days) between the installation service order and completion of installation. The associated ARMIS 43-05 report data are highlighted in the accompanying charts along with customer installation dissatisfaction data from the ARMIS 43-06 report. 3. Average Repair Interval (in hours) This term represents the average time (in hours) for the company to repair access lines with service subcategories for switched access, high-speed special access, and all special access. Repair interval data are also highlighted in the accompanying charts along with results from company conducted surveys relating to customer repair dissatisfaction. This customer feedback is extracted from the ARMIS 43-06 report. 4. Initial Trouble Reports per Thousand Access Lines This term is calculated as the total count of trouble reports reported as "initial trouble reports," divided by the number of access lines in thousands. (Note that multiple calls within a 30 day period associated with the same problem are counted as a single initial trouble, and the number of access lines reported and used in the calculation is the total number of access lines divided by 1,000.) 2 5. Found or Verified Troubles per Thousand Access Lines This term is calculated as 1000 times the number of verified troubles divided by the number of access lines. Only those trouble reports for which the company identified a problem are included. 6. Repeat Troubles as a percent of Initial Trouble Reports This term is calculated as the number of initial trouble reports cleared by the company that recur, or remain unresolved, within 30 days of the initial trouble report, divided by the number of initial trouble reports as described above. 7. Complaints per Million Access Lines This term is calculated as 1 million times the number of residential and business customer complaints divided by the number of access lines, reported to state or federal regulatory bodies during the reporting period. 8. Number of Access Lines, Trunk Groups and Switches These terms represent the numbers of in-service access lines, trunk groups, and switches, respectively, as shown in the ARMIS 43-05 report. Trunk groups only include common trunk groups between Incumbent Local Exchange Carrier (ILEC) access tandems and ILEC end offices. When comparing current data herein with data in prior reports the reader should note that access lines were reported in thousands in pre-1997 data submissions. Starting with 1997 data submissions, access line information in the raw carrier data filings has been reported in whole numbers. 9. Switches with Downtime This term represents the number of network switches experiencing downtime and the percentage of the total number of company network switches experiencing downtime. 10. Average Switch Downtime in Seconds per Switch This term includes (1) the total switch downtime divided by the total number of company network switches and (2) the total switch downtime for outages longer than 2 minutes divided by the total number of switches. Results for average overall switch downtime are shown in seconds per switch. 3 11. Unscheduled Downtime Over 2 Minutes per Occurrence This term presents several summary statistics including, (1) the number of occurrences of more than 2 minutes in duration that were unscheduled, (2) the number of occurrences per million access lines, (3) the average number of minutes per occurrence, (4) the average number of lines affected per occurrence, (5) the average number of line-minutes per occurrence in thousands, and (6) the outage line-minutes per access line. For each outage, the number of lines affected was multiplied by the duration of the outage to provide the line-minutes of outage. The resulting sums of these data represent total outage line-minutes. This number was divided by the total number of access lines to provide line- minutes-per-access-line, and, by the number of occurrences, to provide the line- minutes-per-occurrence. This categorizes the normalized magnitude of the outage in two ways and provides a realistic means to compare the impact of such outages between companies. Data is presented for each company showing the number of outages and outage line-minutes by cause. 12. Scheduled Downtime Over 2 Minutes per Occurrence This term is determined as in item 11, above, except that it consists of scheduled occurrences. 13. Percent of Trunk Groups Meeting Design Objectives This term relates to the percentage of trunk groups exceeding the design blocking objectives (typically 0.5 percent for trunk groups that include feature group D and 1.0 percent for other trunk groups) for three or more consecutive months. The trunk groups measured and reported are interexchange access facilities. These represent only a small portion of the total trunk groups in service. 4 Appendix A Detailed Quality of Service Report Table Specifications Report Tables 1(a) and 2(a) (ARMIS 43-05 data) Statistic Access Services Provided to Carriers-- Switched Access Percent Installation Commitments Met row 112 weighted by row 110 (column aa) Average Installation Interval (days) row 114 weighted by row 110 (column aa) Average Repair Interval (hours) row 121 weighted by row 120 (column aa) Access Services Provided to Carriers -- Special Access Percent Installation Commitments Met row 112 weighted by row 110 (column ac) Average Installation Interval (days) row 114 weighted by row 110 (column ac) Average Repair Interval (hours) row 121 weighted by row 120 (column ac) Local Services Provided to Res. and Business Customers Percent Installation Commitments Met row 132 weighted by row 130 (column aj) Residence row 132 weighted by row 130 (column af) Business row 132 weighted by row 130 (column ai) Average Installation Interval (days) row 134 weighted by row 130 (column aj) Residence row 134 weighted by row 130 (column af) Business row 134 weighted by row 130 (column ai) Avg. Out of Svc. Repair Interval (hours) row 145 weighted by row 144 (column aj) Total Residence row 145 weighted by row 144 (column af) Total Business row 145 weighted by row 144 (column ai) Initial Trouble Reports per Thousand Lines 1000 * row 141 col aj / row 140 col aj Total MSA 1000 * (row 141 column ad + column ag)/ (row 140 column ad + column ag) Total Non MSA 1000 * (row 141 column ae + column ah)/ (row 140 column ae + column ah) Total Residence 1000 * (row 141 column af)/ (row 140 column af) Total Business 1000 * (row 141 column ai)/ (row 140 column ai) Troubles Found per Thousand Lines 1000 * (row 141 column aj - row 143 column aj)/ row 140 column aj Repeat Troubles as a Pct. of Trouble Rpts. (row 142 column aj) / (row 141 column aj) Residential Complaints per Million Res. Access Lines (row 331 column da + row332 column da)/ (row 330 column da) Business Complaints per Million Bus. Access Lines (row 321 column da + row 322 column da)/ (row 320 column da) 5 Appendix A--Detailed Quality of Service Report Table Specifications Report Table 1(b) and 2(b) (ARMIS 43-05 data) Statistic Total Access Lines in Thousands row 140 column aj Total Trunk Groups row 180 column ak Total Switches row 200 column an + row 201 column an Switches with Downtime row 200 column ao + row 201 column ao Number of Switches row 200 column ao + row 201 column ao As a percentage of Total Switches (row 200 column ao + row 201 column ao)/ (row 200 column an + row 201 column an) Average Switch Downtime in seconds per Switch* For All Events (including events over 2 minutes) 60 * (row 200 column ap + row 201 column ap)/ (row 200 column an + row 201 column an) For Unscheduled Events Over 2 Minutes 60 * (unscheduled events * average duration in min.)/ (row 200 column an + row 201 column an) For Unscheduled Downtime More than 2 Minutes Items where rows 220 to 500 column t > 1 Number of Occurrences or Events E = Number of records in row 220 to row 500 excluding rows 320, 321, 322, 330, 331 and 332 Events per Hundred Switches 100 *E/ (row 200 column an + row 201 column an) Events per Million Access Lines E/ 1,000,000 Average Outage Duration in Minutes (sum of rows 220 to 500 column x)/ E Average Lines Affected per Event in Thousands (sum of rows 220 to 500 column v)/ E Outage Line-Minutes per Event in Thousands (sum of rows 220 to 500 column x * column v)/ E Outage Line-Minutes per 1,000 Access Lines 1000 * (sum of rows 220 to 500 column x * column v)/ (row 140 column aj) For Scheduled Downtime More than 2 Minutes Items where rows 220 to 500 column t = 1 Number of Occurrences or Events E = Number of records in row 220 to row 500 excluding rows 320, 321, 322, 330, 331 and 332 Events per Hundred Switches 100 * E/ (row 200 column an + row 201 column an) Events per Million Access Lines E/ 1,000,000 Average Outage Duration in Minutes (sum of rows 220 to 500 column x)/ E Avg. Lines Affected per Event in Thousands (sum of rows 220 to 500 column v)/ E Outage Line-Minutes per Event in Thousands (sum of rows 220 to 500 column x * column v)/ E Outage Line-Minutes per 1,000 Access Lines 1000 * (sum of rows 220 to 500 column x * column v)/ (row 140 column aj) % Trunk Grps. Exceeding Blocking Objectives (row 189 column ak + row 190 column ak)/ (row 180 column ak) Notes: ARMIS 43-05 database rows 110-121 are contained in database table I ARMIS 43-05 database rows 130-170 are contained in database table II ARMIS 43-05 database rows 180-190 are contained in database table III ARMIS 43-05 database rows 200-214 are contained in database table IV ARMIS 43-05 database rows 220- 319 are contained in database table IVa ARMIS 43-05 database rows 320-332 are contained in database table V 6 Appendix A Detailed Quality of Service Report Table Specifications Report Table 1(c) and 2(c) (ARMIS 43-05 data) Total Number of Outages Number of rows between 220 and 500 for each value of column t 1. Scheduled 2. Proced. Errors -- Telco. (Inst./Maint.) 3. Proced. Errors -- Telco. (Other) 4. Procedural Errors -- System Vendors 5. Procedural Errors -- Other Vendors 6. Software Design 7. Hardware design 8. Hardware Failure 9. Natural Causes 10. Traffic Overload 11. Environmental 12. External Power Failure 13. Massive Line Outage 14. Remote 15. Other/Unknown Total Outage Line-Minutes per Thousand Access Lines (Sum of rows 200 to 500 column v * - column x for each value of column t) /row 140 col aj 1. Scheduled 2. Proced. Errors -- Telco. (Inst./Maint.) 3. Proced. Errors -- Telco. (Other) 4. Procedural Errors -- System Vendors 5. Procedural Errors -- Other Vendors 6. Software Design 7. Hardware design 8. Hardware Failure 9. Natural Causes 10. Traffic Overload 11. Environmental 12. External Power Failure 13. Massive Line Outage 14. Remote 15. Other/Unknown Notes: ARMIS 43-05 database rows 110-121 are contained in database table I ARMIS 43-05 database rows 130-170 are contained in database table II ARMIS 43-05 database rows 180-190 are contained in database table III ARMIS 43-05 database rows 200-214 are contained in database table IV ARMIS 43-05 database rows 220- 319 are contained in database table IVa ARMIS 43-05 database rows 320-332 are contained in database table V 7 Appendix A Detailed Quality of Service Report Table Specifications Report Table 1(d) (ARMIS 43-06 data) Percentage of Customers Dissatisfied Installations: Residential Row 40 column ac weighted by column ab Small Business Row 40 column ae weighted by column ad Large Business Row 40 column ag weighted by column af Repairs: Residential Row 60 column ac weighted by column ab Small Business Row 60 column ae weighted by column ad Large Business Row 60 column ag weighted by column af Business Office: Residential Row 80 column ac weighted by column ab Small Business Row 80 column ae weighted by column ad Large Business Row 80 column ag weighted by column af Note: ARMIS 43-06 database rows 40-80 are contained in database table I 8 Appendix A Detailed Quality of Service Report Table Specifications Report Table 1(e) (ARMIS 43-06 data) Note: ARMIS 43-06 database rows 40-80 are contained in database table I Sample Sizes -- Customer Perception Surveys Installations: Residential Sum of Row 40 column ab Small Business Sum of Row 40 column ad Large Business Sum of Row 40 column af Repairs: Residential Sum of Row 60 column ab Small Business Sum of Row 60 column ad Large Business Sum of Row 60 column af Business Office: Residential Sum of Row 80 column ab Small Business Sum of Row 80 column ad Large Business Sum of Row 80 column af Customer Response Publication: Quality of Service of Incumbent Local Exchange Carriers You can help us provide the best possible information to the public by completing this form and returning it to the Industry Analysis and Technology Division of the FCC's Wireline Competition Bureau. 1. Please check the category that best describes you: ____ press ____ current telecommunications carrier ____ potential telecommunications carrier ____ business customer evaluating vendors/service options ____ consultant, law firm, lobbyist ____ other business customer ____ academic/student ____ residential customer ____ FCC employee ____ other federal government employee ____ state or local government employee ____ Other (please specify) 2. Please rate the report: Excellent Good Satisfactory Poor No opinion Data accuracy (_) (_) (_) (_) (_) Data presentation (_) (_) (_) (_) (_) Timeliness of data (_) (_) (_) (_) (_) Completeness of data (_) (_) (_) (_) (_) Text clarity (_) (_) (_) (_) (_) Completeness of text (_) (_) (_) (_) (_) 3. Overall, how do you Excellent Good Satisfactory Poor No opinion rate this report? (_) (_) (_) (_) (_) 4. How can this report be improved? 5. May we contact you to discuss possible improvements? Name: Telephone #: To discuss this report contact Jonathan Kraushaar at 202-418-0947 Fax this response to or Mail this response to 202-418-0520 FCC/WCB/IATD Washington, DC 20554