QUALITY OF SERVICE OF INCUMBENT LOCAL EXCHANGE CARRIERS FEBRUARY 2007 Industry Analysis and Technology Division Wireline Competition Bureau Federal Communications Commission This report was authored by Jonathan M. Kraushaar of the Industry Analysis and Technology Division of the FCC’s Wireline Competition Bureau. The author can be reached at (202) 418-0947; e-mail address: jonathan.kraushaar@fcc.gov; TTY: (202) 418-0484. This report is available for reference in the FCC's Reference Information Center, Courtyard Level, 445 12th Street, S.W. Copies may be purchased by calling Best Copy and Printing, Inc. at (202) 488-5300. The report can be downloaded from the Wireline Competition Bureau Statistical Reports Internet site at http://www.fcc.gov/wcb/stats. 1 Quality of Service of Incumbent Local Exchange Carriers 1. Executive Summary 1.1 Overview This report summarizes the Automated Reporting Management Information System (ARMIS) service quality data filed by the regional Bell companies, Sprint 1 and other price-cap regulated incumbent local exchange carriers (ILECs) for calendar year 2005. 2 The data track the quality of service provided to both retail customers (business and residential) and access customers (interexchange carriers). The Federal Communications Commission (FCC or Commission) does not impose service quality standards on communications common carriers. Rather, the Commission monitors quality of service data submitted by incumbent local exchange carriers that are regulated as price-cap carriers. The Commission summarizes these data and publishes a report on quality of service trends annually. 3 The tables of this report present comparative data on key company performance indicators. These data include several objective indicators of installation, maintenance, switch outage and trunk blocking performance for each reporting company. The tables also present data on customer perception of service and the level of consumer complaints. A number of indicators are charted over time to present a multi- year view. In addition, the Commission uses statistical methods to analyze the data for long term trends and to establish patterns of industry performance. The results of these analyses are also contained in this report. 1.2 Key Findings for 2005 The quality of service report tracks industry performance over time on eight key quality of service indicators: average complaints per million lines, percent of installation commitments met, lengths of installation intervals, lengths of repair intervals, percent of switches with outages, 1 In May 2006, Sprint spun off its Local Telecommunications Division as an independent entity under the new name Embarq. Sprint filed its 2005 ARMIS data prior to the effective date of that change. In this report, we use the name Sprint to refer to the company prior to the spin-off, and our findings are for that entity, unless otherwise stated. 2 See Revision of ARMIS Annual Summary Report (FCC Report 43-01), ARMIS USOA Report (FCC Report 43-02), ARMIS Joint Cost Report (FCC Report 43-03), ARMIS Access Report (FCC Report 43-04), ARMIS Service Quality Report (FCC Report 43-05), ARMIS Customer Satisfaction Report (FCC Report 43-06), ARMIS Infrastructure Report (FCC Report 43-07), ARMIS Operating Data Report (FCC Report 43-08), ARMIS Forecast of Investment Usage Report (FCC Report 495A), and ARMIS Actual Usage of Investment Report (FCC Report 495B) for Certain Class A and Tier 1 Telephone Companies, CC Docket No. 86-182, Order, 20 FCC Rcd 19377 (2005). 3 The last report, which included data for 2004, was released in November 2005. See Industry Analysis and Technology Division, Wireline Competition Bureau, Federal Communications Commission, Quality of Service of Incumbent Local Exchange Carriers (November, 2005). That report can be found on the Commission’s website at www.fcc.gov/wcb/stats under the file name QUAL04.ZIP. Source data used to prepare this report may be useful for further investigation and can be extracted from the ARMIS 43-05 and 43-06 tables on the online database maintained on the FCC website at www.fcc.gov/wcb/eafs. 2 trouble report rate per thousand access lines, percent dissatisfied with installation, and percent dissatisfied with repair. Since our last report, there have been only small changes in the values of most of these indicators. However, our analysis, which incorporated performance data from the most recent six years, identified the presence of statistically significant long term upward or downward trends 4 in a number of industry performance indicators (i.e., with data for large and small companies combined) and in a number of large and small company performance indicators, when these data were analyzed separately. 5 Most of these trends are indicative of long-term improvement. 6 In particular, since 2000: • the average (number of) complaints per million lines is decreasing on average 5.1% per year for the industry as a whole, and 16.1 % for large companies; • the length of installation intervals is also decreasing on average 4.2% annually for the industry, and 7.4% for large companies; • the trouble report rate per thousand lines is declining on average 3.7 % annually for the large companies; and • the percent of switches with outages is declining on average 10.9% per year for the industry as a whole, and 13.2 % for the large companies. We also found a few statistically significant long-term trends toward declining service quality. Notably, since 2000, • the trouble report rate per thousand access lines is increasing on average 6.8% per year for the smaller companies; and • the length of repair intervals is increasing on average 5.1 % per year for the industry as a whole, 4.6 % per year for the large companies, and 7.5 % per year for the smaller companies. In spite of the noted significant trends toward decreasing lengths of installation intervals and increasing lengths of repair intervals, we found no significant long-term trends toward increasing or decreasing customer dissatisfaction with installations or repairs for the large companies. (Small companies are not required to report data on customer dissatisfaction with repairs and installations.) Nonetheless, large-company customer dissatisfaction with repairs did increase in 2005 after three relatively stable years. 7 Independent of trend behavior, which considers changes in performance over time, and considering performance alone, we found that, as in 2004, large company performance continued to 4 A trend is the average (or expected) annual percentage decline or increase in the value of the indicator. Our statistical analysis shows that, for many indicators, the probability that these trends occurred by chance is very small (i.e., less than one chance in one thousand for some indicators, and less than one chance in one hundred for others). In these cases, we say the trend is statistically significant. For further discussion of the statistical techniques employed in this report and detailed results, see infra Section 5.2. 5 For a list of large and small companies, see infra note 27 and note 28. 6 Many factors in addition to actual company performance may impact these data. See infra discussion in Section 4. Thus, the causes for trends and other statistical results reported in this document must be considered in a broader context, and cannot be determined by examination of these data alone. 7 Customer dissatisfaction data are based on company-designed survey methodologies and procedures. This data is collected in the ARMIS 43-06 reports filed only by the larger incumbent local exchange carriers. 3 differ significantly from small company performance on all indicators, except for the length of repair intervals and the trouble report rate per thousand lines. In these two areas, the performance of large and small companies was statistically indistinguishable. We found that the average length of repair intervals rose from 2004 to 2005, and is now at the highest level in the six-year period covered by this report. In addition, when data were aggregated to the holding company level, the average length of repair intervals increased in 2005 for every large holding company and for most of the seven smaller companies included in this report. Weather related problems were of particular note in 2005 and may have been a factor in the length of repair intervals for that year. However, the observed increases in average length of repair intervals for 2005 are also directionally consistent with longer term statistically significant upward trends in this indicator, which were identified by our analysis and discussed earlier in this section. 2. Report History At the end of 1983, anticipating AT&T's imminent divestiture of its local operating companies, the Commission directed the Common Carrier Bureau 8 to establish a monitoring program that would provide a basis for detecting adverse trends in Bell operating company network service quality. The Bureau subsequently worked with industry to refine the reporting requirements, ensuring that the data were provided in a uniform format. Initially, the data were filed twice yearly. The data collected for 1989 and 1990 formed the basis for FCC service quality reports published in June 1990 and July 1991, respectively. These reports highlighted five basic service quality measurements collected at that time. 9 With the implementation of price-cap regulation for certain local exchange carriers, the Commission made several major changes to the service quality monitoring program. These changes first affected data filed for calendar year 1991. First, the Commission expanded the class of companies required to file quality of service data to include non-Bell carriers that elected to be subject to price-cap regulation. 10 These carriers are known collectively as non-mandatory price-cap carriers, and most of them are much smaller than the Bell operating companies. Second, the Commission included service quality reporting in the ARMIS data collection system. 11 Finally, the Commission 8 As the result of a reorganization in March 2002, the Wireline Competition Bureau now performs Common Carrier Bureau functions described in this report. In this report, references to the Common Carrier Bureau apply to activities prior to the above date. 9 These were customer satisfaction level, dial tone delay, transmission quality, on time service orders, and percentage of call blocking due to equipment failure. 10 Policy and Rules Concerning Rates for Dominant Carriers, CC Docket No. 87-313, Second Report and Order, 5 FCC Rcd 6786, 6827-31 (1990) (LEC Price-Cap Order) (establishing the current service quality monitoring program and incorporating the service quality reports into the ARMIS program), Erratum, 5 FCC Rcd 7664 (1990), modified on recon., 6 FCC Rcd 2637 (1991), aff'd sub nom., Nat'l Rural Telecom Ass'n v. FCC, 988 F.2d 174 (D.C. Cir. 1993). The incumbent local exchange carriers that are rate-of-return regulated are not subject to federal service quality reporting requirements. 11 LEC Price-Cap Order, 5 FCC Rcd at 6827-30. The ARMIS database includes a variety of mechanized company financial and infrastructure reports in addition to the quality-of-service reports. Most data are available disaggregated to a study area level which generally represents operations within a given state. 4 ordered significant changes to the kinds of data carriers had to report. 12 Following these developments, the Commission released service quality reports in February 1993, March 1994, and March 1996. In 1996, pursuant to requirements in the Telecommunications Act of 1996, 13 the Commission reduced the frequency of ARMIS data reporting to annual submissions, and in May 1997, clarified relevant definitions. 14 The raw data are now filed in April of each year. The Commission summarizes these data and publishes the quality of service report annually. 15 3. The Data 3.1 Tables The data presented in this report summarize the most recent ARMIS 43-05 and 43-06 carrier reports. 16 Included are data from the regional Bell companies, Sprint and all other reporting incumbent local exchange carriers. 17 Tables 1(a) through 1(e) cover data from the regional Bell companies, or 12 Id.; Policy and Rules Concerning Rates for Dominant Carriers, CC Docket No. 87-313, Memorandum Opinion and Order, 6 FCC Rcd 2974 (1991) (Service Quality Order), recon., 6 FCC Rcd 7482 (1991). Previously the Common Carrier Bureau had collected data on five basic service quality measurements from the Bell operating companies, described earlier. 13 Telecommunications Act of 1996, Pub. L. No. 104-104, 110 Stat. 56. 14 Orders implementing filing frequency and other reporting requirement changes associated with implementation of the Telecommunications Act of 1996 are as follows: Implementation of the Telecommunications Act of 1996: Reform of Filing Requirements and Carrier Classifications, CC Docket No. 96-193, Order and Notice of Proposed Rulemaking, 11 FCC Rcd 11716 (1996); Revision of ARMIS Quarterly Report (FCC Report 43-01) et al., CC Docket No. 96-193, Order, 11 FCC Rcd 22508 (1996); Policy and Rules Concerning Rates for Dominant Carriers, CC Docket No. 87-313, Memorandum Opinion and Order, 12 FCC Rcd 8115 (1997); Revision of ARMIS Annual Summary Report (FCC Report 43-01) et al., AAD No. 95-91, Order, 12 FCC Rcd 21831 (1997). 15 These quality of service reports have included data from the mandatory price-cap companies and the largest non-mandatory carriers, GTE and Sprint. GTE is now a part of Verizon, a mandatory price-cap carrier. Beginning with the December 2004 report, the following smaller non-mandatory price-cap companies that file ARMIS 43-05 data are included: Alltel Corp., Century Tel., Cincinnati Bell, Citizens, Citizens Frontier, Iowa Telecom, and Valor Telecommunications. Non-mandatory carriers are not required to file customer satisfaction data that appear in the ARMIS 43-06 report. 16 Source data used in preparing this report may be useful for further investigation and can be extracted from the ARMIS 43-05 and 43-06 tables on the online database maintained on the FCC website at www.fcc.gov/wcb/eafs. The data are also available from Best Copy and Printing, Inc at (202) 488-5300. A number of prior-year data summary reports are available through the FCC’s Reference Information Center (Courtyard Level) at 445 12th Street, S.W., Washington, D.C. 20554, and the Wireline Competition Bureau Statistical Reports website at www.fcc.gov/wcb/stats. 17 In February 1992, United Telecommunications Inc. became Sprint Corporation (Local Division); and in March 1993, Sprint Corporation acquired Centel Corporation. Bell Atlantic and NYNEX merged in August 1997, and then merged with GTE in 2000. Verizon Communications is shown separately for GTE, Verizon North (the former NYNEX companies), and Verizon South (the former Bell Atlantic Companies). Similarly, SBC and Pacific Telesis merged in April 1997, SBC and SNET merged in October 1998, and SBC and SBC 5 mandatory price-cap companies. Tables 2(a) through 2(c) cover data from the smaller non-mandatory price-cap companies. These companies report quality of service data at a study area level which generally represents operations within a given state. Although reporting companies provide selected company aggregate data, the tables of this report contain summary data that have been recalculated by FCC staff as the composite aggregate of all study areas for each listed entity. This report also includes a fairly extensive summary of data about individual switching outages, including outage durations and numbers of lines affected, for which no company calculated aggregates are provided. Switch outage data have also been aggregated to the company level for inclusion in the tables. The company-level quality of service data included in Tables 1(a)-1(e) and Tables 2(a)-2(c) are derived by calculating sums or weighted averages of data reported at the study area level. In particular, where companies report study area information in terms of percentages or average time intervals, this report presents company composites that are calculated by weighting the percentage or time interval figures from all study areas within that company. For example, we weight the percent of commitments met by the corresponding number of orders provided in the filed data. 18 In the case of outage data summarized in Tables 1(b), 1(c), 2(b), and 2(c), we calculate a number of useful statistics from raw data records for individual switches with outages lasting more than two minutes. These statistics include the total number of events lasting more than two minutes, the average outage duration, the average number of outages per hundred switches, the average number of outages per million access lines, and the average outage line-minutes per thousand access lines and per event. Outage line-minutes is a measure that combines both duration and number of lines affected in a single parameter. We derive this parameter from the raw data by multiplying the number of lines involved in each outage by the duration of the outage and summing the resulting values. We then divide the resulting sum by the total number of thousands of access lines or of events to obtain average outage line-minutes per access line and average outage line minutes per event, respectively. The tables contained in this report cover data for 2005. Tables 1(a) and 2(a) provide installation, maintenance and customer complaint data. The installation and maintenance data are presented separately for local services provided to end users and access services provided to interexchange carriers. Tables 1(b) and 2(b) show switch downtime and trunk servicing data. Tables 1(c) and 2(c) show outage data by cause. Table 1(d) presents the percentages of residential, small business and large business customers indicating dissatisfaction with BOC installations, repairs and business offices, as determined by BOC customer perception surveys. 19 Table 1(e) shows the and Ameritech merged in October 1999. SBC and AT&T then merged at the end of 2005 and the merged company retained the name AT&T. Data from the entities originally known as SBC Southwestern, Ameritech, Pacific Telesis and SNET are shown separately in the charts and tables with the AT&T company name. This report reflects data filed prior to the merger of BellSouth and AT&T. 18 Although companies file their own company composites, we have recalculated a number of them from study area data for presentation in the tables to assure that company averages are calculated in a consistent manner. We weight data involving percentages or time intervals in order to arrive at consistent composite data shown in the tables. Parameters used for weighting in this report were appropriate for the composite being calculated and were based on the raw data filed by the carriers but are not necessarily shown in the tables. For example, we calculate composite installation interval data by multiplying the average installation interval at the individual study area level by the number of orders in that study area, summing the results for all study areas, and then dividing that sum by the total number of orders. 19 Customer satisfaction data collected in the 43-06 report and summarized in Tables 1(d) and 1(e) are required to be reported only by the mandatory price-cap carriers. 6 underlying survey sample sizes. 3.2 Charts This report displays data elements that have remained roughly comparable over the past few years. Such data are useful in identifying and assessing trends. In addition to the tables, this report contains charts that highlight company trends for the last 6 years. Unlike the tables for which the company composites are recalculated, the data in the charts are presented or derived from company provided rollup or composite data. 20 Charts 1 through 7 graphically illustrate trends in complaint levels, initial trouble reports, residential installation dissatisfaction, percent of residential installation commitments met, residential installation intervals, residential repair dissatisfaction, and residential initial out-of-service repair intervals, respectively. Chart 8 displays trends among the larger price-cap carriers in the percentage of switches with outages. Data for Sprint, the largest non-mandatory price- cap company at the time it filed its 2005 ARMIS data, are included only in those charts displaying ARMIS 43-05 data that it is required to file. This report charts the performance of the smaller price-cap carriers only on selected quality of service indicators including the trouble report rate per thousand lines, lengths of repair intervals and lengths of installation intervals. These indicators were selected for charting because they are generally less volatile than the others, thus allowing better comparison with similar trended data from the larger companies. (In the cases where we chart both large and small company performance, the larger companies are tracked on the chart with an ‘A’ designation, e.g., Chart 7A, while the smaller companies are tracked on the chart with a ‘B’ designation, e.g., Chart 7B.) Filed data are not available for all of the past six years for several of the smaller companies, which accounts for the truncated trend lines in some of the charts. Since the most current access line counts are used as weighting factors in calculation of industry composites in the charts, small changes in these composites from year-to-year may be accounted for by changes in the relative number of company access lines. For example, this accounted for a reduction in composite complaint levels in 2004 of less than one percent. 3.3 For More Information about the Data More detailed information about the raw data from which this report has been developed may be found on the Commission’s ARMIS web page cited earlier. Descriptions of the raw ARMIS 43-05 source data items from which Tables 1(a), 1(b), 1(c), 2(a), 2(b), and 2(c) were prepared can be found in Appendix A of this report. Tables 1(d) and 1(e) were prepared from data filed only by the Bell operating companies in the ARMIS 43-06 report. The statistics presented in Tables 1(d) and1(e) are straightforward and reflect the data in the format filed. Complete data descriptions are available in several Commission orders. 21 4. Qualifications 20 Calculations to normalize data and derive percentages in charts 1, 2A, 2B and 8 in this year’s report were performed directly on company provided composite data rather than from recalculated composites in the attached tables. Other charts contain data that were taken directly from company provided composite data. 21 See supra note 14. 7 Overall, we caution readers to be aware of potential inconsistencies in the service quality data and methodological shortcomings affecting both the collection and interpretation of the data. Some common sources of issues are described below. 4.1 Data Re-filings Commission staff generally screen company-filed service quality data for irregularities and provide feedback to reporting companies on suspected problems. The reporting companies are then given an opportunity to re-file. Re-filed data appear in this report if they are received in time to be included in the Commission’s recalculation of holding company totals and other data aggregates described in Section 3.1 prior to publication. However, it is expected that the process of data correction continues beyond the date of publication of this report, as new problems are identified. Reporting companies frequently re-file data, not only for the current reporting period, but also occasionally for previous reporting periods. Hence, users of the quality of service report data may find some inconsistencies with data extracted from the ARMIS database at a later or earlier date. 4.2 Commission Recalculation of Holding Company Aggregate Statistics Commission staff do not typically delete or adjust company-filed data for presentation in the quality of service report, except for recalculating holding company totals and other data aggregates as described in Section 3.1. Recalculated aggregates appear in the tables of the quality of service report. These may not match corresponding company-filed totals and composites. 22 Such inconsistencies are due primarily to differences in the way we and the reporting company derive the data element, for example, in the use of percentages or average intervals that require weighting in the calculations. 4.3 Company-specific Variations Users conducting further analysis of the data should be aware that variations in service quality measurements may occur among companies and even within the same company over time for reasons other than differences in company performance. For example, data definitions must be properly and consistently interpreted. The Commission has, on occasion, provided clarifications when it became apparent that reporting companies had interpreted reporting requirements inconsistently. 23 Changes in a company’s internal data collection procedures or measurement technology may also result in fluctuations in its service quality measurements over time. In some cases, procedural changes in the data measurement and collection process may be subtle enough so that they are not immediately noticeable in the data. However, significant changes in company data collection procedures usually 22 Data presented in the charts are company-filed composites, except where noted. 23 For example, because of data problems resulting from the various classifications of trouble reports, the Commission addressed problems relating to subtleties in the definitions associated with the terms “initial” and “repeat” trouble reports. See Policy and Rules Concerning Rates for Dominant Carriers, CC Docket No. 87-313, Memorandum Opinion and Order, 12 FCC Rcd 8115, 8133, para. 40 (1997); Policy and Rules Concerning Rates for Dominant Carriers, AAD No. 92-47, Memorandum Opinion and Order, 8 FCC Rcd 7474, 7478, para. 26, 7487-7549, Attachment (1993); Revision of ARMIS Annual Summary Report (FCC Report 43-01) et al., AAD 95-91, Order, 12 FCC Rcd 21831, 21835, para. 10 (1997) (introducing reporting of “subsequent” troubles). This issue was discussed at greater length in a prior summary report. See Industry Analysis Division, Common Carrier Bureau, Federal Communications Commission, Quality of Service for the Local Operating Companies Aggregated to the Holding Company Level (March 1996). 8 result in noticeable and abrupt changes in the data. 24 It appears that at least some of these changes have not been reported to the Commission. These factors tend to limit the number of years of reliable data available to track service quality trends. Although the Commission has made considerable efforts to standardize data reporting requirements over the years, given the number of changes to the reporting regimes and predictable future changes, one should not assume exact comparability on all measurements for data sets as they are presented year by year. In spite of all of the foregoing, deteriorating or improving service quality trends that persist for more than a year or two usually become obvious and can provide a critical record for state regulators and others. 4.4 Trend Analysis and Data Volatility Because measurements of any particular quality of service indicator may fluctuate over time, trend analysis can be an effective tool in helping to evaluate longer-term company and industry performance. Consideration of trends may also provide insight into typical lead times that might be needed to correct certain problems once they have been identified. In addition, adverse trends in complaint levels of significant duration, when identified, can serve as warning indicators of problems not included in the more specific objective measurements. For these reasons we identify statistically significant trends in the data. Identification of such trends assists in evaluating the significance of year-to-year changes in the data. With respect to individual measures of company performance, it is our experience that service reliability and to a lesser extent customer satisfaction data are, by their nature, subject to greater volatility than other types of company data. For these measures, in particular, data interpretation must consider longer term trends and take into consideration filing intervals and lag times in data preparation and filing. 4.5 Interpretation of Outage Statistics Statistics describing the impact of outages should be considered in context. For example, a statistic representing the average number of lines affected per event would tend to favor a company with a larger number of smaller or remote switches and lower line counts per switch, while a statistic representing the average outage duration might favor a company with a few larger switches. Thus, using the average number of lines per event measurement, one 25,000 line switch that is out of service for five minutes would appear to have a greater service impact than ten 2,500 line switches that are each out of service for five minutes. To provide a consistent basis for comparison of performance of companies having different switch size characteristics, we present a grouping of outage statistics that can capture the impact of both the number of lines affected and the duration of the outage. These 24 For example, SBC (now AT&T) reported changes for 2003 in its complaint data which were designed to normalize disparate reporting methodologies in its Ameritech region. Resulting declines in complaint levels are at least partially attributable to these changes, which involved elimination of several complaint data reporting subcategories previously included by Ameritech. At our request, the company restated 2002 data for Ameritech to conform to new procedures that were introduced for the 2003 data collection and reporting. The restated Ameritech data were not formally filed as a revision but would have shown 43.9 residential complaints per million residential lines and 15.9 business complaints per million business lines. This would have resulted in an average of 29.9 complaints per million lines instead of the 213.4 complaints per million lines shown for the year 2002 Chart 1. Although improvement in 2003 is still indicated, the improvement appears to be more modest if we assume that the procedural change took place in 2002 instead of 2003. 9 statistics include outage line-minutes per event and per 1,000 access lines. 4.6 External Factors We note that external factors, including economic conditions and natural disasters, the level of competitive activity, and changes in regulation have the potential to affect the quality of service available in specific regions of the country or in the industry as a whole, and these effects may be manifested in the quality of service data. 25 The Commission does not currently consider these effects in its analysis. 5. Observations and Statistical Analysis 5.1 Observations from the Current Year Summary Data Charts 1 to 8 track service quality summary data for the large and small price-cap carriers for the last six years. In 2005, repair intervals increased for every large holding company, with the average initial out-of-service repair interval climbing from 26.7 hours in 2004 to 31.3 hours in 2005. This is the highest level in the six-year period and continues an overall upward trend. 26 Concurrently, the weighted average residential customer dissatisfaction associated with repairs by large companies increased slightly from 12.3 to 13.0 percent dissatisfied. This follows three years with very little change. The length of repair intervals also increased for most of the seven smaller companies, with the average interval increasing to 19.2 hours in 2005. By way of contrast, the average length of the residential installation interval remained near its 2004 level for both large and small carriers, and dissatisfaction with large company residential installations also remained near its 2004 level. Small increases were observed in the number of initial trouble reports per thousand lines for both small and large price-cap carriers from 2004 to 2005. In addition, after several years of decline the weighted average number of complaints per million access lines among the large price cap carriers increased slightly from 93.4 in 2004 to 102.1 in 2005. However, this number remains well below 258.6 complaints per million access lines, the high-point for the six-year period, observed in 2000. 5.2 Statistical Analysis The FCC’s quality of service report presents graphical analysis of several key indicators of industry and company performance. The graphs in the current report contain data for the most recent six-year period. The indicators currently tracked are complaints per million lines, length of installation intervals, length of repair intervals, percent of installation commitments met, trouble 25 For example, the outage statistics for BellSouth, found in Table 1(b), which are significantly higher than that of the other companies, appear to illustrate the impact of Hurricane Katrina in 2005. Also, footnotes contained in the filed 2005 data suggest that actions of the California Public Utilities Commission to clear a complaint backlog may have affected complaint levels in that state. 26 Severe weather-related problems may in fact account for some of the increase in average repair interval lengths in 2005. However, our analysis also indicates the presence of a statistically significant six-year trend toward declining performance for this indicator. See infra Section 5.2. 10 reports per thousand lines, percent installation dissatisfaction, percent repair dissatisfaction and percent of switches with outages. In this year’s report we update the results of the statistical analysis of these indicators from raw data samples received from reporting companies. The overall goals of our statistical analysis are to: ƒ determine if there were any discernable trends in performance as tracked by these indicators across the years, ƒ determine if reporting companies performed differently from each other, ƒ determine whether the large reporting companies performed differently or had different trend behavior from small reporting companies, and ƒ develop models of trends in performance that could be used to predict next year’s performance. For the purpose of our analysis, we classified companies as “large” or “small.” This classification is largely the same as that used earlier in creating the charts -- the larger companies 27 are tracked on the charts with an ‘A’ designation (e.g., chart 2A), and the smaller companies 28 are tracked on the charts with a ‘B’ designation (e.g., chart 2B). However, even though Iowa Telecom was classified as a small company in the charts, it was included as a large company for the statistical analysis, since its performance was very close to that of the larger companies. We used several types of statistical techniques in analyzing the data. These included ANOVA (Analysis of Variance), ANCOVA (Analysis of Covariance) and simple linear regression. They allowed us to analyze small-versus-large company effects, individual company effects, and year effects (i.e., does performance vary from year-to-year) in the performance data for each of the key indicators. We tested for the existence of overall trends, 29 trends for only the large companies, and trends for only the small companies. If a trend existed, we then determined its direction and magnitude. In addition, the statistical testing allowed us to determine if the trends varied widely across companies, if there were performance differences across companies, and if large company performance differed from small company performance. The following table summarizes the results of our statistical analysis on data filed by reporting companies since the year 2000, representing the most recent six-year reporting period. 30 (Note that smaller non-mandatory price cap carriers are not required to file data on all performance indicators. These are designated as “NA” in the table.) 27 The larger companies in the charts of this report are BellSouth, Qwest, AT&T Ameritech, AT&T Pacific, AT&T Southwestern, AT&T SNET, Verizon GTE, Verizon North, Verizon South and Sprint. 28 The smaller companies in the charts of this report are Alltel Corp., Cincinnati Bell, Citizens, Citizens Frontier, Century Tel., Iowa Telecom and Valor. 29 A trend is the expected annual change in the value of the performance indicator. For example, a negative trend of -5.2% means that every year the value of the indicator is expected to decrease by 5.2%. A positive trend (e.g. +6.3%), means that every year the value of the indicator is expected to increase by 6.3%. The magnitude and direction of the trend for a particular performance indicator is estimated by fitting a linear regression model to the logarithms of the values of that performance indicator for the past six years. 30 The table is based on individual raw study area samples from the ARMIS database which have not been weighted. The trends calculated from these samples may therefore differ from composite trends calculated as weighted company totals. 11 The rows of the table contain the key indicators of company performance tracked by this report. The columns contain the effects described above. A “Yes” entry in the table means that we have concluded with a high level of statistical confidence that the effect for which we have tested is indeed present. A “No” entry means that the data did not support such a conclusion. For example, we tested to determine whether large company performance differs from small company performance on the average complaints per million lines indicator, and we concluded with a high degree of statistical confidence that large company performance does differ from small company performance on this indicator. We included the direction and magnitude of a trend in the table if our statistical testing indicated that there was a low probability the trend occurred as a result of random fluctuations in the data. A number of the trends were found significant at less than the 0.001 level, meaning there was less than one chance in 1000 that these trends occurred as a result of random data fluctuations. However, asterisked trends were found significant at less than the 0.01 level, but not at the 0.001 level, meaning that there was a greater probability—between one chance in 100 and one chance in 1000— that these trends happened by chance. The word “No” appearing in any of the first three columns of the table indicates that a trend could not be established at the 0.01 level of significance. In the last three columns of the table the word “Yes” indicates that statistically significant differences were found between companies or groups of companies, and the word “No” indicates that such differences could not be established statistically. Results of Statistical Testing of Key Industry Performance Indicators Trend Over All Companies Trend For Large Companies Trend for Small Companies Trends Vary Widely Across Companies Performance Differences Across Companies Large Company Performance Differs From Small Average complaints per million lines -.5.1%* -16.1% No Yes Yes Yes Installation intervals -4.2%* -7.4% No Yes Yes Yes Repair intervals +5.1% +4.6% +7.5% Yes Yes No* Percent commitments met No No No No No* Yes Trouble report rate per 1000 lines No -3.7%* 6.8% No No No Percent installation dissatisfaction No No NA Yes Yes NA Percent repair dissatisfaction No No NA Yes Yes NA Percent switches with outages -10.9%* -13.2% No Yes Yes Yes All results are significant at less than the 0.001 level except as noted below. * Indicates a result which was significant at less than the 0.01 level. As noted earlier, a trend represents the expected or average change in the value of the performance indicator from year to year. Considering columns 1 through 3, we note our analysis has allowed us to conclude with a high degree of confidence that statistically significant trends do exist in the data for many indicators of performance. Factors other than random data variability are likely to be responsible for these trends. However, what those factors are cannot be determined from our data alone. (Section 4 of this report discusses factors that may impact the data in addition to company performance.) Also, recent observed annual performance changes may not necessarily be in a direction or magnitude consistent with calculated trends of the 12 previous five years. This may occur, for example, when significant underlying events or changes occur. 31 Considering column 4, we find that trends vary widely across companies, except for the “percentage of installation commitments met” indicator and the “trouble reports per 1000 lines” indicator. Column 5 shows that there are significant statistical differences across companies in all performance measures except for the “percent of installation commitments met” and “trouble reports per thousand lines” indicators. Finally, column 6 shows that there is virtually no statistical difference between large and small company performance in the “length of repair interval” and “trouble reports per 1000 lines” entries. Overall, our analysis shows that there are statistically significant trends for several of the performance measures. These trends are typically indicative of longer-term improvement. However, the overall upward trend in the length of repair intervals with all companies included in the analysis, and for both large and small companies considered separately; as well as the trend toward higher trouble report rates, for the smaller companies, provide evidence of longer-term declining performance in these areas. Nonetheless, our analysis finds no statistically significant trends in customer repair dissatisfaction levels, a somewhat surprising result. While reasons for declines or improvements in performance cannot be determined from these data alone, we have found that reported complaint levels exhibit a higher correlation with installation intervals than with repair intervals. In closing, we note that although the highlighted trends reflect longer term patterns in company performance than simply looking at year over year changes, their direction in the future may change as companies respond or fail to respond to quality of service issues. 31 For example, in chart 2A covering trouble reports per thousand lines for large companies, the data show an increase in the trouble report rate for the last two years, while there is a downward trend in the trouble report rate for the entire period of observation. In another example, although the average number of complaints per million lines indicator shows a statistically significant downward trend since 2000, the average number of complaints rose from 93.4 in 2004 to 102.1 in 2005. This is the first annual increase in the average complaints indicator after four consecutive years of decline. ARMIS 43-05 Report 2000 2001 2002 2003 2004 2005 BellSouth 241.6 192.7 131.5 128.0 131.4 137.7 Qwest 379.2 203.4 149.2 103.5 89.1 80.8 AT&T Ameritech 613.2 382.8 213.4 13.2 11.2 12.0 AT&T Pacific 39.2 19.6 12.5 10.6 10.4 23.3 AT&T Southwestern 28.0 23.9 17.0 13.4 21.9 21.9 AT&T SNET 326.3 231.6 186.6 87.1 88.5 20.4 Verizon GTE 106.7 80.1 60.3 79.1 104.8 161.0 Verizon North (Combined with Verizon South) Verizon South 299.4 197.3 151.8 190.7 184.7 191.9 Sprint 287.9 136.4 75.3 78.9 43.3 46.0 Weighted BOC/Sprint Composite* 258.6 166.5 113.3 94.3 93.4 102.1 *Weighted composite is calculated using access line counts. Chart 1 Average of Residential and Business Complaints per Million Access Lines (Calculated Using Data from Company Provided Composites) Relative Complaint Levels Large Price-Cap Carriers 0.0 50.0 100.0 150.0 200.0 250.0 300.0 350.0 400.0 2000 2001 2002 2003 2004 2005 Years Co mpla ints per Millio n Lines Weighted BOC/Sprint Composite* Weighted Verizon Avg. BellSouth Weighted AT&T Avg. Qwest Sprint 13 ARMIS 43-05 Report 2000 2001 2002 2003 2004 2005 BellSouth 290.9 300.1 285.0 278.5 298.2 307.3 Qwest 163.0 131.3 111.4 113.4 117.6 112.6 AT&T Ameritech 177.5 200.4 171.4 149.7 146.2 144.3 AT&T Pacific 157.7 146.8 129.0 119.4 116.1 129.4 AT&T Southwestern 212.8 222.0 197.8 175.4 190.5 173.3 AT&T SNET 194.0 195.6 173.2 180.3 165.8 184.9 Verizon GTE 177.1 164.5 146.4 153.0 167.2 191.7 Verizon North (Combined with Verizon South) Verizon South 168.3 160.6 151.5 169.4 157.8 164.1 Sprint 223.7 206.3 165.6 192.2 216.1 221.1 Weighted BOC/Sprint Composite* 194.2 190.8 172.1 172.1 175.7 180.9 * Weighted composite is calculated using access line counts. Total Initial Trouble Reports per Thousand Lines (Residence + Business) (Calculated Using Data from Company Provided Composites) Chart 2A Initial Trouble Reports per Thousand Lines Large Price-Cap Carriers 0.0 50.0 100.0 150.0 200.0 250.0 300.0 350.0 2000 2001 2002 2003 2004 2005 Years Number o f Repo rts Weighted Verizon Avg. BellSouth Weighted AT&T Avg. Qwest Sprint Weighted BOC/Sprint Composite* 14 ARMIS 43-05 Report 2000 2001 2002 2003 2004 2005 Alltel Corp. 233.5 193.1 128.2 Cincinnati Bell 136.6 136.0 118.7 114.6 113.6 131.4 Citizens 313.3 286.0 264.0 260.2 296.0 325.1 Citizens (Frontier) 305.6 252.6 345.8 266.6 257.2 252.4 Century Tel. 266.9 265.0 231.1 Iowa Telecom 135.9 132.6 157.2 155.4 Valor 397.7 368.0 422.6 479.8 Weighted BOC/Sprint Composite* 194.2 190.8 172.1 172.1 175.7 180.9 Weighted Small Co.Composite* 259.6 232.5 258.9 237.6 244.6 245.5 * Weighted composite is calculated using access line counts. Total Initial Trouble Reports per Thousand Lines (Residence + Business) (Calculated Using Data from Company Provided Composites) Chart 2B Initial Trouble Reports per Thousand Lines Small Price-Cap Carriers 0.0 100.0 200.0 300.0 400.0 500.0 600.0 2000 2001 2002 2003 2004 2005 Years Number o f Repo rts Alltel Corp. Cincinnati Bell Citizens Citizens (Frontier) Weighted Small Co.Composite* Weighted BOC/Sprint Composite* Century Tel. Iowa Telecom Valor 15 ARMIS 43-06 Report 2000 2001 2002 2003 2004 2005 BellSouth 12.8 11.2 10.3 6.7 6.4 5.7 Qwest 7.4 6.4 7.0 5.5 3.9 3.7 AT&T Ameritech 16.4 15.5 10.7 8.1 7.6 6.7 AT&T Pacific 13.5 8.8 6.4 6.1 6.1 6.4 AT&T Southwestern 6.8 8.0 8.1 7.9 8.4 7.1 AT&T SNET 18.7 8.3 7.3 7.6 8.6 8.4 Verizon GTE 4.4 4.8 4.1 3.5 5.3 6.9 Verizon North (Combined with Verizon South) Verizon South 5.2 4.8 5.2 6.2 6.4 6.2 Weighted BOC Composite* 9.3 8.2 7.2 6.3 6.4 6.2 *Weighted composite is calculated using access line counts. Chart 3 Percent Dissatisfied --BOC Residential Installations (Using Company Provided Composites) Residential Installation Dissatisfaction BOCs 0.0 2.0 4.0 6.0 8.0 10.0 12.0 14.0 2000 2001 2002 2003 2004 2005 Years Percent Dissa t isfied Weighted Verizon Avg BellSouth Weighted AT&T Avg. Qwest Weighted BOC Composite* 16 ARMIS 43-05 Report 2000 2001 2002 2003 2004 2005 BellSouth 100.0 100.0 100.0 98.2 98.7 98.7 Qwest 98.9 99.3 99.5 99.7 99.7 99.6 AT&T Ameritech 98.9 98.8 99.1 98.9 98.6 98.6 AT&T Pacific 99.1 99.5 99.6 99.6 99.4 99.2 AT&T Southwestern 98.8 98.8 98.9 99.1 99.0 99.1 AT&T SNET 98.9 100.0 100.0 99.5 99.6 99.6 Verizon GTE 96.2 95.5 98.5 98.3 98.4 98.0 Verizon North (Combined with Verizon South) Verizon South 98.5 98.9 98.7 98.7 98.8 98.9 Sprint 97.7 98.8 98.2 97.5 96.8 97.2 Weighted BOC/Sprint Composite* 98.6 98.8 99.1 98.8 98.8 98.8 *Weighted composite is calculated using access line counts. Percent Installation Commitments Met -- Residential Services (Using Company Provided Composites) Chart 4 Percent Residential Installation Commitments Met Large Price-Cap Carriers 97.0 97.5 98.0 98.5 99.0 99.5 100.0 2000 2001 2002 2003 2004 2005 Years Percent o f Co mmitments M e t Weighted BOC/Sprint Composite* Weighted Verizon Avg. BellSouth Weighted AT&T Avg. Qwest 17 ARMIS 43-05 Report 2000 2001 2002 2003 2004 2005 BellSouth 1.3 1.2 1.1 1.1 1.1 1.3 Qwest 1.0 0.6 0.5 0.4 0.3 0.3 AT&T Ameritech 2.1 2.0 2.1 1.5 1.4 1.4 AT&T Pacific 1.8 1.3 1.2 1.5 1.6 1.5 AT&T Southwestern 0.8 2.2 1.8 1.9 2.0 2.1 AT&T SNET 2.2 1.8 1.0 1.0 1.0 1.0 Verizon GTE 1.0 0.8 0.6 0.6 0.6 0.9 Verizon North (Combined with Verizon South) Verizon South 1.5 1.1 1.0 1.1 1.1 1.0 Sprint 3.9 3.2 1.5 1.4 1.7 1.7 Weighted BOC/Sprint Composite* 1.5 1.4 1.2 1.2 1.2 1.2 * Weighted composite is calculated using access line counts. Chart 5A Average BOC Residential Installation Interval in Days (Using Company Provided Composites) Residential Installation Intervals Large Price-Cap Carriers 0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 2000 2001 2002 2003 2004 2005 Years In t e r val i n D ays Weighted BOC/Sprint Composite* Weighted Verizon Avg. BellSouth Weighted AT&T Avg. Qwest Sprint 18 ARMIS 43-05 Report 2000 2001 2002 2003 2004 2005 Alltel Corp. 1.8 1.6 2.6 Cincinnati Bell 1.9 2.3 1.7 4.5 1.7 2.1 Citizens 4.8 4.6 4.8 5.3 4.1 2.7 Citizens (Frontier) 5.6 3.5 5.3 4.8 5.1 5.0 Century Tel. 3.3 1.6 1.3 Iowa Telecom 2.1 1.8 1.9 1.7 Valor 3.0 2.0 1.6 2.2 Weighted BOC/Sprint Composite* 1.5 1.4 1.2 1.2 1.2 1.2 Weighted Small Co.Composite* 4.2 3.6 3.8 3.8 2.8 2.7 * Weighted composite is calculated using access line counts. Chart 5B Average BOC Residential Installation Interval in Days (Using Company Provided Composites) Residential Installation Intervals Small Price-Cap Carriers 0.0 1.0 2.0 3.0 4.0 5.0 6.0 2000 2001 2002 2003 2004 2005 Years In t e r val i n D ays Weighted BOC/Sprint Composite* Alltel Corp. Cincinnati Bell Citizens Citizens (Frontier) Weighted Small Co.Composite* Century Tel. Iowa Telecom Valor 19 Percent Dissatisfied -- BOC Residential Repairs (Using Company Provided Composites) ARMIS 43-06 Report 2000 2001 2002 2003 2004 2005 BellSouth 18.8 17.6 14.6 10.1 10.0 10.1 Qwest 8.0 10.0 9.3 6.5 5.9 6.2 AT&T Ameritech 26.5 19.2 14.6 11.4 11.0 11.1 AT&T Pacific 23.6 10.0 7.3 7.6 7.4 8.9 AT&T Southwestern 9.6 11.7 9.6 9.9 10.4 9.2 AT&T SNET 18.7 14.2 14.5 11.9 11.6 11.2 Verizon GTE 9.4 10.1 11.9 11.2 14.0 16.1 Verizon North (Combined with Verizon South) Verizon South 15.0 13.4 15.3 20.8 19.0 20.4 Weighted BOC Composite* 16.3 13.5 12.5 12.6 12.3 13.0 * Weighted composite is calculated using access line counts. Chart 6 Residential Repair Dissatisfaction BOCs 0.0 5.0 10.0 15.0 20.0 25.0 2000 2001 2002 2003 2004 2005 Years P ercen t Di ssati sfi e d Weighted BOC Composite* Weighted Verizon Avg. BellSouth Weighted AT&T Avg. Qwest 20 Average Initial Out-of-Service Repair Interval in Hours -- Residential Services (Using Company Provided Composites) ARMIS 43-05 Report 2000 2001 2002 2003 2004 2005 BellSouth 23.1 20.8 20.0 21.5 33.5 44.8 Qwest 19.0 14.1 13.6 14.7 16.3 18.8 AT&T Ameritech 49.0 22.7 18.9 16.8 17.2 16.3 AT&T Pacific 42.1 26.8 25.9 25.8 28.8 45.2 AT&T Southwestern 23.2 24.9 21.0 22.1 29.0 24.6 AT&T SNET 38.2 27.2 27.4 26.7 27.2 30.6 Verizon GTE 13.0 13.5 15.5 15.7 28.9 28.5 Verizon North (Combined with Verizon South) Verizon South 27.0 22.0 24.1 34.5 29.2 34.3 Sprint 16.3 13.9 15.2 17.3 22.6 23.8 Weighted BOC/Sprint Composite* 27.8 20.7 20.4 23.3 26.7 31.3 * Weighted composite is calculated using access line counts. Chart 7A Residential Initial Out-of-Service Repair Intervals Large Price-Cap Carriers 0.0 5.0 10.0 15.0 20.0 25.0 30.0 35.0 40.0 45.0 50.0 2000 2001 2002 2003 2004 2005 Years Int erval in H o urs Weighted BOC/Sprint Composite* Weighted Verizon Avg. BellSouth Weighted AT&T Avg. Qwest Sprint 21 Average Initial Out-of-Service Repair Interval in Hours -- Residential Services (Using Company Provided Composites) ARMIS 43-05 Report 2000 2001 2002 2003 2004 2005 Alltel Corp. 25.9 15.4 13.6 Cincinnati Bell 36.7 49.3 36.1 37.5 28.2 30.3 Citizens 14.3 14.7 14.4 16.3 16.7 18.1 Citizens (Frontier) 20.7 16.4 17.7 28.1 22.3 17.6 Century Tel. 14.9 13.9 16.4 Iowa Telecom 11.3 10.1 11.1 11.3 Valor 21.8 16.8 17.3 21.1 Weighted BOC/Sprint Composite* 27.8 20.7 20.4 23.3 26.7 31.3 Weighted Small Co.Composite* 22.7 25.3 21.0 23.1 18.9 19.2 * Weighted composite is calculated using access line counts. Chart 7B Residential Initial Out-of-Service Repair Intervals Small Price-Cap Carriers 0.0 10.0 20.0 30.0 40.0 50.0 60.0 2000 2001 2002 2003 2004 2005 Years Int erval in H o urs Weighted BOC/Sprint Composite* Alltel Corp. Cincinnati Bell Citizens Century Tel. Weighted Small Co.Composite * Iowa Telecom Valor 22 Percentage of Switches with Downtime (Calculated Using Data from Company Provided Composites) ARMIS 43-05 Report 2000 2001 2002 2003 2004 2005 BellSouth 6.4 5.9 4.2 2.5 1.6 2.3 Qwest 42.1 36.0 18.8 11.1 20.0 13.7 AT&T Ameritech 3.7 3.4 4.5 1.5 1.0 0.3 AT&T Pacific 10.1 15.4 2.3 3.3 3.7 2.3 AT&T Southwestern 12.0 10.3 4.3 3.9 1.5 1.2 AT&T SNET 28.8 42.3 4.4 0.6 6.2 1.3 Verizon GTE 2.9 1.6 1.3 2.7 1.5 1.5 Verizon North (Combined with Verizon South) Verizon South 8.6 5.6 2.4 4.4 0.9 0.8 Sprint 10.2 8.8 10.2 3.5 7.5 13.8 Weighted BOC/Sprint Composite* 11.2 10.2 5.0 3.9 3.7 3.2 *Weighted composite is calculated using access line counts. Chart 8 Percentage of Switches with Downtime Large Price-Cap Carriers 0.0 5.0 10.0 15.0 20.0 25.0 30.0 35.0 40.0 45.0 2000 2001 2002 2003 2004 2005 Years Percent Weighted BOC/Sprint Composite* Weighted Verizon Avg. BellSouth Weighted AT&T Avg. Qwest Sprint 23 BellSouth Qwest SBC SBC SBC SBC Verizon Verizon Verizon Ameritech Pacific Southwestern SNET North South GTE Access Services Provided to Carriers-- Switched Access Percent Installation Commitments Met 100.0 99.1 99.3 99.1 96.2 98.0 99.9 99.7 91.9 Average Installation Interval (days) 18.8 14.9 24.9 29.5 22.5 26.4 37.7 16.9 26.7 Average Repair Interval (hours) 0.6 1.4 8.7 7.0 3.5 2.6 13.6 5.6 14.7 Access Services Provided to Carriers -- Special Access Percent Installation Commitments Met 99.8 97.2 96.3 97.0 97.7 99.3 91.7 91.7 91.3 Average Installation Interval (days) 15.1 9.5 18.7 17.0 17.8 15.8 22.6 16.7 19.8 Average Repair Interval (hours) 3.1 3.4 4.3 5.7 3.7 3.6 5.2 3.7 12.1 Local Services Provided to Res. and Business Customers Percent Installation Commitments Met 97.4 99.5 98.5 99.1 99.1 99.6 99.1 98.5 97.7 Residence 98.7 99.6 98.6 99.2 99.1 99.6 99.1 98.7 98.0 Business 88.5 98.9 98.5 98.3 98.8 99.3 98.5 97.0 94.7 Average Installation Interval (days) 1.7 0.3 1.4 1.6 2.2 1.5 0.8 1.3 0.9 Residence 1.3 0.3 1.4 1.5 2.1 1.0 0.7 1.2 0.7 Business 2.0 0.7 1.3 2.2 2.4 3.8 1.2 2.0 2.3 Avg. Out of Svc. Repair Interval (hours) 42.2 18.2 15.9 43.4 23.5 29.7 29.0 34.1 26.0 Total Residence 44.8 18.8 16.3 45.3 24.5 30.6 31.1 37.8 28.5 Total Business 29.2 16.0 13.6 33.9 17.3 23.6 20.7 15.2 12.9 Initial Trouble Reports per Thousand Lines 307.3 112.6 144.3 129.4 173.3 184.9 189.1 145.2 191.7 Total MSA 302.2 127.8 143.9 128.9 166.7 183.5 181.3 139.9 182.4 Total Non MSA 336.8 39.3 149.0 145.3 206.8 199.4 267.8 216.3 231.4 Total Residence 358.8 132.3 203.5 182.7 227.2 238.8 235.4 196.7 225.5 Total Business 182.6 70.1 58.6 49.6 74.8 73.4 106.3 63.8 111.4 Troubles Found per Thousand Lines 208.4 87.9 109.2 106.2 139.1 110.4 147.5 109.7 152.6 Repeat Troubles as a Pct. of Trouble Rpts. 18.8% 20.2% 15.0% 11.4% 21.2% 17.4% 20.9% 21.8% 16.5% Residential Complaints per Million Res. Access Lines 209.1 124.1 16.6 38.6 36.1 31.3 111.9 514.2 252.0 Business Complaints per Million Business Access Lines 66.1 37.5 7.3 8.0 7.8 9.5 33.0 56.2 70.1 * Please refer to text for notes and data qualifications. Table 1(a): Installation, Maintenance, & Customer Complaints Mandatory Price-Cap Company Comparison -- 2005 BellSouth Qwest SBC SBC SBC SBC Verizon Verizon Verizon Ameritech Pacific Southwestern SNET North South GTE Total Access Lines in Thousands 19,625 12,817 16,050 15,589 13,034 1,941 14,370 19,150 14,131 Total Trunk Groups 2,538 1,531 890 1,204 687 89 741 940 1,539 Total Switches 1,614 1,318 1,439 779 1,639 160 1,285 1,352 2,378 Switches with Downtime Number of Switches 37 181 5 18 19 2 13 7 37 As a percentage of Total Switches 2.3% 13.7% 0.3% 2.3% 1.2% 1.3% 1.0% 0.5% 1.6% Average Switch Downtime in seconds per Switch* For All Events (including events over 2 minutes) 17,195.5 79.2 4.3 0.3 646.7 0.2 39.6 20.0 128.1 For Unscheduled Events Over 2 Minutes 17,193.2 75.1 4.3 NA 646.5 NA 39.6 19.8 127.6 For Unscheduled Downtime More than 2 Minutes Number of Occurrences or Events 29 37 6 0 18 0 13 6 31 Events per Hundred Switches 2 3 0 0 1 0 1 0 1 Events per Million Access Lines 1 3 0 0 1 0 1 0 2 Average Outage Duration in Minutes 15,948 45 17 NA 981 NA 65 74 163.1 Average Lines Affected per Event in Thousands 12.4 7.7 13.6 NA 17.7 NA 8.1 14.7 6.0 Outage Line-Minutes per Event in Thousands 219,618.1 125.5 171.9 NA 678.5 NA 247.6 334.9 486.9 Outage Line-Minutes per 1,000 Access Lines 324,525.7 362.3 64.3 0.0 936.9 0.0 224.0 104.9 1,068.2 For Scheduled Downtime More than 2 Minutes Number of Occurrences or Events 2 8 0 0 0 0 0 0 0 Events per Hundred Switches 0.1 0.6 0.0 0 0.0 0 0.0 0.0 0 Events per Million Access Lines 0.10 0.62 0.00 0 0.00 0 0.00 0.00 0 Average Outage Duration in Minutes 3.5 4.0 NA NA NA NA NA NA NA Avg. Lines Affected per Event in Thousands 29.7 15.1 NA NA NA NA NA NA NA Outage Line-Minutes per Event in Thousands 102.8 58.6 NA NA NA NA NA NA NA Outage Line-Minutes per 1,000 Access Lines 10.5 36.6 0.0 0.0 0.0 0.0 0.0 0.0 0.0 % Trunk Grps. Exceeding Blocking Objectives 2.17% 5.94% 0.22% 4.98% 0.58% 0.00% 1.89% 8.19% 0.00% * Aggregate downtime divided by total number of company switches. Please refer to text for notes and data qualifications. Table 1(b): Switch Downtime & Trunk Blocking Mandatory Price-Cap Company Comparison -- 2005 BellSouth Qwest SBC SBC SBC SBC Verizon Verizon Verizon Ameritech Pacific Southwestern SNET North South GTE Total Number of Outages 1. Scheduled 2 8 0 0 0 0 0 0 0 2. Proced. Errors -- Telco. (Inst./Maint.) 0 0 1 0 0 0 1 0 0 3. Proced. Errors -- Telco. (Other) 0 0 0 0 0 0 0 0 0 4. Procedural Errors -- System Vendors 0 0 0 0 0 0 0 0 1 5. Procedural Errors -- Other Vendors 0 1 0 0 1 0 0 0 0 6. Software Design 1 0 1 0 3 0 4 0 8 7. Hardware design 2 0 0 0 0 0 1 1 1 8. Hardware Failure 5 30 4 0 13 0 5 3 4 9. Natural Causes 14 1 0 0 1 0 0 0 4 10. Traffic Overload 0 0 0 0 0 0 0 0 0 11. Environmental 0 0 0 0 0 0 0 0 0 12. External Power Failure 2 1 0 0 0 0 0 0 7 13. Massive Line Outage 0 0 0 0 0 0 0 0 2 14. Remote 2 8 0 0 0 0 0 0 0 15. Other/Unknown 0 0 0 0 0 0 2 1 0 Total Outage Line-Minutes per Thousand Access Lines 1. Scheduled 10.5 36.6 0.0 0.0 0.0 0.0 0.0 0.0 0.0 2. Proced. Errors -- Telco. (Inst./Maint.) 0.0 0.0 5.1 0.0 0.0 0.0 7.2 0.0 0.0 3. Proced. Errors -- Telco. (Other) 4.7 12.6 0.0 0.0 0.0 0.0 0.0 0.0 227.9 4. Procedural Errors -- System Vendors 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 21.0 5. Procedural Errors -- Other Vendors 0.0 29.6 0.0 0.0 0.3 0.0 0.0 0.0 0.0 6. Software Design 1 0 16 0 8 0 20 0 323 7. Hardware design 628.9 0.0 0.0 0.0 0.0 0.0 77.9 2.9 3.8 8. Hardware Failure 13.5 301.9 43.1 0.0 108.4 0.0 19.2 98.2 36.4 9. Natural Causes 323,295.6 13.1 0.0 0.0 820.3 0.0 0.0 0.0 261.9 10. Traffic Overload 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 11. Environmental 0 0 0 0 0 0 0 0 0 12. External Power Failure 580.4 5.1 0.0 0.0 0.0 0.0 0.0 0.0 171.1 13. Massive Line Outage 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 22.8 14. Remote 2.1 0.0 0.0 0.0 0.0 0.0 0.0 0.5 0.1 15. Other/Unknown 0.0 0.0 0.0 0.0 0.0 0.0 99.5 3.3 0.0 * Please refer to text for notes and data qualifications. Table 1(c): Switch Downtime Causes -- Outages more than 2 Minutes in Duration Mandatory Price-Cap Company Comparison -- 2005 Mandatory Price-Cap Companies: BellSouth Qwest SBC SBC SBC SBC Verizon Verizon Verizon Ameritech Pacific Southwestern SNET North South GTE Percentage of Customers Dissatisfied Installations: Residential 5.74% 3.73% 6.68% 6.44% 7.14% 8.41% 5.49% 7.08% 6.86% Small Business 8.14% 7.13% 9.23% 7.36% 7.27% 8.87% 9.45% 12.07% 11.39% Large Business 5.78% NA NA NA NA NA 8.88% 6.62% 7.58% Repairs: Residential 10.10% 6.25% 11.10% 8.89% 9.18% 11.16% 18.85% 22.48% 16.14% Small Business 6.91% 6.90% 10.49% 7.19% 7.17% 8.02% 11.78% 12.02% 11.19% Large Business 5.76% NA NA NA NA NA 10.92% 8.29% 5.96% Business Office: Residential 7.13% 1.62% 7.38% 4.89% 8.23% 8.49% 5.54% 6.64% 8.22% Small Business 9.97% 2.82% 7.20% 4.90% 6.21% 8.41% 5.99% 7.69% 9.13% Large Business 9.52% NA NA NA NA NA 15.98% 14.86% 11.65% * Please refer to text for notes and data qualifications Table 1(d): Company Comparision -- 2005 Customer Perception Surveys Mandatory Price-Cap Companies: BellSouth Qwest SBC SBC SBC SBC Verizon Verizon Verizon Ameritech Pacific Southwestern SNET North South GTE Sample Sizes -- Customer Perception Surveys Installations: Residential 45,440 48,800 10,744 10,760 10,586 4,779 20,399 15,909 17,164 Small Business 45,051 24,160 12,672 13,163 12,460 2,265 19,680 15,448 17,566 Large Business 9,360 0 0 0 0 0 428 559 396 Repairs: Residential 30,923 38,335 10,793 11,827 10,693 2,402 20,399 15,375 17,719 Small Business 44,335 28,642 13,088 12,945 12,988 1,783 20,151 15,137 17,842 Large Business 6,963 0 0 0 0 0 421 507 386 Business Office: Residential 42,117 45,601 21,453 21,626 21,403 2,955 3,701 9,250 11,472 Small Business 10,249 23,926 21,133 20,645 21,300 1,082 1,268 3,447 2,640 Large Business 557 0 0 0 0 0 169 471 309 * Please refer to text for notes and data qualifications Table 1(e): Company Comparision -- 2005 Customer Perception Surveys Alltel Century Cincinnati Citizens Citizens Iowa Sprint Valor Tel. Frontier Access Services Provided to Carriers-- Switched Access Percent Installation Commitments Met 99.8 91.5 99.9 76.8 98.7 67.2 89.6 85.8 Average Installation Interval (days) 4.6 16.1 16.0 22.1 24.0 14.3 13.2 25.3 Average Repair Interval (hours) 3.4 21.2 NA 7.4 3.8 27.6 2.3 3.0 Access Services Provided to Carriers -- Special Access Percent Installation Commitments Met 96.6 91.9 98.2 79.2 89.7 75.7 93.6 89.5 Average Installation Interval (days) 7.5 17.1 17.4 13.3 20.6 3.9 11.3 17.5 Average Repair Interval (hours) 3.0 24.6 3.4 15.4 13.6 22.9 4.8 3.1 Local Services Provided to Res. and Business Customers Percent Installation Commitments Met 98.5 94.9 99.5 93.8 99.2 97.8 97.0 97.6 Residence 98.8 94.8 99.6 93.8 99.2 97.8 97.2 97.6 Business 94.3 95.3 99.0 94.1 99.1 96.5 95.3 97.6 Average Installation Interval (days) 2.7 1.4 2.4 2.9 5.3 1.8 1.8 2.2 Residence 2.6 1.4 2.0 2.7 5.0 1.7 1.6 2.2 Business 3.6 1.4 5.0 3.8 6.9 2.6 2.7 2.2 Avg. Out of Svc. Repair Interval (hours) 13.4 16.4 28.7 18.0 17.3 11.1 23.6 20.4 Total Residence 13.6 16.4 29.3 18.1 17.6 11.3 23.8 21.1 Total Business 11.3 15.6 18.1 17.4 15.5 8.8 22.2 15.4 Initial Trouble Reports per Thousand Lines 128.2 231.1 131.4 325.1 252.4 155.4 221.1 479.8 Total MSA 115.6 217.2 131.4 NA 235.8 159.1 200.1 447.2 Total Non MSA 140.2 243.8 NA 325.1 269.0 154.4 268.4 506.3 Total Residence 181.6 272.5 166.1 358.2 309.2 177.7 270.2 570.8 Total Business 41.9 110.5 55.0 223.0 129.4 79.2 99.3 239.6 Troubles Found per Thousand Lines 106.9 194.4 122.8 272.8 202.2 140.3 149.2 455.8 Repeat Troubles as a Pct. of Trouble Rpts. 15.3% 31.3% 12.0% 20.3% 11.5% 18.1% 22.7% 6.4% Residential Complaints per Million Res. Access Lines 186.6 721.5 292.6 896.5 621.1 16.6 74.8 223.4 Business Complaints per Million Bus. Access Lines 33.5 316.0 54.9 179.7 53.9 0.0 17.3 81.1 * Please refer to text for notes and data qualifications Table 2(a): Installation, Maintenance, & Customer Complaints Non-Mandatory Price-Cap Company Comparison -- 2005 Alltel Century Cincinnati Citizens Citizens Iowa Sprint Valor Tel. Frontier Total Access Lines in Thousands 780 583 875 1,249 880 234 7,226 494 Total Trunk Groups 93 278 44 248 96 68 499 238 Total Switches 243 188 86 205 72 272 1,345 291 Switches with Downtime Number of Switches 53 0 10 13 7 16 185 21 As a percentage of Total Switches 21.8% 0.0% 11.6% 6.3% 9.7% 5.9% 13.8% 7.2% Average Switch Downtime in seconds per Switch * For All Events (including events over 2 minutes) 2,500.6 0.0 30.1 901.8 1,660.8 483.1 3,299.1 8,919.4 For Unscheduled Events Over 2 Minutes 2,500.6 NA NA 607.0 115.8 483.1 3,201.5 8,919.4 For Unscheduled Downtime More than 2 Minutes Number of Occurrences or Events 24 0 0 14 3 15 179 29 Events per Hundred Switches 9.9 0.0 0.0 6.8 4.2 5.5 13.3 10.0 Events per Million Access Lines 30.75 0.00 0.00 11.21 3.41 64.16 24.77 58.74 Average Outage Duration in Minutes 422.0 NA NA 148.1 46.3 146.0 400.9 1491.7 Average Lines Affected per Event in Thousands 4.6 NA NA 2.8 2.8 0.4 6.0 1.3 Outage Line-Minutes per Event in Thousands 1,557.9 NA NA 320.2 71.9 67.7 2,147.6 907.8 Outage Line-Minutes per 1,000 Access Lines 47,907.2 0.0 0.0 3,590.0 245.0 4,342.4 53,197.5 53,325.3 For Scheduled Downtime More than 2 Minutes Number of Occurrences or Events 0 0 0 1 0 0 6 0 Events per Hundred Switches 0.0 0.0 0.0 0.5 0.0 0.0 0.4 0.0 Events per Million Access Lines 0.00 0.00 0.00 0.80 0.00 0.00 0.83 0.00 Average Outage Duration in Minutes NA NA NA 6.0 NA NA 364.7 NA Avg. Lines Affected per Event in Thousands NA NA NA 20.9 NA NA 7.0 NA Outage Line-Minutes per Event in Thousands NA NA NA 125.6 NA NA 3,390.8 NA Outage Line-Minutes per 1,000 Access Lines 0.0 0.0 0.0 100.6 0.0 0.0 2,815.4 0.0 % Trunk Grps. Exceeding Blocking Objectives 1.08% 22.66% 9.09% 0.00% 0.00% 0.00% 4.01% 0.00% * Aggregate downtime divided by total number of company switches. Please refer to text for notes and data qualifications. Table 2(b): Switch Downtime & Trunk Blocking Non-Mandatory Price-Cap Company Comparison -- 2005 Alltel Century Cincinnati Citizens Citizens Iowa Sprint Valor Tel. Frontier Total Number of Outages 1. Scheduled 0 0 0 1 0 0 6 0 2. Proced. Errors -- Telco. (Inst./Maint.) 0 0 0 0 0 0 14 3 3. Proced. Errors -- Telco. (Other) 0 0 0 0 0 0 0 0 4. Procedural Errors -- System Vendors 0 0 0 1 0 0 2 0 5. Procedural Errors -- Other Vendors 1 0 0 0 0 5 6 0 6. Software Design 2 0 0 2 0 0 11 0 7. Hardware design 0 0 0 0 0 0 2 0 8. Hardware Failure 10 0 0 3 3 4 35 3 9. Natural Causes 1 0 0 0 0 0 16 11 10. Traffic Overload 0 0 0 0 0 0 0 0 11. Environmental 0 0 0 0 0 0 2 0 12. External Power Failure 2 0 0 6 0 0 8 0 13. Massive Line Outage 5 0 0 0 0 6 59 0 14. Remote 0 0 0 1 0 0 6 0 15. Other/Unknown 0 0 0 0 0 0 7 0 Total Outage Line-Minutes per Thousand Access Lines 1. Scheduled 0.0 0.0 0.0 100.6 0.0 0.0 2,815.4 0.0 2. Proced. Errors -- Telco. (Inst./Maint.) 0.0 0.0 0.0 0.0 0.0 0.0 4,855.5 754.4 3. Proced. Errors -- Telco. (Other) 0.0 0.0 0.0 199.6 0.0 0.0 1,507.2 2,437.8 4. Procedural Errors -- System Vendors 0.0 0.0 0.0 142.1 0.0 0.0 2,848.1 0.0 5. Procedural Errors -- Other Vendors 42.3 0.0 0.0 0.0 0.0 1,497.2 273.1 0.0 6. Software Design 23714 0 0 62 0 0 3697 0 7. Hardware design 0.0 0.0 0.0 0.0 0.0 0.0 1,175.1 0.0 8. Hardware Failure 10,343.7 0.0 0.0 394.2 245.0 2,200.2 12,691.1 3,638.7 9. Natural Causes 244.6 0.0 0.0 0.0 0.0 0.0 4,466.9 46,494.4 10. Traffic Overload 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 11. Environmental 0 0 0 0 0 0 195 0 12. External Power Failure 4,774.2 0.0 0.0 2,791.8 0.0 0.0 1,484.9 0.0 13. Massive Line Outage 3,845.4 0.0 0.0 0.0 0.0 645.0 15,643.8 0.0 14. Remote 4,943.4 0.0 0.0 0.0 0.0 0.0 984.5 0.0 15. Other/Unknown 0.0 0.0 0.0 0.0 0.0 0.0 3,374.8 0.0 * Please refer to text for notes and data qualifications Table 2(c): Switch Downtime Causes -- Outages More than 2 Minutes in Duration Non-Mandatory Price-Cap Company Comparison -- 2005 1 Appendix A – Description of Key Terminology in the Tables This Appendix contains descriptions of key terms that appear in the tables and charts of the Quality of Service Report. The data elements in the tables are derived from raw source data for individual study areas submitted by carriers in the ARMIS 43-05 reports. A detailed specification of each element used in the tables of this summary report follows this general description. Data in the charts are derived from composite data provided by the companies. 1. Percent of Installation Commitments Met This term represents the percent of installations that were met by the date promised by the company to the customer. The associated data are presented separately for residential and business customers’ local service in the tables. These data are also summarized in the accompanying charts. 2. Average Installation Interval (in days) This term represents the average interval (in days) between the installation service order and completion of installation. The associated ARMIS 43-05 report data are highlighted in the accompanying charts along with customer installation dissatisfaction data from the ARMIS 43-06 report. 3. Average Repair Interval (in hours) This term represents the average time (in hours) for the company to repair access lines with service subcategories for switched access, high-speed special access, and all special access. Repair interval data are also highlighted in the accompanying charts along with results from company conducted surveys relating to customer repair dissatisfaction. This customer feedback is extracted from the ARMIS 43-06 report. 4. Initial Trouble Reports per Thousand Access Lines This term is calculated as the total count of trouble reports reported as "initial trouble reports," divided by the number of access lines in thousands. (Note that multiple calls within a 30 day period associated with the same problem are counted as a single initial trouble, and the number of access lines reported and used in the calculation is the total number of access lines divided by 1,000.) 2 5. Found or Verified Troubles per Thousand Access Lines This term is calculated as 1000 times the number of verified troubles divided by the number of access lines. Only those trouble reports for which the company identified a problem are included. 6. Repeat Troubles as a percent of Initial Trouble Reports This term is calculated as the number of initial trouble reports cleared by the company that recur, or remain unresolved, within 30 days of the initial trouble report, divided by the number of initial trouble reports as described above. 7. Complaints per Million Access Lines This term is calculated as 1 million times the number of residential and business customer complaints divided by the number of access lines, reported to state or federal regulatory bodies during the reporting period. 8. Number of Access Lines, Trunk Groups and Switches These terms represent the numbers of in-service access lines, trunk groups, and switches, respectively, as shown in the ARMIS 43-05 report. Trunk groups only include common trunk groups between Incumbent Local Exchange Carrier (ILEC) access tandems and ILEC end offices. When comparing current data herein with data in prior reports the reader should note that access lines were reported in thousands in pre-1997 data submissions. Starting with 1997 data submissions, access line information in the raw carrier data filings has been reported in whole numbers. 9. Switches with Downtime This term represents the number of network switches experiencing downtime and the percentage of the total number of company network switches experiencing downtime. 10. Average Switch Downtime in Seconds per Switch This term includes (1) the total switch downtime divided by the total number of company network switches and (2) the total switch downtime for outages longer than 2 minutes divided by the total number of switches. Results for average overall switch downtime are shown in seconds per switch. 3 11. Unscheduled Downtime Over 2 Minutes per Occurrence This term presents several summary statistics including, (1) the number of occurrences of more than 2 minutes in duration that were unscheduled, (2) the number of occurrences per million access lines, (3) the average number of minutes per occurrence, (4) the average number of lines affected per occurrence, (5) the average number of line-minutes per occurrence in thousands, and (6) the outage line-minutes per access line. For each outage, the number of lines affected was multiplied by the duration of the outage to provide the line-minutes of outage. The resulting sums of these data represent total outage line-minutes. This number was divided by the total number of access lines to provide line- minutes-per-access-line, and, by the number of occurrences, to provide the line- minutes-per-occurrence. This categorizes the normalized magnitude of the outage in two ways and provides a realistic means to compare the impact of such outages between companies. Data is presented for each company showing the number of outages and outage line-minutes by cause. 12. Scheduled Downtime Over 2 Minutes per Occurrence This term is determined as in item 11, above, except that it consists of scheduled occurrences. 13. Percent of Trunk Groups Meeting Design Objectives This term relates to the percentage of trunk groups exceeding the design blocking objectives (typically 0.5 percent for trunk groups that include feature group D and 1.0 percent for other trunk groups) for three or more consecutive months. The trunk groups measured and reported are interexchange access facilities. These represent only a small portion of the total trunk groups in service. 4 Appendix A Detailed Quality of Service Report Table Specifications Report Tables 1(a) and 2(a) (ARMIS 43-05 data) Statistic Access Services Provided to Carriers-- Switched Access Percent Installation Commitments Met row 112 weighted by row 110 (column aa) Average Installation Interval (days) row 114 weighted by row 110 (column aa) Average Repair Interval (hours) row 121 weighted by row 120 (column aa) Access Services Provided to Carriers -- Special Access Percent Installation Commitments Met row 112 weighted by row 110 (column ac) Average Installation Interval (days) row 114 weighted by row 110 (column ac) Average Repair Interval (hours) row 121 weighted by row 120 (column ac) Local Services Provided to Res. and Business Customers Percent Installation Commitments Met row 132 weighted by row 130 (column aj) Residence row 132 weighted by row 130 (column af) Business row 132 weighted by row 130 (column ai) Average Installation Interval (days) row 134 weighted by row 130 (column aj) Residence row 134 weighted by row 130 (column af) Business row 134 weighted by row 130 (column ai) Avg. Out of Svc. Repair Interval (hours) row 145 weighted by row 144 (column aj) Total Residence row 145 weighted by row 144 (column af) Total Business row 145 weighted by row 144 (column ai) Initial Trouble Reports per Thousand Lines 1000 * row 141 col aj / row 140 col aj Total MSA 1000 * (row 141 column ad + column ag)/ (row 140 column ad + column ag) Total Non MSA 1000 * (row 141 column ae + column ah)/ (row 140 column ae + column ah) Total Residence 1000 * (row 141 column af)/ (row 140 column af) Total Business 1000 * (row 141 column ai)/ (row 140 column ai) Troubles Found per Thousand Lines 1000 * (row 141 column aj - row 143 column aj)/ row 140 column aj Repeat Troubles as a Pct. of Trouble Rpts. (row 142 column aj) / (row 141 column aj) Residential Complaints per Million Res. Access Lines (row 331 column da + row332 column da)/ (row 330 column da) Business Complaints per Million Bus. Access Lines (row 321 column da + row 322 column da)/ (row 320 column da) 5 Appendix A--Detailed Quality of Service Report Table Specifications Report Table 1(b) and 2(b) (ARMIS 43-05 data) Statistic Total Access Lines in Thousands row 140 column aj Total Trunk Groups row 180 column ak Total Switches row 200 column an + row 201 column an Switches with Downtime row 200 column ao + row 201 column ao Number of Switches row 200 column ao + row 201 column ao As a percentage of Total Switches (row 200 column ao + row 201 column ao)/ (row 200 column an + row 201 column an) Average Switch Downtime in seconds per Switch* For All Events (including events over 2 minutes) 60 * (row 200 column ap + row 201 column ap)/ (row 200 column an + row 201 column an) For Unscheduled Events Over 2 Minutes 60 * (unscheduled events * average duration in min.)/ (row 200 column an + row 201 column an) For Unscheduled Downtime More than 2 Minutes Items where rows 220 to 500 column t > 1 Number of Occurrences or Events E = Number of records in row 220 to row 500 excluding rows 320, 321, 322, 330, 331 and 332 Events per Hundred Switches 100 *E/ (row 200 column an + row 201 column an) Events per Million Access Lines E/ 1,000,000 Average Outage Duration in Minutes (sum of rows 220 to 500 column x)/ E Average Lines Affected per Event in Thousands (sum of rows 220 to 500 column v)/ E Outage Line-Minutes per Event in Thousands (sum of rows 220 to 500 column x * column v)/ E Outage Line-Minutes per 1,000 Access Lines 1000 * (sum of rows 220 to 500 column x * column v)/ (row 140 column aj) For Scheduled Downtime More than 2 Minutes Items where rows 220 to 500 column t = 1 Number of Occurrences or Events E = Number of records in row 220 to row 500 excluding rows 320, 321, 322, 330, 331 and 332 Events per Hundred Switches 100 * E/ (row 200 column an + row 201 column an) Events per Million Access Lines E/ 1,000,000 Average Outage Duration in Minutes (sum of rows 220 to 500 column x)/ E Avg. Lines Affected per Event in Thousands (sum of rows 220 to 500 column v)/ E Outage Line-Minutes per Event in Thousands (sum of rows 220 to 500 column x * column v)/ E Outage Line-Minutes per 1,000 Access Lines 1000 * (sum of rows 220 to 500 column x * column v)/ (row 140 column aj) % Trunk Grps. Exceeding Blocking Objectives (row 189 column ak + row 190 column ak)/ (row 180 column ak) Notes: ARMIS 43-05 database rows 110-121 are contained in database table I ARMIS 43-05 database rows 130-170 are contained in database table II ARMIS 43-05 database rows 180-190 are contained in database table III ARMIS 43-05 database rows 200-214 are contained in database table IV ARMIS 43-05 database rows 220- 319 are contained in database table IVa ARMIS 43-05 database rows 320-332 are contained in database table V 6 Appendix A Detailed Quality of Service Report Table Specifications Report Table 1(c) and 2(c) (ARMIS 43-05 data) Total Number of Outages Number of rows between 220 and 500 for each value of column t 1. Scheduled 2. Proced. Errors -- Telco. (Inst./Maint.) 3. Proced. Errors -- Telco. (Other) 4. Procedural Errors -- System Vendors 5. Procedural Errors -- Other Vendors 6. Software Design 7. Hardware design 8. Hardware Failure 9. Natural Causes 10. Traffic Overload 11. Environmental 12. External Power Failure 13. Massive Line Outage 14. Remote 15. Other/Unknown Total Outage Line-Minutes per Thousand Access Lines (Sum of rows 200 to 500 column v * - column x for each value of column t) /row 140 col aj 1. Scheduled 2. Proced. Errors -- Telco. (Inst./Maint.) 3. Proced. Errors -- Telco. (Other) 4. Procedural Errors -- System Vendors 5. Procedural Errors -- Other Vendors 6. Software Design 7. Hardware design 8. Hardware Failure 9. Natural Causes 10. Traffic Overload 11. Environmental 12. External Power Failure 13. Massive Line Outage 14. Remote 15. Other/Unknown Notes: ARMIS 43-05 database rows 110-121 are contained in database table I ARMIS 43-05 database rows 130-170 are contained in database table II ARMIS 43-05 database rows 180-190 are contained in database table III ARMIS 43-05 database rows 200-214 are contained in database table IV ARMIS 43-05 database rows 220- 319 are contained in database table IVa ARMIS 43-05 database rows 320-332 are contained in database table V 7 Appendix A Detailed Quality of Service Report Table Specifications Report Table 1(d) (ARMIS 43-06 data) Percentage of Customers Dissatisfied Installations: Residential Row 40 column ac weighted by column ab Small Business Row 40 column ae weighted by column ad Large Business Row 40 column ag weighted by column af Repairs: Residential Row 60 column ac weighted by column ab Small Business Row 60 column ae weighted by column ad Large Business Row 60 column ag weighted by column af Business Office: Residential Row 80 column ac weighted by column ab Small Business Row 80 column ae weighted by column ad Large Business Row 80 column ag weighted by column af Note: ARMIS 43-06 database rows 40-80 are contained in database table I 8 Appendix A Detailed Quality of Service Report Table Specifications Report Table 1(e) (ARMIS 43-06 data) Note: ARMIS 43-06 database rows 40-80 are contained in database table I Sample Sizes -- Customer Perception Surveys Installations: Residential Sum of Row 40 column ab Small Business Sum of Row 40 column ad Large Business Sum of Row 40 column af Repairs: Residential Sum of Row 60 column ab Small Business Sum of Row 60 column ad Large Business Sum of Row 60 column af Business Office: Residential Sum of Row 80 column ab Small Business Sum of Row 80 column ad Large Business Sum of Row 80 column af Customer Response Publication: Quality of Service of Incumbent Local Exchange Carriers Report (February 2007) You can help us provide the best possible information to the public by completing this form and returning it to the Industry Analysis and Technology Division of the FCC's Wireline Competition Bureau. 1. Please check the category that best describes you: ____ press ____ current telecommunications carrier ____ potential telecommunications carrier ____ business customer evaluating vendors/service options ____ consultant, law firm, lobbyist ____ other business customer ____ academic/student ____ residential customer ____ FCC employee ____ other federal government employee ____ state or local government employee ____ Other (please specify) 2. Please rate the report: Excellent Good Satisfactory Poor No opinion Data accuracy (_) (_) (_) (_) (_) Data presentation (_) (_) (_) (_) (_) Timeliness of data (_) (_) (_) (_) (_) Completeness of data (_) (_) (_) (_) (_) Text clarity (_) (_) (_) (_) (_) Completeness of text (_) (_) (_) (_) (_) 3. Overall, how do you Excellent Good Satisfactory Poor No opinion rate this report? (_) (_) (_) (_) (_) 4. How can this report be improved? 5. May we contact you to discuss possible improvements? Name: Telephone #: To discuss this report contact Jonathan Kraushaar at 202-418-0947 Fax this response to or Mail this response to 202-418-0520 FCC/WCB/IATD Washington, DC 20554