QUALITY OF SERVICE OF INCUMBENT LOCAL EXCHANGE CARRIERS FEBRUARY 2008 Industry Analysis and Technology Division Wireline Competition Bureau Federal Communications Commission This report was authored by Jonathan M. Kraushaar of the Industry Analysis and Technology Division of the FCC’s Wireline Competition Bureau. The author can be reached at (202) 418-0947; e-mail address: jonathan.kraushaar@fcc.gov; TTY: (202) 418-0484. This report is available for reference in the FCC's Reference Information Center, Courtyard Level, 445 12th Street, S.W. Copies may be purchased by calling Best Copy and Printing, Inc. at (202) 488-5300. The report can be downloaded from the Wireline Competition Bureau Statistical Reports Internet site at http://www.fcc.gov/wcb/stats. 1 Quality of Service of Incumbent Local Exchange Carriers 1. Executive Summary 1.1 Overview This report summarizes the Automated Reporting Management Information System (ARMIS) service quality data filed by the regional Bell companies, 1 Embarq 2 and other price-cap regulated incumbent local exchange carriers for calendar year 2006. 3 The data track the quality of service provided to both retail customers (business and residential) and access customers (interexchange carriers). The Federal Communications Commission (FCC or Commission) does not impose service quality standards on communications common carriers. Rather, the Commission monitors quality of service data submitted by incumbent local exchange carriers that are regulated as price-cap carriers. The Commission summarizes these data and publishes a report on quality of service trends annually. 4 The tables of this report present comparative data on key company performance indicators. These data include several objective indicators of installation, maintenance, switch outage and trunk blocking performance for each reporting company. The tables also present data on customer perception of service and the level of consumer complaints. A number of indicators are charted over time to present a multi- year view. In addition, the Commission uses statistical methods to analyze the data for long term trends and to establish patterns of industry performance. The results of these analyses are also contained in this report. 1 BellSouth merged with AT&T in December 2006. The charts and tables in this report continue to track BellSouth and other regional Bell companies that have recently merged with AT&T as separate entities. This has been done mainly to capture performance differences that may still exist across the former regional Bell companies. This report identifies these entities by placing an “AT&T” in front of the regional company name (e.g., AT&T BellSouth). 2 In May 2006, Sprint spun off its Local Telecommunications Division as an independent entity under the name Embarq. Embarq data are included in the tables and charts in this report. 3 See Revision of ARMIS Annual Summary Report (FCC Report 43-01), ARMIS USOA Report (FCC Report 43-02), ARMIS Joint Cost Report (FCC Report 43-03), ARMIS Access Report (FCC Report 43-04), ARMIS Service Quality Report (FCC Report 43-05), ARMIS Customer Satisfaction Report (FCC Report 43-06), ARMIS Infrastructure Report (FCC Report 43-07), ARMIS Operating Data Report (FCC Report 43-08), ARMIS Forecast of Investment Usage Report (FCC Report 495A), and ARMIS Actual Usage of Investment Report (FCC Report 495B) for Certain Class A and Tier 1 Telephone Companies, CC Docket No. 86-182, Order, 20 FCC Rcd 19377 (2005). 4 The last report, which included data for 2005, was released in February 2007. See Industry Analysis and Technology Division, Wireline Competition Bureau, Federal Communications Commission, Quality of Service of Incumbent Local Exchange Carriers (February, 2007). That report can be found on the Commission’s website at www.fcc.gov/wcb/stats under the file name QUAL05.ZIP. Source data used to prepare this report may be useful for further investigation and can be extracted from the ARMIS 43-05 and 43-06 tables on the online database maintained on the FCC website at www.fcc.gov/wcb/eafs. 2 1.2 Key Findings for 2006 The quality of service report tracks industry performance over time on eight key quality of service indicators: average complaints per million lines, percent of installation commitments met, lengths of installation intervals, lengths of repair intervals, percent of switches with outages, trouble report rate per thousand access lines, percent dissatisfied with installation, and percent dissatisfied with repair. Since our last report, there have been only small changes in the values of most of these indicators. However, our analysis, which incorporated service quality data from the most recent six years, identified the presence of statistically significant long-term upward or downward trends in a number of the indicators of industry-wide performance (i.e., with data for large and small companies combined) and in indicators of large and small company performance, when these data were analyzed separately. 5 These trends are identified below: • Trouble reports per thousand lines are increasing on average 2.1% annually for the industry overall and 6.9% annually for the smaller companies. • Repair intervals are increasing on average 5.5% annually for the industry overall, 6.7% annually for the larger companies, and 4.7% annually for the smaller companies. • Percentage of switches with outages is decreasing on average 11.1% per year for the industry as a whole and 11.2% per year for the larger companies. • Percentage of installation commitments met is increasing for small companies on average 0.8% annually. No statistically significant long-term upward or downward trends were observed in any of the other indicators of large-company, small-company or industry-wide performance. We note however, that the absence of an industry trend does not exclude the possibility that individual companies have significant performance trends. Indeed, our statistical analysis shows that both trends and performance differ significantly across companies for all the tracked indicators. Charts 1-8 of this report illustrate graphically how individual companies have performed over the last six years relative to other companies in the same size class. In particular, Chart 1 covering the average of business and residential complaints per million access lines provides a good illustration of apparent long-term differences in performance among the charted companies. 6 Considering recent changes, we note that despite the continued statistically significant long-term, industry-wide trend toward increasing repair intervals, the industry average repair interval actually decreased in 2006 for the first time since 2002. However, not all companies reported repair-interval improvements. 7 We also found that the indicator that tracks the percent of 5 A trend is the average (or expected) annual percentage decline or increase in the value of the indicator. Our statistical analysis shows that, for many indicators, the probability that these trends occurred by chance is very small (i.e., less than one chance in one thousand for some indicators, and less than one chance in one hundred for others). In these cases, we say the trend is statistically significant. For further discussion of the statistical techniques employed in this report and detailed results, see infra Section 5.2. For a list of large and small companies, see infra note 28 and note 29. 6 Current complaint data is also provided with separate business and residential categories in Tables 1a and 2a. 7 See charts 7a and 7b. 3 customers dissatisfied with large-company repairs increased for the second consecutive year, 8 and that most of the large companies exhibited increases in this indicator last year. However, our statistical analysis shows no long-term trend toward increasing customer dissatisfaction with large- company repairs. Thus, these recent increases have not persisted long enough or been of sufficient magnitude to have impacted the longer-term statistics. 2. Report History At the end of 1983, anticipating AT&T's imminent divestiture of its local operating companies, the Commission directed the Common Carrier Bureau 9 to establish a monitoring program that would provide a basis for detecting adverse trends in Bell operating company network service quality. The Bureau subsequently worked with industry to refine the reporting requirements, ensuring that the data were provided in a uniform format. Initially, the data were filed twice yearly. The data collected for 1989 and 1990 formed the basis for FCC service quality reports published in June 1990 and July 1991, respectively. These reports highlighted five basic service quality measurements collected at that time. 10 With the implementation of price-cap regulation for certain local exchange carriers, the Commission made several major changes to the service quality monitoring program. These changes first affected data filed for calendar year 1991. First, the Commission expanded the class of companies required to file quality of service data to include non-Bell carriers that elected to be subject to price-cap regulation. 11 These carriers are known collectively as non-mandatory price-cap carriers, and most of them are much smaller than the Bell operating companies. Second, the Commission included service quality reporting in the ARMIS data collection system. 12 Finally, the Commission ordered significant changes to the kinds of data carriers had to report. 13 Following these 8 The smaller companies covered in this report are not required to file data on customer dissatisfaction with repairs. Customer dissatisfaction data are based on company-designed survey methodologies and procedures. The data are collected in the ARMIS 43-06 reports, filed only by the larger incumbent local exchange carriers. 9 As the result of a reorganization in March 2002, the Wireline Competition Bureau now performs Common Carrier Bureau functions described in this report. In this report, references to the Common Carrier Bureau apply to activities prior to the above date. 10 These were customer satisfaction level, dial tone delay, transmission quality, on time service orders, and percentage of call blocking due to equipment failure. 11 Policy and Rules Concerning Rates for Dominant Carriers, CC Docket No. 87-313, Second Report and Order, 5 FCC Rcd 6786, 6827-31 (1990) (LEC Price-Cap Order) (establishing the current service quality monitoring program and incorporating the service quality reports into the ARMIS program), Erratum, 5 FCC Rcd 7664 (1990), modified on recon., 6 FCC Rcd 2637 (1991), aff'd sub nom., Nat'l Rural Telecom Ass'n v. FCC, 988 F.2d 174 (D.C. Cir. 1993). The incumbent local exchange carriers that are rate-of-return regulated are not subject to federal service quality reporting requirements. 12 LEC Price-Cap Order, 5 FCC Rcd at 6827-30. The ARMIS database includes a variety of mechanized company financial and infrastructure reports in addition to the quality-of-service reports. Most data are available disaggregated to a study area level which generally represents operations within a given state. 13 Id.; Policy and Rules Concerning Rates for Dominant Carriers, CC Docket No. 87-313, Memorandum Opinion and Order, 6 FCC Rcd 2974 (1991) (Service Quality Order), recon., 6 FCC Rcd 7482 (1991). 4 developments, the Commission released service quality reports in February 1993, March 1994, and March 1996. In 1996, pursuant to requirements in the Telecommunications Act of 1996, 14 the Commission reduced the frequency of ARMIS data reporting to annual submissions, and in May 1997, clarified relevant definitions. 15 The raw data are now filed in April of each year. The Commission summarizes these data and publishes the quality of service report annually. 16 3. The Data 3.1 Tables The data presented in this report summarize the most recent ARMIS 43-05 and 43-06 carrier reports. 17 Included are data from the regional Bell companies, Embarq and all other reporting incumbent local exchange carriers. 18 Tables 1(a) through 1(e) cover data from the regional Bell Previously the Common Carrier Bureau had collected data on five basic service quality measurements from the Bell operating companies, described earlier. 14 Telecommunications Act of 1996, Pub. L. No. 104-104, 110 Stat. 56. 15 Orders implementing filing frequency and other reporting requirement changes associated with implementation of the Telecommunications Act of 1996 are as follows: Implementation of the Telecommunications Act of 1996: Reform of Filing Requirements and Carrier Classifications, CC Docket No. 96-193, Order and Notice of Proposed Rulemaking, 11 FCC Rcd 11716 (1996); Revision of ARMIS Quarterly Report (FCC Report 43-01) et al., CC Docket No. 96-193, Order, 11 FCC Rcd 22508 (1996); Policy and Rules Concerning Rates for Dominant Carriers, CC Docket No. 87-313, Memorandum Opinion and Order, 12 FCC Rcd 8115 (1997); Revision of ARMIS Annual Summary Report (FCC Report 43-01) et al., AAD No. 95-91, Order, 12 FCC Rcd 21831 (1997). 16 In the past, the quality of service reports have included data from the mandatory price-cap companies and the largest non-mandatory carriers, GTE and Sprint (now Embarq). GTE is now a part of Verizon, a mandatory price-cap carrier. Beginning with the December 2004 report, the following smaller non-mandatory price-cap companies that file ARMIS 43-05 data are included: Alltel Corp., Century Tel., Cincinnati Bell, Citizens, Citizens Frontier, Iowa Telecom, and Valor Telecommunications. Alltel and Valor are now owned by Windstream Corp. Non-mandatory carriers are not required to file customer satisfaction data that appear in the ARMIS 43-06 report. 17 Source data used in preparing this report may be useful for further investigation and can be extracted from the ARMIS 43-05 and 43-06 tables on the online database maintained on the FCC website at www.fcc.gov/wcb/eafs. The data are also available from Best Copy and Printing, Inc at (202) 488-5300. A number of prior-year data summary reports are available through the FCC’s Reference Information Center (Courtyard Level) at 445 12th Street, S.W., Washington, D.C. 20554 and the Wireline Competition Bureau Statistical Reports website at www.fcc.gov/wcb/stats. 18 In February 1992, United Telecommunications Inc. became Sprint Corporation (Local Division); and in March 1993, Sprint Corporation acquired Centel Corporation. Sprint recently spun off its local telephone division as a new entity, Embarq, and that name is now used in the charts and tables in this report. Bell Atlantic and NYNEX merged in August 1997, and then merged with GTE in 2000. Verizon Communications is shown separately for GTE, Verizon North (the former NYNEX companies), and Verizon South (the former Bell Atlantic Companies). Similarly, SBC and Pacific Telesis merged in April 1997, SBC and SNET merged in October 1998, and SBC and Ameritech merged in October 1999. SBC and AT&T then merged at the end of 2005 and the merged company retained the name AT&T. In 2006 BellSouth merged with AT&T 5 companies, or mandatory price-cap companies. Tables 2(a) through 2(c) cover data from the smaller non-mandatory price-cap companies. These companies report quality of service data at a study area level which generally represents operations within a given state. Although reporting companies provide selected company aggregate data, the tables of this report contain summary data that have been recalculated by FCC staff as the composite aggregate of all study areas for each listed entity. This report also includes a fairly extensive summary of data about individual switching outages, including outage durations and numbers of lines affected, for which no company calculated aggregates are provided. Switch outage data have also been aggregated to the company level for inclusion in the tables. The tables contained in this report cover data for 2006. Tables 1(a) and 2(a) provide installation, maintenance and customer complaint data. The installation and maintenance data are presented separately for local services provided to end users and access services provided to interexchange carriers. Tables 1(b) and 2(b) show switch downtime and trunk servicing data. Tables 1(c) and 2(c) show outage data by cause. Table 1(d) presents the percentages of residential, small business and large business customers indicating dissatisfaction with BOC installations, repairs and business offices, as determined by BOC customer perception surveys. 19 Table 1(e) shows the underlying survey sample sizes. The company-level quality of service data included in Tables 1(a)-1(e) and Tables 2(a)-2(c) are derived by calculating sums or weighted averages of data reported at the study area level. In particular, where companies report study area information in terms of percentages or average time intervals, this report presents company composites that are calculated by weighting the percentage or time interval figures from all study areas within that company. For example, we weight the percent of commitments met by the corresponding number of orders provided in the filed data. 20 In the case of outage data summarized in Tables 1(b), 1(c), 2(b), and 2(c), we calculate a number of useful statistics from raw data records for individual switches with outages lasting more than two minutes. These statistics include the total number of events lasting more than two minutes, the average outage duration, the average number of outages per hundred switches, the average number of outages per million access lines, and the average outage line-minutes per thousand access lines and per event. Outage line-minutes is a measure that combines both duration and number of lines affected and again retained the AT&T name. Data from the entities originally known as SBC Southwestern, Ameritech, Pacific Telesis and SNET, as well as BellSouth are shown separately in the charts and tables with the AT&T company name. In the summaries of smaller companies, Windstream Corp. has acquired Alltel Corp. and Valor Telecommunications. Data for acquired entities are still shown separately in this report, where possible. 19 Customer satisfaction data collected in the 43-06 report and summarized in Tables 1(d) and 1(e) are required to be reported only by the mandatory price-cap carriers. 20 Although companies file their own company composites, we have recalculated a number of them from study area data for presentation in the tables to assure that company averages are calculated in a consistent manner. We weight data involving percentages or time intervals in order to arrive at consistent composite data shown in the tables. Parameters used for weighting in this report were appropriate for the composite being calculated and were based on the raw data filed by the carriers but are not necessarily shown in the tables. For example, we calculate composite installation interval data by multiplying the average installation interval at the individual study area level by the number of orders in that study area, summing the results for all study areas, and then dividing that sum by the total number of orders. 6 in a single parameter. We derive this parameter from the raw data by multiplying the number of lines involved in each outage by the duration of the outage and summing the resulting values over all outages. We then divide the resulting sum by the total number of thousands of access lines or of events to obtain average outage line-minutes per access line and average outage line minutes per event, respectively. 3.2 Charts This report displays data elements that have remained roughly comparable over the past few years. Such data are useful in identifying and assessing trends. In addition to the tables, this report contains charts that highlight company trends for the last 6 years. Unlike the tables for which the company composites are recalculated, the data in the charts are presented or derived from company provided roll-up or composite data. 21 Charts 1 through 7 graphically illustrate trends in complaint levels, initial trouble reports, residential installation dissatisfaction, percent of residential installation commitments met, residential installation intervals, residential repair dissatisfaction, and residential initial out-of-service repair intervals, respectively. Chart 8 displays trends among the larger price-cap carriers in the percentage of switches with outages. Data for Embarq (formerly Sprint Local Division, the largest non-mandatory price-cap company) are included only in those charts displaying ARMIS 43-05 data that it is required to file. This report charts the performance of the smaller price-cap carriers only on selected quality of service indicators including the trouble report rate per thousand lines, lengths of repair intervals and lengths of installation intervals. These indicators were selected for charting because they are generally less volatile than the others, thus allowing better comparison with similar trended data from the larger companies. (In the cases where we chart both large and small company performance, the larger companies are tracked on the chart with an ‘A’ designation, e.g., Chart 7A, while the smaller companies are tracked on the chart with a ‘B’ designation, e.g., Chart 7B.) Filed data are not available for all of the past six years for several of the smaller companies, which accounts for the truncated trend lines in some of the charts. Since the most current access line counts are used as weighting factors in calculation of industry composites in the charts, small changes in these composites from year-to-year may be accounted for by changes in the relative number of company access lines. For example, this accounted for a reduction in composite complaint levels in 2004 of less than one percent. 3.3 For More Information about the Data More detailed information about the raw data from which this report has been developed may be found on the Commission’s ARMIS web page cited earlier. Descriptions of the raw ARMIS 43-05 source data items from which Tables 1(a), 1(b), 1(c), 2(a), 2(b), and 2(c) were prepared can be found in Appendix A of this report. Tables 1(d) and 1(e) were prepared from data filed only by the Bell operating companies in the ARMIS 43-06 report. The statistics presented in Tables 1(d) and 1(e) are straightforward and reflect the data in the format filed. Complete data descriptions are available in several Commission orders. 22 21 Calculations to normalize data and derive percentages in charts 1, 2A, 2B and 8 in this year’s report were performed directly on company provided composite data rather than from recalculated composites in the attached tables. Other charts contain data that were taken directly from company provided composite data. Graphed composite AT&T data in the charts do not include data for BellSouth (which merged with AT&T at the very end of 2006). BellSouth data are shown separately to facilitate comparisons with prior year data. 22 See supra note 15. 7 4. Qualifications Overall, we caution readers to be aware of potential inconsistencies in the service quality data and methodological shortcomings affecting both the collection and interpretation of the data. Some common sources of issues are described below. 4.1 Data Re-filings Commission staff generally screen company-filed service quality data for irregularities and provide feedback to reporting companies on suspected problems. The reporting companies are then given an opportunity to re-file. Re-filed data appear in this report if they are received in time to be included in the Commission’s recalculation of holding company totals and other data aggregates described in Section 3.1. However, it is expected that the process of data correction continues beyond the date of publication of this report, as new problems are identified. Reporting companies frequently re-file data, not only for the current reporting period, but also occasionally for previous reporting periods. Hence, users of the quality of service report data may find some inconsistencies with data extracted from the ARMIS database at a later or earlier date. 4.2 Commission Recalculation of Holding Company Aggregate Statistics Commission staff do not typically delete or adjust company-filed data for presentation in the quality of service report, except for recalculating holding company totals and other data aggregates as described in Section 3.1. Recalculated aggregates appear in the tables of the quality of service report. These may not match corresponding company-filed totals and composites. 23 Such inconsistencies are due primarily to differences in the way we and the reporting company derive the data element, for example, in the use of percentages or average intervals that require weighting in the calculations. 4.3 Company-specific Variations Users conducting further analysis of the data should be aware that variations in service quality measurements may occur among companies and even within the same company over time for reasons other than differences in company performance. For example, data definitions must be properly and consistently interpreted. The Commission has, on occasion, provided clarifications when it became apparent that reporting companies had interpreted reporting requirements inconsistently. 24 Changes in 23 Data presented in the charts are company-filed composites, except where noted. 24 For example, because of data problems resulting from the various classifications of trouble reports, the Commission addressed problems relating to subtleties in the definitions associated with the terms “initial” and “repeat” trouble reports. See Policy and Rules Concerning Rates for Dominant Carriers, CC Docket No. 87-313, Memorandum Opinion and Order, 12 FCC Rcd 8115, 8133, para. 40 (1997); Policy and Rules Concerning Rates for Dominant Carriers, AAD No. 92-47, Memorandum Opinion and Order, 8 FCC Rcd 7474, 7478, para. 26, 7487-7549, Attachment (1993); Revision of ARMIS Annual Summary Report (FCC Report 43-01) et al., AAD 95-91, Order, 12 FCC Rcd 21831, 21835, para. 10 (1997) (introducing reporting of “subsequent” troubles). This issue was discussed at greater length in a prior summary report. See Industry Analysis Division, Common Carrier Bureau, Federal Communications Commission, Quality of Service for the Local Operating Companies Aggregated to the Holding Company Level (March 1996). 8 a company’s internal data collection procedures or measurement technology may also result in fluctuations in its service quality measurements over time. In some cases, procedural changes in the data measurement and collection process may be subtle enough so that they are not immediately noticeable in the data. However, significant changes in company data collection procedures usually result in noticeable and abrupt changes in the data. 25 It appears that at least some of these changes have not been reported to the Commission. These factors tend to limit the number of years of reliable data available to track service quality trends. Although the Commission has made considerable efforts to standardize data reporting requirements over the years, given the number of changes to the reporting regimes and predictable future changes, one should not assume exact comparability on all measurements for data sets as they are presented year by year. In spite of all of the foregoing, deteriorating or improving service quality trends that persist for more than a year or two usually become obvious and can provide a critical record for state regulators and others. 4.4 Trend Analysis and Data Volatility Because measurements of any particular quality of service indicator may fluctuate over time, trend analysis can be an effective tool in helping to evaluate longer-term company and industry performance. Consideration of trends may also provide insight into typical lead times that might be needed to correct certain problems once they have been identified. In addition, adverse trends in complaint levels of significant duration, when identified, can serve as warning indicators of problems not included in the more specific objective measurements. For these reasons we identify statistically significant trends in the data. Identification of such trends assists in evaluating the significance of year-to-year changes in the data. With respect to individual measures of company performance, it is our experience that in evaluating customer satisfaction data one must consider longer term trends and take into account the effects of filing intervals and lag times in data preparation and filing. 4.5 Interpretation of Outage Statistics Statistics describing the impact of outages should be considered in context. For example, a statistic representing the average number of lines affected per event would tend to favor a company with a larger number of smaller or remote switches and lower line counts per switch, while a statistic representing the average outage duration might favor a company with a few larger switches. Thus, using the average number of lines per event measurement, one 25,000 line switch that is out of service for five minutes would appear to have a greater service impact than ten 2,500 line switches that are each out of service for five minutes. To provide a consistent basis for comparison of performance of 25 For example, SBC (now AT&T) reported changes for 2003 in its complaint data which were designed to normalize disparate reporting methodologies in its Ameritech region. Resulting declines in complaint levels are at least partially attributable to these changes, which involved elimination of several complaint data reporting subcategories previously included by Ameritech. At our request, the company restated 2002 data for Ameritech to conform to new procedures that were introduced for the 2003 data collection and reporting. The restated Ameritech data were not formally filed as a revision but would have shown 43.9 residential complaints per million residential lines and 15.9 business complaints per million business lines. This would have resulted in an average of 29.9 complaints per million lines instead of the 213.4 complaints per million lines shown for the year 2002 Chart 1. Although improvement in 2003 is still indicated, the improvement appears to be more modest if we assume that the procedural change took place in 2002 instead of 2003. 9 companies having different switch size characteristics, we present a grouping of outage statistics that can capture the impact of both the number of lines affected and the duration of the outage. These statistics include outage line-minutes per event and per 1,000 access lines. 4.6 External Factors We note that external factors, including economic conditions and natural disasters, the level of competitive activity, and changes in regulation have the potential to affect the quality of service available in specific regions of the country or in the industry as a whole, and these effects may be manifested in the quality of service data. 26 The Commission does not currently consider these effects in its analysis. 5. Observations and Statistical Analysis 5.1 Observations from the Current Year Summary Data Charts 1 to 8 track service quality summary data for the large and small price-cap carriers for the last six years. In 2006, the average large company repair interval decreased for the first time since 2002, and stood at 29.3 hours at the end of 2006. The average repair interval for the smaller companies also decreased from 19.2 hours in 2005 to 17.0 hours in 2006. 27 However, the weighted average residential customer dissatisfaction associated with repairs by large companies increased from 13.0 to 13.6 percent dissatisfied. This follows a similar increase in 2005. By way of contrast, the average length of the residential installation interval has not changed since 2002 for the larger companies. After a number of years of decline the weighted average number of complaints per million access lines among the large price cap carriers increased for the second consecutive year from 92.9 in 2004, to 101.6 in 2005, to 119.1 in 2006. However, this number remains well below 166.1 complaints per million access lines, the high-point for the six-year period, observed in 2001. 5.2 Statistical Analysis The FCC’s quality of service report tracks several key indicators of industry and company performance. The indicators currently tracked are complaints per million lines, length of installation intervals, length of repair intervals, percent of installation commitments met, trouble reports per thousand lines, percent installation dissatisfaction, percent repair dissatisfaction and percent of switches with outages. In this year’s report we update the results of the statistical analysis of these indicators using raw data samples received from reporting companies. The overall goals of our statistical analysis are to: 26 For example, the actions of the California Public Utilities Commission to clear a complaint backlog in 2005 may have affected complaint levels in that state. 27 Although the length of the average repair interval decreased in 2006, a number of reporting entities exhibited larger repair intervals in 2006 than in 2005. In addition, our analysis continues to indicate the presence of statistically significant six-year trends toward increasing repair intervals for both large and small companies. See infra Section 5.2. 10 ƒ determine if there were any discernable trends in performance as tracked by these indicators across the years, ƒ determine if reporting companies performed differently from each other, ƒ determine whether the large reporting companies performed differently or had different trend behavior from small reporting companies, and ƒ develop models of trends in performance that could be used to predict next year’s performance. For the purpose of our analysis, we classified companies as “large” or “small.” This classification is largely the same as that used earlier in creating the charts -- the larger companies 28 are tracked on the charts with an ‘A’ designation (e.g., chart 2A), and the smaller companies 29 are tracked on the charts with a ‘B’ designation (e.g., chart 2B). However, even though Iowa Telecom was classified as a small company in the charts, it was included as a large company for the statistical analysis, since its performance was very close to that of the larger companies. We used several types of statistical techniques in analyzing the data. These included ANOVA (Analysis of Variance), ANCOVA (Analysis of Covariance) and simple linear regression. They allowed us to analyze small-versus-large company effects, individual company effects, and year effects (i.e., does performance vary from year-to-year) in the performance data for each of the key indicators. We tested for the existence of overall trends, 30 trends for only the large companies, and trends for only the small companies. If a trend existed, we then determined its direction and magnitude. In addition, the statistical testing allowed us to determine if the trends varied widely across companies, if there were performance differences across companies, and if large company performance differed from small company performance. The following table summarizes the results of our statistical analysis on data filed by reporting companies since the year 2001, representing the most recent six-year reporting period. 31 (Note that smaller non-mandatory price cap carriers are not required to file data on all performance indicators. These are designated as “NA” in the table.) The rows of the table contain the key indicators of company performance tracked by this report. The columns contain the effects described above. A “Yes” entry in the table means that we have concluded with a high level of statistical confidence that the effect for which we have tested is indeed present. A “No” entry means that the data did not support such a conclusion. For example, we tested to determine whether large company performance differs from small company 28 The larger companies in the charts of this report are AT&T Ameritech, AT&T BellSouth, AT&T Pacific, AT&T SNET, AT&T Southwestern, Embarq, Qwest, Verizon GTE, Verizon North, and Verizon South. 29 The smaller companies in the charts of this report are Alltel Corp, Cincinnati Bell, Citizens, Citizens Frontier, Century Tel., Iowa Telecom and Valor. Alltel and Valor are now owned by Windstream Corp. 30 A trend is the expected annual change in the value of the performance indicator. For example, a negative trend of -5.2% means that every year the value of the indicator is expected to decrease by 5.2%. A positive trend (e.g., +6.3%), means that every year the value of the indicator is expected to increase by 6.3%. The magnitude and direction of the trend for a particular performance indicator is estimated by fitting a linear regression model to the logarithms of the values of that performance indicator for the past six years. 31 The table is based on individual raw study area samples from the ARMIS database which have not been weighted. The trends calculated from these samples may therefore differ from composite trends calculated as weighted company totals. 11 performance on the average complaints per million lines indicator, and we concluded with statistical confidence that large company performance does differ from small company performance on this indicator. We included the direction and magnitude of a trend in the table if our statistical testing indicated that there was a low probability the trend occurred as a result of random fluctuations in the data, i.e., was statistically significant. A number of the trends were found significant at less than the 0.001 level, meaning there was less than one chance in 1000 that these trends occurred as a result of random data fluctuations. However, asterisked trends were found significant at less than the 0.01 level, but not at the 0.001 level, meaning that there was a greater probability—between one chance in 100 and one chance in 1000— that these trends happened by chance. The word “No” appearing in any of the first three columns of the table indicates that a trend could not be established at the 0.01 level of significance. In the last three columns of the table the word “Yes” indicates that statistically significant differences were found between companies or groups of companies, and the word “No” indicates that such differences could not be established statistically. Results of Statistical Testing of Key Industry Performance Indicators Trend Over All Companies Trend For Large Companies Trend for Small Companies Trends Vary Widely Across Companies Performance Differences Across Companies Large Company Performance Differs From Small Average complaints per million lines No No No Yes Yes Yes Installation intervals No No No Yes Yes Yes Repair intervals +5.5% +6.7% +4.7%* Yes Yes No Percent commitments met No No +0.8%* Yes Yes Yes Trouble report rate per 1000 lines +2.1%* No +6.9% Yes Yes No Percent installation dissatisfaction No No NA Yes Yes NA Percent repair dissatisfaction No No NA Yes Yes NA Percent switches with outages -11.1% -11.2% No* Yes Yes Yes All results are significant at less than the 0.001 level except as noted below. * Indicates a result which was significant at less than the 0.01 level. As noted earlier, a trend represents the expected or average change in the value of the performance indicator from year to year. Considering columns 1 through 3, we note our analysis has allowed us to conclude with a high degree of confidence that statistically significant trends do exist in the data for some indicators of performance. Factors other than random data variability are likely to be responsible for these trends. However, what those factors are cannot be determined from our data alone. (Section 4 of this report discusses factors that may impact the data in addition to company performance.) Also, recent observed annual performance changes may not necessarily be in a direction or magnitude consistent with calculated trends of the previous five years. This may occur, for example, when significant underlying events or changes occur. 32 32 For example, chart 7A, covering repair interval data shows the average repair interval increasing every year 12 Considering column 4 in the above chart, we find that trends vary widely across companies. Column 5 shows that there are statistically significant differences in performance across companies in all performance areas tracked in this report. Finally, column 6 shows that our analysis of disaggregated study area data found that large company performance is statistically indistinguishable from small company performance in the lengths of their repair intervals and in the number of trouble reports per thousand lines. Overall, our analysis shows that there are statistically significant trends over the most recent six-year period for some of the performance measures. Trends in switching outage data continue to exhibit long-term improvement. However, the upward trends in the length of repair intervals and in the number of initial trouble reports per thousand lines provide evidence of longer-term declining performance in these areas. Our analysis finds no statistically significant trends in customer repair dissatisfaction levels. This could change if repair dissatisfaction continues to increase as it has over the past two years. Similarly, customer complaint levels have risen for two years, and this more recent data has replaced declining complaint levels from 1999 and 2000 (shown in prior reports) in our trend calculations. The net effect has been to end the long-term trend toward declining complaint levels observed in our most recent previous report. In closing, we note that although the highlighted trends reflect longer term patterns in company performance than simply looking at year over year changes, their direction in the future may change as companies respond or fail to respond to quality of service issues. between 2002 and 2005, but in 2006 the average repair interval decreased. Our statistical analysis shows differences among companies and a significant long-term trend toward increasing repair intervals in spite of the data point from 2006. The statistically significant upward repair interval trend result will only change if this new trend continues. ARMIS 43-05 Report 2001 2002 2003 2004 2005 2006 AT&T Ameritech 382.8 213.4 13.2 11.2 12.0 8.3 AT&T BellSouth 192.7 131.5 128.0 131.4 137.7 119.1 AT&T Pacific 19.6 12.5 10.6 10.4 23.3 42.1 AT&T Southwestern 23.9 17.0 13.4 21.9 21.9 14.9 AT&T SNET 231.6 186.6 87.1 88.5 20.4 21.1 Qwest 203.4 149.2 103.5 89.1 80.8 69.3 Verizon GTE 80.1 60.3 79.1 104.8 161.0 171.2 Verizon North (Combined with Verizon South) Verizon South 197.3 151.8 190.7 184.7 191.9 266.7 Embarq (formerly Sprint) 136.4 75.3 78.9 43.3 46.0 60.6 Weighted BOC/Embarq Composite* 166.1 112.9 93.9 92.9 101.6 119.1 *Weighted composite is calculated using access line counts. Chart 1 Average of Residential and Business Complaints per Million Access Lines (Calculated Using Data from Company Provided Composites) Relative Complaint Levels Large Price-Cap Carriers 0.0 50.0 100.0 150.0 200.0 250.0 2001 2002 2003 2004 2005 2006 Years Co mpla ints per Millio n Lines Weighted BOC/Embarq Composite* Weighted Verizon Avg. BellSouth Weighted AT&T Avg. (excluding BellSouth) Qwest Embarq (formerly Sprint) 13 ARMIS 43-05 Report 2001 2002 2003 2004 2005 2006 AT&T Ameritech 200.4 171.4 149.7 146.2 144.3 153.8 AT&T BellSouth 300.1 285.0 278.5 298.2 307.3 265.8 AT&T Pacific 146.8 129.0 119.4 116.1 129.4 101.7 AT&T Southwestern 222.0 197.8 175.4 190.5 173.3 179.8 AT&T SNET 195.6 173.2 180.3 165.8 184.9 176.1 Qwest 131.3 111.4 113.4 117.6 112.6 111.3 Verizon GTE 164.5 146.4 153.0 167.2 191.7 176.7 Verizon North (Combined with Verizon South) Verizon South 160.6 151.5 169.4 157.8 164.1 167.4 Embarq (formerly Sprint) 206.3 165.6 192.2 216.1 221.1 220.1 Weighted BOC/Embarq Composite* 190.9 172.1 172.1 175.8 181.0 172.3 * Weighted composite is calculated using access line counts. Total Initial Trouble Reports per Thousand Lines (Residence + Business) (Calculated Using Data from Company Provided Composites) Chart 2A Initial Trouble Reports per Thousand Lines Large Price-Cap Carriers 0.0 50.0 100.0 150.0 200.0 250.0 300.0 350.0 2001 2002 2003 2004 2005 2006 Years Number o f Repo rts Weighted Verizon Avg. AT&T BellSouth Weighted AT&T Avg.(excluding Bellsouth) Qwest Embarq (formerly Sprint) Weighted BOC/Embarq Composite* 14 ARMIS 43-05 Report 2001 2002 2003 2004 2005 2006 Century Tel. 266.9 265.0 231.1 213.3 Cincinnati Bell 136.0 118.7 114.6 113.6 131.4 119.5 Citizens 286.0 264.0 260.2 296.0 325.1 270.3 Citizens (Frontier) 252.6 345.8 266.6 257.2 252.4 242.7 Iowa Telecom 135.9 132.6 157.2 155.4 161.1 Windstream -- Alltel 233.5 193.1 128.2 206.2 Windstream --Valor 397.7 368.0 422.6 479.8 506.4 Weighted BOC/Embarq Composite* 190.9 172.1 172.1 175.8 181.0 172.3 Weighted Small Co.Composite* 232.8 258.6 237.6 244.7 245.6 241.1 * Weighted composite is calculated using access line counts. Total Initial Trouble Reports per Thousand Lines (Residence + Business) (Calculated Using Data from Company Provided Composites) Chart 2B Initial Trouble Reports per Thousand Lines Small Price-Cap Carriers 0.0 100.0 200.0 300.0 400.0 500.0 600.0 2001 2002 2003 2004 2005 2006 Years Number o f Repo rts Windstream -- Alltel Cincinnati Bell Citizens Citizens (Frontier) Weighted Small Co.Composite* Weighted BOC/Embarq Composite* Century Tel. Iowa Telecom Windstream -- Valor 15 ARMIS 43-06 Report 2001 2002 2003 2004 2005 2006 AT&T Ameritech 15.5 10.7 8.1 7.6 6.7 7.4 AT&T BellSouth 11.2 10.3 6.7 6.4 5.7 6.2 AT&T Pacific 8.8 6.4 6.1 6.1 6.4 6.9 AT&T Southwestern 8.0 8.1 7.9 8.4 7.1 6.6 AT&T SNET 8.3 7.3 7.6 8.6 8.4 8.3 Qwest 6.4 7.0 5.5 3.9 3.7 3.8 Verizon GTE 4.8 4.1 3.5 5.3 6.9 7.3 Verizon North (Combined with Verizon South) Verizon South 4.8 5.2 6.2 6.4 6.2 6.5 Weighted BOC Composite* 8.2 7.2 6.3 6.4 6.2 6.5 *Weighted composite is calculated using access line counts. Chart 3 Percent Dissatisfied --BOC Residential Installations (Using Company Provided Composites) Residential Installation Dissatisfaction BOCs 0.0 2.0 4.0 6.0 8.0 10.0 12.0 2001 2002 2003 2004 2005 2006 Years Percent Dissa tisfied Weighted Verizon Avg AT&T BellSouth Weighted AT&T Avg. (excluding Bellsouth) Qwest Weighted BOC Composite* 16 ARMIS 43-05 Report 2001 2002 2003 2004 2005 2006 AT&T Ameritech 98.8 99.1 98.9 98.6 98.6 98.6 AT&T BellSouth 100.0 100.0 98.2 98.7 98.7 98.2 AT&T Pacific 99.5 99.6 99.6 99.4 99.2 99.3 AT&T Southwestern 98.8 98.9 99.1 99.0 99.1 99.3 AT&T SNET 100.0 100.0 99.5 99.6 99.6 99.7 Qwest 99.3 99.5 99.7 99.7 99.6 99.6 Verizon GTE 95.5 98.5 98.3 98.4 98.0 97.9 Verizon North (Combined with Verizon South) Verizon South 98.9 98.7 98.7 98.8 98.9 98.7 Embarq (formerly Sprint) 98.8 98.2 97.5 96.8 97.2 97.0 Weighted BOC/Embarq Composite* 98.8 99.1 98.8 98.8 98.8 98.7 *Weighted composite is calculated using access line counts. Percent Installation Commitments Met -- Residential Services (Using Company Provided Composites) Chart 4 Percent Residential Installation Commitments Met Large Price-Cap Carriers 97.0 97.5 98.0 98.5 99.0 99.5 100.0 2001 2002 2003 2004 2005 2006 Years Percent o f Co mmitments M e t Weighted BOC/Embarq Composite* Weighted Verizon Avg BellSouth Weighted AT&T Avg. (excluding BellSouth) Qwest 17 ARMIS 43-05 Report 2001 2002 2003 2004 2005 2006 AT&T Ameritech 2.0 2.1 1.5 1.4 1.4 1.5 AT&T BellSouth 1.2 1.1 1.1 1.1 1.3 1.3 AT&T Pacific 1.3 1.2 1.5 1.6 1.5 1.6 AT&T Southwestern 2.2 1.8 1.9 2.0 2.1 1.1 AT&T SNET 1.8 1.0 1.0 1.0 1.0 1.1 Qwest 0.6 0.5 0.4 0.3 0.3 0.2 Verizon GTE 0.8 0.6 0.6 0.6 0.9 0.9 Verizon North (Combined with Verizon South) Verizon South 1.1 1.0 1.1 1.1 1.0 1.3 Embarq (formerly Sprint) 3.2 1.5 1.4 1.7 1.7 1.8 Weighted BOC/Embarq Composite* 1.4 1.2 1.2 1.2 1.2 1.2 * Weighted composite is calculated using access line counts. Chart 5A Average Residential Installation Interval in Days (Using Company Provided Composites) Residential Installation Intervals Large Price-Cap Carriers 0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 2001 2002 2003 2004 2005 2006 Years In t e r val i n D ays Weighted BOC/Embarq Composite* Weighted Verizon Avg BellSouth Weighted AT&T Avg. (excluding BellSouth) Qwest Embarq (formerly Sprint) 18 ARMIS 43-05 Report 2001 2002 2003 2004 2005 2006 Century Tel. 3.3 1.6 1.3 0.8 Cincinnati Bell 2.3 1.7 4.5 1.7 2.1 2.0 Citizens 4.6 4.8 5.3 4.1 2.7 4.2 Citizens (Frontier) 3.5 5.3 4.8 5.1 5.0 4.1 Iowa Telecom 2.1 1.8 1.9 1.7 1.4 Windstream -- Alltel 1.8 1.6 2.6 2.5 Windstream --Valor 3.0 2.0 1.6 2.2 2.8 Weighted BOC/Embarq Composite* 1.4 1.2 1.2 1.2 1.2 1.2 Weighted Small Co.Composite* 3.6 3.8 3.8 2.8 2.7 2.9 * Weighted composite is calculated using access line counts. Chart 5B Average Residential Installation Interval in Days (Using Company Provided Composites) Residential Installation Intervals Small Price-Cap Carriers 0.0 1.0 2.0 3.0 4.0 5.0 6.0 2001 2002 2003 2004 2005 2006 Years In t e r val i n D ays Weighted BOC/Embarq Composite* Windstream -- Alltel Cincinnati Bell Citizens Citizens (Frontier) Weighted Small Co.Composite* Century Tel. Iowa Telecom Windstream -- Valor 19 Percent Dissatisfied -- BOC Residential Repairs (Using Company Provided Composites) ARMIS 43-06 Report 2001 2002 2003 2004 2005 2006 AT&T Ameritech 19.2 14.6 11.4 11.0 11.1 9.5 AT&T BellSouth 17.6 14.6 10.1 10.0 10.1 9.0 AT&T Pacific 10.0 7.3 7.6 7.4 8.9 10.9 AT&T Southwestern 11.7 9.6 9.9 10.4 9.2 9.5 AT&T SNET 14.2 14.5 11.9 11.6 11.2 13.8 Qwest 10.0 9.3 6.5 5.9 6.2 6.4 Verizon GTE 10.1 11.9 11.2 14.0 16.1 16.4 Verizon North (Combined with Verizon South) Verizon South 13.4 15.3 20.8 19.0 20.4 22.7 Weighted BOC Composite* 13.5 12.5 12.5 12.2 13.0 13.6 * Weighted composite is calculated using access line counts. Chart 6 Residential Repair Dissatisfaction BOCs 0.0 5.0 10.0 15.0 20.0 25.0 2001 2002 2003 2004 2005 2006 Years P ercen t Di ssati sfi e d Weighted BOC Composite* Weighted Verizon Avg AT&T BellSouth Weighted AT&T Avg. (excluding BellSouth) Qwest 20 Average Initial Out-of-Service Repair Interval in Hours -- Residential Services (Using Company Provided Composites) ARMIS 43-05 Report 2001 2002 2003 2004 2005 2006 AT&T Ameritech 22.7 18.9 16.8 17.2 16.3 17.3 AT&T BellSouth 20.8 20.0 21.5 33.5 44.8 20.6 AT&T Pacific 26.8 25.9 25.8 28.8 45.2 52.6 AT&T Southwestern 24.9 21.0 22.1 29.0 24.6 22.4 AT&T SNET 27.2 27.4 26.7 27.2 30.6 34.4 Qwest 14.1 13.6 14.7 16.3 18.8 18.3 Verizon GTE 13.5 15.5 15.7 28.9 28.5 24.2 Verizon North (Combined with Verizon South) Verizon South 22.0 24.1 34.5 29.2 34.3 40.5 Embarq (formerly Sprint) 13.9 15.2 17.3 22.6 23.8 18.8 Weighted BOC/Embarq Composite* 20.7 20.4 23.3 26.7 31.3 29.3 * Weighted composite is calculated using access line counts. Chart 7A Residential Initial Out-of-Service Repair Intervals Large Price-Cap Carriers 0.0 5.0 10.0 15.0 20.0 25.0 30.0 35.0 40.0 45.0 50.0 2001 2002 2003 2004 2005 2006 Years Int erval in H o urs Weighted BOC/Embarq Composite* Weighted Verizon Avg AT&T BellSouth Weighted AT&T Avg. (excluding BellSouth) Qwest Embarq (formerly Sprint) 21 Average Initial Out-of-Service Repair Interval in Hours -- Residential Services (Using Company Provided Composites) ARMIS 43-05 Report 2001 2002 2003 2004 2005 2006 Century Tel. 14.9 13.9 16.4 9.5 Cincinnati Bell 49.3 36.1 37.5 28.2 30.3 21.6 Citizens 14.7 14.4 16.3 16.7 18.1 17.7 Citizens (Frontier) 16.4 17.7 28.1 22.3 17.6 17.0 Iowa Telecom 11.3 10.1 11.1 11.3 12.2 Windstream -- Alltel 25.9 15.4 13.6 14.6 Windstream --Valor 21.8 16.8 17.3 21.1 21.9 Weighted BOC/Embarq Composite* 20.7 20.4 23.3 26.7 31.3 29.3 Weighted Small Co.Composite* 25.2 21.0 23.0 18.9 19.2 17.0 * Weighted composite is calculated using access line counts. Chart 7B Residential Initial Out-of-Service Repair Intervals Small Price-Cap Carriers 0.0 10.0 20.0 30.0 40.0 50.0 60.0 2001 2002 2003 2004 2005 2006 Years Int erval in H o urs Weighted BOC/Embarq Composite* Windstream -- Alltel Cincinnati Bell Citizens Century Tel. Weighted Small Co.Composite* Iowa Telecom Windstream -- Valor 22 Percentage of Switches with Downtime (Calculated Using Data from Company Provided Composites) ARMIS 43-05 Report 2001 2002 2003 2004 2005 2006 AT&T Ameritech 3.4 4.5 1.5 1.0 0.3 0.4 AT&T BellSouth 5.9 4.2 2.5 1.6 2.3 0.9 AT&T Pacific 15.4 2.3 3.3 3.7 2.3 1.9 AT&T Southwestern 10.3 4.3 3.9 1.5 1.2 1.5 AT&T SNET 42.3 4.4 0.6 6.2 1.3 9.0 Qwest 36.0 18.8 11.1 20.0 13.7 10.0 Verizon GTE 1.6 1.3 2.7 1.5 1.5 3.2 Verizon North (Combined with Verizon South) Verizon South 5.6 2.4 4.4 0.9 0.8 0.8 Embarq (formerly Sprint) 8.8 10.2 3.5 7.5 13.8 8.3 Weighted BOC/Embarq Composite* 10.2 5.0 4.0 3.8 3.2 2.7 *Weighted composite is calculated using access line counts. Chart 8 Percentage of Switches with Downtime Large Price-Cap Carriers 0.0 5.0 10.0 15.0 20.0 25.0 30.0 35.0 40.0 2001 2002 2003 2004 2005 2006 Years Percent Weighted BOC/Embarq Composite* Weighted Verizon Avg AT&T BellSouth Weighted AT&T Avg. (excluding BellSouth) Qwest Embarq 23 AT&T AT&T AT&T AT&T AT&T Qwest Verizon Verizon Verizon Ameritech BellSouth Pacific Southwestern SNET North South GTE Access Services Provided to Carriers-- Switched Access Percent Installation Commitments Met 99.9 100.0 99.0 98.2 100.0 100.0 99.9 99.8 96.9 Average Installation Interval (days) 23.7 19.1 24.2 25.6 19.4 14.5 29.7 18.6 20.6 Average Repair Interval (hours) 5.8 0.5 7.6 3.4 2.2 1.6 5.1 7.6 8.4 Access Services Provided to Carriers -- Special Access Percent Installation Commitments Met 94.7 99.7 95.1 98.1 99.9 96.8 93.1 94.7 92.3 Average Installation Interval (days) 18.1 14.5 15.4 16.0 18.5 6.0 12.8 12.3 9.0 Average Repair Interval (hours) 5.5 3.1 5.9 4.3 3.6 3.2 4.7 3.7 3.9 Local Services Provided to Res. and Business Customers Percent Installation Commitments Met 98.6 96.9 99.3 99.2 99.7 99.5 98.6 98.6 97.6 Residence 98.6 98.2 99.3 99.3 99.7 99.6 98.7 98.8 97.9 Business 98.6 87.6 99.0 99.0 99.5 98.9 97.9 97.2 95.0 Average Installation Interval (days) 1.5 1.4 1.7 1.2 1.5 0.2 1.2 1.5 0.9 Residence 1.5 1.4 1.6 1.1 1.1 0.2 1.1 1.4 0.9 Business 1.4 1.4 2.2 2.2 2.9 0.4 2.0 2.1 2.1 Avg. Out of Svc. Repair Interval (hours) 16.9 19.4 49.9 21.6 32.1 17.6 33.4 40.5 22.2 Total Residence 17.3 20.6 52.7 22.4 34.4 18.3 36.9 45.6 24.2 Total Business 14.7 13.9 35.0 17.5 18.1 14.5 19.1 14.5 12.3 Initial Trouble Reports per Thousand Lines 153.8 265.8 101.7 179.8 176.1 111.3 188.8 151.8 176.7 Total MSA 153.4 259.2 100.1 176.4 173.9 125.3 189.9 145.2 170.7 Total Non MSA 157.5 303.8 146.7 195.8 198.7 44.7 178.8 237.3 201.9 Total Residence 217.7 309.4 144.9 218.1 230.5 134.6 241.8 211.1 210.2 Total Business 63.3 166.7 39.1 93.2 70.9 63.7 101.4 64.6 101.3 Troubles Found per Thousand Lines 122.6 183.9 81.1 143.5 120.4 90.0 149.1 117.3 141.5 Repeat Troubles as a Pct. of Trouble Rpts. 14.6% 17.7% 9.5% 14.3% 13.8% 19.8% 21.1% 22.2% 16.2% Residential Complaints per Million Res. Access Lines 13.4 168.1 75.1 24.2 35.7 100.9 155.5 734.5 280.6 Business Complaints per Million Business Access Lines 3.3 70.1 9.1 5.6 6.6 37.7 33.3 60.1 61.7 * Please refer to text for notes and data qualifications. Table 1(a): Installation, Maintenance, & Customer Complaints Mandatory Price-Cap Company Comparison -- 2006 AT&T AT&T AT&T AT&T AT&T Qwest Verizon Verizon Verizon Ameritech BellSouth Pacific Southwestern SNET North South GTE Total Access Lines in Thousands 14,820 18,429 14,767 12,199 1,787 12,082 12,958 17,776 13,187 Total Trunk Groups 818 2,462 1,159 636 90 2,462 782 980 1,511 Total Switches 1,438 1,615 778 1,635 167 1,310 1,298 1,351 2,409 Switches with Downtime Number of Switches 6 15 15 24 15 131 7 15 78 As a percentage of Total Switches 0.4% 0.9% 1.9% 1.5% 9.0% 10.0% 0.5% 1.1% 3.2% Average Switch Downtime in seconds per Switch* For All Events (including events over 2 minutes) 1.7 11.9 3.4 4.2 59.6 87.7 91.4 5.0 469.8 For Unscheduled Events Over 2 Minutes 1.6 11.5 2.5 3.7 58.3 81.9 91.4 4.8 462.1 For Unscheduled Downtime More than 2 Minutes Number of Occurrences or Events 7.0 8.0 6.0 3.0 2.0 19.0 6.0 9.0 71.0 Events per Hundred Switches 0.5 0.5 0.8 0.2 1.2 1.5 0.5 0.7 2.9 Events per Million Access Lines 0.5 0.4 0.4 0.2 1.1 1.6 0.5 0.5 5.4 Average Outage Duration in Minutes 5.4 38.7 5.3 33.7 81.1 94.2 329.6 11.9 261.3 Average Lines Affected per Event in Thousands 16.4 14.5 39.0 5.3 6.9 1.7 9.4 10.8 1.7 Outage Line-Minutes per Event in Thousands 73.4 87.7 215.0 40.9 550.8 81.8 779.8 165.6 309.9 Outage Line-Minutes per 1,000 Access Lines 34.7 38.1 87.3 10.1 616.6 128.6 361.1 83.9 1668.7 For Scheduled Downtime More than 2 Minutes Number of Occurrences or Events 0.0 1.0 1.0 0.0 0.0 3.0 0.0 0.0 1.0 Events per Hundred Switches 0.0 0.1 0.1 0.0 0.0 0.2 0.0 0.0 0.0 Events per Million Access Lines 0.0 0.1 0.1 0.0 0.0 0.2 0.0 0.0 0.1 Average Outage Duration in Minutes NA 5.0 2.2 NA NA 4.3 NA NA 3.4 Avg. Lines Affected per Event in Thousands NA 5.7 83.7 NA NA 8.5 NA NA 7.2 Outage Line-Minutes per Event in Thousands NA 28.4 184.0 NA NA 30.0 NA NA 24.3 Outage Line-Minutes per 1,000 Access Lines 0.0 1.5 12.5 0.0 0.0 7.4 0.0 0.0 1.8 % Trunk Grps. Exceeding Blocking Objectives 0.0% 4.8% 1.8% 0.3% 1.1% 4.8% 2.4% 3.1% 0.9% * Aggregate downtime divided by total number of company switches. Please refer to text for notes and data qualifications. Table 1(b): Switch Downtime & Trunk Blocking Mandatory Price-Cap Company Comparison -- 2006 AT&T AT&T AT&T AT&T AT&T Qwest Verizon Verizon Verizon Ameritech BellSouth Pacific Southwestern SNET North South GTE Total Number of Outages 1. Scheduled 0 1 1 0 0 3 0 0 1 2. Proced. Errors -- Telco. (Inst./Maint.) 0 0 3 0 0 0 0 1 0 3. Proced. Errors -- Telco. (Other) 0 0 0 0 0 0 0 0 0 4. Procedural Errors -- System Vendors 0 0 0 0 0 0 0 0 0 5. Procedural Errors -- Other Vendors 0 0 0 1 2 2 1 0 0 6. Software Design 1 2 0 0 0 0 2 4 2 7. Hardware design 1 0 0 0 0 0 0 0 0 8. Hardware Failure 5 0 2 2 0 11 3 2 14 9. Natural Causes 0 0 1 0 0 0 0 0 13 10. Traffic Overload 0 0 0 0 0 0 0 0 0 11. Environmental 0 0 0 0 0 0 0 0 0 12. External Power Failure 0 2 0 0 0 4 0 0 19 13. Massive Line Outage 0 0 0 0 0 1 0 0 0 14. Remote 0 1 1 0 0 3 0 0 1 15. Other/Unknown 0 0 0 0 0 0 0 0 3 Total Outage Line-Minutes per Thousand Access Lines 1. Scheduled 0.0 1.5 12.5 0.0 0.0 7.4 0.0 0.0 1.8 2. Proced. Errors -- Telco. (Inst./Maint.) 0.0 0.0 26.2 0.0 0.0 0.0 0.0 58.8 0.0 3. Proced. Errors -- Telco. (Other) 0.0 10.9 0.0 0.0 0.0 0.0 0.0 0.0 0.0 4. Procedural Errors -- System Vendors 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 5. Procedural Errors -- Other Vendors 0.0 0.0 0.0 0.8 616.6 7.4 181.9 0.0 0.0 6. Software Design 9 11 0 0 0 0 10 8 1 7. Hardware design 7.6 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 8. Hardware Failure 17.8 0.0 55.9 9.2 0.0 43.8 168.8 16.4 425.3 9. Natural Causes 0.0 0.0 5.3 0.0 0.0 0.0 0.0 0.0 311.6 10. Traffic Overload 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 11. Environmental 0 0 0 0 0 0 0 0 0 12. External Power Failure 0.0 16.6 0.0 0.0 0.0 65.8 0.0 0.0 789.2 13. Massive Line Outage 0.0 0.0 0.0 0.0 0.0 10.1 0.0 0.0 0.0 14. Remote 0.0 0.0 0.0 0.0 0.0 1.4 0.0 1.0 109.5 15. Other/Unknown 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 31.9 * Please refer to text for notes and data qualifications. Table 1(c): Switch Downtime Causes -- Outages more than 2 Minutes in Duration Mandatory Price-Cap Company Comparison -- 2006 Mandatory Price-Cap Companies: AT&T AT&T AT&T AT&T AT&T Qwest Verizon Verizon Verizon Ameritech BellSouth Pacific Southwestern SNET North South GTE Percentage of Customers Dissatisfied Installations: Residential 7.38% 6.25% 6.93% 6.59% 8.29% 3.81% 5.65% 7.60% 7.32% Small Business 8.88% 10.03% 7.34% 7.63% 8.22% 5.69% 9.31% 12.95% 11.75% Large Business NA 5.26% NA NA NA NA 26.09% 19.23% 13.48% Repairs: Residential 9.53% 8.96% 10.89% 9.52% 13.75% 6.41% 20.06% 26.15% 16.43% Small Business 8.23% 7.47% 8.29% 7.28% 8.55% 6.75% 11.77% 11.54% 9.97% Large Business NA 6.22% NA NA NA NA 15.75% 17.52% 13.29% Business Office: Residential 8.34% 7.03% 5.84% 7.34% 9.43% 1.60% 6.24% 7.20% 10.79% Small Business 6.93% 10.43% 5.57% 6.70% 9.08% 2.29% 6.51% 8.47% 10.09% Large Business NA 7.12% NA NA 12.09% NA 38.62% 37.02% 26.92% * Please refer to text for notes and data qualifications Table 1(d): Company Comparison -- 2006 Customer Perception Surveys Mandatory Price-Cap Companies: AT&T AT&T AT&T AT&T AT&T Qwest Verizon Verizon Verizon Ameritech BellSouth Pacific Southwestern SNET North South GTE Sample Sizes -- Customer Perception Surveys Installations: Residential 10,832 39,188 11,518 10,837 4,828 35,917 20,698 16,304 16,427 Small Business 12,979 41,396 13,313 12,677 2,166 16,769 19,938 15,768 16,737 Large Business 0 6,992 0 0 0 0 161 234 141 Repairs: Residential 10,801 27,833 12,002 10,672 2,422 30,966 20,691 15,931 16,809 Small Business 13,212 40,882 12,962 13,096 1,754 26,134 20,223 15,910 16,780 Large Business 0 6,628 0 0 0 0 165 234 143 Business Office: Residential 21,546 36,938 21,179 21,505 2,896 33,158 9,209 8,822 11,358 Small Business 21,513 10,531 19,143 20,738 1,112 16,629 3,857 3,201 2,954 Large Business 0 590 0 0 93 0 145 208 130 * Please refer to text for notes and data qualifications Table 1(e): Company Comparison -- 2006 Customer Perception Surveys Century Cincinnati Citizens Citizens Emberq Iowa Windstream WindStream Tel. Frontier Alltel Valor Access Services Provided to Carriers-- Switched Access Percent Installation Commitments Met 93.5 100.0 89.2 0.8 90.5 62.9 93.0 88.4 Average Installation Interval (days) 18.6 15.8 23.1 11.3 12.8 17.4 5.3 10.2 Average Repair Interval (hours) 35.3 NA 8.8 98.5 1.8 16.9 4.6 3.1 Access Services Provided to Carriers -- Special Access Percent Installation Commitments Met 91.2 97.6 86.9 16.3 93.0 86.9 93.5 86.7 Average Installation Interval (days) 14.0 17.2 14.4 42.3 10.8 1.6 7.5 10.2 Average Repair Interval (hours) 44.1 4.1 12.5 94.9 3.6 22.9 3.6 5.9 Local Services Provided to Res. and Business Customers Percent Installation Commitments Met 96.7 99.5 93.7 99.2 96.7 97.2 95.9 95.7 Residence 97.1 99.6 93.7 99.2 97.0 97.4 96.3 95.8 Business 95.3 98.8 94.1 98.9 94.5 95.9 91.5 94.8 Average Installation Interval (days) 1.0 2.4 4.9 4.8 1.9 1.4 2.6 2.8 Residence 0.9 2.0 4.2 4.1 1.8 1.4 2.5 2.8 Business 1.5 4.9 6.7 7.8 2.5 1.4 3.1 3.5 Avg. Out of Svc. Repair Interval (hours) 9.6 22.0 17.4 16.7 18.5 12.1 14.0 21.0 Total Residence 9.5 21.8 17.7 17.0 18.7 12.2 14.6 21.9 Total Business 9.8 17.3 15.6 15.4 16.6 10.6 11.7 18.6 Initial Trouble Reports per Thousand Lines 213.3 119.5 270.3 242.7 220.1 161.1 143.2 426.3 Total MSA 197.9 119.5 NA 216.0 193.5 163.8 126.5 401.4 Total Non MSA 226.9 NA 270.3 269.5 279.1 160.3 157.8 446.1 Total Residence 255.0 154.1 300.1 300.5 272.0 184.7 206.2 506.4 Total Business 93.4 48.6 182.1 124.3 98.6 83.3 47.7 211.8 Troubles Found per Thousand Lines 176.6 111.6 233.0 197.2 153.6 145.8 118.4 392.0 Repeat Troubles as a Pct. of Trouble Rpts. 27.9% 12.3% 17.4% 10.7% 23.3% 16.7% 15.8% 7.9% Residential Complaints per Million Res. Access Lines 961.3 270.5 860.3 75.6 95.5 0.0 136.4 378.0 Business Complaints per Million Bus. Access Lines 83.5 88.9 228.1 569.0 25.7 0.0 40.7 150.2 * Please refer to text for notes and data qualifications Table 2(a): Installation, Maintenance, & Customer Complaints Non-Mandatory Price-Cap Company Comparison -- 2006 Century Cincinnati Citizens Citizens Embarq Iowa Windstream WindStream Tel. Frontier Alltel Valor Total Access Lines in Thousands 557 824 1,196 821 6,744 222 742 465 Total Trunk Groups 242 44 247 95 467 0 92 258 Total Switches 187 91 206 67 1,321 267 27 266 Switches with Downtime Number of Switches 0 6 5 2 110 8 12 30 As a percentage of Total Switches 0.0% 6.6% 2.4% 3.0% 8.3% 3.0% 44.4% 11.3% Average Switch Downtime in seconds per Switch * For All Events (including events over 2 minutes) 0.0 95.5 826.9 149.6 1,951.1 562.4 27,471.1 3,334.5 For Unscheduled Events Over 2 Minutes NA NA 826.9 149.6 2,172.2 563.1 47,174.7 3,180.5 For Unscheduled Downtime More than 2 Minutes Number of Occurrences or Events 0.0 0.0 7.0 2.0 107.0 8.0 68.0 48.0 Events per Hundred Switches 0.0 0.0 3.4 3.0 8.1 3.0 251.9 18.0 Events per Million Access Lines 0.0 0.0 5.9 2.4 15.9 36.0 91.6 103.2 Average Outage Duration in Minutes NA NA 405.6 83.5 447.0 313.2 312.2 293.8 Average Lines Affected per Event in Thousands NA NA 3.0 3.0 8.5 0.4 1.3 0.5 Outage Line-Minutes per Event in Thousands NA NA 989.6 267.9 3,731.0 109.7 902.3 117.1 Outage Line-Minutes per 1,000 Access Lines 0.0 0.0 5,789.9 652.8 59,192.0 3,948.8 82,675.6 12,087.4 For Scheduled Downtime More than 2 Minutes Number of Occurrences or Events 0.0 0.0 0.0 0.0 3.0 0.0 4.0 6.0 Events per Hundred Switches 0.0 0.0 0.0 0.0 0.2 0.0 14.8 2.3 Events per Million Access Lines 0.0 0.0 0.0 0.0 0.4 0.0 5.4 12.9 Average Outage Duration in Minutes NA NA NA NA 17.3 NA 124.3 113.8 Avg. Lines Affected per Event in Thousands NA NA NA NA 3.1 NA 1.6 0.6 Outage Line-Minutes per Event in Thousands NA NA NA NA 62.0 NA 220.4 60.3 Outage Line-Minutes per 1,000 Access Lines 0.0 0.0 0.0 0.0 27.6 0.0 1,187.9 777.5 % Trunk Grps. Exceeding Blocking Objectives 25.2% 4.5% 0.0% 0.0% 5.6% #DIV/0! 0.0% 0.0% * Aggregate downtime divided by total number of company switches. Please refer to text for notes and data qualifications. Table 2(b): Switch Downtime & Trunk Blocking Non-Mandatory Price-Cap Company Comparison -- 2006 Century Cincinnati Citizens Citizens Emberq Iowa Windstream WindStream Tel. Frontier Alltel Valor Total Number of Outages 1. Scheduled 0 0 0 0 3 0 4 6 2. Proced. Errors -- Telco. (Inst./Maint.) 0 0 0 0 11 0 2 0 3. Proced. Errors -- Telco. (Other) 0 0 0 0 0 0 0 0 4. Procedural Errors -- System Vendors 0 0 0 0 0 0 0 0 5. Procedural Errors -- Other Vendors 0 0 0 0 2 2 3 0 6. Software Design 0 0 0 0 8 2 4 3 7. Hardware design 0 0 0 0 0 0 1 0 8. Hardware Failure 0 0 3 1 18 0 13 5 9. Natural Causes 0 0 0 0 12 0 9 2 10. Traffic Overload 0 0 0 0 0 0 0 0 11. Environmental 0 0 0 0 1 0 0 0 12. External Power Failure 0 0 4 1 4 3 6 10 13. Massive Line Outage 0 0 0 0 38 1 11 7 14. Remote 0 0 0 0 3 0 4 6 15. Other/Unknown 0 0 0 0 1 0 3 7 Total Outage Line-Minutes per Thousand Access Lines 1. Scheduled 0.0 0.0 0.0 0.0 27.6 0.0 1,187.9 777.5 2. Proced. Errors -- Telco. (Inst./Maint.) 0.0 0.0 0.0 0.0 2,153.5 0.0 266.6 0.0 3. Proced. Errors -- Telco. (Other) 0.0 0.0 0.0 0.0 5,281.3 0.0 0.0 0.0 4. Procedural Errors -- System Vendors 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 5. Procedural Errors -- Other Vendors 0.0 0.0 0.0 0.0 2,513.8 352.9 526.1 0.0 6. Software Design 0 0 0 0 2008 3104 1840 48 7. Hardware design 0.0 0.0 0.0 0.0 0.0 0.0 67.4 0.0 8. Hardware Failure 0.0 0.0 5,065.3 26.6 9,907.0 0.0 2,456.5 1,378.9 9. Natural Causes 0.0 0.0 0.0 0.0 8,923.7 0.0 70,107.3 603.8 10. Traffic Overload 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 11. Environmental 0 0 0 0 298 0 0 0 12. External Power Failure 0.0 0.0 724.6 626.3 3,042.9 399.2 441.9 2,447.9 13. Massive Line Outage 0.0 0.0 0.0 0.0 25,030.5 92.4 6,238.8 3,460.8 14. Remote 0.0 0.0 0.0 0.0 24.1 0.0 590.1 3,938.1 15. Other/Unknown 0.0 0.0 0.0 0.0 9.1 0.0 140.3 210.1 * Please refer to text for notes and data qualifications Table 2(c): Switch Downtime Causes -- Outages More than 2 Minutes in Duration Non-Mandatory Price-Cap Company Comparison -- 2006 1 Appendix A – Description of Key Terminology in the Tables This Appendix contains descriptions of key terms that appear in the tables and charts of the Quality of Service Report. The data elements in the tables are derived from raw source data for individual study areas submitted by carriers in the ARMIS 43-05 reports. A detailed specification of each element used in the tables of this summary report follows this general description. Data in the charts are derived from composite data provided by the companies. 1. Percent of Installation Commitments Met This term represents the percent of installations that were met by the date promised by the company to the customer. The associated data are presented separately for residential and business customers’ local service in the tables. These data are also summarized in the accompanying charts. 2. Average Installation Interval (in days) This term represents the average interval (in days) between the installation service order and completion of installation. The associated ARMIS 43-05 report data are highlighted in the accompanying charts along with customer installation dissatisfaction data from the ARMIS 43-06 report. 3. Average Repair Interval (in hours) This term represents the average time (in hours) for the company to repair access lines with service subcategories for switched access, high-speed special access, and all special access. Repair interval data are also highlighted in the accompanying charts along with results from company conducted surveys relating to customer repair dissatisfaction. This customer feedback is extracted from the ARMIS 43-06 report. 4. Initial Trouble Reports per Thousand Access Lines This term is calculated as the total count of trouble reports reported as "initial trouble reports," divided by the number of access lines in thousands. (Note that multiple calls within a 30 day period associated with the same problem are counted as a single initial trouble, and the number of access lines reported and used in the calculation is the total number of access lines divided by 1,000.) 2 5. Found or Verified Troubles per Thousand Access Lines This term is calculated as 1000 times the number of verified troubles divided by the number of access lines. Only those trouble reports for which the company identified a problem are included. 6. Repeat Troubles as a percent of Initial Trouble Reports This term is calculated as the number of initial trouble reports cleared by the company that recur, or remain unresolved, within 30 days of the initial trouble report, divided by the number of initial trouble reports as described above. 7. Complaints per Million Access Lines This term is calculated as 1 million times the number of residential and business customer complaints divided by the number of access lines, reported to state or federal regulatory bodies during the reporting period. 8. Number of Access Lines, Trunk Groups and Switches These terms represent the numbers of in-service access lines, trunk groups, and switches, respectively, as shown in the ARMIS 43-05 report. Trunk groups only include common trunk groups between Incumbent Local Exchange Carrier (ILEC) access tandems and ILEC end offices. When comparing current data herein with data in prior reports the reader should note that access lines were reported in thousands in pre-1997 data submissions. Starting with 1997 data submissions, access line information in the raw carrier data filings has been reported in whole numbers. 9. Switches with Downtime This term represents the number of network switches experiencing downtime and the percentage of the total number of company network switches experiencing downtime. 10. Average Switch Downtime in Seconds per Switch This term includes (1) the total switch downtime divided by the total number of company network switches and (2) the total switch downtime for outages longer than 2 minutes divided by the total number of switches. Results for average overall switch downtime are shown in seconds per switch. 3 11. Unscheduled Downtime Over 2 Minutes per Occurrence This term presents several summary statistics including, (1) the number of occurrences of more than 2 minutes in duration that were unscheduled, (2) the number of occurrences per million access lines, (3) the average number of minutes per occurrence, (4) the average number of lines affected per occurrence, (5) the average number of line-minutes per occurrence in thousands, and (6) the outage line-minutes per access line. For each outage, the number of lines affected was multiplied by the duration of the outage to provide the line-minutes of outage. The resulting sums of these data represent total outage line-minutes. This number was divided by the total number of access lines to provide line- minutes-per-access-line, and, by the number of occurrences, to provide the line- minutes-per-occurrence. This categorizes the normalized magnitude of the outage in two ways and provides a realistic means to compare the impact of such outages between companies. Data is presented for each company showing the number of outages and outage line-minutes by cause. 12. Scheduled Downtime Over 2 Minutes per Occurrence This term is determined as in item 11, above, except that it consists of scheduled occurrences. 13. Percent of Trunk Groups Meeting Design Objectives This term relates to the percentage of trunk groups exceeding the design blocking objectives (typically 0.5 percent for trunk groups that include feature group D and 1.0 percent for other trunk groups) for three or more consecutive months. The trunk groups measured and reported are interexchange access facilities. These represent only a small portion of the total trunk groups in service. 4 Appendix A Detailed Quality of Service Report Table Specifications Report Tables 1(a) and 2(a) (ARMIS 43-05 data) Statistic Access Services Provided to Carriers-- Switched Access Percent Installation Commitments Met row 112 weighted by row 110 (column aa) Average Installation Interval (days) row 114 weighted by row 110 (column aa) Average Repair Interval (hours) row 121 weighted by row 120 (column aa) Access Services Provided to Carriers -- Special Access Percent Installation Commitments Met row 112 weighted by row 110 (column ac) Average Installation Interval (days) row 114 weighted by row 110 (column ac) Average Repair Interval (hours) row 121 weighted by row 120 (column ac) Local Services Provided to Res. and Business Customers Percent Installation Commitments Met row 132 weighted by row 130 (column aj) Residence row 132 weighted by row 130 (column af) Business row 132 weighted by row 130 (column ai) Average Installation Interval (days) row 134 weighted by row 130 (column aj) Residence row 134 weighted by row 130 (column af) Business row 134 weighted by row 130 (column ai) Avg. Out of Svc. Repair Interval (hours) row 145 weighted by row 144 (column aj) Total Residence row 145 weighted by row 144 (column af) Total Business row 145 weighted by row 144 (column ai) Initial Trouble Reports per Thousand Lines 1000 * row 141 col aj / row 140 col aj Total MSA 1000 * (row 141 column ad + column ag)/ (row 140 column ad + column ag) Total Non MSA 1000 * (row 141 column ae + column ah)/ (row 140 column ae + column ah) Total Residence 1000 * (row 141 column af)/ (row 140 column af) Total Business 1000 * (row 141 column ai)/ (row 140 column ai) Troubles Found per Thousand Lines 1000 * (row 141 column aj - row 143 column aj)/ row 140 column aj Repeat Troubles as a Pct. of Trouble Rpts. (row 142 column aj) / (row 141 column aj) Residential Complaints per Million Res. Access Lines (row 331 column da + row332 column da)/ (row 330 column da) Business Complaints per Million Bus. Access Lines (row 321 column da + row 322 column da)/ (row 320 column da) 5 Appendix A--Detailed Quality of Service Report Table Specifications Report Table 1(b) and 2(b) (ARMIS 43-05 data) Statistic Total Access Lines in Thousands row 140 column aj Total Trunk Groups row 180 column ak Total Switches row 200 column an + row 201 column an Switches with Downtime row 200 column ao + row 201 column ao Number of Switches row 200 column ao + row 201 column ao As a percentage of Total Switches (row 200 column ao + row 201 column ao)/ (row 200 column an + row 201 column an) Average Switch Downtime in seconds per Switch* For All Events (including events over 2 minutes) 60 * (row 200 column ap + row 201 column ap)/ (row 200 column an + row 201 column an) For Unscheduled Events Over 2 Minutes 60 * (unscheduled events * average duration in min.)/ (row 200 column an + row 201 column an) For Unscheduled Downtime More than 2 Minutes Items where rows 220 to 500 column t > 1 Number of Occurrences or Events E = Number of records in row 220 to row 500 excluding rows 320, 321, 322, 330, 331 and 332 Events per Hundred Switches 100 *E/ (row 200 column an + row 201 column an) Events per Million Access Lines E/ 1,000,000 Average Outage Duration in Minutes (sum of rows 220 to 500 column x)/ E Average Lines Affected per Event in Thousands (sum of rows 220 to 500 column v)/ E Outage Line-Minutes per Event in Thousands (sum of rows 220 to 500 column x * column v)/ E Outage Line-Minutes per 1,000 Access Lines 1000 * (sum of rows 220 to 500 column x * column v)/ (row 140 column aj) For Scheduled Downtime More than 2 Minutes Items where rows 220 to 500 column t = 1 Number of Occurrences or Events E = Number of records in row 220 to row 500 excluding rows 320, 321, 322, 330, 331 and 332 Events per Hundred Switches 100 * E/ (row 200 column an + row 201 column an) Events per Million Access Lines E/ 1,000,000 Average Outage Duration in Minutes (sum of rows 220 to 500 column x)/ E Avg. Lines Affected per Event in Thousands (sum of rows 220 to 500 column v)/ E Outage Line-Minutes per Event in Thousands (sum of rows 220 to 500 column x * column v)/ E Outage Line-Minutes per 1,000 Access Lines 1000 * (sum of rows 220 to 500 column x * column v)/ (row 140 column aj) % Trunk Grps. Exceeding Blocking Objectives (row 189 column ak + row 190 column ak)/ (row 180 column ak) Notes: ARMIS 43-05 database rows 110-121 are contained in database table I ARMIS 43-05 database rows 130-170 are contained in database table II ARMIS 43-05 database rows 180-190 are contained in database table III ARMIS 43-05 database rows 200-214 are contained in database table IV ARMIS 43-05 database rows 220- 319 are contained in database table IVa ARMIS 43-05 database rows 320-332 are contained in database table V 6 Appendix A Detailed Quality of Service Report Table Specifications Report Table 1(c) and 2(c) (ARMIS 43-05 data) Total Number of Outages Number of rows between 220 and 500 for each value of column t 1. Scheduled 2. Proced. Errors -- Telco. (Inst./Maint.) 3. Proced. Errors -- Telco. (Other) 4. Procedural Errors -- System Vendors 5. Procedural Errors -- Other Vendors 6. Software Design 7. Hardware design 8. Hardware Failure 9. Natural Causes 10. Traffic Overload 11. Environmental 12. External Power Failure 13. Massive Line Outage 14. Remote 15. Other/Unknown Total Outage Line-Minutes per Thousand Access Lines (Sum of rows 200 to 500 column v * - column x for each value of column t) /row 140 col aj 1. Scheduled 2. Proced. Errors -- Telco. (Inst./Maint.) 3. Proced. Errors -- Telco. (Other) 4. Procedural Errors -- System Vendors 5. Procedural Errors -- Other Vendors 6. Software Design 7. Hardware design 8. Hardware Failure 9. Natural Causes 10. Traffic Overload 11. Environmental 12. External Power Failure 13. Massive Line Outage 14. Remote 15. Other/Unknown Notes: ARMIS 43-05 database rows 110-121 are contained in database table I ARMIS 43-05 database rows 130-170 are contained in database table II ARMIS 43-05 database rows 180-190 are contained in database table III ARMIS 43-05 database rows 200-214 are contained in database table IV ARMIS 43-05 database rows 220- 319 are contained in database table IVa ARMIS 43-05 database rows 320-332 are contained in database table V 7 Appendix A Detailed Quality of Service Report Table Specifications Report Table 1(d) (ARMIS 43-06 data) Percentage of Customers Dissatisfied Installations: Residential Row 40 column ac weighted by column ab Small Business Row 40 column ae weighted by column ad Large Business Row 40 column ag weighted by column af Repairs: Residential Row 60 column ac weighted by column ab Small Business Row 60 column ae weighted by column ad Large Business Row 60 column ag weighted by column af Business Office: Residential Row 80 column ac weighted by column ab Small Business Row 80 column ae weighted by column ad Large Business Row 80 column ag weighted by column af Note: ARMIS 43-06 database rows 40-80 are contained in database table I 8 Appendix A Detailed Quality of Service Report Table Specifications Report Table 1(e) (ARMIS 43-06 data) Note: ARMIS 43-06 database rows 40-80 are contained in database table I Sample Sizes -- Customer Perception Surveys Installations: Residential Sum of Row 40 column ab Small Business Sum of Row 40 column ad Large Business Sum of Row 40 column af Repairs: Residential Sum of Row 60 column ab Small Business Sum of Row 60 column ad Large Business Sum of Row 60 column af Business Office: Residential Sum of Row 80 column ab Small Business Sum of Row 80 column ad Large Business Sum of Row 80 column af Customer Response Publication: Quality of Service of Incumbent Local Exchange Carriers Report (February 2008) You can help us provide the best possible information to the public by completing this form and returning it to the Industry Analysis and Technology Division of the FCC's Wireline Competition Bureau. 1. Please check the category that best describes you: ____ press ____ current telecommunications carrier ____ potential telecommunications carrier ____ business customer evaluating vendors/service options ____ consultant, law firm, lobbyist ____ other business customer ____ academic/student ____ residential customer ____ FCC employee ____ other federal government employee ____ state or local government employee ____ Other (please specify) 2. Please rate the report: Excellent Good Satisfactory Poor No opinion Data accuracy (_) (_) (_) (_) (_) Data presentation (_) (_) (_) (_) (_) Timeliness of data (_) (_) (_) (_) (_) Completeness of data (_) (_) (_) (_) (_) Text clarity (_) (_) (_) (_) (_) Completeness of text (_) (_) (_) (_) (_) 3. Overall, how do you Excellent Good Satisfactory Poor No opinion rate this report? (_) (_) (_) (_) (_) 4. How can this report be improved? 5. May we contact you to discuss possible improvements? Name: Telephone #: To discuss this report contact Jonathan Kraushaar at 202-418-0947 Fax this response to or Mail this response to 202-418-0520 FCC/WCB/IATD Washington, DC 20554