Federal Communications Commission FCC 19-104 Before the FEDERAL COMMUNICATIONS COMMISSION WASHINGTON, D.C. 20554 In the Matter of Connect America Fund ) ) ) ) ) WC Docket No. 10-90 ORDER ON RECONSIDERATION Adopted: October 25, 2019 Released: October 31, 2019 By the Commission: Chairman Pai and Commissioners O’Rielly, Carr, Rosenworcel, and Starks issuing separate statements. TABLE OF CONTENTS Para. I. INTRODUCTION 1 II. BACKGROUND 4 III. DISCUSSION 11 A. End Points for Testing 12 B. Daily Test Period 20 C. Specific Speed Test Requirements 24 D. Specific Latency Test Requirements 27 E. Number of Test Locations 39 F. Quarterly Testing 50 G. Flexibility in Choosing Testing Methods 53 H. Standards for Full Compliance 56 I. Remedies for Non-Compliance 65 J. Schedule to Commence Testing 76 K. Requirements for Certain Alaska Plan Carriers 92 IV. PROCEDURAL MATTERS 95 V. ORDERING CLAUSES 98 APPENDIX A – FINAL RULES APPENDIX B – QUALIFYING INTERNET AUTONOMOUS SYSTEMS APPENDIX C – SUPPLEMENTAL FINAL REGULATORY FLEXIBILITY ANALYSIS I. INTRODUCTION 1. The Federal Communications Commission has long recognized that “[a]ll Americans [should] have access to broadband that is capable of enabling the kinds of key applications that drive our efforts to achieve universal broadband, including education (e.g., distance/online learning), health care (e.g., remote health monitoring), and person-to-person communications (e.g., VoIP or online video chat with loved ones serving overseas).” See Connect America Fund et al., WC Docket No. 10-90 et al., Report and Order and Further Notice of Proposed Rulemaking, 26 FCC Rcd 17663, 17694, paras. 86-87 (2011) (USF/ICC Transformation Order), aff’d sub nom. In re FCC 11-161, 753 F.3d 1015 (10th Cir. 2014) (citations omitted). To that end, the Commission has invested significant Universal Service Fund support for the deployment of broadband-capable networks in high cost, rural areas. 2. But only fast and responsive networks will allow Americans to fully realize the benefits of connectivity. That is why the Commission requires recipients of universal service support in high cost areas to deploy broadband networks capable of meeting minimum service standards. These standards protect taxpayers’ investment and ensure that carriers receiving this support deploy networks that meet the performance standards they promised to deliver to rural consumers. At the same time, the Commission recognizes that each carrier faces unique circumstances, and that one set of prescriptive rules may not make sense for every one of them. To accommodate this practical reality, the Commission’s rules provide flexibility, taking into account the operational, technical, and size differences among providers when establishing minimum standards, to ensure that even the smallest rural carriers can meet testing requirements without facing excessive burdens. 3. In this Order on Reconsideration, we review performance measures established by the Wireline Competition Bureau (WCB), the Wireless Telecommunications Bureau, and the Office of Engineering and Technology (collectively the Bureaus) for recipients of Connect America Fund (CAF) high-cost universal service support to ensure that those standards strike the right balance between ensuring effective use of universal service funds while granting the flexibility providers need given the practicalities of network deployment in varied circumstances. Connect America Fund, WC Docket No. 10-90, Order, 33 FCC Rcd 6509 (WCB/WTB/OET 2018) (Performance Measures Order or Order). Several petitions for reconsideration and applications for review of the Performance Measures Order propose changes to these performance measures. Here, we reject these proposed changes where we find that the Bureaus’ approach strikes the right balance. Where we find that the Bureaus’ approach does not—for example, where we conclude that greater flexibility is warranted than was offered under the Bureaus’ original methodology—we adjust our rules accordingly. Finally, we clarify the Bureaus’ approach where doing so will help resolve stakeholder confusion. II. BACKGROUND 4. The USF/ICC Transformation Order requires eligible telecommunications carriers (ETCs) receiving high-cost universal service support to provide broadband service in their supported areas that meets certain basic performance requirements. See USF/ICC Transformation Order, 26 FCC Rcd at 17705-06, para. 109. As in the USF/ICC Transformation Order, we use the term high-cost support or high-cost funding to include all existing high-cost universal service mechanisms, as well as CAF. See id. at 17695 n.126. ETCs must offer broadband with latency suitable for real-time applications, such as Voice over Internet Protocol (VoIP), and meet specific minimum speed standards depending upon the program from which they receive support. See, e.g., Connect America Fund; ETC Annual Reports and Certifications, Report and Order, 29 FCC Rcd 15644, 15649, para. 15 (2014) (December 2014 CAF Phase II Order) (requiring speeds of 10/1 Mbps); Connect America Fund; ETC Annual Reports and Certifications; Rural Broadband Experiments, Report and Order and Further Notice of Proposed Rulemaking, 31 FCC Rcd 5949, 5957, para. 15 (2016) (CAF Phase II Auction Order) (allowing bids of different performance tiers with speeds of 1 Gbps/500 Mbps, 100/20 Mbps, 25/3 Mbps, and 10/1 Mbps); Connect America Fund; ETC Annual Reports and Certifications, Report and Order and Further Notice of Proposed Rulemaking, 29 FCC Rcd 8769, 8779-80, paras. 24-29 (2014) (Rural Broadband Experiments Order) (making available support through the rural broadband experiments for services providing speeds of 100/25 Mbps, 25/5 Mbps, and 10/1 Mbps). To ensure that these services are meeting the required standards, recipients of high-cost support must test their broadband networks for compliance with the appropriate speed and latency metrics and certify and report the results to the Universal Service Administrative Company (USAC) and the relevant state or Tribal government on an annual basis. Results are subject to verification. USF/ICC Transformation Order, 26 FCC Rcd at 17705-06, para. 109. 5. After multiple rounds of comments, Id. at 18045-46, paras. 1013-1017; Wireline Competition Bureau, Wireless Telecommunications Bureau, and the Office of Engineering and Technology Seek Comment on Proposed Methodology for Connect America High-Cost Universal Service Support Recipients to Measure and Report Speed and Latency Performance to Fixed Locations, Public Notice, 29 FCC Rcd 12623 (WCB 2014) (2014 Broadband Measurement and Reporting Public Notice); Comment Sought on Performance Measures for Connect America High-Cost Universal Service Support Recipients, Public Notice, 32 FCC Rcd 9321 (WCB 2017) (2017 Performance Measures Public Notice). the Bureaus adopted performance requirements that established a uniform framework for measuring the speed and latency performance for recipients of CAF support to serve fixed locations. See generally Performance Measures Order, 33 FCC Rcd 6509. The 2011 USF/ICC Transformation FNPRM had directed the Bureaus to work together to refine the performance standards for implementation. USF/ICC Transformation Order, 26 FCC Rcd at 17680, 17708, paras. 48, 112 (“We delegate authority to the Bureaus to finalize performance measures as appropriate consistent with the goals we adopt today.”); 47 CFR § 54.313(a)(6). Prior to the adoption of the Performance Measures Order, the Wireline Competition Bureau in 2013 had addressed certain requirements for price cap carriers accepting CAF Phase II support, specifying that they must certify that 95% or more of all peak period measurements (also referred to as observations) of network round trip latency are at or below 100 milliseconds (ms) between the customer premises and an FCC-designated IXP. Connect America Fund, Report and Order, 28 FCC Rcd 15060, 15070-71, para. 23 (WCB 2013) (CAF Phase II Price Cap Service Obligation Order). As part of the Order, the Bureaus adopted a series of testing parameters and requirements to ensure that carriers of all sizes would be able to comply with performance testing requirements cost effectively. 6. Notably, the Bureaus required ETCs to perform speed and latency tests from the customer premises of an active subscriber to a remote test server located at or reached by passing through an FCC-designated Internet Exchange Point (IXP) and set a daily test period (requiring carriers to conduct tests between 6:00 p.m. and 12:00 a.m. local time) for such tests. The Bureaus required a specified number of speed and latency tests during each testing window: (1) for speed testing, the Order required a minimum of one download test and one upload test per testing hour at each subscriber test location; and (2) for latency testing, the Order required carriers to conduct a minimum of one discrete test per minute at each subscriber test location. The Bureaus required that carriers test a maximum of 50 subscriber locations per required service tier offering per state, with accommodations based on the number of subscribers a carrier has in a state, and that carriers conduct such testing on a quarterly basis (i.e., one week of testing in each quarter of the calendar year). 7. To recognize the varying circumstance different carriers faced, the Bureaus adopted three alternative methodologies carriers could use to demonstrate their compliance with network performance requirements: (1) testing infrastructure from the Measuring Broadband America (MBA) initiative, in which a number of providers already participate; (2) existing network management systems and tools (off-the-shelf testing); or (3) provider-developed self-testing configurations (provider-developed self-testing or self-testing). 8. To achieve full compliance with the latency and speed standards, the Order required that 95% of latency measurements during testing windows fall below 100 milliseconds round-trip time, and that 80% of speed measurements be at 80% of the required network speed. In addition, the Order established a framework of support reductions in the event that a carrier’s performance testing did not demonstrate compliance with the speed and latency standards to which each carrier is subject. 9. Finally, the Bureaus specified the scope of ETCs subject to the standards specified in the Performance Measures Order. In particular, as the Bureaus made clear, the testing regime and standards apply to recipients of several CAF high-cost universal service support programs, including price cap carriers receiving CAF Phase II model based support, rate-of-return carriers, rural broadband experiment (RBE) support recipients, Alaska Plan carriers, and CAF Phase II auction winners. Performance Measures Order, 33 FCC Rcd at 6509, para. 1. We also note that entities receiving support through New York’s New NY Broadband Program are subject to performance testing. See Connect America Fund; ETC Annual Reports and Certifications, Order, 32 FCC Rcd 968, 993, para. 70 (2017) (stating “like all ETCs receiving Connect America support, [NY Broadband Program] recipients will be required to submit annual reports pursuant to section 54.313 of the Commission’s rules”) (citations omitted). And the Bureaus established a deadline of July 1, 2020 for carriers subject to the Order to report the results of testing. 10. Several providers and associations petitioned the Bureaus to reconsider these adopted requirements, and others applied to the Commission for review. Hughes Network Systems, LLC, Petition for Clarification, or in the Alternative, Reconsideration, WC Docket No. 10-90 (Sept. 19, 2018); Micronesian Telecommunications Corporation, Petition for Partial Reconsideration, WC Docket No. 10-90 (Sept. 19, 2018) (MTC PFR); Viasat, Inc., Petition for Reconsideration of Viasat, Inc., WC Docket No. 10-90 (Sept. 19, 2018); USTelecom – The Broadband Association, ITTA – The Voice of America’s Broadband Providers, and the Wireless Internet Service Providers Association, Petition for Reconsideration and Clarification, WC Docket No. 10-90 (Sept. 19, 2018) (USTelecom/ITTA/WISPA PFR); NTCA – The Rural Broadband Association, Application for Review and Request for Clarification of NTCA – The Rural Broadband Association, WC Docket No. 10-90 (Sept. 19, 2018) (NTCA AFR); WTA – Advocates for Broadband, Application for Review, WC Docket No. 10-90 (Sept. 19, 2018) (WTA AFR). Because both petitions for reconsideration and applications for review were filed that address many of the same issues, the Bureaus referred the petitions to the Commission for consideration, and the Commission will address both the petitions and the applications in this Order, with the exception the issues related to the Mean Opinion Score (MOS) testing. See 47 CFR § 1.429(a) (“Where action was taken by a staff official under delegated authority, the petition may be acted on by the staff official or referred to the Commission for action.”). Those issues have been addressed by the Wireline Competition Bureau. See Connect America Fund, Order on Reconsideration, WC Docket No. 10-90, DA 19-911 (WCB rel. Sept. 12, 2019). III. DISCUSSION 11. In this Order on Reconsideration, we reexamine each of the above-described performance measure requirements. As a result, we adopt several modifications. We believe these changes will alleviate concerns expressed by carriers by increasing the time for carriers to meet certain deadlines and further minimizing the costs associated with compliance, yet still ensure that carriers meet their performance obligations. In short, the refinements to our approach adopted in this Order will further the overarching goal of the Performance Measures Order; namely, to ensure that carriers deliver broadband services with the speed and latency required while providing flexibility to enable carriers of all sizes to choose how to conduct the required performance testing in the manner most appropriate for each individual carrier. A. End Points for Testing 12. Under the Performance Measures Order, all high-cost support recipients serving fixed locations must perform speed and latency tests from the customer premises of an active subscriber to a remote test server located at or reached by passing through an FCC-designated IXP. Performance Measures Order, 33 FCC Rcd at 6516, para. 18. In the USF/ICC Transformation Order, the Commission decided that speed and latency should be measured on each ETC’s access network from the end-user interface to the nearest Internet access point, i.e., the Internet gateway, which is the closest peering point between the broadband provider and the public Internet for a given consumer connection. USF/ICC Transformation Order, 26 FCC Rcd at 17706, para. 111. Subsequently, in the CAF Phase II Price Cap Service Obligation Order, the Wireline Competition Bureau stated that latency should be tested to an IXP, defined as occurring in any of ten different U.S. locations, almost all of which are locations used in the MBA program because they are geographically distributed major peering locations. CAF Phase II Price Cap Service Obligation Order, 28 FCC Rcd at 15071, para. 23 n.63. WCB’s list of locations included New York City, NY; Washington, DC; Atlanta, GA; Miami, FL; Chicago, IL; Dallas-Fort Worth, TX; Los Angeles, CA; San Francisco, CA; Seattle, WA; and Denver, CO. The Bureaus expanded the list to permit testing to six additional metropolitan areas to ensure that most mainland U.S. locations are within 300 miles of an FCC-designated IXP and that all are within approximately 500 air miles of one. Performance Measures Order, 33 FCC Rcd at 6516, para. 20. The expanded list added Salt Lake City, UT; St. Paul, MN; Helena, MT; Kansas City, MO; Phoenix, AZ; and Boston, MA. Further, the Bureaus permitted providers to use any FCC-designated IXP for testing purposes, rather than limiting testing to the provider’s nearest IXP. Id. Providers serving non-contiguous areas greater than 500 air miles from an FCC-designated IXP were also permitted to conduct testing between the customer premises and the point at which traffic is aggregated for transport to the continental U.S. Id. at 6517, para. 21. 13. We agree with the Bureaus that the speed and latency of networks of carriers receiving support through the various high-cost support mechanisms should be tested between the customer premise of an active subscriber and an FCC-designated IXP. See id. at 6516, para. 18. This approach is consistent with the Commission’s determination in the USF/ICC Transformation Order that “actual speed and latency [must] be measured on each ETC’s access network from the end-user interface to the nearest Internet access point.” See USF/ICC Transformation Order, 26 FCC at 17706, para. 111. Measuring the performance of a consumer’s connection to an IXP better reflects the performance that a carrier’s customers experience. As we observed when we first adopted performance measures for CAF Phase II model-based support recipients, “[t]esting . . . on only a portion of the network connecting a consumer to the Internet core will not show whether that customer is able to enjoy high-quality real-time applications because it is network performance from the customer’s location to the destination that determines the quality of the service from the customer’s perspective.” CAF Phase II Price Cap Service Obligation Order, 28 FCC Rcd at 15073-74, para. 31. 14. We therefore disagree with those commenters arguing that we should require testing over a shorter span. For example, NTCA seeks modification of the testing requirements to account for performance only on “portions of the network owned by the USF recipient and the next-tier ISP from which that USF recipient procures capacity directly.” NTCA AFR at 6. See Letter from Joshua Seidemann, Vice President of Policy, NTCA, to Marlene H. Dortch, Secretary, FCC, at 2 (Oct. 10, 2019) (reiterating NTCA’s previous arguments about performance measures testing endpoints). NTCA argues that requiring testing to an FCC-designated IXP imposes liability on a carrier for conditions beyond its control and violates the Act by applying obligations to parts of the network that are not supported by USF funding. Id. at 2-6. Alternatively, NTCA requests that the Commission provide a “safe harbor” to protect a carrier from off-network issues that affect its test measurements. Id. at 6-8. WTA similarly contends that testing to an FCC-designated IXP makes carriers responsible for portions of the connection over which they have no control. WTA AFR at 15-16. See also Letter from Gerard J. Duffy, WTA Regulatory Counsel, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 1 (Oct. 17, 2019) (WTA Oct. 17, 2019 Ex Parte). WTA instead proposes a two-tiered framework consisting of a network-only test for purposes of high-cost compliance and customer-to-IXP testing to respond to customer complaints, with unresolved network-only problems being subject to non-compliance support reductions. Letter from Gerard J. Duffy, Counsel, WTA, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 2 (Apr. 4, 2019) (WTA Apr. 4, 2019 Ex Parte); Letter from Gerard J. Duffy, Counsel, WTA, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 1-2, 4 (May 6, 2019) (WTA May 6, 2019 Ex Parte). Finally, Vantage Point seeks clarity on the initiation point for performance testing within the customer premises, and contends that the endpoint for testing should be at or reached by passing through a carrier’s next tier ISP. Letter from Larry D. Thompson, CEO, Vantage Point, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 2-3 (Aug. 28, 2018) (Vantage Point Aug. 28, 2018 Ex Parte). 15. We disagree with petitioners that testing to an FCC-designated IXP, rather than the edge of a carrier’s network, makes a carrier responsible for network elements it does not control, and we reject testing only on a carrier’s own network as inadequate. As the Bureaus explained, carriers—even smaller ones—do have some influence and control over the type and quality of Internet transport they purchase. Performance Measures Order, 33 FCC Rcd at 6516, para. 19. We expect a carrier to purchase transport of a sufficient quality that enables it to provide the requisite level of service expected by consumers and required by the Commission’s rules. However, in the event a carrier fails to meet its performance obligations because the only transport available would demonstrably degrade the measured performance of the carrier’s network, the carrier can seek a waiver of the performance measures requirements. See 47 CFR § 54.313(a)(6). WTA states that its members “have little confidence that waivers can effectively and efficiently address” their concerns regarding middle mile transport. WTA Oct. 17, 2019 Ex Parte at 1. However, by providing a gradual schedule for implementation of testing and a pre-testing regime, we expect that there will be ample time for both carriers and the Commission to address any issues that become apparent as test results are collected. And we believe that waivers can serve as an effective backstop to address any outlier issues that may remain. We are similarly unpersuaded by WTA’s two-tiered testing proposal. Adopting WTA’s proposal to conduct its required tests over only half of the full testing span would only provide us with insight into the customer experience on half of the network between the customer and the IXP. Given that our aim is to ensure that customers are able to enjoy high-quality real-time applications, we decline to adopt WTA’s proposed approach. 16. Finally, we provide additional clarity on both the initiation point and endpoint for testing. As we have noted above, one of the chief purposes for implementing performance requirements is to ensure that customers are receiving the expected levels of service that carriers have committed to providing. Testing from any place other than the customer side of any carrier network equipment used in providing a customer’s connection may skew the testing results and not provide an accurate reflection of the customer’s broadband experience. As Vantage Point notes, testing in this manner would make it “difficult to ensure that the test was being performed on the network path actually used by the customer.” Vantage Point Aug. 28, 2018 Ex Parte at 2. Thus, we clarify that testing should be conducted from the customer side of any network equipment that is being used. We note that although carriers may use a device placed on customer side of any network equipment, such as a Whitebox, for customer testing, there are now multiple software options available such that testing can be done via the customer’s modem.  If the customer’s modem is too old to be updated to accommodate such software, the carrier can instead provide the customer with a new modem.  Modems can often be self-installed by the customer, removing any need for a truck roll to the customer’s premises. 17. Definition of FCC-designated Internet Exchange Point. Given our commitment to testing the performance of connections between consumers and FCC-designated IXPs, we also take this opportunity to clarify which facilities qualify as FCC-designated IXPs for purposes of performance testing. 18. USTelecom, ITTA, and WISPA request clarification that ETCs are permitted to use “the nearest Internet access point,” as specified in the USF/ICC Transformation Order, which may not necessarily be a location specified in the Order. USTelecom/ITTA/WISPA PFR at 19-21. See also Comments of AT&T, WC Docket No. 10-90, at 2-6 (Nov. 7, 2018) (AT&T PFR Comments). They also seek clarification that ETCs may test to servers that are within the provider’s own network (i.e., on-net servers). USTelecom/ITTA/WISPA PFR at 21; AT&T PFR Comments at 4-5; Comments of Midcontinent Communications in Support of USTelecom and WISPA’s Petition for Reconsideration and Clarification, WC Docket No. 10-90, at 1-4 (Nov. 7, 2018) (Midcontinent PFR Comments); Reply of USTelecom, ITTA, and WISPA to Opposition to Petition for Reconsideration and Clarification, WC Docket No. 10-90, at 3-4 (Nov. 19, 2018) (USTelecom/ITTA/WISPA PFR Reply). In subsequent filings, the petitioners suggest that there should be a criteria-based approach to defining the testing endpoint. See, e.g., Letter from Michael J. Jacobs, Vice President, Regulatory Affairs, ITTA, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90 (May 9, 2019) (ITTA/USTelecom/WISPA May 9, 2019 Ex Parte); Letter from Stephen E. Coran, Lerman Senter PLLC, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90 (May 13, 2019) (ITTA/USTelecom/WISPA May 13, 2019 Ex Parte); Letter from Michael J. Jacobs, Vice President, Regulatory Affairs, ITTA, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90 (June 6, 2019) (ITTA/USTelecom/WISPA June 6, 2019 Ex Parte). Specifically, they propose that testing occur “from the end-user interface to the first public Internet gateway in the path of the CAF-supported customer that connects through a transitive Internet Autonomous System,” (ASN) and “that the Commission establish a safe harbor where the transitive Internet AS which the gateway hosts includes one or more router(s) that advertise(s) [ASN] organizations that are listed on the Center for Applied Internet Data Analysis (CAIDA) ‘AS Organization Rank List.’” See ITTA/USTelecom/WISPA June 6, 2019 Ex Parte at 2. The petitioners propose that testing occurring through a “safe harbor” ASN “would be considered valid without further inquiry.” Id. at 2. 19. We conclude that the Order’s designation of certain metropolitan areas as qualifying IXPs is too ambiguous. It is not clear where the boundaries of a designated IXP metropolitan area begin and end. Thus, drawing on the petitioners’ proposal, we now provide a revised definition of FCC-designated IXP that is more specific and better designed to account for the way Internet traffic is routed. For testing purposes, we define an FCC-designated IXP as any building, facility, or location housing a public Internet gateway that has an active interface to a qualifying ASN. Opposing this revised definition, NTCA asserts that “discussions with technical experts and engineers has not yielded a consistent understanding of this definition.” Letter from Joshua Seidemann, Vice President of Policy, NTCA, to Marlene H. Dortch, Secretary, FCC, at 1-3 (Oct. 17, 2019) (NTCA Oct. 17, 2019 Ex Parte). Nonetheless, NTCA does not propose an alternate definition that would be consistent with the Commission’s prior determination that “actual speed and latency [must] be measured on each ETC’s access network from the end-user interface to the nearest Internet access point.” USF/ICC Transformation Order, 26 FCC at 17706, para. 111. Further, we note that this definition was proposed by USTelecom, ITTA, and WISPA. Such a building, facility, or location could be either within the provider’s own network or outside of it. We use the term “qualifying ASN” to ensure that the ASN can properly be considered a connection to the public Internet. We note that in the USF/ICC Transformation Order, the Commission found that the Internet gateway is the “peering point between the broadband provider and the public Internet” and that public Internet content is “hosted by multiple service providers, content providers and other entities in a geographically diverse (worldwide) manner.” USF/ICC Transformation Order, 26 FCC Rcd at 17706, para. 111. The criteria we use to determine FCC-designated IXPs are designed to ensure that the peering point is sufficiently robust such that it can be considered a connection to the public Internet and not simply another intervening connection point. We designate 44 major North American ASNs using CAIDA’s ranking of Autonomous Systems and other publicly available resources as “safe harbors.” Appendix B provides a list of qualifying ASNs. An ASN is determined to be qualifying if it appears on the CAIDA AS Rank List and meets the following criteria: (1) it is a “transit/access” ASN; (2) it is flagged in the United States, Canada, or Mexico; (3) it has a Transit Degree of 100 or greater; (4) it peers with two or more of the top 300 USA-flagged ASNs on CAIDA’s AS Rank List; and (5) at least one of these peered ASNs is ranked in the top 100. We do exclude 4 ASNs that meet these criteria where analysis of PeeringDB and its website indicates that the ASNs do not peer with a significant portion of the public Internet. They are: Akamai (32787); Florida International University (20080); WoodyNet (Packet Clearing House) (42); and eBay (backbone for eBay Inc.) (62955). See AS Rank, Center for Applied Internet Data Analysis (Mar. 1, 2019), http://as-rank.caida.org. We direct the Bureaus to update this list of ASNs periodically using the CAIDA ranking of ASNs, PeeringDB, PeeringDB contains peering and peering related information, including interconnection data for networks, clouds, services, and enterprise, as well as interconnection facilities that are developing at the edge of the Internet.” See https://www.peeringdb.com/. and other publicly available resources. Providers may test to a test server located at or reached by passing through any building, facility, or location housing a public Internet gateway that has an active interface to one of these qualifying ASNs or may petition the Bureaus to add additional ASNs to the list. We note that we are not requiring the carrier’s network to interconnect with the qualifying ASN at the building, facility, or location. Further, although we expect the list of ASNs to change periodically, adding an ASN will not necessarily mean that one will be removed. If an ASN is removed from the list, a carrier that is using such an ASN for conducting required testing may continue using the ASN through the end of the subsequent calendar quarter. The Bureaus will determine whether any ASN included in a carrier petition is sufficiently similar to qualifying ASNs that it should be added to the list of qualifying ASNs. B. Daily Test Period 20. The Bureaus also established a daily testing period for speed and latency tests, requiring carriers to conduct tests between 6:00 p.m. and 12:00 a.m. local time, including weekends. Performance Measures Order, 33 FCC Rcd at 6520, para. 30. The testing window the Bureaus adopted reflects a slight expansion of the testing window used for the MBA. Id. The Bureaus reasoned that MBA data indicated a peak period of Internet usage every evening but noted that they would revisit this requirement periodically “to determine whether peak Internet usage times have changed substantially.” Id. 21. Petitioners and commenters urge the Commission to reconsider the daily test period requirement to account for the usage patterns of rural consumers, as well as the conditions and characteristics of rural areas. WTA notes that the MBA data cited by the Bureaus likely reflect the usage patterns of urban consumers, rather than consumers in rural areas that “are typically making personal and business use of their household Internet connections throughout the day.” WTA AFR at 9. WTA contends that there is likely to be increased congestion on rural networks during the time period adopted by the Bureaus, potentially resulting in an inaccurate or unrepresentative testing of the carrier’s service. Id. WTA also argues that mandating testing during evening hours and weekends requires rural carriers to adjust their regular daytime schedule, creating staffing and financial hardships and potentially preventing them from responding to other customer service issues. Id. at 10. ITTA supports this point, noting that “evening and weekend test hours require RLECs to re-schedule one or more technicians from their regular daytime maintenance and installation duties and pay them premium or overtime wages.” Comments of ITTA – The Voice of America’s Broadband Providers, WC Docket No. 10-90, at 7 (Nov. 7, 2018) (ITTA PFR Comments). ITTA also challenges the expansion of the daily test period from 7 p.m. to 11 p.m. to 6 p.m. to 12 a.m., Id. at 7-8. See also Letter from Michael J. Jacobs, VP, Regulatory Affairs, ITTA, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 2 (Feb. 11, 2019). and requests flexibility as to the specific hours that testing may be conducted. USTelecom/ITTA/WISPA PFR at 23-24; USTelecom/ITTA/WISPA PFR Reply at 5. 22. We decline to revisit the daily testing period at this time. WTA provides no data to support its claim that rural consumers are more active users of broadband service during daytime hours than urban consumers. Moreover, our review of MBA data from more rural areas indicates that these areas have similar peak periods to urban areas. FCC staff analyzed MBA usage data from August 2018 through February 2019 and compared the data from more urban areas to that of more rural areas. The usage patterns were similar when comparing urban and rural areas, and all showed higher usage during the same peak times. Staff analysis categorized subscriber locations as urban or more rural based in part upon the state in which the subscriber was located, but more importantly the distance from the subscriber location to a city with a sizeable population. See Raw Data - Measuring Broadband America - Eighth Report, FCC, https://www.fcc.gov/reports-research/reports/measuring-broadband-america/raw-data-measuring-broadband-america-eighth#block-menu-block-4. As we have stated many times, a primary goal for universal service is to ensure that customers in rural areas receive the same level of service as those in urban areas. See, e.g., USF/ICC Transformation Order, 26 FCC Rcd at 17679, paras. 43-45. See also 47 U.S.C. § 254(b)(3). By establishing the same testing window for urban and rural areas, we can confirm that consumers in rural areas are not receiving substandard service as compared to consumers in urban areas during the same time periods. Additionally, WTA’s concern that testing during the peak period may degrade a consumer’s broadband experience is unfounded. WTA AFR at 9. As we previously observed, the small amount of data required for speed testing will have no noticeable effect on network congestion. Performance Measures Order, 33 FCC Rcd at 6520-21, para. 32. We remind carriers that we provide them the flexibility to choose whether to stagger their tests over the course of the testing period, so long as they do not violate any other testing requirements. Id. at 6521, para. 33. 23. We also disagree with WTA and ITTA that the current daily testing period will require rural carriers to devote additional personnel hours to implement the Commission’s performance testing requirements. Once the testing regime is implemented and carriers have installed the necessary technology and software to test the speed and latency of their networks on a routine basis, we do not anticipate that extensive staffing will be required to monitor the testing process. Because the technological testing options that we have allowed carriers to use are all relatively automated, carriers should not have to adjust schedules to ensure staffing during evenings and weekends. Additionally, we note that the Bureaus expanded the testing period from 7 p.m. to 11 p.m. to 6 p.m. to 12 a.m. based on several comments from parties that requested a longer testing period. See, e.g. CAF II Performance Certification – AT&T Proposal, Hany Fahmy, Ph.D., Assistant Vice President – Global Public Policy, External and Legislative Affairs, AT&T, WC Docket No. 10-90, Advantages of 18-hour measurements vs. 7-11 pm (June 16, 2016). Adding one additional hour on both the front and back end of the testing period allows a carrier’s testing to capture the ramp up and ramp down periods before and after peak time, providing a more accurate picture of whether customers are receiving the required level of service. We also remind parties that the Bureaus committed to revisiting periodically the daily testing window to ensure that the established hours continue to reflect the usage habits of consumers. Performance Measures Order, 33 FCC Rcd at 6520, para. 30. C. Specific Speed Test Requirements 24. The Bureaus required a specified number of speed tests during each testing window. In particular, the Order required a minimum of one download test and one upload test per testing hour at each subscriber test location. Id. at 6519, para. 28. Providers were required to start separate download and upload speed tests at the beginning of each test hour window, and, after deferring a test due to cross-talk (e.g., traffic to and from the consumer’s location that could impact performance testing), providers were required to reevaluate whether the consumer load exceeds the cross-talk threshold every minute until the speed test can be run or the one-hour test window ends. Id. at 6519-20, paras. 28-29. 25. In their Petition for Reconsideration, USTelecom, ITTA, and WISPA request clarification that recipients are afforded flexibility in commencing hourly tests. They argue that “[i]t is not clear from the Order . . . whether ‘the beginning’ of a test hour window requires a recipient to commence testing at the top of the hour, or whether testing must commence for all test subscribers at exactly the same time.” USTelecom/ITTA/WISPA PFR at 23. The petitioners state that carriers should only be required to complete the test within the hour, and they should be able to retry tests as frequently as their systems allow until a successful test is administered, rather than retrying deferred tests every minute. Id. at 24. Noting that “there should be no practical difference as to whether testing occurs at the top, middle, or closer to [the] end of a testing window,” NTCA, NRECA, and UTC support the petitioners’ request that “the Commission reconsider the discrete and specific times at which testing is to be conducted within each hour.” NTCA/NRECA/UTC Opposition at 18. Opposition of NTCA–The Rural Broadband Association, National Rural Electric Cooperative Association (NRECA) and Utilities Technology Council to Petitions for Reconsideration, WC Docket No. 10-90 at 18 (Nov. 7, 2018) (NTCA/NRECA/UTC Opposition). Vantage Point likewise proposes that the Commission permit carriers to distribute speed tests within testing hours in a way that minimizes network impact; otherwise, Vantage Point asserts, requiring all speed testing to start at the beginning of each hour would significantly burden test servers such that test results would not be representative of customers’ normal experience. Vantage Point Aug. 28, 2018 Ex Parte at 3-4. 26. We clarify that providers do not have to begin speed tests at the beginning of each test hour, as petitioners suggest. In particular, we agree with Vantage Point that providing greater flexibility in this regard will further minimize the impact of any potential burden on the test servers during speed testing. See id. at 4. However, to ensure that there is enough data on carriers’ speed performance, providers must still conduct and report at least one download test and one upload speed test per testing hour at each subscriber test location, with one exception. A carrier that begins attempting speed tests within the first fifteen minutes of a testing hour, and repeatedly retries and defers the test at one-minute intervals due to consumer load meeting the adopted cross-talk thresholds (i.e., 64 Kbps for download tests or 32 Kbps for upload tests), Performance Measures Order, 33 FCC Rcd at 6519-20, para. 28. may report that no test was successfully completed during the test hour because of cross-talk. As noted in the Performance Measures Order, this outcome is unlikely: “a significant majority of MBA speed tests are completed within their designated 1-hour window despite consumer load.” Id., 33 FCC Rcd at 6519 n.86. Although carriers will not be required to submit information on all speed testing attempts made during an hour, carriers must retain and be able to produce this information upon request in the event of an audit. A provider that does not attempt a speed test within the first 15 minutes of the hour and/or chooses to retry tests in greater than one-minute intervals must, however, conduct and report a successful speed test for the testing hour regardless of cross-talk. WTA “questions why the relief provided in instances where there is more than 64 kbps of cross-talk encompasses only the first 15 minutes of a testing hour.” WTA Oct. 17, 2019 Ex Parte at 2. We clarify that carriers may delay a test because of cross-talk throughout the entire hour as long as the carrier begins attempting the hourly speed test within the first 15 minutes and within one-minute intervals throughout the hour. It is only if the carrier does not initiate testing within the first 15 minutes of the hour that the carrier must complete a speed test regardless of cross-talk. Although this approach continues to differ slightly from MBA practice, See Performance Measures Order, 33 FCC Rcd at 6519 n.86. we believe that it minimizes the possibility of network congestion at the beginning of the testing hour while ensuring that the Commission will have access to sufficient testing data. D. Specific Latency Test Requirements 27. The Order established specific test intervals within the daily test period for latency testing, requiring carriers to conduct “a minimum of one discrete test per minute, i.e., 60 tests per hour, for each of the testing hours, at each subscriber test location, with the results of each discrete test recorded separately.” See id. at 6519, para. 27. Recognizing that cross-talk could negatively affect the test results, the Bureaus provided flexibility for carriers to postpone a latency test in the event that the consumer load exceeded 64 Kbps downstream and to reevaluate the consumer load before attempting the next test. Id. 28. Several parties express concern with these requirements and request reconsideration of the latency testing framework. USTelecom, ITTA, and WISPA jointly contend that the Bureaus failed to provide adequate notice for the frequency of latency testing and did not justify departing from the MBA practice of combining speed and latency testing under a unified framework. USTelecom/ITTA/WISPA PFR at 5-8. These parties further argue that requiring latency testing once per minute will be administratively burdensome for carriers by preventing them from combining the instructions for testing into a single process and potentially overloading and disrupting some testing methods. Id. at 9. Instead, USTelecom, ITTA, and WISPA propose that the number of latency tests should be reduced to match the frequency of speed testing. Id. at 6-8. See also Letter from Kevin G. Rupy, Vice President, Law & Policy, USTelecom, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 1-2 (July 31, 2018) (USTelecom July 31, 2018 Ex Parte). Midcontinent also supports aligning the frequency of speed and latency testing requirements. Midcontinent PFR Comments at 1-4. 29. AT&T contends that testing once per minute “is unnecessary and arbitrary and capricious” and likewise argues that the Commission should permit carriers to test latency only once per hour. AT&T PFR Comments at 6-8. AT&T supports its proposal by providing internal data purporting to demonstrate no material difference between testing latency once per minute versus testing once per hour. Letter from Cathy Carpino, Assistant Vice President – Senior Legal Counsel, AT&T Services, Inc., to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90 (Apr. 12, 2019) (public version) (AT&T Apr. 12, 2019 Ex Parte); Letter from Cathy Carpino, Assistant Vice President – Senior Legal Counsel, AT&T Services, Inc., to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 2 (May 21, 2019) (public version) (AT&T May 21, 2019 Ex Parte). AT&T provided staff with summary results of its latency testing of almost 100 subscribers in its Connect America Fund Phase II eligible areas that obtain broadband service via wireline and fixed wireless technologies. As a result, AT&T proposes that the Commission require a minimum of one latency test per hour, but provide flexibility to allow carriers to test more frequently if they desire. See AT&T May 21, 2019 Ex Parte. ITTA concurs with AT&T’s proposed approach. Letter from Mary L. Henze, Assistant Vice President Federal Regulatory, AT&T Services, Inc., to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90 (Mar. 15, 2019); Letter from Michael J. Jacobs, VP, Regulatory Affairs, ITTA – The Voice of America’s Broadband Providers; Mike Saperstein, VP Law and Policy, USTelecom; and Claude Aiken, President & CEO, WISPA, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 4-9 (Apr. 10, 2019) (ITTA/USTelecom/WISPA Apr. 10, 2019 Ex Parte). 30. Conversely, NTCA, NRECA, and UTC support the latency testing framework adopted by the Bureaus. These parties observe that aligning the frequency of speed and latency tests would “risk undermining the Commission’s statutory mandate to ensure reasonably comparable services in rural and urban areas” because speed does not require as frequent testing as latency in order to demonstrate compliance. NTCA/NRECA/UTC Opposition at 4-5. In response, USTelecom, ITTA, and WISPA again argue that the Bureaus failed to adequately address the Administrative Procedure Act’s notice obligations or present any legal or factual basis for requiring substantially more latency tests than speed tests. USTelecom/ITTA/WISPA PFR Reply at 5-9. 31. We decline to revise the determination of the Bureaus that carriers must conduct latency testing once per minute. We also decline to provide an exception for carriers participating in certain high-cost programs to test latency only once per hour rather than once per minute. See Letter from Mike Saperstein, Vice President, Policy & Advocacy, USTelecom, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 3-4 (Oct. 15, 2019) (USTelecom Oct. 15, 2019 Ex Parte) . As explained herein, conducting 60 latency tests per hour instead of just one test per hour provides much more useful information on a network’s performance and is similar to the approach we have taken with the MBA. Moreover, given readily available third-party equipment or software to conduct the required testing and the lack of supporting evidence regarding the cost of upgrades, we are not convinced that the purported cost of $1 million to upgrade test systems is realistic. See id. Regarding parties’ procedural arguments, we note that, in the two Public Notices seeking comment on the performance measures, the Bureaus specifically explained that adopting MBA testing was under consideration. 2014 Broadband Measurement and Reporting Public Notice, 29 FCC Rcd at 12626-27, paras. 15-19; 2017 Performance Measures Public Notice, 32 FCC Rcd at 9325-27, para. 9. Indeed, many of the performance testing requirements were derived from or influenced by the Commission’s experience with MBA testing. As such, parties had ample notice that the testing regime adopted by the Bureaus, which is a less burdensome variation of the MBA testing, was a potential option. Any argument to the contrary is unfounded. 32. Complaints that the frequency of latency testing will affect network performance also are speculative. The latency testing frequency framework ultimately adopted by the Bureaus is substantially less extensive than the MBA program testing. For example, MBA testing sends approximately 2,000 User Datagram Protocol (UDP) packets per hour, and these 2,000 individual results are summarized as a single reporting record that reflects all 2,000 tests. To be clear, MBA requires latency to be tested 2,000 times per hour, with results summarized into one record. See, e.g., 2016 Measuring Broadband America Fixed Broadband Report, Federal Communications Commission Office of Engineering and Technology and Office of Strategic Planning & Policy Analysis at 27, https://www.fcc.gov/reports-research/reports/measuring-broadband-america/measuring-fixed-broadband-report-2016 (2016 MBA Report). Conversely, the Bureaus adopted testing of 60 UDP packets per hour that consists of approximately 3% of the typical MBA load. Performance Measures Order, 33 FCC Rcd at 6520-21, para. 32. The more intensive MBA test frequency has not been found to pose any technical or other difficulties, so there is no reason to believe that the vastly lower frequency of latency testing adopted by the Bureaus will cause concerns. Requiring 60 UDP packets per hour rather than 2,000 balances the need for sufficient testing while minimizing the burden of testing on carriers. 33. We also agree with the Bureaus that the disparity in testing frequency between speed and latency reflects the different type of testing necessary to determine whether carriers are meeting the required benchmarks. The purpose of speed testing is to determine if the network is properly provisioned to furnish the required speed and whether the network provides sufficient throughput to handle uploads and downloads at particular speeds and times. Because of the burden that such testing puts on a carrier’s network, the Bureaus adopted the minimum number of tests necessary to ensure that consumers are receiving broadband service at required speed levels. On the other hand, latency testing indicates whether there is sufficient capacity in the network to handle the level of traffic, which is of particular importance when the network is experiencing high traffic load. In this respect, latency is similar to a pulse rate and can vary substantially as a result of several factors. Even if all these factors are unknown, frequently monitoring latency determines the ability of the network to handle various circumstances and factors that are affecting it. As NTCA, NRECA, and UTC explain: [T]here is logic in a protocol that tests for latency more frequently than speed. The impact of latency is measured in and discernible by milliseconds: the frequency of testing aims to illuminate whether variables that perforate performance are present. In contrast, speed contemplates a steadier aspect of the network facility, and therefore does not require as frequent testing to demonstrate compliance. Therefore, in as much as latency-sensitive services and applications (including but not limited to voice) are affected by millisecond variables, NTCA, NRECA and UTC urge the Commission to maintain its rigorous standards for latency testing. NTCA/NRECA/UTC Opposition at 5. And, in any event, conducting more tests for latency is to the carrier’s benefit, because of the variability of latency and resulting greater likelihood that outlier failures will not affect the overall rate. 34. We appreciate AT&T’s willingness to share its internal data and analysis. However, AT&T’s data reflect only the capabilities of its own network and consisted of a very small sample set—18 customers for one peak period in one instance and “almost” 100 subscribers for one peak period in the other. See AT&T April 12, 2019 Ex Parte at 2; AT&T May 21, 2019 Ex Parte at 2. We also note that even AT&T’s data demonstrated a substantial variation between testing once per hour and once per minute. For example, in its testing, AT&T found that per minute latency testing of customers served by varying technologies showed that 1.17% of tests were higher than 100 ms but once per hour testing showed that 3.04% of tests showed a latency of higher than 100 ms. See AT&T May 21, 2019 Ex Parte at 2. A difference of 2% when the latency standard is 5% is substantial. 35. Analysis undertaken by Commission staff confirms the importance of more frequent testing to account for the variability associated with latency. Commission staff compared the conclusions that AT&T—and supported by ITTA—drew from its data to what the much larger MBA data demonstrate. This analysis indicates that the risk of false positives and false negatives (i.e., sample test results indicate that a carrier fails, when given overall network performance, it should have passed, or that a carrier passes, when given overall network performance, it should have failed) varies significantly based on the number of measurements per hour. Because the Commission’s performance standard for latency requires 95% of the latency measurements to be less than or equal to 100 ms, a carrier would fail the standard if more than 5% of its latency measurements are greater than 100 ms.  In general, staff’s analysis found that a greater number of measurements reduces the impact of data outliers and makes false positives and false negatives less likely. For example, a single 200 ms data outlier among a sample of 10 latency measurements that otherwise are all under 100 ms would result in the carrier’s failing to meet the 95% threshold (i.e., only 9 out of 10 or 90% of the measurements would be at or under 100 ms). However, a single data outlier of 200 ms in a sample of 100 latency measurements would not, in the absence of at least five other measurements exceeding 100 ms, cause the carrier to fail (i.e., 99 out of 100 or 99% of the measurements would be at or under 100 ms). 36. Additionally, staff analysis of MBA data indicated that the distribution of latency among carriers varies widely even within the same minute. For example, the mean of latency from February to March 2016 ranged from 0 to 3,000 ms, the standard deviation ranged from 0 to 1,500 ms, and the coefficient of variation ranged from 0 to 8. Even among AT&T’s own MBA measurements, the latency distributions have a mean ranging between 0 and 2,000 ms with a coefficient of variation ranged from 0 to 5. Staff observed a similar or higher coefficient of variation using MBA per hour data from both October 2017 and October 2018.  This means that latency varies significantly depending upon the traffic on the network at any given time and does not vary in the same way for each carrier or even within each day for each carrier. Because of the countless number of distributions observed among carriers reflected by the MBA data, we conclude that a smaller number of observations would not yield reliable testing results. Thus, more testing provides the Commission with greater ability to detect bad performance in cases where a carrier’s latency is consistently high. In other words, since the likelihood of failing or passing the Commission’s latency standard depends, to some degree, on random noise, the more measurements taken by a carrier, the less likely that random factors would cause it to fail the standard.   37. The figure below demonstrates staff’s analysis of the estimated probability of failure and associated risk of false positive or false negative results with different numbers of measurements from a range of latency distributions observed in the MBA data. To conduct this analysis, FCC staff used MBA data from February to March 2016 (per minute data) and October 2017 and October 2018 (per hour data). Each box (bar) represents the estimated probability of failure for a given latency distribution.  The difference in the probability of failure between N number of measurements and N=2000 is the estimated risk of a false positive (the test result indicates that a carrier fails when it should have passed) and a false negative (the test result indicates that a carrier passes when it should have failed). As demonstrated, there is a much higher risk of a false positive or false negative under AT&T’s proposed once per hour latency measurement as compared to a moderate risk from 60 measurements per hour. Thus, staff’s analysis shows that, given the high variability of latency, one of two things would occur if we required only one measurement per hour: either a few extreme measurements would cause a carrier to fail the standard when, in fact, it should pass given its overall performance, or the Commission would be unable to capture consistent poor performance by a carrier that should fail based on the overall performance of its network. As a result, a moderate-risk approach of 60 measurements per hour strikes a balance between the burden of testing on carriers and the risk of failure by carriers caused by uncertainty. 38. Finally, we note that some parties may misunderstand what exactly constitutes a latency test for purposes of the performance measures. Specifically, USTelecom states that, “[t]esting every minute may also overload some testing methods and cause testing to be disrupted,” implying that a carrier must start and stop a latency test every minute within a test-hour. See USTelecom July 31, 2018 Ex Parte at 2. While we do not believe this interpretation is consistent with the intent of the Order, we provide greater clarity here on what is considered a sufficient latency test to assuage concerns about the number of latency tests per hour. As the Bureaus described in the Order, a “test” constitutes a “single, discrete observation or measurement of speed or latency.” Performance Measures Order, 33 FCC Rcd at 6519, n.83. While carriers may choose to continuously start and stop latency testing every minute and record the specific result, we clarify that there is no requirement to conduct latency testing in this manner. Instead, carriers may continuously run the latency testing software over the course of a test-hour and record an observation or measurement every minute of that test-hour. If a carrier transmits one packet at a time for a one-minute measurement, the carrier should report the result of that packet as one observation. However, some applications, such as ping, commonly send three packets and only report summarized results for the minimum, mean, and maximum packet round trip time and not individual packet round trip time. If this is the case, the carrier should report the mean as the result of this observation. If the carrier sends more than one packet and the testing application allows for individual round trip time results to be reported for each packet, then the carrier must report all individual measurements for each packet. Such an approach plainly fits within the definition of “test” adopted by the Bureaus in the Order and does not require constant starting and stopping of the latency testing software. In sum, carriers have the flexibility to choose how to conduct their latency testing, so long as one separate, discrete observation or measurement is recorded each minute of the specific test-hour. E. Number of Test Locations 39. The Bureaus required that carriers test a maximum of 50 subscriber locations per required service tier offering per state, depending on the number of subscribers a carrier has in a state, randomly selected every two years. Id. at 6522, 6524, paras. 36, 40. The Order included scaled requirements permitting smaller carriers (i.e., carriers with fewer than 500 subscribers in a state and particular service tier) to test 10% of the total subscribers in the state and service tier, except for the smallest carriers (i.e., carriers with 50 or fewer subscribers), which must test five subscriber locations. The Bureaus also recognized that, in certain situations, a carrier serving 50 or fewer subscribers in a state and service tier may not be able to test even five active subscribers; the Bureaus permitted such carriers to test a random sample of existing, non-CAF-supported active subscriber locations within the same state and service tier to satisfy the testing requirement. Id. at 6522-23, para. 36. In situations where a subscriber at a test location stops subscribing to the service provider within 12 months after the location was selected, the Bureaus required that the carrier test another randomly selected active subscriber location. Id. at 6524, para. 40. Finally, the Bureaus explained that carriers may use inducements to encourage subscribers to participate in testing, which may be particularly useful in cases where support is tied to a particular performance level for the network, but the provider does not have enough subscribers to higher performance service tiers to test to comply with the testing sample sizes. Id. 40. Petitioners and applicants raise various concerns regarding the required number of subscriber test locations. Micronesian Telecommunications Corporation (MTC), for example, argues that it and similar carriers that may have fewer than 50 subscribers in a particular state and speed service tier will be unable to comply with the test locations requirement. MTC PFR at 1-2. MTC claims that it will be difficult to find even five customers to test, particularly in higher service tiers. Id. Asking that the Commission “provide a safety valve” for similar small carriers, MTC proposes that such a provider should “test no more than 10 percent of its customers in any given service tier, with a minimum of one test customer per service tier with customers.” Id. at 1-3. See also MTC Reply to Comments on Petition for Partial Reconsideration, WC Docket No. 10-90, at 2 (Nov. 19, 2019). NTCA argues that testing 10% of subscribers may be excessive; instead, NTCA proposes that carriers should test the lesser of 50 locations per state or 5% of active subscribers. NTCA AFR at 13-18. See also ITTA PFR Comments at 2-4. NTCA expresses some concern that customers will be suspicious of testing and will not consent to testing even with carriers’ inducements. See Letter from Joshua Seidemann, VP of Policy – Industry Affairs and Business Development, NTCA, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at (May 7, 2019) (NTCA May 7, 2019 Ex Parte). Further, NTCA argues that carriers should not be required to upgrade the speed or customer premises equipment for individual locations even temporarily to conduct speed tests. NTCA AFR at 13-18; NTCA May 7, 2019 Ex Parte at 3. See also WTA May 6, 2019 Ex Parte at 2; Letter from Michael J. Jacobs, VP, Regulatory Affairs, ITTA – The Voice of America’s Broadband Providers, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 1-2 (Aug. 6, 2019) (ITTA Aug. 6, 2019 Ex Parte) (arguing that carriers should be able to use their own randomization tools to develop test samples and should not be required to upgrade test locations if other locations ordering such speeds exist); Letter from Louis Peraertz, Vice President of Policy, WISPA, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 1-3 (Aug. 13, 2019) (WISPA Aug. 13, 2019 Ex Parte) (stating that requiring carriers to upgrade customers speeds would be required to do a truck roll to provide the customer with higher speed service and could deter applicants in future auctions from bidding at higher speed tiers); USTelecom Oct. 15, 2019 Ex Parte at 1-2. WTA suggests that, at least for rural carriers, the number of test locations should be much lower than adopted in the Order. Smaller carriers must test larger percentages of their customers compared to larger carriers; WTA AFR at 7. accordingly, WTA argues, the Commission should permit testing of just 10-15 locations or 2-3% of subscribers in each CAF-required service tier. Id. at 6-9. See also ITTA PFR Comments at 2-4. WTA also asks that the Commission resolve uncertainties around the testing, including whether the “testing and reporting is covered or not covered by Commission rules or policies concerning Customer Proprietary Network Information.” See WTA May 6, 2019 Ex Parte at 3. 41. NTCA, as well as USTelecom, ITTA, and WISPA, also ask that the Commission clarify that carriers may use the same locations for testing both speed and latency. See USTelecom/ITTA/WISPA PFR at 21-23; NTCA AFR at 22. See also AT&T PFR Comments at 8-9; Midcontinent PFR Comments at 7; Opposition of NTCA – The Rural Broadband Association, National Rural Electric Cooperative Association, and Utilities Technology Council to Petitions for Reconsideration, WC Docket No. 10-90, at 17-18 (Nov. 7, 2018) (NRECA/UTC PFR Comments); USTelecom/ITTA/WISPA PFR Reply at 4-5. USTelecom, ITTA, and WISPA explain that, if carriers must conduct speed and latency testing at different locations, the number of subscribers that must be tested would be unnecessarily doubled, which “would be particularly troublesome for smaller recipients, many of whom will be drawing test locations from a small group of subscribers.” USTelecom/ITTA/WISPA PFR at 21. Similarly, the petitioners explain, the requirement regarding the number of test locations should be clarified to be exactly the same for both speed and latency. Id. at 22-23. These clarification proposals drew broad support from commenters. See AT&T PFR Comments at 8-9; USTelecom July 31, 2018 Ex Parte at 3; Midcontinent PFR Comments at 7; NTCA/NRECA/UTC Opposition at 17-18; USTelecom/ITTA/WISPA PFR Reply at 4-5. For example, comments submitted jointly by NTCA, NRECA, and UTC assert that the clarifications would help providers “avoid unnecessary costs and excessive administrative burden,” NTCA/NRECA/UTC Opposition at 18. while Midcontinent Communications notes that using “the same panelists for speed and latency testing for CAF purposes would align with [its] internal testing practices.” Midcontinent PFR Comments at 7. 42. A few parties offer suggestions regarding the parameters for the random selection process. In particular, WTA asks that locations should be tested for five years, instead of two years, before a new random sample of test locations is chosen. WTA AFR at 6-9. See also ITTA PFR Comments at 4-5. WTA also proposes that twice the required random number of testing locations be provided to carriers so that carriers can replace locations where residents refuse to participate or have incompatible CPE. Letter from Derrick B. Owens, Senior Vice President, Government & Industry Affairs, and Gerard J. Duffy, Counsel, WTA – Advocates for Rural Broadband, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90 at 3 (Apr. 17, 2019) (WTA Apr. 17, 2019 Ex Parte). Frontier, in an ex parte filing, proposes that carriers be allowed to test only new customer locations; it argues that installing the necessary testing equipment at older locations requires more time than is available with the adopted testing schedule. Letter from A.J. Burton, Vice President, Federal Regulatory, Frontier Communications, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 1-2 (Dec. 18, 2018) (Frontier Ex Parte). 43. We decline to modify the adopted sample sizes for testing speed and latency. To minimize the burdens of testing, the Bureaus have used a “trip-wire” approach in determining the required sample sizes. In other words, the adopted sample sizes produce estimates with a high margin of error but can show where further inquiry may be helpful; our target estimation precision is a 90% confidence level with an 11.5% margin of error. For the largest carriers, i.e., those with over 500 subscribers in a given state and speed service tier, this requires a sample size of 50 subscriber locations. For the smallest carriers, the Bureaus adopted small sample sizes that result in less precision, with the margin of error reaching 34.9%, to reduce the testing burden on smaller providers. Reducing the sample sizes for smaller carriers even more would further reduce the resulting estimation precision—making the test data even less likely to be representative of the actual speed and latency consumers experience on CAF-supported networks. We therefore do not modify the required numbers of subscriber locations carriers must test. 44. Nonetheless, we recognize that a few carriers facing unique circumstances may find it extraordinarily difficult to find a sufficient number of subscriber locations to test. See MTC PFR at 1-3. Although we decline to modify the adopted sample sizes, the Commission appreciates that special circumstances occasionally demand exceptions to a general rule. We note that some entities may have misunderstood the testing requirements. MTC explains that it “is regulated by the Commission as a price cap ILEC, [and] offers several speed tiers for Internet subscribers in CNMI. Due to the challenging business environment, however, certain service tiers have very few customers even when counting non-CAF-supported areas.” MTC PFR at 2. As a price cap carrier, MTC is receiving CAF Phase II support, which requires build out of 10/1 Mbps. Thus, MTC is only required to test in one speed tier, 10/1 Mbps. The fact that MTC offers additional speed packages to its customers does not increase the number of test locations required. MTC will be required to test a random sample of up to 50 locations drawn from the number of CAF-supported locations at which there is a subscriber. The Commission’s rules may be waived for good cause shown. 47 CFR § 1.3. Waiver of the Commission’s rules is appropriate only if both: (1) special circumstances warrant a deviation from the general rule, and (2) such deviation will serve the public interest. See Northeast Cellular Tel. Co. v. FCC, 897 F.2d 1164, 1166 (D.C. Cir. 1990) (citing WAIT Radio v. FCC, 418 F.2d 1153, 1157-59 (D.C. Cir. 1969), cert. denied, 93 S.Ct. 461 (1972)) (Northeast Cellular). 45. For carriers that cannot find even five CAF-supported locations to test, we also reconsider the Bureaus’ decision to permit testing of non-CAF-supported active subscriber locations within the same state and service tier. See Performance Measures Order, 33 FCC Rcd at 6522-23, para. 36. Testing and reporting speed and latency for non-CAF-supported locations adds unnecessary complexity to our requirements. Accordingly, we require that any non-compliant carrier testing fewer than five CAF-supported subscriber locations because more are not available would be subject to verification that more customers are not available, rather than requiring that all carriers testing fewer than five CAF-supported subscriber locations find non-CAF-supported locations to test. 46. Additionally, we recognize that, as several parties have noted, obtaining customer consent for testing which requires placement of testing equipment on customer premises may prove difficult. See, e.g., NTCA May 7, 2019 Ex Parte; NTCA Oct. 17, 2019 Ex Parte at 4-5. We believe that our revised testing implementation schedule (discussed below) will help alleviate this concern, particularly for smaller carriers. Numerous vendors are developing software solutions that will allow providers to test the service at customer locations without requiring any additional hardware at the customer’s premises. Further, we direct WCB to publish information on the Commission’s website explaining the nature and purpose of the required testing—to ensure that carriers are living up to the obligations associated with CAF support—and urging the public’s participation. We expect that providing such information in an easy-to-understand format will help alleviate subscribers’ potential concerns. Moreover, we emphasize that no customer proprietary network information is involved in the required testing or reporting, other than information for which the carrier likely would already have obtained customer consent; See, e.g., Letter from Gerard J. Duffy, Regulatory Counsel WTA – Advocates for Rural Broadband, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 3 (May 9, 2019). carriers routinely perform network testing of speed and latency and the performance measures testing we are requiring is of a similar nature. 47. We agree with comments recommending that the same sample sizes adopted for speed should also apply to latency, and that the same subscriber locations should be used for both speed and latency tests. As some parties have noted, requiring testing of two separate sets of subscriber locations for speed and latency, rather than the same group of locations for both, is unnecessarily burdensome. By requiring speed and latency tests at the same subscriber locations, we reduce the amount of equipment, coordination, and effort that may otherwise be involved in setting up testing. Therefore, carriers will test all of the locations in the random sample for both speed and latency. We note that because the Commission is adopting different implementation dates for testing of different broadband deployment programs, a carrier will receive a separate random sample of testing locations for each program for which it must do performance testing. In the Performance Measures Order, the Bureaus stated that, “[a] carrier with 2,000 customers subscribed to 10/1 Mbps in one state through CAF Phase II funding and 500 RBE customers subscribed to 10/1 Mbps in the same state, and no other high-cost support with deployment obligations, must test a total of 50 locations in that state for the 10/1 Mbps service tier.” Performance Measures Order, 33 FCC Rcd at 6524, para. 39 (citations omitted). But because CAF Phase II and RBE have different implementation dates for testing, the carrier in this example must test 50 locations for its CAF Phase II obligations and 50 locations for its RBE obligations. Because programs will have different testing implementation and ending dates, and non-compliance will be calculated for each program individually, a carrier will be required to test a different sample for each program. Similarly, because we now require carriers to use the same sample for both speed and latency, we reconsider the requirement that carriers replace latency testing locations that are no longer actively subscribed after 12 months with another actively subscribed location. See Performance Measures Order, 33 FCC Rcd at 6524, para. 40. The Bureaus did not make clear if this provision applied to both speed and latency test locations. To avoid confusion, we clarify that the same replacement requirements should apply to both speed and latency. Therefore, we now require that carriers replace non-actively subscribed locations with another actively subscribed location by the next calendar quarter testing. We note that if a carrier chooses to test more than the required number of locations, it will be required to replace any locations to maintain the number the carrier originally chose to test, even if that number is greater than the number of testing locations required. Although we do not believe it is necessary for carriers to obtain a random list of twice the number of required testing locations at the outset, See WTA Apr. 17, 2019 Ex Parte at 3. carriers should be able to obtain additional randomly selected subscriber locations as necessary for these kinds of situations. 48. We reconsider the Bureaus’ requirement that carriers meet and test to their CAF obligation speed(s) regardless of whether their subscribers purchase Internet service offerings with speeds matching the CAF-required speeds for those CAF-eligible locations. See Performance Measures Order, 33 FCC Rcd at 6528, para. 51. Specifically, in situations where subscribers purchase Internet service offerings with speeds lower than the CAF-required speeds for those locations, carriers are not required to upgrade individual subscriber locations to conduct speed testing unless there are no other available subscriber locations at the CAF-required speeds within the same state or relevant service area. We recognize that there may be significant burdens associated with upgrading an individual location, particularly when physically replacing equipment at the customer premises is necessary. See USTelecom Oct. 15, 2019 Ex Parte at 1. See also NTCA May 7, 2019 Ex Parte; WISPA Aug. 13, 2019 Ex Parte at 2. Some carriers may still find it necessary to upgrade individual subscriber locations, at least temporarily, to conduct speed testing. Carriers may not charge customers for any upgrades, new modems, or other testing equipment required by the carrier to comply with its testing obligations. Performance testing is a requirement of receiving CAF support and is the responsibility of the carrier, not the customer. See USF/ICC Transformation Order, 26 FCC Rcd at 17705-06, para. 109. We do not allow carriers to select their own randomization procedures and make only those customers that have subscribed to the required speed level eligible to be part of the sample. This would not produce a random sample, particularly for latency testing. See ITTA Aug. 6, 2019 Ex Parte at 1-2. In the event locations must be upgraded for testing, USAC will use its randomization procedures to determine which locations must be upgraded. We do not believe that requiring temporary upgrades of service of testing locations in these instances will discourage bidding in future auctions. See id. Carriers participating in auctions should be prepared to provide the required speeds at all of the locations in the relevant service area and should anticipate that over time more and more customers in the service area will be purchasing the higher-speed offerings. 49. Finally, we reject proposals to require testing only of newly deployed subscriber locations See Frontier Ex Parte at 1-2. and to maintain the same sample for more than two years. See WTA AFR at 6-9. WTA also argues that, because many customers will refuse to consent to carriers’ testing at customer locations, twice the number of required test locations should be selected and provided to carriers, and carriers should be able to self-certify the reasons for not being able to test specific customer locations. See WTA Apr. 17, 2019 Ex Parte. The Commission is working with USAC to implement the random selection process and will ensure that carriers will be able to obtain additional randomly selected subscribers as needed. If we were to permit testing of only new locations, carriers’ speed and latency test data would not reflect their previous CAF-supported deployments, for which carriers also have ongoing speed and latency obligations. Moreover, although the Bureaus adopted the Order in 2018, carriers have been certifying that their CAF-supported deployments meet the relevant speed and latency obligations for several years. Requiring testing of older locations should not prove a problem for carriers that have been certifying that their deployments properly satisfy their CAF obligations. In any case, further shrinking the required sample to include only more recent deployments would compromise the effectiveness of the “trip-wire” sample; the Commission would not be able to identify potential problems with many older CAF-supported deployments. Maintaining the same sample beyond two years would present the opposite problem. By excluding newer deployments, the Commission’s understanding of carriers’ networks would be outdated; the Bureaus’ decision to require testing a different set of subscriber locations every two years struck the correct balance between overburdening carriers and maintaining a current, relevant sample for testing. F. Quarterly Testing 50. The Bureaus required quarterly testing for speed and latency. In particular, to capture any seasonal effects and differing conditions throughout the year that can affect a carrier’s broadband performance, the Bureaus required carriers subject to the performance measures to conduct one week of speed and latency testing in each quarter of the calendar year. Performance Measures Order, 33 FCC Rcd at 6520, para. 29. 51. WTA argues that spreading testing across the year imposes a substantial burden, particularly on rural carriers, without producing more accurate information than a single week of testing. WTA AFR at 4. WTA also contends that obtaining consent from customers to allow testing for four weeks a year “is going to be extremely difficult and likely to become a customer relations nightmare.” Id. at 10. Instead, WTA argues that testing for a single week in late spring or early fall would be more representative of typical Internet usage. Id. at 5. WTA cites these claimed difficulties as a reason for reducing the number of weeks of annual testing, reducing the numbers of locations to be tested, allowing more flexible selection of customer locations, and using the test locations for longer periods. Id. at 15. 52. We decline to adjust the quarterly testing requirement as proposed by WTA. As the Bureaus acknowledged when they adopted the quarterly requirement, different conditions exist throughout the year that can affect service quality, including changes in foliage, weather, and customer usage patterns, school schedules, holiday shopping, increased or decreased customer use because of travel and sporting events, and business cycles. Performance Measures Order, 33 FCC Rcd at 6520, para. 29 (“we expect test results to reflect a carrier’s performance throughout the year, including during times of the year in which there is a seasonal increase or decrease in network usage”). The goal of the testing requirements is to ensure that consumers across the country experience consistent, quality broadband service throughout the year, not at only one defined point during the year. Additionally, we believe WTA’s concerns regarding customer consent are unfounded. We expect that once the requisite technology and software to conduct the required testing has been installed, testing the performance of the network for one week per quarter will not impose any additional significant burden on carriers or customers. Moreover, the tests themselves use so little bandwidth that we do not believe customers will even notice that testing is occurring. See id. at 6520-21, para. 32. Indeed, as the Bureaus explained, quarterly testing “strikes a better balance of accounting for seasonal changes in broadband usage and minimizing the burden on consumers who may participate in testing.” Id. G. Flexibility in Choosing Testing Methods 53. We confirm that carriers may use any of the three methodologies outlined in the Performance Measures Order to demonstrate their compliance with network performance requirements. The Commission has previously determined that it should provide carriers subject to performance testing with flexibility in determining the best means of conducting tests. In 2013, WCB had determined that price cap carriers generally may use “existing network management systems, ping tests, or other commonly available network measurement tools,” as well as results from the MBA program, to demonstrate compliance with latency obligations associated with CAF Phase II model-based support. CAF Phase II Price Cap Service Obligation Order, 28 FCC Rcd at 15071, para. 23. Thus, the Bureaus concluded that ETCs subject to fixed broadband performance obligations would be permitted to conduct testing by employing either: (1) MBA testing infrastructure (MBA testing), (2) existing network management systems and tools (off-the-shelf testing), or (3) provider-developed self-testing configurations (provider-developed self-testing or self-testing). Performance Measures Order, 33 FCC Rcd at 6513, para. 9. The Bureaus reasoned that the flexibility afforded by three different options offered “a cost-effective method for conducting testing for providers of different sizes and technological sophistication.” Id. at 6513, para. 10. 54. NTCA requests clarification about language in the Order stating that “MBA testing must occur in areas and for the locations supported by CAF, e.g., in CAF Phase II eligible areas for price cap carriers and for specific built-out locations for RBE, Alternative Connect America Cost Model (A-CAM), and legacy rate-of-return support recipients.” NTCA AFR at 22-23 (quoting Performance Measures Order, 33 FCC Rcd at 6513, para. 9). NTCA contends that this language refers to previously-promulgated MBA testing requirements and that the Commission should clarify that ETCs subject to fixed broadband performance obligations should be permitted to use any of three testing options outlined by the Bureaus. Id. at 23. We note that legacy rate-of-return support recipients are those carriers that receive Connect America Fund Broadband Loop Support (CAF-BLS). 55. The language highlighted by NTCA applies only to carriers choosing the MBA testing option; the Bureaus set out additional, separate requirements for carriers choosing to use off-the-shelf or provider-developed testing options. Performance Measures Order, 33 FCC Rcd at 6513, para. 9. As the Order explained, in the event that a carrier opts to use the MBA testing methodology to collect performance data, it must ensure boxes are placed at the appropriate randomly selected locations in the CAF-funded areas, as required for the CAF testing program. Id. If, on the other hand, a carrier opts for either off-the-shelf testing tools or its own self-testing, it must use the testing procedures specific to the providers’ respective chosen methodology. Id. H. Standards for Full Compliance 56. To achieve full compliance with the latency and speed standards, the Order required that 95% of latency measurements during testing windows fall below 100 ms round-trip time, and that 80% of speed measurements be at 80% of the required network speed. Based on the standard adopted by the Commission in 2011, WCB used ITU calculations and reported core latencies in the contiguous United States in 2013 to determine that a latency of 100 ms or below was appropriate for real-time applications like VoIP. See CAF Phase II Price Cap Service Obligation Order, 28 FCC Rcd at 15068-70, paras. 19-22. WCB thus required price cap carriers receiving CAF Phase II model-based support to test and certify that 95% of testing hours latency measurements are at or below 100 ms (the latency standard). See id. at 15068-74, paras. 19-32. Later, WCB sought comment on extending the same testing methodologies to other high-cost support recipients serving fixed locations, See 2014 Broadband Measurement and Reporting Public Notice, 29 FCC Rcd at 12625-26, paras. 5-14. and in multiple orders, the Commission extended the same latency standard to RBE participants, rate-of-return carriers electing the voluntary path to model support, CAF Phase II competitive bidders not submitting high-latency bids, and Alaska Plan carriers. See Rural Broadband Experiments Order, 29 FCC Rcd at 8795, para. 78; Connect America Fund, WC Docket No. 10-90, Report and Order, Order and Order on Reconsideration, and Further Notice of Proposed Rulemaking, 31 FCC Rcd 3087, 3099, para. 28 (2016) (2016 Rate-of-Return Reform Order); CAF Phase II Auction Order, 31 FCC Rcd at 5960, para. 31 FCC Rcd at29; Connect America Fund, WC Docket No. 10-90, Report and Order and Further Notice of Proposed Rulemaking, 31 FCC Rcd at10139, 10146-47, paras. 19-20 (2016) (Alaska Plan Order). 57. The Bureaus ultimately reaffirmed and further extended the latency standard to all high-cost support recipients serving fixed locations, except those carriers submitting high-latency bids in the CAF Phase II auction. Performance Measures Order, 33 FCC Rcd at 6527-28, para. 50. In doing so, the Bureaus noted that the data on round-trip latency in the United States had not markedly changed since the 2013 CAF Phase II Price Cap Service Obligation Order, and that no parties challenged the Commission’s reasoning for the existing 100 ms standard. Id. (citing 2016 Rate-of-Return Reform Order, 31 FCC Rcd at 3099, para. 28 (noting that no parties objected to extending the latency standard already adopted for price cap carriers to rate-of-return carriers)). More recently, the Bureaus refreshed the record, seeking comment on USTelecom’s proposal that certifying “full” compliance means that 95 to 100% of all of an ETC’s measurements during the test period meet the required speed. See 2017 Performance Measures Public Notice, 32 FCC Rcd at 9324-26, paras. 8-9 (citing Letter from Kevin Rupy, Vice President, Law & Policy, USTelecom, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 4-6 (May 23, 2017) (USTelecom May 23, 2017 Ex Parte)). The Bureaus then adopted a standard requiring that 80% of a carrier’s download and upload measurements be at or above 80% of the CAF-required speed (i.e., an 80/80 standard). Id. at 6528, para. 51. The Bureaus explained that this speed standard best meets the Commission’s statutory requirement to ensure that high-cost-supported broadband deployments provide reasonably comparable service as those available in urban areas. Id. at 6528-29, paras. 51-52. The Bureaus also noted that they would exclude from certification calculations certain speed measurements above a certain threshold to ensure that outlying observations do not unreasonably affect results. See id. at 6528 n.145. 58. In their Petition, USTelecom, ITTA, and WISPA complain that “[t]here is . . . a significant disparity in compliance thresholds for speed and latency,” and ask that the Bureaus require ETCs’ latency measurements to meet 175 ms at least 95% of the time. USTelecom/ITTA/WISPA PFR at 10-12. The petitioners argue that, before accepting CAF Phase II model-based support, carriers could not have fully understood whether the latency standard adopted in 2013 was appropriate, apparently because it was adopted “almost two full years before price cap carriers accepted CAF Phase II support,” and other “reasonable” requirements were adopted later. Id. at 10. Further, the petitioners argue, the same ITU analysis that WCB relied on in 2013 to adopt the latency standard “found that consumers continue to be ‘satisfied’ with speech quality at a one-way mouth-to-ear latency of 275 ms or a provider round-trip latency of 175 ms,” so “treating a latency result that is even one millisecond above 100 ms as a violation . . . penaliz[es] recipients for providing users with voice quality with which they are fully satisfied.” Id. at 11. Changing the standard to require latency measurements of 175 ms or better 95% of the time, petitioners assert, would better align the latency standard with the speed standard, which is designed to ensure that high-cost-supported broadband deployments are reasonably comparable to those in urban areas. Id. at 11-12. See also USTelecom July 31, 2018 Ex Parte at 2-3; AT&T PFR Comments at 9-11; Midcontinent PFR Comments at 3. 59. NTCA, NRECA, and UTC oppose the petitioners’ request to “align” the latency standard with the speed standard. Defending the 95% threshold adopted by the Bureaus, these parties explain that low latency is necessary to support achieving a “reasonably comparable” level of service, and the 95% compliance benchmark for latency is a “reasonable” standard for that. NTCA/NRECA/UTC Opposition at 12. NTCA, NRECA, and UTC also point out that voice telephony is a required component of CAF deployment obligations, and that reducing latency standards can compromise the quality of voice service and other vital broadband applications reliant on low latency. See id. at 3-4. Moreover, speeds may vary up to 20% because of “networking protocols, interference and other variances that affect all providers and whose accommodation is technology neutral,” but such factors do not affect latency. Id. at 13. Thus, they say, the record supports the adopted latency standard. 60. Multiple parties seek clarifications regarding implementation of the 80/80 speed standard adopted in the Order. In particular, carriers expressed concern that compliance will be measured against advertised speeds, rather than the speeds carriers are obligated to provide in exchange for CAF support. See NTCA AFR at 18-20; NTCA/NRECA/UTC Opposition at 16-17; Comments of Alaska Communications Systems, WC Docket No. 10-90, at 2-5 (Nov. 7, 2018) (ACS Comments); USTelecom/ITTA/WISPA PFR Reply at 2-3. In addition, USTelecom, ITTA, and WISPA, among others, See, e.g., ACS Comments at 2-5; AT&T PFR Comments at 13-14; Midcontinent PFR Comments at 4-7. challenge the Bureaus’ finding that speed test results greater than 150% of advertised speeds are likely invalid and ask that the Bureaus reconsider automatically excluding those measurements from compliance calculations. USTelecom/ITTA/WISPA PFR at 15-19. Instead, Vantage Point suggests, the Commission should consider excluding data points beyond a defined number of standard deviations, rather than setting a 150% cutoff for measurements. Vantage Point Aug. 28, 2018 Ex Parte at 4. 61. We decline to modify the longstanding latency standard requiring that 95% of round-trip measurements be at or below 100 ms. As petitioners acknowledge, the standard was initially adopted in 2013, before carriers accepted CAF Phase II model-based support. See USTelecom/ITTA/WISPA PFR at 10. Petitioners claim that, as a result, “no future recipient could have been expected to assess the appropriateness of this prematurely adopted requirement,” Id. but, in fact, carriers accepted CAF Phase II support conditioned on the requirement that they certify to the adopted latency standard. In other words, carriers assessed the appropriateness of the standard and decided that they would be able to certify meeting the standard—or, at the very least, accepted that they would risk losing CAF Phase II support if they were unable to meet the standard. Moreover, no parties sought reconsideration when the standard was originally adopted, and the Commission later extended the same standard to other high-cost support recipients in the years following. See Rural Broadband Experiments Order, 29 FCC Rcd at 8795, para. 78; 2016 Rate-of-Return Reform Order, 31 FCC Rcd at 3099, para. 28; CAF Phase II Auction Order, 31 FCC Rcd at 5960, para. 29; Alaska Plan Order, 31 FCC Rcd at 10146-47, paras. 19-20. 62. We also note that latency is fundamentally different from speed and therefore requires a different standard to ensure that CAF-supported broadband Internet service is reasonably comparable to service in urban areas. See NTCA/NRECA/UTC Opposition at 12. The 100 ms standard, which is more lenient than the 60 ms standard originally proposed, See Wireline Competition Bureau Seeks Further Comment on Issues Regarding Service Obligations for Connect America Phase II and Determining Who Is an Unsubsidized Competitor, WC Docket No. 10-90, Public Notice, 28 FCC Rcd 1517, 1524, para. 25-26 (WCB 2013). In proposing the 60 ms standard, WCB pointed to recent testing results that “show[ed] that the average peak period round trip UDP latency for all wireline terrestrial technologies is less than 60 ms.” Id. ensures that subscribers of CAF-supported Internet service can use real-time applications like VoIP. CAF Phase II Price Cap Service Obligation Order, 28 FCC Rcd at 15073, para. 28. If we were to require 95% of latency measurements to be only 175 ms or lower, we would be relaxing the standard considerably—permitting CAF-supported Internet service to have 75% higher latency than permitted by the existing standard adopted by the Commission. Further, lowering the existing standard would not decrease burdens on carriers and provide “a more efficient compliance and enforcement process,” as the petitioners suggest. The carriers need only to conduct tests, which can be automated, and provide the data; USAC will complete the necessary calculations to determine compliance. To the extent that parties argue that the 100 ms standard is overly strict and that consumers may be satisfied with higher latencies, that standard was adopted in prior Commission orders and thus is not properly addressed in this proceeding, which is to determine the appropriate methodology for measuring whether high-cost support recipients’ networks meet established performance levels. 63. We clarify, however, that carriers are not required to provide speeds beyond what they are already obligated to deploy as a condition of their receipt of high-cost support. Thus, for a location where a carrier is obligated to provide 10/1 Mbps service, we only require testing to ensure that the location provides 10/1 Mbps service, even if the customer there has ordered and is receiving 25/3 Mbps service. 64. Regarding the trimming of data in calculating compliance with the speed standard, we reconsider the Bureaus’ decision to exclude from compliance calculations any speed test results with values over 150% of the advertised speed for the location. Instead of trimming the data at the outset as the Bureaus had required, we direct the Bureaus to study data collected from carriers’ pre-testing and testing and determine how best to implement a more sophisticated procedure using multiple statistical analyses to exclude outlying data points from the test results. We anticipate that the Bureaus will develop such a procedure for USAC to implement for each carrier’s test results in each speed tier in each state or study area and may involve determining whether multiple methods (e.g., the interquartile range, median absolute deviation, Cook’s distance, Isolation Forest, or extreme value analysis) flag a particular data point as an anomaly. I. Remedies for Non-Compliance 65. The Performance Measures Order also established a framework of support reductions that carriers would face in the event that their performance testing did not demonstrate compliance with speed and latency standards to which each carrier is subject. The Bureaus considered numerous approaches to address non-compliance with the required speed and latency standards. See, e.g., USTelecom May 23, 2017 Ex Parte at 4-6, Exhibit A; Comments of NTCA – The Rural Broadband Association, WC Docket No. 10-90, at 15-17 (Dec. 6, 2017); Comments of the Wireless Internet Service Providers Association, WC Docket No. 10-90, at 7-10 (Dec. 6, 2017); Comments of WTA – Advocates for Rural Broadband, WC Docket No. 10-90, at 11 (Dec. 6, 2017). They adopted a “four-level framework that sets forth particular obligations and automatic triggers based on an ETC’s degree of compliance with our latency, speed, and, if applicable, MOS testing standards in each state and high-cost support program.” Performance Measures Order, 33 FCC Rcd at 6531-32, para. 60. Under this scheme, compliance for each standard is separately determined, with the percentage of a carrier’s measurements meeting the relevant standard divided by the required percentage of measurements to be in full compliance. Id. The Bureaus noted that the framework “appropriately encourages carriers to come into full compliance and offer, in areas requiring high-cost support, broadband service meeting standards consistent with what consumers typically experience.” Id. at 6533, para. 65. 66. Broadly, our goal in establishing a performance testing regime is to ensure that consumers receive broadband at the speed and latency to which carriers have committed, and for which they are receiving support. Our compliance regime is designed to encourage them to provide high quality broadband, not to punish carriers for failing to perform. That is why the Bureaus adopted an interim schedule for withholding support for failing to meet the required performance, but to return such support as the carrier comes into compliance. This is consistent with the Commission’s approach to construction of network facilities, i.e. support is withheld if carriers do not meet their build-out milestones, but as the carrier improves its performance, withheld support is returned. There is no correlation in either case between the interim percentages of support withheld and the total per-location support; rather, these interim withholdings are designed solely to encourage the carrier to meet its obligations and ensure that progress is continuing. We note that carriers have their entire support term to improve their networks and come into compliance. Even at the end of the support term, our rules provide for a one-year period before any support is permanently withheld, during which the carrier can show that it has fixed the problems with its network. Further, as explained below, we add an opportunity for carriers to request a larger, statistically valid sample if the carrier believes that the small sample size is the cause of the failure to perform. We therefore anticipate few instances of non-compliance with our performance measures. 67. Several parties urge the Commission to adjust the adopted framework for non-compliance. USTelecom, ITTA, and WISPA jointly argue that non-compliance with the speed and latency requirements is subject to support withholding under the established framework that is “more severe[] than non-compliance with build-out milestones.” USTelecom/ITTA/WISPA PFR at 12. For example, they observe that a carrier with a compliance gap of less than six percent would lose 5% of its high-cost support, while only being subject to quarterly reporting obligations for missing its required build out by up to 14.9%. Id. at 12-13. USTelecom, ITTA, and WISPA instead propose mirroring the precedent established for the deployment milestone framework, with non-compliance with the speed and latency requirements of 5% or less resulting only in a quarterly reporting obligation and non-compliance of 5% to 15% resulting in 5% of funding being withheld. Id. at 14. See also USTelecom July 31, 2018 Ex Parte at 4. Additionally, they request clarification that a carrier not complying with both its performance measurement requirements and deployment requirements will be subject only to a reduction in support equal to the greater of the two amounts, rather than the combined percentage of the two amounts. USTelecom/ITTA/WISPA PFR at 14. AT&T concurs with petitioners that support reductions for failing to comply with performance standards should not be more serious than failure to deploy. AT&T PFR Comments at 11-13. NTCA, NRECA, and UTC jointly contend that “non-compliance (especially if relatively minor in degree) should impose upon the provider the burden of proof to demonstrate a justifiable reason for non-compliance and an avenue toward remediation; it should not eliminate automatically support upon which the provider relies for deployment and operation.” NTCA/NRECA/UTC Opposition at 15-16. WTA proposes that rural carriers not in full compliance be given a six-month grace period “to locate and correct the problem without reduction or withholding of the monthly high-cost support needed to finance the repair, upgrade and operation of [their] networks.” WTA AFR 5-6. See also WTA Apr. 17, 2019 Ex Parte. WTA also reiterates that rural local exchange carriers (LECs) should not lose high-cost support due to the shortcomings of facilities or circumstances over which they have no control and are not able to repair or upgrade. WTA AFR at 6. Finally, Peñasco Valley Telephone Cooperative argues that a 100% success requirement for full compliance does not take into account factors outside the carrier’s control and instead proposes a high percentage benchmark, but less than 100%, to account for these variables. Letter from Salvatore Taillefer, Jr., Counsel to Peñasco Valley Telephone Cooperative, Inc., to Marlene H. Dortch, Secretary, FCC, WC Docket Nol. 10-90, at 2 (Oct. 9, 2018). 68. Except as discussed below, we generally decline to revise the compliance and certification frameworks adopted by the Bureaus. We disagree that the consequences for failure to meet our performance measures are greater than that for failure to meet deployment obligations. As opposed to the deployment obligations that many parties use for comparison, the speed and latency standards adopted by the Bureaus include a margin for error and do not require carriers to meet the established standards in every instance. For example, carriers are required to meet the 100 ms standard for latency only 95% of the time, rather than 100% as suggested by some parties. Similarly, we allow carriers to be in compliance with our speed standards if they provide 80% of the required speed 80% of the time. Moreover, we establish pre-testing periods in which no support reductions for failing to meet standards will occur to allow carriers to adjust to the new regime. This opportunity for pre-testing will ensure that carriers are familiar with the required testing and how to properly measure the speed and latency of their networks. Because carriers will be aware of which locations are being tested, they will be able to monitor their networks prior to beginning the required testing to make sure the network is performing properly. Further, once a location is certified in USAC's High Cost Universal Broadband (HUBB) portal, the carrier has certified that it meets the required standards, so the performance of the network should not be a surprise to the carrier. 69. Some parties have expressed concern about the performance requirements and the non-compliance support reductions. For example, USTelecom, ITTA, and WISPA argue that certain aspects of the compliance framework “penalize non-compliance with broadband speed and latency requirements more severely than non-compliance with build-out milestones.” USTelecom/ITTA/WISPA PFR at 12. They also assert that the compliance framework is “is too stringent and could impede—rather than advance—broadband deployment in rural CAF-supported areas.” Id. We disagree. As a condition of receiving high-cost support, carriers must commit not only to building out broadband-capable networks to a certain number of locations, but also to providing those locations with a specific, defined level of service. Building infrastructure is insufficient to meet a carrier’s obligation if the customers do not receive the required level of service. If a carrier fails to meet its deployment requirements, it will face certain support reductions, and if it likewise fails to meet its performance requirements for locations to which it claims it has deployed, it has failed to fully fulfill its obligations. The compliance framework established by the Bureaus is essential to ensuring that consumers are receiving the appropriate level of service that the carrier has committed to provide. 70. We emphasize that at the conclusion of a carrier’s build-out term, any failure to meet the speed and latency requirements is a failure to deploy because the carrier is not delivering the service it has committed to deliver. As noted above, the purpose of the interim measures for withholding support for network build-out and performance compliance is to ensure continued progress toward the goal of full deployment of service at the required speed and latency. At the end of the term, the Commission’s interest turns to recovering support associated with those locations to which the carrier did not deploy facilities or locations at which the service does not perform at the committed levels. As the Bureaus stated in referring to the compliance levels thresholds, “[w]e emphasize that the goal of this compliance framework is to provide incentives, rather than penalize.” Performance Measures Order, 33 FCC Rcd at 6533, para. 65. The Bureaus also noted that at Level 4 non-compliance for performance measures, the carrier would be referred to USAC to see if that carrier was on a path to meeting its deployment obligations. Id. at 6532, para. 64. A failure to comply with all performance measure requirements will result in the Commission determining that the carrier has not fully satisfied its broadband deployment obligations at the end of its build-out term and subjecting the carrier to the appropriate broadband deployment non-compliance support reductions. We do not consider a carrier to have completed deployment of a universal service funded broadband-capable network simply by entering the required number of locations to which it has built into the HUBB; customers at those locations also must be able to receive service at the specific speed and latency to which the carrier has committed. Simply put, consumers must receive the required level of service before a network can be considered to have been fully deployed. Otherwise, a carrier would not be meeting the conditions on which it receives support to deploy broadband. 71. Several parties argue that there is insufficient notice for clarifying that “any failure to meet the speed and latency requirements will be considered a failure to deploy.” See USTelecom Oct. 15, 2019 Ex Parte at 2-3. We disagree. When establishing the CAF in 2011, the Commission noted that it “will require recipients of funding to test their broadband networks for compliance with speed and latency metrics,” and each recipient of high-cost support with defined build-out obligations must deploy broadband service with available speeds as required by the Commission. USF/ICC Transformation Order, 26 FCC Rcd at 17705, para. 109. Indeed, the Commission found that verifiable test results would allow the Commission “to ensure that ETCs that receive universal service funding are providing at least the minimum broadband speeds, and thereby using support for its intended purpose as required by section 254(e)”; if the support is not used to provide the required level of service, it is not being used for its intended purpose under section 254(e). Id. at 17706, para. 110. Carriers do not receive high-cost support to just install any network; they must deploy a broadband-capable network actually meeting the required speed and latency metrics. See, e.g., December 2014 CAF Phase II Order, 29 FCC Rcd at 15649-56, paras. 15-29 (adopting what was then “a new minimum speed standard of 10 Mbps downstream and 1 Mbps upstream” for CAF Phase II broadband deployment). Indeed, section 54.320(d)(1) of the Commission’s rules provides that “[f]or purposes of determining whether a default has occurred, a carrier must be offering service meeting the requisite performance obligations.” 47 CFR § 54.320(d)(1). Several parties also argue that using carriers’ compliance with the performance measures standards to determine whether a carrier has actually deployed broadband may make it more difficult to obtain a letter of credit as required in the CAF Phase II auction or other future programs.  These parties assert that “[b]anks will be very reluctant to lend given the possibility of . . . penalties 10 years down the line when all prior indications showed that the network was appropriately deployed.”  USTelecom Oct. 15, 2019 Ex Parte at 2-3.  However, these concerns reflect a misunderstanding of the CAF Phase II auction letter of credit requirements.  CAF Phase II auction recipients in particular must complete build-out to all funded locations by the end of the sixth year of support, after which the letter of credit is terminated, and continue receiving support for four years after their final build-out milestone.  See CAF Phase II Auction Order, 31 FCC Rcd at 6017, para. 191.  Any withholding at the end of a carrier’s ten-year support term due to performance measures non-compliance will be long after the carrier is required to complete its deployment to all locations and its letter of credit is terminated, so banks will not be concerned about support withholding at the end of carriers’ support terms.  And in any event, no banks raised this concerns in the record, so carriers’ concerns in this regard are purely speculative. 72. We use the testing data to determine the level of compliance for the carrier’s network, as defined by the Bureaus in the Performance Measures Order. See Performance Measures Order, 33 FCC Rcd at 6531-65, paras. 60-65. Thus, at the end of a carrier’s build-out term, if a carrier has deployed to 100% of its required locations, but its overall performance compliance percentage is 90%, USAC will recover the percentage of the carrier’s support equal to 1.89 times the average amount of support per location received in the state for that carrier over the term of support for the relevant performance non-compliance percentage (i.e., 10%), plus 10 percent of the carrier’s total relevant high-cost support over the support term for that state. See 47 CFR § 54.320(d)(2). For instance, Carrier X deployed to 100 locations, but there was an overall performance compliance percentage of 90%. If the average support per line in the state was $100 over the support term, Carrier X would be required to refund $11,340 ($100 x (10% of 100 locations) x 6 years x 1.89) in support. Similarly, if a carrier deploys to only 90% of the locations to which it is required to build, and of those locations, the performance compliance percentage is 90%, the carrier will be required to forfeit support equal to 1.89 times the average amount of support per location received in the state for that carrier over the term of support for both the 10% of locations lacking deployment and an additional 9% of locations (reflecting a non-compliance percentage of 10% for the 90% deployed locations), plus 10 percent of the carrier’s total relevant high-cost support over the support term for that state. For instance, Carrier X was required to deploy to 100 locations, but only deployed to 90 total locations, a shortfall of 10 locations. Of those 90 locations, there was an overall performance compliance percentage of 90% . If the average support per line in the state was $100, Carrier X would be required to refund $21,546 ($100 x (10 locations+ 10% of 90 locations) x 6 years x 1.89) in support. However, carriers are permitted up to one year to address any shortcomings in their deployment obligations, including ensuring that their performance measurements are 100% in compliance, before these support reductions will take effect. See 47 CFR 54.320(d)(2). NTCA expresses concern whether “a location that fails a test by any margin [is] considered unbuilt into perpetuity” and whether there is an opportunity to cure after a failure of the final test.  NTCA Oct. 17, 2019 Ex Parte at 4 (emphasis in original).  As we clarify herein, non-compliance is determined based on the methodology outlined in the Performance Measures Order, not on a location-by-location basis.  Thus, even if a location fails a test in one particular year, the carrier can still show that it has improved its performance by future testing regardless of its performance at that one specific location.  Moreover, section 54.320(d)(2) provides an additional one-year period after the end of the support term in which the carrier can retest and show that it has come into compliance with the performance standards. 73. To provide certainty to carriers and to take into account that carriers may be in compliance with performance obligations during their testing periods, but for whatever reason may not be in compliance at the end of the support term, we more narrowly tailor our end-of-term non-compliance provisions to recognize past compliance. Accordingly, we will withhold support where a carrier is unable to demonstrate compliance at the end of the support term only for the amount of time since the carrier’s network performance was last fully compliant. Periods when the carrier’s network were not being tested are not considered to be in compliance with testing for purposes of determining the percentage of compliance if the carrier never came into compliance at any time. Specifically, we modify the support recovery required by section 54.320(d) that is related to compliance with performance measures by multiplying it by the percentage of time since a carrier was last able to show full compliance with required performance testing requirements prior to the end of the support term on a quarterly basis. As determined in the Performance Measures Order, compliance with the performance measures standards will be determined based on one year, four consecutive quarters, of testing. Quarterly determinations of compliance related to performance measures will only be used for purposes of determining withholding under section 54.320(d). For example, if a carrier’s failure to meet end-of-term performance measures under section 54.320(d) resulted in it having to repay support associated with 10% of locations to which it was obligated to deploy (and not including any support related to a failure to build and install the network as determined by USAC verifications) and the carrier’s performance testing had not been in compliance with our requirements for the 15 preceding quarters of testing, out of a total of 20 annual quarters in which it received support, the amount of support to be recovered would be multiplied by 15/20 or 3/4. If a carrier was not in compliance with our performance measures for 5 quarters of testing but comes into compliance before or during end-of-term testing, USAC will not recover any support. However, because carriers have an affirmative duty to demonstrate compliance with network performance measures—as they have with respect to physical build-out milestones—a carrier that has never been in compliance with performance testing requirements at any time during the testing period will have the appropriate amount of support withheld at the end of the support term for the entire term. We believe that this approach more narrowly ties the non-compliance consequences to the period of time in which a carrier fails to comply with performance requirements. We also amend section 54.320(d) to reflect that the milestones are calculated based on (1) the term of support for the carrier and program at issue, which is not 10 years in all cases as the rule currently states, and (2) over the support area, which can be by state, study area, or other area. 74. In response to commenters’ concerns regarding the fairness of potentially reducing carriers’ support amounts for both lack of deployment and non-compliance with speed and latency standards, See USTelecom/ITTA/WISPA PFR at 14; AT&T PFR Comments at 11-13. we clarify that at the end of the support term when USAC has performed the calculation to determine the total lack of deployment based on the numbers of locations to which the carrier has built out facilities and the number of locations that are in compliance with the performance measures, USAC will ensure that the total amount of support withheld from the carrier because of failure to meet deployment milestones and performance requirements does not exceed the requirements of section 54.320(d)(2). To facilitate this calculation, we reconsider the decision allowing carriers to recover only the support withheld for non-compliance for 12 months or less. Performance Measures Order, 33 FCC Rcd at 6532, para. 63. When a non-compliant carrier comes into a higher level of compliance, USAC will now return the withheld support up to an amount reflecting the difference between the levels’ required withholding. By returning all the support USAC may have withheld from a carrier for non-compliance, the non-compliance framework will continue to provide an incentive to carriers to return to full compliance with the speed and latency standards. 75. Finally, we provide additional flexibility at the conclusion of a carrier’s build-out term for any carrier that has failed to meet its performance requirements and believes that its failure to do so is the result of a small sample size. As noted above, to minimize the burdens of testing, the Bureaus have used a “trip-wire” approach in determining the required sample sizes; while these sample sizes are useful for demonstrating where further inquiry may be helpful, they are subject to a high margin of error. Thus, if at the end of its term, a carrier is shown not to have met its deployment obligations due to a failure in meeting the speed and latency requirements, the carrier can submit a request to the Bureaus for an increased size of random samples that will produce an estimate with a margin of error of 5% or less and conduct further testing during the additional 12-month period provided in section 54.320(d)(2) to show that the carrier is compliance with the Commission’s performance requirements. WTA in particular expresses concern about withholding support using “extrapolations from the performance testing survey results of the estimated number of locations that may not be receiving the applicable broadband speeds and latency.” WTA Oct. 17, 2019 Ex Parte at 2. See also NTCA Oct. 17, 2019 Ex Parte at 3-4 (presenting a similar argument about withholding support based on performance test results at the end of a carrier’s build-out term). Again, we welcome carriers to voluntarily increase the size of their random samples so that their test results are more representative of the overall quality of their broadband deployments. To mitigate the potential costs and burdens of testing, however, we refrain from mandating that carriers test more locations than the Bureaus determined in the Performance Measures Order. If, after this further testing, the carrier is able to demonstrate that it fully complies with the required speed and latency benchmarks, then the carrier will be considered to have met the deployment obligations. J. Schedule to Commence Testing 76. We are persuaded by the record here to modify the specific schedule to commence speed and latency tests established in the Performance Measures Order. The Performance Measures Order established a deadline of July 1, 2020 for carriers subject to the Order to report the results of testing, with an accompanying certification, for the third and fourth quarters of 2019. Performance Measures Order, 33 FCC Rcd at 6533, para. 67. We now adopt a modified approach to enable better individualization to the specific circumstances of a given provider. 77. We conclude that it is appropriate under the circumstances to modify the scheduled start of performance testing to link speed and latency testing to the deployment obligations for carriers receiving support from each of the various high-cost support mechanisms. We believe this solution best balances the Commission’s responsibility to ensure that consumers are receiving the promised levels of service in a timely manner with the ability of all carriers to undertake the required performance testing. This approach also allows larger price cap carriers that are further along in their deployments and are more able, at this point, to begin testing to do so without additional delay. Moreover, the rolling testing schedule we adopt will be less administratively burdensome for Commission staff by allowing for more individualized review and evaluation of testing results over time. Pushing back testing will have the added benefit of allowing additional time for the marketplace to further develop solutions for carriers to undertake the required testing. We are aware of numerous vendors offering solutions and believe they will be ready in time for the revised testing deadlines. See, e.g., https://www.vantagepnt.com/2018/10/02/introducing-betti-box-network-performance-testing-solution/ (“Vantage Point Solutions Introduces BETTI Box for Network Performance Testing”); https://www.axiros.com/news-and-announcements/speed-and-latency-test-program-for-caf-recipients (“Axiros Announces FCC Compliant Speed And Latency Test Program For Connect America Fund (CAF) Phase II Recipients”); https://finepoint.com/products/ (“We have also developed a complete project plan for addressing the FCC’s testing requirement that broadly includes 8 steps.”); https://www.calix.com/press-release/2019/02--february-/calix-launches-connect-america-fund-performance-testing-solution.html (“Calix Launches Connect America Fund Performance Testing Solution for the Calix Smart Home Platform Leveraging Speedtest® by Ookla®”); https://samknows.com/blog/connect-america-fund (“SamKnows has an ISP solution for both [testing] of the options the FCC put forward.”). 78. We also implement a pre-testing period that will occur prior to the commencement of each carrier’s testing start date. As with the testing period, this pre-testing period will be aligned with a carrier’s deployment obligations for the specific high-cost mechanism under which it receives support and will require the filing of data regarding pre-testing results. Pre-testing will require carriers to conduct testing according to the Commission’s requirements using a USAC-determined random sample of subscribers, and results must be submitted to USAC within one week of the end of each quarter (i.e., by April 7 for the first quarter, July 7 for the second quarter, etc.). We note that USAC’s interface for accepting performance testing results may not be complete when CAF Phase II carriers need to submit their pre-testing data. USAC will provide notice of alternative submission methods and/or the date when such interface will be available prior to the end of the first quarter of 2020. 79. However, no support reductions will be assessed during the pre-testing period, as long as carriers actually undertake the pre-testing and report their results. Carriers that fail to conduct pre-testing and submit results in a timely fashion will be considered to be at Level 1 non-compliance. See Performance Measures Order, 33 FCC Rcd at 6532-33, paras. 64-67. USAC will withhold 5% of a carrier’s monthly support payments while Level 1 non-compliant. The random sample for pre-testing can be used by the carrier for a total of two years, meaning that carriers will need to obtain a new random sample after two years of pre-testing/testing. Thus, for example, if a carrier does one year of pre-testing and then one year of testing, it will need to obtain a new random sample prior to beginning the second year of testing. While there will be no support reductions during the pre-testing period (as long as the carrier undertakes the testing and reports results), the filing will allow Commission staff to evaluate the pre-testing data and determine if any adjustments to the testing regime are needed to ensure that the testing period is successful. In addition, pre-testing will give carriers an opportunity to see how their networks and testing software and hardware perform and make any changes necessary. We direct the Bureaus to amend the performance measures as appropriate based on the information learned and experience gained from the pre-testing period. We do not modify our rules to permit parties to file untimely petitions for reconsideration, nor do we direct the Bureaus to reexamine the testing requirements with a “fresh look” after collecting test data from pre-testing.  See WTA Oct. 17, 2019 Ex Parte at 3.  The Bureaus have ample authority and direction to make revisions if necessary.  Regarding the availability of pre-testing data, we will make it publicly available in the most granular format possible as soon as feasible.  Doing so will allow a full opportunity for all stakeholders to assess the pre-testing results.  To protect potentially competitively sensitive information and to give carriers a full opportunity to see how testing hardware and software perform on their networks during the pre-testing period, however, we further direct WCB to aggregate and/or anonymize pre-testing data as appropriate. 80. Several industry associations support the approach we adopt to tie speed and latency testing to a carrier’s deployment obligations for the specific high-cost program under which it receives support. We believe the changes adopted to the testing initiation dates for small carriers also will allow sufficient time for any issues regarding the HUBB to be addressed. See NTCA May 7, 2019 Ex Parte at 1-2. Specifically, ITTA, USTelecom, and WISPA advocate aligning a carrier’s performance obligations with its deployment obligations, as well as designating the first two quarters of testing as “transitional and not subject to non-compliance measures for any performance deficiencies” to allow carriers to become familiar with the testing process. ITTA/USTelecom/WISPA Apr. 10, 2019 Ex Parte at 5. In addition, both NTCA and WTA support linking testing obligations to deployment obligations and allowing carriers to have a period of advanced testing before the mandated testing period. Letter from Joshua Seidemann, VP of Policy, NTCA, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 2-3 (Mar. 21, 2019) (“the rules should not be effective until vendors have sufficient time to create generally available solutions following decisions on final protocols and standards, and only after ISPs then have sufficient time to “bench test” the solutions”; stating that there should be a “ʻtest the testing’” period”); WTA Apr. 4, 2019 Ex Parte at 1-2, 4 (rural LECs should be given “a period of six months or more to engage in informal practice testing to discover and resolve potential equipment and procedural problems before being required to initiate formal performance testing and reporting”); WTA Apr. 17, 2019 Ex Parte at 4; WTA May 6, 2019 Ex Parte at 1-2, 4 (“some WTA members want to do some informal or limited pre-testing in advance of their quarterly testing week”). We agree with those commenters suggesting that a period to “test the testing” will help ensure that all carriers become familiar with testing methodologies and equipment, as well as prevent or reduce future administrative issues with the testing process. 81. Accordingly, we adopt the schedule below for pre-testing and testing obligations specific to the carriers receiving high-cost universal service support: Schedule for Pre-Testing and Testing Program Pre-testing start date Testing start date CAF Phase II (Price-cap carrier funding) January 1, 2020 July 1, 2020 RBE January 1, 2021 January 1, 2022 Alaska Plan January 1, 2021 January 1, 2022 A-CAM I January 1, 2021 January 1, 2022 A-CAM I Revised January 1, 2021 January 1, 2022 ACAM II January 1, 2022 January 1, 2023 Legacy Rate of Return January 1, 2022 January 1, 2023 CAF II Auction January 1, 2022 January 1, 2023 New NY Broadband Program January 1, 2022 January 1, 2023 82. Because we establish pre-testing and testing periods to coincide with a carrier’s specific deployment obligations under its respective high-cost mechanism, recipients of CAF Phase II model-based support will be the first to undertake the pre-testing period on January 1, 2020. These carriers are required to build out to 80% of their supported locations by December 31, 2019. December 2014 CAF Phase II Order, 29 FCC Rcd at 15657-58, para. 36, Table 1. Recipients of CAF Phase II model-based support are primarily larger carriers that are better positioned to begin testing sooner due to the availability of testing equipment and solutions already in the marketplace for these carriers. During the six-month pre-testing period, these carriers will be required to test the speed and latency of their networks for a weeklong period once per quarter (first and second quarters of 2020) and submit the results to the Commission within one week of the end of each quarter of pre-testing. USAC is in the process of developing a mechanism for carriers to submit testing data electronically. The testing period for CAF Phase II model-based support recipients will commence on July 1, 2020, with speed and latency tests occurring for weeklong periods in both the third and fourth quarters of 2020 and results of that testing submitted by July 2021. 83. RBE support recipients, See Rural Broadband Experiments Order, 29 FCC Rcd at 8794, paras. 74-75. RBE recipients receive a ten-year term of support but must deploy to at least 85% of locations by the end of the third year of support and 100% by the end of year five. Because RBE support was awarded on a rolling basis, recipients have staggered deadlines for meeting the required deployment milestones. However, all RBE support recipients should be fully deployed by the end of 2021, prior to the commencement of testing. as well as rate-of-return carriers receiving model-based support under both the A-CAM I and the revised A-CAM I, will follow a similar, but slightly extended schedule. The pre-testing period for these carriers will commence on January 1, 2021 and will last one full year to ensure that the predominantly smaller carriers receiving support under these mechanisms have adequate time to implement and test their technology and software solutions to meet the Commission’s performance testing requirements. We believe that a longer pre-testing period than the one we adopt for CAF Phase II model-based support recipients is warranted to ensure that any concerns or issues with the testing process are addressed prior to these carriers being subject to support reductions. During this one-year pre-testing period, this group of carriers will be required to test the speed and latency of their networks quarterly for a weeklong period and submit the results to the Commission within one week of the end of each quarter of pre-testing. The testing period for these carriers will begin on January 1, 2022, and results will be submitted to the Commission by July 2023. 84. We also adopt a one-year pre-testing period for recipients of support from the CAF Phase II auction and A-CAM II, as well as legacy rate-of-return support recipients. However, we delay commencement of the pre-testing period for these carriers to account for certain timing considerations. For example, we are in the process of authorizing CAF Phase II auction winners to receive support, See e.g., Connect America Fund Auction Support Authorized for 459 Winning Bids, AU Docket No. 17-182, WC Docket No. 10-90, Public Notice, DA 19-414 (WCB rel. May 14, 2019). and recently authorized rate-of-return carriers electing the A-CAM II offer to receive support. See Wireline Competition Bureau Authorizes 171 Rate-Of-Return Companies To Receive $491 Million Annually in Alternative Connect America Cost Model II Support To Expand Rural Broadband, WC Docket No. 10-90, Public Notice, DA 19-808 (WCB rel. Aug. 22, 2019). Additionally, to increase administrative efficiency, we put legacy rate-of-return carriers on the same schedule as A-CAM II support recipients in light of the fact that their deployment requirements started at approximately the same time. Thus, to allow time for carriers receiving support under these mechanisms not only to be authorized, but also to deploy in a timely manner, we institute a one-year pre-testing period beginning January 1, 2022. The required testing period for these carriers will commence on January 1, 2023. We anticipate that these support recipients will have deployed to at least 40% of their required locations by the end of 2022. Connect America Fund, et al., WC Docket No. 10-90, Report and Order, Further Notice of Proposed Rulemaking, and Order on Reconsideration, 33 FCC Rcd 11893, 11914, para. 67 (2018) (December 2018 Rate-of-Return Order). These carriers will be subject to the same testing and reporting requirements, for both pre-testing and testing, as the other categories of carriers described above, except that these carriers will have a one-year pre-test period rather than a six-month pre-test period. 85. We disagree with those petitioners urging the Commission to adopt a blanket delay of implementation of the testing requirements. NTCA contends that the equipment necessary for the most cost-effective method of testing is not yet fully developed or widely available, particularly in rural markets. NTCA instead proposes that any obligations be suspended or waived until a later time—at least 12 months—following the widespread availability of modems with built-in testing capability to the rural market. NTCA AFR at 9-12; see also Comments of NTCA-The Rural Broadband Association on Applications for Review and Requests for Clarification, WC Docket No. 10-90, at 2 (Oct. 15, 2018). WTA agrees that the necessary testing equipment is unavailable at this time and thus proposes that the Commission postpone testing for rural LECs for at least two years. WTA AFR at 10-12. WTA also proposes to delay support reductions for non-compliance to coincide with build-out milestones. WTA Apr. 17, 2019 Ex Parte at 3-4. WISPA, ITTA, and NTTA support proposals to postpone testing for a time in order to permit equipment to become more available and affordable. See Comments of the Wireless Internet Service Providers Association on Applications for Review, WC Docket No. 10-90, at 1-3 (Oct. 4, 2018); ITTA PFR Comments at 5-7; Letter from Godfrey Enjady, President, National Tribal Telecommunications Association, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 10-90, at 1 (Apr. 14, 2019). 86. We are not convinced that a blanket delay for all carriers subject to the Commission’s performance measure requirements is necessary. As petitioners and commenters observe, large carriers and carriers serving more urban markets are differently situated than smaller carriers serving more rural communities, and these carriers may already be positioned to begin testing. Though a minor delay for all carriers is warranted to allow USAC time to develop and implement specific IT solutions, additional time beyond that for the marketplace to develop technical solutions is necessary only for a certain subset of carriers. As WTA observes, “Whiteboxes for MBA testing are being used by large carriers, but thus far [its members] have generally been unable to obtain Whitebox pricing estimates for their likely levels of demand.” WTA AFR at 11. Similarly, NTCA explains that larger carriers are able to purchase modems and routers at scale or can develop their own proprietary devices, but smaller carriers oftentimes must purchase “off the rack” technology solutions and may have already deployed equipment that cannot be easily retrofitted to accommodate performance testing. NTCA AFR at 9. 87. We agree that a one-size-fits-all approach does not reflect the realities of the marketplace. However, the tiered implementation schedule we adopt strikes a better balance between the interests of carriers in cost-effectively testing their networks’ performance and the Commission’s need to ensure that those networks are performing at the level promised. We further note that WCB has already announced a delay in the requirement to begin testing and reporting of speed and latency results until the first quarter of 2020. See Delay in Initiation of Performance Measures Testing Requirements Until First Quarter of 2020, Public Notice, DA 19-490 (WCB rel. May 30, 2019). 88. Given the changes to the testing framework we adopt, we likewise decline WTA’s suggestion to delay support reductions for non-compliant carriers until they are given an opportunity to address any deficiencies in their networks. WTA AFR at 5. The pre-testing period we adopt will provide carriers with ample opportunity to identify any issues within their network infrastructure that may impact testing results and to rectify those problems prior to undertaking the required testing. As a result, carriers should have minimal, if any, technological or software challenges that prevent them from meeting the Commission’s performance requirements and would require an opportunity to cure. Moreover, because carriers will be testing only those locations that the carrier has certified are deployed with the requisite speed, we do not see a compelling reason to delay support reductions for non-compliance. 89. We likewise decline to further delay testing and reporting obligations for Alaska Communications Systems (ACS). Because carriers serving certain non-contiguous areas of the United States face different operating conditions and challenges from those faced by carriers in the contiguous 48 states, USF/ICC Transformation Order, 26 FCC Rcd at 17737-38, para. 193. the Commission concluded that it was appropriate to adopt tailored service obligations for each non-contiguous carrier that elected to continue to receive frozen support amounts for Phase II in lieu of the offer of model-based support. December 2014 CAF Phase II Order, 29 FCC Rcd at 15662, para. 46. For ACS, the Commission adopted a 10-year term of support to provide a minimum of 10/1 Mbps broadband service with a roundtrip provider network latency requirement of 100 ms or less to a minimum of 31,571 locations. Connect America Fund, Order, 31 FCC Rcd 12086, 12089, 12092-93, paras. 9, 12, 22, 27 (2016). 90. ITTA, USTelecom, and WISPA propose that testing and reporting obligations for ACS be delayed for one year from the date on which they begin for other CAF Phase II model-based support recipients. See ITTA/USTelecom/WISPA Apr. 10, 2019 Ex Parte at 7. These parties contend that ACS should be given more time because it is still in the process of planning its CAF II deployment and has not identified or reported the specific customer locations that it intends to serve. See id. ITTA, USTelecom, and WISPA also argue that additional time also is necessary for ACS to identify one or more suitable points at which traffic can be aggregated for transport to the continental U.S. See id. 91. Because we are instituting a pre-testing period and delaying the start of the required testing period for CAF Phase II model-based support recipients until July 1, 2020, we anticipate that ACS will have had ample time to finalize deployment plans and identify a suitable aggregation point or points. Thus, we are unconvinced by the argument advanced by ITTA, USTelecom, and WISPA that these issues warrant further delay for ACS. Moreover, we note that ACS already has passed its first deployment milestone and certified to locations in the HUBB. Thus, ACS should be fully prepared to commence testing on the same schedule as other CAF Phase II support recipients. K. Requirements for Certain Alaska Plan Carriers 92. NTCA requests clarification that the Order applies only to high-cost recipients with mandatory build-out obligations. NTCA AFR at 20-22. Though some Alaskan rate-of-return carriers are subject to defined build-out obligations, NTCA observes that if a carrier has “no mandated build-out obligation, there is neither a clear speed threshold to which a carrier can be required to test nor a specified number of locations at which the test can be conducted.” Id. at 21. NTCA argues that additional proper notice-and-comment rulemaking procedures would be needed to subject carriers without mandatory build-out obligations to any required performance measures. Id. at 21-22. 93. Absent any specific deployment requirements, the Commission lacks a standard for determining whether a carrier’s deployment meets the required performance measures. As a result, consistent with NTCA’s request, we clarify that only carriers subject to defined build-out requirements are required to test the speed and latency of their networks in accord with Commission rules. Previously, carriers receiving CAF-BLS with 80% or greater deployment of 10/1 Mbps broadband service in their entire study areas did not have specific build-out obligations as a condition of receiving CAF-BLS support. Connect America Fund; ETC Annual Reports and Certifications; Developing a Unified Intercarrier Compensation Regime, WC Docket No. 10-90 et al., Report and Order, Order on Reconsideration, and Further Notice of Proposed Rulemaking, 31 FCC Rcd 3087, 3152, para. 173 (2016). To the extent it was unclear whether these recipients of CAF-BLS were required to conduct required performance measures testing, this question is now moot because those carriers now also have mandatory deployment obligations. See December 2018 Rate-of-Return Order, 33 FCC Rcd at 11927, paras. 111-112. Alaskan rate-of-return carriers that have committed to maintaining existing service levels therefore are not subject to the performance measures adopted by the Bureaus and modified herein. 94. Alaskan rate-of-return carriers that have committed to defined build-out obligations, however, must conduct speed and latency testing of their networks. That said, we recognize that many of these carriers lack the ability to obtain terrestrial backhaul such as fiber, microwave, We expect locations with microwave backhaul to be able to satisfy the speed and latency requirements and thus subject them to the same testing standards as other locations. or other technologies and instead must rely exclusively on satellite backhaul. Consistent with the standards we adopted for high-latency service providers in the CAF Phase II auction, we require Alaska Plan carriers using satellite or satellite backhaul to certify that 95% or more of all testing hour measurements of network round trip latency are at or below 750 ms for any locations using satellite technology. See CAF Phase II Auction Order, 31 FCC Rcd at 5960-61, para. 30. Alaska Plan carriers are not required to meet the second part of the “two-part standard” for high-latency providers, i.e., demonstrating a required mean opinion score, which applies only to CAF Phase II auction winners and New York CAF winners. See id. We also reaffirm that these carriers must certify annually that no terrestrial backhaul options exist, and that they are unable to satisfy the standard performance measures due to the limited functionality of the available satellite backhaul facilities. USF/ICC Transformation Order, 26 FCC Rcd at 17699-17700, para. 101. See also 47 CFR § 54.313(g). To the extent that new terrestrial backhaul facilities are constructed, or existing facilities improve sufficiently to meet the public interest obligations, we have required funding recipients to meet the standard performance measures within twelve months of the new backhaul facilities becoming commercially available. USF/ICC Transformation Order, 26 FCC Rcd at 17699-17700, para. 101. IV. PROCEDURAL MATTERS 95. Paperwork Reduction Act Analysis. This document contains new information collection requirements subject to the Paperwork Reduction Act of 1995 (PRA), Public Law 104-13. It will be submitted to the Office of Management and Budget (OMB) for review under Section 3507(d) of the PRA. OMB, the general public, and other Federal agencies will be invited to comment on the new or modified information collection requirements contained in this proceeding. In addition, we note that pursuant to the Small Business Paperwork Relief Act of 2002, Public Law 107-198, see 44 U.S.C. 3506(c)(4), we previously sought specific comment on how the Commission might further reduce the information collection burden for small business concerns with fewer than 25 employees. 96. Congressional Review Act. The Commission has determined, and the Administrator of the Office of Information and Regulatory Affairs, Office of Management and Budget, concurs that these rules are non-major under the Congressional Review Act, 5 U.S.C. § 804(2). The Commission will send a copy of this Order on Reconsideration to Congress and the Government Accountability Office pursuant to 5 U.S.C. § 801(a)(1)(A). See 5 U.S.C. § 801(a)(1)(A). 97. Supplemental Final Regulatory Flexibility Analysis. The Supplemental Final Regulatory Flexibility Analysis, pursuant to the Regulatory Flexibility Act, See 5 U.S.C. § 604. is contained in Appendix C. V. ORDERING CLAUSES 98. Accordingly, IT IS ORDERED that, pursuant to the authority contained in sections 1-4, 5, 201-206, 214, 218-220, 251, 252, 254, 256, 303(r), 332, 403, and 405 of the Communications Act of 1934, as amended, and section 706 of the Telecommunications Act of 1996, 47 U.S.C. §§ 151-155, 201-206, 214, 218-220, 251, 256, 254, 256, 303(r), 403 and 405, this Order on Reconsideration IS ADOPTED, effective thirty (30) days after publication of the text or summary thereof in the Federal Register, except to the extent any rules and requirements therein contain new or modified information collection requirements subject to Paperwork Reduction Act review. Those rules and requirements containing such new or modified information collection requirements shall become effective immediately upon announcement in the Federal Register of OMB approval. It is our intention in adopting these rules that if any of the rules that we retain, modify, or adopt herein, or the application thereof to any person or circumstance, are held to be unlawful, the remaining portions of the rules not deemed unlawful, and the application of such rules to other persons or circumstances, shall remain in effect to the fullest extent permitted by law. 99. IT IS FURTHER ORDERED that, pursuant to the authority contained in section 405 of the Communications Act of 1934, as amended, 47 U.S.C. § 405, and sections 0.331 and 1.429 of the Commission’s rules, 47 CFR § 0.331 and 47 CFR § 1.429, the Petition for Reconsideration and Clarification filed by USTELECOM – THE BROADBAND ASSOCIATION, ITTA – THE VOICE OF AMERICA’S BROADBAND PROVIDERS, and the WIRELESS INTERNET SERVICE PROVIDERS ASSOCIATION on September 19, 2018 IS GRANTED IN PART and DENIED IN PART to the extent described herein, and the Petition for Partial Reconsideration filed by MICRONESIAN TELECOMMUNICATIONS CORPORATION on September 19, 2018 IS DENIED. 100. IT IS FURTHER ORDERED that, pursuant to the authority contained in section 5(c)(5) of the Communications Act of 1934, as amended, 47 U.S.C. § 155(c)(5), and section 1.115(g) of the Commission’s rules, 47 CFR § 1.115(g), the Application for Review and Request for Clarification filed by NTCA – THE RURAL BROADBAND ASSOCIATION on September 19, 2018 and the Application for Review filed by WTA – ADVOCATES FOR BROADBAND on September 19, 2018, ARE GRANTED IN PART and DENIED IN PART to the extent described herein. 47 U.S.C. § 155(c)(5); 47 CFR § 1.115(g). 101. IT IS FURTHER ORDERED that the Commission’s Consumer and Governmental Affairs Bureau, Reference Information Center, SHALL SEND a copy of this Order on Reconsideration, including a Supplemental Final Regulatory Flexibility Analysis, to the Chief Counsel for Advocacy of the Small Business Administration. 102. IT IS FURTHER ORDERED that the Commission SHALL SEND a copy of this Order on Reconsideration to Congress and the Government Accountability Office pursuant to the Congressional Review Act, see 5 U.S.C. § 801(a)(1)(A). FEDERAL COMMUNICATIONS COMMISSION Marlene H. Dortch Secretary APPENDIX A Final Rules For the reasons discussed in the preamble, the Federal Communications Commission amends 47 CFR part 54 as follows: 1. Amend § 54.320(d) by revising paragraphs (1) and (2) to read as follows: §54.320  Compliance and recordkeeping for the high-cost program. * * * * * (d) * * * (1) * * * * * * * * (ii) Tier 2. If an eligible telecommunications carrier has a compliance gap of at least 15 percent but less than 25 percent of the number of locations that the eligible telecommunications carrier is required to have built out to or, in the case of Alaska Plan mobile-carrier participants, population covered by the specified technology, middle mile, and speed of service in the carrier’s approved performance plan, by the interim milestone, USAC will withhold 15 percent of the eligible telecommunications carrier’s monthly support for that support area and the eligible telecommunications carrier will be required to file quarterly reports. Once the eligible telecommunications carrier has reported that it has reduced the compliance gap to less than 15 percent of the required number of locations (or population, if applicable) for that interim milestone for that support area, the Wireline Competition Bureau or Wireless Telecommunications Bureau will issue a letter to that effect, USAC will stop withholding support, and the eligible telecommunications carrier will receive all of the support that had been withheld. The eligible telecommunications carrier will then move to Tier 1 status. (iii) Tier 3. If an eligible telecommunications carrier has a compliance gap of at least 25 percent but less than 50 percent of the number of locations that the eligible telecommunications carrier is required to have built out to by the interim milestone, or, in the case of Alaska Plan mobile-carrier participants, population covered by the specified technology, middle mile, and speed of service in the carrier’s approved performance plan, USAC will withhold 25 percent of the eligible telecommunications carrier’s monthly support for that support area and the eligible telecommunications carrier will be required to file quarterly reports. Once the eligible telecommunications carrier has reported that it has reduced the compliance gap to less than 25 percent of the required number of locations (or population, if applicable) for that interim milestone for that support area, the Wireline Competition Bureau or Wireless Telecommunications Bureau will issue a letter to that effect, the eligible telecommunications carrier will move to Tier 2 status. (iv) * * * (A) USAC will withhold 50 percent of the eligible telecommunications carrier’s monthly support for that support area, and the eligible telecommunications carrier will be required to file quarterly reports. * * * * * * * * (2)  Final milestone. Upon notification that the eligible telecommunications carrier has not met a final milestone, the eligible telecommunications carrier will have twelve months from the date of the final milestone deadline to come into full compliance with this milestone. If the eligible telecommunications carrier does not report that it has come into full compliance with this milestone within twelve months, the Wireline Competition Bureau - or Wireless Telecommunications Bureau in the case of mobile carrier participants - will issue a letter to this effect. In the case of Alaska Plan mobile carrier participants, USAC will then recover the percentage of support that is equal to 1.89 times the average amount of support per location received by that carrier over the support term for the relevant percentage of population. For other recipients of high-cost support, USAC will then recover the percentage of support that is equal to 1.89 times the average amount of support per location received in the support area for that carrier over the term of support for the relevant number of locations plus 10 percent of the eligible telecommunications carrier’s total relevant high-cost support over the support term for that support area. Where a recipient is unable to demonstrate compliance with a final performance testing milestone, USAC will recover the percentage of support that is equal to 1.89 times the average amount of support per location received in the support area for that carrier plus 10 percent of the eligible telecommunications carrier’s total relevant high cost-support over the support term for that support area multiplied by the percentage of time since the carrier was last able to demonstrate compliance based on performance testing, on a quarterly basis. In the event that a recipient fails to meet a final milestone both for failure to build facilities and failure of performance, USAC will recover the total of (1) the percentage of support that is equal to 1.89 times the average amount of support per location received by that carrier over the support term for the relevant percentage of population for those locations to which the carrier failed to build facilities; (2) the percentage of support that is equal to 1.89 times the average amount of support per location received in the support area for that carrier multiplied by the percentage of time since the carrier was last able to demonstrate compliance based on performance testing; and (3) 10 percent of the eligible telecommunications carrier’s total relevant high-cost support over the support term for that support area. APPENDIX B Qualifying Internet Autonomous Systems AS Rank AS Number Organization Flag Transit ASN Degree Peer w/ United States Top USA 100 Peer w/ Top United States 300 Class 1 3356 Level 3 Parent, LLC USA 5,177 many many Transit/Access 3 174 Cogent Communications USA 5,761 many many Transit/Access 4 2914 NTT America, Inc. USA 1,769 many many Transit/Access 7 6939 Hurricane Electric LLC USA 7,566 many many Transit/Access 8 6453 TATA COMMUNICATIONS (AMERICA) INC USA 740 many many Transit/Access 9 3491 PCCW Global, Inc. USA 640 many many Transit/Access 10 6461 Zayo Bandwidth USA 1,896 many many Transit/Access 12 3549 Level 3 Parent, LLC USA 2,390 many many Transit/Access 16 209 CenturyLink Communications, LLC USA 1,862 many many Transit/Access 21 701 MCI Communications Services, Inc. d/b/a Verizon Business USA 1,310 many many Transit/Access 22 7018 AT&T Services, Inc. USA 2,289 many many Transit/Access 23 7922 Comcast Cable Communications, LLC USA 234 many many Transit/Access 27 2828 MCI Communications Services, Inc. d/b/a Verizon Business USA 902 many many Transit/Access 36 1239 Sprint USA 349 many many Transit/Access 62 11164 Internet2 USA 132 7 14 Transit/Access 68 7029 Windstream Communications LLC USA 466 2 4 Transit/Access 71 22773 Cox Communications Inc. USA 536 4 6 Transit/Access 81 577 Bell Canada CANADA 319 many 101 Transit/Access 87 6128 Cablevision Systems Corp. USA 322 4 6 Transit/Access 90 20115 Charter Communications USA 413 5 7 Transit/Access 122 11404 vanoppen.biz LLC USA 322 3 5 Transit/Access 123 6327 Shaw Communications Inc. CANADA 233 6 11 Transit/Access 140 702 MCI Communications Services, Inc. d/b/a Verizon Business USA 274 7 7 Transit/Access 169 7385 Integra Telecom, Inc. USA 216 2 3 Transit/Access 175 852 TELUS Communications Inc. CANADA 234 12 23 Transit/Access 179 5650 Frontier Communications of America, Inc. USA 210 9 15 Transit/Access 205 812 Rogers Communications Canada Inc. CANADA 175 6 8 Transit/Access 224 13768 Cogeco Peer 1 CANADA 241 1 5 Transit/Access 276 2381 WiscNet USA 216 4 8 Transit/Access 290 29791 Internap Corporation USA 180 4 7 Transit/Access 308 26554 US Signal Company, L.L.C. USA 115 1 2 Transit/Access 311 703 MCI Communications Services, Inc. d/b/a Verizon Business USA 107 4 5 Transit/Access 319 2686 AT&T Global Network Services, LLC USA 181 7 7 Transit/Access 400 13760 Southern Light, LLC USA 244 2 11 Transit/Access 404 6079 RCN USA 179 6 15 Transit/Access 438 19151 WV FIBER USA 347 8 20 Transit/Access 474 7342 VeriSign Infrastructure & Operations USA 219 4 8 Transit/Access 520 4181 TDS TELECOM USA 100 7 18 Transit/Access 613 29838 Atlantic Metro Communications, LLC USA 154 1 2 Transit/Access 643 11096 FloridaNet USA 102 2 4 Transit/Access 930 293 ESnet USA 112 9 18 Transit/Access 1566 14537 Continent 8 LLC USA 132 1 5 Transit/Access 1725 23473 PAVLOV MEDIA INC USA 172 1 6 Transit/Access 2228 40805 JMF Solutions, Inc USA 636 1 2 Transit/Access 41 Federal Communications Commission FCC 19-104 APPENDIX C Supplemental Final Regulatory Flexibility Analysis 1. As required by the Regulatory Flexibility Act of 1980 (RFA), 5 U.S.C. § 603. The RFA, 5 U.S.C. §§ 601-612 has been amended by the Contract With America Advancement Act of 1996, Public Law No. 104-121, 110 Stat. 847 (1996) (CWAAA). Title II of the CWAAA is the Small Business Regulatory Enforcement Fairness Act of 1996 (SBREFA). as amended, an Initial Regulatory Flexibility Analysis (IRFA) was incorporated in the USF/ICC Transformation FNPRM. See USF/ICC Transformation Order, 26 FCC Rcd at 18364-95. The Commission sought written public comment on the proposals in the USF/ICC Transformation FNPRM, including comment on the IRFA. The Wireline Competition Bureau, Wireless Telecommunications Bureau, and Office of Engineering and Technology (the Bureaus) included a Final Regulatory Flexibility Analysis (FRFA) in connection with the Performance Measures Order. Performance Measures Order, 33 FCC Rcd at 3538-46, Appx. B, Final Regulatory Flexibility Analysis. This Supplemental Final Regulatory Flexibility Analysis (Supplemental FRFA) supplements the FRFA in the Performance Measures Order to reflect the actions taken in the Order on Reconsideration and conforms to the RFA. See 5 U.S.C. § 604. A. Need for, and Objective of, the Order 2. The Order on Reconsideration addresses issues raised by parties in petitions for reconsideration and applications for review of the Performance Measures Order. See Performance Measures Order. In the Performance Measures Order, the Bureaus established how recipients of Connect America Fund (CAF) support must test their broadband networks for compliance with speed and latency metrics and certify and report those results. See generally id. In doing so, the Bureaus adopted a flexible framework to minimize the burden on small entities—for example, by permitting carriers to choose from one of three methodologies to conduct the required testing. 3. The Order on Reconsideration affirms certain key components of the Performance Measures Order while making several modifications to the requirements. Specifically, in the Order, we maintain the choice between three testing methodologies for carriers to conduct required testing; tie the implementation of speed and latency testing to a carrier’s deployment obligations for the specific high-cost program under which it receives support; adopt a pre-testing regime to give both carriers and the Commission the opportunity to ensure that carriers are familiar with the testing regime and minimize any administrative issues; maintain the previously-adopted testing sample sizes but clarify that carriers must use the same locations for testing both latency and speed; adopt a revised definition of FCC-designated Internet Exchange Point (IXP); confirm that end-points for testing are from the customer’s side of any network being used to an FCC-designated IXP; maintain the existing daily testing time period and quarterly testing requirement; allow further flexibility for the timing of speed tests but maintain the same frequency of latency testing; and reaffirm the compliance standards and associated support reductions for non-compliance. B. Summary of Significant Issues Raised by Public Comments in Response to the IRFA 4. There were no comments raised that specifically addressed how broadband service should be measured, as presented in the USF/ICC Transformation FNPRM IRFA. See USF/ICC Transformation FNPRM, 26 FCC Rcd at 18364, para. 3. Nonetheless, the Commission has considered the potential impact of the rules proposed in the IRFA on small entities and reduced the compliance burden for all small entities in order to reduce the economic impact of the rules enacted herein on such entities. C. Response to Comments by the Chief Counsel for Advocacy of the Small Business Administration 5. Pursuant to the Small Business Jobs Act of 2010, 5 U.S.C. § 604(a)(3). which amended the RFA, the Commission is required to respond to any comments filed by the Chief Counsel of the Small Business Administration (SBA), and to provide a detailed statement of any change made to the proposed rule(s) as a result of those comments. 6. The Chief Counsel did not file any comments in response to the proposed rule(s) in this proceeding. D. Description and Estimate of the Number of Small Entities to Which the Rules Would Apply 7. The RFA directs agencies to provide a description of, and where feasible, an estimate of the number of small entities that may be affected by the proposed rules, if adopted. See 5 U.S.C. § 603(b)(3). The RFA generally defines the term “small entity” as having the same meaning as the terms “small business,” “small organization,” and “small governmental jurisdiction.” See 5 U.S.C. § 601(6). In addition, the term “small business” has the same meaning as the term “small-business concern” under the Small Business Act. See 5 U.S.C. § 601(3) (incorporating by reference the definition of “small-business concern” in the Small Business Act, 15 U.S.C. § 632). Pursuant to 5 U.S.C. § 601(3), the statutory definition of a small business applies “unless an agency, after consultation with the Office of Advocacy of the Small Business Administration and after opportunity for public comment, establishes one or more definitions of such term which are appropriate to the activities of the agency and publishes such definition(s) in the Federal Register.” A small-business concern” is one which: (1) is independently owned and operated; (2) is not dominant in its field of operation; and (3) satisfies any additional criteria established by the Small Business Administration (SBA). See 15 U.S.C. § 632. 8. As noted above, the Performance Measures Order included a FRFA. In that analysis, the Bureaus described in detail the small entities that might be significantly affected. Accordingly, in this Supplemental FRFA, we hereby incorporate by reference the descriptions and estimates of the number of small entities from the previous FRFA in the Performance Measures Order. See Performance Measures Order, 33 FCC Rcd at 6539-44, Appx. B, paras. 8-24. E. Description of Projected Reporting, Recordkeeping, and Other Compliance Requirements for Small Entities 9. We expect the amended requirements in the Order on Reconsideration will not impose any new or additional reporting or recordkeeping or other compliance obligations on small entities and, as described below, will reduce their costs. F. Steps Taken to Minimize the Significant Economic Impact on Small Entities, and Significant Alternatives Considered 10. The RFA requires an agency to describe any significant alternatives that it has considered in reaching its proposed approach, which may include (among others) the following four alternatives: (1) the establishment of differing compliance or reporting requirements or timetables that take into account the resources available to small entities; (2) the clarification, consolidation, or simplification of compliance or reporting requirements under the rule for small entities; (3) the use of performance, rather than design, standards; and (4) an exemption from coverage of the rule, or any part thereof, for small entities. 5 U.S.C. § 603(c). 11. The Commission has taken further steps which will minimize the economic impact on small entities. In the Order on Reconsideration, we adopt a delayed schedule providing for a period of “pre-testing” for all carriers and later start dates for carriers that do not receive CAF Phase II model-based support. Thus, CAF Phase II model-based support recipients, which include only large carriers, must begin pre-testing and testing in 2020, whereas legacy rate-of-return carriers, many of which are smaller entities, must begin pre-testing in 2022 and testing in 2023, and small carriers receiving A-CAM I model support do not begin pre-testing until 2021 and testing in 2022. Pre-testing will give carriers time to correct any issues with their networks or with their testing infrastructure without being subject to support reductions, and the delayed schedule for non-CAF Phase II carriers will permit smaller entities even more time to prepare to meet our testing requirements. 12. We also now permit greater flexibility for carriers to conduct speed tests within an hour. In the Order on Reconsideration, we clarify that carriers may not necessarily start testing speed at the very beginning of each test hour. Instead, a carrier must simply report a successful speed test for each hour, except a carrier that begins attempting a speed test within the first 15 minutes of an hour and checks for cross-talk in one-minute intervals (using the cross-talk thresholds of 64 Kbps for download and 32 Kbps for upload) may record that no test was successful during that test hour. 13. Finally, we clarify that carriers may use the same subscriber locations for testing both speed and latency, halving the potential burdens for carriers that may have otherwise believed it necessary to test separate subscriber locations for speed and latency. This clarification is most significant for the smallest carriers, which may use less automated means of testing than larger carriers. 14. Report to Congress: The Commission will send a copy of the Order, including this FRFA, in a report to be sent to Congress and the Government Accountability Office pursuant to the Small Business Regulatory Enforcement Fairness Act of 1996. 5 U.S.C. § 801(a)(1)(A). In addition, the Commission will send a copy of the Order, including the FRFA, to the Chief Counsel for Advocacy of the Small Business Administration. A copy of the Order and FRFA (or summaries thereof) will also be published in the Federal Register. See id. § 604(b). 44 Federal Communications Commission "FCC XX-XXX" STATEMENT OF CHAIRMAN AJIT PAI Re: Connect America Fund, WC Docket No. 10-90. To help achieve the Commission’s number one priority, closing the digital divide, we provide billions of dollars each year in high-cost universal service support to carriers to deploy modern, high-speed broadband networks to unserved Americans living in rural areas. These networks are essential to bringing digital opportunity to every American. I’ve seen firsthand the impact of these universal service-funded rural networks across the country. Just a few weeks ago, in Mandan, North Dakota, I met with a consumer getting a fiber broadband connection for the first time. I saw his sense of wonder as we talked about how he and his family would use this new outlet to the world. And I met two of his neighbors who had recently been connected: One was a software engineer who could finally work more regularly from home, letting him spend more time with his family, and another had started a successful baby clothing business in her basement because she had that vital broadband connection. But Americans like these only get the full benefits of connectivity if the carriers receiving universal service support follow through—if they actually deliver the speed and responsiveness that they committed to provide. Most such carriers must build out their networks to specific numbers of homes and businesses. But that’s only half of the equation. Deploying a broadband network means providing the network speed and latency that consumers and the Commission expect, to ensure that rural Americans are not relegated to second-class service. President Reagan was fond of the old Russian proverb, “Trust, but verify.” And today’s Order is about the verify part of the equation. Specifically, we must verify that carriers are not only building the infrastructure, but also supplying the service quality required by our rules. The testing methodologies we adopt today are rigorous because we must ensure that both American taxpayers (who contribute to the Universal Service Fund) and rural consumers are getting their money’s worth. But these methodologies are also workable for all carriers. We recognize that carriers of different sizes and technical and financial capabilities have different needs. So today, we decide to closely review the existing testing methodologies and make targeted changes that will provide flexibility and eliminate unnecessary burdens on carriers, while still ensuring that carriers are accountable to consumers, taxpayers, and the Commission. Whether it is a two-way video chat between a doctor and her patient, a student collaborating with classmates in real-time on a school project, or just a family streaming their favorite movies and television, rural Americans must have a broadband connection that will consistently deliver the performance that modern applications require. And today’s Order will help ensure that’s the case. I would like to thank the staff who contributed to this item, including Mikelle Bonan, Cheryl Callahan, Justin Faulb, Ian Forbes, Sue McNeil, Alex Minard, Kris Monteith, Ryan Palmer, Gilbert Smith, Joseph Sorresso, Stephen Wang, and Suzanne Yelen of the Wireline Competition Bureau; Joseph Calascione, Jonathan Campbell, Cha-Chi Fan, Alec MacDonell, Giulia McHenry, and Cathy Zima of the Office of Economics and Analytics; Michael Carlson, William Dever, Tom Johnson, David Konczal, Rick Mallen, and Linda Oliver of the Office of General Counsel; and Martin Doczkat, Padma Krishnaswamy, and Paul Murray of the Office of Engineering and Technology. 2 STATEMENT OF COMMISSIONER MICHAEL O’RIELLY Re: Connect America Fund, WC Docket No. 10-90. The item before us provides some needed flexibility on timing and other testing components while reaffirming the Commission’s overall goal of ensuring sufficient, quality broadband from one of our consumer-paid subsidy programs. It has my support. At the same time, I was dismayed by a certain narrative, put forth not by the Chairman or his good team or people within the agency, but by folks on the outside, that the groups who are seeking some relief and clarification within our broadband testing regime were attempting to harm rural America and subject it to inferior broadband service. I have worked with the petitioner groups for many years, and while Lord knows we have agreed and disagreed on various issues, this type of messaging was unfair and unfounded. These are reputable organizations, representing hardworking companies seeking to bring broadband service to some of the hardest-to-serve regions of our nation. I would urge those casting accusations to take their rhetoric down a notch or two before engaging in further criticism of some of the dedicated people trying to bring digital access to their neighbors, friends, and communities. STATEMENT OF COMMISSIONER BRENDAN CARR Re: Connect America Fund, WC Docket No. 10-90. When Americans spend their hard-earned dollars on fast Internet connections, they expect to get what they’re paying for. And rightly so. They expect the speeds they’ve been promised so their kids can do their homework at night. They expect the quality they’ve been promised so they can run their home businesses. And they expect the responsiveness they’ve been promised so their family can connect on a video chat. Their expectations should be highest when the Internet infrastructure that carriers build to their homes is constructed with universal service dollars. After all, consumers contribute billions of dollars each year to support these infrastructure builds, and carriers have committed to meeting performance benchmarks. Yet for decades, while Commission after Commission has allocated these billions of dollars worth of funds, the agency has never required or held carriers accountable to these types of specific performance requirements. And that’s a shame because, as I visit rural communities in this job, I hear from Americans that often express doubt that they’re getting what they’ve paid for. With this Order today, the full Commission votes for the first time on these uniform performance metrics and requirements. In doing so, we also respect the privacy of homeowners by clarifying that carriers need not put Whitebox monitoring devices inside a customer’s house. From my time on the road, I doubt it would go uniformly well if an official knocked on a door and told the homeowner that the government wanted them to install a device to monitor their Internet usage. So today’s decision now recognizes that carriers can meet their obligations without that type of intrusive measure. I am also glad that we reject requests today that would have limited the performance test to only a portion of the network. I understand that carriers are not directly in control of intermediate networks, but I also know that these carriers have promised the Commission that they would offer a level of service in exchange for universal service funding. I know, perhaps even more importantly, that they’ve promised their customers a certain level of service. So they need to use their funding wisely to ensure that they have sufficient transport to give their customers what they rightly expect. I am proud that this item will help ensure that Americans get what they pay for. And I want to thank the Wireline Competition Bureau for its work on the item. It has my support. STATEMENT OF COMMISSIONER JESSICA ROSENWORCEL Re: Connect America Fund, WC Docket No. 10-90. Providing broadband to the most remote and rural areas of this country is not for the faint of heart. Consumers and businesses can be spread few and far between. Terrain can be rough and the deployment season can be brutally short. The economics are hard and the business case is not always easy. But it is still a fact that we are stronger when we are connected to one another. So over the last decade, the Federal Communications Commission set out the modernize its universal service program to assist with the effort to connect all. The agency has taken steps to support a mix of phone and broadband services in rural communities across the country. As a result, the FCC now commits over four and a half billion dollars a year to broadband deployment efforts in these areas. It is by far and away the largest of the agency’s universal service efforts. That’s why today’s decision is important. Going forward, carriers that accept universal service support to provide broadband will be required to test that their networks actually offer the service they have committed to deliver. This is about accountability. It’s important. The FCC must make sure there are measures in place to demonstrate that universal service funding is being used to extend the reach of high-speed service to all. In other words, we have promises to keep. We also still have work to do. Two years ago, Representative Frank Pallone called attention to the fact that the FCC’s own Inspector General stated that the agency does not have the dedicated resources it needs to police the universal service high cost program. That’s a problem. And just one week ago the Inspector General reminded us that this program does not comply with the Improper Payments Elimination and Recovery Improvement Act. We need to address these problems—stat. We need confidence in this program. We need to ensure it truly delivers. Last week, I joined Senator Joe Manchin in West Virginia. We crisscrossed the Mountain State at the peak of its Fall glory. The towns we stopped in were all small, all proud of their history, and all grappling with their futures. Everyone we met expressed concern about how reliable broadband had not yet reached the homes and businesses in their community. They spoke of the connections they were not able to forge, the economic opportunities that had been lost, and the students who struggled with the homework gap. Their frustration was real. Many of them were aggrieved that this agency’s maps suggest they have service when they know on the ground, at home and at work, they simply do not. The trip was a reminder that we have big broadband challenges in this country. We have work to do. But back to the here and now. This decision is a modest step forward. It brings a new level of accountability to our funding for the high-cost universal service system. It has my support. Federal Communications Commission "FCC XX-XXX" STATEMENT OF COMMISSIONER GEOFFREY STARKS Re: Connect America Fund, WC Docket No. 10-90. As I have said before, affordable broadband is a necessity for all Americans. And the broadband deployment catalyzed by the Connect America Fund is a crucial step towards alleviating internet inequality. It will help empower those living on the outskirts of today’s digital society to share in the benefits of telemedicine and distance learning, among others. But deployment is only half the battle. If the performance of these networks fails to reach even minimum standards of speed and latency, then the people they serve will be unable to fully realize the benefits of connectivity. Today’s Order improves our ability hold carriers to account as they deliver on the promise of broadband. It helps safeguard precious Universal Service dollars. And it offers needed clarity to CAF participants on the requirements they face to prove they have met their obligations. The addition of pre-testing periods to our regime affords both carriers and the Commission time to evaluate how deployments are performing, and I am glad to see that the Commission has mandated public access to the pre-testing data generated by the carriers. I look forward to seeing the numbers. My thanks to the staff of the Wireline Competition Bureau for your commitment to getting all Americans connected to broadband, and for your work on this item. 2