Federal Communications Commission DA 22-241 Before the Federal Communications Commission Washington, D.C. 20554 In the Matter of Establishing the Digital Opportunity Data Collection ) ) ) ) WC Docket No. 19-195 ORDER Adopted: March 9, 2022 Released: March 9, 2022 By the Chiefs, Wireless Telecommunications Bureau, Office of Economics and Analytics, Office of Engineering and Technology: TABLE OF CONTENTS Heading Paragraph # I. INTRODUCTION 1 II. BACKGROUND 3 III. DISCUSSION 8 A. Mobile Service Challenge Process 8 1. Creating a Challenge/Cognizable Challenges 13 2. Challenge Responses 60 a. Rebutting Challenges with On-the-Ground Data 64 b. Rebutting Challenges with Infrastructure Data 71 c. Other Data 83 B. Collecting Verification Information from Mobile Providers 86 1. Area Subject to Verification 87 2. Sampling Methodology 94 3. On-the-Ground Test Data 97 4. Infrastructure Information 102 5. Transmitter Monitoring Information 113 C. Collecting Verified Broadband Data from Government Entities and Third Parties 114 D. Crowdsourced Data 117 1. Tools to Submit Crowdsourced Data 119 2. Crowdsourced Data Submitted in the Online Portal 124 3. When Crowdsourced Filings Reach a “Critical Mass” 127 4. Public Availability of Crowdsourced Data 136 E. Other Matters 138 IV. PROCEDURAL MATTERS 147 V. ORDERING CLAUSES 150 APPENDIX A: TECHNICAL APPENDIX APPENDIX B: FINAL RULES APPENDIX C: SUPPLEMENTAL FINAL REGULATORY FLEXIBILITY ANALYSIS I. INTRODUCTION 1. In this Order, the Wireless Telecommunications Bureau (WTB), Office of Economics and Analytics (OEA), and Office of Engineering and Technology (OET) (collectively, the Bureau and Offices) adopt the technical requirements to implement the mobile challenge, verification, and crowdsourcing processes set forth in Appendix A (Technical Appendix) and required by the Broadband DATA Act Broadband Deployment Accuracy and Technological Availability Act, Pub. L. No. 116-130, 134 Stat. 228 (2020) (codified at 47 U.S.C. §§ 641-646) (Broadband DATA Act). as part of the FCC’s ongoing Broadband Data Collection (BDC) effort to improve the Commission’s broadband availability data. See, e.g., 47 U.S.C. §§ 642(a)(1)(B)(i), (iii), (iv), (a)(2), (b)(4), (b)(5), 644(b); see also Comment Sought on Technical Requirements for the Mobile Challenge, Verification, and Crowdsource Processes Required under the Broadband DATA Act, WC Docket No. 19-195, Public Notice, DA 21-853, 2021 WL 3057378 (WTB/OEA/OET July 16, 2021) (BDC Mobile Technical Requirements Public Notice). 2. Specifically, we adopt the proposed processes and methodology set forth in the BDC Mobile Technical Requirements Public Notice for collecting challenge process data and for determining when the threshold to create a cognizable challenge has been met. The Commission referred to a “cognizable challenge” as one requiring a provider response and directed OEA, in consultation with WTB, to establish the methodology for determining this threshold. Establishing the Digital Opportunity Data Collection; Modernizing the FCC Form 477 Data Program, WC Docket Nos. 19-195, 11-10, Third Report and Order, 36 FCC Rcd 1126, 1168, para. 105 (2021) (Third Order). Additionally, we adopt detailed processes for mobile providers to respond to challenges, for the Commission to initiate a verification request to a service provider, and for providers to respond to verification requests to confirm broadband coverage in areas they claim have service. We adopt the parameters and metrics set forth in the Technical Appendix that must be collected both for on-the-ground test data to support challenge submissions, rebuttals to cognizable challenges, and responses to verification requests, and for infrastructure information to support challenge rebuttals and responses to verification requests. We require government entities and third parties to submit verified broadband data using the same data specifications we require of mobile service providers. Finally, we find the Commission’s speed test app to be a reliable and efficient method for entities to use in submitting crowdsourced mobile coverage data to the Commission and describe the methodology staff will use in determining when a “critical mass of” crowdsourced filings suggests that a provider has submitted inaccurate or incomplete data. The measures that we adopt in this Order to implement the mobile challenge, verification, and crowdsourcing processes will enable the Commission, Congress, other federal and state policy makers, Tribal entities, consumers, and other third parties to verify and supplement the data collected by the Commission on the status of mobile broadband availability throughout the United States. II. BACKGROUND 3. Congress and the Commission have taken several steps in recent years to improve the broadband availability data that the Commission collects and publishes on its coverage maps. In March 2020, Congress passed the Broadband DATA Act, which requires the Commission to collect more granular and consistent data from broadband internet access service providers on the availability and quality of broadband service, as well as to establish a challenge process; verify the accuracy and reliability of the broadband coverage data that providers are required to submit; and improve data accuracy through a crowdsourcing process. See, e.g., 47 U.S.C. §§ 642(a)(1)(B)(i), (iii), (iv), (b)(4), (b)(5), 644(b). The Broadband DATA Act also requires the Commission to develop “a process through which it can collect verified data for use in the coverage maps from: (1) [s]tate, local, and Tribal governmental entities that are primarily responsible for mapping or tracking broadband internet access service coverage for a [s]tate, unit of local government, or Indian Tribe, as applicable; (2) third parties . . . ; and (3) other Federal agencies.” 47 U.S.C. § 642(a)(2). These tools are designed to provide a more accurate data collection that is informed by input from consumers, other federal agencies, state and local governments, Tribal entities, providers, and other entities. 4. In its Second Order in this proceeding, the Commission implemented a number of the Broadband DATA Act’s requirements for collecting and reporting broadband data from providers, developed the framework for the BDC, established a process for verifying the broadband data it receives from providers in their BDC filings, and adopted a basic framework for collecting crowdsourced information. Establishing the Digital Opportunity Data Collection; Modernizing the FCC Form 477 Data Program, WC Docket Nos. 19-195, 11-10, Second Report and Order and Third Further Notice of Proposed Rulemaking, 35 FCC Rcd 7460 (2020) (Second Order and Third Further Notice). While the challenge process, crowdsourced data, and other Commission efforts will all serve to validate the data submitted by providers, for purposes of this Order, “verification” or “verification process” refers to the internal process the Commission sought comment on in Section IV.D. of the Third Further Notice and adopted in Section III.E. of the Third Order. Id. at 7503-06, paras. 104-09; Third Order, 36 FCC Rcd at 1146-51, paras. 47-60; see also 47 U.S.C. § 642(b)(4) (instructing the Commission to “verify the accuracy and reliability of the information in accordance with measures established by the Commission”). In the Third Order, the Commission adopted additional requirements for collecting and verifying provider-submitted data and established the challenge process. Third Order, 36 FCC Rcd at 1146-51, 1164-75, paras. 47-60, 97-125. 5. In the Third Order the Commission determined that it should aggregate speed test results received from multiple consumer challenges in the same general area in order to resolve challenges in an efficient manner, mitigate the time and expense involved, and ensure that the mobile coverage maps are reliable and useful. Id. at 1167-68, para. 105. The Commission found that, when the aggregated results reach an appropriate threshold, they will constitute a cognizable challenge requiring a provider response. Id. at 1168, para. 105. The Commission acknowledged that consumers are likely to submit challenges in distinct, localized areas and recognized that providers should not be subject to the undue cost of responding to a large number of challenges in very small areas. Id. at 1167-68, para. 105. The Commission directed OEA, in consultation with WTB, to determine the threshold number of mobile consumer challenges within a specified area that will constitute a cognizable challenge triggering a provider’s obligation to respond. Id. at 1167, para. 105; 47 CFR § 1.7006(e)(2). In connection with that determination, the Commission also directed OEA, in consultation with WTB, to establish: (1) the methodology for determining this threshold; Third Order, 36 FCC Rcd at 1168, para. 105; see 47 CFR § 1.7006(e)(2). The Commission stated that, “[i]n developing this methodology, OEA should consider, inter alia, the number, location, and timing of the tests, variability in test results, and whether the tests were conducted in urban or rural areas.” Third Order, 36 FCC Rcd at 1168, para. 105. and (2) the methodology for determining the boundaries of a geographic area where the threshold for a cognizable challenge has been met. Third Order, 36 FCC Rcd at 1168, para. 106; see 47 CFR § 1.7006(e)(2). 6. Consistent with the approach it adopted for consumer challenges, the Commission stated that it would also aggregate speed test evidence received from multiple government and third-party challengers in the same general area. Third Order, 36 FCC Rcd at 1173, para. 120. The Commission directed OEA, in consultation with WTB, to determine the threshold number of such challenger speed tests within the same general area that constitute a cognizable challenge and require a provider response. Id. at 1168, 1173, paras. 105-06, 120. Similar to the consumer challenges, the Commission directed OEA, in consultation with WTB, to establish the methodology for determining this threshold and the boundaries of an area where the threshold has been met. Id. The Commission also delegated to the Bureau and Offices certain responsibilities to refine the verification process, and to establish a way for governments to submit verified mobile coverage data, among other things. Id. at 1150, 1154, paras. 59, 68. 7. In July 2021, the Bureau and Offices released the BDC Mobile Technical Requirements Public Notice seeking comment on proposed technical requirements for the mobile challenge, verification, and crowdsourcing processes required under the Broadband DATA Act. BDC Mobile Technical Requirements Public Notice. This Order uses Westlaw *pagination for the BDC Mobile Technical Requirements Public Notice. The Public Notice included proposals to implement processes delegated to the Bureau and Offices, including establishing: thresholds for a cognizable challenge to mobile wireless broadband availability data; a process for mobile providers to respond to challenges; a process for collecting verified on-the-ground and infrastructure data from mobile providers in response to a verification inquiry from the Commission; a collection of verified broadband data from government and third-party entities; and processes for the Commission to collect and use crowdsourced data. BDC Mobile Technical Requirements Public Notice at *4-6, paras. 8-14, *7-11, paras. 15-25, *11-17, paras. 26-42, *17, paras. 43-45, *18-19, paras. 46-50, *19-22, paras. 51-59. Sixteen parties filed comments and/or replies, including service providers, trade associations, state governments, technology providers, and public interest organizations. Comments were filed by: Competitive Carriers Association (CCA), CTIA, Enablers, Inc. (Enablers), Kimberly J. Lippi (filed on behalf of California Public Utilities Commission (CPUC)), Rural Wireless Association (RWA), Precision Agriculture Connectivity and Accuracy Stakeholder Alliance (PAgCASA), Ookla, T-Mobile, Vermont Department of Public Service (Vermont DPS), and Verizon. Reply comments were filed by: AT&T, Comniscient Technologies, Inc. (Comniscient), CCA, CTIA, Garland T. McCoy (filed on behalf of PAgCASA), Next Century Cities, Ookla, Opensignal, Inc., New America Open Technology Institute and Public Knowledge (Public Knowledge/New America), RWA, T-Mobile, and Vermont DPS. III. DISCUSSION A. Mobile Service Challenge Process 8. In this Order, the Bureau and Offices adopt the proposals for the mobile challenge process set forth in the BDC Mobile Technical Requirements Public Notice, with certain modifications described below. 9. The Broadband DATA Act requires that the Commission “establish a user-friendly challenge process through which consumers, [s]tate, local, and Tribal governmental entities, and other entities or individuals may submit coverage data to the Commission to challenge the accuracy of – (i) the coverage maps; (ii) any information submitted by a provider regarding the availability of broadband internet access service; or (iii) the information included in the [Broadband Serviceable Location] Fabric.” 47 U.S.C. § 642(b)(5)(A); see id. § 642(a)(1)(B)(iii). “Fabric” is defined as the “Broadband Serviceable Location Fabric” established under section 642(b)(1)(B). 47 U.S.C. § 641(6). The general requirements and framework for the mobile challenge process predate the BDC Mobile Technical Requirements Public Notice, and were set forth in either the Broadband DATA Act or prior Commission orders. We note that, to the extent commenters ask the Bureau and Offices to eliminate, modify, or otherwise revisit particular requirements established in either the Broadband DATA Act or prior Commission-level orders, we lack the legal authority to do so. In the Second Order and Third Further Notice, the Commission proposed a challenge process that “encourages participation to maximize the accuracy of the maps, while also accounting for the variable nature of wireless service.” Second Order and Third Further Notice, 35 FCC Rcd at 7515, para. 141. In the Third Order, the Commission adopted its proposals from the Second Order and Third Further Notice, and established a framework for consumers, state, local, and Tribal governments, and other entities to submit data to challenge the mobile broadband coverage maps. See Third Order, 36 FCC Rcd at 1164-68, 1171-73, paras. 98-106, 113-20. 10. The Commission determined that it should enable stakeholders to challenge mobile coverage data based on both a lack of service and poor service quality (such as slow delivered user speeds). Id. at 1164-65, 1171, paras. 98, 113. Challenges must be based upon on-the-ground speed test data taken outdoors (i.e., from an in-vehicle mobile or outdoor stationary environment). Id. at 1164, 1165, 1166, 1171, 1172, paras. 99, 102, 116, 118. The Commission adopted a requirement that consumers use a speed test application (either developed by the FCC or a third-party app approved by OET for use in the challenge process) that automatically collects information and metrics associated with each speed test and allows for submission of information directly to the Commission from a mobile device. Id. at 1166-67, paras. 103-04. Consumers will be required to submit certain identifying information to deter frivolous filings. Id. at 1166, para. 101. Government and other third-party entity challengers (including competing mobile service providers) may use their own software or hardware to collect data for the challenge process so long as the data contain metrics that are substantially the same as those collected by approved speed test applications. Id. at 1172, 1173, paras. 117, 119. Moreover, government and other entity challengers are required to conduct on-the-ground tests using a device advertised by the challenged provider as compatible with its network. Id. at 1172, para. 118. 11. The Commission adopted a requirement for providers to either submit a rebuttal to the challenge or concede the challenge within 60 days of being notified of the challenge. Id. at 1168, 1173, paras. 107, 121. Rebuttals must consist of either on-the-ground test data or infrastructure data. Id. at 1168-69, 1173, paras. 108, 121. A challenge respondent may also submit supplemental data in support of its rebuttal, either voluntarily or in response to a request for additional information from OEA. Id. at 1168-70, 1173-74, paras. 108-10, 121-22. The Commission directed OEA to develop a methodology and mechanism to determine if the data submitted by a provider constitute a successful rebuttal to all or some of the challenged service area and to establish procedures to notify challengers and providers of the results of a challenge. Id. at 1170, para. 111. Further, the Commission adopted a requirement that providers that concede or lose a challenge file new coverage data within 30 days depicting the challenged area that has been shown to lack service. Id. at 1170, 1174, paras. 112, 124. 12. The requirements that we adopt in this Order will enable the Commission to collect sufficient measurements to ensure that the challenge process is statistically valid while remaining “user-friendly.” 47 U.S.C. § 642(b)(5)(A). In particular, we establish a methodology for determining a threshold number of mobile speed tests and the geographic boundaries within a specified area. Based on this methodology, a challenge is created by associating the locations of validated speed tests within geographical hexagons defined by the accessible, open-source H3 geospatial indexing system and analyzing those speed tests. Isaac Brodsky, H3: Uber’s Hexagonal Hierarchical Spatial Index, (June 27, 2018), https://eng.uber.com/h3/. We also adopt the parameters and metrics that speed tests must meet to be validated and used to meet the challenge thresholds. Importantly, as the Commission specified in the Third Order, the challenge process will remain user-friendly because all of the information a consumer needs to create a challenge will be collected and submitted by the FCC Speed Test app and any third-party mobile speed test apps approved by OET. Governmental and other entity challengers may use these apps or their own software or hardware to collect data for the challenge process. See Third Order, 36 FCC Rcd at 1172, para. 117. Additionally, we implement the Commission’s decision to aggregate speed tests to resolve challenges “in an efficient manner, mitigate the time and expense involved, and ensure that the mobile coverage maps are as reliable and useful as possible,” Id. at 1168, para. 105. by adopting our proposal to combine speed tests conducted by consumers, governmental agencies, and other entities to determine whether the thresholds for a cognizable challenge have been met. These requirements strike the appropriate balance between ensuring that consumers, state, local, and Tribal governments, and other entities can participate in the challenge process, on the one hand, and protecting providers from being burdened by having to respond to challenges that do not meet the cognizable challenge standard, on the other hand. 1. Creating a Challenge/Cognizable Challenges 13. On-the-Ground Speed Test Data Parameters and Metrics. Challenges must be supported by on-the-ground test data. We have therefore established the required testing parameters and data metrics for speed test submissions. At the outset, we will require the FCC Speed Test app and approved third-party apps to collect the name and email address of the end user and mobile phone number of the device on which the speed test was conducted, to the extent technically feasible. As discussed in further detail below, iOS devices will not automatically transmit the mobile phone number associated with the device that runs a speed test. We will therefore require testers submitting tests for use in the challenge process to manually submit, through the speed test app, the phone number associated with the device on which the speed test was conducted. The Commission’s rules state that consumer challengers must include “name and contact information (e.g., address, phone number, and/or email address) in their data submissions.” 47 CFR § 1.7006(e)(1)(i). We amend these rules to require that app users also submit their email address so that the Commission can notify testers of the status of their speed test(s) and any resulting challenge(s), and we also amend the rules to require app users to submit the mobile phone number of the device on which the speed test was conducted so that we may, if necessary, We anticipate we will only share the phone number of the device on which the speed test was conducted with mobile broadband providers in situations where a challenged provider is unable to identify a subscriber by using the timestamp that test measurement data were transmitted to the app developer’s servers, as well as the source IP address and port of the device, as measured by the server, which we will also require to be included in challenge data submitted by the app, as discussed below. See infra para. 18. share this information with mobile broadband providers for use when responding to challenges. We will not collect the address of an end user for use in the mobile challenge process at this time in order to minimize the amount of personally identifiable information we require from end users, and because a mobile user’s physical address is not currently helpful either to the Bureau and Offices when considering challenges or to providers when responding to challenges. In addition to the testing metrics adopted by the Commission in the Third Order, See Third Order, 36 FCC Rcd at 1166-67, 1172, paras. 103, 117; see also 47 CFR § 1.7006(e)(1)(i)-(v). The Commission’s rules state that consumer challengers must include in their data submissions, “name and contact information (e.g., address, phone number, and/or email address).” 47 CFR § 1.7006(e)(1)(i). We will require the FCC Speed Test app and approved third-party apps to collect the consumer’s name, email address, and phone number of the device on which the speed test was conducted to the extent technically feasible. we adopt the testing parameters and updated metrics for challenge speed test data proposed in the BDC Mobile Technical Requirements Public Notice, with the modifications described below. See BDC Mobile Technical Requirements Public Notice at *6, para. 14. With the exception of different considerations pertaining to the submission of speed test data taken on iOS devices The information we will use in the challenge process that can be collected from Android devices, but not iOS devices, includes the signal strength, signal quality, unique identifier, and other RF metrics of each serving cell, as well as the spectrum bands used for the test and other network characteristics (e.g., whether the device was roaming, as well as the identity of the provider for the connected network). and the submission of IP address, source port, and timestamp measured by an app developer’s servers by government entities and service providers in some scenarios, As discussed in greater detail below, we will allow government and other third-party entities to alternatively submit the International Mobile Equipment Identity (IMEI) of the device used to conduct a speed test for use in the challenge process rather than provide the source IP address, source port, and timestamp measured by an app developer’s servers. We will also not require a service provider to submit either the device IMEI or the combination of source IP address, source port, and timestamp measured by an app developer’s servers when submitting speed tests either in response to a challenge or in response to a verification inquiry. See infra para. 18, Section III.B.3. Collecting Verification Information from Mobile Providers, On-the-Ground Test Data. these parameters and metrics will apply across all testing mechanisms, not only in the challenge process but also for on-the-ground data submitted in response to verification inquiries. Individual consumer challengers must collect on-the-ground speed test data using mobile devices running either a Commission-developed app (e.g., the FCC Speed Test app) or another speed test app approved by OET for the submission of challenges. BDC Mobile Technical Requirements Public Notice at *6, para. 14. The Bureau and Offices will announce the process and procedures for third-party app providers to seek approval for a speed test app to be used in submitting data for use in the challenge process. Third-party and governmental entities may, as specified in the Third Order, collect data using either one of these speed test apps or their own software and hardware that collects broadband availability data, consistent with the parameters and metrics set forth herein. See Third Order, 36 FCC Rcd at 1172, para. 117; see also, id. (noting that the metrics the Commission adopted for government and other entity challenge data “are substantially the same as the metrics [the Commission] require[s] approved speed test applications to collect for consumer challenges.”). We include “hardware” to capture the professional tools such as laptops, hard drives, or other hardware devices, used to collect on-the-ground data. The Third Order provided that government and other entity challengers submit a complete description of the methodologies used to collect the data. Id.; 47 CFR § 1.7006(f)(1)(ii). The Bureau and Offices will issue a public notice announcing the process and procedures for such parties to submit the necessary documentation. 14. In the Third Order, the Commission required consumer challengers to use a speed test app approved by OET for use in the challenge process and provided the metrics that approved apps must collect for each speed test. Third Order, 36 FCC Rcd at 1166-67, paras. 103-04. The Commission directed OET, in consultation with OEA and WTB, to update the FCC Speed Test app as necessary or develop a new speed test app to collect the designated metrics, so that challengers may use it in the challenge process. Id. at 1167, para. 104. For government and third-party entity challengers, the Commission did not require the use of a Commission-approved speed test app but instead set forth the information that all submitted government and third-party challenger speed test data must contain and directed OEA, WTB, and OET to adopt additional testing requirements if they determine it is necessary to do so. Id. at 1172, paras. 117-18. Our BDC Mobile Technical Requirements Public Notice proposed certain testing parameters and metrics to standardize the on-the-ground test data submitted in the challenge process and to assure more reliable challenges; BDC Mobile Technical Requirements Public Notice at *6, para. 14. a number of parties agree that such consistency among the apps used for challenges and rebuttals is important. See, e.g., Ookla Comments at 1 (calling the collection of on-the-ground data a “positive step”). This set of standardized parameters and metrics will also ensure that we can make a meaningful comparison of tests run by different entities using different methods (e.g., tests run on a speed test app versus a government’s own hardware and software), and will enable us to easily combine and evaluate speed test data used in the challenge process. Accordingly, we will require that such data meet the following testing parameters set forth in the BDC Mobile Technical Requirements Public Notice: (1) a minimum test length of 5 seconds and a maximum test length of 30 seconds; To avoid requiring excessive data usage for tests on particularly fast networks (e.g., 5G-NR using high-band spectrum), we will relax the minimum test duration requirement once a download or upload test measurement has transferred at least 1,000 megabytes of data. See BDC Mobile Technical Requirements Public Notice at *6, para. 14; see also Ookla Comments at 10-11 (“strong incentives exist to accurately complete a test as quickly and easily as possible.”). Specifically, when a speed test transfers at least 1,000 megabytes of data, we will validate the test if it has a duration value of greater than 0 seconds and less than or equal to 30 seconds. Otherwise, a speed test must have a duration value of greater than or equal to 5 seconds and less than or equal to 30 seconds to be valid. (2) test measurement results that have been averaged over the duration of the test (i.e., total bits received divided by total test time); and (3) a restriction that tests must be conducted between the hours of 6:00 a.m. and 10:00 p.m. local time. BDC Mobile Technical Requirements Public Notice at *6, para. 14. 15. We clarify that the minimum and maximum test length parameters will apply individually to download speed, upload speed, and round-trip latency measurements, and will not include ramp up time. See Ookla Comments at 4 (requesting clarification for whether the proposed test length combines latency, download, and upload measurements into one test or considers them individually, and advocating for a minimum of 5 measurements and maximum of 20 measurements for the latency metric); Ookla Reply at 10-11; Vermont DPS Comments at 5-6 (recommending that the Commission specify a test sequence of 15 seconds’ duration with five seconds for each of the individual component test measurements (upload, download, and latency)). We disagree with CCA, Public Knowledge/New America, and Vermont DPS that imposing a maximum test limit places an arbitrary or inferior limitation on testing. CCA Comments at 3-6; Public Knowledge/New America Reply at 5-6; Vermont DPS Reply at 4-5. These timing requirements balance representative measurement over a stable Transmission Control Protocol (TCP) connection, on the one hand, versus data usage considerations, on the other hand—especially for consumers who may have limited data plans. The FCC Speed Test app, for example, first initiates a test server selection process, which typically takes two seconds (and a maximum of 10 seconds if servers fail to respond) then individually runs, including a warm-up time, a maximum of eight seconds for download and eight seconds for upload tests by establishing three concurrent TCP connections and summing the three resulting data rates for each test. FCC Office of Engineering and Technology, 2021 FCC Speed Test App Technical Description at 6 (2021), https://www.fcc.gov/sites/default/files/2021_fcc_speed_test_app_technical_description.pdf (2021 FCC Speed Test App Technical Description) (the 2021 FCC Speed Test App Technical Description is accessible via the FCC’s Measuring Broadband America Mobile Data webpage under the heading, “Data set scripts and descriptions,” available at https://www.fcc.gov/reports-research/reports/measuring-broadband-america/measuring-broadband-america-mobile-data). In addition, the round-trip latency testing runs for a fixed five seconds to transmit up to 200 UDP (User Datagram Protocol) packets (i.e., datagrams) to calculate the average latency of those datagrams. 2021 FCC Speed Test App Technical Description at 6. Hence, a typical test cycle takes approximately 23 seconds to complete, and a maximum of 31 seconds to complete. 16. We also decline to adopt CCA’s request to exempt continuous network monitoring from the maximum test length. See CCA Comments at 3-6. Continuous network monitoring software can monitor active users’ speeds at the cell sites and other network parameters over extended periods of time. See, e.g., Netscout, TruCall, https://www.netscout.com/sites/default/files/2019-01/SPDS_003_EN-1901%20-%20TrueCall.pdf (last visited Feb. 3, 2022) (discussing how network operators can continuously measure and collect performance metrics directly from their RAN and Core networks). We are not persuaded that deviating from the uniform 30-second per test component maximum testing standard to accommodate continuous network monitoring will yield equal or more accurate test results. We found in the Mobility Fund Phase II challenge process that continuous network monitoring speed tests recorded significant variability within the same area and across a short time span, in some cases recording strong network performance well exceeding the minimum requirement interspersed with short seconds-long drops in performance that may have been the result of normal network conditions (e.g., sector handover or network scheduling). See Rural Broadband Auctions Task Force Staff Report, Mobility Fund Phase II Coverage Maps Investigation Staff Report at 61-62, Appx. B, para. 6 (2019), https://docs.fcc.gov/public/attachments/DOC-361165A1.pdf (Mobility Fund Phase II Investigation Staff Report) (noting that staff analysis “indicates that a large portion of challenger data include speed tests both above and below 5 Mbps within the same general area” and concluding that the MF-II challenge process algorithm was “less reliable for data where a challenger conducted dozens of continuously recorded drive tests along roads within a grid cell”); id. at 62, Appx. B, para. 6, n.9 (suggesting “a more appropriate framework for processing a large number of speed tests recorded in a short time period over a limited area could be the use of statistical calculations (e.g., 90th percentile) to mitigate noise in the data due to the variability of wireless networks”). The Commission is obligated by statute to consider lessons learned in MF-II when creating the challenge process. 47 U.S.C. § 642(b)(5)(B)(i)(V). The overall performance in these areas indicated that coverage was adequate (i.e., with the average of tests in the same area over 15-20 seconds exceeding the minimum requirement), but because the test results were so variable, we are concerned that allowing the reporting of continuous speed tests could result in inaccurate results that do not reflect the typical on-the-ground customer experience, which as the results showed, may be adequate when averaged, but may not deliver consistent speeds to consumers. Mobility Fund Phase II Investigation Staff Report at 61-62, Appx. B, para. 6. To the extent challengers choose to use continuous network monitoring to record challenge data, results of the speed tests should report the average speeds over a uniform time period consistent with the minimum and maximum test lengths we adopt above (i.e., a minimum of 5 seconds and a maximum of 30 seconds). 17. We share Ookla’s concern that averaging the number of bits received over the entire duration of a throughput test may negatively affect the accuracy of any calculation, as that may not exclude an internet connection’s known and expected “ramp-up time.” Ookla Comments at 6 (defining “ramp-up time” as “the time period in which a congestion control algorithm – such as Transmission Control Protocol (“TCP”) slow start – gradually increases the amount of data transmitted over a connection until the algorithm finds the network’s maximum carrying capacity”). To account for this, we will apply the following formula: [(total bits received – ramp up bits) divided by (total test time – ramp up time)]. We consider “ramp up bits” to be the initial bits received during the initial warm-up time. We find that this approach will sufficiently account for ramp-up time and fully satisfy Ookla’s concern, especially in light of the clarification above that the test time limits apply individually to tests’ upload and download measurements. 18. We require on-the-ground speed test data to include a standardized set of metrics. Each on-the-ground speed test must include the following metrics that were previously adopted by the Commission Third Order, 36 FCC Rcd at 1166-67, 1172, paras. 101, 103, 117. as modified by the updates proposed in the BDC Mobile Technical Requirements Public Notice: BDC Mobile Technical Requirements Public Notice at *6, para. 14. (1) the timestamp and duration of each test metric; (2) geographic coordinates (i.e., latitude/longitude) measured at the start and end of each test metric with typical Global Positioning System (GPS) Standard Positioning Service accuracy or better, along with the location accuracy; “Location accuracy” refers to a metric that GPS-enabled smartphones are able to report on the horizontal accuracy of the geographic coordinates of the location reported. (3) the consumer-grade device type(s), brand/model, and operating system used for the test; (4) the name and identity of the service provider being tested; (5) location (e.g., hostname or IP address) of the test server; (6) signal strength, signal quality, unique identifier, and other radiofrequency (RF) metrics of each serving cell, where available; (7) download speed; (8) upload speed; (9) round-trip latency; (10) for an in-vehicle test, the speed the vehicle was traveling when the test was taken, where available. All on-the-ground speed tests must also include the following metrics previously adopted by the Commission: Third Order, 36 FCC Rcd at 1172, para. 117. (11) whether the test was taken in an in-vehicle mobile or outdoor, pedestrian stationary environment; Government and other third-party entities must also indicate whether an in-vehicle mobile test was conducted with the antenna outside of the vehicle. (12) an indication of whether the test failed to establish a connection with a mobile network at the time and location it was initiated; and (13) the network technology (e.g., 4G LTE, 5G-NR) and spectrum bands used for the test. Both para. 103 and para. 117 of the Third Order required speed tests to include the network technology (e.g., 4G LTE or 5G-NR) and spectrum bands used for the test. We adopt an additional metric that was proposed in the BDC Mobile Technical Requirements Public Notice: (14) the app name and version. BDC Mobile Technical Requirements Public Notice at *6, para. 14. We will also require all speed tests to include: (15) the timestamp that test measurement data were transmitted to the app developer’s servers, as well as the source IP address and port of the device, as measured by the server. Given concerns that challengers may conduct tests after exceeding data limits, we will collect the timestamp that test measurement data were transmitted to the app developer’s servers, as well as the source IP address and port of the device, as measured by the server, so that a service provider may determine if a challenger’s device is subject to reduced speeds or otherwise lacks full network performance. See Verizon Comments at 11. The source port of the device is an available network port over which the device communicates with the server and is unique to a particular network connection or transmission. The IP address and source port associated with the device used in testing is attainable from devices using both iOS and Android devices. For the same reasons, we will allow government and other third-party entities to alternatively submit the IMEI of the device used to conduct the test rather than provide the source IP address, source port, and timestamp measured by an app developer’s servers since such entities are allowed to use their own hardware or software to conduct speed tests. The purpose of collecting either type of data is to allow for the challenged provider to identify characteristics of the device or service plan used to conduct the test, such as whether the device was roaming or was subjected to slower service due to the subscriber’s data plan. See Verizon Comments at 11. Accordingly, we will not require a service provider to submit either the device IMEI or the combination of source IP address, source port, and timestamp when submitting speed tests (either in response to a challenge or in response to a verification inquiry), as these fields are relevant only for data submitted by challengers. Finally, we require on-the-ground challenge test data to include all other metrics required per the most recent specification for mobile test data adopted by OEA and WTB in accordance with 5 U.S.C. § 553. Concurrent with release of this Order, we are publishing the full technical and data specifications for mobile speed test data. Broadband Data Task Force and Office of Economics and Analytics Publish Additional Data Specifications for the Submission of Mobile Speed Test and Infrastructure Data into the Broadband Data Collection, Public Notice, DA 22-242 (BDTF/OEA 2022). The specification for speed test data includes additional fields derived from the high-level metrics defined herein, as well as other identifiers to facilitate management of the submission of such data. These fields include: a unique device installation ID; a unique test ID; the device Type Allocation Code (TAC); the Mobile Country Code (MCC) and Mobile Network Code (MNC) values measured from the network and from the device’s SIM card; flags indicating whether the network is connected, is available, and/or is roaming; total bytes transferred and calculated bytes per second for download and upload tests; jitter and packets sent and received for latency tests; for each connected cell, the measured cell ID, Physical Cell Identity (PCI), cell connection status, Received Signal Strength Indication (RSSI), Reference Signal Received Power (RSRP), Reference Signal Received Quality (RSRQ), Signal to Interference and Noise Ratio (SINR), Channel Quality Indicator (CQI), spectrum band and bandwidth, and Absolute Radio-Frequency Channel Number (ARFCN); and the horizontal accuracy of GPS coordinates and speed accuracy of measured velocity for each location measurement. Third-party app developers and government or other third parties that use their own hardware or software to conduct speed tests will be required to update their processes in accordance with such updates, including, as stated in the BDC Mobile Technical Requirements Public Notice, revised specifications for mobile test data adopted by the Bureau and Office in accordance with 5 U.S.C. § 553. The modified set of parameters and metrics we adopt aligns more closely with those already required of government and third-party challengers. 47 CFR § 1.7006(f); Third Order, 36 FCC Rcd at 1172, para. 117. The Commission delegated authority to the Bureau and Offices to adopt additional testing requirements for government and third-party challengers. Third Order, 36 FCC Rcd at 1172, para. 118. We therefore add certain metrics to those listed in paragraph 117 of the Third Order and § 1.7006(f) of the Commission’s rules and make clear that all challengers must collect these metrics, with the exception that consumers need not indicate whether an in-vehicle mobile test was conducted with the antenna outside of the vehicle. 19. We recognize the concerns raised by Vermont DPS, Enablers, and Public Knowledge/New America about excessive data and burdens on consumers and governments and other third-party challengers to assure that their data aligns to these standards, Vermont DPS Comments at 10 (arguing that these data metrics will require unreasonable processing to produce and record an exorbitant number of coordinates in any of the proposed app options, which will result in significant extra data and exponential possibility for data errors and will be administratively burdensome on government entities to adapt their apps to include six sets of coordinates for each test sequence); Enablers Comments at 6 (arguing that the testing parameters “amount to an exceedingly high burden of proof for consumers and other parties challenging mobile operator broadband coverage maps” and “[a]s a result, contrary to the Broadband DATA Act and its own policy goals, the Commission is at risk of overseeing a largely ineffective and meaningless mobile broadband mapping challenge process.”); Public Knowledge/New America Reply at 3 (agreeing with Enablers). but we believe that such parameters and metrics are necessary to provide the Commission with complete and reliable challenge data that accurately reflect on-the-ground conditions in the challenged area and provide the additional context necessary to efficiently and fully adjudicate challenges and thereby assure that more accurate and reliable coverage maps are made available. These data metrics are also substantially similar to those adopted by the Commission in the Third Order, See 47 CFR § 1.7006(e)(1)-(2), (f); Third Order, 36 FCC Rcd at 1166-67, 1172, paras. 103, 117. and therefore we do not anticipate that they will create any new burdens on consumers or governmental entities and third parties beyond those in place resulting from the previously adopted requirements. Further, the challenge process will remain user-friendly because any challenger can use a readily downloadable mobile app to collect and submit data (including the FCC Speed Test app, which the FCC makes available for download at no cost), and government and third-party entities have the flexibility also to use their own software or hardware. Therefore, government and other third parties will only need to modify their software once, to the extent necessary to conform to the required testing parameters and metrics we discuss above (and subject to our adopting any new metrics in the future). The Commission will also provide technical assistance to consumers and state, local, and Tribal governmental entities with respect to the challenge process, Third Order, 36 FCC Rcd at 1185-86, para. 155 (directing OEA and the Consumer and Governmental Affairs Bureau to make detailed webinars available to explain the challenge process and make available the names and contact information of Commission staff who are available to assist consumers, state, local, and Tribal governments with the challenge process); see also 47 U.S.C. § 644(e) (requiring the Commission to “provide technical assistance to consumers and state, local, and Tribal governmental entities with respect to the challenge process . . ., which shall include . . . detailed tutorials and webinars [and] the provision of staff of the Commission to provide assistance, as needed, throughout the entirety of the challenge process”). which will be a resource for government entities that do not understand some of our data collection requirements. The Bureau and Offices will ensure that the FCC Speed Test app and other apps approved for use in the challenge process collect this information, and government and other third-party challengers will be able to submit challenge data to the Commission through such apps under the procedures adopted for consumer challenges. 20. We understand that certain technical network information and RF metrics that we would otherwise require are not currently available on Apple iOS devices. The information we will use in the challenge process that can be collected from Android devices, but not iOS devices, includes the signal strength, signal quality, unique identifier, and other RF metrics of each serving cell, as well as the spectrum bands used for the test and other network characteristics (e.g., whether the device was roaming, as well as the identity of the provider for the connected network). Therefore, until such time as such information and metrics are available on iOS devices, and the Bureau and Offices indicate they will collect such information from iOS devices, government and third-party entity challenges must use a device that is able to interface with drive test software and/or runs the Android operating system. See BDC Mobile Technical Requirements Public Notice at *6, para. 14. The iOS operating system, which supports iPhone and iPad hardware devices, does not disclose certain technical network information and RF metrics that are essential to the Commission’s challenge and crowdsource processes. This limits the conclusions that we can draw from on-the-ground tests conducted using such devices. OET will update its guidance if future iOS software versions are released that disclose this technical network information and/or RF metrics. To ensure that the challenge process remains user-friendly and encourage public participation, including by consumers who use a device running the iOS operating system, however, we will not extend this restriction to challenges submitted by consumers, and we will still consider speed test data submitted using an iOS device towards challenges. Although iOS software does not report the complete metrics we require in this Order (e.g., certain technical network information and RF metrics), the Bureau and Offices will nevertheless use the remaining on-the-ground data we receive from consumers using iOS software in the challenge process. Although we may receive limited data from tests run on iOS devices, we do not anticipate that such tests will significantly impede the creation of challenges because, as mentioned, the Commission will aggregate speed tests to create cognizable challenges. iOS speed tests will be considered in combination with other speed tests that fall within the same resolution 8 hexagon. We therefore anticipate that data submitted by government and other entities, as well as consumer tests run on Android devices, will help fill in any gaps in information about the on-the-ground quality and availability of broadband coverage that may result from the limited nature of the data we receive from speed tests run on iOS devices. Our approach preserves balance and flexibility for both types of challengers, while also ensuring that the Commission gathers adequate data to adjudicate challenges. On the one hand, government and other third-party entities who can be expected to submit large amounts of speed test data may not use iOS devices but have the flexibility to use their own hardware and software. On the other hand, consumers who use iOS devices and would face a prohibitive burden if required to use a non-iOS device to submit a challenge may submit speed tests conducted using an iOS device but do not have the same flexibility as government and other entities to use non-approved software. 21. Third-party app developers and government or other entities that use their own hardware or software to conduct speed tests will be required to update their processes in accordance with updates to the full technical and data specifications for mobile speed test data, including, as stated in the BDC Mobile Technical Requirements Public Notice, revised specifications for mobile test data adopted by the Bureau and Offices. BDC Mobile Technical Requirements Public Notice at *6, para. 14. RWA asserts that adopting the proposed data metrics and parameters, including “all other metrics required per the most-recent specification for mobile test data released by OEA and WTB” Id. would be an improper incorporation by reference that violates the Federal Register regulations and the Administrative Procedure Act (APA). RWA Comments at 17-19 (discussing the metrics that speed test data and infrastructure data must include, which we adopt in supra para. 18 and infra Section III.B.4. Collecting Verification Information from Mobile Providers, Infrastructure Information). We disagree with RWA that this is an improper incorporation by reference that violates Federal Register regulations and the APA. The metrics we require are substantially similar to those already adopted by the Commission in the Third Order, Third Order, 36 FCC Rcd at 1166-67, 1172, paras. 101, 102 & n. 315, 103, 117. and have been adopted after notice and comment in accordance with the APA’s rulemaking requirements. See, e.g., BDC Mobile Technical Requirements Public Notice at *6, para. 14 (seeking comment on the proposed parameters and metrics); see also 5 U.S.C. § 553. Furthermore, we note that certain changes to the specifications that apply to the submission of on-the-ground test data, including for example, changing the file type to be submitted, are not substantive changes, and may be adopted without notice and comment. See 5 U.S.C. § 553(b)(B). The Bureau and Offices have been delegated authority to adopt such procedural changes pursuant to section 1.7010 of the Commission’s rules. See 47 CFR § 1.7010 (stating that the Bureau and Offices “may update the specific format of data to be submitted pursuant to the [BDC] to reflect changes over time in Geographical Information Systems (GIS) and other data storage and processing functionalities and may implement any technical improvements or other clarifications to the filing mechanism and forms”). To the extent that we may wish to make any substantive changes to testing parameters or metrics, we clarify that we would make such changes in accordance with 5 U.S.C. § 553. 5 U.S.C. § 553. Any future changes we make to the testing parameters or metrics will also be consistent with the Commission’s Orders implementing the BDA. Finally, the adoption of these rules will not result in an improper incorporation by reference because we will comply with the requirements of any applicable federal statutes and regulations governing the publication of these test parameters and metrics in the Federal Register and the Code of Federal Regulations. See, e.g., 5 U.S.C. § 552(a)(1); 1 CFR pts. 5, 51. 22. Speed Test Applications. Pursuant to the Commission’s directive in the Third Order, Third Order, 36 FCC Rcd at 1167, para. 104. OET is currently in the process of developing updates to the FCC Speed Test app to incorporate additional functionalities that will allow for its use in submitting speed test data as part of the BDC mobile challenge and crowdsource processes. See FCC, Justification for Other than Full and Open Competition – BDC Speed Test App, https://sam.gov/opp/cbd05b3514244c62996ef84ce734ea37/view (last visited Jan. 28, 2022). OET recently released a technical description of the metrics and methodologies used in the current version of the FCC Speed Test app. See generally 2021 FCC Speed Test App Technical Description. The revised technical description document includes updated technical standards and additional modifications. Id. While this document does not illustrate future user experience design changes to the FCC Speed Test app that will be made to implement the challenge and crowdsource functionalities, we anticipate that the fundamental measurement methodologies reflected in the recently updated technical description document will not be affected by these design updates. See id. We note that the description includes the following about the test system architecture: “The measurement servers, each supporting a 100 Gbps capacity, used for mobile broadband measurement are hosted by StackPath and are distributed nationally to enable a measurement client to select the host server with the least latency.” Id. at 6. The technical description includes data dictionaries for both Android and iOS versions of the app, but these dictionaries define data fields and formats for the current version of the app (and not the updated version of the app). See id. To provide third-party app developers and other stakeholders with information and guidance as early in the process as possible, the Bureau and Offices have made available, contemporaneous with the release of this Order, a current draft of the data specification the FCC Speed Test app will use once updated to include challenge and crowdsource data functionalities. Broadband Data Task Force and Office of Economics and Analytics Publish Additional Data Specifications for the Submission of Mobile Speed Test and Infrastructure Data into the Broadband Data Collection, Public Notice, DA 22-242 (BDTF/OEA 2022). The updated data specification aligns with the test metrics adopted in this Order. The updated FCC Speed Test app with those functionalities will be available on the FCC’s website and in iOS and Android app stores prior to the opening of the challenge and crowdsource process. 23. We decline to provide a further opportunity for comment on the FCC Speed Test app. Although some parties request an opportunity for public comment on both the FCC Speed Test app and third-party apps before we allow them to be used in the challenge process, CTIA Comments at 4; T-Mobile Comments at 15; AT&T Reply at 10-12; Ookla Comments at 9-10. we note that the Commission already sought comment on the use of the FCC Speed Test app in the challenge process as part of this rulemaking proceeding. See Second Order and Third Further Notice, 35 FCC Rcd at 7515, para. 143. The Commission also provided other opportunities to comment on the FCC Speed Test app because (1) the app was initially developed in coordination with the major wireless providers and trade associations several years ago; and (2) information on the data collected by the app has been publicly available on the Commission’s website and has been available for comment in a rulemaking docket for several years. See generally GN Docket No. 12-264; FCC, Measuring Mobile Broadband Methodology—Technical Summary, https://www.fcc.gov/general/measuring-mobile-broadband-methodology-technical-summary (last visited Feb. 1, 2022). Additionally, the Commission specified in the Third Order that the challenge process use an FCC app, and, unlike some newer third-party speed test apps, the FCC Speed Test app has been in use for several years and the updates that are underway will merely implement the data specifications and requirements proposed in the BDC Mobile Technical Requirements Public Notice and adopted by this Order. Third Order, 36 FCC Rcd at 1166-67, paras. 103-04. For these reasons, we do not believe it is necessary to seek comment on the use of the FCC Speed Test app for challenge and verification purposes. 24. CCA and RWA assert that it is unclear how the FCC Speed Test app will operate when there is inadequate connectivity to upload data or record a test. CCA Comments at 6-7; RWA Comments at 17 (asking the Bureau and Offices to clarify that a “failed” test (i.e., a zero service test) constitutes a negative test); Public Knowledge/New America Reply at 4. Several commenters likewise misunderstand how the FCC Speed Test App reports “failed” tests or tests where mobile service is unavailable. See CCA Comments at 6-7; RWA Comments at 17; Public Knowledge/New America Reply at 4. As set forth in the 2021 technical description of the FCC Speed Test app, “test[ ] results are transferred depending on the available connectivity at the conclusion of the test and can be stored and forwarded when connectivity is immediately unavailable.” 2021 FCC Speed Test App Technical Description at 11, n.8. Failed test results are therefore uploaded to the server and included in the relevant dataset when the app user reestablishes a broadband connection. The FCC Speed Test app is designed to record and store measurements conducted in areas without internet connectivity and then to automatically transmit such failed tests once the app is opened when the device next has broadband connectivity.  Moreover, third-party apps will be required to function in a similar way to be granted approval for use in the challenge process. The upload and download components of a failed test will be recorded as negative if they fail to meet the minimum speeds that the mobile service provider reports as available where the test occurred. For example, if a failed test records speeds of 0 Mbps upload and 0 Mbps download, both components of the test will be recorded as negative. 25. At a later date, OET will release a public notice outlining the process for collecting, reviewing, and approving applications for third-party speed test apps. In their applications, app developers will be required to describe their performance-centric speed test methodologies and how their app complies with the data collection requirements set forth in this Order. Applicants will not be required to disclose any proprietary and/or confidential information that is sensitive to public inspection, such as source code, to the Commission, and we therefore decline to adopt T-Mobile’s request that we require developers to submit their source code for public review. See T-Mobile Comments at 15 (urging the Commission to require third-party apps to disclose app source code for public review); Ookla Reply at 7-9 (asking the Commission to reject T-Mobile’s proposal). The OET public notice also will describe procedures for interested parties to submit comments and replies in response to the proposals and will publish on the Commission’s website a list of approved third-party apps and any available data specifications for third-party apps. 26. We agree with commenters who recommend holding the FCC Speed Test app and third-party apps to the same technical standards. See, e.g., CTIA Comments at 4; AT&T Reply at 10-12; see also CCA Comments at 7-10 (recommending that we authorize third-party speed test apps and authorize the FCC Speed Test app at the same time). Both the FCC Speed Test app and third-party apps, as well as software used by state and local governments and other third parties, must comply with the data collection requirements set forth in this Order. We also agree with commenters who recommend requiring speed test apps to use multiple servers that are geographically diverse. CTIA Comments at 16 & n.47; T-Mobile Comments at 15-16; Ookla Comments at 8. Contra RWA Reply at 8 (expressing concern that a perceived lack of testing servers will be used to discount rural challenges and disagreeing that the FCC Speed app needs more servers due to latency concerns). As to this point, CCA asserts that Ookla’s speed test app is more accurate than the FCC Speed Test app due in major part to its many geographically distributed servers (with 41 servers in the US and 15,019 testing servers globally), which allow users to run a test against a server that is located physically close to them reducing the likelihood of inaccurate latency measurements or artificial increases in latency distorting the download and upload speeds. CCA Comments at 10-11. As described in the most recent technical description for the FCC Speed Test app, the app currently carries out measurements against 13 servers spread out across ten locations throughout the United States and initiates a test sequence by selecting a measurement server using a latency test to identify the optimal server that has the lowest round-trip latency for performing subsequent tests. 2021 FCC Speed Test App Technical Description at 11-12. We believe that the current distribution of FCC Speed Test app servers, combined with this measurement server selection process, provides sufficient diversity to meet this geographic-diversity criterion. We also note that the number of servers used by a speed test is of less concern than the ratio of the concurrent consumers conducting tests to the total capacity of the test server hosting those tests (i.e., the server utilization rate).  The FCC Speed Test app’s test servers are overprovisioned based upon statistics of the utilization rate and usage pattern, which are automatically monitored for the highest system availability, to maintain the optimal connectivity rate. A utilization rate of 80% or more is classified as a critical state, and triggers the provisioning of new servers to stabilize load across the platform.  Accordingly, although not as geographically diverse as Ookla’s speed test app, we believe that the geographic diversity offered by the FCC Speed Test app in the United States provides sufficient capacity to support its user base and that it is sufficiently diverse to meet the required needs that rely on the test system architecture. Id. The test system architecture for multiple redundant and meshed servers to target maximum availability of the test platform also employs load balancing for traffic to failover to other servers in which each server provides a 100 gigabit per second (Gbps) connectivity capacity. In sum, the FCC Speed Test app provides sufficient capacity to support its users and has sufficient geographic diversity to meet the required needs of the test system architecture. We also observe that latency is the principal concern raised by commenters. See, e.g., Ookla Comments at 8 (asserting that tests should rely on servers located in the U.S. that are within 2,000 kilometers of the device running the test to prevent a negative effect on latency measurements); CCA Comments at 10 (arguing that inaccurate latency measurements could be caused by large distances between users and servers); CTIA Reply (agreeing with CCA); but see RWA Comments at 8 (arguing that requiring apps to use additional servers would be a logistical burden that is not outweighed by any specific benefits). In this regard, we note that Commission rules require measurement of round-trip latency. As adopted and implemented in the FCC Speed Test app, the variability of latency is not entirely a function of geographical distance to the test server but also is a function of the network congestion, and so, at a minimum, servers should be distributed nationally in consideration of user base, population density, and the server utilization rate for multiple servers to be examined before the test server selection and located reasonably close to Internet eXchange Points (IXPs) to accurately reflect unbiased real-world conditions. See 2021 FCC Speed Test App Technical Description at 4-5 (stating “[a]s a result, the metrics measure performance along a specific path within each mobile broadband provider’s network, through the point of interconnection between the mobile broadband provider’s network, and the network on which the chosen measurement server resides . . . to accurately measure the performance of mobile broadband connections based on real-world usage”). We point out that the FCC Speed Test app sufficiently considers these effects to help reduce round-trip latency. 27. Validating Speed Tests. As proposed in the BDC Mobile Technical Requirements Public Notice, we will validate submitted speed tests and exclude those that: (i) are outside the scope of the challenge process, (ii) do not conform to the data specifications, or (iii) do not otherwise present reliable evidence. BDC Mobile Technical Requirements Public Notice at *4, para. 9. We will accept as valid speed tests only those tests conducted between the hours of 6:00 a.m. and 10:00 p.m. local time. Id. at *4, para. 9 & n.23 (stating that speed tests should be reflective of the hours that consumers typically use mobile broadband networks and explaining that we propose a departure from the Second Order proposal to require that speed tests be taken between the hours of 6:00 a.m. and 12:00 a.m. because “tests conducted after 10:00 p.m. may likely record network performance that is materially different than tests conducted earlier in the day due to reduced cell loading”); see also Second Order and Third Further Notice, 35 FCC Rcd at 7518-19, para. 153 (proposing that each speed test be taken between 6:00 a.m. and 12:00 a.m. (midnight) local time). Commenters do not raise concerns with our adopting a window for purposes of validating speed tests. See infra para. 53 (discussing the requirements for speed tests to meet the temporal threshold). We will compare speed tests for a particular network technology (e.g., 3G, 4G LTE, or 5G-NR) to the coverage maps for the corresponding technology or higher-generation technology, to the extent the service provider claims coverage for the more than one technology in the tested location. BDC Mobile Technical Requirements Public Notice at *4, para. 9 (proposing to compare speed tests for a particular network technology (e.g., 3G, 4G LTE, or 5G-NR) to the coverage maps for the corresponding technology). We implement these changes so that testers are able to submit tests to be used to challenge a higher-generation technology map in situations when a mobile service provider claims multiple technologies at a location but the tester’s device only connects to a lower-generation technology. We agree with Vermont DPS that our original proposal did not adequately address those situations where a device that is unable to connect to a network using a particular technology “falls back” to a lower-generation technology (e.g., 4G LTE to 3G), which could make it impossible to challenge the higher-generation technology. Vermont DPS Comments at 3-4. Vermont DPS warns that a device intended to run a 4G LTE speed test could “fall back” to 4G and 3G technology as the signal deteriorates in an area, and therefore run three tests (one taken with 4G LTE, one taken with 4G, and one with 3G). Id. We note that there is no meaningful distinction between “4G LTE” and “4G” for mobile networks operated in the U.S., and in Vermont DPS’s example only one test would have “fallen back” as the signal deteriorated (i.e., from 4G LTE to 3G). Nevertheless, we agree that a device that is capable of connecting to the network using a higher-generation technology but that instead connects to the network using a lower-generation technology due to poor network quality is indicative of a problem. We will allow, therefore, a speed test conducted using a device capable of connecting to a higher-generation technology, but that only connects to a lower-generation technology, to count as a test for the higher-generation technology. To be a valid test for the higher-generation technology, the consumer submitting the challenge must also subscribe to a service plan that is capable of connecting to the provider’s network using the higher-generation technology. To prevent gaming, and as discussed further below, we will allow challenged providers to invalidate challenger speed tests with specific evidence that the challenger’s device was not capable of connecting using a higher-generation technology or that the service plan to which the challenger subscribes does not allow use of the higher-generation technology. For example, a test conducted with a 4G LTE-capable device in a location where the service provider claims 4G LTE but where the challenger can only connect via the 3G network could count as both a 3G test when compared to the provider’s 3G coverage map as well as a negative 4G LTE test when compared to its 4G LTE coverage map if the test did not meet the 5/1 Mbps minimum speeds; alternatively, it could count as a positive 4G test if the test met or surpassed the 5/1 Mbps minimum speeds reported for the 4G LTE map. Note that, under this approach, the 3G test may count towards the 4G LTE coverage map regardless of whether the provider claims 3G coverage at the location. This modified approach would resolve Vermont DPS’s hypothetical concern that, under the proposal set forth in the BDC Mobile Technical Requirements Public Notice, a test result that “fell back” to a lower-generation technology would not be “preserved.” Vermont DPS Comments at 4. As discussed, such tests will be preserved and used to challenge a higher technology’s maps if a service provider offers a higher-generation service in that area and the tester subscribes to a service plan that is capable of connecting to the provider’s network using the higher-generation technology. 28. Similarly, if a challenger conducts a test but fails to connect to any network, we will treat that as a failed test against the provider’s coverage maps for each technology to which the device is capable of connecting. These small changes to our original proposal will help prevent the scenario raised by Vermont DPS and enable more meaningful challenges in areas with marginal coverage where a device “falls back” to a lower-generation technology. Our updated approach also accounts for situations in which a device could alternate between, or utilize both, 4G LTE and 5G-NR over the course of a single test. 5G-NR dynamic spectrum sharing and dual connectivity features would allow 4G-LTE and 5G-NR to seamlessly share resources to enhance the users’ experience during different phases of 5G-NR deployments. See Samsung, Technical White Paper: 5G Standalone Architecture at 5-8, 13-15 (2021), https://images.samsung.com/is/content/samsung/p5/global/business/networks/insights/white-papers/0107_5g-standalone-architecture/5G_SA_Architecture_Technical_White_Paper_Public.pdf. Verizon agrees with the Bureau and Offices’ initial proposal to compare each speed test against the relevant coverage map, and argues that “only speed tests conducted on 3G networks should be used to challenge 3G coverage, only speed tests conducted on 4G LTE networks should be used to challenge 4G LTE coverage, and only speed tests conducted on 5G-NR networks should be used to challenge 5G-NR coverage.” Verizon Comments at 11 (agreeing with the proposal in paragraph 9 of the BDC Mobile Technical Requirements Public Notice). However, we are persuaded that the proposal we sought comment on in the BDC Mobile Technical Requirements Public Notice could allow for a scenario in which a tester seeking to support a challenge to a provider’s 5G coverage would be prevented from submitting evidence because their phone fell back to the 4G network. Under our original proposal, in areas where a provider claims coverage for multiple technologies, a lower-generation technology could have prevented the higher-generation technology from being challenged, which in turn could isolate higher-generation technologies from legitimate challenges. 29. We will also compare speed tests conducted in a particular environment—outdoor stationary or in-vehicle mobile—to where the provider’s maps report coverage for the corresponding environment (i.e., outdoor stationary or in-vehicle mobile), as discussed in greater detail below. BDC Mobile Technical Requirements Public Notice at *4, para. 9. Additionally, we will also treat as invalid and exclude from the challenge process any speed tests that fall outside the boundaries of the provider’s most recent coverage data for all claimed technologies and environments. This differs from our original proposal in the BDC Mobile Technical Requirements Public Notice Id. in that the system will preserve all tests in a geographic area where a provider claims coverage by any technology. We believe our modified approach will result in more reliable evidence for challenges because tests that may otherwise have been excluded for falling outside a provider’s coverage for a specific technology under the proposed methodology in the BDC Mobile Technical Requirements Public Notice may now be counted as challenge data. Id. This change will allow for the scenarios discussed above, in which a test conducted using a lower-generation technology could be used to challenge a provider’s map for a higher-generation technology if the provider claims both types of coverage (e.g., 4G LTE and 5G-NR), but a challenger’s device is not connected to the higher-generation technology. 30. In response to Verizon’s concerns that tests may be throttled, Verizon Comments at 11. we will not validate, for purposes of the challenge process, speed tests conducted by customers of mobile virtual network operators (MVNOs) or tests conducted while roaming on another carrier’s network, so as to avoid biasing the challenge process with speed tests that may not reflect typical network performance. MVNOs do not own any network facilities. Instead, they purchase mobile wireless service wholesale from facilities-based service providers and resell these services to consumers. Because the agreements between a facilities-based provider and MVNOs or roaming partners often include limitations on the technology and speed available to or the network prioritization of devices used by consumers of the MVNO or roaming partner, we conclude that speed tests from such devices are not reliable evidence about the performance of the facilities-based provider’s network. See, e.g., Mint Mobile, Terms and Conditions (Dec. 6, 2021), https://www.mintmobile.com/plan-terms-and-conditions/ (“We also manage our network to facilitate the proper functioning of services that require consistent high speeds, such as video calling, which may, particularly at times and in areas of network congestion, result in reduced speeds for other services.”); Google Fi, Terms, https://fi.google.com/about/tos/ (last visited Jan. 20, 2022) (“Limits on the amount of minutes, number of texts, and amount or speed of data services used while roaming may be applied. Certain Services or features may not be available in roaming coverage areas, and call quality may be lower while roaming.”). While we anticipate that the majority of tests conducted by an MVNO subscriber or while roaming will fail our automated validations, there may be circumstances where the BDC system is unable to automatically identify these tests (e.g., identifying whether an iOS device is roaming is not currently possible). We anticipate that a provider may identify whether a specific device(s) used in the testing was either roaming at the time, was an MVNO customer, or was subject to deprioritized or otherwise limited service because, as discussed, on-the-ground speed tests submitted in the challenge process will include the timestamp that test measurement data were transmitted to the app developer’s servers, as well as the source IP address and port of the device, as measured by the server. We therefore do not agree with Vermont DPS’s assertion that pre-paid tests in rural areas will be less accurate than speed tests run by subscribers of a typical service provider, due to the fact that pre-paid services exclude roaming in rural areas, because we will not validate any tests conducted while a subscriber is roaming. Vermont DPS Comments at 12. While we will allow a service provider’s pre-paid customers to submit speed tests for use in the challenge process, a service provider will be able to use the timestamp that test measurement data were transmitted to the app developer’s servers, as well as the source IP address, and port of the device, as measured by the server to determine if a specific speed test is run by a pre-paid subscriber that experienced limited service, and use that information when responding to a challenge. Given that these consumers may likely be subject to de-prioritization or otherwise limited service, and that the BDC system will be unable to detect whether or not a limitation in mobile service exists, we are unable to establish a reliable method for validating MVNO or roaming tests and, thus, these tests will be excluded from the challenge process. As discussed later, however, we may consider speed tests conducted by consumers of MVNOs and consumers roaming on other providers’ networks when evaluating crowdsourced data. See infra Section III.D.1. Crowdsourced Data. 31. Aggregating Valid Speed Tests. The Bureau and Offices will combine and collectively evaluate—according to the testing environment (i.e., outdoor stationary or in-vehicle mobile) and technology type— valid speed tests submitted by consumer, governmental, and third-party challengers. See BDC Mobile Technical Requirements Public Notice at *4, para. 8. Speed tests, including those collected through an approved speed test app and the data collected by government and other third-party entities through their own software and hardware, will be combined and collectively evaluated according to their tested environment and technology type. For example, as discussed in greater detail below, in-vehicle tests will generally be evaluated against a carrier’s in-vehicle maps, and stationary tests will generally be compared against a carrier’s stationary maps. We expect that in-vehicle and stationary tests will have substantially different results such that they would not provide an equal comparison and aggregating these tests would be problematic because there are fundamental characteristics of the two environments that are expected to cause noticeable signal losses for the in-vehicle mobile environment. See Verizon Comments at 7-8; CTIA Comments at 11; T-Mobile Comments at 10-11; AT&T Reply at 8; CTIA Feb. Ex Parte at 4. As noted above, we do not expect iOS and Android devices to pose a similar problem. While we will receive a more complete set of datapoints from Android tests than iOS tests, we do not expect them to have substantially different results when, for example, tests using both types of devices are conducted in a pedestrian stationary environment, such that the tests would not have equal value and could not be compared and aggregated; the fact that iOS provides fewer datapoints than Android tests does not render a test run using iOS any less accurate than a test run using the Android operating system. Similarly, tests conducted with an external antenna will be considered in-vehicle, and while subtle differences between test results from those antenna placements may occur, overall those differences are considerably less significant than the differences between stationary vs. in-vehicle mobile more broadly. We will combine such speed test evidence and apply a single methodology to determine whether the thresholds for a cognizable challenge (described in greater detail below) have been met and the boundaries of the challenged area. See id. at *4, para. 8. Several commenters express support for aggregating speed tests from multiple challengers, CTIA Comments at 14-15; T-Mobile Comments at 12-13; RWA Comments at 9; AT&T Reply at 4. and we find that doing so will result in more accurate challenges and will further the Commission’s goals of resolving challenges in an efficient manner, mitigating time and expense, and ensuring that maps are as reliable and useful as possible. Id. See Third Order, 36 FCC Rcd at 1168, para. 105. We disagree with CPUC’s assertion that combining speed test data will not reduce costs or complexity in the challenge process. CPUC Comments at 16-17. In fact, combining speed tests could ease the other potential burdens on an individual challenger of conducting multiple speed tests to meet the challenge thresholds. CTIA Comments at 15 (“Aggregating test data of the same coverage maps helps achieve the goal of a user-friendly challenge process and reduce[s] burdens on providers and the Commission.”). Our approach ensures that a smaller number of speed tests by one person or entity may nevertheless contribute to a challenge because the tests will be combined with other validated speed tests to meet the testing, temporal, and geographic thresholds. As a result, in many cases, no single challenger—whether a consumer, a government agency, or other entity—will be required to individually shoulder the burden of creating a challenge. AT&T Reply at 4 (“It is also a user-friendly approach because it reduces the burden on individual consumers who do not have to take on the responsibility of challenging coverage by themselves.”). While in places with low population density an individual challenger may be the only entity to submit speed tests to create a cognizable challenge, in many other cases, challengers will be able to combine efforts to submit speed tests in an area. Speed tests will be combined and used collectively—according to testing environment (i.e., outdoor stationary or in-vehicle mobile) and technology type—to meet the thresholds set forth below. 32. We will evaluate tests for a given technology against each provider map independently (one reporting stationary and one reporting in-vehicle mobile coverage) when determining whether to establish a cognizable challenge. BDC Mobile Technical Requirements Public Notice at *6, para. 13. Pursuant to the Third Order, tests taken on bicycles and motorcycles will be considered tests from in-vehicle mobile environments. Third Order, 36 FCC Rcd at 1166, para. 102 & n.315. We will consider in-motion tests taken in similar environments, such as on snowmobile or all-terrain vehicle, to be tests from in-vehicle mobile environments. By contrast, consistent with the Third Order, tests taken from stationary positions and tests taken at pedestrian walking speeds (such as on horseback) will be considered tests taken in outdoor pedestrian environments. See id. We decline to exclude tests taken on other vehicles as T-Mobile requests. T-Mobile Comments at 10. The Commission did not give the Bureau and Offices authority to change this accommodation; we anticipate that challengers may take speed tests on other vehicles than cars in areas with difficult or hard to reach terrain. Additionally, we will exclude stationary tests that occur outside a provider’s stationary coverage map and in-vehicle mobile tests that occur outside a provider’s in-vehicle mobile coverage map. BDC Mobile Technical Requirements Public Notice at *6, para. 13. Our approach differs from that which we proposed in the BDC Mobile Technical Requirements Public Notice in that we will no longer aggregate in-vehicle and stationary maps together. We find that the approach we adopt will result in more accurate challenges. T-Mobile Comments at 11 (“the Commission should not sacrifice accuracy and sound methodology for the sake of user friendliness. . . . [i]f the Commission persists in requiring [providers to submit both in-vehicle and stationary maps], it should acknowledge they are separate coverage maps and must have separate challenge processes for each map.”). To ensure that the challenge process also remains user-friendly, and because we expect performance to be better for stationary tests than for in-vehicle tests, In-vehicle users’ data speeds are adversely affected by lower signal strengths and quality at the end-user devices, typically because of the additional vehicle signal penetration losses and sudden variations to the signal quality compared to the stationary scenario, resulting in lower achievable users’ data speeds. See also Verizon Comments at 7-8; CTIA Comments at 11; T-Mobile Comments at 10-11; AT&T Reply at 8; CTIA Feb. Ex Parte at 4. stationary speed test results that create a cognizable challenge to an area on the stationary map will also create a cognizable challenge to the same area on the in-vehicle map if the area has overlapping coverage on both maps. On the other hand, the reverse situation will not be permitted, meaning, we will not permit a challenge to an area on the in-vehicle map to automatically create a challenge to the same area on the stationary map if the area has coverage on both maps. If, however, in an area that has coverage on both maps we find that large portions of a provider’s in-vehicle mobile map have been successfully challenged, but there are very few speed tests conducted in a stationary environment, then we may use this as evidence upon which to form a credible basis for initiating a verification inquiry of a provider’s stationary coverage in that area. Similarly, a provider refuting a challenge to a geographic area on the in-vehicle map would also refute a challenge to the same area on the stationary map if that challenge exists. 33. Several providers express concern about the proposal to aggregate in-vehicle mobile and outdoor stationary tests and compare them collectively against both coverage maps. Verizon Comments at 7-8 (asking the Bureau and Offices to not adopt its proposal to evaluate in-vehicle tests against stationary maps because losses from the vehicle reduce the measured speed and therefore do not provide a valid measurement of the coverage that would be experienced by an outdoor pedestrian); CTIA Comments at 11 (arguing that aggregating outdoor stationary and in-vehicle maps would run counter to the Commission’s efforts to deter frivolous filings and would be inconsistent with our proposal to only compare speed tests of a generation of technology to the maps for that generation of technology); T-Mobile Comments at 10-11 (asking the Bureau and Offices, if they require providers to submit in-vehicle maps, to treat the mapped environments differently due to the variability of in-vehicle testing and have a separate challenge process for both stationary and in-vehicle coverage maps because the proposal to aggregate them together could result in a false number of failed tests if a challenger submits an in-vehicle test when outdoor stationary coverage meets or exceeds the minimum speed available); AT&T Reply at 8 (arguing that aggregating in-vehicle and stationary challenges would lead to inaccurate challenges and false negative tests because the speed tests from in-vehicle); CTIA Feb. Ex Parte at 4 (arguing that traveling in a moving vehicle has a measurable impact on throughput speeds due to Doppler shift and spread, as well as dB loss as signal penetrates the vehicle, and, as such, in-vehicle tests would not accurately portray coverage availability in an outdoor stationary environment, which would make it difficult to evaluate the validity of a challenge). As described above, we will not aggregate all stationary and in-vehicle mobile tests for comparison against both maps but will evaluate stationary tests against the stationary map and the in-vehicle mobile tests against the in-vehicle map. Rather than aggregating all tests, we will allow cognizable challenges to the stationary map to also create a challenge for the same area on the in-vehicle map and successful provider responses to the in-vehicle map to also refute a cognizable challenge of the same area on the stationary map. We find that this approach adequately addresses providers’ concerns about comparing tests from different modeled environments, and promotes consistency between the maps. We thus decline to adopt the Vermont DPS’s recommendation to allow challengers to submit in-motion tests to challenge stationary coverage, Vermont DPS Comments at 8-9. because we do not expect in-vehicle tests to achieve the same performance had the test been conducted in a stationary environment. If we did not allow for challenge or response comparison to both maps in the limited circumstances we adopt above, it would be easier for one map in an area to show a lack of coverage while the other map shows robust coverage—solely because of a lack of testing. See BDC Mobile Technical Requirements Public Notice at *7, para. 13, n.36 (explaining that “it may be significantly more difficult to establish a challenge to certain coverage data” were we to “only evaluate stationary tests against stationary maps and separately evaluate in-vehicle mobile tests against in-vehicle mobile maps”). Alternatively, were we to only count certain positive or negative tests towards the other map, such an approach could also create unintended anomalies. For example, if only negative but not positive stationary tests counted against maps reporting in-vehicle mobile coverage, a challenger could submit 10 negative stationary tests and 1000 positive stationary tests and still successfully challenge the in-vehicle map. Such test results would indicate good coverage for the stationary map yet could still result in a challenge to the in-vehicle map because the system would only count the negative tests despite strong evidence showing good overall coverage. We conclude that the approach we adopt strikes the best balance between the competing goals of making the challenge process user-friendly and ensuring robust, accurate evidence, while largely mitigating unintended outcomes. 34. Data from speed tests taken after the “as-of” date of the initial BDC data collection will be considered as part of the challenge process upon confirmation that they meet the validation criteria See supra paras. 27-30. set forth herein. Accordingly, once the Commission has generated maps of the data collected from providers, the BDC system will analyze all previously submitted tests to determine whether they were taken after the “as-of” date of the maps and to perform the data validations discussed further below, including whether they were taken within the published coverage area claimed by the applicable provider. Speed tests submitted as part of the challenge process that do not meet these qualifications will be considered crowdsourced data. BDC Mobile Technical Requirements Public Notice at *19, para. 52. Validated speed tests results will be reconsidered on a monthly basis, in conjunction with any newly validated speed test filings, to determine whether the data meet the geographic, temporal, and testing thresholds to create a cognizable challenge to an area. Id. at Appx. A – Technical Appendix § 3.1. Such speed tests will be considered for up to one year to determine whether the data for a location subsequently meet the thresholds to be considered a cognizable challenge, and if so, the tests will be used collectively to challenge the maps that are published at that time. Id. 35. Once the maps have been published, the BDC system will analyze all submitted tests to determine whether speed tests fall within the geographic area depicted in a provider’s published coverage area. Id. at Appx. A – Technical Appendix § 2. Speed tests submitted after the “as of” date but prior to publication of the map, as well as those submitted after the publication of the maps, will be used to challenge the maps that are published at that time, subject to the restriction that speed tests are considered valid evidence for one year from the date the test was taken. During the one-year period that they remain valid evidence, speed tests may initially be excluded from consideration in the challenge process because the speed tests fell outside of the provider’s reported coverage maps but be included when the system reconsiders the challenge data every month, due to subsequent publication of maps reporting coverage in which such tests are located. For example, if a challenger submits otherwise valid speeds tests that were conducted in July in an area reported by the provider to not have coverage in its maps that are “as of” the previous December 31, such tests would be initially excluded. If the coverage maps submitted by the provider “as of” June 30 and published in September of that year do report such areas as covered however, the tests taken in July would be considered as valid evidence in favor of a challenge to the June 30 maps. Parties submitting speed tests to be used in the challenge process will be notified when their test has been submitted and that the test submitted may be used to create a challenge if such data meet the validation requirements. BDC Mobile Technical Requirements Public Notice at Appx. A – Technical Appendix § 3.1. Thereafter, parties that have submitted speed tests to be used in the challenge process will be notified of the status of their submitted speed tests, which will include information on whether their speed test is used in the creation of a cognizable challenge. Id. at *7, para. 15, Appx. A – Technical Appendix § 3.1. 36. Maps That Can Be Challenged. We clarify that speed test data will only be used to create challenges in areas where a provider reports that it has broadband service availability. Id. at Appx. A – Technical Appendix § 2; see T-Mobile Comments at 15-16 (asking the Commission to require apps to have sufficient functionality so that a challenger cannot submit test data from a location that a provider does not report as being served). We will, however, permit challenges to 3G, 4G LTE, and 5G-NR coverage maps. Some commenters suggest that we defer consideration of challenges to 3G maps, For example, CTIA, AT&T, and Verizon argue that the Broadband DATA Act does not require a challenge process for 3G maps and, since 3G maps will not be used by universal service programs like the 5G Fund, they should not be available to challenge. See CTIA Comments at 12-13; CTIA Reply at 5-6; Verizon Comments at 4-5; AT&T Reply at 10. T-Mobile adds that many providers are in the process of retiring 3G networks. See T-Mobile Comments at 4-5. but the Commission has classified 3G as a mobile broadband technology in previous BDC orders and has determined to allow challenges to the accuracy of mobile broadband coverage maps. 47 CFR § 1.7006(e), (f); Third Order, 36 FCC Rcd at 1141, para. 36; Second Order and Third Further Notice, 35 FCC Rcd at 7476, para. 38. We reiterate that mobile providers are required to submit propagation maps for their 3G networks in all areas where they provide 3G service even if they also provide 4G LTE and/or 5G-NR service in the same area. 47 CFR § 1.7004(c)(3)(i). Since the Commission did not delegate to the Bureau and Offices the authority to limit challenges to certain technologies, we lack the discretion to limit challenges to only 4G LTE and 5G-NR maps. Moreover, doing so could exclude certain consumers from the challenge process. For example, consumers rely on 3G in areas where 4G LTE and 5G-NR are not offered by the provider or are otherwise unreliable, and subscribers in rural areas continue to use 3G at higher concentrations than other parts of the country. See Second Order, 35 FCC Rcd at 7481, para. 47 & n.131. See generally Form 477 data, https://www.fcc.gov/mobile-deployment-form-477-data. We note that, when a provider retires a given mobile broadband technology such as 3G, that service would not be included on its updated coverage maps and therefore would not be available for challenges. However, until providers retire a particular broadband network technology, they will be obligated to respond to challenges to their claims of coverage for that technology. 37. Based on the record and the goals underlying the Broadband DATA Act, we adopt our proposal to exclude voice maps from the challenge process. BDC Mobile Technical Requirements Public Notice at *4, para. 9. The Broadband DATA Act requires the Commission to establish a process for challenging the accuracy of broadband coverage data, which, for mobile services, is defined as “the coverage maps” (i.e., the broadband maps discussed in 47 U.S.C. § 642 (c)(1)) and “any information submitted by a provider regarding the availability of broadband internet access service.” See 47 U.S.C. § 642(b)(5)(A), (c)(1). Additionally, the Commission has decided that the mobile challenge process applies only to broadband (and not voice) coverage maps. 47 CFR § 1.7006(e) (“Consumers may submit data to challenge the accuracy of mobile broadband coverage maps.” (emphasis added)); id. § 1.7006(f) (“State, local, and Tribal governmental entities and other entities or individuals may submit data to challenge accuracy of mobile broadband coverage maps.” (emphasis added)). We also note that commenters raise concerns with using “speed test” data to verify voice coverage maps. See, e.g., CTIA Comments at 12; CTIA Reply at 6; T-Mobile Comments at 7; T-Mobile Reply at 4. Vermont DPS disagrees, proposing that the Bureau and Offices should set parameters for voice maps, including defining a threshold signal level of upload and download speeds that would indicate voice service is available in an area. Vermont DPS Comments at 4-5; Vermont DPS Reply at 5. We reject the Vermont DPS proposal. Vermont DPS was the only commenter to proffer minimum throughput parameters (i.e., download and upload speeds) or signal strength values necessary to support a voice call, but these values did not receive any additional record support. Although Vermont DPS recommends that the Bureau and Offices determine threshold parameters that “would be indicative of no mobile service,” it does not propose specific parameters, noting only that zero would be indicative of no service and that 256 kbps download, 64 kbps upload, or a signal strength of less than -105 decibel-milliwatts (dBm) would indicate that service is likely insufficient. Vermont DPS Comments at 4-5. We therefore decline to include voice maps as part of the mobile challenge process at this time. 38. Additionally, we reject commenters’ requests to allow challenges only to outdoor stationary coverage maps. CTIA, Verizon, T-Mobile, and AT&T ask the Commission to reconsider the in-vehicle mapping requirement altogether. CTIA Comments at 3-4, 6-11; CTIA Reply at 3-5; T-Mobile Comments at 9; AT&T Reply at 7-9; Letter from Sarah K. Leggin, Director, Regulatory Affairs, CTIA, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 19-195, 11-10, at 2 (filed Feb. 4, 2022) (CTIA Feb. Ex Parte). Verizon requests that the Commission require challengers to instead conduct speed tests outside of a vehicle or building. Verizon Comments at 7-9. CTIA, Verizon, T-Mobile, and AT&T argue that the Commission should focus first on challenges to outdoor stationary maps, and defer consideration of any challenges to in-vehicle maps until after it has ruled on CTIA’s petition for reconsideration to eliminate in-vehicle coverage maps. CTIA Comments at 3, 7-8; CTIA Reply at 3-4; T-Mobile Comments at 9; Verizon Comments at 3; AT&T Reply at 7-8; CTIA Feb. Ex Parte at 3. The Commission’s Third Order clearly directed that we collect both sets of maps, and we will not eliminate or delay the challenge process for in-vehicle maps given the importance in making the challenge process available for consumers and other entities that use mobile services in vehicles, unless the Commission determines that such maps are not necessary. CTIA, Verizon, T-Mobile, and AT&T also argue that in-vehicle maps should be excluded from the challenge process because the Commission has not established parameters for mapping in-vehicle coverage or evaluating in-vehicle challenges. CTIA Comments at 9-10; T-Mobile Comments at 9-10; Verizon Comments at 4; AT&T Reply at 8; CTIA Feb. Ex Parte at 3 (asking the Commission to seek further comment on adopting a targeted set of parameters for in-vehicle mapping). Limiting the challenge process to outdoor stationary tests and maps could reduce the utility and accuracy of the challenge process, given that many consumers use mobile services in vehicles and in motion. We recognize that many states ban handset use while driving and many vehicle operators do not have passengers. We do not intend to contravene state bans on handset use while driving, nor do we advocate for consumers to run speed tests on a personal handset while operating a vehicle. It also would ignore a significant number of speed tests, especially on highways and in areas where it is not safe or convenient to conduct stationary speed tests. Moreover, the Commission has established sufficient parameters for mapping in-vehicle coverage and evaluating in-vehicle challenges. The Commission has allowed consumers to conduct speed tests in an in-vehicle mobile environment, but declined to adopt detailed testing requirements for in-vehicle consumer tests, See Third Order, 36 FCC Rcd 1166, para. 102 & n.315 (declining to adopt a requirement that consumers stop a vehicle and place the testing device outside of the vehicle or connect it to an external antenna because such a requirement “would add complexity to the speed test rules we adopt for consumer challengers that would be inconsistent with the Commission’s obligation under the Broadband DATA Act to adopt a user-friendly approach that encourages participation in the challenge process”). whereas it required government and third-party challengers to submit more detailed information on tests run in in-vehicle mobile environments. Id. at 1172, para. 118 & n.343 (“Given the more complex nature of government and other entity data gathering programs, we require government and other entity challengers to submit more detail regarding speed tests that were taken in an in-vehicle mobile environment than we require for consumer challengers.”). We reiterate that all challengers must report whether the test was taken in an in-vehicle mobile or outdoor pedestrian environment; for in-vehicle tests, the speed the vehicle was traveling when in-vehicle tests were taken (where available); and, for government and other third-party challengers conducting in-vehicle tests, whether the test was conducted with an antenna located outside of the vehicle. Id. at 1166, 1172, para. 102, n.315, 117, n.343. 39. Finally, we decline to adopt Vermont DPS’s request to change the thresholds for in-vehicle tests “to account for the slight difference in performance of stationary and mobile tests” Vermont DPS Comments at 8-9. because, as discussed, we will not use in-vehicle test data to form the basis of a challenge of stationary maps. See supra para. 32. Moreover, Vermont DPS has not given us any objective metric by which to adjust tests upward or downward for purposes of meeting the threshold when comparing the test against the other environment (i.e., Vermont does not suggest any formula to accurately estimate actual performance (based upon, e.g., signal strength) and thus, there is no way we could translate signal strength into actual speeds). 40. We also reject suggestions that we permit challenges only in rural areas. Verizon Comments at 5 (asserting that restricting challenges to rural areas, at least initially, would “reduce the burdens on challengers, providers, and the Commission and focus the challenges on the coverage that matters for universal service purposes”). The Broadband DATA Act envisions a broad challenge process, and there is nothing in the Act that authorizes the Commission, or by extension, the Bureau and Offices, to limit the challenge process to rural areas. See 47 U.S.C. § 642(b)(5) (requiring the Commission to establish a challenge process through which consumers and other entities may challenge the accuracy of, among other things, “the coverage maps . . . [and] any information submitted by a provider regarding the availability of broadband internet access service”). 41. Grouping Valid Speed Tests by Location. After excluding speed tests that fail our validations, we will associate the location of each valid speed test with a particular underlying hexagonal cell geography based on the H3 geospatial indexing system. See BDC Mobile Technical Requirements Public Notice at *4, para. 10. H3 is an open-source project developed by Uber Technologies, Inc. that overlays the globe with hexagonal cells of different sizes at various resolutions, from zero to 15. Isaac Brodsky, H3: Uber’s Hexagonal Hierarchical Spatial Index, (June 27, 2018), https://eng.uber.com/h3/. The lower the resolution, the larger the area of the hexagonal cell. BDC Mobile Technical Requirements Public Notice at *4, para. 10; see also Isaac Brodsky, H3: Uber’s Hexagonal Hierarchical Spatial Index, (June 27, 2018), https://eng.uber.com/h3/. The H3 system is designed with a nested structure wherein a lower resolution cell (the “parent” hexagon) contains approximately seven hexagonal cells at the next higher resolution (its “children” and each a “child” hexagon), which approximately fit within the “parent” hexagon. Isaac Brodsky, H3: Uber’s Hexagonal Hierarchical Spatial Index, (June 27, 2018), https://eng.uber.com/h3/. The lower the resolution, the larger the area of the hexagonal cell. See id. (“H3 supports sixteen resolutions. Each finer resolution has cells with one seventh the area of the coarser resolution. Hexagons cannot be perfectly subdivided into seven hexagons, so the finer cells [i.e., the ‘children’] are approximately contained within a parent cell. The identifiers for these child cells can be easily truncated to find their ancestor cell at a coarser resolution, enabling efficient indexing.”). Because of this nested structure, using the H3 system to group speed tests allows for challenges at multiple levels of granularity which, as discussed below, enables challengers in rural areas where broadband coverage may be more sporadic to contest larger areas if aggregated speed test data demonstrate a lack of coverage within a sufficient number of child hexagons. See infra para. 49; Isaac Brodsky, H3: Uber’s Hexagonal Hierarchical Spatial Index, (June 27, 2018), https://eng.uber.com/h3/. The nested structure includes 16 total H3 resolutions of hexagons ranging in average area size from approximately 4.25 million square kilometers to 0.9 square meters. H3, Table of Cell Areas for H3 Resolutions, https://h3geo.org/docs/core-library/restable/ (last visited Jan. 27, 2022). As proposed, the smallest cognizable challenge will be to a single resolution 8 hexagonal cell, which has an area of approximately 0.7 square kilometers. BDC Mobile Technical Requirements Public Notice at *4, para. 10; see also H3, Table of Cell Areas for H3 Resolutions, https://h3geo.org/docs/core-library/restable/ https://h3geo.org/docs/core-library/restable/(last visited Jan. 27, 2022). 42. Some commenters support the use of hexagons to evaluate challenges but recommend basing challenges on a different hexagonal cell size. AT&T Reply at 5 (“AT&T generally supports the proposed hexagonal cell area process.”); CTIA Comments at 3-4 (“CTIA generally supports the use of the H3 system . . . .”); T-Mobile Comments at 3 (“[T]he H3 indexing system is a reasonable solution overall because it is an open-source platform that allows stakeholders to use a common framework for assessing the accuracy of coverage data.”); Vermont DPS Comments at 5 (“VTDPS supports the proposed use of the H3 geospatial indexing system with certain necessary additions.”). While Vermont DPS generally supports the proposed use of H3 indexing, it argues that the system is not intuitive to use and asks the Commission to create and share GIS layers for the H3 hexagons at all resolutions it intends to employ in the coverage analysis, which we have already done. Vermont DPS Comments at 5. We have posted GIS layers for the H3 hexagons proposed for grouping speed tests in the BDC mobile challenge and verification processes on our website. Broadband Data Collection Resources, https://www.fcc.gov/BroadbandData/resources#data-reports (last visited Feb. 7, 2022). CTIA, T-Mobile, and AT&T urge us to use smaller resolution 10 hexagons instead of resolution 8, contending that hexagons at resolution 10 better match the 100-meter resolution providers must use when submitting their coverage maps. CTIA Comments at 13-14; CTIA Reply at 7-8; T-Mobile Comments at 4-6, 13-14 (asking the Commission to use resolution 10 hexagons in the challenge process, including the geographic threshold); T-Mobile Reply at 5; AT&T Reply at 4-5; see Second Order and Third Further Notice, 35 FCC Rcd at 7477-78, para. 40. RWA and Vermont DPS, meanwhile, recommend allowing challenges to resolution 6 and 7 hexagons in rural areas, which RWA notes are often difficult to test because of a lack of accessible roads. RWA Comments at 6-12; RWA Reply at 2; Vermont DPS Reply at 3; Letter from Carri Bennet, General Counsel, RWA and Alex Espinoza, Regulatory Counsel, RWA, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 19-195, at 2 (filed Nov. 19, 2021) (RWA November 2021 Letter); but see CTIA Reply at 2 (asking the Bureau and Offices to reject RWA’s suggestion to use resolution 6 and 7 hexagons). 43. We find that resolution 8 strikes an appropriate balance as the smallest resolution for a cognizable challenge. Smaller areas (e.g., resolution 9 or 10) could result in many disparate challenges that may require excessive testing by providers and, in the case of resolution 10 hexagons, may exceed the granularity of propagation maps that were not designed to provide such precision. Coverage maps must be submitted at a resolution of 100 meters (i.e., 0.1 km) or better. See 47 CFR § 1.7004(c)(3)(iii). Therefore, allowing for challenges to an area as small as a resolution 10 hexagon cell, which is smaller than the 100 meter map resolution, may instead reflect inaccuracies due to the resolution at which the provider generated its maps. Larger areas (e.g., resolution 6 or 7 hexagons), on the other hand, would require significantly more testing for challengers and make it difficult to verify coverage in distinct local areas. See Appx. A – Technical Appendix §§ 2.1, 3.1. For example, a resolution 7 hexagon would require four to seven times as many tests as a resolution 8 hexagon to create a successful challenge. The Commission directed staff to determine the requisite number of tests and define the geographic boundaries of cognizable challenges while satisfying the goals of both “encourag[ing] consumers to participate in the challenge process and assuring that providers are not subject to the undue cost of responding to a large number of challenges to very small areas.” Third Order, 36 FCC Rcd at 1167-68, paras. 105-06. We are not persuaded that allowing challenges to areas smaller than the 100-meter resolution (i.e., a resolution 10 hexagon) requirement would adequately meet these goals. Using areas smaller than a resolution 8 hexagon would additionally make it difficult for consumers to reach the threshold of cognizable challenges. See Appx. A – Technical Appendix §§ 2.1, 3.1. A challenger would need to take many more tests in the smaller hexagons to achieve the statistical significance required. Use of particularly small areas also would likely make in-motion testing for both challengers and providers impossible. A car traveling at 35 mph without stops passes through a resolution 8 hexagon in roughly 50 seconds or less and through a resolution 9 hexagon in roughly 20 seconds or less (and in half that amount of time if starting from the center of the hexagon). As noted above, a typical FCC speed test cycle takes 23-31 seconds to complete. In the future, we may consider using hexagonal cells at a higher resolution if it becomes necessary to correct coverage errors at a more granular level. 44. RWA and Verizon assert that the use of the H3 geospatial indexing system would present implementation issues. Verizon Comments at 12; RWA Comments at 13. RWA cautions that third-party network maps, which providers may use to supplement the data used to rebut challenges, may not be compatible with the H3 geospatial indexing system. RWA Comments at 13. Verizon also raises concerns that providers would need to develop new tools and systems for managing speed tests and evaluating data in an H3-based environment and notes that tracking and evaluation may be complicated because child cells will not nest precisely into their parent cell. Verizon Comments at 12. These concerns do not warrant deviations from our proposal since parties seeking to rebut challenges do not need to conform their tools or data to the H3 indexing system. The BDC system itself will overlay submitted speed test points with the H3 hexagons; providers need only submit their speed test data and the BDC system will appropriately index them (so long as the data otherwise meet the specifications and test requirements to qualify as valid on-the-ground speed test data). Moreover, H3 is an open-source indexing system, and therefore we do not anticipate it being overly expensive or burdensome for providers to access. Finally, in response to Verizon’s argument that the tracking and evaluation of speed test data will be complicated because child cells will not nest precisely into their parent cell, we note that speed tests will be evaluated based on the resolution 8 hexagon within which a test falls. 45. CPUC and Public Knowledge/New America assert that submitting speed test data under the H3 system using resolution 8 hexagons would be more burdensome and expensive, and would result in fewer challenges, because challengers would need to gather statewide measurements in each resolution 8 hexagon. CPUC Comments at 14-16; Public Knowledge/New America Reply at 4. We disagree. First, challengers will not need to submit speed tests in every resolution 8 hexagon in a state because challenge data cannot form the basis of a cognizable challenge in areas where a provider does not claim coverage. Challengers will be aware of the areas in which a provider does not claim coverage from the publicly available mobile broadband map and can avoid the burden and expense of conducting speed tests in those areas. Second, as discussed, we will combine, according to the tested environment, valid speed tests conducted by consumers, state, local, and Tribal governments, and other entities. This likely will reduce the number of speed tests that any one challenger needs to submit to create a challenge. The number of required tests needed to meet the thresholds reflect the total number of speed tests needed to create a cognizable challenge, not necessarily the number of speed tests that must be submitted by an individual consumer or entity. Third, CPUC’s concerns ignore our decision to allow testers to challenge larger geographic areas, such as resolution 7 hexagons or resolution 6 hexagons, when at least four of the seven child hexagons of the parent hexagon are challenged. See infra para. 49; BDC Mobile Technical Requirements Public Notice at *5, para. 12. Testers will be able to see which areas have been challenged and if, for example, four or more of the seven child-resolution 8 hexagons in a resolution 7 hexagon are challenged, then the entire resolution 7 hexagon will be considered challenged. In this case, there would be no need to conduct further speed tests in the remaining resolution 8 hexagons in order to create a challenge to the resolution 7 hexagon. Finally, H3 indexing will not burden testers because it will serve as an “under the hood” way for the Commission to group and analyze speed tests submitted by testers at various times and places. 46. We will evaluate all valid challenger speed tests that present evidence about the service of a given technology and environment within each hexagon to determine whether to create a cognizable challenge to the coverage in that area. BDC Mobile Technical Requirements Public Notice at *5, para. 11. We did not receive any comments on this proposal. We also adopt the alternative approach proposed in the BDC Mobile Technical Requirements Public Notice to evaluate the download and upload components of each speed test individually rather than evaluating them jointly. See id. (proposing to categorize speed tests as “positive” or “negative” based upon whether a speed test meets or does not meet the minimum predicted download or upload speed that the mobile service provider reports as available at the location where the test occurred and proposing an alternative to evaluate download and upload speed test components independently). Under this approach, each component will be categorized as either “positive” or “negative” based on whether the component is consistent with the provider’s modeled coverage (i.e., the coverage assumptions in providers’ BDC propagation maps). Id. A positive component is one that records speeds meeting or exceeding the minimum speeds that the mobile service provider reports as available where the test occurred. Id. For example, a positive download component for a 4G LTE speed test would show speeds of at least 5 Mbps, and a positive upload component for a 4G LTE speed test would show speeds of at least 1 Mbps. 47 CFR § 1.7004(c)(3)(i)-(v). A negative component is one that records speeds that fail to meet the minimum speeds that the mobile service provider reports as available where the test occurred. Id. For each speed test, the download component will be either positive or negative, and the upload component will be either positive or negative. The coverage map will then be evaluated for all download tests and separately for all upload tests. If a resolution 8 hexagon meets the thresholds for either upload or download tests, a challenge would be triggered. In order to rebut a challenge, a provider would need to meet the thresholds for both the upload components and download components. Speed test apps typically measure download and upload components sequentially and not simultaneously, so evaluating these components independently will better account for geographic and/or temporal variability. Id. 47. In the case where the starting and ending locations of a test are in different hexagons (e.g., because the testing device was in motion), we will associate the test with the hexagon containing the midpoint of the reported start and end coordinates for each test component. We also will use the midpoint to determine whether the test component falls within the applicable provider’s coverage map. Each test component will be point-hex dependent. Therefore, a download test could be associated with a different point-hex than an upload test, and in such cases, the two tests would be treated independently. We disagree with Ookla that we should use the start location as the single point value of a test rather than associating two locations for each data point. Ookla Comments at 6-7. We also disagree with Vermont DPS that we should use a single set of geographic coordinates at the start of each on-the-ground sequence, but we do agree with its alternative recommendation and will capture the timestamp and duration of each test component, as well as the geographic coordinates measured at the start and end of each test component with typical GPS Standard Positioning Service accuracy or better. Vermont DPS Comments at 9-10. Having start and end coordinates for each test will facilitate our verification of stationary maps versus mobile maps because it will enable us to capture the precise locations of drive tests. 48. We decline Verizon’s request to adopt additional device- and plan-specific requirements. For example, Verizon suggests requiring challengers to conduct speed tests with newer devices, requiring non-consumer challengers to use a service plan that allows for speed tests of full network performance, and excluding consumer and non-consumer speed tests from challenge calculations if the provider can show that a particular device used by the challenger was subject to reduced speeds. Verizon Comments at 10-11; T-Mobile Comments at 16 (arguing that providers should be able to rebut challenges by showing that the devices used for testing were not compatible with the technology relevant to the provider’s map or the provider’s spectrum). Contra RWA Reply at 8 (asking the Commission to not permit providers to rebut or discount tests conducted from “incompatible devices”). We recognize that some devices have limitations (e.g., an older device may not connect to all spectrum bands), but find that restricting the types of devices that can be used to conduct speed tests would make the challenge process less user-friendly and less accessible to consumers and non-consumers alike. At the same time, a challenger must disclose the manufacturer and model of its device so that providers will have this information when rebutting challenges and can seek to invalidate tests from devices that are not compatible with a specific network or band. Third Order, 36 FCC Rcd at 1167, para. 103. We will also allow mobile service providers to respond to a challenge with infrastructure information in situations where a mobile device used in the testing accessed the network over a data plan that could result in slower service. See infra Section III.A.2.b. Mobile Service Challenge Process, Challenge Responses, Rebutting Challenges with Infrastructure Data. Finally, the methodology we adopt for aggregating speed tests and requiring challenges to meet the thresholds described below will ensure that challenges are temporally and geographically diverse and therefore reflect a robust and representative sample of user experience, regardless of device type or subscriber plan. 49. Challenges to Larger, Lower-Resolution Hexagons. We adopt our proposal for a “parent” or “grandparent” hexagon (i.e., a hexagon at resolution 7 or 6) to be considered challenged if at least four of its child hexagons are challenged. BDC Mobile Technical Requirements Public Notice at *5, para. 12. CCA supports this proposal, CCA Reply at 7-9. while T-Mobile and Verizon argue that it could allow for challenges to very large areas even though significant portions of them have not been tested. T-Mobile Comments at 6; Verizon Comments at 13-14 (claiming that “the Commission’s proposal could permit a cognizable challenge for an entire 36 square kilometer hex-6 cell even if less than one-third of the hex-8 cells (16 out of 49) within the hex-6 cell have actually been tested”). We disagree with T-Mobile and Verizon and find that this approach will allow for the effective challenge of larger areas where an abundance of geographically diverse tests indicate a pervasive problem. Under it, a resolution 7 or 6 “parent” hexagon will be considered challenged only if more than half (i.e., at least four of seven) of its “child” hexagons are challenged. The threshold can therefore be met without testing each resolution 8 hexagon, including ones that may be practically inaccessible. See RWA Comments at 7-8 (asserting that “[o]nly 38% of all hex 8 cells in the state of Montana are road accessible”). While we decline to set the minimum size of a cognizable challenge at either resolution 7 or resolution 6 hexagons as requested by RWA, see supra paras. 42-43, we believe that the approach we adopt herein will allow for challenges covering a significant portion of otherwise inaccessible resolution 8 hexagons. So long as challengers submit tests meeting the thresholds in at least four of the seven resolution 8 hexagons for a “parent” resolution 7 hexagon, the remaining hexagons would be effectively covered by the challenge to the “parent,” even if these resolution 8 hexagons are inaccessible. See infra para. 51 (defining an “accessible” point-hex as one in which a provider reports coverage for at least 50% of the area of the point-hex in its reported coverage data and through which at least one road traverses). But each “child” hexagon must still meet the geographic threshold described below, which means that any challenges to larger “parent” hexagons will reflect that negative tests are persistent throughout the geographic area. We conclude that this strikes an appropriate balance between reducing the burden on challengers while ensuring that robust evidence of a problem exists before requiring a provider to respond. 50. Required Thresholds. A resolution 8 hexagon will, as proposed, be challenged when tests submitted within the hexagon meet three thresholds: geographic, temporal, and testing. BDC Mobile Technical Requirements Public Notice at *5, para. 12. We adopt the proposed geographic threshold, modified to account for our approach to evaluate each test component (i.e., download and upload) separately. Id.; see CTIA Reply at 9 (calling the geographic and numerical (i.e., testing) thresholds “necessary”); cf. T-Mobile Reply at 9 (stating that challenges “must ensure that the challenges meet the temporal, geographic, and numerical thresholds so as not to waste resources,” despite asking the Commission to use resolution 10 hexagons for the geographic thresholds in its initial comments). If the tests for a given technology in a resolution 8 hexagon meet all three thresholds we will consider that map’s coverage to be challenged in that area. BDC Mobile Technical Requirements Public Notice at *6, para. 13. To satisfy the geographic threshold for a challenge, in general, at least four child hexagons (i.e., “point-hexes”) within the resolution 8 hexagon must contain two of the same test components (download or upload), one of which is a negative test, in each point-hex. See id. at *5, para. 12, Appx. A – Technical Appendix § 3.1.1. The threshold must be met for one component entirely, meaning that a challenge may contain either two upload components per point-hex, one of which is negative, or two download components per point-hex, one of which is negative. Requiring at least four out of seven point-hexes to include two of the same test components and at least one negative test will ensure that more than half of the point-hexes within a resolution 8 hexagon show inadequate coverage. Requiring at least one negative test in multiple locations within the geographic area of a resolution 8 hexagon will demonstrate that negative tests are persistent throughout the hexagon. 51. Consistent with the Commission’s direction to consider (among other factors) “whether the tests were conducted in urban or rural areas” when setting the methodology for aggregating speed test results, we will adjust the geographic thresholds to allow challenges that account for differences in areas. Third Order, 36 FCC Rcd at 1168, para. 105. Specifically, we adopt a different geographic threshold depending on the road density of each resolution 8 hexagon. We will relax the geographic threshold to require tests in fewer than four point-hexes when fewer than four of the point-hexes of a resolution 8 hexagon are “accessible.” We define an “accessible” point-hex as one in which the provider reports coverage for at least 50% of the area of the point-hex in its reported coverage data and through which at least one road traverses. BDC Mobile Technical Requirements Public Notice at *5, para. 12 & n.31; Appx. A – Technical Appendix § 3.1.1 and tbl. 2; see also CCA Comments at 9 (“Adopting the methodology for addressing inaccessible areas proposed in the Public Notice will ensure equity, reduce costs, and increase accuracy.”). Using the most recent U.S. Census Bureau roadway data, a point-hex would contain a road if it overlaps any primary, secondary, or local road, which are defined as MAF/TIGER Feature Class Codes S1100, S1200, or S1400, respectively. U.S. Census Bureau, 2020 TIGER/Line Shapefiles: Roads, https://www.census.gov/cgi-bin/geo/shapefiles/index.php?year=2020&layergroup=Roads (last visited Feb. 1, 2022). In order to account for road width, we will apply a small buffer around the U.S. Census Bureau road line data. No entities commented on this definition. We choose 50% of the area of the point-hex to be within the provider’s reported coverage because we want challengers to have a high likelihood of being within the coverage map when they test. We note that challengers can still test within a point-hex that is not “accessible” so long as the test falls within the provider’s reported coverage. We settle on this definition of “accessible” because without a road it becomes significantly more difficult for parties to run speed tests in a point-hex. We find that the existence of at least one road gives parties a way to access a hexagon and run speed tests. We anticipate that this approach will make it easier for challengers to establish a challenge in less densely populated areas because challengers will be permitted to show less geographic diversity among tests if there are fewer accessible point-hexes in a resolution 8 hexagon. 52. We decline to adopt Vermont DPS’s proposal to eliminate the requirement that four of the seven point-hexes within a resolution 8 hexagon meet the geographic threshold. Vermont DPS Comments at 6-7 (stating that the proposed geographic threshold requirements would effectively make it impossible to conduct drive testing within each resolution 9 hexagon when driving at a reasonable speed). Requiring a challenge to meet the geographic threshold in four of seven point-hexes ensures geographic diversity of tests and will help identify potential coverage gaps over a sufficiently wide area. Vermont DPS does not propose any alternative geographic threshold, and the record supports our conclusion that the geographic threshold is necessary to minimize the chance of anomalous results. CTIA Reply at 9 (“[The proposed geographic and numerical thresholds] ensure that a challenge is cognizable only if the data submitted is sufficiently robust to minimize the chances of anomalous results.”); see Vermont DPS Comments at 7 (stating that “VTDPS proposes to completely do away with the requirement for tests within four individual resolution 9 hexagons in order to challenge a resolution 8 hexagon,” but offering no alternative threshold). We also reject RWA’s proposal to reduce the geographic threshold for inaccessible resolution 7 hexagons or allow for a resolution 7 hexagon with low road density to automatically trigger a challenge. Letter from Carri Bennet, General Counsel, RWA and Alex Espinoza, Regulatory Counsel, RWA, to Marlene H. Dortch, Secretary, FCC, WC Docket No. 19-195 at 2 (filed Oct. 26, 2021) (RWA Hexagon Size Letter). See also CCA Reply at 7-9 (arguing that RWA’s concerns are “misplaced,” and agreeing that the proposal to consider resolution 6 and 7 hexagons challenged when four of the seven child resolution 8 hexagons are challenged addresses RWA’s concerns). Essentially, RWA’s proposal could result in a more lenient threshold for resolution 7 hexagons to be challenged than that which we propose for inaccessible resolution 8 hexagons, because under its proposal, “in instances where fewer than four hex cells are road accessible, wireless providers should conversely be allowed to rebut the presumption that the entire parent cell is challengeable.” RWA Hexagon Size Letter at 5. The proposal we adopt for inaccessible resolution 8 hexagons, by contrast, implements a sliding scale in which when there are fewer than four accessible point-hexes in a resolution 8 hexagon, the number point-hexes that must meet the geographic threshold must be equal to accessible point-hexes in a resolution 8 hexagon. See supra para. 51 and infra Appx. A – Technical Appendix § 3.1.1, tbl. 2. RWA does not appear to suggest such a sliding scale, and therefore under its proposal, if a resolution 7 hexagon has fewer than four accessible point-hexes, then the entire hexagon could be challenged using data from one or zero point-hexes. RWA would, in turn, also make it more difficult for providers to rebut challenges in such areas, because, under its proposal, if an entire resolution 7 hexagon is challenged using data from only one accessible point-hex, a provider would be required to test the remaining six non-accessible point-hexes to rebut the challenge (i.e., a provider would have to run tests in five more point-hexes than challengers, including all of those that are inaccessible by road). RWA Hexagon Size Letter at 5. By comparison, under the proposal we adopt for inaccessible resolution 8 hexagons, “the number of point-hexes for which the challenged provider would be required to make this showing will be the same as that required of the challenger if there are fewer than four ‘accessible’ point-hexes . . . within the challenged hex-8 cell.” Appx. A – Technical Appendix § 3.1.1. We believe the two proposals we adopt—1) to reduce the geographic threshold for resolution 8 hexagons with low road density, See supra para. 51. and 2) to allow a “parent” or “grandparent” hexagon (i.e., a hexagon at resolution 7 or 6) to be challenged if at least four of its child hexagons are considered challenged See supra para. 49. —adequately address RWA’s concerns. See CCA Reply at 7-9 (agreeing that the proposals to reduce the geographic threshold for inaccessible resolution 8 hexagons, combined with the proposal to consider resolution 6 and 7 hexagons challenged when four of the seven child resolution 8 hexagons are challenged, address RWA’s concerns). For example, a resolution 7 hexagon that does not contain any roads is comprised of seven resolution 8 hexagons that also do not contain roads. A challenger therefore would not need to meet the geographic threshold in any of the resolution 8 hexagons if none of the point-hexes contain roads. Moreover, if a challenger runs tests meeting the temporal and testing thresholds in four resolution 8 hexagons and such tests show inadequate coverage sufficient to create a challenge, then the entire resolution 7 hexagon will be considered challenged. Thus, while our proposal does require challengers to meet the temporal and testing thresholds in a resolution 8 hexagon that has no accessible point-hexes, the tests do not need to be geographically diverse within each resolution 8 hexagon. We believe such a trade-off is reasonable to challenge a large geographic area. 53. We also adopt a modified version of our proposed temporal threshold. To meet the temporal threshold under the approach we adopt, each resolution 8 hexagon cell must include a set of two negative components of the same type (upload or download) with a time-of-day at least four hours different from two other negative components of the same type as the first set, regardless of the date of the tests. In other words, if the negative tests within the hexagon were ordered chronologically, regardless of the day of the tests, the difference in time between the first two tests and the last two tests must be at least four hours. The temporal threshold is evaluated across all tests within the resolution 8 hexagon and need not be met for each point-hex within the hexagon. That is, the earliest two negative tests and the latest two negative tests can be recorded in different point-hexes and still meet the temporal threshold so long as the difference in time between the two pairs of tests is at least four hours. Accordingly, because the geographic threshold for a fully-accessible resolution 8 hexagon requires at least eight negative tests (i.e., two each in four of the hexagon’s point-hexes) whereas the temporal threshold could be met using only four of those tests (located in any of the point-hexes), the temporal threshold would not necessarily require the challenger(s) to conduct additional testing. This threshold is different from that which we proposed in the BDC Mobile Technical Requirements Public Notice in that we now require two sets of negative tests to be temporally diverse, rather than one negative test being temporally diverse from one other test. See BDC Mobile Technical Requirements Public Notice, at *5, para. 12. T-Mobile supports the adoption of the temporal threshold proposed in the BDC Mobile Technical Requirements Public Notice, and we believe our modified approach is consistent with the concepts for which T-Mobile expresses support. T-Mobile Comments at 13 (supporting the temporal threshold proposed in the BDC Mobile Technical Requirements Public Notice ); T-Mobile Reply at 8-9 (arguing that a uniform temporal threshold will help filter out challenges that are based on anomalous, one-off testing, and supporting the requirement that challenges meet the temporal, geographic, and numerical thresholds so as not to waste resources). T-Mobile expresses support for the initial proposal because it would require “at least two speed tests in a given hex be conducted at different times of day and to require that these tests be taken at least four hours apart. T-Mobile Comments at 13. The modified approach still requires there be at least two tests be separated by a four-hour period, because it now requires two sets of tests—with each set consisting of two tests—be separated by at least four hours. Verizon and AT&T generally support a temporal threshold, Verizon Comments at 12 (“[The Bureau and Offices] should . . . adopt its proposal to require that speed tests meet “geographic” and “temporal” thresholds.”); AT&T Reply at 3-4 (supporting the establishment of specific thresholds for valid speed tests, including a temporal threshold). and agree with our determination that temporal diversity is important, but we decline to adopt their proposal to categorize tests into specific four-hour ranges. Verizon Comments at 13 (“To better reflect the variability of cell loading and ensure that any lack of coverage is persistent, the Commission should categorize tests into four-hour ranges (e.g., 6 to 10 a.m., 10 a.m. to 2 p.m.) and require that more than one temporal range have a meaningful percentage of the negative tests.”); AT&T Reply at 6-7 (arguing the Commission should categorize tests into four-hour ranges to account for time-of-day concerns such as environmental issues (humidity, fog, etc.) and cell loading factors). We disagree that categorizing tests into specific time ranges would ensure temporal diversity. For example, Verizon and AT&T’s proposal could allow a challenger to satisfy the temporal threshold with tests that have been conducted within a very short timeframe. For example, under Verizon’s approach, the temporal ranges that have a “meaningful percentage of the negative tests” could be 1:00 p.m. – 5:00 p.m. and 5:00 p.m. – 9:00 p.m. In such a scenario, all of the negative tests could have occurred within a very short time period (e.g., between 4:30 p.m. and 5:30 p.m.). See Verizon Comments at 13. While additional rules could be added to prevent this type of gaming (such as requiring that more than two four-hour timeframes have tests), such rules would likely increase the complexity of the challenge process, which in turn would negate any simplification that Verizon and AT&T’s approach provided and could potentially increase the burden on challengers and providers in creating and responding to challenges. However, in light of Verizon’s concerns with our initial proposal, we find that multiple tests separated by four hours, rather than one at each end of a minimum of a four hour period, are needed to show temporal diversity, and thus modify our approach to ensure temporal diversity across several tests. See BDC Mobile Technical Requirements Public Notice at *5, para. 12, Appx. A- Technical Appendix § 3.1.2; Verizon Comments at 13 (“[The temporal threshold proposed in the BDC Mobile Technical Requirements Public Notice] would establish a cognizable challenge even if substantially all of the negative tests are at the same time of day, as long as just one of the negative tests is separated from the other negative tests by at least four hours. This minimal level of temporal diversity falls short of the Commission’s goal of requiring challengers to ‘demonstrate the lack of coverage is persistent rather than temporary.’”). 54. We are also unpersuaded by Vermont DPS’s argument that we should not adopt the temporal threshold because it would require a challenger to drive test a road twice, Vermont DPS Comments at 7. and by CPUC’s argument that the temporal threshold would significantly increase costs on challengers. CPUC Comments at 16; Public Knowledge/New America Reply at 4-5 (agreeing with CPUC). We believe that the effort required to achieve the temporal threshold is outweighed by the need to collect a representative sample of a mobile service provider’s coverage, particularly since our decision to combine challenge data from consumers, governments, and other entities in a given area will help minimize burdens on challengers and limit the number of drive tests any one challenger will need to conduct. We conclude that our approach is a reasonable solution that will ensure challengers demonstrate persistent inadequate coverage while accounting for the temporal variability of mobile networks, such as variability due to cell loading. 55. Finally, we adopt a modified version of the proposed testing threshold to require that there must be at least five negative test components of the same type (upload or download) within the resolution 8 hexagon when 20 or fewer total challenge test components of that type have been submitted. See BDC Mobile Technical Requirements Public Notice at *5, para. 12. Consistent with the approach originally proposed, when challengers have submitted more than 20 test components of the same type in a hexagon, we will require that a certain minimum percentage of the total number of test components of the same type in that hexagon be negative, ranging from at least 24% negative when challengers have submitted between 21 and 29 total tests, to at least 16% negative when challengers have submitted 100 or more tests. Appx. A – Technical Appendix § 3.1.3. Once the percentage of negative test components of the same type submitted meets the minimum negative percentage required (for example, for a sample of fewer than 21 tests, once there are at least five negative tests submitted), we will not require additional tests so long as both the geographic and temporal thresholds for a resolution 8 hexagon have been met. Id. The failure rates we adopt were chosen to demonstrate that coverage does not reach a 90% probability threshold. We find that this 90% threshold is reasonable to use because most speed tests will be taken within the provider’s cell (rather than solely at the edge of the cell) where the cell area probability should be greater than the modeled cell edge probability of 90%, See Mobility Fund Phase II Investigation Staff Report at 22, para. 54 (explaining that “a set of speed tests taken uniformly throughout the cell area should achieve the required download speeds 92% of the time, whereas tests taken exclusively around the cell edge should achieve such speeds 80% of the time”); Connect America Fund; Universal Service Reform – Mobility Fund, WC Docket No. 10-90, WT Docket No. 10-208, Order on Reconsideration and Second Report and Order, 32 FCC Rcd 6282, 6300, para. 36 (2017); see also Christophe Chevallier et al., WCDMA (UMTS) Deployment Handbook: Planning and Optimization Aspects 33 Figure 2.6 (1st ed. 2006); D. O. Reudink, Microwave Mobile Communications 126-28 Figure 2.5-1 (William C. Jakes 2d ed. 1974). and to simplify the process, we will use the 90% threshold for tests conducted anywhere in the cell. To avoid the risk that the testing threshold would be skewed by a disproportionate number of tests occurring in one location within a resolution 8 hexagon, however, we adopt a modified approach such that if the number of test components of the same type in a single point-hex represent more than 50% of the total test components in the resolution 8 hexagon (where there are four or more accessible point-hexes in the hexagon), the test components in that point-hex will count only toward meeting 50% of the testing threshold. In a resolution 8 hexagon where there are only three accessible point-hexes, if the number of test components in one point-hex represent more than 75% of the total test components in the hexagon where the geographic threshold is otherwise satisfied, the test components in that point-hex will count only toward 75% of the testing threshold. If fewer than three point-hexes are accessible, we will not apply a maximum percentage of total test components for a single point-hex as the risk that testing would be skewed by a disproportionate number of tests occurring in a single location is reduced. We believe that these changes mitigate the potential bias resulting from a disproportionate number of tests occurring in one point-hex, and that this revised testing threshold will result in greater variety of tests within each resolution 8 hexagon. See Verizon Comments at 12-13 (arguing that the proposed geographic and testing threshold could result in too many negative tests being consolidated in one point-hex). 56. Verizon, CTIA, and T-Mobile generally support the adoption of a testing threshold. Id. at 12; CTIA Reply at 9 (“The Commission’s proposed geographic and numerical thresholds—requiring challengers to submit multiple tests conducted within the hexagonal cell being challenged—are both necessary.”); T-Mobile Reply at 9 (“[T]he Commission must ensure that the challenges meet the temporal, geographic, and numerical thresholds so as not to waste resources.”). Verizon supports our evaluating challenges based on the percentage of tests in a cell that are below the relevant speed threshold, but expresses concern that the Commission’s geographic threshold “would allow cognizable challenges even if substantially all of the negative tests are in a single point-hex.” Verizon Comments at 12. The modified approach we adopt mitigates the potential problems Verizon raises because the Commission would adjust the testing threshold when a disproportionate number of tests occur in the same point-hex. T-Mobile contends that staff should adjudicate challenges based on a threshold number and percentage of “negative” tests, with a minimum of five tests for each resolution 10 hex cell and at least 50% of those negative. T-Mobile Comments at 12-13; T-Mobile Reply at 6. We decline to adopt T-Mobile’s alternative proposal because, as discussed above, we believe resolution 10 hexagons are too small for the challenge process. See supra para. 42-43. We also find that T-Mobile’s proposal to require that 50% of tests be negative, regardless of the number of tests run, would place a high burden on challengers, and could diminish legitimate indications that coverage is unavailable in particular areas. In contrast, the thresholds for the percentage of negative tests we adopt are based on the statistical significance necessary to demonstrate lack of coverage. We also decline to adopt Vermont DPS’s proposal to allow a single test, or maximum of two tests to be used to show inadequate coverage at multiple locations within a resolution 8 hexagon. Vermont DPS Comments at 6-7. Vermont DPS’s argument that the geographic and testing thresholds effectively prevent drive testing assumes that a challenger should be able to run all of the tests necessary to meet each threshold on a single drive through a resolution 8 hexagon, but if challengers find that they are having to drive at a slow pace to run an in-vehicle test in a resolution 9 hexagon, they may periodically stop to run tests in a stationary manner before moving on to the next resolution 8 hexagon. We anticipate that government and other third-party testers can use software that overlays the H3 indexing system and/or providers published maps on a drive test map and may therefore know whether they are keeping within a hex or moving into another one while doing a test. We note, however, that this may not be necessary since we will be combining challenges from consumers, governments, and other entities in a given area which would lessen the number of drive tests any one challenger will need to conduct. For this same reason, we disagree with the CPUC that the testing threshold will be extremely expensive and require complicated coordination of efforts. CPUC Comments at 17-18. As discussed, we will aggregate challenges from multiple sources and no one entity will be required to conduct all tests needed to challenge a particular geographic area. 57. User-Friendly Challenge Process. AT&T concurs with our assessment that the challenge process we proposed is reasonable and user-friendly and supports the overall framework, including the use of the H3 geospatial indexing system. AT&T Reply at 16 (stating that “[o]verall, the proposed challenge process is reasonable and user-friendly”); id. at 4-6 (agreeing that use of the H3 system is user-friendly but asking the Commission to base challenges on resolution 10 hexagons rather than resolution 8 hexagons). AT&T also agrees with the Bureau and Offices’ proposal to consider multiple successful challenges in a higher resolution area (e.g., resolution 8 hexagons) as a challenge to the larger parent hexagon (e.g., resolution 7 hexagon). AT&T Reply at 6, n.12 (finding the proposal to be “both reasonable and user-friendly”). In addition, CTIA, T-Mobile, and AT&T agree that the proposal to combine test data from consumers, governments, and other entities is user-friendly and reduces burdens on challengers, who will not be required to collect and submit every drive test needed to sustain a challenge on their own. Id. at 4; CTIA Comments at 15; T-Mobile Comments at 12. Although Public Knowledge/New America raise concerns about whether the challenge process is sufficiently user-friendly, they share our belief that the challenge process should be as streamlined and burden-free as possible for consumers and other entities; we note that our implementation of the consumer challenge process is consistent with the Third Order’s determination that challengers will collect and submit all speed test data needed to support a challenge, including the new speed test metrics and parameters we adopt, through the FCC Speed Test app or another app approved by OET to collect and submit challenge data to the Commission. See Letter from Michael Calabrese, Amir Nasr, New America Open Technology Institute, Greg Guice, Jenna Leventoff, and Kathleen Burke, Public Knowledge, to Marlene Dortch, Secretary, FCC, WC Docket Nos. 19-195, 21-31, 21-93, at 2-3 (filed Aug. 25, 2021) (Public Knowledge/New America August Letter) (“It should not be necessary to hire expensive consultants to file a challenge to the maps. . . . The Commission should take into account that [low-income and rural Americans] will not necessarily have easy access to high-capacity fixed broadband connections and computers needed to navigate a multi-step process to challenge . . . a mobile carrier’s faulty maps.”). See Third Order, 36 FCC Rcd at 1166-67, para. 103 (explaining that the FCC Speed Test app or other apps approved by OET for use in the challenge process will automatically collect metrics associated with each speed test to ensure the challenge submission format is user-friendly). 58. We disagree with commenters that argue that our challenge process is not “user-friendly.” See, e.g., CPUC Comments at 16-17; PAgCASA Reply at 2; Public Knowledge/New America Reply at 3-4. RWA argues that the testing process is not “user-friendly” because consumers can test only the networks their handsets are authorized to use. RWA Comments at 16 (“For rural areas where coverage is lacking, RWA members—and all consumers, government entities or other third parties—must therefore purchase handsets from a particular carrier to challenge that carrier’s maps, an incredible burden that will render the Commission’s BDC challenge process moot under the Commission’s proposal.”).   It recommends requiring providers to allow tests by other networks’ subscribers. Id. at 16-17. RWA argues that “mobile user[s] with poor coverage should have the ability to shop around for a better provider by testing any other providers’ existing coverage in advance, before he or she purchases a costly and lengthy mobile phone contract with poor coverage.” RWA Comments at 16. We reject RWA’s proposal for the reason stated in this paragraph, and because the public-facing maps, not the challenge process, exist to give consumers a complete picture of where all providers offer network coverage. Public Knowledge/New America also argue that consumers should be able to use the challenge process as a way to compare coverage availability among providers, and ask the Commission to make the challenge process more “consumer-friendly” by making challenges publicly available “in an easy-to-search format by geography (e.g., by state, locality, zip code) as well as by provider. Customers paying for mobile service should be able to easily check if others in their area believe they are experiencing degraded service compared to what their carrier is reporting to the Commission and join in such complaints by filing a challenge.” Public Knowledge/New America August Letter at 3. Pursuant to the Third Order, “the Commission will make public the information about the location that is the subject of the challenge (e.g., street address and/or coordinates (latitude/longitude)), the name of the provider, and any relevant details concerning the basis of the challenge.” Third Order, 36 FCC Rcd at 1174, para. 125. All other challenge information, such as individual contact information, will be kept private. Id. at 1175, para. 125. Between the publicly available information on challenges and the public-facing maps that show mobile broadband providers’ predicted coverage, we believe that consumers will have sufficient data to use to compare mobile providers’ broadband availability. The Commission has already determined that consumer challengers must submit certain identifying information, including that they are a subscriber or authorized user of the provider being challenged, to deter frivolous filings, Third Order, 36 FCC Rcd at 1166, para. 101; 47 CFR § 1.7006(e)(1)(iv). and the Bureau and Offices were not delegated authority to change this requirement. Similarly, Vermont DPS recommends requiring providers to temporarily provide approved devices with post-paid service at no or reduced cost to governmental entities wishing to engage in a challenge. Vermont DPS Comments at 11-12. We decline to adopt Vermont DPS’s request because we lack the authority to subsidize government challenges and believe it would be too burdensome to require providers to establish and bear the costs of such programs. Enablers argues (and Public Knowledge/New America agree) that “‘testing parameters that amount to an exceedingly high burden of proof for consumers and other parties’ run ‘contrary to the Broadband DATA Act and [the Commission’s] own policy goals.’” Public Knowledge/New America Reply at 3 (quoting Enablers Comments at 6); see also Enablers Comments at 5-6 (asking the Commission to adopt its own sampling service to be used to challenge and verify providers’ coverage maps). Public Knowledge/New America accordingly encourage the Bureau and Offices to consider “allow[ing] the option to use other trusted sources to challenge providers’ claims.” Public Knowledge/New America August Letter at 2. The Precision Ag Connectivity & Accuracy Stakeholder Alliance (PAgCASA) similarly claims that the proposed challenge process “delineates a series of technical and non-technical steps [m]obile customers must initiate and successfully navigate when conducting their [c]hallenge process that . . . falls well short of being easy to use from a customer’s perspective.” PAgCASA Reply at 2-3. These commenters also raise many issues that were already decided in the Third Order (e.g., subscriber certifications and testing methodology and metrics) See 47 CFR § 1.7006(e)(1)(iv) (requiring a consumer to certify that the challenger is a subscriber or authorized user of the provider being challenged); Third Order, 36 FCC Rcd at 1166, para. 101; 1164-65, paras. 98-99 (adopting a proposal for the challenge process to rely on speed testing); 1166-67, para. 103 (specifying information that apps used in the challenge process will collect); see generally Second Order, 35 FCC Rcd at 7474-83, paras. 32-51 (establishing the processes and parameters by which providers would submit propagation maps to the Commission; Third Order, 36 FCC Rcd at 1164-75, paras. 97-125 (establishing a general framework for consumers and government and other entities to submit on-the-ground test data to challenge provider’s propagation maps). and are not delegated to the Bureau and Offices, or urge the Bureau and Offices to ignore the instructions given by the Commission, See, e.g., CPUC Comments at 3-13 (“The FCC’s proposed approach begins with a presumption that the coverage maps providers submit will be correct, unless successfully challenged by test data submitted by consumers or government entities.”), 14 (suggesting that, until provider coverage maps are proven to be statistically accurate, the FCC lower the burden of accuracy for on-the-ground challenge data, and instead, should shift the burden back to providers to conduct their own on-the-ground testing), 18 (urging the Commission to allow interpolations of test data submitted by government entities to constitute cognizable challenges); Public Knowledge/New America Reply at 2 (arguing there is a “clear need for the Commission to proactively validate availability information rather than outsourcing the task to the public”); RWA Comments at 15-16 (asking the Bureau and Offices to require providers to allow consumers to submit speed tests on all networks, including those of which they are not subscribers). See infra Section III.E. Other Matters (e.g., rejecting calls to introduce new mapping requirements on providers, and rejecting requests to allow for interpolations to be used in place of on-the-ground speed tests in the challenge and verification processes). and would have been more appropriately filed as a petition for reconsideration of the Third Order. We reject the arguments of these commenters as untimely because they should have been filed as petitions for reconsideration to the extent that they raise issues already decided by the full Commission. Under section 405(a) of the Communications Act of 1934, as amended, any party in a proceeding may file a petition for reconsideration within thirty days of public notice of the decision. See 47 U.S.C. § 405(a) (“A petition for reconsideration must be filed within thirty days from the date upon which public notice is given of the order, decision, report, or action complained of.”); 47 CFR § 1.429(d) (specifying that a petition for reconsideration of final orders in rulemaking proceedings should be filed within 30 days from the date of public notice of such action). These commenters raise issues that were decided by the Commission in the Third Order, which was published in the Federal Register on April 17, 2021. See 86 Fed. Reg. 18124 (Apr. 7, 2021). This publication date means that deadline for filing a petition for reconsideration of the Third Order was May 7, 2021. Because these commenters did not file their comments until September 2021, the Bureau and Offices find that the arguments are untimely and would have been more appropriately filed as petitions for reconsideration. See 47 CFR § 1.429(l)(9). 59. In conclusion, while the challenge processes and methodologies we adopt are by necessity detailed and technical, so as to assure that accurate and rigorous measurements are supplied to challenge providers’ claimed broadband coverage, As discussed, the Bureau and Offices were instructed to implement a number of complex and complicated tasks, among them, developing thresholds for determining when a cognizable challenge has been met, a procedure for resolving challenges, and adopting additional testing requirements if necessary. Third Order, 36 FCC Rcd at 1167-68, 1170, 1172-73, paras. 105-06, 110-11, 118, 120. These obligations were delegated by the Commission within the context of the Broadband DATA Act, which requires the Commission to consider user-friendly challenge submission formats, reducing the time and expense burdens on consumers submitting challenges and providers responding to them, while at the same time considering lessons learned from the challenge process established under Mobility Fund Phase II, and the costs to consumers and providers resulting from a misallocation of funds because of a reliance on outdated and inaccurate maps. 47 U.S.C. § 642 (a)(5)(B)(i)(I)-(VI). Indeed, financial assistance for underserved areas may, in the future, be based on updated Commission maps. RWA November 2021 Letter at 3. Therefore, we find that the processes we adopt strike an appropriate balance, within the authority delegated to us by the Commission, to ensure the challenge process is easy to use and accessible for consumers and government and other entities and also results in high-quality challenges that will accurately correct any errors associated with providers’ reported coverage maps. the Commission and Bureau and Offices have minimized the burdens placed on challengers by providing a user-friendly means for challengers to run speed tests using their mobile devices and submit all data via either the FCC Speed Test app or another OET-approved third-party app. 2. Challenge Responses 60. Notification of Challenges. We adopt the BDC Mobile Technical Requirements Public Notice’s proposed procedures for notifying service providers of cognizable challenges filed against them and for notifying challengers and providers of results of challenges. BDC Mobile Technical Requirements Public Notice at *7, paras. 15-16. The BDC Mobile Technical Requirements Public Notice proposed that challenged mobile service providers would be notified via the online portal at the end of each calendar month of the hexagons that are subject to cognizable challenges. Id. at *7, para. 16. CTIA and T-Mobile express support for our proposal. CTIA Comments at 3, 15; T-Mobile Comments at 14. We find this approach will help create a manageable process for providers by providing them with a standard set of deadlines rather than an erratic and potentially unpredictable set of innumerable deadlines for rebuttals that begin as soon as any given discrete area becomes challenged. We also adopt our proposal for mobile service providers and challengers to be notified monthly of the status of challenged areas, and parties will be able to see a map of the challenged area, and a notification about whether or not a challenge has been successfully rebutted, whether a challenge was successful, and if a challenged area was restored based on insufficient evidence to sustain a challenge. BDC Mobile Technical Requirements Public Notice at *7, para. 15. In the Third Order, the Commission directed that challenge and crowdsource data other than the location that is the subject of the challenge, the name of the provider, and details concerning the basis for the challenge must be kept private to protect challengers’ privacy interests. Third Order, 36 FCC Rcd at 1174, para. 125.  Accordingly, before a service provider receives access to crowdsourced or challenge data, it will be required, within the BDC system, to acknowledge that it will use personally identifiable information that it receives for the sole purpose of responding to the challenge and that it will protect and keep private all such personally identifiable information. Such personally identifiable information may include challenger contact information, device information, and network information, as well as other personally identifiable information included in addition to evidence that a challenger submits. 61. Timeframe for Responding to Challenges. In the Third Order, the Commission determined that providers must either submit a rebuttal to a challenge or concede a challenge within 60 days of being notified of the challenge. Third Order, 36 FCC Rcd at 1168, 1173-74, paras. 107, 121; 47 CFR § 1.7006(e)(3), (f)(4). Consistent with the Third Order, if the challenged provider concedes or fails to submit data sufficient to overturn the challenge within 60 days of notification, it must revise its coverage maps to reflect the lack of coverage in the successfully challenged areas. Third Order, 36 FCC Rcd at 1170, para. 112 (“[I]n cases where a mobile service provider concedes or loses a challenge, the provider must file, within 30 days, geospatial data depicting the challenged area that has been shown to lack service. Such data will constitute a correction layer to the provider’s original propagation model-based coverage map, and Commission staff will use this layer to update the broadband coverage map.”); accord 47 CFR § 1.7006(e)(6), (f)(7); Third Order, 36 FCC Rcd at 1174, para. 124. 62. In comments on the BDC Mobile Technical Requirements Public Notice, CCA argues that the Bureau and Offices should allow providers to seek a waiver of the 60-day deadline if the provider needs additional time to submit on-the-ground data due to unforeseen events or weather. CCA Comments at 13. Verizon contends that providers should be able to choose to seek either: (1) a waiver of rules that limit the permitted uses of infrastructure data or transmitter monitoring software in lieu of speed tests; or (2) a waiver of the 60-day deadline if the provider will rebut with speed test data. Verizon Comments at 17-18. The Commission adopted the requirement that providers submit a rebuttal or concede a challenge in the Third Order based on its determination that permitting 60 days to respond to a challenge would make the challenge process more manageable for providers, while also providing for speedy resolution of challenges consistent with the requirements of the Broadband DATA Act. Third Order, 36 FCC Rcd at 1168, para. 107. The Bureau and Offices do not have authority to change the required timeframe for provider responses. To the extent that a provider may wish to seek a waiver of the 60-day deadline for responding to a challenge in any individual case, it may do so under the Commission’s generally applicable waiver rules. 47 CFR § 1.3. 63. Future Challenges in Successfully Rebutted Areas. We adopt our proposal to make any areas where a provider has demonstrated sufficient coverage in a challenged area ineligible for subsequent challenge until the next biannual broadband availability data filing at least six months after the later of either the end of the 60-day response period or the resolution of the challenge. BDC Mobile Technical Requirements Public Notice at *8, para. 18. This ineligibility applies only with respect to the particular network technology and modeled environment for which the provider has demonstrated sufficient coverage. We deny Verizon and AT&T’s request that the Bureau and Offices make successfully rebutted areas exempt from future challenges for a period of three years. Verizon Comments at 18 (stating that “it is highly unlikely that coverage will be reduced due to such changes, and even less likely that coverage will be reduced in less than a year” and arguing that “[t]he burden imposed on a provider that must repeatedly rebut challenges to the same area far outweighs the minimal chance that coverage will deteriorate after a provider rebuts a challenge”); AT&T Reply at 12 (agreeing with Verizon that “exempting an area from a subsequent challenge for a period longer than six months is appropriate” and that “it is highly unlikely that the coverage in future years and map filings will be reduced, and even less likely that it will be reduced in less than a year” ). We find that preventing future subsequent challenges for a period as long as three years could result in less accurate maps due to changes over time in technology and coverage. We find instead that limiting subsequent challenges for at least six months after the resolution of the challenge strikes an appropriate balance between avoiding a requirement that providers repeatedly confirm the same areas while ensuring that challengers have the opportunity to submit data regarding changed conditions. Although commenters assert that it is unlikely that coverage will be reduced in an area that was subject to challenge, Verizon Comments at 18. an area that is subject to repeated cognizable challenges may highlight that significant technical issues continue to affect the availability of broadband service in that area. Permitting a subsequent challenge in these areas will help ensure that the Commission receives the most accurate and up-to-date coverage data reflecting consumers’ on-the-ground experience. In any area in which a provider does not overturn the challenge but which is otherwise no longer considered challenged (e.g., where, as a result of data submitted by the provider there is no longer sufficient evidence to sustain the challenge to that area but the provider’s data fall short of confirming coverage in the area), the coverage area will be restored to its pre-challenge status and will be eligible for future challenges against it. BDC Mobile Technical Requirements Public Notice at *7, para. 15. a. Rebutting Challenges with On-the-Ground Data 64. We adopt our proposal from the BDC Mobile Technical Requirements Public Notice that, when a challenged mobile service provider submits on-the-ground speed test data to rebut a challenge, the provider will be required to meet analogous thresholds to those required of challengers, adjusted to reflect the burden on providers to demonstrate that sufficient coverage exists at least 90% of the time in the challenged hexagon(s). Id. at *7, para. 17. Consistent with our proposal, the on-the-ground test data that providers submit must meet the same three thresholds required of challenger tests for both the upload and download components: (1) a geographic threshold; (2) a temporal threshold; and (3) a testing threshold, albeit with different values (i.e., the number of tests and percentages) for test data for each threshold. Id. at *8, para. 18. 65. For the geographic threshold, the provider will need to meet the same geographic threshold required of challengers, but with positive test components rather than negative test components. At least four point-hexes of a resolution 8 hexagon must include two download test components taken within them, at least one of which must be positive, and at least four point-hexes of a resolution 8 hexagon must include two upload test components taken within them, at least one of which must be positive to demonstrate that adequate coverage occurs at multiple locations within the resolution 8 hexagon. Id. Fewer point-hexes may be tested when fewer than four of the point-hexes of a resolution 8 hexagon are “accessible.” As mentioned supra, a point-hex is “accessible” when “the provider reports coverage for at least 50% of the area of the point-hex in its reported coverage data and through which at least one road traverses.” See infra Appx. A – Technical Appendix at tbl. 2. CCA expresses support for this approach. CCA Reply at 8-9 (noting that “parties serving inaccessible areas would not have to meet the same high threshold numbers of tests as in accessible areas”). In order to rebut a challenge, a provider would need to meet the thresholds for both the upload components and download components of each speed test. We adopt a modified version of our proposed temporal threshold. To meet the temporal threshold under the approach we adopt, each resolution 8 hexagon will need to include a set of five positive download components with a time-of-day difference of at least four hours from another set of five positive download components, regardless of the date of the test and a set of five positive upload components with a time-of-day difference of at least four hours from another set of five positive upload components, regardless of the date of the test. We modify the threshold proposed in the BDC Mobile Technical Requirements Public Notice because we find that requiring more tests to be separated in time will help ensure that there is more consistent temporal diversity across several tests. For the testing threshold, we adopt our proposal that challenged providers must demonstrate statistically significant evidence that coverage is adequate to overturn a challenge using on-the-ground speed tests, based on the same statistical significance analysis used for determining challenges for both upload and download components. Specifically, in order for the testing threshold for a resolution 8 hexagon to be met, we require that at least 17 positive test components of the same type have been taken in the hexagon when the provider has submitted 20 or fewer test components of that type. BDC Mobile Technical Requirements Public Notice at *8, para. 18. When the provider has submitted more than 20 test components of the same type, we require that a certain minimum percentage of the total number of test components of that type in the hexagon must be positive, ranging from at least 82% positive, when providers have submitted between 21 and 34 total test components of the same type, to at least 88% positive, when providers have submitted 100 or more test components of the same type. Id. The positive test rates we adopt were chosen to demonstrate that coverage does reach a 90% probability threshold, as opposed to the requirement that challengers demonstrate coverage does not reach a 90% probability threshold. Additionally, in line with the modification we adopt for challengers, if more than 50% of the test components of the same type are within a single point-hex where four or more point-hexes in the resolution 8 hexagon are accessible, the test components in that point-hex will be down-weighted to only account for 50% of the total test components when evaluating the testing threshold. If more than 75% of the tests are within one point-hex where there are three accessible hexes in the resolution 8 hexagon, the tests in that point-hex will be reduced to only account for 75% of the total tests when evaluating the testing threshold. By limiting the percentage of test components within any one point-hex that may contribute to a challenge response, this requirement will help ensure that there is sufficient diversity in the test data that a challenged provider submits. A provider may also demonstrate sufficient coverage in a resolution 8 hexagon that was not challenged in order to rebut a challenge to a lower-resolution hexagon containing the non-challenged resolution 8 hexagon (i.e., the “parent” resolution 7 hexagon or “grandparent” resolution 6 hexagon). BDC Mobile Technical Requirements Public Notice at *8, para. 18. As discussed more fully in section 3.2.4 of the Technical Appendix, for challenged hexagons at resolution 7 or 6, if the provider submits response data sufficient to demonstrate coverage in the hexagon’s child hexagons such that fewer than four child hexagons would still be challenged, then the resolution 7 or 6 hexagon would no longer be challenged even if sufficient data were not submitted to rebut a challenge for the remaining child hexagons. See infra Appx. A – Technical Appendix § 3.2.4. In analyzing challenges, staff may consider other relevant data submitted by providers, request additional information from the challenged provider, and take other actions as may be necessary to ensure the reliability and accuracy of rebuttal data. These actions may include rejecting speed tests or requiring additional testing. 66. In the BDC Mobile Technical Requirements Public Notice, we proposed to require providers to collect on-the-ground test data using mobile devices running either a Commission-developed app (e.g., the FCC Speed Test app), another speed test app approved by OET to submit challenges, or other software if approved by staff. BDC Mobile Technical Requirements Public Notice at *7, para. 17. T-Mobile urges the Bureau and Offices to allow providers to use their own software tools to rebut challenges without seeking prior staff approval. T-Mobile Comments at 19. T-Mobile also asks the Commission to “ensure the process for submitting and responding to challengers is user friendly” by making the challenge portal “compatible with widely used database software like Salesforce.” T-Mobile Comments at 16. We decline to adopt a requirement that the portal be compatible with specific types of software. However, we take other steps to provide flexibility for providers in responding to challenges, including, as described in more detail below, allowing them to use their own software tools to gather on-the-ground test data. We also anticipate that service providers and other entities will be able to build their own tools and integrate their own software and databases with the BDC system using a modern web-based Application Programming Interface (API). If approval is needed, T-Mobile argues, then OET should commit to approve or reject such tools within 90 days of submission. Id. at 19. Our proposal to require approval of testing software used by providers was based on the Third Order’s direction to the Bureau and Offices to approve the equipment that providers may use to conduct on-the-ground testing to respond to verification inquiries, combined with the Commission’s determination that providers rebutting challenges with on-the-ground test data would be subject to the same requirements and specifications that apply to providers submitting data in response to a Commission verification request. Third Order, 36 FCC Rcd at 1150, 1169, paras. 59, 109. 67. While we continue to read these provisions as requiring the Bureau and Offices to approve any software tools providers may use to gather on-the-ground test data, we clarify that, to the extent that a provider chooses to use software other than the FCC Speed Test app or another speed test app approved by OET for use in the challenge process, we will consider such software approved for use in rebutting challenges provided that the software incorporates the test methodology and collects the metrics that approved apps must gather for consumer challenges and that government and third-party entity challenger speed test data must contain. We understand that certain technical network information and RF metrics that we would otherwise require are not currently available on Apple iOS devices. Therefore, until such time as such information and metrics are available on iOS devices, and the Bureau and Offices indicate that they will collect such information from iOS devices, providers must collect all of the required technical network information and RF metrics using a device that is able to interface with drive test software and/or runs the Android operating system. See supra Section III.A.1. Creating a Challenge/Cognizable Challenges (discussion of requirement for government and third-party entity challenges to use a device that is able to interface with drive test software and/or runs the Android operating system.) We also require providers conducting in-vehicle mobile tests (i.e., drive tests) to conduct such tests with the antenna located inside the vehicle. We disagree with Verizon that providers should be able to choose whether or not to use an external antenna when conducting speed tests. Verizon Comments at 15-16. Because most consumers will take in-vehicle tests using an antenna inside the vehicle, adopting this requirement for providers will help minimize discrepancies and ensure more consistent comparisons between on-the-ground test data supplied by challengers and data supplied by providers. 68. In order to inform our approval process and consistent with the requirement that applies to government and other entity challengers who choose to use their own software when submitting challenges, we require providers who choose to use their own software to submit a complete description of the methodologies used to collect their data and to substantiate their data through the certification of a qualified engineer or official. Permitting providers to use their own tools is consistent with the approach the Commission adopted for government and other entity challengers in collecting challenge data and it is preferable to requiring prior approval for providers wishing to use their own software tools because it will help streamline the challenge process by reducing the potential for any delays that might be caused by requiring prior review of specific software tools that providers may wish to use. It also will provide greater flexibility and reduce burdens on providers by allowing them to more easily use the software tools they may already be using in the ordinary course of their business. 69. We recognize that this approach is different than the approach we have adopted for third-party speed tests apps where we require OET approval before such apps may be used in the challenge process. We find, however, that the difference in treatment is justified and warranted. Mobile broadband service providers routinely test and monitor network performance as they develop their networks, and their software has been engineered specifically to obtain detailed speed test measurement data. Providers’ software is unlikely to be constrained by limitations in the categories of data that can be collected; in contrast, and as discussed above, consumer-facing third-party apps (particularly apps run over iOS) cannot provide certain categories of information. We require approval for third-party speed test apps because we want to ensure that the apps measure coverage as accurately as possible and report information into the BDC system with the required certifications and in a useable format. In addition, requiring approval is necessary to hold the third-party app developers accountable for the accuracy and reliability of their tools and to allow us to inform consumers of the available third-party apps that meet our requirements and are approved for use in the challenge and crowdsource processes. In contrast, the Commission has greater jurisdiction over service providers, as providers are required under the Broadband DATA Act to ensure the accuracy of the coverage information they submit to the Commission. 47 U.S.C. § 643. Permitting providers to use these existing performance measurement tools without individualized review and approval will help increase efficiency while continuing to ensure that the Commission receives high-quality data that will allow an apples-to-apples comparison between challenge data submitted by consumers and other entities and data supplied by providers using their own software. While we expect that this approach will benefit our administration of the challenge process, we retain the discretion to require prior approval of providers’ software or to make changes to the required metrics via notice and comment at a later time. We also retain discretion to revoke the automatic grant of approval in instances where a provider’s software is found to be unreliable or otherwise inconsistent with our objective of ensuring accurate mapping data. 70. We decline T-Mobile’s request that we “adopt a 90-day ‘expiration’ date for challenge data” and instead adopt our proposal to make on-the-ground test data valid for one year from the test date. T-Mobile Comments at 20 (stating that “T-Mobile is constantly updating its highly dynamic network to keep pace with network demand and data usage patterns. Because of these regular and consistent changes to the network, speed-test results can easily become stale within a matter of weeks—or even days—depending on the circumstances”); BDC Mobile Technical Requirements Public Notice at *29, Appx. A – Technical Appendix § 2. The process we adopt for submission of challenges ensures that providers have sufficient details to respond to challenges, including dates and times of speed tests. See supra Section III.A.1. Creating a Challenge/Cognizable Challenges. Moreover, to the extent a provider improves its network coverage in an area, it can either remove the area from its current data and add it back in with its next biannual submission or rebut a challenge by submitting on-the-ground test data demonstrating network performance in the recently deployed area. We find that these alternatives strike a better balance in facilitating robust participation in the challenge process and ensuring high-quality data than requests to curtail the lifespan of valid challenge data. b. Rebutting Challenges with Infrastructure Data 71. Under the rules adopted in the Third Order, providers may respond to challenges with infrastructure data rather than (or in addition to) on-the-ground speed test data. Third Order, 36 FCC Rcd at 1168, 1173, paras. 108, 121; 47 CFR § 1.7006 (e)(4), (f)(5). In cases where a challenged mobile service provider chooses to submit infrastructure data to respond to a challenge, we adopt our proposal to require the provider to submit the same data as required when a mobile provider submits infrastructure information in response to a Commission verification request, including information on the cell sites and antennas used to provide service in the challenged area. BDC Mobile Technical Requirements Public Notice at *9, para. 20; see infra Section III.B.4. Collecting Verification Information from Mobile Providers, Infrastructure Information. In the Third Order, the Commission directed OEA and WTB to provide guidance on the types of data that will likely be more probative in validating broadband availability data submitted by mobile service providers in different circumstances Third Order, 36 FCC Rcd at 1146, para. 48. and in the BDC Mobile Technical Requirements Public Notice, we proposed to use infrastructure data, on their own, to adjudicate challenges in a limited set of circumstances. BDC Mobile Technical Requirements Public Notice at *9, para. 20; see also BDC Mobile Technical Requirements Public Notice at *18, para. 46. Specifically, we proposed that a challenged provider may use infrastructure data to identify tests within challenger speed test data that the provider claims are invalid or non-representative of network performance and proposed four circumstances under which a provider could claim a speed test was invalid, or non-representative. BDC Mobile Technical Requirements Public Notice at *9, para. 20. In response, CCA argues that providers should not be permitted to respond to a challenge with only infrastructure data because such data are predictive and are not as reliable as on-the-ground test data. CCA Comments at 12 (stating that “a rebuttal of only infrastructure information is effectively a response with another prediction, which is inconsistent with the Commission’s (correct) emphasis on the value of on-the-ground testing. Placing a priority on field measurements would require that in rebuttals to challenges, providers must supplement predicted infrastructure data with at least some of their own on-the-ground test data”). CTIA and Verizon, by contrast, argue that the Bureau and Offices lack delegated authority to impose any limitation on providers’ ability to submit infrastructure data to respond to challenges. CTIA Reply at 14; Verizon Comments at 16. 72. We find that our proposed approach strikes the best balance between providing flexibility for providers and ensuring that they respond to challenges with probative data. We continue to view data that reflect actual on-the-ground tests, as opposed to infrastructure data, generally to more accurately reflect user experience and therefore be of more probative value in most—but not all—circumstances. We disagree with CTIA and Verizon’s argument that the Commission’s decision to permit providers to respond with infrastructure data precludes us from adopting rules governing the circumstances under which such data can be used, on their own, to respond to challenges. While the Commission directed providers to “submit to the Commission either on-the-ground test data or infrastructure data, so that Commission staff can examine the provider’s coverage in the challenged area and resolve the challenge,” Third Order, 36 FCC Rcd at 1168, para. 108; accord id. at 1173, para. 121. it also “directed OEA and WTB to develop the specific requirements and methodologies that providers must use in conducting on-the-ground testing and in providing infrastructure data” and “direct[ed] OEA and WTB to provide guidance about what types of data will likely be more probative in different circumstances.” Id. at 1146, 1173, paras. 48, 121. The Commission also found that “if needed to ensure adequate review, OEA may also require that the provider submit other data in addition to the data initially submitted, including but not limited to, either infrastructure or on-the-ground testing data (to the extent not the option initially chosen by the provider).” Third Order, 36 FCC Rcd at 1169, para. 108. Defining the circumstances under which infrastructure data, on their own, may be used to rebut a challenge is consistent with these delegations of authority and offers guidance to providers about when the Commission will find infrastructure data to be as probative as on-the-ground test data, as well as when such data are likely to be sufficient to resolve a challenge. 73. We also disagree with Verizon that requiring a challenged provider to submit infrastructure data in cases where there may be other forms of evidence that can rebut a challenge is “unnecessarily burdensome.” Verizon Comments at 14-15 (stating that “[b]ecause detailed information about the network is not necessary to show that (1) tests at a given time were affected by a network outage; (2) the challenger’s device did not support all relevant spectrum bands; (3) the challenger’s account was subject to reduced speeds for exceeding a usage limit; or (4) the challenger conducted speed tests indoors, the Commission should give providers the flexibility to provide other forms of evidence demonstrating that specific speed tests were invalid. The Commission should also permit providers to demonstrate that, based on a non-consumer challenger’s description of its testing methodology, the challenger’s methodology could not have produced sufficiently reliable speed test data” (footnote omitted)). In the Third Order, the Commission determined that providers may rebut a challenge by submitting to the Commission on-the-ground test data and/or infrastructure data, so that Commission staff can examine the provider’s coverage in the challenged area and resolve the challenge, and may optionally include additional data or information in support of a response. Third Order, 36 FCC Rcd at 1168-69, 1173-74, paras. 108, 121; 47 CFR § 1.7006(e)(4), (f)(5). The Bureau and Offices do not have the authority to change the Commission’s decision or permit challenge responses that do not include either on-the-ground test data and/or infrastructure data. 74. While we adopt our proposal to use infrastructure data, on their own, to resolve challenges in a limited set of circumstances, we agree with commenters that providing additional flexibility will help providers submit responses efficiently. See e.g., CTIA Comments at 18, T-Mobile Comments at 19. Therefore, we add to the list of circumstances where we will accept infrastructure data, on their own, to respond to a challenge. In the circumstances listed below, we find that infrastructure information will likely be as probative as on-the-ground test data and therefore a provider may submit infrastructure data, on their own, in response to challenge that would invalidate speed tests submitted by challengers. We disagree with CCA that the circumstances for submitting infrastructure data are not defined sufficiently and risk increasing burdens on challengers. See CCA Comments at 12 (contending that the proposed circumstances where infrastructure data could be used were “loosely defined” and could have the effect of requiring challengers to demonstrate that “no special events, no outages, and no anomalous loading occurred on the challenged network”). We expect the circumstances outlined above to occur rarely and providers, not challengers, must demonstrate that one of these circumstances exists when responding to a challenge solely with infrastructure data. 75. First, we find that infrastructure information will likely be of comparable probative value when extenuating circumstances at the time and location of a given test (e.g., maintenance or temporary outage at the cell site) caused service to be abnormal. BDC Mobile Technical Requirements Public Notice at *9, *18, paras. 20, 47. In such cases, we adopt our proposal for providers to submit coverage or footprint data for the site or sectors that were affected and information about the outage, such as bands affected, duration, and whether the outage was reported to the FCC’s Network Outage Reporting System (NORS), along with a certification about the submission’s accuracy. Id. at *18, para. 47. We will then remove measurements in the reported footprint in the relevant band(s) made during the outage and, as appropriate, recalculate the statistics. Id. 76. Second, we find that infrastructure data will likely be of comparable probative value when the mobile device(s) with which the challenger(s) conducted their speed tests are not capable of using or connecting to the radio technology or spectrum band(s) that the provider models as required for service in the challenged area. Id. at *9, *18, paras. 20, 48. In such cases, we adopt our proposal for providers to submit band-specific coverage footprints and information about which specific challengers’ device(s) lack the band or technology. Id. at *18, para. 48. We will then remove measurements from the listed devices in the relevant coverage footprint and recalculate the statistics. Id. 77. Third, we find that infrastructure data will likely be of comparable probative value when speed tests were taken during an uncommon special event (e.g., a professional sporting event or concert) that increased traffic on the network. Id. at *9, *18, paras. 20, 49. As we previously stated, we recognize that in such cases mobile service providers would not have the same throughput they would in normal circumstances given the high volume of traffic on networks during these types of uncommon special events, so demonstrating the existence of coverage in the area by submitting infrastructure information would be persuasive for why speed tests were negative in such a scenario. Id. at *18, para. 49. 78. Fourth, we find that infrastructure data will likely be of comparable probative value when speed tests were taken during a period where cell loading was abnormally higher than the modeled cell loading factor. Id. at *9, *18, paras. 20, 50. Speed tests taken during a period when cell loading is higher than usual can result in negative speed tests, and we thus anticipate that infrastructure information will be useful to remove the tests and recalculate the statistics for challenges in this situation. Id. In such cases, we adopt our proposal to require providers to corroborate their claims by submitting cell loading data and we clarify that these data must both (a) establish that the cell loading for the primary cell(s) at the time of the tests was abnormally higher than modeled, To meet this threshold, infrastructure data reporting cell loading at the time of test would need to show that actual loading was both higher than the modeled cell loading factor (e.g., 50%) and higher than the 75th percentile of the 15-minute interval weekly cell loading data submitted as a cell loading baseline. Adopting the 75th percentile requirement would ensure that loading at the time is abnormally high because loading would be higher than the four busiest hours each day during the 6:00 AM to 10:00 PM daily window to submit challenges during the baseline. and (b) include cell loading data for a one-week period before and/or after the provider was notified of the challenge showing as a baseline that the median cell loading for the primary cell(s) was not greater than the modeled value (e.g., 50%). See BDC Mobile Technical Requirements Public Notice at *10, para. 20. See CCA Comments at 12 (arguing that providers should be required to submit corroborating evidence when using infrastructure data to respond to challenges). These clarifications should help address concerns about the utility of infrastructure data by ensuring that we receive robust evidence, based upon actual cell loading measurements, that higher-than-modeled cell loading at the time of the test is an abnormal occurrence. See CCA Comments at 12 (arguing that proposed circumstances for allowing infrastructure data to be used, on their own, to respond to challenges were “open-ended and loosely defined”). We also adopt our proposal that, if a high number of challenges show persistent over-loading, staff may initiate a verification inquiry to investigate whether mobile providers have submitted coverage maps based on an accurate assumption of cell loading in a particular area. See BDC Mobile Technical Requirements Public Notice at *10, para. 20, n.67. 79. Fifth, in response to the record we find that infrastructure data will likely be of comparable probative value when a mobile device used in testing used a data plan that could result in slower service. CTIA Comments at 18 (arguing that providers should be able to “explain that a challenge may have failed due to reasons outside of the provider’s control. For example, a challenge may be invalid due to issues with the test device(s) used or the use of data plans that could result in slower service”); T-Mobile Comments at 19 (stating that “[p]roviders should also be able to rebut a challenge on the grounds that test devices used data plans that were not capable of receiving the advertised speeds in the relevant area at the time of test”). In such cases, providers must submit information about which specific device(s) used in the testing were using a data plan that would have resulted in slower service and information showing that the provider’s network did, in fact, slow the device at the time of the test. 80. Sixth, and also in response to the record, we find that infrastructure will likely be of comparable probative value when a mobile device used in the testing was either roaming or was used by the customer of an MVNO. Verizon Comments at 11 (arguing that the Commission should “exclude consumer and non-consumer speed tests from the challenge calculations if the provider can show that the particular device that a challenger used to conduct speed tests was subject to reduced speeds”). As adopted above, we will not permit speed tests submitted by customers of an MVNO or whose devices are roaming on another provider’s network to be counted as valid tests against the facilities-based provider’s network on which the speed test was conducted. As stated above, because the agreements between a facilities-based provider and MVNOs or roaming partners often include limitations on the technology and speed available to or the network prioritization of devices used by consumers of the MVNO or roaming partner, we conclude that speed tests from such devices are not reliable evidence about the performance of the facilities-based provider’s network. See Section III.A.1 Creating a Challenge/Cognizable Challenges. While we anticipate that the majority of such tests will fail our automated validations, there may be circumstances where the BDC system is unable to automatically identify these tests (e.g., identifying whether an iOS device is roaming is not currently possible). See, e.g., supra Section III.A.1 Creating a Challenge/Cognizable Challenges. In such circumstances, providers must identify which specific device(s) used in the testing were either roaming at the time or used by the customer of an MVNO, based upon their records. 81. After the provider identifies the speed tests it seeks to invalidate pursuant to one of the six circumstances we adopt above and submits all required infrastructure data in support of this contention, we will remove any invalidated speed tests and recalculate the challenged hexagons. Any challenged hexagons that no longer meet the thresholds required for a challenge would be restored to their status before the cognizable challenge was created. BDC Mobile Technical Requirements Public Notice at *10, para. 20. We note that where a provider rebuts a challenge using this process, the challenged hexagons that have been restored to their status before the cognizable challenge was created would continue to be eligible for subsequent challenges. 82. Where a challenged provider does not claim that a challenger’s speed tests were invalid based upon one of the six circumstances listed above, Commission staff will consider any additional information submitted by the challenged provider or request additional information from the challenged provider. Id. at *10, para. 21. Such information must include on-the-ground speed test data and may also include other types of data, as specified in the Third Order. Third Order, 36 FCC Rcd at 1168-68, para. 108. Staff will use this information to complete its adjudication of the challenge. BDC Mobile Technical Requirements Public Notice at *10, para. 21; see Third Order, 36 FCC Rcd at 1168-69, para. 108. Although we adopt the foregoing approach for considering infrastructure information in response to challenges, we note that we may make changes to this approach over time as we gain experience with administering the challenge process. c. Other Data 83. In the Third Order, the Commission determined that providers may rebut a challenge by submitting to the Commission either on-the-ground test data and/or infrastructure data, and may optionally include additional data or information in support of a response, including drive testing data collected in the ordinary course of business, third-party testing data (such as speed test data from Ookla or other speed test app), and/or tower transmitter data collected from transmitter monitoring software. Third Order, 36 FCC Rcd at 1168, 1170, 1173-74, paras. 108, 110, 121. Consistent with the Commission’s direction in the Third Order, OEA staff will review such data when voluntarily submitted by providers in response to challenges, and, if any of the data sources are found to be sufficiently reliable, staff will specify appropriate standards and specifications for each type of data and issue a public notice adding the data source to the alternatives available to providers to rebut a consumer challenge. Id. at 1170, 1173-74, paras. 110, 121. 84. In the BDC Mobile Technical Requirements Public Notice, the Bureau and Offices sought comment regarding the conditions under which a provider’s transmitter monitoring software can be relied upon by staff in resolving challenges. BDC Mobile Technical Requirements Public Notice at *11, para. 25. Commenters did not discuss specific conditions under which transmitter monitoring software should be relied upon, instead expressing general support for the use of such data and encouraging the Commission to develop standards for when such data would be sufficient for rebutting a challenge. CTIA Comments at 5 (urging the Bureau and Offices to “adopt standards specifying when these alternative types of data are sufficient to rebut a challenge”); Verizon Comments at 17 (arguing that transmitter monitoring software provides a “comprehensive picture of network performance” because it “provides data for all sessions, from all devices and at all times of the day, thus largely avoiding different forms of error and bias that could affect a speed test sample”). Based on the record, we find that there is insufficient evidence to determine, at this time, the conditions under which we may rely on transmitter monitoring software data to resolve challenges. Accordingly, we will review such data when voluntarily submitted by providers in response to challenges and, in doing so, we will consider, among other things, the extent to which the transmitter monitoring software data augment or reinforce the probative value of infrastructure or other data to rebut challenger speed test data, how such systems measure the geographic coordinates (longitude and latitude) of the end-user devices, how the data compare to the information collected from on-the-ground testing, and whether such software records instances of end-user devices not being able to connect to the network at all. 85. Several providers filed comments requesting additional flexibility in responding to challenges. CTIA Reply at 13; T-Mobile Comments at 18-19; Verizon Comments at 14-15. They argue that, rather than only being permitted to voluntarily submit other types of data, such as data from field tests conducted in the ordinary course of business or third-party data, in addition to either on-the-ground test data or infrastructure data, providers should be able to submit such data on their own as a response to challenges. T-Mobile Comments at 18-19 (arguing that providers should be permitted to submit “data from coverage tests conducted in the ordinary course of business or any other federal or state required coverage tests” and “any appropriate data showing that the challenge test results are invalid due to issues with the test devices used” to respond to challenges); CTIA Reply at 13 (arguing that providers should be permitted to “submit various types of data to respond to a challenge, including test data collected in the ordinary course of business, third-party test data, and data collected from transmitter monitoring software”); Verizon Comments at 14-15 (contending that “the Public Notice’s proposal to require challenged providers to submit the entire list of infrastructure data to rebut challenges based on speed tests that are invalid or non-representative of network performance is unnecessarily burdensome” and arguing that providers should be permitted to respond to challenges with “other forms of evidence demonstrating that specific speed tests were invalid”). We note that the specifications for on-the-ground test data require that speed tests be taken outdoors. Data that does not meet the specifications will not be validated or included in challenges. The Commission has already addressed requests for additional flexibility in responding to challenges, and the Bureau and Offices do not have authority to change the Commission’s determinations. In the Third Order, the Commission considered arguments that providers should have additional flexibility to submit other types of data in responding to challenges, including, among others, drive testing data collected in the ordinary course of business. Third Order, 36 FCC Rcd at 1168-69, 1170, paras. 108, 110; see id. at 1173-74, para. 121. The Commission recognized the need for flexibility in provider responses, determining that providers may voluntarily submit other types of data beyond on-the-ground testing data or infrastructure data they are required to submit to rebut a challenge, but found that the record did not support a finding that such data were sufficient to serve as a complete substitute for either on-the-ground testing or infrastructure data. Id. at 1170, para. 110; see id. at 1173-74, para. 121. The Bureau and Offices do not have the discretion to change the Commission’s decision. Although OEA has the delegated authority to adopt new alternatives as a substitute for on-the-ground data or infrastructure data, it can exercise such authority only after reviewing such data submissions, determining that they are sufficiently reliable, and specifying the appropriate standards and specifications for each type of data. Id. at 1170, 1174, paras. 110 & n.333, 121 & n.356. B. Collecting Verification Information from Mobile Providers 86. The Broadband DATA Act requires the Commission to “verify the accuracy and reliability” of the broadband Internet access service data providers submit in their biannual BDC filings in accordance with measures established by the Commission. 47 U.S.C. § 642(b)(4)(B). The Commission determined in the Third Order that OEA and WTB “may request and collect verification data from a provider on a case-by-case basis where staff have a credible basis for verifying the provider’s coverage data.” Third Order, 36 FCC Rcd at 1146, para. 47; 47 CFR § 1.7006(c). In response to such an inquiry, the provider must submit either on-the-ground test data or infrastructure information for the specified area(s). Third Order, 36 FCC Rcd at 1146, para. 50; 47 CFR § 1.7006(c). The provider may also submit additional data, including but not limited to, on-the-ground test data or infrastructure data (to the extent such data are not the primary option chosen by the provider), or other types of data that the provider believes support its reported coverage. Third Order, 36 FCC Rcd at 1147, para. 50. A mobile service provider has 60 days from the time of the request by OEA and WTB to submit, at the provider’s option, on-the-ground or infrastructure data, as well as any additional data that the provider chooses to submit to support its coverage. Id. OEA and WTB may require submission of additional data if such data are needed to complete the verification inquiry. Id. The Commission directed OEA and WTB “to implement this data collection and to adopt the methodologies, data specifications, and formatting requirements that providers shall follow when collecting and reporting [these] data.” Id. at 1146, para. 48. The BDC Mobile Technical Requirements Public Notice sought comment on processes and methodologies for determining areas subject to verification (i.e., areas where Commission staff have a credible basis for verifying a mobile provider’s coverage data in an area) and for the collection of on-the-ground test data and infrastructure information, as well as information from transmitter monitoring systems and other data. BDC Mobile Technical Requirements Public Notice at *12-17, paras. 26-42; BDC Mobile Technical Requirements Public Notice, Appx. A – Technical Appendix § 4. Below we discuss and expand on when a credible basis exists for initiating a verification inquiry. Additionally, we adopt approaches for submitting data in response to a verification request and discuss our efforts to balance the needs of this proceeding with the burdens placed on providers in verifying coverage. 1. Area Subject to Verification 87. To identify the portion(s) of a mobile provider’s coverage map for which we will require verification data—referred to as the targeted area(s)—we will rely upon all available evidence, including submitted speed test data, infrastructure data, crowdsourced and other third-party data, as well as staff evaluation and knowledge of submitted coverage data (including maps, link budget parameters, and other credible information). Appx. A – Technical Appendix § 4. We find this approach allows for needed flexibility while accounting for the relevant data at hand when selecting a targeted area. The adopted approach to the mobile verification process differs from the challenge process and the verification process proposed in the BDC Mobile Technical Requirements Public Notice by removing the testing and geographic threshold requirements of the challenge process. This reduces the burden on providers while still allowing for an accurate verification process and is discussed further below. See, e.g., Appx. A – Technical Appendix § 4.5. 88. A Credible Basis to Verify a Provider’s Coverage Data. We will conduct verification inquiries in areas where we find there is a “credible basis” for such an inquiry, and we will use an evidence-based analysis to determine whether a credible basis exists. The factors we will consider in this analysis include, but are not limited to, the geographic size of the area, the number of tests taken, the reliability of the tests, the parameters of the RF link budgets, infrastructure data accuracy, backhaul, and cell loading factor requirements. As discussed below, staff may also adjust the fade margins of the RF link budgets to calculate new “core coverage” areas using a standard propagation model, which would have a higher probability of coverage. See infra Section III.B.4. Collecting Verification Information from Mobile Providers, Infrastructure Information. For example, if testing data in an area exhibit an aberration compared to nearby areas and make that area appear as an outlier, this could constitute a credible basis to initiate a verification inquiry for that area. For example, assume an area is within a provider’s 3G and 4G LTE coverage maps and there are many speed tests in the area on 3G but no tests recorded using 4G LTE from devices that are technologically capable of connecting to a 4G LTE network. This absence of tests on a superior technology would be considered an aberration in an area with many tests. Similarly, if speed tests submitted as challenges are sufficient to create many small, disparate challenges across a much larger area, these may be indicative of a pervasive problem, which could give staff a credible basis for conducting a verification inquiry. Another example where a credible basis could exist is an area where a significant number of speed tests have been submitted as challenges but do not meet the thresholds to create cognizable challenges. A credible basis could also be established for an area without cognizable challenge data but where other available data, such as the results of staff’s statistical analysis of crowdsourced data (including, e.g., Kriging spatial-interpolation analysis), indicate that coverage data may be incorrect. Kriging is a spatial interpolation technique that predicts values at unknown points based on measured values at known points. ESRI and ArcGIS, How Kriging works, https://desktop.arcgis.com/en/arcmap/latest/tools/spatial-analyst-toolbox/how-kriging-works.htm (last visited Feb. 7, 2022). Kriging determines the statistical correlation among measured values and fits a mathematical function to make a prediction at a location. Id. Additionally, Kriging provides variance maps that measure the certainty or accuracy of prediction maps. Kriging is most appropriate when the measured values behave as a normal distribution. Id. Additionally, as discussed further below, once we determine that a “critical mass” of crowdsourced filings indicate a provider’s coverage map may be inaccurate, Commission staff has a credible basis for verifying a provider’s coverage data in that area. See infra Section III.D.3. Crowdsourced Data, When Crowdsourced Filings Reach a “Critical Mass”; see also Appx. A – Technical Appendix § 5. Notwithstanding any of the foregoing, we note that the Commission also retains the right to perform audits of provider submissions at random, even without the existence of a credible basis necessary to trigger a verification inquiry. See 47 U.S.C. § 644(a); Second Order and Third Further Notice, 35 FCC Rcd at 7486, para. 60. 89. We believe that the aforementioned examples of the information we will consider, as well as the standards and types of analysis we intend to apply, when deciding where to initiate a verification inquiry provide sufficient guidance on this topic, and we therefore find it unnecessary to adopt additional restraints, as advocated by T-Mobile. T-Mobile Comments at 21. T-Mobile argues that “[s]etting clear standards to limit the scope and frequency of such requests will ensure that the requests for verification data are made on a case-by-case basis and only when there is an actual need for additional data” and that while certain information is identified for staff review, no framework for making the “credible basis” determination is given. Id. Because the Broadband DATA Act gives the Commission the responsibility to “verify the accuracy and reliability of [service providers’ biannual coverage data],” 47 U.S.C. § 642(b)(4)(B). it is important that staff have enough discretion to consider whether coverage data are accurate based on a range of factors, including geographic size, on-the-ground tests taken, and the reliability of those tests, according to the particular data and circumstances of the data that are presented to us. On the other hand, the case-by-case nature of the data received from providers, the challenge process, and the crowdsourced data is sufficient to limit verification requests to areas where a reason exists to view the area as problematic. We believe the approach described here is the most reasonable and effective way to pursue the goals of this proceeding and the Broadband DATA Act. We do not seek to require superfluous information from providers, but if circumstances indicate that additional data or other information are necessary to verify coverage in an area where evidence suggests the coverage is problematic, we have an obligation to verify the data, and, in many cases, additional information will be necessary to verify the area’s coverage and carry out the Commission’s obligations under the Broadband DATA Act. See 47 U.S.C. § 642(b)(4)(B) 90. Multiple commenters express a strong general desire to reduce or minimize the burden placed on providers as a result of the verification process. See, e.g., Verizon Comments at 18 (noting the proposed requirements would impose unnecessary burdens); T-Mobile Comments at 25 (advocating providers should have more time to respond to requests); CTIA Comments at 19 (supporting a Commission determination to avoid imposing excessive burdens); AT&T Reply at 12-14 (suggesting the Commission tailor requests for verification data in accord with a Commission direction to reduce the burden on providers). For instance, Verizon claims that the methods proposed for determining an area subject to verification would create verification areas that are too large. Verizon Comments at 20-21. Verizon references a webinar conducted by staff to provide an example of a targeted area for the verification process and says this targeted area is too large given the timeframe to respond. Id. This webinar was meant to provide an example for illustrative purposes only and was not meant to represent an actual use scenario or necessarily reflect what would happen in actual verifications. See FCC, Broadband Data Task Force Webinar on Proposals for BDC Mobile Challenge, Verification, and Crowdsource Technical Requirements (August 12, 2021), https://www.fcc.gov/news-events/events/2021/08/broadband-data-task-force-webinar-proposals-bdc-mobile-challenge. It recommends initially testing the verification process on a smaller scale, such as in rural areas. It also recommends that the Bureau and Offices limit verification requests to one per map submission (and up to two per year) and limit the areas to be sampled in the verification process to three contiguous resolution 6 hexagons. Verizon Comments at 20. T-Mobile supports focusing verification requests in rural areas. T-Mobile Comments at 24. T-Mobile similarly requests that the Bureau and Offices limit verification requests, recommending that such requests cover an area of no more than 10,000 square miles in a given year. Id. at 25. T-Mobile also recommends giving providers a period of six months to respond to a verification request. Id. 91. We decline to adopt any specific limitations on the basis for initiating verification inquiries or the areas subject to verification, including instances where a provider is already required to conduct drive testing for other reasons. See, e.g., id. at 24-25 (requesting an exemption from verification requests if the provider is already required to do drive testing for another reason, such as a merger condition). We likewise decline to adopt a limit on the number of verification inquiries that we initiate for a particular provider within a given timeframe. We also decline to limit the verification process to a smaller scale initially, or to focus verification requests in rural areas. See id. at 24; Verizon Comments at 20-21. The Broadband DATA Act envisions that the Commission will assess accuracy and reliability of broadband availability data, and we find it inappropriate to limit staff’s ability to carry out its tasks to further the goals of both the Act and this proceeding. Although we decline to set a maximum size for the target area, we consider any target area with a size less than 50 resolution 8 hexagons to be de minimis and more appropriate for the mobile challenge process than the mobile verification process. See Appx. A – Technical Appendix § 3.1. 92. However, we are mindful of the burden that a large area subject to verification can pose for providers. For this reason, we will rely on a sampling method for verification inquiries. The sampling method we adopt, described more fully in the Technical Appendix, is a somewhat modified version of the proposed approach. Appx. A – Technical Appendix § 4. It relaxes the burden on providers For example, we reduce the burden on providers by replacing the proposed geographic threshold for on-the-ground test data in the verification process with spatial random sampling, as described below. See infra Section III.B.3. Collecting Verification Information from Mobile Providers, On-the-Ground Test Data; see also Appx. A – Technical Appendix §§ 4.5-4.6. in nearly all cases Appx. A – Technical Appendix §§ 4.4-4.5. and is generally more streamlined, but still falls well within the bounds of accepted statistical methodologies. The sample stratified random sampling design is the same as proposed in the BDC Mobile Technical Requirements Public Notice. The frame construction step and stratification method are the same. The sample size formula is algebraically the same (although modified for clarity in the Order). The estimation method for overall broadband availability in the area subject to verification is the same. Nothing in the relevant parts of the proposed verification process that would materially affect the statistical validity of the final estimate of broadband availability has been changed in the adopted approach. Therefore, results under the adopted approach will still be statistically valid, as they were under the approach originally proposed. See, e.g., Appx. A – Technical Appendix § 4.4; BDC Mobile Technical Requirements Public Notice, Appx. A – Technical Appendix § 4.4. The adopted formula for sample selection underwent a purely cosmetic change from the proposed formula that allows for easier explanation although the result of both formulas is the same. 93. In its comments, Verizon requests that the Bureau and Offices allow providers at least 15 days to review and respond to a verification request before a request is officially made and starts the 60-day clock. Verizon Comments at 19-20. We decline to adopt Verizon’s request. We view this request as tantamount to requesting an amendment of the 60-day term stipulated in the Third Order, and such an amendment would be beyond the Bureau and Offices’ delegated authority. Third Order, 36 FCC Rcd at 1147, para. 50. Further, we find that allowing a pre-review period could cause delays in the verification process that would adversely affect the provision of accurate broadband coverage information to the public. Additionally, as verification requests are triggered when there is a credible basis, there is already reason to view the relevant area with concern, and we do not believe that this delay would outweigh the need to verify the data. 2. Sampling Methodology 94. Gathering Statistically Valid Samples of Verification Data. As proposed in the BDC Mobile Technical Requirements Public Notice, we require a mobile service provider subject to a verification inquiry to provide data for a statistically valid sample of areas within the targeted area. See Appx. A – Technical Appendix § 4.1; see also BDC Mobile Technical Requirements Public Notice at *12, para. 28. Commenters do not address proposals for the gathering of statistically valid samples of verification data. We will determine the statistically valid sample size by dividing the targeted area into hexagonal units based on the H3 indexing system at resolution 8; the aggregation of these hexagonal units comprises “the frame.” Appx. A – Technical Appendix § 4.2. We use the H3 indexing system to define the frame for consistency with the mobile challenge process. We will then categorize the hexagonal units that comprise the frame into non-overlapping, mutually exclusive groups (one “stratum” or multiple “strata”). Id. at § 4.3. Each stratum will be based upon one or more variables that are correlated with a particular mobile broadband availability characteristic. These variables could include core / non-core coverage area (if available, and as explained further below), See infra Section III.B.4. Collecting Verification Information from Mobile Providers, Infrastructure Information. signal strength (from a provider’s reported “heat map” or staff-performed propagation modeling), population, urban/rural status, road miles, clutter, and/or variation in terrain. Appx. A – Technical Appendix § 4.3. For example, terrain variation is correlated with broadband availability due to the characteristics of radiofrequency propagation. Hexagons that are not accessible by roads will be excluded from all strata. Hexagons inaccessible by public roads present too high a burden to expect a provider to conduct large-scale on-the-ground testing and we thus have excluded such inaccessible hexagons. Previously we noted if an area is unable to be sampled because there are too few hexagons accessible by road, we would include the minimum number of non-accessible hexagons within the strata as necessary to create a sufficient sample. See BDC Mobile Technical Requirements Public Notice at *12, para. 28. That is no longer possible now because we have decided not to adopt the proposed geographic threshold for verifications. At least one accessible point-hex (i.e., a “child” resolution 9 hexagon) is necessary to conduct on-the-ground testing. By design, all resolution 8 hexagons in all strata have at least one such accessible point-hex. Therefore, all hexagons in all strata can be included in the sample and all of them can be tested. Resolution 8 hexagons that are not accessible by roads cannot be included in stratification because, under the sampling design we adopt, stratification applies only to hexagons that can be selected for on-the-ground testing. We will then select a random sample of hexagons within each stratum for which service providers must conduct on-the-ground testing. To the extent mobile service providers receive personally identifiable information through the verification process by way of receiving crowdsource data, providers may only use such information for the purpose of responding to a verification inquiry, and must protect and keep private all such personally identifiable information. As an alternative to on-the-ground testing, a provider can respond with infrastructure information covering the targeted area. Appx. A – Technical Appendix § 4.4. 95. We find this sampling approach minimizes the cost and burden placed on service providers while ensuring that staff have sufficient data to verify coverage in a reliable way. Without such sampling, providers would need to submit substantially more data to verify their broadband availability, whereas requiring providers to submit speed test results for only a stratified random sample of units within a targeted area will minimize the time and resources associated with responding to the verification requests. This approach is also a more efficient and less burdensome approach than having providers perform annual drive tests, regularly submit infrastructure information, or submit data for their entire network coverage area. See, e.g., Vermont DPS Reply at 4-5 (noting that carriers in Vermont conduct annual drive testing and the information collected therefrom, along with carriers’ continuous performance assessments, should be required in submissions to the Commission). The stratification methodology will also ensure that variation in broadband availability will be as small as possible within hexagons in the same stratum. In the BDC Mobile Technical Requirements Public Notice, the Bureau and Offices sought comment on other variables which correlate with broadband availability and upon which stratification should be based and on the tradeoffs of setting a higher or lower confidence level for this verification process than the thresholds established for the challenge process. See BDC Mobile Technical Requirements Public Notice at *14, para. 32. No commenters address these specific questions. We anticipate this methodology will reduce the sample size and the cost of data collection. 96. Failing to Verify Coverage in a Targeted Area. If the provider fails to verify its coverage data, the provider will be required to submit revised coverage maps that reflect the lack of coverage in the targeted areas failing the verification within 30 days. See Appx. A – Technical Appendix §§ 4.5-4.6; see also BDC Mobile Technical Requirements Public Notice at *14, para. 32; Third Order, 36 FCC Rcd at 1182, paras. 145-46; 47 CFR § 1.7009(d). When a provider submits such revised coverage data, we will re-evaluate the data submitted by the provider during the verification process by comparing it with the revised coverage data for the targeted area using the same methodology. See Appx. A – Technical Appendix §§ 4.5-4.6. If the targeted area still cannot be successfully verified, we will require that the provider submit additional verification data, such as additional on-the-ground tests, or that it further revise its coverage maps until the targeted area is successfully verified. See id. We note, however, that at any point after the initial 30-day deadline has elapsed, we may treat any targeted areas that still fail verification as a failure to file required data in a timely manner and that the Commission may make modifications to the data presented on the broadband map (i.e., by removing some or all of the targeted area from the provider’s coverage maps). See, e.g., Third Order, 36 FCC Rcd at 1160, para. 85; id. at 1182, para. 144; 47 CFR § 1.7009(b). Staff do not make changes to the availability data submitted by providers in the BDC system (providers must submit and certify their own data), but staff can make changes as to what is presented on the public map. Cases where a provider fails to respond in a timely manner may also lead to enforcement action. Third Order, 36 FCC Rcd at 1182, para. 144; 47 CFR § 1.7009(b). 3. On-the-Ground Test Data 97. The approach we adopt for providers to respond to verification requests using on-the-ground test data is a modified version of what was proposed in the BDC Mobile Technical Requirements Public Notice. BDC Mobile Technical Requirements Public Notice at *14, *37, para. 34, Appx. A § 4.5; see also id. at *7-*9, paras. 17-19. The approach is outlined in the Technical Appendix. See Appx. A – Technical Appendix § 4.5. Commenters did not address proposed metrics for on-the-ground test data in the verification process. As noted above, mobile service providers responding to a verification inquiry with on-the-ground test data do not need to provide the Commission with the timestamp, source IP address, and source port of the device measured by an app developer’s servers, or IMEI value of the device that submits the test measurement data. See supra note 72. As requested by providers in the record, our modified approach is intended to lessen the burden on providers. See, e.g., Verizon Comments at 18; T-Mobile Comments at 25; CTIA Comments at 19; AT&T Reply at 12-14. Concerns in the record over a provider’s burden in the verification process are also discussed above. See supra Section III.B.1. Collecting Verification Information from Mobile Providers, Area Subject to Verification. These modified thresholds will still provide the Commission with sufficient data to evaluate a provider’s coverage but aim to reduce the testing burden on the providers. First, rather than requiring tests to meet a geographic threshold, we adopt a revised requirement wherein staff will randomly select a single point-hex (i.e., a child resolution 9 hexagon) within the resolution 8 hexagon selected for the sample where the provider must conduct its tests. We will also treat any tests within the sampled accessible point-hex that are outside the coverage area as valid, in the case where tests were not recorded in the coverage area, to provide additional flexibility to the provider. See Appx. A – Technical Appendix § 4.5. Given the size of point-hexes and the limits of GPS Standard Positioning Service accuracy at that scale, we may also consider tests that fall slightly outside the required point-hex but within the typical GPS average user range error as valid when no tests are recorded within the point-hex. See Department of Defense, GPS SPS Performance Standard at tbl. 3.4-1 (5th ed. 2020), available at https://www.gps.gov/technical/ps/2020-SPS-performance-standard.pdf. Any point-hexes that still do not contain any valid tests will be considered to have the minimum number of required tests and those tests will all be considered negative tests. Given the geographic size of the resolution 9 hexagon and the verification thresholds described herein, we anticipate some tests, particularly in-motion drive tests, could unintentionally result in an incomplete sample. This approach accounts for the incomplete sample and describes how we would proceed in this scenario rather than having to address the matter on a case-by-case basis. Unlike in the challenge process, geographic variation in the on-the-ground test data submitted for the verification process is guaranteed by spatial random sampling approach; thus, the geographic threshold used in the challenge process is unnecessary here. The goal of the geographic threshold is not to ensure geographic variation (e.g., terrain). The geographic threshold is intended to ensure that the tester cannot selectively find an optimal, single point within a resolution 8 hexagonal area with good coverage. The tester cannot select an optimal, single point within an area with good coverage from which to test because it is Commission staff—not the tester—that selects the sample. The tester must show that good coverage exists at several points within the resolution 8 hexagonal area. Requiring testing in only one point-hex (versus four or more, as originally proposed) reduces provider burden because there will be less travel involved. See BDC Mobile Technical Requirements Public Notice, Appx. A –Technical Appendix §§ 4.5-4.6. Less travel requires less mileage, and less mileage equates to reduced testing time and cost. Second, the specific testing threshold requirements that apply to challenges are not as relevant to verifications. See Appx. A – Technical Appendix §§ 4.5-4.6. Similar to the geographic threshold, the goals of the testing threshold are achieved slightly differently in the verification process than in the challenge process, as discussed in the Technical Appendix. Id. In the verification process, geographic variation in the on-the-ground test data is guaranteed by spatial random sampling. Id. Geographic variation in the on-the-ground test data for the verification process is guaranteed by spatial random sampling. Id. Pass or fail adjudication for verifications relies on the totality of the submitted valid on-the-ground test data taken from the randomly selected hex cells. Id. For these reasons, specific testing threshold requirements similar to those that apply to challenges are not materially relevant to verifications. Id. Accordingly, the temporal threshold is the only relevant threshold from the challenge process necessary to ensure statistically valid results when submitting on-the-ground test data for the verification process. See id. Third, we adopt a slight modification to the temporal threshold for verification responses. The temporal threshold proposed in the BDC Mobile Technical Requirements Public Notice requires the provider to record at least two tests within each of the randomly selected hexagons where the time of the tests are at least four hours apart, irrespective of date. BDC Mobile Technical Requirements Public Notice, Appx. A – Technical Appendix § 4.5. We adopt the proposed temporal threshold for the verification process Id. with a slight modification in certain circumstances. Specifically, we relax this threshold from what was proposed by requiring only a single test in a sampled hexagon if the provider establishes that any significant variance in performance was unlikely due to cell loading. Appx. A – Technical Appendix § 4.5. The provider can establish this by submitting with its speed test data actual cell loading data for the cell(s) covering the hexagon sufficient to establish that median loading, measured in 15-minute intervals, did not exceed the modeled loading factor (e.g., 50%) for the one-week period prior to the verification inquiry. We find that this modification will reduce the burden on providers without sacrificing statistical robustness because the temporal threshold exists to mitigate the likelihood that the speed measured in test data is unrepresentative of the speed when measured at different times of day, with different cell loading utilization that may exceed the provider’s modeled loading assumptions. 98. We will evaluate the entire set of speed test results to determine the probability that the targeted area has been successfully verified. Appx. A – Technical Appendix § 4.6. The upload and download components of a test will be evaluated jointly in the verification process (rather than separately, as in the challenge process). In contrast to the challenge process, the location of the tests is prescribed in the verification process and, therefore, the upload and download components of a test will be in the same point-hex. This allows for a single classification of the test as positive (if both upload and download meet or exceed required speeds) or as negative (if upload and/or download fail to meet required speeds). We will treat any resolution 8 hexagons in the sample where the provider fails to submit the required speed tests in the randomly selected point-hex as containing negative tests in place of the missing tests when performing this calculation. Providers must verify coverage of a sampled area using the H3 geospatial indexing system at resolution 8. Appx. A – Technical Appendix § 4.2. The tests will be evaluated to confirm, using a one-sided 95% statistical confidence interval, that the cell coverage is 90% or higher. Appx. A – Technical Appendix § 4.6. If the provider can show sufficient coverage in the selected resolution 8 hexagons, the provider will have successfully demonstrated coverage to satisfy the verification request in the targeted area. Id. Sampling allows us to identify where to test and to draw statistically meaningful results about the performance in areas that are not sampled. We believe the specific thresholds and confidence interval that we adopt balance the costs to providers of verifying maps with the Commission’s need to acquire a sample sufficient to accurately verify mobile broadband availability. 99. As proposed in the BDC Mobile Technical Requirements Public Notice, we require that mobile providers conduct on-the-ground tests consistent with the testing parameters and test metrics that we require for provider-submitted test data in the challenge process. BDC Mobile Technical Requirements Public Notice at *14, para. 34. As required in the challenge process for in-vehicle mobile tests, providers must conduct in-vehicle mobile tests in the verification process with the antenna located inside the vehicle. See supra Section III.A.2.a. Mobile Service Challenge Process, Challenge Responses, Rebutting Challenges with On-the-Ground Data. As noted above, because most consumers will take in-vehicle tests using an antenna inside the vehicle, adopting that requirement for providers will help minimize discrepancies and ensure more equivalent comparisons between on-the-ground test data supplied by consumers and data supplied by providers. See supra id. (discussing rationale for requiring providers to conduct tests with the antenna located inside the vehicle contrary to Verizon’s request). 100. We decline to ask for on-the-ground test data from mobile providers on a continuous or quarterly basis as part of the verification process as proposed by Enablers. Enablers Comments at 6. As noted above, we are mindful of the burden placed on provider resources and find a continuous or quarterly rolling submission requirement unnecessarily burdensome. 101. Commission staff may also leverage spatial interpolation techniques, such as Kriging, to evaluate and verify the accuracy of coverage maps based on on-the-ground data. BDC Mobile Technical Requirements Public Notice at *15, para. 38. Spatial interpolation techniques can be an alternative or complementary approach to specifying an exact testing threshold, since spatial interpolation techniques require fewer data to compare with predictions using propagation models. 4. Infrastructure Information 102. In the BDC Mobile Technical Requirements Public Notice, we noted the Commission found that infrastructure information can provide an important means to fulfill its obligation to independently verify the accuracy of provider coverage maps. Id. at *15, para. 36 (citing Third Order, 36 FCC Rcd at 1147-48, para. 52). We also reiterated the Commission’s conclusion that collecting infrastructure data from mobile service providers will enable the Commission to verify the accuracy and reliability of submitted coverage data as required under the Broadband DATA Act. Id. (citing Third Order, 36 FCC Rcd at 1148, para. 53); see also U.S.C. § 642(b)(4)(B). 103. In determining how best to utilize infrastructure data to verify a provider’s coverage, the Bureau and Offices proposed that Commission staff evaluate whether a provider has demonstrated sufficient coverage for each selected hexagon using standardized propagation modeling. BDC Mobile Technical Requirements Public Notice at *15, para. 37. Under that proposed approach, staff engineers would generate their own predicted coverage maps using the infrastructure data submitted by the provider (including link budget parameters, cell-site infrastructure data, and the information provided by service providers about the details of the propagation models they used). Id. No proposed approach anticipated that staff would exactly duplicate a provider’s own coverage maps for dispositive effect as exact map duplication by staff would likely be impossible. Using those staff-generated maps, the proposed approach anticipated that Commission staff would evaluate whether each selected hexagon has predicted coverage with speeds at or above the minimum values reported in the provider’s submitted coverage data. Id. The Bureau and Offices sought comment on this proposed approach to verifying coverage using standardized propagation modeling, as well as on other ways more generally that infrastructure data could be used to evaluate the sufficiency of coverage in the proposed verification process. Id. at *15, paras. 37, 38. In the BDC Mobile Technical Requirements Public Notice, we noted staff may also consider other relevant data submitted by providers during the verification process, may request additional information from the provider (including on-the-ground speed test data, if necessary), and may take steps to ensure the accuracy of the verification process. Id. at *15, para. 37. Alternatively, we sought comment on other ways to use the submitted infrastructure and link budget data to perform initial verification of the claimed coverage within the selected hexagons using standard propagation models as well as appropriate terrain and clutter data. Id. at *15, para. 38. We stated we could evaluate the provider’s link budgets and infrastructure data for accuracy against other available data, such as Antenna Structure Registration and spectrum licensing data. Id. This alternative approach would include using a staff projection of speeds, available crowdsourced data at the challenged locations, and any other information submitted by or requested from a provider in order to verify coverage. Id. The Bureau and Offices further discussed leveraging spatial interpolation techniques to evaluate and verify the accuracy of coverage maps based on available crowdsourcing and on-the-ground data. Id. We sought comment on both the original and alternative approaches and invited comment on any other ways that infrastructure data and staff propagation modeling could be used to verify a provider’s coverage in a targeted area. Id. 104. We adopt the BDC Mobile Technical Requirements Public Notice’s proposal that, if a provider chooses to submit infrastructure information in response to a verification request, it must provide such data for all cell sites and antennas that serve or affect coverage in the targeted area. Id. at *15, para. 37. As set forth in that notice, staff may use these infrastructure data—in conjunction with link-budget data from the provider, standard sets of clutter and terrain data, other factors, and standardized propagation modeling—to inform our decision about whether the provider has verified its claimed coverage. See id. However, we agree with several commenters that it would be difficult for staff to account for the intricacies of a provider’s dynamic network configuration and replicate provider models with staff’s own propagation models and that the proposed approach is not necessary to accomplish the Commission’s goals with respect to the verification process. AT&T Reply at 15; CTIA Comments at 21-22; T-Mobile Comments at 22-24; Verizon Comments at 21. Verizon notes its alternative suggestion that the Commission use a statistically based propagation model with provider information to make a reasonable coverage estimate. See Verizon Comments at 21. Rather than attempt to replicate the results of providers’ modeling, we expect staff will rely on a more flexible approach to its analysis. For example, in appropriate cases staff may choose to estimate a “core coverage area,” in which coverage at the modeled throughput is highly likely to exist, As further discussed below, commenters express concern at staff propagation models, calling this modeling expensive and of little use. See, e.g., CTIA Comments at 19, 21-23; T-Mobile Comments at 21-24. and would focus its verification efforts instead on areas outside of that “core coverage area”—but within the service provider’s claimed coverage area (i.e., close to the cell edge)—and may consider other data that could be relevant (e.g., cell loading or signal strength measurements) to determine whether to seek additional information in furtherance of a verification inquiry for areas within the core coverage area. 105. While each analysis will turn on the relevant facts and circumstances, we offer one possible example of the approach in an effort to provide guidance about how the staff’s analysis might work. In this scenario, Commission engineers would first confirm that the backhaul, technology, and other network resources reported for the base station(s) that serve(s) the targeted area are sufficient to meet or exceed the required speed thresholds. Second, staff could use propagation modeling to estimate the provider’s core coverage area within the targeted area using more conservative parameters (including a higher cell edge probability) than required of the propagation modeling the provider used to generate its coverage data. Third, staff could analyze downlink and uplink cell loading data submitted by the provider as part of its infrastructure data to confirm that the median cell loading values are less than or equal to the cell loading factor modeled by the provider (e.g., 50%). Fourth, staff could then evaluate the signal strength information from all available speed test measurements – including those submitted as challenges, crowdsourced data, or on-the-ground data in response to a verification inquiry. For a verification inquiry, the system would evaluate whether the portion of the target area falls outside of the staff-determined core coverage area. If the targeted area falls within the core coverage area, then we would consider other relevant evidence (if any) to determine whether further inquiry is necessary or appropriate. In instances where some or all of the targeted area has been the subject of an earlier verification inquiry that was dismissed after the provider submitted only infrastructure data, we anticipate that staff is unlikely to dismiss the subsequent verification inquiry without on-the-ground speed test data. Moreover, because we consider on-the-ground test data to be generally more probative than infrastructure data alone, see supra Section III.A.2.b. Rebutting Challenges with Infrastructure Data, we retain discretion to not dismiss a verification inquiry under this framework where there is other additional evidence calling into question the reliability of the provider’s infrastructure data or propagation model or where staff engineers, in their expert judgment, determine that on-the-ground speed tests are necessary to verify the targeted area and fulfill the Commission’s obligations under the Broadband DATA Act. See 47 U.S.C. § 642(b)(4)(B). 106. In cases where staff’s analysis indicates that infrastructure data alone would be insufficient to resolve the verification inquiry, staff may determine to sample a new set of areas and in appropriate cases may also take into account additional infrastructure data and information on the core coverage areas, where staff expect adequate coverage is highly likely. In the BDC Mobile Technical Requirements Public Notice, we sought comment on “other variables which correlate with broadband availability and upon which stratification should be based,” as well as “the tradeoffs of setting a higher or lower confidence level for this verification process than the thresholds established for the challenge process.” See BDC Mobile Technical Requirements Public Notice at *14, para. 32. Due to the high assumed pass rate in core coverage areas, we would ordinarily expect that most of the sampling would be targeted at non-core coverage areas. Staff could then request additional information, such as on-the-ground data, to complete the verification process. BDC Mobile Technical Requirements Public Notice at *13, *15, paras. 29, 37. Staff may also consider infrastructure data independently and review for anomalies. 107. Several commenters argue that Commission staff should not generate propagation models with the submitted infrastructure information or do so only in limited cases. AT&T Reply at 12, 14-16; CTIA Comments at 19, 21-23; T-Mobile Comments at 21-24; Verizon Comments at 21. For example, Verizon urges Commission staff to limit predictive studies to localized examinations of the reasonableness of a service provider’s map and clarify that successful speed test data would preclude staff propagation modeling or outweigh countervailing staff propagation modeling results. Verizon Comments at 21-22 (arguing that if a provider responds to a verification request with speed test data meeting the statistical tests, then staff may not also request infrastructure data or, at least, speed test data should carry more evidentiary weight than any staff propagation modeling). We clarify that where a provider submits valid speed test data in sample-selected areas, staff propagation studies based on infrastructure data should not be necessary. We also clarify that while staff has the option to create predictive maps based on providers’ infrastructure data, we are not required to do so. However, the option to create staff propagation studies is a tool necessary to retain in the analyzation of collected infrastructure data and fulfillment of our obligations under the Broadband DATA Act. 108. Initial Verification of Claimed Coverage. We adopt our proposal to perform initial verification of claimed coverage as an alternative way to use infrastructure data to assess providers’ coverage data. BDC Mobile Technical Requirements Public Notice at *15, para. 38. We will compare the provider’s link budget and infrastructure data with other available data for accuracy, such as Antenna Structure Registration and spectrum licensing data. Id. If staff believe, after making these comparisons, that there is a technical flaw in a provider’s maps (e.g., a model was run with the wrong parameters), we will then determine if this flaw would result in a significant difference in coverage. If staff estimation of speed (e.g., resulting from staff-performed propagation modeling or other related calculations), along with the available crowdsourced data at the challenged locations, does not predict speeds at or above the minimum values reported in the provider’s submitted coverage data, Commission staff will consider any additional information submitted by the provider or request other data from the provider; other data may include on-the-ground data. Id. No commenters addressed this alternative to perform initial verification of claimed coverage. 109. Additional required infrastructure information. We adopt the proposal to expand the categories of infrastructure information that providers must submit. Id. at *16, para. 39. As anticipated, Id. we find that such information is necessary to analyze verification inquiries adequately. In addition to the types of infrastructure information listed as examples in the Third Order, Third Order, 36 FCC Rcd at 1147-48, para. 52 (listing examples of infrastructure data to include: “(1) the latitude and longitude of cell sites; (2) the site ID number for each cell site; (3) the ground elevation above mean sea level (AMSL) of the site (in meters); (4) frequency band(s) used to provide service for each site being mapped including channel bandwidth (in megahertz); (5) the radio technologies used on each band for each site; (6) the capacity (Mbps) and type of backhaul used at each cell site; (7) the number of sectors at each cell site; and (8) the Effective Isotropic Radiated Power (EIRP, in dBm) of the sector at the time the mobile provider creates its map of the coverage data”) (footnotes omitted). We add “cell” ID number for each cell site to the requirements. As with site ID number, the cell ID number is needed to associate mobile speed measurements to a particular cell site for our performance verification analyses. We slightly modify the metrics for collecting geographic coordinates to require that coordinates be measured with typical GPS Standard Positioning Service accuracy or better. While we find that typical GPS Standard Positioning Service would provide a minimum level of information on the collected cell site positions we allow for submission of more precise location accuracy data as well. providers must submit the following parameters: (1) geographic coordinates of each transmitter measured with typical GPS Standard Positioning Service accuracy or better; (2) per site classification (e.g., urban, suburban, or rural); (3) elevation above ground level for each base station antenna and other transmit antenna specifications (i.e., the make and model, beamwidth (in degrees), radiation pattern, Radiation pattern was not included in the BDC Mobile Technical Requirements Public Notice. However, this parameter serves the same purpose as the proposed parameters. The radiation pattern is the transmitting antenna radiation pattern necessary to perform or verify coverage studies. Every commercial antenna has radiation pattern data which can be derived from the antenna model. This information allows the verification process to be streamlined because the data is readily available and was used by providers to perform their coverage studies. and orientation (azimuth and any electrical and/or mechanical down-tilt in degrees) at each cell site); (4) operate transmit power of the radio equipment at each cell site; (5) throughput and associated required signal strength and signal-to-noise ratio; (6) cell loading distribution; To facilitate staff confirmation of cell loading assumptions pursuant to the core coverage area methodology we discuss, we will require providers to submit information on the actual loading for each cell site that serves the targeted area, including, for example, the average number of active radio resource control channel users and average bandwidth carrying user traffic for both the downlink and uplink carriers measured in 15-minute intervals for the one-week period before the provider received the verification inquiry. (7) areas enabled with carrier aggregation and a list of band combinations; Originally, we proposed to collect data on the percentage of handset population capable of using this band combination. See BDC Mobile Technical Requirements Public Notice at *16, para. 39. Commenters argued against collecting additional infrastructure information that was not necessary. See, e.g., Verizon Comments at 16-17, 21. We do not include the percentage of handset population capable of using this band combination because we do not view this information as necessary to our analysis and requiring this would unnecessarily burden a provider. and (8) any additional parameters and fields that are listed in the most-recent specifications for wireless infrastructure data adopted by OEA and WTB in accordance with 5 U.S.C. § 553. Concurrent with release of this Order, we have published the full data specifications for mobile infrastructure data in advance of the initial biannual filing window. Broadband Data Task Force and Office of Economics and Analytics Publish Additional Data Specifications for the Submission of Mobile Speed Test and Infrastructure Data into the Broadband Data Collection, Public Notice, DA 22-242 (BDTF/OEA 2022). The specifications for infrastructure data include additional fields derived from the high-level metrics defined herein, as well as other identifiers to facilitate management of the submission of such data. 110. Some commenters argue that the Commission should not require infrastructure data fields beyond what was required in the Third Order. See, e.g., CTIA Comments at 19; AT&T Reply at 13-14; Verizon Comments at 16. Verizon advocates for deleting proposed fields it called unnecessary, unclear, or unable to be readily provided. Verizon Comments at 16-17, 21 (advocating deleting the proposed fields of (1) throughput and associated required signal strength and signal-to-noise ratio; (2) cell loading distribution; and (3) areas enabled with carrier aggregation and a list of band combinations). CTIA says the “Bureaus should not second-guess a provider’s cell-loading factor if the data indicates higher than average cell loading in a given area at a given time.” CTIA Comments at 21. CTIA also urges the Commission not to collect additional infrastructure information due to its sensitive and confidential nature and the burdens this collection would impose; CTIA contends this collection would be inconsistent with the Broadband DATA Act, and staff should rather tailor its requests to specific issues after discussion with the provider. Id. at 19-20. 111. The data fields we adopt here are necessary to help predict more precisely the users’ speeds, and the potential burdens of providing these data are outweighed by the necessity of the information. To elaborate, required signal strengths and signal-to-noise (SNR) ratio data are critical factors that enable or impede the speed at which users may connect and are thus required to estimate the users’ speeds. Cell loading distribution is the measured cell loadings observed for each cell over time (e.g., every 15 minutes or less for each cell on the day of interest). Cell loading distribution is also necessary to calculate the final users’ speeds and analyze challenges, as evidenced by the inclusion of a minimum 50% cell loading specification in the Broadband DATA Act. 47 U.S.C. § 642(b)(2)(B)(ii)(I)(bb). A provider’s measured cell loading factor is the best way to verify actual cell loading; the cell loading factor is not being second-guessed. In areas with carrier aggregation, a list of spectrum band combinations used for carrier aggregation is necessary to analyze the capacity of the cell, and will be used in conjunction with cell loading data to evaluate more precisely the disputed areas of the coverage map. More detailed infrastructure data specifications are listed in § 1.7006(c)(2) of the Final Rules Appendix. Infra Appx. B – Final Rules § 1.7006(c)(2). 112. While we do not prioritize one information source over another, we noted above that where providers’ responses to verification inquiries include valid speed test data for each sampled area, staff propagation studies based on infrastructure data should not be necessary. See supra para. 107. As previously noted, we are sensitive to confidentiality and security concerns in the collection of mobile infrastructure information, and infrastructure information submitted by providers at the request of staff will be treated as presumptively confidential. Third Order, 36 FCC Rcd at 1148-49, para. 55. We are also sensitive to not imposing undue burden on providers and have therefore not mandated the submission of infrastructure data in response to every verification inquiry. Id. at 1149, para. 56. We may engage in discussions with a provider when necessary, after which we can request specific areas in which to collect the data. When staff find that infrastructure data are necessary to verify coverage consistent with the Broadband DATA Act, the infrastructure data fields enumerated herein are necessary for staff to carry out that obligation. 5. Transmitter Monitoring Information 113. The Commission directed OEA and WTB to review transmitter monitoring information submitted voluntarily by providers in addition to on-the-ground and infrastructure information. Id. at 1146, para. 47, n.157. T-Mobile asserts that providers should be allowed to submit data from alternative sources, including transmitter monitoring information, to satisfy verification requests. T-Mobile Comments at 25-26. Verizon states that transmitter monitoring information “provides a comprehensive picture of network performance.” Verizon Comments at 17. We agree that these data could be helpful, to the extent that they support potential reasons for service disruptions during the time interval in which measurements were performed. Therefore, we will consider transmitter monitoring information voluntarily submitted by a provider in addition to on-the-ground testing or infrastructure data in response to a verification inquiry. BDC Mobile Technical Requirements Public Notice at *16, para. 41. We do not believe, however, that the record supports a finding that such data constitute a sufficient substitute for the on-the-ground testing or infrastructure data required by the Third Order to respond to a verification inquiry. Third Order, 36 FCC Rcd at 1146-51, paras. 50-60. C. Collecting Verified Broadband Data from Government Entities and Third Parties 114. We adopt our proposal for governmental entities and third parties to submit verified on-the-ground test data using the same metrics and testing parameters that mobile providers must use when submitting on-the-ground test data in response to a verification request. See supra Section III.B.3. Collecting Verification Information from Mobile Providers, On-the-Ground Test Data; see also BDC Mobile Technical Requirements Public Notice at *17, para. 44. We also note, as set forth in the Third Order, government and other third-party entities that submit verified broadband availability data must file their broadband availability data in the same portal and under the same parameters as providers. Third Order, 36 FCC Rcd 1152, para. 63. This includes a certification by a certified professional engineer that he or she is employed by the government or other third-party entity submitting verified broadband availability data and has direct knowledge of, or responsibility for, the generation of the government or other entity’s Broadband Data Collection coverage maps. Third Order, 36 FCC Rcd 1144-45, para. 43; 47 CFR § 1.7004(d). We find that assigning consistent, standardized procedures for governmental entities and third parties to submit on-the-ground data is necessary to ensure that the Commission receives consistent, reliable data and that the broadband availability maps are as accurate and precise as possible. The record exhibits support for this approach. T-Mobile Comments at 26-27. Next Century Cities advocates the Commission develop outreach and explanatory materials to encourage participation from state and local leaders, Next Century Cities Reply at 2. and we will be making such materials available to state, local, and Tribal government entities to file verified data. We are mindful of Precision Ag Connectivity & Accuracy Stakeholder Alliance’s concerns that imposing these standards will not result in the submission of verified data from governmental entities and third parties. PAgCASA Reply at 2, 7 (expressing concern that the Commission methodology will not solicit verified data appropriately and thus will not result in submission of verified data from governmental and third-party entities nor a user-friendly process). We believe, however, that this approach is the most efficient and effective way for providers and staff to review verified data from governmental entities and third parties. This approach minimizes variables between different datasets and thus helps ensure that staff and other parties may more efficiently and effectively evaluate competing data (e.g., verified on-the-ground tests submitted by a governmental entity versus on-the-ground tests conducted by the provider) with an apples-to-apples comparison to determine the source of any data discrepancies. Assigning consistent, standardized procedures for governmental entities and third parties to submit verified on-the-ground data is appropriate and necessary to ensure the broadband availability maps are as accurate and precise as possible. 115. We also adopt our proposal that, to the extent the Commission is in receipt of verified on-the-ground data submitted by governmental entities and third parties, such data may be used when the Commission conducts analyses as part of the verification processes and will be treated as crowdsourced data. BDC Mobile Technical Requirements Public Notice at *17, para. 45. Governmental entities and third parties may also choose to use these data to submit a challenge, provided they meet the requirements for submission of a challenge under the Commission’s rules. Id. 116. Enablers advocates that the Commission create a “strong active testing-based verification layer with sampling of nationwide coverage” and revisit the decision to require propagation maps instead of continuous drive testing. Enablers Comments at 6-7. To that end, Enablers notes that its solution allows for cost-effective, continuous active testing by third parties to better produce statistically valid samples and advocates that its approach be adopted. Id. at 7-8. To the extent that government entities and third parties choose to submit verified data, we note that the Commission requires them to submit their data under the same parameters as providers. Third Order, 36 FCC Rcd at 1152, para. 63. The Bureau and Offices lack the authority to override decisions by the full Commission. We note, however, that if Enablers or other parties submit crowdsourced data consistent with the specifications outlined below, we will treat those data as such. See infra Section III.D. Crowdsourced Data; see also Appx. A – Technical Appendix § 5. D. Crowdsourced Data 117. The Broadband DATA Act requires the Commission to “develop a process through which entities or individuals . . . may submit specific information about the deployment and availability of broadband internet access service . . . on an ongoing basis . . . to verify and supplement information provided by providers.” 47 U.S.C. § 644(b). In the Second Order, the Commission adopted a crowdsourcing process to allow individuals and entities to submit such information. Second Order, 35 FCC Rcd at 7487, para. 64. The Commission required that crowdsourced data filings contain: the contact information of the filer, the location that is the subject of the filing (including the street address and/or GPS coordinates of the location), the name of the provider, and any relevant details about the deployment and availability of broadband Internet access service at the location. 47 CFR § 1.7006(b)(1)(i)-(iv); see also Second Order, 35 FCC Rcd at 7489, para. 69. The Commission also required that crowdsourced data filers certify that, “to the best of the filer’s actual knowledge, information, and belief, all statements in the filing are true and correct.” 47 CFR § 1.7006(b)(1)(v); see also Second Order, 35 FCC Rcd at 7489, para. 70. As the Commission has clarified, the Bureau and Offices, together with the Wireline Competition Bureau (WCB), will use crowdsourced data to “identify[] trends,” and “individual instances or patterns of potentially inaccurate or incomplete deployment or availability data that warrant further investigation or review.” Second Order, 35 FCC Rcd at 7490, para. 72. Crowdsourced information is intended to “verify and supplement information submitted by providers for potential inclusion in the coverage maps.” See 47 CFR § 1.7006(b). Notably, the Commission also expressly reserved the right to investigate provider filings in instances that warrant further investigation based on the specific circumstances presented by crowdsourced data. Second Order, 35 FCC Rcd at 7491, para. 74. 118. We provide further guidance and adopt rules regarding the crowdsourced data process as described below. We provide additional information about updates we are making to the FCC Speed Test app’s technical standards and requirements to configure the app for submission of mobile challenge and crowdsourced data. We also outline the procedures OET will follow for approving third-party speed test apps for these purposes. We establish requirements for consumers and other entities to submit any crowdsourced data to the online portal using the same parameters and metrics providers would use when submitting on-the-ground data in response to a Commission verification request, with some simplifications, as described above. See BDC Mobile Technical Requirements Public Notice at *21, para. 55. Finally, we provide guidance on our methodology for evaluating mobile crowdsourced data through an automated process – a process that will assist us in establishing when crowdsourced data filings reach a “critical mass” sufficient to merit further inquiry. See id. at *21-*22, paras. 56-57. Once the automated process identifies areas where verification may be warranted, Commission staff will conduct an evaluation based upon available evidence such as speed test data, infrastructure data, crowdsourced and other third-party data, as well as staff’s review of submitted coverage data (including maps, link budget parameters, and other credible information) to determine whether a credible basis for conducting a verification inquiry has been established using the standards outlined in greater detail below. 1. Tools to Submit Crowdsourced Data 119. In the BDC Mobile Technical Requirements Public Notice, the Bureau and Offices proposed a process for consideration of crowdsourced data submitted through data collection apps used by consumers and other entities, including methods to prioritize the consideration of crowdsourced data submitted through apps that are determined to be “highly reliable” and that “have proven methodologies for determining network coverage and network performance.” Id. at *19-20, paras. 52-53; see also Second Order, 35 FCC Rcd at 7488, para. 66 (quoting 47 U.S.C. § 644(b)(2)(A)). We noted that the Commission directed the Bureau and Offices (along with WCB) to consider “(1) whether the application uses metrics and methods that comply with current Bureau and Office requirements for submitting network coverage and speed test data in the ordinary course; (2) whether the speed test app used has enough users that it produces a dataset to provide statistically significant results for a particular provider in a given area; and (3) whether the application is designed so as not to introduce bias into test results.” BDC Mobile Technical Requirements Public Notice at *19, para. 52; see also Second Order, 35 FCC Rcd at 7488, para. 66. The Bureau and Offices noted that “data submitted by consumers and other entities that do not follow any specific metrics and methodologies may be less likely to yield effective analysis and review . . . of providers’ mobile broadband availability.” BDC Mobile Technical Requirements Public Notice at *21, para. 55. Commenters did not provide any suggestions or recommendations on how to prioritize consideration of crowdsourced data. 120. We find that the FCC Speed Test app is a reliable and efficient tool for users to submit crowdsourced mobile coverage data to the Commission. See id. at *19, para. 52. The FCC Speed Test app allows users to submit specific information about the availability of mobile broadband service and its performance and meets the requirements outlined in the Commission’s Second Order. Id.; see also Second Order, 35 FCC Rcd at 7488, para. 66. We also make clear that we will include both stationary and mobile in-vehicle speed test results in crowdsourced data. See Ookla Comments at 11-12 (requesting clarification that crowdsourced data will also include stationary observations). Specifically, we find the FCC Speed Test app sufficiently meets the considerations that the Commission set forth. First, we find the FCC Speed Test app uses metrics and methods that comply with current requirements for submitting network coverage and speed test data in the ordinary course. These include upload speed, download speed, latency and other network performance metrics. See 2021 FCC Speed Test App Technical Description (“The App performs the following active tests of mobile broadband performance: Download speed . . . Upload speed . . . Latency . . . Packet loss . . . Jitter . . . .”). These metrics are consistent with the network performance metrics required to be collected by the Commission under the 2020 Broadband DATA Act and the 2008 Broadband Data Improvement Act. 47 U.S.C. §§ 642(b), 1303(c) (requiring the Commission generally to collect download speed, upload speed, latency, “other sources of broadband service capability which consumers regularly use or on which they rely,” and “any other information the Commission deems appropriate for such purpose”). Next, we find that the FCC Speed Test app is designed to minimize bias in test results. See 2021 FCC Speed Test App Technical Description at 5-11 (describing generally how the FCC Speed Test app selects its measurement server and carefully conducts its performance measurements over a predetermined time duration to “minimize[] extraneous factors that could degrade a statistically accurate measure of mobile broadband performance.”) The FCC Speed Test app’s test system architecture implements dedicated off-net servers hosted by a Content Delivery Network (CDN) to provide robust and reproducible test results for effective representation of network performance. The test servers are deployed at Tier 1 major peering/transit locations to minimize bias which is a practical approach to measure network performance. With regard to whether the FCC Speed Test app produces a dataset sufficient to provide statistically significant results for a particular provider in a given area as it pertains to crowdsourced data, we note that we will not be analyzing speed test results from the FCC Speed Test app in isolation. Rather, we will aggregate and/or cluster all speed tests conducted with the FCC Speed Test app—along with those conducted with an authorized third-party speed test app and those conducted by government or other entities using their own hardware or software—for a particular provider in a particular area during our analysis, as described further below. We anticipate that this aggregation and/or clustering process will lead to statistically valid results by provider and geographic area. We therefore find that the FCC Speed Test app meets the required criteria and is a reliable, efficient method for those interested to use when submitting crowdsourced mobile coverage data to the Commission. See also Section III.A.1. Creating a Challenge/Cognizable Challenges (discussion of authorized speed test apps and use of data submitted using them). 121. As discussed, OET maintains a technical description that describes the metrics and methodologies used in the existing FCC Speed Test app. See generally 2021 FCC Speed Test App Technical Description. We note that RWA requests that the FCC Speed Test app display whether users are roaming and, if so, identify the roaming network. RWA Comments at 14-15. The FCC Speed Test app currently has the ability to provide network roaming information via the app’s local data export feature for download and upload speed tests and latency tests; however, this capability is not available for Apple iOS devices as certain technical network information and RF metrics are currently not available on those devices. See 2021 FCC Speed Test App Technical Description; see also BDC Mobile Technical Requirements Public Notice at *6, para. 14. For example, an Android device should report into the BDC system both the subscriber’s mobile service provider as well as the identity of the roaming service provider operating the mobile network in that particular area (plus display a flag indicating whether the device is roaming). See, e.g., 2021 FCC Speed Test App Technical Description at Table 1 – Android Data Dictionary (showing the data fields that are collected during the tests for Android devices). In contrast, an iOS device in that same area would only report into the BDC system the identity of the subscribed mobile service provider rather than the roaming service provider operating the network in that particular area and there would be no flag indicating whether the device is roaming. See, e.g., 2021 FCC Speed Test App Technical Description at Table 2 – iOS Data Dictionary (showing the data fields that are collected during the tests for iOS devices). In order to ensure ample public participation in the crowdsourcing process, we clarify that consumers wishing to submit crowdsourced data may use a device running either the iOS or Android operating system to collect speed test data and submit it as crowdsourced information; for the same reasons discussed above, however, we require government, other third-party, and provider entities to collect all of the required technical network information and RF metrics using a device that can interface with drive test software and/or runs the Android operating system. See supra Section III.A.1. Creating a Challenge/Cognizable Challenges (discussion of requirement for government, provider, and third-party entities to use a device that is able to interface with drive test software and/or runs the Android operating system). We also clarify, as discussed earlier, that speed tests conducted by a customer of an MVNO will be considered and evaluated as crowdsourced data. See id. (discussion of speed tests conducted and submitted by consumers who are subscribers roaming on another provider’s network or subscribers of an MVNO will be considered and evaluated as crowdsourced data). 122. Regarding third-party speed test apps used to collect challenge and crowdsourced data on mobile wireless broadband availability, the BDC system will accept challenge and crowdsourced data from third-party applications approved by OET that collect the required data set forth in the relevant data specification for mobile challenge and crowdsourced data (e.g., contact information, geographic coordinates, and required certifications) and in a format that comports with the application programming interface (API) for the backend of the BDC system. To the extent that consumers and other entities choose to submit on-the-ground crowdsourced mobile speed test data, such data will be collected using a similar measurement methodology as the FCC Speed Test app and submitted in a similar format to that which challengers and providers will use when submitting speed tests. See BDC Mobile Technical Requirements Public Notice at *20, *21, paras. 53, 55. We will thus only find third-party apps to be “highly reliable” and to “have proven methodologies for determining network coverage and network performance” if OET has approved them based upon the processes and procedures we will adopt for review of third-party apps for use in the mobile challenge process, and we will only allow for submission of crowdsourced data from such approved apps. As noted above, OET will release a public notice announcing the process for approving third-party apps for use in the mobile challenge process, inviting third-party app proposals, and seeking comment on third-party apps being evaluated. As previously mentioned, OET will announce and publish a webpage to maintain a list of approved third-party apps and any available data specifications for third-party apps. See supra Section III.A.1. Creating a Challenge/Cognizable Challenges (discussion of technical requirements of FCC Speed Test app and approval of third-party speed test apps). We also will consider as crowdsourced data speed tests taken with an authorized app that do not meet the criteria needed to create a cognizable challenge or are otherwise not intended to be used to challenge the accuracy of a mobile service provider’s map. See id. (discussion of authorized speed test apps and use of data submitted using them). 123. Finally, we recognize that changes in technology and other considerations may require us to periodically revaluate these initial determinations in order to satisfy the Act’s provisions for submitting crowdsourced data. The Bureau and Offices will modify the process for collecting mobile crowdsourced data over time, as experience dictates may be necessary and appropriate to improve our procedures and assure that the maps we make are as reliable and accurate as possible. BDC Mobile Technical Requirements Public Notice at *21, para. 55 (citing Second Order, 35 FCC Rcd at 7488-89, para. 68). 2. Crowdsourced Data Submitted in the Online Portal 124. We will use crowdsourced data to “identify individual instances, or patterns of potentially inaccurate or incomplete deployment or availability data that warrant further investigation or review.” Second Order, 35 FCC Rcd at 7490, para. 72. In light of this given purpose, we believe it is reasonable to provide those collecting crowdsourced data with increased flexibility to facilitate making the process more user-friendly. Specifically, on-the-ground crowdsourced data must include the same parameters and metrics as required for on-the-ground speed test data submitted through the mobile service challenge process, except that we will allow on-the-ground crowdsourced data to include any combination of download speed and upload speed (rather than both). See infra Appx. B, para. 2 (47 CFR § 1.7006(b)(2)). Crowdsourced data should include valid on-the-ground speed tests and will be categorized and evaluated based on the upload and download speed tests as “positive” or “negative” tests, similar to speed tests in the challenge process. See supra Section III.A. Mobile Service Challenge Process. In the BDC Mobile Technical Requirements Public Notice, the Bureau and Offices noted that the Commission directed them, together with WCB, to establish and use an online portal for crowdsourced data filings and to use the same portal for challenge filings. BDC Mobile Technical Requirements Public Notice at *21, para. 55; see also infra Appx. B, para. 2 (47 CFR § 1.7006(b)(3)). The Bureau and Offices will release additional guidance on how consumers and other entities can use the online portal to submit crowdsourced data once the portal is available. 125. Staff will validate submitted crowdsourced speed test data and exclude those that are, for example, anomalous, For example, “anomalous” speed tests may include test results with actual data errors, device errors, or evident mistaken readings or internally inconsistent results. See also Mobility Fund Phase II Coverage Maps Investigation Staff Report, GN Docket No. 19-367, at 56-57, Appx. B, https://www.fcc.gov/document/mf-ii-coverage-maps-investigation-staff-report, for a more complete description of anomalous and problematic speed test data that Commission staff found during that investigation that nevertheless passed the automated system validations. do not conform to the data specifications, or do not otherwise present reliable evidence and then evaluate the crowdsourced data as described further below to determine whether a critical mass of crowdsourced filings suggest that a provider has submitted inaccurate or incomplete information. This approach helps ensure that the crowdsourced data staff analyzes are valid and reliable while also affording consumers some added flexibility by allowing on-the-ground crowdsourced data to include any combination of download speed and/or upload speed rather than both. Similarly, mobile providers will be notified of a crowdsource filing but will not be required to respond to crowdsource filings unless and until Commission staff request that they do so, based on the procedures outlined below. See infra Appx. B, para. 2 (47 CFR § 1.7006(b)(3)). We believe this process is an efficient and effective way for staff to analyze and review a provider’s mobile broadband availability using crowdsourced data. 126. T-Mobile supports making certain speed test metrics optional for crowdsourced data and not to require providers to automatically respond to crowdsourced data filings, stating they are appropriately tailored and will serve to limit burdens on providers without compromising the need for the Commission to ensure that it receives verified and reliable data. T-Mobile Comments at 26. We agree that making certain test metrics optional for the crowdsourced data filings and also not requiring providers to respond to crowdsourced data filings (absent a Commission inquiry) However, the crowdsourced data portal will alert providers when crowdsourced filings are made concerning their data and providers may voluntarily choose to respond to crowdsourced data filings. See Second Order, 35 FCC Rcd 7490, para. 71. serves to limit the burdens on filers and providers without compromising the reliability of the crowdsourced data, with the goal of providing as broad and robust crowdsourced data as possible. 3. When Crowdsourced Filings Reach a “Critical Mass” 127. In the Second Order, the Commission directed staff to initiate inquiries when a “critical mass” of crowdsourced filings suggest that a provider has submitted inaccurate or incomplete information and directed us to provide guidance on when crowdsourced filings reach such a critical mass. Second Order, 35 FCC Rcd at 7491, para. 74 & n.211. We sought comment in the BDC Mobile Technical Requirements Public Notice on when inquiries based on a critical mass of crowdsourced filings could be initiated. BDC Mobile Technical Requirements Public Notice at *21-*22, paras. 56-57. Specifically, we proposed to evaluate crowdsourced data in the first instance with an automated process to identify areas that would trigger further review. Id. at *21, para. 56. 128. Establishing Critical Mass. We adopt our proposal and will evaluate mobile crowdsourced data through a combination of automated processing and further review by Commission staff. As described in more detail below, the automated process will identify areas for further review by first excluding or “culling” any anomalous or otherwise unusable speed test information and then using data clustering to identify groupings of potential targeted areas where a provider’s coverage map is inaccurate that would trigger further review. Staff will then review the identified potential targeted areas and any other relevant data to confirm whether this cluster presents a credible basis to warrant verification. Id.; see also Third Order, 36 FCC Rcd at 1146, para. 47. Under this approach, areas identified from crowdsourced data using this methodology would be subject to a verification inquiry consistent with the mobile verification process adopted herein. BDC Mobile Technical Requirements Public Notice at *21, para. 56; see also supra Section III.B. Collecting Verification Information from Providers. 129. We note that commenters generally support our proposals regarding when crowdsourced data should trigger an inquiry about the accuracy of a provider’s broadband mapping information. Verizon, for example, finds reasonable our proposals regarding which crowdsourced information to consider. Verizon Comments at 23. Specifically, Verizon states that the Commission’s proposal is reasonable to accept as crowdsourced information speed tests taken with an authorized app that do not meet the criteria needed to create a cognizable challenge or are otherwise not intended to be used to challenge the accuracy of a mobile service provider’s map. Id. Additionally, Verizon states the Commission should adopt the proposal to permit consumers and other entities to submit crowdsourced data collected using either the FCC Speed Test app or other speed test apps approved by OET. Id. Furthermore, T-Mobile supports our proposal to initiate an inquiry when crowdsourced data suggest that a provider has submitted inaccurate or incomplete coverage data. T-Mobile Comments at 26. Ookla agrees, pointing out that “crowdsourcing allows for the rapid, cost-effective collection of actionable, accurate broadband data.” Ookla Reply at 1, 3-4; see also Next Century Cities Reply at 1 (stating crowdsourced data can promote accurate results because its inclusion can increase accuracy and reliability by reducing the Commission’s reliance on self-reported service provider data). 130. We expect that the minimum data standards and structured vetting process we adopt for evaluating crowdsourced data described below address concerns about any bias in, and the reliability of, the crowdsourced data collected. For example, because the automated process we describe below will filter out anomalies or other unusable speed test information, we believe this filtering process sufficiently addresses Verizon’s concerns about including inaccurate speed test information in any crowdsourced dataset due to possible varying test conditions. Verizon Comments at 23 (noting that inaccuracies may arise due to there being little control over the testing conditions such that the crowdsourced dataset may include tests conducted indoors, with old or defective devices, or subject to reduced speeds due to plan limits). Further, because the process will also employ a clustering methodology to identify trends or patterns suggesting persistent coverage issues over time, we believe the crowdsourced data will be an efficient and effective means with which to inform, but not decide, a provider’s claimed deployment and availability of broadband Internet access service and thereby be an important part of the Commission’s available data verification options. See, e.g., Second Order, 35 FCC Rcd at 7487-92, paras. 64-76. 131. Other commenters offer different views regarding our proposal to evaluate crowdsourced data. RWA requests more clarity, suggesting that we define what the “critical mass” is to trigger an inquiry in rural and urban areas. RWA Comments at 4. Public Knowledge/New America, seeking to bolster the usefulness and value of crowdsourced information, opposes our proposal to initiate a verification inquiry only when there is a “critical mass of” crowdsourced data. Public Knowledge/New America Reply at 6; see also Public Knowledge/New America August Letter (stating “[f]acilitating the crowdsourcing of speed tests to inform the challenge process is likewise essential to ensure that the process is well-informed and not limited to the challenges and data provided by rival providers with the financial self-interest, resources and expertise to engage in the process”). Instead, they argue that staff should make it easier for crowdsourced data to inform our verification inquiries. Public Knowledge/New America Reply at 6. We find that the requirement we adopt to initiate an inquiry in response to crowdsourced data when a critical mass of these data suggest that a provider has submitted incomplete or inaccurate information strikes the best balance. This approach allows for the crowdsourcing process to highlight problems with the accuracy of a provider’s mobile broadband coverage maps and is an important tool in the Commission’s verification process. As Ookla observes “crowdsourcing uses large numbers of samples to identify useful conclusions.” Ookla Comments at 4. The crowdsourcing process we adopt provides a user-friendly way for interested filers to provide crowdsourced data to the Commission in a cost-effective way without requiring providers to respond automatically to such filings. Because the process is user-friendly, we also believe it will incentivize greater participation in the crowdsourced data gathering process. We believe this strikes the right balance and helps us ensure more reliable mobile broadband coverage data. 132. Automated Process. We will evaluate mobile crowdsourced data first through an automated process to identify potential areas that warrant further review and evaluation by Commission staff. See BDC Mobile Technical Requirements Public Notice at *21, para. 56. Specifically, we adopt a modified version of our proposal in the BDC Mobile Technical Requirements Public Notice regarding the automated process and will evaluate crowdsourced filings using a two-step process by first excluding any anomalous or otherwise unusable tests submitted as crowdsourced data and then by using data clustering (an industry standard tool for clustering GIS data) to identify potential targeted areas where crowdsourced tests indicate a provider’s coverage map is inaccurate. Id. at *21-*22, paras. 56-57. Areas identified by the automated process then would be subject to further review and evaluation by Commission staff of available evidence, such as speed test data, infrastructure data, crowdsourced and other third-party data, and the staff’s review of submitted coverage data, including maps, link budget parameters, and other credible information to make a determination as to whether a credible basis for conducting a verification inquiry has been established and whether a verification request is appropriate. 133. More particularly, the automated process will involve an analysis at the end of each month Evaluating this crowdsourced data on a monthly basis mirrors the challenge process because we are also evaluating challenge data on a monthly basis and notifying providers about challenged hexagons at the end of each month. See supra Section III.A.1. Creating a Challenge/Cognizable Challenges. that will include aggregating the crowdsourced data into H3 hexagons at resolution 8, and categorizing each hexagon for purposes of further analysis. Next, we will apply a clustering algorithm to spatially cluster these hexagons. Hahsler, M., Piekenbrock, M., Doran, D. (2019). “dbscan: Fast Density-Based Clustering with R.” Journal of Statistical Software, 91(1), 1–30. doi: 10.18637/jss. We will track the growth of the clusters of hexagons over time and if the level of negative speed tests is observed for three consecutive months, will make a determination of whether crowdsourced data have reached a “critical mass” warranting verification. The details of this process are described in more detail in the Technical Appendix. See Appx. A – Technical Appendix § 5. We note that the Density Based Spatial Clustering of Applications with Noise (DBSCAN) algorithm we will employ is one of the 10 default tools for clustering GIS data in the industry standard ESRI ArcGIS software and is commonly used to perform this type of data clustering analysis. See Mitchell, Andy (2021). The ESRI Guide to GIS Analysis, Volume 2: Spatial Measurements and Statistics, second edition. ISBN: 9781589486089, eISBN: 9781589486096; see also Appx. A – Technical Appendix § 5. In fact, the DBSCAN algorithm we will employ is one of the most commonly used methods for data clustering analysis. 134. Verizon opposes the use of an automated process to analyze crowdsourced data as well as the use of data clustering to identify potential targeted areas where crowdsourced tests indicate that a provider’s coverage map is inaccurate, and asks that, should we adopt these proposals, we provide more detail about their mechanics and seek further comment on the proposed algorithm, data sources, and criteria the processes will use for identifying potential targeted areas for further review and evaluation. See Verizon Comments at 23-24. We proposed to use an automated process to identify potential areas that would trigger further review using a methodology similar to the mobile verification process, with certain simplifications. See BDC Mobile Technical Requirements Public Notice, at *21, para. 56. More specifically, we proposed to use data clustering to identify potential targeted areas where crowdsourced tests suggest that a provider’s coverage map is inaccurate and also sought comment on any alternative methods for determining when a critical mass of crowdsourced filings suggest a provider may have submitted inaccurate or incomplete information. See id. We did not receive any comments suggesting any alternative methods for the critical mass determination. We adopt a modified version of our proposal as described above. Employing the modified automated process we adopt is a reasonable approach to analyze crowdsourced data because of the anticipated volumes of data. Using data clustering to identify potential targeted areas for further Commission staff review and evaluation is also a reasonable way to group crowdsourced data together for a particular area within a coverage map. In this regard, we note that a data clustering approach for the identification of clusters of concern will reduce the amount of staff work and assure that an unbiased analysis has provided evidence that specific areas warrant further review by Commission staff. See Appx. A – Technical Appendix § 5 (describing a statistical clustering approach for the identification of clusters of concern). We believe the modified version of the automated process we adopt, including the use of data clustering, is sufficiently detailed and, taken together with the added safeguard of subsequent staff evaluation, addresses Verizon’s request for more information about the automated process itself and the data clustering and other criteria the process will use as described below to identify potential areas for further review and evaluation. 135. Staff Evaluation. As noted above, the data identified in this process will inform, but not decide, a provider’s claimed deployment and availability of broadband Internet access service and thereby be an important part of the Commission’s available verification options. See, e.g., Second Order, 35 FCC Rcd at 7487-92, paras. 64-76. If the automated process suggests that an area has persistent coverage issues, Commission staff will evaluate the data and make a final determination as to whether clusters of hexagons identified in this manner for three consecutive months have, indeed, reached “critical mass.” Staff may consider other relevant data submitted by providers, consumers and/or third parties; may request additional information; and may take other actions as may be necessary to ensure the reliability and accuracy of the provider’s coverage data and any applicable crowdsourced data. See BDC Mobile Technical Requirements Public Notice, at *22, para. 57. Should automated processing establishing a “critical mass” of crowdsourced filings combined with staff evaluation suggest a provider’s coverage map is inaccurate, Commission staff will have a “credible basis” for verifying a provider’s coverage data. See 47 CFR § 1.7006(b)(4). Under this approach, areas identified from crowdsourced data using this methodology would be subject to a verification inquiry consistent with the mobile verification process adopted herein. See supra Section III. B. Collecting Verification from Mobile Providers; see also 47 CFR § 1.7006(b)(4). Finally, we reiterate that we may initiate an inquiry, in the absence of a critical mass of crowdsourced filings, to collect and request verification data from a provider where there is a credible basis for doing so based upon a holistic review of all data available to staff (including crowdsourced data, data associated with challenges, verified data from government or third-party entities, or broadband availability data included in the provider’s initial filing). See supra Section III. B. Collecting Verification from Mobile Providers. On a case-by-case basis, staff may thus have a credible basis for initiating a verification inquiry if warranted by the specific circumstances of a crowdsourced data filing in the context of all other data available to staff. See Second Order, 35 FCC Rcd 7491, para. 74 (reserving the right to investigate if warranted by specific circumstances of crowdsourced data filing); see also BDC Mobile Technical Requirements Public Notice, at *15, para. 35 (proposing that staff may consider other relevant data submitted by providers, request additional information from the provider, and may take other actions as necessary); see also Appx. B, para. 2 (47 CFR § 1.7006(b)(4)). 4. Public Availability of Crowdsourced Data 136. The Commission determined in the Second Order that all information submitted as part of the crowdsourcing process will be made public, except for personally identifiable information (PII) and data required to be confidential under section 0.457 of its rules. Second Order, 35 FCC Rcd at 7492, para. 76 (citing 47 CFR § 0.457); see also 47 CFR § 1.7006(b)(4). The Commission also directed OEA to make crowdsourced data publicly available as soon as practicable after submission and to establish an appropriate method for doing so. Second Order, 35 FCC Rcd at 7492, para. 76. No commenters addressed, or provided any alternatives to, our proposal in the BDC Mobile Technical Requirements Public Notice to make crowdsourced data filings available to the public or offered any suggestions about any specific ways to protect personally identifiable (PII) or other sensitive information. 137. We therefore adopt our proposal to make crowdsourced data available via the Commission’s public-facing website. BDC Mobile Technical Requirements Public Notice at *22, para. 58. This will include data collected via designated third-party apps. This publicly available information will depict coverage data and other associated information but will not include any PII or other data required to be confidential under section 0.457. Since designated third-party apps will be collecting data on behalf of the Commission, we expect similar handling of PII or other confidential information by third-party designees. We also adopt a modified version of our proposal and will update the public crowdsourced data at least biannually in order to make available the most up-to-date data. Id. (proposing to update the public crowdsourced data biannually). This is consistent with the Commission’s requirement to update the Fabric every six months to ensure the most up-to-date information is available for all of the locations identified in the common dataset and will ensure the crowdsourced data provided is also current, reliable and robust. E. Other Matters 138. Additional Mapping Information. We reject calls to require providers at this time to submit additional information with their maps. Next Century Cities and Public Knowledge/New America recommend that providers be required to include other performance and affordability information, such as the throughput speeds experienced by broadband consumers, signal strength, and pricing information. Next Century Cities Reply at 1-2; Public Knowledge/New America Reply at 2. The Commission declined to adopt pricing and throughput data filing requirements for fixed services in the Third Order, Third Order, 36 FCC Rcd at 1137, para. 25. and did not delegate authority to the Bureau and Offices to add such requirements for mobile services. The Broadband DATA Act defines standardized propagation modeling at defined throughput speeds for 4G-LTE coverage. 47 U.S.C. § 642(b)(2)(B). The Commission followed Congress’s approach and required mobile broadband providers to model broadband coverage, including 3G and 5G-NR services, based on standardized propagation modeling. See Second Order and Third Further Notice, 35 FCC Rcd at 7476-81, paras. 38-47. We thus decline to require providers to model actual mobile throughput. Even if we had the delegated authority adopt a rule to require the modeling of mobile throughput, we note that doing such modeling would be a computationally difficult, if not impossible, task for mobile broadband providers. Instead, we will use on-the-ground data collected through the challenge and crowdsource processes to improve the accuracy of the coverage maps. The Commission did specifically consider whether to standardize signal strength for mobile propagation maps, Letter from Matthew Gerst, Vice President, Regulatory Affairs, CTIA, to Marlene H. Dortch, Secretary, FCC, WC Docket Nos. 11-10, 19-195, at 2-3 (filed May 29, 2020) (arguing that the Commission should not standardize signal strength for the propagation maps). and instead adopted a requirement for providers to submit “heat maps.” Third Order, 36 FCC Rcd at 1142, para. 37. Mobile providers are therefore already required to submit maps showing Reference Signal Received Power (RSRP) or Received Signal Strength Indicator (RSSI) signal levels for each technology. 47 CFR § 1.7004(c)(3)(v). Additionally, in adopting rules to implement the Broadband DATA Act, the Commission focused on ensuring that the public has access to more precise coverage maps, but did not delegate to the Bureau and Offices the authority to adopt new mapping requirements such as requiring providers to include affordability or pricing data for their broadband services. We also find it would be inconsistent with the Commission’s reasoning to adopt these types of pricing requirements for mobile maps, but not fixed maps. 139. Expanding the Types of Data That Can Be Used to Challenge Maps. CPUC, Public Knowledge/New America, and Vermont DPS recommend allowing interpolation techniques to be used for challenging provider-submitted maps. CPUC Comments at 18-19; Public Knowledge/New America Reply at 2-5; Vermont DPS Reply at 6. CPUC appears to suggest that the Bureau and Offices sought comment on the benefits of using spatial interpolation techniques in lieu of the proposed testing thresholds in the challenge process. CPUC Comments at 18 (citing BDC Mobile Technical Requirements Public Notice at *14, para. 33). We clarify that we sought comment on the use of spatial interpolation techniques when the Commission initiates a verification inquiry of providers’ maps, not in the challenge process. BDC Mobile Technical Requirements Public Notice at *14, para. 33 (seeking comment in Section III.B of the Public Notice, titled “Collecting Verification Information from Mobile Providers,” on “the costs and benefits of using spatial interpolation techniques either in addition to or as an alternative to the testing thresholds proposed . . . for verifying the accuracy of coverage maps” (emphasis added)); see also BDC Mobile Technical Requirements Public Notice at *2, para. 3 & n.6 (specifying that, for purposes of the Public Notice, “verification” and “verification process” refer to a different process than the challenge process). The Commission explicitly adopted a requirement that consumers and government and other entities submit speed test data to support their mobile coverage challenges, and did not grant the Bureau and Offices authority to accept data other than on-the-ground speed tests to challenge coverage. Third Order, 36 FCC Rcd at 1165, para. 98 (stating that “we do not believe we could reasonably collect challenges to mobile coverage without relying on speed testing”); id. at 1171, para. 116 (stating that “[f]or mobile broadband coverage challenges, we require government and third-party entities to submit speed test data”); id. at 1165, 1166, paras. 99, 102. We therefore lack delegated authority to accept interpolations or statistical sampling as challenge data in lieu of actual, valid speed tests. 140. Expanding the Types of Data That Can Be Used for Verified Data. CPUC and Vermont DPS likewise recommend allowing interpolations of speed test results by government entities to identify areas requiring validation. CPUC Comments at 18-19; Vermont DPS Reply at 6. Such spatial interpolation techniques could include the Kriging technique discussed in the BDC Mobile Technical Requirements Public Notice. BDC Mobile Technical Requirements Public Notice at *14, para. 33 & n.87. In contrast, T-Mobile states that the Commission must reject any proposal premised on interpolation. T-Mobile Reply at 9. To the extent governments or other entities submit on-the-ground speed test data through our crowdsource process, we agree with CPUC and Vermont DPS that the results of spatial interpolation analyses would be useful additional information on which to determine if there is a credible basis for verifying a provider’s coverage data. However, the Commission directed that verified mobile on-the-ground data be submitted “through a process similar to the one established for providers making their semiannual [BDC] filings,” Third Order, 36 FCC Rcd at 1154, para. 68. and the Bureau and Offices do not have discretion to change that approach. Since interpolation is a projection, it therefore does not meet the requirements established for “verified” broadband availability data under the Broadband DATA Act. See 47 U.S.C. § 642(a)(2); see also Third Order, 36 FCC Rcd at 1151-52, paras. 62-63. Therefore, while we may use interpolation in our analysis of on-the-ground data submitted either as part of the challenge process or as crowdsourced data when conducting a holistic review to ensure the accuracy of coverage data (e.g., when evaluating whether there is a credible basis for conducting a verification inquiry), we are unconvinced that accepting interpolated data on their own would give us the necessary understanding of on-the-ground performance consistent with our obligations under the Broadband DATA Act and Commission Orders. 141. Decline to Require Providers to Offer Challenge Incentives. We will not, as urged by some commenters, require that providers offer subscribers incentives to conduct speed tests or submit voluntary challenges. RWA Comments at 12-13 (recommending that the Commission seek comment on requiring carriers to advertise to subscribers how to challenge maps and mandate that carriers provide challenge incentives for subscribers, such as rebates for successful challenges); Public Knowledge/New America Reply at 5, 8 (agreeing with RWA that the Commission should consider imposing requirements for carriers to create voluntary challenge incentives for their subscribers, such as rebates for successful challenges, and urging the FCC to take more action to ensure consumers are aware of the challenge process and encourage consumers to provide more crowdsourcing mobile broadband data). Once we implement the challenge process, we believe that consumers and third parties will be motivated to provide us with data where they believe providers’ coverage maps are inaccurate or incomplete. Relatedly, the Commission noted in the Third Order that speed test results submitted by consumer challengers that do not reach the threshold of a cognizable challenge will nevertheless be incorporated in the analysis of crowdsourced data, Third Order, 36 FCC Rcd at 1168, para. 106. and similarly that on-the-ground test data submitted by governmental and third-party entities that do not reach the threshold of a cognizable challenge also will be considered in the analysis of crowdsourced data. Id. at 1173, para. 120. We believe that combining these speed test results along with other available data, including other available crowdsourced data, will provide us with a robust and accurate dataset, thereby obviating the need for provider-offered incentives to spur consumers and third parties into submitting challenges or collecting crowdsourced data to submit to us. The user-friendly challenge process we implement should facilitate consumers and other entities alike in submitting challenges and crowdsourced mobile coverage data. As one commenter observes, “[d]ue to known shortcomings in mobile coverage maps[,] . . . the Commission needs a good challenge process” and should “allow the use of crowd-sourced data to challenge providers’ claims.” Comniscient Comments at 5. We agree, and believe that we have put efficient and effective challenge and crowdsource processes and procedures in place. 142. Pre-Publication Commission Review of Maps. We decline to establish an additional period of review for the Commission to perform a “quick look” at the data that service providers submit before publishing maps rendering the data. CCA suggests an “initial review and sampling process,” which “could be automated, although there is likely no complete substitute for some degree of manual review and sampling,” to identify “significant and overt errors”; CCA cites the Commission’s initial review of spectrum license transfer applications prior to placing them on public notice as a potential framework for a similar initial review process. CCA Comments at 13-15 & n.44. It also recommends staff conduct random sampling or statistical analysis and comparison of the data provided by each provider to detect clear errors, and then quickly review maps for errors such as failure to account for terrain and clutter, excessive signal propagation at co-located sites, failure to use the required resolution, understated/overstated service in populated areas, depicted service ceasing at artificial boundaries, and failure to match the coverage maps on providers websites. CCA Comments at 13-15; CCA Reply at 2-7. CTIA and Public Knowledge/New America agree that such a process could be helpful, reasoning that a Commission-led initial review would eliminate a costly and open-ended burden on challengers who, they argue, will expend time and energy identifying overt errors that carriers never should have submitted. CCA Comments at 13-14; CCA Reply at 2-6; CTIA Reply at 12-13; Public Knowledge/New America Reply at 8. CCA further argues that “[f]orcing challengers to bear the burden of challenging maps that contain obvious problems . . . likely would result in a need for extensions of time, which would lengthen the challenge process.” CCA Comments at 13. 143. While we recognize the theoretical benefits of a “quick look” of provider-submitted maps before they are made available to the public to challenge, we find that these are outweighed by the significant delay that this would introduce into the challenge process. Requiring the Commission to independently analyze provider submissions or conduct field surveys would significantly delay when this information is made available for the public to challenge. It also would be difficult to operationalize meaningful and practical standards to be applied in a “quick look.” The Commission will be collecting data and rendering multiple maps for scores of mobile and fixed providers, and it would clearly be wholly impracticable for staff to review every map of every provider before making them available to the public and to other federal, state, and local government agencies, Tribal entities, and other third parties. In order to build a process to undertake this type of review, we would need to decide, for example, which maps to review; how much time to spend reviewing them; and what kinds of “significant and overt” errors to look for. Commenters who support this pre-screening of provider data offer virtually no input on these fundamental implementation challenges, Although CCA suggests some types of information that the Commission could look for, it does not suggest which maps to review, a time period for doing so, or a depth of review. CCA Reply at 2-6. and we note that adopting CCA’s suggested “quick look” approach in the absence of a more complete record on issues like these would likely require additional notice and comment. Additionally, the Broadband DATA Act created a framework whereby mobile service providers submit propagation maps based on a standardized set of propagation model details; 47 U.S.C. § 642(b)(2)(B). in turn, the Commission is required to publish the data mobile service providers submit, and outside stakeholders are permitted to challenge mobile service providers’ broadband coverage assumptions or submit crowdsource information to help us further refine and validate mobile service providers’ propagation maps. 47 U.S.C. §§ 642(a)(1)(A)(ii), 642(b)(5), 644(b). Creating a “quick look” process could interfere with Congress’s intent that we leverage public input to improve the maps over time. S. Rep. No. 116-174, at 13 (2019) (“The Committee therefore urges the Commission to use the challenge process not just as a way to collect information from the public, but also to provide timely opportunities for input and evidence-based corrections to, or updates of, the coverage maps as the agency prepares to distribute funding for broadband based upon the availability of broadband internet access service. Similarly, the Committee intends for the crowdsourcing process created in section 5 of the bill, as well as the Commission's own verification obligations pursuant to the bill, to serve as important ongoing checks on the accuracy of the information being submitted to the agency and used in the coverage maps.”). 144. That is not to say that we have not already planned to undertake certain data validations a part of the BDC submission process to preempt or remediate any overt errors. The BDC system will perform dozens of data validations and automatic processing steps on uploaded data and will alert the provider when any of the data fail one of these steps. These validations and processing steps will—for the first time—allow for the Commission’s systems to automatically detect many of the GIS data and mapping issues that have historically been found in data submitted by providers after a time-consuming and largely manual review by staff for each Form 477 filing round. The new validations and automatic processing will flag a number of factors that would undermine the accuracy of a provider’s data, including geometric errors in maps and overt errors in providers’ assumptions. Moreover—and also for the first time—the BDC system will require providers to review and correct maps rendered from their data and to confirm that they uploaded the correct data and that any changes made as a result of data validations (e.g., automatic repairs of invalid geometries and incorrect map projections) are correct, all prior to certifying their submissions. We anticipate that these additional validations and processing steps will significantly improve the process to submit data and, by preventing a provider from completing its submission until it has successfully undergone these data validations, will prevent the lengthy back-and-forth between filers and FCC staff that has typically occurred after the submission of Form 477 data. We believe that the new validations and automatic processing will help correct many, if not all, of the problems CCA discusses. See CCA Reply at 3, 5-6. The Bureau and Offices will maintain discretion to develop additional tools in the future to provide automatic feedback to carriers as we receive more data. 145. Use of BDC Data. RWA requests that Bureau and Offices clarify when the data collection, Fabric, and coverage maps will be “complete” for the purposes of awarding broadband deployment funds. RWA Comments at 13. We note that decisions regarding specific programs and how to use BDC data to determine areas of eligibility are outside the scope of this proceeding. 146. Non-substantive Changes. Finally, we make two non-substantive changes. First, we correct the numbering of 47 CFR § 1.7006(e)(1). See 47 CFR § 0.271(i). In particular, we redesignate the first paragraph (iv) as paragraph (iii). Section 1.7006(e)(1)(iii) is corrected to read as follows: (iii) A certification that the challenger is a subscriber or authorized user of the provider being challenged; Second, in the second sentence in the paragraph that appears at 47 CFR § 1.7006(f), we change the first instance of the word “or” to “of,” to read as follows: (f) Mobile service challenge process for State, local, and Tribal governmental entities; and other entities or individuals. State, local, and Tribal governmental entities and other entities or individuals may submit data to challenge accuracy of mobile broadband coverage maps. They may challenge mobile coverage data based on lack of service or poor service quality such as slow delivered user speed. IV. PROCEDURAL MATTERS 147. Regulatory Flexibility Act. The Regulatory Flexibility Act of 1980, as amended (RFA), 5 U.S.C. §§ 601–612. The RFA has been amended by the Small Business Regulatory Enforcement Fairness Act of 1996 (SBREFA), Pub. L. No. 104-121, Title II, 110 Stat. 857 (1996). requires that an agency prepare a regulatory flexibility analysis for notice and comment rulemakings, unless the agency certifies that “the rule will not, if promulgated, have a significant economic impact on a substantial number of small entities.” 5 U.S.C. § 605(b). Accordingly, we have prepared a Supplemental Final Regulatory Flexibility Analysis (Supplemental FRFA) concerning the possible impact of the rule changes contained in this Order on small entities. The Supplemental FRFA is set forth in Appendix C. 148. Congressional Review Act. The Commission has determined, and the Administrator of the Office of Information and Regulatory Affairs, Office of Management and Budget, concurs, that these rules are “non-major” under the Congressional Review Act, 5 U.S.C. § 804(2). The Commission will send a copy of this Order to Congress and the Government Accountability Office pursuant to 5 U.S.C. § 801(a)(1)(A). 149. Paperwork Reduction Act. This document does not contain new or modified information collection(s) subject to the Paperwork Reduction Act of 1995 (PRA), Public Law 104-13, as the requirements adopted in this Order are statutorily exempted from the requirements of the PRA. See Infrastructure Investment and Jobs Act § 60102(h)(2)(E)(ii) (amending 47 U.S.C. § 646(b)). As a result, the Order will not be submitted to OMB for review under section 3507(d) of the PRA. V. ORDERING CLAUSES 150. Accordingly, IT IS ORDERED that, pursuant to sections 1-4, 7, 201, 254, 301, 303, 319, 332, and 641-646 of the Communications Act of 1934, as amended, 47 U.S.C. §§ 151-154, 157, 201, 254, 301, 303, 319, 332, 641-646, this Order IS ADOPTED. 151. IT IS FURTHER ORDERED that Part 1 of the Commission’s rules IS AMENDED as set forth in Appendix B. 152. IT IS FURTHER ORDERED that the Order SHALL BE effective 30 days after publication in the Federal Register. 153. IT IS FURTHER ORDERED that the Office of the Managing Director, Performance Evaluation and Records Management, SHALL SEND a copy of this Order in a report to be sent to Congress and the Government Accountability Office pursuant to the Congressional Review Act, 5 U.S.C. § 801(a)(1)(A). 154. IT IS FURTHER ORDERED that the Commission’s Consumer & Governmental Affairs Bureau, Reference Information Center, SHALL SEND a copy of this Order, including the Final Regulatory Flexibility Analysis, to the Chief Counsel for Advocacy of the Small Business Administration. FEDERAL COMMUNICATIONS COMMISSION Joel Taubenblatt Acting Chief Wireless Telecommunications Bureau Giulia McHenry Chief Office of Economics and Analytics Ronald T. Repasi Acting Chief Office of Engineering and Technology 114 APPENDIX A Technical Appendix 1 Introduction This technical appendix provides additional information about the mobile challenge, mobile verification, and mobile crowdsourcing processes for the Broadband Data Collection (BDC), including how mobile speed tests will be validated and evaluated to determine whether mobile broadband service is available in an area. This Technical Appendix also summarizes rules and methodologies that are discussed in light of the record in this proceeding and adopted in the Order. The analytic procedures and calculations set forth herein provide statistically valid methodologies for ensuring that mobile service providers’ coverage maps, which are based on predictive propagation modeling, reflect the on-the-ground experience of consumers. See 47 U.S.C. § 642(b)(2)(B) (requiring the Commission to adopt rules by which mobile providers would submit propagation models and propagation model details that indicate current 4G LTE broadband coverage); Establishing the Digital Opportunity Data Collection; Modernizing the FCC Form 477 Data Program, WC Docket Nos. 19-195, 11-10, Second Report and Order and Third Further Notice of Proposed Rulemaking, 35 FCC Rcd 7460, 7476-83, 7503-06, paras. 32-51, 104-09 (2020) (Second Order and Third Further Notice) (adopting rules for mobile providers to submit propagation maps and propagation model details of 3G, 4G LTE, and 5G-NR coverage and seeking comment on how providers may submit a statistically valid sample of on-the-ground data to verify mobile providers’ coverage maps); Establishing the Digital Opportunity Data Collection; Modernizing the FCC Form 477 Data Program, WC Docket Nos. 19-195, 11-10, Third Report and Order, 36 FCC Rcd 1126, 1150-51, para. 59 (2021) (Third Order) (directing “OEA, WTB, and OET to develop and administer the specific requirements and methodologies that providers must use in conducting on-the-ground-tests . . . so that the tested areas satisfy the requirements of a statistically valid and unbiased sample of the provider’s network”). The Commission has defined the parameters that service providers must use when modeling whether broadband is available using technology-specific minimum download and upload speeds with a cell edge probability of at least 90% and assuming minimum 50% cell loading. Second Order and Third Further Notice, 35 FCC Rcd at 7477, 7479-81, paras. 39, 44-47. Mobile service providers are required to submit data modeled with different minimum speed threshold values. For 3G coverage, these values are 200 kbps download and 50 kbps upload (i.e., 0.2/0.05 Mbps). 47 CFR § 1.7004(c)(3)(i). For 4G LTE coverage, these values are 5 Mbps download and 1 Mbps upload (i.e., 5/1 Mbps). Id. For 5G-NR coverage, there are two sets of minimum speed threshold values, 7 Mbps download and 1 Mbps upload (i.e., 7/1 Mbps), and 35 Mbps download and 3 Mbps upload (i.e., 35/3 Mbps). Id. These speed thresholds are applied to determine whether an individual mobile speed test submitted as part of the challenge, verification, and/or crowdsourcing process is “positive” or “negative” – that is, whether each test meets the minimum predicted upload or download speeds or fails to do so – while the cell edge probability sets the overall expected rate of positive or negative speed tests for challengers and providers. On-the-ground speed testing via mobile device apps, such as the FCC Speed Test app, measures network performance quality through several metrics, including download speed and upload speed. Third Order, 36 FCC Rcd at 1166-67, para. 103. These apps can be used to measure on-the-ground consumer experience and to evaluate the accuracy of propagation model-based coverage maps. Id. at 1166-67, paras. 102-04. We acknowledge that many real-world factors, such as the randomness or variability of localized terrain and clutter such as foliage, vehicular traffic, device performance, cell loading, and weather, may affect the speed test results and may not be fully accounted for in the model-based coverage maps, and that testing may not be conducted in a truly random (i.e., unbiased) manner. In this proposal, we have therefore taken several steps to create a robust and representative sample. We specify the minimum sample size needed to create a challenge as well as geographic and temporal requirements when conducting the speed tests. We have also used the principles of statistical design to guide other aspects of the challenge and verification processes. The methodologies we adopt balance the need to minimize burdens on challengers and providers with the need to ensure a statistically sound methodology upon which to rely for making decisions involving the availability of broadband coverage. Consequently, we impose several requirements for on-the-ground speed tests to be used to challenge a service provider’s coverage map, rebut such a challenge, or verify coverage for the smallest challengeable or verifiable geographic area. These requirements seek to ensure that the set of aggregated tests approximates an unbiased sample of the location (geographically and temporally) that is large enough to draw meaningful conclusions from the data. We acknowledge that requiring testers, which could include consumers, governmental entities and other third parties, and service providers, to conduct a truly random sample would increase the statistical confidence of conclusions drawn from the data, but believe that imposing this requirement would impose an unreasonable and unnecessary burden on both challengers and providers, especially as speed tests submitted as challenges may come from individual consumers and/or from unaffiliated challengers. 2 On-the-Ground Speed Test Validations and Certifications We will validate, automatically upon submission, on-the-ground speed tests submitted by challengers or challenged providers (collectively, “testers”) as part of the mobile challenge process and by providers as part of the verification process. Consumer challengers must conduct speed tests using the FCC Speed Test app or another OET-approved speed test app and configuration, and all submitted tests must have a complete set of mandatory fields. Id. at 1166-67, 1172, paras. 103, 117. For challenges, any test with a location The upload and download components of the speed test will each have an associated location, as determined by the midpoint of the starting and ending locations of the component. It is possible that only one of these midpoints falls within the coverage area if the test was taken in motion. In this situation, the component that is within the coverage area will be included and the component outside the coverage area will be excluded. outside the coverage area of all technologies claimed by the applicable service provider will be considered invalid and excluded from further analysis. A speed test may switch from invalid to valid if a subsequently posted submission places the speed test within the respective coverage area as long as the coverage date is before the date of the speed test. Each speed test will need to conform to the most recent specifications for mobile test data adopted by OEA and WTB in accordance with 5 U.S.C. § 553, See id. at 1146, para. 48 (instructing OEA and WTB to “adopt the methodologies, data specifications, and formatting requirements that providers shall follow when collecting and reporting mobile infrastructure and on-the-ground test data to the Commission”). with fields either having a set or range of acceptable values (e.g., positive download speed), and any test that does not include all acceptable values per the data specifications will be likewise excluded from further analysis. We require that testers conduct all speed tests between the hours of 6:00 a.m. and 10:00 p.m. local time. Tests will be valid for one year from the test date to help ensure the tests are representative of the current state of the provider’s network. As explained in the Order, we reject requests to curtail the lifespan of speed test data. The process we adopt for submission of challenges ensures that providers have sufficient details to respond to challenges, including dates and times of speed tests. To the extent that a provider improves its network coverage in an area, it can either remove the area from its current data and add it back in with its next biannual submission or rebut a challenge by submitting on-the-ground test data demonstrating network performance in the recently deployed area. We find that these alternatives strike a better balance in facilitating robust participation in the challenge process and ensuring high-quality data than further limiting the lifespan of valid challenge data. Therefore, speed tests may be re-evaluated against new coverage maps when the service provider submits its biannual coverage data. Prior to conducting a speed test, testers must certify that: 1) their handset and speed test app are in ordinary working order to the best of the tester’s actual knowledge, information, and belief; 2) all submitted tests were taken in either an in-vehicle mobile or stationary outdoor environment; and 3) the tester is a subscriber or an authorized user of the provider being challenged. Id. at 1167, para. 104. Government and third-party challengers must also substantiate their data through the certification of a qualified engineer or official. Id. at 1172, para. 117. Under the approach we adopt, we will require testers to submit all conducted tests. The requirements that testers certify tests prior to conducting them and that all conducted tests are submitted will help ensure that testers do not submit a selected (i.e., biased) sample of tests. 2.1 Nested Hexagon Grid System Geospatial datasets allow the Earth to be divided into unique cells using various shapes such as squares, rectangles, triangles, circles, and hexagons. We will use hexagons as the geographic area for grouping speed tests, identifying challenged areas and evaluating challenges, evaluating verification data, and evaluating crowdsourced data. Hexagons can be arranged to form an evenly spaced grid allowing for less distortion than squares or rectangles due to the curvature of the Earth. See ESRI, Why Hexagons?, https://pro.arcgis.com/en/pro-app/latest/tool-reference/spatial-statistics/h-whyhexagons.htm (last visited Feb. 1, 2022). Specifically, we will use the H3 standardized, open-source geospatial indexing system developed by Uber Technologies, Inc. Isaac Brodsky, H3: Uber’s Hexagonal Hierarchical Spatial Index, (June 27, 2018) https://eng.uber.com/h3/. This system overlays the Earth with hexagonal cells of different sizes or resolutions. Id. The smallest hexagonal cells are at resolution 15, in which the average hexagonal cell has an area of approximately 0.9 square meters, and the largest are at resolution 0, in which the average hexagonal cell has an area of approximately 4.25 million square kilometers. Id. Table 1 provides the H3 geospatial indexing system resolutions that are relevant to the challenge and verification process. For ease of explanation, we refer to the hexagonal cells across different resolutions as a “hex-n” cell, where n is the resolution (e.g., “hex-15” for the smallest size hexagonal cell). The H3 geospatial indexing system employs a nested cell structure wherein a lower resolution hexagonal cell (the “parent”) contains approximately seven hexagonal cells at the next highest resolution (its “children,” and each a “child”). Id. Each finer resolution has nested cells with approximately one seventh the area of the coarser resolution. Id. However, because a hexagon cannot perfectly subdivide into seven child hexagons, the finer resolution cells are not perfectly contained within their parent cells. Id. Regardless, a cell is considered to have a single parent, specifically, the hexagon at the next lower resolution containing the centroid of the cell. That is, a hex-1 cell is the “parent” of seven hex-2 cells, each hex-2 cell is the parent of seven hex-3 cells, and so on. H3 Resolution Average Hexagon Area (square km) Average Hexagon Edge Length (km) Number of unique indexes 6 36.129 3.229 14,117,882 7 5.161 1.221 98,825,162 8 0.737 0.461 691,776,122 9 0.105 0.174 4,842,432,842 Table 1: Excerpted table of H3 geospatial indexing system resolutions, with numbers rounded for readability. H3, Table of Cell Areas for H3 Resolutions, https://h3geo.org/docs/core-library/restable (last visited Feb. 1, 2022). The grid structure of H3 allows for parent-child or nested relationships to easily translate across multiple indices of various resolution sizes. Figure 1: H3 geospatial indexing system at resolution 7, resolutions 7 and 8, and resolutions 7, 8, and 9. Isaac Brodsky, H3: Uber’s Hexagonal Hierarchical Spatial Index, (June 27, 2018), https://eng.uber.com/h3/. For example, the leftmost image of Figure 1 shows the approximately five square kilometer area of one complete hexagonal cell at resolution 7 (a hex-7 cell). The center image shows the same hex-7 cell with seven finer resolution 8 hexagonal cells (hex-8 cells) nested in the hex-7 cell. The rightmost image shows the original hex-7 cell with seven finer resolution hex-8 cells, and 49 finer resolution 9 hexagonal cells (hex-9 cells). Because the finer resolution child cells are approximately contained in the coarser resolution ancestor cells, H3 allows for efficient indexing of the area with minimal shape distortion that would occur only at the cell boundaries. 3 Methodology for Creating and Evaluating Mobile Challenges 3.1 Creating a Challenge For the challenge process, the smallest cognizable challenge will be to a hex-8 cell, which has an average area that is approximately 0.7 square kilometers. This H3 resolution has hexagonal cells that are closest to the one square kilometer de minimis threshold for challenges and one square kilometer uniform grid system adopted for the Mobility Fund Phase II challenge process, along with the same one square kilometer uniform grid system adopted by the Commission for use when evaluating on-the-ground speed tests submitted by 5G Fund support recipients. See Connect America Fund; Universal Service Reform – Mobility Fund, WC Docket No. 10-90, WT Docket No. 10-208, Order on Reconsideration and Second Report and Order, 32 FCC Rcd 6282, 6305-06, para. 46 (2017); Procedures for the Mobility Fund Phase II Challenge Process, WC Docket No. 10-90, WT Docket No. 10-208, Public Notice, 33 FCC Rcd 1985, 1989-90, para. 9 (WTB/WCB 2018); Establishing a 5G Fund for Rural America, GN Docket No. 20-32, Report and Order, 35 FCC Rcd 12174, 12232, para. 140 (2020). Coverage maps must be submitted at a resolution of 100 meters (i.e., 0.1 km) or better. 47 CFR § 1.7004(c)(3)(iii). Therefore, allowing for challenges to an area smaller than a hex-8 cell may instead reflect inaccuracies due to the resolution at which the provider generated its maps. Allowing challenges for a smaller area (at a higher resolution) may thus require excessive testing by providers whose propagation maps were not designed to provide such precision. Conversely, allowing challenges only for a larger area (at a lower resolution) would require significantly more testing for the challenger and may hamper a challenger’s ability to demonstrate local coverage gaps. Roughly 70% of hexagonal cells at resolution 8 in the United States, excluding Alaska, intersect with at least one road, using U.S. Census Bureau roadway data. This means that most hex-8 cells should be easily accessible for drive testing. If a hex-8 cell only has limited road access, the thresholds to create a challenge are reduced to reflect the difficulty. The adopted proposal contains a process to create challenges at lower resolutions if sufficient evidence exists, as described in section 3.1.4 As adopted, speed tests will be required at a variety of locations within the hex-8 cell and at more than one time of day. The download and upload components of a speed test will be evaluated separately. A speed test component will be categorized as a “positive” test—that is, a test that meets or exceeds the minimum download or upload speeds respectively associated with the coverage area of a mobile provider’s technology being tested (e.g., 5/1 Mbps for 4G LTE), or a “negative” test—that is, a test that does not satisfy the minimum speeds associated with the coverage area. For example, for a 4G LTE speed test to be positive, the download speed must be at least 5 Mbps and the upload speed must be at least 1 Mbps (5/1). If a test records speeds that are 4/2 Mbps, the download component will be considered a negative test and the upload component will be considered a positive test. If a test records speeds that are 25/0.5 Mbps, the download component will be considered a positive test and the upload component will be considered a negative test. We therefore will also allow a speed test conducted using a device capable of connecting to a higher-generation technology that is only able to connect to a lower-generation technology to count as a failed test for the higher-generation technology where the test does not meet the minimum speed reported on the higher-generation technology’s coverage map. For speed tests conducted on a 5G-NR network and submitted as challenges, tests will be evaluated against the highest minimum speeds reported in the mobile service provider’s coverage data. As discussed, providers are required to submit coverage data showing where their models predict 5G-NR coverage with minimum speeds of 35/3 Mbps in addition to where their models predict 5G-NR coverage with minimum speeds of 7/1 Mbps. Consequently, for a 5G-NR speed test’s download component to be considered positive within the area that the provider reports minimum speeds of 35/3 Mbps, the download speed must be at least 35 Mbps. If the test’s speeds were 10/3 Mbps, 35/1 Mbps, or even 7/1 Mbps, these tests will be considered negative tests because none meet both minimum speed thresholds. However, all three of these 5G-NR speed tests will be considered positive if conducted outside of the area that the provider reports minimum speeds of 35/3 Mbps but within the area that it reports minimum speeds of 7/1 Mbps. If a test is unable to connect to any network, it will be treated as recording 0 Mbps on every technology for the provider at that location for which the device is capable. Challengers’ speed tests will be evaluated collectively, so that the tests of multiple challengers may be used to create a challenge, and speed tests will be evaluated cumulatively, with all tests remaining valid for one year. Each challenger will be notified that an area where it provided tests that are included in a challenge has been classified as challenged. A hex-8 cell will be classified as challenged if the following three thresholds are met in the cell for either the download or upload components: 1. Geographic Threshold 2. Temporal Threshold 3. Testing Threshold These three separate conditions will help ensure that tests are geographically and temporally diverse, and therefore approximate a random sampling of the area. At the same time, these thresholds are meant to not be overly burdensome to challengers – the typical distance to capture measurements at four different hex-9s is less than a mile. For example, challengers could reasonably satisfy these criteria while taking a morning and evening walk. As challengers submit speed test results, the system will identify in which hex-8 cell each test component occurred, periodically evaluating whether the thresholds have been met for each cell, and will make this information available to challengers When the test results for a given cell meet all of the thresholds to successfully create a challenge, that cell would be considered “tentatively challenged” until the end of the calendar month in which the criteria were met. This will allow challengers to review which cells have been tentatively challenged and, if desired, provide additional speed test data in cells where submitted speed test data were insufficient to establish a challenge. At the end of each month, any cell that is tentatively challenged will be considered “challenged” and the challenged service provider will be notified of any challenged cells. The provider will then have 60 days to respond to the challenges for any newly challenged cells. 3.1.1 Geographic Threshold We require a geographic threshold for the challenge process so that challengers must demonstrate that lack of coverage exists over a sufficiently large area and is not concentrated in one small area. To accurately measure the coverage in a hex-8 cell, we require speed tests to be conducted in multiple locations within the geographic area of the cell and include negative tests recorded at multiple different locations. These requirements ensure geographic diversity of tests and identify potential coverage gaps over a sufficiently wide area. The methodology will group speed test components that fall within the same hex-9 cell (which is a child of a hex-8 cell), or “point-hex,” and that are within the reported coverage of the tested provider. Speed tests conducted within the same point-hex would be on average within 350 meters of each other, or approximately 3 city blocks, which is less than the 400-meter buffer radius used in the Mobility Fund Phase II challenge process. Connect America Fund; Universal Service Reform—Mobility Fund, WC Docket No. 10-90, WT Docket No. 10-208, Order on Reconsideration, 33 FCC Rcd 4440-41, paras. 1, 4 (WTB/WCB 2018). Because there are seven child hex-9 cells within a hex-8 cell, test components within a hex-8 cell will fall within one of seven point-hexes. Because the child cells do not perfectly nest within the parent cell, some tests may fall within the hex-8 cell but not within any child point-hex. These tests will not count toward satisfying the geographic threshold but will be included when evaluating the temporal and testing thresholds. The system will count the number of point-hexes that contain the following: (a) at least two of the same test components (either upload or download) and (b) at least one of which is a negative test. Such point-hexes will be considered to have inadequate coverage—that is, there would be prima facie evidence that the provider’s submitted coverage data may be inaccurate for these point-hexes. To satisfy the geographic threshold for a challenge, a hex-8 cell will generally need to contain at least four point-hexes that meet both criteria (a) and (b) above. This requirement will assure that more than 50% of the point-hexes show inadequate coverage. Figure 2: Speed tests within a hex-8 cell (outlined in black), with negative (red) and positive (green) tests shown. Figure 2 illustrates the geographic threshold requirement. The graphic on the left shows that four child point-hexes (outlined in red) satisfy the testing requirements to be counted toward the geographic threshold because all four point-hexes include two or more tests, at least one of which is a negative test. The graphic on the right shows that no child point-hexes satisfy the requirements to be counted toward the geographic threshold because there are either fewer than two tests in each point-hex, the test does not fall within a point-hex, or the point-hex does not include a negative test. If a provider’s coverage map only partially covers a hex-8 cell, or if a portion of the hex-8 cell does not contain roads, Using the most recent U.S. Census Bureau roadway data, a point-hex would contain a road if it overlaps any primary, secondary, or local road, which are defined as MAF/TIGER Feature Class Codes S1100, S1200, or S1400, respectively. U.S. Census Bureau, 2020 TIGER/Line Shapefiles: Roads, https://www.census.gov/cgi-bin/geo/shapefiles/index.php?year=2020&layergroup=Roads (last visited Feb. 1, 2022). In order to account for road width, we will apply a small buffer around the U.S. Census Bureau road line data. then the geographic threshold for that cell will be reduced from the specified four-out-of-seven point-hex requirements described above. We consider point-hexes to be “accessible” where at least 50% of the point-hex overlaps with the provider’s reported coverage data and a road runs through the point-hex. Where fewer than four point-hexes in a hex-8 cell are accessible, the number of point-hexes necessary to satisfy the geographic threshold for the hex-8 cell will equal the number of accessible point-hexes in that cell (see Table 2 below). For example, if there are only two accessible point-hexes in a hex-8 cell, then only two of the point-hexes would need to contain multiple speed tests and at least one negative test in order to satisfy the geographic threshold. If a point-hex does not have a road but is still within the provider’s coverage, tests conducted in that point-hex count toward satisfying the geographic threshold for a challenge in a hex-8 cell (assuming that multiple speed tests with at least one negative speed test were recorded in that point-hex). For example, a challenge contains two tests, at least one of which is negative, within a point-hex that does not have a road but was reached via a hiking trail. This would count toward satisfying the geographic threshold. If there are no accessible point-hexes within a hex-8 cell, the geographic threshold does not need to be met; only the temporal and testing thresholds need to be met in order for that hex-8 cell to be considered challenged. Number of Accessible Point Hexes in Hex-8 Cell Minimum Number of Accessible Point-Hexes Required for Challenge 4 – 7 4 3 3 2 2 1 1 0 0 (hex-8 cell must contain tests that satisfy the temporal threshold and testing threshold) Table 2: Relationship of Accessible Point-hexes to Geographic Threshold. Figure 3: Example of accessible child point-hexes within a hex-8 cell. The example in Figure 3 illustrates the impact of accessible point-hexes within a hex-8 cell on the geographic threshold. In this graphic, the provider’s reported coverage is shown in blue, with the underlying roads shown. Only two point-hexes both contain a road and have greater than 50% overlap with the provider’s coverage. Therefore, only those two point-hexes are accessible, and the geographic threshold would be only two point-hexes for this hex-8 cell rather than the typical requirement of four point-hexes. 3.1.2 Temporal Threshold We require a temporal threshold for the challenge process so that challengers must demonstrate the lack of coverage is persistent rather than temporary. To meet this requirement, a hex-8 cell would need to include a set of two negative components of the same type (upload or download) with a time-of-day difference of at least four hours with another set of two other negative components of the same type as the first set, regardless of the date of the tests. For example, if a challenge’s negative test components occurred at 10:00 a.m. and 10:15 a.m. and, then on another day had additional negative test components at 3:30 p.m. and 5:00 p.m., then this temporal requirement would be satisfied because the difference between the morning tests of 10:00 a.m. and 10:15 a.m., and the afternoon tests of 3:30 p.m. and 5:00 p.m., is greater than four hours. If a challenge only recorded four negative components between 9:00 a.m. and 10:00 a.m. on the same day, or even negative components at 9:00 a.m. and 9:10 a.m. on one day and negative component at 10:00 a.m. and 10:15 a.m. on the following day, those tests would not satisfy this temporal requirement because the time-of-day difference between 9:10 a.m. and 10:00 a.m. is only 50 minutes. 3.1.3 Testing Threshold We require a testing threshold for the challenge process so that challengers must demonstrate statistically significant evidence that coverage is inadequate. Specifically, in order for the testing threshold for a hex-8 cell to be met, at least five negative test components of the same type must have been taken within the cell when challengers have submitted 20 or fewer test components. When challengers have submitted more than 20 test components, we require that a certain minimum percentage of the total number of test components in the cell must be negative, ranging from at least 24% negative, when challengers have submitted between 21 and 29 total test components, to at least 16% negative, when challengers have submitted 100 or more test components (see Table 3 below). Ignoring the costs to the challengers and providers, a greater sample size only improves the statistical certainty associated with such a statistical test. A general rule of thumb in statistics is that a sample size should be at least 30, but in this context, we have considered the burden on challengers and providers of reaching such a sample size. Marco Taboga, Lectures on Probability Theory and Mathematical Statistics ch. 71 (3d ed. 2017). We also note that a sample size any lower than our suggested values would be of questionable statistical validity. For example, if 60 download components were recorded in a hex-8 cell, at least 12 of these tests must be negative to meet this requirement. Once the percentage of negative components recorded meets the minimum negative percentage required, or for a sample of fewer than 21 test components, once there are at least five negative test components submitted, we will not require additional tests so long as both the geographic and temporal thresholds for a hex-8 cell have been met for the same component type. The thresholds for the percentage of negative tests are based on the statistical significance necessary to demonstrate lack of coverage. Total Number of Tests Count or Percent Negative Tests 20 or fewer 5 tests 21-29 24% 30-45 22% 46-60 20% 61-70 18% 71-99 17% 100+ 16% Table 3: Challenger testing threshold with required number of tests and negative test counts or percentages. To avoid the testing threshold being skewed by a disproportionate amount of components of the same type occurring in one location in a hex-8 with four or more accessible point-hexes, if the number of components of the same type in one point-hex represent more than 50% of the total test components of that type in the hex-8 but still satisfies the geographic threshold, the components in that point-hex will count only toward 50% of the threshold. For example, if three point-hexes each had 10 download components with 1 negative test (a 10% failure rate) and one point-hex had 80 download components with 30 negative tests (a 37.5% failure rate), the testing threshold would be calculated as if 60 total tests were taken, 30 in the one point-hex with disproportionate tests and 30 outside of this point-hex. Therefore, the percent of negative tests would be 3×10%×1060+37.5%×3060=23.75% which meets the threshold of 20% for 60 tests. If three point-hexes each had 2 tests with 1 negative test (a 50% failure rate) and one point-hex had 10 tests with 2 negative tests (a 20% failure rate), the testing threshold would be calculated as if 12 total tests were taken, 6 in the one point-hex with disproportionate tests and 6 outside of this point-hex. Therefore, the count of negative tests (because the adjusted number of tests is 20 or fewer) would be 3×50%×2+20%×6=4.2 tests which does not meet the threshold of at least 5 negative tests. In a hex-8 where there are only three accessible point-hexes, if the number of components of the same type in one point-hex represent more than 75% of the total test components of that type in the hex-8 but still satisfies the geographic threshold, the components in that point-hex will count only toward 75% of the threshold. If fewer than three point-hexes are accessible, we will not apply a maximum percentage of total test components for a single point-hex as the risk that testing would be skewed by a disproportionate number of tests occurring in a single location is reduced. 3.1.3.1 Statistical Analysis Given the variable nature of wireless signal propagation and network load, occasional negative tests are possible within areas that should otherwise have adequate coverage, assuming a cell edge probability of 90% with 50% cell loading. We have therefore defined the testing thresholds for the minimum number of tests and the percentage of negative tests per hex-8 cell based on a statistical test called a one-sided confidence interval. We refer to “proportion” as the percentage of positive/negative tests relative to the total number of tests. We refer to “probability” as the chance that any one test will be positive/negative. In other words, the term “proportion” reflects a summary of observations and the term “probability” reflects a hypothetical chance. As part of this method, we first defined the significance level for a hypothesis testing: the null hypothesis is that the probability of adequate coverage is at least 90%, and thus, the alternate hypothesis is that the probability of coverage is less than 90%. We have chosen 90% for the null hypothesis because the coverage maps are modeled to reflect a 90% probability of coverage at the cell edge. While the probability of coverage at any point within the cell (i.e., not at the cell edge) is greater than 90%, we chose a conservative probability for simplicity. In other words, to reject the null hypothesis and successfully establish a challenge for a provider’s coverage data, a challenge must provide sufficient evidence that the probability of coverage is less than 90% over a geographically and temporally diverse sample. Next, we chose a desired level of statistical significance. This value is commonly referred to as alpha (α) and represents the Type I error that we are willing to tolerate (the probability of a “false positive”). Taboga, supra note 29, ch. 78. For instance, a 95% confidence interval has a 5% significance level, or a 5% probability of rejecting the null hypothesis when the null hypothesis is true. Several tradeoffs exist when choosing the significance level. Choosing a very low significance level would reduce the chance of a “false positive” result—that is, successfully establishing a challenge when there is, in fact, adequate coverage in an area. To reduce the chance of false positives, more samples are required to demonstrate that coverage meeting the minimum speeds does not exist. Conversely, decreasing the significance level would increase the probability of a “false negative” result. A “false negative” is commonly referred to as the value beta (β) and represents the Type II error. Id. Therefore, we chose a statistical significance value of 5%, which means we would calculate a 95% confidence interval and accept a 5% probability of finding inadequate coverage when there is actually adequate coverage. In many fields, common values of significance levels are 1% (99% confidence interval), 5% (95% confidence interval), and 10% (90% confidence interval). After choosing the significance level, we are able to calculate the minimum number of negative tests that a challenge would need in order to provide sufficient statistical evidence that coverage meeting the minimum speeds does not exist in a hex-8 cell. Specifically, we have applied the Clopper-Pearson method for calculating an exact binomial confidence interval. This “exact” method is more complicated than the traditional normal approximation of a binomial confidence interval, which is inaccurate when sample sizes are small or the proportion is very high or low. Given that we seek to minimize the burden on challengers and providers in the form of small sample sizes and expect the sample proportions to be very low (close to 0) or very high (close to 1), we have concluded that the traditional approach is not appropriate. A common rule of thumb is that the traditional method of a normal approximation should only be used if n×p≥5 and n×(1-p)≥5, where n is the sample size and p is the sample proportion (number of positive tests divided by the sample size). This rule of thumb can be interpreted as the sample must have at least 5 positive and 5 negative tests for the traditional approach to be used and more than 50 tests would be required. To reject the null hypothesis, the upper bound of the confidence interval must be less than 90%: Pub=1-β-1α, n-k, k+1<0.9 where the inverse of the cumulative density function of the beta distribution is used to calculate a binomial cumulative density function (CDF), α is the chosen significance level, n is the sample size (number of tests), and k is the number of positive tests. We distinguish here between the inverse cumulative density function (CDF) of the beta distribution and the scalar value of beta which are not related. The beta distribution is probability distribution defined by two shape parameters and has a probability density function (PDF) of fx; α, β=1B(α, β)*xα-1*1-xβ-1 where B() is the Beta Function which defines a normalization constant to ensure the total area under the PDF equals 1, α > 0, β > 0, and 0 ≤ x ≤ 1. Note that the α in the inverse beta distribution corresponds to the significance level but the α in the beta distribution is a shape parameter. If the challenger chooses to conduct more than the minimum number of tests, the required percentage of negative tests would decrease since the challenger has provided a larger sample size and there is more certainty that their sample proportion reflects the true probability of coverage meeting the minimum speeds. Using this formula, five or more failures in 20 tests demonstrates a likely lack of adequate coverage. We calculated the percentage of failures required at each number of tests. For simplicity, we then grouped the number of tests and chose failure percentages such that the number of failures will always meet or exceed the exact number of failures required. 3.1.4 Challenges to Larger, Lower-Resolution Hexagons If multiple nearby hex-8 cells meet the three thresholds described above, it may point to a more systemic lack of coverage across a larger area. Rather than require that challengers meet these thresholds in every hex-8 cell near a group of challenged hex-8 cells, we will use the nested structure of H3 to establish challenges across larger areas. If four or more of the seven child hex-8 cells in a hex-7 cell (which has an average area of 5.2 square kilometers) are challenged, then the entire hex-7 cell also will be considered to be challenged (see Fig. 4 below). Similarly, if four or more of the seven child hex-7 cells in a hex-6 cell (which has an average area of 36.1 square kilometers) are challenged, then the entire hex-6 cell also will be considered to be challenged. Hexagonal cells at a resolution lower than resolution 6 (i.e., those larger than a hex-6 cell) could not be challenged in the challenge process. Instead, we rely upon the verification process to address cases where sufficient evidence suggests that there is a more systemic problem with a provider’s coverage data, for example, an area larger than a hex-6 cell. Figure 4: Process by which challenges to hex-8 cells can challenge a larger hex-7 cell. This process is illustrated in Figure 4 above. The leftmost graphic shows the challenges to hex-8 cells that would be determined using the geographic, temporal, and testing thresholds. The center graphic shows the hex-7 cells containing any challenged hex-8 cells that would then be identified. Finally, the rightmost graphic shows that the system would consider any hex-7 cell that contains four or more challenged hex-8 cells to also be challenged. 3.1.5 Stationary vs. In-Vehicle Mobile Challenges Mobile service providers are required to submit two sets of predictive propagation modeling data for a given technology: a map that predicts coverage assuming the device is outdoors and stationary and a map that predicts coverage assuming the device is in-vehicle and mobile. Second Order and Third Further Notice, 35 FCC Rcd at 7481-82, para. 48; 47 CFR § 1.7004(c)(5). Similarly, challengers and providers are required to indicate, for each speed test, whether the speed test was conducted in an outdoor stationary or in-vehicle mobile environment. Third Order, 36 FCC Rcd at 1150-51, 1166 paras. 57, 59, 102 & n.315; 47 CFR § 1.7006(e)(1)(iii), (f)(1)(i)(G). We will first filter out speed tests so as to exclude any outdoor stationary tests that fall outside of the provider’s outdoor stationary coverage map and to exclude any in-vehicle mobile tests that fall outside of the provider’s in-vehicle mobile coverage map. Because the two coverage maps may differ, especially at the edge of a provider’s network, speed tests submitted as challenges against the same provider within the same hex-8 cell may be sufficient to create a challenge against one of the provider’s coverage maps but insufficient to create a challenge against the other. For example, the aggregated speed tests might meet the testing, geographic, and temporal threshold requirements for a hex-8 cell when evaluated against a provider’s stationary 4G LTE coverage map but fail to meet the testing threshold when evaluated against the provider’s in-vehicle mobile 4G LTE coverage map because a handful of stationary speed tests fell outside of the provider’s in-vehicle mobile coverage. We will attempt to mitigate these potential discrepancies by assuming coverage is generally superior for stationary tests compared to in-vehicle tests. Because we expect performance to be better for stationary tests than in-vehicle tests, a challenge to an area on the stationary map will also create a challenge to the same area on the in-vehicle map if the provider reports coverage on both maps for that area. Similarly, a provider refuting a challenge to a geographic area on the in-vehicle map would also refute a challenge to the same area on the stationary map if challenges exists on both maps. 3.2 Provider Notification and Response Process We will notify mobile service providers after the end of each calendar month of all hexagonal cells for which cognizable challenges were generated during that month. Upon notification, challenged providers will have 60 days to respond to a challenge by either conceding or disputing a particular challenge. Third Order, 36 FCC Rcd at 1168, 1173, paras. 107, 121; 47 CFR § 1.7006(e)(3), (f)(4). Where the challenged provider disputes a challenge, it must submit either infrastructure data or on-the-ground speed test data in response. Third Order, 36 FCC Rcd at 1168-69, 1173-74, paras. 108, 121; 47 CFR § 1.7006(e)(4), (f)(5). If the challenged provider provides infrastructure data, it must provide the data specified in section 1.7006, and may optionally provide any other data that would be helpful to the Commission in adjudicating challenges. Third Order, 36 FCC Rcd at 1168-69, 1173-74, paras. 108, 121; 47 CFR § 1.7006(e)(4), (f)(5). If the provider submits on-the-ground speed test data in response to a challenge, we will evaluate the provider speed tests independently of the challenger data. A challenged provider may provide evidence of coverage in response to a challenge by submitting on-the-ground speed test data for any challenged hex-8 cell or any hex-8 cell that is the child or grandchild of a challenged hex-7 or hex-6 cell. To successfully overcome a challenge, a challenged provider may submit speed tests for any hex-8 cell within the challenged area. In addition to any challenged hex-8 cell, this also includes any hex-8 cell that is a child of a challenged hex-7 or grandchild of a challenged hex-6 cell. If a challenged provider submits positive speed tests meeting the thresholds for challenged or non-challenged hex-8 cells that have a challenged parent or grandparent, such tests could overturn challenges in the larger hex-7 or hex-6 cells. The provider may only submit speed tests conducted during the previous 12 months as evidence in response to a challenge. The criteria that a provider must meet to overturn a challenge by showing evidence of coverage in a hex-8 cell will be similar to the criteria for creating a challenge except they must satisfy the criteria for both download and upload components. Challenged providers will be required to meet the following three thresholds: 1. Geographic Threshold 2. Temporal Threshold 3. Testing Threshold A hex-8 cell for which a challenge is successfully rebutted (i.e., coverage is confirmed) will not be subject to subsequent challenge until the first biannual BDC coverage data filing six months after the later of either the end of the 60-day response period or the resolution of the challenge (the “confirmed period”). A challenged provider may “restore” a challenged hex-7 or hex-6 to an unchallenged state if, as a result of data submitted by the provider, there is no longer sufficient evidence to sustain the challenge to that hexagon (see example in section 3.2.4), or, as discussed below, if the provider submitted evidence invalidating challenger speed tests such that the remaining valid challenger speed tests are no longer sufficient to challenge the hex-8. Unlike cells for which a challenge was rebutted, a restored cell in which coverage has not been confirmed is subject to challenge at any time in the future as challengers submit new speed test data. 3.2.1 Geographic Threshold We require a geographic threshold so that challenged providers must demonstrate that coverage exists over a sufficiently large area in the challenged cell to overturn a challenge. To overturn a challenge by showing adequate coverage in the challenged area, the provider will need to meet the same geographic threshold required of challengers, but with positive test components rather than negative test components required. These requirements will ensure geographic diversity of tests and demonstrate that coverage is consistent over a sufficiently wide area. The system will count the number of point-hexes that contain the following for both download and upload components: (a) at least two speed test components of the same type (either positive or negative); and (b) at least one positive component of that type. Such point-hexes will be considered to have some evidence of adequate coverage. To satisfy the geographic threshold for a challenge response, a hex-8 cell will generally need to contain at least four point-hexes that meet both criteria (a) and (b) above for both download and upload. This requirement would assure that more than 50% of the point-hexes show adequate coverage. As with the geographic threshold required of challengers, the number of point-hexes for which the challenged provider would be required to make this showing will be the same as that required of the challenger if there are fewer than four “accessible” point-hexes (as defined in section 3.1.1) within the challenged hex-8 cell. 3.2.2 Temporal Threshold We require a temporal threshold so that challenged providers must demonstrate that the existence of coverage is persistent to overturn a challenge using on-the-ground speed test, analogous to the temporal threshold required of challengers. To meet this requirement, a hex-8 cell will need to include a set of five positive download test components with a time-of-day difference of at least four hours from another set of five positive download test components, regardless of the date of the test and a set of five positive upload test components with a time-of-day difference of at least four hours from another set of five positive upload test components, regardless of the date of the test. We require more tests to be separated in time than initially proposed to ensure sufficient diversity in the test data. 3.2.3 Testing Threshold We require a testing threshold for challenged providers so that providers must demonstrate statistically significant evidence that coverage is adequate to overturn a challenge using on-the-ground speed tests, based on the same statistical significance analysis used for determining challenges for both download and upload components. Using the same statistical analysis as detailed in section 3.1, the provider needs tests such that Pub≥0.9 to demonstrate that the 95% confidence interval either contains or exceeds 90%. Specifically, in order for the testing threshold for a hex-8 cell to be met, we require that at least 17 positive components of the same type have been taken in the cell when the provider has submitted 20 or fewer test components. When the provider has submitted more than 20 components, we require that a certain minimum percentage of the total number of test components in the cell must be positive, ranging from at least 82% positive, when providers have submitted between 21 and 34 total components, to at least 88% positive, when providers have submitted 100 or more components (see Table 4 below). As more tests are taken, the confidence interval shrinks and, therefore, the percent of positive tests required to demonstrate coverage increases. For example, if 50 test components were recorded in a hex-8 cell, at least 43 of these components must be positive to meet this requirement. Once there are at least 17 positive components submitted or the percentage of positive components meets the minimum percent required, as appropriate, we do not require additional components so long as both the geographic and temporal thresholds for the hex-8 cell have been met for each type of component. These thresholds demonstrate that the required speeds can be obtained in the cell 90% of the time. Total Number of Tests Count or Percent of Positive Tests 20 or fewer 17 tests 21-34 82% 35-49 84% 50-70 86% 71-99 87% 100+ 88% Table 4: Provider testing threshold with required number of tests and positive test percentages. As with challenges, to avoid the testing threshold being skewed by a disproportionate number of components occurring in one location in a hex-8 cell with four or more accessible point-hexes, if the number of components in one point-hex represent more than 50% of the total test components in the hex-8 cell but still satisfies the geographic threshold, then the components in that point-hex will count only toward 50% of the threshold. For example, if three point-hexes each had 10 download components with 9 positive tests (a 90% pass rate) and one point-hex had 80 download components with 70 positive tests (a 87.5% pass rate), the testing threshold would be calculated as if 60 total tests were taken, 30 in the one point-hex with disproportionate tests and 30 outside of this point-hex. Therefore, the percent of positive tests would be 3×90%×1060+87.5%×3060=88.75% which meets the threshold of 86% for 60 tests. If three point-hexes each had 2 tests with 2 positive tests (a 100% pass rate) and one point-hex had 18 tests with 16 negative tests (an 88.9% pass rate), the testing threshold would be calculated as if 12 total tests were taken, 6 in the one point-hex with disproportionate tests and 6 outside of this point-hex. Therefore, the count of negative tests (because the adjusted number of tests is 20 or fewer) would be 3×100%×2+88.9%×6=8.7 tests which does not meet the threshold of at least 17 positive tests. In a hex-8 cell where there are only three accessible point-hexes, if the number of components in one point-hex represents more than 75% of the total test components in the hex-8 cell but still satisfies the geographic threshold, then the components in that point-hex will count only toward 75% of the threshold. 3.2.4 Responding to Lower Resolution Challenges If a hex-7 or hex-6 cell is challenged, the provider can overturn the challenge by submitting positive speed test results sufficient to overturn or revert a challenge such that fewer than four of the child hex-8 cells remain challenged. If a provider does not submit sufficient speed test data to overturn a challenged hex-8, hex-7, or hex-6 cell, and is not able to overturn or revert a challenge using other evidence, then it is required to remove from its coverage map the area overlapping any portion of the successfully challenged hexagon. However, a challenged provider would not need to remove from its coverage map the area overlapping any hex-8 cells in which the provider was able to provide sufficient on-the-ground test evidence to overturn the challenge. Figure 5: Process by which provider responses to challenged hex-8 cells can revert a challenged hex-7 cell. This process is illustrated in Figure 5 above. The leftmost graphic shows a hex-7 cell that would be challenged because four child hex-8 cells were challenged. The graphic in the center shows that the provider overturned the challenge to three hex-8 cells (in green, one of which was challenged, the other two were within the hex-7 being challenged) and only three hex-8 cells remain challenged (in red), so the previously challenged parent hex-7 cell would be restored and no longer challenged. The rightmost graphic shows, in the alternative, that the provider overturned the challenge to only two hex-8 cells (in green, both challenged through the hex-7 being challenged) leaving four hex-8 cells that remain challenged (in dark red), therefore the challenge to the parent hex-7 cell would remain. The provider would be required to remove from its coverage map all areas successfully challenged but would not need to remove areas overlapping the two hex-8 cells for which the challenges were overturned. 3.3 Post-Challenge Review After the challenged provider submits all responses, any areas that remain challenged will be adjudicated in favor of the challenger and must be removed from the provider’s coverage map. Third Order, 36 FCC Rcd at 1170, 1174, paras. 112, 124; 47 CFR § 1.7006(e)(3), (f)(4). The provider will have 30 days to submit new maps reflecting these updates. Third Order, 36 FCC Rcd at 1170, 1174, paras. 112, 124; 47 CFR § 1.7006(e)(6), (f)(7). Speed tests submitted for the challenges in these areas would thus no longer be valid because the tests would no longer be within the provider’s coverage map. Any areas where the provider successfully rebutted the challenge by confirming coverage will be marked as such and will be ineligible for challenge for the duration of the confirmed period. Any hex cells that were challenged and restored (but where coverage was not confirmed) will remain eligible to be challenged. Any valid speed test in these cells may still be used for a future challenge (up to a year from the date the test was conducted). It is also possible that sufficient proof was provided to confirm a hex cell had coverage but some of its children were neither challenged nor confirmed to have coverage. These individual child hex cells can still be challenged in the future, but the hex cell is unable to be challenged for the period as described previously. If the provider makes modifications to its infrastructure and the provider now models coverage in an area that had previously been successfully challenged, the provider may include this area in its coverage map data, but must submit additional information or data before certifying their BDC coverage data detailing what modifications have been made that now provide this coverage. Third Order, 36 FCC Rcd at 1170, 1174, paras. 111, 123; 47 CFR § 1.7006(e)(5), (f)(6). Providers would be required to submit such information and data as part of their biannual BDC coverage data submissions. Otherwise, areas that have successfully been challenged must not be included in future submissions of the coverage map. Third Order, 36 FCC Rcd at 1170, 1174, paras. 112, 124; 47 CFR § 1.7006(e)(6), (f)(7). 4 Methodology for Verifying Coverage Data 4.1 Introduction In the Third Order, the Commission called for the development and administration of “specific requirements and methodologies that providers [subject to verification] must use in conducting on-the-ground-tests, including the geographic areas that must be subject to the on-the-ground testing so that the tested areas satisfy the requirements of a statistically valid and unbiased sample of the provider’s network.” Third Order, 36 FCC Rcd at 1150, para. 59. In this section, we provide the technical details for a stratified random sample design that addresses these requirements. In particular, these technical details guide both the Commission and the provider subject to verification in determining: · where, within the geographic boundaries of the portion of its coverage map a provider should conduct on-the-ground testing; · in how many locations a provider must conduct on-the-ground speed test measurements; · what speed test measurements will be accepted for staff analysis by the Commission; and · how Commission staff will evaluate the test data and adjudicate whether the provider has passed or failed verification. The sample design is applicable when the mobile service provider chooses to submit on-the-ground test data as evidence to support its claim of mobile broadband availability in the portion of its coverage map subject to verification. 4.2 Targeted Area, Frame, Units, and Sample At the heart of any sampling exercise is one or more research questions about a population of interest, called the “targeted area.” Such questions typically involve estimates of population totals, averages, proportions, and similar statistics. Data are then collected from a randomly selected subset of the population called a “sample” and subsequently analyzed to produce the desired estimates. Sampling starts with the division of the target population into unique components called “units.” The list of units is called the “frame.” A sample of units is then randomly selected from the frame. In the context of BDC mobile verifications, the targeted area is a portion of a provider’s coverage map that is subject to verification. Staff will define the boundaries of the targeted area based upon the mechanism for triggering verification – i.e., defining the area where staff has a “credible basis” to request verification from the provider. The goal is to determine whether the service provider makes available adequate coverage meeting the minimum speeds required for the modeled technology throughout the targeted area and at different periods of the day. Id. at 1146-47, para. 50; 47 U.S.C. § 642 (a)(1)(B)(i), (b)(2)(B). As a well-defined geographic area, the targeted area can be processed into layers of hexagons consistent with the system used in the challenge process (H3 geospatial indexing system at resolution 8). Each tessellating hex cell inside the target is a sampling unit, and the set of all such hexagons forms the frame. Figure 6(a) and (b) below illustrates these concepts: Figure 6: Defining the Target Area, Frame, Units, and Sample for the Mobile Verification Process. 4.3 Stratified Random Sampling Stratified random sampling occurs when: 1) a frame is divided into non-overlapping/mutually exclusive groups such that every unit is in exactly one group called a “stratum” (plural: strata) (this process is called stratification); and 2) units are selected at random in each stratum and independently across strata (i.e., the selection of units in one stratum does not affect the selection of units in another stratum). When properly implemented, a stratified design can simultaneously increase the precision of the desired estimates and decrease the total number of units in the sample (sample size) required to meet this precision. William G. Cochran, Sampling Techniques ch. 5 (3d ed. 1977). Ideally, stratification is accomplished by using prior knowledge about the quantity of interest. However, it is usually the case (especially for a new sampling exercise) that no such prior knowledge is available. In this case, one or more quantities directly related to (i.e., correlated with) the quantity of interest are used as stratification variables. As described in the verification design, stratification will begin by first identifying which hex cells in the frame can be drive tested to easily produce on-the-ground speed tests and then dividing the frame into drive-testable versus non-drive-testable hex cells, as illustrated in Figure 6(c). More specifically, we will include as drive-testable any hex-8 cells within the frame where at least one of seven point-hexes is accessible, using the same definition for an “accessible” point-hex as we use in evaluating the geographic threshold during the mobile challenge process. Supra § 3.1.1 (defining a point-hex as “’accessible’ where at least 50% of the point-hex overlaps with the provider’s reported coverage data and a road runs through the point-hex” using U.S. Census Bureau roadway data). We will use road data from the U.S. Census Bureau to make this determination. U.S. Census Bureau, 2020 TIGER/Line Shapefiles: Roads, https://www.census.gov/cgi-bin/geo/shapefiles/index.php?year=2020&layergroup=Roads (last visited Feb. 1, 2022). Next, in the drive-testable hex cells, we may use terrain variation, denoted as X and measured in meters, as one of the stratification variables. See Office of Economics and Analytics and Wireline Competition Bureau Adopt Adjustment Factor Values for the 5G Fund, GN Docket No. 20-32, Public Notice, 35 FCC Rcd 12975, 12976, paras. 4-5 (OEA/WCB 2020). Terrain variation is an example of a viable stratification variable because it is correlated with broadband availability due to the characteristics of radiofrequency propagation. We would calculate a terrain variation value for every hex cell in the frame. Additional stratification variables may also be used (for example, signal strength data from a provider’s “heat map” or staff-performed propagation modeling, clutter, population, or urban/rural status) to stratify the drive-testable hex cells. When the stratification variable is continuous, we may construct the strata using the cumulative square root of the frequency rule, a standard stratification method which creates equal intervals not on the X scale, but rather on the cumulative square root of the count (frequency) of drive-testable hex cells on equal intervals along the X scale. Cochran, supra note 54 at 127-31. Figure 6(d) shows an example of stratification in the drive-testable hex cells. 4.4 Sample Selection In any sampling exercise, the question of what sample size, n, to use is of primary importance. In the context of the mobile verification process, staff will decide the value of n based on a set of assumptions. The first assumption would be that the margin of error, which is the acceptable difference between an estimated proportion of positive speed tests for the modeled technology and the population proportion, is a specified value, d. The second assumption would be that the cost of drive testing is relatively the same in every drive-testable hex cell selected in the sample. Under these assumptions, and a specified confidence level, a theoretical value for the sample size n can be calculated as detailed below. Id. at 110. n=zα/22(h=1Lwhph(1-ph))2d2+1Nzα/22h=1Lwhph(1-ph) where · n is the sample size · L is the number of strata · N is the number of hex cells in the frame · Wh is the weight of the stratum, calculated as the ratio of the number of hex cells in the stratum, Nh, and N: Wh=NhN · ph is the assumed value of the population proportion of positive tests in the hth stratum By default, we will set this value to 0.5 for all strata since this ensure adequate sample size when no additional information is known about the stratum, but we will adjust this value for the different strata when we have relevant information. For example, we may use the crowdsourced data to calculate a stratum’s proportion of positive tests with the option to weight tests or subset the data to ensure reliability. · zα/2 is the (1-α/2)th percentile of the standard normal distribution · d is the margin of error, a pre-specified acceptable difference between an estimated proportion p and the population proportion p0: Pp-p0≤d=1-α, where p=h=1LWhph Once determined, n will be allocated among the different strata. Specifically, if nh is the number of sample hexagons allocated to the stratum, then: nh=nWhph(1-ph)h=1LWhph(1-ph)=nNhph(1-ph)h=1LNhph(1-ph) This method of apportioning the sample among the various strata is called Neyman allocation. Cochran, supra note 54 at 99. Note that n=h=1Lnh. A minimum sample size of 20 hex cells for each stratum will be applied to ensure that stratum that have a low number of hex cells are sufficiently tested. Guided by this allocation scheme, staff will apply spatial random sampling without replacement to select the hex cells in each stratum, Fibonacci sampling: Alvaro Gonzalez, 2010. Measurement of Areas on a Sphere Using Fibonacci and Latitude-Longitude Lattices. Mathematical Geosciences 42(1), pp. 49-64. as illustrated in Figure 6(e). Within each selected hex cell, an accessible child hex-9 will then be randomly selected where testing must occur. The provider subject to verification will then be notified of the sample hex cells and the specific child hex-9 in which it will be required to conduct on-the-ground speed tests. 4.5 Valid On-The-Ground Test Measurements On-the-ground speed test measurement data submitted in response to a verification request must be taken at various stationary points or mobile drive paths entirely within each of the n randomly selected point-hexes. Moreover, providers will be required to collect these data using mobile devices running the FCC Speed Test app, a third-party speed test app approved for use in the challenge process, or other software and hardware approved by staff. To the extent that a provider chooses to use software other than the FCC Speed Test app or another speed test app approved by OET for use in the challenge process, we will consider such software approved for use in rebutting challenges provided that the software adopts the test methodology and collects the metrics that approved apps must collect perform for consumer challenges and that government and third- party entity challenger speed test data must contain. Unlike the BDC challenge process, geographic variation in the on-the-ground test data for the verification process is guaranteed by spatial random sampling. Moreover, as explained in the next section, pass or fail adjudication for verifications relies on the totality of the submitted valid on-the-ground test data taken from the randomly selected hex cells. For these reasons, specific testing threshold requirements similar to those that apply to challenges are not materially relevant to verifications. Thus, only the temporal thresholds may be borrowed from the challenge process and can be applied in the context of verifications. To ensure temporal variation, the provider will generally need to record at least two tests within each of the randomly selected hexagons where the time of the tests are at least four hours apart, irrespective of date. We will only use the first two tests in each hex-8, based on time and date, with at least a 4-hour time of day difference when calculating the stratum level proportion. For each sampled hex-8 cell, we will order the tests sequentially by time and date and choose the first test in that sequence, which we define as the primary test. If two tests are required to meet the temporal requirement, we will then choose the next test in the sequence with at least a 4-hour time of day difference from the first test chosen, regardless of the date of the tests. If no tests have a 4-hour time of day difference with the primary test, the next test in the sequence starting from the current primary test will be selected as the new primary test, and the process will be repeated until two tests with at least a 4-hour time of day difference are selected (i.e., primary and secondary tests). Providers have the option of conducting as many speed tests as they would like within the sampled hex-8 cells and are required to submit all tests conducted in response to a verification inquiry. Our approach to only use one or two tests from each sampled hex-8 cell will ensure that the stratum level sample proportion is not distorted by many tests in a small number of the sampled hex-8 cells within the stratum and is designed to eliminate any incentive to take more tests in the sampled hex-8 cells with the best coverage. As adopted, we will relax the temporal threshold to require only a single test in a sampled hexagon where the provider establishes that significant variance in performance due to cell loading is unlikely based upon the submission of actual cell loading data that support this assumption. Specifically, we will relax the temporal threshold for any sampled hexagon where the cell loading data for the cell(s) covering the hexagon show that median loading, measured in 15-minute intervals, did not exceed the modeled loading factor for the one-week period prior to the verification inquiry. In cases where the provider only needs to conduct one test in the sampled point-hex (i.e., a child hex-9 cell), we will treat the one test as two identical tests when calculating the stratum’s proportion to ensure that each hex-8 cell has equal weighting despite the different number of tests per hex-8 cell. In other words, in the hex-8 cells that the provider has sufficiently shown as “non-congested” via cell loading data, a positive test will count as two positive tests or a negative test will count as two negative tests when calculating the stratum level proportion. In effect, we are assuming that the second test would have had the same result as the first test if the provider had been required to meet the temporal threshold of taking a second test in the sampled hex-8 cell and set the secondary test as equivalent to the primary test for the outcome of adjudication. Given the small geographic size of a hex-9 and the verification thresholds, we anticipate some tests, particularly in-motion tests, could unintentionally lie outside the hex-9 boundary resulting in an incomplete sample. In the case that a provider does not submit all required tests, we implement the following rules to allow the Commission to adjudicate a final decision without a complete sample. We will treat any tests within the sampled accessible point-hex that are outside the coverage area as valid in the case where tests were not recorded within the coverage area. If the required sampled point-hex continue to have missing tests, we will also consider tests that fall slightly outside the required point-hex but within the typical GPS average user range error as valid when no tests are recorded within the point-hex. If the sampled point-hex still has missing tests, we would set those missing required speed tests as negative tests when performing the final adjudication. These rules will provide additional flexibility to the provider testing in a small geographic area and allow the provider to simply concede missing tests as negative tests. For instance, a provider may decide to concede missing speed tests as negative tests when the collection and the resulting burden of additional drive testing would not have changed the outcome of the adjudication. 4.6 Calculation of the Overall Estimate of Broadband Availability and Adjudication of the Outcome of Verification The question underlying the statistical sampling exercise for BDC mobile verifications is whether the provider provides adequate coverage meeting the minimum speeds throughout the targeted area. This section describes how we will determine an overall estimate of adequate broadband availability based upon valid speed test data submitted by the provider in response to a verification request. Let mh,i be the count of the selected primary and secondary See supra note 64 for explanation of primary and secondary tests. speed test measurements in the ith sample hexagon of stratum h, where i=1,…,nh. Of these, let kh,i be the count of positive test measurements in the ith sample hexagon of stratum h; that is, measurements where both the download and upload speed values meet or exceed the minimum values the provider reports that it provides for a particular technology throughout the target area. Let mh=i=1nhmh,i and kh=i=1nhkh,i. Finally, let: ph=khmh, h=1,…,L Appropriate for stratified random sampling, Cochran, supra note 54 at 107. an estimate of the overall proportion of positive measurements in the target is given by: p=h=1LNhphN=h=1LWhph and an estimate of its variance is: V(p)=h=1LWh2ph1-phnh(1-nh/Nh) Using these estimates, we will then construct a one-sided 95% confidence interval for p, with upper limit calculated as p+1.645Vp. This one-sided 95% confidence interval calculation assumes that the sample size is large enough to use the Normal Distribution. If this computed value is greater than or equal to 0.9, which is the threshold proportion the Commission considers as delineating broadband availability for the purpose of verifications, then we will determine that the provider has passed verification. 5 Critical Mass of Crowdsourced Data Crowdsourced mobile speed test data (“crowdsourced data”) are another mechanism through which the Commission may obtain data with which to evaluate the accuracy of a provider’s coverage map. Crowdsourced data are any data that have been submitted through the portal and were not part of an area already challenged or part of a verification process. As discussed in the Order, once Commission staff determines that a “critical mass” of crowdsourced data submissions indicate that a provider’s coverage map may be inaccurate, Commission staff has a “credible basis” for verifying a provider’s coverage data in that area.  In this section, we will clarify what it means for crowdsourced data collected through the BDC data submission portal (“portal”) to have reached “critical mass.” A statistical clustering approach for the identification of clusters of concern will (a) reduce the amount of staff work and (b) assure that an unbiased analysis has provided evidence that specific areas warrant a review by Commission staff. When crowdsourced data are received through the portal, speed tests will be required to meet the same standards as required within the Mobile Challenge and Mobile Verification Processes, with the exception that we will accept a test with only the download or the upload component. Tests taken outside of the provider’s coverage area will not be accepted as part of the crowdsourced data. Similar to the challenge process, we will evaluate download speeds and upload speeds separately. An area identified by either component will be referred for review to determine whether a verification is required. We will present the process as applied to the download component, but the same process applies to the upload component as well. The clustering process described below will be used to determine the locations that are most likely to have shown problems with coverage in areas where a provider indicated coverage on a given technology map. Those identified locations will be provided to Commission staff for further evaluation. For each hex-8 cell with at least 30 valid tests for a specific provider and a specific technology map (e.g., in-vehicle mobile 5G), we calculate the following statistics: · n, the number of valid test data points in the hexagon; and · p, the proportion of test data points in the hexagon whose download or upload speed value is less than what the provider claims to be the minimum values in its coverage map for that technology. We identify all hex-8s where p≥0.22. A failure rate of 0.22 (22%) is consistent with the minimum failure rate for 30 tests as described previously in the challenge process. See supra Section 3.1.3. These hex-8s are considered for further aggregation into regional clusters. To create these larger clusters, we use Density-based spatial clustering (DBSCAN) Sander, Jörg (1998). Generalized Density-Based Clustering for Spatial Data Mining. München: Herbert Utz Verlag. ISBN 3-89675-469-6 , a well-established clustering algorithm that, given a set of points in space (in this case, the centroids of the hex-8 cells), groups together points that are geographically “close.” We will use DBSCAN as the data clustering algorithm because we find it most appropriate to clustering the type of geospatial data that we expect in the BDC mobile verification process where we cannot know in advance the complete set of data we will receive. Unlike other data clustering algorithms considered and that have been used with geospatial data (for example, K-Means and others), the DBSCAN algorithm allows us to dynamically determine the number of clusters, and it provides results with relatively compact clusters. We determined that some clustering methods are not appropriate for use with the crowdsourced data we expect. For example, hierarchical clustering analysis (HCA) is more appropriate for data with clusters formed by virtue of a natural parent-child relationship. We also note that other clustering algorithms (such as K-Means) do not handle single point clusters (outliers) as efficiently as the DBSCAN algorithm nor do they allow for the non-uniform shape of the cluster. The DBSCAN algorithm needs, as a parameter, the minimum cluster diameter to be considered, since the objective is to find clusters (in this case, hex-grouping) of a reasonable size. Thus, the algorithm requires two parameters: the minimum distance between “neighboring” hex-8 cells and the minimum number of neighboring hex-8s required to be part of a cluster. Since the data quantity and density of data received from crowdsourcing is unknown, we will determine the best values for these parameters such that the clusters created best guide BDC staff in determining areas for verification. We will select the parameters such that the clusters created are neither too small nor too large. For a credible basis to be established, a cluster must remain for three consecutive months. Over this period, as more data arrive, the cluster may grow or shrink. We will evaluate whether any hex-8 cells within a cluster appeared in a cluster in the previous month. If a cluster from the present month intersects with a cluster from the previous month, the present month’s cluster will be revised to be the union of the two clusters. If two present month clusters overlap with the same previous month’s cluster, the two present month clusters will be revised to be a single cluster made up of the union of the two clusters. If hex-8 cells have appeared in a cluster for three consecutive months, the area has shown persistent concern. The present month’s cluster is then recommended for review by BDC staff and may form an initial “credible basis” for conducting verification of the area. After reviewing these data and any other available relevant data, including other information collected during drive tests, BDC staff will determine if the area warrants verification using the Mobile Verification Process and, if so, the precise area to be verified. Federal Communications Commission DA 22-241 APPENDIX B Final Rules 1. Amend section 1.7001 by adding paragraph (a)(20) to read as follows: § 1.7001 Scope and Content of Filed Reports (a)* * * (1) * * * (20) H3 standardized geospatial indexing system. A system developed by Uber Technologies, Inc., that overlays the Earth with hexagonal cells of different sizes at various resolutions. The smallest hexagonal cells are at resolution 15, in which the average hexagonal cell has an area of approximately 0.9 square meters, and the largest are at resolution 0, in which the average hexagonal cell has an area of approximately 4.25 million square kilometers. Hexagonal cells across different resolutions are referred to as a “hex-n” cell, where n is the resolution (e.g., “hex-15” for the smallest size hexagonal cell). The H3 standardized geospatial indexing system employs a nested cell structure wherein a lower resolution hexagonal cell (the “parent”) contains approximately seven hexagonal cells at the next highest resolution (its “children”). That is, a hex-1 cell is the “parent” of seven hex-2 cells, each hex-2 cell is the parent of seven hex-3 cells, and so on. 2. Amend section 1.7006 by adding paragraph (b)(2), redesignating existing paragraph (b)(2) as (b)(3), revising paragraphs (b)(3) and (b)(4), and redesignating them as paragraphs (b)(4)-(b)(5), revising paragraph (c), adding paragraphs (c)(1)-(c)(2), revising paragraph (e)(1)(i), removing paragraph (e)(1)(ii), revising paragraph (e)(1)(iii) and redesignating it as paragraph (e)(1)(ii), redesignating the first paragraph designated as (e)(1)(iv) as (e)(1)(iii), revising paragraphs (e)(2), (e)(4) and (e)(6), adding paragraph (e)(7), revising paragraph (f), (f)(1)(i), adding paragraph (f)(1)(iv), and revising paragraphs (f)(2), (f)(3) and (f)(5) to read as follows: § 1.7006 Data Verification. * * * * * (b) * * * (2) On-the-ground crowdsourced data must include the metrics and meet the testing parameters described in paragraph (c)(1)(i)-(ii) of this section, except that the data may include any combination of download speed and upload speed rather than both. (3) The online portal shall notify a provider of a crowdsourced data filing against it, but a provider is not required to respond to a crowdsourced data filing. (4) If, as a result of crowdsourced data and/or other available data, the Commission determines that a provider's coverage information is likely not accurate, then the provider shall be subject to a verification inquiry consistent with the mobile verification process described in paragraph (c) of this section. (5) All information submitted as part of the crowdsourcing process shall be made public via the Commission’s website, with the exception of personally identifiable information and any data required to be confidential under § 0.457 of this chapter. (c) Mobile service verification process for mobile providers. Mobile service providers must submit either infrastructure information or on-the-ground test data in response to a request by Commission staff as part of its inquiry to independently verify the accuracy of the mobile provider's coverage propagation models and maps. In addition to submitting either on-the-ground data or infrastructure data, a provider may also submit data collected from transmitter monitoring software. The Office of Economics and Analytics and the Wireless Telecommunications Bureau may require the submission of additional data when necessary to complete a verification inquiry. A provider must submit its data, in the case of both infrastructure information and on-the-ground data, within 60 days of receiving a Commission staff request. Regarding on-the-ground data, a provider must submit evidence of network performance based on a sample of on-the-ground tests that is statistically appropriate for the area tested. A provider must verify coverage of a sampled area using the H3 geospatial indexing system at resolution 8. The on-the-ground tests will be evaluated to confirm, using a one-sided 95% statistical confidence interval, that the cell coverage is 90% or higher. In submitting data in response to a verification request, a provider must record at least two tests within each of the randomly selected hexagons where the time of the tests are at least four hours apart, irrespective of date, unless, for any sampled hexagon, the provider has and submits alongside its speed tests actual cell loading data for the cell(s) covering the hexagon sufficient to establish that median loading, measured in 15-minute intervals, did not exceed the modeled loading factor for the one-week period prior to the verification inquiry, in which case the provider is required to submit only a single test for the sampled hexagon. We will treat any tests within the sampled accessible point-hex that are outside the coverage area as valid in the case where tests were not recorded within the coverage area. If the required sampled point-hex continue to have missing tests, we will also consider tests that fall slightly outside the required point-hex but within the typical GPS average user range error as valid when no tests are recorded within the point-hex. If the sampled point-hex still has missing tests, we would set those missing required speed tests as negative tests when performing the final adjudication. For in-vehicle mobile tests, providers must conduct tests with the antenna located inside the vehicle. (1) When a mobile service provider chooses to demonstrate mobile broadband coverage availability by submitting on-the-ground data, the mobile service provider must provide valid on-the-ground tests within a Commission-identified statistically valid and unbiased sample of its network. (i) On-the-ground test data must meet the following testing parameters: (A) A minimum test length of 5 seconds and a maximum test length of 30 seconds. These test length parameters apply individually to download speed, upload speed, and round-trip latency measurements, and do not include ramp up time. The minimum test duration requirement will be relaxed once a download or upload test measurement has transferred at least 1,000 megabytes of data; (B) Reporting test measurement results that have been averaged over the duration of the test (i.e., total bits received divided by total test time); (C) Conducted outdoors between the hours of 6:00 a.m. and 10:00 p.m. local time; (ii) On-the-ground test data must include the following metrics for each test: (A) Testing app name and version; (B) Timestamp and duration of each test metric; (C) Geographic coordinates (i.e., latitude/longitude) measured at the start and end of each test metric measured with typical Global Positioning System (GPS) Standard Positioning Service accuracy or better, along with location accuracy; (D) Consumer-grade device type(s), brand/model, and operating system used for the test; (E) Name and identity of the service provider being tested; (F) Location of test server (e.g., hostname or IP address); (G) Signal strength, signal quality, unique identifier, and radiofrequency metrics of each serving cell, where available; (H) Download speed; (I) Upload speed; (J) Round-trip latency; (K) Whether the test was taken in an in-vehicle mobile or outdoor, pedestrian stationary environment; (L) For an in-vehicle test, the speed the vehicle was traveling when the test was taken, where available; (M) An indication of whether the test failed to establish a connection with a mobile network at the time and place it was initiated; (N) The network technology (e.g., 4G LTE, 5G-NR) and spectrum bands used for the test; and (O) All other metrics required per the most recent specification for mobile test data adopted by Office of Economics and Analytics and the Wireless Telecommunications Bureau in accordance with 5 U.S.C. § 553. (2) When a mobile service provider chooses to demonstrate mobile broadband coverage availability by submitting infrastructure data, the mobile service provider must submit such data for all cell sites and antennas that serve or interfere with the targeted area. (i) Infrastructure data must include the following information for each cell site that the provider uses to provide service for the area subject to the verification inquiry: (A) The latitude and longitude of the cell site measured with typical GPS Standard Positioning Service accuracy or better; (B) The cell and site ID number for each cell site; (C) The ground elevation above mean sea level (AMSL) of the site (in meters); (D) Frequency band(s) used to provide service for each site being mapped including channel bandwidth (in megahertz); (E) Radio technologies used on each band for each site; (F) Capacity (Mbps) and type of backhaul used at each cell site; (G) Number of sectors at each cell site; (H) Effective Isotropic Radiated Power (EIRP, in dBm) of the sector at the time the mobile provider creates its map of the coverage data; (I) Geographic coordinates of each transmitter site measured with typical GPS Standard Positioning Service accuracy or better; (J) Per site classification (e.g., urban, suburban, or rural); (K) Elevation above ground level for each base station antenna and other transmit antenna specifications (i.e., the make and model, beamwidth (in degrees), radiation pattern, and orientation (azimuth and any electrical and/or mechanical down-tilt in degrees) at each cell site); (L) Operate transmit power of the radio equipment at each cell site; (M) Throughput and associated required signal strength and signal-to-noise ratio; (N) Cell loading distribution; (O) Areas enabled with carrier aggregation and a list of band combinations; and (P) Any additional parameters and fields that are listed in the most-recent specifications for wireless infrastructure data released by the Office of Economics and Analytics and the Wireless Telecommunications Bureau in accordance with 5 U.S.C. § 553. * * * * * (e) * * * (1) * * * (i) Name, email address, and mobile phone number of the device on which the speed test was conducted; (ii) Speed test data. Consumers must use a speed test app that has been designated by the Office of Engineering and Technology, in consultation with the Office of Economics and Analytics and the Wireless Telecommunications Bureau, for use in the challenge process. Consumer challenges must include on-the-ground test data that meets the requirements in paragraph (c)(1)(i)-(ii) of this section, and must also report the timestamp that test measurement data were transmitted to the app developer’s servers, as well as the source IP address and port of the device, as measured by the server. (iii) A certification that the challenger is a subscriber or authorized user of the provider being challenged; (iv) * * * (v) * * * (2) Consumer speed tests will be used to create a cognizable challenge based on the following criteria: (i) The smallest challengeable hexagonal cell is a hexagon at resolution 8 from the H3 standardized geospatial indexing system. (ii) The download and upload components of a speed test will be evaluated separately. (iii) A “positive” component is one that records speeds meeting or exceeding the minimum speeds that the mobile service provider reports as available where the test occurred (e.g., a positive download component would show speeds of at least 5 Mbps for 4G LTE, and a positive upload component would show speeds of at least 1 Mbps for 4G LTE). A “negative” component is one that records speeds that fail to meet the minimum speeds that the mobile service provider reports as available where the test occurred; (iv) A point-hex shall be defined as one of the seven hex-9s from the H3 standardized geospatial indexing system nested within a hex-8; (v) A point-hex shall be defined as accessible where at least 50% of the area of the point-hex overlaps with the provider’s reported coverage data and the point-hex overlaps with any primary, secondary, or local road in the U.S. Census Bureau’s TIGER/Line Shapefiles; and (vi) A hex-8 from the H3 standardized geospatial indexing system shall be classified as challenged if the following three thresholds are met in the hex-8 for either the download or upload components. (A) Geographic Threshold. When there are at least four accessible point-hexes within the hex-8, each must contain two of the same test components (download or upload), one of which is a negative test. The threshold must be met for one component entirely, meaning that a challenge may contain either two upload components per point-hex, one of which is negative, or two download components per point-hex, one of which is negative. The minimum number of point-hexes in which tests must be recorded must be equal to the number of accessible point-hexes or four, whichever number is lower. If there are no accessible point-hexes within a hex-8, the geographic threshold shall not need to be met. (B) Temporal Threshold. A hex-8 cell must include a set of two negative test components of the same type with a time-of-day difference of at least four hours from another set of two negative test components of the same type, regardless of the date of the tests; and (C) Testing Threshold. At least five speed test components of the same type within a hex-8 cell are negative when a challenger has submitted 20 or fewer test components of that type. (1) When challengers have submitted more than 20 test components of the same type, the following minimum percentage of the total number of test components of that type in the cell must be negative: (i) when challengers have submitted 21-29 test components, at least 24% must be negative; (ii) when challengers have submitted 30-45 test components, at least 22% must be negative; (iii) when challengers have submitted 46-60 test components, at least 20% must be negative; (iv) when challengers have submitted 61-70 test components, at least 18% must be negative; (v) when challengers have submitted 71-99 test components, at least 17% must be negative; and (vi) when challengers have submitted 100 or more test components, at least 16% must be negative; (2) In a hex-8 with four or more accessible point-hexes, if the number of test components of the same type in one point-hex represent more than 50% of the total test components of that type in the hex-8 but still satisfies the geographic threshold, the components in that point-hex will count only towards 50% of the threshold. In a hex-8 where there are only three accessible point-hexes, if the number of test components of the same type in one point-hex represent more than 75% of the total test components of that type in the hex-8 but still satisfies the geographic threshold, the components in that point-hex will count only towards 75% of the threshold. (3) Once the percentage of negative components of the same type recorded meets the minimum negative percentage required (or for a sample of fewer than 21 components, once there are at least five negative component submitted), no additional tests are required so long as both the geographic and temporal thresholds for a hex-8 have been met. (vii) A larger, “parent” hexagon (at resolutions 7 or 6) shall be considered challenged if at least four of the child hexagons within such a “parent” hexagon are considered challenged. (viii) Mobile service providers shall be notified of all cognizable challenges to their mobile broadband coverage maps at the end of each month. Challengers shall be notified when a mobile provider responds to the challenge. Mobile service providers and challengers both shall be notified monthly of the status of challenged areas and parties will be able to see a map of the challenged area and a notification about whether or not a challenge has been successfully rebutted, whether a challenge was successful, and if a challenged area was restored based on insufficient evidence to sustain a challenge. (3) * * * (4) To dispute a challenge, a mobile service provider must submit on-the-ground test data that meets the requirements in paragraph (c)(1)(i)-(ii) of this section, (for in-vehicle mobile tests, providers must conduct tests with the antenna located inside the vehicle), or infrastructure data that meets the requirements in paragraph (c)(2)(i) to verify its coverage map(s) in the challenged area. To the extent that a mobile service provider believes it would be helpful to the Commission in resolving a challenge, it may choose to submit other data in addition to the data initially required, including but not limited to either infrastructure or on-the-ground testing (to the extent such data are not the primary option chosen by the provider) or other types of data such as data collected from network transmitter monitoring systems or software, or spectrum band-specific coverage maps. Such other data must be submitted at the same time as the primary on-the-ground testing or infrastructure rebuttal data submitted by the provider. If needed to ensure an adequate review, the Office of Economics and Analytics may also require that the provider submit other data in addition to the data initially submitted, including but not limited to either infrastructure or on-the-ground testing data (to the extent not the option initially chosen by the provider) or data collected from network transmitter monitoring systems or software (to the extent available in the provider's network). If a mobile provider is not able to demonstrate sufficient coverage in a challenged hexagon, the mobile provider must revise its coverage maps to reflect the lack of coverage in such areas. (i) A “positive” component is one that records speeds meeting or exceeding the minimum speeds that the mobile service provider reports as available where the test occurred (e.g., a positive download component would show speeds of at least 5 Mbps for 4G LTE, and a positive upload component would show speeds of at least 1 Mbps for 4G LTE). A “negative” component is one that records speeds that fail to meet the minimum speeds that the mobile service provider reports as available where the test occurred (ii) A point-hex shall be defined as one of the seven nested hexagons at resolution 9 from the H3 standardized geospatial indexing system of a resolution 8 hexagon; (iii) A point-hex shall be defined as accessible where at least 50% of the area of the point-hex overlaps with the provider’s reported coverage data and the point-hex overlaps with any primary, secondary, or local road in the U.S. Census Bureau’s TIGER/Line Shapefiles; and (iv) A mobile service provider that chooses to rebut a challenge to their mobile broadband coverage maps with on-the-ground speed test data must confirm that a challenged area has sufficient coverage using speed tests that were conducted during the 12 months prior to submitting a rebuttal. A provider may confirm coverage in any hex-8 cell within the challenged area. This includes any hex-8 cell that is challenged, and also any non-challenged hex-8 cell that is a child of a challenged hex-7 or hex-6 cell. Confirming non-challenged hex-8 cells can be used to confirm the challenged hex-7 or hex-6 cell. To confirm a hex-8 cell, a provider must submit on-the ground speed test data that meets the following criteria for both upload and download components: (A) Geographic Threshold. Two download components, at least one of which is a positive test, and two upload components, at least one of which is a positive test, are recorded within a minimum number of point-hexes within the challenged area, where the minimum number of point-hexes in which tests must be recorded must be equal to the number of accessible point-hexes or four, whichever number is lower. If there are no accessible point-hexes within a hex-8, the geographic threshold shall not need to be met. (B) Temporal Threshold. A hex-8 cell will need to include a set of five positive test components of the same type with a time-of-day difference of at least four hours from another set of five positive test components of the same type, regardless of the date of the test; and (C) Testing Threshold. At least 17 positive test components of the same type within a hex-8 cell in the challenged area when the provider has submitted 20 or fewer test components of that type. When the provider has submitted more than 20 test components of the same type, a certain minimum percentage of the total number of test components of that type in the cell must be positive; (1) When a provider has submitted 21-34 test components, at least 82% must be positive; (2) When a provider has submitted 35-49 test components, at least 84% must be positive; (3) When a provider has submitted 50-70 test components, at least 86% must be positive; (4) When a provider has submitted 71-99 test components, at least 87% must be positive; (5) When a provider has submitted 100 or more test components, at least 88% must be positive; (6) In a hex-8 with four or more accessible point-hexes, if the number of test components of the same type in one point-hex represent more than 50% of the total test components of that type in the hex-8 but still satisfies the geographic threshold, the components in that point-hex will count only toward 50% of the threshold. In a hex-8 where there are only three accessible point-hexes, if the number of test components of the same type in one point-hex represent more than 75% of the total test components of that type in the hex-8 but still satisfies the geographic threshold, the components in that point-hex will count only toward 75% of the threshold. (D) Using a mobile device running either a Commission-developed app (e.g., the FCC Speed Test app), another speed test app approved by OET to submit challenges, or other software provided that the software adopts the test methodology and collects the metrics that approved apps must perform for consumer challenges and that government and third-party entity challenger speed test data must contain (for in-vehicle mobile tests, providers must conduct tests with the antenna located inside the vehicle); (1) Providers must submit a complete description of the methodologies used to collect their data; (2) Providers must substantiate their data through the certification of a qualified engineer or official. (E) Using a device that is able to interface with drive test software and/or runs on the Android operating system. (v) A mobile service provider that chooses to rebut a challenge to their mobile broadband coverage maps with infrastructure data on their own may only do so in order to identify invalid, or non-representative, speed tests within the challenger speed test data. The mobile service provider must submit the same data as required when a mobile provider submits infrastructure information in response to a Commission verification request, including information on the cell sites and antennas used to provide service in the challenged area. A provider may submit only infrastructure data to rebut a challenge if: (A) Extenuating circumstances at the time and location of a given test (e.g., maintenance or temporary outage at the cell site) caused service to be abnormal. In such cases, a provider must submit coverage or footprint data for the site or sectors that were affected and information about the outage, such as bands affected, duration, and whether the outage was reported to the FCC’s Network Outage Reporting System (NORS), along with a certification about the submission’s accuracy; (B) The mobile device(s) with which the challenger(s) conducted their speed tests are not capable of using or connecting to the radio technology or spectrum band(s) that the provider models for service in the challenged area. In such cases, a provider must submit band-specific coverage footprints and information about which specific device(s) lack the technology or band; (C) The challenge speed tests were taken during an uncommon special event (e.g., professional sporting event) that increased traffic on the network; (D) The challenge speed tests were taken during a period where cell loading was abnormally higher than the modeled cell loading factor. In such cases, providers must submit cell loading data that both (a) establish that the cell loading for the primary cell(s) at the time of the test was abnormally higher than modeled and (b) include cell loading data for a one-week period before and/or after the provider was notified of the challenge showing as a baseline that the median loading for the primary cell(s) was not greater than the modeled value. If a high number of challenges show persistent over-loading, staff may initiate a verification inquiry to investigate whether mobile providers have submitted coverage maps based on an accurate assumption of cell loading in a particular area; (E) The mobile device(s) with which the challenger(s) conducted their speed tests used a data plan that could result in slower service. In such cases, a provider must submit information about which specific device(s) used in the testing were using such a data plan and information showing that the provider’s network did, in fact, slow the device at the time of the test; or (F) The mobile device(s) with which the challenger(s) conducted their speed tests was either roaming or was used by the customer of a mobile virtual network operator. In such circumstances, providers must identify which specific device(s) used in the testing were either roaming at the time or used by the customer of a mobile virtual network operator based upon their records. (vi) If the Commission determines, based on the infrastructure data submitted by providers, that challenge speed tests are invalid, such challenge speed tests shall be ruled void, and the Commission shall recalculate the challenged hexagons after removing any invalidated challenger speed tests and consider any challenged hexagons that no longer meet the challenge creation threshold to be restored to their status before the challenge was submitted. (5) * * * (6) After a challenged provider submits all responses and Commission staff determines the result of a challenge and any subsequent rebuttal has been determined: (i) In such cases where a mobile service provider successfully rebuts a challenge, the area confirmed to have coverage shall be ineligible for challenge until the next biannual broadband availability data filing six months after the later of either the end of the 60-day response period or the resolution of the challenge. (ii) A challenged area may be restored to an unchallenged state, if, as a result of data submitted by the provider, there is no longer sufficient evidence to sustain the challenge to that area, but the provider’s data fall short of confirming the area. A restored hexagon would be subject to challenge at any time in the future as challengers submit new speed test data. (iii) In cases where a mobile service provider concedes or loses a challenge, the provider must file, within 30 days, geospatial data depicting the challenged area that has been shown to lack sufficient service. Such data will constitute a correction layer to the provider’s original propagation model-based coverage map, and Commission staff will use this layer to update the broadband coverage map. In addition, to the extent that a provider does not later improve coverage for the relevant technology in an area where it conceded or lost a challenge, it must include this correction layer in its subsequent filings to indicate the areas shown to lack service. (7) Commission staff are permitted to consider other relevant data to support a mobile service provider’s rebuttal of challenges, including on-the-ground data or infrastructure data (to the extent such data are not the primary rebuttal option submitted by the mobile service provider). The Office of Economics and Analytics will review such data when voluntarily submitted by providers in response to challenges, and if it concludes that any of the data sources are sufficiently reliable, it will specify appropriate standards and specifications for each type of data and will issue a public notice adding the data source to the alternatives available to providers to rebut a consumer challenge. (f) Mobile service challenge process for State, local, and Tribal governmental entities; and other entities or individuals. State, local, and Tribal governmental entities and other entities or individuals may submit data to challenge accuracy of mobile broadband coverage maps. They may challenge mobile coverage data based on lack of service or poor service quality such as slow delivered user speed. (1) * * * (i) Government and other entity challengers may use their own software and hardware to collect data for the challenge process. When they submit their data the data must meet the requirements in paragraph (c)(1)(i)-(ii) of this section, except that government and other entity challengers may submit the International Mobile Equipment Identity (IMEI) of the device used to conduct a speed test for use in the challenge process instead of the timestamp that test measurement data were transmitted to the app developer’s servers, as well as the source IP address and port of the device, as measured by the server. (ii) * * * (iii) * * * (iv) If the test was taken in an in-vehicle mobile environment, whether the test was conducted with the antenna outside of the vehicle; (2) Challengers must conduct speed tests using a device advertised by the challenged service provider as compatible with its network and must take all speed tests outdoors. Challengers must also use a device that is able to interface with drive test software and/or runs on the Android operating system. (3) For a challenge to be considered a cognizable challenge, thus requiring a mobile service provider response, the challenge must meet the same thresholds specified in paragraph (e)(2) of this section. (4) * * * (5) To dispute a challenge, a mobile service provider must submit on-the-ground test data or infrastructure data to verify its coverage map(s) in the challenged area based on the methodology set forth in paragraph (e)(4) of this section. To the extent that a service provider believes it would be helpful to the Commission in resolving a challenge, it may choose to submit other data in addition to the data initially required, including but not limited to either infrastructure or on-the-ground testing (to the extent such data are not the primary option chosen by the provider) or other types of data such as data collected from network transmitter monitoring systems or software or spectrum band-specific coverage maps. Such other data must be submitted at the same time as the primary on-the-ground testing or infrastructure rebuttal data submitted by the provider. If needed to ensure an adequate review, the Office of Economics and Analytics may also require that the provider submit other data in addition to the data initially submitted, including but not limited to either infrastructure or on-the-ground testing data (to the extent not the option initially chosen by the provider) or data collected from network transmitter monitoring systems or software (to the extent available in the provider's network). (6) * * * (7) * * * 3. Amend section 1.7008 by revising paragraph (d)(2) to read as follows: § 1.7008 Creation of broadband internet access service coverage maps. * * * * * (d)(1) * * * (2) To the extent government entities or third parties choose to file verified data, they must follow the same filing process as providers submitting their broadband internet access service data in the data portal. Government entities and third parties that file on-the-ground test data must submit such data using the same metrics and testing parameters the Commission requires of mobile service providers when responding to a Commission request to verify mobile providers’ broadband network coverage with on-the-ground data (see 47 CFR § 1.7006(c)(1)). (3) * * * APPENDIX C Supplemental Final Regulatory Flexibility Analysis 1. As required by the Regulatory Flexibility Act of 1980, as amended (RFA) See 5 U.S.C. §§ 601–612. The RFA was amended by the Small Business Regulatory Enforcement Fairness Act of 1996 (SBREFA), Pub. L. No. 104-121, Title II, 110 Stat. 857 (1996). a Supplemental Initial Regulatory Flexibility Analysis (Supplemental IRFA) was incorporated in the BDC Mobile Technical Requirements Public Notice released in July 2021 in this proceeding. See Comment Sought on Technical Requirements for the Mobile Challenge, Verification, and Crowdsource Processes Required under the Broadband DATA Act, Public Notice, WC Docket No. 19-195, DA 21-853 (WTB/OEA/OET July 16, 2021) (BDC Mobile Technical Requirements Public Notice). The Commission prepared Initial and Final Regulatory Flexibility Analyses in connection with the Digital Opportunity Data Collection Report and Order and Further Notice of Proposed Rulemaking, Second Order and Third Further Notice, and Third Order (collectively, Broadband Data Act Proceedings). Establishing the Digital Opportunity Data Collection; Modernizing the FCC Form 477 Data Program, WC Docket Nos. 19-195 and 11-10, Report and Order and Second Further Notice of Proposed Rulemaking, 34 FCC Rcd 7505, 7566-86, 7587-608, Appendices B, and C (2019) (Digital Opportunity Data Collection Order and Second Further Notice); Second Order and Third Further Notice, 35 FCC Rcd at 7542-60, 7561-81, Appendices C, D; Third Order, 36 FCC Rcd at 1199-222, Appx. B. Written public comments were requested on the IRFAs prepared for the Further Notice of Proposed Rulemakings that are part of the Broadband Data Act Proceedings. Digital Opportunity Data Collection Order and Second Further Notice, 34 FCC Rcd 7587, Appx. C, Initial Regulatory Flexibility Analysis (2019); Second Order and Third Further Notice, 35 FCC Rcd at 7561, Appx. D, Initial Regulatory Flexibility Analysis. Additionally, the Commission sought written public comment on the proposals, including comments on the Supplemental IRFA, in the BDC Mobile Technical Requirements Public Notice. No comments were filed addressing the Supplemental IRFA or the IRFAs incorporated in the Broadband Data Act Proceedings. This Supplemental Final Regulatory Flexibility Analysis (Supplemental FRFA) supplements the Final Regulatory Flexibility Analyses (FRFAs) in the Broadband Data Act Proceedings Digital Opportunity Data Collection Order and Second Further Notice, 34 FCC Rcd at 7566-86, Appx. B, Final Regulatory Flexibility Analysis (2019); Second Order and Third Further Notice, 35 FCC Rcd at 7542-60, Appx. C, Final Regulatory Flexibility Analysis; Third Order, 36 FCC Rcd at 1199-222, Appx. B, Final Regulatory Flexibility Analysis. to reflect actions taken in the Order and conforms to the RFA. See 5 U.S.C. § 604. A. Need for, and Objectives of, the Order 2. The Broadband DATA Act requires the Commission to collect granular data from providers on the availability and quality of broadband Internet access service and to verify the accuracy and reliability of the broadband coverage data submitted by providers. Broadband Deployment Accuracy and Technology Availability Act, Pub. L. No. 116-130, 134 Stat. 228 (2020) (codified at 47 U.S.C. §§ 641-646) (Broadband DATA Act). In its Second Order and Third Further Notice, and Third Order, the Commission adopted some of the Broadband DATA Act’s requirements, developed the framework for the Broadband Data Collection (BDC), The BDC was formerly called the Digital Opportunity Data Collection, or DODC. established processes for verifying providers’ broadband data submissions, See generally Establishing the Digital Opportunity Data Collection; Modernizing the FCC Form 477 Data Program, WC Docket Nos. 19-195, 11-10, Second Report and Order and Third Further Notice of Proposed Rulemaking, 35 FCC Rcd 7460 (2020) (Second Order and Third Further Notice): Establishing the Digital Opportunity Data Collection; Modernizing the FCC Form 477 Data Program, WC Docket Nos. 19-195, 11-10, Third Report and Order, 36 FCC Rcd 1126 (2021) (Third Order). and established a data challenge process. See Third Order, 36 FCC Rcd at 1154-1174, paras. 70-124. The Commission delegated authority to the Wireless Telecommunications Bureau (WTB), the Office of Economics and Analytics (OEA), and the Office of Engineering and Technology (OET) (collectively “the Bureau and Offices”) to design and construct the new mapping system, which includes setting forth the specifications and requirements for the challenge, verification, and crowdsourcing processes. See id. at 1146, 1166-68, paras. 47-48, 103-06. Following the December 27, 2020, Congressional appropriation of funding for the implementation of the Broadband DATA Act, the Commission began to implement challenge, verification, and crowdsourcing processes involving broadband data coverage submissions. Consolidated Appropriations Act, 2021, Pub. L. No. 116-260, H.R. 133, Div. E, Tit. V, Div. N, Tit. V, § 906(1) (Dec. 27, 2020) (Consolidated Appropriations Act of 2021). 3. In the Order adopted today, pursuant to their delegated authority, the Bureau and Offices take the next steps toward obtaining better coverage data and implementing the requirements of the Broadband DATA Act. More specifically, the Bureau and Offices take action to carry out their responsibility to develop technical requirements for verifying service providers’ coverage data, a challenge process that will enable consumers and other third parties to dispute service providers’ coverage data, and a process for consumers and other entities to submit crowdsourced data on mobile broadband availability. These measures will help the Commission, Congress, other federal and state policy makers, Tribal entities, consumers, and other third parties better evaluate the status of broadband deployment throughout the United States. 4. The Order discusses the technical requirements to implement the mobile challenge, verification, and crowdsourcing processes required by the Broadband DATA Act, such as parameters and metrics for on-the-ground test data and a methodology for determining the threshold for what constitutes a cognizable challenge requiring a provider response. It also provides guidance on what types of data will likely be more probative in different circumstances. Additionally, the Order discusses detailed processes and metrics for providers to follow when responding to a Commission verification request, for government entities and other third parties to follow when submitting verified broadband coverage data, and for challengers to follow when contesting providers’ broadband coverage availability. We believe this level of detail is necessary to formulate the processes and procedures to enable better evaluation of the status of broadband deployment throughout the United States and to meet the Commission’s obligations under the Broadband DATA Act. B. Summary of Significant Issues Raised by Public Comments in Response to the IRFA 5. There were no comments filed that specifically addressed the proposed rules and policies presented in the Supplemental IRFA. C. Response to Comments by the Chief Counsel for Advocacy of the Small Business Administration 6. Pursuant to the Small Business Jobs Act of 2010, which amended the RFA, the Commission is required to respond to any comments filed by the Chief Counsel for Advocacy of the Small Business Administration (SBA) and to provide a detailed statement of any change made to the proposed rules as a result of those comments. 5 U.S.C. § 604(a)(3). 7. The Chief Counsel did not file comments in response to the proposed rules in this proceeding. D. Description and Estimate of the Number of Small Entities to Which the Rules Will Apply 8. The RFA directs agencies to provide a description of and, where feasible, an estimate of the number of small entities that may be affected by the rules adopted herein. 5 U.S.C. § 604(a)(4). The RFA generally defines the term “small entity” as having the same meaning as the terms “small business,” “small organization,” and “small governmental jurisdiction.” 5 U.S.C. § 601(6). In addition, the term “small business” has the same meaning as the term “small-business concern” under the Small Business Act. 5 U.S.C. § 601(3) (incorporating by reference the definition of “small-business concern” in the Small Business Act, 15 U.S.C. § 632). Pursuant to 5 U.S.C. § 601(3), the statutory definition of a small business applies “unless an agency, after consultation with the Office of Advocacy of the Small Business Administration and after opportunity for public comment, establishes one or more definitions of such term which are appropriate to the activities of the agency and publishes such definition(s) in the Federal Register.” A “small-business concern” is one which: (1) is independently owned and operated; (2) is not dominant in its field of operation; and (3) satisfies any additional criteria established by the SBA. 15 U.S.C. § 632. 9. As noted above, Regulatory Flexibility Analyses were incorporated into the Broadband Data Act Proceedings and the BDC Mobile Technical Requirements Public Notice. First Order and Second Further Notice 34 FCC Rcd at 7567-84, 7587-7605, Appx. B at paras. 8-53 and C at paras. 4-49; Second Order and Third Further Notice, 35 FCC Rcd at 7542-59, Appx. C at paras. 6-51; Third Order, 36 FCC Rcd at 1200-19, Appx. B at paras. 8-54; BDC Mobile Technical Requirements Public Notice at *22-27, paras. 60-75. More specifically, the FRFAs incorporated in the Broadband Data Act Proceedings described in detail the small entities that might be significantly affected in the proceedings. Accordingly, in this Supplemental FRFA, we hereby incorporate by reference from the FRFAs in the Broadband Data Act Proceedings the descriptions and estimates of the number of small entities that might be significantly affected, as well as the associated analyses, set forth therein. First Order and Second Further Notice 34 FCC Rcd at 7567-84, 7587-7605, Appx. B at paras. 8-53 and C at paras. 4-49; Second Order and Third Further Notice, 35 FCC Rcd at 7542-59, Appx. C at paras. 6-51; Third Order, 36 FCC Rcd at 1200-19, Appx. B at paras. 8-54; BDC Mobile Technical Requirements Public Notice at *22-27, paras. 60-75. E. Description of Projected Reporting, Recordkeeping, and Other Compliance Requirements for Small Entities 10. We expect that the granular data collection for the challenge and verification processes in the Order will impose some new reporting, recordkeeping, or other compliance requirements on some small entities. Specifically, as part of the challenge process, challenged mobile service providers are notified monthly via the online portal of the challenged hexagons at the end of each calendar month. Mobile providers of broadband internet access service must submit a rebuttal (consisting of either on-the-ground test data or infrastructure data) to the challenge or concede the challenge within 60 days of being notified of the challenge. A challenge respondent may submit supplemental data in support of its rebuttal, either voluntarily or, in some cases, in response to a request from OEA. When rebutting a challenge with on-the-ground data, the provider must meet analogous thresholds (geographic, temporal, and testing) to those required of challengers, adjusted to reflect the burden on providers to demonstrate that sufficient coverage exists at least 90% of the time in the challenged hexagons. When a provider submits only infrastructure data to rebut a challenge, the provider must submit the same data as required when a mobile provider submits infrastructure information in response to a Commission verification request. 11. As part of the verification process, mobile providers of broadband internet access service must submit coverage data in the form of on-the-ground test data or infrastructure information on a case-by-case basis in response to a Commission request to verify mobile broadband providers’ biannual BDC data submissions in a targeted area. For on-the-ground test data, we adopted an approach for providers to reply to verification requests using on-the-ground test data to verify networks which require mobile providers to submit data using the H3 geospatial indexing system at resolution 8. The tests will be evaluated to confirm, using a one-sided 95% statistical confidence interval, that the cell coverage is 90% or higher. Providers must also meet a temporal threshold in verification inquiry submissions that may be relaxed from that required in the challenge process. Additionally, consistent with our proposal in the BDC Mobile Technical Requirements Public Notice, state, local, and Tribal government entities as well as other third parties who voluntarily submit on-the-ground test data as verified data must use the same metrics and testing parameters that mobile providers must use when submitting on-the-ground test data, to ensure the consistency and accuracy of the broadband availability maps. 12. The Order allows providers to submit infrastructure information in response to a verification request as proposed in the BDC Mobile Technical Requirements Public Notice. If a provider chooses to submit infrastructure information in response to a verification request, it must provide such data for all cell sites and antennas that serve or affect coverage in the targeted area. To the extent that the infrastructure information submitted by a provider in response to a verification request standing alone is not sufficient to demonstrate adequate coverage, the Commission may request additional information be submitted by the provider to complete the verification process. The Order expands the categories of infrastructure information that providers must submit when collecting and reporting mobile infrastructure data by adopting the eight additional data categories proposed in the BDC Mobile Technical Requirements Public Notice which will enable a more precise evaluation of the challenged area of a provider’s coverage map. Further, recognizing the need to allow flexibility for responding providers, the Order also allows providers to submit other types of data to supplement on-the-ground or infrastructure information, such as transmitter monitoring information, data from their own field tests conducted in the ordinary course of business, and data collected using their own software tools. 13. With regard to the reporting or submission of crowdsourced data, the Bureau and Offices were directed by Commission to establish and use an online portal for crowdsourced data filings and to use the same portal for challenge filings. As proposed in the BDC Mobile Technical Requirements Public Notice to the extent state, local, and Tribal government entities, other entities, or consumers choose to submit on-the-ground crowdsourced mobile speed test data in the online portal, the data submission must use measurements similar to the methodology used by the FCC’s speed test app and be submitted in a similar format to that which challengers and providers are required to use when submitting speed tests. Likewise, if state, local, and Tribal government entities, other entities, or consumers choose to submit preliminary on-the-ground crowdsourced mobile speed test information prior to availability of the online portal, the data collection requirements require use of a similar measurement methodology as the FCC’s speed test app and submission in a format similar to the one used for speed tests. 14. The requirements we adopt in the Order continue the Commission’s actions to implement the Broadband DATA Act and develop more accurate, more useful, and more granular broadband availability data to advance our statutory obligations and continue our efforts to close the digital divide. We conclude that it is necessary to adopt these rules to produce broadband deployment maps that will allow the Commission to precisely target scarce universal service dollars to where broadband service is lacking. We are cognizant of the need to ensure that the benefits resulting from use of the data outweigh the reporting burdens imposed on small entities. The Commission believes, however, that any additional burdens imposed by our revised reporting approach for providers and state, local, and Tribal government entities are outweighed by the significant benefit to be gained from producing more accurate broadband deployment data and map. We are likewise cognizant that small entities will incur costs and may have to hire attorneys, engineers, consultants or other professionals to comply with the Order. Moreover, although the Commission cannot quantify the cost of compliance with the requirements in the Order, we believe that the reporting and other requirements we have adopted are necessary to comply with the Broadband DATA Act and ensure the Commission obtains complete and accurate broadband coverage maps. F. Steps Taken to Minimize the Significant Economic Impact on Small Entities, and Significant Alternatives Considered 15. The RFA requires an agency to describe any significant, specifically small business, alternatives that it has considered in reaching its approach, which may include the following four alternatives (among others): “(1) the establishment of differing compliance or reporting requirements or timetables that take into account the resources available to small entities; (2) the clarification, consolidation, or simplification of compliance and reporting requirements under the rule for such small entities; (3) the use of performance rather than design standards; and (4) an exemption from coverage of the rule, or any part thereof, for such small entities.” 5 U.S.C. § 604(a)(6). 16. The requirements adopted in the Order balance the need for the Commission to generate more precise and granular mobile broadband availability maps with any associated costs and burdens on mobile broadband providers and other entities participating in the BDC process. The Commission has considered the comments in the record and is mindful that some small entities will have to expend resources and will incur costs to comply with requirements in the Order. In reaching the requirements we adopted in the Order, there were various approaches and alternatives that the Commission considered but did not adopt, which we discuss below, that will prevent small entities from incurring additional burdens and will minimize the economic impact of compliance. 17. The mobile challenge process requirements adopted by the Commission will facilitate the collection of sufficient measurement information to ensure the mobile challenge process is statistically valid while, at the same time, meeting the Commission’s statutory obligation to keep the challenge process “user-friendly.” The adopted requirements strike a balance between ensuring that small entities, including but not limited to state, local, and Tribal governments, as well as consumers and other third-party challengers, can use the challenge process, and ensuring that providers, including small providers, are not unreasonably burdened by responding to every speed test that shows a lack of coverage. The mobile challenge process we have adopted includes a process to determine whether there is a cognizable challenge to which a provider is required to respond rather than requiring a provider to respond to any and all submitted challenges. This will minimize the economic impact for small providers to the extent they are subject to challenges. For challengers, the mobile challenge process allows drive test data meeting specific testing parameters to be submitted via a mobile app – the data must be collected using mobile devices running either a Commission-developed app (i.e., the FCC Speed Test app) or another speed test app approved by OET – and allows governmental entities and other third-party challengers to use their own software and hardware, which contributes to the “user-friendly” nature of the challenge process. Additionally, the speed test data from state, local, and Tribal governments, consumers and other third-party challengers will be aggregated as part of the mobile challenge process to ensure that one challenger is not required to submit all of the speed test data needed to create a challenge, thereby lessening the load as well as the costs and resources required for small entities and others who participate in the mobile challenge process to create a cognizable challenge. 18. The notification process adopted in the Order to inform service providers of cognizable challenges filed against them and inform challengers and service providers of the status and results of challenges will be done on a monthly basis via the online portal. This approach should be more manageable, more administratively efficient, and thereby less costly for small entities and other providers by providing them with a standard set of deadlines rather than having a rolling set of multiple deadlines, while also ensuring that challengers have the opportunity to submit additional evidence in support of their challenge submissions if desired. Providers and challengers will have access to all relevant information through the online portal, including a map of the challenged area(s), notification of whether or not a challenge has been successfully rebutted, whether a challenge was successful, and if a challenged area was restored based on insufficient evidence to sustain a challenge. 19. The mobile service challenge process metrics for mobile providers to follow when responding to a Commission verification request seek to balance the need for the Commission to establish valuable methods for verifying coverage data with the need to reduce the costs and burdens associated with requiring mobile providers to submit on-the-ground test data and infrastructure information. For example, in order to ensure the challenge process is user-friendly for challengers and workable for mobile providers to respond to and rebut challenges, the challenged mobile service providers who choose to submit on-the-ground speed test data are required to meet analogous thresholds as the challengers to demonstrate that the challenged areas have sufficient coverage. Providers are required to submit on-the-ground data to demonstrate that sufficient coverage exists at least 90% of the time and meet the same three threshold tests as challengers. We considered but declined a proposal to define a challenge area based on the test data submitted by the challengers on our belief that our proposal is both user-friendly and supported by sufficient data while also targeting a more precise geographic area where broadband coverage is disputed and limits the burden on providers in responding to challenges. 20. We also declined to adopt several recommendations from commenters which would have expanded the scope of requirements for the challenge process and increased costs for small and other providers. More specifically, we declined to include voice maps in the challenge process, noting that Broadband DATA Act makes no mention of allowing challenges to voice maps, and the Commission decided that the mobile challenge process applies only to broadband (i.e., not voice) coverage maps. Further, we declined to require providers to provide additional information such as performance and affordability information like throughput speeds experienced by consumers, signal strength, and pricing information with their maps. In the Third Order, the Commission specifically declined to adopt pricing and throughput data on fixed services, and we do not believe the Bureau and Offices have discretion to add such requirements in the Order. 21. For small entities and other providers who use on-the-ground test data to rebut challenges, we provide greater flexibility in the collection of on-the-ground test data and reduce burdens on providers by allowing them to use the software tools they may already be using. To the extent that a provider chooses to use software other than the FCC Speed Test app or another speed test app approved by OET for use in the challenge process, we will consider such software approved for use in rebutting challenges provided that the software collects the metrics that approved apps must collect for consumer challenges and that governmental and third-party challengers’ speed test data must contain. This approach will help minimize costs for small and other providers and increase efficiency, while continuing to ensure that the Commission receives high quality data that will allow an equivalent comparison between challenge data submitted by consumers and other entities, and data created by providers using their own software. We note however, that we retain the discretion to require prior approval of providers’ software tools or make changes to the required metrics via notice and comment at a later time. Similarly, we provide small and other providers flexibility to rebut challenges by allowing the use of infrastructure data, on their own, to adjudicate challenges in a limited set of circumstances. 22. In our adoption of parameters for the collection of verification information, we recognize that it may be more costly for small providers to obtain on-the-ground test data. We take steps to address this issue by adopting a targeted and more inclusive approach. Specifically, we identify the portion of a provider’s coverage map (targeted area) that may require verification data and will conduct our determination based upon all available evidence. The scope of all available evidence includes speed test data, infrastructure data, crowdsourced and other third-party data, as well as staff evaluation and knowledge of submitted coverage data (including maps, link budget parameters, and other credible information). Thus, rather than a one-size-fits-all requirement, this approach will allow Commission staff to evaluate whether a verification request is warranted and for providers to submit the type of data in response to a verification request that most cost-effectively supports their coverage calculations. To further minimize the costs and burden placed on small and other service providers, while ensuring Commission staff have access to sufficient data to demonstrate coverage, we will use sampling of the target area and require service providers to provide verification data which covers a statistically valid sampling of areas for which sufficient coverage must be demonstrated to satisfy the verification request. By using a sampling plan to demonstrate broadband availability, we decrease the data submission requirements allowing small and other providers to avoid the costs that would have been associated with submitting considerably more data. Additionally, we declined a request to require providers to submit actual on-the-ground test data on a continuous or quarterly basis as such a requirement would be unnecessarily burdensome. 23. To ensure consistency, reliability, comparability, and verifiability of the data the Commission receives, in the Order we require state, local, and Tribal government entities and other third parties, including small entities that fall within these categories, to comply with the challenge process applicable to providers. Consistent with our approach for providers which does not carve out different or lower standards for smaller providers, requiring state, local, and Tribal government entities and third parties to submit on-the-ground test data using analogous thresholds we adopted for mobile providers will ensure that the Commission implements a standardized process resulting in broadband availability maps that are as accurate and precise as possible. We are cognizant however, that on-the-ground test data can be more costly to obtain and can impose burdens for small entities. Therefore, our consideration of appropriate verification data sources took into consideration both the usefulness and costs of on-the-ground test data, and the fact that this type of data may not be necessary in every situation, particularly where infrastructure information is available which based on our analysis will likely be of comparable probative value to on-the-ground test data in certain situations. 24. Finally, in the Second Order, Second Order and Third Further Notice, 35 FCC Rcd at 7487, para. 64. the Commission adopted a crowdsourcing process to allow individuals and entities to submit information about the deployment and availability of broadband internet access service. Consistent with the data collection and submission requirements adopted in the Order for the mobile challenge and verification process, governmental entities and other third parties, including small entities that fall within these categories, can submit on-the-ground crowdsourced mobile speed test data using the online portal that will be used by providers for the challenge and verification processes. As mentioned above in Section E, crowdsourced data will be collected using a similar measurement methodology and submitted in a format similar to the format challengers and providers use to submit speed test data. In adopting this approach for crowdsourced data, the continued consistency will minimize the cost and administrative burdens for small entities and further ensure the uniformity, dependability, comparability, and verifiability of the data received by the Commission in the mobile challenge, verification, and crowdsourcing processes. Report to Congress 25. The Commission will send a copy of the Order, including this Supplemental FRFA, in a report to be sent to Congress pursuant to the Congressional Review Act. See 5 U.S.C. § 801(a)(1)(A). In addition, the Commission will send a copy of the Order, including this Supplemental FRFA, to the Chief Counsel for Advocacy of the SBA. A copy of the Order and Supplemental FRFA (or summaries thereof) will also be published in the Federal Register. See 5 U.S.C. § 604(b).