Federal Communications Commission DA 21-853 DA 21-853 Released: July 16, 2021 COMMENT SOUGHT ON TECHNICAL REQUIREMENTS FOR THE MOBILE CHALLENGE, VERIFICATION, AND CROWDSOURCE PROCESSES REQUIRED UNDER THE BROADBAND DATA ACT WC Docket No. 19-195 Comments Due: [30 days after publication in the Federal Register] Reply Comments Due: [45 days after publication in the Federal Register] TABLE OF CONTENTS Heading Paragraph # I. INTRODUCTION 1 II. BACKGROUND 3 III. DISCUSSION 6 A. Mobile Service Challenge Process 6 1. Cognizable Challenges 8 2. Challenge Responses 15 a. Rebutting Challenges with On-the-Ground Data 17 b. Rebutting Challenges with Infrastructure Data 20 c. Other Data 24 B. Collecting Verification Information from Mobile Providers 26 1. Area Subject to Verification 27 2. On-the-Ground Test Data 34 3. Infrastructure Information 36 4. Additional Data 40 C. Collecting Verified Broadband Data from Governmental Entities and Third Parties 47 D. Probative Value 47 E. Crowdsourced Data 51 IV. PROCEDURAL MATTERS 60 A. Supplemental Initial Regulatory Flexibility Analysis 60 B. Deadlines and Filing Procedures 77 C. Paperwork Reduction Act 80 APPENDIX A—Technical Appendix APPENDIX B—Proposed Rules I. INTRODUCTION 1. With this Public Notice, the Wireless Telecommunications Bureau (WTB), the Office of Economics and Analytics (OEA), and the Office of Engineering and Technology (OET) (collectively, the Bureau and Offices) take the next step in implementing the requirements of the Broadband DATA Act and improving the Commission’s data on broadband availability as part of the Broadband Data Collection (BDC). Broadband Deployment Accuracy and Technological Availability Act, Pub. L. No. 116-130, 134 Stat. 228 (2020) (codified at 47 U.S.C. §§ 641-646) (Broadband DATA Act). The Broadband Data Collection was formerly known as the Digital Opportunity Data Collection, or DODC. To implement the Broadband DATA Act’s requirements and obtain better mobile broadband availability data, the Commission delegated to the Bureau and Offices the obligation to develop: (1) technical requirements for a challenge process that will enable consumers and other third parties to dispute service providers’ coverage data; (2) a process to verify service providers’ coverage data; and (3) a process to accept crowdsourced information from third parties. These measures will enable the Commission, Congress, other federal and state policy makers, Tribal entities, consumers, and other third parties to verify and supplement the data collected by the Commission on the status of broadband availability throughout the United States. 2. This Public Notice seeks comment on proposed technical requirements to implement the mobile challenge, verification, and crowdsourcing processes required by the Broadband DATA Act. These requirements include the metrics to be collected for on-the-ground test data and a methodology for determining the threshold for what constitutes a cognizable challenge requiring a provider response. The Public Notice also provides tentative views and seeks comment on the types of data that likely will be probative in different circumstances for validating broadband availability data submitted by mobile service providers. This Public Notice and the detailed Technical Appendix, See Appendix A-Technical Appendix § 3-4. attached hereto as Appendix A, propose detailed processes and metrics for challengers to use to contest providers’ broadband coverage availability, for providers to follow when responding to a Commission verification request, and for state, local, and Tribal governmental entities For purposes of this Public Notice, we generally refer to state, local, and Tribal entities as “government entities” or “governmental entities.” and other third parties to follow when submitting verified broadband coverage data. The Public Notice seeks comment on the technical requirements for these complex issues to assure that the broadband availability data collected in the challenge and other data verification and crowdsource processes serves the important broadband data verification purposes envisioned in the Broadband DATA Act. II. BACKGROUND 3. The Broadband DATA Act requires the Commission to collect granular data from broadband internet access service providers on the availability and quality of broadband service and also to establish a challenge process, verify the accuracy and reliability of the broadband coverage data that providers are required to submit in their BDC filings, and improve data accuracy through a crowdsourcing process. See, e.g., 47 U.S.C. §§ 642(a)(1)(B)(i), (iii), (iv), (b)(4), (b)(5), 644(b). The Broadband DATA Act also requires the Commission to develop “a process through which it can collect verified data for use in the coverage maps from: (1) [s]tate, local, and Tribal governmental entities that are primarily responsible for mapping or tracking broadband internet access service coverage for a [s]tate, unit of local government, or Indian Tribe, as applicable; (2) third parties . . . ; and (3) other Federal agencies.” 47 U.S.C. § 642(a)(2). In its Second Order and Third Further Notice, the Commission adopted some of the Broadband DATA Act’s requirements for collection and reporting broadband data from providers, developed the framework for the BDC, established a process for verifying the broadband data it receives from providers in their BDC filings, and adopted a basic framework for collecting crowdsourced information. See generally Establishing the Digital Opportunity Data Collection; Modernizing the FCC Form 477 Data Program, WC Docket Nos. 19-195, 11-10, Second Report and Order and Third Further Notice of Proposed Rulemaking, 35 FCC Rcd 7460 (2020) (Second Order and Third Further Notice). While the challenge process, crowdsource data, and other FCC efforts will all serve to validate the data submitted by providers, for purposes of this Public Notice, “verification” or “verification process” refers to the internal process the Commission sought comment on in section IV.D. of the Third Further Notice and adopted in section III.E. of the Third Order. Id. at 7503-05, paras. 104-09; Establishing the Digital Opportunity Data Collection; Modernizing the FCC Form 477 Data Program, WC Docket Nos. 19-195, 11-10, Third Report and Order, 36 FCC Rcd 1126, 1146-51, paras. 47-60 (2021) (Third Order); see also 47 U.S.C. § 642(b)(4) (instructing the Commission to “verify the accuracy reliability of the information in accordance with measures established by the Commission.”). In the Third Order, the Commission adopted additional requirements for collecting and verifying provider-submitted data and established the challenge process. See generally Third Order, 36 FCC Rcd at 1126. The Commission directed the Bureau and Offices to design and develop the new BDC platform for mapping broadband availability, and to set forth the specifications and requirements for the mobile challenge, verification, and crowdsourcing processes. See, e.g., Id. at 1146, 1150-51, 1166-68, 1170, 1173-74, paras. 47-48, 59, 103-06, 110-11, 120-21; Second Order and Third Further Notice, 35 FCC Rcd 7488, 7491-92, 7494, paras. 66-67, 74, 76, 82; 47 CFR § 1.7006(e)(2), (f)(3). The Commission was able to begin development of the BDC systems and the proposed technical requirements to implement these processes after funding to implement the Act was appropriated in December 2020. Consolidated Appropriations Act, 2021, Pub. L. No. 116-260, H.R. 133, Div. E, Tit. V, Div. N, Tit. V, § 906(1) (Dec. 27, 2020) (Consolidated Appropriations Act of 2021). FCC, Broadband Data Collection Resources, https://www.fcc.gov/BroadbandData/resources (last visited June 22, 2021) (announcing the Data Architect and Design Services contract was awarded to Emprata, LLC on Feb. 23, 2021). 4. In the Third Order, the Commission determined that it should aggregate speed test results received from multiple consumer challenges in the same general area in order to resolve challenges in an efficient manner, mitigate the time and expense involved, and ensure that the mobile coverage maps are reliable and useful. Third Order, 36 FCC Rcd at 1167-68, para. 105. When these aggregated results reach an appropriate threshold, they will constitute a cognizable challenge requiring a provider response. Id. at 1168, para. 105. While the Commission acknowledged that consumers are likely to submit challenges in distinct, localized areas instead of expending the time and resources to test in a broader area or for extended periods, it also recognized that providers should not be subject to the undue cost of responding to a large number of challenges in very small areas. Id. at 1167-68, para. 105. In response to the Second Order and Third Further Notice, providers argued that a requirement to respond to every consumer challenge would be a substantial burden. See, e.g., CTIA Comments and Petition for Reconsideration at 19-20. The Commission directed OEA, in consultation with WTB, to determine the threshold number of mobile consumer challenges within a specified area that will constitute a cognizable challenge triggering a provider’s obligation to respond. Third Order, 36 FCC Rcd at 1167, para. 105; 47 CFR § 1.7006(e)(2). In connection with that determination, the Commission also directed OEA, in consultation with WTB, to establish: (1) the methodology for determining this threshold; Third Order, 36 FCC Rcd at 1168, para. 105; see 47 CFR § 1.7006(e)(2). The Commission stated that, “[i]n developing this methodology, OEA should consider, inter alia, the number, location, and timing of the tests, variability in test results, and whether the tests were conducted in urban or rural areas.” Third Order, 36 FCC Rcd at 1167-68, para. 105. and (2) the methodology for determining the boundaries of a geographic area where the threshold for a cognizable challenge has been met. Third Order, 36 FCC Rcd at 1168, para. 106; see 47 CFR § 1.7006(e)(2). 5. Consistent with the approach it adopted for consumer challenges, the Commission stated that it would also aggregate speed test evidence received from multiple government and third-party challengers in the same general area. Third Order, 36 FCC Rcd at 1173, para. 120. The Commission directed OEA to determine the threshold number of mobile governmental and third-party challenges within the same general area that will constitute a cognizable challenge that requires a provider response. Id. Similar to the consumer challenges, the Commission directed OEA, in consultation with WTB, to establish the methodology for this threshold and the methodology for determining the boundaries of an area where the threshold has been met. Id. III. DISCUSSION A. Mobile Service Challenge Process 6. The Broadband DATA Act requires the Commission to “establish a user-friendly challenge process through which consumers, [s]tate, local, and Tribal governmental entities, and other entities or individuals may submit coverage data to the Commission to challenge the accuracy of– (i) the coverage maps; (ii) any information submitted by a provider regarding the availability of broadband internet access service; or (iii) the information included in the Fabric.” 47 U.S.C. § 642(b)(5); see id. § 642(a)(1)(B)(iii). The Commission established requirements for challenges to mobile service coverage reporting in the Third Order and directed the Bureau and Offices to adopt additional implementation details. Third Order, 36 FCC Rcd at 1164-75, paras. 97-125. 7. At the outset, we note that coverage maps generated using propagation modeling are probabilistic due to the variability of mobile wireless service. See 47 U.S.C. § 641(3) (defining the term “cell edge probability” to mean “the likelihood that the minimum threshold download and upload speeds with respect to broadband internet access service will be met or exceeded at a distance from a base station that is intended to indicate the ultimate edge of the coverage area of a cell”). The BDC coverage maps will be based on specifications adopted by the Commission to reflect where a mobile service provider’s models predict a device has at least a 90% probability of achieving certain minimum speeds at the cell edge for the parameters and assumptions used in the modeling. Second Order and Third Further Notice, 35 FCC Rcd at 7477, para. 39. But an individual speed test conducted in an area where a provider’s propagation model predicts adequate coverage may not, by itself, be sufficient to establish the on-the-ground reality of service in that area. Throughout this Public Notice we use the term “adequate coverage” to refer to coverage where a device should achieve upload and download speeds meeting or exceeding the minimum values associated with the provider’s map for a given technology. We have therefore designed the mobile challenge process to evaluate the on-the-ground truth of whether devices are able to achieve particular minimum speeds at least 90% of the time, measured at any point within the covered area and at any time during typical usage hours. This approach strives to collect sufficient measurements to ensure the process is statistically valid, while at the same time meeting the statutory obligation to keep the challenge process “user-friendly.” We acknowledge that on-the-ground service can be measured and analyzed in ways other than the approach set forth herein, but we believe that our approach has the benefit of being both straightforward and consistent with the framework adopted by the Commission. 1. Cognizable Challenges 8. To implement the Commission’s directives, the Bureau and Offices propose to evaluate the speed tests submitted by consumers in combination with the speed tests submitted by governmental and third-party challengers in the challenge process. Under this approach, we would combine such speed test evidence and apply a single methodology to determine whether the threshold for a cognizable challenge has been met and to establish the boundaries of the challenged area. Since we propose to require all entities submitting challenges to meet the same thresholds and follow similar procedures for submitting challenge data, we see little functional difference between consumer and governmental or third-party challenges. As such, we believe combining all challenges will result in more robust and accurate challenges. 9. In addition to combining consumer speed tests and governmental and third-party speed tests, we propose to validate each submitted speed test and exclude tests that are outside the scope of the challenge process, do not conform to the data specifications, or do not otherwise present reliable evidence. We propose to accept as valid speed tests only those tests conducted between the hours of 6:00 a.m. and 10:00 p.m. local time, so that speed tests are reflective of the hours that consumers typically use mobile broadband networks. See Second Order and Third Further Notice, 35 FCC Rcd at 7518-19, para. 153. We acknowledge that our proposal departs slightly from the time range proposed by the Commission, which would allow for tests to be conducted between 6:00 a.m. and 12:00 a.m. (midnight) local time. However, we believe that tests conducted after 10:00 p.m. may likely record network performance that is materially different than tests conducted earlier in the day due to reduced cell loading. We seek comment on this proposal and our assumptions about network traffic patterns. We also propose to compare each speed test against the relevant coverage map. Specifically, we propose to compare speed tests for a particular network technology (e.g., 3G, 4G LTE, or 5G) to the coverage maps for the corresponding technology, to compare the environment of the speed test—stationary or in-vehicle mobile—to the coverage map of the corresponding modeled environment, and to treat as invalid and exclude any speed tests that fall outside the boundaries of the provider’s most recent coverage data for the relevant technology and modeled environment. Additionally, because we do not believe there is a reliable way to evaluate mobile voice coverage using the speed test data which the Commission requires for submitting challenges, See Third Order, 36 FCC Rcd at 1166, para. 102 (adopting a requirement that consumers “submit speed test data to support their mobile challenges”). we propose not to permit challenges to the voice coverage maps submitted by mobile service providers. We seek comment on these proposals. 10. After excluding any speed tests that fail the validations proposed above, we propose to associate the location of each validated speed test with a particular underlying geography depicted as a specific hexagonal cell area based upon the H3 geospatial indexing system. Isaac Brodsky, H3: Uber’s Hexagonal Hierarchical Spatial Index, (June 27, 2018), https://eng.uber.com/h3/. H3 is an open-source project developed by Uber Technologies, Inc. that overlays the globe with hexagonal cells of different sizes at various resolutions, from zero to 15. The lower the resolution, the larger the area of the hexagonal cell. See id. (“H3 supports sixteen resolutions. Each finer resolution has cells with one seventh the area of the coarser resolution. Hexagons cannot be perfectly subdivided into seven hexagons, so the finer cells [i.e., the ‘children’] are approximately contained within a parent cell. The identifiers for these child cells can be easily truncated to find their ancestor cell at a coarser resolution, enabling efficient indexing.”). The H3 system is designed with a nested structure in which each hexagonal cell can be further subdivided into seven “child” hexagons at the next higher (i.e., finer) resolution that approximately fit within the “parent” hexagon. Because of this nested structure, using the H3 system to group speed tests allows for challenges at multiple levels of granularity. See id. The nested structure includes 16 total H3 resolutions of hexagons ranging in average area size from approximately 4.25 million square kilometers to 0.9 square meters. H3, Table of Cell Areas for H3 Resolutions, https://h3geo.org/docs/core-library/restable/ (last visited June 22, 2021). In the case where a test reports more than one pair of distinct geographic coordinates (e.g., because the device was in motion), we propose to associate the test with the midpoint of the reported coordinates. We propose to use a system based upon hexagonal shapes instead of squares or rectangles because hexagons better enable us to evaluate challenges across multiple levels of granularity which can cover a significant area. See ESRI, Why Hexagons?, https://pro.arcgis.com/en/pro-app/latest/tool-reference/spatial-statistics/h-whyhexagons.htm (last visited June 22, 2021) (“Hexagons reduce sampling bias due to edge effects of the grid shape, this is related to the low perimeter-to-area ratio of the shape of the hexagon. A circle has the lowest ratio but cannot tessellate to form a continuous grid. Hexagons are the most circular-shaped polygon that can tessellate to form an evenly spaced grid.”). Tessellations (i.e., the arrangement of shapes closely fitted together, such as polygons in a repeated pattern without gaps or overlapping) with large hexagons distort less due to the curvature of the Earth. See id. The orientation of a hexagon matters less compared with squares and rectangles. Id. We further propose that the smallest cognizable challenge would be to a single resolution 8 hexagonal cell, which has an area of approximately 0.7 square kilometers. H3, Table of Cell Areas for H3 Resolutions, https://h3geo.org/docs/core-library/restable/ (last visited June 22, 2021). The Commission previously adopted a de minimis requirement for the smallest cognizable challenge in the Mobility Fund Phase II challenge process of one square kilometer. See Connect America Fund; Universal Service Reform – Mobility Fund, WC Docket No. 10-90, WT Docket No. 10-208, Order on Reconsideration and Second Report and Order, 32 FCC Rcd 6282, 6305-06, para. 46 (2017); Procedures for the Mobility Fund Phase II Challenge Process, WC Docket No. 10-90, WT Docket No. 10-208, Public Notice, 33 FCC Rcd 1985, 1989-90, para. 9 (2018); see also Establishing a 5G Fund for Rural America, GN Docket No. 20-32, Report and Order, 35 FCC Rcd 12174, 12232, para. 140 (2020). We seek comment on this choice of geographical area, including our proposal to use the H3 geospatial indexing system, as well as the ideal resolution or minimum size of the area to consider a cognizable challenge. 11. As part of the proposed methodology, we would evaluate all valid challenger speed tests for a given technology within each hexagon to determine whether to create a cognizable challenge to the coverage in that area. In so doing, we propose to categorize each speed test as either a “positive” test or a “negative” test based upon whether the test is consistent or inconsistent with the provider’s modeled coverage. We would consider a negative test to be a speed test that does not meet the minimum predicted download or upload speed based on the provider-reported technology-specific minimum speeds with the cell edge probability and cell loading factors modeled by the provider. We would consider a positive test to be a speed test that records speeds meeting or exceeding the minimum download and upload speeds the mobile service provider reports as available at the location where the test occurred. We seek comment on this proposal. Alternatively, rather than considering a speed test as “negative” when either the recorded download or upload speed fails to meet the minimum predicted speeds for that area, should we evaluate the download and upload portions of each test independently? We note that speed test applications (apps) typically measure download, upload, and latency metrics sequentially and not simultaneously, and thus evaluating these metrics independently may better account for geographic and/or temporal variability at the expense of adding complexity to our proposed approach. We seek comment on this alternative and also on whether we should consider any other methodologies to address the probabilistic nature of mobile wireless coverage and the potential for test results “at the margins” (either on the download speed or the upload speed) to either overrepresent or underrepresent coverage. Commenters proposing any alternative methodologies should explain how their proposals are consistent with the requirements and standardized reporting parameters set forth by the Commission and in the Broadband DATA Act. By aggregating speed tests and requiring challenges to meet the thresholds described below, we tentatively conclude that the methodology we propose above would ensure that challenges are temporally and geographically diverse, and therefore reflect a robust and representative sample of user experience. As such, we anticipate that situations in which a mobile service provider has throttled speeds of consumers that exceed data limits will have little, if any, effect on the challenge process. We seek comment on our assumptions, tentative conclusions, and whether there are other ways to address the issue of throttling. 12. We propose to consider a provider’s coverage for a given technology in a resolution 8 hexagon to be challenged when the set of valid speed tests meets three thresholds: (1) a geographic threshold, (2) a temporal threshold, and (3) a testing threshold. For the geographic threshold, we propose to require that at least four child hexagons (or “point-hexes”) We define a point-hex as a resolution 9 child hexagon for a given resolution 8 hexagon. See Appendix A-Technical Appendix § 3.1.1. A resolution 9 hexagon has an area of approximately 0.1 square kilometers. See id. at Table 1. within the resolution 8 hexagon include two or more tests taken within each point-hex, and that at least one of the tests in each point-hex be negative. We propose to require fewer than four point-hexes to include tests when there are fewer than four of the seven point-hexes of a resolution 8 hexagon that are “accessible” – that is, where at least 50% of the point-hex overlaps with the provider’s reported coverage data and a road runs through the point-hex. See Appendix A-Technical Appendix at Table 2. Setting these dual requirements will help to demonstrate that inadequate coverage occurs at multiple locations within the resolution 8 hexagon. For the temporal threshold, we propose to require at least two negative tests be conducted at different times of day, separated by at least four hours, to demonstrate persistent inadequate coverage. For the testing threshold, we propose to require at least five negative tests within the resolution 8 hexagon when 20 or fewer total challenge tests have been submitted within the hexagon. When more than 20 challenge tests have been submitted within the hexagon, we propose to require that the percentage of negative tests within the resolution 8 hexagon statistically demonstrate, using a 0.95 statistical confidence level, that the probability of a test achieving the minimum speeds reported for the provider’s coverage is less than 90% and therefore warrants a challenge. The required percentage of negative tests would thus vary, from at least 24% when between 21 and 30 challenge tests have been submitted within the hexagon, to 16% when 100 or more tests have been submitted. We also propose that a larger, “parent” hexagon (at resolutions 7 or 6) be considered challenged if at least four of its child hexagons are considered challenged. Consistent with the Commission’s direction to consider “whether the tests were conducted in urban or rural areas,” we propose to allow challenges that account for differences in areas. Third Order, 36 FCC Rcd at 1167-68, para. 105. The proposal sets forth a different geographic threshold depending on the road density of each resolution 8 hexagon which we anticipate will make it easier for challengers to establish a challenge in less densely populated areas. Additionally, the proposal includes a process to trigger challenges to a parent or grandparent hexagon (at resolutions 7 and 6, respectively) that likewise takes into account this different geographic threshold, thus more easily allowing for challenges over large rural areas. We seek comment on this proposed methodology and the associated thresholds. Specifically, we seek comment on whether these thresholds are sufficient to adequately reflect the actual coverage in an area while maintaining a user-friendly challenge process. Should additional tests and testing at additional times of day be required in order to overcome typical variability in mobile wireless coverage? Alternatively, instead of our proposed temporal threshold, should we categorize tests into different temporal ranges (e.g., 6:00 to 10:00 a.m., 10:00 a.m. to 2:00 p.m., 2:00 to 6:00 p.m., and 6:00 to 10:00 p.m.) and require tests in different time ranges to account for the temporal variability of mobile networks, such as variability due to cell loading? Should we consider other metrics that correlate with the availability of mobile broadband (e.g., signal strength or other radiofrequency metrics) or that provide an indication of real-world conditions that impact throughput, such as cell loading, when determining the temporal or testing thresholds, and if so, how should we adjust these thresholds in relation to such metrics? Once the challenge process has been implemented, we anticipate that we may revisit and modify these thresholds, after notice and comment, if they are not sufficient to provide a clear determination of actual coverage conditions. See id. at 1168, para. 106. Appendix A provides a more detailed technical descriptions of these proposed thresholds. 13. Because mobile service providers are required to submit two sets of coverage data for a given technology—one map modeled to assume a device is in a stationary environment and one map modeled to assume a device is in-vehicle and in a mobile environment—we propose to evaluate all tests for a given technology against each map independently when determining whether to establish a cognizable challenge. Second Order and Third Further Notice, 35 FCC Rcd at 7481-82, para. 48; 47 CFR § 1.7004(c)(5). That is, we would filter speed tests to exclude any stationary tests that fall outside of the provider’s stationary coverage map and exclude any in-vehicle mobile tests that fall outside of the provider’s in-vehicle mobile coverage map. We would then aggregate all of the remaining stationary and in-vehicle mobile tests and compare these tests against the coverage data for a given technology and modeled environment. If the aggregated tests in a resolution 8 hexagon meet all three thresholds proposed above, we would consider that map’s coverage to be challenged for that hexagon. Because the two sets of coverage data may differ (especially at the edge of a provider’s network), tests submitted as challenges against the same provider within the same hexagon may be sufficient to create a challenge against one of the maps and insufficient to create a challenge against the other. We seek comment on this proposed approach to evaluating challenges against stationary and in-vehicle mobile maps. We acknowledge that stationary tests and in-vehicle mobile tests may not be entirely homogeneous measurements of an on-the-ground experience. However, we believe that aggregating such tests when evaluating challenges would more closely align with the Broadband DATA Act requirement to develop a “user-friendly” challenge process and would thus outweigh any cost to accuracy in treating such tests as homogeneous. In the alternative, if we were to not aggregate such tests and only evaluate stationary tests against stationary maps and separately evaluate in-vehicle mobile tests against in-vehicle mobile maps, we anticipate that it may be significantly more difficult to establish a challenge to certain coverage data. For example, if most consumers conduct stationary tests while most government and third-party entities conduct in-vehicle mobile tests (i.e., drive tests), segregating such tests when evaluating challenges would likely result in tests meeting all three proposed thresholds in fewer resolution 8 hexagons. Moreover, there is a higher likelihood that, after adjudicating the challenges, portions of a provider’s coverage data may show a lack of coverage for one type of map, due to successful challenges, yet still show robust coverage for the other type of map due solely to an absence of one type of test and in ways that are inconsistent with mobile wireless propagation. We seek comment on this view and on any alternatives to reconciling challenges to these two sets of coverage data. 14. In the Third Order, the Commission required consumer challengers to use a speed test app approved by OET for use in the challenge process and provided the metrics that approved apps must collect for each speed test. Third Order, 36 FCC Rcd at 1166-67, paras. 103-104. The Commission directed OET, in consultation with OEA and WTB, to update the FCC Speed Test app as necessary or develop a new speed test app to collect the designated metrics, so that challengers may use it in the challenge process. Id. at 1167, para. 104. For government and third-party entity challengers, the Commission did not require the use of a Commission-approved speed test app but instead set forth the information that all submitted government and third-party challenger speed test data must contain and directed OEA, WTB, and OET to adopt additional testing requirements if they determine it is necessary to do so. Id. at 1172, paras. 117-18. We propose to update the metrics that approved apps must collect for consumer challenges and that government and third party entity challenger speed test data must contain. Specifically, we propose that on-the-ground test data submitted by challengers meet the following testing parameters: (1) a minimum test length of 5 seconds and a maximum test length of 30 seconds; (2) test measurement results that have been averaged over the duration of the test (i.e., total bits received divided by total test time); and (3) a restriction that tests must be conducted between the hours of 6:00 a.m. and 10:00 p.m. local time. We also propose that on-the-ground challenge test data shall include the following metrics for each test: (1) app name and version; (2) timestamp and duration of each test metric; (3) geographic coordinates measured at the start and end of each test metric with typical Global Positioning System (GPS) Standard Positioning Service accuracy or better; (4) device make and model; (5) cellular operator name; (6) location (e.g., hostname or IP address) of server; (7) signal strength, signal quality, unique identifier, and radiofrequency (RF) metrics of each serving cell, if available; (8) download speed; (9) upload speed; (10) round-trip latency; (11) the velocity of the vehicle, if available, for in-vehicle tests; and (12) all other metrics required per the most-recent specification for mobile test data released by OEA and WTB. We propose to require challengers to collect these data using mobile devices running either a Commission-developed app (e.g., the FCC Speed Test app) or another speed test app approved by OET to submit challenges. See infra Section III.E. proposing that OET issue a public notice inviting proposals for designation of third-party speed test data collection apps as acceptable for use for submission of crowdsourced and challenge data. For government and third-party entity challengers, we would also allow these data to be collected using other software and hardware. Third Order, 36 FCC Rcd at 1172, para. 117 (stating that the Commission “will not require government and other entity challengers to use a Commission-approved speed test application, but rather will allow them to use their own software to collect data for the challenge process”). We anticipate that updating these parameters will provide the Commission with reliable challenges, while assuring a user-friendly challenge process by allowing consumers to use a readily-downloadable mobile app and preserving flexibility for government and third-party entities to use their own software and hardware. We note, however, that certain technical network information and RF metrics are not currently available on Apple iOS devices, thus limiting the conclusions that we can draw from on-the-ground tests conducted using such devices. We therefore propose to require that, until such time as such information and metrics are available on iOS devices, government and third-party entity challenges must use a device that is able to interface with drive test software and/or runs the Android operating system. See, e.g., Mobility Fund Phase II Challenge Process Handsets and Access Procedures for the Challenge Process Portal, WC Docket No. 10-90, WT Docket No. 10-208, Public Notice, 33 FCC Rcd 10372, 10373-74, para. 5 (2017) (MF-II Handset and USAC Portal Access Public Notice) (adopting requirements for the devices challengers may use in the Mobility Fund Phase II challenge process). However, we do not propose this same restriction for challenges submitted by consumers to ensure that the challenge process remains user-friendly and encourage public participation, including by consumers that may use a device running the iOS operating system. We seek comment on these proposals. 2. Challenge Responses 15. Providers must either submit a rebuttal to the challenge or concede the challenge within a 60-day period of being notified of the challenge. Third Order, 36 FCC Rcd at 1168, 1173, paras. 107, 121; 47 CFR § 1.7006(e)(3), (f)(4). Providers may rebut a challenge by submitting to the Commission either on-the-ground test data and/or infrastructure data, so that Commission staff can examine the provider’s coverage in the challenged area and resolve the challenge, Third Order, 36 FCC Rcd at 1168-69, 1173, paras. 108, 121; 47 CFR § 1.7006(e)(4), (f)(5). Some providers acknowledged that submitting on-the-ground data may serve as an effective means to rebut specific challenges. See, e.g., T-Mobile Reply at 9 (“Although on-the-ground testing may be useful to respond to a particular challenge, that does not mean regular on-the-ground testing is necessary or that it strikes an appropriate balance between promoting accuracy and minimizing the burdens on providers.”); CTIA Comments and Petition for Reconsideration at 20-21 (“[I]n some situations, a drive test or similar measures may be warranted to respond to a challenge.”); Letter from Mary L. Henze, Asst. Vice Pres. Federal Regulatory, AT&T to Marlene Dortch, Secretary, FCC, WC Docket No. 19-195, at 4 (filed Oct. 9, 2020) (AT&T Oct. 9, 2020 Ex Parte Letter) (“If a provider elects to collect data to disprove a challenge, AT&T proposes that providers have flexibility in using various sources of data to respond but should be encouraged to respond with on-the-ground drive test data.”). Several commenters also acknowledged that infrastructure information may be useful in rebutting challenges. See, e.g., T-Mobile Comments at 12 (“To the extent an issue with coverage maps is identified in the challenge process or by other verification tools, the Commission has the authority to request targeted infrastructure information, such as cell site location, as necessary.”); CTIA Comments and Petition for Reconsideration at 13 (“If the other verification processes identify a specific issue regarding a provider’s coverage data, the Commission could request targeted cell site location information from a provider specific to the service area in question in order to facilitate verification or resolution of the challenge process.”). and may optionally include additional data or information in support of a response. Providers have the option to include additional data or information in support of its challenge response. Third Order, 36 FCC Rcd at 1168-69, 1173-74, paras. 108, 121; 47 CFR § 1.7006(e)(4), (f)(5). When a mobile provider responds to a consumer challenge, the challengers who submitted the challenge data would be notified individually by the Bureau or Offices via the online portal and would be able to view the provider’s response. Third Order, 36 FCC Rcd at 1170, para. 111. The Commission directed OEA to “develop a methodology and mechanism to determine if the data submitted by a provider constitute a successful rebuttal to all or some of the challenged service area and to establish procedures to notify challengers and providers of the results of the challenge.” Id. at 1170, 1173-74, paras. 111, 121. The Commission “adopt[ed] the same challenge response process for government and third party-entities as [it] do[es] for consumer challenges in the mobile context,” therefore we infer the notification process will occur in the same way for challenges made by governmental and other entities as it does for challenges made by consumers. Id. at 1173-74, para. 121. We propose for mobile service providers and challengers to be notified monthly of the status of challenged areas. Parties would be able to see a map of the challenged area, and a notification about whether or not a challenge has been successfully rebutted, whether a challenge was successful, and if a challenged area was restored based on insufficient evidence to sustain a challenge. We also propose that any area in which the provider does not overturn the challenge but is otherwise no longer challenged (e.g., because some challenger tests were subsequently considered to be invalid or unreliable evidence), the coverage area would be restored to its pre-challenge status and would be eligible for challenges against it in the future. See Appendix A-Technical Appendix at § 3.2.4. We propose that any valid speed test in a hexagon that was challenged and then restored (but where the provider did not overturn the challenge by demonstrating adequate coverage) may still be used for a future challenge (up to a year from the date the test was conducted). We seek comment on these proposals. 16. The Commission also directed OEA, in consultation with WTB, to establish procedures for notifying service providers of cognizable challenges filed against them. Third Order, 36 FCC Rcd at 1168, para. 106; see 47 CFR § 1.7006(e)(3). Accordingly, we propose that the challenged mobile service provider would be notified by the Bureau or Offices via the online portal of the challenged hexagons at the end of each calendar month. We seek comment on this proposal and note that this approach would allow challengers to submit additional evidence if desired and grant providers a standard set of deadlines rather than a rolling set of multiple deadlines. If the challenged provider concedes or fails to submit data sufficient to overturn the challenge within 60 days of notification, it must revise its coverage maps to reflect the lack of coverage in the successfully challenged areas. Third Order, 36 FCC Rcd at 1170, para. 112 (“[I]n cases where a mobile service provider concedes or loses a challenge, the provider must file, within 30 days, geospatial data depicting the challenged area that has been shown to lack service. Such data will constitute a correction layer to the provider’s original propagation model-based coverage map, and Commission staff will use this layer to update the broadband coverage map.”); accord 47 CFR § 1.7006(e)(6), (f)(7); Third Order, 36 FCC Rcd at 1174, para. 124. a. Rebutting Challenges with On-the-Ground Data 17. The Commission directed OEA to resolve challenges based on a “preponderance of the evidence” standard with the burden on the provider to verify their coverage maps in the challenged areas. Third Order, 36 FCC Rcd at 1170, para. 111. When the challenged mobile service provider chooses to submit on-the-ground speed test data to rebut a challenge, we propose to require the provider to meet analogous thresholds to those required of challengers, adjusted to reflect the burden on providers to demonstrate that sufficient coverage exists at least 90% of the time in the challenged hexagons. We also propose that mobile providers submit on-the-ground data consistent with the specific testing parameters and methodologies outlined above that we propose challengers use when submitting speed test data. See supra Section III.A.1. We propose to require providers to collect these data using mobile devices running either a Commission-developed app (e.g., the FCC Speed Test app), another speed test app approved by OET to submit challenges, or other software and hardware if approved by staff. The Commission instructed the Bureau and Offices to “approve the equipment that providers may use [to conduct on-the-ground testing for verifying maps], including the handsets and any other special equipment necessary for the testing and other parameters necessary to obtain a statistical sample of the network.” Third Order, 36 FCC Rcd at 1150-51, para. 59. The Commission instructed that providers rebutting challenges with on-the-ground test data will be subject to the same requirements and specifications that apply to providers submitting data in response to a Commission verification request. Id. at 1169-70, para. 109. Per the Third Order, OEA may also approve appropriate standards and specifications (including appropriate hardware and software) for other types of data it may approve to be used to rebut a consumer challenge. Id. at 1170, para. 110. As noted above, certain technical network information and RF metrics are not currently available on Apple iOS devices. Accordingly, until such time as these data are available on iOS devices, we propose to require providers to use a device that is able to interface with drive test software and/or runs the Android operating system. See MF-II Handset and USAC Portal Access Public Notice, 33 FCC Rcd at 10373-74, para. 5. We seek comment on our proposals. 18. We propose that the test data that providers submit meet the same three thresholds required of challenger tests: (1) a geographic threshold; (2) a temporal threshold; and (3) a testing threshold. However, we propose somewhat different values (i.e., the number of tests and percentages) for test data for each threshold. For the geographic threshold, we propose to require at least four point-hexes of a resolution 8 hexagon to include two tests taken within them, at least one of which must be positive, to demonstrate that adequate coverage occurs at multiple locations within the resolution 8 hexagon. Fewer point-hexes may be tested when not all seven point-hexes of a resolution 8 hexagon are within the coverage area or do not contain at least one road. See Appendix A-Technical Appendix Table 2. For the temporal threshold, we also propose to require at least two positive tests be taken at times of day separated by at least four hours to demonstrate persistent adequate coverage. For the testing threshold, we propose to require at least 17 positive tests within the resolution 8 hexagon when 20 or fewer total response tests have been submitted within the hexagon. When more than 20 response tests have been submitted within the hexagon, we propose to require that the percentage of negative tests within the resolution 8 hexagon statistically demonstrate, using a 0.95 statistical confidence level, that the probability of a test achieving the minimum speeds reported in the provider’s coverage is 90% or greater and therefore the area has adequate coverage. The required percentage of positive tests would thus vary, from at least 82% when between 21 and 34 response tests have been submitted within the hexagon to 88% when 100 or more tests have been submitted. As with the thresholds proposed for challengers, we seek comment on whether these thresholds are sufficient to adequately demonstrate the on-the-ground reality of coverage in an area while maintaining a user-friendly challenge process. We expect any future modifications to these thresholds would apply to both challengers and providers. We also propose that a provider may demonstrate sufficient coverage in a resolution 8 hexagon that was not challenged if that hexagon is the child of a lower resolution challenged hexagon. As discussed more fully in section 3.2.4 of the Technical Appendix, for challenged hexagons at resolution 7 or 6, if the provider submits response data sufficient to demonstrate coverage in the hexagon’s child hexagons such that fewer than four child hexagons would still be challenged, then the resolution 7 or 6 hexagon would no longer be challenged even if sufficient data were not submitted to rebut a challenge for the remaining child hexagons. If the provider can demonstrate sufficient coverage in a challenged hexagon, the provider would have successfully rebutted the challenge to that hexagon, and the challenge would be overturned. Conversely, if the provider is not able to demonstrate sufficient coverage in a challenged hexagon, the provider would be required to revise its coverage maps to reflect the lack of coverage in such areas. If the provider demonstrates sufficient coverage in some but not all child hexagons and the parent (or grandparent) hexagon remains challenged, we propose that a provider would not be required to remove from its coverage map the portions of the challenged parent (or grandparent) hexagon where the provider demonstrated sufficient coverage in the child hexagons. However, the provider would be required to remove the remaining portion of the challenged parent (or grandparent) hexagon where it did not demonstrate sufficient coverage. We propose that any areas where the provider has demonstrated sufficient coverage would be ineligible for subsequent challenge until the first biannual BDC coverage data filing six months after the later of either the end of the 60-day response period or the resolution of the challenge. This is to avoid requiring a provider to repeatedly confirm the same area but also acknowledges that coverage may change over time due to changes in technology and infrastructure. We seek comment generally on this approach and as to whether this time period is too short or too long. 19. We seek comment on this methodology and invite commenters to propose alternative approaches that would allow for staff to adjudicate most challenges through an automated process. Commenters should refer to the methodology set forth in Appendix A-Technical Appendix. AT&T submitted a preliminary proposal for defining a challenge area based on the test data submitted by the challenger(s), and we considered this proposal while developing the proposed methodology. See Third Order, 36 FCC Rcd at 1168, para. 106 (directing OEA, in consultation with WTB, to consider AT&T’s proposal as they developed the methodology that will be used); AT&T Oct. 9, 2020 Ex Parte Letter at 2. We tentatively conclude that our proposed methodology is preferable to that submitted by AT&T, AT&T Oct. 9, 2020 Ex Parte Letter at 2-3. because it ensures the challenge process is both user-friendly and supported by sufficient data, while also targeting a more precise geographic area where broadband coverage is disputed and limiting the burden on providers in responding to challenges. AT&T recommends the Bureau and Offices adopt an approach in which the geographic location of speed tests would determine the size and shape of a polygon that would serve as the challenged area. Id. Moreover, AT&T proposes the Commission adopt a tiered structure in which challenges are filed and adjudicated in a manner proportional to their likelihood of success based on a percentage of valid speed tests in a polygon. Id. at 2-4. This could lead to significant challenged areas with few or no speed tests. Our approach differs in that challenged areas would be based on the H3 hexagonal indexing system. Under our proposed process, individual speed tests would be aggregated and evaluated collectively, and a hexagon would be classified as challenged once the aggregated speed tests have met geographic, temporal, and testing thresholds in that particular area. In addition to the on-the-ground data or infrastructure information submitted by mobile service providers, staff could also consider other relevant data submitted by challenged providers, request additional information from the challenged provider (including infrastructure data, if necessary), and take such other actions as may be necessary to ensure the reliability and accuracy of the rebuttal data. Third Order, 36 FCC Rcd at 1168-69, 1170, 1173-74, paras. 108, 110, 111, 121-22 (allowing OEA to request additional data (other than the data initially submitted by providers to rebut a challenge), instructing OEA to review additional data when voluntarily submitted by providers in response to a consumer challenge, and directing OEA to develop a methodology and mechanism to determine if data submitted by a provider constitutes a successful rebuttal to all or some of the challenged area using a “preponderance of the evidence” standard for resolving challenges with the burden on the provider to verify coverage maps in the challenged area); 47 CFR § 1.7006(e)(4), (f)(5). We propose such steps could include rejecting speed tests or requiring additional testing. We seek comment on these proposals. b. Rebutting Challenges with Infrastructure Data 20. Providers may respond to challenges with infrastructure data rather than (or in addition to) on-the-ground speed test data. See infra Section III.B.3 (proposing examples of the types of infrastructure information that staff would collect); Third Order, 36 FCC Rcd at 1168, 1173, paras. 108, 121; 47 CFR § 1.7006 (e)(4), (f)(5). In cases where a challenged mobile service provider chooses to submit infrastructure data to rebut a challenge, we propose that the mobile service provider submit the same data as required when a mobile provider submits infrastructure information in response to a Commission verification request, which would include information on the cell sites and antennas used to provide service in the challenged area. See Third Order, 36 FCC Rcd at 1147-48, para. 52; see infra Section III.B.3. Based on our tentative conclusion below that such data may not be as probative in certain circumstances as on-the-ground speed tests, See infra Section III.D. we propose to use these data, on their own, to adjudicate challenges in only a limited set of circumstances. Specifically, a challenged provider may use infrastructure data to identify tests within a challenger’s speed test data that the provider claims are invalid or non-representative of network performance. Under our proposal, a provider could claim a speed test was invalid, or non-representative, based on the following reasons: (1) extenuating circumstances at the time and location of a given test (e.g., maintenance or temporary outage at the cell site) caused service to be abnormal; (2) the mobile device(s) with which the challenger(s) conducted their speed tests do not use or connect to the spectrum band(s) that the provider uses to serve the challenged area; (3) speed tests were taken during an uncommon special event (e.g., a professional sporting event) that increased traffic on the network; or (4) speed tests were taken during a period where cell loading exceeded the modeled cell loading factor. While providers may use infrastructure information with hourly cell loading data to rebut a challenge in this scenario to show sporadic or abnormally high cell loading, in the event a high number of challenges indicates persistent over-loading, we propose that staff may initiate a verification inquiry to investigate whether mobile providers have submitted coverage maps based on an accurate assumption of cell loading in a particular area. We propose to require that mobile providers respond to such a verification inquiry with on-the-ground data. Using this proposed approach, we would recalculate the challenged hexagons after removing any invalidated challenger speed tests and consider any challenged hexagons that no longer meet the thresholds required for a challenge to be restored to their status before the challenge was submitted. Challenged providers may also demonstrate sufficient coverage for any areas that remain challenged by submitting on-the-ground speed test data. We seek comment on this approach, including on whether there are other reasons or circumstances under which we should use infrastructure data alone to determine the outcome of a challenge. 21. We seek comment generally on other ways that infrastructure data could be used to automatically evaluate or rebut speed test data submitted by challengers. Where a challenged provider’s submitted infrastructure data do not meet one of the processing rules proposed above, we propose that Commission staff consider any additional information submitted by the challenged provider or request additional information from the challenged provider. Such information would include on-the-ground speed test data, as specified in the Third Order, and staff would use this information to complete its adjudication of the challenge. We acknowledge there may be some scenarios in which a provider may not be able to respond to a challenge with on-the-ground test data due, for example, to the inability to collect on-the-ground data during certain months of the year or other unforeseen circumstances. We seek comment on the best approach to handle such situations. One approach would be to allow for providers to seek a waiver of the 60-day response deadline until the provider can make on-the-ground measurements, or a waiver of the requirement to submit either infrastructure or on-the-ground speed tests data in response to a challenge. Another approach would be to allow providers to submit infrastructure data, even if one of the four instances of particular probative value set forth above does not apply, with supplemental data that explain their inability to make on-the-ground measurements at that time. In such cases, the Commission could request that the on-the-ground test data be submitted at a time when such measurements would be more feasible, or that a possible substitute for such data—such as transmitter monitoring software data or third-party speed test data—be submitted instead. Commission staff could also use infrastructure data to do its own propagation modeling and generate its own predicted coverage maps using the data submitted by the provider including link budget parameters, cell-site infrastructure data, and the information provided by service providers about the types of propagation models they used, standard terrain and clutter data, as well as standard propagation models, to determine whether the provider should be required to update its maps. We seek comment on other approaches we should take where on-the-ground testing is temporarily infeasible. 22. In instances where the Commission staff uses its own propagation modeling to adjudicate challenges, we seek comment on how staff should conduct such propagation modeling. What model or models should staff use in different conditions (e.g., for what combinations of spectrum band and terrain)? What inputs and parameters should staff use beyond those supplied by providers (e.g., what specific sources of terrain and clutter data in what areas)? What assumptions should the Commission make regarding carrier aggregation? How should staff calculate the throughput in a given area given propagation-model calculations for signal strength? Finally, how should the Commission calibrate its models or ensure their accuracy? 23. We also seek comment about how staff should adjudicate instances where the on-the-ground test data and infrastructure data disagree or where the provider-filed coverage and Commission-modeled coverage differ. Under what conditions should staff determine that a given hexagon has network coverage? Would the results of the Commission propagation modeling always be dispositive? For example, should we always find that an area has network coverage if so indicated by the Commission propagation model, despite any number of on-the-ground tests that indicated a lack of service at the required speeds? Should we incorporate other, related metrics, such as signal strength or cell loading data, when considering how to treat infrastructure data in the adjudication of challenges? And should staff always require providers to update their filings or submit additional data if the Commission’s propagation modeling indicate a lack of network coverage? If the Commission propagation model indicates network coverage over part of a hexagon, how should staff adjudicate that area? Should the specific location of on-the-ground test measurements within a challenged hexagon, relative to the Commission-predicted coverage, matter? Are there other scenarios in which we should consider adjudicating challenges with only infrastructure data? c. Other Data 24. In the Third Order, the Commission sought to adopt a flexible approach for providers to respond to challenges. Third Order, 36 FCC Rcd at 1169-70, para. 109. Several commenters argued that the Commission should grant providers additional flexibility in responding to challenges, including allowing providers to respond with drive testing data collected in the ordinary course of business, third party testing data (such as speed test data from Ookla or other speed test app), and/or tower transmitter data collected from transmitter monitoring software. Verizon Comments at 27; CTIA Comments at 20-21. As discussed in the Third Order, providers may voluntarily submit these or other types of data to support their rebuttals, but they may not be used in lieu of on-the-ground testing or infrastructure data. Third Order, 36 FCC Rcd at 1170, para. 110; 47 CFR § 1.7006(e)(5). Consistent with the Commission’s direction, OEA staff will review such data when voluntarily submitted by providers in response to consumer challenges, and if any of the data sources are found to be sufficiently reliable, we will specify appropriate standards and specifications for each type of data and add them to the alternatives available to providers to rebut a consumer challenge via public notice. Third Order, 36 FCC Rcd at 1170, para. 110. 25. We also seek comment regarding the conditions under which a provider’s transmitter monitoring software can be relied upon by staff in resolving challenges. For example, in what ways would transmitter monitoring software data augment or reinforce the probative value of infrastructure or other data to rebut challenger speed test data? How precisely do such systems measure the geographic coordinates (longitude and latitude) of the end-user devices, and how does that precision compare to the information collected from on-the-ground testing? Would such software record instances of end-user devices not being able to connect to the network at all? If not, would that exclusion make the data less reliable and probative in the rebuttal process? What other information would staff need to determine how to make use of such data in the challenge process? B. Collecting Verification Information from Mobile Providers 26. The Broadband DATA Act requires the Commission to “verify the accuracy and reliability of the [broadband internet access service data that providers submit in their biannual BDC filings] in accordance with measures established by the Commission.” 47 U.S.C. § 642(b)(4)(B). In the Third Order, the Commission determined that OEA and WTB may request and collect verification data from a provider on a case-by-case basis where staff have a credible basis for verifying the provider’s coverage data. Third Order, 36 FCC Rcd at 1146, para. 47; 47 CFR § 1.7006(c). The Third Order specifies that, in response to an OEA and WTB inquiry to verify a mobile service provider’s coverage data, the provider must submit either infrastructure information or on-the-ground test data for the specified area(s). Third Order, 36 FCC Rcd at 1146, para. 50; see 47 CFR § 1.7006(c). A mobile provider has the option of submitting additional data, including but not limited to on-the-ground test data or infrastructure data (to the extent such data are not the primary option chosen by the provider), or other types of data that the provider believes support its reported coverage. Third Order, 36 FCC Rcd at 1147, para. 50. A mobile service provider has 60 days from the time of the request by OEA and WTB to submit, at the provider’s option, infrastructure or on-the-ground data and any additional data that the provider chooses to submit to support its coverage. Id. OEA and WTB may require submission of additional data if such data are needed to complete their verification inquiry. Id. The Commission further directed OEA and WTB to implement this data collection and adopt the methodologies, data specifications, and formatting requirements that providers must follow when collecting and reporting such data. Id. at 1146, para. 48. Below, we propose processes and methodologies for determining areas subject to verification and for the collection of on-the-ground test data and infrastructure information, as well as information from transmitter monitoring systems and other data. We seek comment on each of these proposals, including the additional details and specifications set forth in the Technical Appendix. 1. Area Subject to Verification 27. We propose to identify the portion(s) of a mobile provider’s coverage map for which we would require verification data—referred to as the targeted area(s)—based upon all available evidence, including submitted speed test data, infrastructure data, crowdsourced and other third-party data, as well as staff evaluation and knowledge of submitted coverage data (including maps, link budget parameters, and other credible information). We seek comment on this proposal and on any alternative methodologies for determining where staff have a credible basis for verifying a mobile provider’s coverage data. 28. Within the targeted area, we propose to require verification data covering a statistically valid sample of areas for which the mobile service provider must demonstrate sufficient coverage in order to satisfy the verification request. We propose to start the sampling with the division of the targeted area into unique components called “units.” Illustrative maps are provided in the Technical Appendix. See Appendix A-Technical Appendix, Figure 6. The complete list of units within the targeted area is called the “frame.” We propose to first subdivide the targeted area into units based upon the same hexagonal geography we propose to use for grouping challenger speed tests (i.e., H3 geospatial indexing system at resolution 8). To create the frame, we propose to include all resolution 8 hexagons that are within the targeted area or, for those resolution 8 hexagons that are only partially within the boundary of the targeted area, its centroid falls within or on the boundary of the targeted area. We next propose to group the hexagonal units that comprise the frame into non-overlapping, mutually exclusive groups (one “stratum” or multiple “strata”). We propose to define each stratum based upon one or more variables that are correlated with a particular mobile broadband availability characteristic, such as population, road miles, and/or variation in terrain, and seek comment on what variables we should consider. Appendix A-Technical Appendix § 4.2-3. We propose to exclude any hexagons that are not accessible by roads from the strata. If an area is unable to be sampled because there are too few hexagons accessible by road, we propose to include the minimum number of non-accessible hexagons within the strata as necessary to create a sufficient sample. We seek comment on these proposals, and on other methods that can be used to verify the part of the targeted area that cannot be drive tested. 29. Next, we propose to select a random sample For a full methodology of sample selection, see Appendix A-Technical Appendix § 4.4. of hexagons independently within each stratum and to require that a service provider conduct on-the-ground testing within these randomly selected hexagons or else submit infrastructure data sufficient for staff to reproduce coverage for these randomly selected hexagons. When evaluating on-the-ground test data, we propose that a sample meet two of the three thresholds proposed for evaluating tests in a challenged hexagon in the challenge process, specifically the geographic and temporal thresholds. We also propose to require a minimum of five speed tests in each selected hexagon. We would then evaluate the entire set of speed tests to determine the probability that the targeted area has been successfully verified. Under our proposal, for the targeted area to be successfully verified, the probability of adequate coverage must be greater than or equal to 0.9 assessed using a one sided 95% confidence interval. When evaluating infrastructure data, we propose that staff review all available data and staff propagation modeling to demonstrate adequate coverage for all hexagonal units in a sample for the targeted area to be successfully verified. Where the data submitted by the provider in response to a verification request are not by themselves sufficient to demonstrate adequate coverage, we may request additional information to complete the verification process. We seek comment on these proposals. 30. Several commenters supported our proposal in the Second Order and Third Further Notice to verify broadband availability data by requiring providers to submit tests and information on sampled areas, and agreed that it would be an efficient and less burdensome approach than having providers perform annual drive tests or regularly submit infrastructure information. CCA Reply at 2; Connected Nations Comments at 4; Vermont Department of Public Safety (VTDPS) Comments at 3; NRECA Comments at 5-6; see also Second Order and Third Further Notice, 35 FCC Rcd at 7503-05, paras. 104, 106, 108. We agree that sampling will require lower costs and fewer resources than collecting data from a provider’s entire network coverage area. CCA Comments at 6-7. In particular, the proposed approach for sampling the targeted area is designed to minimize the cost and burden placed on service providers while ensuring staff have access to sufficient data to verify coverage in a reliable way. Second Order and Third Further Notice, 35 FCC Rcd at 7503-05, paras. 104, 106, 108 (recognizing that requiring providers to test their entire networks would be prohibitively expensive and seeking comment on the costs of requiring mobile providers to submit a statistically valid sample of on-the-ground data to verify their network coverage); CCA Comments at 6-7 (asserting that requiring carriers to verify the accuracy of mobile broadband availability data through a statistically valid sample of on-the-ground data may require lower costs and fewer resources than collecting data from a provider’s entire network); CCA Reply at 2 (acknowledging that the Commission can achieve its verification goals by requiring carriers to drive test a statistically significant sample of their networks). Without such a sampling plan, providers would need to submit substantially more data to demonstrate broadband availability. 31. In response to the Second Order and Third Further Notice, some providers expressed concerns that sampling would not mitigate the costs associated with performing testing and would still be a burden on providers, as it would require a minimum number of tests at different locations. CTIA Comments and Petition for Reconsideration at 14; Coalition of Rural Wireless Carriers Reply at 4-5; T-Mobile Comments at 9; Verizon Comments at 13; Letter from Brendan Haggerty, AT&T, to Marlene H. Dortch, Secretary, FCC, WC Docket Nos. 19-195 and 11-10 at 2 (filed Aug. 18, 2020) (asserting that the cost of drive testing, even a small percentage of its network, would cost it millions of dollars, but acknowledging “there is no indication of how an ‘area’ would be defined which makes it difficult to assess the feasibility of developing a sample.”). However, compared to requiring providers to regularly drive test their networks or submit large amounts of infrastructure data in response to a verification request, we anticipate that our proposal to require providers to submit speed test results or infrastructure information on a case-by-case basis would minimize the time and resources associated with responding to the Commission’s verification requests. See Verizon Comments at 14 (“Speed test data and infrastructure data should be used for case-by-case verification in small areas, when other verification methods have identified a potential issue.”). The proposed stratification methodology would ensure that variation in broadband availability would be as small as possible within hexagons in the same stratum. We anticipate this methodology would reduce the sample size (e.g., the number of test locations), the cost of data collection, and the variance in the estimate of the variable interest (meaning the percentage, P-hat, of positive tests indicating broadband availability), and, in turn, would increase the precision of the final estimate. We seek comment on this proposed methodology. 32. In addition, we seek comment on other variables which correlate with broadband availability and upon which stratification should be based. We also seek comment on the tradeoffs of setting a higher or lower confidence level for this verification process than the thresholds established for the challenge process. Under our proposed methodology, if the provider fails to verify its coverage data, the provider would be required to submit revised coverage maps that reflect the lack of coverage in targeted areas failing the verification. Third Order, 36 FCC Rcd at 1182, para. 145; 47 CFR § 1.7009(d). Where a provider fails to verify its coverage and submits revised coverage data, we propose to re-evaluate the data submitted by the provider during the verification process against its revised coverage data for the targeted area. If the targeted area still cannot be successfully verified, we propose to require the provider to submit additional verification data or further revise its coverage maps until the targeted area is successfully verified. We seek comment on this proposal and invite commenters to propose alternative methodologies for generating a statistically valid sample of areas for which the mobile service provider must demonstrate sufficient coverage in response to a verification request. 33. Alternatively, we seek comment on the use of available spatial interpolation techniques, such as Kriging, Kriging is a spatial interpolation technique that predicts values at unknown points based on measured values at known points. ESRI and ArcGIS, How Kriging works, https://desktop.arcgis.com/en/arcmap/10.3/tools/3d-analyst-toolbox/how-kriging-works.htm (last visited June 22, 2020). Kriging determines the statistical correlation among measured values and fits a mathematical function to make a prediction at a location. Additionally, Kriging provides variance maps that measure the certainty or accuracy of prediction maps. Kriging is most appropriate when the measured values behave as a normal distribution. Id. that could be used to evaluate and verify the accuracy of coverage maps based on available measurements. Spatial interpolation techniques can be an alternative or complementary approach to specifying an exact testing threshold since spatial interpolation techniques require fewer data to compare with predictions using propagation models. Although spatial interpolation techniques can readily verify whether or not a hexagonal cell has coverage with speeds at or above the minimum values reported in the provider's submitted coverage data, the incremental benefit over testing thresholds may be minimal because spatial interpolation techniques provide better results as more data is collected. We seek comment on the costs and benefits of using spatial interpolation techniques either in addition to or as an alternative to the testing thresholds proposed above for verifying the accuracy of coverage maps. 2. On-the-Ground Test Data 34. To submit on-the-ground test data in response to a verification inquiry, we propose to require that mobile providers conduct on-the-ground tests consistent with the testing parameters and test metrics that we propose to require for provider-submitted test data in the challenge process. See supra Sections III.A.1, A.2.a. As described above, we propose to require verification data covering a statistically valid sample of areas for which the mobile service provider must demonstrate sufficient coverage in order to satisfy the verification request. To verify coverage with on-the-ground speed test data, we propose that the provider submit on-the-ground speed tests within a hexagonal area based upon the H3 geospatial indexing system at resolution 8. We would require that these tests meet a threshold percentage of positive tests (i.e., those recording download and upload speeds at or above the minimum speeds the provider reports in its BDC submission as available at the location where the test occurred). The tests would be evaluated to confirm, using a 95% statistical confidence interval, that the cell coverage percentage is 0.9 or higher. In addition, we propose to require that tests meet the same geographic, temporal, and testing thresholds as proposed for evaluating provider rebuttals to challenges. We envision that the specific thresholds and the confidence interval proposed would provide balance between the costs to providers associated with verifying maps and the need for the Commission to acquire a significant enough sample to accurately verify mobile broadband availability. We seek input from commenters on the costs and benefits associated with these proposed threshold numbers and confidence intervals. 35. We propose that if the service provider is able to show sufficient coverage in the selected resolution 8 hexagon, the provider would have successfully demonstrated coverage to satisfy the verification request in that hexagon. We seek comment on this proposed methodology and invite commenters to propose alternative approaches that would allow for staff to automatically adjudicate speed test data submitted during the verification process. Staff may consider other relevant data submitted by providers, may request additional information from the provider (including infrastructure data, if necessary), and may take other actions as may be necessary to ensure the reliability and accuracy of the verification process. We seek comment on these proposals. 3. Infrastructure Information 36. In the Third Order, the Commission found that infrastructure information can provide an important means for the Commission to fulfill its obligation to independently verify the accuracy of provider coverage propagation models and maps and provided examples of the infrastructure information that mobile providers may be required to submit as part of a verification inquiry. Third Order, 36 FCC Rcd at 1147-48, para. 52. The Third Order listed the following examples of infrastructure information that mobile providers may be required to submit: (1) the latitude and longitude of cell sites; (2) the site ID number for each cell site; (3) the ground elevation above mean sea level (AMSL) of the site; (4) frequency band(s) used to provide service for each site being mapped including channel bandwidth; (5) the radio technologies used on each band for each site (for example, 802.11ac-derived OFDM, proprietary OFDM, LTE Release 13, and NR Release 15); (6) the capacity and type of backhaul used at each cell site; (7) the number of sectors at each cell site; and (8) the Effective Isotropic Radiated Power (EIRP) of the sector at the time the mobile provider creates its map of the coverage data. Id. at 1147-48, para. 52. The Commission further concluded that collecting such data will enable the Commission to satisfy the Broadband DATA Act’s requirement that the Commission verify the accuracy and reliability of submitted coverage data. Id. at 1148, para. 53; see also 47 U.S.C. § 642(b)(4)(B). 37. If a mobile service provider chooses to submit infrastructure data in response to a verification request, we propose to require the provider to submit such data for all cell sites and antennas that provide service to the targeted area. We propose that the Commission staff then evaluate whether the provider has demonstrated sufficient coverage for each selected hexagon using standardized propagation modeling. Under this approach, staff engineers would generate their own predicted coverage maps using the data submitted by the provider (including link budget parameters, cell-site infrastructure data, and the information provided by service providers about the types of propagation models they used). Using these staff-generated maps, we would evaluate whether each selected hexagon has predicted coverage with speeds at or above the minimum values reported in the provider’s submitted coverage data. In generating our own coverage maps, we propose to use certain standard sets of clutter and terrain data. We seek comment on this proposal and seek comment generally on other ways that infrastructure data could be used to evaluate the sufficiency of coverage in our proposed verification process. Staff may also consider other relevant data submitted by providers during the verification process, may request additional information from the provider (including on-the-ground speed test data, if necessary), and may take steps to ensure the accuracy of the verification process. We seek comment on these proposals. 38. Alternatively, we could use the submitted infrastructure and link budget data, along with available crowdsourced data, to perform initial verification of the claimed coverage within the selected hexagons using standard propagation models as well as appropriate terrain and clutter data. See Second Order and Third Further Notice, 35 FCC Rcd at 7502-03, paras. 101-02; Third Order, 36 FCC Rcd at 1147-48, para. 52. We could evaluate the provider’s link budgets and infrastructure data for accuracy against other available data, such as Antenna Structure Registration and spectrum licensing data. Under this approach, if our projection of speeds, along with the available crowdsourced data at the challenged locations, does not predict speeds at or above the minimum values reported in the provider’s submitted coverage data, we propose that Commission staff would consider any additional information submitted by the provider or request additional information from the provider. Such information would include on-the-ground speed test data and staff would use this information to complete its verification of the targeted area. The Commission could also leverage spatial interpolation techniques to evaluate and verify the accuracy of coverage maps based on available crowdsourcing and on-the-ground data. See supra Section III. B.1. We seek comment on this approach and other ways that infrastructure data could be used to verify a provider’s coverage in the targeted area. 39. Consistent with the authority the Commission delegated to OEA and WTB in the Third Order to “adopt the methodologies, data specifications, and formatting requirements” that providers must follow when collecting and reporting mobile infrastructure data, and to help ensure that infrastructure information submissions are useful, we seek comment on adding additional input fields to the list of infrastructure information providers should include when responding to a verification request. Third Order, 36 FCC Rcd at 1146, para. 48 (directing OEA and WTB to implement this data collection and adopt the methodologies, data specifications, and formatting requirements). The use of carrier aggregation must also disclose the percentage of handset population capable of using this band combination. In addition to the types of infrastructure information listed as examples in the Third Order, See id. at 1147-48, para. 52. we propose that providers submit the following additional parameters and fields: (1) geographic coordinates of each transmitter; (2) per site classification (e.g., urban, suburban, or rural); (3) elevation above ground level for each base station antenna and other transmit antenna specifications, including the make and model, beamwidth, and orientation (i.e., azimuth and any electrical and/or mechanical down-tilt) at each cell site; (4) operate transmit power of the radio equipment at each cell site; (5) throughput and associated required signal strength and signal to noise ratio; (6) cell loading distribution; (7) areas enabled with carrier aggregation and a list of band combinations (including the percentage of handset population capable of using this band combination); and (8) all other metrics required per the most-recent specification for infrastructure data released by OEA and WTB. We anticipate we will need all of this infrastructure information to use as inputs for Commission engineers to generate their own predicted coverage maps. While we recognize that several commenters recommended limiting the scope of infrastructure data in response to the Second Order and Third Further Notice, See, e.g., Letter from Matthew Gerst, Vice President, Regulatory Affairs, CTIA, to Marlene H. Dortch, Secretary, FCC, WC Docket Nos. 19-195 and 11-10, at 5 (filed July 6, 2020); AT&T Comments at 9-10; Letter from Mary L. Henze, Assistant Vice President, Federal Regulatory, AT&T, to Marlene H. Dortch, Secretary, FCC, WC Docket Nos. 19-195 and 11-10, at 1-2 (filed July 8, 2020). we anticipate that collecting additional infrastructure data based on the data specifications listed above will be necessary in order for such data to be useful in verifying providers’ biannual data submissions. We seek comment on these proposals and tentative conclusions. 4. Additional Data 40. Mobile service providers may supplement their submission of infrastructure information or on-the-ground test data required by verification inquiry with “other types of data that the provider believes support its coverage.” Third Order, 36 FCC Rcd at 1146-47, para. 50. In addition, OEA and WTB may require the submission of additional data when necessary to complete a verification inquiry. Id. at 1147, para. 50. We seek comment on what types of other data, besides infrastructure information and on-the-ground test data, will be useful to verifying mobile service providers’ coverage data and whether such data should be submitted in a specific format. 41. For example, in the Third Order, the Commission stated that it will allow mobile broadband service providers to supplement their submission of either infrastructure information or on-the-ground test data with additional data that the provider believes support its coverage, such as data collected from its transmitter monitoring systems and software. Id. at 1146, para. 47; see 47 CFR § 1.7006(c). The Commission found that such data currently have not been shown to be a sufficient substitute for either on-the-ground testing or infrastructure data in response to a verification investigation. Third Order, 36 FCC Rcd at 1146, para. 47 & n.157; see 47 CFR § 1.7006(c). However, the Commission directed OEA and WTB to accept and review transmitter data to the extent they are voluntarily submitted by providers in response to verification requests from staff. Third Order, 36 FCC Rcd at 1146, para. 47 & n.157. These data could be especially helpful to the extent that they support potential reasons for service disruptions during the time interval in which measurements were performed, or to describe remedial improvements to network quality. To that end, the Commission delegated authority to OEA and WTB to specify appropriate standards and specifications for such data and add them to the alternatives available to providers to respond to verification requests if staff concludes that such methods are sufficiently reliable. See supra Section III.A.2.c seeking comment on the use of transmitter monitoring data; Third Order, 36 FCC Rcd at 1146, para. 47 & n.157. We note that the Commission specifically recognized that OEA and WTB’s analysis of transmitter data “may lead it to expand the options available to providers for responses with respect to verification investigations but not do so for other purposes, including responses to consumer challenges and/or governmental or other entity challenges.” Id. 42. In the absence of any experience with this process it is premature to propose specifications and standards to receive voluntary data collected from a provider’s transmitter monitoring systems and software. However, mobile service providers may submit transmitter data in addition to the infrastructure or on-the-ground data they submit in response to a verification investigation. We propose that OEA and WTB analyze transmitter data submitted by mobile service providers to determine whether such data accurately depict coverage by a mobile service provider. We seek comment on this proposal. C. Collecting Verified Broadband Data from Governmental Entities and Third Parties 43. The Broadband DATA Act requires the Commission to develop a process through which it can collect verified data for use in the coverage maps from: (1) state, local, and Tribal government entities primarily responsible for mapping or tracking broadband internet access service coverage in their areas; (2) third parties, if the Commission determines it is in the public interest to use their data in the development of the coverage maps or in the verification of data submitted by providers; and (3) other federal agencies. 47 U.S.C. § 642(a)(2). In the Third Order, the Commission directed OEA to collect verified mobile on-the-ground data from governmental entities and third parties through a process similar to that established for providers making their semiannual Broadband Data Collection filings. Third Order, 36 FCC Rcd at 1153-54, para. 68; see 47 CFR § 1.7008(d)(2). Governmental and third-party entities must include a description of relevant methodologies, specifications, and other relevant details that the Commission should consider in reviewing these verified mobile data. Third Order, 36 FCC Rcd at 1153-54, para. 68. They also must certify that the information they submit is true and accurate to the best of their actual knowledge, information, and belief. Id. 44. In accordance with the Commission’s direction in the Third Order and to ensure the Commission receives verified and reliable data, we propose that governmental entities and third parties should submit on-the-ground test data using the same metrics and testing parameters as we propose above for mobile providers to use in submitting on-the-ground test data. Id. at 1152-54, paras. 63-64 & n.202, 68; see also Verizon Comments at 16 (asserting that “allowing only third parties that employ sound and reliable data collection and verification methodologies to contribute data to the Fabric can also improve the overall accuracy of the underlying data”); VTDPS Comments at 4-5 (arguing that the Commission should adopt a uniform testing methodology for verifying coverage and conducting the challenge process that reflects typical end-user experience). While the Massachusetts Department of Telecommunications and Cable asks the Commission to adopt a “minimum standard” and avoid “strict submission methodology guidelines” on data submissions by states and other third parties, we do not propose standards that are lower than or differ from those we propose for mobile providers. Massachusetts Department of Telecommunications and Cable Comments at 3-4. As discussed, these data can be used to verify service providers’ coverage maps, similar to the data submitted by mobile providers. We therefore anticipate that assigning consistent, standardized procedures for governmental entities and third parties to submit on-the-ground data will be both appropriate and necessary to ensure the broadband availability maps are as accurate and precise as possible. 45. We also propose that, to the extent the Commission has verified on-the-ground data submitted by governmental entities and third parties, such data may be used when the Commission conducts analyses as part of the verification processes and would be treated as crowdsourced data. The Third Order states that a provider does not need to include government or third party-submitted data as part of its broadband data submissions if it does not agree with the data submitted by the government entity or third party. Third Order, 36 FCC Rcd at 1153, para. 66. In such cases, if a government entity or third party does not agree with the provider’s treatment of the data, they have the option of filing the data as part of a challenge to the provider’s availability data via the challenge portal and such challenges will be addressed via the respective fixed and mobile challenge process procedures. Id. To the extent the Commission has already verified a government or third party’s data, there will be no need to submit new data for challenge purposes. See id. Governmental entities and third parties may also choose to use these data to submit a challenge, provided it meets the requirements for submission of a challenge under the Commission’s rules. We invite comment on both of these proposals and also on whether stakeholders would benefit from additional guidance regarding when the Commission will consider data from government entities and third parties. See id. at 1151-52, para. 62. D. Probative Value 46. The Commission directed OEA and WTB to provide guidance on the types of data that will likely be more probative in validating broadband availability data submitted by mobile service providers in different circumstances. Id. at 1146, para. 48. We believe that on-the-ground test data that reflects actual on-the-ground tests as opposed to predictive modeling and other techniques will generally be more accurate reflections of user experience and thus more probative than infrastructure or other sources of information in most but not all circumstances. We recognize that on-the-ground test data can be more costly to obtain and may not be necessary in every instance, and therefore describe below at least four circumstances where we tentatively conclude that infrastructure information will likely be of probative value comparable to on-the-ground data. We seek comment on these conclusions and whether there are any other circumstances where we can draw such a conclusion. We further seek comment on the probative value of potentially less burdensome testing techniques using aerial drones or other technologies for collecting test data. Second Order and Third Further Notice, 35 FCC Rcd at 7509, para. 121. 47. First, we propose to find that infrastructure information will be of comparable probative value when extenuating circumstances at the time and location of a given test (e.g., maintenance or temporary outage at the cell site) caused service to be abnormal. In such cases, we propose for providers to submit coverage or footprint data for the site or sectors that were affected and information about the outage, such as bands affected, duration, and whether the outage was reported to the Network Outage Reporting System (NORS), along with a certification about the submission’s accuracy. We would then remove measurements in the reported footprint in the relevant band(s) made during the outage and, as appropriate, recalculate the statistics. 48. Second, we propose to find that infrastructure or other information will be of comparable probative value when measurements that led to the verification request or challenge rely on devices that lack a band that the provider uses to make coverage available in the area in question. In such cases, we propose for providers to submit band-specific coverage footprints and information about which specific device(s) lack the band. We would then remove measurements from the listed devices in the relevant footprint and recalculate the statistics. 49. Third, we propose to find that infrastructure information will be of comparable probative value when speed tests were taken during an uncommon special event (e.g., a professional sporting event) that increased traffic on the network. We recognize that mobile service providers would not have the same throughput they would in normal circumstances given the high volume of traffic on networks during these types of events, so demonstrating the existence of coverage in the area by submitting infrastructure information would be persuasive for why speed tests were negative in such a scenario. 50. Fourth, we propose to find that infrastructure information will be of comparable probative value when challenger speed tests were taken during a period where cell loading exceeded the modeled cell loading factor. We recognize speed tests taken during a period when cell loading is higher than usual can result in negative speed tests. However, as discussed, we anticipate infrastructure information will be useful to rebut challenges in this situation, but if a high number of challenges show persistent over-loading, we propose that staff may initiate a verification inquiry to investigate whether mobile providers have submitted coverage maps based on an accurate assumption of cell loading in a particular area, and mobile providers should respond to such a verification request with on-the-ground data in order to assess the experience of users in that area. E. Crowdsourced Data 51. The Broadband DATA Act requires the Commission to “develop a process through which entities or individuals . . . may submit specific information about the deployment and availability of broadband internet access service . . . on an ongoing basis . . . to verify and supplement information provided by providers.” 47 U.S.C. § 644(b). In the Second Order, the Commission adopted a crowdsourcing process to allow individuals and entities to submit such information. Second Order and Third Further Notice, 35 FCC Rcd at 7487, para. 64. 52. The Commission instructed OET, OEA, WTB, and the Wireline Competition Bureau (WCB) to develop a process to prioritize the consideration of crowdsourced data submitted through data collection apps used by consumers and other entities that are determined to be “highly reliable” and that “have proven methodologies for determining network coverage and network performance.” Id. at 7488, para. 66 (quoting 47 U.S.C. § 644(b)(2)(A)). The Commission further directed OET, OEA, WCB, and WTB to consider “(1) whether the application uses metrics and methods that comply with current Bureau and Office requirements for submitting network coverage and speed data in the ordinary course; (2) whether the speed application has enough users that it produces a dataset to provide statistically significant results for a particular provider in a given area; and (3) whether the application is designed so as not to introduce bias into test results.” Id. at 7488, para. 66. We propose to find that the Commission’s speed test app is a reliable and efficient method for entities to use in submitting crowdsourced mobile coverage data to the Commission. The Commission’s speed test app allows users to submit specific information about the deployment and availability of mobile broadband service and meets the requirements outlined in the Commission’s Second Order. Id. To the extent that OET, in consultation with OEA and WTB, determines that other apps used by consumers or other entities are “highly reliable” and “have proven methodologies for determining mobile broadband network coverage and network performance,” See id. at 1167, para. 104 (delegating authority to OET to approve additional third-party speed test apps for the challenge process). we propose to allow consumers and other entities to use such an app to submit crowdsourced information. We also propose to consider as crowdsourced information speed tests taken with an authorized app that do not meet the criteria needed to create a cognizable challenge or are otherwise not intended to be used to challenge the accuracy of a mobile service providers’ map. 53. To the extent consumers and governmental or other entities choose to submit on-the-ground crowdsourced mobile speed test data in the online portal, we propose that such data be collected using a similar measurement methodology as the Commission’s speed test app and submitted in a similar format to that which we propose for challengers and providers to use when submitting speed tests. See supra Sections III.A.1, A.2.a. However, because crowdsourced data will not automatically require a response from a provider, Second Order, 35 FCC Rcd at 7490, para. 71. and Commission staff will use crowdsourced data for identifying individual instances or patterns of potentially inaccurate or incomplete deployment or availability data that warrants further review and will only initiate an inquiry when a “critical mass of” crowdsourced filings suggest that a provider has submitted inaccurate or incomplete data, we propose for some speed test metrics to be optional. Id. at 7490-91, paras. 72, 74. For example, we propose to allow entities submitting crowdsourced data to submit tests that include any combination of the download speed, upload speed, or round-trip latency test metrics rather than requiring all three as with challenge data. We seek comment on our proposal. Should the Bureau and Offices adopt a more or less stringent standard for consumers and other entities to submit crowdsourced data? If so, what metrics and methods should consumers and other entities be required to meet when submitting crowdsourced data? How should the Bureau and Offices ensure that a speed app has enough users to provide statistically significant results for a mobile provider in a specific geographic area? How should the Bureau and Offices ensure apps do not introduce bias into test results? 54. In the Third Order, the Commission directed OET, in consultation with OEA and WTB, to update the FCC Speed Test app as necessary or develop a new speed test app to collect the metrics and include the requisite functionalities so that challengers may use it in the challenge process. Third Order, 36 FCC Rcd at 1167, para. 104.  The Commission also directed OET to approve additional third-party speed test apps that collect all necessary data and include these required functionalities for use in the challenge process. Id.  We propose that OET issue a public notice inviting proposals for designation of third-party speed test data collection apps as acceptable for use for submission of crowdsourced and challenge data.  In submitting proposals, parties would be required to include information indicating how the app complies with the requirements for crowdsourced data collection and challenge data collection requirements as set forth in applicable Commission orders. Second Order and Third Further Notice, 35 FCC Rcd at 7488, para. 66; Third Order, 36 FCC Rcd at 1166-68, paras. 103-06. OET would provide an opportunity for comments and replies regarding the proposals.  OET would then review all of the proposals, comments, and replies, and evaluate the functionalities before designating apps as acceptable for use for submission of crowdsourced and challenge data. We also propose that OET would provide periodic review and offer guidance for designated third party apps to ensure continued compliance with all technical and program requirements. We seek comment on our proposed process. 55. The Commission found it appropriate to establish and use an online portal for crowdsourced data filings and use the same portal for challenge filings. Second Order and Third Further Notice, 35 FCC Rcd at 7488-89, para. 68; 47 CFR § 1.7006(b). In adopting this approach, the Commission directed the Bureaus and Offices to implement the crowdsourced data collection and create a portal for the receipt of crowdsourced data. Second Order and Third Further Notice, 35 FCC Rcd at 7488-89, para. 68. The Commission also directed OET, OEA, WCB, and WTB to “issue specific rules by which [the Commission] will prioritize the consideration of crowdsourced data in advance of the time that the online portal is available.” Id. at 7488, para. 66. We seek comment on ways to implement this directive. Specifically, we ask commenters to recommend methodologies for submitting mobile crowdsourced data prior to the creation of the online portal that are efficient for consumers and other entities, protect consumers’ privacy, and are feasible for the Bureaus and Offices to implement. For example, data submitted by consumers and other entities that do not follow any specific metrics or methodologies may be less likely to yield effective analysis and review by the Commission of providers’ mobile broadband availability. Therefore, we propose to require consumers and other entities to submit any preliminary crowdsourced data using the same metrics that providers would use when submitting on-the-ground data in response to a Commission verification request. Do commenters agree? 56. As discussed in the Second Order, the Commission declined to establish specific thresholds to use when deciding whether to evaluate providers’ filings where crowdsourced data suggest potential inaccuracies. Id. at 7491, para. 74. Instead, the Commission found that staff should initiate inquiries when a “critical mass of” crowdsourced filings suggest that a provider has submitted inaccurate or incomplete information. Id. at 7491, para. 74 & n.211. The Commission directed OET, OEA, WCB, and WTB to provide guidance to providers when inquiries based on crowdsourced filings could be initiated. Id. at 7491, para. 74. Commenters generally agreed that the crowdsourcing process could be used to highlight problems with the coverage maps’ accuracy and trigger further review by the Commission. Verizon Comments at 14; NTCA Comments at 11; NRECA Comments at 5; See also ACA Connects Comments, WC Docket Nos. 19-195 and 11-10, at 12-14 (rec. Sept. 23, 2019); NCTA Comments, WC Docket Nos. 19-195 and 11-10, at 15-16 (rec. Sept. 24, 2019); Verizon Comments, WC Docket Nos. 19-195 and 11-10, at 6 (rec. Sept. 23, 2019); CTIA Reply, WC Docket Nos. 19-195 and 11-10, at 6 (rec. Oct. 7, 2019). We propose to evaluate mobile crowdsourced data through an automated process to identify potential areas that would trigger further review using a methodology similar to the mobile verification process proposed above, with certain simplifications. Second Order and Third Further Notice, 35 FCC Rcd at 7491, para. 74. We propose that the outcome of this methodology may provide staff with a credible basis for verifying a provider’s coverage data. See Third Order, 36 FCC Rcd at 1146, para. 47. Under our proposed approach, we therefore propose that areas identified from crowdsourced data using this methodology would be subject to verification inquiry consistent with the proposed mobile verification process. Second Order and Third Further Notice, 35 FCC Rcd at 7491, para. 74. We seek comment on this proposed framework for evaluating crowdsourced data. 57. More specifically, the methodology we propose would first exclude any anomalous or otherwise unusable tests submitted as crowdsourced data, and we seek comment generally on how to identify such tests. From the remaining crowdsourced tests, we propose to use data clustering to identify potential targeted areas where crowdsourced tests indicate a provider’s coverage map is inaccurate. We seek comment on our proposal and on any alternative methods for determining when a “critical mass” of crowdsourced filings suggest a provider has submitted inaccurate or incomplete information. 58. In the Second Order, the Commission determined that all information submitted as part of the crowdsourcing process will be made public, with the exception of personally identifiable information and any data required to be confidential under section 0.457 of our rules, Id. at 7492, para. 76; (citing 47 CFR § 0.457); see 47 CFR § 1.7006(b)(4). and directed OEA to make crowdsourced data publicly available as soon as practicable after submission and to establish an appropriate method for doing so. Second Order and Third Further Notice, 35 FCC Rcd at 7492, para. 76. Accordingly, we propose to make all crowdsourced data available via the Commission’s public-facing website. Such information will depict coverage data and other associated information and will not include any personally identifiable information. We propose to update the public crowdsourced data biannually. We seek comment on our proposals and on any alternative methods for making crowdsourced data available to the public. We also seek comment on ways to ensure personally identifiable and other sensitive information is kept secure and private. 59. Finally, the Commission directed OET, OEA, WCB, and WTB to modify the process for the collection of fixed and mobile crowdsourced data over time as determined to be necessary by the Bureaus and Offices. Id. at 7488, para. 67. The Bureaus and Offices seek comment on the proposals herein and will modify the process for collecting mobile crowdsourced data in the future as necessary. IV. PROCEDURAL MATTERS A. Supplemental Initial Regulatory Flexibility Analysis 60. Supplemental Initial Regulatory Flexibility Analysis. As required by the Regulatory Flexibility Act of 1980, as amended (RFA), 5 U.S.C. § 603. The RFA, 5 U.S.C. §§ 601-612, has been amended by the Small Business Regulatory Enforcement Fairness Act of 1996, Pub. L. No. 104-121, 110 Stat. 857 (1996). we have prepared this Supplemental Initial Regulatory Flexibility Analysis (Supplemental IRFA) of the possible significant economic impact on a substantial number of small entities by the proposed rules and policies contained in this Public Notice to supplement the Commission’s Initial and Final Regulatory Flexibility Analyses completed in the Digital Opportunity Data Collection Report and Order and Further Notice of Proposed Rulemaking, Second Order and Third Further Notice, and Third Order. Establishing the Digital Opportunity Data Collection; Modernizing the FCC Form 477 Data Program, WC Docket Nos. 19-195 and 11-10, Report and Order and Second Further Notice of Proposed Rulemaking, 34 FCC Rcd 7505, 7566-86, 7587-608, Appendices B, and C (2019); Second Order and Third Further Notice, 35 FCC Rcd at 7542-60, 7561-81, Appendices C, D; Third Order, 36 FCC Rcd at 1199-222, Appx. B. Written public comments are requested on this Supplemental IRFA. Comments must be identified as responses to the Supplemental IRFA and must be filed by the same deadline for comments specified on the first page of this Public Notice. The Commission will send a copy of this Public Notice, including this Supplemental IRFA, to the Chief Counsel for Advocacy of the Small Business Administration (SBA). See 5 U.S.C. § 603(a). In addition, this Public Notice and Supplemental IRFA (or summaries thereof) will be published in the Federal Register. See id. 61. Need for, and Objectives of, the Proposed Rules. In this Public Notice, WTB, OEA, and OET take the next step to obtain better coverage data and implement the requirements under the Broadband DATA Act which tasks the Commission with collection of granular data from providers on the availability and quality of broadband internet access service and verification of the accuracy and reliability of broadband coverage data submitted by providers. Following the December 27, 2020, Congressional appropriation of funding for the implementation of the Broadband DATA Act, the Commission began to implement challenge, verification, and crowdsourcing processes involving broadband data coverage submissions. 62. The Commission has delegated to its staff the responsibility to develop technical requirements for verifying service providers’ coverage data, a challenge process that will enable consumers and other third parties to dispute service providers’ coverage data, and a process for third parties and other entities to submit crowdsourced data on mobile broadband availability. These measures will help the Commission, Congress, federal and state policy makers, and consumers to evaluate the status of broadband deployment throughout the United States. The Public Notice proposes and seeks comment on technical requirements to implement the mobile challenge, verification, and crowdsourcing processes required by the Broadband DATA Act, such as metrics for on-the-ground test data and a methodology for determining the threshold for what constitutes a cognizable challenge requiring a provider response. It also provides initial guidance and seeks comment on what types of data will likely be more probative in different circumstances. The Bureau and Offices propose detailed processes and metrics for providers to follow when responding to a Commission verification request, for government entities and other third parties to follow when submitting verified broadband coverage data, and for challengers to follow when contesting providers’ broadband coverage availability. We believe this level of detail is necessary to allow providers, consumers and other third parties with robust opportunities to comment, provide input and help formulate the processes and procedures to enable better evaluation of the status of broadband deployment throughout the United States. 63. Legal Basis. The proposed action is authorized pursuant to sections 1-5, 201-206, 214, 218-220, 251, 252, 254, 256, 303(r), 332, 403, and 641-646 of the Communications Act of 1934, as amended, 47 U.S.C. §§ 151-155, 201-206, 214, 218-220, 251, 252, 254, 256, 303(r), 332, 403, 641-646. 64. Description and Estimate of the Number of Small Entities to Which the Proposed Rules Will Apply. The RFA directs agencies to provide a description of, and, where feasible, an estimate of the number of small entities that may be affected by the proposed rules and policies, if adopted. 5 U.S.C. § 603(b)(3). The RFA generally defines the term “small entity” as having the same meaning as the terms “small business,” “small organization,” and “small governmental jurisdiction.” Id. § 601(6). In addition, the term “small business” has the same meaning as the term “small business concern” under the Small Business Act. Id. § 601(3) (incorporating by reference the definition of “small business concern” in the Small Business Act, 15 U.S.C. § 632). Pursuant to 5 U.S.C. § 601(3), the statutory definition of a small business applies “unless an agency, after consultation with the Office of Advocacy of the Small Business Administration and after opportunity for public comment, establishes one or more definitions of such term which are appropriate to the activities of the agency and publishes such definition(s) in the Federal Register.” A “small business concern” is one which: (1) is independently owned and operated; (2) is not dominant in its field of operation; and (3) satisfies any additional criteria established by the SBA. 15 U.S.C. § 632. 65. As noted above, Regulatory Flexibility Analyses were incorporated into the Digital Opportunity Data Collection Report and Order and Further Notice of Proposed Rulemaking, Second Order and Third Further Notice, and Third Order. In those analyses, we described in detail the small entities that might be affected. In this Public Notice, for the Supplemental IRFA, we hereby incorporate by reference the descriptions and estimates of the number of small entities from the previous Regulatory Flexibility Analyses in the Digital Opportunity Data Collection Report and Order and Further Notice of Proposed Rulemaking, Second Order and Third Further Notice, and Third Order. First Order and Second Further Notice 34 FCC Rcd at 7567-84, 7587-7605, Appx. B at paras. 8-53 and C at paras. 4-49; Second Order and Third Further Notice, 35 FCC Rcd at 7542-59, Appx. C at paras. 6-51; Third Order, 36 FCC Rcd at 1200-19, Appx. B at paras. 8-54. 66. Description of Projected Reporting, Recordkeeping, and Other Compliance Requirements for Small Entities. The granular data collection for the challenge and verification processes proposed in the Public Notice would, if adopted, impose some new reporting, recordkeeping, or other compliance requirements on some small entities. Specifically, we propose that mobile providers of broadband internet access service submit coverage data in the form of on-the-ground test data or infrastructure information on a case-by-case basis in response to a Commission request to verify mobile broadband providers biannual BDC data submissions. Additionally, we propose a methodology for state, local, and Tribal government entities and third parties to follow when submitting verified mobile on-the-ground data to the Commission for use in the coverage maps. We also establish a methodology for mobile broadband providers to follow when responding to or rebutting consumer challenges of broadband availability. We also seek comment on other types of data that will likely have more probative value when used to either verify coverage maps or respond to a consumer challenge. Finally, we propose details and seek comment on how third parties and other entities may submit crowdsourced data and how this information may be put to best use. If adopted, any of these requirements could impose additional reporting, recordkeeping, or other compliance obligations on small entities. 67. The challenge and verification process proposals and issues raised for consideration and comment in the Public Notice may require small entities to hire attorneys, engineers, consultants, or other professionals. At this time, however, the Commission cannot quantify the cost of compliance with any potential rule changes and compliance obligations for small entities that may result from the Public Notice. We expect our requests for information on potential burdens, costs and cost minimization and alternative approaches associated with matters raised in the Public Notice will provide us with information to assist with our evaluation of the cost of compliance for small entities of any reporting, recordkeeping, or other compliance requirements we adopt. 68. Steps Taken to Minimize the Significant Economic Impact on Small Entities and Significant Alternatives Considered. The RFA requires an agency to describe any significant, specifically small business, alternatives that it has considered in reaching its proposed approach, which may include the following four alternatives (among others): “(1) the establishment of differing compliance or reporting requirements or timetables that take into account the resources available to small entities; (2) the clarification, consolidation, or simplification of compliance and reporting requirements under the rule for such small entities; (3) the use of performance rather than design standards; and (4) an exemption from coverage of the rule, or any part thereof, for such small entities.” 5 U.S.C. § 603(c)(1)-(4). 69. We anticipate the proposals set forth in the Public Notice will balance the need for the Commission to generate more precise and granular mobile broadband availability maps with any associated costs and burdens on mobile broadband providers. In implementing the requirements of the Broadband DATA Act in orders preceding this Public Notice, the Commission sought comment on the burdens associated with the potential requirements discussed in collecting broadband internet access service data and how such burdens can be minimized for small entities. For example, in the Second Order and Third Further Notice, the Commission sought comment on the potential burdens on small providers associated with: (1) requiring providers to submit on-the-ground data to validate mobile broadband coverage; and (2) encouraging small providers to participate in the challenge process. Second Order and Third Further Notice, 35 FCC Rcd at 7505, 7513, 7519-20, paras. 108, 136, 154, 158. In part, the comments received in response to the Second Order and Third Further Notice helped shape the proposals, approaches and steps taken in this Public Notice. 70. Consistent with the Commission’s recognition in the Third Order that providers should not be subject to the undue cost of responding to a large number of challenges to very small areas, for the mobile service challenge process, we have proposed in this Public Notice to jointly evaluate speed tests submitted by consumers and governmental and third-party challengers. We have also proposed data specifications that all submitted challenger speed test data must meet. After combining consumer speed tests and governmental and third-party speed tests, we propose to validate each speed test and exclude tests that do not present reliable evidence. Under our proposed approach, we would combine such speed test evidence and apply a single methodology to determine whether the threshold for a cognizable challenge has been met and to establish the boundaries of the challenged area. After determining the full set of combined, valid challenger speed tests, we would then associate each speed test with the proposed standardized geographical area discussed in the Public Notice. For each area that includes valid challenger speed tests, we would then evaluate whether several thresholds have been met in order to determine whether the challenger evidence demonstrates a cognizable challenge requiring a provider response. Adopting a process to determine whether there is a cognizable challenge to which a provider is required to respond rather than requiring a provider to respond to any and all submitted challenges will minimize the economic impact for small providers to the extent they are subject to challenges. 71. The proposed mobile service challenge process metrics for mobile providers to follow when responding to a Commission verification request seek to balance the need for the Commission to establish valuable methods for verifying coverage data with the need to reduce the costs and burdens associated with requiring mobile providers to submit on-the-ground test data and infrastructure information. For example, in order to ensure the challenge process is user-friendly for challengers and workable for mobile providers to respond to and rebut challenges, we have proposed that challenged mobile service providers who choose to submit on-the-ground speed test data will be held to the same standard as the challengers to demonstrate that the challenged areas have sufficient coverage. Providers would be required to submit on-the-ground data consistent with the metrics we propose for verifying coverage with on-the-ground data and meet the same three threshold tests as the challengers. We considered but declined a proposal to define a challenge area based on the test data submitted by the challengers on our belief that our proposal is both user-friendly and supported by sufficient data while also targeting a more precise geographic area where broadband coverage is disputed and limits the burden on providers in responding to challenges. The Public Notice seeks comment on the specifics of our proposed methodology and invites commenters to propose alternative approaches that would allow for staff to automatically adjudicate most challenges. 72. Our proposals for collection of verification information recognize that some types of test data such as on-the-ground test data can be more costly for small entities and others to obtain and therefore we have proposed to identify the portion of a provider's coverage map (target area) for which we would require verification data based upon all available evidence, including submitted speed test data, infrastructure data, crowdsourced and other third-party data, as well as staff evaluation and knowledge of submitted coverage data (including maps, link budget parameters, and other credible information). Using all available evidence will enable providers to choose options in line with their specific economic situations. Further, to minimize the cost and burden placed on service providers, while ensuring Commission staff have access to sufficient data to demonstrate coverage, we have proposed to use sampling of the target area. Mobile service providers would be required to provide verification data which covers a statistically valid sampling of areas for which sufficient coverage must be demonstrated to satisfy the verification request. The sample would also be required to meet the same thresholds for adequate coverage as defined in the challenge process using either infrastructure data or on-the-ground speed tests for the targeted area to be successfully verified. The proposed use of a sampling plan to demonstrate broadband availability will allow small and other providers to avoid submission of considerably more data and the associated costs. 73. In crafting the challenge and verification process proposals in the Public Notice, the Bureau and Offices also considered the appropriate verification data requirements for government entities and third parties and the probative value of other types of data. To ensure consistency, reliability, comparability, and verifiability of the data the Commission receives we declined to propose different or lower standards than those that would be applicable to providers. Requiring government entities and third parties to submit on-the-ground test data using the same metrics and testing parameters proposed for mobile providers will ensure that the Commission implements a standardized process resulting in the broadband availability maps that are as accurate and precise as possible. Our consideration of appropriate verification data sources took into consideration both the usefulness and costs of on-the-ground test data which can be more costly to obtain and may not be needed in every situation versus the use of infrastructure information. Based on our analysis we propose to find that infrastructure information will likely be of comparable probative value to on-the-ground test data in situations when cell sites or sectors had a temporary malfunction during measurements, when measurements that led to a verification request or challenge rely on devices that lack a band that the provider uses to make coverage available in the area in question, when speed tests were taken during an uncommon special event (e.g., a professional sporting event) that increased traffic on the network, or when challenger speed tests were taken during a period where cell loading exceeded the modeled cell loading factor. The Public Notice seeks comment on this proposal, on whether there are any other circumstances where infrastructure data will be greater than, equal to, or comparable to, on-the-ground data, and on whether there are other types of data that will be probative in other circumstances. 74. To assist in the further evaluation of the economic impact on small entities of proposals in this Public Notice, and to identify any additional options and alternatives for such entities that the Commission can pursue while also achieving its objectives of improving accuracy and reliability of its data collections, the Bureau and Offices have sought comment on these matters. Before reaching any final conclusions and taking final action in this proceeding, the Bureau and Offices expect to review the comments filed in response to the Public Notice and more fully consider the economic impact on small entities and how any impact can be minimized. 75. Federal Rules that May Duplicate, Overlap, or Conflict with the Proposed Rules. None. B. Deadlines and Filing Procedures 76. Filing Requirements. Pursuant to sections 1.415 and 1.419 of the Commission's rules,47 CFR §§ 1.415, 1.419. interested parties may file comments and reply comments on or before the dates indicated above and must reference WT Docket No. 19-195. Comments may be filed using the Commission’s Electronic Comment Filing System (ECFS) or by filing paper copies. See Electronic Filing of Documents in Rulemaking Proceedings, GC Docket No. 97-113, Report and Order, 13 FCC Rcd 11322 (1998); 63 FR 24121 (1998). · Electronic Filers: Comments may be filed electronically using the Internet by accessing the ECFS: https://www.fcc.gov/ecfs. · Paper Filers: Parties who choose to file by paper must file an original and one copy of each filing. · Filings can be sent by commercial overnight courier, or by first-class or overnight U.S. Postal Service mail. All filings must be addressed to the Commission’s Secretary, Office of the Secretary, Federal Communications Commission. o Commercial overnight mail (other than U.S. Postal Service Express Mail and Priority Mail) must be sent to 9050 Junction Drive, Annapolis Junction, MD 20701. U.S. Postal Service first-class, Express, and Priority mail must be addressed to 45 L Street NE Washington, DC 20554. o Effective March 19, 2020, and until further notice, the Commission no longer accepts any hand or messenger delivered filings. This is a temporary measure taken to help protect the health and safety of individuals, and to mitigate the transmission of COVID-19. See FCC Announces Closure of FCC Headquarters Open Window and Change in Hand-Delivery Policy, Public Notice, 35 FCC Rcd 2788 (OMD 2020), https://www.fcc.gov/document/fcc-closes-headquarters-open-window-and-changes-hand-delivery-policy. 77. People with Disabilities: To request materials in accessible formats for people with disabilities (braille, large print, electronic files, audio format), send an e-mail to fcc504@fcc.gov or call the Consumer & Government Affairs Bureau at 202-418-0530 (voice, 202-418-0432 (tty). 78. Ex Parte Rules. This proceeding shall be treated as a “permit-but-disclose” proceeding in accordance with the Commission’s ex parte rules. See 47 CFR § 1.1200 et seq. Persons making ex parte presentations must file a copy of any written presentation or a memorandum summarizing any oral presentation within two business days after the presentation (unless a different deadline applicable to the Sunshine period applies). Persons making oral ex parte presentations are reminded that memoranda summarizing the presentation must: (1) list all persons attending or otherwise participating in the meeting at which the ex parte presentation was made; and (2) summarize all data presented and arguments made during the presentation. If the presentation consisted in whole or in part of the presentation of data or arguments already reflected in the presenter’s written comments, memoranda, or other filings in the proceeding, the presenter may provide citations to such data or arguments in his or her prior comments, memoranda, or other filings (specifying the relevant page and/or paragraph numbers where such data or arguments can be found) in lieu of summarizing them in the memorandum. Documents shown or given to Commission staff during ex parte meetings are deemed to be written ex parte presentations and must be filed consistent with section 1.1206(b) of the Commission’s rules. In proceedings governed by section 1.49(f) of the rules or for which the Commission has made available a method of electronic filing, written ex parte presentations and memoranda summarizing oral ex parte presentations, and all attachments thereto, must be filed through the electronic comment filing system available for that proceeding, and must be filed in their native format (e.g., .doc, .xml., .ppt, searchable .pdf). Participants in this proceeding should familiarize themselves with the Commission’s ex parte rules. 79. Additional Information. For further information regarding this Public Notice, please contact William Holloway at William.Holloway@fcc.gov (WTB), Jonathan McCormack at Jonathan.McCormack@fcc.gov (OEA), or Martin Doczkat at Martin.Doczkat@fcc.gov (OET). C. Paperwork Reduction Act 80. The rulemaking required under section 802(a)(1) of the Broadband DATA Act is exempt from review by OMB and from the requirements of the Paperwork Reduction Act of 1995 (PRA), Public Law 104-13. 47 U.S.C. § 646(b). As a result, the Public Notice will not be submitted to OMB for review under section 3507(d) of the PRA. [-FCC-] 2 APPENDIX A Technical Appendix 1 Introduction This technical appendix provides additional information about the mobile challenge process and the mobile verification process proposed for the Broadband Data Collection (BDC), including how mobile speed tests would be validated and evaluated to determine whether mobile broadband is available in an area. The proposed processes seek to provide statistically valid methodologies for ensuring that mobile service providers’ coverage maps, based on predictive propagation modeling, reflect the on-the-ground reality of consumers’ experience. See 47 U.S.C. § 642(b)(2)(B) (requiring the Commission to adopt rules by which mobile providers would submit propagation models and propagation model details that indicate current 4G LTE broadband coverage); Establishing the Digital Opportunity Data Collection; Modernizing the FCC Form 477 Data Program, WC Docket Nos. 19-195, 11-10, Second Report and Order and Third Further Notice of Proposed Rulemaking, 35 FCC Rcd 7460, 7476-83, 7503-06, paras. 32-51, 104-09 (2020) (Second Order and Third Further Notice) (adopting rules for mobile providers to submit propagation maps and propagation model details of 3G, 4G LTE, and 5G-NR coverage and seeking comment on how providers may submit a statistically valid sample of on-the-ground data to verify mobile providers’ coverage maps); Establishing the Digital Opportunity Data Collection; Modernizing the FCC Form 477 Data Program, WC Docket Nos. 19-195, 11-10, Third Report and Order, 36 FCC Rcd 1126, 1150-51, para. 59 (2021) (Third Order) (directing “OEA, WTB, and OET to develop and administer the specific requirements and methodologies that providers must use in conducting on-the-ground-tests . . . so that the tested areas satisfy the requirements of a statistically valid and unbiased sample of the provider’s network.”). The Commission has defined the parameters that service providers must use when modeling whether broadband is available using technology-specific minimum download and upload speeds with a cell edge probability of at least 90% and assuming minimum 50% cell loading. Second Order and Third Further Notice, 35 FCC Rcd at 7477, 7479-81, paras. 39, 44-47. Mobile service providers are required to submit data modeled with different minimum speed threshold values. For 3G coverage, these values are 200 kbps download and 50 kbps upload (i.e., 0.2/0.05 Mbps). 47 CFR § 1.7004(c)(3)(i). For 4G LTE coverage, these values are 5 Mbps download and 1 Mbps upload (i.e., 5/1 Mbps). Id. For 5G-NR coverage, there are two sets of minimum speed threshold values, 7 Mbps download and 1 Mbps upload (i.e., 7/1 Mbps), and 35 Mbps download and 3 Mbps upload (i.e., 35/3 Mbps). Id. These speed thresholds determine whether an individual mobile speed test is “positive” or “negative” – that is, whether the test meets the minimum predicted speeds or fails to do so – while the cell edge probability sets the overall expected rate of positive or negative speed tests for challengers and providers. On-the-ground speed testing via mobile device apps, such as the FCC Speed Test app, measures network performance quality through several metrics, including download speed and upload speed. Third Order, 36 FCC Rcd at 1166-67, para. 103. These apps can be used to measure on-the-ground consumer experience and to evaluate the accuracy of propagation model-based coverage maps. Id. at 1166-67, paras. 102-04. We acknowledge that many real-world factors, such as the randomness or variability of localized terrain and clutter such as foliage, vehicular traffic, device performance, cell loading, and weather, may affect the speed test results that are not accounted for in the model-based coverage maps and that testing may not be conducted in a truly random (i.e., unbiased) manner. In this proposal, we have therefore taken several steps to create a robust and representative sample. We specify the minimum sample size needed to create a challenge as well as geographic and temporal requirements when conducting the speed tests. We have also used the principles of statistical design to guide other aspects of the challenge and verification process proposals. The proposed methodologies balance the need to minimize burdens on challengers and providers with the need to ensure a statistically sound methodology upon which to rely for making decisions involving the availability of broadband coverage. Consequently, we propose several requirements for on-the-ground speed tests to be used to challenge a service provider’s coverage map, rebut such a challenge, or verify coverage for the smallest challengeable or verifiable geographic area. These requirements seek to ensure that the set of submitted tests approximates an unbiased sample of the location (geographically and temporally) that is large enough to draw meaningful conclusions from the data. While requiring testers, which could include service providers, consumers, governmental entities, and other third parties, to conduct a truly random sample would increase the statistical confidence of conclusions drawn from the data, this requirement would add significant burden on both challengers and providers, especially as speed tests submitted as challenges may come from individual consumers and/or from unaffiliated challengers. 2 On-the-Ground Speed Test Validations and Certifications We propose to validate on-the-ground speed tests submitted as part of the mobile challenge process by challengers or challenged providers (collectively, “testers”) and as part of the verification process by providers automatically upon submission. Consumer challengers must conduct speed tests using the FCC Speed Test app or another OET-approved speed test app and configuration, and all submitted tests must have a complete set of mandatory fields. Id. at 1166-67, 1172, paras. 103, 117. Any test with a location outside the respective coverage area would be considered invalid and excluded from further analysis. Each speed test would likewise need to conform to the most recent specification for mobile test data released by OEA and WTB, See id. at 1146, para. 48 (instructing OEA and WTB to “adopt the methodologies, data specifications, and formatting requirements that providers shall follow when collecting and reporting mobile infrastructure and on-the-ground test data to the Commission.”). with particular fields having a set or range of acceptable values (e.g., positive download speed), and any test that does not include entirely acceptable values per the data specification would be likewise excluded from further analysis. We propose that testers would also be required to conduct all speed tests between the hours of 6:00 a.m. and 10:00 p.m. local time and that tests would be valid for one year from the test date to ensure the tests are representative of the current state of the provider’s network. Therefore, speed tests may be re-evaluated against new coverage maps when the service provider submits its biannual coverage data. Prior to conducting a speed test, testers must certify that: 1) their handset and speed test app are in ordinary working order to the best of the tester’s actual knowledge, information, and belief; 2) all submitted tests were taken in either an in-vehicle mobile or stationary environment; and 3) the tester is a customer (for challengers) or an authorized user (for providers) of the provider being challenged. Id. at 1167, para. 104. Government and third-party challengers must also substantiate their data through the certification of a qualified engineer or official. Id. at 1172, para. 117. Under our proposal, we would require testers to submit all conducted tests. The requirements that testers certify tests prior to conducting them and that all conducted tests are submitted would help ensure that testers do not submit a selected (i.e., biased) sample of tests. 2.1 Nested Hexagon Grid System Geospatial datasets allow the Earth to be divided into unique cells using various shapes such as squares, rectangles, triangles, circles, and hexagons. We propose to use hexagons as the geographic area for grouping speed tests, identifying challenged areas and evaluating challenges, and evaluating verification data. Hexagons can be arranged to form an evenly spaced grid allowing for less distortion than squares or rectangles due to the curvature of the Earth. See ESRI, Why Hexagons?, https://pro.arcgis.com/en/pro-app/latest/tool-reference/spatial-statistics/h-whyhexagons.htm (last visited June 22, 2021). Specifically, we would use the H3 standardized, open-source geospatial indexing system developed by Uber Technologies, Inc. Isaac Brodsky, H3: Uber’s Hexagonal Hierarchical Spatial Index, (Jun. 27, 2018) https://eng.uber.com/h3/. This system overlays the Earth with hexagonal cells of different sizes or resolutions. Id. The smallest hexagonal cells are at resolution 15, in which the average hexagonal cell has an area of approximately 0.9 square meters, and the largest are at resolution 0, in which the average hexagonal cell has an area of approximately 4.25 million square kilometers. Id. Table 1 provides the H3 geospatial indexing system resolutions that are relevant to the challenge and verification process. For ease of explanation, we refer to the hexagonal cells across different resolutions as a “hex-n” cell, where n is the resolution (e.g., “hex-15” for the smallest size hexagonal cell). The H3 geospatial indexing system employs a nested cell structure wherein a lower resolution hexagonal cell (the “parent”) contains approximately seven hexagonal cells at the next highest resolution (its “children,” and each a “child”). Id. Each finer resolution has nested cells with approximately one seventh the area of the coarser resolution. Id. However, because a hexagon cannot perfectly subdivide into seven child hexagons, the finer resolution cells are not perfectly contained within their parent cells. Id. Regardless, a cell is considered to have a single parent, specifically, the hexagon at the next lower resolution containing the centroid of the cell. That is, a hex-1 cell is the “parent” of seven hex-2 cells, each hex-2 cell is the parent of seven hex-3 cells, and so on. H3 Resolution Average Hexagon Area (square km) Average Hexagon Edge Length (km) Number of unique indexes 5 252.903 8.544 2,016,842 6 36.129 3.229 14,117,882 7 5.161 1.221 98,825,162 8 0.737 0.461 691,776,122 9 0.105 0.174 4,842,432,842 Table 1: Excerpted table of H3 geospatial indexing system resolutions, with numbers rounded for readability. H3, Table of Cell Areas for H3 Resolutions, https://h3geo.org/docs/core-library/restable (last visited June 22, 2021). The grid structure of H3 allows for parent-child or nested relationships to easily translate across multiple indices of various resolution sizes. Figure 1: H3 geospatial indexing system at resolution 7, resolutions 7 and 8, and resolutions 7, 8, and 9. Isaac Brodsky, H3: Uber’s Hexagonal Hierarchical Spatial Index, (Jun. 27, 2018), https://eng.uber.com/h3/. For example, the leftmost image of Figure 1 shows the approximately five square kilometer area of one complete hexagonal cell at resolution 7 (a hex-7 cell). The center image shows the same hex-7 cell with seven finer resolution 8 hexagonal cells (hex-8 cells) nested in the hex-7 cell. The rightmost image shows the original hex-7 cell with seven finer resolution hex-8 cells, and 49 finer resolution 9 hexagonal cells (hex-9 cells). Because the finer resolution child cells are approximately contained in the coarser resolution ancestor cells, H3 allows for efficient indexing of the area with minimal shape distortion that would occur only at the cell boundaries. 3 Methodology for Creating and Evaluating Mobile Challenges 3.1 Creating a Challenge For the challenge process, we propose that the smallest cognizable challenge would be to a hex-8 cell, which has an average area that is approximately 0.7 square kilometers. This H3 resolution has hexagonal cells that are closest to the one square kilometer de minimis threshold for challenges and one square kilometer uniform grid system adopted for the Mobility Fund Phase II challenge process, along with the same one square kilometer uniform grid system adopted by the Commission for use when evaluating on-the-ground speed tests submitted by 5G Fund support recipients. See Connect America Fund; Universal Service Reform – Mobility Fund, WC Docket No. 10-90, WT Docket No. 10-208, Order on Reconsideration and Second Report and Order, 32 FCC Rcd 6282, 6305-06, para. 46 (2017); Procedures for the Mobility Fund Phase II Challenge Process, WC Docket No. 10-90, WT Docket No. 10-208, Public Notice, 33 FCC Rcd 1985, 1989-90, para. 9 (WTB/WCB 2018); Establishing a 5G Fund for Rural America, GN Docket No. 20-32, Report and Order, 35 FCC Rcd 12174, 12232, para. 140 (2020). Coverage maps must be submitted at a resolution of 100 meters (i.e., 0.1 km) or better. 47 CFR § 1.7004(c)(3)(iii). Therefore, allowing for challenges to an area smaller than a hex-8 cell may instead reflect inaccuracies due to the resolution at which the provider generated its maps. Allowing challenges for a smaller area (at a higher resolution) may thus require excessive testing by providers whose propagation maps were not designed to provide such precision. Conversely, allowing challenges only for a larger area (at a lower resolution) would require significantly more testing for the challenger and may hamper a challenger’s ability to demonstrate local coverage gaps. Roughly 70% of hexagonal cells at resolution 8 in the United States, excluding Alaska, intersect with at least one road, using U.S. Census Bureau roadway data. This means that most hex-8 cells should be easily accessible for drive testing. The current proposal contains a process to create challenges at lower resolutions if sufficient evidence exists, as described in section 3.1.4 Under the staff proposal, speed tests would be required at a variety of locations within the hex-8 cell and at more than one time of day. A speed test would be categorized as a “positive” test—that is, a test that meets or exceeds the minimum download and upload speeds associated with the coverage area of a mobile providers’ technology being tested (e.g., 5/1 Mbps for 4G LTE), or a “negative” test—that is, a test that does not satisfy the minimum speeds associated with the coverage area. For example, for a 4G LTE speed test to be positive, the download speed must be at least 5 Mbps and the upload speed must be at least 1 Mbps (5/1). If a test records speeds that are 4/2 Mbps or 25/0.5 Mbps, the test would be considered negative tests because neither meet both minimum speed thresholds. Only speed tests conducted on 3G networks would be used to challenge 3G coverage, only speed tests conducted on 4G LTE networks would be used to challenge 4G LTE coverage, and only speed tests conducted on 5G-NR networks would be used to challenge 5G-NR coverage. For speed tests conducted on a 5G-NR network and submitted as challenges, under the staff proposal, tests would be evaluated against the highest minimum speeds reported in the mobile service provider’s coverage data. As discussed, providers are required to submit coverage data showing where their models predict 5G-NR coverage with minimum speeds of 35/3 Mbps in addition to where their models predict 5G-NR coverage with minimum speeds of 7/1 Mbps. Consequently, for a 5G-NR speed test to be considered positive within the area that the provider reports minimum speeds of 35/3 Mbps, the download speed must be at least 35 Mbps and the upload speed must be at least 3 Mbps. If the test’s speeds were 10/3 Mbps, 35/1 Mbps, or even 7/1 Mbps, these tests would be considered negative tests because none meet both minimum speed thresholds. However, all three of these 5G-NR speed tests would be considered positive if conducted outside of the area that the provider reports minimum speeds of 35/3 Mbps but within the area that it reports minimum speeds of 7/1 Mbps. Challengers’ speed tests would be evaluated collectively, so that the tests of multiple challengers may be used to create a challenge, and speed tests would be evaluated cumulatively, with all tests remaining valid for one year. Each challenger would be notified that an area where it provided tests has been classified as challenged. A hex-8 cell would be classified as challenged if the following three thresholds were met in the cell: 1. Geographic Threshold 2. Temporal Threshold 3. Testing Threshold These three separate conditions would help ensure that tests are geographically and temporally diverse, and therefore approximate a random sampling of the area. At the same time, these proposals are meant to not be overly burdensome to challengers – the typical distance to capture measurements at four different hex-9s is less than a mile. For example, challengers could reasonably satisfy these criteria while taking their dog for a morning and evening walk through their neighborhood. As challengers submit speed test results, the system would identify in which hex-8 cell each test occurred, continually evaluating whether the thresholds have been met for each cell, and would make this information available to challengers As proposed, when the test results for a given cell meet all of the thresholds to successfully create a challenge, that cell would be considered “tentatively challenged” until the end of the calendar month in which the criteria were met. This would allow challengers to review which cells have been tentatively challenged and, if desired, provide additional speed test data in cells where submitted speed test data were insufficient to establish a challenge. At the end of each month, any cell that is tentatively challenged would then be considered “challenged” and the challenged service provider would be notified of any challenged cells. The provider would then have 60 days to respond to the challenges for any newly challenged cells. 3.1.1 Geographic Threshold We propose a geographic threshold for the challenge process so that challengers must demonstrate that lack of coverage exists over a sufficiently large area and is not concentrated in one small area. To accurately measure the coverage in a hex-8 cell, we would require speed tests to be conducted in multiple locations within the geographic area of the cell and include negative tests recorded at multiple different locations. These requirements would ensure geographic diversity of tests and identify potential coverage gaps over a sufficiently wide area. The proposed methodology would group speed tests that fall within the same hex-9 cell (which is a child of a hex-8 cell), or “point-hex,” and that are within the reported coverage of the tested provider. Speed tests conducted within the same point-hex would be on average within 350 meters of each other, or approximately 3 city blocks, which is less than the 400-meter buffer radius used in the Mobility Fund Phase II challenge process. Connect America Fund; Universal Service Reform—Mobility Fund, WC Docket No. 10-90, WT Docket No. 10-208, Order on Reconsideration, 33 FCC Rcd 4440-41, paras. 1, 4 (WTB/WCB 2018). Because there are seven child hex-9 cells within a hex-8 cell, tests within a hex-8 cell will fall within one of seven point-hexes. Because the child cells do not perfectly nest within the parent cell, some tests may fall within the hex-8 cell but not within any child point-hex. These tests will not count towards satisfying the geographic threshold but will be included when evaluating the temporal and testing thresholds. The system would count the number of point-hexes that contain the following: (a) at least two tests (either positive or negative) and (b) at least one negative test. Such point-hexes would be considered to have inadequate coverage—that is, there would be prima facie evidence that the provider’s submitted coverage data may be inaccurate for these point-hexes. To satisfy the geographic threshold for a challenge, a hex-8 cell would generally need to contain at least four point-hexes that meet both criteria (a) and (b) above. This requirement would assure that more than 50% of the point-hexes show inadequate coverage. Figure 2: Speed tests within a hex-8 cell (outlined in black), with negative (red) and positive (green) tests shown. Figure 2 illustrates the geographic threshold requirement. The graphic on the left shows that four child point-hexes (outlined in red) satisfy the testing requirements to be counted towards the geographic threshold because all four point-hexes include two or more tests, at least one of which is a negative test. The graphic on the right shows that no child point-hexes satisfy the testing requirements to be counted towards the geographic threshold because there are either fewer than two tests in each point-hex, the test does not fall within a point-hex, or the point-hex does not include a negative test. If a provider’s coverage map only partially covers a hex-8 cell, or if a portion of the hex-8 cell does not contain roads, Using the most recent U.S. Census Bureau roadway data, a point-hex would contain a road if it overlaps any primary, secondary, or local road, which are defined as MAF/TIGER Feature Class Codes S1100, S1200, or S1400, respectively. U.S. Census Bureau, 2020 TIGER/Line Shapefiles: Roads, https://www.census.gov/cgi-bin/geo/shapefiles/index.php?year=2020&layergroup=Roads (last visited June 22, 2021). In order to account for road width, we would apply a small buffer around the U.S. Census Bureau road line data. then the geographic threshold for that cell would be reduced from the specified four-out-of-seven point-hex requirements described above. We would consider point-hexes to be “accessible” where at least 50% of the point-hex overlaps with the provider’s reported coverage data and a road runs through the point-hex. Where fewer than four point-hexes in a hex-8 cell are accessible, the number of point-hexes necessary to satisfy the geographic threshold for the hex-8 cell would equal the number of accessible point-hexes in that cell (see Table 2 below). For example, if there are only two accessible point-hexes in a hex-8 cell, then only two of the point-hexes would need to contain multiple speed tests and at least one negative test in order to satisfy the geographic threshold. If a point-hex does not have a road but is still within the provider’s coverage, tests conducted in that point-hex would count towards satisfying the geographic threshold for a challenge in a hex-8 cell (assuming that multiple speed tests with at least one negative speed test were recorded in that point-hex). For example, a challenge contains two tests, at least one of which is negative, within a point-hex that does not have a road but was reached via a hiking trail. This would count towards satisfying the geographic threshold. If there are no accessible point-hexes within a hex-8 cell, the geographic threshold would not need to be met; only the temporal and testing thresholds would need to be met in order for that hex-8 cell to be considered challenged. Number of Accessible PointHexes in Hex-8 Cell Minimum Number of Accessible Point-Hexes Required for Challenge 4 – 7 4 3 3 2 2 1 1 0 0 (hex-8 cell must contain tests that satisfy the temporal threshold and testing threshold) Table 2: Relationship of Accessible Point-hexes to Geographic Threshold. Figure 3: Example of accessible child point-hexes within a hex-8 cell. The example in Figure 3 illustrates the impact of accessible point-hexes within a hex-8 cell on the geographic threshold. In this graphic, the provider’s reported coverage is shown in blue, with the underlying roads shown. Only two point-hexes both contain a road and have greater than 50% overlap with the provider’s coverage. Therefore, only those two point-hexes are accessible, and the geographic threshold would be only two point-hexes for this hex-8 cell rather than the typical requirement of four point-hexes. 3.1.2 Temporal Threshold We propose a temporal threshold for the challenge process so that challengers must demonstrate the lack of coverage is persistent rather than temporary. To meet this requirement, a hex-8 cell would need to include two negative tests with a time-of-day difference of at least four hours between the negative tests, regardless of the date of the test. For example, if a challenge’s first negative test occurred at 10:00 a.m. and, on the same day, the challenge recorded an additional negative test at 6:30 p.m., then this temporal requirement would be satisfied because the difference between 10:00 a.m. and 6:30 p.m. is greater than four hours. If a challenge only recorded negative tests between 9:00 a.m. and 10:00 a.m. on the same day, or even a negative test at 9:00 a.m. on one day and another negative test at 10:00 a.m. on the following day, those tests would not satisfy this temporal requirement because the time-of-day difference between 9:00 a.m. and 10:00 a.m. is only one hour. 3.1.3 Testing Threshold We propose a testing threshold for the challenge process so that challengers must demonstrate statistically significant evidence that coverage is inadequate. Specifically, in order for the testing threshold for a hex-8 cell to be met, we propose that at least five negative tests have been taken within the cell when challengers have submitted 20 or fewer tests. When challengers have submitted more than 20 tests, we propose that a certain minimum percentage of the total number of tests in the cell must be negative, ranging from at least 24% negative, when challengers have submitted between 21 and 29 total tests, to at least 16% negative, when challengers have submitted 100 or more tests (see Table 3 below). Ignoring the costs to the challengers and providers, a greater sample size only improves the statistical certainty associated with such a statistical test. A general rule of thumb in statistics is that a sample size should be at least 30, but in this context, we have considered the burden on challengers and providers of reaching such a sample size. Marco Taboga, Lectures on Probability Theory and Mathematical Statistics ch. 71 (3d ed. 2017). We also note that a sample size any lower than our suggested values would be of questionable statistical validity. For example, if 60 tests were recorded in a hex-8 cell, at least 12 of these tests must be negative to meet this requirement. Once the percentage of negative tests recorded meets the minimum negative percentage required, or for a sample of less than 21 tests, once there are at least five negative tests submitted, we would not require additional tests so long as both the geographic and temporal thresholds for a hex-8 cell have been met. The proposed thresholds for the percentage of negative tests are based on the statistical significance necessary to demonstrate lack of coverage. Total Number of Tests Count or Percent Negative Tests 20 or fewer 5 tests 21-29 24% 30-45 22% 46-60 20% 61-70 18% 71-99 17% 100+ 16% Table 3: Challenger testing threshold with required number of tests and negative test counts or percentages. 3.1.3.1 Statistical Analysis Given the variable nature of wireless signal propagation and network load, occasional negative tests are possible within areas that should otherwise have adequate coverage, assuming a cell edge probability of 90% with 50% cell loading. We have therefore defined the testing thresholds for the minimum number of tests and the percentage of negative tests per hex-8 cell based on a statistical test called a one-sided confidence interval. We refer to “proportion” as the percentage of positive/negative tests relative to the total number of tests. We refer to “probability” as the chance that any one test will be positive/negative. In other words, the term “proportion” reflects a summary of observations and the term “probability” reflects a hypothetical chance. As part of this method, we first defined the significance level for a hypothesis testing: the null hypothesis is that the probability of adequate coverage is at least 90%, and thus, the alternate hypothesis is that the probability of coverage is less than 90%. We have chosen 90% for the null hypothesis because the coverage maps are modeled to reflect a 90% probability of coverage at the cell edge. While the probability of coverage at any point within the cell (i.e., not at the cell edge) is greater than 90%, we chose a conservative probability for simplicity. In other words, to reject the null hypothesis and successfully establish a challenge for a provider’s coverage data, a challenge must provide sufficient evidence that the probability of coverage is less than 90% over a geographically and temporally diverse sample. Next, we chose a desired level of statistical significance. This value is commonly referred to as alpha (α) and represents the Type I error that we are willing to tolerate (the probability of a “false positive”). Taboga, supra note 25 at ch. 78. For instance, a 95% confidence interval has a 5% significance level, or a 5% probability of rejecting the null hypothesis when the null hypothesis is true. Several tradeoffs exist when choosing the significance level. Choosing a very low significance level would reduce the chance of a “false positive” result—that is, successfully establishing a challenge when there is, in fact, adequate coverage in an area. To reduce the chance of false positives, more samples are required to demonstrate that coverage meeting the minimum speeds does not exist. Conversely, decreasing the significance level would increase the probability of a “false negative” result. A “false negative” is commonly referred to as the value beta (β) and represents the Type II error. Id. Therefore, for the staff proposal we chose a statistical significance value of 5%, which means we would calculate a 95% confidence interval and accept a 5% probability of finding inadequate coverage when there is actually adequate coverage. In many fields, common values of significance levels are 1% (99% confidence interval), 5% (95% confidence interval), and 10% (90% confidence interval). After choosing the significance level, we are able to calculate the minimum number of negative tests that a challenge would need in order to provide sufficient statistical evidence that coverage meeting the minimum speeds does not exist in a hex-8 cell. Specifically, for the staff proposal we have applied the Clopper-Pearson method for calculating an exact binomial confidence interval. This “exact” method is more complicated than the traditional normal approximation of a binomial confidence interval, which is inaccurate when sample sizes are small or the proportion is very high or low. Given that we seek to minimize the burden on challengers and providers in the form of small sample sizes and expect the sample proportions to be very low (close to 0) or very high (close to 1), we have concluded that the traditional approach is not appropriate. A common rule of thumb is that the traditional method of a normal approximation should only be used if n×p≥5 and n×(1-p)≥5, where n is the sample size and p is the sample proportion (number of positive tests divided by the sample size). This rule of thumb can be interpreted as the sample must have at least 5 positive and 5 negative tests for the traditional approach to be used and more than 50 tests would be required. To reject the null hypothesis, the upper bound of the confidence interval must be less than 95%: Pub=1-β-1α, n-k, k+1<0.9 where the inverse of the cumulative density function of the beta distribution is used to calculate a binomial cumulative density function (CDF), α is the chosen significance level, n is the sample size (number of tests), and k is the number of positive tests. We distinguish here between the inverse cumulative density function (CDF) of the beta distribution and the scalar value of beta which are not related. The beta distribution is probability distribution defined by two shape parameters and has a probability density function (PDF) of fx; α, β=1B(α, β)*xα-1*1-xβ-1 where B() is the Beta Function which defines a normalization constant to ensure the total area under the PDF equals 1, α > 0, β > 0, and 0 ≤ x ≤ 1. Note that the α in the inverse beta distribution corresponds to the significance level but the α in the beta distribution is a shape parameter. If the challenger chooses to conduct more than the minimum number of tests, the required percentage of negative tests would decrease since the challenger has provided a larger sample size and there is more certainty that their sample proportion reflects the true probability of coverage meeting the minimum speeds. Using this formula, five or more failures in 20 tests demonstrates a likely lack of adequate coverage. We calculated the percentage of failures required at each number of tests. For simplicity, we then grouped the number of tests and chose failure percentages such that the number of failures will always meet or exceed the exact number of failures required. 3.1.4 Challenges to Larger, Lower-Resolution Hexagons If multiple nearby hex-8 cells meet the three proposed thresholds described above, it may point to a more systemic lack of coverage across a larger area. Rather than require that challengers meet these thresholds in every hex-8 cell near a group of challenged hex-8 cells, we propose to use the nested structure of H3 to establish challenges across larger areas. We propose that if four or more of the seven child hex-8 cells in a hex-7 cell (which has an average area of 5.2 square kilometers) are challenged, then the entire hex-7 cell also will be considered to be challenged (see Fig. 4 below). Similarly, we propose that if four or more of the seven child hex-7 cells in a hex-6 cell (which has an average area of 36.1 square kilometers) are challenged, then the entire hex-6 cell also will be considered to be challenged. Hexagonal cells at a resolution lower than resolution 6 (i.e., those larger than a hex-6 cell) could not be challenged in the challenge process. Instead, we would rely upon the verification process to address cases where sufficient evidence suggests that there is a more systemic problem with a provider’s coverage data, for example, an area larger than a hex-6 cell. Figure 4: Process by which challenges to hex-8 cells can challenge a larger hex-7 cell. This process is illustrated in Figure 4 above. The leftmost graphic shows the challenges to hex-8 cells that would be determined using the geographic, temporal, and testing thresholds. The center graphic shows the hex-7 cells containing any challenged hex-8 cells that would then be identified. Finally, the rightmost graphic shows that the system would consider any hex-7 cell that contains four or more challenged hex-8 cells to also be challenged. 3.1.5 Stationary vs. In-Vehicle Mobile Challenges Mobile service providers are required to submit two sets of data for a given technology: one map that predicts coverage assuming the device is stationary and one map that predicts coverage assuming the device is in-vehicle and mobile. Second Order and Third Further Notice, 35 FCC Rcd at 7481-82, para. 48; 47 CFR § 1.7004(c)(5). Similarly, challengers and providers are required to indicate, for each speed test, whether the speed test was conducted in an outdoor stationary or in-vehicle mobile environment. Third Order, 36 FCC Rcd at 1150-51, 1166 paras. 57, 59, 102 & n.315; 47 CFR § 1.7006(e)(1)(iii), (f)(1)(i)(G). We propose to first filter out speed tests so as to exclude any stationary tests that fall outside of the provider’s stationary coverage map and to exclude any in-vehicle mobile tests that fall outside of the provider’s in-vehicle mobile coverage map. The resulting speed tests would then be aggregated and independently compared against each coverage map, and any aggregated speed test that falls outside of the particular coverage map would be similarly excluded. Because the two coverage maps may differ especially at the edge of a provider’s network, speed tests submitted as challenges against the same provider within the same hex-8 cell may be sufficient to create a challenge against one of the provider’s coverage maps and insufficient to create a challenge against the other. For example, the aggregated speed tests might meet the testing, geographic, and temporal threshold requirements for a hex-8 cell when evaluated against a provider’s stationary 4G LTE coverage map but fail to meet the testing threshold when evaluated against the provider’s in-vehicle mobile 4G LTE coverage map because a handful of stationary speed tests fell outside of the provider’s in-vehicle mobile coverage. 3.2 Provider Notification and Response Process We propose to notify mobile service providers after the end of each calendar month of all hexagonal cells for which cognizable challenges were triggered during that month. Upon notification, challenged providers would have 60 days to respond to a challenge by either conceding or disputing a particular challenge. Third Order, 36 FCC Rcd at 1168, 1173, paras. 107, 121; 47 CFR § 1.7006(e)(3), (f)(4). Where the challenged provider disputes a challenge, it must submit either infrastructure data or on-the-ground speed test data to respond to the challenge. Third Order, 36 FCC Rcd at 1168-69, 1173-74, paras. 108, 121; 47 CFR § 1.7006(e)(4), (f)(5). The challenged provider may optionally provide additional data that would be helpful to the Commission in adjudicating challenges. Third Order, 36 FCC Rcd at 1168-69, 1173-74, paras. 108, 121; 47 CFR § 1.7006(e)(4), (f)(5). If the provider submits on-the-ground speed test data in response to a challenge, we propose to evaluate the provider speed tests independently of the challengers’ data. A challenged provider may provide evidence of coverage in response to a challenge by submitting on-the-ground speed test data for any challenged hex-8 cell or any hex-8 cell that is the child or grandchild of a challenged hex-7 or hex-6 cell. To successfully overcome a challenge, a challenged provider may submit positive speed tests for any hex-8 cell within the challenged area. In addition to any challenged hex-8 cell, this also includes any hex-8 cell that is a child of a challenged hex-7 or grandchild of a challenged hex-6 cell. If a challenged provider submits positive speed tests meeting the thresholds for challenged or non-challenged hex-8 cells that have a challenged parent or grandparent, such tests could overturn challenges in the larger hex-7 or hex-6 cells. We propose that the provider may only submit speed tests conducted during the previous 12 months as evidence in response to a challenge. The proposed criteria that a provider must meet to overturn a challenge by showing evidence of coverage in a hex-8 cell would be similar to the criteria for creating a challenge. Challenged providers would be required to meet the following three thresholds: 1. Geographic Threshold 2. Temporal Threshold 3. Testing Threshold As proposed, a hex-8 cell for which a challenge is successfully overturned would not be subject to subsequent challenge until the first biannual BDC coverage data filing six months after the later of either the end of the 60-day response period or the resolution of the challenge (the “confirmed period”). A challenged provider may “restore” a challenged hex-7 or hex-6 to an unchallenged state if, as a result of data submitted by the provider, there is no longer sufficient evidence to sustain the challenge to that hexagon (see example in section 3.1.4), or, as discussed below, if the provider submitted evidence invalidating challenger speed tests such that the remaining valid challenger speed tests are no longer sufficient to challenge the hex-8. Unlike with cells for which a challenge was overturned, a restored cell would be subject to challenge at any time in the future as challengers submit new speed test data. 3.2.1 Geographic Threshold We propose a geographic threshold so that challenged providers must demonstrate that coverage exists over a sufficiently large area in the challenged cell to overturn a challenge. To overturn a challenge by showing adequate coverage in the challenged area, the provider would need to meet the same geographic threshold required of challengers, but with positive tests rather than negative tests required. These requirements would ensure geographic diversity of tests and demonstrate that coverage is consistent over a sufficiently wide area. The system would count the number of point-hexes that contain the following: (a) at least two speed tests (either positive or negative); and (b) at least one positive test. Such point-hexes would be considered to have some evidence of adequate coverage. To satisfy the geographic threshold for a challenge response, a hex-8 cell would generally need to contain at least four point-hexes that meet both criteria (a) and (b) above. This requirement would assure that more than 50% of the point hexes show adequate coverage. As with the geographic threshold required of challengers, the number of point-hexes for which the challenged provider would be required to make this showing would be the same as that required of the challenger if there are less than four “accessible” point-hexes (as defined in section 3.1.1) within the challenged hex-8 cell. 3.2.2 Temporal Threshold We propose a temporal threshold so that challenged providers must demonstrate that the existence of coverage is persistent to overturn a challenge using on-the-ground speed test, analogous to the temporal threshold required of challengers. To meet this requirement, a hex-8 cell would need to include two positive tests with a time-of-day difference of at least four hours between the positive tests, regardless of the date of the test. 3.2.3 Testing Threshold We propose a testing threshold for challenged providers so that providers must demonstrate statistically significant evidence that coverage is adequate to overturn a challenge using on-the-ground speed tests, based on the same statistical significance analysis used for determining challenges. Using the same statistical analysis as detailed in section 0, the provider needs tests such that Pub≥0.9 to demonstrate that the 95% confidence interval either contains or exceeds 90%. Specifically, in order for the testing threshold for a hex-8 cell to be met, we propose that at least 17 positive tests have been taken in the cell when the provider has submitted 20 or fewer tests. When the provider has submitted more than 20 tests, we propose that a certain minimum percentage of the total number of tests in the cell must be positive, ranging from at least 82% positive, when providers have submitted between 21 and 35 total tests, to at least 88% positive, when providers have submitted 100 or more tests (see Table 4 below). As more tests are taken, the confidence interval shrinks and, therefore, the percent of positive tests required to demonstrate coverage increases. For example, if 50 tests were recorded in a hex-8 cell, at least 43 of these tests must be positive to meet this requirement. Once there are at least 17 positive tests submitted or the percentage of positive tests meets the minimum percent required, as appropriate, we would not require additional tests so long as both the geographic and temporal thresholds for the hex-8 cell have been met. These proposed thresholds are meant to demonstrate that the required speeds can be obtained in the cell 90% of the time. Total Number of Tests Count or Percent of Positive Tests 20 or fewer 17 tests 21-34 82% 35-49 84% 50-70 86% 71-99 87% 100+ 88% Table 4: Provider testing threshold with required number of tests and positive test percentages. 3.2.4 Responding to Lower Resolution Challenges If a hex-7 or hex-6 cell is challenged, the provider can overturn the challenge by submitting positive speed test results sufficient to overturn or revert a challenge such that fewer than four of the child hex-8 cells remain challenged. If a provider does not submit sufficient speed test data to overturn a challenged hex-8, hex-7, or hex-6 cell, and is not able to overturn or revert a challenge using other evidence, then it would be required to remove from its coverage map the area overlapping any portion of the successfully challenged hexagon. However, a challenged provider would not need to remove from its coverage map the area overlapping any hex-8 cells in which the provider was able to provide sufficient on-the-ground test evidence to overturn the challenge. Figure 5: Process by which provider responses to challenged hex-8 cells can revert a challenged hex-7 cell. This process is illustrated in Figure 5 above. The leftmost graphic shows a hex-7 cell that would be challenged because four child hex-8 cells were challenged. The graphic in the center shows that the provider overturned the challenge to three hex-8 cells (in green, one of which was challenged, the other two were within the hex-7 being challenged) and only three hex-8 cells remain challenged (in red), so the previously challenged parent hex-7 cell would be restored and no longer challenged. The rightmost graphic shows, in the alternative, that the provider overturned the challenge to only two hex-8 cells (in green, both challenged through the hex-7 being challenged) leaving four hex-8 cells that remain challenged (in dark red), therefore the challenge to the parent hex-7 cell would remain. The provider would be required to remove from its coverage map all areas successfully challenged but would not need to remove areas overlapping the two hex-8 cells for which the challenges were overturned. 3.3 Post Challenge Review After the challenged provider submits all responses, any areas that remain challenged must be removed from the provider’s coverage map. Third Order, 36 FCC Rcd at 1170, 1174, paras. 112, 124; 47 CFR § 1.7006(e)(3), (f)(4). The provider will have 30 days to submit new maps reflecting these updates. Third Order, 36 FCC Rcd at 1170, 1174, paras. 112, 124; 47 CFR § 1.7006(e)(6), (f)(7). Speed tests submitted for the challenges in these areas would thus no longer be valid because the tests would no longer be within the provider’s coverage map. Any areas where the provider overturned the challenge would be marked as such and would be ineligible for challenge for the duration of the confirmed period. Any hex cells that were challenged and restored (but where coverage was not confirmed) would remain eligible to be challenged. Any valid speed test in these cells may still be used for a future challenge (up to a year from the date the test was conducted). It is also possible that sufficient proof was provided to confirm a hex cell had coverage but some of its children were neither challenged nor confirmed to have coverage. These individual child cells can still be challenged in the future, but the hex cell is unable to be challenged for the period as described previously. If the provider makes modifications to its infrastructure and the provider now models coverage in an area that had previously been successfully challenged, the provider may include this area in their coverage map data, but would be required to submit additional information or data before certifying their BDC coverage data detailing what modifications have been made that now provide this coverage. Third Order, 36 FCC Rcd at 1170, 1174, paras. 111, 123; 47 CFR § 1.7006(e)(5), (f)(6). Providers would be required to submit such information and data as part of their biannual BDC coverage data submissions. Otherwise, areas that have successfully been challenged must not be included in future submissions of the coverage map. Third Order, 36 FCC Rcd at 1170, 1174, paras. 112, 124; 47 CFR § 1.7006(e)(6), (f)(7). 4 Methodology for Verifying Coverage Data 4.1 Introduction In the Third Report and Order, the Commission called for the development and administration of “specific requirements and methodologies that providers [subject to verification] must use in conducting on-the-ground-tests, including the geographic areas that must be subject to the on-the-ground testing so that the tested areas satisfy the requirements of a statistically valid and unbiased sample of the provider’s network.” Third Order, 36 FCC Rcd at 1150, para. 59. In this section, we provide the technical details of the staff proposal for a stratified random sample design that addresses these requirements. In particular, these technical details guide both the Commission and the provider subject to verification in determining: · where, within the geographic boundaries of the portion of its coverage map, a provider should conduct on-the-ground testing; · in how many locations a provider must conduct on-the-ground speed test measurements; · what speed test measurements will be accepted for staff analysis by the Commission; and · how Commission staff will evaluate the test data and adjudicate whether the provider has passed or failed verification. The proposed design is applicable when the mobile service provider chooses to submit on-the-ground tests data as evidence to support its claim of broadband availability in the portion of its coverage map subject to verification. This proposed design does not, however, address the mechanism or thresholds that triggers verification. 4.2 Targeted Area, Frame, Units, and Sample At the heart of any sampling exercise is one or more research questions about a population of interest, called the “targeted area.” Such questions typically involve estimates of population totals, averages, proportions, and similar statistics. Data are then collected from a randomly selected subset of the population called a “sample,” and subsequently analyzed to produce the desired estimates. Sampling starts with the division of the target population into unique components called, “units.” The list of units is called the “frame.” A sample of units is then randomly selected from the frame according to some carefully designed plan. In the context of BDC mobile verifications, the targeted area is a portion of a provider’s coverage map subject to verification. Under our proposed approach, staff would define the boundaries of the target based upon the mechanism for triggering verification – i.e., defining the area where staff has a “credible basis” to request verification from the provider. The research question is whether the provider provides adequate coverage meeting the minimum speeds required for the modeled technology throughout the target and at different periods of the day. Id. at 1146-47, para. 50; 47 U.S.C. § 642 (a)(1)(B)(i), (b)(2)(B). As a well-defined geographic area, the targeted area can be processed into layers of hexagons consistent with the system used in the challenge process (H3 geospatial indexing system at resolution 8). Each tessellating hex cell inside the target is a sampling unit, and the set of all such hexagons forms the frame. Figure 6(a) and (b) below illustrates these concepts: Figure 6: Defining the Target Area, Frame, Units, and Sample for the Mobile Verification Process. 4.3 Stratified Random Sampling Stratified random sampling occurs when: 1) a frame is divided into non-overlapping/mutually exclusive groups such that every unit is in exactly one group called a “stratum” (plural: strata) (this process is called stratification); and 2) random units are independently selected in each stratum (i.e., the selection of units in one stratum does not affect the selection of units in another stratum). When properly implemented, a stratified design can simultaneously increase the precision of the desired estimates and decrease the total number of units in the sample (sample size) required to meet this precision. William G. Cochran, Sampling Techniques ch. 5 (3rd ed. 1977). Ideally, stratification is accomplished by using prior knowledge about the quantity of interest. However, it is usually the case, especially for a new sampling exercise, that no such prior knowledge is available. In this case, one or more quantities directly related to (i.e., correlated with) the quantity of interest are used as stratification variables. As proposed in the verification proposal design, stratification would begin by first identifying which hex cells in the frame can be drive tested to easily produce on-the-ground speed tests and then dividing the frame into drive-testable versus non-drive-testable hex cells, as illustrated in Figure 6(c). Under the staff proposal, we would use the latest road data from the U.S. Census Bureau to make this determination. U.S. Census Bureau, 2020 TIGER/Line Shapefiles: Roads, https://www.census.gov/cgi-bin/geo/shapefiles/index.php?year=2020&layergroup=Roads (last visited June 22, 2021). In the stratum of drive-testable hex cells, the staff proposal would use terrain variation, denoted as X and measured in meters, as the second-level stratification variable. See Office of Economics and Analytics and Wireline Competition Bureau Adopt Adjustment Factor Values for the 5G Fund, GN Docket No. 20-32, Public Notice, 35 FCC Rcd 12975, 12976, paras. 4-5 (OEA/WCB 2020). Terrain variation is an example of a viable stratification variable because it is correlated with broadband availability due to the characteristics of radiofrequency propagation. We would calculate a terrain variation value for every hex cell in the frame. Additional stratification variables may also be used (for example, clutter or population) to stratify the drive-testable hex cells, we would use between 5 to 20 strata (depending on the number of drive-testable hex cells) and construct the strata using the cumulative square root of the frequency rule, a standard stratification method which creates equal intervals not on the X scale, but rather on the cumulated square root of the count (frequency) of drive-testable hex cells on equal intervals along the X scale. Cochran, supra note 47 at 127-31. Figure 6(d) shows an example of stratification in the drive-testable hex cells. 4.4 Sample Selection In any sampling exercise, the question of what sample size, n, to use is of primary importance. In the context of the mobile verification proposal, we propose that staff would decide the value of n based on a set of statistical and logistical assumptions. The statistical assumption would be that the variance of the average speeds for the modeled technology cannot exceed a specified value, V. The logistical assumption would be that the cost of drive testing is constant in every drive-testable hex cell selected in the sample. Under these assumptions, a theoretical value for the sample size n can be calculated as detailed below. Let L denote the number of strata in the drive-testable hex cells and let the index h distinguish these L strata. Further, denote or define the following quantities: · Number of hex cells in the stratum =Nh (thus, N=h=1LNh) · Weight of the stratum =Wh=Nh/N · Mean of X in the stratum =Xh=i=1NhXh,iNh where Xh,i is the value of terrain variability X in the ith hexagon of stratum h · Variance of X in the stratum =V(X)h=i=1Nh(Xh,i-Xh)2Nh-1 Under our proposal, the theoretical minimum sample size is given by: Id. at 98. n=WhV(X)h2V+(1/N)h=1LWhV(X)h∙ Once determined, n would be allocated among the different strata. Specifically, if nh is the number of sample hexagons allocated to the stratum, then: nh=nWhV(X)hh=1LWhV(X)h=nNhV(X)hh=1LNhV(X)h This method of apportioning the sample among the various strata is called Neyman allocation. Id. at 99. Note that n=h=1Lnh. Guided by this allocation scheme, staff would use geographic information systems (GIS) tools to independently select the sample hex cells in each stratum, ESRI, Create Random Points, https://desktop.arcgis.com/en/arcmap/latest/tools/data-management-toolbox/create-random-points.htm (last visited June 22, 2021). as illustrated in Figure 6(e). The provider subject to verification would then be notified of the sample hex cells in which it would be required to conduct on-the-ground speed tests. 4.5 Valid On-The-Ground Test Measurements Under the staff proposal, providers subject to verification would be required to submit on-the-ground speed test measurement data taken at various (stationary) points or (mobile) drive paths entirely within each of the n randomly selected hex cells. Moreover, providers would be required to collect these data using mobile devices running either the FCC Speed Test app, another speed test app approved by OET to submit challenges, or other software and hardware if approved by staff. To ensure both geographic and temporal variation in the on-the-ground test data, the provider would need to follow similar requirements as established in the BDC challenge process. Specifically, the provider would be required to conduct two tests in the same number of point hexes as required of challengers for each hex-8 cell. Additionally, the provider would need to record at least two tests within the hex cell where the time of day is at least four hours apart. However, the minimum number of tests would be established for the strata rather than the hex cell. Therefore, there would be no minimum number of positive tests for the hex cell, but if fewer than three accessible point hexes would be required, a provider would need to conduct a minimum of five tests within the hex cell. Only after the provider has submitted on-the-ground speed test data satisfying all of the above validity requirements would we proceed with the calculation of the final estimate of broadband availability in the targeted area. 4.6 Calculation of the Overall Estimate of Broadband Availability and Adjudication of the Outcome of Verification The question underlying the statistical sampling exercise for BDC mobile verifications is whether the provider provides adequate coverage meeting the minimum speeds throughout the targeted area. This section describes how we would determine an overall estimate of adequate broadband availability based upon valid speed test data submitted by the provider in response to a verification request. Let mh,i be the total count of speed test measurements in the ith sample hexagon of stratum h, where i=1,…,nh. Of these, let kh,i be the number of positive test measurements; that is, measurements where both the download and upload speed values meet or exceed the minimum values the provider reports that it provides for a particular technology throughout the target. Let mh=i=1nhmh,i and kh=i=1nhkh,i. Finally, let: ph=khmh, h=1,…,L Appropriate for stratified random sampling, Cochran, supra note 47 at 107. an estimate of the overall proportion of positive measurements in the target is given by: p=h=1LNhphN=h=1LWhph and an estimate of its variance is: V(p)=h=1LWh2ph1-phnh(1-nh/Nh) Using these estimates, under the staff proposal we would then construct a one-sided 95% confidence interval for p, with upper limit calculated as p+1.645Vp. This one-sided 95% confidence interval is based on the assumption that the sample size is large enough to use the Normal Distribution. If this computed value is greater than or equal to 0.9, which is the threshold proportion the Commission considers as delineating broadband availability for the purpose of verifications, we would determine that the provider has passed verification. APPENDIX B Proposed Rules We propose the following rule changes, subject to comment in the Public Notice: 1. Amend section 1.7001 by adding paragraph (a)(20) to read as follows: § 1.7001 Scope and Content of Filed Reports (a)* * * (1) * * * (20) H3 standardized geospatial indexing system. A system developed by Uber that overlays the Earth with hexagonal cells of different sizes at various resolutions. The smallest hexagonal cells are at resolution 15, in which the average hexagonal cell has an area of approximately 0.9 square meters, and the largest are at resolution 0, in which the average hexagonal cell has an area of approximately 4.3 million square kilometers. Hexagonal cells across different resolutions are referred to as a “hex-n” cell, where n is the resolution (e.g., “hex-15” for the smallest size hexagonal cell). The H3 geospatial indexing system employs a nested cell structure wherein a lower resolution hexagonal cell (the “parent”) contains approximately contains seven hexagonal cells at the next highest resolution (its “children”). That is, a hex-1 cell is the “parent” of seven hex-2 cells, each hex-2 cell is the parent of seven hex-3 cells, and so on. 2. Amend section 1.7006 by revising paragraphs (b)(2), (b)(3), and (b)(4) and redesignating existing paragraphs (b)(2)-(b)(4) as paragraphs (b)(3)-(b)(5), revising paragraph (c), adding paragraphs (c)(1)-(c)(2), adding subsections (e)(2)(i)-(e)(2)(iii), revising paragraph (e)(1)(iii), (e)(4) and (e)(6), adding paragraph (e)(7), and revising paragraphs (f)(1), (f)(2), (f)(3) and (f)(5) to read as follows: § 1.7006 Data Verification. * * * * * (b) * * * (2) On-the-ground crowdsourced data shall include the same metrics described in paragraph (c)(1) of this section. (3) The online portal shall notify a provider of a crowdsourced data filing against it, but a provider is not required to respond to a crowdsourced data filing. (4) If, as a result of crowdsourced data, the Commission determines that a provider's coverage information is not accurate, then the provider shall be subject to a verification inquiry consistent with the mobile verification process described in paragraph (c)(1) of this section. (5) All information submitted as part of the crowdsourcing process shall be made public via the Commission’s website, with the exception of personally identifiable information and any data required to be confidential under §0.457 of this chapter. (c) Mobile service verification process for mobile providers. Mobile service providers shall submit either infrastructure information or on-the-ground test data in response to a request by Commission staff as part of its inquiry to independently verify the accuracy of the mobile provider's coverage propagation models and maps. In addition to submitting either on-the-ground data or infrastructure data, a provider may also submit data collected from transmitter monitoring software. The Office of Economics and Analytics and the Wireless Telecommunications Bureau may require the submission of additional data when necessary to complete a verification inquiry. A provider must submit its data, in the case of both infrastructure information and on-the-ground data, within 60 days of receiving a Commission staff request. Regarding on-the-ground data, a provider must submit evidence of network performance based on a sample of on-the-ground tests that is statistically appropriate for the area tested. (1) When a mobile service provider chooses to demonstrate mobile broadband coverage availability by submitting on-the-ground data, the mobile service provider shall provide valid on-the-ground tests within a Commission-identified statistically valid and unbiased sample of its network, and shall demonstrate that the sampled area meets a threshold percentage of positive tests, which are defined as tests that show speeds that meet or exceed the minimum download and upload speeds the mobile service provider reports as available at the location where the test occurred. (i) On-the-ground test data shall meet the following testing parameters: (A) A minimum test length of 5 seconds and a maximum test length of 30 seconds; (B) Reporting measurement results that have been averaged over the duration of the test (i.e., total bits received divided by total test time); and (C) Conducted outdoors between the hours of 6:00 a.m. and 10:00 p.m. local time. (ii) On-the-ground test data shall include the following metrics for each test: (A) Testing app name and version; (B) Timestamp and duration of each test metric; (C) Geographic coordinates at the start and end of each test metric measured with typical Global Positioning System (GPS) Standard Positioning Service accuracy or better; (D) Velocity of vehicle, if applicable and available, for in-vehicle tests; (E) Device make and model; (F) Cellular operator name; (G) Location of server (e.g., hostname or IP address); (H) Available signal strength, signal quality, and radiofrequency metrics of each serving cell; (I) Download speed; (J) Upload speed; (K) Round-trip latency; and (L) All other metrics required per the most-recent specification for mobile test data released by the Office of Economics and Analytics and the Wireless Telecommunications Bureau. (2) When a mobile service provider chooses to demonstrate mobile broadband coverage availability by submitting infrastructure data, the mobile service provider must submit such data for all cell sites that provide service for the targeted area. (i) Infrastructure data shall include the following information for each cell site that the provider uses to provide service for the area subject to the verification inquiry: (A) Geographic coordinates of the site measured with typical GPS Standard Positioning Service accuracy or better; (B) A unique site ID for the site; (C) The ground elevation above mean sea level of the site; (D) Frequency band(s) used to provide service for each site being mapped including channel bandwidth (in megahertz); (E) Radio technologies used on each band for each site; (F) Capacity (Mbps) and type of backhaul used at each cell site; (G) Number of sectors at each cell site; (H) Effective Isotropic Radiated Power (EIRP); (I) Geographic coordinates of each transmitter; (J) Per site classification (e.g., urban, suburban, or rural); (K) Elevation above ground level for each base station antenna and other transmit antenna specifications (i.e., the make and model, beamwidth (in degrees), and orientation (azimuth and any electrical and/or mechanical down-tilt in degrees) at each cell site); (L) Operate transmit power of the radio equipment at each cell site; (M) Throughput and associated required signal strength and signal to noise ratio; (N) Cell loading distribution; and (O) Areas enabled with carrier aggregation and a list of band combinations (including the percentage of handset population capable of using this band combination); (P) Any additional parameters and fields that are listed in the most-recent specifications for wireless infrastructure data released by the Office of Economics and Analytics and the Wireless Telecommunications Bureau. * * * * * (e) * * * (1) * * * (iii) Speed test data. Consumer challenges shall include the test metrics described in paragraph (c)(1) of this section, and shall: (A) Be performed outdoors; (B) Indicate whether each test was taken in an in-vehicle mobile or outdoor pedestrian environment; and (C) Be conducted using a speed test app that has been designated by the Office of Engineering and Technology, in consultation with the Office of Economics and Analytics and the Wireless Telecommunications Bureau, for use in the challenge process; (2) * * * (i) A hexagon at resolution 8 from the H3 standardized geospatial indexing system shall be classified as challenged if it satisfies the following criteria. (A) Geographic Threshold. At least two valid speed tests, at least one of which is a “negative” test, are recorded in a minimum number of “point-hexes” of the resolution 8 hexagon, where: (1) A test shall be defined as negative when the test does not meet the minimum predicted speeds based on the highest technology-specific minimum download and upload speeds reported for that area by the provider in its most recent coverage data; (2) A point-hex shall be defined as one of the seven nested hexagons at resolution 9 from the H3 standardized geospatial indexing system of a resolution 8 hexagon; (3) A point-hex shall be defined as accessible where at least 50% of the point-hex overlaps with the provider’s reported coverage data and the point-hex overlaps with any primary, secondary, or local road from the most recent U.S. Census Bureau’s road data; and (4) The minimum number of point-hexes in which tests must be recorded shall be equal to the number of accessible point-hexes or four, whichever number is lower. If there are no accessible point-hexes within a resolution 8 hexagon, the geographic threshold shall not need to be met. (B) Temporal Threshold. The difference in time of day between two negative tests is at least four hours irrespective of calendar day; and (C) Testing Threshold. At least five speed tests are negative within a hex-8 cell when a challenger has submitted 20 or fewer tests. When a challenger has submitted more than 20 tests, a certain minimum percentage of the total number of tests in the cell must be negative; (1) When a challenger has submitted 21-29 tests, at least 24% must be negative; (2) When a challenger has submitted 30-45 tests, at least 22% must be negative; (3) When a challenger has submitted 46-60 tests, at least 20% must be negative; (4) When a challenger has submitted 61-70 tests, at least 18% must be negative; (5) When a challenger has submitted 71-99 tests, at least 17% must be negative; (6) When a challenger has submitted 100 or more tests, at least 16% must be negative; (ii) In addition, a larger, “parent” hexagon (at resolutions 7 or 6) shall be considered challenged if at least four of its child hexagons are considered challenged. The smallest challengeable hexagonal cell is a hexagon at resolution 8 from the H3 standardized geospatial indexing system. (iii) Mobile service providers shall be notified of all cognizable challenges to their mobile broadband coverage maps at the end of each month. Challengers shall be notified when a mobile provider responds to the challenge. Mobile service providers and challengers both shall be notified monthly of the status of challenged areas. (3) * * * (4) To dispute a challenge, a mobile service provider must submit on-the-ground test data, consistent with the metrics and methods described in paragraph (c)(1) of this section, or infrastructure data to verify its coverage map(s) in the challenged area. To the extent that a mobile service provider believes it would be helpful to the Commission in resolving a challenge, it may choose to submit other data in addition to the data initially required, including but not limited to either infrastructure or on-the-ground testing (to the extent such data are not the primary option chosen by the provider) or other types of data such as data collected from network transmitter monitoring systems or software, or spectrum band-specific coverage maps. Such other data must be submitted at the same time as the primary on-the-ground testing or infrastructure rebuttal data submitted by the provider. If needed to ensure an adequate review, the Office of Economics and Analytics may also require that the provider submit other data in addition to the data initially submitted, including but not limited to either infrastructure or on-the-ground testing data (to the extent not the option initially chosen by the provider) or data collected from network transmitter monitoring systems or software (to the extent available in the provider's network). If a mobile provider is not able to demonstrate sufficient coverage in a challenged hexagon, the mobile provider shall revise its coverage maps to reflect the lack of coverage in such areas. (i) A mobile service provider that chooses to rebut a challenge to their mobile broadband coverage maps with on-the-ground speed test data shall confirm that a challenged area has sufficient coverage using speed tests that were conducted during the 12 months prior to submitting a rebuttal. A provider may confirm coverage in any hex-8 cell within the challenged area. This includes any hex-8 cell that is challenged, and also any non-challenged hex-8 cell that is a child of a challenged hex-7, hex-6, or hex-5 cell. Confirming non-challenged hex-8 cells can be used to confirm the challenged hex-7, hex-6, or hex-5 cell. To confirm a hex-8 cell, a provider must submit on-the ground speed test data that meets the following criteria: (A) Geographic Threshold. Two speed tests, at least one of which is a positive test, are recorded within a minimum number of point-hexes within the challenged area, where: (1) A test shall be defined as positive when the test meets both the minimum predicted speeds based on the highest technology-specific minimum download and upload speeds reported for that area by the provider in its most recent coverage data; (2) A point-hex shall be defined as one of the seven nested hexagons at resolution 9 from the H3 standardized geospatial indexing system of a resolution 8 hexagon; (3) A point-hex shall be defined as accessible where at least 50% of the point-hex overlaps with the provider’s reported coverage data and the point-hex overlaps with any primary, secondary, or local road from the most recent U.S. Census Bureau’s road data; and (4) The minimum number of point-hexes in which tests must be recorded shall be equal to the number of accessible point-hexes or four, whichever number is lower. If there are no accessible point-hexes within a resolution 8 hexagon, the geographic threshold shall not need to be met. (B) Temporal Threshold. The difference in time of day between at least two positive tests is at least 4 hours irrespective of calendar day; and (C) Testing Threshold. At least 17 positive tests within a hex-8 cell in the challenged area when the provider has submitted 20 or fewer tests. When the provider has submitted more than 20 tests, a certain minimum percentage of the total number of tests in the cell must be positive; (1) When a provider has submitted 21-34 tests, at least 82% must be positive; (2) When a provider has submitted 35-49 tests, at least 84% must be positive; (3) When a provider has submitted 50-70 tests, at least 86% must be positive; (4) When a provider has submitted 71-99 tests, at least 87% must be positive; (5) When a provider has submitted 100 or more tests, at least 88% must be positive; (D) Using a mobile device running either a Commission-developed app (e.g., the FCC Speed Test app), another speed test app approved by OET to submit challenges, or other software and hardware if approved by staff; (E) Using a device that is engineering-capable and able to interface with drive test software and/or runs on the Android operating system. (ii) A mobile service provider that chooses to rebut a challenge to their mobile broadband coverage maps with infrastructure data may only do so in order to identify invalid, or non-representative, speed tests within the challenged speed test data. A provider may claim challenge speed tests were invalid, or non-representative, if: (A) Extenuating circumstances at the time and location of a given test (e.g., maintenance or temporary outage at the cell site) caused service to be abnormal; (B) The mobile device(s) with which the challenger(s) conducted their speed tests do not use or connect to the spectrum band(s) that the provider uses to serve the challenged area; (C) The challenge speed tests were taken during an uncommon special event (e.g., professional sporting event) that increased traffic on the network; or (D) The challenge speed tests were taken during a period where cell loading exceeded the modeled cell loading factor. (iii) If the Commission determines, based on the infrastructure data submitted by providers, that challenge speed tests are invalid, such challenge speed tests shall be ruled void, and the Commission shall recalculate the challenged hexagons after removing any invalidated challenger speed tests and consider any challenged hexagons that no longer meet the challenge creation threshold to be restored to their status before the challenge was submitted. (iv) Aside from the scenarios discussed in paragraph (e)(4)(ii)(A)-(D), the Commission shall only use infrastructure data, on their own, to adjudicate a challenge upon a showing by the provider that collecting on-the-ground or other data (not in infrastructure information) would be infeasible or unlikely to show an accurate depiction of network coverage. In such a situation, the Commission shall evaluate infrastructure data using the same process the Commission uses to verify providers coverage maps. (5) * * * (6) After a challenged provider submits all responses and Commission staff determines the result of a challenge and any subsequent rebuttal have been determined: (i) In such cases where a mobile service provider successfully rebuts a challenge, the area confirmed to have coverage shall be ineligible for challenge until the first time a mobile service provider files its biannual filing information six months after the end of the 60-day response period. (ii) A challenged area may be restored to an unchallenged state, if, as a result of data submitted by the provider, there is no longer sufficient evidence to sustain the challenge to that area, but the provider’s data fall short of confirming the area. A restored hexagon would be subject to challenge at any time in the future as challengers submit new speed test data. (iii) In cases where a mobile service provider concedes or loses a challenge, the provider must file, within 30 days, geospatial data depicting the challenged area that has been shown to lack sufficient service. Such data will constitute a correction layer to the provider's original propagation model-based coverage map, and Commission staff will use this layer to update the broadband coverage map. In addition, to the extent that a provider does not later improve coverage for the relevant technology in an area where it conceded or lost a challenge, it must include this correction layer in its subsequent filings to indicate the areas shown to lack service. (7) Commission staff are permitted to consider other relevant data to support a mobile service provider’s rebuttal of challenges, including on-the-ground data or infrastructure data, to the extent it was not previously submitted by a mobile service provider. The Office of Economics and Analytics will review such data when voluntarily submitted by providers in response to consumer challenges, and if it concludes that any of the data sources are sufficiently reliable, it will specify appropriate standards and specifications for each type of data and add it to the alternatives available to providers to rebut a consumer challenge. (f) * * * (1) * * * (i) Government and other entity challengers may use their own software to collect data for the challenge process. When they submit their data they must meet the test metrics described in paragraph (c)(1)(i)-(ii) of this section. Additionally, their data must contain the following metrics for each test: (2) Challengers must conduct speed tests using a device advertised by the challenged service provider as compatible with its network and must take all speed tests outdoors. Challengers must also use a device that is engineering-capable and able to interface with drive test software and/or runs on the Android operating system. (3) For a challenge to be considered a cognizable challenge, thus requiring a mobile service provider response, the challenge must meet the same threshold specified in paragraph (e)(2)(i) of this section. (4) * * * (5) To dispute a challenge, a mobile service provider must submit on-the-ground test data or infrastructure data to verify its coverage map(s) in the challenged area based on the methodology set forth in paragraph (e)(4) of this section. To the extent that a service provider believes it would be helpful to the Commission in resolving a challenge, it may choose to submit other data in addition to the data initially required, including but not limited to either infrastructure or on-the-ground testing (to the extent such data are not the primary option chosen by the provider) or other types of data such as data collected from network transmitter monitoring systems or software or spectrum band-specific coverage maps. Such other data must be submitted at the same time as the primary on-the-ground testing or infrastructure rebuttal data submitted by the provider. If needed to ensure an adequate review, the Office of Economics and Analytics may also require that the provider submit other data in addition to the data initially submitted, including but not limited to either infrastructure or on-the-ground testing data (to the extent not the option initially chosen by the provider) or data collected from network transmitter monitoring systems or software (to the extent available in the provider's network). (6) * * * (7) * * * 3. Amend section 1.7008 by revising paragraph (d)(2) to read as follows: § 1.7008 Creation of broadband internet access service coverage maps. * * * * * (d)(1) * * * (2) To the extent government entities or third parties choose to file verified data, they shall follow the same filing process as providers submitting their broadband internet access service data in the data portal. Government entities and third parties that file on-the-ground test data shall submit such data using the same metrics and testing parameters the Commission requires of mobile service providers when responding to a Commission request to verify mobile providers’ broadband network coverage with on-the-ground data (see 47 CFR § 1.7006(c)(1)). (3) * * *