Federal Communications Commission Washington, D.C. 20554 Geoffrey Starks Commissioner March 6, 2023 Alan Davidson Assistant Secretary of Commerce for Communications and Information Administrator, National Telecommunications and Information Administration U.S. Department of Commerce 1401 Constitution Avenue NW Washington, DC 20230 Re: Privacy, Equity, and Civil Rights Request for Comment, NTIA-2023-0001 Dear Assistant Secretary Davidson: Privacy issues affect all Americans. Where such critical issues straddle and cut across multiple government agencies’ purviews, it is vital that we work together for the benefit of the public. I want to commend you and the National Telecommunications and Information Administration (“NTIA”) for doing just that by seeking comment to inform your upcoming report on whether and how commercial data practices may lead to outsized harms for marginalized or disadvantaged communities. You and I have worked together closely on a number of issues, and I appreciate the opportunity to lend my voice to this effort. In my role as a commissioner on the Federal Communications Commission (“FCC”), I have seen how technology continues to play an increasing role in our lives. Technological advancements can and do yield tremendous benefits, but as regulators, we must be attuned to attendant costs. We must remain vigilant that these developments do not harm the most vulnerable members of our society. I am particularly focused on issues of equity, and combatting practices like algorithmic bias that can lead to unequal access to or quality of communications services. I want to share three examples with you. Each of these falls within a core area of FCC authority and expertise, but, as technology grows and shifts, has also come to involve commercial data practices that have the potential to negatively affect minority and underserved communities. My hope is that you find these examples illustrative as you examine the impact of these practices more broadly. Digital discrimination in broadband internet access. One of my primary goals as a commissioner is to ensure equal access to high-quality, affordable broadband internet access service for all Americans. As the last few years have highlighted, the internet touches nearly every aspect of daily living, from school, to work, to health, to the fabric of our social connections. Recognizing the necessity of internet access in today’s world, as part of the Bipartisan Infrastructure Act, Congress charged the FCC with promulgating rules to prevent and eliminate “digital discrimination of access based on income level, race, ethnicity, color, religion, or national origin.”1 The FCC has long used its pre-existing authority to ensure universal service and reasonable access to communications technology, and we have a proceeding currently underway to adopt rules in response to Congress’ latest directive. In this proceeding, we asked what policies and practices we should review in order to combat and eradicate digital discrimination. A common thread ran through many of the practices commenters raised: they are animated by data collection and usage.2 For example, commenters stated that bias in algorithms used to determine where to deploy internet service, and what level of service to deploy, could result in zip codes populated by minority communities being disproportionately underserved.3 Commenters also argued that the data underlying network upgrades and maintenance, advertising and marketing, subscription pricing models, and privacy and security may contribute to the digital divide.4 Of course, the same data usage practices, and other practices like them, may implicate other services, and also have the effect of widening gaps in access to technology and other socioeconomic disparities. Thus, while the FCC examines this from the perspective of prohibiting digital discrimination of access, I suggest you also consider how these practices and others may contribute to the current digital divide and other disparities. Automated speech recognition and facial recognition. The FCC is charged with ensuring that those who are deaf, hard of hearing, deafblind, or who have speech disabilities can communicate in a manner that is functionally equivalent to those without such disabilities.5 Accordingly, we regulate and fund telecommunications relay services (“TRS”), including internet protocol captioned telephone service (“IP CTS”), which allows a user to simultaneously 1 Infrastructure Investment and Jobs Act, Pub. L. No. 117-58, 135 Stat. 429, § 60506(b) (2021). In addition to this directive to the FCC, the Act sets forth a broad statement of policy: to the extent technically and economically feasible, “subscribers should benefit from equal access to broadband internet access service within the service area of a provider” and the “Commission should take steps to ensure that all people of the United States benefit from equal access to broadband internet access service.” This subsection defines equal access as “the equal opportunity to subscribe to an offered service that provides comparable speeds, capacities, latency, and other quality of service metrics in a given area, for comparable terms and conditions.” Id. at § 60506(a). 2 See Implementing the Infrastructure Investment and Jobs Act: Prevention and Elimination of Digital Discrimination, Notice of Proposed Rulemaking, GN Docket No. 22-69 at ¶¶ 31-32 (2022). 3 See, e.g., Comments of Free Press, GN Docket No. 22-69 at pp. 8-9 (filed May 16, 2022); Comments of The Utility Reform Network (TURN), GN Docket No. 22-69 at pp. 18-19 (filed May 16, 2022). 4 See, e.g., Comments of Lawyers’ Committee for Civil Rights Under Law, GN Docket No. 22-69, at pp. 18-19 (filed May 16, 2022); Comments of Leadership Conference on Civil and Human Rights, GN Docket No. 22-69 at p. 5 (filed May 16, 2022); Comments of Multicultural Media, Telecom and Internet Council, GN Docket No. 22-69, at p. 10 (filed May 16, 2022); Comments of National Urban League, GN Docket No. 22-69 at p. 4 (filed May 16, 2022). 5 See 47 U.S.C. § 225. 2 listen to the other party in a telephone conversation and read captions of what the other party is saying. This captioning service can be provided by an individual, called a communications assistant, or on a fully automated basis, by automated speech recognition (“ASR”). But while we are barreling ahead toward a more automated world, it is still unclear how accurate, and therefore how reliable, ASR is. It needs to be both: those who rely on IP CTS do so in every context, from staying in touch with loved ones, to talking to employers and prospective employers, to seeking help when they need it, including by dialing 988 and 911. In particular, and as I have raised in multiple FCC proceedings, I remain concerned about potential algorithmic bias in ASR.6 Studies have shown that speech recognition systems make far more errors when transcribing the speech of people of color than of their counterparts.7 This mirrors the troubling bias that facial recognition programs often exhibit. Of course, TRS is hardly the only service that employs ASR, and bias in facial recognition, though related to the issue of bias in ASR, is not presently an issue in TRS programs. While the FCC examines the use of ASR in the TRS we regulate, I suggest you consider how algorithmic bias in both speech and facial recognition systems may lead to discriminatory effects in the various services in which they are embedded. Advertisements to protected groups. Finally, as data collection increasingly powers advertising-supported services, I am focused on the impact of advertising technology on protected groups. There are distinct groups that have historically had legal protection when it comes to advertising. For example, at the FCC, we regulate the amount and content of advertising that can be aired during children’s programming, pursuant to the Children’s Television Act.8 I continually raise this point when speaking about the new broadcast television transmission standard, ATSC 3.0. The FCC is currently overseeing the broadcast industry’s transition to this new, IP-based standard, which promises multiple benefits, including more free, over-the-air content with higher-quality picture and audio. However, the new standard also will enable broadcasters to collect much more viewer data than they currently do, which could be used to deliver targeted advertising, among other purposes.9 6 See Internet Protocol Captioned Telephone Service Compensation et al., Notice of Proposed Rulemaking and Order on Reconsideration, CG Docket No. 22-408 et al. at ¶ 16, Statement of Commissioner Geoffrey Starks (2022); Misuse of Internet Protocol (IP) Captioned Telephone Service et al., Report and Order, Order on Reconsideration, and Further Notice of Proposed Rulemaking, CG Docket No. 13-24 et al. at Statement of Commissioner Geoffrey Starks (2020). 7 See Allison Koenecke et al., “Racial disparities in automated speech recognition,” Proceedings of the National Academy of Sciences Vol. 117 No. 14 (2020), https://www.pnas.org/doi/10.1073/pnas.1915768117 (Stanford University study of “state-of-the-art ASR systems” developed by five major tech companies and finding an average word error rate of 35% for black speakers compared to 19% for white speakers). 8 See 47 U.S.C. § 303a; 47 C.F.R. § 73.670. 9 See Commissioner Geoffrey Starks, Remarks on the Future of Broadcast Television at the University of Pennsylvania Carey Law School Center for Technology, Innovation & Competition (Oct. 18, 2022), https://www.fcc.gov/document/starks-remarks-future-broadcast-television. 3 I have made this one of my key issues because of the unique opportunity we have at this moment: while this technology is still developing, we can make sure that broadcasters are good actors in the targeted advertising market from the start. That is a stark contrast to the challenges regulators typically face, where they are racing to unwind harms that have already occurred in the advertising market. While the FCC evaluates opportunities to protect consumers in connection with the transition to ATSC 3.0, I suggest you also find ways to get in front of technological and business model evolutions to attack privacy problems from the start.10 Advertising technology more broadly may be an ideal candidate. While digital ads have been around for many years, we are already seeing signs that the digital ad marketplace is on the cusp of a significant evolution, due to changes in consumer preferences, policy shifts, and the operation of consumer devices. * * * When it comes to critical issues like these, government must work together to achieve the digital future all Americans deserve. As I continue to work on these issues at the FCC, I urge you to consider how other agencies within the federal government can use their respective authorities to fill in the regulatory gaps on data practices and privacy to protect marginalized communities as NTIA crafts its upcoming report. Respectfully submitted, Commissioner Geoffrey Starks Federal Communications Commission 45 L Street, NE Washington, DC 20554 10 The FCC has wide-ranging regulatory authority over broadcasters, see 47 U.S.C. § 309(a). 4