Federal Communications Commission FCC 23-101 STATEMENT OF CHAIRWOMAN JESSICA ROSENWORCEL Re: Implications of Artificial Intelligence Technologies on Protecting Consumers from Unwanted Robocalls and Robotexts, CG Docket No. 23-362, Notice of Inquiry (November 15, 2023) If Tom Hanks called, I would pick up the phone. If he spoke in a familiar way during that call, I would definitely listen. To be clear, the star of Big and Saving Private Ryan is not dialing me anytime soon. But a video using his voice is on the internet hawking dental plans. None of this is happening with his permission. This is happening because scam artists are playing with artificial intelligence and testing our ability to separate vocal fact from fiction in order to commit fraud. Now imagine instead a call from a friend or family member. Of course you pick up. But maybe that voice sounds off and something feels wrong. Maybe it is because the individual you think is on the other end of the line is telling you about an imminent emergency and pleading with you to send money. Like the hard sell from Tom Hanks, it is also a scam. Because you are not actually talking to who you think you are, you are speaking with a con artist using artificial intelligence to clone the voice of someone you know. If this future sounds far off, think again. We see on the internet how fraudsters are already playing with this technology. We know that scam artists want to explore ways to use this technology over the phone. I recently had the opportunity to sit down with AARP and talk about what the combination of unwanted robocalls and robotexts and artificial intelligence will mean for consumers. I learned about how voice cloning scams are growing and how they can cause special harm for older adults. Imagine, for instance, a grandparent fearing they will get a call from their grandchild, only to learn it was fraudster on the other end of the line, preying on their willingness to forward money to family. The anxiety about these technology developments is real. Rightfully so. But I think we make a mistake if we only focus on the potential for harm. We need to equally focus on how artificial intelligence can radically improve the tools we have today to block unwanted robocalls and robotexts. We are talking about technology that can see patterns in our network traffic unlike anything we have today. This can lead to the development of analytic tools that are exponentially better at finding fraud before it ever reaches us at home. Used at scale, we can not only stop this junk, we can help restore trust in our networks. That is why today we are launching an inquiry to ask how artificial intelligence is being used right now to recognize patterns in our network traffic and how they could be used in the future. We know the risks that this technology involves, but we also want to harness the benefits—just like the recently released Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence recommends. That is not to say this will be easy. Like Tom Hanks said as the ragtag coach Jimmy Dugan in A League of Their Own, “the hard . . . is what makes it great.” We have work to do to harness artificial intelligence for good. But I am an optimist and I believe this is possible. So let’s get to it. Let’s see how we can use artificial intelligence to get this junk off the line. I want to thank the staff responsible for our efforts today, including Jerusha Burnett, Zac Champ, Aaron Garza, Josh Mendelsohn, Michael Scott, Suzy Rosen Singleton, Richard Smith, Mark Stone, Kristi Thornton, and George Phelan from the Consumer and Governmental Affairs Bureau; Kristi Thompson from the Enforcement Bureau; Richard Mallen, Marcus Maher, Michele Ellison, Jeff Steinberg, Royce Sherlock, and Wade Lindsay from the Office of General Counsel; Michelle Schaefer and Andrew Wise from the Office of Economics and Analytics; Martin Doczkat and Dana Shaffer from the Office of Engineering and Technology; Michael Antonino, Maureen Bizhko, Kenneth Carlberg, Shawn Cochran, Gerald English, John Evanoff, David Furth, David Sieradzki, Austin Randazzo, and James Wiley from the Public Safety and Homeland Security Bureau; Jonathan Lechter from the Wireline Competition Bureau; and Arpan Sura and Paul Powell from the Wireless Telecommunications Bureau. 2