| Speaker | Time | Text |
|---|---|---|
|
unidentified
|
Institution Center President and CEO Jeffrey Rosen leads a discussion of presidential historians on inaugural addresses and how they shape a president's legacy. | |
| Exploring the American story, watch American History TV every weekend and find a full schedule on your program guide or watch online anytime at c-span.org/slash history. | ||
| C-SPAN. Democracy Unfiltered. | ||
| We're funded by these television companies and more, including Comcast. | ||
| You think this is just a community sensor? | ||
| No. | ||
| It's win-winning that. | ||
| Comcast is partnering with a thousand community centers to create Wi-Fi-enabled lifts so students from low-income families can get the tools they need to be ready for anything. | ||
| Comcast supports C-SPAN as a public service, along with these other television providers, giving you a front-row seat to democracy. | ||
| The Senate Judiciary Committee held a hearing on strengthening online safety protections for children, featuring lawmakers, advocates, and legal experts. | ||
| During their remarks, witnesses fielded questions on federal regulations, age restrictions for online content, and artificial intelligence. | ||
| This runs two hours and ten minutes. | ||
| Good morning, everybody. | ||
| In today's digital era, our young people face risks that previous generations couldn't even have imagined. | ||
| Even though technology brings amazing opportunities for education and growth, it also opens doors to new dangers that we must confront. | ||
| This isn't the first hearing we've had on this issue, and unfortunately, probably won't be the last. | ||
| We held a hearing on this same subject roughly a year ago when we brought CEOs from some of the largest social media companies to discuss safety issues on their platforms. | ||
| And we held a similar hearing a year before that. | ||
| On the one hand, this is alarming because the problems are getting worse. | ||
| In 2023, as an instance, the NCMEC cyber tiplane received 36 and 2 tenths million reports of suspected online child sexual exploitation, a 12% increase over 2022. | ||
| And even though the numbers haven't been published for 2024, it seems that they're expected to go up. | ||
| Additionally alarming are the new technologies that are being used by bad actors to exploit children online. | ||
| Predators can use generative AI for instance to take normal images of children and manipulate them to create novel forms of CSAM. | ||
| In 2024 alone, NCMEC reported over 60,000, almost 61,000 instances of generative artificial intelligence, CSAM. | ||
| Despite this, so far, Congress has enacted no significant legislation to address these dangers against children. | ||
| And tech platforms have been unhelpful in our legislative efforts. | ||
| Big tech promises to collaborate, but they're noticeably silent in supporting legislation that would affect meaningful change. | ||
| In fact, big tech's lobbyists swarm this hill armed with red herrings and scare tactics suggesting that we'll somehow break the internet if we implement even these very modest reforms. | ||
| Meanwhile, these tech platforms generate revenues that dwarf the economies of most nations. | ||
| So how do they make so much money? | ||
| They do it by compromising our data and privacy and keeping our children's eyes glued to the screens through addictive algorithms. | ||
| Indeed, in one recent study, 46% of teens reported that they're online, quote, almost constantly, end quote. | ||
| This has had severe mental consequences for adolescents. | ||
| It has also led to a rise in sexual exploitation as some algorithms have actually connected victims to their abusers. | ||
| Such tech platforms, should such tech forms be allowed to profit at the expense of our children's privacy, our children's safety, and our children's health. | ||
| Should they be allowed to contribute to a toxic digital ecosystem without being held accountable? | ||
| So I believe to everybody the answer is very clear. | ||
| When these platforms fail to implement adequate safety measures, they're complicit in harms that follow, and they should be held accountable. | ||
| That said, there are some signs of encouragement. | ||
| Just as new technologies are being developed that exacerbate harm to children online, so too are technologies being developed to combat exploitation as one example. | ||
| With AI rapidly evolving, open source safety tools are being developed to recognize and report CSAM. | ||
| Some of the witnesses here today will speak to the effectiveness of these tools. | ||
| Additionally, on the committee, on a committee with some of the most diverse viewpoints in the United States Senate, we have actually advanced bipartisan legislation that addresses legal gaps in our current framework, especially those related to the blanket immunity that Section 230 provides. | ||
| Last Congress, for example, we reported several bills, online safety bills, out of committee with overwhelming bipartisan support. | ||
| And there are a number of bills that are being considered and refined this Congress, which we'll give attention to in due course. | ||
| That being said, we can't come up with a wise and effective legislative solution without first understanding the nature and scope of the problem. | ||
| And so that's why we're having this hearing today. | ||
| Our witnesses come from various backgrounds and represent very diverse perspectives, all of which point to the need for our committee to improve legislation and continue our work to keep kids safe. | ||
| So with that, I'll open things up to Ranking Member Durbin to give opening remarks. | ||
| After that, we'll hear from Senators Blackburn and Klobuchar. | ||
| Then I'll introduce the witnesses and swear. | ||
| Go ahead. | ||
| I want to personally thank you, Senator Grassley. | ||
| This is unusual, a change in leadership in this committee, and yet an issue which we took up very seriously in the last few years on a bipartisan basis has survived the change. | ||
| And in fact, this hearing is evidence of the determination of the chairman. | ||
| I'd like to join him in that assurance that we're taking this issue very seriously. | ||
| It was almost exactly two years ago this committee held a similar hearing. | ||
| We heard from six witnesses about the harm social media does to our kids and grandkids. | ||
| A mom whose son took his own life after he was bullied online. | ||
| A young woman whose mental and physical health suffered as she chased the unattainable lifestyle depicted on Instagram and other apps. | ||
| Experts who told us how big tech designs their platforms to be addictive, keeping users online for longer and longer and longer time so they can be fed more targeted ads. | ||
| Individuals combating the tidal wave of child sexual abuse material or CSAM flowing across the internet. | ||
| At the end of that hearing, I told the witnesses and the many parents and young people in the audience I was going to roll up my sleeves, get to work and pass legislation to protect kids from online safety concerns. | ||
| That spring, the committee reported five bills that helped protect kids online and included my Stop CSAM Act. | ||
| I want to thank Senator Holly for joining me in that effort, which we hope to renew soon, along with bipartisan bills from Senators Graham, Blumenthal, Klobuchar, Cornyn, Blackburn, and Ossoff. | ||
| These bills were reported out of this committee unanimously. | ||
| For anyone who's a newcomer to Capitol Hill or to this committee, you have the American political spectrum from one end to the other on this committee. | ||
| And for us to do anything unanimously is nothing short of a political miracle. | ||
| We did it. | ||
| The Senate Judiciary Committee contains members across the spectrum, the most conservative Republican to the most progressive Democrat. | ||
| It's almost unheard of to pass a bill unanimously, yet we did it five times. | ||
| One of these bills, the Report Act, later signed into law by President Biden, strengthened the cyber tip line run by National Center for Missing and Exploited Children. | ||
| As for the rest, different story. | ||
| Big Tech opened up a $61.5 million lobbying war chest to make sure these bills never became law. | ||
| Now, let's be clear. | ||
| None of these bills are the silver bullet that would make the internet completely safe for our kids. | ||
| But they would be significant steps towards finally holding tech companies accountable for the harms that they caused, the damage that they caused, the death that they caused. | ||
| And that's why the tech companies opposed them as strongly as they did. | ||
| They didn't do it publicly. | ||
| Publicly, oh, it's such a great idea. | ||
| But privately, they just beat the hell out of us. | ||
| So just over a year ago, I called in the CEOs of the five major tech platforms. | ||
| Some I had to issue subpoenas to demand answers on the record under oath. | ||
| And that hearing produced some results. | ||
| Several companies implemented child safety improvements just days before their CEOs came to testify. | ||
| And Meta CEO Mark Zuckerberg, under pressure from Senator Hawley, artful questioning, gave a long overdue apology to the parents his platform had hurt. | ||
| But apologies and too little, too late reforms are simply not enough. | ||
| The dozens of parents and survivors in that room, the thousands impacted across the country, demand more. | ||
| And I, for one, plan to follow through. | ||
| In the coming weeks, Senator Hawley and I will reintroduce the Stop CSAM Act. | ||
| This bill will finally open the courthouse door to families whose children have been victimized due to big tech's failure to safeguard their online platforms. | ||
| I hope Senator Grassy will help me schedule a timely markup on that bill. | ||
| And this week, I'll join Senators Graham, White House, Hawley, Klobuchar, and Blackburn to introduce the bills to sunset Section 230 of the Communications Decency Act in two years. | ||
| This is long overdue. | ||
| Section 230 and the legal immunity it provides to big tech has been on the books since 1996, long before social media was part of our lives. | ||
| To the extent this protection was ever needed, its usefulness for this so-called fledgling industry has long since passed. | ||
| I'm under no illusion that it'll be easy to pass legislation to protect kids online and finally make the tech industry legally accountable for the damage that they're causing. | ||
| But they ought to face the same liability as every other industry in America. | ||
| Just last year, big tech and its allies in the House killed a bill, the Kids Online Safety and Privacy Act, a bill introduced, I believe, by Senators Blumenthal and Senator Blackburn as well. | ||
| That would have imposed a basic duty of care on tech platforms. | ||
| It passed the Senate 91 to 3, but big tech did it in the House. | ||
| Couldn't even get it up for a vote. | ||
| The National Center for Missing and Exploited Children receives 100,000 reports to its cyber tip line every single day. | ||
| That's not just a statistic. | ||
| Each of these reports involves a victim. | ||
| It could be anything from images of a toddler being raped to an attitude energy being coerced, extorted, groomed, and encouraged to commit suicide. | ||
| 100,000 reports in the United States every single day. | ||
| I hope everyone keeps that in mind as we hold this hearing. | ||
| We cannot wait. | ||
| We have to move. | ||
| I hope it drives the public to demand Congress finally do something. | ||
| Senator Beigburn. | ||
| Thank you, Mr. Chairman. | ||
| And I want to say thank you to our witnesses for being here today. | ||
| And Mr. Guffey, we appreciate that you are here and sharing your story. | ||
| And Mr. Chairman, you mentioned that it was over a year ago that we had tech execs in front of us and that nothing much has changed. | ||
| That is the tragic part of this situation, that nothing much has changed. | ||
| There's been window dressing. | ||
| There have been ads that have been run saying, look at us, look at what we're doing. | ||
| But unfortunately, there is no enforcement to this. | ||
| That is why it is still dangerous for kids to be online. | ||
| They're still facing online threats, exposure, sexual exploitation, drug trafficking, promotion of suicide, eating disorders. | ||
| And the thing that is so interesting is in the physical world, there are laws against this. | ||
| It is only in the virtual space that it remains the Wild West. | ||
| And our children can be attacked every single day, non-stop, 24-7, 365. | ||
| It is long overdue. | ||
| And the Kids Online Safety Act that Senator Blumenthal and I have worked on for years now has been mentioned already this morning. | ||
| And there is such a broad bipartisan coalition, whether it's parents, principals, teachers, pediatricians, child psychologists, even teens themselves, have come to us and have said something needs to be done about this. | ||
| We have had companies like Microsoft, X, SNAP, who have supported this bill. | ||
| Unfortunately, kids are still being harmed online. | ||
| I talked to a mom recently whose child died. | ||
| They met somebody online who sold them supposedly a Xanax. | ||
| They met them on SNAP. | ||
| They took what they thought was a Xanax and they died. | ||
| It was fentanyl. | ||
| So these are the dangers that are there. | ||
| And while there is broad bipartisan support, Senator Grassley mentioned the lobbying efforts of some of the big tech firms and how they went with distortions and lies to the House. | ||
| And this bill did not get through. | ||
| So it is time to stop this and get it passed. | ||
| Now, Senator Durbin mentioned the bills we sent out of committee here last year. | ||
| There was one that got signed into law and it was the bill that Senator Osoff and I did, the Report Act. | ||
| And this deals with NICMIC's cyber tip line and increases the time that evidence submitted to NCMIC has to be preserved and it gives law enforcement more time to investigate, to get these criminals into court, and then get them locked up. | ||
| And we still have so much work to do. | ||
| Now Senator Klobuchar and I are going to lead the Privacy Technology and Law Subcommittee and these issues will be coming before us. | ||
| We've got plenty of work to do. | ||
| We're looking forward, Mr. Chairman. | ||
| I look forward to convening this committee, working to make certain that we are pushing this legislation that we are going to protect our children in the virtual space. | ||
| Thank you, Mr. Chairman. | ||
| Well, thank you so much, Mr. Chairman, and I am truly looking forward to working with Senator Blackburn on this important subcommittee. | ||
| As many of you know, Senator Lee and I chaired the antitrust subcommittee for a long time, but I actually think this situation right now with the possibility of moving on these bills is going to be a very positive development. | ||
| As Senator Blackburn just pointed out, despite the strong support that we have had from Senator Durbin and Senator Grassley and Senator Graham when he chaired this committee or was the ranking on this committee, we've just continued to run into roadblocks to passing these laws and it's getting absolutely absurd. | ||
| Senator Grassley is well aware of the antitrust tech bill that he and I lead, that hundreds and hundreds of millions of dollars are spent against it in TV ads. | ||
| And despite the fact that the companies, FANG, as we call them, have agreed in other countries to some of these consumer protections, that did not happen in America. | ||
| And I think that this piece of it, whether it's Instagram's promotion of content that encourages eating disorders, frightening rise of non-consensual AI-generated pornographic deepfakes, or the tragic stories of kids losing their lives to fentanyl-laced pills, will most likely be leading the way as we continue to push our antitrust and privacy and news bills. | ||
| Just this month, this committee heard from Bridget Noring of Hastings, Minnesota. | ||
| Her son, Devin, was struggling with migraines and bought what he thought was a percocet over Snapchat to deal with the pain, but it really wasn't a percocet. | ||
| It was a fake pill laced with fentanyl. | ||
| And with that one pill, as we say, one pill kills, he died at age 19. | ||
| For too long, the companies have turned a blind eye when young children joined their platforms, used algorithms that pushed harmful content. | ||
| They have done that and provided a venue for dealers to sell deadly drugs like fentanyl. | ||
| We know that social media also increases the risk of mental illness, addiction, exploitation, and even suicide among kids. | ||
| I will never forget the testimony of the FBI director telling us that in just one year, I believe it was 2023, over 20 kids had committed suicide just because of the pornography and the images that had been put out there when they were innocently sending a picture to who they thought was a girlfriend or a boyfriend. | ||
| That's why this committee has taken this on on a bipartisan basis. | ||
| And I'm hopeful that this hearing will be the beginning of actually passing these bills into law. | ||
| Representative Guffey, you and I met through Senator Cruz and the bill that he and I have, the Take It Down Act. | ||
| We have an additional bill that Senator Cornyn and I have that's really important that's passed through this committee, the SHIELD Act. | ||
| And as you know all too well, the threat of dissemination alone can be tragic, especially for kids. | ||
| We need to enact the Kids Online Safety Act, which thanks to Senator Blumenthal and Blackborn have passed the Senate on a 91 to 3 vote. | ||
| As we know, some of these are stalled out in the House. | ||
| We need to get the Federal Roads of the Rules of the Road in place for safeguarding our data. | ||
| According to a recent study, social media platforms generate $11 billion in revenue in 2022 from advertising directed at kids and teenagers, including $2 billion in ad profits derived from users age 12 and under. | ||
| I am supportive, as was mentioned by Senator Durbin, of the legislation that he and Senator Graham and Hawley and many others to open the courtroom doors to those harmed by social media by making those reforms to Section 230. | ||
| That legislation was enacted long before any of this was going on. | ||
| And somehow, with respect to other industries, we've been able to make smart decisions to put more safety rules in place. | ||
| Just ask those passengers that were on that flight that flipped upside down in Toronto who are in those seats that were the result of safety rules that were put in place. | ||
| And yet, when it comes to this, we just put up our hands and say, no, they're lobbying against us, or they have too money, or we like some of the people that work there. | ||
| And we do nothing. | ||
| And by doing nothing, instead of reaching some reasonable accommodations of settlements or things we can do on legislation, we just let them run wild at the expense of our kids' lives. | ||
| Thank you. | ||
| When you consider five bills got out of this committee last Congress, and over the last few years, Congress has only been in session about two and a half days a week. | ||
| It's supposed to be a new regime. | ||
| I'm not sure that it is. | ||
| And I would hope that some of you folks on the Democrat side would push Republicans to make sure we keep the Senate in session more than two and a half days a week so we can get some of this done. | ||
| Because we had hardly any important legislation the last two years. | ||
| We were basically just a confirming body. | ||
| Take that. | ||
| I hope you enjoy doing that like I enjoyed complaining because we were only meeting two and a half days a week when the Democrats controlled the Senate. | ||
| I'm going to introduce our guest today. | ||
| A first witness, Mr. Brendan Guffey. | ||
| You serve now in the South Carolina House of Representatives. | ||
| Following tragic loss of your son, Kevin, Mr. Guffey became an advocate for mental health awareness and combating online crimes. | ||
| And as Senator Blackburn said, we're sorry for the loss of your son, Mr. Guffey. | ||
| It's probably hard for you to be here to talk about it, but thank you for being here. | ||
| Next, we have Ms. Carrie Goldberg, plaintiff's attorney and founder of the law firm C.A. Goldberg, PLLC. | ||
| She specializes in representing victims of sexual abuse, child exploitation, online harassment, and other forms of digital abuse. | ||
| Professor Mary Larry, a former federal prosecutor, current law professor at Catholic University of America, Professor Larry directs the law school's modern prosecution program, and her scholarship focuses on exploitation of women and children. | ||
| Professor Larry has an upcoming article that dives deeply into the history of Section 230 and its role in facilitating child sexual abuse material. | ||
| Mr. John Pizzaro is CEO of RAVEN. | ||
| Started as a former law enforcement. | ||
| Raven gathers subject matter experts across multiple disciplines to help protect children from online exploitation. | ||
| Mr. Pizzuro is a former commander of the New Jersey Internet Crimes Against Children Task Force program. | ||
| Mr. Stephen Balcombe, CEO and founder of Family Online Safety Institute. | ||
| This international nonprofit is dedicated to making the internet safety safer for kids. | ||
| Before founding the Institute in 2007, Mr. Balcombe spent 30 years as a leader in the nonprofit sector championing online safety. | ||
| His work at the Institute brings together those in government, industry, and the nonprofit sector to create a culture of responsibility. | ||
| Now I'd like to ask you to stand and be sworn. | ||
| Raise your right hand. | ||
| Do you swear or affirm that the testimony you're about to give before this committee will be the truth, the whole truth, and nothing but the truth? | ||
| So help you, God. | ||
| They have all answered affirmatively. | ||
| Mr. Guffey, we'll start with you and go from my left to my right. | ||
|
unidentified
|
Thank you, Mr. Chairman, distinguished senators. | |
| Thank you for the opportunity to testify today. | ||
| My name is Representative Brandon Guffey, and I'm here to share why protecting youth from online dangers and holding big tech companies responsible is now my life's mission. | ||
| Sometimes God sends you down a path that you never thought you would be on. | ||
| In July of 2022, I lost my oldest son, Gavin Guffey, to suicide. | ||
| On Gavin's last Instagram post, a week prior to his death, he said, this week helped me look up to where my head should have been years ago. | ||
| Jesus and his word has given me a high that no other can compare to his love. | ||
| He ended that post with a less than three son. | ||
| On July 27th, Gavin would send out the lesson three sign again on a black screen to his friends and his younger brother, Cohen Guffey, who's with me here today. | ||
| At 1.40 a.m., Gavin took his life. | ||
| We quickly learned that Gavin was contacted on Instagram around midnight, which he told his friends that he would jump off the game to chat with her. | ||
| In just one hour and 40 minutes, my son was gone. | ||
| The predator that contacted Gavin was recently extradited to the U.S. two weeks ago from Lagos, Nigeria. | ||
| The predator not only attacked my son Gavin, who was 17, but also began to extort my 16-year-old son, my 14-year-old cousin, and then myself. | ||
| One of the messages I received read, did I tell you that your son begged for his life? | ||
| I hope you asked, how is this possible? | ||
| It's possible because Instagram removed the profile that attacked Gavin, but left up the additional profiles that predators used. | ||
| One of those is the account that began to attack my family after Meta was fully aware of this predator. | ||
| I vowed from that moment that I would make it my life's mission to protect children online and would not stop. | ||
| I was shortly elected to the South Carolina House and within four months of taking office successfully passed what is now known as Gavin's Law. | ||
| Sex tortion is mandated in education throughout the state of South Carolina, and every kid at least has to have some awareness so they don't feel alone like my son did that night. | ||
| I've worked with many states on similar legislation. | ||
| I started a nonprofit speaking to teens about mental health and the dangers from big tech. | ||
| I filed a lawsuit against Meta in January 2024, sold my businesses, and went to work for a tech company that provides tools to protect children. | ||
| I've also become an advocate on the Hill, urging members to see this as what it is, and that is the greatest threat to the next generation. | ||
| In the two years of my advocacy, I've seen big tech lobby fight us every inch in Congress cave instead of listening to we the people. | ||
| I witnessed COSA pass the Senate 91 to 3 go to the House where the Speaker refused to let it be heard. | ||
| Senators Graham and Durbin have the Defiance Act. | ||
| Senators Blackburn and Coons, the No Fakes Act. | ||
| These are great bills. | ||
| I've even taken notes and reintroduced them on a state level in South Carolina, like the Elvis Act in Tennessee. | ||
| Senators Cruz and Klobuchar led the Take It Down Act and has already passed the Senate. | ||
| I'd like nothing more than to be proven wrong about the inefficiency of Congress by having the House pass the Take It Act soon. | ||
| Speaker Johnson, Chairman Guthrie, the ball is in your court with a bill to protect American lives. | ||
| Please don't let us down again. | ||
| I've witnessed over 40 teens take their lives since Gavin just due to sex tortion, while we as lawmakers fight amongst ourselves. | ||
| Will it take one of your own children or grandchildren to finally get fed up enough to move? | ||
| Sextortion is only one of the many harms due to our children due to big tech's lack of accountability. | ||
| Big tech is the big tobacco of this generation. | ||
| We see groups such as Nick Mac and Cozy give statistics over and over. | ||
| We see parent survivors knock on your doors daily, and Section 230 will go down as one of the greatest disasters, allowing big tech to run rampant without repercussions. | ||
| We watch companies spend millions lobbying, fighting us in court, and continuously absolving themselves of responsibility. | ||
| In this very chamber last January, I stood holding a photo of Gavin while Mark Zuckerberg offered a forced, pathetic apology. | ||
| Where I'm from, we have a saying that says, don't talk about it, be about it. | ||
| And until these companies can be held responsible and the billions they make off of advertising to our children, big tech will simply never be about it. | ||
| I use this as an example. | ||
| Meta pulled down 63,000 accounts in one day in one country, just from Lagos, Nigeria, and just off of Instagram. | ||
| Now ask yourself, did they pull those down to actually help our children? | ||
| And if so, why haven't they done more since? | ||
| Or did they pull it down for a PR stunt? | ||
| I beg to say that that is nothing more than a PR stunt so they can get that pat on the back as if they are doing something good, but have done nothing since. | ||
| I got way offline, but I want to focus on my main message to big tech as lawmakers. | ||
| I think we have to say either get in line or get offline. | ||
| And right now, we have too many politicians making decisions based on their next election and not enough leaders making decisions based on the next generation. | ||
| Are we politicians or are we leaders? | ||
| We can't just talk about it. | ||
| We have to be about it. | ||
| And if we can't protect our next generation, then what are we even fighting for? | ||
| Tomorrow needs you and our children need you now. | ||
| Ms. Goldwood. | ||
|
unidentified
|
Chair Grassley, Ranking Member Durbin, and distinguished members of the Senate Committee of the Judiciary. | |
| My name is Carrie Goldberg, and I'm a lawyer who represents families catastrophically injured by big tech. | ||
| I want to tell you about a few of the cases I've been working on for the past decade. | ||
| I'm the originating attorney in a case against SNAP where our clients' children were matched with drug dealers and sold counterfeit fentanyl-laced pills that killed them. | ||
| The case now has 90 families in it from all over the country, including families that you heard from last week. | ||
| I'm also joined by my client, Amy Neville, the mother of 14-year-old Alexander Neville. | ||
| In another case against SNAP, criminals are exploiting a known security vulnerability to access CSAM and blackmail and extort kids with it. | ||
| Yesterday, the Ninth Circuit dismissed one of my cases representing a 15-year-old, severely autistic boy who at age 15 was funneled into Grinder's marketing campaign and was recommended to four different pedophiles who raped him over four consecutive days. | ||
| In court, Grinder's lawyers said that they had no duty to restrict children's access to their hookup app. | ||
| In another case of mine, a 13-year-old LS was lured to the site Band Lab, another site with no age restrictions. | ||
| She thought she was meeting a 17-year-old boy, but it turned out to be 40-year-old Noah Pedrona from Portland, Oregon. | ||
| He posted openly on this music sharing platform songs about her, one called Pedophile in A Minor, a song called Pedophile in A Minor. | ||
| On June 24th, 2022, Madrona drove 15 hours to her home, abducted her on the way to school, stuffed her in the trunk of the car, and raped and abused her for eight days. | ||
| Despite there being a national manhunt, Band Lab refused to provide law enforcement with key information that could have led to her fast rescue. | ||
| They wanted to respect Madrana's privacy, they said. | ||
| Finally, I represent the family of 16-year-old Aiden Walden from Colorado, who in July 2020 discovered a website that glorifies suicide and learned on that website about a product that he could buy from Amazon and get Prime delivered to him and use it to end his life. | ||
| Two months later, his grieving mother exchanged 57 messages with Amazon, telling them about their product being amplified on a suicide message board. | ||
| And yet Amazon, despite knowing there was no household use for this product besides suicide, continued to promote, sell, and deliver it for 26 more months. | ||
| I now represent 27 other families who bought it after Amazon sold it to Aiden Wallen and heard from his mother. | ||
| In all of my cases, tech has two main defenses, Section 230 and that they didn't know. | ||
| Now I was here a year ago with my clients, including Amy, when this committee so powerfully told the CEOs of Meta, Twitter, Discord, Snap, and TikTok that you were done with discussions and you wanted solutions. | ||
| The most important thing I can say is that families want legislation like COSA, Sunsetting Section 230, the Defiance Act, SHIELD. | ||
| They want laws that increase accountability, that create protection boards at the FTC, the cyber tip line, create procedures to contest a platform's failure to remove CSAM. | ||
| They want injunctive relief. | ||
| And families want civil remedies against platforms when they've increased the risk of harm. | ||
| Now, take for example my case representing AM, one of the first cases to overcome Section 230 on trafficking and product liability. | ||
| At age 11, AM lived in a normal town living a normal life in Michigan when she went to a sleepover and discovered a website called Omegle. | ||
| It matches strangers for private live streaming. | ||
| Omegle matched her with a man who made her his online sex slave for three years, extorting her, making her at the beck and call of he and his friends to perform for her, sometimes interrupting her at the dinner table or at school, even forcing her to go back on Omegle to recruit more kids. | ||
| The abuse eventually ended when his home, which he shared with his wife's daycare, was raided and images of AM and other young girls were found. | ||
| In that case, Omegle did not intend my clients' injuries. | ||
| I could not claim that they knew who she or the offender was. | ||
| Instead, I pointed to the mountain of evidence that Omegle had knowledge of how prevalent the harm was on its platform. | ||
| I pointed to criminal cases, articles, exposés, academic journals. | ||
| I just want to say two more things. | ||
| As a result of how we pled the case, we advanced into discovery and acquired 60,000 documents exposing the extent of injured children. | ||
| And that led to them agreeing to shutter Omegle forever on November 8th, 2023. | ||
| Now, we are at a consensus today. | ||
| We are all here to not repeat history. | ||
| Section 230 was supposed to incentivize responsible content moderation. | ||
| Instead, it did the opposite. | ||
| And as we look into the future, on behalf of the victims I represent, we are here to support laws that pressure platforms to know about the harms and to fix them. | ||
| Thank you, and I look forward to questions. | ||
| Thank you, Mr. Burga. | ||
| Now, Professor Leary. | ||
|
unidentified
|
Thank you, Chair Grassley, Ranking Member Durbin, and all the members of this committee. | |
| As has been mentioned, I'm really grateful for all the work this committee has done on this issue. | ||
| The experience our children are having in the digital space is one fraught with danger for them. | ||
| And one might want to ask, why do you have to work so hard? | ||
| Why do you have to keep passing these laws? | ||
| Why are not the laws that Congress has had on the books regarding exploited crimes working? | ||
| And there's lots of answers to that, to be sure, but the common thread through this morning so far is Section 230 of the Communications Decency Act, which has transformed, which has been transformed into what I label a de facto near absolute immunity regime. | ||
| And what I mean by that is exactly what Ms. Goldberg just said. | ||
| This was a law that was designed to incentivize platforms for protection, and instead it has incentivized them to harm. | ||
| I want to make about five points that I think will help frame our discussion about Section 230. | ||
| The first two are what I call framing principles. | ||
| When one reviews the text, the history, the structure of Section 230 of the Communications Decency Act, it is clear that this is a law that is not standalone law protecting freedom of the internet, as tech and its surrogates will try to argue. | ||
| It is a law that is born out of a landscape of child protection. | ||
| When you go back to the legislative history, there is no question the Senate with the Communications Decency Act, the House with the Internet Freedom and Families Act were wrestling with the same question. | ||
| How, as you look to the Telecommunications Act, how could you as Congress have a safer internet and other media for youth? | ||
| Not whether, but how. | ||
| First point. | ||
| Second point, Section 230 of the Communications Decency Act must be regarded as an experiment. | ||
| And I say that because when you look at the promises Tech made back in 1996, and when you look at the supporters of the IFFE and the House, what you see is they represented to you and to America that this would be a way in which we could protect our children. | ||
| That was their claim. | ||
| That was the promise. | ||
| Point number three, the experiment has failed. | ||
| The experiment has failed for all the reasons that have been said already. | ||
| And why has it failed? | ||
| And I would say to you, in addition to the reference to what happens here on Capitol Hill with regard to tech, the transformation of Section 230 of the Communications Decency Act into a law that incentivizes harm was not by accident. | ||
| It wasn't sort of something that just emerged from the internet. | ||
| It was a systematic effort by tech and its surrogates to litigate that throughout this country. | ||
| And they went across the country over 30 years arguing not for the narrow, limited protection for Good Samaritans that the Act states, but rather broad immunity. | ||
| Interestingly, immunity is nowhere in Section 230 of the Communications Decency Act, as a side note. | ||
| And that result has had human consequences, which we've heard today. | ||
| And I think to highlight one that's been said, 99,000, 100,000 reports today will happen on the cyber tip line. | ||
| But it also has important effects in the courtroom that Ms. Goldberg alluded to, and I want to highlight a couple of them. | ||
| One is, keep in mind, this has become an immunity, not a defense, and that is essential for two important reasons. | ||
| First, as an immunity, these cases are thrown out at a motion to dismiss. | ||
| So there's no access to discovery. | ||
| So when we say that victims, survivors, state's attorneys generals are shut out of the courtroom, we don't mean it's very hard to win these cases. | ||
| We mean they are shut out of the courtroom, that they do not have their day in court, notwithstanding the harm that they've experienced. | ||
| And I label this reality the dual danger of de facto near absolute immunity. | ||
| First, that shield, which has allowed platforms to engage in a list of criminal activities having nothing to do with publishing, that has allowed this industry to grow to a massive scale where one individual or one small company can cause massive harm, as we've heard. | ||
| But the other part of that dual danger is because it is an immunity, there is no access to discovery. | ||
| There is no way to look under the hood of this incredibly dangerous industry. | ||
| There's no guardrails. | ||
| And that means that, and as Senator Klobuchar pointed out, what that tells us as I wrap up is that there is no guardrails for the harm that these folks will experience. | ||
| So I offer some suggestions of reform in my papers, but I think the key thing here is to keep the Good Samaritan protections that Section 230 has, but to get rid of the C1 protections that have so distorted this incentivization for harm. | ||
| And I would just encourage the Senate to listen to the words of Justice Thomas, where he has lamented about the reality of Section 230 and stated, make no mistake about it, there is danger and delay. | ||
| And that danger, we can do at the math. | ||
| If we accept that 99,000 reports of NICMEC will be made today, that means that 12,375 reports will come in during this hearing. | ||
| And in the last five minutes I've spoken, there have been 344 reports. | ||
| And if that's not a reason to act enough, I don't know what is. | ||
| Thank you, Chair. | ||
| Thank you, Professor. | ||
| Now, Mr. Berserr. | ||
|
unidentified
|
Chairman Grassley, Ranking Member Durbin, and distinguished members of the Senate Judiciary Committee, thank you for the opportunity to testify today. | |
| As the CEO of RABEN, an organization dedicated to transforming the nation's response to child exploitation, I am here to urge decisive legislative action. | ||
| Despite multiple testimonies before Congress, progress has been slow, hindered by special interest groups and financial incentives that favor the status quo. | ||
| We must prioritize our children's safety and support those who protect them above all else. | ||
| New threats continue to emerge while old ones remain unaddressed. | ||
| Artificial intelligence now enables offenders to manipulate regular images of children into explicit content, create images of children who do not even exist, and groom children at mass. | ||
| Offenders now increasingly exploit children for financial gain in addition to their depraved sexual gratification. | ||
| Yet legislative inaction allows this crisis to persist. | ||
| The tech industry has not meaningfully reduced online victimization. | ||
| Their voluntary cooperation with law enforcement is minimal, allowing offenders to continue exploiting children with impunity. | ||
| In 2023, for example, there were 36 million cyber tips, yet Apple, holding a 57% market share in the U.S., only reported 275. | ||
| According to investigators in the field, Discord notifies users of legal processing subpoenas, enabling offenders to erase evidence before law enforcement can act, allowing offenders to continue to target our children. | ||
| Electronic service providers permit offenders to rejoin platforms under new aliases with the same IP address, while failing to block foreign IP addresses used for sextortion. | ||
| This lack of enforcement emboldens criminals and leaves our children unprotected. | ||
| Poor moderation, the lack of parental controls in relation to age identification, and inadequate safety measures further expose children to these dangers. | ||
| As mentioned a lot today, social media algorithms push harmful content, enabling predators to reach victims globally. | ||
| AI-powered grooming will allow offenders to manipulate children at scale, mimicking their language and behaviors to establish trust. | ||
| Troubling, even chat GPT-like tools can provide information on grooming tactics when framed in seemingly innocent ways. | ||
| These dangers extend beyond child exploitation to drug access with platforms facilitating the sale of fentanyl and illicit substances. | ||
| Law enforcement is overwhelmed and under-resourced. | ||
| Undercover operations have been highly successful in apprehending offenders, but the increasing volume of cyber tips has made proactive investigations nearly impossible. | ||
| In the U.S., there are 229,000 IP addresses currently right now trading peer-to-peer images of known child sexual abuse material, yet only 923 are being actually worked. | ||
| Studies indicate that over 50% of those individuals are hands-on offenders with 8 to 13 victims each. | ||
| The mental toll on those who investigate these crimes is severe. | ||
| Prosecutors, child advocates, and law enforcement officers are exposed to daily horrific content leading to burnout and PTSD. | ||
| We must provide them with adequate wellness resources to ensure they can continue their critical work. | ||
| As a retired New Jersey State Police Commander, I have seen firsthand what can happen. | ||
| Despite its critical role, the Internet Crimes Against Children Program, ICAC, has been chronically underfunded despite being responsible for most of the child investigations in the U.S. While authorized for $60 million in 2008, only $31.9 million has been appropriated. | ||
| That's $522,000 per task force per year to investigate child exploitation. | ||
| That's why I urge everyone here to co-sponsor the Protect Our Children Reauthorization Act of 2025. | ||
| Children are our most valuable resource, and their victimization has lasting consequences on society. | ||
| Raven stands ready to collaborate with members of the Senate, House, Trump administration, and the CEOs of Big Tech to develop effective solutions. | ||
| Quite frankly, the phrase, talk is cheap, is 100% accurate. | ||
| Action is only the only remedy. | ||
| How many of our children and those who protect them will be impacted as a result of our inaction and debate? | ||
| Make no mistake, right now offenders are winning, children are suffering. | ||
| Those fighting to protect them are left to struggle without the support they need to rescue victims, hold offenders accountable, and bolster their own mental health in the process. | ||
| Legislative action is overdue. | ||
| The solutions are within your power. | ||
| Our children are counting on you, and I'm counting on you. | ||
| Thank you so much. | ||
| Mr. Balcombe. | ||
|
unidentified
|
Good morning, Chairman Grassley, Ranking Member Durbin, and distinguished members of the committee. | |
| Thank you very much for the opportunity to speak with you today. | ||
| My name is Stephen Balcombe and I'm the founder and CEO of the Family Online Safety Institute. | ||
| For nearly two decades, FOSI has worked with industry, government, academia, and the nonprofit sector to create a safer digital world for children and families. | ||
| I'm also here as a father and a newly minted grandfather. | ||
| Chairman, this is my third time testifying before this committee, having first appeared in July of 1995, a committee hearing called Cyberporn and Children. | ||
| While much has changed, our mission remains the same. | ||
| We believe in a three-pronged approach to online safety, enlightened public policy, industry best practices, and good digital parenting. | ||
| Our goal is to create protections for kids as well as empower young people to navigate digital spaces safely and responsibly. | ||
| We want to protect kids on the internet, not from it. | ||
| Parents of younger children should have the strongest protections possible, including easy-to-find and easy-to-use parental controls. | ||
| But as kids grow, our role as parents shifts from being helicopter parents to co-pilots, guiding them as they build digital resilience. | ||
| Research shows that teens value online safety tools like blocking, muting, reporting, and privacy settings. | ||
| Teaching them to use these effectively fosters independence and self-regulations. | ||
| We have found that empowerment is often the best form of protection. | ||
| We must prepare young people to engage safely and thoughtfully with the digital world, equipping them with digital literacy and an understanding of their rights and responsibilities. | ||
| Now, recently there have been calls to ban young people from social media and other online spaces. | ||
| Blanket bans deprive children from any positive experiences they may have, are difficult to enforce, and open up too many possible unintended consequences. | ||
| After all, children have rights, including the right to safely access the web, to information, free expression, and to connect with others. | ||
| Instead of blanket bans, we need thoughtful restrictions that include input from young people and that account for children's evolving maturity. | ||
| While technical solutions such as age assurance are improving, there is no universally approved system as yet. | ||
| It is challenging to get the balance between safety, privacy, and effectiveness right. | ||
| And as I said recently at our annual conference in front of 350 industry leaders, quote, you can and must do better to create easy-to-find and easy-to-use controls for parents and online safety tools for teens and young people. | ||
| You can and must do better to publicize and promote those controls and tools. | ||
| And you can and must do better to collaborate with each other to harmonize your tools across the ecosystem so that parents and teens are not overwhelmed with the task of setting and managing controls across countless apps, games, websites, and social media platforms. | ||
| Unquote. | ||
| In the meantime, Congress has taken some important steps in this space, passing COPPA 27 years ago and the CAMERA Act three years ago, which funds essential research on children's development and well-being. | ||
| But there's still much more work to be done. | ||
| Federal action is critical because states are now beginning to fill the gaps with their own online safety laws. | ||
| Unfortunately, even the most well-intentioned laws often face legal challenges and create a fragmented regulatory landscape. | ||
| A strong federal framework would provide clarity while allowing states to build upon this. | ||
| So, Congress has the opportunity to lead with balanced and thoughtful policies, including passing a comprehensive data privacy law, funding ongoing research to inform evidence-based policymaking, prioritizing specific targeted bills like the Take It Down Act and the Kids Off Social Media Act, encouraging industry cooperation to simplify parental controls and online safety tools, | ||
| rejecting blanket bans in favor of thoughtful restrictions that include young people's input, and critically, supporting digital literacy programs to build resilience in young users. | ||
| So, to conclude, let us challenge ourselves to reimagine what online safety can look like, not just as a range of restrictions, but as a foundation for resilience, confidence, and opportunity. | ||
| Thank you, and I look forward to your questions. | ||
| Thank you all for your testimony. | ||
| We'll have a five-minute round of questions. | ||
| I'm going to start with Mr. Brazzero. | ||
| AI has opened up new possibilities for bad actors to generate novel forms of CSAM. | ||
| In fact, one recent report found that over 3,500 AI-generated C-SAM images were posted in a single dark web form over a nine-month period. | ||
| Could you explain the challenge of AI-generated C-SAM poses for law enforcement and tech companies? | ||
|
unidentified
|
Well, there's a couple things. | |
| One, right now, it's going to be hard to tell the difference, especially without forensic software, what is AI and what isn't. | ||
| Secondly, I could just take my phone now, take a picture of a senator, and then I can age regress them. | ||
| For example, you can be a 40-year-old male. | ||
| I can now make you a 21-year-old female, and now I can make you a 10-year-old girl. | ||
| And with that and AI and within these apps, I can actually then nudify those apps. | ||
| So I think the challenge comes with this and AI, especially from sextortion. | ||
| I don't even need to groom someone right now. | ||
| I can just get an image off the clear web in order to do that. | ||
| So that's going to be the complexity, and the challenge is going to be is how do we determine who is a real victim and who's not in a lot of instances. | ||
| Thank you. | ||
| Professor Larry, 230. | ||
| Well, first of all, I heard your five points, so I'm not asking you to repeat any of them, but how would you advise reforming Section 230 in light of the current online ecosystem? | ||
|
unidentified
|
Thank you, Senator. | |
| Well, first, as I say, when we talk about Section 230 and the provisions that tech points to, there's, as this committee well knows, C1 and C2. | ||
| C2 is the Good Samaritan provision, and I would recommend that that stay in place. | ||
| That incentivizes a platform to be able to remove harmful material from their platforms without being sued. | ||
| The C1 part of the statute should be removed. | ||
| As has been pointed by so many of you in your opening statements, it serves no purpose if it ever did. | ||
| Now, a myth has been created about it that it somehow created the internet and somehow the internet will break without it, and that's just simply not true. | ||
| And if it ever was true, this is no longer a fledgling, a fledgling, fledgling business that needs that kind of support. | ||
| Instead, it needs to be treated like every other business. | ||
| Another important thing I would encourage the Senate to do with Section 230 is to listen to the National Association of Attorney Generals, who repeatedly has written and asked Congress to include in it the ability for them to enforce their state laws, which has also been ruled to be something that they cannot do when these courts, when tech has argued for an expansive interpretation of Section 230. | ||
| In my mind, that is another courthouse door that is closed. | ||
| It's a state's rights issue, and the entire architecture of combating exploitation of our children involves prosecution, protection, and prevention, and within that involves multiple pressure points, including civil litigation, state prosecution, and federal prosecution. | ||
| And that I think would be an important amendment. | ||
| Mr. Pizzaro, obviously it has taken a long time and maybe will take a longer time for Section 230 to be reformed and also putting some more things in that can slow the process up. | ||
| Beyond that reform and liability for big tech, what steps could companies take to protect children online today? | ||
| Well, one of the things that they know is they know, for example, I had mentioned in my testimony, Discord, they notify users. | ||
|
unidentified
|
They notify users when they get legal process. | |
| You know, those are certain internal policies. | ||
| They know what IP addresses that there are because if I get banned, I just create a new username. | ||
| So those are associated IP addresses. | ||
| There's also IP addresses, you know, beyond the scopes of the U.S. where children are targeted here. | ||
| So these are things that the companies actually know and can do something. | ||
| Professor Larry, the bills reported out a committee last year that would impose liability on platform for knowingly promoting CSAM and others for recklessly promoting CSAM. | ||
| As we continue workshopping bills in this committee, do you believe we should pursue recklessness standard or a knowing standard and the pros and cons? | ||
|
unidentified
|
Thank you. | |
| I absolutely believe a reckless standard is superior to a knowing standard. | ||
| And again, you know, Senator Durbin referred to the red herrings. | ||
| One red herring that's out there is reckless is some very low standard that will somehow expose these businesses to an onslaught of litigation. | ||
| A couple of comments on that. | ||
| First, most businesses function having to act responsibly, and they face often a negligent standard. | ||
| Anybody who says that recklessness is an easy standard to make, I invite you, please come to my criminal law class and meet my criminal law students who will be able to tell you the definition of recklessness and they will tell you that it is challenging. | ||
| And specifically, it is a conscious disregard of not just a risk, a substantial and unjustifiable risk. | ||
| That is the definition of recklessness in the criminal context, and it can be used in other contexts as well. | ||
| That requires not just an objective measure, but a level of subjectivity. | ||
| It's referred to sometimes as risk creation. | ||
| So that kind of standard is hardly a day in the park for litigants. | ||
| It is still quite challenging, and that's why it is a far better standard than knowingly, in my opinion. | ||
| Senator Durbin. | ||
| Representative Guffey, thank you for coming back. | ||
| I'm sorry for the circumstances which bring you, but it shows real courage. | ||
| And I know your family and friends have joined you in coming here today. | ||
| I recall the first time we met after a hearing a year or so ago before this committee. | ||
| So thank you very much. | ||
|
unidentified
|
Thank you. | |
| Mr. Chairman, was it a week or two weeks ago we had a hearing on fentanyl? | ||
| Yeah. | ||
| Yeah. | ||
| Last week. | ||
| And we had another parent of a victim who ordered what he thought was a Percocet, turned out to be laced with fentanyl and took his life. | ||
| So this is a life or death proposition that we're dealing with here, and you've lived it and living it still. | ||
| I think we ought to keep it in that context. | ||
| Professor Leary, I'm struck by one of your statements that you've given to the committee, that this notion that we are preparing 230 as an immunity as opposed to a defense precludes evidence being gathered and discovery taking place. | ||
| And you say in your remarks to the committee that that diminishes our knowledge of the actual goings on at these tech companies and what they're doing and gathering. | ||
| I recall what Representative Guffey said in his opening remarks. | ||
| This is bigger than big tobacco. | ||
| I know that issue. | ||
| Over 30 years ago in the House, I introduced a little bill to ban smoking on airplanes. | ||
| It passed because Congress is the biggest frequent flyer club in the world and we were sick of it. | ||
| And it triggered a conversation and a discovery process and AGs from across the country gathered together and did something significant with this industry. | ||
| So I'd like you to expound a bit, if you will, as to how this standard precludes our knowledge of what's actually going on in big tech in their response to this challenge. | ||
| I think that the gathering of that information for the tobacco companies, the demonstration of their lying to the public about the safety of their product, for example, really led to their downfall. | ||
| I think the same could be true here. | ||
|
unidentified
|
Thank you, Senator. | |
| I think that you are 100% correct on that. | ||
| When a defendant has a defense, as many, as I know the committee knows, but to be responsive to the question, there's a period of discovery beforehand. | ||
| It's distributory negligence. | ||
| Exactly, exactly, or things of that nature. | ||
| There's a period of discovery where the plaintiffs who've made a good faith claim can get information to build on their case, and the defendants can also provide information which may exculpate them. | ||
| The way that Section 230 has been interpreted, it is an immunity. | ||
| And so prior to discovery is when these platforms are coming into court and saying, Judge, we don't have to defend ourselves. | ||
| We don't even have to litigate this case. | ||
| You should dismiss it now. | ||
| Motions to dismiss prior to discovery. | ||
| The only way that I would say the public has learned a lot of the information about big tech, for example, that I believe led to COSA and some of the other duty of care, has been through what? | ||
| Congressional investigations. | ||
| I'm reminded of the back page congressional investigation, which was a two-year investigation, or whistleblowers and hearings. | ||
| That's how we are learning this information. | ||
| And only by getting this information can we then make informed choices about what's the appropriate legislative text. | ||
| If I could just say quickly, Justice Thomas commented on this, and he has underscored this when he said, look, let's keep in mind if we fix Section 230, that's not exactly what he said, but after that, he said, quote, it would simply give plaints a chance to raise their claims in the first place. | ||
| Plaintiffs must still prove the merits of their case, and some claims will undoubtedly fail. | ||
| But states and the federal government will be able to update their liability laws to be more appropriate for an internet-driven society. | ||
| If I can make one final point in the closing seconds here, going back to my analogy, smoking on airplanes and ultimately dealing with the tobacco issue in a much larger context, the initial bill that I introduced and passed in the House banned smoking on airplanes and flights of two hours or less. | ||
| People said, what are you talking about? | ||
| If it's dangerous, it's dangerous regardless of the duration of the flight. | ||
| The reason was I had a Minnesota congressman who was a chain smoker who was holding up my bill, and I went to him. | ||
| He's passed. | ||
| It's so timely that I'm not. | ||
| I went to him and I said, Marty, how long can you go without a cigarette? | ||
| And he said, two hours. | ||
| So I put that in the bill, and he didn't object to it, and it moved forward. | ||
| There are things that we're dealing with in some of these bills, which are compromises to try to move the issue forward to make progress toward our goal. | ||
| So don't assume that any language is final. | ||
| It is all in flux and subject to negotiation. | ||
| But thank you for joining us. | ||
| Senator Lee. | ||
| Thank you, Mr. Chairman. | ||
| First, I'd like to thank all the witnesses for being here and for testifying on this important issue. | ||
| These are not easy issues to talk about and not easy in particular because of the tragic circumstances that have regrettably brought you here. | ||
| Representative Guffey, I want to express my sympathy to you for the loss of your son, Gavin. | ||
| No parent should ever have to go through that. | ||
| And I want to commend you on your courage and the strength that you've shown as you continue to fight to protect all children. | ||
| Thank you. | ||
| And Ms. Goldberg, with what you've gone through, likewise, My heart goes out to you and to anyone else who has experienced the things that you're describing. | ||
| For the past several years, I've strongly advocated for reforming Section 230 of the Communications Decency Act. | ||
| And this is due to increasing concerns about how social media platforms are operating and how they're utilizing Section 230. | ||
| The platforms have enabled child sexual exploitation and promoted harmful challenges to children and facilitated drug trafficking in many cases to minors. | ||
| Now, first, I introduced the PROTECT Act on this point, which mandates stricter safeguards on websites hosting pornographic content. | ||
| Victims of online exploitation have faced an uphill battle for years, struggling to get online platforms to remove images that were non-consensually obtained. | ||
| The bill would require platforms to verify the age and also obtain verified consent forms from individuals uploading and appearing in content. | ||
| And the bill would require tech companies to take stronger measures to prevent the exploitation occurring on their platforms and force immediate removal of child sexually explicit material and revenge porn upon receiving notice that the content in question was uploaded without the legally required consent. | ||
| Second, I introduced another bill as a complement to the Protect Act called the Screen Act. | ||
| The Screen Act would require all commercial pornographic websites to adopt age verification technology to ensure children can't access the site's pornographic content. | ||
| In the 20 years since the Supreme Court last examined this issue in earnest, technological advances have demonstrated that prior methods of restricting minors' access to pornography online were ineffective. | ||
| Nearly 80% of teenagers between the ages of 12 and 17 have been exposed to pornography. | ||
| This is especially alarming given the unique physiological effects that pornography has on minors, effects that are much better understood and to a much more alarming degree today than they were 20 years ago. | ||
| Finally, I introduced a third bill called the App Store Accountability Act, which would prevent underage users from downloading apps with pornography, extreme violence, and other harmful content, while making it easier for parents to sue the gatekeepers of the content in question. | ||
| Technology has advanced significantly over the last two decades. | ||
| Modern age verification technology is now the least restrictive, least intrusive, and most effective means to which Congress has ready access to protect our children from exposure to online pornography. | ||
| Ms. Goldberg, if it's okay, I'd like to start with you. | ||
| In your view, should app stores such as the Google Play Store and Apple's App Store be held legally accountable for allowing minors access to harmful content? | ||
|
unidentified
|
100% app stores should have a duty. | |
| They are just a seller in this situation. | ||
| And as we've said in our cases against Amazon, there's standards of seller negligence. | ||
| So if you know that you are selling an unreasonably dangerous product, then there's liability. | ||
| Liability. | ||
| Liability there would be if you sold a tangible physical object unsuitable for minors to someone with knowledge or reckless disregard for their age. | ||
| Professor Leary, do you believe requiring pornographic websites to adopt age verification technology for visitors and for all people featured on those websites in pornographic images while imposing serious consequences for uploading and hosting non-consensual pornographic content? | ||
| Do you think these are things that would help children? | ||
|
unidentified
|
I do think age verification, obviously with any piece of legislation, the words matter, but the idea of anything that will create friction between children and their exposure to pornography is an important thing, and age verification can be one of them. | |
| I think the danger here is to suggest, not to suggest, the danger here is what I see is tech directing things away from them often, right, as there's the solution when we have to have a multi-tiered, multi-level approach. | ||
| And that's why a combination of all of the acts you've talked about, the SHIELD Act, the Defiance Act, the Take-Down, Take-It-Down Act, the No Fakes Act, all together really provide much more protection than one or two approaches. | ||
| Tobacco. | ||
| Thank you. | ||
| Thank you, Grassley. | ||
| It is wonderful to be here and to hear your incredible testimony. | ||
| Got through three of you, I think. | ||
| And I first want to lead with you, Representative Guffey. | ||
| Watching your family behind you, I can friends, how difficult this must be and how heartfelt your testimony was. | ||
| I don't know how anyone can listen to you and not want to get something done here. | ||
| So I want to thank you for that. | ||
| You've described how victims of these crimes often suffer from mental health trauma. | ||
| Can you quickly elaborate on why even the threat of the non-consensual distribution of explicit images can be tragic? | ||
|
unidentified
|
The threat is the most dangerous part of it. | |
| Not even the sharing of the images themselves are as bad of the threat because you are taking your deepest, darkest shame or your most private moment and the threat of sending it out to complete strangers. | ||
| It's complete vulnerability. | ||
| And I believe that in this country we've lost grace and we have too often kicked people for the mistakes that they make. | ||
| And we tell our kids that everything you do online will stay with you forever. | ||
| Well, imagine if you just took your darkest moment and just posted it online. | ||
| Exactly. | ||
| Federal. | ||
| Why should this be federal? | ||
|
unidentified
|
Well, on a state level, I can tell you from passing or submitting legislation, passing legislation, I have submitted things such as the PROTECT Act, the App Store Accountability Act. | |
| You know, we need help on a federal level because Section 230 is causing states to go at this 20 different directions. | ||
| Actually, and until if 230 isn't going to fix it and the states are fed up on how ineffective Congress has been, we're going to continue to try to go at it any and every way we can. | ||
| But it would be a whole lot nicer to have uniform code across the country instead of just protecting children in one state. | ||
| Like you might do with the airplane seat rules that I just brought up. | ||
|
unidentified
|
Yes. | |
| You don't have those state by state. | ||
| That would be very difficult to get any results. | ||
| Mr. Pizarro, could you talk about why it's important that Congress pass these bills to give federal law enforcement tools? | ||
| As you know, Senator Cornyn and I have this SHIELD ACT, which is really important ahead of its time. | ||
| And then the Take It Down Act requires the platforms to take these down immediately, the non-consensual images, but also make sure that there is criminal liability for those that are posting it. | ||
| Could you talk about why that helps federal law enforcement? | ||
|
unidentified
|
Sure. | |
| The challenge becomes in investigating between state and federal. | ||
| There's a lot of, there's a lot of gaps. | ||
| So as an investigator, there's areas where I can't successfully prosecute or have the actual law in order to facilitate things. | ||
| So especially if you go to rural areas where there's not a state perspective, where there's not really good laws, you're going to need that federal law in that aspect. | ||
| So what SHIELD does, it fills that legislative gap in order for us to actually effectually do our jobs. | ||
| Good point. | ||
| And another question on the fentanyl and the drug track. | ||
| The DEA recently found that one-third of fentanyl cases they investigated had direct tithes to social media. | ||
| Others, like the National Crime Prevention Council, estimate that 80% of teen and young adult fentanyl deaths can be tracked back to social media. | ||
| It's not a statistic, it's actual lives lost. | ||
| How did the design of an algorithmic recommendation by online platforms contribute to the facilitation of drug sales? | ||
|
unidentified
|
I could tell you this even going back when I first started, not aging myself, but when there was cloned pagers and we were doing cartels, this is just the advent of technology. | |
| And with these tech companies and the AI algorithms, what they push, that's what they're going to see. | ||
| So it doesn't matter. | ||
| There is no, you know, one of the things I asked Meta, I asked SNAP, I asked a lot of these companies, can you explain your algorithms? | ||
| No one can and no one will because again, it's about business. | ||
| It's about pushing that content and that's what children are seeing. | ||
| So that's why they're at risk. | ||
| Thank you. | ||
| My last question of you, Ms. Kohlberg. | ||
| You have represented over a thousand victims of revenge porn. | ||
| Just to give people a sense of those numbers, of course, there's tens of thousands out there that never were represented. | ||
| Can you discuss the challenges you face in getting justice for your clients and why passage of federal laws like the SHIELD Act and the Take It Down Act would make a difference? | ||
|
unidentified
|
Sure. | |
| When I started representing victims of revenge porn 10 years ago, there were three states that had laws, and everyone wanted to blame the victims and said you shouldn't have taken that picture in the first place. | ||
| And it wasn't until we were testifying about it and like actually making people realize that the liability needs to be like in like in the hands of the offenders. | ||
| There's a responsibility in being the recipient of it. | ||
| But the bigger problem though was that the platforms were the ones that were distributing the content at scale. | ||
| So back in the old days, revenge porn could be photocopied and put on a car windshield. | ||
| But now with Snapchat and Google and Meta, one picture can be seen by millions and millions of people. | ||
| And we need the uniformity, like Mr. Sara was saying. | ||
| Thank you. | ||
| And I know my colleagues ask about Section 230, which I feel very strongly about, so I'll let that go. | ||
| Thank you. | ||
| Senator Klobuchar, thank you. | ||
| Thank you, Chairman Grassley. | ||
| Senator Holly. | ||
| Thank you, Mr. Chairman. | ||
| Thank you for calling this hearing. | ||
| Thanks to the witnesses for being here. | ||
| Mr. Bazero, let me just start with you. | ||
| You've been working in the anti-exploitation space for a long time, both inside and outside government, if I got that right. | ||
|
unidentified
|
That's correct. | |
| And so you know the trends about what we're facing online, what kids are facing online probably as well or better than anybody. | ||
| Is that fair to say? | ||
|
unidentified
|
I would say pretty so. | |
| Would you say that CSAM, child sexual exploitation or abuse material, would you say that there's getting to be more of it or getting to be less of it? | ||
|
unidentified
|
Oh, 100% more, like hundreds and thousands of more. | |
| I mean, I can't even percentage it. | ||
| Yeah, enormous amounts, right? | ||
| Here's a measure of it. | ||
| In 2023, there were 104 million images and videos of suspected child abuse material uploaded onto the internet compared to 450,000 in 2004. | ||
| So 450,204 to 104 million in the last full year for which we have data. | ||
| Here's another statistic. | ||
| According to the National Center for Missing and Exploited Children, the number of reports of child exploitation material went from 1 million in 2014 to 36.2 million in 2023. | ||
| So in other words, it's just an enormous explosion. | ||
| It's absolutely everywhere. | ||
| So let me ask you about some of the remedies for this. | ||
| If you are a parent, and I'm the parent of three young children, three little kids, if you're a parent of a victim of child sexual abuse material and your child's image has been used, they've been exploited. | ||
| It's been used online. | ||
| And you've got companies who have hosted that content recklessly or intentionally or negligently or they've done it. | ||
| If you're a parent, can I sue them and get them to take it down? | ||
|
unidentified
|
Right now, no. | |
| You can sue them, but I don't know how successful you're going to be. | ||
| So if I went into court, if my kid is abused, their content is up online, we know the abuser, but we've got these companies that are hosting the content and making money on it by distributing it. | ||
| And I go to the company, let's say I go to the company and I say, this sexual abuse material, this is my kid. | ||
| This is online. | ||
| I'm reporting it to you. | ||
| I want you to take it down. | ||
| Let's say they don't take it down. | ||
| You're telling me I can't go into court and sue them? | ||
|
unidentified
|
You're going to probably end up losing. | |
| I mean, and I think that's part of the problem. | ||
| It's a huge problem, isn't it? | ||
| You're exactly correct. | ||
| You're exactly correct. | ||
| The state of the law is I cannot go into court and hold these companies accountable. | ||
| In fact, we had testimony just a few weeks ago of somebody sitting right where you're sitting, a parent whose child was sold drugs in this case over one of these platforms over Snapchat. | ||
| This parent went in, reported it to Snapchat. | ||
| Snapchat said, oh, well, you know, we'll do our best. | ||
| They did nothing. | ||
| The parent said, I'm going to sue you. | ||
| And the Snapchat executives laughed in her face and they said, no, no, you're not. | ||
| You're not going to sue us because federal law prohibits you from suing us. | ||
| Let me just ask you this. | ||
| In 2019, Facebook was fined by the FTC $5 billion, $5 billion with a B, and their stock price went up. | ||
| Now, what does that tell you about what these companies fear? | ||
| Do you think they fear these government regulatory agencies that almost never bring suits and almost never bring enforcement? | ||
|
unidentified
|
Oh, absolutely not. | |
| Do you think that they fear lawsuits from parents who might get into court and get a billion dollar or a $10 billion judgment? | ||
|
unidentified
|
For them, it's the cost of doing business, right? | |
| Yeah, exactly. | ||
| And they're willing to pay it. | ||
| Facebook paid that $5 billion. | ||
| Their stock price went up. | ||
| They went right on doing what they were doing. | ||
| But I tell you what they do fear, what they're absolutely terrified is, they're absolutely terrified of a parent coming into court and getting in front of a jury and holding them accountable. | ||
| And that is why it is high time. | ||
| It is past time that this Congress gave parents the ability to do that. | ||
| And I will just say again, for the approximately three millionth time in this committee, until Congress gives parents the ability to sue, nothing will change. | ||
| These companies don't care about fines. | ||
| They don't care about the regulations. | ||
| In fact, the companies regularly come and sit here and offer to write the regulations. | ||
| They say, oh, we're great public citizens. | ||
| We'd love to help you write the regulations, Congress, and we promise to comply. | ||
| We'll write them and then we'll comply. | ||
| They won't comply. | ||
| They buy off the regulators. | ||
| What they fear are juries. | ||
| And this is why what Senator Durbin has done with his bill that we worked on together to give parents the right to get into court and have their day in court is absolutely vital. | ||
| And I'm proud to be working with him on this. | ||
| It passed unanimously out of this committee last year when he was the chairman. | ||
| And I look forward to reintroducing it. | ||
| We make it even stronger, even better this year. | ||
| But I just say again, there is nothing more important than this Congress can do than to stop this than to give parents the right and victims the right to get into court and to hold these companies accountable. | ||
| Thank you, Mr. Chairman. | ||
| Thank you. | ||
| Senator Hirono. | ||
| Thank you, Ms. Thank you, Mr. Chairman. | ||
| Thank you all for testifying. | ||
| And Mr. Representative Guffey, our hearts go out to you. | ||
| We have been here many times already. | ||
| Yes, I agree that we have to do something about Section 230. | ||
| But one of the things that Professor Leahy mentioned, and before I get to that, by the way, the enforcement is really important. | ||
| And I just want to note that last week when I was questioning Mr. Blanche, who is President Trump's nominee for Deputy Attorney General, I noted that protecting children online is an issue that unifies the members of this committee, as you can see. | ||
| That is why I was disappointed. | ||
| It has not answered to one of my questions. | ||
| I explained that if we want to protect children, the last thing we should do is fire prosecutors who fight child exploitation and impose a hiring freeze that stops them from filing these, filling these vacancies, but that's exactly what's happening. | ||
| So I think we should note the environment in which we are having this hearing. | ||
| Moreover, there was a funding freeze briefly that cut off funding to Internet Crimes Against Children task forces that fight child exploitation in every state. | ||
| So child exploitation is a multifaceted issue. | ||
| And I want to get back to Professor Leahy, who said that the states ought to have the right to go after child exploitation in court and that they are not able to do so because of Section 230. | ||
| Does that cover both criminal as well as civil prosecutions by states? | ||
|
unidentified
|
It has been interpreted that way. | |
| The states, and the way it's been interpreted is there's language in Section 230 of the Communications Decency Act, which talks about that no state is not supporting a state law that is in contravention with Section 230 of the Communications Decency Act. | ||
| So courts have interpreted that as, oh, that means you can't enforce your state criminal laws, which happened in the DART, which was an attempt in the DART cases. | ||
| And I assume it would happen in civil cases under the C1 provisions of the statute. | ||
| Well, so you would support legislation at the federal level that would allow the states to enforce their own child protection laws. | ||
|
unidentified
|
100%. | |
| And I believe that my written testimony has a quote from the letters from the National Attorneys Generals laying out again for the third time. | ||
| And again, speaking of unanimity, I believe there's, I don't know exactly how you can have over 50 attorneys generals. | ||
| I believe it's a territories as well, all in agreement on this point. | ||
| Representative Guffey and Ms. Goldberg, you would agree that we need to do something that would enable the states to support their own laws. | ||
|
unidentified
|
I would certainly agree with that. | |
| I think that's one of the tools in the tool belt. | ||
| But yes, states need to be able to have the tools. | ||
| For Ms. Goldberg, you noted very briefly the Ninth Circuit and their decision in a case that you were involved with. | ||
| Could you provide some background on the case and how Section 230 was involved and what you think it demonstrates, this case demonstrates the state of the law around Section 230? | ||
|
unidentified
|
Yes. | |
| So that case is called Dovey Grinder, and it accuses the dating app Grinder of advertising to children using Instagram and TikTok with child models in school settings and luring them onto the dating app. | ||
| And as I said in my complaint, there's statistics that 50% of gay kids who are sexually active have their first sexual experience with an adult that they meet on Grindr. | ||
| Now, Grinder has no age verifications and just absolutely turns a blind eye to the fact that there are so many kids that use their product and inevitably are recommended to adults. | ||
| Now I claimed that this was a defective product and Grindr, because they knew about the problem, as I stated in my lawsuit, and were refusing to institute any sort of age verifications, they were also condoning trafficking. | ||
| And the case got thrown out by the district court and that was affirmed yesterday by the Ninth Circuit. | ||
| So I never got to go to discover. | ||
| It was thrown out because of Section 230 immunity. | ||
|
unidentified
|
Of Section 230 and because of the incredibly high knowledge standard of actual knowledge that they were imposing, which they didn't have to impose, but they imposed in the trafficking claim. | |
| I support the general proposition, thank you, Mr. Chairman, that anyone who gets injured by someone else's actions ought to be able to pursue illegal remedies. | ||
| Therefore, you know, I agree that we need to remove Section 230 immunity somehow and still pay attention to various other unintended consequences that may flow from that kind of a change, but it's not where we ought to be because this is a growing problem. | ||
| Thank you, Mr. Chairman. | ||
| Thank you, Senator. | ||
| I believe I'm next. | ||
| Representative, I'm sorry. | ||
| But your boy's proud of you. | ||
| You're doing good work. | ||
|
unidentified
|
Thank you. | |
| Now, my late father used to tell me that you'll never know love till you know the love of a child, and I didn't believe him, but I do now. | ||
| I don't know what I'd do if something happened to my boy. | ||
| I'm just so sorry. | ||
|
unidentified
|
Thank you so much. | |
| Mr. Pizuro. | ||
|
unidentified
|
Sir. | |
| Social media is now a big part of childhood, isn't it? | ||
| Can we agree that big parts of social media have just become cesspools of snark? | ||
|
unidentified
|
I can probably attest to that, yes. | |
| Can we agree that social media has lowered the cost of being an a-hole? | ||
|
unidentified
|
Yeah. | |
| Can we agree that big parts of social media have become cesspools of sexual exploitation? | ||
|
unidentified
|
For sure. | |
| And I assume you'd agree with me if I said that social media has lowered the cost of being a pedophile, hasn't it? | ||
|
unidentified
|
Absolutely. | |
| It made it easy access. | ||
| Yeah. | ||
| You're familiar with the National Center for Missing Exploited Children's Cyber Tip Line? | ||
| Yes. | ||
| Are the social media companies required to report instances of child sexual exploitation to the National Center? | ||
|
unidentified
|
Of what they see. | |
| Okay. | ||
| So the law says the social media companies have got to report these instances of sexual exploitation to the National Center. | ||
| First, they have to look, don't they? | ||
|
unidentified
|
Yes. | |
| Do they make any money when they look? | ||
|
unidentified
|
No, if you just look at the Apple statistics I gave before, out of what, 36 million, there's 275 came from Apple. | |
| But they're not paid to look. | ||
|
unidentified
|
No. | |
| Okay. | ||
| In fact, they want people coming to their social media platform. | ||
|
unidentified
|
More users, more money. | |
| Yeah. | ||
| They want eyeballs so they can sell them advertising. | ||
| So for them to look is inconsistent with their economic interest, isn't it? | ||
|
unidentified
|
Correct. | |
| All right. | ||
| Now, once they look and they find it, then they have to report it to the national center, is that right? | ||
|
unidentified
|
That's correct. | |
| Are they paid to report it to the national center? | ||
|
unidentified
|
Absolutely not. | |
| Okay. | ||
| How many instances, I know this is a difficult question. | ||
| How many instances do you think are sexual exploitation of children are occurring and not being either looked for and/or reported by the social media companies? | ||
|
unidentified
|
Well, I don't have NECMEC statistics, but I could tell you that most of there's a lot of ESPs that don't actually even report. | |
| So some overreport, some don't report at all. | ||
| So that's part of the challenge. | ||
| And then secondary, it's voluntary, right? | ||
| So whatever they give them, there's no uniformity in data as well. | ||
| Yeah, what happens if they don't look and or they don't report? | ||
| Are they punished? | ||
|
unidentified
|
No. | |
| Okay. | ||
| You're familiar with the SAFER program? | ||
|
unidentified
|
A little bit. | |
| Okay. | ||
| It's a tool. | ||
| They use AI to scan conversations and look for patterns that might be sexual exploitation of children. | ||
| It's not the only algorithm out there. | ||
| Do social media programs all use that? | ||
|
unidentified
|
I don't know how many do, but I don't know if they're using technology, but they should. | |
| Are they required to use it? | ||
|
unidentified
|
Nope. | |
| We've got to do something. | ||
| This is my last question. | ||
| Do you find it ironic that all of these people in big tech who dreamed about and talked about creating a utopia have managed to generate more hate and more harm than anyone could ever have possibly imagined? | ||
| All to make money. | ||
|
unidentified
|
And lots of money they made. | |
| You find that ironic? | ||
|
unidentified
|
Very. | |
| Thank you all for being here. | ||
| Senator Blumenthal, he's not only next, he's the only one left. | ||
| But it's nice to see him. | ||
| Am I recognized, Mr. Chairman, Mr. Ranking Member, or are you? | ||
| I'm the Chairman Blumenthal. | ||
| That may be the reason I'm the only one left. | ||
| Could be. | ||
| I'm looking forward to your questioning, and I'm going to turn the gavel over to Senator Blackberg. | ||
| Thank you, Chairman Blackburn. | ||
| Representative Guffey, thank you for being here today. | ||
| And I think our hearts go out to you. | ||
| I know I'm not the first to have said it, but your courage and strength makes an enormous difference. | ||
| I know how strongly you supported the Kids Online Safety Act, and I am deeply grateful to you for your support and your activism in going to Louisiana, for example, seeking to talk to Representative Scalise and Representative Speaker Johnson on behalf of that bill. | ||
| You did an article that I would like to have entered into the record, if there's no objection. | ||
| And there seems to be none. | ||
| When you went to see Representatives Scalise and Johnson, were you given an opportunity to talk to them? | ||
|
unidentified
|
No, sir. | |
| Myself or the other parents, we did meet with Representative Scalise's staff, which of course was in district at that time, but even coming up here to the Hill, unable to meet with either one of the representatives. | ||
| Would you like to meet with them? | ||
|
unidentified
|
I would love to. | |
| Well, we'll try to arrange it for you. | ||
|
unidentified
|
Thank you, sir. | |
| And they'll hear from you. | ||
| I'm hoping they'll support the bill this time. | ||
| You agree? | ||
|
unidentified
|
I 1,000% agree. | |
| Why don't you tell us as a parent, but also as an advocate and the author of that article, why you think some of the arguments made against COSA based on a supposed free speech thesis are incorrect? | ||
|
unidentified
|
I believe it's all follow the money. | |
| If you look at big tech and their lobby and you look at the narratives that get put out there and you look at the representatives that fight against it and you follow the money and where it ends up, I believe that fear, and as an elected official, you know, I see it myself. | ||
| You're often worried about what this will look like. | ||
| And that's one of the reasons whenever I was presenting, I used the phrase that we have too many politicians worried about their next election instead of leaders worried about the next generation. | ||
| I believe that it's a false narrative that has been put out there. | ||
| The argument has been had over and over, and people will agree with you, and then they will turn right around and share a false narrative. | ||
| I think the United States Senate has recognized that it's a false narrative through a strong bipartisan vote here, 91 to 3 in the last session. | ||
| I'm hoping that we'll have that same kind of support again. | ||
| And I thank my Republican colleagues, particularly Senator Blackburn, who has been such a steadfast partner in this effort. | ||
| I'd like to turn to Professor Leary. | ||
| I think I misattributed the article to Representative Guffey, but maybe you can expand on his response on that free speech false narrative. | ||
|
unidentified
|
Sure. | |
| Thank you, Senator. | ||
| So first thing about free speech. | ||
| Well, first, as you know, I believe the article you're referring to, the op-ed, myself and other scholars wrote this piece. | ||
| And it really dispelled these arguments about COSA and really that we see again. | ||
| In fact, if you look back in history and it's interesting to look at what some of Tech has said over the years, I can go back to 2014. | ||
| They were making this argument. | ||
| We can go back actually before that to 1996. | ||
| They told us the Communications Decency Act was going to ruin free speech. | ||
| Then they said it about the SAVE Act. | ||
| Then they said it about TESTA FOSSA. | ||
| And lo and behold, we still have plenty of free speech. | ||
| The thing to keep in mind with free speech is the First Amendment is that is an amendment designed to help inform us on how to handle these sticky issues. | ||
| It is not a reason to not engage in legislation and that it's being used in that manner. | ||
| There's a distinction between speech and conduct. | ||
| And specifically with COSA, COSA addressed conduct, not content. | ||
| And so the speech argument was particularly misplaced with regard to that piece of legislation. | ||
| Thank you. | ||
| In fact, COSA affects the conduct involved in product design. | ||
| There's no more limitation on free speech than there would be if and when, because it does, the federal government regulates the safety of the design of an automobile or a toaster or a washing machine. | ||
| If they explode, there is liability for it. | ||
| It's not free speech to design a defective and harmful product. | ||
| It's conduct. | ||
| And there is no censorship, no blocking of content in COSA. | ||
| Thank you all for your testimony today. | ||
| Thank you, Madam Chair. | ||
| Hi, thank you. | ||
| And Professor Leary, I'm going to stay right with you for my question. | ||
| I appreciated so much that op-ed that you had put together and the difference that you're making there, that it was not a free speech infringement. | ||
| This is, as Senator Blumenthal said, product design, as you said, conduct. | ||
| But we know the reason that Meta and Google and the groups lobbied just millions of dollars spent lobbying against this is because they have assigned a dollar value to each and every kid. | ||
| And I think the dollar value is $270. | ||
| And so our kids are the product. | ||
| And it is so unseemly. | ||
| To me, it is absolutely disgusting that they devalue the lives of young people in this manner. | ||
| Mr. Pizuro, I want to come to you. | ||
| Senator Klobuchar and I have the National Human Trafficking Database Act, which would establish a database at DOJ's Office for Victims of Crimes and incentivizes states to collect and to enter and share their data. | ||
| What we're trying to do is get a full picture of what is happening in each of the 50 states when it comes to human trafficking. | ||
| And we have really had a tough time doing this and finding those people that are behind these human trafficking rings. | ||
| I know your organization, Raven, has been supportive of the bill. | ||
| I'd like for you to talk for just a minute about why having a national database is so vitally important to breaking this modern-day slavery apart. | ||
|
unidentified
|
Well, the more data we have, the more we're able to understand and see and react to. | |
| And I think that's part of the challenge is that as states, we're so fragmentized, so we're getting data from just certain areas. | ||
| So the more data we're able to actually collect, the more likely we are able to put a comprehensive plan and understand how to go after certain trafficking. | ||
| And I want you to touch for just a moment on the use of AI-generated CSAM, because what we hear from law enforcement is they're having to sift through so many images to figure out what is AI-generated and what is actual. | ||
|
unidentified
|
That's a challenge. | |
| So right now, you can't, detective investigating something, I can't tell the difference between what a real image is and what is not a real image. | ||
| Technology exists. | ||
| Now I can make those images whoever I want. | ||
| I can make a child from an adult. | ||
| And the challenge really becomes is now I could take your images off the ClearNet, off of social media, off of open profiles, and then turn that person into a child or, better yet, have sexually explicit images. | ||
| The challenge is going to be we can't see it unless we have the software capabilities in order to actually do that, which again, we don't have. | ||
| I appreciate that. | ||
| And Senators Koons, Klubchartillis, and I introduced the No Fakes Act to deal with AI generated voice and visual likeness of individuals. | ||
| And we think that this will play an important role in a remedy for AI-generated CSAM. | ||
| Representative Guffey, I'd love to get your thoughts on that. | ||
|
unidentified
|
On the No Fakes Act, I personally love it. | |
| I've actually resubmitted the use of a bill very similar within the state. | ||
| But I love the idea of using the name, image, and likeness. | ||
| I think that is a very easy thing to hit. | ||
| And as we talk about using AI-generated pornography, one of the problems that we have is stating that this is not a real person, therefore is it really a crime? | ||
| Bills such as a name, image, and likeness protects our citizens as opposed to focusing solely on what the image is. | ||
| It protects the citizens. | ||
| Let me ask you this, and congratulations on getting Gavin's law passed. | ||
| Thank you. | ||
| Is there a way you can amend provisions of no fakes onto Gavin's law and begin to expand the protections there at the state level? | ||
|
unidentified
|
In South Carolina, unfortunately, no. | |
| So it's going to be two separate. | ||
| So you'll have to have a group of bills that will do this. | ||
| Okay. | ||
| And I am over time. | ||
| I am going to recognize Senator Schiff and turn the gavel to Ms. Moody. | ||
| Thank you, Madam Chair. | ||
| Thank you all for being here. | ||
| And Mr. Guffey, I appreciate your advocacy and want to express my condolence for the loss of your son. | ||
| I can't imagine the trauma that you and your family have been through. | ||
| But I appreciate your taking that trauma and using it to protect other families. | ||
| I have not had a chance as a new member of the Senate to really study the multiple approaches of the various bills, although some I've supported in the House. | ||
| But I wanted to ask you, Professor, we established Section 230 for the reasons I think you implied, which is it was a nascent industry. | ||
| They urged us to do so so that we would not stifle innovation. | ||
| They also made the argument that without 230, they would not moderate content because they would be sued if they did, and this would encourage them to moderate content. | ||
| Well, there may have been a time where they moderated content, but those days seem to be over. | ||
| It certainly wasn't enacted because it was believed necessary for the First Amendment. | ||
| First Amendment stands on its own two feet. | ||
| In the absence of 230, companies could still plead a First Amendment defense to any case. | ||
| What is your preferred approach? | ||
| That is, is it a repeal of 230? | ||
| Is it changing it from an immunity to some form of defense? | ||
| Is it to cabin 230 in some way by narrowing the scope? | ||
| What are the merits of the various approaches? | ||
|
unidentified
|
Thank you, Senator. | |
| I would say that it's important to a couple of things. | ||
| I would say that there was discussion in the deep background about this free internet nascent industry. | ||
| And when we look at the policies and the findings at the beginning of Section 230 of the Communications Decency Act, there is languages to that. | ||
| But I would repeat the overwhelming background and discussion was about the child protection piece. | ||
| And therefore, I think that the concern, I think that better than repealing the entire thing is to keep the C2 language, which gives a cover, gives protection to a platform if they do, and specifically if they remove anything they consider to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable. | ||
| That will protect them. | ||
| That's all they need. | ||
| They do not need C1, which is what has been turned into this de facto near absolute immunity. | ||
| I think also adding the state's ability to proceed is an important thing. | ||
| I think outside of 230 is holding them liable when they host this material. | ||
| Like that's has been discussed at length as well. | ||
| So I think that those and some of the other things that I've listed, but I don't want to use up too much of your time, all work together to respond to the complex crime. | ||
| And Council, and your representation of clients in this area, what do you believe would be most helpful in terms of making sure that you can get the discovery you need and that we have established the right protections and the right burdens in terms of the platforms and the pipelines? | ||
|
unidentified
|
Thank you. | |
| I agree. | ||
| We are long overdue to just abolish Section 230. | ||
| But what's important is that clients need to get into discovery so that they can actually know the extent of the problem. | ||
| And the only way we can do that is if the standard is reasonable for parents to plead. | ||
| If we have to show that the company knew about that picture, that exact victim, that exact perpetrator, there's no way a client is going to be able to overcome a motion to dismiss and get into discovery. | ||
| So we need to actually have standards like negligence, which the law already affords in almost all causes of action. | ||
| So let me ask this question. | ||
| I don't think there's any doubt that if the companies devoted their technological capability to trying to solve this problem, that they could make enormous gains. | ||
| They wouldn't be able to eliminate the problem altogether, but nonetheless, they could prove very effective. | ||
| What would you propose the new standard be then? | ||
| That is, if it's not going to be possible to completely do away with this, the standard can't be perfection. | ||
| How would you define the standard of care that you would expect the industry to follow, given that it hasn't had to follow any standard with the protections of 230? | ||
|
unidentified
|
And if you could just quickly answer that. | |
| Sure. | ||
| Well, these are products. | ||
| So strict liability should apply here. | ||
| If these companies have created a defective product, then all users should be able to sue them without having to even prove a duty if the product injured them. | ||
| Thank you. | ||
|
unidentified
|
Thank you, Senator. | |
| Appreciate you being here today. | ||
| I have been so impressed with this committee. | ||
| I'm a senator all of four weeks, so get ready. | ||
| I bring to it with an array of passions I have acquired, not just as an attorney general, but as a mother of a teenager right now. | ||
| And I'm so impressed with the topics that we have focused on, and specifically this one. | ||
| It was shocking to me that the Senate was able to move forward pretty unanimously on some protections for children, and they ran right into the House that did not go along with some of those things. | ||
|
unidentified
|
And I'm hoping that we can change that. | |
| As Attorney General, obviously, I fought in court against many of the platforms. | ||
|
unidentified
|
I investigated platforms for harms to children. | |
| I am the mother dealing with this now. | ||
| In fact, I tell people all the time it is really hard to be one of the first generations of parents trying to parent children, and we don't understand what we're doing because we don't understand the technology to the degree they do. | ||
| In fact, when I'm going through some of the controls, I often have to ask my kid what that means, which seems to defeat the purpose. | ||
|
unidentified
|
But here we are. | |
| And while I can break down what we're addressing today by privacy concerns of children, certainly, harm to children, whether that is mental affects, addiction, or materials that they never would have been exposed to in the past, but they now have ready access to. | ||
|
unidentified
|
One of the things I want to talk about quickly is the access to our children by predators and bad actors. | |
| I think this is this third lane that we read about repeatedly in the paper every single day in my state from doctors to predators to you name it, they are getting access to our children. | ||
| And parents in the past could lock our children's bedroom doors and know they were safe at night, but that is not the reality anymore. | ||
|
unidentified
|
In fact, in my own child school, there were five teenage boys. | |
| A woman was arrested for posing as one and luring them and molesting them online, I think, using Snapchat and TikTok. | ||
| And of course, when you engage with the platforms, they will often deny that this is happening. | ||
|
unidentified
|
But it is happening. | |
| And the best people that can represent that are the parents where it's happening to their children in their homes while they thought they were safe. | ||
|
unidentified
|
And so I really commend you as a fellow parent, Mr. Guffey, for taking your pain and channeling that into just frustration and anger, because that is what is going to get the attention of lawmakers. | |
| I mean, we've tried to get the attention of the platforms. | ||
| We've talked a lot about what needs to be done to force some restrictions and them acting on their own. | ||
| But we need to talk about what needs to be done through laws. | ||
|
unidentified
|
And what I'm specifically concerned about, and I would open this up to whoever wants to answer this question. | |
| What is the thing that we can do as lawmakers right now to stop predators from getting access to our children? | ||
| I can go. | ||
| I'll go ahead. | ||
| Let me stop down there and then we'll come back. | ||
| I want to use this as an example because the comment was made that we essentially have different laws for the outside world than we do from the inside world. | ||
| If I had a storage facility and I stored only guns in there and you as Attorney General and someone was breaking the law and I said, okay, the majority are law-abiding citizens, but we also have terrorists and we're going to store guns for criminals. | ||
| And if I told you that you had to have the digital ID to get into that locker, you'd think that's ludicrous. | ||
| But that's exactly the way that we treat CSAM. | ||
| I mean, to me, I believe that if you're housing CSAM, you should be held responsible. | ||
| But nothing is going to change until we open up civil liability. | ||
| These are the world's richest companies since the inception of man, and yet they are immune. | ||
| And I'd say that if you have designed a product where you are exposing children to predators and you can't stop that from happening, then it's a defective product. | ||
| And all a parent or a victim should have to do to be able to sue you is just to show that you know about the problem and the extent of it. | ||
| And in your experience, and I have understand that this has happened, when parents have demanded and shown that this harmful material is online and demanded that they take it down, they've now told them about it. | ||
|
unidentified
|
They know about it, and there have been refusals to take it down. | |
| Absolutely. | ||
| And those cases get thrown out of court because the online platform says, well, I didn't know about that specific incident. | ||
| You know, of course they're not going to know about that specific incident. | ||
| Or they're going to say, I didn't intend to harm that exact child. | ||
| And Mr. Presero, I know you have law enforcement experience, and I'm grateful for that. | ||
| Thank you. | ||
| My husband is career law enforcement. | ||
| Understanding that predators can now get to our children through online platforms and online, what is the number one thing you would recommend to prevent that that we can do as Congress? | ||
| Device-based age verification. | ||
| You mentioned parental controls. | ||
| If there was a framework for a parent just to shut the spigot off and make it easy rather than go through 25 different apps, the companies have this. | ||
| We can stop it at the device level. | ||
| That's where we prevent children from getting on some of these images and offenders getting access to those children. | ||
| Thank you. | ||
| And since I am the acting chair, I don't want to exceed the boundaries of time. | ||
| So I will turn it over to Senator Whitehouse. | ||
| Thank you. | ||
| And I understand that I've been given permission to close out the hearing at the end of my questioning. | ||
| So I know you have another place to be, so don't hesitate to go where you need to be. | ||
| First of all, Ms. Goldberg, you said that repeal of 230 was long overdue. | ||
| I'm hoping that that day is coming fairly soon and that a bipartisan bill to do just that will be filed by a group of members from this committee before very long. | ||
| As you also pointed out, there are standards by which to evaluate the conduct or misconduct of these big platforms that the law already affords. | ||
| And everybody else has to abide by those same standards. | ||
| If you're a radio station, if you're a newspaper, if you're a manufacturer, if you're an individual, some of them go back to the English common law that came over with the first settlers. | ||
| And the idea that what Representative Guffey described, I think, quite well as institutions that are the richest since the inception of man shouldn't be bound by the law. | ||
| I adore Ron Wyden. | ||
| I think he's a wonderful senator. | ||
| He put Section 230 in when these platforms were in people's garages. | ||
| And they've gone from that to being the richest companies since the inception of man. | ||
| I'm not going to forget that phrase of yours, Representative Guffey. | ||
| I like it. | ||
| With no change in Congress's response to the original rationale for having that Section 230 protection and also repeated grotesque failure by these entities to police themselves. | ||
| It's not as if we're dealing with a array of platforms that have a demonstrated record of meeting the public interest in the safety of their product. | ||
| Not at all. | ||
| As a lawyer to lawyer, Ms. Goldberg, talk a little bit about when the Section 230 defense first kicks in and what that means in terms of you and your clients actually being able to get discovery to take a deposition to find out the truth of what actually transpired. | ||
|
unidentified
|
What happens is that I file a lawsuit with all my facts, with everything that I can know, even though there's so much asymmetry of knowledge. | |
| Like, I don't know the extent to which the platform knows about the exact problem or the overall problem. | ||
| I can just base it on what's happened to my client, if they're alive. | ||
| Otherwise, I have to go through their parents. | ||
| So immediately, within like 30 days, they file a motion to dismiss saying, we're just a publishing platform. | ||
| We're not a product. | ||
| This is just speech. | ||
| And then they attempt to get it dismissed. | ||
| Oftentimes, judges will do it without even oral argument. | ||
| And then we never get into discovery. | ||
| So we never get the opportunity to even show or know exactly the extent to which the platform has been tolerating and making money off of this exact harm. | ||
| We don't have any information about other similar incidents, nothing. | ||
| So it's a vehicle not only for evading responsibility for bad acts, but it's a vehicle also for covering up what actually took place. | ||
| It would be slightly different if the Section 230 dismissal motion was something that you made at trial, for instance. | ||
|
unidentified
|
Yes, and I have a full chance, but not even that. | |
| I also believe that even more terrifying to tech than facing a jury eye to eye is the discovery. | ||
| It was the discovery that made Omegle shut down. | ||
| I had 60,000 documents showing all these other similar incidents of child sexual abuse, and they just, they shuttered their platform because they had no defense. | ||
| Discovery is a beautiful thing. | ||
| Senator Padilla. | ||
|
unidentified
|
Thank you, Mr. Chair. | |
| Mr. Guffey, I just want to begin with you and let you know that my heart goes out to you for you and your family's experience. | ||
| And I really appreciate your willingness to be here today and to share your testimony. | ||
| Thank you. | ||
| I want to draw my colleagues' attention to the threat presented to minors by a relatively new consumer product, character-based AI chatbot apps. | ||
| Many of these services have been flooded with age-inappropriate chatbots, which may cause young users to be exposed to sexual or suggestive AI-generated imagery or conversations. | ||
| As a father of three school-aged children, this is personal. | ||
| Further conversations with these chatbots can end tragically, as we've heard reports. | ||
| Since 2023, at least two individuals have died by suicide following extensive conversations with AI chatbots. | ||
| So the threat, colleagues, the risk is real. | ||
| Mr. Guffey, how would you recommend that this committee begin to think about or think through the risk posed by this emerging consumer product category? | ||
|
unidentified
|
Whenever it comes to AI, I would have to lean more on some of the other panelists up here on their expertise when addressing chatbots. | |
| Chatbots is something new that I've just started really looking into, but on the legal side, I'm not an attorney. | ||
| So I'm an angry parent that tries to throw it against the wall, whereas the attorneys are the ones who have to say, this is what will hold up in court, this is what will not. | ||
| Well, we've had to figure out the legalese, but I do think you have the most important voice here given your experience. | ||
| I mean, the chatbot piece is just the next iteration of this technology. | ||
| We know what technology was when you and I were much younger. | ||
| Everything which children have to contend with today, we can only imagine what's coming. | ||
| Well, and that's the exact problem. | ||
| It's not just the problem. | ||
| That's the problem of today, but as tech is evolving, our laws don't move fast enough to keep up with. | ||
| And I believe that having that liability and being able to hold these companies responsible for what they are presenting, if we, instead of taking online services and treating it as a service, if we can simply treat it as a product, then we can hold them to consumer protection laws. | ||
| Senator, if I may, I think we should think about the international context which this is playing out, because the recent AI summit in Paris was called the AI Security Summit rather than the AI Safety Summit, which had taken place in the UK and I believe in Korea. | ||
| And there's been a shift away from the prevailing thought that we must make these products safe. | ||
| And instead, and particularly this administration, is urging the vast and quick expansion of these tools. | ||
| And I think you have a role and your colleagues have a role to bring that focus back. | ||
| And I dearly hope you do. | ||
| Ms. Goldberg, you seem to think she's having me here. | ||
| I do. | ||
| One of my close friends, Matthew Bergman, is actually litigating a case against character AI where the bot encouraged addictive behavior and ultimately led the child to die by suicide. | ||
| And I think what we'll find is that there's a possibility that courts will perceive this speech as the corporation's own speech and that in that case, character AI is owned by Google and that it won't overcome a Section 230 challenge. | ||
| Okay. | ||
| Very good point, actually. | ||
| So what I would do, just in the interest of time, is invite all of you to respond to the same question after the hearing as part of our questions for the record, because I do want to get to at least one more topic. | ||
| And I understand Senator Graham is on his way back as well. | ||
| Last Congress, we had a hearing very similar to this, but instead of you five sitting in front of us testifying, it was actually the CEOs of the five largest social media companies testifying to the committee. | ||
|
unidentified
|
And I had the opportunity then to ask them each about the parental tools that they offer or didn't, but I think all of them have offered some sort of parental tool to help parents help minors safely navigate the use of their respective services. | |
| I asked them to describe what those tools were and more specifically what the adoption and use rates of those tools were. | ||
| Because you can have tools and protections out there and you can debate whether it's efficient or not. | ||
| But if people aren't even utilizing them, then kind of what's the point? | ||
| And sadly, they either didn't share how widely used these tools were, didn't provide the data, and they're all very big into data, or what data they did show demonstrated to us that the usage rates were actually very low. | ||
| So the Conclusion, unavoidable, undeniable, is that the industry isn't doing enough to let parents know what resources are available and aren't investing enough into understanding why these so-called protections aren't being adopted at greater rates. | ||
| Mr. Balcombe, in your testimony, you observed that these controls would better serve minors and their guardians if they were standardized, interoperable, and unified between apps, devices, and brands. | ||
| How do you think we can make that a reality? | ||
| Well, I often use the example of the automobile industry. | ||
| Back in the 50s and 60s, if you got out of one car and into another, you may not necessarily know where the blinkers are or the light switches were. | ||
| Even the logos for those were different in different car mechs. | ||
| Well, laws came into place in the 60s, and in fact, now when you get into a new rental car, you know exactly where the indicators are, you know exactly where the lights are, and the symbols are all the same. | ||
| Well, I'd like to see the industry come together, ideally voluntarily, but if not, perhaps with some coercion, to standardize the ways in which parental controls and online safety tools, which by the way, are the ones that teens and young people use to stay private, to report, and to block, oftentimes without even their parents' knowledge. | ||
| But in other words, let's have a standardized way of keeping our kids safe and that teens can keep themselves safe that is not as confusing as we got at the moment. | ||
| Yeah, industry standards. | ||
| It's not a new concept, and it tends to happen one of two ways. | ||
| It either gets imposed by some level of government and then industry comes away kicking and screaming, or they can actually have up to their responsibility and come together as an industry and put forward a model that is transparent and that either works or at least we can measure and hold them accountable when and where it doesn't. | ||
| I know it's been a long morning for all of you. | ||
| Very, very much appreciate your participation in today's hearing and the work that you do and the perspectives that you've offered on Tulsa Gram is not coming after all. | ||
| And so it falls upon me to not just thank all of our witnesses, but remind folks that the hearing record will remain open for one week for statements to be submitted into the record. | ||
| Questions for the record may be submitted by senators by 5 p.m. on Wednesday, February 26th. | ||
| Unless there's anything further from the nameplates, then this hearing is adjourned. | ||
| C-SPAN's Washington Journal, our live forum inviting you to discuss the latest issues in government, politics, and public policy, from Washington to across the country. | ||
| Coming up this morning, former CIA Russia analyst George Beebe will talk about efforts by the Trump administration to bring an end to the Ukraine-Russia conflict. | ||
| Washington Post Berlin reporter Kate Brady discusses parliamentary elections in Germany following the collapse of Chancellor Olaf Schulz's governing coalition. | ||
| Then, Amanda Littman, co-founder and president of Run for Something, talks about her group's effort to recruit, train, and support young progressives running for office. | ||
| C-SPAN's Washington Journal, join in the conversation live at 7 Eastern this morning on C-SPAN, C-SPAN Now, our free mobile app, or online at c-SPAN.org. | ||
| Democracy is always an unfinished creation. | ||
| Democracy is worth dying for. | ||
| Democracy belongs to us all. | ||
| We are here in the sanctuary of democracy. | ||
| Great responsibilities fall once again to the great democracies. | ||
| American democracy is bigger than any one person. | ||
| Freedom and democracy must be constantly guarded and protected. | ||
|
unidentified
|
We are still at our core a democracy. | |
| This is also a massive victory for democracy and for freedom. | ||
|
unidentified
|
And now, former senior Energy Department officials testify on risks posed by foreign nationals working at national laboratories where classified research is conducted. | |
| Geraldine Richmond, former Energy Undersecretary for Science in the Biden administration, expressed concern that following recent mass layoffs, China could recruit former government employees for their knowledge on sensitive topics involving national security. |