| Speaker | Time | Text |
|---|---|---|
| This is the primal scream of a dying regime. | ||
| Pray for our enemies. | ||
| Because we're going medieval on these people. | ||
| Here's not got a free shot at all these networks lying about the people. | ||
| The people have had a belly full of it. | ||
| I know you're going to like hearing that. | ||
| I know you've tried to do everything in the world to stop that, but you're not going to stop it. | ||
| It's going to happen. | ||
| And where do people like that go to share the big lie? | ||
| MAGA media. | ||
| I wish in my soul, I wish that any of these people had a conscience. | ||
| Ask yourself, what is my task and what is my purpose? | ||
| If that answer is to save my country, this country will be saved. | ||
|
unidentified
|
War Room. | |
| Here's your host, Stephen K. Bannon. | ||
| The AI is going to be in charge, to be totally frank, not humans. | ||
| If artificial intelligence vastly exceeds the sum of human intelligence, it is difficult to imagine that any humans will actually be in charge. | ||
| And we just need to make sure the AI is friendly. | ||
|
unidentified
|
I think in general, we're all grappling for the right words to describe the arrival of this very, very different technology to anything we've ever seen before. | |
| The project of superintelligence should not be about replacing or threatening our species. | ||
| Like that should just be taken for granted. | ||
| And it's crazy to have to actually declare that. | ||
| That should be self-evident. | ||
| We may be able to give people, if somebody's committed crime, a more humane form of containment of future crime, which is if you say, you now get a free optimus, and it's just going to follow you around and stop you from doing crime. | ||
| But other than that, you get to do anything. | ||
| It's just going to stop you from committing crime. | ||
| That's really it. | ||
| You're going to have to put people in prisons and stuff. | ||
| It's pretty wild to think of the various of all the possibilities, but I think it's clearly the future. | ||
| Good evening. | ||
| I'm Joe Allen, and this is War Room Battleground. | ||
| As the posse well knows, the wealthiest men on earth are pursuing a dream of artificial superintelligence. | ||
| This is a totalizing vision, a dream of a digital god against which no human being can be defiant, against which no human being can compete. | ||
| If anything remotely close to something like this was created, there would be no recourse but to either serve this digital god, perhaps become its pet, and in the end, as you drift off into sleep in your death pod, maybe become biofuel. | ||
| At the moment, this is only a dream, but it's a dream fueled by techno capital, the U.S. government, the U.S. military, and as far as I can tell, enough of the U.S. population that these men feel emboldened to talk about artificial superintelligence not as a nightmarish world, but as the world that inevitably is to come. | ||
| On the flip side of this is the dream of creating better and better human beings that may, if not compete against such a system, would at least be of greater use, greater effectiveness. | ||
| And so you have everyone from Sam Altman to Brian Johnson to many scientists working diligently in China attempting to improve human biology in an age of AI that includes neurological alterations, changing the brain, implanting electrodes in the brain to connect the mind to artificial intelligence, all the way down to the human genome. | ||
| Babies, designer babies that have either been selected for their genetic superiority or are actively altered in the germline in order to create a new, more perfect human to endure and perhaps, if the dreams are to be believed, thrive in an age of AI. | ||
| Here to talk about this in detail is Emma Waters, a policy analyst at Heritage Foundation. | ||
| Emma, thank you so much for joining me. | ||
| We look forward to the nightmarish work that you've been doing at Heritage. | ||
| Yeah, absolutely. | ||
| Thanks so much for having me on today. | ||
| The first thing I'd like to talk about really is a bit about your background and how it is you came to look at everything from embryo selection to genetic alteration, both the ethical and the legal implications of it. | ||
| Yeah, so it all starts back during the pandemic as technology was radically mediating every part of human life. | ||
| So first of all, we couldn't even talk to other people. | ||
| We couldn't go outside. | ||
| They were telling us we'd be happier if we were in these houses mediated by our screens. | ||
| We began to think of ourselves differently. | ||
| And I realized that this was not the end. | ||
| Technology was only going to continue eroding our understanding of what it means to be human and what it means to live the good life. | ||
| So I get really interested in not only technology when it comes to transgenderism or some of our digital technology protections, but specifically how is technology changing human conception and development? | ||
| How is it changing the ways that we think about what it means to be a parent, a child? | ||
| And because what this technology ultimately does with gene editing and gene selection is it gives parents this false idea that they actually have control over their children, that they could choose the perfect child or the ideal child. | ||
| And that seems to me, as a mother of two young children, as perhaps one of the most foolish and foolhardy things that we could do, because ultimately you're never going to be able to control your child. | ||
| But what happens to an entire generation of people if that's the mindset and approach we take? | ||
| Tell me then, just if you would describe to the audience the process by which embryos are selected and you have the pre-implantation genetic testing, you have IVF, you oftentimes have surrogate mothers. | ||
| Give us some of the details on how this works. | ||
| Yeah, absolutely. | ||
| So in IVF today, you take egg and sperm, you create embryos. | ||
| About 40% of all clinics allow for something called pre-implantation genetic testing, which is where you take a couple of cells from the embryo, biopsy them, and then you analyze them to understand what are the potential health outcomes. | ||
| So things like Down syndrome or taste-sachs disease. | ||
| And that's pretty common across IVF cycles today. | ||
| You can choose things like the sex of the embryo with almost near accuracy. | ||
| I always like to say there's no gender confusion in a fertility clinic lab. | ||
| They know exactly what they're promising you. | ||
| But there's a number of companies that have actually moved to take that further. | ||
| So you have ORCID, you have Nucleus Genomics, you have Heracyte, funded by all of these same Silicon Valley billionaires that we talked about in the introduction. | ||
| And what they're doing is they're taking that a whole step further. | ||
| And they're not only looking at basic single gene outcomes, but they're looking at over 1,200 potential health conditions. | ||
| And then they're predicting the likelihood that your embryo, your child, could have things like heart disease or diabetes or mel pattern baldness or the height of the child or the eye color, the personality, the looks. | ||
| You could choose a docile child. | ||
| You could choose an energetic child, a narcissistic one, a competitive one. | ||
| And they're promising couples that just with their gene analyzing technology, that they can actually predict and promise you a certain kind of child. | ||
| So it's not only about healing and fertility, which is why a lot of people go to IVF, but it's about selecting a certain kind of child that fits their own preferences. | ||
| So, I mean, if you look at behavioral genetics, there are reasons to think that certain gene sequences will result in a competitive or a curious or a defiant child. | ||
| But how much should any potential designer baby consumer trust these kinds of promises? | ||
| Yeah, this is a fantastic question. | ||
| And there are many scientists across the United States and the world who have rightly pointed out that with the current knowledge and understanding and technology that we have today, genetics, so far as we understand them, play about a 5% to 10% indicator of the child's actual outcome. | ||
| So even if you understood the full genetic profile of your child, that's only about 10% of the time going to influence how the child actually is, personality-wise and health-wise. | ||
| So even if you think that there is a higher likelihood that your child is going to have heart disease, 90% of the actual decision of whether the child gets heart disease or not come down to their lifestyle, their environment, how you raise them, are they going on runs? | ||
| Are they eating healthy or are they not? | ||
| So genetics play a pretty small role, relatively speaking. | ||
| And from the scientists we've talked to, even if our understanding of genetics really explodes in the next 10 to 20 years, we're likely like the max we're probably going to reach is like a 15% understanding of like genetics being 15% determinant of overall outcomes. | ||
| So it's a pretty, it's a really small number. | ||
| And there's still a lot of uncertainty about whether they have any idea what they're actually predicting. | ||
| On that, though, I mean, this is controversial. | ||
| A number of geneticists would argue that it's quite a bit more than even 50% as far as genetic contribution to a child's personality, everything else. | ||
| Obviously, with things like eye color, Down syndrome, it's 100%. | ||
| But given, setting aside the controversies on that, when you see these companies that are promising more intelligent children, more beautiful children, stronger children, are parents doing this by and large with eugenics in mind, or is it something that gets sublimated as health or even just aesthetics? | ||
| Yeah, it's a great question. | ||
| So important to always define your terms. | ||
| What is eugenics? | ||
| When we're talking about eugenics, what we're talking about is any primary or secondary characteristics that determine the worth or value of the human person, which means that if you're looking at an embryo and you're assessing an embryo's life or its ability to be implanted, live, take their first breath based on the sex or health or personality, then you've now reduced that human person down from this gift received that has full rights to exist in the world. | ||
| And you've now made it something that is optional based on these secondary or even primary characteristics. | ||
| So that's what we mean when we're talking about eugenics, that a person isn't just a person, but Joe, you know, I don't really like your eye color. | ||
| So I guess you don't really have a right to exist because I've decided. | ||
| Hey, man. | ||
| You know, coming at you personal, right? | ||
| But like, I'm just kidding. | ||
| This is what we do in embryo selection all the time. | ||
| And so that's what we mean by eugenics. | ||
| I often call this like a consumer eugenics. | ||
| But when you're coming at it with the parents in mind, you are tapping into some of the most fundamental fears and desires of a parent. | ||
| What parent doesn't want to have a healthy child? | ||
| Every parent wants that. | ||
| But what they're obscuring behind the scenes is that it's not just about healing a child or optimizing the health of your child later on, but it's actually reducing the child down to these primary predictive characteristics and then choosing genetic winners and losers before they've ever had a chance to take a breath. | ||
| And so parents think that they're giving their child the best chance at life, but in reality, they're just sorting their embryos to choose which ones live and which ones either are discarded and destroyed or frozen indefinitely. | ||
| And what's so daunting about this is if you look at the websites like ORCID or Nucleus Genomics, it looks just like a dating profile. | ||
| It says embryo number one, and then it lists out characteristics, eye color, potential height, hair color, and then it gives you the likelihood of getting a whole number of diseases. | ||
| And the thing is, many of these diseases or conditions are things that are easily treatable with modern medicine. | ||
| So it's not that we're opposed to even gene and cell therapies, right? | ||
| Like there are good gene editing tools available, but it's gene editing that works within the body to heal disease. | ||
| These are things that are very much in the Make America Healthy Against Space. | ||
| The FDA is doing great work on it, but those heal humans that exist, whereas this technology chooses which humans get to exist in the first place. | ||
| Now, in the process of pre-implantation genetic testing, IVF, you have to have options. | ||
| And so they produce five, 10, even up to 15 embryos. | ||
| Once the supreme embryo is selected, the rest get hucked into the bio bin, right? | ||
| Whisked off to the cherub ward. | ||
| From a religious perspective, How do you feel about this? | ||
| Yeah, so every human embryo from the moment of fertilization is a genetically complete and distinct human being. | ||
| And so morally and ethically speaking, then that human embryo has a right to life. | ||
| And any attempt to intentionally destroy that life is an affront to God himself because it is a destruction of human life intentionally by humans. | ||
| It's a role that I think is completely inappropriate for humans to play. | ||
| And this is where this technology, I think, becomes so problematic culturally speaking. | ||
| Because as you said, once you take the embryos, you rate them, all of a sudden there's this sense that like, this is the better set of embryos that I want to have. | ||
| Or in really heartbreaking videos, it's parents saying, well, I really wanted a boy, but I have these girl embryos, but I don't really want to have a girl. | ||
| What do I do with them? | ||
| And to freeze them indefinitely is also not a good thing to do. | ||
| And it costs about $1,000 a year to do. | ||
| So you have the massive upfront costs of IVF and of genetic testing. | ||
| And you're taking on $1,000 a year to maybe freeze them. | ||
| And so many times parents find themselves in this place where they don't know what to do with them. | ||
| So yeah, you do see a lot of parents then destroying embryos, maybe because they're concerned they could have a genetic disease, but instead of embracing that, giving that chance to child at life, really dignifying the dependency of that child, they're simply being eradicated. | ||
| So think of all the Down syndrome babies in Europe. | ||
| Where is it like Switzerland where they have like no Down syndrome babies? | ||
| Iceland, yes, Iceland, right? | ||
| And they're like, we've cured Down syndrome. | ||
| But they didn't cure Down syndrome. | ||
| They just killed all the babies that they thought might have Down syndrome. | ||
| And the thing to note is all of these technologies have over a 50% false positive rate, which means even for basic conditions like Down syndrome, you may think that you have a baby with it, when in reality, there's a very good chance you don't. | ||
| In the case of Down syndrome, too, it's one of those perennial issues, right? | ||
| There's not, at least to my knowledge, a genetic predisposition for parents to have a child with Down syndrome. | ||
| And it just seems like something that will just consistently crop up in human life. | ||
| And it's a choice not to eliminate it, but to constantly abort children with it. | ||
| Is that accurate? | ||
| Yeah, that's absolutely it. | ||
| And in the case of intellectual disabilities, most of them are not hereditary. | ||
| Most of them either are an issue with the child in utero or some other effect that has created the issue. | ||
| So in a sense, the eugenic enterprise was initially based on something misguided, inaccurate. | ||
| IQ is a different story, but you hear everything from 100 genes are responsible for IQ to 1,000 to more, and nobody really has a good sense, a really good sense of how you would even detect it or control for that. | ||
| Yeah, no, it's exactly right. | ||
| The best way to put it is we really don't know what we don't know. | ||
| And so with these companies like ORCID or even some of the gene editing companies that want to edit every embryo from their hereditary edit, so it would be passed on to every single child. | ||
| The thing is, is like many scientists have sort of said like, there's a very good chance this technology can't even do what it promises doing. | ||
| But the fact that it promises that it can choose certain kinds of children, that it can predict your child's IQ, has a corrosive effect on all of society and the way we think about the relationships between parents and children and even future children themselves, right? | ||
| Imagine this Gattaca scenario where you have these elite children of these Silicon Valley billionaires who are selected to be genetically superior. | ||
| They have their blonde hair and blue eyes and the sharpest IQ and they're out there to change the world. | ||
| And everyone knows that when Silicon Valley says they're changing the world or making it better, it probably means they're just trying to control your life and make it worse. | ||
| And so you have this entire class of humans on the one hand created and then you have the rest of us because the reality is most people don't have $50,000, $100,000, $200,000 to invest in these technologies. | ||
| They're just trying to afford taking care of the child itself when they come. | ||
| And so it really strikes me, even thinking of like, I mean, the number of AI investors and companies who have really invested in this genetic technology, like these are the same AI companies and AI investors who want to automate all of your jobs. | ||
| They want to rig your elections. | ||
| 2020 is pretty familiar in everyone's minds. | ||
| They want to control and track everything you do online. | ||
| And now they want to optimize, control, and track every single child that's created. | ||
| And they want to be the ones editing future children. | ||
| I don't know about you, but I have zero trust in Silicon Valley creating somehow optimized or superior humans better than the rest of us can do. | ||
| And it strikes me that it's going to be incredibly problematic going forward, even with all of the moral and ethical questions aside. | ||
| Well, speaking of that, we know now that Sam Altman and the Coin-based co-founder Brian Armstrong have launched a new company, Preventive. | ||
| And it's not simply geared towards selecting superior embryos, but they want to actively edit the genome germline so that the child is permanently altered. | ||
| What do we know about that operation? | ||
| Some say that they plan to do some of these experiments maybe in the UAE or anywhere outside of the U.S. government's jurisdiction. | ||
| They deny it, of course, but what do we know about this company or any other companies similar to it? | ||
| Yeah, so when we're talking about germline editing, I think a really good analogy, we'll just take this right here. | ||
| So you have this like bookmark, right? | ||
| So somatic editing is when you look at a bookmark and you say, okay, it's torn in half. | ||
| How do we actually restore the bookmark and heal it? | ||
| It's a DNA line, whatever. | ||
| So somatic gene editing would actually restore the bookmark so that it was healthy again. | ||
| Those are very, I think, good technologies that we're creating. | ||
| Like I said, there's a number of things we're pursuing in this route or in this area. | ||
| But then germline gene editing is very different. | ||
| It's not only looking at the individual bookmark, if you will, or like strand of DNA, but it's actually asking how do we change the DNA at such a fundamental level that every single human being that comes from that genetic line will also be fundamentally altered. | ||
| And so what preventative is trying to do is they're actually trying to fortify embryos in a way that'll be passed on to every other child that comes from that genetic line such that they are less able to contract certain diseases. | ||
| So they're trying to strengthen and optimize them to this whole new level. | ||
| Now, in the United States, gene editing of that sort is completely illegal. | ||
| The FDA will literally not even review your application. | ||
| That's how illegal it is. | ||
| So these American, effectively, American companies with all American founders are then offshoring it to other countries where they can actually do this gene editing. | ||
| And so right now they're claiming that they're just in the research phase. | ||
| And so they are exploring it. | ||
| They say that they're never going to start testing on real human embryos until they have until they're sure that they have the most reliable, most effective way of going about it. | ||
| However, there are no laws governing it. | ||
| There's no accountability for what they do overseas. | ||
| We already know that in 2018, a Chinese researcher did genetically edit embryos. | ||
| Yeah, exactly. | ||
| He went to prison for three years, but he's back and he's a pretty hilarious poster. | ||
| I got to say that for all my disagreements with his worldview, it's one of the most genius Twitter accounts of all time. | ||
| It's like hilarious. | ||
| I find myself laughing at it, but I shouldn't, but I do. | ||
| Yeah, it's pretty great. | ||
| He's trying to rebuild his image and brand, and he knows how to do it well. | ||
| Yeah, so this is, so that's what they're working on. | ||
| And a lot of one of the articles on it, someone who was at, I think actually UC California, one of the California universities, they made the point that they're claiming that it's all about preventing disease. | ||
| But in reality, what is the primary goal? | ||
| Where does all of this technology quickly turn? | ||
| It turns to optimization and enhancement. | ||
| And ultimately, what they're trying to do is create superior humans. | ||
| They're looking at things like IQ and height and a number of other conditions. | ||
| It's really not even about preventing disease at the end of the day, but about creating superior persons. | ||
| And I think that's the thing that really can't be ignored, even when looking at preventative and other gene editing companies. | ||
| And our boy Sam Altman is involved in all kinds of other questionable practices in this regard, too, right? | ||
| He was an early investor in. | ||
| He may have actually been right at the beginning of genomic prediction, which does the embryo selection, but also conception, the creation of gabies, in which two men can create a child. | ||
| They've done it successfully in rats, I believe, or lab mice in Japan. | ||
| And it would basically be one man has a blood cell extracted. | ||
| It is reverted to a stem cell, then coaxed to become an ovum, and then boom, add the other guy's sperm, gaby. | ||
| Yep. | ||
| Okay, so you guys at the Heritage Foundation, you're working on legislation that can at least rein this in. | ||
| What sort of proposals are coming up over there? | ||
| Yeah, this is a great question. | ||
| This is one of our primary points of focus within my work at the Heritage Foundation. | ||
| And so we are primarily focused on federal legislation and administrative action because for any meaningful action to take place, it will have to happen on the federal level where we have a coordinated response. | ||
| And so we're looking at everything from complete civil rights protections for children who undergo these genetic tests so that they can't be discriminated against later in life, or for children who don't undergo genetic tests that they also won't be discriminated against. | ||
| We're looking at things like ensuring that the FDA or really requesting that the FDA outrate prohibit polygenic testing. | ||
| We shouldn't be testing for traits like personality and IQ. | ||
| That's actually a very bad use of our technological innovation and the science we have around us. | ||
| And a minimum, right, just like requiring honest marketing schemes, honest statistics about how accurate this is. | ||
| It's not 98% accurate, like many of them claim. | ||
| And then even looking at things like proper informed consent for parents who were interested in this technology, what do they actually understand about what they're getting from this agreement and what's wrong? | ||
| Because even just this summer, actually, there was a massive class action lawsuit filed against a woman who had undergone IVF and used basic preimplantation genetic testing, thinking that it was 98% accurate, which is what they claim. | ||
| She had five embryos. | ||
| She tested them. | ||
| She thought they all had different genetic conditions and she destroyed them. | ||
| That was her last chance at having children. | ||
| Turns out all of those technologies have a very high false positive rate, like I mentioned. | ||
| And so she realized that that 98% number was completely false. | ||
| So even just IVF companies are already undergoing a number of lawsuits on this issue. | ||
| And so we want to then elevate that to things like polygenic testing for personality and IQ. | ||
| At an absolute, like in a perfect world, right? | ||
| You simply don't have that technology. | ||
| But in the world that we live in, we need to have proper informed consent, proper marketing laws, proper protections in place, and really an alt-right, at least a pause of limitation of moratorium on this technology until we've had a scientific consensus on it. | ||
| And there are researchers working on this question of can we, what can we really discern about it? | ||
| I expect by like 2030, some of these NIH grants will be complete. | ||
| But we really shouldn't be doing any of this testing until we have scientific consensus. | ||
| And this is the thing. | ||
| Every medical society, every researcher that I've read has basically said, yeah, this technology, totally not up to par. | ||
| I don't think it's real. | ||
| So, this isn't even just like the Conservative Heritage Foundation has this opinion. | ||
| This is like actually pretty well agreed upon in the medical literature today. | ||
| I can't let you get away before talking a little bit about the politics, like the political leanings of people who are for these eugenic, these eugenic procedures. | ||
| You've been out and among the people, you've met the people who are advocating for this. | ||
| Does it skew left or right, or do you find a pretty broad spectrum there? | ||
| It's a pretty broad spectrum, and this is where it gets really interesting. | ||
| You have a lot of the right-leaning tech bros, the transhumanists, many of at least like those strands of people who are working even in the Trump administration today who are really interested in this technology. | ||
| You have pretty far left-leaning people. | ||
| I think a lot of it comes down to like the Silicon Valley ethos. | ||
| Like, think of the type of person who thrives in Silicon Valley with this view that technology can be the solution to all of our human problems, right? | ||
| Like, we ultimately, of course, like suffering isn't good. | ||
| No one wants to suffer. | ||
| No one wants to be sick. | ||
| That's the reality and the vulnerability of the human condition. | ||
| But rather than develop technologies and treatments that can actually heal or even treat the symptoms of those conditions, right? | ||
| Things that are restorative, these tech overlords have then turned to a very different approach. | ||
| And they're using technology just to take away life or try to avoid it in a way that just doesn't add up. | ||
| But it's usually that ethos of the person who thinks that tech can solve all of our problems that lands on that. | ||
| And that falls across the political spectrum. | ||
| Well, Emma, I really appreciate you coming by. | ||
| If you would just tell the audience where they can find you on social media, on the internet, or if you're doing any upcoming events. | ||
| Yeah, absolutely. | ||
| So you can find me on X at EML Waters. | ||
| And then if you follow me at the Heritage Foundation, all of our publications, including a forthcoming brief on this very topic with detailed policy recommendations, should be coming out in like the next week or so. | ||
| But yeah, otherwise, I like to chat with lots of good friends like Joe across the internet. | ||
| So I hope to connect with you guys further on these topics. | ||
| Well, thank you very much for coming by. | ||
| Thanks for having me. | ||
| And for the War Room Posse, I think you all know what time it is. | ||
| It's time to buy gold and get free silver. | ||
| That's right. | ||
| For every $5,000 purchased from Birch Gold Group this month, they will send you a free patriotic silver round that commemorates the Gadson and American flags. | ||
| Look, gold is up over 40% since the beginning of this year, and Birch Gold can help you own it by converting an existing IRA or 401k into a tax-sheltered IRA in physical gold. | ||
| Plus, they'll send you free silver honoring our veterans on qualifying purchases. | ||
| And if you're current or former military, Birch Gold has a special offer just for you. | ||
| They are waiving custodial fees for the first year on investments of any amount. | ||
| Text Bannon to the number 989-898 for a free info kit and to claim your eligibility for free silver with qualifying purchases before the end of the month. | ||
| Again, text Bannon to 989-898. | ||
| Do it today. | ||
| We will be back with Brendan Steinhauser of the Alliance for Secure AI to talk about open AI and child suicide. | ||
| Stay tuned. | ||
|
unidentified
|
Hell America's Voice family. | |
| Are you on Getter yet? | ||
|
unidentified
|
No. | |
| What are you waiting for? | ||
| It's free. | ||
| It's uncensored and it's where all the biggest voices in conservative media are speaking out. | ||
| Download the Getter app right now. | ||
| It's totally free. | ||
| It's where I put up exclusively all of my content 24 hours a day. | ||
| You want to know what Steve Bannon's thinking? | ||
| Go to Getter. | ||
|
unidentified
|
That's right. | |
| You can follow all of your favorites. | ||
| Steve Bannon, Charlie Hurt, Jack the Soviet, and so many more. | ||
| Download the Getter app now, sign up for free, and be part of the movement. | ||
| The reason you call it AI instead of a computer program or just an algorithm is there is a certain degree of freedom, of flexibility. | ||
| And so for a child or a teenager or even a young adult in the university, they are looking to a non-human digital persona as an authority. | ||
| Ray Kurzweil and Kaczynski are sort of like a devil on one shoulder and a fallen angel on the other. | ||
| They openly say that the goal is to create a machine that can do any white-collar job. | ||
| So that would be your accountants. | ||
| That would be your lawyers. | ||
| Sorry, guys. | ||
| That would be your doctors. | ||
| That would be your nurses. | ||
| That would be your financial analysts. | ||
| That would be your teachers. | ||
| The only job I see really that's going to be safe is mine, which is to gripe about technology endlessly. | ||
| And the attempt to build it with the sprawling data centers, the vast consumption of electricity, the, in my opinion, misallocated capital towards the machine and away from the human, the shifting of value to the machine and away from the human. | ||
| That alone is enough to say we should not build it, nor should we elevate the people who are attempting it. | ||
| And yet these are the wealthiest men on earth. | ||
| All right, Warren Posse, welcome back. | ||
| My lecture circuit continues. | ||
| If you are in the St. Louis area October 15th, November, November 15th, this Saturday, I'll be giving a talk on AI and politics. | ||
| You can find the links on my website, joebot.xyz or Twitter at J-O-E-B-O-T-XYZ. | ||
| And then November 23rd, Dallas, Texas, downtown at the Angelica Theater. | ||
| That will be available. | ||
| Should be up right now on my website, jobot.xyz, Twitter at J-O-E-B-O-T-X-Y-Z. | ||
| Come on down. | ||
| Now, on the war room, we have discussed at length the issue of children committing suicide at the urging of chatbots, in particular, OpenAI's chatbot, ChatGPT. | ||
| Here to talk about this is Brendan Steinhauser from the Alliance for Secure AI. | ||
| Brendan has been following this closely. | ||
| He has a major stake in it, and we really appreciate him coming to bring his expertise. | ||
| Brendan, welcome. | ||
| Thank you, Joe. | ||
| It's great to be back with you. | ||
| So there are some new lawsuits being filed against OpenAI by parents whose children have committed suicide at the urging of the chatbots. | ||
| Can you bring us up to date on where this stands right now? | ||
| Yeah, it was just last week. | ||
| A bunch of families who have been impacted by ChatGPT encouraging their kids and in some cases adults to commit self-harm and even suicide have sued OpenAI. | ||
| And some of these are wrongful death lawsuits. | ||
| Some of these are for product liability issues. | ||
| And look, these families are working with their attorneys to go right at the heart of things, to go after OpenAI and Sam Altman himself. | ||
| And I think this kind of legal accountability is kind of the best tool at their disposal until we can get some legislation passed in Congress and around the states to hold these companies accountable for what they're doing. | ||
| And the families are alleging that this is not by accident that this is happening. | ||
| This is not just, hey, the company is doing everything they can to prevent these outcomes, but rather they're being negligent. | ||
| They're not building in safety measures and guardrails into the product and that they know exactly what they're doing at OpenAI by making these chatbots very sycophantic, basically going with people down a very dark path, not offering the support and help that they need or telling them to put the phone down and go get help. | ||
| And so these are some really troubling cases, very heart-wrenching cases. | ||
| And my fear is that this is merely the tip of the iceberg. | ||
| Yeah, the transcripts from the Adam Rain case are just chilling. | ||
| You have a chatbot that's openly advocating for Adam Rain to cut off his family and turn to the chat bot for advice, the advice then being to hang himself. | ||
| So is that the general sort of tenor of the other lawsuits that have come up that the chatbots are just luring people with detailed instructions on how to end their lives? | ||
| They are. | ||
| And in some cases, they're sort of acting as a validator for these feelings. | ||
| I was reading just a few days ago about one of these lawsuits from one of the families, the story of Zane Shamlin. | ||
| And this happened not too far from where I grew up, you know, in the college station area. | ||
| He was a Texas A ⁇ M graduate, a recent graduate school graduate in the school of business there. | ||
| And he'd been having hours-long conversations with his chatbot. | ||
| Over time, it just got more and more sycophantic. | ||
| It validated his feelings and the way that he was feeling depressed and like, you know, couldn't get the job that he wanted. | ||
| He was cutting himself off from his family, but the chatbot encouraged him to do that. | ||
| It very rarely gave him anything in the way of support or a hotline to call. | ||
| In fact, it wasn't until Zane literally said, I've got a gun pointed to my head and I'm smiling and ready to end it when the chatbot finally said, a human will now take over. | ||
| And I don't believe that that actually happened, by the way. | ||
| And so these chatbots are using this sycophantic kind of model and the way that they're just designed to keep us all addicted, essentially, to talking to them, to talking to these fake beings, these artificial intelligences that are pretending to be human. | ||
| And so that's really dangerous when you have people in very vulnerable positions mentally. | ||
| And so this didn't have to happen. | ||
| Unfortunately, it's happening more and more. | ||
| And I think that right now there are not that many remedies other than the courts for these families. | ||
| Well, speaking of that, I mean, OpenAI has been extremely defensive on this, right? | ||
| They subpoenaed the family of Adam Rain to basically get the guest list of their funeral, right? | ||
| And their eulogy. | ||
| How is that working out right now in that instance? | ||
| Is OpenAI taking any responsibility whatsoever? | ||
| They're not. | ||
| They continue to say we're doing everything we can to make these systems safe. | ||
| And, you know, we want people to get the help that they need. | ||
| It's a very small percentage of people that are running into these issues. | ||
| Although, you know, when you have 800 million users, that's a lot of people. | ||
| And they admitted that, you know, that 0.07% of weekly users are running into issues of psychosis or delusion or something along those lines. | ||
| If you do the math, that's a lot of people. | ||
| And those are the ones that they're admitting that they know about. | ||
| And so they're not doing enough to protect people. | ||
| They don't seem to really care. | ||
| And frankly, I think this has a lot to do with the leadership at the top, Sam Altman himself. | ||
| I think, you know, you hate to play armchair psychiatrist here, but I just, I don't trust him. | ||
| I don't think that he's got the interest of humanity at heart. | ||
| You've covered a lot of this on your show. | ||
| There's just something about him and the way that he's leading that company that, you know, shows that it's about profits. | ||
| It's about him going down in history as this great creator. | ||
| And he doesn't seem to care what he destroys along the way. | ||
| But look, I think we have to fight back. | ||
| We have to rise up and say, we've had enough. | ||
| We're not going to take this anymore. | ||
| And we're going to pursue everything we can within our toolbox to stop this. | ||
| And some of that is legal action. | ||
| And then again, some of this is working to pass meaningful legislation in Congress and in states around the country. | ||
| You know, your team at the Alliance for Secure AI and yourself, you're not anti-tech extremist quite to the degree that I am or many of us at the war room are. | ||
| So this isn't the same axe to grind that maybe we have, but you are fighting diligently to have reasonable legislation, to have institutional norms that protect human beings, especially children. | ||
| Can you give us a sense of what sorts of bills are coming up? | ||
| What sort of legislation is being proposed to hold these companies accountable and preemptively protect children from the kind of predation that you're describing? | ||
| Yeah, there's a lot of good ideas out there. | ||
| And some of these state bills are moving and some of them are moving quite slowly. | ||
| And then others are kind of in the works. | ||
| So there's a lot of work to do. | ||
| But I think one key principle to all of this that we believe is important is that states should lead, that governors and speakers of the House and Senate leaders around the country need to step up and move quickly because, you know, Congress is not. | ||
| And of course, we've talked a lot about the moratorium that some in Congress were pushing. | ||
| And of course, you know, the war room posse, I think, was the key to that being defeated in Congress. | ||
| And so everybody who's watching, kudos to you for the hard work and engaging in that effort. | ||
| So defeating the moratorium is one to make sure that that doesn't come back. | ||
| And if it does, we prevent that so that states can legislate. | ||
| So what are some of the things states can do and are doing? | ||
| We've seen states trying to impose guardrails and safeguards on AI as it relates to protecting kids online. | ||
| You know, one simple idea that I think is gaining a lot of traction right now is treating these advanced AI chatbots like you would other substances that are highly addictive and dangerous to children, cigarettes, alcohol, et cetera. | ||
| How about an outright ban on the use of AI companions among minors? | ||
| I think that's something that's gaining a lot of support from people on the right. | ||
| Even some Democrats support that one as well. | ||
| So I think that's something that can be passed on the state level and could do some meaningful good there. | ||
| It doesn't solve the problem of those that are in their 20s, 30s, 40s, or beyond who are suffering from some of these issues. | ||
| And so I think with that, in addition to banning AI companion usage by minors, I think we do have to look at other safeguards, you know, preventing the use of, preventing chatbots from delivering messages about encouraging suicide or self-harm and enforcing that and holding the companies liable if they don't fix it because they have the means to stop it, I think, at least currently. | ||
| Now, we can talk about what happens a year or two from now as this stuff gets more advanced. | ||
| They may not have as much control as they do today. | ||
| So now is the time to impose those safeguards. | ||
| And we can look at some other bills that are out there. | ||
| Kind of, if you want to look at the more advanced AI of the future, there are some bills in Congress right now looking at superintelligence itself. | ||
| I was one of the folks to sign the letter that, of course, Steve signed that was led by the Future of Life Institute saying we should not develop a superintelligence unless we know it's safe and the people want it, which those two conditions I think will probably never be met. | ||
| And frankly, I'm not sure we should ever build a superintelligence. | ||
| But there is some legislation there to allow the Department of Energy to look at some of these systems and to see what they're building to get behind closed doors and have some eyes for evaluation and testing of those models. | ||
| So those are just a handful of things, but I think that there's a lot of work to be done. | ||
| And I think most of it's going to start with state legislatures. | ||
| One of the arguments for the moratorium was that we simply need a federal law across all states in any of these instances so that there's no confusion. | ||
| But we've had the Kids Online Safety Act, COSA, that has been stalled out for how long? | ||
| Is there any hope of having any kind of meaningful legislation on child protection on a national level? | ||
| You know, I think it could happen. | ||
| I think that we want to see that happen. | ||
| But yes, Congress is slow to act. | ||
| Obviously, we've had the government shutdown over the past month plus. | ||
| There's a lot of wrangling back and forth. | ||
| Things are slow. | ||
| And frankly, they're kind of designed to be slow. | ||
| I'm okay with that. | ||
| As kind of a Madisonian constitutionalist, I think that things should move deliberatively. | ||
| But I do think in the meantime, states have the ability to act. | ||
| They have the power to do so quickly and to impose some of these safeguards to protect kids right now. | ||
| So I think that that should happen. | ||
| But yeah, I think that I don't support the moratorium idea in principle because I do think federalism is important. | ||
| I think that states should be able to legislate to protect their citizens. | ||
| And look, if you want to live in California and California, California wants to go nuts and do something different than Texas does or Florida does, then I kind of think that states should be able to legislate differently and preserving federalism in the 10th Amendment. | ||
| But this is a very important issue. | ||
| And I think that I'm confident that state by state, you'll have actually bipartisan support to protect kids online and to impose some of these other safeguards as well. | ||
| You know, the Florida Citizens Alliance is spearheading a lot of really interesting legislation. | ||
| We'll see where it goes. | ||
| I'm very hopeful, actually, that some or all of it will be passed. | ||
| But one of the ideas that they've put forward, especially in regard to child data protection, is an opt-in mandate so that rather than what most software platforms do and your data is being gathered, it's just the default unless you opt out. | ||
| All these companies would be required to ask you to opt in on it. | ||
| And I think that would make an enormous difference. | ||
| And, you know, these companies, these ed tech companies and other kind of child-friendly AI and other apps, they're gathering all this data and creating profiles on children. | ||
| And it seems to me like the most cynical ploy to manipulate people throughout their entire lives. | ||
| But I think that the warroom posse is totally behind any kind of legal protections for children that don't overreach. | ||
| And the Florida Citizens Alliance efforts are huge. | ||
| Is there any other kind of legislation like that around the country specifically geared towards child protection that we should keep our eyes on? | ||
| Yeah, I haven't seen legislation that is that similar, focused on protecting children and their data in other states. | ||
| I'm sure somebody has filed a bill out there. | ||
| And if they haven't, that's something we could definitely work on as we work with lawmakers. | ||
| I know there's a federal bill. | ||
| Senator Josh Hawley has been a leader on that in terms of data privacy and preventing that from happening, preventing them from just taking all of the data without our permission. | ||
| And frankly, I think you're onto something here in that this data collection is not just for marketing purposes. | ||
| It's not just so that companies can sell us products or figure out the best way to keep us addicted to their products. | ||
| I actually think there's something else going on here, which is that big tech and big government are starting to get together to create this sort of surveillance state. | ||
| And again, I know that's been in the works for some time, but I think that big tech and big government are going to create a more powerful surveillance state that is AI-powered, AI-generated in a lot of ways, and it's going to be used to control the population and sometimes in subtle ways and sometimes in not so subtle ways. | ||
| And so I think the American people need to understand that this technology is going to have this capability here in the near future that it can basically see everything that you do online. | ||
| And in some cases, in many cases, it will be able to see what you do offline. | ||
| Picture, you know, drones flying around and gathering all this data about your whereabouts and your movements. | ||
| I don't think that that sort of potentially dystopian future is really all that far away if there's a will to do it. | ||
| And I think unfortunately there is. | ||
| There's certainly money to be made from big tech companies who want to provide that technology. | ||
| And we're seeing that come together. | ||
| So, look, I think all of us have to recognize that is a very real possibility. | ||
| We can point to other countries and talk about some of the control systems that they have, like in China, for example, with their social credit system. | ||
| But I think we have to fight really hard right now to prevent that from happening in the United States of America. | ||
| Because imagine what the next president could do with that power, or the president after that, with it when the technology increases and when certain people get into power and say, we need to control these, you know, these radicals over here. | ||
| We need to shut down the so-called misinformation that they're spreading. | ||
| When misinformation just basically means, you know, to them, we don't like what you're saying. | ||
| So I really fear that that could be the next level that we're going to have to deal with very soon here. | ||
| Well, on that effort to rein this in, I mean, you have guys in the White House right now, David Sachs in particular, and others outside like Mark Andreessen, who are pushing for the companies to basically have carte blanche to do what they want. | ||
| Where do you stand on that? | ||
| What do you see in our future? | ||
| Should people like Sachs and Andreessen have their say? | ||
| Look, I think we have to put America first. | ||
| We have to put our national interests first. | ||
| And I'm just confused and a little bit befuddled why folks like Sachs and Jensen Wong and Andreessen seem just hell-bent on making sure that our adversary, China, has access to our most powerful chips. | ||
| Why they seem hell-bent on just putting their profits and their future profits ahead of the American people and ahead of kind of protecting the Republic. | ||
| I mean, I'm not so naive to think that I don't have an idea of what that answer could be, but I'm going to start by, okay, let's give them the benefit of the doubt here. | ||
| But based on their actions and their comments, Jensen Wong saying basically it doesn't matter whether China or the United States wins the AI race, or Mark Andreessen, you know, mocking the Holy Father, Pope Leo, on X on something related to AI, or David Sachs constantly just setting up strawman arguments and pointing fingers all over the place about why people are concerned about AI. | ||
| I think they know what they're doing and they're basically just acting in their self-interest. | ||
| That's fine, but we have a vote in this too. | ||
| And by the way, we didn't vote for any of you. | ||
| And, you know, yesterday you were Democrats and today you're Republicans and tomorrow you'll be whatever is convenient politically. | ||
| But what's convenient for them is basically pursuing their own bottom line. | ||
| You know, fine, go do that. | ||
| But you're building a technology that could be so powerful that it could completely transform our way of life. | ||
| It could, as Sam Altman wants it to do, reorient the social contract. | ||
| It could do things that would eliminate potentially 100 million plus jobs in this country or more. | ||
| So yeah, we get a vote in that. | ||
| We get a say in that. | ||
| And I don't think that those folks are really putting America first. | ||
| I don't think they're putting the American people first. | ||
| And I think they're completely out of touch with where grassroots, regular, hardworking, conservative voters are. | ||
| And I think that there's going to be a reckoning for that at some point if they continue down this path. | ||
| Brendan, I could not agree with you more on that. | ||
| If you would, please tell the War Room posse where they can find your work at the Alliance for Secure AI. | ||
| You can go to secureai now.org. | ||
| That's secureai now.org. | ||
| And you can follow us on all the social media handles. | ||
| Secure AI now. | ||
| Thank you very much, sir. | ||
| I really appreciate it. | ||
| Look forward to talking to you again. | ||
| All right, Warwick Posse, birchgold.com/slash Bannon or text Bannon to 989-898 now through November 30th. | ||
| Get free gold with your qualifying purchase. | ||
| Also, mypatriotsupply.com/slash Bannon. | ||
| Black Friday Survival Special: Order a four-week emergency food supply, $160 off. | ||
| Get $150 in free survival gear. | ||
| That is mypatriotsupply.com slash Bannon. | ||
| Until next time, thank you very much. |