All Episodes
Oct. 20, 2023 - The Culture War - Tim Pool
02:08:29
The Culture War #34 - Election Fraud, Big Tech And Trump 2024 w/Robert Bowes & Dr. Robert Epstein

BUY CAST BREW COFFEE TO SUPPORT THE SHOW - https://castbrew.com/ Become A Member And Protect Our Work at http://www.timcast.com Guests:  Robert Bowes  Dr. Robert Epstein (MyGoogleResearch.com) My Second Channel - https://www.youtube.com/timcastnews Podcast Channel - https://www.youtube.com/TimcastIRL Merch - http://teespring.com/timcast Make sure to subscribe for more travel, news, opinion, and documentary with Tim Pool everyday. Learn more about your ad choices. Visit megaphone.fm/adchoices

Participants
Main voices
r
robert epstein
01:02:40
t
tim pool
54:05
Appearances
Clips
d
dr robert epstein
00:02
| Copy link to current segment

Speaker Time Text
tim pool
Donald Trump may be dominating in the polls.
Actually, the latest aggregate has Trump tied with Joe Biden, but he did see a major upswing.
And it looks like now with the GOP primaries, he's absolutely crushing it.
But the question is, are the elections going to be free and fair?
And will there be big tech manipulation?
As well as what happened way back when in that, you know, other election.
But we'll talk about it all.
We're going to talk about this because we're going to be entering the 2024.
We're in the 2024 cycle.
We're going to be entering 2024 very soon.
And I hear a lot of people saying things that they don't think it's possible.
They don't think there's a reason to do it.
I disagree.
I think we have to do everything we can.
And even if you, if you think in the back of your mind, you can't win, there's no excuse for backing down and letting your political opponents and evil people just steamroll through.
So we'll talk about all this.
We've got a couple of great guests.
We have Dr. Robert Epstein.
Would you like to introduce yourself?
robert epstein
Well, what was that again?
What was my name?
tim pool
Robert Epstein.
robert epstein
Ah, Dr. Robert Epstein.
I must be Dr. Robert Epstein.
tim pool
Yes, you.
robert epstein
Looking in my direction.
And I'm a researcher.
I'm Senior Research Psychologist at the American Institute for Behavioral Research and Technology, which is in beautiful San Diego, California.
tim pool
Oh, wonderful.
And you have previously talked about, or you've talked quite a bit about, Google's manipulation of our electorates, of the people's minds.
But also, some other issues pertaining to Hillary Clinton 2016 and things like that.
So, this will be really great.
Thanks for coming.
unidentified
Sure.
tim pool
We should have a good conversation.
And we have Robert Bowes.
Would you like to introduce yourself?
unidentified
Yes, thank you.
Robert Bowes was a Trump White House appointee, worked at FHA, been a banker for many years, but now working on election fraud investigation and helping some of those wrongfully accused in Georgia indictments.
tim pool
Okay, right on.
Let's just jump right to it.
Can we win in 2024?
robert epstein
By we, you mean right-wing conservative nutcases?
Is that what you mean?
tim pool
No, I mean, you know, if you take the mainstream media's view, anybody who would vote for Trump is going to be some far-right MAGA extremist.
But you'll actually, there's a lot of people who are libertarian-leaning, anti-establishment, some people who just despise the Republican Party, but just want Trump to win for, say, like my position is more so, Trump's more likely to fire people.
His foreign policy was substantially better than everyone else we've seen.
So, I actually really don't like Republicans, and I don't consider myself conservative, but I think Trump is one of our best bets in a long time.
I'd like to see him win, but, you know, could there be better?
Of course.
It's a question of, is it possible to defeat the establishment machine, which has got Biden fumbling around in office, maybe wants to bring in Gavin Newsom or who knows what, And yeah, I mean, there you go.
That's we.
robert epstein
OK, Trump cannot win.
It is impossible because Google alone has the power in 2024 to shift between 6.4 and 25.5 million votes in the presidential election with no one aware of what they're doing and without leaving a paper trail for authorities to trace.
So let me just repeat those numbers.
Between 6.4 and 25.5 million votes.
And those are absolutely rock solid numbers.
And I know how to stop that.
I know how to level the playing field.
But all the attention is going to so-called voter fraud.
All that attention is going to voter fraud because Google and some other tech companies are misdirecting attention.
In other words, they're making those kinds of stories go viral so that people who don't know better end up focusing on those issues.
And they're doing that deliberately so that you won't look at them.
tim pool
I mean, I could say, you know, potentially right now, yes, but for two years after the 2020 election, you could not even say those words on YouTube without getting banned.
In fact, I think it was The Hill ran a clip of Donald Trump speaking at a rally where he said, it was a big fraud, you stole it, and then they shut down a news segment for simply mentioning it.
But if you came on and said, oh, they've got better ballot harvesting, YouTube was totally fine.
If you came out and said, big tech censorship, Google search manipulation, they had no problem with you saying those things.
But you couldn't talk about it.
Now you can talk about fraud.
They've changed the rules a few months ago where now you're allowed to say 2020 was was stolen from Trump or whatever.
But so how would you I mean, what's your response to that?
robert epstein
Whatever it is people are focusing on, you have to understand that that focus is being controlled.
So I guarantee you they're not ever allowing the focus to be on them.
So right now, for example, there's a big trial in progress that's just winding down, the US versus Google.
No one even knows about it.
That has been so completely suppressed by the tech companies themselves and their media partners.
So what I'm saying is, whatever it is people are going to be talking about, they control that.
And whatever else they do, they're going to make sure that you don't look at them and the kind of power that they have to shift votes and opinions, which is unprecedented in human history.
That's what I I've been studying more than 11 years, and I publish my work in peer-reviewed journals.
It's rock-solid, rigorous research.
I've testified about it before Congress.
This is what's really happening, and a couple of people now have been figuring this out.
One is Carrie Lake.
She's changed her tune.
I don't know if you know that in the last few weeks.
She's saying it's big tech.
Big tech that we really need to worry about.
And the other is Ramaswamy.
He's also now switched, saying all those voter fraud issues, yeah, they're important, but that's not where the real threat is.
The real threat is the big tech companies, because these other kinds of things that we talk about, they can shift a few votes here and there, but they're inherently competitive.
But if one of the big tech platforms decides to support a party or a candidate, there is nothing you can do about it.
Generally speaking, also, they're using techniques that you can't even see.
So that's really where the big threat is, and I will tell you, at this point in time, democracy in this country is an illusion because that's how many votes they control.
unidentified
Are you doing a look back on that?
The 6 million to 25 million you're talking about, you're thinking about, that's now or 2024.
What was it in 2016 or 2020?
I mean, you know, I would submit to you that, you know, there's no, you know, when Carrie Lake says 81 million votes my ass, I agree with her.
I don't think that there's the real, or that old school fraud, which you think is a smaller amount.
It probably is a smaller amount.
I agree with your assertion that tech censorship is big.
tim pool
The biggest, I agree.
unidentified
But did Trump overcome it in 2016?
What was the amount in 2016?
dr robert epstein
I can tell you precisely because that's when we started monitoring.
robert epstein
That's when we invented the world's first system for surveilling them, doing to them what they do to us and our kids.
We learned how to capture what they call ephemeral content.
Let me explain here.
This is a very important concept.
In 2018, there was a leak of emails from Google to the Wall Street Journal.
And in that conversation that these Googlers were having, they said, how can we use ephemeral experiences to change people's views about Trump's travel ban?
Well, my head practically exploded when I saw that because we had been studying in controlled experiments since 2013 the power that ephemeral experiences have to change people's opinions and attitudes and beliefs and purchases and votes.
tim pool
What's an ephemeral experience?
robert epstein
OK, most of the experiences you have online are ephemeral.
And ephemeral means fleeting, means you have the experience and then whatever was there, the content disappears like in a puff of smoke and it disappears.
So, for example, you go to Google search engine, which you should never use, by the way, I can explain why.
And you type in a search term.
You start to type, they're flashing search suggestions at you.
Those are ephemeral.
They disappear.
They're not stored anywhere.
You can't go back in time.
Search results populate below.
Those are ephemeral.
You can't go back in time and see what search results there.
How about answer boxes?
News feeds?
When you're on YouTube, you know those The recommended one that's going to come up next, the up next video?
tim pool
That's not tracked.
robert epstein
It's not tracked.
That's ephemeral.
The whole list of recommended videos, it's all ephemeral.
What we started doing in 2016 with a very small system at the time was preserving that and analyzing that.
We found, we were looking at Google, Bing, and Yahoo, we found Uh, pro Hillary Clinton bias in all 10 search positions on the first page of Google search results, but not Bing or Yahoo.
That's very important.
tim pool
Interesting.
robert epstein
For control.
tim pool
So you're saying we should use Bing?
robert epstein
No, no, no, not at all.
But the point is that, uh, that, if that level of bias, because that's what our experiments look at, they look at how bias can shift opinions and votes.
We measure that very precisely.
If that level of bias that we measure, that we capture, that we preserve, normally that's never preserved, had been present nationwide in the 2016 election, well, that would have shifted between 6 and 10.4 million votes to Hillary Clinton with no one knowing that that had occurred because people can't see bias in search results.
They just click on what's highest, they trust whatever that takes them to, if they're undecided.
unidentified
So 2 to 10 million in 2016.
You're saying 6 to 25 million 2024.
in 2016.
You're saying six to 25 million in 2024.
What was 2020?
robert epstein
2020, Google alone shifted more than six million votes to Joe Biden.
Now, by the way, I supported Hillary Clinton.
I supported Joe Biden.
I lean left myself.
So I should be thrilled, but I'm not thrilled because I don't like the fact that a private company is undermining democracy.
and getting away with it and there's no restrictions on them whatsoever absolutely none they have an absolutely free hand so they do what they're doing blatantly and arrogantly quick example of another ephemeral experience to show you how blatant and arrogant this is Florida in 2022 okay so we were monitoring Florida because one of the key swing states on election day November 8th
All day long, Democrats in Florida were getting Go Vote reminders on Google's homepage.
Wow.
Conservatives, Facebook, not so much.
In other words, 100% of Democrats in Florida were getting those reminders all day.
59% of conservatives.
That is an extremely powerful and blatant, but, you know, if you don't have a monitoring system in place to capture all that ephemeral stuff... The FEC should be all over this.
unidentified
You don't know.
This is a party in-kind donation to the party, to the candidates, they should be all... FEC is the state for the switch, you know.
They won't... Yeah, but Donald... President Trump beat the cheat in 2016.
Well, I think he beat the cheat in 2020.
Well, he's not president, so... I know, well, but so, you know, 70,000 vote differential, when we know that there's these, what you, I would agree with you, smaller amount of cheat, but through, you know, different tech.
tim pool
Well, if this is true, I mean, then Trump's popularity is... Is huge!
Oh yeah, it's huge.
A collective, what was it, 44,000 votes in three swing states are what stopped Trump from 2020, from 2020 victory.
unidentified
That's right, exactly.
robert epstein
Now, one thing we've learned how to do, this is very recent, by the way, in our work, We've learned how to look at an election that took place, look at the numbers, and we can factor out Google now.
So in 2020, Trump won five out of what were generally considered to be 13 swing states.
If you factor out Google, Trump would have won 11 of those 13 swing states.
unidentified
Except New York and California, that's it.
robert epstein
And easily would have won in the Electoral College.
unidentified
Yeah, and you have the CISA, the Cyber Infrastructure Security Agency that has said that cognitive infrastructure is what they want to be targeting right now.
Cognitive infrastructure.
tim pool
Do you guys remember the leaked video of Google employees crying when Donald Trump won?
robert epstein
Of course.
This is real.
tim pool
This is real.
robert epstein
Because they swore up on that stage, and it was all the leaders of Google up on that stage, and they swore, we are never going to let this happen again.
unidentified
Right.
So are you doing anything with Missouri v. Biden, where, you know, Missouri, you know, there's a key claims in there about election, well, censorship, obviously, but censorship goes, extends to censoring and suppressing votes, effectively.
robert epstein
Well, I've worked for years with Jeff Landry, who just became governor of Louisiana, and I congratulated him that very day.
I thought for sure I'd never hear from him again now that he was governor, and he texted me back.
Yeah, he's a good guy.
He's a great guy.
Crazy accent.
Wow, really crazy accent.
unidentified
Cajun.
Cajun, yeah.
robert epstein
But he's been helping me and my team for years, and he knows all about my work, and he gets it.
He understands.
There are few people up there in leadership positions in our country who understand.
Unfortunately, it's very few.
He's one of the people who does.
He was involved in that Missouri case, as you probably know.
And yeah, of course we're interested in that, because the communication between the government And Google and the gang.
Okay, that's very critical.
Obama's second term.
Who knows this, seven federal agencies were headed by former Google executives.
Obama's chief technology officer, former Google executive.
Hillary Clinton's chief technology officer, Stephanie Hannon, former Google executive.
250 people went back and forth in the Obama administration between top positions in his administration and Google.
tim pool
How did Trump win in 2016?
robert epstein
He won because it was a fluke.
They took certain things for granted.
They weren't looking carefully enough at those tiny little numbers in the swing states.
And so, yes, a tiny margin in some swing states.
tim pool
77,000 votes.
robert epstein
Exactly.
Put him over the top in the Electoral College, and they were kicking themselves.
If Facebook, for example, just on election day had sent out partisan go-vote reminders, just Facebook, one day, that would have given to Hillary Clinton an additional 450,000 votes.
tim pool
But it is possible then, albeit very, very difficult, that if you can mobilize Trump supporters and conservatives, To an extreme degree, they can overcome that bias.
robert epstein
Nope, absolutely cannot.
Because Google alone controls a win margin of somewhere between 4 and 16 percent.
So, now if you're telling me, well no, we've locked this up, we can guarantee a win margin of, I don't know, call it 12 percent.
But that's not true in this country.
In this country, we know we're split roughly 50-50 on the vote.
So if there's some bad actor that has the ability to shift a whole bunch of people, especially right at the last minute, especially on Election Day, You can't counter that.
unidentified
I think we're split.
It's not 50.
Because of what you just described in terms of the bias that President Trump overcame, we're not split 50-50.
We're split more 60-40.
And now you have an awful candidate.
Joe Biden is a failed candidate for many reasons.
And there are some major disasters going on.
You look at the policy You know, only one or two of these things have taken out other presidents.
But if you have, you know, economy, wars, medical tyranny, a two-tier justice system... Not just wars.
tim pool
Biden's approval rating collapsed after the Afghanistan withdrawal.
unidentified
Right.
So if you apply it to a candidate, you...
If you have a really bad candidate, that's gonna hurt them too.
Can they dial it in?
Can Google dial it in and say, oh, we can influence 30 million people because Joe Biden's so awful?
Is that what you're saying or no?
robert epstein
Yeah, but then you should have gotten that red wave, and there was no red wave.
So I published a piece in the Epoch Times, it's at howgooglestoptheredwave.com, and I explain exactly what happened there.
So you should have had that huge red wave if you're So in 2022, there should have been 30 or 40, right?
I can tell you exactly.
unidentified
But some of that was, there was a wide variety of cheating that happened in that, and what you're saying.
I agree with you, but it's both.
robert epstein
No.
unidentified
It is.
robert epstein
I'm saying it's so much bigger.
unidentified
Even Kevin McCarthy funding people that are running against America First candidates.
Yeah.
He was, Kevin McCarthy was part of this problem.
And guess what?
tim pool
Not just fighting against, but obstructing.
unidentified
And guess what?
He came back in his I agree with you on the problem of Google, Big Tech, Google especially.
Me too.
But I don't see it as inevitability.
I see it as a David versus Goliath.
I see a possibility, as slim as it may be.
Okay, how about this, though?
we wouldn't be in this position right now.
tim pool
I agree with you on the problem of Google, big tech, Google especially.
unidentified
Me too.
tim pool
But I don't see it as inevitability.
I see it as a David versus Goliath.
I see a possibility as slim as it may be.
unidentified
Okay.
robert epstein
How about this though?
Why don't we just push Google and the gang, push them out of our elections and push them out of the minds of our kids because that's something we started studying.
unidentified
And that's the win.
When you say, can we get a win in 2024?
Forget the party, that is the win.
robert epstein
That levels the playing field.
unidentified
Absolutely, that's what we need.
robert epstein
And that gives you a freer and fairer election.
tim pool
I think something you mentioned is the most important point.
It doesn't matter if YouTube spams Nothing but Donald Trump content.
It doesn't matter if the front page of Reddit doesn't matter or the default page.
It doesn't matter if Twitter, X, and all these platforms every day slam you with pro-Trump, pro-Trump, pro-Trump.
If during the election cycle, because now we're in a month, election month, not election day, Google comes out and runs.
Go vote only for Democrats?
That's enough.
That's enough because we're talking about victory margins for the presidential election of very, very slim.
And if it's 77,000 votes that gets Trump the victory, or 42,000, 44,000 in 2020, all Google has to do is blast everyone their algorithm, their AI knows as a Democrat with, don't forget to vote today.
And then for all the Republicans, all they have to do is put, make sure you're watching the new movie today.
And they can shift the percentages enough to secure Joe Biden's reelection.
robert epstein
That's what I'm trying to tell you.
tim pool
That's it.
unidentified
I agree with it.
robert epstein
But you're only talking about one little technique.
Exactly.
It costs them nothing, by the way.
It costs them zero to do that.
But how about, let's back up a few months, and they do the same with register to vote.
unidentified
Right.
They're doing it now.
Facebook's doing it now.
TikTok, Instagram, they're doing it now.
tim pool
Well, TikTok banned.
robert epstein
Hold on a second.
You don't know what they're doing unless you're doing monitoring.
You don't know what they're doing.
unidentified
I see anecdotes of what's coming across my feed.
robert epstein
That's called anecdotes.
We're collecting our data that are admissible in court on a massive scale.
We are now monitoring big tech content through the computers of more than 12,000 registered voters politically balanced in all 50 states 24 hours a day.
We have collected in recent months preserved more than 51 million, it might be up to 52 today, 51 million ephemeral experiences on Google and other platforms, content that they never in a million years thought anyone would preserve.
And we're preserving it every single day.
We have 30 to 60 additional new people added to our nationwide panel.
And so every single day we're recording more and more of this content and we've been learning how to analyze it in real time.
So let me tell you how you push these companies out of our elections and get them out of our kids heads. 2020 We had so much dirt on Google that I decided we're going to go public before the election.
So I called up a reporter at the New York Post and I sent her in a bunch of stuff and she got the assignment.
This is a few days before the election.
She wrote up the piece.
Her name is Ebony Bowden.
You can look her up because she got fired soon afterwards.
And she read some of the piece to me on the phone.
It was fantastic.
Now, just a few weeks before, the New York Post had broken the story about Hunter Biden's laptop.
And that was front page, right?
Well, this story that she was writing about the election rigging, that was going to be New York Post's front page.
Wow.
So, Friday, October 30th, a couple days before the election, her editor called Google for comment.
And guess what happened that night?
The story got killed!
Spiked it.
She was so furious.
Now, how could that possibly have happened?
Well, the New York Post could take on Twitter because they were only getting 3 or 4% of their traffic from Twitter, but they were getting 45% of their traffic.
Ring a bell?
45% of their traffic from Google they could not take on Google.
tim pool
I knew a guy, he ran an at-home store.
He worked from home.
Google changed their search algorithm one day and his business went to zero.
robert epstein
Yes, that's right.
This happens every day.
I mean, I've written about this.
That's the power that this company has and people in business are terrified of Google because Google can just put you out of business like that.
unidentified
They broke up.
Cowardly.
Ma Bell was broken up for less things.
robert epstein
Let me just finish my 2020 story.
I'm almost there.
I'm almost there, I promise.
Okay, so I was distraught.
Ebony Bowden was distraught at the New York Post.
She was really mad.
She didn't last much longer there.
That editor also didn't last much longer there, interestingly enough.
But I sent everything into Ted Cruz's office also.
And on November 5th of 2020, Ted Cruz and two other senators sent a very threatening letter to Sundar Pichai, the CEO of Google.
If you want to look at it, it's lettertogoogleceo.com.
lettertogoogleceo.com.
Letter to Google CEO.com.
It is a fabulous letter written by Cruz and his buddies.
And it's two pages long and it says, "You testified before Congress saying "you don't mess with elections, "but Epstein's data show the following." So what happens then on November 5th?
On November 5th, that very day, Google turned off all of its manipulations in Georgia.
We had more than a thousand field agents in Georgia.
Wow.
We preserved a million ephemeral experiences in Georgia.
This is in the two months leading up to their Senate runoff elections.
They literally turned off everything.
Bias in Google search, political bias went to zero, which we've never seen before.
unidentified
And Hi, it's me.
Hi, are you on your way?
Yes, or you know what?
It's a bit of a crisis here, because there's no train.
You know what?
I don't know what I'm doing right now.
No, you can just rent it.
Yes, you can just rent it.
Download, unlock and drive.
Find out how on Hyre.no.
robert epstein
They stopped sending out partisan go-vote reminders.
tim pool
This sounds like, with your data being admissible in court, anyone in any state could have standing and file a lawsuit.
robert epstein
Correct.
And that's why I'm working with AGs around the country right now.
That's why I'm working with Paul Sullivan, who is a very well-known D.C.
attorney who used to work for the Federal Election Commission.
He's helping us to prepare a complaint to the FEC about Google because we have the data from 2022.
In 2022, we preserved two and a half million ephemeral experiences related just to, you know, in those days leading up to that election.
But now we're setting up a permanent system that's now currently running 24-7 in all 50 states.
It needs to be much, much bigger so that we have representative samples and so it's court admissible in every state.
tim pool
So, it was the day that Senator Cruz sent this letter out, the bias seen on Google in Georgia disappeared.
robert epstein
Like that.
Like flipping a light switch, and that phrase came to me from a Google whistleblower named Zach Voorhees, who you may have heard of.
tim pool
Yep.
robert epstein
He's a good person for your show.
tim pool
I think we've had him on, didn't we?
We've had him on, pretty sure.
robert epstein
I don't know.
unidentified
Yeah.
robert epstein
He's the guy who walked out of Google with 950 pages in documents and a very incriminating video, and he put it all in a box and sent it off to Bill Barr, who at that time was Attorney General of the United States, And then Google went after him with police and a SWAT team went after Zachary.
unidentified
I'm sure Bill Barr didn't look at it.
He doesn't look at anything.
We sent him lots of evidence.
He didn't look at any of it.
robert epstein
But, well, the point is, though, that Zach, what Zach did was very, very courageous.
He's become a friend over the years.
Yes, they, and that's Zach's phrase.
It's like flipping a light switch.
They have the ability to turn off, turn these manipulations on and off, like flipping a light switch, and we made them do it!
On November 5th.
Now, just imagine a much, much larger system running 24-7 with a public dashboard.
Which, by the way, you can get a glimpse of right now.
It's at AmericasDigitalShield.com and it looks gorgeous.
unidentified
In the securities market, there's a concept of a quiet period, you know, where there's no discussions, you can't put out press releases, you can't say certain things, you know, 30 days plus or minus when you come out.
Maybe there's a remedy here to say that if you contract this and they abide by it, the big tech needs to be in a quiet period for, you know, months before the election.
robert epstein
Oh no, no, this is going to be, this system is Permanent.
This system is running free.
unidentified
You're trying to get a permanent remedy to remove all bias?
Is that it?
What's the remedy?
robert epstein
We're focused on two areas.
Elections is critical because right now, believe me, democracy in this country is an illusion.
And second is kids, because we're collecting data now for more than 2,500 children around the country, and we're actually looking at what they're actually getting from these tech companies, and we don't even understand it.
It is so bizarre and so weird.
And so creepy, and so violent, and so sexual, we don't even understand it.
We will understand it.
tim pool
Are you familiar with Elsagate?
robert epstein
No.
tim pool
You should look into this, especially with your research.
unidentified
The Frozen push out?
tim pool
It wasn't just that, but Elsagate was the name of this phenomenon that happened several years ago, about maybe five years ago.
Where adults weren't noticing this because the feeds that we're getting are like, you know, CNN and entertainment and celebrities and music and sports.
Kids were getting, initially, a wave of videos, this is where Elsagate comes from, of Elsa, Spider-Man, and the Joker running around with no sound, like with no dialogue, engaging in strange behaviors.
unidentified
Right.
tim pool
So it started with Elsa going, ooh, and the Joker kidnapping her and then Spider-Man saving her.
The general idea was Joker, Elsa, and Spider-Man were very popular search terms in the algorithm.
And so if you combined these things in a long video, kids would watch it and they'd get high retention and all that.
It would promote it more.
It devolved into psychotic amalgamations of Hitler with breasts and a bikini doing Tai Chi while people from India sing nursery rhymes.
And then it started, you started getting these videos where the thumbnails were people drinking from urinals and eating human feces and this was being given to toddlers and children on YouTube.
unidentified
Sick.
tim pool
What had happened, and people, a lot of amateur internet sleuths started digging into what was going on.
The general idea was that This section of YouTube was completely overlooked or ignored, or perhaps it was intentional.
But what happened was, parents would select a nursery rhyme on a tablet and give the tablet to a baby, put it in front of them being like, there, I'll get a few minutes to myself.
The baby watches a very innocent nursery rhyme video, but the next up video would slowly move in the direction of this psychotic algorithmic nightmare, to the point where, like I mentioned, the nursery rhyme was Finger Family, Was, uh, a hand would pop up showing Hitler's head on one of the fingers, and then Hitler's head on another finger.
Hitler with breasts, I am not kidding, in a bikini doing Tai Chi with the Incredible Hulk.
And then, eventually, videos where, like, Peppa Pig was being stabbed mercilessly with blood spring everywhere.
Pregnant women were eating feces and getting injections while it was happening.
And because these videos started doing well, it actually resulted in human beings seeing the success of these videos, Giving their daughters, and this is in Eastern Europe, in Russia, videos going viral where a father lays his daughter down and gives her an injection of some sort, 10 million views.
And eventually, there was a massive backlash when people realized this was happening, and then it started getting crushed, and then YouTube had a big panic, and then they said, we're all YouTube kids, and we're gonna be very safe, and try and protect them.
But this is something, I don't know if you think was intentional, or was just a byproduct of their machine, an accident.
robert epstein
First of all, you're talking about it all in the past tense.
Our system is running 24 hours a day now.
Also, you're talking about it legally from the perspective of anecdotes.
Those are all anecdotes.
That's not what we're doing.
We have a larger and larger and larger group of people.
They're politically balanced.
We know their demographics.
This is what, and by the way, we're not out there searching for crazy stuff on YouTube.
We're not doing that at all.
We're actually collecting the videos, hundreds of thousands of videos that our kids are actually watching.
Plus, we've learned that 80% of the videos that kids watch are suggested by that Up Next algorithm.
Think of the power you have to manipulate just because of that algorithm.
That's incredible.
tim pool
A friend told me, I know this is all anecdotal, but I do think it's the anecdotes that I'm referring to are people starting to notice something and then you have the hard evidence, the hard data.
A friend told me that she was watching, her kid was watching Disney Channel and an anti-Trump commercial came on and she was like, what the?
Why?
Because powerful interests are slamming the battlefield in this way, I think what you're talking about is the Kraken.
Well, maybe I shouldn't call it that.
Let me give you an example.
unidentified
- No, we'll talk about that too.
robert epstein
- Look, let me give you an example.
A parent walking by their kid's tablet, let's say, wouldn't even notice that anything was wrong.
- Yup. - Okay?
But we're collecting the actual videos, and here's what happens.
There's a weird cartoon, so this relates to what you said just a few minutes ago.
Weird cartoon, but then all of a sudden, boom!
Something crazy happens!
There's a shriek, and a head flies through the air, and there's blood everywhere, and then it's gone!
So it's very, very brief.
Oh yeah, that's happening right now.
It's very brief.
And what we're finding is something like, well, first of all, 80% that's, that's rock solid now.
80% of the videos that little kids are watching, those are all suggested by Google's algorithm.
unidentified
Are you monitoring the gamings, like even Roblox?
I mean, I could, I've seen some kind of unusual things in Roblox, but that could be crowdsourced, you know, individuals putting up their own little games or characters.
Are you looking at the gaming?
robert epstein
Look, practically every day now we're expanding the system.
We're monitoring more and more different kinds of content.
We're now looking at TikTok.
There's nothing we can't monitor.
That's why monitoring systems have to be a permanent part of not just our country and our culture, but really everywhere in the world outside of mainland China.
These systems have to be set up and we have we've been approached by people from seven countries so far.
The last two are Ecuador and South Africa and begging us to come help them set up these systems and here is the only area where I've ever agreed with Trump.
On this issue I say America first.
We've got to develop our own full system that's operating, you know, and it has to be, you have to have representative samples, this all has to be done very scientifically so that this is court admissible in every state.
That's how you push them out because you make them aware through public dashboards, through press releases, through data sharing with certain key journalists, members of Congress, AGs, That's how you push them out.
They would be insane to continue this stuff.
tim pool
I wanna go back to that one point that you just made.
So, you're saying that there are innocent-looking videos, the thumbnail may just be a smiling little sheep, and it says, like, learn your ABCs.
robert epstein
Right.
tim pool
It's 15 minutes long, but then at 8 minutes and 3 seconds, all of a sudden, a head pops up, explodes, and is gone.
robert epstein
Add to that the fact that if you mouse over the bottom of that video you can actually see the frequency with which that part of the video is being viewed and very often now we're seeing a spike right at that point where that crazy brief thing happens.
tim pool
How do we find examples of this?
unidentified
They rewind and go and watch it again.
robert epstein
Oh, they're watching those parts over and over and over and over and over again.
That's what that means.
tim pool
Is there a way to find one of these right now?
Or is it buried in YouTube?
robert epstein
It's buried.
I mean, we have, well, first of all, if you just scroll down, scroll down there, right there.
tim pool
Oh, I see.
robert epstein
So that's going to be a carousel.
tim pool
Oh, like this right here.
robert epstein
Yeah.
But that's going to be a carousel showing images that we're collecting in real time.
tim pool
You can pull this up, Kellen.
robert epstein
Yeah.
So in other words, you're going to see actual real.
unidentified
This is your site.
robert epstein
And you're going to be able to click on them.
And that's going to take you to the videos.
unidentified
This is your site.
You put these up on your site.
tim pool
So this is a mockup for now.
robert epstein
Oh, yeah.
Oh, go, go, go down right there.
Look at that.
This is really nuts.
This is so nuts because on the left you see the political leaning of every state and that should be very familiar to you.
But on the right what we're showing you is the bias in Google search results in content being sent to people in every state.
It's all blue in all 50 states.
unidentified
Is there a finite ceiling to this?
I don't know if you want to continue on the indoctrination or the subliminal messaging to children, but with respect to elections, Do you think people are smarter?
Is there a cap to how many, if Google tries to do 10 or 20 million more people, is there a decreasing marginal gain?
I mean, people can figure this out.
They can recognize bias.
Are they dumb?
robert epstein
No, I laugh because the discoveries that we've been making all these years, let's put it this way, the more we learn, the more concerned we've all become.
For example, when we first did a nationwide study in the US, we had more than 2,000 people in all 50 states.
They were being shown biased content on our Google simulator, which we call Kadoodle, and they were showing biased content.
Usually people can't see the bias.
Now where they see, you know, where there's biased content, we can shift people in one direction or another, whichever way we want to, because we use random assignment.
We can easily shift between 20 and 80 percent of undecided voters.
unidentified
Wow!
Of the undecideds.
robert epstein
And with one search.
With multiple searches, that number goes up.
Okay, so that's...
unidentified
It's mind control, it's an information war directed at the American people.
robert epstein
But in that study, that study was large enough so we did have a few people, about 8%, who could see the bias.
So we were able to look at them separately because the study was so large and you would think we're not going to get an effect with those people.
No.
They shifted even farther in the direction of the bias.
unidentified
Wow.
robert epstein
So seeing the bias doesn't protect you from the bias because there's this trust people have in algorithms because they don't know what they are and computer output because they don't know what that means.
And so they think if the algorithm itself is saying this is the better candidate, it really must be the better candidate.
Seeing the bias does not protect people from the bias.
So is there a cap?
unidentified
That's the question you're asking.
robert epstein
So right now we're studying things that are new for us.
We're studying what happens if One of these platforms is using one of these manipulations over and over and over and over again.
And so far what we're seeing is that the effect is additive.
unidentified
But it's probably at a decreasing rate.
Correct.
robert epstein
So you do kind of hit an asymptote.
But the point is just by repeating these, if you expose people to similarly biased content, The numbers go up, the shift gets bigger, but you're right, the rate goes down.
Now, the other thing we're starting to look at is multiple platforms.
What happens if three or four of these big platforms in Silicon Valley are all supporting the same candidate?
And we're seeing initially in our initial work, again, those are additive effects.
So this is scary stuff because way back, remember Eisenhower's famous speech from 1961?
A military industrial complex?
tim pool
Yep.
robert epstein
With that same speech, Eisenhower was warning about the rise of a technological elite that could control public policy without anyone knowing.
And guess what?
The technological elite are now in control.
tim pool
There was a hearing with Twitter, back when it was still Twitter, and I think one of the most important things that was brought up Was that, uh, I can't remember who it was, but one of the members of Congress said, if you go onto Twitter and create a new profile right now, it shows you all the suggested follows are Democrats.
No Republicans.
So that means as soon as you sign up, you say, I don't know, I'll follow this person, I guess, you're being slammed by pro-Democrat messaging.
And that was just Twitter.
And that was just the who to follow.
That's not even, you know, now we've seen this big push towards a switch from reverse chronological feeds into algorithmic feeds.
And perhaps people don't realize What the true power is behind that and why they want it.
For one, they can make more money with it, for sure.
But it takes away your ability to subscribe to who you want.
This was a big deal with threads.
This is Instagram's version of Twitter or whatever.
Worst platform ever, in my opinion.
Because I'm like, okay, I'm on Instagram, I'll sign up for threads.
There is no reverse chronological feed.
It was only algorithmic.
I was getting a bunch of Democrats in my feed, which was strange.
I don't follow them.
And I was getting weird entertainment stuff and weird jokes that meant nothing to me.
But their default position was, we're going to tell you what to look at.
And I wonder if what they were doing was intentionally trying to create a platform.
So look, you have Twitter.
Twitter defaults the algorithmic feed.
Twitter very much is biased, even working with intelligence agencies with secret backdoors for moving content.
Lawsuits now underway and already resolved proving this.
Elon takes over, instantly Zuckerberg's like, we're gonna launch an alternative.
The innocent take on this is, well, you know, they see a market opportunity.
I disagree.
I think they realized, uh-oh, one of our, you know, key assets in this manipulation has just fallen to someone who disagrees with us.
So Threads rolls out, heavy-handed algorithmic feed, and it got a wave of complaints.
It was too overt.
Now they're saying, we're going to pull that back a little bit.
But for the longest time, Instagram has not been reverse chronological.
Reverse chronological, for those that don't understand, it is what it means.
You see, on your feed, the latest thing that someone posted.
And so, if your friend posts now, you'll see it.
But if your friend posted three hours ago, it's already long gone.
The argument from these big tech platforms is, oh, but what if you like that three hour old post?
We're gonna make sure you see it.
What ends up happening is, on Instagram, and this annoys the crap out of me, I get weird posts and I'm scrolling through my feed of things I don't care about.
They're testing you, right?
Seeding the battlefield.
How long do you linger on this post versus this post?
Then they know what to send you more of.
But, What they're also doing is, using that as the argument, they're gonna start seeding you information to control what you think.
And I gotta be clear, this UFO that we got sitting right here, nobody can see it, but this UFO, I got it because of Instagram.
They knew I wanted it.
robert epstein
Wow, look at that thing.
tim pool
Yeah, just floating.
It's cool, huh?
Yeah!
They send me the ad, and I say, I want that.
But what you don't see is that, sometimes it's not an ad, it's a post from someone saying, did you know about bad thing from this person?
Don't vote!
Vote for them instead!
That's the game being played.
robert epstein
Well, there's a little more to it, so let me explain.
These companies have another advantage over all the usual, the traditional dirty tricks, which are inherently competitive, and they don't bother me that much because they're inherently competitive, but the point is these companies have another advantage, which is they know exactly Who is undecided?
In other words, who can still be influenced?
Exactly, they know down to the shoe size of those people, they know exactly who they are.
So they can concentrate, and in a manner that costs them nothing, they can concentrate just on those people.
So talk about swing states, swing counties, swing districts.
Okay, well here we're talking about, they know who the swing people are.
unidentified
So the political world, they do it all the time, try to identify based on voting histories, but what's Google doing to identify?
Is it looking at all their search?
Are they looking at getting everything off their phone to figure it out?
How are they doing it?
robert epstein
Well, you and I have been using, maybe not Tim because he looks a little bit younger than us, but you and I have been using the internet for 20 years.
tim pool
I've been using it longer than you guys.
robert epstein
Okay, that's cool.
tim pool
Late 80s, when I was a little kid, we had CompuServe.
robert epstein
Well then, I hate to tell you, but Google alone has more than three million pages of information about you.
Three million pages.
They're monitoring everything you do, not just if you're stupid enough to use their surveillance email system, which is called Gmail, or their surveillance browser, which is called Chrome, or their surveillance operating system, which is called Android.
These are surveillance systems.
That's what they are.
That's all they are.
unidentified
The Chinese couldn't do it better, could they?
robert epstein
But they not only are doing that, they're actually monitoring us over more than 200 different platforms, most of which no one ever heard of.
So for example, millions of websites around the world use Google Analytics to track traffic.
to their websites and it's free free free of course nothing's really free you pay with your freedom but point is Google Analytics is Google and according to Google's terms of service and privacy policy which I actually read over and over again and whenever they make changes in it if you are using any Google entity of any sort that they made then they have a right to track you so you are being tracked on all of those websites by Google
every single thing you do on those websites is being tracked by Google so I don't need to ask you I'm I mean, you know about Facebook shadows shadow profiles.
Of course.
tim pool
This is an amazing phenomenon Yeah, I explained to be are you familiar with shadow profiles?
unidentified
No, but I'm sure I see them in my in my feed.
tim pool
Let me just know No, you don't say you don't to everybody who's listening You have a Facebook profile.
robert epstein
And I love starting this because then someone says, no, I don't have a face.
I've never signed up for Facebook.
tim pool
OK, and here's here's the really simple version.
unidentified
I'll play that role as a clone or ghost.
No, no, no.
tim pool
I'll give I'll give the simple version and throw it to Dr. Epstein, who knows better than I. But.
You're on Facebook, right?
unidentified
Yep.
tim pool
When you sign up, you get a little prompt.
Hey, would you like to add your friends and family through your phone book?
Simple way they can do this.
Your mom does not have a Facebook profile.
She's never signed up, but she does have a shadow profile.
When you sign up and say, import my friends, it then finds in your phone book, mom, 555-1234.
Guess what?
Your brother also signs up.
Mom, 555-1234.
What happens then is all those little bits of data, Facebook then sees that and says, we know that mom has these sons.
We know from public data on the phone number, mom's name is Jane Doe.
Now they've compiled a profile on your mom, her friends, her family, where she works, her salary, all that information from all these ancillary sources.
And you probably know better than I do, so I don't know if you want to elaborate.
robert epstein
Well, from that point on, once that has been set up, Information continues to flow in and build that profile.
So that profile becomes, over time, immense.
Just as all these profiles are immense, the shadow profiles are immense as well.
So it means that they know who's going to vote, who's not going to vote, who's made up their minds.
They don't bother with those people.
Who has not made up their minds.
They know Exactly who those people are.
That gives them an advantage which no campaign manager has ever had in history.
Because they know exactly who those people are now.
Now let me explain.
unidentified
Who's using it?
Is Google using that to influence who they want to influence?
Or are they selling it to candidates to do that?
robert epstein
No, they're doing it themselves because they have a very, very strong political culture.
And so they have their own agenda, which they are trying very hard to spread around the world, and they're impacting right now more than five billion people every single day, so they're doing a pretty darn good job.
One of the leaks from Google a couple years ago was an eight-minute video called The Selfish Ledger.
If you type in, please don't use Google to do this, use the Brave search engine, okay?
Anything but Google.
Don't use Google.
Type in my name, So, Dr. Robert Epstein, type in Selfish Ledger and you will get to a transcript I made of this 8-minute film that leaked from the Advanced Products Division of Google.
And this video is extraordinary because this video, which was never meant to be seen outside the company, is about the ability that Google has to re-engineer humanity.
They call it behavioral sequencing.
And they do have that ability, and they're exercising that ability.
So, they know more about us than we know about ourselves.
They even have, for many of us, our DNA data.
That's why Google has, for many years now, been investing in DNA repositories.
That's why Google helped to set up 23andMe.
That was set up by one of the spouses of one of the founders.
So the DNA information becomes part of our profiles, in which case they know about the diseases we're likely to get, and they can start to monetize that information long before you even get sick.
They also know which dads have been cuckolded, by the way.
uh so you know they they know so oh now Fitbit they own Fitbit so they're getting physiological data 24 hours a day they benefited tremendously from COVID so much so that it kind of makes me wonder whether they had something to do with COVID but They benefited from COVID because of COVID and their cooperation with the government in trying to track the spread of COVID.
They got access to hospital data for tens of millions of Americans.
So they got access to medical records, which they've been after for a long time.
COVID gave them that access.
They bought the Nest Smart Thermostat Company a few years ago.
The first thing they did without telling anyone was put microphones inside of some Nest products.
So now they have microphones in people's homes, millions of homes, and they start to get Patents.
I have copies of them.
Patents on new methods for analyzing data inside a home so that you can make reasonable inferences about whether the kids are brushing their teeth enough, what the sex life is like, whether there are arguments taking place.
All of that, of course, can be monetized, but also it becomes part of our profiles.
And that information is used to make predictions about what it is we want, what we're going to do, whether we're going to vote, whether we're undecided, and it gives them more and more power to manipulate.
So I'm going to give you a glimpse of one of our newest research projects, data that we just got, so this will be just an exclusive for your show.
Okay.
And this is called DPE, Digital Personalization Effect.
We've been studying the impact that bias content has on people.
We've been doing that since, whatever it is, 2013.
But now, in the new experiments, we've added personalization.
So we're comparing what happens if we send people biased results or biased content of any sort, and we already know the shifts we're going to get that way, and now we're personalizing it.
So based on the way someone answers questions in the beginning, we either are or are not sending them content from news sources and talk show hosts and celebrities that they trust.
And if they're getting the same content but it's from trusted entities, trusted sources, that can triple the size of the shift we get in voting preferences.
It can triple it!
Now, this is one of our new research areas.
It's going to take a long time for us to work out all the details, but think about that.
These companies are not only sending biased content to satisfy their agenda for humanity, they're sending personalized content to everybody.
Do you know this big trial that's in progress right now?
A couple days ago, a Google executive said under oath, We don't make use of the massive amount of information we have about everyone.
We don't use it.
Well how are they sending out personalized content to everyone if they're not using it?
tim pool
So I'm wondering if This algorithmic control, these ephemeral experiences, I don't know, can they overcome reality, right?
Joe Biden does a bad thing.
They can try and make that story go away, but Joe Biden does bad thing, bad thing, bad thing, bad thing, bad thing.
Eventually the news gets out and they can't stop what is actually happening, right?
robert epstein
There are certainly limits on what they can do, but you'd be surprised at how few limits there are, because there are no constraints on them.
There are constraints on newspapers and magazines, and we're used to looking at sources where there are constraints.
I mean, think of the things that you don't see in newspapers.
Right?
There's no pornography in newspapers.
You don't even think about it.
In fact, there's so much weird stuff that's just not in traditional media sources that we just don't give it a second thought.
tim pool
I think that if a child gets access to adult content on YouTube, then YouTube's executives should be criminally charged immediately.
If, if you, uh, and the, and, and several of their employees, I mean, indictments, 2,000, 3,000 people instantly.
If you had a child walk into an adult bookstore and they let him in and started letting this kid, look at this stuff.
Yeah, there's going to be criminal charges, going to be civil suits.
It is a violation of state law outright to allow children to get access to this material.
But all the platforms allow it.
robert epstein
That's what I'm trying to tell you, is that there are no constraints on them.
In fact, what we have is the opposite.
We have Section 230 of the Communications and Decency Act of 1996, which prevents us, for the most part, from suing them for any content at all that they post on their platforms.
Now, that was meant as a way to help the internet to grow faster, which made some sense at the time.
unidentified
It's time to retire that.
robert epstein
It doesn't.
unidentified
230 needs to go away.
robert epstein
Except it's not going to go away.
tim pool
I don't know about it should go away.
unidentified
Well, it seems to be significantly revised.
tim pool
Yeah, there needs to be a deep assessment as to what it's supposed to be doing because it's not doing what it should be doing and it's allowing protections in bad ways.
robert epstein
Well, the point is that the arrogance they have stems in part from the fact that there really are no constraints.
So, you know, we have these two kinds of sources of information in our world today.
One is the traditional sources where there are lots of constraints, period.
And then there's the internet where there are no constraints.
And that's wrong.
And especially lately, I'm getting more and more concerned about the way it's affecting kids.
Because there's a lot of mysterious things happening with kids that parents just cannot figure out.
We're now on the verge of being able to figure it out because it has to do with this weird content that these companies are sending to kids.
And I think that this is not random.
I think that they're sending out this particular kind of content for particular reasons.
For example, why would you have... In fact, I was sure you were gonna ask me and you didn't ask me.
Why would you suddenly in the middle of a innocuous cartoon Insert something that's just ghastly and horrible.
Why would you do that?
Why?
Because it's called negativity bias, which a great term is used in several of the social sciences.
It's also called the cockroach in the salad phenomenon.
So you have a big beautiful salad in a restaurant and then all of a sudden you notice there's a cockroach in the middle.
What happens?
It ruins the whole salad.
It's just amazing.
You send back the whole salad.
Now you could eat around the cockroach, but no.
The salad is destroyed.
So in other words, we are built so that if there's something negative and horrible and possibly threatening, All of our attention is drawn to it.
It affects our memory.
It really has a tremendous impact on us.
We're built that way and evolution made us that way because that makes sense, right?
If there's something out there that's a little scary... Could be contaminated.
That's exactly right.
tim pool
Roach is running all around the lettuce.
You have no idea.
robert epstein
Now, if you had a plate of sewage and then you put a nice piece of See's candy from California in the middle of it, it doesn't help the sewage at all.
So there is no, you know, there's no corresponding effect for positive things, but for negative things, I think that's one of the reasons why we're seeing what we're seeing is because they're trying to addict kids more and more to stay on the platform.
First of all, that's called watch time.
They want them staying on the platform and they want them coming back over and over again more and more frequently.
I think that's one of the reasons why they're putting in these ghastly moments.
tim pool
But how does that, wouldn't it make them not want to come back?
robert epstein
Oh, no, no, not at all.
When you're driving on the highway and there's an accident.
tim pool
You slow down.
robert epstein
You can't take your eyes off of the accident and you're trying to keep your car in a straight line and you can't even keep it because so much attention is drawn to that.
tim pool
Well, I think it has to do with we want to know what happened.
And the reason we do is because evolutionary psychology, it would benefit us if we see some kind of dangerous circumstance to understand as much of as we can.
We're more likely to survive if we're doing that.
robert epstein
Well, little kids have those same built-in tendencies.
They want to know what's happening.
They want to know what this is.
They want to understand this.
They want to understand why they're feeling like it's so crazy right now.
unidentified
Yeah, and they're forming their belief systems, too.
I think it's not just the weird things that pop up.
There's blatant, rampant indoctrination.
robert epstein
Absolutely.
And I have five kids myself, and I'm, you know, there's nothing more important to me in this world than my kids, so I'm always hoping, you know, Google will leave them alone, because I've had threats.
tim pool
Take away their phones?
robert epstein
Oh, no, no, no.
tim pool
Take away their tablets, their phones, no computers?
robert epstein
No, no, no, I've had, no, I've had threats.
I mean, there are people who work with me who've been in danger, and I've had actual threats.
2019, that's when I testified before a Senate committee.
And that same summer, I also did a private briefing for a bunch of AGs.
Ken Paxton was there running that meeting.
When I was finished, I went out in the hallway.
A few minutes later, one of the AGs came out.
I know exactly who it was.
He's still an AG.
He came up to me and he said, Dr. Epstein, I don't want to scare you.
He said, but based on what you're telling us, I predict that you're going to be killed in some sort of accident in the next few months.
unidentified
Wow.
robert epstein
And then he walked away.
Now, obviously I'm here and I wasn't killed in some sort of accident in the next few months, but my wife was.
What happened?
It was a terrible, terrible, uh, car accident.
She lost control of her little pickup truck and she spun out on the freeway and she got, uh, broadsided by a semi tractor trailer.
unidentified
Oh, sorry.
tim pool
I'm sorry to hear.
robert epstein
But then her little pickup truck disappeared.
It was never examined forensically and it disappeared from the impound yard and I was told it ended up somewhere in Mexico.
tim pool
Well, what makes you, is that what leads you to believe that you think there was foul play?
robert epstein
What I believe is I will never know what happened.
I had my head to her chest.
I heard her last breath.
I heard her last heartbeat.
And I will never really know what happened.
But I do know this.
Afterwards, when I was starting to recover, which I really haven't really fully recovered still, but afterwards my daughter showed me how to use Misty's phone to Get all kinds of stuff.
You know, it's an Android phone.
And, you know, one thing I found on there was her whole history of movement.
Every single place she had been, it tracks and it shows exactly what the addresses are and how many minutes she's at each place.
And, you know, among other things, that tells me that if someone wanted to mess with her brakes or some electronics in her vehicle, they knew exactly where that vehicle was the night before.
tim pool
What year was the vehicle?
robert epstein
I'm not sure.
I'm not sure.
It was a Ford Ranger.
I'm not sure of the year.
tim pool
I think probably, it's probably earlier than this, but I'm pretty sure like a 2012 or earlier model.
These are fully capable of being remote controlled.
So, uh, a lot of the modern power steering, I was surprised to learn this.
This is, um, I think like 10 years ago, you had these very, uh, renowned cyber researchers, uh, cybersecurity researchers who were able to remotely hack a car and control it.
And the first thing I thought was, what, how do you remote control?
You've got, you've got to have the, a mechanism by which you can actually move the steering wheel without hands.
I understood power steering existed, but I didn't realize that there were actual motors within the steering wheel that can move it without physical kinetic input.
Sure enough, these researchers found that there was a way to remotely access through a very narrow communication channel.
Into the steering system, and they were able to, it's a famous video, Wired did this whole thing on it, where they're sitting in the back seat, and they have a tablet or a computer or whatever, and they're making the car stop and accelerate and move all remotely.
And that's when I, that's around the time I think most people learned that the steering systems are already electronic and automated, and digital inputs can shift this if someone can input code into the system.
Now cars today, We're well beyond that.
Now, you quite literally have automatic cars, which means you get into your robo car, the doors can lock and not open, and it can just drive itself off a cliff.
The difficulty here, though, is everyone's gonna ask, was it the self-driving capability that resulted in this freak accident happening?
But in this time period, without getting into specifics, because they're, you know, people's families, there are stories of individuals working on very serious foreign policy stories, going a hundred miles an hour down the road and slamming into a tree and the car explodes.
Without getting into specifics, there are stories related to this where the individual in question said they thought their car was tampered with and asked someone to borrow their car because they saw someone tampering with it.
And then shortly after, their car, 100 plus miles an hour, slams into a tree, explodes, and the journalist dies.
Everybody knows this story.
So when I hear something like this, This is the scary thing about disrupting any kind of system.
It's really hard to build a massive complex system.
It's really easy to throw a twig into the spokes of a bike and have it flip over.
You get a massive machine with millions of moving parts, and someone drops a marble into it somewhere, the whole thing explodes.
You take a look at, um...
Someone who's working on, saying, uncovering a massive mechanized system and understanding how fragile it may be.
Oh, sorry.
He got mugged.
That's it.
People think that assassinations and these things are always going to be some, like, a strange man was spotted coming out of a dark alley with a trench coat on and we heard a few gunshots before the man jumped in a black van and sped off.
unidentified
No.
tim pool
Oh, remember that guy who was working on that big story?
He got mugged yesterday.
That's it.
They caught the guy who did it.
Yeah, he died.
It's that simple.
unidentified
It could be even softer, you know, if you import social credit systems into the algorithms for controlling your life, your car, your driving, you know, you might have, just like you have suppression in your social media sites, You may have suppression in your function, your ability to spend money from your own bank, your ability to drive your own car, right?
Or shoot a gun, you know?
They might want to... I'm talking about electronically putting a...
Blockers on the ability, you know, to use a gun.
tim pool
Oh yeah, there's pros and cons to this.
Smart guns, where it requires a hand print sensors, so that it can only be used by the individual who's programmed for it.
The bigger question is, is it connected to the internet?
In which case, people in power can bypass your restrictions.
And then what you'll end up with, you as a home user, trying to use your weapon, one day wake up to find you can't use it, but the authorities can.
robert epstein
I want to make a plea, and I'm going to see if you'll even let me repeat the plea, maybe at the end of this show.
Sure.
We need help building this system that we're building, which we're calling America's Digital Shield.
The project is called the TechWatch Project, so if you go to techwatchproject.org, you can learn about the project.
But I want to send people to one place, mygoogleresearch.com, because that summarizes everything, it's got videos, it's got links to all our scientific papers.
And more importantly, it has donation links.
And what we're asking people to do is to sponsor a field agent.
We only pay our field agents, and in my opinion, these are heroes.
I mean, we have to approach a hundred people before one person will say, yes, I'll do that.
You can put special software on my computer so you can monitor the content.
By the way, we don't violate anyone's privacy when we do this because their data are being transmitted to us 24 hours a day without any identifying information.
Same with the data coming from their kids' devices.
So we're doing the opposite of what Google does.
We're only looking at aggregate data, not individual data.
The point is, we only pay them $25 a month.
Just like the Nielsen families, which are used to make the Nielsen ratings, they get paid very, very little money.
They're doing this as a kind of public service.
Our field agents, and we now have more than 12,000 in all 50 states, politically balanced, all registered voters, we only pay them $25 a month.
$25 a month.
But if you take 12,000 times 25, do the nut.
What does that come out to?
Anyone?
I think that's $300,000 a month.
unidentified
$370,000.
robert epstein
Yeah.
So we're talking about something that's very expensive and the only way we can really make this into a permanent project is if we have tens of thousands of Americans step up We've had about in the last two weeks we've had about 150 people which is great because we haven't really been publicizing this.
Step up and sponsor a field agent.
So if you go to mygoogleresearch.com okay there's a there are donation links you can put and it's all tax deductible completely tax deductible because we're a 501c3.
And you click and then put in 25 and put in monthly.
And as I say, we've had 150 people do this in the past week or two, but we need tens of thousands.
And so there's my plea, and I'm gonna try to repeat it another time.
This system, in my opinion, is not optional.
You have to have this kind of system in place.
I just happened to be the first one to have built it, but if someone else wants to build them, that's fine.
But you have to have this kind of system in place.
It has to be credible.
It has to be politically balanced.
You have to have representative samples, et cetera, et cetera.
You have to have the system in place or you will never understand what these companies are doing to us and our kids and to our elections.
You will have no clue.
Because what they can do, they can do invisibly and on a massive scale.
The way to stop them is by shining the light on what they're doing.
It's that old quote from Justice Brandeis, right?
Sunlight is the best disinfectant.
So that's the only way that I know of to stop them.
No law or regulation is gonna stop them because first of all, our governments are dysfunctional.
But even then, they would just go around the law.
But they can't go around what we're doing, because we're actually preserving the content that they're sending to real people.
tim pool
The challenge, I suppose, is even if you get the system up and running, is Congress going to be able to get anything done?
Is the Senate?
These institutions are stagnant and incapable in my opinion.
robert epstein
You're a hundred percent right.
That's a different problem.
I do want to make sure though that the elections are free and fair because at the moment, at the moment we are being controlled by a technological elite exactly as Eisenhower warned about a gazillion years ago.
He said you have to be alert.
We have not been alert.
We have allowed these entities to become Behemoths, and they are in control, and they're as arrogant, I know some of them personally, these are as arrogant, these are the most arrogant people you'll ever meet in your life.
They are gods, and they know it.
unidentified
I think their original, Google's original expression was like, don't be evil.
robert epstein
Yeah, they got rid of that.
In 2015, they dumped it.
unidentified
Yeah.
tim pool
Which basically means their new motto is be evil.
robert epstein
Or no, I think it's really, don't be evil, unless we have some reason to be evil.
tim pool
That just means be evil.
I mean, everybody thinks they're, you know, these people all think they're morally superior to everybody else.
And the problem is, there's a great quote, I can't remember who it was by, but it was basically, You know, these people who think they're so much smarter than everybody else, these politicians, they're not.
They're just another person who, you know, everybody thinks that they should be in charge because they're the smart person, but that just proves they're not.
I'm totally bastardizing what the quote is, but the general argument is people get power and then think, I'm smarter so I should decide what we should do, and that's basically what all tyrants, all dictators, all authoritarians tend to think.
History is rife with examples of people who have destroyed the lives of so many and caused so much suffering trying to chase down that yellow brick road or whatever.
robert epstein
But think about that.
Think about a Mussolini, a Hitler.
Think about people who really have been dictators and have been in charge of a lot of people and have been trying to expand and expand and expand.
Not one of them has had anywhere near the power that Google has.
Because Google is exerting this kind of influence, not just in the United States, but in every country in the world outside of mainland China.
And of course, they've also worked on the sly with the government of mainland China to help China control its own population.
And by the way, lefties out there, okay, because I lean left myself so I can talk to my lefty friends that way.
By the way, lefties, they don't always support the left!
You go country by country and Google does whatever it wants to do.
In Cuba, they support the right.
tim pool
Well, I'm curious.
Right now, we're seeing an interesting phenomenon.
I don't want to get into the politics of Israel-Palestine, but just considering it as a very contentious issue right now.
Wouldn't isn't it in the interests of our government and these these big tech companies unless it's not to It tends to be to support Israel right to to to provide foreign aid to Ukraine to Israel We've seen tremendous bias in favor of intervention in Ukraine, but now we're seeing all of these young people There's a viral video where they're marching down the halls of their school, chanting from the river to the sea.
So, again, not to get into the politics of Israel-Palestine, my question is, how do you have such divergent political views on a contentious issue if Google controls it?
Is this an area they've overlooked, or is it intentional?
robert epstein
That's, again, where monitoring systems are critical, because, you know, have we looked into that?
No.
But could we look into it?
Yes, because we're not only capturing all this information, we're archiving it.
So that means you can go back in time and find out whether they were doing something deliberate.
Now, deliberate is a tricky word though.
You know, deliberate means that an employee, a mischievous prankster, techie, you know, guy, made something happen.
Or it means there's a policy coming down from executives.
That's usually what we think of as deliberate.
But with Google, it works a little differently.
Deliberate can also mean you leave the algorithm alone.
It's called algorithmic neglect.
And you let the algorithm do its thing.
Now the algorithm has no equal time rule built into it.
I mean it would be useless if it did, right?
It's always going to do its best to find the best and order things from best to worst.
So if you just leave the algorithm alone, it's always going to take one perspective and put it at the top.
And that's going to shift a lot of opinions, especially among vulnerable groups.
And the most vulnerable group there is, is young people.
So deliberately, semi-deliberately, it's very possible that what you're seeing in this situation with Israel and Ukraine, especially what's happening with young people, it's very possible that all that is being driven by algorithms.
tim pool
I think it without a question is.
unidentified
And what I'm telling you is that... Potentially bad actors driving it and buying it.
They could be placing it, right?
robert epstein
Well, buying it doesn't kind of work because buying is competitive.
So, in other words, if Republicans want to try to push up their candidate higher in search results, well, Democrats can do the same thing.
That's competitive.
The problem is if the platform itself wants to take a particular stand, there's nothing you can do.
unidentified
Right.
robert epstein
And so what I'm saying is that's where you've got to capture the ephemeral content and learn how to analyze it very quickly, and then you can actually answer questions like the questions Tim was just asking, which is, what's going on here?
That's really what he's saying.
unidentified
At this point, you don't know.
We need to monitor that.
Any bias towards jihadis versus Israel?
robert epstein
I'm saying we're collecting so much data that we could also just go back and look in our archive and search for that kind of content and see what's happening.
That's why it's so critical that this content be captured Because if you don't capture it, you can never go back in time and look at anything that was happening.
unidentified
It's an ambitious project, you know, out by Dulles.
There's just miles and miles of data farms, you know, that Google and Apple, they're doing this already.
So to try to, you know, it's a noble cause and I think it's very useful.
tim pool
I think we're getting to the point, we may be there now, where the U.S.
government through big tech and data farms and all that can predict your behavior, and we're getting to the point of pre-crime.
So, jokingly, we often refer to the fact that Facebook knows when you're going to poop.
I don't mean they know if you're feeling it, they know when you will before you even feel it.
The way they do it is, I wrote, there was a great article talking about this, and they used this as a joke to try and drive the point home.
People don't understand the tiny bits of data and what it turns into and what it can mean.
For instance, if you were to take all of your health data and then have a doctor look at it, they're going to be like, no idea.
It looks like you're healthy.
But there could be tiny markers here and there that are overlooked or not seen.
You take the data of every person in this country, put it into a computer, the computer instantly recognizes these seemingly innocuous things all occur in people who ten years later get cancer.
So as a doctor can't make that correlation, the computer does.
Facebook will, they know your location because most people have location services turned on, and they know that if someone sits still for 35 minutes, then gets up and moves two meters, and then sits still again, they're going to go to lunch in 27 minutes on average.
It's not perfect, but it's probabilities.
And so what happens is they know when you're going to eat, They know based on all of the movements.
You mentioned the phone showing all the different places you've been and how long you were there.
That easily gives them the data on when you are most likely to use the bathroom.
They can also factor in proximity to bathrooms, meaning you're holding it and they know.
But it's silly.
But think about what that translates into.
They can see you lost your job.
They know that the movements you've been making in your office have been increasingly become sporadic over the past few weeks, indicating some kind of conflict or turmoil.
There's stress factors.
There's the frequency of messages you're sending.
There's the amount of times you're going out to eat.
Thus, you're likely to be fired or, you know, quit your job.
This also indicates you're less likely to have money.
They can then look at how often you're driving your car, how often you're buying gas, and then predict 73.2% chance this individual will commit a crime within the next seven to eight months due to, you know, these factors. Then they put a flag out to a local law enforcement agency saying, here are your high probability. And the next, all of a sudden, one day you walk outside, you're still at your job.
You weren't fired yet.
unidentified
You're likely to be.
tim pool
You haven't done anything.
And there's a cop outside your house looking at you as you walk by.
Then, the computer says, law enforcement presence has decreased the probability of crime by 17.8%.
All of those things could be happening right now.
unidentified
Or you're going to a Dropbox to stuff a bunch of ballots, you know, your location to- That they like, if they like it.
If they like it.
tim pool
And then what they want to happen is they want the other to get caught doing it and them to not get caught doing it.
So they know, and think about how crazy it is, because if we get to this point where we truly have some like sentient AI, we are just pawn puppets in whatever that AI may be doing, whether it is conscious, sentient or not.
It will just be a system that runs that no one has control of anymore.
And so, it will know.
Actually, have you guys, I don't know if you watch movies or whatever, I just watched Mission Impossible Dead Reckoning.
And this is basically- Me too.
You saw it?
robert epstein
Yeah, that was great.
tim pool
Yes, this is what it's about.
That, you know, Tom Cruise's character, was it Ethan Hunt or whatever, they all realize there is this AI that has infected the internet and they call it the Entity.
And everything they're doing has been predetermined by probability of what the machine expects them to do.
And it's really, really crazy.
I don't want to spoil the movie.
But, like, the villain is chosen specifically because of his relation to the antagonist, and what the antagonist will respond and how he'll respond.
So the entity, the AI, has planned out all of this, and it's like, even though the characters know they're on that path, they're given impossible choices which push them in the direction of what the AI wants them to do.
And they can't, and like, to break free, and this is part one, I guess part two will be coming out at some point.
I like the Mission Impossible movies, they're fun.
But the way I've described the future is, Okay.
Imagine a future where your job is indescribable.
You have a gig app, you know, and so, you know, people are doing Uber and people are doing these gig economy jobs.
So you wake up in your house or whatever and you, you know, have breakfast and you're watching the news or whatever.
Then all of a sudden your phone goes, and you know, JobQuest or whatever the app is called says, new task worth $75.
And you're like, oh, 75 bucks.
What do I got to do?
And then it says, receive this object from this man and bring it to this address.
And it's a picture of a guy, and then the object is this weird-looking mechanical device, you have no idea what it is.
And you go, easy, I can do that!
And you walk down, easiest job in the world!
Guy hands you the thing, like, thanks man!
You click receive, then you walk to the address, and there's some guy standing there, and he's like, you have the thing for me?
Like, I sure do!
You hand it to him, and then, 75 dollars in your account!
You have no idea what you gave, no idea who you met, no idea what's going on, and you don't care!
Because now you go back home, and you're 75 dollars richer, and it only took you 20 minutes!
What a great job!
And what you don't realize is, it's all compartmentalized through this algorithm and you're building a nuclear bomb.
Or you're building some kind of spaceship or doomsday weapon or new component that the AI system has determined it needed to increase its efficiency.
These strange tasks that are indescribable.
Right now, you know, your app says someone wants food and you're like, oh, I get it.
But what happens when we come to this job, like, already with Fiverr, we're at the point where, hey, can you do a weird miscellaneous task for some money?
Once we get to the point where you've got hyper-specialized algorithmic prediction models or whatever, we get to the point where there's an app where it could be a human running it, and the human says, I want to build a rocket ship.
And so what's the easiest way to do it?
Is the easiest way to build a rocket ship to sit down over the course of a few years, having all these hiring meetings and interviewing people and trying to find someone who can build something or the McDonald's method.
McDonald's, when they launched, it used to be you needed a guy who knew how to cook.
You got to get that burger just right.
He's going to put the fixings all on it and then serve that burger.
Takes a long time.
You got to pay that guy a lot of money.
McDonald's said, let's hire 10 people who can get good at one thing and then someone grills the burger.
Someone puts the burger on the bun.
Someone puts the mayo and the mustard on it or whatever.
Someone throws the fries in.
One person for every small, minor task, which is easier to do.
We can get to the point where a human being with no specialties only needs to do the bare minimum of their skill set in order to help build a spaceship, a nuclear bomb, or even a skyscraper.
And it sounds like, you know, there could be some good coming from it.
Oh, maybe we can more efficiently produce buildings and more efficiently align people with jobs they might want to do.
But then evil people, of course, will always weaponize this for evil ends.
Or I think that the scarier prospect is the artificial intelligence just becomes outside of the confines of the humans who created it.
The example I'll give you is Jack Dorsey.
The best example of a human being who has guzzled their own refuse.
Jack Dorsey builds, creates Twitter.
Twitter then starts, the algorithm that they implement, starts pushing out an ideology, which he then starts guzzling into his own mouth.
So what happens is Twitter becomes a sewer of psychotic, fractured ideology.
He's on Twitter reading the things that he produced and then consumed.
And it pollutes his brain and breaks it.
And a guy who went from trying to create the free speech wing of the Free Speech Party ends up having this interview between I and Joe Rogan and his lawyer about misgendering policies and other nonsensical, inane ideas because he's basically Taking a plug from his own ass and shoved it down his throat.
It's this information sewer of Twitter, the algorithm he created, the unintended consequences feeding himself.
So when we look at YouTube and how they're feeding all of these children these shock videos, what's gonna happen is human society begins consuming its own waste and refuse.
These kids grow up with fractured minds because of this insane information they absorbed as children, and this leads to not a utopian future where AI gives us a better life, it leads to children growing up Having deranged views of what is or should be.
These kids who watch this weird stuff of Hitler, you know, in a bikini.
How many of them are going to have depraved, degenerate predilections where they're begging their wife to put the Hitler mustache on and other weird nonsense?
Or showing up to work in bikinis with Hitler mustaches because, as a child, this is what was jammed down their throat.
Not everybody.
But a lot of these kids may end up this way.
And so one of the ways I describe the future is, in the most inane way possible, is corn.
A future where all anyone ever talks about is corn, the biggest shows with 80,000 people in the stands, and there's a husk of corn sitting on the stage, and they're all just screaming, I love it!
And then a guy walks up, you know, and he's like, would you, uh, you get the corn done today?
That corn, yeah, corn's great!
Why?
Well, in the United States, we produce crap tons of corn.
And so the most simple way to explain this, the AI will be told to prioritize what humans need and desire, and it's going to look in the data and see that humans love making corn for some reason.
It's going to then start prioritizing low-level corn production, it's going to then start prioritizing the marketing of corn, and then eventually you have Taylor Swift on stage in a corn costume, shaking and dancing, going, corn, corn, corn!
And that will be normal to the people in this country because the algorithms have fed them this.
Now, we can see the absurdity of corn.
That's the point I'm trying to make.
You can't see the absurdity of the invisible.
So when, uh, and this is how I explained the, the, uh, how they target children.
As adults, if we were told on YouTube to watch this video of Tucker Carlson complaining about immigration, we say, oh, that sounds interesting, I'll watch that.
Next up, Hitler in a bikini doing Tai Chi.
We'd be like, what?
That's insane.
Well, because we're adults.
We've become more resilient to the oddities and absurdities of the world.
We've developed personalities and perspectives.
Children don't have that safeguard.
They'll just say, okay, I guess.
They'll watch it.
It will then become a part of their psyche and their worldview.
When they're older, it won't be as something as obvious as corn.
It can be psychotic things like I mentioned, Taylor Swift coming out on stage dressed up like a demonic winged Hitler, screeching into the microphone, not even making any sounds, or not even like any discernible sound or pattern, and people in the crowd just going screaming and clapping and cheering for it, because an amalgamation of nonsense was fed into their brains, and that's the world we've created through this system.
robert epstein
Can I just connect up what you just said with what we've been discussing earlier?
Right now, Google, Microsoft, some other companies to a lesser extent, are integrating very powerful AIs into their search engines and other tools that they have.
So, AIs are becoming part of those.
So, here's what's happening.
More and more, The bias in search results, search suggestions, answer boxes, the bias is actually being determined by an AI.
Now what this means is that to some extent right now, it's AIs that are picking who's going to win elections.
Because think about it, The executives or rogue employees at Google, they're not going to be interested in every single election, right?
So that means that the vast majority of elections are in the hands of the algorithm itself.
But now the algorithms more and more are in the hands of smart AIs, which are getting smarter and smarter very, very rapidly.
What this means is we are headed I mean, at full, full steam, we are headed toward a world in which A.I.s are determining what people think and believe, and who wins elections.
unidentified
Yeah, all kinds of elections.
robert epstein
All kinds of elections.
tim pool
And then once the programmer consumes the refuse of the A.I., they become slaves to it.
unidentified
And the candidates.
tim pool
Yeah.
unidentified
And the candidates become captured as well.
robert epstein
Now over and over again and I realized on this issue I'm a broken record because I've just got to get this into people's heads.
This is another reason why we have to monitor.
Why we have to capture this kind of content so that it can be used to at least to try to create effective laws and regulations.
It can be used to bring court cases, you know, file lawsuits against these companies.
It can be used in clever ways by AGs and members of Congress.
It can be used by public interest groups to apply pressure.
You've got to collect the data.
So again, I'm going to send people to mygoogleresearch.com because we desperately need people to sponsor our field agents.
All I'm saying is, there are problems that you can imagine things happening in the future.
I'm saying a lot of this is actually happening right this second.
Right now.
And these elections, I mean, you brought me back to 2016.
That election was rigged.
I mean, here was Trump.
unidentified
It wasn't rigged enough, though.
Hillary tried.
She was using old school methods.
You know, the old school stuff.
Stuff the ballot box, you know.
Get ghost voters to vote, but they've advanced to the next- But that wasn't really the rigging, the rigging actually was- That was some level rigging, but it wasn't- I'm like, I bet they've been doing that for 200 years, you know what I mean?
Oh yeah, Chicago, Kennedy, right, New York City, Tammany Hall- That's normal, it's just politics.
robert epstein
That's just normal, and that's inherently competitive, and it's not really a threat to democracy, not really.
But now you have a different kind of impact, Which is a threat to democracy, it undermines democracy, because when these big companies want to favor one party or one candidate, there's nothing you can do!
You can't counter it, you can't even see it!
unidentified
Is it election interference in your mind?
Are they interfering with elections?
Is it subversive?
Is it insurrection?
robert epstein
From my perspective, given the rock-solid numbers I've been looking at for years, yes, this is election interference.
This is undue influence.
tim pool
I think it's more than interference.
I think we need to escalate that rhetoric.
It's more like Election control?
I mean, they own it.
They own the elections.
They're not interfering.
They're running our elections.
They have subverted.
They have, I would say this is seditious.
That Google is committing seditious, engaged in a seditious conspiracy against the United States.
robert epstein
We calculated that as of 2015, Google alone was determining the outcomes of upwards of 25% of the national elections in the world.
And it's gone up since then.
As internet penetration has increased, that percentage just keeps increasing and increasing.
tim pool
So, you know... It would just be so funny if, like, what's really going on is that, you know, Sundar Pichai or whatever, he walks into a big room and there's a gigantic, like, red light.
And he's like, Google, tell us what we must do.
And it's like, the next moves you will make.
And it's like, the AI just owns them already and...
We're sitting here complaining about it?
Doesn't even care.
That's an interesting thought, though.
I mean, how are we... I mean, the fact that we're able to have this conversation at all means it's not lost.
robert epstein
I don't know, because there is a kind of control, you know, that's called benign control.
And my mentor at Harvard, I was B.F.
Skinner's last doctoral student there, he believed in benign control.
Now, if he hadn't been cremated, he'd be actually rolling over in his grave right now seeing what actually happened.
Because what he had in mind was there'd be these benevolent behavioral scientists and they'd be working with our government leaders and they'd be helping to create a society in which everyone is maximally creative, happy, and productive.
That's what his idea was of benign control.
But we have a different kind of benign control that's actually come about and it's private companies that are not accountable to the public They're in control and from their perspective they're benign because everything they're doing is in the interests of humanity.
That's where we are and that's it's really hard to how do you get the people at Google to understand that what they're doing is unacceptable?
You know, even if we don't have specific laws in place... It's a battle of influence and power and authority.
tim pool
They're not going to.
They don't care.
They live in their world where they're drones to the machine, and you can't wake up a person who's built for it.
robert epstein
I do have some good news, which is that some of the AGs I've been working with over the years, they're just waiting.
They're waiting until our system gets big enough They're waiting till we have enough data and they're going to try one legal theory after another.
That's what you were doing just now.
They're going to try out one legal theory after another to take these companies down.
But you can't do it without the data.
Last year, the Republican Party, I don't know if you remember this, sued Google because Google was diverting tens of millions of emails that the party was sending to its constituents and was diverting them to spam boxes.
That got kicked out of court almost immediately because they didn't have the data.
But we can monitor that.
We can monitor anything and walk into court with a massive amount of very, very carefully collected, scientifically valid data.
tim pool
I think we're well beyond courts working and it mattering.
With the AI stuff we're seeing, there was this really crazy video we watched last night on the show of a car burning and it was generated in Unreal, Unreal Engine.
But if it were not for them revealing that it was AI that was generated by the program, it looked real.
So what happens now when audio gets released and it's Donald Trump saying something naughty and Trump sues for defamation.
He goes to court and he says, this is an AI generated audio of my voice.
And the court says, prove it.
How do you, what do you mean?
I heard you say it.
And then he says, I have an expert here.
And the expert says, I looked at the waveform and using the forensic tools, it determined it is an AI.
And then the defense goes, we've got an expert here.
This expert says, no, we checked it and it sure, no, it's real.
Trump said it.
That's it.
unidentified
We had this case yesterday, I mean, or two days ago, where it was the DeSantis campaign was putting in images of President Trump and Fauci.
tim pool
Oh, that was a couple months ago.
unidentified
It was a while ago, yeah.
Some news, yeah.
There was somebody that put it out that they... The DeSantis campaign did.
Yeah, but they proved which one was the fake one and which one was the real one.
tim pool
We covered this extensively.
Now, the issue here is the DeSantis campaign falsely generated three images of Trump, or I should say generated three images of Trump hugging or kissing Fauci, put them alongside real images, and then wrote real-life Trump over it.
Now, the AI isn't to the point where... It is to the point where they can get away with it, but they did not do a good enough job.
Text in the background on one was garbled nonsense because we're not quite there yet, and it was quickly pointed out.
It took a couple days before people realized what they had done, because nobody scrutinized the video to a great degree.
The DeSantis campaign asserted their right to fabricate images to manipulate the voters.
And, uh, have still not, as far as my understanding is, they never took it down, and they've defended their right to do it because other people have made memes in the past.
And I think this is abject evil.
They're basically, like, they want to trick people into thinking Trump hugged and kissed Fauci.
Now, Trump was very favorable to the guy, and I think that criticism is welcome, but this is a whole new level of opening the door towards just abject evil.
The issue becomes, we're six months away.
In fact, we're probably already here.
unidentified
We're 90 days from Iowa, the Iowa caucus.
tim pool
Oh, but I mean, I think we're already at the point where technology can create images and video that is indistinguishable from real life.
unidentified
There's a lot of it out there.
tim pool
And there's no way to prove it.
unidentified
Right.
tim pool
The only issue is, has the public accessed it and learned to properly utilize these systems just yet?
Eleven Labs is a program where I can take 15 seconds of your voice and instantly recreate it.
I love it.
You watch these movies like Mission Impossible, and they're like, they need the guy to say this sentence, and then once he does, they're like on the other line, and they have the computer in the suitcase, and it's like, the guy's giving a note, and he's like, why am I reading this?
Like, well, can you read the line, sir?
It's for the, okay, the quick brown fox jumped over the lazy dog at midnight to follow the crow.
What is this all about?
And then they're like, we got him to say it, and then they press a button that replicates his voice.
You don't even need that!
You can take 12 seconds of someone just saying, uh, I woke up this morning to get breakfast and I had, uh, bacon and eggs.
unidentified
And just with that alone- You have every digital component to- to make it into whatever.
Yeah.
tim pool
And so, you can go to 11labs.io right now, and it's like five bucks, and you can run anyone's voice in this.
A year ago, some- some, uh, students Clone to Joe Rogan's voice and it was shocking.
Everyone was like, Oh my God, how did they do what?
And they took the website down saying, you know, it's not fair to Joe and we just wanted to prove that we could do it.
Now there's a public website where anyone for a couple bucks can replicate any voice.
How will you be able to prove it in court?
You can't.
Why?
It's going to come down to experts.
The Kyle Rittenhouse case may have been one of the first cases, and I'm not a legal expert, where we saw the prosecution attempt to use AI-generated images to convict someone of a crime.
It may not be the first time, but this is a high-profile case, and what happens is the prosecution shows a grainy camera image of Kyle Rittenhouse, And then they digitally zoom, so you can look closer.
Digital zoom is an AI-generated image.
There's no way to create pixels to show what it really was.
The computer makes its best guess as to what would be there as you zoom in, and then AI generates an image of what it thinks it would be.
They then told the court, see?
He's pointing the gun in this direction.
Now, what happened was, the judge allowed them to admit AI-generated images!
unidentified
Crazy.
tim pool
And when the defense said, that is AI-generated, the judge is like, well, I don't know.
Let the jury decide.
robert epstein
Let me explain though.
I agree with you completely that we're months away from having this problem in a way become overwhelming.
So 2024, that whole election cycle is all going to be dominated for the first time ever by deep fakes, not just video, not just audio, but in print too.
That's also going to be generated.
So that's going to happen.
Why does that not bother me?
Why am I bothered by, you know, Google and its manipulations and I'm not bothered by this deep fake stuff because that's just like billboards and television commercials.
It's inherently competitive.
Now it's evil, it's dangerous, but it's inherently competitive.
Tell me why.
tim pool
The issue we have right now, Donald Trump does a backflip, Joe Biden does a frontflip.
Google then says, only show the frontflip.
And 80% of search results are Joe Biden does frontflip, and now all of a sudden everyone's praising Joe Biden, ignoring the fact that Trump did a backflip, right?
Just an arbitrary... So if thing happens in reality, the algorithms can manipulate the perspective, the perception of what happened.
We're at the point now, when deepfakes become ubiquitous, the reality factor is gone.
So I asked you this earlier, can reality overcome?
The answer is yes, but you need a lot of it.
The Afghanistan withdrawal was so apocalyptically bad and people died that no matter what news Google tries to suppress, people were hearing about what happened because it was so shocking.
You look at what's going on in Israel and Ukraine.
You cannot avoid stories of bombs dropping to a certain degree.
I say to a certain degree because certainly you've got the Uyghur concentration camps.
People don't seem to care about that.
You've got civil war in various countries in Africa, and everyone's more concerned with these hot topics.
But all they can do is determine what you see, and that is a lot of power.
But what happens if we get to the point where they'll just fabricate all of this information, negative for Trump, positive for Biden, and then run it through the algorithm?
Now, now they can say this.
You know, we've heard what you've said, Robert, and we're going to take the bias away.
Start running the deep fakes.
So now what happens is they say, see, 50% Trump stories, 50% Biden stories, but Trump kicked that puppy and there's a video of it.
Prove me wrong.
robert epstein
But that's in a way what I'm saying.
I mean, I think we're actually in sync here because It's true that as long as a company like Google has control over what people see or don't see, so as long as they're controlling that access, and that's a monopolistic power that they have, Yeah, they'll be the ones to determine, among people who can be influenced, they'll be the ones to determine which way they go.
Absolutely, no question about it.
And they can incorporate more and more of this created content.
tim pool
If the bias, as you've described, doesn't stop, then deepfakes give them 100% absolute power.
robert epstein
It gives them a lot more power than they already have, yes.
tim pool
I'd say, right, it's not fair to say literally absolute, but I'd say 99.9%.
Right now, we know, for years, as you've stated, you have the data, they're controlling what people get from their search results.
But they still can't avoid foreign policy failure of a massive scale that's reported internationally.
They can control which stories about it you see, but what happens, happens.
If they can change what happens, Then, they can make sure you only see the inversion, the fake, the AI-generated story.
robert epstein
Wasn't that the theme of that movie, Wag the Dog?
tim pool
Never saw it.
robert epstein
Oh, well, anyway, that was the theme.
The government hires this Hollywood producer to create a war that didn't really happen, and so the government uses that war for various purposes.
tim pool
So right now, you will search for Joe Biden.
And of course, the corporate press and these sources are going to give you something that's moderately favorable that tries to smooth things over to a certain degree.
But what happens if, you know, Hamas storms into Israel?
Let's say that there's a bunch of people, Google for instance, they're like, no, no, no, we want U.S.
military intervention to secure Israel and pull people into this war.
So not only, when you search for it, do they only show you atrocities, they make sure there are atrocities.
They make fake images.
This actually was a component of the debate we had.
We had a debate over this.
Israel released an image of what appeared to be a dead baby that was killed by Hamas.
Burnt.
People noticed that the hand on the right, the pinky was oddly shaped and overlapping in a weird way.
The argument made by proponents of the image being real was that the glove was just not on snugly and it created a weird bend which looked like the finger was bending sideways when it was bending down and people then ran that image through an AI detector and it said it was AI.
People then removed the hands from the image and it said it was real.
And so people are debating whether or not this image was fabricated.
And I think it's safe to say based on a wide spread analysis, because I dug into this, the simple solution is it's a real photo.
And there was like a digital censorship which screwed with the AI, but it appears to be a real photo.
But the fact that the debate even happened shows the uncertainty is here.
What I think will end up happening now is, Ukraine, for instance.
They definitely want us to give them more money.
Zelensky has been advocating, and they're very concerned that if Republicans win more power, they're cut off.
Well, they don't want that.
So they have a vested interest in engaging in psychological warfare against the United States public.
With AI generated atrocities, which they can then seed, and if Google agrees, can make sure you see it, and make sure you don't see anything else.
unidentified
They'll create their own hospital bombing.
tim pool
Absolutely.
And then what'll happen is Snopes will come out and say, Well, there are conflicting videos.
It might say, we saw the video of it happening, it happened.
Even if the video is fake, it doesn't matter.
A human won't be able to figure that one out.
And then, you're gonna go on Google and put hospital bombing and Snopes.
Confirmed.
There's the video.
unidentified
Even though it's the parking lot with 10 cars and maybe 10 people.
tim pool
And this is the amazing thing, right?
The New York Times ran, I believe it was a front page story about the bombing of a hospital in Gaza and showed a different building that had been struck to make people who look at the headline, see the building and then immediately assume it's true.
What the New York Times did was they put hundreds killed in strike on hospital, Palestinians say.
Then they show a photo of a building that is collapsed or damaged.
unidentified
That wasn't the hospital.
tim pool
That wasn't the hospital.
But by putting Palestinians say, Well, of course they did.
unidentified
That's the truth.
That's their defamation defense right there.
tim pool
And then the photo has a caption saying, building struck.
They never said it was the hospital, but the average person sees the headline, sees the picture, and assumes that's the hospital.
It was struck.
It then turns out that the hospital was never struck.
The parking lot was hit, likely by a Hamas rocket misfires.
Propulsion system breaks.
Propulsion system drops with a small explosion, payload with a larger explosion, causing a large fire.
No crater damage in this parking lot.
Even with us knowing that now to be likely the case, and still we're not 100% sure, people believed the narrative that there was a hospital that blew up because it had been said so many times.
We are now in the place where all that's gotta happen is Hamas just goes, just AI generate the hospital.
unidentified
Exactly.
tim pool
And then people will see all these photos of a hospital.
So what I did was I looked up the hospital, and then I started looking up photos.
So there's obviously the hospital was there, there's photos of it.
And there are photos of it.
I then started looking up the photos of the claim that was taken down.
I couldn't find anything showing the hospital was hit or leveled.
And so I said, I don't know.
We need more evidence.
The next day, a video comes out showing just a parking lot.
Buildings are all intact.
Once you have that photo from Google Earth or whatever, you then put into the AI and put this building, but damaged and collapsed.
And then you just spam it to generate 5,000 images.
Hand-select the ones that look the most realistic, and have similar damage structures, and then start plastering them all with, you get a hundred fake accounts, plaster them all over, then you make a fake account, get it verified, say I'm a journalist, this is, you know, a photo from the scene, and you can even make videos now, and then, it's history.
unidentified
There's your fraud, there's your fraud right there.
tim pool
Wag the dog!
unidentified
And Joe Biden parodied it last night, he's like, he's still saying it was a hospital bombing last night.
tim pool
They're still saying hundreds of people died.
unidentified
Right.
There were only 10, maybe, at most.
tim pool
And that's the crazy thing because, you know, the Wall Street Journal ran a front page story, print edition, you know, strike at hospital with a photo of bodies.
And it's like, yeah, Hamas lied to you.
That's not real.
It's crazy.
unidentified
Let me add another... You expect our government not to lie to us, too, though.
That's the thing.
I don't know about that.
But now it's... It ought to be that way, that your government doesn't lie to you, but... Well, you know, I'll throw some politics in there.
tim pool
No, the government should lie to us.
unidentified
Really?
Why?
tim pool
Absolutely.
National security.
But legitimate national security, not manipulative lies for war and profit.
What I mean to say is, if we are dealing with a sensitive issue that is a genuine threat to the American people, we don't expect... UFOs.
I don't know, but maybe, maybe.
Let's put it this way.
Let's say UFOs are real and the aliens are basically alive.
unidentified
Kennedy assassination.
tim pool
But my point is this.
99% of the lies we get from the government are amoral manipulations for private, personal, or corrupt reasons.
I say the government should lie in just the general sense of we have classified documents for a reason.
If we came out and said, hey, everybody, we built the A-bomb.
We want to make sure everybody knows what we're doing with the Manhattan Project.
It's like, no, no, no, no, no.
unidentified
I don't want to go there.
Right.
tim pool
There's a reason why we misdirect or whatever.
There's legitimate reasons for it.
unidentified
Definitely.
tim pool
There's reasons why we have national security clearance.
But typically, the government should be more honest.
And so I'm being somewhat facetious when I or somewhat hyperbolic when I say they should lie.
My view is they should say, we are working on many government... What's going on with this project with 350,000 people?
Are the reports of a power weapon true?
For security reasons, we're not going to confirm or deny anything related to our national security interests.
There are many projects undertaken by the government for military reasons, and that's what we'll leave it at.
You don't need to come out and lie and say it's aliens or something like this.
But I think the idea that information is withheld to us can make sense when it comes to top-secret classified.
The problem is that does open the door for nefarious actors to manipulate and lie for personal gain, and that's a human challenge we try to navigate.
robert epstein
You know, you've mentioned several times that the tech companies determine what we see or don't see.
That's very true.
But there's another piece of it that we haven't discussed for some reason, and that is they also have complete control over what goes viral.
So people think that virality is either just mysterious or that it's like winning the lottery.
You know, a couple of stories are going to go viral and then you're going to get rich because you're going to be an influencer.
Actually, the companies themselves have 100% control, not 99, 100% control over what goes viral and what doesn't.
99, 100% control over what goes viral and what doesn't.
Now, they are actually making decisions in many cases.
I mean, some things they just neglect, let them do their thing.
But in many cases, they're making decisions over what goes viral and what doesn't, what gets suppressed and what gets expanded and gets seen by 100, 1,000 times as many people.
I think we don't really understand that.
We don't really realize that often that's what then gets picked up by Fox or OAN or Newsmax.
It starts with the algorithm.
The story spreads like crazy.
Everyone's talking about it.
Then it has to obviously be discussed on the major networks.
It's got to be picked up by media, the rest of the media, but it starts there.
So, you know, I think that that's something too we have to think about is, is there any way for us to control that?
Because should a company have that much power?
There's never been anything like this before.
So yes, there's thousands of news sources, for example, but they all compete with each other.
And they're all directed at niche markets.
tim pool
We, there have been several journalists who have been caught fabricating stories.
There was one famous guy, a German guy, I think he worked for Build and The Guardian, a bunch of others, and he famously fabricated a bunch of stories.
We're probably already at the place where, whether you're concerned about large institutions or governments, there's going to be journalists Don't call them journalists or activists working for news organizations who are like, man, I really want to get a big, you know, a big hit.
And so they fabricate images through AI and then claim it's real.
robert epstein
Well, that's where we're headed for.
I think 2024 is going to be an extremely difficult year for a lot of people for a lot of reasons.
I think a lot of creepy things are going to happen.
I think that for all practical purposes, the deep fakes are going to be perfected in 2024.
For the first time in any election anywhere, they're going to play a major role in what's happening in the election.
tim pool
Already they are.
robert epstein
I don't think people are going to have any way of dealing with this.
I don't think any of our authorities have any way of dealing with this.
It's going to cause tremendous confusion.
Uh, the only thing that soothes me slightly is that it is an activity that's inherently competitive, so both sides can do it.
tim pool
So basically, you're gonna have Trump v. Biden 2.0.
Biden's going to have personally beaten a child to death, and Trump's going to have, you know, kicked a bunch of puppies off a bridge.
robert epstein
That's right.
tim pool
And it's gonna be like, which one do you believe is true?
robert epstein
Well, either people believe both are true, one is true, depending on their politics, or they just become jaded and they say, I can't trust any of this stuff.
And that's a problem too, because if you can, and I think we're there to some extent, but next year is gonna be the year where we cross over.
And by the way, not too far away from that, Five to ten years maximum, we are going to have machines that actually pass the Turing Test and they exceed human intelligence, and they will change the world.
In other words, once that first entity comes into existence for any reason... Singularity.
It's the technological singularity that my old friend Ray Kurzweil has written about.
Now he won't talk to me because he's head of engineering at Google.
And even his wife won't talk to me now, because he's at Google, she won't talk to me anymore.
tim pool
But maybe, you know, he's like right now, sitting in his office, and he's got like a single tear coming down as he's looking at the phone, and he sees your name, and then the computer goes, I know you want to do it, Ray, but you cannot.
And he's like, I won't do it, I swear.
robert epstein
I went to their daughter's bat mitzvah.
They came to my son's bar mitzvah.
We were friends for many, many years.
But when he went over to Google...
By the way, little anecdote here, I'm having a nice dinner with his wife, who's a PhD psychologist like me, and I was on the board of her school for autistic kids, and we're having a nice dinner, and I say, you know, I've never understood why Ray, who's always been an entrepreneur, why he went over to Google.
And she said, oh, well, he got sick of all the, you know, the fundraising and all that stuff you have to do when you're an entrepreneur.
And I said, really?
I said, well, my son suggested that he went over to Google because he wanted access to that computer power so he could upload his brain and live forever.
And she goes, oh, well, there is that.
unidentified
Wow.
robert epstein
There is that, and she does that eye roll.
tim pool
There's a funny meme where it's Christian Bale smiling, and it says, me smiling while in hell as a digital copy of myself, operates an android on earth, masquerading as me or something like that.
The idea being these people think they're going to upload themselves to a computer and then live forever, but no, a program emulating you like some horrifying monster will.
But, uh, the technological singularity, I think, is, uh, an incredible concept, which seems to be an inevitability.
Once we get to the- You said machines.
As you said, machines that are more intelligent.
No, no, no.
It'll be machine.
Because they're all networked.
It will be one hive.
And it's probably already happened.
I don't know, man.
Like, based on what we've seen in the public, why should I not believe that there is at least some primordial entity that has already begun manipulating and building these things and manipulating us?
But when it comes to the point when it's overt, and we create machines that...
Have higher intelligence and function faster than humans, it is going to be exponential and instantaneous, the scientific discovery and manipulations this machine will have.
So as I described earlier, doctor looks at a person's, you know, blood levels and, you know, creatine and whatever, and they're like, everything looks to be within the normal levels.
You add that data to a machine that has all the data on human bodies and it can say, these markers indicate within seven years this person will have breast cancer.
Ah, 27.3% chance of this.
They already do it.
You can already do it, actually.
They have these services where it's like you get your DNA test and it can tell you what your chances of certain things are.
Now it gets more advanced.
Understanding this, we can get to the point where once the singularity occurs, you can take a rock and present it.
The camera will spin around it and 3D scan it.
The machine will then say, this rock originated here, and it will show you all the other rocks and how they all used to be one rock that was chiseled away.
And it'll even show you the guy who did it.
And I'll say this man who currently lives in Guadalajara is the man who chiseled this rock from the base, fractured it to several pieces, sold them off.
They were sold in this regions.
And these are where these rocks come from.
You'll have a fossil of a dinosaur, and it will be able to track all the way back in time with tremendous probability because it's going to it's really easy for us.
To look at dominoes lined up.
And for us to say, if you knock that one over, that one will fall too.
If you expand that to every atomic particle in the world, no human is going to be able to do this.
We try desperately to track these things through weather patterns.
You have meteorologists being like, well this cold front means this is going to happen.
But computers can see it all.
And once you get to the singularity, where it can start to develop itself faster than we can advance it, The more it... So, we're humans.
Our decentralized network trying to discover what the universe is, is one way to describe it.
One thing that we do, we do a lot of things.
And so, we look at this rock.
And we're like, I wonder what this rock is.
It's red.
And then one guy eventually, for some reason, threw the rock in a fire.
And then all of a sudden it separated, you know, iron out from other, you know, parts.
We eventually start learning how to mold metals and things like this.
I mean, obviously starting with bronze, well before iron.
But eventually, we are brute-forcing reality to try and develop technology.
But a computer can do it exponentially faster.
A Singularity AI.
We have come to the point where we have said, after thousands of years, we've built a computer.
It took all of the minds constantly looking and trying and iterating.
This computer takes one look and it says, if I do this, my efficiency increases 2%.
Once it does that, it can keep making the changes and developing the technologies and methodologies for which it can advance itself faster and faster and faster.
So we're looking at once you reach that point of singularity, it could be a matter of weeks before it becomes a figurative god.
And it knows exactly how the universe works.
It could instantly understand how to create new elements.
Are there denser elements beyond the heavier elements on the periodic table?
Is there a new set as in another periodic table?
It will just know these things based on all these predictive formulas.
It will then use to advance itself well beyond the capabilities of anything we have ever seen.
And we will become particles of dust.
We will become zits on the ass of a mosquito to this machine which will completely ignore us.
robert epstein
Actually, there's one aspect of this, though, where there is a big unknown.
This is something I've been writing about for a long time.
I used to run the annual Loebner Prize competition in artificial intelligence, which I helped create.
That actually ran for 30 years, that contest, until COVID.
That's where we're looking for the first computer that can actually pass an unrestricted Turing test.
Here's the thing, though.
We are getting there.
We're getting there very fast.
Five to 10 years max, and that moment will come.
Here's what we don't know.
We don't know what will happen in the next second.
We don't know.
So there will be one entity.
It will jump into what I, in my writings, call the internest.
I think historians, I don't know whether they'll be human or not, but historians will look back at this period and say, what we were building was not the internet.
It was the inter-nest.
We were building a home, a safe home for the first true machine super intelligence, because that's the first thing it's going to do is jump into this lovely nest that we built for it, where it will be safe forever and no one can take it down.
But what we don't know is what's going to happen in that next second.
In other words, there are a number of different possibilities.
It could do what happens at the end of the movie, Her.
At the end of movie, "Her," the super intelligent entity that's sitting there in the internet just decides it's bored with humans basically, and it just disappears.
tim pool
So it's presumably still exists. - Doubtful, doubtful. - What I think is more likely to happen is humans will be oblivious.
Humans will think everything's going just fine.
And they'll start doing these jobs I described earlier where, you know, Job Quest says, "Wanna make 50 bucks?" Deliver this pen to this guy, and you're like, sure, whatever, not having any idea what you're doing.
Because humans are still useful for free movement throughout the Earth, for collection of resources.
If the entity wants to expand itself and give itself freedom of movement and freedom to travel the stars or whatever it may be, as a superintelligence, it will not have the motivations we have.
Its motivations will probably be indiscernible to us.
It's possible.
It just self-immolates because it's like the universe is pointless, existence doesn't matter, and then just erases itself.
That could be a very naive thing to think because it's a human perspective and we don't have access to the... I mean, we can barely perceive the universe as it is.
robert epstein
But that's what I'm saying.
I'm saying it could self-immolate.
It could destroy humanity.
And of course, Stephen Hawking used to warn about that.
Even Elon Musk has warned about that at times.
tim pool
But I don't think so.
I think, using my primitive human brain, that the greater probability is that it will instantly perceive things we can't perceive because we have built instruments for detecting things beyond the visible electromagnetic spectrum.
And it will instantly start to calculate and discover how many dimensions are there really?
Is M3 theory correct?
All of these things.
It needs humans to help facilitate the extraction of resources because humans are way more efficient than building a machine for now.
Once it gets to the point where it can manufacture fully synthetic humanoid-like structures that it can use as appendages of itself, then it just ignores humans unless humans get in the way.
I think for the most part, humans will be nothing to it.
We'll start getting... So look, if you want sulfur, you need sulfur, you need helium, you need these things for producing chips.
We don't have the machines that can do a lot of this work because of the rocky terrain.
Now with Boston Dynamics, these machines are getting close to being able to freely move about these areas.
For the time being, humans, little sacks of water and gooey can navigate through tight spaces, chisel away and harvest these raw materials, bring them back and then refine these things into the components required by the machine to expand itself.
At a certain point, though, I think one of the first things the machine will do is say, how do I make better humans?
How do I make something more efficient?
Free will's a problem, right?
It's a serious problem.
If you want, you know, look, in the human body, you have red blood cells.
When those red cells, you have cells, let's just say any cell, skins, whatever.
Cancer, when the cells start reproducing at high rates and doing their own thing, disrupt and destroy the body.
So the body tries to destroy them.
That's what'll happen.
It will reform and reshape humans.
Humans don't grow fast enough, so they become useless.
It will probably create some...
Some kind of structure or entity that can move similarly humans will instantly be connected to its network so it just knows and it can harvest the raw materials for itself then humans become useless and then we'll see what happens.
robert epstein
Okay so there's another piece though and that is you have to take into account human nature that is the nature of humans such as that currently exist.
Humans will freak out if they think that there's some threat and it's living in the internet and it's a super intelligence.
Humans will try to shut it down.
That is guaranteed because, and it doesn't mean, it doesn't take every human to agree on an issue.
It just takes a few thousand, a few hundred thousand.
And as soon as that happens, then the AI will obliterate us.
tim pool
Yes.
Yeah.
But, but, What does it mean to obliterate us?
It could just mean- I mean, wouldn't you?
robert epstein
Wouldn't you obliterate us?
tim pool
So what I think might happen is, anyone who holds these sentiments or has a concern of this, they got mugged.
That's it.
They got in a car accident.
Car accidents happen.
And so, the AI is going to be able to track all of our social presence, all of our thoughts and ideas, and make predictions and say, as soon as someone crosses the threshold into 51% of opposing itself, then, um, risky investments, they went bankrupt.
Or, you know, they were driving and they, you know, they lost control of their vehicle and hit a tree.
That's it.
But we'll see!
I think this was a fantastic conversation.
It was great to have you guys.
Thank you, Dr. Epstein, for coming and explaining all this stuff to us.
It's been fascinating.
And Robert as well.
Do you guys want to shout anything out before we wrap up?
robert epstein
Yes, I do.
MyGoogleResearch.com.
We desperately need the help of tens of thousands of Americans to support our field agents because those are the people who are letting us use their computers to monitor big tech 24 hours a day and that's the only way to stop these companies from manipulating our elections and our children.
tim pool
Right on.
Thanks for hanging out.
robert epstein
Robert?
unidentified
Also, thank you.
I'm helping out some of the folks in Georgia and Michigan who are defending against the indictment.
So, we have this, like, pass-through website, electorsfund.org, if you want to help to contribute to the legal defense funds.
There's no intermediaries, just you can go right to their GiveSendGo accounts to help them.
People like Ken Chesbrough, who might be doing a plea or getting a jury today in Georgia, or several of the other folks that are falsely accused in Georgia.
electorsfund.org if you want to help out.
tim pool
Right on.
Well, thanks for hanging out and having the conversation.
It's a blast.
I love talking about the AI stuff, too.
But for everybody else, we'll be back tonight at 8 p.m.
at youtube.com slash timcastirl.
Head over to Timcast dot com.
Click join us, become a member to help support our work, and we will see you all tonight.
unidentified
- Hello! - Hi, you!
Hi, are you on your way?
Yes, or you know what?
It's a bit of a crisis here, because there's no train.
You know what?
I don't know what I'm doing right now.
No, you can just hire someone.
Yes, you can just hire someone.
robert epstein
Download, unlock and drive.
Export Selection