All Episodes
Sept. 28, 2023 - Stay Free - Russel Brand
45:03
Google Are Doing THIS To Manipulate You! With Dr Robert Epstein

Russell chats to Dr Robert Epstein, a former Harvard psychology professor, author of 15 books, and current director of the American Institute for Behavioral Research and Technology, who has investigated alleged use of electoral manipulation by Google and other tech giants. Together they talk about Dr Robert's claims that Google’s manipulations directed 6 million extra votes to Joe Biden in the 2020 elections, Google's attempt to overturn a $2.6 billion EU antitrust fine and the future of A.I technology.WATCH THE FULL EPISODE ON RUMBLECheck out Dr Robert Epstein's research: https://mygoogleresearch.com/ Become an Awakened Wonder: https://russellbrand.locals.com/ 

| Copy link to current segment

Time Text
Hello there, you Awakening Wonders.
Welcome to Stay Free with Russell Brand.
What extraordinary times we're living in where reality appears to be curated to an enormous degree.
How do you manage reality?
How do you manage perception?
How do you manage information?
Joining me now is Dr. Robert Epstein, a former Harvard psychology professor, author of 15 books and current director of the American Institute for Behavioral Research and Technology.
Thank you very much for joining us today, Doctor.
My pleasure.
One of the claims that you have made that is most astonishing, difficult almost to believe, is that Google are essentially able to curate and control reality.
Google, that we all use as an ordinary tool in most people's lives, you claim can be used to drive and direct an agenda, that it can be used as a political tool and even weapon In particular, I'd like to ask you about your claim that Google was able to direct six million extra votes to Joe Biden.
And obviously, that's an incredibly contentious claim, because talking about electoral fraud and electoral meddling seems to be one of the subjects that's Most difficult to discuss and has to be discussed with incredible caution.
So can you tell me exactly what it is you mean by Google directing six million extra votes to a presumably preferred presidential candidate and how on earth Google would be able to do that?
Well, I've been doing very rigorous scientific research on this topic for more than 11 years.
And what should really shock you here is that people's preoccupation with election fraud and ballot stuffing and all that, that preoccupation, that obsession is actually engineered by Google and to a lesser extent, other tech companies.
There's nothing really there, and that's what they do.
They redirect attention like magicians do, so that you won't look at them.
That's exactly what they're doing.
So they're directing us to look at things that are very trivial, that are competitive,
that have little net effect on elections, because they don't want you looking at them
because they in fact have the power and use the power to shift millions of votes in elections,
not just in the US election in 2020, where they did indeed shift more than 6 million votes
to Joe Biden, but in elections around the world.
By the year 2015, Google alone was determining the outcomes of upwards of 25%
of the national elections in the world.
How do we know this?
Well, in 2020, for example, we had 1,735 field agents in four swing states in the U.S.
That's where the action is.
What does that mean?
That means that we had recruited registered voters, equipped them with special software, So that we could look over their shoulders as they're getting content from Google and other tech companies.
And we recorded all that content.
In other words, we were seeing the real content that they're sending to real voters during the days leading up to an election.
And then we measured the bias in that content.
We found extreme political bias favoring Joe Biden, whom I actually supported, although I no longer do.
The point is we found extreme political bias and we know from randomized controlled experiments we've been conducting since 2013 that that level of bias shifted at least six million votes to Biden in that election.
In 2022 we had 2,742 field agents in 10 swing states.
So in other words, we're monitoring real content sent to real voters by these companies,
recording it in real time and analyzing it in real time.
In 2022, they shifted millions of votes in hundreds of midterm elections throughout the US.
We know they did this for Brexit, by the way, in the UK.
And again, they're very good at redirecting attention.
What we're doing now is much, much bigger.
We decided to build a permanent monitoring system in all 50 US states.
At this moment in time, we have 11,638 field agents in all 50 states, which means 24 hours a day, we are monitoring and preserving and archiving ephemeral content.
That's what they use to manipulate us.
Ephemeral content.
Through the computers of more than 11,000 registered voters in the U.S., 24 hours a day, we're on the verge of setting up a permanent system like this that will keep these companies away from our elections and from our kids permanently.
Whilst I understand that you're able, with these agents that you described, to monitor the information that Google is publishing, promoting and directing, it does seem to be, given the sort of literally global scale of the endeavour that Google are undertaking, to be a relatively small sample size.
I will add, of course, that I understand that there are significant contracts that are explicit between Google and the government in areas like data, security, military-industrial complex, defence.
There are explicit financial ties as well as donations and lobbying money, as well as numerous people in Congress and the Senate owning significant shares.
In companies, big tech companies, particularly in this instance, that they are supposed to regulate.
So the possibility and opportunity for corruption is plainly there.
But I do wonder how you're able, with that sample size, to deduce such a significant number, specifically six million.
And also the other figure that I've heard in association with your work that a 50/50 split among undecided voters,
you know, I know we're talking about swing states anyway, can turn into a 9E/10 split.
How do you map these relatively small figures onto like, you know, such a global number?
And also you suggested that part of your work going forward is to regulate and oppose this trend and tendency.
How would you do that?
You're shocking me here because you sound skeptical and yet you have been victimized
by exactly these kinds of manipulations and are being victimized now.
You've been victimized because you have been suppressed.
Your content has been suppressed.
You've been demonetized.
These companies have enormous power to determine what people see and what people don't see.
And what we measure in our experiments is how that impacts people's opinions And people's votes, their voting preferences.
That's what we measure in controlled experiments.
We present at scientific meetings.
We publish in peer-reviewed journals.
Our work follows the very highest standards of scientific integrity.
And this issue of sample size, you've got that backwards.
These are enormous sample sizes for statistical and analytical purposes.
These are very, very large samples.
And so the effects that we keep replicating over and over again, other teams have now replicated,
those are significant, for those of you who know any stats here, at the 0.001 level,
meaning the probability that we're making mistakes is less than 1 in 1,000.
We're highly confident about what we've been finding.
And the problem here is that we're up against...
The most powerful mind control machine that's ever been developed by humankind, and it's operating in every country in the world except mainland China, and it impacts how people see those companies.
They're impacting not just our elections, they're not just indoctrinating our kids, they're literally altering the way we perceive them as a company.
That's extremely dangerous.
And most of these manipulations that they have access to now, that they control exclusively because they're a monopoly, most of these manipulations cannot be seen by the people who are being manipulated.
That makes it even more dangerous.
So your ability to observe them and to track them, it operates against what type of control?
If you're able to say that people are being sent this information that's highly biased, what would unbiased information look like?
I'm open, of course, to the possibility that this unprecedented and fully immersive technology would be used by people that have an appetite to control information and it seems quite plain to me.
That that does happen but because it's so extraordinary and revelatory because it's so significant and if it were able to be opposed it could be so seismic in our ability to have true democracy and a public sphere worthy of the name where dissent and conversation could take place freely.
I feel that it's important that I understand exactly how that not exactly because of probably the limitations of my ability to understand but as precisely as I might, the way that you're able to say,
"Look, this would constitute neutral information.
Look at what you're actually getting."
Because I feel that it's very important.
Again, you're shocking me because you're being the skeptic here,
but you know, good scientists are also skeptics, and there's no one more skeptical
about the research I do than me.
So, let me give you an example, and I'll just show you exactly how this works.
In 2020, where we had collected a massive amount of data, we had preserved more than 1.5 million ephemeral experiences on Google and other platforms, and you're asking, Ephemeral experiences?
What are those?
Those are those fleeting experiences that we all have online when we're shown search suggestions or answer boxes or search results or news feeds.
They appear, they impact you, they disappear, they're stored nowhere, so no one can go back in time and see what was being done.
That's what we've learned to preserve over the years.
So, here we go.
2020, we find, again, Massive, overwhelming evidence of extreme bias will preserve 1.5 million ephemeral experiences.
And I sent the data in to the office of Senator Ted Cruz.
He and two other senators sent a very threatening letter to the CEO of Google.
This was November 5th, 2020, two days after the presidential election.
And lo and behold, that same day, Google turned off All the bias in the state of Georgia, which was gearing up for two Senate runoff elections in January.
We saw them turn the bias off.
It literally like flipping a light switch, as I was told by a Google whistleblower, literally like flipping a light switch.
We had more than a thousand field agents in Georgia.
So we saw the extreme bias that was being shown.
We saw them turn it off.
Among other things, they stopped sending partisan go vote reminders.
In other words, they were sending go vote reminders mainly to members of one party.
But on that day in Georgia, no one Got to go vote reminders from Google anymore.
So believe me, they have this power.
They exercise this power.
This is now being confirmed by multiple leaks from the company.
For example, emails that were leaked to the Wall Street Journal in which Google employees were discussing how can we use And I put this in quotes, ephemeral experiences to change people's views about Trump's travel ban.
This has been confirmed by multiple whistleblowers, leaks of documents, leaks of videos of a PowerPoint presentation.
This is how the company operates.
They literally know that they have the power to re-engineer humanity.
That's a leak of a video called The Selfish Ledger from Google.
Literally, that's what the video is all about.
And that's what we're tracking.
In other words, we're doing to them what they do to us and our kids 24 hours a day.
we have learned how to surveil them and to preserve that very,
very powerful ephemeral content, which normally is never preserved.
And they never in a million years imagined that anyone would be sophisticated enough,
competent enough, audacious enough to preserve that content.
And that's what we are doing.
And as of this moment in time, we have preserved in recent months more than 44 million
ephemeral experiences on Google and other platforms.
We have the data.
We have the evidence and it's court admissible.
Wow.
So that's fascinating.
So presumably there are relationships and an agenda where interests converge to the degree where there is an established and undemocratic consensus about the nature of this reality that's being formulated, i.e.
this is the data that is promoted, this is the information that's amplified, this is the information that's censored, this is the information that people just don't get to see.
I wonder if when you presumably began to garner your expertise and education in behaviouralism, tools of this magnitude didn't exist and were not available.
Throughout the pandemic period there was a lot of talk about nudge units, certainly in our country there were, how behavioural nudges could be offered and sort of BF Skinner type nomenclature about how behaviour can be controlled, how certain traits can be amplified, certain impressions can be projected and promoted and others maligned, ignored.
I wonder how your expertise and background in behaviouralism, Robert, maps onto this new reality and what advantages they now have having this kind of utility.
How does this How does this, what do I want to say, how does this marry to your conventional understanding of behaviouralism in a normal propagandist state like in the last century where there have been print media and TV media?
And can you tell us what techniques of observation and measurement are preserved and have sustained what must be an epochal shift?
I was the last doctoral student at Harvard University of B.F.
Skinner, the man who some would say helped to create behavioral psychology.
And Skinner himself did not anticipate what has actually happened.
He would be shocked.
If he hadn't been cremated, I would say he'd be rolling over in his grave right now.
Because what is happening is astonishing.
It's just, it's unprecedented.
Companies like Google, and there are others too, but they're the worst offender.
Companies like Google now have access, because of the internet, to new types of manipulations.
These aren't nudges.
These are massive manipulations.
I mean, when we started doing experiments, controlled experiments, on these new techniques, which we had to discover, we had to name, and then we had to learn how to quantify them, I didn't believe our data.
In the first experiment we ran in 2013, I thought by showing people biased search results, I could shift their voting preferences by two or three percent, which I thought would be, you know, important possibly in a closed election.
The first shift we got was 43 percent, which I thought was incorrect.
So we repeated the experiment.
These are not with college sophomores, by the way.
These are with a this is with a representative sample of U.S.
voters.
And the fact is, we repeated that experiment.
We got a shift of 66 percent.
We continued to replicate.
Other teams have replicated this effect.
We did a national survey in the U.S.
We did research in India, research in the U.K.
This has been going on now for more than 11 years.
This is rock solid research and Skinner himself would be flabbergasted.
Because what we're seeing now are techniques for shifting people's thinking and behavior without their knowledge on a massive scale to an extent that has never been possible before in human history.
That's what the internet has made available.
Now, this wouldn't necessarily be that much of a threat.
Except for the fact that the way the internet has evolved, which no one anticipated, is it's controlled mainly by two big monopolies, to a lesser extent by a couple of other monopolies.
And because they're monopolies, it means that these techniques of control, we can't counter.
If you, in an election, you support a candidate and you buy a billboard, I can buy another billboard.
You buy a TV commercial, I can buy two TV commercials.
But if one of these big platforms, like Google, if they want to support a candidate, or they want to support a Brexit vote, or they want to support a political party, there is nothing you can do to counter what they're doing.
What we've developed are systems to surveil them, to preserve the evidence, preserve the data.
That's the only way I know of To stop them is by gathering the evidence in a way that is, again, scientifically valid, so that the data are admissible in court, and that is what we're doing right now.
If people want to know the details, they can go to mygoogleresearch.com, they can go to techwatchproject.org, MyGoogleResearch.com will give them lots and lots of links to lots of published papers, lots of talks I've given.
This is serious work and what's happening here, again our attention is being misdirected away from what they're doing, but what's happening here, what they're really doing is extremely dangerous and very scary.
It undermines democracy.
It makes democracy into a kind of a joke and Since you haven't interrupted me yet, thank God, I want to just tell you that President Dwight D. Eisenhower, who was head of Allied Forces in World War II, I mean, he was an insider.
In the last speech he gave as president in 1961, some people are aware that he talked about the rise of a military industrial complex and, you know, but that same speech, he warned about the rise of a technological elite.
This was 1961.
He warned about the rise of a technological elite that could someday control public policy without people's knowledge.
And that is what has happened.
The technological elite are now in control.
Oh my God, it's terrifying.
One of the things that you covered there was, I suppose, the monopolization, or at best,
duopolization of the public space.
Sometimes when I have something of this scale described to me, Robert, I find it inconceivable
to envisage that it could ever be opposed.
And yet there's something oddly traditional about the dynamics suggested by this.
We once believed that, in a sense, it was the function of the evolved state to preserve
and protect the interests of the public against corporate behemoths and corporate gigantism.
Now we have a gigantism that's unprecedented, way beyond the instantiations of a previous century, where it would have been steel and minerals and resources.
But attention and consciousness itself Is the faculty the object of this monopolization?
It's extraordinary to hear how effective they are at managing and manipulating to 46% or 66%.
to 46% or 66%.
These numbers are sort of astonishing to hear.
I wonder what you think about Google's attempt to overturn that $2.6 billion EU antitrust fine.
I wonder what you think about, for example, we know we're on Rumble, that when Rumble covered the Republican primaries, it was apparently very difficult to find on Google.
And I wonder, perhaps most of all, about whether or not, given that it appears that there is
a political bias built into the system's current modality, whether or not an alliance with
the alternative political party is a possibility in order to regulate and break up these monopolies,
because that would seem to be the only way that it could be challenged. And that's the
sort of traditional component that I'm referring to, unless you have some kind of like, other
than the state or an incredibly mobilized population, even with the information that
you are curating and compiling, how do you ever challenge something of this scale?
It can be challenged, but the antitrust actions that are currently being used in the EU and
also in the United States were actually designed by Google's legal team.
They're absolute shams, complete shams.
It makes it look like our public officials are doing something to protect us.
They're not.
Google works closely with governments around the world, even with the government of mainland China.
And works closely with intelligence agencies around the world.
The people at Google know that no one can ever break them up because you can't break up the search engine.
That's their main tool.
If you broke up the search engine, it wouldn't work.
Facebook knows this, too.
You can't break up their basic social media platform.
That would be like putting a Berlin Wall through every family in the world.
Are there ways to stop them?
Yes, but antitrust actions aren't going to do much.
What could be done, though, is you could declare this is very light touch regulation.
There's precedent for it in law.
There's precedent for it in Google's business practices.
is that you could declare the index, the database they use to generate search results,
you could declare that to be a public commons.
The EU could do it.
In other words, you would allow other parties, other companies, high school students,
you'd allow them to build their own search engine with access to Google's index.
You'd end up with thousands of search engines, all competing for our attention,
all trying to attract niche audiences exactly like the news media domain.
That's exactly what happens in news media.
That could be done simply by giving everyone access to Google's index.
Google would fight it in court, of course, and we'd see what happens, but that's one way.
But the only sure way that I know of to stop these companies Because they're affecting not just our elections, but our thinking, what we focus on.
They're in control of what content we see, such as your content, and what content we don't see, such as your content.
The only way to really stop them is through monitoring, because by monitoring, what happens is we preserve their manipulations.
We preserve them.
We can make them public 24 hours a day.
We can share the findings with public officials, both in the US and other countries, and give people, give organizations, give government agencies, give political campaigns the power they need to bring effective, Legal action against Google, because we're talking about massive amounts of data collected in a scientifically rigorous way.
I'll give you one quick example of how hard it is to find them if you don't have the data.
Last year, the Republican National Party sued Google because Google was diverting tens of millions of emails that the Republican Party was sending to the Republicans, and Google was diverting all those emails into spam boxes.
So the Republican Party sued them.
That case got thrown out of court.
Why?
They didn't have sufficient data to prove their claim.
Now, Google was really doing this and we were not monitoring that.
We are now.
The point is we can monitor what they're doing, preserve the data on a very large scale that can be used in the courts and that can be used with various government agencies.
And will they stop what they're doing?
Yes.
How do we know that?
Because in 2020, when we shared our data with some US senators, they sent a threatening letter to the CEO of Google and Google stopped.
They'll have to stop.
If they know that they're being monitored on a massive scale, 24 hours a day, worldwide eventually, by the way, we've already been approached by five other countries asking us to help set up monitoring systems.
If these tech execs know that their data are being captured, That we're doing to them what they do to us and our kids 24 hours a day.
That we're monitoring them.
We're preserving data that they thought could never be preserved.
They will stop, because you know why?
They can still make billions of dollars.
They don't have to at the same time be messing with our thinking, be messing with our elections, and especially be messing with our kids.
One of the new areas of research that we've started is looking at data coming onto the devices of more than 2,000 children throughout the U.S.
We're just beginning to look at that and our heads are spinning because what these companies are sending to kids is just unbelievable and parents are unaware.
There's a kind of social engineering occurring here on a massive scale that people are unaware of.
You can see it in leaks from Google.
You can see this.
That this is the intention of some of the top people at that company is to make a better world according to quote unquote company values.
That's actually in a video that leaked from the company that was about the power the company has to reengineer humanity.
Literally, they're using the phrase according to company values.
We can stop them.
The first step is to be aware of what it is they're doing.
It sounds like the kind of...
banalized dystopia described both by Huxley and David Foster Wallace to a degree in Infinite Jest, a sort of a corporatized cultural space that where the ideologies are masked in the kind of language of convenience, safety, no real moral spikes, no real ideological thrusts, you know, until there are, but mostly it's kind of
present in normalcy different.
I suppose that in order to significantly change society, you have to change the parameters of
what people regard as normal significantly. Now, one of the things that you've talked about is the
possibility of dissent and the likelihood of dissent being closed down in such a space.
What do you think is the role of independent media within this space?
How can independent media Succeed in such a highly controlled and curated space.
And what do we have to do to ensure that independent voices are able to be heard in a space like this?
And I'm very encouraged by the way, by what you say about the monitoring, the effectiveness of monitoring this does seem to, you know, somewhat slow and curtail the proclivities of this organisation in particular.
And the possibility for sharing that tech and, you know, making Google search stuff open source.
That does seem like an amazing way of dissolving that power.
But what do we do in particular about the sort of news media organizations like this one that necessarily exist within a space that's controlled to that degree?
Well, at the moment, you're in grave danger.
I mean, that's the bottom line.
At the moment, independent media of any sort are in grave danger.
One of the most remarkable pieces ever written about this problem, long before, by the way, he ever became aware of my research, was written by the head of the EU's largest publishing conglomerate, the German.
And he published a long letter in English and in German called Fear of Google.
And it was about how his company, they're in constant fear of Google and every decision they make, every business decision they make, they have to make in such a way as to not offend Google.
Because when Google decides to suppress content, for example, to demote you in their search results or delete you, There's nothing you can do, there's no recourse at all, and you are now out of business.
And that's the environment in which we live.
So no matter what content you want to contribute to the world, and I'm speaking of you personally here, it's a whim on their part.
You're under the Literally under the influence of whims at that company about whether you can continue to get your message out.
They've done this repeatedly with independent news sources.
They have reduced their traffic to 10% of what it was.
They can do that with the flip of a switch.
And by the way, that was confirmed to me by one of the whistleblowers from Google.
I'm in touch with a lot of the whistleblowers.
I'm in touch with people at Google who haven't even blown the whistle yet.
So I know way, way, way too much about what's going on there.
But they, yes, at the moment, They have that power.
They decide what more than 5 billion people around the world can see and cannot see.
And at the moment, there is no way to counteract what they're doing.
In the U.S., the courts have said over and over again, when they have, for example, shut down hundreds of websites belonging to one particular company, yes, they have that ability.
They can block websites.
They block millions of websites every day.
March 31st, 2009, they blocked access to the entire internet for 40 minutes.
That was reported by The Guardian and that was never denied by the company.
I eventually figured out, by the way, why they chose those particular 40 minutes to shut down the internet.
The point is they have this incredible power.
They use this incredible power.
The courts in the U.S.
have said they have every right to do that because they're a private company.
And see, that's the problem here.
In other words, even though I agree with a lot of their values because I lean left myself politically, I don't like the idea of a private company that's not accountable to us, to any public, having this kind of power.
That's the problem here.
The problem is not necessarily their values.
The problem is the power that they have and that they're utilizing without any accountability.
To us, to any population, any group of people around the world, they're simply not accountable.
I hope some of your viewers find that to be objectionable.
I hope some of your viewers will go to mygoogleresearch.com because this big national monitoring system that we started setting up last year, I had raised about $3 million to get us going on it.
It's going extremely well.
We've preserved now more than 44 million ephemeral experiences.
We have a panel nationwide of more than 11,000 field agents in all 50 U.S.
states, because we've got to get the system going here fully before we start helping other countries.
But the fact is that $3 million is now almost gone.
I need access to other major funding.
One of our advisors is trying to get us in touch with people in Switzerland who he feels might be very interested.
Are there people in Europe or in the UK who could help us?
Because this system has to exist.
This is not optional for humanity.
If we don't monitor them, We will never know how they're influencing elections or kids or human autonomy with no system in place like that.
I'll make a specific statement.
If this system is not fully running next year in the United States, with all of our data being shared with authorities and with the public every single day, if this system is not there, Google alone will be able to shift between 6.4 and 25.5 million votes in the presidential election of 2024 without anyone knowing what they're doing, without anyone being able to go back in time and look at all that ephemeral content.
That's what we're up against here.
That's why we must have systems like this, monitoring systems in place, that catch the data that they thought could never be caught.
That's what we've learned how to do.
I need your help and your audience help in making this happen.
This horrifying power that you described, already present, already active, already operating according to your data, is as yet un-augmented by a fully capable AI technology.
What are your thoughts on how the AI component will advance these capacities?
And what do you feel about, for example, the sort of chatbot story and the talk of sentience and, you know, the sacking of software engineer Blake Lemoine or Lemien or whatever his name was.
What do you feel about that, Doc?
AI is part of the story, obviously.
It is also potentially dangerous in its own right.
It will make these capabilities that they have even more powerful.
For example, we just finished, in fact, I've not announced this publicly, this will be my first announcement, but we've just finished our first exploration of what we call DPE, the Digital Personalization Effect.
And what we've shown is that if we show people biased content, we can produce shifts easily of 20% or more in their voting preferences.
If we personalize the content, which of course Google is famous for doing, if we personalize it based on some things we know about those people and what kinds of media sources they trust and news sources and celebrities, if we personalize the content so it's coming from sources they personally trust, that shift goes up to over 70 percent from 20 shift to 70
shift that's just by personalizing and AI of course makes it much much easier and smoother to
personalize content that's one of the main dangers here.
So the fact that these companies have always relied on AI to some extent, and now are relying on it more and more, makes them more powerful and far more dangerous.
All the more reason why we have to capture the ephemeral content that they use to manipulate people.
And I'm going to say it a third time, MyGoogleResearch.com, because we are desperately in need of help.
I'm just being honest with you.
I mean, we we desperately need help.
We we can't do this ourselves.
I have a team of almost 50 people helping working on this day and night.
A lot of them are volunteers.
It's very, very difficult what we're doing.
It's never been done before, but we're doing it and we're doing it well.
And we need people's help to make sure that this can be done on a larger scale.
For those of you out there who care about such things, All donations are going to a 501c3 public charity.
They're all fully tax deductible.
I'm so sorry that I have to keep interrupting with this begging for money, but that's the reality.
What we're doing is expensive, it's new, and it's important.
It's extremely important.
It sounds important in a way that's almost difficult to conceive of.
When you were talking before about the impact of personalised data, it made me realise that we're simply not evolved to live in a world where information can be curated in that manner.
I imagine, I imagine, That the roots of behaviouralism must have, you know, a component that's anthropological and ethnographic and how we are evolved to relate to one another and how we're evolved to trust sources of information.
How a consensus between a group is established and to have tools that can wallpaper your reality like a kind of chrome sphere surrounding your mind is...
It's in a sense beyond our, it's beyond sugar.
It's beyond sugar in terms of an agent of interruption, stimulation and control.
So I recognize how important what you're doing is.
I can hear that you're, you know, necessarily evangelical about continuing the work because it's seismic and pertains to sort of cornerstones of our As yet still called civilization, like democracy, like judiciary, like the ability to have open conversations, like important principles around which we presumed society was being built but for a while have suspected that in a sense these are simply gestures that are put in place while real power does what real power wants to do.
And that kind of power with this kind of utility is truly terrifying.
Can you speak for a moment about the aspect of, from a behaviouralist perspective, how we are not, you know, because in a sense, right, I'm a person, obviously, and I imagine that I'd be able to go, oh, well, I'm getting very biased information here from Google.
How is it that we simply are not able to discern, tackle, remain objective?
Keep some kind of distance from this experience.
Why is it so powerful from a, almost from a anthropological and behavioral perspective?
A couple of issues there.
First of all, most people can't see bias in the content that's being presented to them.
So those people are very easy to shift.
And in some demographic groups, you can easily shift upwards of 80% of voting preferences.
Some people are just very, very vulnerable to this kind of manipulation because they trust companies like Google, they trust They've got algorithmic output because they have no idea what algorithms are.
They trust computers because they think computers are inherently objective.
So, you've got all that working against you.
And then there's another factor, which is really, really scary.
In some of the big studies that we've done, there's always a small group of people who do see the bias.
Now, it's a small group, but with a big enough study, you know, that group is large enough for us to look at them separately.
And here's the thing.
The people who see the bias, they shift even farther in the direction of the bias.
Now, how can that be?
Why would that be?
Well, because, presumably, they're thinking, well, I can see there's bias here, and of course, Google is objective, or computer output is objective, or algorithms are objective, and it's clearly It's clearly preferring that candidate over this candidate.
So that candidate really must be the best.
And those shifts, the shifts among the people who can see the bias are larger than the shifts among the people who can't see the bias.
So, you know, there are no protections here.
This is a whole new world of influence and manipulation.
The only protection that I know of for sure that works is by doing to them what they do to us, by surveilling them, capturing the data so that it can be looked at carefully by authorities and courts.
You know, I'll tell you something, the UK and the EU, as you know, have been far more aggressive against Google in particular than any government agency in the US.
Because, you know, it's a US company.
So the EU has fined Google over and over again, more than 10,000, excuse me, 10 million euros in fines, also big fines in the UK.
You know what?
It has no impact on these companies whatsoever.
They've ordered Google to do this and that.
Google has completely ignored them.
What is lacking In the EU and the UK, it's a monitoring system to measure compliance with whatever the new laws and regulations are, but there are no monitoring systems in the EU and the UK, and Google knows that.
They completely ignore all of these various agreements and orders because no one is monitoring.
No monitoring means you can't measure compliance.
Well, I imagine we're going to have to be pretty clear about how to find your work because I don't imagine it comes up very easily on a Google search.
Dr Robert Epstein, thank you so much for joining us today.
Thank you for conveying this complex information in such an Easy to understand in spite of the vastness of the task and the scale of the challenge.
Thanks for giving us some suggestions of what a solution might look like and making it clear this is something that's happening right now and how difficult it is to detect And yet there is a way to oppose it and I would recommend that all of you learn more about Dr. Robert's work by going to drrobertepstein.com and mygoogleresearch.com.
Doctor, thank you so much for joining us.
I'm sure we'll be talking again, although these conversations might be difficult to find online.
Thank you.
Thank you.
That shows you the necessity of supporting us by clicking the red Awaken button and joining and supporting our community.
Without a direct connection to you, it's going to become increasingly difficult to communicate with you in a curated and controlled cultural space.
On the show tomorrow, we have Glenn Greenwald.
Imagine the information that he's going to be able to convey on this subject, as well as war, the pandemic, legacy media, corruption.
If you do become an Awakened Wonder and join our community, and I urge you to do that, you've just heard what Dr. Robert Epstein has described, almost a necessity to do that, you'll get access to guided meditations, readings, questions and answers.
And I want to thank you that have recently become new annual supporters like Truthfulergave, Barloo, Lucky Lou, Magic Peace, Love Ray, Pardon, Snuffle Dog, The Kennedys, Freddie, Flintstone, and so many more.
Thank you for joining us.
We really need you now more than ever.
Join us tomorrow, not for more of the same.
We'd never insult you with that, but For more of the different.
Export Selection