All Episodes
Nov. 23, 2020 - Epoch Times
36:43
Google Vote Reminders Only Went to Liberals, Not Conservatives for at Least 4 Days—Dr Robert Epstein | American Thought Leaders
| Copy link to current segment

Time Text
Those vote reminders only went to people who call themselves liberal, and not one of our conservative field agents received a vote reminder.
During his 2020 election monitoring project, Dr.
Robert Epstein found stark data showing Google election bias, he says, and he's only just started to pour through all the data that was collected.
In this episode, Dr.
Epstein, a senior research psychologist at the American Institute for Behavioral Research and Technology, explains what he's found so far, and why he believes the public has the ability to stop big tech election bias, even without legislative action.
Dr.
Robert Epstein, such a pleasure to have you back on American Thought Leaders.
It's my pleasure, of course.
Dr.
Epstein, you have some pretty significant, I guess I could say, allegations of activity from Google that basically suggests that Google was disproportionately targeting Basically, Democratic voters with messaging about getting out the vote.
And I'm hoping you can actually speak to that to start us off.
And of course, this is all part of a large monitoring project that you've been involved in for many years now.
Yes, well, first of all, let me say I don't have any allegations.
I have some data that my team and I have collected.
I should also say, I think it's relevant to this issue, I was contacted by a pretty prominent Washington, D.C. attorney a few days ago who thinks I should go into hiding.
But I have no allegations.
I do have some data, and I think they're possibly quite important data.
So tell me about these data, please.
Well, you need a little background.
We've set up systems since 2016 to try to determine what the big tech companies were showing people in the days leading up to the 2016, 2018 and now the 2020 election.
So to do that we recruit people we call field agents.
We had 95 of them back in 2016 and we equip them with special software Which allows us, in effect, to look over their shoulders as they're doing election-related activities on the Internet.
For example, doing searches on Google, Bing, and Yahoo.
And this year, what we did was far more ambitious.
In 2016, we were able to preserve 13,000 what Google calls ephemeral experiences.
In this case, searches on Google, Bing, and Yahoo.
Ephemeral experiences.
That's a very important phrase.
It means those experiences we have online that involve very fleeting content that impacts us, disappears, and is gone forever and can't be analyzed because it disappears.
That's the whole point.
We know from leaks from Google that ephemeral experiences are used quite deliberately and strategically to shift people's Thinking and behavior.
That's beyond any doubt at this point.
So the point is this year that we recruited not 95 field agents, but 733 field agents deliberately in three very critical battleground states.
Those are Arizona, Florida, and North Carolina.
We wanted to go where the action was, where we would be most likely to find Some evidence of bias or manipulation.
And we have indeed found evidence of bias and we've also found what some people might want to call a smoking gun.
We found that during the week of October 26, that's quite close to the election, only our liberal field agents were getting vote reminders on Google's homepage.
I I deliberately began to go public with this information on Thursday the 29th.
I shared a lot of information with a reporter from the New York Post and that was quite strategic on my part because I knew that all New York Post emails are shared with Google.
I wrote about that in an article I knew that all the information I was giving to this reporter would be seen by Google.
With my name in there, I assumed an algorithm would immediately boot it up stairs to real people.
And then real people would look at all this information.
The point is that two things happened that night.
That was Thursday night, October 29th.
Number one, the article, which I gather was about to go to press, got pulled from the New York Post.
So think about that.
That's a fairly conservative news outlet.
And in effect, I got censored by this news outlet.
Relevant there, perhaps, is the fact that The New York Post gets 32% of its traffic from Google.
Maybe someone reminded someone else at the New York Post that they're quite vulnerable to being harmed by Google.
In other words, Google has the power literally to put them out of business almost overnight.
The other thing that happened was that the That targeted messaging on Google's homepage, it disappeared.
It was like someone flipped a light switch and it just disappeared.
So starting at midnight on October 29th, just days before the election, all of our field agents began to receive that vote reminder on Google's homepage.
And that continued until the very end of Election Day on November 3rd.
So that's...
It was certainly one of the interesting findings that we've detected so far, but then it took a step in another direction because at that point I was receiving calls from various members of Congress and some AGs that I work with, and I explained to them what we had found.
Next thing I knew, this was November the 5th, I believe, so really right after the election, three U.S. senators Sent a letter to the CEO of Google talking about some of my findings and basically accusing him of lying before Congress,
which is a felony, when he said that we never, ever, ever tilt any of our content toward one political party or another.
And so they're going after him based on my data, and that could explain Why that Washington, D.C. attorney told me I should go into hiding.
Dr.
Epstein, is there any chance that this data was somehow wrong?
And the other question is, did you just start monitoring this from the 26th, or did the results as you described them just start on the 26th?
The 26th is the day we decided that we were fully operational.
At that point, we had over 500 field agents.
You know, it takes a while to get a system like this going.
There's a lot of recruitment involved.
There's a lot of technical challenges.
We were being attacked digitally more so than ever before in previous monitoring projects.
One DDOS attack, that's distributed denial of service attack, was pretty serious.
So we had some outages, but basically On Monday, the 26th, we decided we were fully operational, and so we've just been focusing our analysis of data on data we received from that point forward.
Over time, of course, we're going to look at everything we have, but at the moment, that's our focus.
Basically, about a week before the election, in other words, to see what was going on.
Keep in mind that the most dramatic kinds of manipulations are going to occur Very close to the election, because that's when you want to do three things, if you can.
If you're supporting one candidate, of course, you want to mobilize the base.
In other words, you want to get those voters off of their sofas if they haven't yet voted by mail.
Secondly, you want to discourage supporters of the candidate you oppose from voting.
So you want to keep those people home.
But most important of all, and that's where our monitoring project is very important, you want to impact the people who are still undecided.
And it's those last few days, those are absolutely critical.
That's where you're going to apply the most pressure you can in every way you can to try to nudge those undecided voters in one direction or the other.
Those are the people who end up deciding Normally, in a close election, those people decide who wins.
Is there any chance that your data is wrong?
I mean, this is pretty stark.
You're basically saying that for three days, at least, voting reminders were going to purely one, I guess, ideological group.
Well, it would be four days, the 26th through the 29th.
I don't know how to answer the question.
I mean, I was startled when I saw the numbers appear on my screen.
And I can't say for sure what people around the country were getting, but I can say that we had recruited a diverse group of 733 registered voters, Republicans, Democrats, and Independents.
And I can say that during that period of time, our field agents who identified themselves as liberal They all got this vote reminder on Google's homepage.
I can say that among those who called themselves conservative, not a single person saw that reminder on the homepage.
Now, that's a strange thing to see on a computer screen when you're looking at data because you have 100% in one group and a 0% in the other group.
One doesn't see that very often.
Let's put it this way.
You don't need to do a statistical analysis to see if there's any difference between the two groups.
A hundred percent versus zero percent?
Let's think about it another way.
Let's say that Google was monitoring us, and if I were Google, I would have been monitoring us.
They wouldn't alter our data to have that pattern.
It just doesn't make any sense.
We also have an enormous number of security precautions in place when we do these projects.
I won't go into detail, but the point is that although we can be attacked and our system's frozen for short periods of time, there's no way for anyone really to change the data.
We're seeing what the field agents are seeing.
In effect, we are looking over their shoulders using software, and this is with their permission.
We're looking over their shoulders, and we're seeing what they see on their screens.
And what we saw during that period was that, again, those vote reminders are It only went to people who call themselves liberal, and not one of our conservative field agents received a vote reminder.
Again, I'm not making any allegations.
I'm simply reporting what we found.
And of course, we also found a bias in Google search results and other things.
And we've preserved so much data, it's going to take us really months to understand what we have.
And the main point of this project, of course, is to preserve content that is normally lost forever.
This is all ephemeral content.
And the folks at Google know this.
In a leak of emails to the Wall Street Journal in 2018, one Google employee says to others, how can we use ephemeral experiences to change people's views about Trump's travel ban?
Now, that's from inside Google.
I mean, they know that ephemeral experiences like search results or reminders on the home page or search suggestions, news feeds, they're ephemeral.
They appear in front of your eyes, they impact you, they disappear, they're gone forever.
No one can go back in time and see what these companies were showing people or saying to people on their personal assistant devices.
That's what we've done that is quite unique, is we have preserved, well, in 2016, 13,000 ephemeral experiences, which are normally lost forever.
And this time, we preserved more than half a million, of all sorts, by the way.
We preserved Google Home Pages, Google search results, and, of course, Bing and Yahoo search results, but also YouTube sequences, thousands of YouTube sequences.
We preserved Facebook homepages.
What kinds of messaging was Facebook sending to its members, its users?
We've only taken a quick stab at that.
We're not really sure yet.
And that's because, of course, the Facebook homepage is quite complex, whereas the Google homepage is quite simple.
So we will, at some point, have an answer, and we'll know Whether there was any kind of targeted messaging on Facebook's homepage as well.
If there was, I mean we're talking about the possibility that maybe a number of Silicon Valley companies were all brazenly pushing votes in just one direction.
And I calculated months ago that if all the Silicon Valley companies, the most powerful two being Google and Facebook, If they're all pushing in the same direction, that could easily shift in this election to 15 million votes, which means they, in effect, decide who the next president is going to be.
Now, I lean left myself.
I think it's great that they're pushing causes and candidates that I like and that my family likes.
But, you know, I put democracy and I put The free and fair election, I put our country ahead of any personal preferences I might have for a candidate or a party.
The fact is, if we allow companies like Google to control the outcome of our elections, then we have no democracy.
There is no free and fair election.
All of that is illusory.
In effect, we have, behind the scenes, technology Lords, technology masters, who are actually running the show.
And to me, that is unacceptable, even if they're supporting candidates and causes that I like.
That is unacceptable.
In my opinion, that should be unacceptable to all Americans, no matter what your party is.
And if you're out there, my Democrat friends and family, if you're out there listening and you don't like what I'm saying, then shame on you.
Shame on you.
Because you don't know tomorrow who these companies are going to be favoring.
What they do is not transparent.
And they're not accountable to the American public, unlike our elected officials.
They are not accountable.
What they do is highly secretive.
You don't know what they're going to be doing next.
You don't know what they're doing in other countries, for example.
We have evidence that in Cuba, for example, they don't support the left.
They support the right in Cuba because the left are in power and the left doesn't like companies like Google.
So they support the right.
In China, mainland China, Google has worked with the Chinese government to help spy on and control the Chinese people.
There was a tremendous revelation just over a year ago of a secret Google project called Dragonfly in which Google was Basically going back into China, working with the Chinese government to help them control their population.
So, again, I'm speaking now to my Democrat friends and family.
family.
I love you, but if you're mad at me for what I'm saying, then shame on you." This is hard to take in in its entirety, I imagine, for myself and for other viewers.
I just want to ask you one more quick question about what you call the smoking gun data.
How many of these 733 were conservative and liberal, according to your metrics?
I just want to get a sense of the sample size we're talking about.
I don't have those numbers right in front of me at this point, but we had a very diverse group.
Very, very roughly, we had about one-third liberals, about one-third conservatives.
Then we had a smaller group that called themselves moderates and a smaller group still that called themselves other.
I don't have the exact numbers here, but eventually, of course, we'll write all of this up and we'll release lots and lots of details.
In this world, there are good problems and bad problems.
We have a good problem.
We have so much data.
What we accomplished in a very short time, it dwarfs what we've done in previous monitoring projects.
This project tells me basically two things.
It tells me, yes, it is possible to do large-scale monitoring of these companies to do to them what they do to us 24 hours a day.
We can be monitoring year-round on a very, very large scale pretty much everything that they are showing to users, that they're telling users through their personal assistant devices, and we can be looking 24 hours a day.
We can be looking for manipulations, for bias, for shenanigans of all sorts Of course, this project also tells me that if we are detecting any kinds of irregularities and then we're exposing those irregularities, we can get these companies to back down.
We can get them to stop.
This is without laws or regulations.
This is just using tech, good tech, to fight bad tech.
You know, laws and regulations move very, very slowly.
Tech moves literally at the speed of light.
How are you going to, you know, trying to think ahead to the future, not just Google, but to the next Google and the one after that, how are you going to protect humanity, democracy, free speech from companies like Google, whether they're well-intentioned or whether they're truly evil?
How are you going to protect humanity?
You do it.
With monitoring systems because monitoring is tech and it can keep up with whatever new technologies are emerging.
It can keep up with those technologies and on an ongoing basis, and I mean 24 hours a day, it can protect us from manipulation, from in effect undermining democracy and over time even Undermining human autonomy.
One of the disturbing things that leaked out of Google in 2018 was an eight-minute video called The Selfish Ledger.
This was never meant to leave that company.
It talks about the power that Google has to re-engineer humanity.
To re-engineer humanity.
And this video also includes the phrase company values.
To re-engineer humanity in a way that reflects company values.
In effect, they're acknowledging that the kind of thing we're finding is real, it's deliberate, it's strategic.
They're acknowledging a kind of utopian thinking.
Maybe they know better.
But how would the rest of the people in the world, the people outside of Google, how would they...
How would they weigh in on this kind of possibility or this kind of plan, these kinds of actions?
How would the rest of the world weigh in?
I really don't think people around the world would welcome a private company in the United States exercising the power it has To rig elections, to re-engineer humanity according to company values.
I mean, I think it's outrageous.
You know, we need transparency.
Obviously, there are all kinds of ways we might get more transparency.
Various countries have tried and so far failed.
But most important, we need to be protected from these manipulations and I know now for sure, because of what's happened with our current monitoring system, I know we can be protected by having large-scale permanent monitoring systems in place, not just in the U.S., but in countries around the world.
I know that not only that we can do it, I know that we must do it.
I don't think this is optional anymore for humanity.
I think that the The technological elite is now in control, whether you know it or not.
They're now in control of our elections.
I know of a way to stop them.
I did stop them.
At least with one manipulation, it appears that I did stop them.
By the way, a whistleblower, a leak from the company, subpoenaed documents, court discovery, Any number of different methods could someday confirm that what happened on the night of Thursday, October 29, really happened the way I am speculating.
We might actually get confirmation of that at some point, that Google became aware of the monitoring, they became concerned, they turned off a blatant manipulation Which is, I'm told, probably a violation of campaign finance laws, which is a felony that can be punished with fines or even prison time.
Of course, it's also possible that there was some sort of heroic actor within Google who was partisan who would have been involved in something like this.
I don't presume to speculate.
But I'm sure a lot of our viewers right now are very curious Do you have any estimate of that particular observation of this homepage, basically voting encouragement?
How many votes could that have shifted based on your experience or your estimates?
I haven't done those calculations yet, but I can tell you a couple of things.
Number one, that homepage is seen in the United States 500 million times a day.
If that kind of reminder was being used systematically over a period of time, it affected more than who voted on Election Day.
It affected who sent in mail-in votes.
It affected who registered to vote.
Imagine that kind of targeted messaging continuing for a long period.
It can shift a lot of votes.
You know, directly by getting someone to vote and then indirectly by getting more and more people with one particular political orientation to register to vote.
So, you know, those are calculations that we know how to do.
At some point I will do them.
I certainly did not expect to find this targeted messaging on Google's homepage.
It might also be on Facebook's homepage.
We've preserved thousands of those homepages.
We know who they went to.
We know the demographics of the people that they went to.
So we'll be able to look into that as well.
But the point is I wasn't expecting this.
We have so much data.
We have an incredible wealth of data that we're going to find all kinds of things that, you know, who knew?
Things that we just didn't expect.
YouTube sequences.
Imagine the power that YouTube sequences have over opinions and votes because 70% now of the videos that people watch on YouTube around the world, 70% are suggested by YouTube's Up Next algorithm.
People sit there letting YouTube, which is part of Google, feed them videos.
Imagine the power that a sequence of videos has to shift the thinking of someone who's vulnerable, of someone who's undecided, someone who's trying to make up their mind about something.
This is a way to take people down rabbit holes.
And there are documented cases now of individuals who have turned to right-wing extremism or Islamic extremism because of a sequence of videos they saw On YouTube.
Well, we've done something no one's ever done.
We have recorded, in the days before a political election, we have recorded thousands of YouTube sequences.
And again, we know the demographics of the people that were watching these sequences.
Just to show you that we're not dreaming here.
One of the videos that leaked from Google in 2018 was a two-minute video in which the CEO of YouTube, her name is Susan Wojcicki, the CEO of YouTube is talking to her staff, or so it appears,
and she's explaining that they're just not going to let any of this fake news affect our YouTube users anymore, so they're modifying the UpNext algorithm to Elevate to push up content that they think is valid content and to demote content that they think is not.
So we're talking about, again, deliberate efforts within the company to engineer the kind of content that people can see and that people cannot see.
And that second category, content people cannot see, that's very, very dangerous.
Some people would call that censorship.
But, you know, what's truly dangerous about it is you don't know what you don't know.
You don't know what's being suppressed because you can't see it.
So we know that there have been deliberate efforts within the company to alter the YouTube Up Next algorithm.
Obviously, according to, once again, there's that phrase, company values.
I mean, how would outsiders have made judgments about which videos people should be able to see or not see?
How would people of different political orientations have made those decisions?
What exactly was the process they used to make those decisions?
See, this is unacceptable.
We cannot have private companies that are not accountable to the public deciding what three billion people around the world I mean,
we are in absolutely unprecedented times, of course, in terms of the raw ability of, as you said, private corporations to control information.
Absolutely.
How soon can we expect to get some results from your work.
That's the first question.
And the second question is, I imagine there's viewers that are watching who might want to help you do some of that processing if you're looking for people to do that.
Because, as you said, who knows what could be found in these data?
Unfortunately, we can't really accept We're not volunteer helpers.
We would love to because we get people offering every single day and I'm sure they're wonderful people and they're very sincere in wanting to help us.
But the problem is that if someone volunteers, we don't know whether they have some connection with the big tech companies.
Because Google not only employs 100,000 people, it also employs that we know of more than 10,000 outside contractors, just regular old people.
For all kinds of purposes.
If I were Google and Dr.
Epstein was looking for volunteers to analyze data, I'd be very generous.
I would send him dozens and dozens of volunteers.
You see, we have no way of distinguishing a good volunteer from a bad volunteer.
We have to take all kinds of precautions.
We find our people through networking, and we're extremely, extremely cautious.
We do background checks.
We have people sign very strict NDAs.
We're very cautious.
I saw Dershowitz give a talk once in which he said there's a fine line between paranoia, which is a bad thing, and caution, which is a good thing.
We're very, very cautious.
Cautious.
You asked when we're going to release lots and lots of results.
I can't even give you a date on that because, frankly, we are overwhelmed in a good way.
We're overwhelmed with how much information we were able to preserve.
We have to invent new ways of analyzing this kind of data.
Literally, just as we've had to invent new ways to study online manipulations since 2013, we had to invent how to set up a monitoring system that allows us to look over the shoulders of real computer users.
That took us about 10 months back in 2016.
We started in January, and we're still You know, figuring that out.
Obviously, we're getting better.
We're getting better at doing what we're doing.
But in my mind, all of these projects and the numbers, they're not going to really solve any problems.
I don't think so.
What we have done and the real value in what we've done is to show what's possible.
You can think of these projects as proofs of concept.
We are showing that it is possible To look very, very carefully at what these companies are showing people, that it can be done on a very, very large scale.
It could be set up permanently in all 50 states.
It could be collecting massive amounts of data, and the data could be analyzed.
It's possible.
It can be done in real time using algorithms.
You see, in other words, we can do the same kind of thing that Google does.
Again, using good tech to fight bad tech.
We can do large-scale monitoring.
It could be set up so that it is permanent and constantly looking for irregularities and then reporting them, exposing them.
And I think that when this happens, I think this is not an option for humanity.
I think it's a necessity.
It's a requirement.
It must be done.
And when this happens, Think about it.
Think about what these companies, how they'll react.
I mean, they wouldn't dare to try rigging an election.
They wouldn't dare to try re-engineering humanity.
This is the way to protect ourselves.
It's by setting up monitoring systems that detect and expose threats to our Dr.
Robert Epstein, such a pleasure to speak with you again.
Export Selection