All Episodes
May 11, 2022 - Health Ranger - Mike Adams
50:40
Facebook censorship whistleblower Ryan Hartwig interviewed by Mike Adams
| Copy link to current segment

Time Text
Welcome to Brighton Conversations.
I'm Mike Adams, the founder of brighteon.com, a free speech platform.
And today we have a very special guest, the author of a book called Behind the Mask of Facebook.
In fact, he was a Facebook content moderator turned whistleblower, and he understands exactly what we're dealing with when we talk about censorship.
I mean, I was the platform from Facebook, I think, in 2015, had 3 million followers at that time.
Imagine what it would be today.
But then again, that's why we build our own platform, so we can have these conversations.
Anyway, our guest is Ryan Hartwig, and thank you so much for joining us today, Ryan.
It's great to have you on.
We love what you're doing.
Just welcome to the show.
Yeah, thank you, Mike.
So it's an honor to be here.
Well, tell us, I mean, give us the overview of sort of how you ended up being the author of this book.
Yeah, so it's funny because growing up, I'm a big fan of science fiction.
So I always thought if I wrote a book, it would be sci-fi.
But it turns out my first book is nonfiction.
So taking you back, I mean, March of 2018, I started a job working for Cognizant.
And I didn't know what the job was.
And it said, social media content moderator, bilingual.
So I like the fact that I was using my Spanish.
And so I started this, applied for this job, got the job.
And only after I got hired did they tell me it was Facebook, that their client was Facebook.
So I was reviewing content for Facebook and Instagram.
We would make decisions as to whether to take it down or leave it up.
But we would see, you know, some of the most filthy things on the internet.
I mean, we'd see bestiality, cartel violence, beheadings, suicides.
So all of that we saw, child pornography, just horrible stuff.
So that was my job for two years.
Well, let me stop you right there.
I mean, that must have been traumatizing just by itself, I would think.
Yeah, I mean, yeah, the job itself, I mean, before they gave the job out, they would tell you in the interviews, they'd give you examples of what you'd see and ask if you were okay with that.
And you can't really skip jobs.
If something pops up on your screen that's abhorrent, you have to take action on it.
So the mental health aspect of that job is very fascinating.
We had counselors on site, licensed counselors who were available anytime if we needed to talk about something we saw.
So yeah, it was difficult in that regard.
Before we continue, I know we're going to talk about censorship and viewpoint discrimination and so on, but I'm just curious, has this altered your view of humanity?
It really has.
I mean, you think about humanity as...
I mean, there's a lot of good things, and I grew up very religious, so I tend to have a more positive perspective on humanity.
But seeing all the stuff, I mean, the thing that really gets to me is, like, you know, I saw posts about, like, sex trafficking of children, and that's something that really got to me.
But, yeah, it makes you really think about how much evil there is in the world.
There's a lot of good, don't get me wrong, but...
It's shocking to see how much evil is out there and people being exploited.
It gave me a really new perspective about life and humanity in general.
I'll bet, especially being a religious person yourself, because people can pretend to be good in public, but then they get behind the perceived anonymity of a keyboard and maybe a VPN, and then suddenly all this insanity and demonism comes out.
Of a lot of people.
That's crazy.
Are you aware that the Texas Supreme Court, in fact, just recently ruled that Facebook can be held liable for sex trafficking on its platform?
Did you hear about that?
I did see that.
And it's fascinating to see how things go through the courts.
And we know about five years ago there were some court issues that might still be ongoing with Backpage.com because they were...
Basically aiding sex trafficking away, allowing those ads.
Yeah, I think it's a step in the right direction.
I mean, I have an example, I have multiple examples of, you know, Facebook making exceptions and allowing like child nudity, for example.
So I think it's good because Facebook needs to be held accountable.
There's a lot more they can do to fight these kinds of things.
Okay.
All right.
Fair enough.
So what happened next after you were taking down drug cartel beheadings and things like that?
What happened next?
Yeah, so my first year there, I learned the ropes of how to do the job, and I started on the Spanish queue, so I was monitoring content in Latin America, but I also saw posts and received training on North America and later transferred to North America.
But that first year, I saw some exceptions where it kind of raised my eyebrows a little bit, and I saw there was a viral post about a Trump supporter being attacked, and Facebook told us we had to delete the video, even though there was really no violation.
And so after a year, like in May of 2019, I made a list of some of the examples I saw of bias, where they're censoring conservatives, And I wrote a letter to a couple of congressmen and senators and didn't really hear back, didn't really want to wait on it or sit on it for very long.
So that's when I reached out in May or June of 2019.
I reached out to a couple of journalists, and one of them referred me to Project Veritas, and that's how it all started.
Oh, wow.
Okay, yeah.
James O'Keefe.
Now, there's a true American hero as far as I'm concerned.
They're doing amazing work.
So when did your Project Veritas sort of whistleblowing come out?
I forgot when that was.
I remember seeing it, but when was that?
Yeah, it actually came out just about a year ago, actually June 25th of 2020.
So yeah, so the contract I had, I was working with Cognizant, the contract ended in February of 2020.
And so I was filming continuously.
I filmed total for about nine months straight.
And so the contract ended, so we all got laid off.
So wait, this was undercover filming?
Yeah, yeah.
So I was working and it's interesting because we had a really strict policy at work where you couldn't even bring cell phones onto the production floor and no paper.
So I was wearing a camera and filming within camera for many months.
Were you scared during that time that you might be caught or threatened or anything like that?
Yeah, there is a certain level of paranoia when you're filming.
The first couple of times was stressful, but then for the last few months, I was filming almost on a daily basis.
It sounds strange, but you get to a certain point where you almost forget that you're wearing a camera.
But yeah, there was the fear of getting fired.
I mean, it was a good job for me and my wife and I had to get health benefits.
I mean, Cognizance, a Fortune 200 company.
So it was a sacrifice to, you know, be filming and run that risk.
And technically, I was never fired, but I mean, I could have transferred to another project after that project ended and chose not to.
So it basically, you know, it was the end of my career with that company.
Yeah.
So what were the key assertions then that you revealed as a whistleblower and that Project Veritas covered when this all went public?
Yeah.
So just to give you, just so you understand like the overview of it, there was a lot of content that I filmed that was not in the Project Veritas release.
So when I went public with Project Veritas, we focused on a few things that were crucial.
So for example, Facebook, and it's interesting because we're in Pride Month right now and And when I first started there, Facebook gave an exception to allow attacks against straight white males.
And so you could say, you're going to attack a straight white male and call them filth for not supporting LGBT rights.
So that's an exception that Facebook gave in opposition to their own policies and rules to allow hate speech.
And it's okay to attack, you know, hate speech is allowed if you're attacking a straight white male.
So that seems crazy right away.
So if someone posts a video attacking a transgender, that would be taken down, violence against transgenders.
But someone attacks like a Christian heterosexual male, that's allowed?
Or it was allowed?
Yeah, in the context of attacking them for not supporting LGBT, it's allowed.
So they carved out that exception, yeah.
So, I mean, it's a complete betrayal of their own rules and guidance.
And they always talk about how they're preventing real-world violence or hate speech, but they give exceptions for certain classes.
Didn't they also have a policy at one point where...
You couldn't post anything about Alex Jones unless you were insulting Alex Jones.
Wasn't that a policy?
Yeah, that's part of the...
I was there actually when they banned Alex Jones, and the post that they gave us, and I have a screenshot of the post...
When they banned Alex Jones and all the three companies banned him the same day, right?
What a surprise that they're coordinating that.
But they literally, word for word, told us it was an emergency update to ban Alex Jones.
Alex created an emergency in Facebook.
That's funny.
So what you're referring to, yeah, not speaking negatively about him applies to him, applies to Tommy Robinson.
So the phrase we would use, the internal terminology we would use was an acronym, PSR, which stands for Praise, Support, Represent.
So you can't PSR, speak positively about, you know, Alex Jones or Tommy Robinson or anybody else on the hate figure list, Gavin McInnes, So Gavin McInnes and Robin, Tommy Robinson specifically, were on the hate figure list.
So they're on the same list, literally on the same list as Adolf Hitler.
Wow.
I mean, did you know at the time, did you know anything about Alex Jones?
Because, you know, I've been a fill-in host of his show from time to time.
And every time I've heard Alex speak, it's always about, hey, we welcome people of any color, any race, any national origin.
He's never, that I can think of, spewed, you know, bigotry rhetoric.
So did you know anything about that at the time?
Yeah, I didn't know too.
I'd heard about him a little bit.
And just to be clear, Alex Jones was not...
He might be on another list at Facebook that I didn't have access to, but he wasn't on the hate figure list that I had access to.
So I think the reason they...
Deleted him was because of conspiracy theories or what they call conspiracy theories.
And while I was there, they created a new part of their bullying policy that doesn't allow us to claim that anybody's a crisis actor.
So we had the Parkland shooting, and so they created a new policy saying, you know, you can't claim that someone's a crisis actor.
You can't claim that a tragedy is a hoax.
Oh, okay.
Okay, yeah, that's interesting because what, well, just to add this to the interview, what we know is, yeah, there were children who died and people who were killed, and it was augmented by some crisis actors.
I mean, that's actually come out.
But I hear what you're saying.
At the time, that's, or maybe that's still a policy that you can't claim anybody's a crisis actor ever.
Yeah.
Yeah, and I understand.
There are real tragedies, and my heart goes out to the victims of these tragedies.
Obviously, I'm not saying that nobody died or anything like that, but I can't even theorize or suggest that maybe some of these people were crisis actors at all.
So that was a new policy that they implemented, and it protected people.
So it's just kind of interesting to see Facebook's perspective on what is fact, what is fiction.
Was there any pushback from people within the company?
Did you ever have conversations with other people who were questioning some of these decisions the way that you are?
Yeah, there was a good mixture of conservatives there.
I mean, there was a fair amount of conservatives.
There was also people on the left.
So, I talked to some of my coworkers about it, and they were kind of of the same mind in some regards to the policy and some of the exceptions that were given.
I mean, we had, for example...
We had an exception given for child abuse.
So there was a far-right senator from Australia named Fraser Anning, and he got attacked by a kid.
A kid walked up behind him when he was giving an interview, and he cracked an egg on the back of his head.
And then the senator turned around and slapped the kid in the face a couple times, right?
And that action of slapping a kid, a minor, multiple times qualifies, meets the definition of child abuse.
So we had guidance from Facebook and it said, look, we know this video violates our child abuse policy because he's smacking him in the head multiple times.
And, but Facebook, so we're making a newsworthy exception to allow that video of child abuse.
Wow.
So, I mean, just goes to show like, I think that's the overarching message here for a lot of my book is, you know, Facebook can pick and choose what rules to follow and they can make exceptions anytime they like about their own rules.
Right, because, you know, genital mutilation of children with transgenderism indoctrination, that clearly meets the definition of child abuse, but Facebook isn't taking any of that down that I'm aware of.
Don't they allow that kind of content?
Now, I know it's pretty nuanced, I know, but there are exceptions given.
I know, just to give another similar example, if you post, normally female nipples are not allowed on Facebook and Instagram, nudity, but if you claim that you're non-binary and you post female nipples, then they're no longer female nipples and that's allowed.
Wait a second.
Wait.
Okay.
I'm trying to figure this one out.
So if you're a man and you post a picture of your penis but you say it's not a penis, you say you're a trans elephant and that's a trunk.
Then that would be allowed?
I mean, it specifically applies to nipples and non-binary.
So yeah, you could post a pornographic image of a porn star with her nipples, her breasts showing, and you could put in the caption, she is non-binary, and that would be allowed.
Wow.
So it would no longer be taken down.
Wow.
We also saw that any of the topless protests for LGBT parades, a lot of them have bare female nipples in the imagery, and those were allowed as well.
As far as the transgender mutilation of children, I wasn't aware of any exceptions given for that kind of content.
If there was any kind of nudity where there's visible genitalia of a child, we would immediately delete that.
But if it's partially nude or if the genitals are covered, then sometimes it is allowed.
Well, this is really fascinating because what you're describing is, as you say, it's this bizarre maze difficult to navigate where they're making exceptions based on what?
Their own personal politics or belief systems?
Where are these decisions coming from, if you have an idea about that?
Yeah, so what I do know is that it's a global policy.
So the policy decisions they make are for the entire world, for whatever market it is that they're in, every country in the world.
I did speak to one of my managers at one point who said that She would get guidance from the policy team at Facebook.
I do know the decisions came from the top.
There's a team of about six people who make those decisions on a global basis.
One of the biggest exceptions I saw was when they protected Greta Thunberg and any attacks on her were not allowed.
People were calling her Retarded or greetarded, kind of a play on words.
And yeah, it's not cool to make fun of someone who's a minor.
But as far as meeting the hate speech criteria, it did not meet Facebook's rules.
So you can call another public minor like JoJo Siwa, who's actually now 18.
But JoJo Siwa is a very famous minor public figure.
And you can call her stupid.
You can call them names.
You just can't make any sexual jokes about them.
But what Facebook did is they made a newsworthy exception to their rules and they said, we're not going to allow any attacks on Greta Thunberg where they're calling her retarded.
So they literally had their AI scour Facebook and And pick up any phrase, retarded.
So they picked up that phrase and they had us dump those in our queue.
Normally, retarded would not be a violation, but Facebook made it an exception to the rule.
And then they dumped all those jobs into our queues and had us delete those attacks for an entire week.
So Greta here had some kind of special protected status that appears to be because of her political viewpoint, which is that climate change is going to kill us all.
But if there were a young person, another minor, who were, let's say, a Trump supporter or a Second Amendment supporter, as you're saying, those same attacks would be allowed against that minor.
That's correct.
So like Nick Sandman, for example, when he was a minor, I think he was 17, there was no exception given for him at any time.
And I wasn't aware of any other exception given for any other minor.
So it was fairly rare for them to give this blanket exception.
Yeah.
Yeah, and we know what Greta's chance is on politics and her leftist views on climate change.
So time and time again, I saw these exceptions given.
Don Lemon saying on air, white males are terror threats.
And Facebook saying, look, we know that breaks our rules, but we're making a newsworthy exception.
So time and time again, it was always exceptions and policy changes that favored the left and silenced the right.
Wow.
So, if there's so many exceptions, then there's really no rule.
There's no consistent, high-integrity application of principles at work here.
It's just whatever they wanted.
And it was to silence conservatives, as you say.
Is that an accurate assessment?
Yeah, that's accurate.
I have a list of at least 30 examples, and I go into these in the book.
And that's one of the reasons why I decided to write the book is because Project Veritas did a great job with the video expose, and they gave some examples.
But I had so much content, so much material, and more proof.
So that's why I put it into book format.
I started writing back in August.
It's a long book.
I wrote about 70,000 words.
Wow.
My co-writer says it's a lot of analysis.
I stick to the facts.
I used to be a security guard, so I write analytical-type reports.
It sticks to the bare-bone facts of these policy exceptions.
That's great.
The book has been out for just a little while.
It's available now in booksellers everywhere?
Yeah, it's available for pre-order.
It's on Amazon.
It's also on bookshop.org and a few other alternate indie-bound, I think, a few other websites.
Okay, so it's pre-order only right now?
Yeah, right now it's available for pre-order.
It'll launch on August 17th, so just in a couple months here.
Oh, okay, great.
Well, I'm going to pre-order it for sure and, of course, spread the word about it.
This would be great.
I think your book is really important for this discussion, this national discussion about Section 230, about viewpoint discrimination, about the First Amendment.
Now, a couple of developments that have happened, I think, since you probably worked on this book is that it has come out that Facebook In several elections, it was really acting as a censorship proxy on behalf of government.
This happened in Massachusetts with Dr.
Shiva, and it happened also, I believe, in California.
There's another lawsuit making that claim there.
So what have you learned since writing this?
What have you learned about Section 230 and any efforts to resolve this viewpoint discrimination assault on our First Amendment?
Yeah, so there's a couple different angles.
They're trying to approach it from a legal perspective.
And we saw that Judicial Watch released those documents, those FOIA requests in California, I believe Massachusetts possibly.
In California, the state government was communicating directly with Facebook and making requests as to what content they wanted censored or taken down.
That's right.
And these are people who are protesting.
There's a gentleman I know named Siaka Masako, and he's a former Hollywood actor, and he was protesting Gavin Newsom out there in the street protesting, and they were trying to censor his content.
Yeah, as far as Section 230 goes, I mean, and this is the law from 1996 designed to protect children on the Internet, which we know that Internet's not safe for children at all at this point.
But...
Yeah.
As far as Section 230, I know there's a guy named Jason Fick who filed a lawsuit against Facebook a couple years ago.
His case was almost heard by the Supreme Court this past January.
They declined to hear it.
But really, Section 230 needs to be reinterpreted by the Supreme Court.
It's never been interpreted by the Supreme Court.
And it's been misinterpreted by the Ninth Circuit Court, of course.
And it's given Facebook additional protection.
So there's a question of whether it's It's how you define the publisher versus a publisher in the law.
So that's one legal argument.
I think Dr.
Shiva's case can be good as well, but it really comes down to the constitutionality of the law.
Because basically what the Congress did in 1996, they delegated authority to these companies to exercise what should be only something that Congress does or a government agency.
So Facebook is acting as a de facto government agency when they're censoring your posts.
And yes, they do have the right per the law to censor your posts and the First Amendment does not apply, but they have to do it in good faith.
And that's why the interpretation is wonky, because it's not being applied correctly.
But isn't it clear that if they're making content decisions based on viewpoints, then doesn't that transform them into a publisher, which means they should not enjoy Section 230 legal protections?
Yeah, I think that's a fair argument.
And that's another reason really why I think this book is important.
I hope that legal analysts and scholars can use my book as evidence in future lawsuits.
I know I helped with an FEC complaint in November for a U.S. Senate race.
So I was mentioning an FEC complaint against Democrat Gary Peters because Facebook, they fact-checked the Republican nominee, you know, candidates ad about transgender sports.
And so I gave evidence showing that Facebook has a slanted viewpoint or stance on transgenderism and they were protecting, you know, people who are transgender more than other people.
So, yeah, I really think that...
Yeah, there's a lot there.
I think Facebook, obviously they're not neutral, and that's what my book shows.
They're clearly picking sides and picking winners and losers.
In fact, I'm curious, have you been asked to testify in congressional hearings or Senate hearings of any kind?
I have not yet.
Not yet.
Not yet.
There was the FEC complaint.
I know Congressman Matt Gaetz, shortly after I went public last June, Congressman Matt Gaetz filed a criminal referral to the DOJ for Mark Zuckerberg.
I haven't testified yet.
I look forward to it.
I can also recommend a few people, a few former employees at Facebook who I think should testify and share their knowledge of it.
It is difficult to comprehend for the average congressman or senator.
It's a very complex, nuanced policy that Facebook has.
Yeah, absolutely.
Are you doing other things now?
Aside from your book coming out, do you have a blog or a podcast or a website or anything else like that that you're working on?
Yeah, so I have my website.
It's ryanhartwig.org, which is my first and last name,.org.
I'm also – I have a nonprofit.
We're waiting to hear back for our 501c3 status.
It's the Hartwig Foundation for Free Speech.
And so just advocating for free speech on the Internet and talking to other whistleblowers like Zach Voorhees, the Google whistleblower who's also got a book coming out, Google Leaks.
And just connecting with people, because I think a lot of, you know, the average person doesn't want to be censored.
And I think, you know, states need to take a more active role.
And I'm glad to see states like Texas and Florida passing legislation against big tech and their encroachments on our free speech and our privacy.
So those are a few things I'm working on.
I mean, the last year I've done at least 100 interviews.
I do interviews in Spanish.
I'm fluent in Spanish.
And I went to Brazil last September.
So I think it really is an international issue.
I mean, you look at we're suffering a lot in the United States because of censorship.
And but globally, people are being censored by Facebook on a global basis.
Well, that's the truth.
And South American nations don't have a First Amendment.
Like we do.
So, you know, as I understand it, in many nations, the regime in power just tells Facebook, hey, censor our political opponents.
And Facebook does.
And I think that they did the same thing in America as well with the 2020 election.
Yeah, and that's something that's really important because, and I've, you know, there was another whistleblower named Sophie Zhang that came out, and she was published in BuzzFeed News, and she basically said the same thing.
She was a data scientist, and she was talking about how Facebook didn't really care what was happening in these third world countries with these dictators using their bots and AI to manipulate the election.
But she did say that Facebook did prioritize the US elections and I saw that firsthand as well.
So we had election training decks.
2018 during the midterms was our big ramp up training for elections in general.
So any content that was talking about when to vote or how to vote, we flagged it with a certain tag.
So we'd mark it with the initials VI and then that would be escalated to the Facebook team for review.
And so we were enforcing election law.
Like if someone was saying, Republicans vote on Tuesday, Democrats on Wednesday, we would delete that comment.
So we'd be enforcing election law in a way.
And I saw training decks for Canada, for Taiwan, the UK, pretty much most countries in the world, I saw training decks for those countries.
Wow.
Well, now one of the biggest issues of our time is, of course, the COVID outbreak and vaccines and vaccine policy.
And I think this is the reason why Facebook deplatformed me many years ago, because I was one of the leading voices of vaccine skeptics saying, hey, you know, there might be might be dangers here.
Look at the ingredients.
But now Facebook is enforcing the oppression of therapeutics that many qualified doctors say can be very helpful and save many lives, such as ivermectin, hydroxychloroquine, or vitamin D, zinc, and so on.
So Facebook has gone from political censorship now to...
I will quote some other people I've interviewed saying that Facebook is complicit in murder because they're censoring information that could have saved lives if people only had access to that knowledge.
Do you think that's a fair assessment of what's going on or how would you describe it?
Yeah, I think it's a fair assessment because Facebook definitely has that potential.
When I was there, so I left in February of 2020, And I'm curious.
I wish I could have stayed on longer or find another company to contract with that was doing the same thing.
But I do know we had a policy called regulated goods.
So we would regulate the sell of marijuana on the platform.
So brick-and-mortar marijuana dispensaries could advertise, but obviously person-to-person transactions were not allowed.
And we regulated the sell of non-regulated goods or Like, you know, meth or just, you know, street drugs.
We would delete those, obviously.
So I didn't see, when I was there, they hadn't yet created that new policy.
So it just goes to show they can, you know, they can adjust their policies.
But more importantly, you know, they can classify it however they want.
They can use their AI to detect these keywords.
The example I have of content I would delete is kind of those spam clickbait ads where it's like, you know, lose 40 pounds in two weeks.
Like, okay, that's unreasonable.
So Facebook took the stance of deleting that because it was considered spam.
So they could be doing a similar thing to any mention of hydroxychloroquine.
And we know just recently from Morgan Kahn, another whistleblower at Facebook, that they have something called vaccine hesitancy.
Right.
But yeah, based on my knowledge of the policy, it's very possible that they have, in my experience, being there for two years, they often added new sections of the policy, modified their policy, and it's a very overarching and expansive policy.
Because one of the things that happened recently was a U.S. senator was censored, I believe, on Facebook for talking about ivermectin and quoting scientific studies.
And of course, Senator Rand Paul speaks frequently about these kinds of issues.
And it seems like Facebook has no hesitation to censor sitting U.S. senators.
Do you find that shocking?
I mean, because I do.
I mean, it's like they won't even let a senator talk about this.
Yeah, it's really despicable to see what Facebook is doing.
This is information that should be in the public forum.
People can debate it.
People are intelligent.
They can make decisions on their own, informed decisions.
You know, I'm not surprised because when I was there, you know, Trump gave his State of the Union speech and Facebook literally told us to look for hate speech coming from the State of the Union.
I mean, the bias cannot be more evident.
And so for them to censor a senator, I mean, it's really, it's, yeah, when you're censoring Trump and you're used to censoring the president for four years, I mean, I think censoring a sitting U.S. senator is no big deal for Facebook.
Well, it makes me wonder if any of these senators are beginning to wake up and notice what's going on, because from what I've seen in terms of the judicial and legislative push against Facebook mostly, so far it's only focused on monopoly marketplace types of laws.
I still haven't seen any real effort to reform Section 230, even when senators are being censored.
Yeah.
I mean, the antitrust avenue, there was recently the case, the antitrust lawsuits, I think, were filed initially back in October, I believe, or November.
But they just had a major setback because a judge ruled in Washington that there wasn't enough evidence that they're a monopoly.
So you can go the antitrust route or you can just say, hey, these companies are not following Section 230.
Can we rewrite Section 230?
I think something needs to be done.
But at the same time, it's dangerous because if you rewrite the law right now, I mean, Facebook has tons of lobbyists there and tons of influence.
So I think if they were to redo the law right now, it would favor Facebook in a lot of ways.
But yeah, it's unacceptable that, and I think hopefully, yeah, there are Democrat senators that are waking up to it.
But essentially, what you just said is that it's already too late.
Facebook is more powerful than government.
Yeah, not to sound too pessimistic, but to be honest, Facebook has more power than the United States government, and they have a global reach, and they have metadata on billions of people throughout the world.
So yeah, I think that's a safe assumption.
Well, and Edward Snowden revealed that Facebook is really a data collection portal for the NSA. Not just Facebook, but pretty much Google and Twitter and YouTube and all kinds of other services.
So really, it's almost like Facebook would have protection by a lot of the most powerful sectors of the intelligence community because it's such a useful intelligence gathering tool.
Have you written about that or talked about that?
That's something I need to explore a little bit further.
I know that we had relationships with law enforcement, so if there was an act of suicide going on on a live video, we could escalate that to law enforcement.
So we did have a button.
I had a button where I could click and escalate it all the way up the chain.
It would be sent to someone at Facebook who would then escalate it to law enforcement.
There has to be communication between these government agencies and Facebook.
It would make sense.
If you're a data collection agency and you have a social media platform with billions of users and have all their data, it would be a logical step to use that data.
Yeah, so I think that's pretty clear that they have this buddy-buddy relationship with, at least we know there's a buddy-buddy relationship with the Democrats, and Facebook is a wealth of information for user data.
I'm sure the NSA would love access to all that data.
Yeah, no question about that.
So, overall, are you more optimistic or pessimistic about a future where freedom of speech might be restored, viewpoint discrimination might be eliminated?
I mean, do you think there's any hope of this, or is Facebook forever going to be this slanted, anti-free speech, anti-conservative platform?
I think there's a glimmer of hope.
I think we've got our work cut out for us.
I mean, when the Supreme Court decides not to take these very important cases, that's huge in my mind.
That's not a good sign.
But we do have some Supreme Court justices that would decide favorably, I think, and make the right decision if they decided to hear the case.
So I'm cautiously optimistic.
I think the states need to, you know, there's always that continual struggle to have more power given to the states.
So I think the states need to step up at this point because the Supreme Court is clearly showing that they don't have the gumption or the balls, so to speak, to take these cases and make the important decisions.
Yeah, definitely.
Okay.
Pop quiz item for you here.
Is Mark Zuckerberg human?
What is he?
That's a good...
So Mark Zuckerberg...
I mean, I've seen some of his interviews and he's clearly a smart guy.
I don't know if you...
But like AI smart or human smart?
What are you saying?
Right?
Because if he's the one designing the AI, then he has to be as smart or smarter than the AI. Yeah.
Well, maybe he just copied like an AI subroutine out of his cyborg brain and put it into the system.
I mean, okay, we're joking, but he does come across to a lot of people as very odd, for sure.
Yeah, and as you say that, I'm going to take a sip of water like he did in his testimony.
Okay, yeah.
Well, he's just trying to prove that he can swallow liquids, I think, is what he was doing.
There's a functioning digestive tract.
But getting back on a serious note, how many people does Facebook employ as contractors through other companies globally to do this kind of moderation?
It's got to be tens of thousands.
Yeah, it's tens of thousands.
And what's fascinating is before 2016, there were hardly any U.S.-based content moderators.
So they really ramped up after 2016.
And so the contract with Cognizant, my company, started in 2017.
And they told us, I was talking to someone, one of my bosses at work, and he said they told us they brought all these jobs over to the U.S. to prevent Russian interference in the election.
Really?
So, in other words, they wanted to run the interference themselves instead of the Russians.
Yeah.
Because the overseas people they were hiring before didn't have enough market knowledge about local politics to understand all the nuances and to monitor for election trends.
Right.
But yeah, as far as estimates, at my workplace in Phoenix, we had between 1,000 and 1,500 content moderators.
Right.
Now, I knew of at least a couple of companies, a couple more companies.
Accenture was one, a contractor in the United States.
And so there was probably at least 5,000 to 10,000 in the U.S. Wow.
And I know there's one in Germany.
And there's been different articles about the content moderation business, like the mental health effects program.
For those employees throughout the world.
I think in his testimony, Zuckerberg said that he was spending about $10 billion on content moderation in the US. Did your work colleagues, did they understand that ultimately what Zuckerberg wants to do is replace as many humans as possible with algorithms and machine learning and so on?
Was that understood among all of you who are working there?
Yeah, we were actually training the AI. So part of our job to train the AI was something called continuous enforcement.
So we talked earlier about how Female nipples are allowed sometimes for LGBT marches.
But on every content we saw, if we saw cleavage, it wasn't violating, so we wouldn't delete it for cleavage, obviously.
But we would mark it for CE. So we would mark CE cleavage.
Or if someone was in a swimsuit, CE swimsuit, So anything that was somewhat sexual.
Now, they were telling us we were training the AI so that someone could choose what their filters are in the future.
So if you're on Facebook and you don't want to see, you know, an underwear advertisement, then you can make that choice.
So we were training the AI to be able to filter out those kinds of sexual images, I guess.
Wow.
Just with tagging.
Yeah, so tag, yeah, so you're looking at the screen, you got an image on your screen, let's say you got like Melania Trump modeling, you know, and so you've got her and then you say, okay, well, it's not violating, there's no nudity, but I'm going to mark it, I'm going to tag it a certain way because she's wearing, you know, lingerie.
So it would, we were training, they told us we were training the AI so that they could filter out certain imagery.
So ultimately, the people that work there are training their AI replacements.
Yeah.
And, you know, there were some decisions that, you know, if there was like a gray area with the policy, we would escalate it to a supervisor and then get guidance from Facebook.
So, for example, but yeah, the AI could track a lot of those keywords and And, you know, the AI, for example, we had an edge case where I actually raised this up, I brought this up, and because of me, like, Facebook, like, made it the official policy.
So when Eric Sierra Morella, the Ukraine whistleblower, came out, like, we know that he was, you couldn't say his name at all, right?
Yeah.
So Facebook, so I got the post initially, I raised it up locally, and they said, okay, we're going to delete Eric Sierra Marella because people are revealing his status as like an undercover law enforcement.
I'm like, what?
Eric Sierra Marella is undercover law enforcement?
Supposedly he worked for like the NSA or something, but he wasn't law enforcement.
He wasn't like FBI or anything.
And then, so that policy, that guidance was active for like six hours.
And then Facebook came back and said, oh, no, no.
Delete it for, keep deleting it, but delete it for, like, some, for coordinating harm other.
Some, like, generic piece of the policy.
And so it didn't really match the policy.
There's no policy rationale for deleting him.
But even photos of him, even Alex Soros, like there's a famous photo of Alex Soros next to taking a photo with Pelosi or something.
And people were confusing Alex Soros with Eric Ciamarella.
And I told my manager, hey, this isn't Ciamarella.
Should I still delete it?
He said, yeah, still delete it.
Wow.
Just someone that the public's confusing with someone who's not allowed to appear.
Yeah.
So that's just another example of, you know, what was Eric Ciaramolla?
Why was he important?
Well, he was the biggest Republican talking point, and he played a crucial role in the impeachment, and now Republicans can't talk about him on social media, right?
Wild.
Wild.
So your book then, as I see it, it's kind of like Adventures Inside the Ministry of Truth.
Yeah.
It pretty much is.
I mean, you have...
Yeah, we were...
You know, you kind of felt like gods in a way.
I mean, we would talk about the Facebook ban hammer.
We would joke about it amongst our coworkers.
And, you know, you have...
We would see the memes as well.
People joke about how the moderator laughs at the meme before he deletes it.
Right.
So they find it funny, but they still delete it anyway because that's the job.
Yeah.
Yeah.
But yeah, I mean, you look at the impact.
So you're like, okay, he's deleting a few memes here and there.
No.
Like, look at the numbers.
So 1,000 content moderators, we were reviewing about 200 posts a day.
So if I'm deleting 200 posts a day times 1,000, right?
Yeah.
20,000 or whatever a day just from that site.
And then to weekly or monthly, I mean, that's millions of posts a week that are getting deleted or censored.
So this has huge implications for free speech.
So yeah, Ministry Truth, I think that's a good comparison.
I don't understand how Facebook can even fund all this moderation effort.
It doesn't seem like the business model would work unless they're getting paid by the NSA or somebody in exchange for all the spy data.
I mean, if I launched something like Facebook and had to hire tens of thousands of moderators and just ran ads on the sites, I don't think the numbers would work.
Yeah, yeah, no, it's, I wonder, you know, we know he's plush with, Zuckerberg's plush with cash, which he gets from selling everybody's data, right?
But, yeah, as far as that goes, I know we were getting underpaid.
I was literally making $15 an hour as a content moderator.
And there were a lot of complaints about that because there is the mental toll that it takes.
And some of my co-workers developed PTSD symptoms and had to see counselors or get on medication.
So we were definitely underpaid.
I don't know where their money comes from, how they can afford it.
It is very costly.
I mean, just in general, hiring U.S. workers is costly.
That's why before 2016, they were hiring people overseas like in India because it's much, much cheaper.
Right, right.
It's an extraordinary amount of money to put into this process when a user out there could just post every 10 seconds and create this moderation cost.
What's the process by which they ended up in your moderation queue?
I mean, how does Facebook know whether to moderate posts?
Yeah, so we would get, so cognizant, we would get a certain number of jobs dumped in our queue, and then based on that was the invoicing of how they would pay us as a company.
And we had service level agreements, so we had to keep our quality scores at a certain level.
So, yeah, they would dump things in our queue.
I mean, we had different workflows.
And so as we grew as a client, we did pretty well.
We were making Facebook happy.
And so they gave us different lines of business.
So they gave us one called, there's like the separate hate speech queue where people were just reviewing hate speech.
And then there was one for, you know, profile review, Instagram profile review.
We just started with videos and they expanded to give us, you know, groups to review, pages, posts, comments.
Yeah.
What about the Dr.
Seuss queue?
All the Dr.
Seuss censorship.
Surely that's got to have its own department.
I'm sure Dr.
Seuss has his own queue, his own department.
You know, because Cat in the Hat is so evil.
When you went public with this, were you threatened with legal action or anything that you can talk about?
I don't know if you've ever talked about this publicly.
Anything like that happen?
So surprisingly, I didn't receive any kind of legal action or even letters or hints of legal action from Facebook or Cognizant.
So when I went public, Project Veritas, they have my back.
And I think if anything were to come out, they would help me out.
But I think the PR fire backlash from that would be greater than the reward.
Oh, yeah.
Right.
But nobody threw a brick through your front window with a note attached to it.
Like, stop talking or else.
Yeah.
There's always those fears, I mean, of something happening.
And I think, you know, that's part of the reason why I went public is you're usually safer if you are public.
Yeah.
And now you have your book, which I think makes you even more public on this.
Yeah.
And...
Down the road, let's say some lawsuits use my book for suing Facebook.
Facebook stands to lose a considerable amount of money, possibly.
It does, in a way, put a target on my back.
Luckily, I haven't had any direct threats, but it's always in the back of your mind.
Yeah.
Well, I'm glad that you're safe, and I really appreciate your courage, and I appreciate you joining me to come out and talk about this.
You've done a really important thing, and I think it's a game changer.
And I think when your book comes out, it will be cited by many members of Congress, although they're mostly getting paid by Facebook.
But maybe someone will say, you know what, enough is enough.
We've got to restore freedom of speech in America.
So any final thoughts before we wrap this up?
No, I just felt like it was my civic duty as a patriot to come forward, and I took time off of life to work on the book, and I really think you'll enjoy it.
It's a fascinating view behind the scenes of what's going on at Facebook.
Yeah, that will be fascinating.
I can't wait to get it.
And let's keep in touch.
If you've got some other breaking news coming out, love to have you back on again.
Or maybe we'll have some news about Section 230 one of these days.
I'd like to get a comment from you on that.
Would that be okay?
Yeah, that'd be great.
Okay.
All right, Ryan.
Well, thank you so much for joining us today.
And folks, check out Ryan's book, Behind the Mask of Facebook, and also his website, ryanhartwig.org.
And of course, I'm Mike Adams, the founder of brighteon.com, where you can find videos and interviews like this that are banned on Facebook.
Thank you for watching.
And that's why I've recorded a new nine-hour audiobook.
It's called The Global Reset Survival Guide.
You can download it for free by subscribing to the naturalnews.com email newsletter, which is also free.
I'll describe how the monetary system fails.
I also cover emergency medicine and first aid and what to buy to help you avoid infections.
So download this guide.
It's free.
Export Selection