All Episodes Plain Text
Sept. 24, 2020 - Behind the Bastards
01:21:40
Part Two: Mark Zuckerberg Should Be On Trial For Crimes Against Humanity

Robert Evans and Jamie Loftus argue Mark Zuckerberg warrants a trial for crimes against humanity, comparing Facebook's growth-at-all-costs model to the Mayan collapse. They detail how Joel Kaplan, a former Bush advisor, blocked polarization-reducing measures like "Eat Your Veggies" to protect right-wing allies such as Breitbart and Ian Miles Chong. Despite internal warnings that algorithms fueled 64% of extremist joins and QAnon's rise, Kaplan's paternalistic stance allowed Russian bots and disinformation to dominate. This strategy sparked employee walkouts after Zuckerberg refused to fact-check Trump, ultimately prioritizing conservative content over safety while banning militias yet retaining anarchists. Ultimately, the hosts conclude that Facebook's engineered chaos represents a systemic failure of corporate ethics with dangerous societal consequences. [Automatically generated summary]

Transcriber: nvidia/parakeet-tdt-0.6b-v2, sat-12l-sm, and large-v3-turbo
|

Time Text
Financial Literacy Month Kickoff 00:04:38
This is an iHeart podcast.
Guaranteed human.
It's financial literacy month, and the podcast Eating While Broke is bringing real conversations about money, growth, and building your future.
This month, hear from top streamer Zoe Spencer and venture capitalist Lakeisha Landrum Pierre as they share their journeys from starting out to leveling up.
There's an economic component to communities thriving.
If there's not enough money and entrepreneurship happening in communities, they've failed.
Listen to Eating While Broke from the Black Effect Podcast Network on the iHeart Radio app, Apple Podcasts, or wherever you get your podcast.
Saturday, May 2nd, country's biggest stars will be in Austin, Texas.
At our 2026 iHeart Country Festival presented by Capital One.
See Kane Brown.
Parker McCollum.
The man who you need.
Riley Green.
This girl, Shabuzzi.
Dylan Scott.
Russell Dickerson.
Benam.
Gretchen Wilson.
Chase Matthew.
Lauren Elena.
Tickets are on sale now.
Get yours before they sell out at Ticketmaster.com.
Hey there, folks.
Amy Roebuck and TJ Holmes here.
And we know there is a lot of news coming at you these days from the war with Iran to the ongoing Epstein fallout, government shutdowns, high-profile trials, and what the hell is that Blake Lively thing about anyway?
We are on it every day, all day.
Follow us, Amy and TJ, for news updates throughout the day.
Listen to Amy and TJ on the iHeart Radio app, Apple Podcasts, or wherever you listen to podcasts.
On a recent episode of the podcast, Money and Wealth with John O'Brien, I sit down with Tiffany the Bajinista Alicia to talk about what it really takes to take control of your money.
What would that look like in our families if everyone was able to pass on wealth to the people when they're no longer here?
We break down budgeting, financial discipline, and how to build real wealth, starting with the mindset shifts too many of us were never, ever taught.
If you've ever felt you didn't get the memo on money, this conversation is for you to hear more.
Listen to Money and Wealth with John O'Brien from the Black Effect Network on the iHeartRadio app, Apple Podcast, or wherever you get your podcast.
Wrecked him.
Damn near killed him.
That was the punchline to a joke without the rest of the joke, because I think we can all put things together now as adults.
I'm Robert Evans.
Yeah.
Hi, Jamie.
This is Behind the Bastards.
We talk about bad people on this podcast.
And that was a little, that was a little bit of levity at the start of it before we get into depressing shit again.
Some abstract levity.
Some abstract levity.
Yeah.
Pieces of levity that one can assemble into comedy.
Yeah.
That's a very nice.
Yeah.
I mean, comedy, you know, it is just a series of things that you put in the correct order.
It's like a deconstruction of comedy, like when people take apart a sandwich and then serve it on a plate in a fancy restaurant.
I got to be honest.
Anytime someone says it's a deconstruction of comedy, it's the least funniest shit you'll ever hear in your entire life.
They're like, he's deconstructing the medium, and it's usually just like some, some guy.
It's not just some guy.
Yeah, it's never any good.
But you know what is good, Jamie?
What?
Facebook's peanut butter back.
I was like, this can't be a transition to Mark Zuckerberg.
No, because nothing about him.
Robert just talked about this delicious lunch he had as I eat a peanut butter.
I had a great lunch.
And I'm eating a packet of peanut butter.
And you know what?
I'm content.
Sophie is eating peanut butter.
I ate a delicious lunch.
Oh, I'm going to go.
There were fried eggs involved.
I love that for you.
I had a couple of chips.
And most important, most important, we're all going to get back to my favorite thing to do with my good friend Jamie Loftus, which is talk about the extensive crimes of Mark Zuckerberg.
Oh, yes, I changed shirts between episodes, Robert.
So now I have a little Marky with me.
Yeah, you do.
You've got your Marquis shirt.
Yeah, my favorite Mark quote that you can be unethical and still be legal.
That's the way I live my life.
Ha ha.
It is amazing.
And he really, I mean, there's been a lot said about him, but the man sticks to his guns.
He lives by this credo to this very day.
Yeah, yeah, yeah.
You know who else sticks to their guns, Jamie?
The Religion of Growth 00:14:59
Whom?
The death squads of the various dictatorial political candidates who use Facebook to crush opposition and incite race riots.
That was a transition.
That was a transition.
So, Jamie.
All right.
I'm going to start.
It's time to start the episode.
And I'm going to start with a little bit of a little bit of an essay here.
So once upon a decade or so ago, I had the fortune to visit the ruins of a vast Mayan city in Guatemala, Tikal.
And the scale of the architecture there was astonishing.
If you ever get the chance to visit one of these cities, you know, in Guatemala or Mexico or whatever, really worth the experience.
Just the, again, the size of everything you see, the precision of the stonework.
It's just amazing.
And one of the things that was most kind of stirring about it was the fact that everything that surrounded it was just hundreds and hundreds of miles of dense, howling jungle.
So I spent like an afternoon there and I got to sit on top of one of these giant temple pyramids, drinking a one-liter bottle of Gaio beer and staring out over the jungle canopy and just kind of marveling at the weight of human ingenuity and dedication necessary to build a place like this.
Sounds metaphorical.
And while I was there, Jamie, I thought about what had killed this great city and the empire that built it.
Because a couple of years earlier, really not all that long before I visited, theories had started to circulate within the academic community that the Mayans had, in the words of a NASA article in the subject in 2009, killed themselves through a combination of mass deforestation and human-induced climate change.
A year after my visit, the first study on the matter was published in Proceedings of the National Academy of Sciences.
I'm going to quote here from the Smithsonian magazine.
Researchers from Arizona State University analyzed archaeological data from across the Yucatan to reach a better understanding of the environmental conditions when the area was abandoned.
Around this time, they found severe reductions in rainfall were coupled with a rapid rate of deforestation as the Mayans burned and chopped down more and more forests to clear land for agriculture.
Interestingly, they also required massive amounts of wood to fuel the fires that cooked the lime plaster for their elaborate constructions.
Experts estimate it would have taken 20 trees to produce a single square meter of cityscape.
So in other words, the Mayans grew themselves to death, turning the forests that fed them into deserts, all in the pursuit of expansion.
It's a story that brings to mind a quote from the great historian Tacitus writing about Augustus Caesar and men like him.
Solitudinum faciunt, pacum appellant.
They make a desert and call it peace.
That's what he's saying about Augustus Caesar and the emperors like him.
They make a desert and call it peace.
That sounds like one of those sundial phrases.
I think a more accurate summation of the 200 years of peace that Augustus Caesar created than what Mark put out.
They make a desert and call it peace.
Now, I read that quote for the first time as a Latin student in high school, and I saw it referenced in relation to Mark Zuckerberg in a Guardian article covering that New Yorker piece we quoted from last episode.
And the title of that New Yorker article was, Can Mark Zuckerberg Fix Facebook Before It Breaks Democracy?
So, democracy, a free and open society where numerous viewpoints are tolerated, cultural experimentation is possible, and evolution is encouraged.
These are the things that have made Facebook success possible.
It could not have come about without them and outside of a culture that embodies those values.
And now that Facebook's member count is closing in at 3 billion, the social network is doing what all empires do.
It's turning the fertile soil that birthed it into a desert.
And as it was with the Mayans, all of this is being done in the name of growth.
Catherine Loes was an early Facebook employee and Mark Zuckerberg speechwriter for a time.
In her memoir, The Boy Kings, which is what you call it.
I'm triggered.
I don't know.
That's a good title.
She lays out what she saw as the engineering ideology of Facebook.
Quote, scaling and growth are everything.
Individuals and their experiences are secondary to what is necessary to maximize the system.
Mark Zuckerberg and thus Facebook have held a very consistent line since day one of the company operating as an actual business.
And that line is that Facebook's goal is to connect people.
But this was and always has been a lie.
The goal is growth, growth at any cost.
In 2007, Facebook's growth leveled off at around 50 million users.
At the time, this was not unusual for social networks, and it seemed to be something of a built-in natural growth limit.
Like maybe 50 million is about as much as a social network can get unless you really start jinking the results.
And 50 million users, that's a very successful business.
You can be a very rich person operating a business.
You could call it a day.
That's a great thing to accomplish.
MySpace Tom was thrilled with that.
Yeah, God bless Tom.
MySpace Tom is now.
Tom, who hasn't done a goddamn problematic thing.
Traveling the world now, taking photographs of the world that Mark Zuckerberg is destroying.
He might be the only person worth hundreds of millions of dollars that I'm fine with not taking the money back.
Tom, you're fine.
Like, go keep doing your thing.
Use the money to be boring.
It seems like he is just...
Just use his fortune to be boring.
Do you not remember?
It took like five months for me to get my MySpace deleted.
I don't remember anything about MySpace.
That's my favorite thing about MySpace is I've forgotten everything about it but the name MySpace.
Oh, for sure.
MySpace was not good, but did I learn about a lot of goth music on it?
Yes.
Now I have some middle school angst from not being in somebody's top eight, but oh my God.
PC4PC, I did a lot of PC4PC.
As did I, my friend.
You are so pretty.
PC4PC.
And you know what no one used MySpace for?
Genocide.
Organizing militias to show up at the site of protests and shoot Black Lives Matter activists.
That is very true.
Was not done with MySpace.
And I suspect Tom would have had an issue if it had been.
I think so.
Well, I don't know about Tom's podcast.
I don't know the man.
But the fact that he's kept his fucking mouth shut since getting rich and going off to do whatever he does makes me suspect that he's a reasonable man.
So Facebook hits this growth limit and it kind of levels off a bit.
And again, it's a very successful business in 2007, but it's not an empire.
And that's what Mark wanted.
That's the only thing Mark has ever wanted in his entire life.
And so he ordered the creation of what he called a growth team dedicated to putting Facebook back on the path to expansion, no matter what it took.
So the growth team quickly came to include the very best minds in the company who started applying their intellect and ingenuity to this task.
One solution they found was to expand Facebook to users who spoke other languages.
And this is what began we talked about last episode, the company's heedless growth into foreign markets.
Obviously, at no point did anyone care or even consider what impact Facebook might have on those places.
Neat.
Yeah.
I'm going to quote from the New Yorker again.
Alex Schultz, a founding member of the growth team, said that he and his colleagues were fanatical in their pursuit of expansion.
You will fight for that inch, Alex said.
You will die for that inch.
Facebook left no opportunity untapped.
In 2011, the company asked the Federal Election Commission for an exemption to rules requiring the source of funding for political ads to be disclosed.
In filings, a Facebook lawyer argued that the agency should not stand in the way of innovation.
Oh, okay.
See, it doesn't seem like an innovation to me.
Innovation.
You know, I get this to an extent.
So the other day, I was drunk driving my forerunner and I was shooting at some targets I'd set up in the trees.
And the people in the neighborhood I was doing this in said, oh, for the love of God, please, you're endangering all of our lives.
And I said, you're standing in the way of innovation because I was innovating what you can do drunk in a forerunner with a Kalashnikov.
And they got in the way of that.
Yeah.
I understand, Mark.
I like to really heat up a pan and put it on someone's face just to innovate the art of what you can do.
Yeah, you innovate their skin by burning.
You innovated your face.
Yeah.
And really people standing in the way of that innovation.
How am I supposed to make progress in hurting people's faces?
Yeah, I'm a fan of how Pol Pot innovated the capital city of Cambodia by forcing everyone out of it and then killing hundreds of thousands of them.
It's just innovative.
The way all of this is just horrific and like, I don't know.
Like we're just talking about something else.
But it's like the language of Silicon Valley applied to the two genocidal situations is just so it makes my fucking I just peeled all my skin off.
I'm you have to you have to agree that Hitler was an innovator.
He innovated so many things.
He really did change the game.
Yeah, he absolutely changed the narrative from there not being a war in Europe to there being a war in Europe.
That's called disrupting, honey.
He did disrupt it.
He disrupted the shit out of the Polish government.
Oh my gosh.
Oh, fuck.
Why are you doing this, Jimmy Rock?
So Sandy Parakilis, who joined Facebook in 2011 as an operations manager, paraphrased the message of the orientation session he received as, we believe in the religion of growth.
The religion of growth is what he was told when he joined the company.
That's what it was called to him.
Not only horrifying, but like, what could you sound like more of a sniveling loser than saying the phrase religion of group?
Okay.
Yeah.
Just shut up.
He said, quote, the growth team was the coolest.
Other teams would even try to call subgroups within their teams the growth X or the growth Y to try and get people excited.
And in the end, Facebook's finest minds decided that the best way they could excited.
I know.
Let the light.
Yeah.
I'm excited.
I'm horny.
I'm ready to go.
I mean, with that kind of narrative, I love to hear it.
I love to, you do love to hear it.
So in the end, Facebook's finest minds decided that the best way they could further the great god of growth was to for Facebook to become a platform for outside developers.
That this was the way to really, really get things going again.
And you all remember the start of this period when Facebook made this change.
This is when like what had once been a pretty straightforward service for keeping up with your friends from college was suddenly flooded with games like Farmville and a bunch of like personality tests and shit that period of time.
The mom's fucking drone struck Facebook by coming down with Farmville, sending you 5 trillion invitations.
Yeah.
Leaving your like high school choir concert to go harvest strawberries.
Yeah, I know.
And making a bunch of fucking money for Facebook.
Whatever appearance these apps took, their main purpose was the same, which was to hoover up all of your personal data and sell it for profit.
Yeah.
Someone would have given her a social security number to Farmville, and that's just a fact.
Yeah.
And the only person you should give your social security number to is me.
I do encourage all of our listeners to find my email and just email me your social.
Yeah, just to just to your tips line.
Yeah.
Yeah.
It's like it's like that, you like that thing from that documentary about Keith Ranieri, who we also did episodes on, The Vow, it's your collateral.
Send me your social security number.
This is your collateral.
So I'll know that you really care.
Yeah.
So Facebook's employees kind of realized very quickly after this change was made and these developers start flooding the service with all of their shit that the company's new partners were engaged in some really shady behavior.
One worker who was put in charge of a team ordered to make sure developers weren't abusing user data immediately found out that they were.
And I'm going to quote again from the New Yorker here.
Some games were siphoning off users' messages and photographs.
In one case, he said, a developer was harvesting user information, including that of children, to create unauthorized profiles on its own website.
Facebook had given away data before it had a system to check for abuse.
Perakelis suggested that there be an audit to uncover the scale of the problem.
But according to Perakelis, an executive rejected that idea, telling him, do you really want to see what you'll find?
No.
Which, look, I can identify with that too.
I recently had an issue where I left a bag of potatoes in the top counter of my kitchen for, I don't know, somewhere between four and seven months.
And when I took them, I didn't want to, I knew something was wrong.
I knew something was wrong up there because of the flies and the strange smell, but I didn't want to look into it because I didn't want to see the extent of the problem.
And when I finally did, I regretted learning what an issue I had made for myself and my home.
Robert, you are really, since we last spoke, you've become very prone to a metaphor.
I am a living metaphor, Jamie.
You are living out.
I'm innovative.
I innovated those potatoes.
You disrupt those potatoes were with a family of maggots.
Okay, we don't need to.
We don't need to talk about what a problem my life has become.
Perakelis told me, me, the New Yorker reporter, quote, it was very difficult to get the kind of resources that you needed to do a good job of ensuring real compliance.
Meanwhile, you looked at the growth team and they had engineers coming out of their ears.
All the smartest minds are focused on doing whatever they can do to get those growth numbers up.
Now, Jamie, Jamie Loftus, I happened to read this quote.
Well, I was struggling to work in the midst of unprecedented wildfires that devastated a huge amount of the state of Oregon and made our air quality the worst in the world for a while.
On the very day I read that article, four of my friends and journalistic colleagues were held and threatened at gunpoint by militiamen who had taken to the streets of a town very near Portland in the middle of an evacuation because viral Facebook memes convinced them that Antifa was starting the fires.
Viral Stories and Dangerous Choices 00:04:59
Around the same time that this was happening, that my buddies were getting held at gunpoint because they were not white people and a militia thought that was suspicious.
Around that same time, a tweet went viral from a Portland resident and a former Facebook employee named Bo Rin.
She posted a picture of the city blotted out by thick, acrid clouds of smoke and wrote, My dad sent me this view from my childhood room in Portland.
It hit me that we have been wasting our collective intelligence in tech optimizing for profits and ad clicks.
Huh.
Glad you got on the on that page, Bo.
Well, glad we, I mean, sometimes it just takes something to put it all in perspective, wouldn't you say?
Like your home burning down.
Yeah, sometimes the militias being 20 minutes from your door.
Yes.
Unfucking believable.
The militias that organize on Facebook.
Yeah, I mean, the story went pretty viral.
Yeah, as I said, they weren't harmed.
Yes.
Yeah, the two people I knew best who were there were Sergio Almost and Justin, Olmos, and Justin Yao, who are both wonderful reporters.
But yeah, it was not lost on me that I think of the four people who were there, three of them were not white people, and that some of the white reporters had a much easier time.
Interesting things about militias, you learn.
Anyway, it makes you think.
Now, I thought that quote was interesting.
Anyway, opening.
Interesting.
Disruptive.
Disruptive.
Thought-provoking.
Like the fires.
And like militias.
The fires made me think when I could think.
Yeah.
I like how Facebook.
I threw up in my N95 mask walking down the street.
Awesome.
Yeah.
I've been jogging and doing pull-ups in a gas mask.
Just half naked in a gas mask in my front lawn, like a normal person.
You're the only person I know who would have seen this as a possible outcome.
And for that, I thank you.
And I curse you.
Yeah.
So anyway, opening Facebook up to developers made a shitload of money and membership grew.
And from Mark's point of view, everything was going great.
But Catherine Loes, his speechwriter, saw a lot of the same problems Perakelis had seen.
And in her memoir, she writes, The idea of providing developers with a massive platform for application promotion didn't exactly accord, I thought, with the site's stated mission of connecting people.
To me, connection with another person required intention.
They have to personally signal that they want to talk to me and vice versa.
Platform developers, though, went at human connection from a more automated angle.
They churned out applications that promised to tell you who had a crush on you if you would just send an invitation to the application to all of your friends.
Oh, I know.
The idea was that after the application had a list of your contacts, it would begin the automated work of inquiring about people's interests and matching people who were interested in each other.
Soon, developers didn't even ask you if you wanted to send invitations to your friends.
Simply adding the application would automatically notify all of your Facebook friends that you had added it and invite them to add it too, using each user as a vessel through which invitations would flow virally without the user's consent.
In this way, users' needs for friendship and connection became a powerful engine of spam, as it already was with email and on the internet long before Facebook.
The same will tell you you have a crush on who has a crush on you if you just send this email to your address book ploys, were familiar to me from Hopkins when spammers would blanket the entire email server with emails in a matter of hours, spread violally by students gullibly entering the names of their crushes and their crushes' email addresses.
So this was the face, this was the start of Facebook making choices for its users, choices that were based on what would be best for the social network, which was keeping people on the site for as long as possible.
The growth team saw that proactively connecting people to each other worked out really well for Facebook's bottom line, even though sometimes, for example, people who had been horribly abused and raped by their spouses were reconnected to those spouses who they were hiding from and had their personal data exposed to them.
A thing that happened repeatedly and still happens repeatedly.
You know, but that's a small price to pay for growth.
In 2010, for that inch, Robert.
For that inch.
You got to fight for that inch.
And sometimes fighting for that inch means connecting abused women to the men who horribly injured them.
That's like Mark talking to Priscilla when they're trying to conceive a child.
He's just like, you got to fight for my inch, honey.
You got to fight for it.
Oh, Mark Zuckerberg is incapable of talking during sex.
He lets out a high-pitched hum that is only audible to crickets.
Fighting for the Inch 00:04:20
He doesn't.
Yeah, he's sort of got a Kendall situation going on, whereas he just has a sex lump that gets really hot.
Yeah, yeah, she has to actually withdraw the semen from inside his sacks using a needle.
I think if we have to go to an ad where she has to actually put in the little, hold on, hold on, hold on.
Okay.
On the subject.
I'm holding my vomit.
Put in a syringe and then suck.
And then she just has it.
And then she just has it.
And if you want to have the emotional equivalent of Mark Zuckerberg's semen, no, that's not.
Oh, that's a bad way to.
That's not fair to the products or the services.
It's not.
Anyway, here they are.
Vomit.
On a recent episode of the podcast Money and Wealth with John O'Brien, I sit down with Tiffany the Budginista Alicia to talk about what it really takes to take control of your money.
What would that look like in our families if everyone was able to pass on wealth to the people when they're no longer here?
We break down budgeting, financial discipline, and how to build real wealth, starting with the mindset shifts too many of us were never ever taught.
Financial education is not always about like, I'm going to get rich.
That's great.
It's about creating an atmosphere for you to be able to take care of yourself and leave a strong financial legacy for your family.
If you've ever felt you didn't get the memo on money, this conversation is for you to hear more.
Listen to Money and Wealth with John O'Brien from the Black Effect Network on the iTeart Radio app, Apple Podcasts, or wherever you get your podcast.
Hey, Ernest, what's up?
Look, money is something we all deal with, but financial literacy is what helps turn income into real wealth.
On each episode of the podcast Earn Your Leisure, we break down the conversations you need to understand money, investing, and entrepreneurship.
From stocks and real estate to credit, business, and generational wealth, we translate complex financial topics into real conversations everyone can understand.
Because the truth is, most people were never taught how money really works.
But once you understand the system, you can start to build within it.
That means ownership, smarter investing, and creating opportunities not just for yourself, but for the next generation.
If you want to learn how to build wealth, understand the market, and think like an owner, Earn Your Leisure is the podcast for you.
Listen to Earn Your Leisure on the iHeartRadio app, Apple Podcast, or wherever you get your podcast.
I'm Iris Palmer, and my new podcast is called Against All Odds.
And that's exactly what the show is about.
Doing whatever it takes to beat the odds.
Get ready to hear from some of your favorite entrepreneurs and entertainers as they share stories about defying expectations, overcoming barriers, and breaking generational patterns.
I'm talking to people like award-winning actress, producer, and director, Fiva Lingoria.
I think I had like $200 in my savings account, and my mom goes, what are you going to do?
And I was like, oh, I'll figure it out.
We had a one-bedroom apartment for like $400 a month and we all could not afford.
Like, I was like, how am I going to make $100 a month?
I'm opening up like I've never before.
For those of you who think you know me from what you've seen on social media, get ready to see a whole new side of me.
Listen to Against All Odds with Iris Palmer as part of the Michael Tura Podcast Network, available on the iHeartRadio app, Apple Podcast, or wherever you get your podcast.
Hello, gorgeous.
It's Lala Kent, host of Untraditional Ila.
My days of filling up cups of surah may be over, but I'm still loving life in the valley.
Life on the other side of the hill is giving grown-up vibes.
But over here on my podcast, Untraditional Ila, I'm still that Lala you either love or love to hate.
I've been full on oversharing with fans, family, and former frenemies like Tom Schwartz.
I had a little bone to pick with Schwartzy when he came on the pod.
You don't feel bad that you told me I was a bootleg housewife?
I must flipped a pizza in your lap.
Oh God, I literally forgot about that until just now.
Sorry, I don't want to blame all of that.
I got to blame that one on the alcohol.
This is about laughing and learning when life just keeps on laughing.
Because I make mistakes so that you guys don't have to.
We're growing, we're thriving, and yes, sometimes we're barely surviving, but we do it all with love.
Profitable Spreading of Fascism 00:15:22
It's unruly, it's unafraid, it's untraditionally Lala.
Listen to Untraditionally Lala on the iHeartRadio app, Apple Podcasts, or wherever you get your podcast.
We're back.
Okay.
So in 2010, Facebook launched Facebook Groups, which would allow just about anyone to create a private Waldoff community to discuss just about anything, including fascism, white genocide, or the need to gather a militia together and use it to kill their political enemies.
If you're a regular listener of my show, you know the next part of this story.
From about 2010 to 2016, the United States saw an astonishing leap in the number of active hate groups.
For some perspective, just from 2015 to 2020, the SPLC estimates there were a 30% increase in the number of hate groups nationwide.
All of this growth was mostly spurred on by social media, and Facebook was one of the main culprits.
And they knew they were too.
They didn't admit it openly, but internally they were talking about it from pretty early on.
And I'm going to quote now from a report in the Wall Street Journal.
A 2016 presentation that names as author, a Facebook researcher and sociologist Monica Lee, found extremist content thriving in more than one-third of large German political groups on the platform.
Swamped with racist, conspiracy-minded, and pro-Russian content, the groups were disproportionately influenced by a subset of hyperactive users, the presentation notes.
Most of them were private or secret.
The high number of extremist groups was concerning, the presentation says.
Worse was Facebook's realization that its algorithms were responsible for their growth.
The 2016 presentation says that 64% of all extremist group joins are due to our recommendation tools.
No.
Yeah.
And that most of the activity came from the platforms groups you should join and discover algorithms.
Quote from the presentation: our recommendation systems grow the problem.
Oh, okay.
Well, I mean, as long as the word grow is in the sentence, I think that that's good enough.
Growth is in there.
You're good.
Yeah.
Growth is in there.
So, really, where we're growing and what the consequences are, not really worried about it.
Yeah, it's just like when I'm in my forerunner, drunk as shit on Mezcal and firing a Kalashnikov.
All that matters is forward movement.
It doesn't matter if that forward movement is driving through the trailer that a family lives in.
What matters is that I'm moving forward and shooting and drunk.
You're trash.
Thank you.
Wow.
A judgmental statement.
I'm innovating homeownership.
I mean, this is another example of just, you know, Facebook innovating people's interests.
Like, hey, do you, do you enjoy this?
I'm trying to think of the old Facebook groups that you used to be able to join like 10 years ago, where it would be like, science is my boyfriend.
And it's like, do you enjoy science?
I went to fuck the Smithsonian Institute.
Or be like school groups.
It'd be like class of 2012.
Stuff like that.
The same with like early, I mean, it's like, I mean, obviously, very much in the same line of algorithmic thinking as YouTube, where it's like, oh, did you enjoy this like collage of Gerard Butler images?
How about a man sitting in his forerunner whispering conspiracy theories for three hours on end?
Yeah.
That's just growth.
I love growth.
I love growth almost as much as I love everything that I do with the Toyota 4runner.
Yeah, I was referencing it.
Well hammered in a trailer park.
Yeah.
That's the real is innovating the trailer parks near my house with a Toyota and a rifle.
Just kind of changing the narrative around it.
Changing the narrative around it to screaming mainly.
So, yeah, throughout, right.
So throughout 2016, and particularly in the wake of the election, a lot of Facebook employees began to increasingly express their concerns that the social network they were pouring their lives into might be tearing the world apart.
Because again, most of these are very nice and intelligent people who don't want to live in a planet dominated by nightmarish dictatorships and a complete collapse in the understanding of truth that allows, for example, viral pandemics to spread long after where they should have spread because people don't have any sort of common conception about basic reality as a result of the influence of social media.
Weird example.
They don't like that.
Like the people who work at Facebook got kind of bummed out about contributing to that.
One observer at the time reported to the Wall Street Journal: there was this soul-searching period after 2016 that seemed to me this period of really sincere, oh man, what if we really did mess up the world in 2016?
Yeah.
Yeah.
I love that we're going from in the 40s, like the scientist who does this, who does the same thing going, now I am become death, the destroyer of worlds.
An appropriate comment for the thing that he'd done.
And then something honestly equivalent in its destructive potential.
But the response this time, because everything is tacky now, oh, what if we messed up the world?
Oh, we might have fucked this up.
God.
Like, yeah, LLO.
Yeah.
Starting to think we've severely fucked up the planet.
Never mind.
Like, Jesus Christ.
Yeah.
This is why Aaron Sorkin is still working.
It's because people are saying shitty stuff in shitty ways.
Yeah, I don't.
We should cut that out.
History, no, no, let's, we can, we should never cut out criticizing history's real villain, Aaron Sorkin.
I agree.
Who I call the pole pot of cable television.
He's like, yeah, he is.
Yeah.
That was evil.
This soul searching did not extend to Mark Zuckerberg, who after the election gave the order to pour even more resources into Facebook groups, marking that feature out as emblematic of what he saw as the future of his site.
He wrote a 6,000-word manifesto in 2017, which admitted to playing some role in the disinformation and bigotry flooding the body politics.
So he's like, yeah, we did, we had something to do with it.
He also claimed that Facebook was going to start fighting against this by fostering safe and meaningful communities.
From CNBC, quote, Zuckerberg noted that more than 100 million users were members of very meaningful Facebook groups, but he said that most people don't seek out groups on their own.
There is a real opportunity to connect more of us with groups that will be meaningful social infrastructure in our lives, Zuckerberg wrote at the time.
If we can improve our suggestions and help connect 1 billion people with meaningful communities, that can strengthen our social fabric.
Again, fascinating use of the word meaningful.
Meaningful.
Yeah.
Meaningful.
Meaningful.
What happened next was terrible and predictable and meaningful, Jamie.
Very meaningful.
A flood of new users got introduced and even pushed into extremist groups on Facebook.
The changes Mark insisted upon have been critical to the growth of QAnon, which was able to break containment from the weird parts of the internet and start infecting the minds of our aunts and uncles, thanks mostly to Facebook, which took no action against it until like a month or two ago.
Within two years, Facebook hosted thousands of QAnon pages with tens of millions of collective members.
I'm going to quote now from an NBC News investigation on the matter.
Facebook has been key to QAnon's growth, in large part due to the platform's groups feature, which has also seen a significant uptick in use since the social network began emphasizing it in 2017.
There are tens of millions of active groups, a Facebook spokesperson told NBC News in 2019, a number that has probably grown since the company began serving up group posts in the users' main feeds.
While most groups are dedicated to innocuous content, extremists from QAnonic conspiracy theorists to anti-vaccination advocates have also used the group's feature to grow their audiences and spread misinformation.
Facebook aided that growth with its recommendations feature, powered by a secret algorithm that suggests groups to users seemingly based on interest and existing group membership.
And growth.
And growth.
Yeah, it's funny.
One of the things I like about this NBC report, which is partly authored by Brandy Zedrozny, who's done a lot of great work on this subject, is they kind of talk about how profitable spreading dangerous fascist content is for Facebook.
Quote, a small team working across several of Facebook's departments found 185 ads that the company had accepted praising, supporting, or representing QAnon, according to an internal post shared among more than 400 employees.
The ads generated about $12,000 for Facebook and 4 million impressions in the last 30 days.
Well, you have to imagine, like, they have to, if they're doing the math of what, I mean, it has to be financially profitable because it has to offset the cost of the PR hits that they know that they're going to eventually take for shit.
So they're in the depths.
But again, it's just assigning, yeah, assigning a price to lives and brains.
Yeah, which is a good thing to do.
$12,000 seems reasonable.
Yeah, it seems fair to me.
Yeah.
So, yeah, outside Facebook, the only people who really noticed what was happening initially were a handful of researchers that studied extremist groups.
And I wasn't really one of them until like 2019 that I realized Facebook groups specifically were a problem.
It was obvious that Facebook was the issue.
But the I wasn't until Facebook group kept threatening to kill me for two years.
Yeah, that did happen to you, huh?
That did happen to me.
Yeah.
And if you haven't listened to my Year and Menta podcast, what are you doing?
Thankfully, the people threatening to kill you were just members of Mensa who I trust are not competent enough to pull off an assassination.
I mean, don't challenge them, but let's hope so.
No, this, I'm throwing down the gauntlets.
You're like, no, no, no.
I don't think they could do it.
Yeah.
Yeah.
Sorry.
So, um, yeah, I didn't really grasp the scale of the problem with Facebook groups in specific until 2019 when I started really looking into the Boogaloo movement.
And it was kind of camouflaged because there was just so much fascist content everywhere on Facebook that the fact that groups in specific were driving a lot of the expansion of fascism in this country kind of got lost in the noise.
But there were other researchers who started to realize this early on.
And workers inside Facebook realized what was happening right away.
In 2018, they held a meeting for Mark and other senior leadership members to reveal their troubling findings.
From the Wall Street Journal, quote, a Facebook team had a blunt message for senior executives.
The company's algorithms weren't bringing people together.
They were driving people apart.
Quote, our algorithms exploit the human brain's attraction to divisiveness, read a slide from a 2018 presentation.
If left unchecked, it warned, Facebook would feed users more and more divisive content in an effort to gain user attention and increase time on platform.
So that presentation went to the heart of a question dogging Facebook almost since its founding.
Does its platform aggravate polarization and tribal behavior?
The answer it found in some cases was yes.
In some case, I mean, I guess that's technically accurate in some cases.
Yeah.
Yeah.
So Facebook, in response to this meeting, starts like a massive internal effort to try to figure out like how its platform might be harming people.
And Mark Zuckerberg in public and private around this time started talking about his concern that sensationalism and polarization were being enabled by Facebook.
And to Mark's credit, he made his employees do something about it.
I was that phrase.
Yeah, a little bit to his credit.
It's okay.
We'll take away the credit in just a second.
So, quote, fixing the polarization problem would be difficult, requiring Facebook to rethink some of its core products.
Most notably, the project forced Facebook to consider how it prioritized user engagement, a metric involving time spent, likes, shares, and comments that for years had been the lodestar of its system.
Championed by Chris Cox, Facebook's chief product officer at the time and a top deputy to Mr. Zuckerberg, the work was carried out over much of 2017 and 18 by engineers and researchers assigned to a cross-jurisdictional task force dubbed Common Ground and employees in a newly created newly created integrity teams embedded around the company.
Integrity teams sounds good to me.
It sounds reliable.
It sounds like they made sure that integrity was accomplished via teamwork.
Yeah.
Yeah.
So the Common Ground team proposed a number of solutions.
And to my ears, some of them were actually pretty good.
One proposal was basically to kind of try to take conversations that were derailing groups, like conversations over hot button political issues, and excise them from those groups.
So basically, if a couple of members of a Facebook group started fighting about vaccinations and like a group based around parenting, the moderators would be able to make a temporary subgroup for the argument to exist in so that other people would.
It's like a Zoom breakout room.
Yeah, so that other people wouldn't be exposed.
Which I don't know if that's a great idea, but it was something.
Another idea that I do think was better was to tweak recommendation algorithms to give people a wider range of Facebook group suggestions.
Yeah, but it was kind of determined that doing these things would probably help with polarization, but would come at the cost of lower user engagement and less time spent on site, which the Common Ground team warned about in a 2018 document.
They described some of their own proposals as anti-growth and requiring Facebook to take a moral stance.
You can guess how that all went.
Yeah.
Mark Zuckerberg almost immediately lost interest.
Some of this, a lot of this was probably due to the fact that it would harm Facebook's growth.
But another culprit that like employees who talk to the Wall Street Journal and other publications repeatedly mention is the fact that he was all butthurt about how journalists were reporting on Facebook.
Because after the Cambridge Analytica scandal, they kept writing mean things about him.
No.
Yeah.
Well, Mr. Mark always has to ask himself, what would Bad Haircut Emperor do?
And Bad Haircut Emperor wouldn't, you know, wouldn't slow down on this shit.
Absolutely not.
One person familiar with the situation told the Wall Street Journal, the internal pendulum swung really hard to the media hates us no matter what we do.
So let's just batten down the hatches.
By January of 2020, Mark's feelings had hardened enough that he announced he would stand up, quote, against those who say that new types of communities forming on social media are dividing us.
According to the Wall Street Journal, people who have heard him speak privately say he argues social media bears little responsibility for polarization.
Now, there may be an additional explanation for Mark's shifting opinions on the matter that go beyond being just greedy and angry about bad press.
Batten Down the Hatches 00:15:19
And that explanation is a fella named Joel Kaplan.
Do you know Joel Kaplan?
You ever heard of this dude?
I don't know this Joel Kaplan character.
Well, in short, he's the goddamn devil.
In long, he's the guy that Facebook hired to head up U.S. public policy in 2011, and he became the VP of Global Public Policy in 2014.
And Joel was picked for these jobs because unlike most Facebook employees, he is a registered Republican with decades of experience in government.
This made him the perfect person to help the social network deal with allegations of anti-conservative bias.
With as little empathy as possible, I'm sure.
Yeah.
In 2016, there's all these rumors that Facebook is like censoring conservative content that are proven to be untrue, but the rumors go viral on the right.
And so everyone on the right forever assumes that they were true.
And basically, Joel becomes increasingly influential after this point because he's Mark Zuckerberg's best way out of angering the right wing, which you actually can't not do because they're always angry and will just yell about everything until they get to kill everyone who isn't them because that's what they're doing.
Yeah, life finds a way.
Life always finds a way for them.
So Joel was a policy advisor for George W. Bush's 2000 campaign and a participant in the Brooks Brothers riot, which is the thing that was orchestrated by Roger Stone to help hide a bunch of ballots in Florida that swung the election for W.
He was a part of that.
What the fuck?
Yeah.
That's the guy who's basically running Facebook's response to partisanship right now.
I had a physical reaction to that.
That's awesome.
That's so upsetting.
He worked in the White House for basically the whole Bush administration.
And in 2006, he took over Karl Rove's job.
So if you want to visualize Joel Kaplan, he's the guy you get when you can't get Karl Rove anymore.
He's Mr. Karl Rove wasn't available.
That's the thing.
The worst person in the world is like, I can't do this job anymore.
Joe Kaplan's like, I got you.
I got you, famous monster.
Look, I'm trying to drop some shit over here.
Wow.
Infamous piece of shit, Karl Rove.
Don't worry.
I will continue your good work.
I am Joel Kaplan, and now I basically run Facebook.
And if you Google him, Google has him listed as American Advocate.
Yeah, he is an advocate of things.
I was like, again, I guess don't get any more specific.
It's so untrue.
Don't enjoy his face just to put it out there.
Joel is presently one of the most influential voices in Mark Zuckerberg's world, and he was one of the most influential voices in the entire company when the Common Ground team came back with their suggestions for reducing partisanship.
As policy chief, he had a lot of power to approve these new changes, and he argued against all of them.
His main point was that the proposed changes were, in his words, paternalistic.
He also said that basically babying people.
He also said that these changes would proportionate.
This can't be a daddy story.
This can't be a daddy story.
I can't handle any more daddy stories that end in a genocide, Robert.
Oh, God.
If it makes you feel any better, all the genocides that this is going to lead to haven't happened yet.
Oh, okay.
Well, there you go.
Yeah.
So Joel also said that these changes would disproportionately impact conservative content because it tends to be bigoted and divisive.
Since the Trump administration was at this point regularly tossing threats at Facebook, this had some weight.
Quote from Wall Street Journal.
Mr. Kaplan said in a recent interview that he and other executives had approved certain changes meant to improve civic discussion.
In other cases where proposals were blocked, he said he was trying to instill some discipline, rigor, and responsibility into the progress as he vetted the effectiveness and potential unintended consequences of changes to how the platform operated.
Internally, the vetting process earned a nickname, Eat Your Veggies.
No!
Which sounds paternalistic to me, actually.
Sounds like the beginning of a daddy story that ends in a genocide.
Wow.
Okay.
Eat your veggies.
We'll get back to Joe Kaplan in a little bit.
For now, we need to talk some more about the problem of violent of how we're going to talk about how the problem of violent extremism on Facebook groups got completely out of control.
So this summer, which was marked by constant militia rallies, the explosive growth of the Boogaloo movement, numerous deaths as a result of violent far-right actors showing up at protests with guns, Facebook finally took action in late September to ban militias from using their service because they have to be balanced.
They also banned anarchists from Facebook at the same time, even though anarchists have not been tied to any acts of fatal terrorism in recent memory.
Because, you know, you got to placate the right wing because they're the only ones who matter.
So let's ban the anarchists who have been spending the last four years trying to lay out the individual actors and groups who are members of these militias that are doing stuff like taking over checkpoints and holding my friends at gunpoint.
We wouldn't want the folks who are keeping track of them to be able to use Facebook.
That's the wrong kind of disruptive.
You see, that's the wrong kind of disruptive and advocating.
You know, that's very similar to what the dude in that trailer said when I was driving my forerunner through his trailer and shooting towards his children, not at.
And I'll tell you what I told him.
What did you say?
I'm an innovator.
So is Mark.
I don't know.
That didn't really tie into this.
It worked for me.
I could see it.
I could see it in kind of an Ozark-y kind of way.
I could see it.
Yeah.
Yeah.
So, Mark, by the way, is on record declaring that Facebook is a guardian of free speech, which is one of the things he cited when he noted that he was refusing to fact-check political ads in 2020.
So, anarchists who want to talk about operating a communal garden or share details about dangerous militias are the same as militiamen baying for the blood of protesters.
But political candidates spreading malicious lies about protesters who are being assaulted and killed based on those lies, that is fine.
That's fine.
So, back to Facebook's integrity.
I don't think it's a growth or anything like that.
No, let's get back to Facebook's integrity teams and their doomed quests to stop their boss from destroying democracy.
So, the engineers and data scientists on these teams, and chief, like mainly like the guys who are working on the news feed, they, they, yeah, uh, they, they, according to the Wall Street Journal, arrived at the polarization problem indirectly.
Um, asked to combat fake news, spam, clickbait, and inauthentic users, the employees looked for ways to diminish the reach of such ills.
One early discovery, bad behavior came disproportionately from a small pool of hyper-partisan users.
Um, now, another finding was that the U.S. saw a larger infrastructure of accounts and publishers that met this definition on the far right than the far left.
Um, and outside observers documented the same phenomenon.
The gap meant that seemingly apolitical actions, such as reducing the spread of clickbait headlines along the title of You Won't Believe What Happened Next, it meant that like doing this stuff affected conservative speech more than liberal speech.
Now, yeah, and obviously this pissed off conservatives.
The way that Facebook works means that users who post and engage with the site more have more influence.
The algorithm sees if you're posting a thousand times a week instead of 50.
It likes that engagement because engagement means money, and so it prioritizes your content over the content of someone who posts less often.
This means that a bunch of networks of Russian bots and hyperactive or like Ian Miles Chong, who's a fascist troll who lives in fucking Malaysia and tweets about how like everybody needs to have a gun that they can use to shoot Democrats,
even though guns are illegal in his country, and like makes like did very recently miss anyway, total piece of shit that these pieces of shit who are actively attempting to urge violence and who have urged violence and caused death mobs in other countries.
It means that these people, because they're just shotgunning out hundreds of posts per day, will always be more influential than local journalists and reporters who are trying to bring out factually based information.
Because it's better for Facebook for a stream of lies to spread on their platform than a smaller amount of truth.
Yeah.
And it also lends itself to just never like to be releasing content so quickly that you couldn't possibly disprove or fact check things fast enough because there's just it's just a bullshit machine.
Yeah.
And, you know, Facebook's teams found that most of these hyperactive accounts were way more partisan than normal Facebook users and were more likely to appear suspicious, like to engage in suspicious behavior that suggested either a bunch of people were working in shifts or they were bots.
So these teams, these integrity teams did like the thing that has integrity, which was they suggested their company fix the algorithm to not reward this kind of behavior.
Now, this would lose the company a significant amount of money.
And since most of these hyperactive accounts were right-wing in nature, it would piss off conservatives.
So you can imagine how this idea went over with Joel Kaplan.
Since Mark was terrified of right-wing anger, he tended to listen to Joel about these sort of things.
The eat your veggies.
Joel's daddy.
Let's not forget.
Yeah, Joel's daddy.
And the eat your veggies policy review process stymied and killed any movement on halting this problem.
So, well, how do we feel about that?
We feel great.
We feel great.
Glad Dad's in charge.
Glad everyone's eating their veggies.
I mean, even just the dystopia nature of like mobilizing these teams to be like, hey, I've ruined the world.
Do you think you could stop it before it blows up?
Because this is going to be a real PR issue.
Why would you do that?
Best of luck to the team.
It was another case where, like, because basically the only way to combat this stuff is to have another person Mark Zuckerberg respects or is at least scared of yelling at him.
Or, you know, talking politely to him.
The daddies of the world.
Yeah.
The opposite of whatever Joel Kaplan is saying.
And there thankfully was someone like that in Facebook.
They hired in 2017 Carlos Gomez Urib, who was the former head of Netflix's recommendation system, which has obviously made a lot of money for Netflix.
So this guy, Carlos Urib, is a big, important get for Facebook.
So he gets on staff and he immediately is like, oh, this looks like we might be destroying the world.
And so he starts pushing to reduce the impact that hyperactive users had on Facebook.
And one of the proposals that his team championed was called sparing sharing, which would have reduced the spread of content that was favored by these hyperactive users.
And this would obviously have had the most impact on content favored by far right and far left users.
And number one, there's more far-right users on Facebook than far left.
So that was going to disproportionately impact them.
But the people who mainly would have gained influence were political moderates.
Mr. Urib called it the happy face.
That's what he called this plan.
And Facebook's data scientists thought that it might actually like help fight the kind of spam efforts that Russia was doing in 2016.
But Joel Kaplan and other Facebook executives pushed back because...
Okay, yeah.
Yeah.
And they didn't want to say, because, you know, Max Urib, you couldn't like, you had to be careful arguing with.
So instead of saying, this will be bad for money or it'll make the right angry at us, Joel Kaplan invented a hypothetical Girl Scout troop.
And he asked, what would happen if the girls became Facebook super sharers as part of a cookie selling program?
Robert, that sounds like a metaphor you would do at the beginning of an episode.
Yeah, he was like, basically, what if these Girl Scouts made a super successful account to sell their cookies?
Like, we would be unfairly hurting them if we stopped these people who are baying for the deaths of their fellow citizens and gathering militias to their banner.
They're like, okay, okay, militia.
I hear you.
What about fictional girl scouts?
Girl Scouts.
Fake Girl Scouts.
Yeah.
It's awesome.
So the debate between Mr. Urib and Joel Kaplan eventually did make it to Mark Zuckerberg.
He had to make a call on this one because both of them were kind of big names in the company.
Mark listened to both sides and he took the coward's way out.
He approved Urib's plan, but he also said they had to cut the weighing by 80%, which mitigated most of the positive benefits of the plan.
Yeah.
After this, Mark, according to the Wall Street Journal, quote, signaled he was losing interest in the effort to recalibrate the platform in the name of social good, asking that they not bring him something like that again.
Neat.
200 years of peace, Mark.
That has big 200 years of peace energy about it.
Big 200 years of peace energy.
Yeah.
In 2019, Mark announced that Facebook would start taking down content that violated specific standards, but would take a hands-off approach to policing material that didn't clearly violate its standards.
In a speech to Georgetown that October, he said, you can't impose tolerance top down.
It has to come from people opening up, sharing experiences, and developing a shared story for society that we all feel we're a part of.
That's how we make progress together.
So you know, it's like, That is just such a wild way of saying, like, I don't feel I am accountable for this.
And once again, I'm going to delegate this to the users of the people whose brains I'm actively ruining.
You know what makes progress harder, in my opinion, Jamie?
Products and services?
No, no.
When fascists are allowed to spread lies about disadvantaged and endangered groups to tens of millions of angry and armed people because your company decided sites like the Daily Caller and Breitbart are equivalent to The Washington Post.
This is something Facebook did when, at Joel Kaplan's behest, it made both companies Facebook news partners.
These are the folks that Facebook trusts to help them determine what stories are true.
They get money from Facebook.
They get an elevated position in the newsfeed.
Yeah.
On an unrelated note, earlier this year, Breitbart News shared a video that promoted bogus coronavirus treatments and told people that masks couldn't prevent the spread of the virus.
This video was watched 14 million times in six hours before it was removed from Breitbart's page.
They removed it, presumably, because it violated Facebook policy.
And Facebook has a two-strike policy for its news partners sharing misinformation within a 90-day period.
When Mark was asked why Breitbart got to be a Facebook trusted partner while spreading misinformation about an active plague that was killing hundreds of thousands of Americans, Mark held up the two-strike policy as a shield.
Quote, this was certainly one strike against them for misinformation.
But they don't have others in the last 90 days.
So by the policies we have, which by the way, I think are generally pretty reasonable on this, it doesn't make sense to remove them.
No!
That's pretty great, Jamie.
That's pretty awesome.
But you know what's even better about this?
Unethical But Still Legal 00:06:05
Unhical, but still legal.
Ha ha.
What's even better about this is that Breitbart absolutely violated Facebook policies more than two times in 90 days, and it was covered up.
That's what's even better.
Yeah.
You have to imagine Breitbart is violating Facebook policies multiple times a day.
Like Kaplan helps him hide it.
Yeah.
That is such, I mean, it's awesome.
It's awesome.
I'm going to read actually about that entire tale by citing an incredible report by BuzzFeed, who, by the way, all credit to BuzzFeed.
BuzzFeed and I think all, you know, BuzzFeed and I've cited a number of great articles, including that one from the Wall Street Journal, which is really important.
BuzzFeed has probably been, of all of the different media companies, the most dedicated and like hounding Facebook like a fucking dog with a groin fetish.
I don't know how to, I'm very proud of BuzzFeed's reporting on Facebook.
Thank you for keeping on this one, y'all.
Good work.
Now I have to remove that image from my head, but yes.
Yeah, I'm going to quote from this report on the fact that Facebook fraudulently hid the fact that one of their information partners was violating their own policies and spreading disinformation about an active plague.
And then you need to take an ad break just so you know.
Oh, I'll take an ad break now.
We'll get to this afterwards because if there's one thing that prepares me to hear about how democracies, both in the nation I live and around the world, are being actively murdered for the profit of a man who's already a billionaire.
If there's one thing that makes that easier to take, it's products and services.
It's the sweet lullaby of a product or a service.
Oh, nothing, nothing keeps me going, gets me intellectually hard, like a product or a service.
I want to be surrounded.
I want to die surrounded by my most beloved products and services.
I have a feeling that you will because there's a good chance that a horrible wildfire will sweep through the city you live in.
And sorry, that's getting too dark.
Mine too, maybe.
Yeah.
Yeah, as I was going to say, I'm like, hey, as long as we're on the same page there, that's great.
And it's okay.
If we make it out of that fire, Facebook will ensure there's lots of armed and misinformed militias waving guns wildly in the areas we attempt to evacuate through.
Well, as long as my death will have been completely in vain.
Yes, that's what Facebook promises for all of us.
And that's what products and services promise for all of us.
Here we go.
On a recent episode of the podcast, Money and Wealth with John O'Brien, I sit down with Tiffany the Budginista Alicia. to talk about what it really takes to take control of your money.
What would that look like in our families if everyone was able to pass on wealth to the people when they're no longer here?
We break down budgeting, financial discipline, and how to build real wealth, starting with the mindset shifts too many of us were never ever taught.
Financial education is not always about like, I'm going to get rich.
That's great.
It's about creating an atmosphere for you to be able to take care of yourself and leave a strong financial legacy for your family.
If you've ever felt you didn't get the memo on money, this conversation is for you to hear more.
Listen to Money and Wealth with John O'Brien from the Black Effect Network on the iHeartRadio app, Apple Podcast, or wherever you get your podcast.
Hey, Ernest, what's up?
Look, money is something we all deal with, but financial literacy is what helps turn income into real wealth.
On each episode of the podcast, Earn Your Leisure, we break down the conversations you need to understand money, investing, and entrepreneurship.
From stocks and real estate to credit, business, and generational wealth, we translate complex financial topics into real conversations everyone can understand.
Because the truth is, most people were never taught how money really works.
But once you understand the system, you can start to build within it.
That means ownership, smarter investing, and creating opportunities not just for yourself, but for the next generation.
If you want to learn how to build wealth, understand the markets, and think like an owner, Earn Your Leisure is the podcast for you.
Listen to Earn Your Leisure on the iHeartRadio app, Apple Podcast, or wherever you get your podcast.
I'm Iris Palmer, and my new podcast is called Against All Odds.
And that's exactly what the show is about, doing whatever it takes to beat the odds.
Get ready to hear from some of your favorite entrepreneurs and entertainers as they share stories about defying expectations, overcoming barriers, and breaking generational patterns.
I'm talking to people like award-winning actress, producer, and director, Fiva Lingoria.
I think I had like $200 in my savings account, and my mom goes, what are you going to do?
And I was like, oh, I'll figure it out.
We had a one-bedroom apartment for like $400 a month and we all could not afford.
Like, I was like, how am I going to make $100 a month?
I'm opening up like I've never before.
For those of you who think you know me from what you've seen on social media, get ready to see a whole new side of me.
Listen to Against All Odds with Iris Palmer as part of the Michael Tura podcast network available on the iHeartRadio app, Apple Podcast, or wherever you get your podcast.
Hello, gorgeous.
It's Lala Kent, host of Untraditional Ila.
My days of filling up cups at sir may be over, but I'm still loving life in the valley.
Life on the other side of the hill is giving grown-up vibes.
But over here on my podcast, Untraditional Ila, I'm still that Lala you either love or love to hate.
I've been full on over sharing with fans, family, and former frenemies like Tom Schwartz.
I had a little bone to pick with Schwartzy when he came on the pod.
You don't feel bad that you told me I was a bootleg housewife?
I must flipped a pizza in your lap.
Oh God, I literally forgot about that until just now.
Sorry, I don't want to blame all of that.
I got to blame that one on the alcohol.
This is about laughing and learning when life just keeps on laughing because I make mistakes so that you guys don't have to.
We're growing, we're thriving, and yes, sometimes we're barely surviving, but we do it all with love.
Clickable as Fuck Decisions 00:12:32
It's unruly.
It's unafraid.
It's untraditionally Lala.
Listen to Untraditional Ila on the iHeartRadio app, Apple Podcasts, or wherever you get your podcast.
All right, we're back.
So we're talking about how Facebook covered up the fact that Breitbart was repeatedly spreading disinformation that should have gotten them removed as a trusted partner.
Quote from BuzzFeed: Some of Facebook's own employees gathered evidence they say shows Breitbart, along with other right-wing outlets and figures, including Turning Point USA founder Charlie Kirk, Trump supporters Diamond and Silk, and conservative video production nonprofit Prager University, has received special treatment that helped it avoid running afoul of company policy.
They see it as part of a pattern of preferential treatment for right-wing publishers and pages, many of which have alleged that the social network is biased against conservatives.
On July 22nd, a Facebook employee posted a message to the company's internal misinformation policy group, noting that some misinformation strikes against Breitbart had been cleared by someone at Facebook seemingly acting on the publication's behalf.
A Breitbart escalation marked urgent end of day was resolved on the same day, with all misinformation strikes against Breitbart's page and against their domain cleared without explanation.
The employee wrote.
The same employee said a partly false rating applied to an Instagram post from Charlie Kirk was flagged for a priority escalation by Joel Kaplan, the company's vice president of global public policy.
Now, the whole article itself details just a ton of other instances in this, and it's all incredibly shady.
I'm not going to go into all of it in tremendous detail because we are running out of time.
But if you read the article, it's extremely clear that Joel Kaplan is directing Facebook to actively violate the company's own policies in order to keep right-wing bullshit peddlers spreading lies on the platform for profit.
Kaplan has faced no punishment for this, although his behavior did provoke outrage from employees in Facebook's internal chat system.
The rules don't apply to daddy.
That's how it goes.
Facebook employees have been getting angrier and angrier at this sort of thing throughout the year.
Remember back in May when President Trump posted this message to Twitter and Facebook?
Quote, there is no way, zero, that mail-in ballots will be anything less than substantially fraudulent.
Mailboxes will be robbed.
Ballots will be forged and even illegally printed out and fraudulently signed.
The governor of California is sending ballots to millions of people.
Anyone living in the state, no matter who they are or how they got there, will get one.
That will be followed up with professionals telling all of these people, many of whom have never even thought of voting before, how and for whom to vote.
This will be a rigged election.
No way.
I do remember that, Robert.
I do remember that.
Twitter, too, again, it's unbelievable, like the mildest I could possibly give someone credit to that level of credit.
Twitter fact-checked the president's tweet, which was not nothing, and that's all I'll say about it.
Unfortunately, that does not qualify as nothing.
Again, that qualifies as the most responsible action a social media CEO took.
Mark, on the other hand, refused to let his employees do anything similar, allowing the president's flagrant misinformation to circulate on his network.
This enraged employees, and they got angrier when his, when the looting starts, the shooting starts post was let up.
They created a group in workplace, their internal chat app, called Let's Fix Facebook, parentheses, the company.
It now has about 10,000 members.
One employee started a poll asking colleagues whether they agreed, quote, with our leadership's decisions this week regarding voting misinformation and posts that may be considered to be inciting violence.
A thousand respondents said the company had made the wrong decision on both posts, which is more than 20 times the number of responses who said otherwise.
So Facebook employees after this staged a digital walkout.
And they like changed their workplace avatars to a black and white fist and called out sick en masse among hundreds of them and stuff.
And, you know, I'm going to quote from BuzzFeed again here.
As Facebook grappled with yet another public relations crisis, employee morale plunged.
Worker satisfaction metrics, measured by micro poll surveys that are taken by hundreds of employees every week, fell sharply after the ruling on Trump's looting post, according to data obtained by BuzzFeed.
On June 1st, the day of the walkout, about 45% of employees said they agreed with the statement that Facebook was making the world better, down 25 percentage points from the week before.
That same day, Facebook's internal survey showed that around 44% of employees were confident in Facebook leadership leading the company in the right direction.
A 30 percentage point drop from May 25th.
Responses to that question have stayed around the lower mark as of earlier this month.
So pretty significant drop in faith in the company from its employees.
And yeah, Zuckerberg, the ultimate decision maker, according to Facebook's head of communications, initially defended his decision to leave Trump's looting post up without even hiding it like with a warning like Twitter.
Mark stated, Unlike Twitter, we do not have a policy of putting a warning in front of posts that may incite violence because we believe that if a post incites violence, it should be removed regardless of whether or not it's newsworthy, even if it comes from a politician.
Oh, so you have to wait for there to be violence incentive and then be like, oh, it turns out that post was actually really bad and we should take it down.
Again, the amount of bodies that he needs attached to do a single thing is staggering.
Four days later, Mark backtracked from BuzzFeed.
Quote: In comments at a company-wide meeting on June 2nd that were first reported by Recode, Facebook's founder said the company was considering adding labels to posts from world leaders that incite violence.
He followed that up with a Facebook post three days later, in which he declared Black Lives Matter and made promises that the company would review policies on content discussing excessive use of police or state force.
What material effect does any of this have?
One employee asked in workplace, openly challenging the CEO.
Commitments to review offer nothing material.
Has anything changed for you in a meaningful way?
Are you at all willing to be wrong here?
Mark didn't respond to this, but on the 26th, nearly a month of June, nearly a month later, he posted a clarification to his remarks, noting that any post that is determined to be inciting violence will be taken down.
Employee dissatisfaction has continued to swell over the course of the summer.
One senior employee, Max Wang, even recorded a 24-minute long video for his colleagues.
And BuzzFeed, in another article, has all the audio for this.
It's worth listening to.
In the video, Max outlines why he can't morally justify working for Facebook anymore.
And he's a pretty early employee, I think.
His video quotes at length from books on totalitarianism by Hannah Arendt, who is one of like the great scholars of the Holocaust.
Yeah.
Yeah.
He shared the video on workplace with a note that started, I think Facebook is hurting people at scale.
Yes.
Yeah.
Yes, it is.
Absolutely.
Yeah.
All right.
Yeah.
Like Emperor Augustus, who had members of his own family killed for disobedience, Mark did not like being questioned and gasped, disapproved of by his own employees.
On June 11th, he hosted a live QA where he delivered a message to employees who were angry at his enabling of hideously violent fascist rhetoric.
This is a lot of this is in like, I think, response to the killings and such.
I've been very worried about the level of disrespect and in some cases of vitriol that a lot of people in our internal community are directing towards each other as part of these debates.
If you're bullying your fellow colleagues into taking a position on something, then we will fire you.
Well, good.
You know, the amount of consistency, I mean, you got to appreciate it.
I'm really glad that that employee, I mean, just spoke directly about, because it's like, at what point, truly, what do you have to lose?
I guess except for your life, depending on how Mark Zuckerberg wants to go about it.
But I mean, it's, I don't know.
No, it is so frustrating, even though it's like, I don't know what else to do other than, you know, whatever, some, some shit in Minecraft.
But, but just people are continually waiting for this person and this company to act in the best interest.
It's like, it's not, when has it ever happened?
Name a time.
Even in the face of like the most brutal public disapproval, there's too much.
It's amazing.
It's amazing.
As you're saying all this, and as I just finished the thing that I'm saying, a Bloomberg story just dropped.
Like as we were recording this episode, I'm just going to read you.
I haven't read the story.
I'm just going to read you the title.
Facebook accused of watching Instagram users through cameras.
No!
Oh, man.
Oh, and fucking rules.
Oh, my God.
It's so good.
Have we talked about that before, though?
Because I have, I've had that issue with Instagram before where I'll close out Instagram and then you'll see the little section of your iPhone in the top left where it indicates that you're being recorded.
It goes, it turns like when I, hmm.
Listeners, let me know if you've had a similar issue.
Sometimes when I close Instagram, it looks like my phone just stopped recording, but it goes away really quickly.
It's like a millisecond that it's up.
It happens all the time.
So that is not shocking at all.
Yeepery do.
Woo.
I don't know what I'm going to actually title this.
I don't know what I'm actually going to title this episode.
The working title that I started this under was Mark Zuckerberg Needs to Be Tried in the Hague and Hung in Public Until Dead.
But I don't think legal's going to let me go with that title.
Okay.
I think it's clickable.
Strong proposal.
I think it's clickable as fuck.
I think you'd get great engagement on that.
I mean, it's what he'd want.
Yeah, we may have to go with a different title.
I mean, I'm not urging illegal behavior.
I'm urging that he be tried in the international criminal court and then once convicted, hung until the net by the neck until dead for his crimes.
Which is what you do when a world leader commits genocide.
Right.
Right.
Yeah.
That is true.
But I probably won't title the episode that.
I don't know.
I mean, I'm glad you put it out there, though.
Let's not take it out of the running.
Yeah, there's a number of other options.
I mean, Mark Zuckerberg continues to disrupt.
Yeah.
200 years of peace.
There's so many options.
I can't wait for the 200 years of peace that only involved dozens of wars.
Yeah, I mean, let's say if the 200 years of peace began in 2004, imagine how much peace we have to look forward to.
I think it's similar to the amount of peace I brought that trailer park.
You're a little sicko.
You're a little sicko.
I know it.
I know it.
Well, we're all gonna be fine.
Fine.
We're all gonna be great.
Yeah.
You want to plug some shit?
Yeah.
Yeah.
Thank you for disrupting my life and inner sense of peace.
That's all I always be disrupting, baby.
I've always been APR.
You've always been a huge disrupter.
And yeah, you can follow me on Twitter or Instagram, which is watching me right now.
And then if you want to contribute to a candidate that I love, Fatima Iqbal Zubair, we're doing a live read of the Twilight script this Friday evening, 5 p.m. Pacific.
That sounds very exciting.
It is something to do to distract yourself from the void.
Team Edwards.
See you there.
See you there.
And I am going to be doing the thing that I normally do, which is staring into the abyss and going, hey, hey, quit being an abyss.
You're really bumming us all out, Abyss and the Abyss.
You and the Abyss have great chemistry.
We do.
We do.
And the Abyss has made me a lot of money, a lot of money, which is something that I feel very, very conflicted about.
Staring Into The Abyss 00:03:22
The Abyss is rich.
That's the thing that Nietzsche missed is sometimes when you stare into the Abyss, you get a six-figure salary because it's incredibly profitable to talk about the Abyss on a podcast.
Yeah, The Abyss has facial recognition software and it's pretending to be me elsewhere.
Yeah.
Jamie the Abyss Loftus.
That's what I've been called.
It's been said.
You can follow us on Twitter and Instagram where you're probably being watched at Bastards Pod.
You can follow Robert on Twitter at iWriteOK.
You can buy stuff from RT Public Store and also the Beck Delkaski Public Store where Jamie designs all the artwork for that and it's amazing.
I think I covered everything.
Wash your hands, wear a mask, yell into the abyss.
Yeah.
Oh, wait, Robert, did I show you my bedazzled bolt cutters?
I'll send you a picture of them.
Bye, bolt cutters.
No, I would love to see your bedazzle bolt cutter.
Yeah, I have a pair of bolt cutters that are still usable, but also mostly covered in rhinestones.
I'll send it to you.
Yay!
Yay!
That's the episode.
Hell yeah.
It's financial literacy month, and the podcast Eating While Broke is bringing real conversations about money, growth, and building your future.
This month, hear from top streamer Zoe Spencer and venture capitalist Lakeisha Landrum Pierre as they share their journeys from starting out to leveling up.
There's an economic component to community thriving.
If there's not enough money and entrepreneurship happening in communities, they've failed.
Listen to Eating While Broke from the Black Effect Podcast Network on the iHeart Radio app, Apple Podcasts, or wherever you get your podcast.
Saturday, May 2nd, country's biggest stars will be in Austin, Texas.
At our 2026 iHeart Country Festival presented by Capital One.
Tickets are on sale now.
Get yours before they sell out at Ticketmaster.com.
That's Ticketmaster.com.
Hey there, folks.
Amy Roebuck and TJ Holmes here.
And we know there is a lot of news coming at you these days from the war with Iran to the ongoing Epstein fallout, government shutdowns, high-profile trials, and what the hell is that Blake Lively thing about anyway?
We are on it every day, all day.
Follow us, Amy and TJ, for news updates throughout the day.
Listen to Amy and TJ on the iHeart Radio app, Apple Podcasts, or wherever you listen to podcasts.
On a recent episode of the podcast, Money and Wealth with John O'Brien, I sit down with Tiffany the Bajanista Alicia to talk about what it really takes to take control of your money.
What would that look like in our families if everyone was able to pass on wealth to the people when they're no longer here?
We break down budgeting, financial discipline, and how to build real wealth, starting with the mindset shifts too many of us were never, ever taught.
If you've ever felt you didn't get the memo on money, this conversation is for you to hear more.
Listen to Money and Wealth with John O'Brien from the Black Effect Network on the iHeartRadio app, Apple Podcast, or wherever you get your podcast.
This is an iHeart podcast.
Guaranteed human.
Export Selection