Mark Zuckerberg and Facebook's internal documents reveal a corporate culture prioritizing engagement over safety, where algorithms amplified anger to profit while ignoring data showing 32% of teen girls feel worse about their bodies. Despite internal recommendations to reduce hate speech by 5%, leadership rejected interventions that could have limited violence, instead whitelisting 5.8 million VIPs from enforcement rules while lying to the board. This negligence, spanning from the January 6th riots to ethnic violence in Myanmar, exposes a colonial-style exploitation of user data, suggesting that without strict regulation, platforms will continue trading ethical responsibility for maximum engagement and profit. [Automatically generated summary]
Transcriber: nvidia/parakeet-tdt-0.6b-v2, sat-12l-sm, and large-v3-turbo
|
Time
Text
Roald Dahl Spy Secrets00:02:16
This is an iHeart podcast.
Guaranteed human.
You know the famous author Roald Dahl.
He thought up Willy Wonka and the BFG.
But did you know he was a spy?
Neither did I. You can hear all about his wildlife story in the podcast, The Secret World of Roal Dahl.
All episodes are out now.
Was this before he wrote his stories?
It must have been.
What?
Okay, I don't think that's true.
I'm telling you, I was a spy.
Binge all 10 episodes of The Secret World of Roal Dahl.
Now on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Readers, Katie's finalists, publicists.
We have an incredible new episode this week for you guys.
We have our girl Hillary Duff in here, and we can't wait for you to hear this episode.
They put on Lindsay McGuire at 2 a.m. video on demand.
This guy's playing.
2 a.m.
2 a.m.
Whatever time it is.
Lizzie McGuire and I'm like, wild bad way.
It was like a first closet moment for me where I was like, I don't feel like she's hot like the rest of them.
No, no, no.
I was like, she's beautiful.
I'm appreciating her in a different way than these boys are.
I'm not like...
Listen to Las Culturalistas on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
This is Saigon, the story of my family and of the country that shaped us.
From iHeart Podcasts, Saigon.
You're gonna think I'm serious about a free Vietnam?
One city, a divided country, and the war that tore America apart.
For Vietnam.
Freedom for Vietnam.
There's a fire coming to this country and it's going to burn out everything.
Listen to Saigon, starting on April 22nd on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
On the Ceno Show podcast, each episode invites you into a raw, unfiltered conversation about recovery, resilience, and redemption.
On a recent episode, I sit down with actor, cultural icon Danny Trail to talk about addiction, transformation, and the power of second chances.
The entire season two is now available to bench, featuring powerful conversations with guests like Tiffany Addish, Johnny Knoxville, and more.
I'm an alcohol without this probe.
I'm a guy.
Listen to Ceno's show on the iHeartRadio app, Apple Podcast, or wherever you get your podcast.
Twitter Radicalization Crisis00:14:45
Let's do it.
Let's start the podcast.
Let's start the podcast.
All right.
Well, let's have that be what starts the podcast, what we just said.
Let's start the podcast.
Well, I'm Robert Evans.
Yep.
I'm Sophie Licknerman.
I never introduced myself.
I'm Jamie.
Who are you?
Is there Jamie Loft?
Yes.
Anything more we need to say?
Are we done with the episode?
Anderson's here.
No, I think that, yeah.
Yeah.
Anderson is here.
Sure.
Well, you know what's happening in the world?
No.
Facebook is happening to the world, and it's bad.
That's unfortunate.
It's not great, Jamie.
It's not great, Sophie.
Not a fan of the Facebook.
We left off having gone through some of the Facebook papers, particularly employees attacking their bosses after Jan 6th when it became clear that the company they were working for was completely morally indefensible.
I wouldn't call it an attack.
They already knew.
Yeah, they already knew.
I wouldn't call it attacking either.
I would call it attacking.
They were pretty.
I mean, there's a guy, like the quote there, there was the guy who was like, history won't judge us kindly.
The guy who was like, yeah, when we didn't ban Trump in 2015, that's what caused the Capitol riot.
I mean, facts are facts.
Is that really attacking?
If you're just like, well, I think, yeah, I think stating facts can be an attack.
Whoa.
Okay, put it on a t-shirt.
I mean, for people like this, you know, yeah, I think stating facts can be an attack.
And we ended part one by sharing some of the blistering criticisms of Facebook employees against, you know, management and the service itself.
So as we start part two, it's only proper that we cover how Facebook responded to all of this internal criticism.
As I stated last episode, Facebook is in the midst of a years-long drought of capable engineers and other technical employees.
They are having a lot of trouble hiring all of the people that they need for all of the things they're trying to do.
So one of the things is for a lot of these employees, when they say things that are deeply critical, they can't just dismiss the concerns of their employees outright because act like if they were to do that, these people would get angry and they need them, right?
Facebook's not in the strongest position.
When it comes to the people who are good engineers, they have to walk a little bit of a tightrope.
However, if they were to actually do anything about the actual meat of the concerns, it would reduce profitability and in some cases destroy Facebook as it currently exists.
So they're not going to do anything, which has meant that they've had to get kind of creative with how they respond.
So Mark and his fellow bosses pivoted and argued that the damning critique, like.
We're calling him Mark now.
Yeah, old Zucky Zuck.
So when this all comes out and people are like, boy, it sure seems like all of your employees know that they're working for the fucking Death Star.
Zuckerberg and his mouthpieces made a statement that all of these damning critiques from people inside the company were actually evidence of the very open culture inside Facebook, which encouraged workers to share their opinions with management.
That's exactly what a company spokesperson told The Atlantic when they asked about comments like, history will not judge us kindly.
The fact that they're saying we'll be damned by historians means that we really have a healthy office culture.
Hashtag Death Star Proud.
Yeah.
Death Star Proud, everybody.
Yeah, yeah.
It's like the fact that...
Well, remove the stigma of working for the devil, right?
I mean, come on.
The devil, I would be proud to work for because he's done some cool stuff.
Like, have you ever been to Vegas?
Nice town.
Oh, I've been to Vegas.
I saw the Backstreet Boys in Vegas right before two of them were revealed to be in QAnon.
So really caught the end of that locomotive.
Oh, wow.
I did not realize that a sizable percentage of the Backstreet Boys had gotten into QAnon.
That makes total sense.
The Backstreet Boys are from Florida.
They're ultimately five men from Florida.
So what can you do?
As the author of that article, The Atlantic article noted, this stance allows Facebook to claim transparency while ignoring the substance of the complaints.
And the implication of the complaints, that many of Facebook's employees believe their company operates without a moral compass.
All over America, people used Facebook to organize convoys to D.C. and to fill the buses they rented for their trips.
And this was indeed done in groups like the Lebanon Maine Truth Seekers, where Kyle Fitzsimmons posted the following, quote, this election was stolen and we were being slowwalked towards Chinese ownership by an establishment that is treasonous and all too willing to gaslight the public into believing the theft was somehow the will of the people.
Would there be an interest locally in organizing a caravan to Washington, D.C. for the Electoral College vote count on January 6th, 2021?
Yeah, and Kyle recently pled not guilty to eight federal charges, including assault on a police officer.
Mark Zuckerberg would argue that like Facebook didn't play a significant role in organizing January 6th and couldn't have played a significant role in radicalizing this guy and many other people.
But the reality is that for the people, like the people who managed part of what led Kyle Fitzsimmons to go assault people on January 6th was the fact that he had been radicalized by a social network that for years made the conscious choice to amplify angry content and encourage anger because it kept people on the site more, right?
Like all of the anger that boiled up at January 6th, that came from a number of places, but one of those places was social media because social media profited and specifically Facebook knowingly profited from making people angry.
That was the business.
And of course it blew up in the real world.
I have a question just out of your own experience and observation, which is how do you, like if you're doing a side-by-side case study of how Facebook responded to events like this versus how like YouTube slash Google responded to radicalization, are there like significant differences?
Did anyone do better or different things?
Yes, Twitter has done better than probably most of them.
YouTube, I mean, and again, I'm not saying to say that Twitter's done well or that YouTube has done well, but they've both done, particularly with coronavirus disinformation, a bit better than Facebook.
And they were better in general on not really YouTube as much, but like Twitter was definitely, has taken, has been the most responsible of the social networks around this stuff.
It did seem like for a while the various networks were kind of like duking it out to see who could do the absolute worst and damage people's lives.
And it seems like Facebook won't.
I would say Facebook.
And again, Twitter chose to do a lot of the same toxic things Facebook did.
So did YouTube.
And they did it all for profit.
A number of the things we've criticized Facebook for, you can critique YouTube and Twitter for.
I would argue Twitter certainly has done more and more effectively than Facebook.
Not enough that they're not being irresponsible, because I would argue that Twitter has actually been extremely irresponsible and knowingly so.
But I think Facebook, in my analysis, Facebook has been the worst.
Although I'm not as I haven't gotten studied as much about like TikTok yet, so we'll see.
But my analysis is...
You got to get on TikTok.
I need to pivot out of podcasting and into TikTok dances.
Yeah, I mean, it's not the dances that concern me on TikTok.
It's the minute-long conspiracy theory videos that have convinced a number of people that the Kardashians are Armenian witches and had something to do with the collapse of the Astro Worlds or the deaths in the Astro World thing.
My concern there is the dances that go over those conspiracy videos and really marry the worst of both worlds.
Yeah, I'm sure.
Because I have seen dancing on TikTok.
I have seen conspiracy videos that involve dancing and skincare routine.
Have you ever seen a conspiracy video where someone's also doing their skincare routine?
Because that is a thriving subject.
I'm sure that's...
Yeah, so well, all of them.
I was like, that is just a thing that is on many platforms.
I will say all media companies are willfully bad at stopping radicalization because making people angry and frightened is good for all of their bottom lines.
So they all knowingly participate in this.
I think Facebook has been the least responsible about it, but that shouldn't be taken as praise of anybody.
Like saying Twitter, saying Twitter has done the best is saying like, well, we were all drunk driving, but John could actually walk most of a straight line before vomiting.
So he was the least irresponsible of us who drunk drove that night.
Just to put it in terms that I understand, it sounds like Twitter is the backstreet boy that's like, look, I don't believe in QAnon, but I see their points.
That's kind of the vibe I'm getting.
Fair enough.
So when deciding which posts should show up more often in the feeds of other users, Facebook's algorithm weighs a number of factors.
The end goal is always the same, to get the most people to spend the most time interacting with the site.
For years, this was done by calculating the different reactions a post got and weighing it based on what responses people had to it.
And again, for years, the reaction that carried the most weight was anger, the little snarling, smiley face icon you could click under a post.
It was at one point being weighted five times more than just like a like.
Really?
Like, again, when I'm saying this was all intentional, they were like people who respond angrily to posts.
That keeps them on the site more.
They spend the most time engaging with things that make them angry.
So when it comes to determining by which method, like how we choose to have the algorithm present people with posts, the posts that are making people angriest is the posts our algorithm will send to the most people.
That's a conscious choice.
That's a conscious, yeah.
It's so funny how, I mean, not funny, it's tragic and upsetting, but just how specific the Facebook audience is that it's like you would have to be the kind of person who would be like, I better react angry.
I might be as specific as possible in my feedback to this post, which is Farmville moms and all of the tilts.
It's boomers.
Yeah.
And yeah, they just kind of knowingly set off a bomb in a lot of people's fucking brains.
They're addicted to telling on themselves for no reason.
Yeah.
Why?
Why?
Anyways.
Facebook has something called the integrity department.
And these are the people with the unenviable task of trying to fight misinformation and radicalization on the platform.
They noted in July 2020.
It's so embarrassing.
Imagine going on a first date.
Yeah.
Just going on a first date and be like, I work for the Facebook integrity department.
Yeah, good fucking luck.
Yeah, I work for the anger.
My job is to go door to door and apologize to people after we bomb them.
We have gift baskets for the survivors.
That's the gig, really.
Yeah, I send edible arrangements to people who have been drone striked.
Like, oh, Jesus, awful.
There's one of my favorite follows on Twitter is Brooke Binkowski, who used to work for Facebook and was like one of the people early on who was trying to warn them about disinformation and radicalization on the platform years ago and left because it was clear they didn't actually give a shit.
And a lot of the integrity department people are actually like really good people who are a little bit optimistic and kind of young and come in and like, okay, I'll make it's my job to make this huge and important thing a lot safer.
And these people get chewed up and spit out very, very quickly.
And members of the integrity team were kind of analyzing the impact of weighing angry content so much.
And some of them noted in July 2020 that the extra weight given to the anger reaction was a huge problem.
They recommended the company stop weighing it extra in order to stop the spread of harmful content.
Their own tests showed that dialing the weight of anger back to zero, so it was no more influential than any other reaction, would stop rage-inducing content from being shared and spread nearly as widely.
This led to a 5% reduction in hate speech, misinformation, bullying, and posts with violent threats.
And when you consider how many billions of Facebook posts there are, that's a lot less nasty shit, some of which is going to translate into real world violence.
And again, this was kind of a limited study, so who knows how it would have actually affected things in the long run.
So Facebook made this less money, question mark.
Yeah, this actually was kind of a win for them.
Facebook did make this change.
They pushed it out in September of 2020.
And the employees responsible deserve real credit.
Again, there's people within Facebook who did things that really actually were good.
Like changing this probably like made the world a bit healthier.
That said, the fact that it had been weighted this way for years, you don't undo that just by dialing it back now.
For one thing, anger has become such an aspect of the culture of Facebook that even without weighing the anger emoji, most of the content that goes viral is still stuff that makes pisses people off because that's just become what Facebook is because that's what they selected for for years.
Like it also, like, who knows, like if they'd done this years ago, if they'd never weighted anger more, it might be a very different platform with like a very different impact on the brains of, for example, our aunts and uncles.
I think that that's really interesting too, because that timeline lines up pretty quickly with, or pretty exactly with where it feels like a lot of younger people were leaving that platform and the platforms became associated with older people.
Because I feel like I don't think I was using Facebook consistently after 2017, I want to say, was maybe my last Facebook year.
Yeah, I stopped.
I mean, I stopped visiting it super regularly a while back.
Yeah, maybe around 2017.
Right.
So in April of 2020, Facebook employees came up with another recommendation.
And this one wouldn't be as successful as changing the reaction of the algorithm to the angry reaction.
Spurred by the lockdown and the sudden surge of QAnon, Boogaloo, and anti-lockdown groups urging real world violence, it was suggested by internal employees that the newsfeed algorithm deprioritize the posting of content based on the behavior of people's Facebook friends.
So the basic idea is this.
What Facebook was doing was you would, if normally, like the way you'd think it would work, right, is that like your friends post something and you see that in your newsfeed, right?
Like the posts of the people that you've chosen to follow and say are your friends, right?
That's how you would want it to work.
That's how it worked at one point.
They made a change a few years back where they started sending you things not because someone you followed had said something, but because they'd liked a thing or they'd commented, like not necessarily even commented, just like liked a thing.
Like if they'd reacted to a thing, you would get engaged in any way.
You would get that sent to your newsfeed.
Facebook Newsfeed Mechanics00:04:17
And members of the integrity team start to recognize like this has some problems in it.
For one thing, it results in a lot of people getting exposed to dangerous bullshit.
So they start looking into like the impact of this and how just sharing the kind of things your friends are reacting to influences what you see and what that does to you on Facebook.
The integrity team experimented with how changing this might work.
And their early experiments found that fixing this would reduce the spread of violence inciting content.
For one thing, what they found is that normally, if you hadn't seen someone like a post about something that was maybe like violent or aggressive or conspiratory, like a flat earth post or a post urging the execution of an elected leader, if you hadn't seen anyone that you knew react to that post, even if you saw it, you wouldn't comment on it or share it.
But they found that like if you just saw that a friend had liked it, you were more likely to share it, which increases exponentially the spread of this kind of violent content.
And it's this idea, like the whole people weren't stopped being afraid to be racist at a certain point as much as they had been earlier.
And it led to this surge in real world violence.
It is kind of the same thing.
People felt by seeing their friends react to this, they felt permission to react to it too.
In a way, maybe they would have liked, well, I don't want to like, maybe I'm interested in flat earth shit, but I'm just going to ignore this because I don't want to seem like a kook.
That is so fucking upsetting and fascinating in the way that it affects your mind is, yeah, there was a time where you would, if you were, you know, racist, misogynist, homophobic, whatever you were, but you just didn't talk about it.
But then all of a sudden there's this confirmation that like, hey, this person you know and see all the time feels the same fucking way you do.
So why be quiet about it?
Yeah.
Let's discuss.
Like it's just, that's so dark.
It's really dark.
And so the integrity team sees this and they're like, we should change this.
We should only show, we shouldn't be showing people just like the reactions their friends have had to content because it seems to be bad for everybody.
And they do find in some of their, you know, because when they experiment, they're like, we'll take this country or this city and we'll roll this change out in this limited geographical location to like try and see how it might affect its scale.
And they do this and they see that like, oh, changing this significantly reduces the spread of specifically violence inciting content.
So they're like, hey, we should roll this out service-wide.
Zuckerberg himself steps in, according to Francis Hogan, the whistleblower, and quote, rejected this intervention that could have reduced the risk of violence in the 2020 election.
From The Atlantic, quote, an internal message characterizing Zuckerberg's reasoning says he wanted to avoid new features that would get in the way of meaningful social interactions.
But according to Facebook's definition, its employees say engagement is considered meaningful even if it entails bullying, hate speech, and reshares of harmful content.
The episode, like Facebook's response to the incitement that proliferated between the election and January 6th, reveals a fundamental problem with the platform.
Facebook's mega scale allows the company to influence the speech and thought patterns of billions of people.
What the world is seeing now through the window provided by reams of internal documents is that Facebook catalogs and studies the harm it inflicts on people, and then it keeps harming people anyway.
See, that's always so interesting to hear.
And by interesting, I mean, you know, psychologically harmful.
Yeah.
Because it's like, yes, that is a fundamental flaw of the platform, but that's also very entrenched into like what the DNA of the platform always was, which was based on harshly judging other people.
Like that's why Mark Zuckerberg created Facebook, was to harshly judge women in his community.
So it's like, I know that it is, you know, on a bajillion scale at this point, but I'm always kind of stunned at how people are like, oh, it's so weird that this went, you know, the way that it did.
It's like, well, to an extent, it was always like that.
And maybe it was like cosplaying as not being like that.
And for certain people, there were eras in Facebook where your user experience wouldn't be like that.
But it, you know, this goes back almost 20 years at this point of this being in the DNA of this shit show.
Mega Bus Adventures00:02:31
Yeah.
And it's really bleak.
It's just really bleak.
And it also goes to show like the one of the things Zuckerberg will say repeatedly when he talks about when he does admit, like, yes, there are problems and there have been like negatives associated with the site and we're aware of that.
They're humbling.
But like, you know, you also have to include all the good that we're doing, all of the meaning.
And the way he always phrases this is like all of the meaningful social interactions that wouldn't have happened otherwise.
And then you realize when every time he says that.
Five meaningful social interactions that have taken place on Facebook.
When he says that he's including as these internal documents, he includes bullying and people like making death threats and like talking about their desire to murder people.
Like that's a meaningful interaction.
People getting angry and trying to incite violence together is a meaningful social interaction, which I guess, yes.
Hate is, I mean, not meaningless, that it, that has meaning.
Yeah, clan meetings were meaningful social interactions, you know?
You got to give the KKK that.
The Nuremberg rally was a meaningful interaction.
The last meaningful interaction I had on Twitter led to like a rebound I was dating, coming to my grandma's funeral blackout drunk.
So, you know, it's all just like, oh man.
God, it's been too long since I've shown up at a funeral just too drunk to stand.
It is still one of my favorite memories with my family to this day.
They're like, who is this guy?
And I'm like, I don't really know.
He's drunk as shit, though.
He came on the mega bus.
Hell yeah, he did.
Hell yeah.
He's fucked up on the mega bus.
Getting drunk from a camelback on a mega bus.
Yeah.
That would be when I used to do a lot of bus trips, like when I was traveling and stuff, that would be one of the tactics is you fill like a thermos or a camelback with like 40% cranberry juice, 60% liquor, and just get it.
No, it's awesome.
I'm not above getting fucked up on a mega bus, but on your way to my grandma's funeral, that was a move.
Me and my friends got like wasted in San Francisco one day, just like going shopping in broad daylight with a camelback where we would get a bottle of orange-flavored Trader Joe's Patron tequila, and we would get a half dozen lime popsicles, and you just throw the popsicles in with the Patron in the camelback, and throughout the day it melts, and you just have constant cold margarita.
It's actually fucking amazing.
That fucking rocks.
I wish I knew that when I was 22.
San Francisco Night Out00:03:26
Oh, yeah, I recommend it heavily.
You will get trashed and people don't notice.
Dude, walking around with a fucking camelback in San Francisco, nobody gives a shit.
Oh my God, you're basically camouflaged.
Yeah.
You know who else is camouflaged?
Who?
The products and services that support this podcast.
Camouflaged to be more likable to you by being wrapped in a package of the three of us.
That's how ads work.
I thought you were saying that you were taking ads from the U.S. Army Recruitment Center again.
I mean, it's entirely possible.
But at the moment, we're just camouflaging, I don't know, whoever comes on next, whoever comes on next, you'll feel more positively about because of our presence here.
Wow.
That's how ads work.
It's good stuff.
On a recent episode of the podcast, Money and Wealth with John O'Brien, I sit down with Tiffany the Budgetista Alicia to talk about what it really takes to take control of your money.
What would that look like in our families if everyone was able to pass on wealth to the people when they're no longer here?
We break down budgeting, financial discipline, and how to build real wealth, starting with the mindset shifts too many of us were never, ever taught.
Financial education is not always about like, I'm going to get rich.
That's great.
It's about creating an atmosphere for you to be able to take care of yourself and leave a strong financial legacy for your family.
If you've ever felt you didn't get the memo on money, this conversation is for you to hear more.
Listen to Money and Wealth with John O'Brien from the Black Effect Network on the iHeartRadio app, Apple Podcast, or wherever you get your podcast.
Hey, Ernest, what's up?
Look, money is something we all deal with, but financial literacy is what helps turn income into real wealth.
On each episode of the podcast, Earn Your Leisure, we break down the conversations you need to understand money, investing, and entrepreneurship.
From stocks and real estate to credit, business, and generational wealth, we translate complex financial topics into real conversations everyone can understand.
Because the truth is, most people were never taught how money really works.
But once you understand the system, you can start to build within it.
That means ownership, smarter investing, and creating opportunities not just for yourself, but for the next generation.
If you want to learn how to build wealth, understand the markets, and think like an owner, Earn Your Leisure is the podcast for you.
Listen to Earn Your Leisure on the iHeartRadio app, Apple Podcast, or wherever you get your podcast.
Will Farrell's Big Money Players and iHeart Podcast presents soccer moms.
So I'm Leanne.
This is my best friend Janet.
Hey.
And we have been joined at the hip since high school.
Absolutely.
Now a redacted amount of years later, we're still joined at the hip.
Just a little bit bigger hips, wider.
This is a podcast.
We're recording it as we tailgate our youth soccer games in the back of my Honda Odyssey with all the snacks and drinks.
Sidebar.
Why did you get hard seltzer instead of beer?
Oh, they had a BOGO.
Well, then you got it.
You had a white claris up here.
Just what are y'all doing?
Microphones?
Are you making a rap album?
I would buy it.
Cuts through the defense like a hot knife through sponge cake.
That sounds delicious.
Oh, you're lucky.
I'm not a drug addict.
You're lucky.
I'm not an alcoholic.
Instagram Suicide Risks00:15:55
You're lucky I'm not a killer.
I love this team and I'm really trying to be a figure in their lives that they can rely on.
Oh.
Listen to soccer moms on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Hello, gorgeous.
It's Lala Kent, host of Untraditional Le La.
My days of filling up cups of sur may be over, but I'm still loving life in the valley.
Life on the other side of the hill is giving grown-up vibes.
But over here on my podcast, Untraditional Le Lala, I'm still that Lala you either love or love to hate.
I've been full-on oversharing with fans, family, and former frenemies like Tom Schwartz.
I had a little bone to pick with Schwartzy when he came on the pod.
You don't feel bad that you told me I was a bootleg housewife?
I almost flipped a pizza in your lap.
Oh my god, I literally forgot about that until just now.
Sorry, I don't want to blame all of that.
I got to blame that one on the alcohol.
This is about laughing and learning when life just keeps on laughing.
Because I make mistakes so that you guys don't have to.
We're growing, we're thriving, and yes, sometimes we're barely surviving, but we do it all with love.
Listen to Untraditionally Lala on the iHeartRadio app, Apple Podcasts, or wherever you get your podcast.
Oh, we're back.
My goodness.
What a good time we're all having today.
How are you doing, Jamie?
You make it okay.
You made it sound sarcastic.
I am having a good time.
Well, I'm glad.
I'm happy that you're having a good time.
That's my only goal for this show and for you as a good time.
See, now you're doubling down on it and I'm getting insecure.
I'm doubling down and I'm also talking more and more like an NPR talking head as I get quieter by the bit.
Now I'm going to start having a panic attack.
I've never heard you talk this.
I know.
This is how I talk to my cats when I'm angry at them.
Rar, honestly, I feel like we do have that dynamic.
I feel like I'm a cat that you get angry at sometimes.
Yeah, because you jump on my desk and knock over my xevia.
It's infuriating.
It's just for attention.
I know.
It's just for attention.
But I've got to work to keep you inexpensive cat food.
I only feed my cats the nice wet food.
I would rather have your attention than really nice food.
Okay.
No, that's not what my cats say.
So there's just a shitload to say about how Facebook negatively impacts the increasingly violent political discourse in the United States and how they helped make January 6th happen.
But I think the way I'd like to illustrate the harm of Facebook next is a bit less political.
It also occurs on a different Facebook product.
I'm talking about Facebook, the company generally when I refer to Facebook, but now we're going to talk about Instagram.
In part one, I mentioned that young people felt that removing likes from Instagram temporarily corresponded with a decrease in social anxiety.
The impact of Instagram specifically on the mental health of kids and teens can be incredibly significant.
One of the other Facebook internal studies that was released as part of the Facebook papers was conducted by researchers on Instagram.
The study, which again almost certainly would never have seen the light of day if a whistleblower hadn't released it, found that 32% of teen girls reported Instagram made them feel worse about their body.
22 million teenagers in the United States log on to Instagram on like a daily basis.
So that's millions of teen girls feeling worse about their body because of Instagram.
I've never been less surprised at learning a thing.
It's a very revelation.
Well, good news, it gets worse.
Like, no fucking kidding.
So good.
These researchers released their findings internally in March of 2020, noting that comparisons on Instagram can change how young women view and describe themselves.
Again, not surprising.
So company researchers have been investigating the way that Instagram works, though, for quite a while, years.
About three years that they've been doing this seriously.
And their previous findings all back up the same central issues.
Photo sharing in particular is harmful to teen girls.
One 2019 report concluded, we make body image issues worse for one in three teen girls.
Its findings included this damning line.
Teens blame Instagram for increases in anxiety and depression.
This reaction was unprompted and consistent across all groups.
So, like, they almost always mention that this app specifically makes them feel worse about their body, and we don't have to prompt them at all.
Like, this just comes up when they talk about Instagram.
I mean, that's that truly it's so, Sophie.
I don't know how you feel.
I mean, I truly think that, like, because I've been on Instagram since what, like 2014, some shit?
No, earlier.
I think I put that on earlier.
It was around when we were in high school.
I truly think my life and my relationship to my body would be very different.
100% had not been on that app for the better part of a decade.
Yeah, I mean, especially when they introduced filters.
Yeah, we're about to talk about that.
So here's the kicker.
And by kicker, I mean the bleakest part.
In teens who reported suicidal thoughts, 13% of teens in the UK and 6% of teens in the United States claimed their desire to kill themselves started on Instagram.
That's fucking disgusting and terrible.
That's pretty bleak.
More than 40% of Instagram.
I just like, I wish I were more surprised.
I'm just good to have this data.
The data shows that more than 40% of Instagram, so more than 40% of Instagram users are less than 22 years old, which means you've got 22 million teens logging onto the service in the U.S. every day.
6% of those people becoming suicidal as the result of Instagram is 1.32 million children who started wanting to kill themselves while using Instagram.
Hey, everybody, Robert Evans here, and I actually screwed up the math that I just cited, which is often the case when I do math.
So anytime I do math of my own in an episode, you're right to question me.
I was calculating 6% of 22 million, basically.
But as the study noted, it's 6% of kids who are suicidal say that their suicidal feelings started on Instagram.
So I wanted to recalculate that.
About 76, 72 to 76, kind of depending on the source, percent of American teens use Instagram.
There are about 42 million teenagers in the United States.
So I calculated from that, and about 18% of 19% of high school students of teenagers seriously considered attempting suicide.
So if we're just counting serious attempts or people who seriously considered attempting suicide, that's 5,745,600 teens who seriously considered suicide.
6% of those, if 6% of those kids had their suicidal feelings start on Instagram, that's 344,736 children in the United States whose suicidal feelings started on Instagram.
And I furthermore found that about 9% of kids who seriously attempt suicide or seriously consider suicide attempt it.
So of that 344,736 American teens whose suicidal feelings started on Instagram, about 31,26 kids attempt suicide.
So about 31,000 kids in the United States on an annual basis attempt suicide because of suicidal feelings that started on Instagram.
So that is the more accurate look at the data.
And I apologize, as always, for the error.
But what's interesting is that these studies document like Facebook is as physically harmful at scale as like a wide variety of narcotics.
Like most narcotics probably are less harmful at scale physically than Instagram.
I think weed certainly is.
Oh my God, if every teenager was smoking weed instead of doom scrolling on Instagram, the world would just be so fucking good.
If they were chain smoking cigars instead of being on Instagram, we might be better off.
It's so weird because I think about like how I don't know, whatever.
Like I'm in my late 20s, so I feel like I have like a little bit of memory of like what life was like before you were constantly being encouraged to compare yourself to every single person you've ever met in your life, regardless of whether you know who they are, how they are, whatever.
And I just may call me nostalgic, but I liked how I felt better.
Yeah.
Like it's so absurd how much I know about people I don't give a shit about and how bad it makes me feel to know about the curated lives of people that I don't give a shit about and how I let that actively affect my daily life.
And it's just, yeah, it's just fucking miserable.
It is.
It's horrible.
It's horrible.
That said, I like flirting on the application.
So, you know, sure.
Now, here's why, despite the documented harm that Instagram does, nothing's ever going to change.
As I stated, 22 million U.S. teens use the Gram daily.
Only 5 million log on to Facebook.
So Instagram is almost five times as popular among teenagers as Facebook, where kids are leaving in droves.
So Facebook, Mark Zuckerberg's invention, is now definitively just the terrain of the olds.
And Facebook knows it.
Kids are never going to come back because that's not being a kid works.
Like you don't get them back.
They're going to continue to do new shit.
Eventually, they'll leave Instagram for something else.
That's just the way it fucking goes.
Unless the 30-year nostalgic cycle is like Facebook is actually back now.
It's actually cool.
I just don't think it gave anybody a good experience enough to have that.
It's not the fucking teenage mutant ninja turtles.
Yeah, no one's getting dopamine hits.
That's a good thing.
It's not like flaming hot cheetos.
Nobody's thinking fondly back to scrolling Facebook when they were seven.
They're thinking back to, I don't know, SpongeBob SquarePants.
Oh, they should.
And as well, they should.
But at the moment, Instagram is very popular with teens.
And Facebook knows that if they're going to continue to grow and maintain their cultural dominance, they have to keep bringing in the teens.
They have to keep Instagram as profitable and as addictive as it currently is.
And that's why they bought Instagram in the first place.
They only paid like a billion dollars for it.
It was an incredible investment.
And they spend 50% more time on Instagram.
It's only a lemonade in a day.
Yeah, that's cheap as hell for something as influential and huge as Instagram.
That is not real.
I wonder, do you know what it's worth now?
I would guess significantly more than a billion dollars.
But I don't entirely know what a value is.
But Facebook's like a trillion-dollar company now.
That's really good.
And Facebook fucking sucks.
And it's fucking.
Well, but that includes Instagram, you know?
Oh, okay.
So the entire.
Yeah, and among, you know, teens are one of the most valuable demographics to have for advertisers, and Instagram is where the fucking teens go.
Do you want the number?
Its estimated value is $102 billion.
So yeah, I would say that.
That's a good investment.
That's a good investment.
That's a fucking good investment if money was made.
Yeah.
So the fact that so much is at stake with Instagram, the fact that it's such a central part of the company having any kind of future is part of why Mark and company have been so compelled to lie about it.
None of this stuff that we've been talking about was released when Facebook researchers got it.
Of course not.
They wouldn't want anyone to know this shit.
In March of 2021, Mark took to Congress where he was criticized for his plans to create a new Instagram service for children under 13.
He was asked if he'd studied how Instagram affects children.
And he said, I believe the answer is yes.
So not yes.
I think we've studied that.
He told them then, the research we've seen is that using social apps to connect with other people can have positive mental health benefits.
And I'm sure there's something that he's gotten paid researchers to come up with that he can make that case off of.
I'm sure in certain situations, it may even be true.
There are ways you can use social media that are good dear men.
I've legitimately smiled or had my heart warmed by things that happen on social media.
It doesn't not happen.
And I do think that there is a case for, like, I mean, and it's, you can't credit Mark Zuckerberg with it, but just, I mean, going back to fucking like live journal days of just like friendships that have deepened as a result of social media.
That's definitely a thing, but the costs outweigh the benefits there by quite a bit.
Yeah.
It's great.
So Mark goes on to say, you know, I think we've got research that shows it can have positive mental health effects.
You know, I think we've studied whether or not how it affects children.
But he doesn't talk about it, he leaves out all the stuff that I, all the statistics, like about all the kids whose suicidal ideation starts on Instagram.
They had that data when he went before Congress.
He just didn't mention it.
They hadn't told anyone that shit up.
Like he didn't say a goddamn word about it.
Yeah.
He was like, yeah, I think we've looked into it.
And, you know, there's some ways in which it can be healthy.
Not, and also 1.3 million American kids became suicidal because of our app.
Like, he did not throw that info out.
Did he throw that?
I mean, truly, I'm like up in the air of like, did he not say that because he didn't want people to know?
Or did he just say that because he heard it and he didn't care and he forgot?
Like, you just don't know what that guy is so fucking evil.
Wow.
Yeah.
It's pretty great.
And we'll talk more about that later.
In May of 2021, Instagram boss Adam Masseri told reporters that he thought any impact on teen well-being by Instagram was likely, quote, quite small based on the internal research he'd seen.
Again, they haven't released this research.
He's saying, oh, we have research and it says that any kind of impact on well-being is pretty small.
And again, the actual research by this point showed 13% of kids in the UK and 6% of kids in the United States were moved to thoughts of suicide by Instagram, which I would not call small.
I would not call, I wouldn't necessarily say it's huge, but that is not a small impact.
No, that is like thousands and thousands and thousands and possibly millions of dollars.
The Wall Street Journal caught up with Masseri after the Facebook papers leaked.
So they were able to like drill him on this a bit.
And he said a bit more.
Quote, in no way do I mean to diminish these issues.
Some of the issues mentioned in this story aren't necessarily widespread, but their impact on people may be huge, which is like, again, a perfect non-statement.
That's right.
They're like, but what about the thing we couldn't possibly gauge at all versus the thing that we did and we're actively distancing ourselves from?
I mean, those statistics, that's like at least one kid in every classroom.
Like that is gigantic.
And when you read the responses of guys like Masseri and compare them to the responses of people like Mark Zuckerberg and official corporate spokespeople, it's very clear that they're working from the same playbook, that they're very disciplined in their responses.
Because Masseri does try to tell the journal that he thinks Facebook was late to realizing there were drawbacks in connecting people in such large numbers.
But then he says, I've been pushing very hard for us to embrace our responsibilities more broadly, which again says nothing.
He then pivots from that to stating that he's actually really proud of the research they've done in the mental health effects on teens, which again, they didn't share with anybody and I would argue lied about by omission in front of Congress.
He's proud of this because he says it shows Facebook employees are asking tough questions about the platform.
Quote, for me, this isn't dirty laundry.
I'm actually very proud of this research, which is the same thing Zuckerberg said about his own employees damning the service after James 6.
Right.
I was going to say, that's the same exact thing as the like, actually, bad work, you know, talking about how working for the Death Star is bad is like evidence of, oh, the Death Stars actually has a really open work culture.
Like, no, I don't know.
I feel like there are a few, there are not many CEOs that are good at flipping a narrative, but Mark Zuckerberg is particularly bad at it.
Yeah.
And it's, I mean, part of why they can be bad at it is it doesn't really matter.
Or at least it hasn't fucking so far.
Sure.
Internal Research Confessions00:09:29
But the patterns.
I mean, not enough to get a better figurehead.
No.
Yeah.
The pattern's pretty clear here.
When a scandal comes out, deny it until the information that can't be denied leaks out.
And then claim that whatever is happening at the site, whatever information you had about how harmful it is, is a positive because it means that you were trying to do stuff about it, even if you actually rejected taking action based on the data you had and refused to share it with anybody else.
Masseri and Zuckerberg were also careful to reiterate that any harms from Instagram had to be weighed against its benefits, which I haven't found a ton of documentation on.
In fact, as the Wall Street Journal writes, in five presentations over 18 months to this spring, the researchers, Facebook researchers, conducted what they called a teen mental health deep dive and follow-up studies.
They came to the conclusion that some of the problems were specific to Instagram and not social media more broadly.
This is especially true concerning so-called social comparison, which is when people assess their own value in relation to the attractiveness, wealth, and success of others.
Social comparison is worse on Instagram, states Facebook's deep dive into teen girl body image issues in 2020, noting that TikTok, a short video app, is grounded in performance, while users on Snapchat, a rifle photo and video sharing app, are sheltered by jokey features that keep the focus on the face.
In contrast, Instagram focuses more heavily on the body and lifestyle.
March 2020, internal research states it warns that the Explore page, which serves users, photos, and videos curated by an algorithm, can send users deep into content that can be harmful.
Aspects of Instagram exacerbate each other to create a perfect storm, the research states.
Yeah, I mean, again, not a shocking revelation over here.
It is, I mean, and I do think that that lets TikTok and Snapchat get off of it easy there.
Like they assert there is absolutely toxic body image culture on there.
And I feel like ThinSpo will thrive on any platform it fucking gloms itself onto.
But Instagram is particularly bad because it's like where so many lifestyle people have launched.
And there's so many headless women on Instagram.
It's, it is shocking.
There's so many, like, not like you macheted my head off, but like you're not encouraged to show your head by the algorithm, which sounds weird, but it is true.
The less, like, it is just very focused on how you physically look.
And then there's also this tendency to like tear people apart if they have edited their body to look a certain way when it's like, well, that the algorithm rewards editing your body to look a certain way and to do all this.
And it's, oh, you do bring up a good point where it's like going.
It's frustrating that it's important to critique Facebook in relation to its competitors, like TikTok and Snapchat.
That can lead to the uncomfortable situation of like seeming to praise them when they haven't done a good job.
They just haven't been as irresponsible.
It's kind of like attacking, like Chevron.
If you look at all of the overall harms, including like their impact and like covering up climate change, maybe the worst of the big oil and gas companies.
I don't know.
It's debatable.
But it's like if you're criticizing Chevron specifically, it doesn't BP is great.
You're just being like, well, these are the guys specifically that did this bad thing and they were the leaders in this specific terrible thing.
Other bad things are going on, but we can't, like, the episode can't be about how bad everyone is.
We're talking about Facebook right now.
We have these documents from inside Facebook.
I'm sure versions of this are happening everywhere else.
Listeners, in your everyday life, just don't use Facebook as a yardstick for morale.
You know what, Jason?
You'll just end up letting a lot of people off for a lot of fucked up stuff.
I would say in your regular life, don't use Facebook is all the sentence we needed there.
Wow.
Wow.
So you asked, you were talking earlier about like, because Mark went up in front of Congress and was like, yeah, I think we've got research on this.
And I've definitely seen research that says it's good for kids.
We know everything I just stated, that quote I just read, everything like that's in those internal studies.
We know that Mark saw this.
We know that it was viewed by top Facebook leaders because it was mentioned in a 2020 presentation that was given to Mark Zuckerberg himself.
We know that when in August of 2021, Senators Richard Blumenthal and Marshall Blackburn sent a letter to Mark Zuckerberg asking him to release his internal research on how his platforms impact child mental health.
We know that he sent back a six-page letter that included none of the studies we've just mentioned.
Instead, the study said that it was hard to conduct research on Instagram and that there was no consensus about how much screen time is too much.
Meanwhile, their own data showed that 40% of Instagram users who reported feeling unattractive said that the feeling began while they were on Instagram.
Facebook's own internal reports showed that their users reported wanting to spend less time on Instagram, but couldn't make themselves.
And here's a quote that makes it sound like heroin.
Teens told us they don't like the amount of time they spend on the app, but feel like they have to be present.
They often feel addicted and know that what they're seeing is bad for their mental health, but feel unable to stop themselves.
That's Facebook writing about Instagram.
Like that's their own people saying this.
Like this is not some activist getting in here, you know?
That's so, I mean, it's, I guess, good on them, regardless of the level of self-awareness going on there.
That's, I mean, and what I was thinking about earlier when it comes to anytime Zuckerberg is in front of Congress or in front of political officials, I feel like for a lot of people, the takeaway and the thing that gets trending is how little political officials and members of Congress understand about how the internet works.
And that's like the funny story is like, oh, Mark Zuckerberg talked about an algorithm.
And they, and, you know, like, this is, this comes up all the time.
It comes up on Veeep.
It came up on succession of just like how not internet literate the majority of people who decide how the internet works are.
And it's like, it almost becomes like a he, hee, ha, ha, old guy doesn't know how algorithm works.
But it's like, well, the consequence of that is that it ends up making Mark Zuckerberg look way cooler than he is.
And it also doesn't address the problem at all of like, no, Mark Zuckerberg is omitting something gigantic here.
And the majority of our, you know, lawmakers in Congress don't have the fucking, you know, cultural vocabulary to even understand that.
And that is like, and I guess it like makes for a couple of memes, but it's just like, no, this is bad.
Jamie, can you hear me?
Can you commit to cancel Finsta?
Do you remember that horror?
That was sad.
Oh, God.
That made me feel like, right, cancel Finsta.
I mean, I think that was the most recent one where it's like, okay, yeah, that is, you know, objectively funny.
But, but, like, the consequence of that is, I mean, that's ultimately a win for Instagram.
And that's a win for Facebook because it makes them look like they're operating on a level that the fucking government doesn't understand.
And meanwhile, you know, one kid in every classroom is suicidal as a result of the inability of law, like lawmaking officials to understand the effect that this has.
It's just, it makes me real mad, Robert.
And one of the niggas things about this is that while these lawmakers don't understand and sound like idiots talking to Mark Zuckerberg, his own employees, these researchers who are part of the integrity team, these researchers studying the impact of Instagram on teens, know exactly how harmful it is.
And they are grappling in real time with like the damage their product is doing to children.
Members of these teams reported frustration at the fact that their colleagues often refuse to take their findings seriously.
One former researcher told the Wall Street Journal that we're standing directly between people and their bonuses when they try to reduce the harmful aspects of Instagram because like anything that reduces the harm is going to reduce its popularity.
It's going to cut down on time on site.
It's going to cut down on users.
And so everyone else at Facebook, but the integrity team gets paid by how much they increase engagement on the site.
That's where the bonuses come from.
So if you're trying to make Facebook less dangerous, a huge number of the most influential employees have a financially, a vested financial interest in you failing.
So that's what that boils down to is like there is a financial incentive for you convincing a 13-year-old girl that she should get fucking fillers in her face.
Like, what the fuck?
It's just it.
Now, all I'm going to try to sell the teens because it's ad break time, Jamie.
And I would like to.
Oh, oh, okay.
Okay.
If you're a teenager, if you're a teenager, you don't need Instagram.
That shit's bad for you.
You certainly don't need Facebook.
That's where the olds are.
But you know what's come back around to being cool and what I think your generation needs to get on board with?
Smoking huge comical cigars.
I'm talking the ones that are like nine inches long and like three inches wide.
Massive teens listening right now, go out to your local cigar shop.
Do not bribe a sad adult.
Robert.
The worst looking adult you find outside the cigar shop.
Pay him for huge cheap cigars.
Talk to me.
And chain smoke.
That's the new Instagram chain smoking horrible cigars.
No, no, no.
This is what's going to bring.
Stop buying teenagers.
Look, Zoomers.
I did, we did.
Zoomers, the air is not getting any cleaner, right?
You're all going to, we're all going to choke to death on wildfire smoke.
You might as well burn down a big fat McAnudo.
Robert, can I tell you, I bought a teenager a white claw the other day.
I'm so proud of you.
Ethnic Cleansing Metaphors00:15:32
Good for you.
Thank you.
I felt good.
I felt like I did a public service.
Yeah, teens, go buy those big, fat, ridiculous lunatic cigars.
Bribe a savvy girl for it.
You're just preparing yourself for climate change.
All right.
Here's the other L. White Claw teens.
Okay.
I mean, white claw goes great with a huge shitty cigar, Jamie.
No, it doesn't.
It absolutely does.
Smoking is bad for you.
Andy has a white claw.
You're smoking a cigar.
You puff it so it's healthy.
You, you.
All right.
Here's some ads.
On a recent episode of the podcast Money and Wealth with John O'Brien, I sit down with Tiffany the Bajanista Alicia to talk about what it really takes to take control of your money.
What would that look like in our families if everyone was able to pass on wealth to the people when they're no longer here?
We break down budgeting, financial discipline, and how to build real wealth, starting with the mindset shifts too many of us were never, ever taught.
Financial education is not always about like, I'm going to get rich.
That's great.
It's about creating an atmosphere for you to be able to take care of yourself and leave a strong financial legacy for your family.
If you've ever felt you didn't get the memo on money, this conversation is for you to hear more.
Listen to Money and Wealth with John O'Brien from the Black Effect Network on the iTHeartRadio app, Apple Podcast, or wherever you get your podcast.
Hey, Ernest, what's up?
Look, money is something we all deal with, but financial literacy is what helps turn income into real wealth.
On each episode of the podcast, Earn Your Leisure, we break down the conversations you need to understand money, investing, and entrepreneurship.
From stocks and real estate to credit, business, and generational wealth, we translate complex financial topics into real conversations everyone can understand.
Because the truth is, most people were never taught how money really works.
But once you understand the system, you can start to build within it.
That means ownership, smarter investing, and creating opportunities not just for yourself, but for the next generation.
If you want to learn how to build wealth, understand the market, and think like an owner, Earn Your Leisure is the podcast for you.
Listen to Earn Your Leisure on the iHeartRadio app, Apple Podcast, or wherever you get your podcast.
Will Farrell's Big Money Players and iHeart Podcast presents soccer moms.
So I'm Leanne.
Yeah.
This is my best friend Janet.
Hey.
And we have been joined at the hip since high school.
Absolutely.
Now a redacted amount of years later, we're still joined at the hip, just a little bit bigger hips, wider.
This is a podcast.
We're recording it as we tailgate our youth soccer games in the back of my Honda Odyssey with all the snacks and drinks.
Sidebar.
Why did you get hard seltzer instead of beer?
Oh, they had a BOGO.
Well, then you go.
You had a white clar sub here.
Just hit.
What are y'all doing?
Microphones?
Are you making a rap album?
I would buy it.
Cuts through the defense like a hot knife through sponge cake.
That sounds delicious.
Oh, you're lucky.
I'm not a drug addict.
You're lucky.
I'm not an alcoholic.
You're lucky.
I'm not a killer.
I love this team and I'm really trying to be a figure in their lives that they can rely on.
Oh.
Listen to soccer moms on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Hello, gorgeous.
It's Lala Kent, host of Untraditional Le La.
My days of filling up cups at sir may be over, but I'm still loving life in the valley.
Life on the other side of the hill is giving grown-up vibes.
But over here on my podcast, Untraditional Ila, I'm still that Lala you either love or love to hate.
I've been full-on oversharing with fans, family, and former frenemies like Tom Schwartz.
I had a little bone to pick with Schwartzy when he came on the pod.
You don't feel bad that you told me I was a bootleg housewife?
I must have flipped a pizza in your lap.
Oh my god, I literally forgot about that until just now.
Sorry, I don't want to blame all of that.
I got to blame that one on the alcohol.
This is about laughing and learning when life just keeps on laughing.
Because I make mistakes so that you guys don't have to.
We're growing, we're thriving, and yes, sometimes we're barely surviving, but we do it all with love.
Listen to Untraditionally Lala on the iHeartRadio app, Apple Podcasts, or wherever you get your podcast.
All right, we're back.
We are?
We all just enjoyed a couple of really comically large cigars.
We did not have one of those ridiculous long asylum cigars.
It was great.
Why are you fixated on this?
What is happening?
Because I find that sketch from I Think You Should Leave while the little girls are talking about smoking five Macanudos to unwind at the end of the day actually quite funny.
I mean, yeah, but like, why?
So I'm wondering.
I love when you're, you reveal yourself to be a basic bitch.
I am a basic bitch.
I was watching Netflix.
Yeah, Netflix.
That's why I'm thinking about cigars.
I love that.
I love that we're in the middle of a podcast and you can't get off that.
Well, I also think making children do things that's bad for them is funny, but not this way.
Not the way Facebook is.
Just send them to Dan Flashes.
Send them to Dan Flash as well.
I mean, they've already, I think the teens are rejecting NFTs pretty widely, Jamie.
So when Facebook does try to make the case that their products are benign, they like to bring up studies from the Oxford Internet Institute, which is a project of Oxford University, which show minimal or no correlation between social media use and depression.
The Wall Street Journal actually reached out to the Oxford researcher responsible for some of these studies, who right away was like, wasn't like, oh, yes, they're right.
Everything's fine.
He was like, actually, Facebook needs to be much more open with the research that they're doing because they have better data than we can get, than researchers can get.
And so our actual information that they're citing is hampered by the fact that they're not sharing what they're finding.
And who knows how things could change and our conclusions could change if we had access to all of that data.
He even told the Wall Street Street Journal, people talk about Instagram like it's a drug, but we can't study the active ingredient.
Which you'll notice is not him saying it's fine.
It's him being like, yeah, I really wish we could actually study this better.
It's difficult, right?
And also he's referring to it like drugs, which is the comparable scale of how it manifests.
Okay, yeah, cool, cool.
Yeah, he's certainly not being like, everything's fine.
I think that's clear.
He's truly like constantly, Mr. Policeman, I gave you all the clues in this situation, and just no one gives a shit.
It is very funny.
And like that, that movie.
And that's what I was trying to say.
I said it's hilarious.
Yeah.
So we focused a lot on these episodes about how Facebook has harmed people and institutions in the United States.
But as we've covered in past episodes, the social network has been responsible for helping to incite ethnic cleansings and mass racial violence in places like Myanmar and India.
Mob violence against Muslims in India, incited by viral Facebook misinformation, led one researcher in February of 2019 to create yet another fake account to try and experience social media as a person in Kerala, India, might.
From the New York Times, quote, for the next three weeks, the account operated by a simple rule.
Follow all the recommendations generated by Facebook's algorithm to join groups, watch videos, and explore new pages on the site.
The result was an inundation of hate speech, misinformation, and celebrations of violence, which were documented in an internal Facebook report published later that month.
And this is from the Facebook researcher.
Following this test user's news feed, I've seen more images of dead people in the past three weeks than I've seen in my entire life total.
What a great site, Mark built.
Facebook's new tagline, The Place for Corpses.
Oh, my goodness.
I mean, and it's like, I know that we have discussed Facebook's role in supercharging ethnic cleansings, but that is just, that is so, yeah, it's not great, Jamie.
Someone wrote that down, Robert.
Someone wrote that down and hit publish.
It's not greater because India is Facebook's biggest customer.
340 million Indians use one or more Facebook products.
That's a shitload of people.
Yeah.
340 million.
That is something that I think is important to remember and something that I lose sight of sometimes is like Facebook is not a super popular platform for people of all ages in North America, but that's not the case everywhere.
Yeah, and it is just, it is the internet for a lot of these people.
Like that is the way they, that is the whole of how they consume the internet in a lot of cases.
I mean, maybe with like YouTube or something mixed in, but they're probably getting a lot of their YouTube links from their Facebook feed.
Now, the fact that India is the number one customer in terms of like number of people for Facebook, I'm sure the United States is still more profitable just because of like differences in income and whatnot.
But this is a huge part of their business.
But despite that fact, they have failed to invest very much in terms of meaningful resources into having employees who speak the language, or as is more the problem, the languages of India.
See, India, super mixed country, right, in terms of different like ethnic groups and religious groups.
They have 22 officially recognized languages in the country.
And there's way more languages than that in India that significant numbers of people speak.
There's 22 officially recognized languages.
Anyone who can travel there, and I've spent a lot of time in India, can tell you that being able to effectively say hello and ask basic questions of people can require a lot of research if you're traveling a decent amount.
But Facebook aren't 20-something tourists on the prowl for good tandoori and banglasses.
They have effectively taken control of the primary method of communication and information distribution for hundreds of millions of people.
And they failed to hire folks who might know if some of those people are deliberately inciting genocide against other people in the country.
87% of Facebook's global budget for identifying misinformation is spent on the United States.
The rest of the planet shares 13% of their misinformation budget.
You want to guess what percentage of Facebook users North Americans make up?
No.
10%.
87% of their budget goes on 10% of their users.
Of dealing with disinformation.
That sounds like a larger metaphor for something else.
Of dealing with disinformation specifically.
Now, when this leaked out, Facebook's response was that the information cited was incomplete and did not include third-party fact-checkers.
They're like, well, this doesn't include all of the people, the third-party companies we hire, except for the data they show suggests that the majority of the effort and money spent on third-party fact-checkers is for fact-checking stuff in the United States.
And of course, they did not elaborate on how including this information might have changed the overall numbers.
So my guess is not by much of it all.
Internal documents do show that Facebook attempted to create changes to their platform to stop the spread of the disinformation during the November election in Myanmar.
Those changes, which also halted the spread of disinformation put out by the military, which was a big, like it was the military inciting ethnic cleansings and like trying to incite violence in order to like lock down political power ahead of this election.
So they cut this significantly prior to the election.
They see it as a problem.
They institute changes similar to the changes they'd talked about putting up in the U.S. if things went badly with the election.
And these worked.
It dropped dramatically.
Yeah.
And again, that is good.
I'm glad that was done.
But they only responded to the pressure.
Give me a second, Jamie.
Exclusively.
Give me a second, Jamie.
Because prior to the election, they institute these changes, which are significant.
It reduces the number of inflammatory posts by 25.1% and reduces the spread of photo posts containing disinformation by 48.5%.
This is huge.
That's really significant.
As soon as the election was done, Facebook reversed those changes, presumably because they were bad for money.
Three months after the election, the Myanmar military launched a vicious coup.
Violence there continues to this moment.
In response, Facebook created a special policy to stop people from praising violence in the country, one which presumably reduces the spread of content by freedom fighters resisting the military as much as it reduces content spread by the military.
It's obviously too much to say that Facebook caused a coup in Myanmar.
Shit's been, I mean, there's a lot going on there.
I'm not pretending that this is like, it's all just Facebook.
But a major contributing factor isn't insignificant.
And the fact that they knew how much their policies were helping and reversed them after the election, reversing this effect and leading to an increase in inflammatory content because it profited them more, is damning, right?
That's the thing that's damning.
Around the world, Facebook's contribution to violence may be greatest in places where the company has huge reach but pays little attention.
In Sri Lanka, people were able to automatically add hundreds of thousands of users to Facebook groups that spread violent content.
In Ethiopia, a nationalist militia coordinated calls for violence openly on the app.
The company claims that it has reduced the amount of hate speech people see globally by half this year.
But even if that is true, how much hate was spread during the years where they ignored the rest of the world?
How many killings?
How many militant groups seeded with new recruits?
How many pieces of exterminationist propaganda spread while Facebook just wasn't paying attention?
The actual answer is likely incalculable.
But here's the New York Times again reporting on that test account in Kerala, India.
Perfect turn of phrase.
Yeah, yeah, yeah.
10 days after the researcher opened the fake account to study misinformation, a suicide bombing in the disputed border region of Kashmir set off a round of violence and a spike in accusations, misinformation, and conspiracies between Indian and Pakistani nationals.
After the attack, anti-Pakistan content began to circulate in the Facebook recommendation groups that the researcher had joined.
Many of the groups, she noted, had tens of thousands of followers.
A different report by Facebook, published in December 2019, found Indian Facebook users tended to join large groups, with the company's median group size at 140,000 members.
In a separate report produced after the elections, Facebook found that over 40% of top views or impressions in the Indian state of West Bengal were fake or inauthentic.
One inauthentic account had amassed more than 30 million impressions.
A report in March 2021 showed that many of the problems cited during the 2019 elections persisted.
In the internal document called Adversarial Harmful Networks, India Case Study, a Facebook researcher wrote that there were groups and pages replete with inflammatory and misleading anti-Muslim content on Facebook.
The report said that there were a number of dehumanizing posts comparing Muslims to pigs and dogs, and misinformation claimed that the Quran, the holy book of Islam, calls for men to rape their female family members.
So that's significant.
Like the scale at which this shit spreads is huge.
And I mean, I don't even, I mean, I feel like I know the answer if the hate is existing on that scale unmitigated.
But who is working to like, how many people does Facebook have working on, is there an integrity team for this region?
Like technically, yes.
The question is, how many of them and how many of the languages there are represented by the team?
And it's not many.
Exactly.
It's not many.
You can't have a global company and not have global representation or shit like this is going to happen.
Like it's just.
It's actually, you know what it kind of reminds me of, Jamie?
I was looking at this and I was thinking about the East India Trading Company.
Global Polarization Flaws00:15:11
When the East India Company took over large chunks of India, they took it over from a regime, the government, the monarchical government that had been in charge in that area prior, was not a good government, right?
Because number one, they lost that war, but like they weren't a very good government.
They were a government.
So they did do things like provide aid and famines and disasters and have people whose job it was to like handle stuff like that and like handle, like make sure that like stuff was getting where it needed to go during like calamities and whatnot.
And doing things specifically that helped people but were not, wouldn't it were not profitable because a big chunk of what a government does isn't directly profitable.
It's just helping to like keep people alive and keep the roads open and whatnot, right?
Yeah.
Sustain humanity.
Yeah.
When the East India Company took over, they were governing and in control of this region.
And this is actually Bengal, I think is their first place.
But they don't have any responsibility.
They don't have teams who are dedicated to making sure people aren't starving.
They don't have people who are dedicated to actually keeping the roads open in any way that isn't necessary for directly the trade that profits them.
They don't do those things because they're governing effectively, but they're not a government.
And there's been a lot of talk about how Facebook is effectively like a nation, a digital nation of like 3 billion people.
And Mark Zuckerberg has the power of a dictator.
And one of the problems with that is that for all of their faults, governments have a responsibility to do things for people that are like necessary to stop them, like to deal with calamities and whatnot.
Facebook has no such responsibility.
And so when people were not paying attention to Sri Lanka, to West Bengal, to Myanmar, they didn't do anything.
And as we know, in a region where there are millions and millions of people, 40% of the views were fake, inauthentic content.
Because they don't give a shit what's spreading because they don't have to, because they don't have to deal with the consequences unless it pisses people off, as opposed to a government where it's like, well, yeah, we are made up of the people who live here.
And if things go badly enough, it can't not affect us.
I'm not trying to be, again, not like with TikTok.
I'm not trying to praise the concept of governance, but it is better than what Facebook's doing.
Right, right.
I think that that is like a very I'd never considered looking at it that way, but viewing it as this kind of digital dictatorship that a colonial dictatorship.
It's colonized people's information, like information streams.
It's colonized the way people communicate, but it has no responsibility to them if they aren't white and wealthy.
Well, yeah, and marginalize people in the same ways that actual dictatorships do in terms of how much attention is being given.
How are people being hired to support and represent this area?
And of course, the answer is no.
And of course, the result of that is extreme human consequence and harm.
And it's so, and it like is just so striking to me that it still feels like in terms of the laws that exist that control, I mean, that even attempt to address the amount of influence and control that a gigantic digital network like Facebook has.
You know, Facebook, I mean, unless people are yelling at them and unless their bottom line is threatened, they're never going to respond to stuff like this.
Like that's been made clear for decades at this point.
Yeah.
It's great.
I love it.
So.
Well, I'm all worked up.
Yeah.
A great deal of the disinformation that goes throughout India on Facebook comes from the RSS, which is an Indian fascist organization closely tied to the BJP, which is the current ruling right-wing party.
And when I say fascist, I mean like some of the founders of the RSS were actual like friends with Nazis and they were heavily influenced by that shit in like the 30s.
Both organizations are profoundly anti-Muslim and the RSS's propaganda has been tied to numerous acts of violence.
Facebook refuses to designate them a dangerous organization because of quote political sensitivities that might harm their ability to make money in India.
Facebook is the best friend many far-right and fascist political parties have ever had.
Take the Polish Confederation Party.
They're your standard right-wing extremists, anti-immigrant, anti-lockdown, anti-vaccine, anti-LGBT.
The head of their social media team, Tomas Garbichek, sorry, Tomaz, told the Washington Post that Facebook's hate algorithm, in his words, had been a huge boon to their digital efforts.
Like he calls it a hate algorithm and says, this is great for us, expanding that, like, I think we're good with emotional messages and thus their shit spreads well on Facebook.
Quote from the Washington Post.
In one April 2019 document detailing a research trip to the European Union, a Facebook team reported feedback from European politicians that an algorithm change the previous year, billed by Facebook at chief executive Mark Zuckerberg as an effort to foster more meaningful interactions on the platform, had changed politics for the worst.
This change, Mark claimed, was meant to make interactions more meaningful, but it was really just a tweak to the algorithm that made comments that provoked anger and argument even more viral.
And I'm going to quote from the post again here.
In 2018, Facebook made a big change to that formula to promote meaningful social interactions.
These changes were built as a design to make the news feed more focused on posts from family and friends and less from brands, businesses, and the media.
The process weighted the probability that a post would produce an interaction, such as a like, emoji, or comment, more heavily than other factors.
But that appeared to backfire.
Hawk, who this week took her campaign against her former employer to Europe, voiced a concern that Facebook's algorithm amplifies the extreme.
Anger and hate is the easiest way to grow on Facebook, she told British lawmakers, many of whom have their jobs because of how easy it is to make people shit go viral when it comes to the pressure.
I was about to cause anger and like I mean, that shows a media network or system of power that that's not true for.
Yes.
Again, we're focusing on Facebook here in part because I do think it's more severe in a lot of ways there, but also just because like they're the ones who had a big leak.
And so we have this data.
So we're not just saying, yeah, look at Facebook, obviously, hate spreading there.
We're saying, no, no, no, we have numbers.
We have their numbers about how fucking bad the problem is.
I guess that that is the difference.
Yeah.
Yeah.
And we have evidence that the system is well aware of the numbers.
Yeah, I would love to be talking about Twitter, too.
It's just, and maybe Twitter just never bothered to get those kind of numbers.
Who knows?
This caused what experts describe as a social civil war in Poland, like this change.
One internal report concluded, we can choose to be idle and keep feeding users fast food, but that only works for so long.
Many have already caught on to the fact that fast food is linked to obesity and therefore its short-term value is not worth the long-term cost.
So he's being like, we're poisoning people and it's addictive, like, you know, McDonald's, but like people are going to give it up in the same way that McDonald's started to suffer a couple of years back because like they don't, they don't like the way this makes them feel, actually.
It's fun for a moment.
We just got to get a Morgan Spurlock for Facebook, baby.
We just got to get, where's the super size B for Facebook?
Our entire society is the Morgan Spurlock for Facebook.
January 6th this morning.
I was going to say, I was like, I feel like it's, I mean, whatever, not to say that McDonald's isn't a hell of a drug, but like this is not the same.
I mean, it's stronger because it's your fucking brain and self-image and the view of yourself.
And I feel like that is the most strong manipulation that any given system, person, whatever can have on you is controlling the way that you see yourself.
It's not the same in terms of like involuntary baseness.
I feel like it's something that you very much participate in.
Yeah.
It's bad.
Yeah.
It's good.
I think it's good.
That's what I think, Jamie.
Oh.
Facebook has been aggressive.
You called me today to say good, actually.
To read all this and then say, so that's fine.
Let's never talk of it again.
Anyway, Facebook has been aggressive at rebutting the allegations that their product leads to polarization.
Their spokeswoman brought up a study which she said shows that academic research doesn't support the idea that Facebook or social media more generally is the primary cause of polarization.
Now, ignore for the moment that not the primary cause doesn't mean isn't a significant cause.
And let's look at this study.
The spokeswoman was referencing cross-country trends in effective polarization, an August 2021 study from researchers at Stanford and Brown University.
This study opens by noting it includes data for only 12 countries and that all but Britain and Germany exhibited a positive trend towards more polarization.
So right off the bat, there's some things to question about this study, which is, number one, they're saying that like, oh, Britain hasn't gotten more polarized, which is like, have you been there?
Have you talked to about...
I yeah, not the don't live there, but not what I've been hearing.
And my friends that do.
Here's the thing.
When you look at how Facebook is basically using this, citing this as like evidence that like, look, we're fine.
Social media is not, this study from this very credible study says that we're not the cause of polarization, so everything's good.
The study doesn't quite back them up on this.
Right off the bat, one of the authors provides it, like notes this, and this is from a write-up by one of the authors on the study on a website called techpolicy.com, where he's talking about the study and what it says.
A flat or declining trend over the 40 years of our sample does not rule out the possibility that countries have seen rising polarization in the most recent years.
Britain, for example, shows a slight overall decline, but a clear increasing trend post-2000 and post-Brexit.
So he's saying that we don't have as much data from more recent polarization.
And that may be a reason why this study is less accurate and why some of our statements do not conform with what people have observed.
He goes on to note, the data do not provide much support for the hypothesis that digital technology is the central driver of effective polarization.
The internet has diffused widely in all the countries we looked at.
And under simple stories where this is the key driver, we would have expected polarization to have risen everywhere as well.
In our data, neither diffusion of internet nor penetration of digital news are significantly correlated with increasing polarization.
Similarly, we found little association with changes in inequality or trade.
One explanatory factor that looks more promising is increasing racial diversity.
The non-white share of the population has increased faster in the U.S. than in almost any other country in our sample.
And other countries like New Zealand and Canada, where it has risen sharply, have seen rising polarization as well.
So I have some significant arguments with him here, including the fact that, as he notes here, his study only looks at Western nations.
With the exception of Japan, all of the nations in the study are European or the United States and Canada.
And so they have all had prior to 2000 higher penetrations of the internet and non-internet mass media.
Like, outside of this, if you're trying to determine the impact of social media, elements of what social media has done were present in places like Fox News in the United States years before Facebook ever existed.
And that was not the case in places like Myanmar and India, which are not a part of this study.
So right off the bat, it's problematic to try and study the impact of social media on polarization only in countries that already had robust mass media before social media came into the effect, which is not to say that I agree with their conclusion, because I think there's other flaws with this study.
But one of the flaws is just that hundreds of millions of their users exist in countries where they did not, this study was not done, where they were not looking at these places, which is a flaw.
Which is just like dependent on, yeah, and that's dependent on most readers just conflating, you know, North America and Europe with the center of the fucking world.
And again, I have issues about like, okay, well, you're saying that racial diversity is more of a thing, but like, where is the where is the propaganda, where is the hate speech about racial diversity spreading?
Is it spreading on social media?
Like, yes, it is.
I can say that as an expert.
It's also just like, again, not that this study is even bad or not useful.
It is one study.
And again, we have internal Facebook studies that make claims that I would say throw some of this into question.
But again, this is just how a corporation is going to react.
They're going to find a study that they can simplify in such a way that they can claim there's not a problem.
Because nobody, none of the people who they're going to be arguing with on Capitol Hill and precious few of the journalists are going to actually drill into this and then talk to other experts or even reach out to members of that study and be like, how fair is this phrasing?
How does it gel with this information and this information?
As we saw earlier with the last study, when we people reached, when the Wall Street Journal, to their credit, reached out to that scientist, he was like, well, actually, they have better data than me, and I'd love to see it because maybe that'll change our conclusions.
Anyway, yeah.
Mark Zuckerberg has been consistent in his argument that deliberately pushing divisive and violent content would be bad for Facebook.
Quote, we make money from ads and advertisers consistently tell us they don't want their ads next to harmful or angry content.
While I was writing this article, I browsed over to one of my test Facebook accounts.
The third ad on my feed was for a device to illegally turn a Glock handgun into a fully automatic weapon.
Just as a heads up, yeah.
One of my, I have a couple of test feeds, and it was like, hey, this button will turn your Glock automatic, which is so many felonies, Jamie.
If you even have that thing in a Glock in your home, the FBI can put you away forever.
I have to laugh.
I have to laugh because that is really, really scary.
But yeah, it is like Mark being like, look, no advertiser wants this to be a violent place.
Buy a machine gun on Facebook, you know?
Next to ads that are like t-shirts about killing liberals and stuff.
A machine gun advertiser maybe would be one that wouldn't take issue with that.
Holy shit.
Fucking hang the media shirts advertised to me on Facebook.
Like, my God, go to go, like, fuck you, Mark.
I used to, but when I quit Facebook a couple of years ago, I was getting normie advertisements.
I was getting those really scary ones that says like those custom t-shirts that say, it's a Jamie Loftus thing.
You wouldn't understand.
I wouldn't.
Like, why?
And you wouldn't.
I would not.
No, and, you know, the only time Facebook I can think of recently actually anticipated something I wanted is they keep showing me on all of the accounts that I've used.
videos of hydraulic presses crushing things.
And I do love those videos.
Those are pretty fun.
And that's the meaningful social interactions that Mr. Mark Zuckerberg was talking about was the hydraulic press videos.
And those are very comforting.
On the good old internet, which also wasn't all that great, but on the old internet, it was a lot more fun, though.
There would have just been a whole website that was just like, here's all the videos of hydraulic presses crushing things.
Come watch this shit.
There wouldn't have been any algorithm necessary.
You could just scroll through videos.
There's no friend function.
It's just hydraulic press shit.
Yeah, that's all I need, baby.
That's all I need.
Politically Protected Accounts00:04:11
So back to the point, it is undeniable that any service on the scale of Facebook, again, like 3 billion users, is going to face some tough choices when it comes to the problem of regulating the speech of political movements and thinkers.
As one employee wrote in an internal message, I am not comfortable making judgments about some parties being less good for society and less worthy of distribution based on where they fall in the ideological spectrum.
That's true.
This is, again, part of the problem of not regulating them like a media company, like a newspaper or something.
Because by not making any choices, they're making an editorial choice, which is to allow this stuff to spread.
Presumably, actually, like if you were actually being held to some kind of legal standard that, again, most of our media isn't anymore, you would at least have to be like, well, let's evaluate the truthfulness of some of these basic statements before pressing.
And I would say that's where the judgment should come in on, but that's expensive.
If Facebook is saying, we won't judge based on politics, but we will judge based on whether or not something is counterfactual, that I think is morally defensible, but that's expensive as shit.
And they're never going to do that.
Look, moral decisions are famously not cheap.
And that is a lot of the reason why people do not do them.
Yeah.
It is true that.
Bad morals is not a profitable venture.
Yeah.
No, of course not.
And the other thing that's true is that Facebook already makes a lot of decisions about which politicians and parties are worthy of speech.
And they make that decision based mostly on whether or not said public figures get a lot of engagement.
Midway through last year, they deleted all of the different anarchist media groups that had, and a lot of anti-fascist groups that had accounts on Facebook just across the board.
They deleted like Crime Think and they kicked off It's Going Down, like a rapper I know sold.
Like, yeah, I mean, nobody ever complains when bad shit happens to anarchists except for anarchists.
But yeah, they nuked a bunch of anarchist content, just kind of blanket saying it was dangerous.
And I think it was because they just nuked the proud boys and they had to be shown to be fair.
But it has now come out that they have a whole program called XCheck or CrossCheck, which is where they decide which political figures get to spread violent and false content without getting banned.
Based on engagement.
Yeah, based on engagement.
They've claimed for years that everybody's accountable to the site rules.
But again, the Facebook papers has revealed that that's explicitly a lie.
And it's a lie.
Facebook has told other people at high levels of Facebook.
And I'm going to quote from the Wall Street Journal here.
The program, known as CrossChecker XCheck, was initially intended as a quality control measure for actions taken against high-profile accounts, including celebrities, politicians, and journalists.
Today, it shields millions of VIP users from the company's normal enforcement process.
The documents show.
Some users are whitelisted, rendered immune from enforcement actions, while others are allowed to post rule-violating material pending Facebook employee reviews that often never come.
At times, the documents show, XCHEC has protected public figures whose posts contain harassment or incitement to violence, violations that would typically lead to sanctions for regular users.
In 2019, it allowed international soccer star Neymar to show nude photos of a woman who had accused him of rape to tens of millions of his fans before the content was removed by Facebook.
Whitelisted accounts shared inflammatory claims that Facebook's fact checkers deemed false, including that vaccines are deadly, that Hillary Clinton had covered up pedophile wings, and that then President Donald Trump had called all refugees seeking asylum animals.
According to the documents, a 2019 review of Facebook's whitelisting procedures, marked attorney client-privileged, found favoritism to those users to be both widespread and not publicly defensible.
We are not actually doing what we say we do publicly, said the Confidential Review.
It called the company's actions a breach of trust and added, unlike the rest of our community, these people violate our standards without any consequence.
And they lied to like their board members about whether or not this didn't lie about whether or not it was a thing.
They said it was very small.
And just for, I think the initial claim was like, we have to have something like this in place, like people like President Trump, but it's a tiny number of people.
And it's because they occupy some political position where we can't just as easily delete their account because it creates other problems.
They're writing it off because they're not as fringe as they need to be for this conduct to be acceptable.
So that was their justification.
Small Number Justification00:04:36
One sec, Jamie.
That was their justification.
On the level of you can be unethical and still be legal.
It's still true.
Well, here's the thing.
They told their board they only did this for a small number of users.
You want to guess how small what that small number was?
Oh, I love when Facebook says there's a small number.
What is this?
What is a small number?
5.8 million.
That's so many.
Yeah.
Oh, dear.
Yeah.
Okay.
It's very funny.
It's very funny.
It's all good.
I that is so, I mean, yeah, that I, they're just, they're just.
Yep.
Robert, can I say something controversial?
Please.
I don't like this company one bit.
You don't?
Well, I feel like that's going a bit, that's going a bit far.
I'm sorry.
And I'm famously, you know, I don't like making harsh judgments on others, but I'm starting to think that they might be doing some bad stuff over there.
Yeah, I would, you know, I don't like these people.
I don't like these people at all.
You know what I do like, Jamie?
What do you like?
Ending podcast episodes.
Oh, I actually do like that too.
Yeah, that's the thing I'm best at.
Do you want to plug your pluggables?
Yeah.
Passion.
Yeah, sure.
You can.
I'm going to just open by plugging my Instagram account a famously healthy platform that I'm addicted to, and I don't really have any concerns about it.
I don't really think it's affecting my mental health at all.
So I'm over there, and that's at JamieCray Superstar.
I'm also on Twitter, which Robert can't stop saying is the healthiest of the platforms.
It is of all of the people who are drunk driving through intersections filled with children.
Twitter has the least amount of human blood and gore underneath the grill of the car.
Okay, so Robert's saying, for all you Backstreet Boys heads, he's saying that Twitter is the Kevin Richardson of social media in there as well.
I'm saying the drunk driving Twitter car made it a full 15 feet further than the Facebook car before the sheer amount of blood being churned up into the engine flooded the engine air intakes.
But at the end of the day, we're all fucked.
I'm on Twitter as well at Jamie Locketell.
You can listen to my podcast.
Yeah.
You know, you can listen to my podcast, The Bechdelcast.
You listen to Act Cast.
That's about the Kathy Comics.
You can listen to My Uran Mensa.
You can listen to Lolita podcasts.
You can listen to nothing.
You know what never led to a genocide in any country, as far as I'm aware, Jamie.
Uh-huh.
The Kathy comics.
Well, see, then you haven't listened to the whole series.
Oh, really?
Is it?
Oh, you know what?
No, I'm kidding.
Yeah, that's why the last episode is your life report from Sarajevo in 1994.
Episode 11.
Yeah, Irving really, his politics were not good.
Yeah, he was like weirdly into the Serbian nationalism.
Irving is like for the Kathy comics.
He's like, okay, I'm about to make a wild parallel, but Irving is like the barefoot contessa's husband in that he looks so innocent.
But then when you Google him, you're like, wait a second, this man is running on dark money.
This guy is like, was on Wall Street in the 80s.
This is a bad man.
He's basically like Jeffrey, the barefoot contessa's husband.
The barefoot contessa is run on dark money.
I know people don't like to hear it.
They love her, but it's just true.
It's objectively true.
And that's what I would like to say at the end of the episode.
I've never heard of the barefoot contessa, and I don't know what you're talking about.
Yes, I am not even 1% surprised to know that, but that's okay.
But you know what I do know about?
What?
I know about podcasts, and this one is done.
Great ending.
You know the famous author Roald Dahl.
He thought up Willy Wonka and the BFG.
But did you know he was a spy?
Neither did I. You can hear all about his wildlife story in the podcast, The Secret World of Roald Dahl.
All episodes are out now.
Was this before he wrote his stories?
It must have been.
What?
Okay, I don't think that's true.
I'm telling you, the guy was a spy.
Binge all 10 episodes of The Secret World of Roald Dahl.
Now on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Readers, Katie's finalists, publicists.
We have an incredible new episode this week for you guys.
We have our girl Hillary Duff in here, and we can't wait for you to hear this episode.
They put on Lizzie McGuire at 2 a.m. video on demand.
Roald Dahl Wildlife Story00:01:30
This guy's playing.
2 a.m.
2 a.m.
Whatever time it is.
Lizzie McGuire and I'm like, wild, wild bat sheer wave.
It was like a first closet moment for me where I was like, you're like, I don't feel like she's hot like the rest of them.
No, no, no.
I was like, she's beautiful, but I'm appreciating her in a different way than these boys are.
I'm not like, listen to Las Culturalistas on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
This is Saigon, the story of my family and of the country that shaped us.
From iHeart Podcasts, Saigon.
You're going to think I'm serious about a free Vietnam?
One city, a divided country, and the war that tore America apart.
For Vietnam.
Freedom for Vietnam.
There's a fire coming to this country and it's going to burn out everything.
Listen to Saigon, starting on April 22nd on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
On the Ceno Show podcast, each episode invites you into a raw, unfiltered conversation about recovery, resilience, and redemption.
On a recent episode, I sit down with actor, cultural icon Danny Trail to talk about addiction, transformation, and the power of second chances.
The entire season two is now available to bench, featuring powerful conversations with guests like Tiffany Addish, Johnny Knoxville, and more.
I'm an alcoholic.
Without this probe, I'm going to die.
Listen to Ceno's show on the iHeartRadio app, Apple Podcast, or wherever you get your podcasts.