All Episodes
May 18, 2023 - Conspirituality
01:14:03
154: The Truth Wars (w/Renée DiResta)

In the post-truth world, journalists who report facts are disparaged as perpetuating the narrative while candidates who hold the appropriate qualifications are smeared as deep state operatives. Likewise, a career spent studying terrorism, online conspiracy theories, and digital propaganda becomes "evidence" of opposing free speech and the American way. Our guest today became the "main character" on Twitter in April, subject to information requests from Congress and labeled the "leader of the Censorship Industrial Complex" from her perch at the center of a conspiracy web in which Big Tech, government intelligence agencies, and woke university think tanks secretly silenced free speech online. Her name is Renée DiResta, and Julian talks to her about her extensive study of online propaganda. She tells us about the unfolding digital information crisis, of which the Twitter Files is just be the most recent example. Learn more about your ad choices. Visit megaphone.fm/adchoices

| Copy link to current segment

Time Text
Hey everyone, welcome to Conspiratuality.
I'm Derek Barris.
I'm Julian Walker.
If you want to keep up with us on social media, our Instagram account is conspiratualitypod.
And in addition to these weekly Thursday episodes, we also regularly drop briefs on Saturdays where each of us has an opportunity to explore a topic that we want to take on this past Weekend, I created one that was sort of following on last week's anti-sunscreen episode, where we had a whole segment talking about Sarah Aniano, about anti-Semitism in the anti-sunscreen movement.
So, on Saturday, I got to talk to the Banter founder, Ben Cohen, about his experiences as a political reporter covering anti-Semitism, as well as what it's like to be A Jew in America coming from the UK and about his experiences with antisemitism here.
For those inspired to support our work, we also have Monday Premium bonus episodes on both Patreon and Apple subscriptions.
Patreon supporters do also get all of our episodes ad-free and can even choose to access our behind-the-scenes videos and live streams.
For this past Monday's bonus, Matthew took a quite nuanced look at the controversy from last month that he referred to as the Dalai Lama spectacle.
Conspiratuality 154 The Truth Wars with Rene D'Aresta In the post-truth world, journalists who report facts are
disparaged as perpetuating the narrative, while candidates who actually hold the appropriate
qualifications to be in government are often smeared as deep state operatives.
Likewise, a career spent studying terrorism, online conspiracy theories, and digital propaganda translates through the looking glass as obvious evidence of opposing free speech and the American way.
This past April, our guest today became the main character on Twitter, subject to information requests from Congress, labeled the leader of the censorship industrial complex from her supposed perch at the center of a conspiracy web in which big tech, government intelligence agencies, and woke university think tanks secretly silenced free speech online.
Her name is Renee D'Aresta, and we talked about what her extensive study of online propaganda,
starting under the Obama administration, tells us about the unfolding digital information crisis,
of which the Twitter files may well actually just be the most recent example.
So Derek, you know I couldn't wait to talk to Renee.
We actually had postponed our interview because the week that it was scheduled for turned out to be the week that Michael Schellenberger and Jim Jordan decided she was public enemy number one.
So for context, we should back up a little and mention the series of bonus episodes you and I did as the Twitter files were unfolding, because that's all prelude to where we find ourselves at today.
And by the way, it's also informed by free speech warrior Elon Musk this past week being perfectly willing to censor Erdogan's opponents on Twitter to keep generating ad impressions in Turkey and lick the boots of a guy who has jailed comedians for making fun of him.
Yeah, he's a great guy.
He's going after Soros now, of course, again, but really digging in.
I saw some people on Twitter tag your bonus episode or the brief actually reframing George Soros or framing George Soros, right?
So maybe Elon.
I'm sure Elon's going to listen and take a nuanced look after that.
But for us, we had the pleasure of listening to too many hours of Russell Brand as he appointed himself the moderator of all things Twitter files.
He had Matt Taibbi on, Barry Weiss, he had Alex Berenson.
They were all on his Rumble show over the last few months.
He also later had Michael Schellenberger on, but we didn't get to that yet.
Maybe we'll have to go back to that one.
But I have to say, it was really hard hitting journalism.
And by that I mean he conceded every point each person made as part of the grand conspiracy of the Biden administration and other deep state operatives colluding to stop free speech in America, and by which I mean that it was all small amounts of actual reporting under a deluge of Russell Brand's trademark bullshit.
That's right.
We started with Russell's breathless coverage, first with his interview of Matt Taibbi.
And Russell ensconced in his new headquarters at Rumble because, you know, having 5 million followers on YouTube is evidence of really unfair censorship.
And then poor old journalist in exile Matt Taibbi, who's raking in around $500,000 a year from his sub stack, according to the data.
That substack ended up being, of course, a source of tension between Matt Saebi and Elon, because these are all incredibly, like, you know, these are all relationships of convenience and of narcissistic self-interest.
And so Elon had blocked tweets from including links to substack pages, which prompted Matt to say he was leaving.
And then he started posting quite frequently on True Social.
He seems to post on Twitter still.
You know, Julian, we are a podcast about misinformation.
Russell has six million YouTube followers.
Oops.
So I would appreciate if you paid attention to him on a daily basis more often.
Yeah, the censorship is really hurting him.
It's really hurting him.
Now, I don't want to give away anything from your wonderful interview with Renee.
I'm a big fan.
I've been following her for a while, so it's great to hear and have her on.
But at one point, she says something to the effect that she presented all of their requested information to Taibbi and Berenson before the congressional hearing, and yet they ignored it and moved ahead with their conspiratorial fear-mongering in front of Congress anyway.
And that really drives home the point that Matt's Journalism is just a charade.
And it truly is sad.
I've read his books in the past.
He's always been super opinionated.
He was at Rolling Stone for a long time, and they're famous for their gonzo journalism.
He, for a while, was considered to be taking up that mantle of the modern gonzo journalist.
And Rolling Stone is the organization that basically invented the style, right?
But I don't know.
What character would Matt actually be these days?
Would he be the Swedish chef?
Because he's not Gonzo anymore.
Gerbity, gerbity, gerbity.
He's definitely not Hunter S. Thompson.
I mean, that's basically what I hear every time he tries to mumble his way through his lies, like when Mehdi Hassan took him to task recently.
Oh, that was a thing of beauty.
Yeah, incredible.
Incredible.
Next up was Barry Wise, whose sub stack, I should point out, by the way, is drawing in 800,000.
And she's kind of turning it into a company where she has other people working for her that she's paying, it would appear.
She lasted even less time, though, in Elon's good graces.
He bumped her from the Twitter files dream team.
After she criticized the free speech absolutist for banning an account, he had explicitly said he would never ban like a week before.
Our third episode in that series, as you pointed out, followed Russell then bringing on Alex Berenson, who by the time he appeared with Russell had already been kicked out of Twitter Files coverage by Elon for his own breach of etiquette around whether you put things on Substack first or on Twitter first and how the links go, like, out.
You know, there was almost a year of my life where I was laid up in bed because of a broken femur.
This was in 1986, and my grandmother came over to take care of me most days because she was retired and my parents were not.
And during that time, I got very into soap operas because I would watch whatever my grandmother was watching.
Even at that age of 11, I was able to distinguish between reality and soap operas, but I don't think the culture always can.
I hear stories like this, like Barry removing someone they said they wouldn't.
I mean, it is just a soap opera.
And Barry is definitely another opportunist in this entire equation.
And just like Taibbi, I liked her work previously, even if I didn't agree with her.
But now she's relegated to spreading anti-trans rhetoric on Substack and not correcting her errors when pointed out, which happened recently.
And she's working on her fake university in Austin.
It seems like every industry she wants to reform, she just ends up spinning more of the same, but worse.
And as for Berenson, that dude has always been manipulative.
Anyone who writes a book falsifying cannabis studies to try to frame it in a legit 1980s marijuana will fry your brain rhetoric is going to be suspect.
Yeah, Reefer Madness.
And he also has found significant acclaim and success as a writer of spy novels.
He's got a spy novel series and it just appears that that also bleeds over.
He maybe can't tell the difference either.
I love that you went to like the old school soap operas because I'm thinking of that classic trope where the character wakes up and is looking in the mirror and the weird harp music is playing like a whole tone scale.
That was Christiane Northrup, yeah.
And they have amnesia because all of these players have amnesia about anything that happened before they got the tap from Elon.
They're like, oh, what's going on?
Everyone's being censored.
It's terrible.
We're going to save America for democracy by platforming a bunch.
And the thing about this, too, is they also have amnesia about the real process of journalism that would have gotten them to where they were before all of this.
Right.
And some of that you can only imagine included I think we need a conspirituality version of Yacht Rock for our cohort of people.
If we had more time and budget, even though Yacht Rock was very low budget, I think we could pull it off very well.
That could be something.
So these are the players.
We've not said much about the content of the Twitter files because mostly it was really unremarkable.
But by the time we got to Taibi and Schellenberger showing up in front of Jim Jordan's House Subcommittee on the Weaponization of the Government.
The talking points had shifted from promises of Hunter Biden revelations about election interference that never really came, and then moaning about how anti-vax and COVID-denying doctors like Jay Bhattacharya had been shadow banned, To how the supposed pseudo-academic scam of disinformation studies and this ominous-sounding collusion between university think tanks, US intelligence agencies, and big tech had hidden the truth about the pandemic, silenced right-wing voices, and boosted the mainstream narrative on behalf of the left.
So, this then found Renee DiResta and her colleagues being named as central to this vast conspiracy, labeled, and I think they coined the term on that day in front of Jim Jordan, the Censorship Industrial Complex.
And this prompted our episode 147, titled The Censorship Megaphone.
In which we showed how conspiritualists have actually been laughing all the way to the bank as their claims of being unfairly censored seem to actually be a kind of skeleton key or password onto bigger and bigger platforms and therefore more exposure and more revenue.
Nothing brings in revenue like telling your millions of followers that you're being censored on every social media platform that exists.
That said, great interview with Renee, Julian.
Again, I really enjoyed it.
And you mentioned this during the talk, but there are a lot of important layers here.
So I want to say this to listeners.
I recommend not listening to this if you're multitasking because it's really important information.
Renee puts out a compelling argument for the spread of misinformation and disinformation here, including why she doesn't like those terms and how they operate in our society.
And it really does inform so much of what we've been trying to tackle for three years now on this podcast.
We'll turn now to our interview.
Renee DiResta's academic background is in computer science and political science.
She's the technical research manager at Stanford Internet Observatory.
At the behest of the SSCI, Renee led investigations into the Russian Internet Research Agency's multi-year effort to manipulate American society, and she presented public testimony before the Senate.
She led an additional investigation into Russian hack and leak operations in the 2016 election as well.
Renee studies online pseudoscience conspiracies, Terrorist activity and state-sponsored information warfare.
She has advised Congress and the State Department on the topic.
She also writes for Wired and The Atlantic and has been featured on too many media outlets to list here.
Amongst her many distinguished fellowships and consulting roles,
she is a team member on the Council on Foreign Relations and a Truman National Security Fellow.
René D'Aresta, welcome to Conspiratuality.
Thank you for having me.
We have so much to discuss.
First of all, how are you doing?
I'm good.
I mean, it's been a it's been a very interesting month, but two months now, I think.
But, you know, it's it's not the first go around with people on the Internet being mad at me.
So occupational hazard, it turns out.
Yeah, I would imagine not.
All right.
So I want to hit on several topics across your already really noteworthy career.
But I really think, as you just indicated, We have to start with what's been going on in the last six weeks or two months.
To set the scene, as part of the hoopla around the so-called Twitter files, now independent Substack journalist Matt Taibbi and noted climate denialism writer Michael Schellenberger go before the House Select Subcommittee on Weaponization of the Federal Government.
This is on March 9th.
And there they either coin or popularize this now somewhat ubiquitous term, the censorship industrial complex.
Their online writing on this topic and testimony to the committee allege that they've uncovered a plot between government intelligence agencies, big tech platforms, university research facilities, and think tanks to censor right-wing voices under the guise of protecting Americans from what they term so-called misinformation or disinformation.
Then a couple weeks later, ProPublica reported that the head of that committee, Jim Jordan, had issued sweeping official information requests.
At first they reported them as subpoenas, but later corrected them, for documents from three universities and one think tank, and you are named in that article alongside Alex Stamos and Nina Jankovic.
So, first of all, is that about right?
Did I miss anything?
And then, second, what's happened since then with regard to Media coverage, online activity, and those information requests.
How has this all affected you and your colleagues?
So, yeah, that's all accurate.
The Committee on the Weaponization of the Government, this is chaired by Congressman Jim Jordan,
it was created in part as Congressman McCarthy was trying to solidify his leadership role.
And so this committee was created in part to extensively look at abuses by the federal government
in a variety of different areas, particularly areas like Hunter Biden's laptop
and these other sort of shibboleths that really dominate the way that conservatives think
about how they are treated online, how they are treated by the government
and whether there is a collusion between big tech and government.
So that's the kind of environment under which the committee is operating.
One of the first hearings of the Judiciary Committee actually brought in executives from Twitter
and asked them to comment on these things.
And that was, you know, that was a big, big hearing.
And then the, what we might call the Twitter Files hearing came a little bit after that.
And shortly after that, we received this, this letter.
And we are one of, I think, ProPublica Got four to confirm on the record, but it's actually many more than that.
So this is really quite a very broad, expansive effort to try to get emails between academics and the government or academics and the tech companies in an effort to find this theoretical collusion.
What is interesting about this is that if you actually watch the hearings or if you go and read, you know, Mr. Schellenberger's testimony, which is something like 60 pages long, his written testimony, what they are articulating, this allegation of a vast collusion effort to censor, in their words, I believe 22 million tweets and Sometimes it's in the millions, sometimes like when Schellenberger went on Rogan and jumped all of a sudden into the billions.
This argument actually is not found nor supported by anything in the actual Twitter files.
It comes from a group called Foundation for Freedom Online.
Which I believe is actually just one guy, but it calls itself an organization, a man named Mike Benz.
And Mr. Benz has been, he was a Trump appointee in the State Department for a period of, I think, just a few months, but has refashioned himself as a whistleblower.
And since approximately August of last year, had been writing these stories, these very, very sensational claims, misinterpreting research that we had published nearly two years prior.
And that is where these claims about 22 million tweets and a lot of the other things that they say come from.
So, the Twitter Files authors connected with this individual in a Twitter Spaces, Tybee connected with Benz in a Twitter Spaces, and began to, you know, Benz said, you know, oh, I have the kind of keys to the kingdom, I can uncover the full conspiracy for you.
You can only see bits and pieces of it in the Twitter Files, but let me tell you the rest.
And in this remarkable exchange...
Tybee is extremely excited about this, invites him to come on, you know, come on, be part of the team.
We're going to dismantle this censorship industrial complex, this media collusion effort with tech and, you know, the sort of usual things that they write.
And that's where that connection was made.
And so at the time, I had been speaking with Mr. Schellenberger for a period of a couple months.
And so I was rather blindsided to wake up to this written testimony that just regurgitated all of these claims that we had actually responded to months prior, but all of a sudden there they were in the congressional record.
So those claims were not actually found in the Twitter files.
There are no internal emails within Twitter that suggest that 22 million tweets were censored or billions of tweets or millions of Americans because it didn't happen.
And so this is just the connection of these two communities.
Really kind of came together in that hearing and it was, you know, just as a person who studies how these things happen, it's very fascinating to see when you become the main character of it.
But again, this understanding of how this laundering tends to occur and again, the complexity even in my three-minute explanation here has probably lost half the listeners because it requires understanding a whole lot of, you know, different pieces and narratives that have been motioned for the better part of six months.
Yeah, I mean, that's possible, but we're going to keep unpacking and contextualizing everything.
That is fascinating, to get some more detail on that.
I noticed with some pleasure the irony that you replied to Michael Schellenberger's statements and characterizations in some depth on March 31st by publishing three articles to your own substack.
These consisted in the entire text of a long, unpublished interview you did with him.
As well as then his abridged version, which I understand he quoted from in his congressional testimony.
Yes.
Despite it not already being in the public record, right?
No, he never published it.
Yeah.
And then you also revealed extensive text messages between the two of you spanning back several months, seeming like a sort of friendly interview process.
He has since fired back with a substack piece now on April 3rd calling you in the title The leader.
Why you are the leader of the censorship industry.
So, I mean, how do you make sense of all of these interactions, of how he's now represented your work, of the way that this, because you said you were surprised.
This is bizarre.
Well, I think one of the easiest explanations is, of course, the obvious, which is in all of those substack posts, many of them are behind a paywall.
And so by Whenever you have a conspiracy theory, you need some main characters.
You need some organizations.
You need some people.
I was selected to be that person.
And so I'm fairly public.
I've written about topics of content moderation for about 7 years now.
I've published Pretty extensively on it.
I've never been shy about sharing my opinions on it.
Again, this was why I was engaging with him for a period of several months because I thought, okay, well, we're going to have, we're going to have a conversation.
Reasonable people can disagree about certain aspects of content moderation and let's find out where those disagreements are and let's go through them.
And the question that I ask over and over and over again in my texts is, what do you think should happen?
What do you believe should happen here?
And we can table that for a second just to go a little bit more into the dynamics of of what's happened with this characterization of me.
Again, anytime you have a narrative with a lot of different moving parts,
creating one person and turning them into the mastermind is a fairly established trope, right?
The evil genius behind the curtain.
And you can see just the hallmarks of a basic smear campaign.
You know, Schellenberger, his background's in PR.
He knows how to do this.
He knows how to tell a story.
He knows how to create characters.
And so you see things like CIA fellow Rene Duresta I believe is how he refers to me.
And this is, you know, so I was an intern for the agency when I was an undergrad.
And this has never been a secret.
Everyone who knows me knows it.
Can I just say right here that the gotcha of like posting a link to a YouTube video that's been up for, I don't know, I don't know how many years.
A throwaway comment.
You're being introduced as a speaker and there's this throwaway comment that you had done some work for the CIA and it's like it's now on your Wikipedia page.
Yeah, well, it was sort of a joke.
I mean, it was because, again, it was never a because it was such a.
Early, you know, internship experience, again, 20 years ago, the Alex's, you know, when he says this, again, keep, as you know, it was a room full of people, a recorded live streamed event.
If it was something that was a secret, we sure did a bad job of keeping it, you know?
And so his joke, oh, Renee used to work for the CIA, is exactly playing on what Schellenberger plays on,
which is this idea of the shady, secretive, spooky organization.
It's very mysterious.
And so you can either be framed as an international woman of mystery
in this sort of funny intro that he did, or you can turn it into
the unsavory deep state association.
And that's what Schellenberger does.
And it's again, it's a very, you know, it's a very established tactic.
He was a PR flack for Hugo Chavez, right?
And so you could go with Hugo Chavez PR flack, Michael Schellenberger, which is about as accurate as CIA fellow, Renee Diresta.
These are things that are both true.
But he, I believe, was a flack for Hugo Chavez 20 years ago.
And so one might argue that that might not be necessarily his motivating force or his driving force today.
But you can create that insinuation.
You can use that innuendo by taking a thing that is true.
brain of truth and just taking it out of context or presenting it to an audience that's not
familiar with the specifics. And you can do that very dishonestly. And, uh, you know,
as a, as a PR flack, he happens to be good at it.
Yeah. I mean, it's, it's clearly a part of the conspiracy theorist playbook to make insinuations
of shadowy unknown implications that we then can stitch together into proving some narrative
that we're pushing. I want to back up a little now, because as you just said, the reason
that you've become the target of what I see as an opportunistic and sensationalist slander
that drives, you know, sub stack subscriptions is that in fact, you are one of the world's
foremost experts on digital disinformation.
And I want our listeners to really benefit from your expertise.
Let me set the scene again in terms of kind of both of our origin stories here at Conspiratuality.
We start from how during the pandemic, the wellness space became a vector of right-wing conspiracism.
And one of the topics we've covered is how as 2020 unfolded, wellness entrepreneurs already had a well-oiled influence machine at the ready to monetize misinformation and pseudoscience.
That readiness, of course, relied on how e-commerce tools like having a website, an email list, online books or video courses, and a facility with social media branding and marketing gave their initial outsider commentaries on the crisis immediate access to trusting, established audiences, many of whom were now stuck at home under quarantine.
And so the viral appeal of contrarianism and fear-mongering would then deliver the lucrative metrics of likes and shares and views and comments and ballooning follower counts.
And these incentives got the escalation cycle churning.
And what we've tracked is how that typically moved a little further to the political right with each cycle, all the while these conspiritualist influencers, as we name them, cashed in on the crisis.
You've studied and written extensively on the underlying conditions for situations like this and how, as the number of social media users and posts exploded over time, the sorting algorithms that prioritized and recommended both content and who to follow shaped platform patrons into affinity groups and target demographics ripe not only for commercial but also for ideological persuasion.
So, that might be a very condensed summary of things I've heard you talk about and things that we cover.
Tell us then, if we go back a little in terms of your history, how you saw these dynamics playing out during the Obama administration, as you interacted with anti-vax propaganda about SB 277 here in California, and then into studying how ISIS was using social media as a tool of radicalization.
That's a great question.
As you were talking, I was thinking about the early days of the pandemic.
And the first thing that I did in early 2020 that I got very interested in, as it became clear that this was going to be a pandemic, And not going to stay confined to China was I actually started paying attention.
I went back and I started looking at a lot of the old anti-vaccine networks that I had paid attention to in 2015.
Because as you note, there are a few different things in play.
First, there is The way that information moves today, it happens on particular structures, right?
It happens on social media platforms.
But more specific than that, the information kind of cascades across networks of users.
And you have influencers, who as you kind of articulate, sometimes have economic motivations, sometimes are true believers themselves.
Then you have crowds of fans, right?
So you have the influencer and the crowd.
And the two different distinct types of user work together to propel messages across networks.
And the anti-vaccine movement was very, very, very good at this, even dating back to 2015.
And one of the reasons for that was that when Certain of their claims, and again, long pre-COVID, this was mostly the sort of MMR, the MMR conspiracy theories about autism, and then the sort of sadder ones in which they believed that vaccines caused SIDS.
These were the narratives that they believed they had to convey to the world in In 2015.
And they did this through networked communication.
So everybody was actually really moving in lockstep around putting out the same kind of content around boosting each other's content.
It's very networked communication dynamic.
One of the things that we noticed as we paid attention to the way that narratives moved during SB 277, which was a law to eliminate personal belief exemptions to require measles vaccines for kindergartners in public schools in California.
This followed very closely after the Disneyland measles outbreak.
So that was the environment in which I started paying attention.
At the time, I was a mom Who believed that children should be vaccinated for measles to go to school.
And so I thought, okay, well, let me try to understand how this network works.
What was very interesting about it was that as we tried to map the conversation around SB 277, just looking at that hashtag, what we started to notice was that the public health communicators We're almost entirely outside of it.
So, these were complete parallel universes.
You can see this laid out very beautifully on Twitter, and I wrote an article about it for Wired.
We articulated... So, that's the kind of structure piece, right?
How information moves.
And then you have the substance, which is what are they saying, right?
And again, you have in the anti-vaccine community, you have storytelling, you have first-person lived experiences, you have a mom turning on her computer, You know, sitting in her kitchen, telling a story about why she believes her child is autistic following an MMR vaccine.
And that is an incredibly powerful thing, right?
That is a very... It's an emotional thing.
It really hits you.
And then you have, okay, what does the CDC say about this?
Well, they put out a fact sheet.
And so at the time, what I was arguing was that Institutional communicators, public health entities, did not understand that the way people communicated had changed.
And they did not understand, you know, other people were saying this too, you know, Tara Smith, Anikata, there were a bunch of academic papers on it at the time that I was just, you know, sort of an activist on the outside saying this in places like WIRED.
But beginning to say, hey, the way that people understand things is changing and you can You guys have to adapt.
You have to begin to understand the power of storytelling and the way that the network moves the information, the way that the network selects the most resonant information and chooses what to propagate because people become the sharers.
So fast forward, we can go back, we can revisit that in the context of COVID.
But you asked about ISIS.
Around the same time, what was very interesting And I was only kind of nominally paying attention to it, but it was the ISIS fanboys on Twitter.
So platforms had begun to realize that this terrorist organization was trying to create what they called the virtual caliphate.
And they were using social media platforms to do it.
And again, you had the structure, right?
You had the amplifier networks, you had the people who were the core creators, the sort of ISIS influencers, if you will.
And these two groups, again, kind of worked together using the tools that the platforms had given everybody and anybody.
To spread this very particular type of propaganda, which was the glorification propaganda, the propaganda of a brand new state, a brand new caliphate emerging, and all the trappings of propaganda, the iconography, the videos, the recruitment videos.
I don't mean the Beheading videos.
I don't mean the gore.
That kind of stuff usually came down quite quickly.
But I mean, there was a video called No Respite.
It was one of their recruitment videos and it looked like a video game.
It was, again, the soaring cadence, you must come here, you know, fight the good fight.
And just the way in which propaganda had moved into this environment was happening there as well.
And so everybody had all of a sudden been given these tools to create influential figures and associated crowds.
Relying on algorithms for amplification and affordances for amplification.
And all of a sudden, anybody could do it.
And so I wound up getting asked to kind of weigh in on that in part just because I was doing, I think, some of the more detailed documentation of how it was working in this other space.
And I said, Hey, I don't know anything about terrorism or ISIS.
And they said, Well, that's okay.
We're more interested in hearing from people who understand the internet.
And that was how that connection was made.
And really, for me, it still stayed largely a thing that I was just doing at night.
I had a tech startup and logistics at the time, but it became something of a passion.
How do we understand who has influence, what that looks like, how that works, and how opinions are shaped today?
So that was This is a question I know you get asked a lot and I've heard you do some really concise sort of explanations along the lines of what you were just saying about how misinformation and disinformation can move across different platforms or different types of media.
So I guess I want to ask you, let's just, let's just define disinformation versus misinformation.
And then the understanding I've extracted is that perhaps they start to shade into each other as those distribution cycles translate across different types of media, or, you know, whether it's a website or a social media platform, et cetera.
Tell us about that.
So, misinformation is a word that's generally used to mean things that are inadvertently wrong.
So, the person who is sharing them doesn't realize that they're wrong.
Usually, there's a highly altruistic motivation.
They want to inform their community.
They want to give them, you know, what they believe to be good information.
Disinformation is a term that kind of dates to the Cold War, disinformation.
Kind of KGB-era Soviet term that refers to the idea that you can manipulate the discourse, right?
Information with an intent to deceive.
There is something inherently manipulative about it.
Sometimes, quite often, it's miscategorized as false.
That is only sometimes true.
And that's because you should think about it... I actually don't really love these two terms.
I've had a kind of a...
I have mixed feelings about them for a few years now, in part because misinformation is kind of a standalone thing.
People make mistakes, they say the wrong thing.
It is what it is.
But disinformation really is a very particular class of propaganda.
And so if you think about it in that regard, a lot of what it is, is what used to be called black propaganda.
Propaganda where the origin is obscured.
It is a message that doesn't necessarily have to be true, but there is something that is
inauthentic about it.
It is coming through a channel that is inaccurate.
It takes a grain of truth and manipulates it.
And so disinformation really is much more in, you know, in how I, how I tend to think
about it, how we think about it, SIO, much more falls in the realm of that, that deliberate
intent to deceive, that deliberate intent to be manipulative sometimes through the use
of fake accounts now or front media properties in the olden days and now.
Sometimes it's trying to manipulate algorithms for certain types of enhanced distribution.
Sometimes it is the content Sometimes the content is false.
And now as we move into the realm of generative AI, you know, we have this whole new world of unreality that we're going to experience together.
So there is this, but ultimately I think it really, the differentiator is that, that idea of intent.
Disinformation It seems like part of the structure of these kinds of campaigns can sometimes be that disinformation is deliberately created and shared in a particular way and then that gets picked up and translates across
And that has always been the goal, right?
sort of areas of distribution or modes of distribution, and perhaps gets to the point
where it's being shared by people who sincerely believe that this is an important perspective,
right? Right. And that has always been the goal, right? And that was where if you looked at some of the
canonical old cases, what was called Operation Denver during the Cold War, the attempt to
make people in the United States believe that the CIA had created AIDS. You see what's
called, you know, it's called narrative laundering. It's where a doctored document or a misleading
claim appears in a front media or paid media, or the author of the media is somehow
compromised. And so that initial seed is planted. And what you start to see is other outlets
begin to quote that outlet.
And then this claim begins to appear.
And in the days of print media, that takes, at times, weeks to months for that transition to happen.
In the age of social media, that's changed in two ways.
First, it's accelerated in speed, so that kind of information cascade, that narrative cascade, happens very, very fast.
And second, it's often even better if it doesn't appear in media, because Because particularly for audiences that are very distrustful of media, having it move through the peer-to-peer distribution chain is actually far faster.
It's far more persuasive because it's coming from someone who's just like you.
And so you're in a different mindset when you receive it.
And it is conveyed through people who hear something that they're inclined to believe and then go on to propagate it.
And so this is where what you see from state actors like Russia, who have used disinformation Again, going back decades to centuries, what we're seeing is an adaptation for a new system, a new environment, and a new way in which people persuade each other much more directly, as opposed to having to go through the slow process of laundering a narrative through a bunch of media properties.
You know, as an aside here, I just can't help but comment on how we seem to be in this Very strange situation where in the US, half of the population maybe thinks that there is a disinformation crisis in the way that you describe.
And the other half thinks that legacy media and people like yourselves and like us are actually the ones who are the patsies, who are sort of...
You know, continuing to perpetuate the mainstream narrative, which is real.
We're not skeptical enough about how CNN, MSNBC, The New York Times, NPR are just like manipulating the hell out of us and spreading lies.
It's such a strange predicament.
What would you say to someone who says, oh, that's all very well and good that you're skeptical in this way, but you haven't gone far enough.
Here's how you've been fooled.
It's a really interesting question.
First of all, media gets things wrong.
One of the things that you want to see from an outlet that is trustworthy is that they correct themselves when they get things wrong.
And that doesn't always happen as quickly as it could.
Sometimes it doesn't happen at all.
I think that One of the reasons why this has become such a topic of conversation or such a focus is that people are very attuned to the extent of the distrust.
Everybody kind of thinks that it's the other side that is distrustful or too trustful.
But that central crisis of trust, I think, is really one of the things that is driving this and propelling it.
And it is.
I think we talk, we over-focus, particularly in academia, on what we might call the supply of disinformation.
Where does it come from?
Who is producing it?
When the receptivity piece really is more a question of demand, right?
Why do people believe what they believe?
Why do they trust who they trust?
And how does that happen?
And one of the things that I've been Spending a lot of time on, actually, is the old books about rumors.
I don't know if you've ever read any of them, but there is a book that I've become kind of entranced with, written, I think it was in the late 70s, maybe early 80s, by a man, his last name is Kapferer.
He describes the phenomenon of the rumor mill.
And it's absolutely fascinating because you read it and I think like why we're talking about misinformation as if the problem is falsifiable facts.
It is not, right?
It is that people don't necessarily trust the information they're hearing.
They also are now accustomed to getting information very, very quickly.
And so journalism, the entire kind of premise of journalism was around the idea that there there would be some friction created through a process of
fact-checking that would, through a process of investigations and research,
come up with an understanding of the world. But that takes time. You saw
this very acutely during COVID when we, you know, we're all doom-scrolling,
checking our phones constantly, trying to get the latest piece of information
about this thing that is so important to us.
It is, in fact, in the early days, a matter of life and death.
And so everybody is very attuned to their devices.
They want information then.
But that's not the speed at which we actually arrive at facts.
So in this book about rumors, what he describes is this process by which you have these people who are really inherently good people, again that same kind of misinformation dynamic, the altruistic motivation, but they don't necessarily know if they trust what the media is saying.
And so what they do is you see this characterization of rumors as like an alternative to media.
Media It's a peer media, if you will, where people say, Hey, I heard this thing about this person and you should know, or I heard this thing about this politician and you need to know, or I heard this thing about this restaurant where people are getting sick and you need to know.
And so all of these, these pieces of information are, they're not necessarily verified, but they seem very important.
And so people share them.
And so a lot of what I, what I think about when I look at social media today is how this is, this is, I think actually a much more, Appropriate description of the kind of environment that we all find ourselves in.
People don't know what to trust.
They trust their friends or their peers.
Their friends or their peers are the people that they hang out with on social media now, not necessarily people they even know in the real world.
And that's where they're getting their community from.
And so, if you trust the influencer and you trust the other members of the crowd, you're going to continue to share it on.
And that's the...
That's kind of the real, I think, driver.
It's this sort of... Everybody has a kind of alternative back-channel chatter to whatever the official narrative is.
and that it's not the same in different communities, but the phenomenon I think is the same.
All right, let's move to your work on understanding Russian online interference during the 2016 election.
I know this is covering territory you've gone over quite a bit in public, but I thought we should revisit it because one pervasive theme I've noticed around the Twitter files is that government agencies, in concert with big tech, have supposedly been censoring right-wing voices by using this bogus cover of trying to combat disinformation.
I've even heard disinformation studies described as a pseudo-academic scam.
This is then often dovetailed with the assertion that the Russia probe was nothing but a misguided left-wing conspiracy theory.
So if the caricature that gets labeled as the Russia hoax Boils down to something like, well, the Russians just spent a paltry hundred grand on Facebook ads, so what?
And the Steele dossier has been discredited.
And in the end, the Mueller report showed no collusion.
Does that mean this was all just a nothing burger, Renée de Ressa?
It's a great question.
First of all, I would say I do not ever know what people mean when they say Russiagate.
I think I always ask the question, what do you what do you mean by that specifically?
Because I think that so I can talk a little bit about the work that I did and how it fit into the broader Space at the time.
But there were two very, very, very different sets of questions that were being asked.
The first, which was the one that I was asked to look at, was with these data sets, which to be clear, were attributed by the tech companies, not by me, by the tech companies.
And they were turned over to the Senate Intelligence Committee.
And then a few different groups of researchers were given access to these data sets.
And we were blinded.
We did not know who the other Researchers were.
And the reason that they did that was because they wanted to make sure that no one group, you know, if there was some bias, if there was findings that alleged, you know, that overstated a conclusion to a degree or said something, you have to remember this was very, very sensitive, right?
This was this question of did there, you know, was the Russian effort, what was it doing was the question.
And if the answer included Was trying to help elect President Trump.
That's a bit of a political minefield.
And so having multiple different groups have access to the same data, the entire effort there was to try to minimize ideological bias to the greatest extent possible.
And so there were, I think, five or six people on my team from a couple of different institutions.
And what we tried to get at was just, what did the data show?
And so the data shows this Russian effort.
It's very different on different platforms.
And that's something that unfortunately gets lost quite a bit, particularly because Unfortunately, the Facebook data was never made public.
The Twitter data was.
And so there are a lot of conclusions that I think are drawn from the Twitter data that they behaved actually quite differently in Facebook.
And so what they're doing on Facebook, Facebook is a place to build communities.
Twitter is a place to have fights.
And so on Twitter, you're having these kind of whatever the hot topic of the day is, the Russians go and fight about it.
So funny enough, the vaccine stuff during the Disneyland measles outbreak becomes a fight that the Russians go and insert themselves into.
And it's not very large.
It's a couple thousand tweets.
And so it doesn't even appear in the Facebook or Instagram or YouTube data because it wasn't a major thing for them.
It was a thing that they They would go and they would dabble in whatever the arena fight was on Twitter that day.
On Facebook, though, they spent a much, much, much more sustained effort there trying to grow identity-based communities.
And they would reinforce people's pride in the identity that they were targeting.
And then they would position those identities oppositionally.
If you were a veteran, you had to be opposed to Muslims because there were refugees who were Muslim and they were taking your benefits.
If you were a black American, obviously this was at a time of quite high racial tension, there were a number of You know, Mike Brown and a number of these officer-involved shootings that took place around that time.
And so they would run ads targeting cities that had had one of these shootings, and then they would try to create, you know, to kind of exacerbate racial tension in those places.
Now, the tension is real, right?
There are many veterans who do not receive benefits and do not receive care.
There are very plausible arguments to be made about You know, race relations in America and the very real grievances that these accounts begin to amplify and to exacerbate.
But that's the dynamic primarily that takes place.
And the political stuff gets layered on at the end.
So it's not the main focus of the operation.
And so it's been interesting to see the work that I did, particularly by the Twitter Files propagandists, reclassified as, oh, Diresta said the Russians elected President Trump.
I said nothing of the sort, because that's not actually in the data.
And we didn't even make that claim.
We just said, here is objectively, here is what it shows.
It shows a very sustained effort to try to demotivate voters for Secretary Clinton.
And it shows a clear effort to try to actually demotivate, even in the primary, people who supported Marco Rubio and Ted Cruz and Rand Paul, and to redirect that support to Donald Trump.
And so, again, this is objectively in the posts.
Did it swing the election?
We had no way of telling that, and my personal opinion is no, it did not.
But that is the Russia interference investigation.
Over on the other side of government is the collusion investigation, which I had nothing to do with, and I just read about it along with everybody else.
The areas in which there was some intersection though that was interesting was you do see
Mueller indict a couple different indictments come down.
And in one of them, there's an indictment of the accountant and others who worked for Concord
as Pergosa's kind of, one of his shell company vehicle type things. And the Internet
Research Agency, of course, is affiliated with this. And you do see in some of those
documents, it pointed out that they spent about $22 million on this effort. So not the $100,000
number that's tossed around that's specific to the Facebook ads, but the 22 million
refers to the entire scope of the operation, most of which did not involve the ads.
It involved the creation of persuasive personas who could participate in the rumor mills and the Facebook groups and the other places where people form their opinions about the world and decide who to trust and where to engage.
So I think I've probably kind of talked for long enough now.
But I think the one final point that I want to make, though, is just that that engagement really becomes the thing that sort of sets them apart, right?
They realize, hey, we've got this phenomenal tool, we can meet people where they are.
And I think my remaining kind of concern is that as we point out the fact that no, this did not swing an election, which, you know, in my opinion, has never really been in dispute.
The idea that it had no impact at all is also wrong.
The idea that it is not something that we should be inherently concerned about, in my opinion, again, is also wrong.
And so that's where, unfortunately, it does require a whole nuanced answer for you, and I'm sorry I couldn't say that.
To some extent, obviously, Renee, a lot of what you've just been saying is part of your 2018 testimony, your expert testimony before the Senate Intelligence Committee, which was specifically about foreign influence on social media.
And you had, you had tracked a lot of what was going on with the IRA and the GRU.
As all of this keeps unfolding, when we get to 2020, right?
At that moment, we're in an election year, the pandemic is raging, the term fake news, which had at that point often been used to describe false stories designed by foreign actors to specifically appeal to his supporters, so as to generate clicks and ad revenue, Has now been appropriated by Trump to smear America's free press in this proto-authoritarian way.
And this stokes the flames of polarization and conspiracism as exemplified by QAnon disinformation jumping the rails from 4chan onto Facebook and Instagram and then into CNN footage of MAGA rallies and also into the really unhinged real world violence that we started to see.
As if that wasn't enough, and you've touched on this already, the other hugely significant cultural phenomenon is that the George Floyd protests are going on.
I wanted to ask you, between 2017 and that climax that we saw in 2020, How specifically was Russian interference interacting with and framing the narrative around Black Lives Matter?
We published an interesting paper on this.
Probably most people are familiar that they were doing a lot of work to exacerbate tension.
Anytime that there was this potential tinderbox, they were in it.
It wasn't just Russia, I think it's important to note.
Iran did it and China did it as well.
And that's because you have to think about this just from the standpoint of geopolitical power games.
Propaganda has always been used, you know, whenever you're an ideological adversary or national adversary is experiencing some sort of unrest that might lead to a change that is advantageous to you, or maybe just kind of the breakdown of their country or their governance system is advantageous to you, you can kind of throw gasoline on the fire.
And this becomes something that we start to see state actors do.
We can talk about U.S.
operations at some point, maybe, but So as that's happening, Russia has now at this point become a subject of what the platforms call integrity teams that are looking for and trying to take down their networks as they spin up.
And so these fake accounts, these individual people or the Facebook pages that become front media vehicles are getting moderated away.
And the platforms begin to get quite good at this, particularly around 2018.
They begin to release the data to the public.
They begin to write these reports on it.
And so that dynamic really shifts.
They do, in fact, begin to look for it, find it and take it down.
What you start to see happen, though, is New types of media pop up on new and emerging platforms.
So TikTok becomes popular.
And so you start to see these entities that wind up years later filing what are called Foreign Agent Registration Act, FARA disclosure forms.
In the early days of this unrest, these accounts become quite prominent as A new form of media, you know, new TikTok media, global leftist media that's kind of pointing to what's happening in the U.S.
as a thing that the U.S.
should be ashamed of.
And it should, right?
That's the reality of it, right?
You do have that grain of truth.
And so a lot of the narratives really date back to the way the Russians leveraged racial unrest during the civil rights movement, where they say, how can America sit on this moral high ground when, you know, there's a situation with the Scottsboro Boys, you know, there's a phrase that the Soviets use, they put it on the propaganda posters, and in your country you lynch Negroes, right, is the phrase.
And so they are using this as a way to say, you know, why are we framing this capitalist-communist tension, this Cold War, with a moral high ground when in reality you treat your own
people terribly, right? And this, of course, there is that grain of truth in there. And so it becomes a
very, very highly effective, highly effective narrative. And you see echoes of that from
Russia, from China, from others as their state media. And then these kind of new front
media, these semi-undeclared gray media outlets begin to become popular.
Because again, the social media companies are trying to take down their fake accounts and their fake personas.
Nobody just gives up and goes home.
You know, it is a propaganda war.
It is sort of constant roiling thing that is always happening in varying degrees of impact.
But so you do see them just try to move to new platforms, new places where they're not going to get moderated as quickly or where they can use different ways of engaging and getting their message out.
So during this period, it sounds like you're saying there are several foreign actors that are wanting to stoke division around or like exacerbate the divisive topic of racial conflict already happening in the country.
Are there specific ways that they did that?
Interesting ways that they did that.
Some of it is, we use this little rubric, we say overt to covert and then broadcast to social, right?
And that's the kind of, you can, we think about that as like a two by two.
So, you have their state media, which is very overt, that writes articles and just articulates a very attributable state point of view.
You have their overt social accounts in China, this kind of class of accounts that comes to be called the Wolf Warriors.
This very, very kind of Twitter personality kind of ministry of foreign affair accounts.
Like, they're really in there.
Like, they are getting in the fight.
There's shitposting, there's memes.
They're not the way, you know, you don't see this from American government officials.
Like, they haven't quite entered into this modern era, but the Chinese MFA guys, like, they really just get into it.
Their state media editors also create accounts that become quite popular around this time.
And through, again, a combination of playing the Twitter game well, and then also what you have, the other kind of quadrant, the sort of covert social accounts, Which are the fake accounts.
They're really primarily to boost these other accounts which are not going to be taken down because they are legitimate speakers.
And so you start to see the ways in which data actors realize, hey, maybe creating a whole bunch of fake accounts isn't necessarily doing the trick, but we can use these fake accounts to boost our real speakers.
And so that dynamic begins to come into play.
And then you have what the kind of last quadrant which is just the sort of covert broadcast and that's where you have the front media and the paid media.
And it begins to move into interesting spaces.
You see China begin to offer to pay existing YouTube influencers to say favorable things about China
on their programs.
You see Russia do the same thing.
Again, these are not new strategies.
There has always been that dynamic of, if you can't start the outlet, maybe you buy the outlet
or you buy the people speaking on the outlet.
And so you just see that strategic adaptation.
So it's a very holistic thing.
And so I think there's at times quite often an overfocus on the, you know, the kind of covert social bot accounts or Facebook personas or whatever, when what they're really doing is they're using all of these things in tandem.
And the United States does this too, just to be clear.
But they use all of these things in tandem to, to create favorable messages for themselves and to, and to, and to exploit, Social turmoil in the countries that they want to destabilize.
So I can't I can't let it slide.
Give me some examples of how the U.S.
does this.
We did a report on this.
The U.S.
Pentagon had been running accounts as well.
Right.
And again, some of these accounts.
This was this was actually the most interesting part of the Twitter files.
Part of the ones that I got mentioned in was Li Feng's analysis of the situation with the Pentagon.
We called the report Unheard Voice.
If anybody wants to go and find it and take a look at it, we did it with this company called Graphica that does social media research as well.
Dating back to the 2012 timeframe, 2015 timeframe, there's this thing called the Transregional Web Initiative, TRWI.
And it was a series of Pentagon-length accounts that, at the time, SoCom, I believe it was, were overt.
Again, the disclosures were properly there on the website.
You had to kind of go into the About page to find them.
But they were putting out the U.S.
government's point of view in Central Asia, in, I believe, Southeast Asia, some parts of the Middle East.
And what they're doing is they're speaking to local populations About the United States' point of view.
Now, sometimes what you see is the, again, the hallmarks of propaganda where you see things like, some people are saying, right, is one kind of red flag piece of rhetoric where they'll say, border guards in this community in Afghanistan say that Iranians are sending back bodies of people killed in conflicts without their organs, right?
So they allege that the Iranian regime is doing some sort of organ harvesting thing.
And again, it's written not as this is a fact, but it is this person is saying, which is possibly true, possibly not, who knows.
But that's how that kind of content begins to appear.
So, they originally make these things as websites.
And then what you start to see is Congress cuts funding.
In around 2015, there's a dormancy period.
These things disappear.
But then a couple years later, the infrastructure comes back.
And you can see things like AdWords IDs and other kind of technical links that indicate that there's some continuity between these websites, these efforts.
And you start to see fake accounts associated as well.
And there, they are not disclosing that they are affiliated with the Pentagon or SOCOM, and so there becomes a question of were the correct authorities pursued?
You know, U.S.
government does have certain authorities under which it can run psychological operations.
In this case, we did not know whether they had been followed or not.
So, in certain situations, We will do an analysis of a network and we will say, this is our understanding.
We will not make an attribution.
We're actually really, really careful before we make an attribution saying, we believe this thing is this thing.
There's a many stage process that we go through to try to feel that we have the best possible understanding of that.
And oftentimes that's where we would communicate with a social platform to say, do you also see this?
What is your interpretation of the provenance of these accounts?
And so the same, you know, the same thing with we're not going to attribute something to the United States government or a particular United States government contractor in that way either.
So what winds up happening is Washington Post goes and does an investigation and gets a set of comments indicating that, yes, in fact, They're going to do an investigation to see what happened here.
Something, you know, didn't look like things were necessarily entirely above board on that particular set of projects.
So, that's one example of the kind of work that we do.
We're interested in We are not a shop that goes out and hunts for Russians, just to be clear.
It's more a question of how does state actor influence work?
Who is doing it?
What does it look like?
Is there evidence of inauthenticity?
Is there evidence of these Kind of cross-pollination of overt covert broadcast social kind of propaganda apparatus and if so you know just for me as a researcher who studies propaganda I'm very interested in that whether it's coming from USG of course there I feel a you know a sense of a need to weigh in just as an American like maybe we shouldn't be doing this guys you know.
Maybe there are other ways rather than kind of getting in the mud with a bunch of mediocre fake accounts.
Maybe we could do this in other ways.
Maybe we should think about countering foreign propaganda in other ways.
And that was one of the things that I tried to get at actually.
In that Senate testimony in 2018, which of course was years before I saw this stuff, but just that question of you can't do nothing, so what do you do?
And that I think becomes, again, when I said at the start of our chat, what do you want to happen?
That's really the question we need to be asking given this understanding of the world.
What is it that we want to happen?
Speaking of hunting Russians, let's turn to Yevgeny Prigozhin.
He's sometimes referred to as Putin's chef because after nine years in prison as a young man, he went from selling hot dogs on the street to owning grocery stores and casinos and eventually upscale restaurants where he would cook.
For the Russian premier and his visiting dignitaries from time to time.
He's recently become well known for some other reasons.
What can you tell us about Yevgeny Prokhorin?
So Prokhorin has a number of different entities that he for a very, very long time Either denied existed, right?
The Wagner Group legally is not supposed to exist in Russia.
There's a ban on private military corporations, which is what, for those who don't know, Wagner Group is, right?
It goes and it fights in kinetic conflicts.
So it was in Syria, even though, of course, it did not exist.
It appears in numerous places in Africa, Central African Republic being one, even though it does not exist.
And so there is this, much like the Internet Research Agency, which again, there is always this sense of Plausible deniability.
It creates plausible deniability for the Russian government to use mercenary organizations, whether that is for the kind of propaganda contracting or, you know, the sort of social media manipulation that the Internet Research Agency does, or the kind of kinetic operations that Wagner does.
Purgosian also has a few other, he's kind of a man of many talents, many other various organizations that are linked to him.
Foundations and a news outlet, RAA Fan, and these things work in concert.
So in around 2019, we were doing this investigation into the GRU at the time actually,
to Russian military intelligence and wound up one of the other researchers on my team,
Dr. Shelby Grossman was looking at media in Libya around the election and began to notice these pages.
We did a pretty big report on them.
We actually reached out to Facebook.
We said, you know, these don't seem to be authentic.
They appear to be quite, you know, sometimes they would just say very, very favorable things about Purgosian quite directly.
Interestingly, there was not really a strong effort to completely divorce the ties between the two things.
And so in that particular case, one of the questions became, how do you attribute this?
And we went with entities, you know, and entities linked to Yevgeny Prigozhin, because nobody could say, these were Facebook pages and groups that were popping up in countries where Wagner had a presence on the ground.
And so it appeared to be trying to bolster relations between the local publics and to make them feel more favorably about these You know, Purgosian's involvement in the country, Russia's involvement in the country, of course, in certain parts of Africa you have Russia and China and France kind of in competition.
So what you start to see again is these tools are used to support some sort of real world power dynamic.
And so Purgosian is a man who now has taken on a much greater position of prominence because both Um, his media, uh, kind of empire, but also his, his kind of kinetic, uh, teams Wagner are quite visibly involved in the Ukraine war.
Uh, and, and so interestingly, he has begun to boast about his activities that he previously denied this.
I mean, the man is a, you know, a chronic, um, is this, this is not an honest person, right?
This is a propagandist, but has begun to boast about his interference in us politics and politics and various other Parts of the world as well.
I think he had a, you know, kind of interesting kind of phrasing he used, we did interfere, we will interfere, and we are interfering, right, was how he put it.
And this made the statement sort of right before the US 2022 elections.
And so that got quite a bit of, you know, major flurry of activity.
I think it's really important to note, you never want to let the adversary turn themselves into this omnipotent, all powerful, you That's of course what he's trying to do because he also, that reputation now helps him in domestic politics in Russia as there are power struggles going on as the Ukraine war has been quite the debacle for them.
And so again, you never want to assume that this quote unquote confession of nefarious evil deeds being done in 2022 actually means anything beyond the boasting of a propagandist.
I think we should be able to hold two ideas in our head, which is that interference is bad.
It is a thing we should look for.
It is a thing we should understand.
If we do nothing, it grows like a cancer.
But at the same time, it is not necessarily the be-all end-all of threats facing the U.S.
polity or impacting our domestic cohesion.
And those two things are simultaneously true.
Yeah, I've been following him for a little while, and it's been fascinating to see him become more prominent, and part of how he's become more prominent is in this last November, he started being open, where before he had denied that he was the leader of Wagner, and even saying that he created the Wagner Group, he founded it, not this other person, Dimitri Utkin, who had been credited with that.
And then just this past February, he started saying, I invented the Internet Research Agency, I funded it, I ran it for a long time, I was the boss.
And yeah, it's hard to tell how much of that is him just once.
It's like more of his propaganda in terms of now he has an agenda to become a more prominent figure on the world stage.
I've seen that he's been having disputes with the senior Russian military leaders.
I've even heard some speculation that if anyone's going to oust Putin, it might be him and he
might be the next leader of Russia, which does seem pretty terrifying.
And then to what you were saying before, I'm with you in terms of holding those two things
at the same time.
It seems to me that more than swinging the election, part of the crisis that we're in
is that we can't agree on the facts anymore.
And we can't agree, everyone's pointing the finger and saying, that's propaganda, that's a narrative.
You know, your dishonesty, your media source has been corrupted.
I don't think that that situation has been created by foreign influence, and you can correct me if I'm wrong, but I feel like it exploits that situation and perhaps makes it worse.
So, you know, I want to ask you, by way of closing, as we find ourselves heading towards another election year, With Trump, who I would say is kind of the archetypal embodiment of what goes wrong when disinformation campaigns succeed, potentially running again.
You know, I think if Schellenberger is right about anything, it's that you are at the forefront of something quite significant for our times.
I see it as the battle to preserve democracy and the study of how corrosive digital, of how corrosive digital disinformation really is.
How it can completely corrupt the cultural conversation, not only on politics, but also on reality, on the reality that we inhabit together.
What kind of closing remarks might you have for us about how to understand the current crisis and how to perhaps be on the right side of history?
There was an effort in the 1930s by a bunch of academics and journalists to try to help just create population resilience by explaining how certain types of rhetoric worked.
And that I think is the thing that is what's really needed today.
It's sort of a similar model where people who are trusted by their communities become both the counterspeakers, but also the educators where we say, this is how you should think about when you, you know, you guys do this with your podcast.
Here's how you should think about when you hear this type of rhetoric, when you see these types of words, this is what they actually mean.
This is, you know, they're, they're, here's the thing that they're not saying that you should pick up on as a kind of red flag.
So I think that the bringing that up, you know, doing that more, not through government, but rather through civil society organizations or, you know, artists, creators, people who have a position of trust with communities.
I think a lot of it has to be done at that much smaller level.
I think that The idea that we are going to rebuild trust because we're going to elect somebody new and all of a sudden, you know, everything is going to reset itself to some Halcyon days of old, like that's just not going to happen.
It's going to take work.
And so that A lot of what we see our role doing in the projects we've done at SIO is to say, how can we help people understand what they need to counterspeak about?
How can we help identify people who are effective at counterspeaking?
How can we say, here's the structure and here's the substance and this is working and that's not?
And can we make that process a little bit easier for the people who have to actually go and do the work?
And that's what I anticipate us doing in 2024 as well.
You know, we are continuing to try to develop an understanding of how narratives work and what influences people and why they trust certain things and that's going to keep going.
Thank you everyone for listening to Conspiratuality.
We'll see you here again next week on the main feed or of course on the weekends or Monday on our other offerings.
Export Selection