65: The Alternative Influence Network, with Rebecca Lewis
This time, Daniel welcomes special guest Rebecca Lewis to the show. Rebecca is the author of the Data & Society report 'Alternative Influence: Broadcasting the Reactionary Right on YouTube' about which Eric Weinstein was such an asshole (our word, not Rebecca's). Content Warnings as ever. The report: https://datasociety.net/library/alternative-influence/ Rebecca's Twitter: https://twitter.com/beccalew IDSG 62 (on Eric Weinstein and Peter Thiel, etc): https://idontspeakgerman.libsyn.com/62-eric-weinstein-part-2-cancel-culture
I'm Jack Graham, he him, and in this podcast I talk to my friend Daniel he him, and in this podcast I talk to my friend Daniel Harper, also he him, about what he learned from years of listening to today's Nazis, white nationalists, white supremacists, and what they say to white nationalists, white supremacists, and what
Be warned, this is difficult subject matter.
Content warnings always apply.
All right, and welcome to episode 65 of I Don't Speak German, the podcast in which I tell people about the things that terrible people say when they don't think we're listening.
As you know, if you're a regular listener, and if you don't know, if you're not, normally I'm joined by my very good British friend Jack, who does all the actual work of this podcast in terms of making it listenable, so thank you to Jack.
But he is not going to be here today.
Instead, I am joined by a no doubt lovely guest.
I am joined by Rebecca aka Becca Lewis, the author of the Alternative Influenza Network report from Data & Society and a PhD student at Stanford.
Please say hi, Becca.
Hi!
Thanks so much for having me!
I am very happy to have you, and I must say, given that you are on the West Coast, it is probably, I hope I will never have to ask this question again, but is your home currently on fire?
So I'm very privileged to be able to say no.
I feel like there aren't many people in California who can say that at this point.
It's so scary and depressing and kind of... I posted about this on Twitter that I think everyone on the West Coast right now is just filled with this sense of existential dread.
And that's the best that you can hope for, right?
Because either you're in an active state of Evacuating or kind of, you know, in an actual dangerous way, or else you're kind of sitting there just watching the air quality index and watching the sky turn various shades of orange.
And it's very strange, but, and horrifying and, you know, tragic to see what everyone's going through.
But luckily I am safe for the time being.
Okay, well, we hope that that remains the way it is.
So yeah, no, great.
Um, so we have over here on this podcast.
We have done a number of episodes about the IDW and certain people whose last name rhymes with Einstein and, you know, someone by the name of Dave Rubin, who said some terrible things about you and you and I have been.
In some context for a little while.
I mean, you know, kind of friendly, but casual.
And I'm not trying to diminish that or to speak, I'm just trying to say, you know, yeah, we've shared information, etc, etc.
And I really did want to kind of ask you to come on and kind of talk about, talk about the report and talk about kind of some of the issues that you kind of ran into in making it.
And also to maybe, since the report is now two years old, to kind of talk about what What, uh, what you see from that same kind of group of people or that same ecosystem today.
And so this is going to be like the last episode, kind of a more of a grab baggy kind of a chat episode.
I don't have like a specific outline here, but I think we're going to have a good conversation.
I think there's a lot of stuff to talk about.
Um, so, um, I guess, I guess we should just get started.
Um, I can recommend that people kind of look back to our previous episodes where we've kind of talked about the report a little bit, but I guess, uh, tell me what is data in society and what is the Alternative Influence Network?
Like kind of tell me how the report, kind of tell me how that thing happened.
Yeah, yeah.
And I should say up front that I don't work with data in society anymore, so anything you hear here is me representing me and not me representing data in society.
So if I say anything you're very angry with as a listener, the fault is all mine.
But Is that because you're such a deeply unprofessional researcher that Data Society shit-canned you?
Is that what happened?
Exactly.
Because I can only imagine that's what the intellectual dark web types are going to assume, yeah.
Well, I'm still very close with them and that team, but I had already actually, at the time that that report came out, I I had kind of left my full-time position and I had begun my PhD at Stanford, but there's a team at Data & Society which is a New York-based think tank that looks at
Issues of technology and, you know, social consequences of technology, particularly the internet, and a lot of it is through kind of a social science lens, so kind of sociological approaches, and This team that I was on was starting to look at what we were calling media manipulation.
So this team formed in 2016-2017 to look at kind of the hodgepodge of characters that you cover on this show, a similar hodgepodge of characters.
And we started out by looking at the ways that actually that these kind of a loose Group of, you know, men's rights activists, white nationalists, just, you know, Chan trolls, all of these different groups were working together to amplify their messaging within the mainstream media and basically kind of hack people's attention.
And, you know, whether it was through Richard Spencer being kind of the suave Nazi and getting coverage through that, You know, Shtick, or whether it was Andrew Anglin kind of, you know, and Weave hacking computer printers and getting local news outlets to write about it.
There were so many different ways that they were actually able to get their messaging kind of into the mainstream.
And so that was what we were looking at at first.
And I co-wrote a paper on that in 2017 with my colleague Alice Marwick.
And throughout that, YouTube kept becoming this super important vector.
We would find time and again that these conspiracy theories or messaging points would start.
On YouTube, or excuse me, would start in like far right spaces, you know, like a far right forum or one of the like Chan message boards.
And it would kind of, you know, just be staying around there.
And then a YouTuber would cover it.
And, you know, before you knew it, a Fox News host was covering it or, you know, an even bigger YouTuber was covering it.
And pretty soon it would kind of make its way into mainstream discourse.
And so because of that, I really turned to YouTube as kind of a site for my next research project.
And of course, I think anyone who spends a lot of time in this world knows that YouTube is this rat's nest of far-right creators.
But at the time, no one was really, or very few people, I should say, were writing about it or talking about it.
Academics were very focused on disinformation on Facebook and Twitter.
Right.
Because it's way easier to do kind of mass data analysis on those sites, right?
Because they're text based.
Well, right.
At least mostly text based.
I don't know anyone who might, you know, understand the travails of trying to understand an entire community through hours and hours of just listening to audio.
I know, what would that be like?
I don't know, I don't know.
That seems like something that no reasonable person would ever try to do, you know.
Just having white nationalists on at two times speed in the background of your day-to-day life at all times.
Right, yeah.
Effectively, no reasonable person would do that.
Actually, what I'm actually learning is I need to contact your previous team and see if they will pay me.
Yeah, there you go.
Well, that was a lot of people afterwards kind of, you know, I will get into this.
I experienced like a range of backlash after this report got published.
But my favorite criticisms were the ones that were like, so basically, she's getting paid to just sit there and watch YouTube.
And it's like, well, in a way, yes.
Yeah, yeah.
Try it.
Try it sometime.
This is an aside, and I'm not trying to put myself at the center of this, but I do get people who say, like, hey, I'd love to help out.
Give me a channel or two to follow.
And I give them very easy ones.
And they come back in two weeks and go, this is soul crushing.
I can't imagine doing this.
And I don't blame people for it.
I really, really don't blame people for it, because it really is, like, soul-crushing.
But, like, it does require, like, it does take a lot out of you.
So, you know, again... It really does!
And I think that, like, that story, I think is, it's spot-on in terms of no one exactly knows how they're going to react.
Right.
When I was going through, actually, the job interview process at Data & Society, they asked me, like, you know, are you going to be okay kind of looking at this far-right content?
And of course, you know, I wanted the job and I said, well, yes, I'm, you know, I'll face it.
But in reality, I had no idea how I would react to it.
And, you know, I think I have found that I do, I am someone that much like, you know, I like watching horror movies.
I like kind of understanding humanity through its darkest you know, elements or impulses or whatever.
I found that there there's a way that it helps make sense of this awful, awful moment that we're in.
But it also is like, genuinely distressful and kind of have like a real mental health toll and all of that.
But anyway, sorry, I digress.
Oh, no, no, no, it's fine.
I'm happy to you know, believe me, I always happy to like share notes on that sort But yeah, no, so you kind of approached, like Data Society were looking for someone to kind of do some project like this.
You applied to it and they gave you the job, I guess, a contract or whatever for it, you know?
So I'm sorry, I'm not fully aware of how these things work.
I am the barefoot barbarian in this situation.
Think tanks are very weird organizations.
And in fact, a more recent project I'm working on is looking at right-wing think tanks, because they're these weird hybrid organizations.
But yes, essentially, like we were doing a mix of like op-ed type pieces and longer term research projects.
And the goal, like the specific vision of the founder of this specific think tank, her name is Dana Boyd, and she's a scholar of kind of Internet culture.
And her idea was that, you know, academic research often stays like so siphoned off from public conversations.
And it's this really often counterproductive thing where it's like the more esoteric your academic work is, people think it's better.
And, you know, the less accessible it is to the public.
And of course, for some fields, sure, if you want to go off in philosophy, sorry if there's any philosophers listening, But, but with social science, you know, that there are real policy decisions made, you know, based on social science, a lot of the like major tech companies hire social scientists, like it has really, you know, pretty, pretty intense real world impact or has the ability to and so she really thought let's
She wanted to start this research institute where people would be funded to do research, and then it would get published in a more broadly accessible way, and it would be written for a range of different audiences, not just academic audiences.
interviewed and joined this team and kind of the the mission was to research these far-right groups but specifically with this bent of like how how is the internet shaping this right like how are they utilizing the internet to their advantage and how is kind of the internet shaping what they're doing And so that's why eventually I kind of centered on YouTube.
And I think, yeah, there were a few motivations.
One was that I kept seeing it crop up in my other projects.
Another was that all of the academic research I was seeing come out at that point was much more focused on Facebook and Twitter and most of the journalism, frankly, because anyone kind of, you know, In their 30s and beyond has a very different relationship to social media than, you know, people, you know, in their teens and 20s.
And so I think like for people, you know, for baby boomers, for example, like, YouTube, Often is it just like the place where you go and look up like how to get your cabinet fixed or whatever, right?
Right, yeah.
You do like DIY projects there.
Let's watch Fleetwood Mac videos or you know.
Exactly.
Both lovely applications for YouTube.
It's great for those things.
I need to fix my lawnmower.
Here's how to do that.
There you go.
But I think a lot of people, you know, above a certain age don't realize that it is like And now TikTok, but it is one of the central spaces where young people get their entertainment, information, to a certain extent, community.
It's like, you know, TikTok, YouTube, Twitch.
Right.
And then, you know, there's other ones.
But really, there's this kind of central feature of video content and how important that is.
So that was one driving force.
And then the other thing was, you know, I started to get really frustrated with after the 2016 election when all of these new stories started coming out and kind of Various liberal foundations started to fund research focused specifically on this, like, concept of, you know, disinformation as just, like, fact-checking.
Like, oh, there's just a series of facts that they get wrong, and if you present that, you know, we'll have these fact-checkers on the show.
You just awarded infinite Pinocchios, and it doesn't exist anymore.
Exactly.
That's the way it happens.
Yeah, clearly.
That's the answer.
The Washington Post is just gonna, yeah.
We shake our finger at you, sir, Richard Spencer, for declaring that all brown people should be deported, because you made a factual error.
It turns out that the 13th Amendment was actually – you got the wording wrong in your statement of the 13th Amendment, and so you now get mini Pinocchios.
And that's the state of fact-checking.
Listen, Daily Stormer, the Jews do not run the entire media.
It turns out that Jewish people are only the heads of 75% of large media organizations, because that's really the key to challenging this.
So yeah, and then I guess the last thing and maybe now I can't remember I think this this pet peeve of mine may be a bit more recent I think you know more and more actually like as YouTube has Become part of the discussion around these issues and propaganda and disinformation and all of this the discourse tends to focus really intensely on like the recommendation algorithm and
You know, the kind of summary that you'll hear or, you know, see in the media is like, oh, the algorithm is radicalizing people.
The algorithm is radicalizing people.
And, you know, it brings them into the dark corners of the Internet.
It brings them down the rabbit hole.
And that's certainly a piece of it.
But I think it misses a lot of the story.
And so what I focused on in this report was much more about Kind of, you know, influencer, micro-celebrity is the term that academics like to use, but essentially these content creators, the relationships that they develop with their audiences, and how they're able to kind of build really strong ties with their audiences and
gain their trust, kind of use these persuasive techniques, whether they're like fully aware of it or not, and essentially sell them on this, you know, white supremacist ideology, or at the very least kind of reactionary ideology.
And then also the other piece of it is kind of good old-fashioned social networking, right?
And this idea that people were Appearing in each other's content so much, and I just realized, you know, I was like, wow, there's really kind of a cross section here of a bunch of people appearing in each other's content.
And I first was thinking about it when I was going through and watching all of Dave Rubin's videos, so.
I do apologize.
I do deeply apologize.
No one, even I haven't done that.
Like, that's some effort, I will say.
Not because he's even that bad, but just because he's that stupid.
He's just that stupid.
Right.
It's a certain kind of... there's a mind-numbing element to it that you really have to look out for.
He's the right-wing equivalent to a TV turned to static.
You discern the patterns in the random electrons hitting the screen.
So, yeah, so, you know, what I thought was so interesting, though, about him is that he, I mean, he is able to have this kind of mainstream image, right?
Like, you know, he got written up in the New York Times in Barry Weiss's, like, article about the intellectual dark web.
He used to be an atheist liberal, so he constantly talks about that.
Well, he has a previous appearance on The Young Turks, right?
He had a bit of a background there.
Yeah, which is which is kind of seen as more of a I'm not a big Young Turks person like I like who gives a fuck frankly But you know Looking for the haters, that's where we go.
But, you know, he kind of left that and then kind of moved into, like, doing his own thing.
And he kind of pivots from being this, you know, fairly kind of milquetoasty liberal kind of person into being part of this kind of right-wing ecosystem.
Like, you know, I didn't leave the left, the left left me.
And there's an interview that Michael Brooks did with Anna Kasparian.
Where Anna Kasparian talks about, like, how, oh no, he wanted more money.
He wanted to get a lot of money for doing, like, not a lot of work.
And we just weren't able to do that.
And so he left and, like, she's like, I don't blame him for doing it.
If he wanted more money, that's fine.
But ultimately, even she says, like, I think she says something to the effect of, like, he was hired because his husband was, like, one of our producers and he was a really good producer.
And so we hired Dave Rubin because, like, he kind of came as a package deal.
That's so funny.
And so Dave Rubin is like, no, I'm going to be a new star.
And that wasn't happening at the Young Turks.
And so like, what happens?
Let's just go sit and have fireside chats with Dennis Prager.
And that's the way.
Right.
Well, I got to hand it to him.
I mean, like everyone, I think rightly talks about That maybe he's not the most intellectual fellow in the intellectual dark web, but he is incredibly, or at least for a time, he was kind of in the right place at the right time and had that incredible ambition and was kind of able to like, you know, bring disparate people together and, you know, kind of hitch his wagon to Jordan Peterson and all of these things that
You know, had him doing quite well for a long time.
And I think, you know, we'll we'll get to this a bit more, I presume.
But, you know, I think he's struggling now a little bit more.
But, you know, what what I found so fascinating at the time was, you know, he ended up speaking of getting more money.
He ended up developing this partnership with a Koch brothers foundation called Learn Liberty.
And they I actually don't have I haven't seen actual information about them paying him, although I would, and he has denied that they pay him, although it is a Coke institution and, you know, I wouldn't be surprised to learn that they were paying him.
Oh no, I'm not being paid by the Coke brothers.
I get my funding through a completely separate organization, which receives some funding from the Coke brothers.
And we are not required to disclose the amount of funding that they get.
And it turns out that that number is very, very high.
We just have an amicable partnership.
We have an amicable partnership.
I understand.
No, the question is, it's really ambiguous how much money I get from right-wing billionaires and how those things affect the opinions that I share.
It's kind of an open question.
I really think that that asshole that doesn't speak German should really speak much more, should really get his facts in order before he states outright that I am being paid by terrible people.
Oh my god, yeah.
Yeah, clearly.
But what they definitely do, like on the record, what they talk about is that they supply him with like an endless...
supply of, like, libertarian scholars and public figures to go on his show.
So he became this hub of, like, you know, conversation around these libertarian issues, essentially.
And then he would have on, like, Gamergate guys and Milo Yiannopoulos and all these people.
And so, you know, you asked me at the beginning about my methodology, the way that I figured out, like, who exactly I was going to study, because YouTube is this nebulous space.
I started with, you know, for network graphs, what they call like a seed account.
That's the account I started with, was Dave Rubin's.
Right.
And then I used what they call a snowball methodology, which is, you know, I went through all of his content from, it was for the full year of 2017 and until April 1st, 2018, which was when I was conducting the research.
and looked at every single person who had been on his channel throughout that time and logged it and then if they had a YouTube channel I went to their YouTube channel and saw every single person who went on their channel and just let it go from there and what happens in these cases is you know like you could
You could have it theoretically go on forever until you've included every YouTuber who has collaborated with someone, but I bounded it by four people, which means it's not enough just to have appeared on Dave Rubin's channel to make it into the network I was looking at.
You had to also appear with three other people that showed up in that network.
No, but Noam Chomsky.
Noam Chomsky, Becca.
How did Noam Chomsky not end up on your list?
Linda Sarsour, surely she appeared.
She made several appearances on Dave Rubin's show and Joe Rogan's show.
I think she went on the Daily Show a couple of times.
Isn't that something that happened?
Isn't that something?
Doesn't it seem like that should go on the list?
Yeah, no.
Sorry, I'm beating the dead horse from a previous episode.
No, it is very therapeutic for me to hear because this is what I was met with when I released this research.
I mean, essentially what happened, I think you talked about this, but I developed this network First of all, to bound kind of who I was going to be looking at.
So it ended up being 65 creators.
And also to show just how amorphous the lines between them were.
I mean, I initially wanted to actually be able to divvy up that network and be like, OK, here's the libertarian part of the network.
Here's the conservative part.
Here's the white nationalist part.
And at a certain point, I realized, like, I literally couldn't because first of all, a lot of them are slippery with how they identify themselves.
Right.
And, you know, like, literally, I was worried about, you know, I wanted to stay conscious of like, you know, Sticking to how they describe themselves in terms of, like, I wasn't going to call someone a white nationalist unless they had explicitly said, I believe in a white ethnostate, right?
Right, right, yeah.
I actually will apologize to you because I actually mischaracterized you slightly in the previous episode.
Because I said that your report called Ben Shapiro a mainstream conservative.
But that was Ezra Klein writing about your report, Described Ben Shapiro.
That's right.
That error was Ezra Klein's and not yours.
And I noticed that when I kind of went back and re-listened to it and was looking at it.
So I did just want to clarify that.
I think that is a reasonable... Well, I would say that is how he brands himself, right?
So I think that is a reasonable way to talk about it with the caveat that You know, part of the project that he is partaking in is pushing mainstream conservatism to the right to the point that, you know, how do you like?
Sure, it's mainstream, but what does mainstream mean in that context, right?
So they can be mainstream and still, like, incredibly extremist or incredibly Islamophobic or all of these things.
Well, and part of what we do here, and I've said this on many occasions, is to sort of examine this super far-right, tip-of-the-sphere stuff, tip-of-the-sphere, tip-of-the-spear stuff, as a way of understanding Um, you know, how the talking points sort of get fed into the mainstream, but also because these people exist as these sort of like, well, I'm not far right.
Those guys who are excluded from the conversation are far right.
And I think, you know, there is this, if I can, if I can kind of get on, if I can get to get on a little bit of my hobby horse just for a second, a particular, uh, uh, someone, someone whose name rhymes with Beric Weinstein.
I did a podcast with Andrew Morantz.
Andrew Morantz is a writer at The Atlantic, or at least did a big profile of this guy Mike Enoch, Mike Pinovich.
People who know this podcast know very well I am very, very familiar with Mike Enoch.
I may be more familiar with Mike Enoch than Mike Enoch's parents at this point.
It is very disconcerting the amount to which, like, we develop, you know, like, we...
There is a bit of a parasocial relationship.
There is a parasocial relationship, absolutely.
It is not a friendly parasocial relationship, and in fact, Mike Enoch blocks me on every single one of his new Twitter accounts.
I'm, like, one of his Insta blocks, so, you know.
That's so funny.
It's fine, it's fine.
He has a restraining order against some people, he doesn't have a restraining order against me, but I've never tried to contact him, so, you know, that's probably why.
But he has wished me dead of coronavirus, so, you know, there is that.
Oh, there you go.
Um, no, so, um, uh, Andrew Morantz did this, uh, profile of Mike Enoch in, um, uh, late 2017, early 2018.
And he is sort of a friend of Eric Weinstein.
He was originally going to do a profile of Eric, and they sort of became kind of social friends, and then the profile kind of never really happened.
But in, I think, September of 2019, they recorded an episode of The Portal, which is Eric Weinstein's show, which came out some months later.
It's one of the more recent episodes of the show.
And I say this because like they have a really interesting I could really do I wanted to bring this up in the episode I did about one of the episodes I did with Eric except it just kind of goes off in its own Like it's kind of its own topic, right?
But there's a lot of really fascinating stuff But among that like Andrew Marantz is very familiar with this world like Andrew Marantz would be someone who is at least as familiar With this world as you and I are like he wrote a book about this topic.
He is perfectly, you know, cromulent with this stuff.
I mean, Andrew, if you want to come on the show, I would, you know, certainly if your friend Eric doesn't want it to happen or whatever, I understand.
But, you know, I would absolutely love to interview Andrew about his experience with Mike Enoch.
But among other things, he's like, you know, the question of is Ben Shapiro alt-right kind of comes up, right?
And Andrew at Marantz is kind of on the, well, no, of course, Ben Shapiro is an alt-right because Ben Shapiro isn't anti-Semitic.
He doesn't believe in, like, Ben Shapiro is Jewish.
He doesn't believe in the, he doesn't believe in the Jewish question.
Yeah.
But that's not the question to ask, right?
Because like alt-right means a lot of different things in this context, right?
And if you're looking in mid-2016 when Ben Shapiro was a big person at Breitbart and Breitbart was Kind of calling itself the sort of the front page of the alt-right or the sort of the the big like news source for the alt-right and Ben Shapiro is like their editor-at-large or whatever that his title was.
Yeah.
Ben Shapiro is absolutely one of the like major figures in the quote-unquote alt-right and so the question of like who is and is not alt-right kind of becomes this moving target and that's why I don't really use the term that much and nowadays it kind of doesn't mean much in much in particular at all.
No, that's, I think that's so spot on because like that, in a lot of ways, that's what I was trying to show with the network graph that it's like, okay, well, Sure, technically, like if you look at Ben Shapiro, if you look at Dave Rubin or any of the most mainstream guys with the biggest platforms, they are very careful about what they do and don't say, right?
And so you're not going to catch them saying, you know, there should be a white ethnostate.
And, you know, I
believe that probably neither of them genuinely want a white ethnostate, but what they are doing is, first of all, kind of helping weaponize people or turn people against mainstream information sources, and thus kind of destabilizing any previous worldview they may have had, putting themselves as the markers of kind of actually the most important, you know,
to get information and so on and ways to frame how you see the world.
And then they will have these friendly conversations on their channel with people that are kind of, you know, open white nationalists or at the very least kind of one step closer along that journey.
And I think, you know, I think you've talked about this specific case before.
I mean, the quintessential case, which I wrote about in my research, was when Dave Rubin had Stefan Molyneux on his show.
I don't think I talked... I mean, I did an episode about Stefan, but I don't think I talked about Dave Rubin in that.
In that particular instance, but certainly got it.
Got it.
Fully aware of it.
Yeah.
Yeah.
And so, yeah, it's like, you know, Stephen Molyneux has been documented time and time again as a radicalizing force for people.
And Dave Rubin had him on.
And, you know, Dave Rubin has a very big platform.
I think he has a million subscribers at this point.
Yes.
I don't think he did at the time, but he still had a very large audience and and literally gave On that show, Stefan literally says that black people have smaller brains than white people, and Dave Rubin nods along.
He's like, oh, okay.
Absolutely no pushback because Dave Rubin doesn't know how to do that.
Or doesn't care.
Or doesn't care.
Right, right.
What I find interesting about Dave Rubin, in particular, when he interviewed Peter Thiel, which I believe I talked about in the most recent episode, is he seems to have sort of a list of very friendly questions.
It's sort of almost this pre-prepared list of questions which he will ask people.
Like, look, I give friendly interviews, but I don't invite Nazis on the show for a reason.
Right, it's not like, of course, yeah, not every interview has to be hard-hitting, but I think they hide behind, you know, the line is like, we want to have conversations, you know, engaging with dangerous ideas and, you know, not, we have to confront them or whatever, but that's not what's happening.
There is, you know, even when people give hard-hitting interviews, a lot of times it kind of backfires and actually gives the person You know, that they're interviewing more publicity or more attention or whatever.
And then when you're not even giving them a hard-hitting interview, you're giving them a very friendly one, then it just becomes you're amplifying them, you are giving them the potential to broaden their audience.
And that's what happens.
Like, you'll literally see in YouTube comments, People talking about, like, who else came here from this channel?
Like, I loved when you guest appeared on this channel and now I've subscribed and I love you, this sort of thing, right?
So when we talk about, like, the algorithm Certainly plays a very important role, right?
Like, videos get surfaced that way, but content also gets surfaced, like, when you've built trust with a creator and you really like them and are essentially a fan of them, and then they tell you, like, hey, go check out this other person's channel, and they supply the link right in the About section of the video, then it's super easy just to go over and, you know, it has your favorite celebrity's endorsement.
And that also, like, that all of these things kind of also feed into the algorithm.
So anyways, you get like all of these forces at work that are both pushing people towards more radical content and pushing that radical content more into mainstream sources.
Yeah, I mean, what I find, I feel like the conversation around the algorithm, like I did, we did an episode kind of a little more than a year ago about kind of the YouTube algorithm and sort of the way, and this is when it was sort of, you know, at the sort of like forefront of people's conversation around the time that we're starting to see, you know, kind of mass shooters and the way that sort of the YouTube algorithm was
You know, the way that people were avoiding the algorithm to a certain degree by moving to other platforms, and I hope that's something we can talk about here in a minute.
Totally.
But I think something that, I think that there is sort of an algorithmic problem that comes from all these sort of social media networks.
It comes from the fact that, you know, we have four websites, and those four websites have, are programmed by people who are trying to drive engagement, and they don't necessarily care what kind of engagement they're driving.
They're perfectly happy for you to stay on the show, to stay on the website and watch cooking videos.
They're also perfectly happy for you to stay on the show and watch videos that are all about how the Jews are trying to drink, to try to kill children to create adrenochrome.
Right.
Right, right.
They're very happy to kind of do both of those things.
Yes.
Right, right, right.
And it's kind of value neutral in that way.
Yes.
And they don't.
And this is where, not to give credit to the Nazis here, but this is where the Nazis do have sort of a, they're like, well, tell me what rule I broke when you banned me, right?
Oh, absolutely.
Like they have a certain point of view of like, well, I'm happy.
You know, Richard Spencer or Mike Enoch or whoever is very happy to learn a list of very precise rules that if I just avoid breaking any of these rules, I can just not use the N word.
I can use another word for that.
I can just not do this thing They're very happy to do that too in order to spread their message and so I think not having a set of like specific rules is Like, we all know what you're doing, you know, when you kinda come on the platform, but at the same time, like, the fact that these things are not, like, evenly applied, the fact that there isn't, like, this clear standard, like, they do have a complaint on that, except, like, they're promoting genocide, and so, like, maybe, you know, maybe that's something that we just kinda shouldn't have.
On the platform regardless, you know?
And of course, this obviously gets used to a much, much larger degree to left-leaning creators, but left-leaning creators don't have the sort of, like, platform to complain about it in the same way.
Like, you know, they extract the far-left and far-right people from their platform.
The far-right people who promote genocide, which we can understand, and the far-left people who think that maybe trans people are good.
Those people are also equally bad in the hands of, you know, who might say, like, you know, here are ways you can get your, get to do your medical transition, you know.
Also, also, just, just on, just on the terms of, I just, I'm just gonna do this right here.
Just on terms of, like, like, YouTube banning your content, like, try to post a blowjob video.
That gets taken down faster than anything.
Right.
Like, anything that's, like, pornographic or anything like that, very You know, you don't have to agree with this.
You can kind of say, but like completely healthy, you know, very normalized, like, you know, like adult content is absolutely disallowed on YouTube.
And so like the idea that like overt neo-Nazis are going to be like, you know, I'm just, I just have a political belief is, you know, I don't know.
It's, it's.
No, I think it's so true that the politics of porn and nudity on mainstream platforms are fascinating because it is, I mean,
To the point that you've been making, I think, you know, I focus, as one has to, on a lot of the individuals on YouTube, but at the end of the day, I think that the, and not to minimize their, the role that they play and the decisions they make and kind of, you know, the accountability that they, that's on their shoulders, but at the end of the day, my biggest critique comes
For YouTube, because YouTube is the one that's not only hosting this content, but in many ways incentivizing people to make it, right?
If you are a up-and-coming YouTuber, I mean, I saw this time and again with the people I was looking at, that they would start with much more moderate content, and over time it would become more and more extreme because they have these instant feedback mechanisms.
I mean, it's like, you've talked about this with Good old Chris Cantwell, right?
How he got radicalized by- Oh, hold on, hold on.
Who?
I don't know.
I haven't heard of this person.
No, no, no.
He, you know, and a bunch of them get radicalized by their audience.
Like, you'll start to hear- I mean, um...
They'll start out with kind of these stories of like, oh, you know, I don't entirely agree with things that have been promoted by contemporary feminism.
It clashes with my personal experience in these ways.
And then like several videos later, they'll be saying like, you know, why Black Lives Matter with cancer, right?
And it's because they see what is doing well on their channel and they have this incentive to continue to make kind of more and more Extremist content, since it does very well with the audience.
Sure.
And there are plenty of small creators.
I mean, I'm thinking of a particular one, which I won't name because I will only give them more attention.
But don't worry, they're on my archive list.
Who, you know, sort of brand is like incels, who kind of brand is like, oh, I'm just kind of a normal dude, I can't get laid, I'm sitting around.
But then also, like, my intro has like Hitler memes, and I define myself as a third positionist.
And, like, I talk about the Jews a lot when I'm, you know, talking about how I can't get laid.
It's like, well, you know, I don't want to go die for Israel, I'd join the military, but, you know.
And so you, like, and I think that, I think that what we're circling around is something that I want to put a pin in here, is that this sort of, like, it is the algorithm, and I think you and I both agree that the algorithm was then, and probably less so now, I mean, I don't know, we could talk about it, I do want to talk about it now, can I?
How things have changed since 2018, because I do think that the YouTube experience has changed dramatically several times over the course of the last couple of years.
But the algorithm is a problem because it does kind of drive this particular kind of content.
It does drive radicalization one way or the other.
But there's also kind of a social phenomenon kind of happening.
Yes.
And I differentiate between that, or I sort of highlight the difference by talking about BreadTube, and I don't know how familiar you are with sort of the BreadTube phenomenon.
Yeah, quite.
But, you know, there are a lot of really great creators, like Sean.
He used to be Sean and Jen who's someone who I adore Sean's content.
I think he's a great.
Yeah, he spent seven months making a two and a half hour video completely demolishing the claims of the bell curve, which Right, right, right, right.
An excellent video.
Which is brilliant content.
And he is well compensated as a creator because he has an audience that will sit for seven months and trust him to make that kind of brilliant content, right?
Yeah.
But that's something that like you watch that video and despite the fact that it's two and a half hours long You need to watch it like a couple of times Which I have I have watched that video.
I actually put it on sometimes when I'm just bored like before bed I'll just put on like it's just I don't know I'm not the first person to have told me that I I put out I put on my favorite like there's some great creators and we might talk about another one here in a few minutes, but I put on like something like that, you know, it's sort of like it's like a Because I do what I do and because I'm very interested in like how to communicate these ideas, like when I find a content that kind of does it very well, I try to understand it and I just I love it.
It's just like it's great, right?
But that doesn't kind of fill the same need as someone like, I don't know, Destiny who I'm certainly not I'm certainly not planning to do something on in the near future I feel very validated because people were very pissed off that he showed up in the network If you would like to come back and talk about destiny you would be welcome to come back and talk about this a whole other canvas
Yeah, that's a whole other can of worms, you know.
Yes, I'm gonna go on the Ralph Report and giggle at Holocaust denial jokes and then go like, oh, come on guys, you know, no, my neoliberal ideas about how politics works are clearly superior to your banning the Jews from society.
No, yeah, no.
But Destiny produces, makes, like, kind of similar money to someone like Sean, but, like, produces this, like, huge array of sort of low Quality content right, you know, and I don't think destiny is clueless on a Nazi, right?
Like but that's sort of the measure like for a long time and for a lot of these creators particularly on the right wing the idea is not like Let's produce like some really well produced documentaries that make our point the point is let's sort of draw people in and do these like You know know what guilt is doing eight hour live streams at this point.
Yeah Right, right, right.
But the idea is that it sort of creates this like social situation where then you get to hang out in the chats and you get to – they'll share links to like the next guy or you kind of – and you find these other things.
And so like I just want to kind of highlight something that you were saying and I'm sorry to kind of babble a bit about that.
But like there really is this sense in which it's both.
There's a culture and an algorithm that are sort of feeding on each other.
And to some degree, I think it was intentional.
But I don't think that there were like a bunch of like SEO farmers kind of like sitting and like planning this.
I think it sort of grew organically.
But then once it started to happen, it became like this is the kind of thing that sort of works.
And we just – and I expressed this back in like a very early – I mean like I think episode 15 was talking about the YouTube algorithm.
It was somewhere in that realm.
It was in the teens.
And I was kind of expressing like it would be really nice if we had like some of a left version of this, if there was a kind of like – We're gonna sit and hang out and I think there are channels like the surfs and you know some other ones that sort of do Kind of similar things, but they don't have the same kind of engagement Because there is a kind of difference in the way that like people just want to kind of absorb that content I don't know like it's a it's a difficult problem right you know and I It absolutely is.
And I think, no, it's true.
I don't mean to minimize the role that the algorithm plays.
I think that the way that you phrased it is like spot on that the algorithm feeds into and impacts the culture and community on the platform and the culture and community on the platform.
Feeds into the platform and helps shape it.
And you have the politics of the people that built the algorithm.
I mean, they're kind of feeding into how it works.
And there's a really amazing book on this called Algorithms of Oppression by a professor at UCLA named Safiya Noble.
And she was writing about the Google algorithm, but talking about Basically, if you look at both Google and YouTube's search, the way that they present themselves is as information retrieval services.
That's how we all use them.
The way that you get that is through a library database.
What they're giving you is an advertising service.
Right.
Well, and not even like, like it is, God, this is such a big topic and like, I, well, the idea, again, this is the thing I kind of come back to over and over again and talk about this stuff is like, yeah, we only we have four websites now, right, you know, and so.
and they work by kind of getting you engaged as long as possible.
Whereas previously, you know, like in the 90s or in the early 2000s, kind of before social media, like the idea with Google was we're going to give you a – we're going to give you search results that are like designed to kind of give you the kind of content that you're likely to want based on the thing that – and there are like like the idea with Google was we're going to give you a – we're going to give you search results And I remember the days when like Google AdSense was like this kind of controversial thing.
Like, you know, is it well enough advertised?
And now it's just like we don't even think about sort of the invisible algorithm that like provides – like my Twitter feed is designed – like, you know, people think like, oh, your Twitter feed is just sort of a backwards sequential organization of like the people you follow and their tweets, but it's clearly not that.
It's curated, it's designed to give you the thing, but that's invisible to the way that the algorithm works, right?
And so it's all based on, like, what you've engaged with in the past, and so you just get kind of more and more... Like, I happen to follow a couple of thousand people because I've been on Twitter for a long time, and I just follow... I find a little account that says something I like, and I tend to give them a follow, right?
But I never see their content because, like, they tweet, like, once every three months or whatever, and so it just doesn't appear.
Like, yeah.
Well, yeah, that's the thing.
The two guys that developed the Google algorithm, Larry Page and Sergey Brin, in their original paper that they wrote about the Google search algorithm, they explicitly said this cannot have advertising tied to it because it will inherently corrupt it.
And then of course, like, you know, they have a business to run.
So of course that, you know, very quickly falls by the wayside.
But what it means is that, you know, it's not the best, you know, whatever best means like, you know, there's certainly debates to be had around what the best results would be to come back.
But that's not even the conversation that's happening internally there, right?
It's about what results are going to get the most engagement so that then it can place the most ad revenue on the landing page and all of these different things, right?
And what you get is all of these far-right people Basically since the beginning of the internet have been incredibly opportunistic about getting their content in front of people and they do kind of a like a low-key version of search engine optimization, right?
I wrote about this in my report that you know it's like it's basically impossible to quantitatively Scholars have tried to study the algorithm at scale and the search results at scale, and it's really, really difficult to do.
I just did provisional testing, but it was amazing the extent to which, for not only right-wing terms, but also social justice terms, the extent to which the search results were dominated by right-wing creators.
Part of that, like you said, is like, you know, they are churning out content at a much faster rate.
There's just a higher volume of it.
And part of it is that they're really strategic, actually, with like how they use how they tag their content to show up in search results and all of these different things.
I mean, every YouTube creator has to become kind of a mini entrepreneur and they they learn a bunch of tactics and whatnot.
And the way that You know, YouTube and Google and Facebook, all these places, as you said, it's, you know, they're, um...
They have become kind of the new public square where we have these debates and stuff, but they are not built with productive conversations in mind, right?
They are built to make money, to keep people engaged, and so the incentives are very skewed, and that's why you end up with- I mean, I just imagine if Lincoln Douglas had to do the debate within 280 characters, or within 15 second clips, you know?
And there was an algorithm that was being fed by the Slaveholding Society.
Right!
That was sort of pushing things in that direction.
And one of the things that I kind of run into, and sorry, we're a little bit far afield, but one of the things that the Nazis will say is, they'll kind of look at my tweets.
And they'll go like, you have no engagement, you got 10 retweets on this or kind of whatever when they're trying to troll me.
And I said, well yeah, because I don't have like an army of people who will just kind of follow me and sit online all the time and retweet and fave like every single tweet and will like respond and engage.
Like, you know, the people who follow me have like lives.
They go out about their day.
They're not sitting on Twitter all the time.
They're not sitting on YouTube all the time.
And, like, I'm, you know, like, I'm not trying to, like, believe me, I'm not trying to blame people for being perpetually online, you know.
I'm just saying, like, there is this, like, community, and that's where, again, the culture and the algorithm feed on each other, right?
And so the things that get pushed to the top are precisely those things of, like, the people who are perpetually willing to just kind of sit and kind of mindlessly engage, you know, with it.
Right, right.
And then there's this this sense that it's the you know, I think it's deeply ingrained in these communities and more broadly on YouTube and other social media sites that like, the content that's the most popular is somehow the best.
There's this like, idea that it's this weird like meritocracy or that something is like, Winning in the battle of ideas or whatever, right?
And not the fact that like plenty of really awful things are incredibly popular on these platforms like Jake Paul and Logan Paul are two of the most popular YouTubers, right?
It's it's really, you know trashy stuff, but A lot of this goes back to, like, what philosophies were guiding and ideologies were guiding the people who built these things.
And, you know, the fact that Peter Thiel was like, you know, is clearly, you know... Peter who?
I know nothing about this man.
I clearly know nothing about this man.
I have not been observing.
I have not been watching his C-Span appearances from 1996 or anything like that, you know.
But yeah like I mean he's of course like I would say the worst of them but all throughout Silicon Valley in the early days and still it's informed by a deeply libertarian philosophy about the ability that we should not interfere at all in how speech takes place if they're you know everyone's gonna get along it's fine it doesn't matter that the fact that you know we tested this initially in a group of like 50 white men who are all friends with each other
And that's what the early internet was.
It's kind of baked into these platforms in a lot of ways, right?
The ideas, the policies, the incentives, all of these things.
It's almost as if it's systemic.
Oh, almost.
We should we should we should move on.
I don't know.
I don't know how much longer you want to sit and chat.
I mean, I can I can kind of sit here for a while, but I do.
One of the things that I find interesting is the range of dates that you chose for or that you just sort of like lucked into in a way for the data and society report because.
It's the whole of 2017 up until, like, April 2018, right, is the dates.
And that's, like, such a, like, that's the time period that, I mean, that's almost the perfect time period if you wanted to study something like this, right, you know?
Because you really run into, like, halfway, like, basically halfway in that is Unite the Right.
Which kind of creates this fragmentation and the whole of like the first up until kind of mid-August 2017 is this period of like consolidation in which all these people are really kind of communicating with each other.
They're really kind of like building this thing and the whole point of Unite the Right was we're going to unite all of us together and kind of put a united front forward to express these ideas to try to create this like some version of a white ethnostate, right?
And because of the events of that day and, frankly, because of the actions of the on-the-ground activists who should be seen as heroes of our society in the last, like, decades, honestly, it is almost impossible for me to... I was not there that day.
It is almost impossible for me to, like, to overstate the importance of, like, the people on the ground on that day who were confronting this evil as it existed, right?
But because of that, you got this clear fragmentation of this thing.
You got this clear... They didn't really know what to do.
You get the optics debate.
I have discussed this in some detail in previous episodes.
I'm not going to kind of recover it today.
But in the beginning of 2018, you start to see These kind of alternative channels you start to see these live stream channels like heel turn becomes a thing and The sort of warski live like the the blood sports era Becomes the thing and that's what you start to see so so I would I mean I'm not saying like please reproduce the data for me Becca and like show me but it would be really interesting to see sort of a like a division like I don't know.
kind of the end of August 2017, like what, what that looks like.
And then like sort of late 2017 to the end of your data, go to see like the shift of the way that the data works to that point.
I don't know.
It's sort of a fascinating look, right?
I know I maybe at some point I should do this, Please take a break from the lackadaisical pace of a PhD to amuse me, Becca.
No, no, I genuinely would be interested to see more of this too, because...
As you were saying, it's like a very particular snapshot in time.
And like, you know, I hadn't looked back at that graph for a long time before today because I was, you know, reviewing it so I could remember things to talk about.
And it is such a snapshot, right?
Because like one of the biggest hubs is Andy Warsky, because at that moment in time, he was like really this real Node and the network bringing like a ton of different people together for these blood sports debates, and then it really wasn't too long after the end of my data collection that his influence kind of totally imploded and... Well, JF Gary Eppie kind of went off to do his thing.
One day I'm gonna have to do a JF Gary Eppie episode.
I've been trying to avoid it.
Yeah, you probably will.
I don't blame you.
But yeah and and then you have you have events like Unite the Right which kind of of course as you said like fractured a lot of these spaces and you also have like one thing that's fascinating to me is like how much
It became clear how much Jordan Peterson was kind of the glue holding the Intellectual Dark Web together, and that with his personal issues... But your data, you ended your collection before the Intellectual Dark Web was a thing, right?
Because that really doesn't get named until like August of 2018, I think, or something, you know?
I don't they definitely were already hanging out with you, right?
No, no, I'm not I'm not disagreeing with that I'm not trying to you know, like please little lady.
Let me tell you the problem But like I Believe I believe that wasn't named until significantly later, but you're right They were kind of hanging out together and I do I want to put a pin on that I do want to come back to that Yeah.
But yeah, you're right.
Jordan Peterson is this very significant figure within that kind of mainstreaming of a lot of these ideas, despite the fact that he's just weird, right?
Yeah.
He's kind of way off.
If you look at his rhetoric, He's not really a Nazi.
He's not really a libertarian.
He's a reactionary.
He has, you know, it's almost like you can take kind of five minutes of him and kind of go, yeah, I kind of like, yeah, like this isn't, this is fine.
You know, even the kookier stuff, you can just kind of look at it as like, yeah, he's kind of a kooky professor, but when you look at sort of the big picture and kind of the people that he attracts and the way that he, like, and It's hard for me to know if that's intentional or not with someone like Jordan Peterson.
I don't really have a really clear take.
I don't think I'm ever going to do a Jordan Peterson episode for a lot of reasons, but part of it is that I don't feel like I really have an understanding of Jordan Peterson.
He's slippery in that way, and I think that he's been so successful in part because he's so slippery and hard to pin down.
It's become basically a meme at this point.
Try to characterize his ethos in any way, you know His fans will be like, oh you have to read this other piece of this or watch this video That's very common and they kind of eat IDW types is like no you can't possibly have understood him if you disagree I mean, I feel like Jordan Peterson is someone and I guess if we're gonna God, this is a big another big conversation.
Yeah, you need to come back.
We need to have more conversations like this But I feel like I feel like the thing with Jordan Peterson is, like, he's, like, the way to understand him as, is, like, A, he's reactionary.
I mean, he kind of comes to the fore.
But just being a transphobe, let's not, let's not sugarcoat that.
He's a fucking asshole transphobe.
That's how he comes to, that's how he becomes, he would have been a nobody, kooky professor in Canada, except, like, he decided to pull his metaphorical dick out and hate on trans people for a while.
Like, that's, that's who he is.
But I feel like what a lot of people, and particularly young men, and you know, as someone who was formerly a young man, I will not speak for you, you know, I sort of understand the idea of like, you know, I'm 20 years old, I've got a shit job, I'm looking for some kind of like direction in life and kind of like having Kind of a mentor a father figure this kind of online father figure who says make your bed Make sure you eat every day, you know kind of control yourself.
Don't watch too much porn Etc.
Like I understand how that's Useful for people and on that kind of like baseline self-help stuff I get that except like the way that the real toxicity comes with like and then I'm pushing this like super reactionary kind of vision alongside that and Yes.
And then when you start to interrogate, because Jordan Peterson is so incoherent in what he actually says, when kids start to interact with that and start to kind of analyze that, they start looking for other people who are, you know, also kind of giving, kind of answering these questions.
And then suddenly it becomes either You're pushing more towards kind of that IDW direction, and you start to kind of follow Sam Harris and the Weinsteins, or you become a full Nazi, right?
Right, right, right!
Choose your path!
Yeah, sorry, I just kind of thought through that as I was sitting here talking to you.
No, I think that's that's spot on.
And I agree, I think that part of what is so pernicious about the way that this works is that, like you said, like, I think that there is a lack of like, or I should say that it is like, a tough moment in time to be a young person, regardless of who you are, right?
Even prior to COVID, even prior to COVID.
Even prior to COVID, absolutely.
And so I think that there is, like, this real opportunity for charismatic people to kind of step in and provide this, like, self-help guidance, and that that in itself is not necessarily a bad thing.
And, like, if he's telling you to clean your room, like, honestly, that's...
Very harmless and good advice, you know, like it's kind of trying to set people up to like, you know Take a little action get a little accountability going in your life, whatever.
I don't have any issue with that I can say I'm looking at I'm looking at Becca on screen and her room is much cleaner than mine That was not a subtweet I just mean all of that is relatively harmless, but Becky Lewis is looking at me and going, maybe you need to read a little Jordan Peterson.
I have a book that may be of interest for you.
It's called 12 Rules for Life.
But yes, then it starts becoming about, and the reason that you are so unhappy is because Feminists are taking over the university and pop culture and there are inherent differences between men and women and that's why diversity programs are actually just bringing men down and blah blah blah.
It just goes off from there.
Yeah, as you said, it's not always going to end up with... I mean, I focused a lot on radicalization in my paper, even with the knowledge that, like, of course not everyone goes all the way to white nationalism from there.
But even in the years since I, you know, speaking of things that have changed, I mean, some of it is these shifting grounds of personalities.
But I also think like more and more it has become clearer to me, at least, how much even the people that are considered mainstream have such extremist rhetoric around, you know, like the people committing violence who now I'm going to forget.
I think it was a shooter in toronto was it who like his main uh his main media diet was ben shapiro oh yeah yeah yeah it's like you don't even have to go all the way to um like the the mike enoch well and to be clear like ben shapiro is like 95 of a of a mike enoch right like you know right right like let's just kind of frame it as like judeo-christian
you know like he's not saying he's not a holocaust denier he's not doing you know but he's really really far out there and And I think that, like, one of the other things, again, just to talk about sort of the perception of like Ben Shapiro within these
Spaces like he kind of gets like Andrew Morances like clearly Ben Shapiro is not outright He's actually like the largest target of the alt-right Well, the reason he's the largest target of like Spencer at Enoch and England Etc is because they see him as sort of the Jewish interloper who's coming in to give people most of what we believe but is going to avoid like the real and
Important issues and so like like the reason is because they agree with him so much But he rejects sort of like there's kind of central framing device as opposed to like Ben Shapiro is fucking awful like Ben Shapiro is absolutely atrociously bad and he speaks to like these giant audiences and that's something that like Something that these guys have learned to a certain degree I mean like someone like Nick Fuentes who I've you know, I have some familiarity with Right, right some familiarity with
euphemistically speaking you know let me go let me go watch another three-hour episode of dick fuentes you know um you know um one of the things that he's learned is like you know to sort of like frame himself as sort of a pro-trump republican and this kind of trad cath sort of thing yeah
who he's so good at that will talk in sort of coded language or will talk to his audience about the jews who will talk about and will express you know genocidal holocaust denying rhetoric but like sort of frame it in a way that his audience knows what he's saying and even when he gets like these kind of like uh you know super chats even when he gets these comments that he's responding to he kind of like will like laugh and go like you know, genocidal Holocaust denying rhetoric, but like sort of frame it in a way that his audience knows what he's saying.
And even when he gets like these kind of like, you know, super chats, even when he gets these comments that he's responding to, he kind of like, well, like laugh and go like, well, we can't really talk so much about that, you know, sort of thing.
Yeah.
The knowing.
And they've really learned how to do that.
And so like, it's, it's very much on this, on this, on this, you know, like, I mean, Nick Fuentes did get banned from YouTube.
I was, I was amazed that he got banned from YouTube.
I know.
I have been actually quite surprised by the amount of people that have been banned in, you know, recent months and years.
I'm actually really mad at YouTube because they banned Stefan Molyneux.
Not because I thought they shouldn't ban Stefan Molyneux, but because they didn't tell me first.
They didn't.
I know, as researchers it provides a very specific problem.
Like, can you please leave his content up only for us?
Can you leave his content up just for the two days it will take me to re-archive it?
Because I had it all at one point.
And then I didn't have the money to buy a new hard drive, so I had to delete it because I considered it a low priority for YouTube to delete it, because he avoids tripping the sensors in that way.
And then you delete his content, and apparently he didn't even have it all, and so he put a bunch of it up on his BitChute, and so a bunch of that stuff is completely lost.
But I had it at one point.
I had it at a certain point.
I had it all.
There has to be someone who has it sitting on a hard drive somewhere and it's just a question of finding them.
Right.
It's just it's just kind of one of those things of like, you know, please, YouTube, just drop me a note.
That's all I'm saying.
You know, like, you know.
But yeah, to your point, like so much of it is around how this like personal branding piece of it.
Right.
And some people like are fine with and prefer to brand themselves as open white nationalists.
Other people and some of this gets to the the optics debate.
But like one of the things that was so fascinating and troubling to me doing this research was like the women, the far right women and the amount that they emulated like beauty influencer tropes.
Right.
And like.
I follow some of those.
I follow some of those.
Yeah.
Yeah.
And like like their Instagrams, I mean, like Brittany Pettibone, who literally is married to like the head of the.
Yeah.
Martin Soner, who the Christchurch shooter, Britton Tarrant, literally gave money to, like, Exactly, like an actual neo-Nazi.
A person who actually killed 51 people, 51 brown people in New Zealand, gave a big chunk of money to this guy, and this guy is married to this woman, and this woman is on YouTube.
Like, that's...
And and she and she also has an Instagram page where she doesn't talk about ideology.
She just posts like romantic photos of her and Martin Sellner together as any, you know, as any Instagram influencer would do.
Right.
And so it's this real kind of use of social media aesthetics and personal branding to kind of launder their image.
Yeah, I mean, there's a really big, again, another big topic we could probably talk about in terms of sort of the women in this movement and sort of the way that they use kind of an online persona because a lot of them do embrace this kind of like trad wife, you know, kind of wheat field girl thing.
You know, in my experience, it's sort of like you get kind of one of two extremes with women in the movement is either they embrace this sort of like Sarah, plain and tall, Very, very mild aesthetic of like, I'm just going to give you children and no, I'm not going to, I'm not going to ever challenge anything or they go full on like, you know, rate me songs.
And I assume you know which person I'm referring to there who.
Right, right, right.
We'll eventually have an episode devoted to her.
But, you know, so many people, so little time, you know, ultimately.
She's been kind of quiet, so I kind of am willing to let her kind of do her thing.
Although she kind of, yeah, no, we don't have to talk about that.
I guess, I mean, we've been, you know, an hour and 15 minutes or so.
Again, I can talk forever.
But I do want to not strain the audience a little bit, and I'm interested in kind of what you're seeing from either the figures that you saw at the time that you finished the report, or from like the kind of the alternative influence network at large.
You know, sort of that same idea.
How you see 2018 is different from 2020, because I have very specific opinions about this, and I suspect that you and I are kind of following some of the same things, but as I kind of ran into with, like, Megan Squire, like, one of the things, because I did an episode with Megan Squire, who I'm assuming... Yeah, yeah, I was just listening to that earlier today.
Yeah, she's amazing.
Oh, no, she's great.
She's great.
I love Megan.
I mean, you know, either you or she is welcome back at any time, you know?
Thank you, I would love to come back.
Talking about, kind of like, what I run into is that I kind of experience phenomenologically what the two of you kind of experience through your data science, right?
And so, you know, for me, I can tell you that like in 2017 and even 2018, I got, I kind of would, you know, I would see, like, these, like, kind of overt Nazis who are maybe slightly coded.
You know, obviously, like, uh, I mean, TRS, The Right Stuff, had a YouTube channel for a while, and then they kind of got banned, and they decided, like, oh, we're not gonna keep uploading just because it's, like, annoying, and so they have, like, a Bitchu channel instead.
Um, but, like, Nick Fuentes is doing his thing.
James Alsup was around forever.
Like, he had 500,000 subscribers before they banned him.
Guess who has that entire channel archived?
You know, but you know there is a there is a sense of like you actually could find kind of overt Nazi content and that kind of got like Trimmed away first like the algorithm wouldn't send me to it.
You know because yeah again for a while and this even Into early 2019, I could kind of log into my YouTube and I would see like overt Nazi shit kind of just, you know, as a friend of mine said like, you know, I don't want to see what Daniel's YouTube recommendations are like.
It's just probably like a wall of screaming skulls and I would kind of just do like screenshots from my phone and go like, no, this is, this is actually, you know, this is literally what I'm recommended.
And it's like, you know, I'd listen to a few like songs just to like, Pull away from like the absolute like worst of the worst Which is the sign that like I kind of found the right wall, right?
I kind of found like the furthest of the furthest content that was allowed on YouTube and so I use my own kind of recommendations as this kind of as a barometer as to what is allowed and what is recommended because for a while I was getting all this recommended and then it wouldn't be recommended, but it would be available.
Like it would just kind of feed me more Nick Fuentes with nothing kind of further than him or nothing like just kind of based on kind of what I saw.
And then at a certain point it just kind of like stopped giving me any of that at all.
And then like, and particularly after, um, you know, sort of the live streamers move all moved to D life because they weren't really getting engagement on YouTube because the YouTube stuff, uh, just the terror Graham types, the siege built.
Um, and now ironically, you know, most of what I get is IDW stuff.
Most of what I get is like more Eric and Brett and you know, because, you know, And now those guys are doing the same thing that the Nazis were doing, and I'm not trying to directly equate the ideology here.
Please be clear, for fans of Eric or Brett Weinstein who might be tuning into this podcast for an obscure reason, I am not trying to say that Eric Weinstein is a Nazi.
I'm saying he's kind of buddies with a pseudo-Nazi, and he's employed by a pseudo-Nazi, who maybe is like paying Nazis to do content, but who knows.
But what I'm getting is, like, they're doing the same thing now that those guys were doing then, because you get...
Eric and Brett will bring on Sam Harris, and then Sam Harris will bring on Coleman Hughes, and then Coleman Hughes will bring on Sam Harris, and then they'll all kind of talk to, and so you get this whole, the network, the idea of the network is the same.
It's just that it's shifted away from kind of the overt stuff into this more like, we just have to, we just have to crush the left guys.
We just, we can't, we can't have socialism.
We can't have socialism.
We gotta have the police!
The violent thugs in the street, the Antifa.
The police, the police, they just have to be brutal.
They just have to be brutal, don't you understand?
This is what has to happen.
So that's kind of my experience of the whole thing.
And I'm, again, maybe seen as a leading question.
What's your what's your kind of thought about it?
No, I think, unsurprisingly, I think you're absolutely right that, you know, the grounds have shifted.
The check is in the mail, Becca.
The check is in the mail.
You're saying everything so intelligently!
And so handsome as well, that was the other... So I think that a few things have happened.
I mean, YouTube has changed certain policies in really important ways, and they have continually talked about changes they have made to their algorithms to kind of not drive people so deep down towards Nazi content.
They also have been, yes, banning like a surprising number of these figures.
And of course, like you were saying, like that that stops at a certain point, right?
Like they are never going to ban Ben Shapiro.
They're never going to ban Stephen Crowder because these figures are too entrenched in the mainstream.
They're too powerful at this point.
And the platforms are terrified of saying that they have a bias.
Yeah.
We might have said that about Alex Jones at a certain point, right?
And they did ban Alex Jones.
And frankly, Alex Jones is terrible.
I'm not saying Alex Jones isn't terrible.
He's less bad than Ben Shapiro.
But that's kind of because Ben Shapiro has sort of a mainstream voice.
He has a connection to more of a mainstream audience, whereas Alex Jones... I don't know, it's kind of a difficult question.
I shouldn't say they never would, but I think that if YouTube was going to ban Steven Crowder, they would have done it throughout the Carlos Maza stuff.
When, like, it was very clear, like, time after time after time, all of these times he had violated their policy.
And because it was this giant media thing, they really got caught.
Like, they didn't want to come across as, you know, disproportionately taking down conservative content.
And so they just didn't do much at all.
And so they they kind of have backed themselves into a corner in that sense.
But And, you know, Ben Shapiro brings in a lot of revenue for them, you know?
At a certain point, some of these, you can tell, like, with their biggest celebrities that YouTube is very hesitant to take certain actions because these celebrities have, they drive revenue for them and they have big, you know, Platforms with which to criticize YouTube.
But anyway, so yeah, the figures have shifted, but a lot of the content is still there, just kind of like in slightly more dog-whistley terms and, you know, framed in different ways or the paths are slightly different.
But I think the other thing that's interesting is that, you know, Not to say that banning people or de-platforming people isn't effective.
I think that in a lot of ways it is effective, but at the very best, it's a game of whack-a-mole.
When you still have these incentives in place, these extremely laissez-faire content moderation policies that Other people are going to figure out other ways to kind of, you know, become famous until they, you know, maybe get banned.
And then other people are going to do it.
So like, now you see this upswing of QAnon content, right?
And that's a different network.
But that's, that's new since I did my research, right?
Oh, yeah, sure.
No, that's, I mean, the first QAnon posts were at the end of 2017.
And so and that was tiny.
Certainly not like on YouTube that was that was like 4chan and shit like you know, right?
That's a whole other thing Like, you know, I I don't know.
Did you did you have the chance to watch the Dan Olson video by chance?
I did it was really good.
Yeah, so so I feel like Maybe we should wrap up and kind of talk a little bit about, because I thought that was a really important video.
And this is, this was released like, you know, Becca and I planned to do this like three weeks ago, because that's kind of how I plan out episodes.
I'm usually prepping, you know, a few weeks in advance.
And then this video drops and I watch it and I'm like, this is clearly something that Becca and I need to have at least a brief conversation about.
Yeah.
And so this video was called In Search of a Flat Earth, and I have watched it three times in the two days since it dropped.
I think it's a really profound video.
And it's about Dan Olsen.
He goes by Folding Ideas.
We've talked about him a bit in our Fight Club episode and in a couple of other places.
I think he's a great... He's one of the great BreadTubers, precisely because he talks about things in a really...
Involved sociological kind of sense as opposed to kind of like getting distracted by the big shinies And I think the best IDSG episodes get like 10% as good.
It's like a Ted Olsen video, you know I think the one that I did by myself about about the right stuff and genocide.
I think that one kind of was I was thinking of Dan Olson when I produced that one.
I think they're both great.
Different types of media.
It's a very different thing.
I'm not asking you to say I'm better than Dan Olson or anything.
But the video, it starts off as talking about debunking a particular kind of weirdo flat-earth thing.
And he goes out and does some experiments and produces really beautiful footage to debunk Flat Earth, you know, and then talks about how it's not enough because of these kind of sociological phenomenon.
Talks about, like, Flat Earth, and then at a certain point goes like, well, the community has kind of been crushed.
They're not really kind of getting play the way they used to.
This is something that's going to go away, and in a lesser episode, that would be the end of the episode.
Except then he goes, but they've all gone to QAnon at the 37-minute mark.
And then he does a really interesting talk about how QAnon has sort of absorbed the same kind of ideas and the same kind of phenomenon that sort of fed the flat earth phenomenon.
And talks a lot about how YouTube allows that to happen and how it is a kind of political phenomenon, a sociological phenomenon, as well as an algorithm.
And no, I just thought it was a really interesting thing, and I would love to get your thoughts about it.
I was just trying to give it to the audience who maybe haven't watched it yet, since it just came out two days ago.
Yes!
No, I thought it was really incredible as well.
And yes, it gets at that exact thing, that it's like they're I think because, you know, we're living in an information environment now where like, you know, certainly there are people who go up and like, you know, join official Nazi groups with membership and so on.
But more often than not, you get these kind of like loosely networked hodgepodges that kind of will come together at a various moment in time for a while.
And then kind of disperse, and we'll go in different directions, and then we'll reformulate in other iterations, and then those will disperse, and so on and so forth.
So like, you know, in 2015, 2016, like, a huge swath of the internet, right, in various forms kind of rallied behind Trump, right, and that became a big unifying force.
The God Emperor, right, yeah.
The God Emperor, then they all went their separate ways.
And then you kind of had this fragile alliance between the alt-right and the alt-right.
And then after Charlottesville, they kind of went their separate ways.
And, you know, even earlier than that, you had kind of like the New Atheists, which like a whole piece of them kind of went off in the Gamergate direction, right?
Oh, Elevatorgate was the thing.
I come out of that New Atheist thing.
That's sort of my origin story, yeah.
We'll have to talk about this, because I'm doing some research on this.
Okay, well maybe after you turn off the mic, I'd be happy to chat with you.
I was in that from the beginning.
I was there before the beginning of New Atheism.
Wow, fascinating.
But yeah, so it's like these moments come and go, but a lot of the same themes and anxieties crop up, and there's always this reactionary strain, whether it's anti-feminism or Islamophobia,
Or kind of like, you know, the complete distrust of official narratives, right, that you find with the Flat Earthers and the… Well, and you can't overstate the importance of, like, Fox News.
Like, Fox News was something I talked about with Ed last week.
You know, Fox News was founded in 1996.
And it was kind of explicitly, you know, this sort of, and this, this feeds back to the Weinsteins, right?
You know, of, you know, the Weinsteins talk about, particularly Eric talks about like, well, the New York Times is this, is this like kind of reputable source, and these is, and we can't get access to the mainstream media.
And it's like, Your brother literally announced this bullshit Unity 2020 thing on like the biggest show on cable news.
So what the fuck are you talking about?
You have an enormous microphone, both on your kind of YouTube channel, but you're being fed into this kind of right-wing media ecosystem, which is gigantic.
It's much, I mean, it's at least as large as, you know, kind of milquetoast liberal centrist New York Times, you know, kind of whatever.
But that doesn't count, right?
And that's the thing that they sort of leave behind, is like, oh, there's this right-wing ecosystem, and everybody knows those people are completely out to lunch.
We're not being taken seriously in the mainstream publications, because the mainstream publications, for all their faults, look at you and go, like, we actually have the slightest amount of journalistic integrity.
Right, right, right.
Whereas Tucker Carlson has an agenda.
Pushing a covert white nationalism.
Come at me, Tucker Carlson's lawyers.
Please don't.
And again, I feel like it's, sorry to bring this up kind of late in the episode or whatever, but there is this sort of context in which all this stuff, like, It feeds on this grievedness.
It feeds on this kind of idea of we are being left out of the conversation and not because we're not allowed to have a kind of conversation amongst ourselves, not because we're actually being censored, but because we're not, our ideas are not being trumpeted in this kind of uncritical way from what we consider these kind of like big media outlets.
And the reason is because You're just fucking stupid and you're fucking wrong and nobody cares about your thing, you know?
Also, it's the result of a literal decades-old strategy to try to continue to move public discourse to the right from these well-funded institutions.
Oh, no, not at all.
I mean, you can find clear parallels.
I mean, the original networks and Chip Berlay has written about this.
social media platforms but this stuff didn't begin with them right oh no no not at all i mean you you can find uh you can find clear parallels i mean the original networks and uh chip burleigh has has written about this i mean i have you know papers written in 1985 that talk about like how some of the original uses of the early internet were literally like pushing white nationalist terrorist propaganda onto bbs servers you know which i assume you've seen you've seen some of
I've got some books right here.
I've got some PDFs I could show people.
Yeah, yeah, yeah!
One day we will definitely, like, talk about that.
I've just been looking for more information about it, frankly, you know.
Yeah, Jessie Daniels, the sociologist.
I don't know if you... I know Jessie Daniels.
Her stuff is... She follows me on Twitter.
She follows me on Twitter, you know.
No big deal.
Yeah, she's amazing.
And yeah, I love her book about kind of far-right activists on the early internet.
What's the name of that book?
Wait, wait something.
Wait.
So she has White Lies, which is from before the internet, actually.
That one's also amazing.
But then she has one called Cyber Racism that talks about kind of the early days of Stormfront.
I own White Lies.
I have read parts of it.
I don't own Cyber Racism yet, but you know.
They're both great.
Yeah.
Anyways, yes.
So it's been going on for so long and shape-shifting and, you know, the grounds shift, but kind of the problems stay.
Or, you know, people and movements and media figures are able to kind of reinvent what they're talking about in ways that make it seem new and exciting.
Yeah, because, you know, ultimately there's always another way to blame the juice.
That's the truth.
So Becca, I want to give you the last word.
Do you have anything that you wish I had asked you about or anything you'd like to kind of say before we kind of wrap up here?
And please, you're welcome back anytime.
You can come back next week, if we're all lucky.
I mean, I would love to talk to you again.
This has been a lot of fun for me.
No, I would love to come back at some point.
Yeah, this has been really great.
I appreciate you giving me the chance to kind of ramble about the things that I find super interesting.
Would you like to call Eric Weinstein an asshole?
Because you can or cannot.
That is up to you.
I don't think that's legally actionable.
I am going to be incredibly professional.
Okay, I am going to call Eric Weinstein an asshole, and Becca Lewis does not officially agree or disagree with that statement.
I will say that he has led a months-long obsessive harassment campaign against me, and let that speak for itself.
Sure.
Actually, it's probably unfair to call him an asshole, because assholes have a very powerful and useful function in society.
No, sorry.
I just thought of that joke.
Where it breaks down.
Where the metaphor breaks down.
So did you have anything you wanted to add?
No, I think that, you know, as you said, like, Could go on for hours about any number of the topics that we touched on.
But if anyone's interested in checking out the research, it is publicly available, unlike much academic research.
It's called Alternative Influence, and it's by me, Rebecca Lewis.
So if you just Google that, it should come up.
It'll be in the show notes.
I'll stick in the show notes.
Oh, great.
Great.
And anything else you want added, just give it to me.
I'll make sure it goes in the show notes.
Sounds good, sounds good.
And yeah, I guess that's pretty much it.
Thank you so much for having me.
Oh, well, thank you for coming on, and yeah, again, welcome back any time.
And until next time, next time we're going to talk about, Jack will be back, and we're going to talk about the new political party.
I put that in big quotes from the boys at TRS.
From the Daily Showaboys, they have a new political party, and it's amazing.
And there's lots of Nazi infighting, which is always our favorite thing, is exploring Nazi infighting.
When they argue amongst themselves, it only benefits the rest of us.
So, Becca, where can we find you on the internet?
Oh, my Twitter account is at beccalew, B-E-C-C-A-L-E-W, and everything else you can probably find through there.
Awesome.
I'll put that in the show notes.
So yeah.
Great.
It's great.
Thanks again for coming on and we'll see you next time.
That was I Don't Speak German.
Thanks for listening.
If you enjoyed the show or found it useful, please spread the word.
If you want to contact me, I'm at underscore Jack underscore Graham underscore, Daniel is at Daniel E Harper, and the show's Twitter is at IDSGpod.
If you want to help us make the show and stay 100% editorially independent, we both have Patreons.
I Don't Speak German is hosted at idonspeakgerman.libsyn.com, and we're also on Apple Podcasts, Soundcloud, Spotify, Stitcher, and we show up in all podcast apps.
This show is associated with Eruditorum Press, where you can find more details about it.