Danny Jones Podcast - #116 - Unmasking The Man Behind QAnon | Cullen Hoback Aired: 2021-11-12 Duration: 01:53:05 === Cullen Hoback's HBO Project (09:47) === [00:00:01] Hello world. [00:00:03] Cullen Hoback is a documentary filmmaker who is responsible for the new HBO series titled Q Into the Storm. [00:00:10] Cullen spent three years filming behind the scenes with the guys responsible for the website where Q posts. [00:00:16] Q Into the Storm is about a bitter rivalry, the origins of QAnon, and who is behind the operation. [00:00:22] It's also about free speech and pigs. [00:00:24] The series examines the connection between QAnon, President Trump, and political and ex-military operatives. [00:00:31] It also explores QAnon's influence on American culture and politics and probes the consequences of unfettered free speech permeating the darkest corners of the internet. [00:00:40] I highly recommend listening to this one from start to finish. [00:00:42] It's packed with gems. [00:00:44] Please enjoy this gripping podcast with Cullen Hoback. [00:00:55] This is going to be the most high production value Zoom podcast I've ever done, I think. [00:01:01] You know, if I can't set up a decent AV setup on my end, what good am I? [00:01:06] So, I would have been way cooler to have you here in person, but this is cool too because I think this works as well because it's like a common way you interview people on your documentary. [00:01:17] Well, usually what would happen is I would interview somebody through Skype or Zoom or whatever, you know, whatever tool they prefer beforehand. [00:01:26] And then if it, Make sense, you know, at the end, like, go okay, like, you got a sense of who I am, vice versa, there's more to talk about, you know, and then I would fly out and do the interview in person. [00:01:36] Um, that of course changed during COVID, and I couldn't get to Japan to film with Ron, so you know, I would have really didn't want to do that interview over Skype or whatever. [00:01:49] I wanted to bring up my sort of list of reasons as to why I could make a strong case that he's Q in person, but. [00:01:59] Anyway, if that had happened, he probably would have slipped up in the same way and wouldn't have the ending that I do. [00:02:05] Right, right. [00:02:06] Well, it's fascinating. [00:02:07] That is one of the best documentaries I've seen in a long time. [00:02:10] I mean, everything about it, I was fascinated by. [00:02:13] I told all my friends about it. [00:02:15] Everything from just even the intro to it to the way you did some of those interviews with people that wanted to remain anonymous and you animated that one guy to be a fly. [00:02:24] I mean, just everything about it is so detailed, so much attention to detail. [00:02:28] Yeah, it's like a twitching cicada. [00:02:31] Puzzle moving in the background. [00:02:33] Yeah. [00:02:33] Well, thank you. [00:02:34] Yeah. [00:02:34] I mean, there were so many things like that when I was recording or just audio or when I was in Italy and I wanted to represent Luther Blissett in, you know, with some kind of visual way where I was shooting background plates, knowing that I was going to want to animate that in the future. [00:02:52] But I had, but I didn't know how I was going to financially do it because that kind of stuff costs a lot of money. [00:02:58] So, you know, the whole time I was in production, it was all independent. [00:03:03] So, you know, it was just kind of. imagining that at some point in the future I would get the resources necessary to tell the story that I was hoping to. [00:03:12] So I was kind of gambling along the way that shooting background plates in a situation like that, I'd be able to later animate it. [00:03:18] And, you know, it worked out, but there was no reason to believe at the time that it would have. [00:03:25] You know, HBO didn't come on to cue into the storm until September of 2020. [00:03:30] So the two years before that, it was just mostly flying solo. [00:03:35] Wow. [00:03:36] How long after HBO came on? [00:03:38] How much more work did you have to put into it after that? [00:03:43] Well, I mean, shooting was about 95% done at that point. [00:03:48] Obviously, I had to do the follow ups with Ron, shot at the Q convention in Arizona, and then went to January the 6th with Jim. [00:03:57] But that's when we were doing the majority of the post production on the series. [00:04:02] So, I mean, the series hadn't been edited. [00:04:05] We didn't start editing. [00:04:07] Until October. [00:04:08] So the show itself was constructed in about four months' time in post production. [00:04:14] So we had to scale up massively almost overnight. [00:04:18] And that kind of production schedule for editing an entire documentary series that's this complicated, I mean, we just had a remarkable team, but it was just crushingly difficult for everyone involved. [00:04:30] I don't think anyone, including HBO, had ever done anything on a timeline like that to deliver a doc series. [00:04:40] I had done some editing prior to it. [00:04:42] I had cut about 80, 90 minutes of scenes, and that's what I took out to HBO. [00:04:48] And I said, here's what I think it's going to look like. [00:04:52] But obviously, even those scenes changed along the way. [00:04:56] I'm so interested in what that process is like because I've been a part of that process before trying to shop shows and documentaries. [00:05:02] And when you're putting all your time, like your mindset is like, I'm completing this project, whether HBO or anybody comes through and finances it or not. [00:05:12] Like, is that, or like when you're going into like, obviously, you're in the process, you have someone that's helping you shop it to these buyers, and you have some sort of like packaged. [00:05:24] Sort of synopsis or sizzle about what you're doing, or how did that whole process go? [00:05:30] Yeah, it's always, I mean, it's always a little bit different. [00:05:34] We had above average security concerns for this project. [00:05:39] You know, we're dealing with a lot of people who are world class hackers, whether it's, and, you know, I was filming with people on both sides. [00:05:49] So you have, you know, who are trying to sort of destroy the other side, whether it's the guy who, you know, Claims he was the founder of Anonymous, who was going after Jim and Ron, or vice versa. [00:06:03] I mean, Ron, you know, has his own hacking skills, you know, running 8chan. [00:06:09] And of course, Fred also has some skills and resources he can tap. [00:06:13] So when we were taking it out, I had to, we had all kinds of security protocols in place to try to protect the footage, but we also, taking it out, I had to also protect the material. [00:06:24] So I couldn't pitch it in the same way I would normally pitch a show. [00:06:27] So I put together secure iPads to present that. [00:06:31] Where I took out all connectivity to the internet. [00:06:34] And so the only way that you could see the footage that we had constructed was by watching one of these iPads. [00:06:41] So that's how you would see the scenes. [00:06:43] And I had also just built the story around that. [00:06:46] So I had written what I thought would be the six episodes when I took it out to HBO. [00:06:50] And I didn't take it out to HBO by myself. [00:06:52] I mean, it was in the run up to taking it up to HBO, I knew that I needed help. [00:06:59] I needed a 10,000 pound gorilla in my corner. [00:07:03] And the reason that I thought that was because Q had become such a big story by summer of 2020 that there were lots of documentary filmmakers with big names who were pitching a Q project at that time. [00:07:22] But I knew I had the real story in the bag at that point. [00:07:27] I just needed to get it in front of the right people and have someone like Adam McCain in my corner. [00:07:36] And Adam McKay is specifically who I sought out. [00:07:39] I had no relationship to him whatsoever. [00:07:41] I basically asked everyone I know, everyone I'd ever met at a film festival along the way, like, do you have any kind of connection to Adam McKay? [00:07:48] Because I was convinced that he was really going to understand what I was doing with this story, the tone, the style, and the importance of it. [00:07:56] He had done The Big Short, which is one of my favorite films of all time. [00:08:01] I mean, before that, he had had a Saturday Night Live. [00:08:05] I think he did Step Brothers. [00:08:06] Films like that as well. [00:08:08] So he had a background that was sort of a mixture of comedy and then also something like The Big Short, which is humorously tackling an incredibly complicated subject. [00:08:17] So I, hook and crook, got one of these iPads in front of him and he responded right away. [00:08:24] And I knew he had a deal with HBO as well, which is where I wanted to place the series more than anywhere else because I knew that they're very filmmaker friendly. [00:08:34] And that turned out to be the case. [00:08:35] So, you know, a series of small miracles. [00:08:38] Like, I honestly, to this day, it seems. [00:08:41] Impossible to me that all of that worked out. [00:08:45] in that we were able to get it done on such a crazy timeline. [00:08:51] And I couldn't have had a better partner than an HBO. [00:08:57] I mean, they really knew, they really let me tell the story I wanted to tell here. [00:09:02] And we're good creative partners as well. [00:09:06] I don't know what it might have been like with like a Netflix or something like that. [00:09:12] But historically, HBO has been, Sort of the bastion for controversial or difficult topics in the documentary realm. [00:09:24] Yeah, yeah. [00:09:25] HBO truly is the holy grail of that kind of content, any kind of content as far as stuff that kind of like teeters on the edge, like something like QAnon would. [00:09:36] When you talk about those encrypted iPads, were you concerned, like when you were trying to show people this footage or communicate with these networks, were you concerned about people like Ron Watkins? === The Secret to Cheap Flights (03:18) === [00:09:48] Or Fred, Frederick Brennan somehow getting a hold of it? [00:09:53] And what were you actually afraid of? [00:09:56] It's hard to know what the exact threat model really was. [00:09:59] Who was working with whom? [00:10:01] Were there government subcontractors or people who had access to hacking tools or whatnot who might want to know what any of these sides were doing? [00:10:17] So I was trying to protect everyone from everyone. [00:10:20] and minimize the possibility of any kind of a leak. [00:10:25] Because I took a position of neutrality in this. [00:10:28] So I wasn't concerned about one party more than the other. [00:10:32] I was concerned about all of them almost equally. [00:10:36] Because I just didn't know what the real risk was in that scenario. [00:10:41] And I wasn't even sure if it was really a matter of the characters so much as other entities who might be interested in the footage that I had gathered along the way. [00:10:53] Maybe state-sponsored entities. [00:10:55] So, at what point from when you began following this QAnon movement did you decide I need to make a documentary about this? [00:11:08] I decided fairly early on. [00:11:11] It was about nine months into Q. [00:11:15] I was aware of QAnon peripherally just because I'm on Reddit. [00:11:21] I wasn't a Chan user at all. [00:11:23] I was aware of what the Chans were, but I hadn't ever spent any time there. [00:11:27] And It was when Reddit banned QAnon, like the Great Awakening subreddit. [00:11:34] That's what piqued my interest. [00:11:35] I was like, well, what's something that's so dangerous or problematic in the minds of Reddit that it needed to be banned? [00:11:44] And might this be a sign of things to come? [00:11:46] Might this be sort of where the internet is headed? [00:11:48] Are we going to see more of this? [00:11:51] Because usually it's the more extreme, disfavorable things in society when it comes to speech that test whether or not we have. [00:12:00] A right. [00:12:01] So that's what got me paying attention. [00:12:05] And of course, I was just drawn to the mystery of who was behind all of this. [00:12:08] And I thought that unmasking whoever Q was or what the organization, whatever the structure looked like behind it, might bring what in 2018 still seemed like a game to its logical conclusion. [00:12:25] When you began, like, through this whole process of producing this documentary and spending all your own money, like flying everywhere to interview these guys, you know, Japan and where did Ron's dad live? [00:12:40] In the Philippines? [00:12:42] The Philippines, like, spending all this money. [00:12:44] Like, did. [00:12:46] Well, the secret was Scott's cheap flights. [00:12:49] So I would wait until there was an airfare, E R R O R, airfare, you know. [00:12:56] Sometimes the airlines make a mistake. [00:12:59] So, like, I flew to South Africa for $500. [00:13:03] I don't think I ever spent more than $600 round trip flying to Manila. === Unraveling the QAnon Relationship (14:58) === [00:13:07] I used air miles a few times to get back and forth from Japan, but I don't think I ever spent more than $800 on a ticket round trip for Japan either. [00:13:15] So, part of my strategy was timing things out based on when I could get cheap flights to go to these places. [00:13:22] You know, when I would use air miles is when a big event happened and suddenly I just had to be there the next day. [00:13:29] But yeah, so that was part of my strategy. [00:13:34] And honestly, the majority of my budget was travel. [00:13:37] Like, I obviously wasn't paying myself. [00:13:39] When I was traveling to these places, for the most part, I was shooting by myself. [00:13:42] In the Philippines, I usually had at least one person there helping me. [00:13:46] In Japan, I was by myself. [00:13:49] So it just sort of depended on the environment. [00:13:51] And if I needed somebody on the ground there who could also help film me as I kind of became a character in the story as well. [00:14:01] And that was. [00:14:02] That was, it's never really by choice. [00:14:05] I kind of resist being in these projects, but as playing the surrogate for the audience, there comes a certain point at which, in a story like this, where my involvement just sort of becomes unavoidable in some way, shape, or form, even though I try to maintain as neutral of a position as possible. [00:14:27] I mean, obviously, the situation with Fred when he needed to flee the Philippines. [00:14:34] You know, that sort of the need to help him in that situation overruled the neutrality principle I had in place. [00:14:44] And, you know, because above that is minimize harm. [00:14:48] So minimizing harm in that situation meant getting on a plane to both document him, but physically assist him as well because of his condition. [00:14:57] Frederick Brennan, you know, he. [00:15:00] So in the series, Frederick Brennan, who I was just describing, he. [00:15:04] Created 8chan while he was coming down from mushrooms when he was, I believe he was about 18 at the time. [00:15:12] He has brittle bone disease. [00:15:14] You know, just a fall could kill him out of bed. [00:15:18] So, you know, he's very fragile. [00:15:21] And he had started to part ways with the current owner operators of 8chan, Jim and Ron Watkins. [00:15:32] And part of what I document in the series is this unraveling of their relationship that then turns into an all out. [00:15:39] Rivalry where they're essentially trying to put each other in prison on a global scale. [00:15:45] And Fred is trying to take down 8chan at all costs. [00:15:50] 8chan is, of course, the site where Q was hosted, where Q was anonymously posting these cryptic messages. [00:15:59] And Fred took it upon himself to try to destroy the thing that he had created. [00:16:06] And so I had cameras in the middle of that and I was bouncing back and forth between both sides. [00:16:11] And what you see. [00:16:12] Later in the series, Jim and Ron Watkins come up with a strategy to get him arrested in the Philippines for what's called cyber libel. [00:16:23] In this case, Fred had called Jim senile on Twitter. [00:16:26] But he had been levying a lot of attacks, but that was the one that they decided could stick. [00:16:33] And, you know, it's kind of the justice system, as I think all of these characters would admit in the Philippines, isn't exactly fair. [00:16:44] It can be bought. [00:16:46] And I don't know exactly if that's what happened or not, but the cyber libel charge looked like it was going to stick. [00:16:55] Fred had a. [00:16:58] Fred was going to be brought in by the police and he messaged me saying that he needed to flee the Philippines so that he wouldn't go to prison. [00:17:08] And with his condition, he would have likely died in prison for that. [00:17:13] So there's no way I could let someone like that in his condition for something they said on Twitter have a death sentence. [00:17:21] So I got on a plane and kind of documented his escape as he left his family and everything else behind. [00:17:28] And that's just one piece of this story. [00:17:31] In the series, what you see, the rivalry also reveals a lot about who's behind QAnon. [00:17:37] And it reveals a lot about the site and the culture from which Q was born, which was, you know, first 4chan, but then 8chan, then 8kun. [00:17:49] Yeah, it's such an, number one, it's just such an incredible cast of characters. [00:17:53] The characters that just happened to run this weird, dark world of the internet. [00:18:01] Are just so fascinating. [00:18:02] Like Jim Watkins and Ron Watkins are such oddballs. [00:18:07] They come off so, just so quirky and odd in the show. [00:18:13] I don't know if that's how they're like in real life, but. [00:18:16] Oh, they're more quirky and odd in real life. [00:18:19] You know, they were on their best behavior when the cameras were on. [00:18:23] So if that's any indication. [00:18:25] But yes, they were. [00:18:28] That is true. [00:18:30] They kind of revel in being as absurd as possible. [00:18:35] You know, trying to be as offensive as possible, usually to make each other laugh. [00:18:41] But some, you know, it's kind of like you say awful things long enough and then you don't even know what you believe or don't believe anymore. [00:18:47] You lie often enough, you don't know what's true and what's not or what you believe or what you don't. [00:18:51] So, and I don't know that Ron can even keep track of all of his lies at a certain point, you know, what he's told me, what he hasn't told me. [00:19:00] I mean, the first time I went to be with them, that, you know, film with Jim and Ron Watkins in the Philippines, I don't know if they were expecting that I was going to ask as much about Q as I did. [00:19:10] You know, the next time I filmed with them, it was clear that they had communicated behind the scenes and then their stories changed and suddenly they knew less than they knew the time before, which is just even more suspicious. [00:19:21] But when I went there to film with these, as you say, pretty eccentric characters, I didn't think that they were behind Q at the time. [00:19:28] I just knew that Q was posting on their website. [00:19:31] So if anybody knew who was behind the operation, it would have been those who had the technical data. [00:19:39] But after filming with them that first time, I was just like, oh, fuck, these guys are suspicious. [00:19:43] So. [00:19:44] I find myself in the position of trying to explain to a lot of people that I know what QAnon is. [00:19:52] And I sometimes have a hard time coming up with that explanation. [00:19:55] Do you have a very precise, boiled down summary of what QAnon is? [00:20:04] Well, I should have that at this point. [00:20:07] You know what I would say? [00:20:08] The way I would describe it is, you know, QAnon. [00:20:17] So there is a. [00:20:20] Most QAnons would tell you that they believe that there is this anonymous insider known as Q who they think is close to Donald Trump, leaving these cryptic messages through, you know, 4chan, then 8chan, using these fringe message boards on the edge of the internet to get the secret truth out about this top secret plan that. [00:20:43] That involves Donald Trump and all of this complicated cast of characters who are going to take down the deep state and this evil global cabal of Satan worshiping pedophiles. [00:20:52] And that Q is leaving these anonymous insider drops to help the anons along, the anonymous people online who are following Q, to figure out the details of this sort of top secret plan. [00:21:08] And Q uses all kinds of keywords and phrases in order to. [00:21:14] Keep the audience engaged, asks a lot of questions rather than making statements. [00:21:20] And this is part of the game that Q is playing in order to basically let the Anons who are on the Chans come up with new theories as to what all of that means. [00:21:29] And so, on a long enough timeline, what ends up happening is QAnon becomes this massive big tent for basically every theory or conspiracy theory or whatever you want to call it over time. [00:21:42] And not just theories from the past, but it's generating theories in real time. [00:21:48] And so if you were to talk to someone who is a QAnon follower, a QAnon believer, they wouldn't necessarily agree with another QAnon follower or believer. [00:21:55] They would agree with that first thing that I said, the plan and the taking down of this global cabal and that there's some government insider who's involved. [00:22:04] But they wouldn't agree on who they thought Q was. [00:22:07] They wouldn't even agree on the same theories necessarily. [00:22:11] So it's really more of like a big tent for believe whatever the hell you want as long as you believe in this sort of core concept. [00:22:19] And don't question Q, which has always been the most baffling part of it to me. [00:22:24] It's like, you know, one of the mantras that Q would repeat was question everything. [00:22:30] Well, that's a reasonable thing, I guess. [00:22:36] But you should be able to question something pretty simple, like, okay, is the Earth flat or round? [00:22:40] Well, that's a dumb thing to question. [00:22:42] I can figure out pretty quickly based on science and things. [00:22:44] Okay, let's check that one off. [00:22:47] Earth is round. [00:22:48] Fine. [00:22:49] Um, but uh, you know, what what people uh, what what happened pretty quickly was that people stopped questioning Q and it became a problem to question Q or question the narrative or the beliefs around this. [00:23:04] So, um, it it it wasn't question everything anymore and it became it became um more dogmatic as a byproduct of that. [00:23:15] It seems to me that the whole QAnon movement was like. [00:23:22] It was an idea that was like a mind virus that infects vulnerable people who have no sense of community, who generally they're lonely, they lack a sense of purpose, or maybe they maybe suffer from depression, etc. etc. [00:23:42] And basically, that equates to a weaker immune system when it comes to detecting bullshit. [00:23:53] And that's kind of like. [00:23:55] You know, if it were a biological virus, they would get sick from it. [00:23:59] Is that sort of accurate at all? [00:24:03] Accurate at all? [00:24:04] I mean, that's a great analogy, I think. [00:24:07] A very helpful analogy. [00:24:09] Yeah, I had said a little bit ago that it's kind of like, yeah, that the Q is like a, operates similar to, yeah, like a digital virus in a way. [00:24:22] Like an idea virus. [00:24:25] Hmm. [00:24:25] Yeah. [00:24:26] Yeah. [00:24:26] I mean, and Ron even released a, seemingly released a virus through one of his videos on his followers and they were still fine with it in the aftermath. [00:24:34] Like a couple months ago. [00:24:37] It's like, well, why would he be fine with that? [00:24:39] Well, he released Q, which operated in a very similar way to malware. [00:24:44] And yeah, it does draw on people who are more susceptible to it. [00:24:48] And I think what makes people susceptible to it is a distrust in the media, a distrust in institutions, and a sense that just they're being lied to. [00:25:01] And so they stop, they turn away from these sources and they look elsewhere. [00:25:09] But the sources that they find, or QAnon, is not a more reliable source, but it is more sensational, it's more exciting, and it tells a story of heaven or hell, black and white, and it's sort of all or nothing. [00:25:27] It's not about the banality of evil, which isn't sexy, so it becomes about something far more ethereal, otherworldly, more magical, more fantasy. [00:25:38] It gives people a sense also that they have more control in their lives. [00:25:42] Gives them a sense of purpose and it gives them community. [00:25:45] And I think we also live in a society right now where, you know, people are looking for other ways to get religious experience. [00:25:57] You know, maybe before some of that sense of community might have been extracted from, like, watching a sports team, but it's almost like the, the, All of the things that would draw people to a church before are drawing them now to politics. [00:26:19] Like that same level of fervor and faith is what you see in QAnon, and the same level of sort of passion and just belief. [00:26:32] It's like all of that kind of religious thinking is now being put in a political box. [00:26:38] And so that's why it does have a lot of kind of religious qualities to it. [00:26:45] To it as well. [00:26:48] But the sort of red pill that I think gets a lot of people hooked on Q in the beginning is something that it relies on a certain amount of truth in society. [00:27:05] I think at the beginning of the plan to save the world, it's like, have you ever wondered why there's homelessness, hunger, why you can never seem to get out of debt, why there are endless wars? [00:27:17] And these are all things that people are like, yeah, why all of that stuff? [00:27:22] Which speaks to that why institutional failure point. [00:27:26] And then it leads people down a rabbit hole of increasingly implausible beliefs. [00:27:35] But the central tenet that sort of draws them in is the failures that I think many people are experiencing. [00:27:47] In society. [00:27:49] If the institutions were doing a better job, if the media was doing a better job, if we weren't in a clickbait society, where people were driven into echo chambers by algorithms, the conversation would just be a lot different. === Big Tech as Government Proxy (15:19) === [00:28:06] But there's sort of a confluence of factors here that have created an opportunity for something like Q. [00:28:14] I don't think it would have worked 10 years ago. [00:28:17] And if I had to guess what it is, What caused all of this? [00:28:19] If I could say, if there's one thing that I could have go back in time and change and say, change this, and we wouldn't get to the point we're at now with the sort of hyperpolarization where people turn to something like QAnon, it would be online privacy, which sounds at first like it doesn't make a lot of sense. [00:28:40] Like, what's the connection between those two things? [00:28:43] But, you know, I made a film almost a decade ago called Terms and Conditions May Apply. [00:28:49] And In that film, I highlighted all of the ways in which Silicon Valley and the government were kind of working together to extract our personal information and what the cost of that might be. [00:29:08] And at the time, people would say, well, I have nothing to hide. [00:29:13] What does it matter? [00:29:14] What does it matter if Facebook is collecting thousands of data points on me, knows my fears and my insecurities? [00:29:21] You know, it knows all, it knows everything about me. [00:29:23] What's the harm of that? [00:29:24] All they're doing is, you know, trying to sell me shoes or it's just, they're just using it for advertising, right? [00:29:29] You're like, well, now we know what the cost is. [00:29:35] The cost is that all of that personal data could be used to manipulate us, to play on those insecurities, to drive us towards increasingly sensational content that reinforces our biases. [00:29:51] And, and, Results in, I think, the kind of hyper polarized society that we have now. [00:29:59] And Q, that kind of stuff that Q was doing, that would have normally just stayed on the chance. [00:30:05] But because it was sensational and exciting, and because there were algorithms that could help fuel it, it was like gasoline. [00:30:12] And individuals like Ron Watkins and those who he was coordinating with would know how to capitalize on that and how to, you know, it wasn't. [00:30:27] 8chan specifically that made Q a hit. [00:30:29] It was YouTube. [00:30:30] You know, it was all of these Qtubers who started talking about all this wild stuff. [00:30:36] And it drew in an audience, it drew in a big audience. [00:30:39] You know, and a lot of these guys, they just, they were big fans of Alex Jones. [00:30:43] They were kind of his acolytes, right? [00:30:45] And they were just taking what they had learned from Alex Jones and now repurposing that same sort of entertainment style using Q as their. [00:30:58] As sort of their daily input. [00:31:01] And because Q would frequently ask new questions, there would be this whole ecosystem that formed around Q. [00:31:09] And because they got a ton of followers almost overnight from talking about Q, it created a functional feedback loop where a lot of people were able to profit off of talking about something, and they would not have been able to be successful on YouTube otherwise. [00:31:31] Yeah, I mean, YouTube is more responsible for QAnon success, I think, than almost any other site out there. [00:31:39] Really? [00:31:41] Wow. [00:31:41] And they've since banned all QAnon YouTubers, right? [00:31:45] Like QAnon content? [00:31:47] Yeah, they did. [00:31:49] Except for your stuff. [00:31:50] I mean, I don't think they should have removed their videos. [00:31:54] If they want to demonetize them, so be it. [00:31:56] But they're basically burning the past. [00:31:58] It makes it very difficult for researchers to go back and see what everyone was talking about. [00:32:03] I think it's a convenient excuse to some extent for these tech companies to delete all of these accounts and say, oh, look, look, we're such good stewards now. [00:32:13] Look, Coca Cola, we did it. [00:32:17] Yeah, so they wipe out the kind of digital past that really just shows their culpability in all of this. [00:32:25] So, really, they're just getting rid of the evidence of their involvement in bolstering all of this stuff. [00:32:35] And instead of ever offering the real solution, because the real solution undermines their business model, which requires to extract our personal data. [00:32:47] And amplify and feed us content based off of that personal data. [00:32:54] They'll never offer that as a solution. [00:32:55] So now instead, they're offering censorship and removal of content as a solution. [00:33:01] You don't necessarily have to drive people to the most sensational shit. [00:33:08] You can have a platform that lets people watch things, and it doesn't necessarily mean that you have to drive them to that kind of content. [00:33:18] Based on their biases, fears, and interests at an accelerated level. [00:33:24] So you will never hear, I don't think, Google sitting in front of Congress saying, yeah, yeah, yeah, we should get rid of data personalization driven algorithms. [00:33:32] I don't think they're going to ever offer that as a solution. [00:33:35] Censorship to them is a perfectly viable solution because they have AI that they've been working on to do quote unquote moderation. [00:33:46] We know, and we show this in the series, how ineffective that moderation actually is. [00:33:52] How it actually captures accounts and videos that it's not supposed to. [00:34:01] It's very easy for people to manipulate that and to get around it. [00:34:05] I mean, we saw that they weren't even able to stop the Christchurch video from being uploaded millions of times. [00:34:10] It only caught it maybe 80%. [00:34:13] So the AI moderation doesn't even work, but it's being presented as this kind of panacea. [00:34:19] Meanwhile, if you're a startup or you're a competing company who's saying, okay, well, all these people have been banned off of. [00:34:27] off of Twitter now, where are they going to go? [00:34:29] Maybe they go to Getter, maybe they go to Gab, maybe they go to Telegram. [00:34:33] But if laws were put in place or requirements were put in place that forced these companies to do this kind of moderation, well, a startup wouldn't be able to compete. [00:34:49] So it's perfectly viable for a big tech company like Twitter or Google or Facebook to employ AI-driven moderation algorithms. [00:34:59] It's very difficult for a startup to do that. [00:35:01] So it just allows a big tech company like that to have more control than they do even now because it makes it less likely that competition would be able to rise up in that environment. [00:35:16] Why? [00:35:16] Now, the reason that they have this big dragnet, right, to pull down all this content that doesn't supposedly comply with their terms and conditions, the reason they're doing that is threats from advertisers, right? [00:35:31] That's the only incentive they have to do that. [00:35:34] Am I wrong there? [00:35:37] Well, there's political incentives, right? [00:35:40] Like, once you start playing the game of arbiter of what's true and what's not true, what's good or bad, sure, you can take down some content that I think most people would agree they'd prefer not to have on those platforms, but it also allows them to start taking down things that might be true. [00:36:01] You know, if there's something that's unsavory, an unsavory truth about a politician, You know, and that politician calls them up and says, This is fake news. [00:36:11] This isn't true about me. [00:36:12] Pull this thing down. [00:36:14] Don't let this video go up. [00:36:15] Well, there's obviously a quid pro quo that goes on between Congress and big tech. [00:36:23] I mean, they pay big tech, donates a lot of money to these guys. [00:36:29] You know, it used to be cheap money or easy money to take that didn't come with any strings to take big tech money. [00:36:36] You know, the intel agencies also have a lot to benefit from the monitoring. [00:36:41] That goes on in these websites and the kind of data that they're extracting as well. [00:36:46] So, uh, okay, there's, there's, yeah, the incentives are there, they're endless. [00:36:56] Another very scary thing that YouTube does as well, you know, other than doing like takedowns or demonetizations, is they do a thing which I'm sure you're very aware of with shadow banning content where they make it invisible in search, so you have no idea, you have no, you get no notifications that you've been shut. [00:37:14] Down or that you've been demonetized, they literally don't do anything. [00:37:19] They just erase it from search. [00:37:21] But to you, it looks like you, the creator or the uploader, it looks like it's still there, but no one else is going to see it. [00:37:30] Yeah. [00:37:31] So I struggle a little bit with the question of what these companies are required or should host, right? [00:37:42] On the one hand, they're private companies. [00:37:45] And because they're private companies, they have the right to moderate. [00:37:49] As they see fit. [00:37:52] On the other hand, their scale is so extreme that it's just the place where everybody goes. [00:38:04] They have a monopoly on either video sharing or they've reached a monopoly status. [00:38:15] And these companies have so much control over information at this point. [00:38:21] We have to sit back and say, well, are they really just private companies who are stewards of our information and get to moderate at this level? [00:38:27] Or is this something where we either need to break them up or we need to create some new rules that create neutrality on these platforms because of their scale? [00:38:39] And at a minimum, if they are going to have this level of scale, should we have protections in place that allow for competition to actually exist? [00:38:50] So, right now, for instance, it's not easy to leave Twitter. [00:38:52] Why? [00:38:53] You lose your contacts, you lose your conversations. [00:38:58] Same with Facebook. [00:38:59] I mean, all of these platforms, it's very difficult to leave because they control all of the content that you've created and your relationships. [00:39:11] So, one step I think would be data portability. [00:39:16] Basically, make it very easy to leave one of these companies if you don't like their behavior, if you don't like how they're restricting certain types of speech, things like that. [00:39:30] But right now, individuals don't really have any kind of meaningful choice. [00:39:34] You know, it's very difficult. [00:39:35] Like, if you want to just leave YouTube at this point, that doesn't make it all that easy. [00:39:42] And something like Google, you know, YouTube itself, you know, maybe is just too big. [00:39:47] So I think we have to have the conversation of, at a certain scale point, are they really just a private company or are they kind of operating as a shadow government in the sense that, in the same way they were doing data extraction? [00:40:00] And that data extraction was valuable for the government, and they were able to take things from us directly that a government would have never been able to take from us directly. [00:40:07] Is the same thing now happening with speech, where the government can use corporate entities as proxies to accomplish things that they could never do directly? [00:40:19] Like the government can't come in and say, you can't say this, take this down, remove this, this is false, this is true. [00:40:25] That's not allowed, but a private company can. [00:40:29] And since this is where conversations are happening and information is. [00:40:34] Is, you know, narratives are kind of driven and information is spread. [00:40:44] How much influence is the government having over those conversations at this point? [00:40:48] So, in a way, big tech is a kind of proxy for the government when it comes to regulating speech. [00:40:59] But right now, as the law is written, they're allowed to. [00:41:05] You know, Section 230, a lot of times people say, oh, well, that just means they're supposed to treat all content neutrally. [00:41:12] And actually, what Section 230 allows is for these companies to moderate as they see fit. [00:41:19] And so I think the question then becomes, well, do we, and it is the law that kind of created the internet. [00:41:25] I mean, it's not just about moderation, it's also what allows you or me to go on a website and post comments. [00:41:36] And to write things anonymously. [00:41:37] And you can do that and you can say what you want there. [00:41:40] And it doesn't make that company liable for doing that. [00:41:42] So it protects the little guy almost more than it protects the big guy. [00:41:47] So, you know, it's in a way, Section 230 is sort of the free speech law of the internet. [00:41:53] It's one of the few kind of rights, few regulations that actually protected some rights online. [00:41:59] These companies wouldn't be able to exist without that law, right? [00:42:02] Because they would just be buried by lawsuits. [00:42:05] Wasn't Trump trying to get rid of that and abolish that? [00:42:11] Trump seemed to be upset that he couldn't say whatever he wanted. [00:42:14] And I think he didn't quite, whoever was advising him on Section 230 didn't perhaps clarify exactly what that meant. [00:42:20] So he's like, I'm going to take away this thing that allows the internet to, you know, if I can't have Facebook, no one can. [00:42:26] If I can't have Twitter, no one can. [00:42:28] Meanwhile, you know, Biden was coming from sort of the opposite perspective, saying, we want the internet to be a safer and happier, whatever place that has less. [00:42:40] Misinformation. [00:42:41] So, we need to get rid of Section 230 so that we can have more control over, you know, to make these companies accountable for everything that gets posted on their sites. [00:42:49] So, they both wanted to get rid of Section 230, but for completely different reasons. [00:42:55] Wow. [00:42:56] It's like they both didn't know that. [00:42:58] Or to get rid of it and replace it with something else, right? [00:43:01] Yeah. [00:43:02] You know, and I think that that's a mistake. [00:43:07] I think what we need to be looking at is the scale of these companies, whether or not we should, how we break them up and restoring rights online, restoring digital privacy rights that I think have, the lack of which I think have put us in this situation. === Saving Children Masked Ideology (14:59) === [00:43:26] And are the, basically, the censorship that's going on on websites like Google, or not Google, but YouTube and Twitter, is that what, Spawned websites like 8chan and 4chan and 8coon? [00:43:43] No, it's more. [00:43:44] I mean, there have been a lot of social media companies that have tried to go up against Twitter and Facebook, and none have really managed to stick, you know? [00:43:54] And I think that the only reason that they're starting to find success now is because of the banning that's happened on these more mainstream platforms. [00:44:03] But if we thought we had echo chambers in society before, Now it's just like, here's the social media for the right and here's the social media for the left. [00:44:12] There's not even the cross pollination occurring. [00:44:16] You don't see the same amount of people having to battle it out in a comment section on Twitter or being exposed to ideas that are different than what their assumptions may be or beliefs may be. [00:44:31] So just because it's been pushed somewhere else doesn't mean it's gone anywhere. [00:44:34] It just means that it's disappeared. [00:44:37] It's just been pushed somewhere else. [00:44:39] So, I think a lot of people who are on Twitter are like, oh, you know, problem solved. [00:44:42] Like, well, actually, guys, if you thought what you consider to be a problem or speech that you don't like, that's just move somewhere else. [00:44:51] You know, and by the way, those people who were kicked off of Twitter are pissed off about it, as they should be, and they are mobilizing and they are going to be very motivated to vote in the upcoming election and to vote for candidates that are pushing back against this. [00:45:13] And I think that that's going to become the dominant argument in the upcoming election cycle for any Republican candidate. [00:45:20] The other topics won't really even matter. [00:45:24] It will just be a focus on censorship. [00:45:28] And this is something that I've seen at the conferences that I've been to and filmed at a focus on that. [00:45:37] It doesn't even matter what the other ideas are, it just matters that they can't express them. [00:45:44] Are there any left wing QAnons or are they all right wingers? [00:45:49] They're all Trump supporters. [00:45:52] No, there is. [00:45:53] So I don't know what the exact ratio is, but certainly there are people on the left who follow QAnon. [00:45:59] It's not just a right wing thing. [00:46:02] Particularly in the last year or last year and a half, after COVID hit, you saw QAnon move over to more of the new age communities, UFO communities. [00:46:17] Just communities where people are a little more conspiracy minded. [00:46:22] And also, there was the whole Save the Children campaign, which, you know, it sort of, QAnon kind of masked itself as just being about that using the Save the Children hashtag. [00:46:35] And that brought more people from the left into the sort of Q orbit. [00:46:44] But now, QAnon has been on decline in the last. [00:46:48] in the last nine months. [00:46:50] And not just people who identify as QAnon, but beliefs that are associated with QAnon, that has been on decline. [00:46:58] But one of the things that Q did near the very end, right before the election, was say, there is no QAnon. [00:47:09] There's Q and there is a non-s, but there is no QAnon. [00:47:15] And it was a tactic. [00:47:18] That has allowed those who follow QAnon to do these kind of gaslighting campaigns where, like, anytime you say QAnon now, like I've said it a million times on this podcast, but they would just say, like, he doesn't know what he's talking about. [00:47:33] There is no QAnon because Q told them to say that. [00:47:37] So it becomes this, like, gaslighting technique where anytime somebody says QAnon, then people attack. [00:47:42] And I've actually even seen people in the media now stop saying it. [00:47:45] It's like, guys, like, don't just. [00:47:49] To this nonsensical thing. [00:47:50] Like, tons of influencers in the QAnon community wrote books with QAnon in the title. [00:47:55] Like, it's just a gaslighting technique. [00:47:58] But the new flavor of QAnon that I think we're seeing emerge now is one which is more of a wink and a nod. [00:48:07] It's coded language, but it's obvious coded language. [00:48:11] It's just using all the keywords and phrases that Q would use and embedding them in campaigns and posters and If you're going to put on a conference, like there was just the Patriot Double Down this weekend, you know, it wasn't officially a QAnon event, but all of the keywords and phrases that Q uses were everywhere, including in all of their marketing materials. [00:48:34] And whether it was like, you know, where we go, one, we go all, or trust the plan. [00:48:40] And then in the, you know, their main poster image, the one that they use the most for this Double Down, was two cards, like you're playing blackjack, two cards, a queen and a seven. [00:48:51] This is their main image. [00:48:52] You know, in Blackjack, that adds up to 17, which is Q. There's also a Q Queen. [00:48:58] So, this is what I'm saying when it's coded. [00:48:59] It's like, okay, they're going to be doing. [00:49:03] And then all of the speakers were all the QAnon people. [00:49:05] Ron Watkins headlined the event. [00:49:08] So, you know, it hasn't gone. [00:49:13] QAnon as a movement has just changed kind of name and technique, but it's still very much alive. [00:49:21] And I think actually it's more well funded than before. [00:49:24] And what we're going to see in the next election cycle. [00:49:26] I mean, look, if the guy who's essentially Q can run for office, you know, it's like having Q run for Congress, which is what Ron is doing now. [00:49:34] He's running for Congress in Arizona. [00:49:37] If that can happen, we're going to see more QAnon candidates happening. [00:49:41] And I think it's going to be well funded. [00:49:43] I think there's money to support this at this point. [00:49:45] It's going to be slicker than it was in the last election cycle when Marjorie Taylor Greene got elected. [00:49:51] And I'm telling you, I think that the primary focus is just going to be. [00:49:56] Digital censorship and it and and all the rest of it isn't even going to matter, people are just going to be thinking about that one thing, and it will be a strong argument, I think, for a lot of people on the right. [00:50:09] I think they'll be attracted to that, so and maybe some on the left as well. [00:50:13] Have you spoken to Ron? [00:50:15] Are you guys still friendly after you released the documentary? [00:50:19] Um, we have communicated some, yeah. [00:50:22] I mean, he's he sent me messages. [00:50:24] Well, he felt the documentary was fair, um. [00:50:29] Which was important to me. [00:50:32] I wanted everybody who was in the documentary to feel that they had been treated fair and that I had respected their intentions in any given scene. [00:50:38] I didn't want to make someone see, I would never want someone to watch a scene and be like, that's not what I meant. [00:50:44] You know, you cut this in a way that isn't how I intended it. [00:50:50] So I try really hard when I'm in the edit bay to give everyone a fair shake. [00:50:57] So I guess that was part of it. [00:50:59] You know, he also said, like, you know, I identify more with villains. [00:51:04] He said, I learned a long time ago that making internet personalities larger than life makes for a more entertaining existence. [00:51:15] So, we've done all these things since the series came out to basically say, yeah, I'm Q, but I'm not. [00:51:22] Because I don't think that he can ever fully take responsibility for it. [00:51:29] But he still wants people to know how smart he is and kind of what he pulled off. [00:51:34] So, it's kind of like he gets as close as he can to admitting it at this point and then steps back. [00:51:41] And that's been my impression since. [00:51:44] And he'll go on look, now that he's running for Congress, the media, it's already happening. [00:51:50] Like the amount of free press he's gotten, you know, the media wouldn't say his name until he ran for Congress, and now he's all over the news. [00:51:59] And I don't mean all of the media, many people in the media would say it, but there was a, there was, I was surprised how many outlets kind of dodged around even naming him. [00:52:09] But I guess now that he's running for Congress, it's kind of changed the rules a little bit. [00:52:13] And, you know, he made this Rosa Parks statement while he was at the Patriot double down. [00:52:19] And, All of the outlets picked it up. [00:52:23] He's already done interviews with CNN, the BBC, Rolling Stone. [00:52:26] Like, he's all over the news now as he runs for Congress, and he's using the same techniques that Trump used. [00:52:33] You know, he is essentially just saying the most sensational shit, trolling along the way, and he knows that it'll get picked up because it's click worthy, right? [00:52:49] And gets shared. [00:52:50] So he's taking the lessons he learned from Q. [00:52:55] And applying them for political campaign to gain real power. [00:53:01] And he might win. [00:53:02] I mean, I think he has a shot, you know, I really do. [00:53:05] And I don't know if his opposition right now has some sense of what they're up against, you know, particularly if they watch the series. [00:53:17] But Ron has a digital army at his disposal. [00:53:20] He's really talented at memetic warfare. [00:53:23] He doesn't, he'll say whatever he wants, you know. [00:53:29] Make up, play whatever character he thinks he needs to at any given moment to win. [00:53:37] So I think that he's a serious threat to who he's running against in Arizona. [00:53:42] And he picked the district where they had been running this whole audit thing, Maricopa County. [00:53:49] You know, he selected that specific area to run in at this point. [00:53:55] So I'm trying to understand like, what is this guy? [00:53:58] He doesn't seem to have a motive. [00:54:00] He's. [00:54:01] He doesn't seem to have any sort of real incentive to do this other than fame andor to entertain himself. [00:54:10] He just seems like the Joker, just sort of seeing what sort of systems he could unravel. [00:54:16] I don't think that's far off. [00:54:20] I mean, he identifies more with villains or anti heroes. [00:54:28] I do think it's all kind of a game to him. [00:54:33] And I think that he's. [00:54:36] And since he sort of exists outside of culture, in fact, he enjoys leveraging the things that get people upset or riled up or offend them in order to gain more attention for himself. [00:54:55] So, I mean, if there is anything that motivates him that's an actual belief, I do think he genuinely does care about free speech online. [00:55:10] If only because he likes it himself. [00:55:15] But I think that that is something that he does care about. [00:55:20] The rest is a show. [00:55:23] Would you vote for him? [00:55:29] I mean, look, I'm going to chronicle this probably from afar. [00:55:33] We'll see what happens. [00:55:35] But I'm just going to take a neutral position on that point. [00:55:43] How, I mean, it is crazy. [00:55:45] I would, I would, I would abstain from commenting on that one. [00:55:50] It's so crazy. [00:55:51] You know, all the coincidences are pretty undeniable. [00:55:55] Like, you know, you point out in the documentary how Q posts things that are almost identical to something Trump would post on Twitter five minutes later. [00:56:05] How would Ron have, how would Ron know this kind of stuff? [00:56:08] Like, how did Ron essentially, you say, how did he get in? [00:56:14] Connection with Trump and his whole team? [00:56:20] So, what you see in the series is that Ron obviously has Viz into what's going on behind the scenes with Q. [00:56:28] He knew that Q was going to end on the day of the election. [00:56:31] It didn't exactly end. [00:56:33] There were a couple other drops sort of after that, just a small handful that seemed more like cover. [00:56:41] But for all intents and purposes, it did end on that day. [00:56:44] He and his father both described Q as kind of being a marketing campaign. [00:56:50] You know, he would just know things like the Q is going to go on the offense. [00:56:55] And then lo and behold, Q says a couple days later, I'm going on the offense. [00:57:00] So, I mean, there's a million clues like this, way more than I even put in the series. [00:57:07] But at what point did Ron and his father actually get plugged in with, you know, Trump's inner circle? [00:57:17] That's something I'm still interested in figuring out. [00:57:21] We know that. [00:57:22] Individuals like General Flynn at least saw the power the Q represented. [00:57:27] They got involved fairly early on. [00:57:29] Jerome Corsi, who took to Alex Jones, you know, they got involved very early on. [00:57:42] So there were sort of a typical cast of players who gravitated towards the power the Q represented. [00:57:49] What makes it difficult? [00:57:52] To parse out the network is that it is decentralized and that people wear different hats online, you know, they use different handles and they communicate in a decentralized fashion. [00:58:05] And so, you know, at what Ron could have been wearing any number of digital hats when he and communicating with someone from, say, like General Flynn's team or something, or other operators working out of the US who had connections up the food chain. [00:58:20] But in terms of actual political power, of seeing when it really seemed to switch. === Decentralized Networks and Moderation (14:41) === [00:58:25] I don't think that that happened until 2020. [00:58:29] There's that call in the series where I'm on it with Ron and Jason Sullivan and Bill Binney. [00:58:36] And Jason Sullivan, he's Roger Stone's head of social media. [00:58:43] That indicates that they probably weren't working together prior to that. [00:58:48] It sure didn't seem that way on the call. [00:58:52] And Jason Sullivan, you know, I mean, he's not the biggest player out there, but he had connections to them. [00:59:00] I mean, Roger Stone was. [00:59:01] briefly Trump's campaign manager and was an advisor, you know, along the way. [00:59:09] And so that's when I think what I describe is, or I mean, what, you know, many describe as sort of meme magic. [00:59:15] That's the moment where the sort of fantasy of Q, the sort of LARP, starts to really become real. [00:59:24] Starts to, yeah, just becomes real at that point to some extent. [00:59:28] You know, this idea of Q having a relationship to Trump's inner circle. [00:59:34] That we get closer to that becoming a kind of reality. [00:59:37] And Ron would message me over the course of 2020 saying, like, oh, look, I just got retweeted by Flynn or by Trump, or, you know, in order to sort of show, I think, that Q was becoming real. [00:59:52] And at least in the one sense of Q having some kind of access to Trump's inner circle, there being some sort of relationship there. [01:00:05] And by the time the election rolls around, Ron sort of steps out from his role as the admin of Aikun. [01:00:16] And he starts doing publicly on Twitter what he says he was doing privately on Aikun. [01:00:25] And he gains followers very quickly. [01:00:27] He gains more than half a million followers using Jason Sullivan's amplification tool. [01:00:32] And Jason Sullivan had one goal on that call. [01:00:35] His goal was. [01:00:37] nine months earlier was to put his amplification tool, his Twitter amplification tool into Q's hands. [01:00:44] Now, on that call, he thought Q was, he didn't think Q was necessarily Ron Watkins. [01:00:49] He just thought that Ron Watkins would be able to help him get the tool into Q's hands. [01:00:55] But it sure seems like Jason Sullivan found his man when he got his amplification tool into Ron's hands. [01:01:02] So Ron gained a ton of power on Twitter, you know? [01:01:06] He was getting a lot of followers. [01:01:08] Um, very rapidly, Trump started retweeting him, you know, and then he started, then he became the advisor, or one of the advisors, to the election fraud campaign. [01:01:22] Um, you know uh, he gets in touch with Chanel Rion of ONE America NEWS. [01:01:27] Chanel Rion gets hired by Rudy Giuliani um to work for them, to work uh, with the Trump campaign. [01:01:35] This is all in the deposition that that recently just came out on the lawsuit uh Filed by Dominion against Chanel Rion and a handful of other players, Giuliani. [01:01:47] So Ron Watkins, who like, I don't know, read an election manual or something, is suddenly the election voter fraud guy. [01:01:57] There are credible experts who they could have hired for something like this, but he was telling the story that they wanted to hear. [01:02:03] And Ron was more than happy to do that. [01:02:06] So Ron is the guy who's basically just inventing the narrative that they're running with. [01:02:13] I'm sure there were other people advising as well, but. [01:02:15] But we know with certainty that Ron and Chanel, based on what he told me and what you can see publicly, were working together and that Ron was her primary source on that stuff. [01:02:29] So now Ron was really plugged in with the inner circle because he was advising them. [01:02:35] And since that time, he's been actively on Telegram, pushing all of the audit stuff. [01:02:45] He's doing on Telegram. [01:02:48] What he had been doing before on 8chan and then 8kun. [01:02:55] But little did I know he would try to turn that into real political power by running for Congress. [01:03:01] And I think we're going to see more QAnon affiliated candidates. [01:03:04] And there's already dozens of them coming into this next election cycle. [01:03:10] Did you see Trump's new social platform that he created, Truth? [01:03:14] Yeah, like speak your truth, send out your truth or whatever. [01:03:19] But don't say anything bad about Trump. [01:03:20] I bet all of that will get banned on the truth platform. [01:03:26] Right. [01:03:27] I wonder if Ron Watkins had anything to do with that. [01:03:32] Yeah, I am very curious what the. [01:03:37] Ron has claimed that he has not in any kind of way been directly communicating with Trump, but that he's like one hop, you know? [01:03:44] So did Ron help code that? [01:03:48] What's his exact level of involvement with it? [01:03:51] You know, I don't know. [01:03:52] Would he get involved with it? [01:03:54] You know, probably. [01:03:56] But Ron, again, he does, he's born from the Chans. [01:04:05] So he's a free speech absolutist. [01:04:07] And none of these platforms are free speech absolutists. [01:04:11] They all have their respective biases when it comes to moderation, which most people want. [01:04:17] Most people don't want to deal with the kind of stuff that gets posted on a site like 8chan. [01:04:25] Because it's uncomfortable. [01:04:28] There's a lot of pretty fucked up things that get posted there. [01:04:32] And moderators have to take it down. [01:04:35] I mean, they have a huge child porn problem. [01:04:38] People post a lot of that crap there. [01:04:40] Fred was trying to moderate that. [01:04:45] So the reality of an absolutist free speech website is that there have to be moderators in place to take down crap like that. [01:04:52] And Facebook has the same problem. [01:04:54] They have to take down that kind of stuff. [01:04:59] But they allow everything on 8chan that's to the limit of free speech, right? [01:05:04] That's the absolute max of what's legal. [01:05:06] That's what's allowed on 8chan now, 8kun. [01:05:09] So, of course, they're going to take down the most fucked up things on the internet that are illegal. [01:05:16] Of course, they're going to take that down. [01:05:19] But it doesn't mean it always gets taken down immediately. [01:05:24] And because on 8kun, anyone can create their own board. [01:05:28] Especially in the beginning, you know, they had a problem with boards getting created where people would share that kind of content. [01:05:37] What is the problem with those types of boards? [01:05:41] Like, what is there? [01:05:42] Do you see a fundamental problem with them existing? [01:05:47] The chance? [01:05:48] Yeah. [01:05:48] No, I mean, I think that obviously there's a concern with their ability to. [01:06:01] Illegal content. [01:06:04] But in terms of a site like, I mean, the Chans don't use algorithms. [01:06:09] There's not the same level of amplification, and they're completely anonymous. [01:06:15] So, chans are very old. [01:06:17] They've been around for decades. [01:06:20] What's new are the algorithms. [01:06:22] So, I don't see the chans as being the primary problem, and I don't think Q would have escaped the chans if it weren't for the algorithms. [01:06:31] Yeah, so I don't understand. [01:06:34] I mean, I do understand how they function. [01:06:39] I understand why people go to the chans, but it seems to me. [01:06:42] Like the issue with all the fucked up content that gets posted there is the anonymity. [01:06:50] You know, the fact that people can remain anonymous and not be responsible for anything they post, they just want to post the most fucked up shit that exists on the planet. [01:07:02] Yeah, but what's wrong with what's wrong with people like people wanted to, like, that's where the internet was born was with anonymity. [01:07:08] Not being anonymous on the internet, that's new. [01:07:11] And I tell you what, like, Facebook, man, people like, Like I said, Facebook has a huge problem with illegal content getting posted there. [01:07:20] And someone doesn't have to be anonymous to say the most fucked up things on Facebook. [01:07:24] Go there. [01:07:25] It's a trash fire. [01:07:26] So anonymity is, it protects whistleblowers. [01:07:33] It protects people who have things about themselves, whether it's their sexuality or beliefs, where they just want to try something out. [01:07:44] You know, they want to test it out, and anonymity allows them to converse, especially in a society that's more repressive. [01:07:51] Japan, way more people use the chans because it's an outlet, you know. [01:07:57] Having a place where you can anonymously say something and talk to other people and it's not attached to your identity in the same way allows people to experiment with ideas, it allows them to express things that and try out different aspects of their identity. [01:08:14] I mean, I don't know how old you are, but like when I was a kid, There was AOL Instant Messenger, and you could go on there and pretend to be whoever you want. [01:08:22] And part of the charm of the internet was being able to maintain anonymity. [01:08:29] And anonymity has been a tool for many political movements in the past. [01:08:36] Being able to even just share leaflets or documents or things that were written that you didn't necessarily want to be traced back to you, but because you knew the. [01:08:49] What the consequences might be based on whoever was in power. [01:08:53] So, no, anonymity is a tool for the masses that I don't think is the problem at all. [01:09:00] Really? [01:09:01] If we're going to say there's a problem, when I say the problem, what I mean is there's hyper polarization in society and there are people who like hate each other so much they like want to kill them or something. [01:09:15] That's a problem. [01:09:16] And what's driving that problem is, I think, to a major extent, these algorithms. [01:09:21] And they thrive on our personal data. [01:09:23] So that's, to me, that's the problem. [01:09:27] Right. [01:09:27] Them collecting the personal data is the problem. [01:09:29] But wouldn't you say that people being able to remain anonymous and say the most foul things about or dox somebody and still remain anonymous and not get to doxing somebody is a different question. [01:09:42] And I think that that's something that we could potentially regulate against. [01:09:47] People being able to go on there and say really fucked up things. [01:09:50] I mean, if they're not saying something that's illegal, it's not illegal. [01:09:53] If we as a society determine that if something goes beyond the pale to where it actually doesn't pass the emergency test, already those things get taken down on those sites. [01:10:06] If there's an imminent threat, right? [01:10:08] Like if it's clear that someone isn't making a threat to like murder someone else, they have to respond to FBI requests. [01:10:19] That's not legal. [01:10:21] It has to be, you know, it has to be treated as such. [01:10:26] So, absolutist free speech doesn't mean that literally anything goes. [01:10:30] And 8chan or 8kun, along with a lot of those other websites, will respond to those requests. [01:10:37] So, if we as a society say, well, we don't want people to be able to talk about certain fucked up things, then we have to be prescriptive and say, well, what are those fucked up things they can or cannot talk about? [01:10:47] But it's very, very difficult, if not impossible, to regulate against one certain type of speech or one certain type of way of thinking without taking out, without unintentional consequences. [01:11:04] To say, like, you know, Because, for instance, if you, we saw this with QAnon. [01:11:12] Like they tried to ban QAnon on Twitter, the hashtag. [01:11:16] And inevitably, they were just like, well, let's just pick a new hashtag. [01:11:20] Let's pick another letter. [01:11:21] Let's just, let's ashamed. [01:11:22] What are they going to do? [01:11:23] Keep banning everything? [01:11:25] So it becomes a whack a mole in that sense. [01:11:28] And also, the more by censoring things or suppressing them, it gives those ideas the veneer of, Intellectualism. [01:11:40] It makes them seem like there must be something to it if someone is suppressing it. [01:11:45] Isn't that how QAnon got catapulted into the mainstream in the first place when they got banned from Reddit? [01:11:55] Yeah, yeah. [01:11:56] I think every time Q got censored, it made it stronger because it was able to leverage that and say, I'm being censored. [01:12:03] We're being censored. [01:12:05] Surely what we're doing must be so righteous that. [01:12:11] That our enemies are responding by purging us or silencing us. [01:12:20] So, yeah, the censorship is what made Q grow. [01:12:23] Yeah, very much so. [01:12:25] You know, there's a phrase I think is pretty on point, which is that the hostility of suppression speeds up the treadmill of extremism. [01:12:35] So, that sense of being silenced or suppressed, when somebody experiences that, it makes them more sure that they're right. [01:12:44] And makes them more likely to act on those beliefs. [01:12:48] But going back to talking about being anonymous on these websites, what about when you have something like this guy who was live streaming when he went to go shoot up the synagogue? [01:13:05] I think it was a synagogue or something. === Suppression Speeds Extremism (11:13) === [01:13:07] The guy who was. [01:13:07] Well, he wasn't live streaming on. [01:13:08] He was light. [01:13:09] Hold on, let me just toggle this. [01:13:14] Is it because that you're saying because it's. [01:13:17] When it comes to within the bounds of the law, right? [01:13:21] But these people are like cheering him on. [01:13:23] These anonymous users are like, yeah, yeah, yeah, finally, you stood up for us. [01:13:28] You stood up for white supremacy and you killed these Muslims or whatever it was. [01:13:32] And, you know, it's an argument. [01:13:35] It's kind of an argument to say, like, maybe this guy wouldn't have done this if all these anonymous people wouldn't have been like cheering him on, being his cheerleaders. [01:13:44] But you could also make the argument that if the media didn't talk about it, It would, you know, you could look at all kinds of things that led to that. [01:13:52] But the reality was, if we're talking about this specific case, you know, at the time, the story was that he had been radicalized on the Chans. [01:14:04] What we would later find out is that he had been radicalized through YouTube. [01:14:10] So years earlier, he had been led to content that white supremacist content. [01:14:18] He had already, he had flown to like the Balkans to try to meet with white nationalist leaders there. [01:14:26] You know, that guy had a history, like he should have been on all kinds of watch lists for traditional reasons. [01:14:34] So this is not a case where he was just going out there specifically to appeal to those groups. [01:14:41] However, he was aware of those cultures, certainly, of the Chan culture, and, you know, that he did create something that was a kind of show for them. [01:14:59] But where did he live stream? [01:15:01] You know, he live streamed what on like Facebook? [01:15:05] Is that where it was? [01:15:05] Yeah. [01:15:06] Yeah. [01:15:06] I mean, he wasn't live streaming on the chans. [01:15:08] So, okay. [01:15:10] And then that live stream got picked up and replicated and shared time and time again. [01:15:14] So, right. [01:15:16] Facebook had a real. [01:15:17] If the question is, I mean, there are some real fissures in society and some things that we need to tackle. [01:15:26] I think that in that case, that guy could have been. [01:15:30] There were ways that that guy could have been caught beforehand. [01:15:33] And some things are just, some fucked up things happen in the world and people cheer them on and it's a nightmare. [01:15:43] But at the same time, what are we looking at? [01:15:49] I guess I'm trying to understand what it is that you're hoping to solve here. [01:15:54] I'm really trying to understand what the argument is. [01:16:01] For because I hear so many people say, you know, you're the first person I've heard argue for being anonymous, remaining anonymous on the internet. [01:16:12] I just hear, you know, I hear so many people say that's, you know, that's the issue. [01:16:17] Cause so many people, I understand, you know, the comment section, it can hurt your feelings or whatever. [01:16:21] You know, these assholes, you know, from their mom's basement, they shouldn't be able to make fun of me and call me stupid in the comment section. [01:16:27] They should be able to have their real name on there so I can, you know, Because you wouldn't be able to do that in real life. [01:16:32] You couldn't walk up to somebody at the supermarket and say that shit to them, or else, you know, it's going to result in violence, real violence. [01:16:40] So, getting rid of, and I hate that word, anonymity, getting rid of that would kind of like resolve that issue. [01:16:52] And you're saying that, you know, if you make it impossible to be anonymous online, You're never going to solve the issue of privacy, which essentially is what these algorithms are based on people's privacy in order to make money and keep you in your same little echo chamber. [01:17:13] So I'm just trying to understand the best argument for anonymity on the internet. [01:17:22] And what you've already articulated very well, I'm just trying to work through it myself, I guess. [01:17:29] Yeah, yeah, yeah. [01:17:30] I mean, I think that. [01:17:34] There are places online where just where do you want to spend your free time? [01:17:39] Like, you know, do you want to spend on the chans where people are poking the worst fucked up shit and like celebrating atrocity? [01:17:49] That's not where I want to spend my time. [01:17:54] But then again, what is the cost of not allowing anonymity on a place like Reddit? [01:17:59] Especially once you end up in an environment, we imagine that society will remain the same. [01:18:08] We imagine that authoritarian impulses or fascists or whatever will not be in positions of power and that we will not need anonymity to be able to communicate and organize. [01:18:22] If we imagine that that governments are benevolent and cannot be corrupted, and that we won't need anonymity, the shield of anonymity, to be able to protect ourselves, to say things that might be disavorable to those in powerful positions. [01:18:41] If we really believe that, then sure, everything we should do should be in public, out in the open, and we can monitor everything at all times. [01:18:47] That makes sense. [01:18:48] There's a reason that privacy is the Fourth Amendment, and there's a cost to every right. [01:18:55] And however ugly. [01:18:57] Some of the outcomes inevitably will be because of that right. [01:19:02] I would rather have the right than surrender it because you never know when you're going to need it. [01:19:09] People used to say that it didn't matter that our rights were privacy was being eroded online in 2012. [01:19:16] And now we know the cost. [01:19:18] And if our response to that is to get rid of completely get rid of privacy, well, my God, we've just handed over the keys to everyone in power, those who monitor us. [01:19:30] And have access to everything that we do. [01:19:33] Already, that's the state that we're at. [01:19:35] So, the turmoil and chaos that we're seeing in society right now, the arguments are like, well, we need to restrict the speech, we need to silence the people who have disfavorable ideas or dangerous ideas, and we need to get rid of anonymity. [01:19:50] And I think the opposite is the case. [01:19:51] I think it was a lack of privacy that landed us here. [01:19:56] And I'm very concerned that the very, that the, The lack of a right that landed us here is now being used to take away more rights. [01:20:08] And I think that the Fourth Amendment is there for a reason. [01:20:13] And it is a negotiation. [01:20:20] It's what we're willing to accept, how much privacy we're willing to surrender in exchange for security. [01:20:25] And I think that's a question everybody has to ask themselves. [01:20:28] But when Democrats were under Obama, they felt more comfortable with government spying on them. [01:20:35] In exchange for security, you know, I think they felt less comfortable during the Trump years. [01:20:40] And I feel uncomfortable with it during all years. [01:20:43] So it doesn't matter who's in power. [01:20:46] That's very well put. [01:20:47] That's very well put. [01:20:50] What do you use as far as like everyday applications? [01:20:54] Do you use like those encrypted text messaging apps, email apps, browsers? [01:20:59] Like, how far do you go in everyday life? [01:21:03] As much as I can. [01:21:06] You know, I don't have a VPN turned on right now because our internet connection is already unstable. [01:21:12] You know, I mean, I advocate, I've spent the last eight years teaching college students how to use encrypted services. [01:21:20] You know, it's not because you have something to hide, it's because why do other entities have a right to everything that you do at all times, your thoughts? [01:21:30] And you never know when you're going to need to have that privacy. [01:21:33] You know, you never know when, what kind of regime may step into power. [01:21:39] And suddenly, your beliefs or things that you care about are dangerous or maybe even illegal. [01:21:47] What's legal today isn't necessarily legal tomorrow. [01:21:50] And with a past being our permanent record, all of that stuff can be used against you. [01:21:54] I mean, we've seen that playing out in the last couple of years. [01:21:58] Tweets that people said from a decade ago, now that culture doesn't accept that anymore or a certain thing that they might have said, it becomes a weapon against them. [01:22:07] So, with all of our lives being a sort of a permanent record, you can see how much information and sort of belief from the past has consequences in the present. [01:22:19] And I feel like we're in a place now where it's not okay to be wrong anymore. [01:22:24] And I think part of that is because we're beholden to everything we've ever said every second. [01:22:29] So it's very difficult for people to change their minds. [01:22:33] And that's not a good place to be in for democracy, nor is it a good place when the right and the left can't communicate with each other. [01:22:40] And there's lots of things that have bolstered that polarization. [01:22:43] You know, I've obviously hit on the one I think is the biggest, but. [01:22:48] I mean, these things worry me. [01:22:50] And it worries me that we're not going to have, that people don't have faith in our elections. [01:22:59] Trump got elected. [01:23:01] The left didn't, the left thought it was stolen. [01:23:05] Biden got elected. [01:23:06] The right thought it was stolen. [01:23:08] And I, and I, and look, like we need to instill more faith in elections for sure. [01:23:16] We could do that with paper ballots. [01:23:17] You know, you can't hack paper. [01:23:19] It doesn't mean that people still won't come up with narratives to justify how it might have been stolen. [01:23:23] They will. [01:23:24] But at least there are things we can do to at least give people more faith and more trust in our elections. [01:23:30] And if we don't have that going into the next election cycles, my God, man, I don't and if we don't fix the privacy problem online, all of the things that created this polarized environment we're in right now, this kind of information war we witnessed online, it's only going to get worse. [01:23:50] So, I don't like to be a doomsday prognosticator here, but I just haven't seen anything get fixed. [01:23:56] I haven't seen any of this stuff get fixed since the Snowden links. [01:23:59] I thought Snowden was, when he revealed that the NSA was doing this massive spying on everybody, I thought that was going to change things. [01:24:06] I'm like, they did like the smallest amount of reform. [01:24:10] They basically outsourced the metadata to a private company, and that was basically all that changed. [01:24:15] So, I don't have a great amount of optimism because. === Censorship Levers and Market Share (03:15) === [01:24:20] These companies don't want to do anything that hurts their business model. [01:24:23] They're not going to offer up the solution that we need. [01:24:27] And there's plenty of people right now who are looking, are seeing the information state of the internet as a useful tool to gain power. [01:24:41] And they will continue to leverage that. [01:24:46] And my fear is that in trying to suppress that, we're pulling the wrong levers. [01:24:55] To de escalate the tensions that exist right now in society. [01:25:03] And there are too many bad actors interested in fucking everything up and creating chaos for their own political gain. [01:25:15] Yeah, man, it's so scary. [01:25:17] It's just, you know, what are the right levers to push? [01:25:20] And how do you incentivize people to pull the right levers? [01:25:24] You know, what are those levers? [01:25:26] And You know, why would anyone do that when it seems just like another interesting kind of example of this is, you know, the NBA in China. [01:25:36] How, if anyone, like there's the player who plays for the Boston Celtics, the big guy, I forget his name, but he speaks out a lot about the stuff going on in his home country and in Turkey and in China. [01:25:50] And, you know, China, I think China accounts for 15 to 20% of the NBA's total revenue. [01:25:58] So whenever anyone like speaks their mind on Twitter who happens to be employed by the NBA, they bury it. [01:26:06] You know, they don't, they don't, they kind of like, they're censoring people within their own organization, even players, because it affects their bottom line because, you know, China doesn't like it. [01:26:16] Well, this is the thing is that it, it, it, it really is a slow creep. [01:26:20] And I actually don't even know how slow it's been. [01:26:22] You know, it always starts with the, with like, let's get rid of this thing that everybody, that 98% of people agree should be gotten rid of, or 90% of people agree should be gotten rid of. [01:26:33] And people are like, fine, you know, Milo Yiannopoulos, gone, whatever. [01:26:37] You know, nobody misses him. [01:26:38] Oh, Alex Jones canceled, whatever. [01:26:40] He's gone. [01:26:41] But then you put levers in place that when China wants to suppress certain things in our ecosystem, well, maybe that gets pulled. [01:26:53] And you don't know what you see and what you don't on social media platforms. [01:26:57] You don't know how your feed is being curated and how the narrative is being shaped. [01:27:01] So what starts is getting rid of voices that people really dislike, moves along the way to getting rid of voices. [01:27:08] And ideas that are necessary, that I think most people would agree are necessary to have a functioning democracy. [01:27:16] And ultimately, putting those levers of censorship in place will be pulled by people in positions of power or organizations trying to curry favor with foreign governments where they're trying to get a market share. === Algorithms Shaping Democratic Voices (09:51) === [01:27:36] So that's why it can start with something like getting rid of QAnon, but where does it end? [01:27:43] And that's why most people who are legal scholars who debate free speech issues will say, well, that's why you can't get rid of something like QAnon. [01:27:55] Because there's kind of no end to it. [01:28:02] There's no way to say you can't talk about this thing without a bunch of other things inevitably going with it. [01:28:09] So usually it ends up harming the very people, usually, Getting rid of free speech protections ends up harming the very people that that that policy was intended to protect. [01:28:25] Right. [01:28:27] Yeah, man. [01:28:28] It's. [01:28:28] Well, that's why we hit that in episode four, right? [01:28:31] Like we go back to Skokie, Illinois, where you have this Jewish attorney defending, you know, someone who's a Nazi who wants to demonstrate as a Nazi in Skokie, Illinois. [01:28:45] A heavily Jewish community, and a Jewish lawyer took on the case to allow them to demonstrate because he understood this principle at its core, despite it being incredibly emotionally draining for people who were Holocaust survivors, right? [01:29:02] And that's the America that I know because there's an understanding that that has to be allowed for the right to actually exist. [01:29:12] Because, say, they had banned, you know, what law can you put in place to ban that kind of? [01:29:17] Speech that wouldn't then be used to silence, I don't know, a civil rights activist in a community where they're incredibly unpopular. [01:29:26] Martin Luther King Jr. [01:29:27] Would they use the same law that says like, okay, well, we now establish that people who are saying things which are disfavored in some way, shape, or form. [01:29:45] I mean, there's a great book written by Nadine Strawson, who headed the ACLU for, I think, more than 10 years, called Hate, where she outlines her case of why you can't. [01:29:58] Why it's impossible to legislate against something or create a policy against something like a Nazi speaking that wouldn't have far reaching unintended consequences. [01:30:12] And they've tried these things in Europe, and the list goes on and on and on of the kinds of speech that gets suppressed or where people end up getting arrested or fined or put in jail. [01:30:31] For saying things that you and I would go, wait a second, that's a perfectly reasonable critique. [01:30:39] I'm trying to think of some examples off the top of my head. [01:30:42] It's been an hour, it's been like a year since I read the book. [01:30:47] You know, it's like saying something bad about Christians, for instance. [01:30:55] There are countries where that can be sort of inverted, and someone can go to jail for saying something like that. [01:31:01] And then what you don't know is the chilling effect that it has. [01:31:05] So you don't know when people feel like they can't say certain things, there is self censorship that kicks in. [01:31:12] And then that's when people turn even more to something like the Chans. [01:31:16] You know, this is, I think, a big part of why it happens in Japan because people are looking for an outlet to be able to express what they can't socially express. [01:31:24] So it's, I mean, it's a thorny issue. [01:31:27] And that's why I think time and time again it gets tested in front of the Supreme Court because it's always the most vile stuff that. [01:31:34] Tests sort of the limits of free speech. [01:31:37] And we do have, there are plenty of things that are illegal that you can't do. [01:31:42] There's the emergency test that says, well, if the speech is basically inciting people to commit imminent violence in some way, shape, or form, that is disallowed. [01:31:50] You can't yell fire in a crowded theater, right? [01:31:55] That's illegal. [01:31:56] That's not protected speech. [01:31:58] So there is a limit, you know, and this is something that gets tested time and time again. [01:32:05] And I do think that one of the big challenges is, well, the Internet's global. [01:32:11] But does that mean that someone who is in South Africa shouldn't be able to influence policy in America? [01:32:20] Because that's what's happening. [01:32:21] You know, that's what you see in the series. [01:32:22] You've got a dude who's managing the cue board who's operating in South Africa, but he's able to write and say things that influence people in America. [01:32:31] So in a way, we're exporting free speech around the world. [01:32:35] But it means that their speech and foreign actors can influence our way of thinking as well. [01:32:41] And that's why I think now we're going into a. [01:32:46] I've seen this from my travels. [01:32:49] I had production assistants in Macau and in South Africa who were red pilled. [01:32:56] They were like, a dude in Macau knew who Tucker Carlson was. [01:33:01] I'm like, why are you. [01:33:02] And he was able to cite all of these characters. [01:33:06] Like political characters from the United States. [01:33:08] I was like, why in the world would this guy even know who any of these people are? [01:33:12] And it's because of things like YouTube. [01:33:15] So this sort of, what we're seeing now, I think, is that ideologies are spreading on a global scale. [01:33:23] It's borderless. [01:33:25] It's super decentralized. [01:33:28] And those ideologies are very strongly held. [01:33:37] So if there is going to be conflict in the future, and I am doing everything that I can think of to try to avoid that, I really hope that it doesn't. [01:33:50] Come to pass, but you know, if there was another world war, it would be decentralized, it wouldn't be border, it wouldn't be with borders. [01:33:58] You know, it would be driven by ideologies where people have come to believe certain things that they've learned through the internet. [01:34:04] And QAnon is an expression of that. [01:34:07] We've seen it permeate many, many, many other countries. [01:34:13] There was a recently in Brazil, a sort of a failed coup attempt that happened there that where they had their own Q shaman. [01:34:23] Just a Brazilian version of the Q shaman was there. [01:34:27] Japan, they've got two different flavors of Q in on there. [01:34:33] It's all over the world now. [01:34:36] Wow. [01:34:36] And they just kind of repackaged it for their own political players, where there's this idea of fighting this evil global cabal. [01:34:45] And if that can spread in the form of Q in a global way, then that means that these ideologies are spreading in a borderless way because the internet's borderless. [01:34:57] So, in a way, American free speech and providing this to the world has opened up free speech to the world, but it means that now, on a global scale, we're influencing each other, which is fascinating when you compare that to the Cold War, where it's like we were so afraid of the communist ideology somehow finding its way into America. [01:35:22] And then we had a battle, a proxy battle, kind of communism versus capitalism. [01:35:30] And now, you know, a Russian operative or someone from another country doesn't have to leave their computer to shitpost and influence things. [01:35:43] They can do it from the comfort of their own home. [01:35:46] So people use the term information war, and it's kind of hard to sometimes picture that, but it's a battle for people's minds on a global scale right now. [01:35:59] Do you think the first order of business to sort of combat this would be to get rid of the algorithms? [01:36:04] Do you think that would be the One of the main things that would help solve this problem? [01:36:11] I don't know that we can get rid of the algorithms. [01:36:13] It's like you can't really put the genie back in the bottle. [01:36:16] And I want algorithms on a search engine. [01:36:20] I want to be able to find things on the internet more efficiently. [01:36:23] But I think it's the data personalization feeding the algorithms that's the problem. [01:36:28] You can have algorithms that don't involve all the information that the company knows about me, right? [01:36:35] That is a possibility. [01:36:39] And that would change the kind of information feedback loop that I would experience online. [01:36:50] So I don't think we have to get rid of algorithms in their totality. [01:36:53] I think we just have to, I think that we start by enacting privacy regulations, make it so these companies maybe have to get rid of the psychometric profiles that they've built on each of us, and severely limit their ability to use data personalization. [01:37:14] When feeding the algorithms, that's where I would start. [01:37:18] It's slightly more prescriptive, but the algorithms, what makes them powerful is the personal data. [01:37:24] And that's where I think the problem is sourced. === Hijacking Patriotism for Hate (10:32) === [01:37:27] I have a friend who's a lawyer who I just had on last week who he's represented multiple extremely violent, he's a criminal defense lawyer and he's represented neo Nazis. [01:37:43] He's represented drug smugglers convicted of insane amounts of drugs coming in to Florida. [01:37:52] And one of his most recent clients is one of the guys who got charged with assaulting a police officer on July or January 6th. [01:38:01] And this guy, the friend of mine, he's a liberal. [01:38:06] He's a Democrat. [01:38:07] He's outspoken about it. [01:38:08] And he voted for Obama. [01:38:10] He can't stand Trump. [01:38:12] But he takes on cases like this. [01:38:14] Like I said, he represents. [01:38:15] He's represented neo Nazis and he represented this guy who, you know, was convicted of assaulting a police officer on the Capitol steps. [01:38:24] And he said one of the things that really kind of shocked him is how much this guy, when he first met him, he was, he just kept pushing this thing out. [01:38:35] He's a patriot. [01:38:35] He's a patriot. [01:38:37] And he's like, it's interesting. [01:38:38] He's like, why? [01:38:40] He's like, he's hijacked the word patriot. [01:38:42] He's like, you see the picture. [01:38:43] There's like, there's me and Obama shaking hands. [01:38:46] He's like, we're all patriots. [01:38:47] Cut off the patriot bullshit. [01:38:49] You know what I mean? [01:38:50] And I asked him if I'm like, did this guy ever talk to you about QAnon? [01:38:55] Like, what did he say about QAnon? [01:38:56] And he's like, I've never heard of QAnon. [01:38:59] And he never said that once. [01:39:01] I thought that was quite odd. [01:39:04] But it is interesting how they do hijack these words like patriot, or whenever you see we the people, like, how did we the people become this hardcore right wing thing? [01:39:22] Yeah, patriotism, I'm trying to remember the study or the book that said that patriotism is a very useful psychological tool to manipulate people with. [01:39:37] You know, to play to their sort of, because patriotism has a kind of faith in it. [01:39:42] It does have a religious quality to it. [01:39:44] So if you can use patriotism as a kind of shield or a mask in order to to, in order to convince people that you're on the righteous side, um, you know, it helps them feel like whatever they're doing, their cause is just. [01:40:05] Um, so there's a lot of power in you in kind of repurposing patriotism as faith in one's country. [01:40:19] Um, and I think that those who have done so understand that. [01:40:27] They're no idiots. [01:40:28] I mean, using that term, using patriot, it draws in also people who tend to be religious, but it draws in people who are ex military guys. [01:40:42] It makes them feel like, oh, this is the thing that I belong to because patriotism and the flag is so central to it. [01:40:53] And what is the alternative for them? [01:40:55] If they see this thing that says, like, yeah, come over here. [01:40:59] We really support the country. [01:41:03] We blanket ourselves in flags. [01:41:07] I think it's attractive to a lot of people. [01:41:11] It's another effective tool for mobilization. [01:41:16] You know, I'm a pretty serious constitutionalist, you know, and I care a lot about civil rights. [01:41:28] I mean, I would consider myself to be a patriot, right? [01:41:30] But when it gets repurposed, it's fascinating because then people are afraid to say something like that. [01:41:38] You know, like the American experiment founded on the Constitution. [01:41:45] Set the stage for the rest of the world when it came to civil liberties. [01:41:52] It wasn't perfect when it came out, but it was the best document, and they knew it needed to be a working document. [01:41:58] And that's why when I look at something like this new question around, oh, free speech online or digital privacy online, it's like we're just having the argument again because it's in a digital space. [01:42:10] But it's really the same argument or same question that we've had for hundreds of years. [01:42:15] And when people were escaping tyranny, they said, here's Here's what we need. [01:42:19] Here are the rights we want. [01:42:20] Here's what we wish we'd had. [01:42:23] So it does frustrate the hell out of me to see it kind of get repurposed to a certain extent. [01:42:31] But I do think that those who get drawn to that, they get drawn to it because they feel like there's no patriotism somewhere else. [01:42:40] If you went and nearly died for your country and you came back and you're like, well, which group would I align with? [01:42:46] Would you align with the one that, you know, That seems to support you and uses a lot of American flags, or one that is unwilling to say that they have any sense of patriotism. [01:43:07] It's a clever ideological weapon. [01:43:13] Yeah, and I've never seen it. [01:43:15] It's used so effective and it's just so confusing. [01:43:18] It is very confusing. [01:43:19] It's very confusing. [01:43:21] Yeah. [01:43:22] What was it like on January 6th when you guys? [01:43:26] Went there to film and you met with Jim. [01:43:31] Well, I didn't get much sleep going into that day, the two nights before. [01:43:38] I was just really anxious because I knew something bad was coming. [01:43:44] I didn't know how I actually thought that it might break out into civil war that day. [01:43:48] That's how bad I thought it was going to be. [01:43:51] And I think that we are lucky that it didn't. [01:43:58] You know, the stage, especially looking back now at the quote unquote failures that allowed people to get into the Capitol. [01:44:09] I mean, there were still senators in the room while rioters were in the building. [01:44:17] So it was just chance that that didn't escalate or that more people didn't get shot and killed. [01:44:26] I mean, if they had started firing on that crowd, what would have happened? [01:44:28] I mean, would DC have just turned into a war zone? [01:44:32] So I'm thankful that it wasn't worse, but I think for most people sitting at home who hadn't been tracking this stuff for months or years, it was very shocking that anything like that could have been possible. [01:44:46] So I was on high alert going into that day, and I was there to document Jim's reaction to all of it and get a better grip on just how much of a role Q had played. [01:45:05] I don't think that the sixth happened solely because of Q. [01:45:08] I don't think it would have happened without it either. [01:45:11] And the amount of the ideology of Q that it just, it was, its presence was everywhere that day, whether it was in the form of flags and patches or just the chants that people were saying or the songs that people were playing. [01:45:30] You know, it was all, you could see how much of it was informed by Q. [01:45:35] And Jim reflects that in his, in, in, In what he says that day. [01:45:43] You know, he's like, you know, Q's real. [01:45:44] You know, it belongs in there, in like the history books now. [01:45:51] And, you know, and those guys were stoking the flames in the run up to it. [01:45:56] So now I'm feeling stressed just thinking about what that day was like. [01:46:03] Sorry. [01:46:05] Yeah, you know, I mean, in the heat of the moment, it was really unclear how. [01:46:11] I mean, I didn't know what was going on inside of the Capitol until later. [01:46:15] Jim, I couldn't tell if he was LARPing or if he meant it when he said, when he was cheering at the fact that there was somebody out the window in the Capitol. [01:46:26] I'm like, that can't be possible. [01:46:28] There isn't actually somebody out the window. [01:46:30] I mean, it wasn't until later that night and I saw the footage. [01:46:33] I was like, holy fucking shit, the Q shaman got on the dance? [01:46:37] Like, what happened? [01:46:38] How is that possible? [01:46:39] The guy with the horns? [01:46:41] Yeah, yeah. [01:46:43] I mean, and I'd seen that guy, you know, a couple months earlier at a Q convention. [01:46:48] He's just like a, I mean, he's the definition of a LARPer, you know? [01:46:53] Right, right. [01:46:55] But it was like a switch had flipped that day and they had gone from LARPing, and I don't even, and that's why we play that hallucinogenic tune going into the Capitol because it was like, it was almost like the end of a really bad trip. [01:47:11] And where suddenly it was like, here they all were in this. [01:47:15] I mean, the Capitol building, it was like a switch had suddenly been flipped. [01:47:23] And I don't think that the people who were there, like the Q Shaman and some of these guys, like, who knows what it felt like for them. [01:47:31] But I think now they understand that they had gone from LARPing to doing something very real and consequential. [01:47:42] And I think that the people who were provoking them understood what they were doing. [01:47:46] I think that Trump and his inner circle knew exactly what they were doing. [01:47:49] But, you know, they, I mean, they just kind of let those guys do the work for them. === Self-Preservation Over Agreement (03:46) === [01:47:59] And it doesn't seem like. [01:48:06] I mean, Trump didn't come out and, you know, support them after the fact, right? [01:48:12] He just kind of let them all out to dry. [01:48:14] So, I mean, he was able to kind of. [01:48:17] Walk away from it all untarnished. [01:48:19] But we'll see how that narrative continues to get rewritten around what happened on the 6th. [01:48:25] I mean, we're seeing it every day. [01:48:29] And how is that going to be used in the upcoming election cycle? [01:48:37] I guess we just have to sit back and watch. [01:48:40] But some of the agencies, it seems like they're kind of brushing it under the rug, you know, just like they don't really want to look at what happened that day. [01:48:49] We'll see what the January 6th Commission dredges up. [01:48:52] But most likely in a year, we'll see a flip in who's in Congress. [01:48:56] I mean, I thought, look, everyone in Congress was there. [01:48:59] The fact that there couldn't be some agreement on how to investigate what had happened between the right and the left for that ish, I think that gives you a sense of where we're at as a country right now. [01:49:18] Like, that should have been a bipartisan concern. [01:49:23] You know, they weren't just going for Democrats that day, they were going for Pence, you know? [01:49:28] So, like, there's no effort to sort of. [01:49:32] Heal this like wound of division in this country. [01:49:36] Like, there's it's just like these people, they just want to use it to their advantage for more political power or for more influence or whatever it is. [01:49:47] It doesn't seem like any of these people in power are interested in solving this problem. [01:49:57] Yeah, I don't, I mean, I'm sure that there's some who are interested in solving it, but. [01:50:09] I think there's maybe just too much focus on self preservation with a lot of people who are in power. [01:50:16] And I think there was a, when Biden took office, maybe there was a hope that things would just like, oh, well, things are back to normal or something. [01:50:27] And that's just not the case. [01:50:32] You know, and that, and there's no, just because it's been brushed off of Twitter doesn't mean that. [01:50:38] That almost half of the country has changed the way that it feels about any of this, or that any of the problems that enabled that attempt have gone away. [01:50:52] If anything, they've learned from it and they're going to double down, no pun intended, but they're going to be more organized, more well funded, and more angry in the future. [01:51:08] So, yeah, I just don't think that it's. [01:51:15] I wish that there was more focus, and I think on getting the extremes closer to center and the right and the left talking again. [01:51:27] And I've seen some of it. [01:51:29] Bill Maher's been doing it. [01:51:33] There's been individuals who are saying, Whoa, whoa, whoa, let's start to break down what the Fishers really are and how these two sides are playing each other for their own political gain. === Bridging the Political Divide (01:18) === [01:51:46] And what the cost of that is. [01:51:48] And if we don't start to de escalate it, these two realities aren't going to be able to coexist. [01:51:55] Well, you're doing a fucking phenomenal job yourself. [01:51:59] I mean, just I think your film is one of the biggest things out there right now that amplifies this issue. [01:52:08] And you just being you and creating the work that you're creating is, I think, I think if there's anything that can help these people. [01:52:19] Heal this wound in this country. [01:52:21] It's work like the work that you're doing and doing talks like these with people. [01:52:26] You know, even you going on Joe Rogan, things like that. [01:52:29] You know, I think the more that we can amplify people like you and the more, you know, people become aware of films like yours, the better. [01:52:40] If there's anything that you can do, I mean, you're doing the most that you could possibly do. [01:52:45] And I appreciate you for that. [01:52:47] Well, I. [01:52:49] It's very, very, very generous and considerate sentiment. [01:52:54] Thank you. [01:52:55] I really enjoyed talking to you, and thank you for your thoughts on the work I'm doing and for letting me speak to your audience as well.