True Anon Truth Feed - Episode 65: Tower of Terror Aired: 2020-05-01 Duration: 01:21:42 === British Guy Debates 5G (03:29) === [00:00:00] Everyone got really mad at us when we were talking about 5G. [00:00:03] Yeah, what they don't understand is that we're on 3G. [00:00:07] 3 genius. [00:00:09] Yeah, they did. [00:00:10] You know, I got to say, there's a lot of, you know, first of all, a lot of telecom cucks out there. [00:00:17] But second of all, telecucks. [00:00:19] You know what I found out recently? [00:00:20] You know how, like, whenever I post a screenshot or I have Mehmet post a screenshot to Twitter, it says 5G in the upper right-hand corner of my phone. [00:00:28] And people are always like circle that. [00:00:30] It says 5GE. [00:00:31] Apparently, that's just fake. [00:00:33] That's like not, that's just 4G, but they just put it, like, there's no law that says they can't call it 5G. [00:00:39] So they just call it 5G, which actually kind of makes me like them. [00:00:44] Well, I just like, I just love people that are like, oh my God, I can't believe you're making jokes about 5G. [00:00:49] Oh, my God. [00:00:49] I can't believe you take it seriously. [00:00:51] And it's like, wait, guys, you're saying both of those things at the same time. [00:00:54] Yeah. [00:00:55] Yeah. [00:00:55] It's, it's, I mean, anyway, we heard your pleas. [00:01:00] You're, you're, your, your mewlings and your cries. [00:01:05] And I did receive several letters, which made me incredibly uncomfortable. [00:01:10] So we've got, well, we're talking about, what are we talking about today? [00:01:14] We're talking about the intersection of art, technology, culture. [00:01:17] I'm sorry, I do this every time we talk about technology. [00:01:21] No, we have, we have us, uh, with us a really good roundup tonight. [00:01:25] We have editor-in-chief of Wired magazine, uh, Dirk Hitler. [00:01:32] Uh, we have, we have a couple of guys from Slatecast, uh, which is Martin Borman and the guy named Wagner. [00:01:42] And then we're also. [00:01:42] We've got a panel going, right? [00:01:44] And then we're also joined by 15 faceless, nameless, screeching she-devils from a website called Kotaku who are barebreasted and thus very much distracting me. [00:01:56] But Liz, can you, can you, actually, Young Chomsky, can you dim the lights on their Google G-Chat panel? [00:02:29] On podcast. [00:02:32] You know what I fucking despise is when people do like they make some YouTube video, for instance, about World War I, and then they have the British guy, like AI, not AI, but like the British guy, like narration on it. [00:02:46] You know, you can just like make your computer talk in a British accent. [00:02:50] When they do that, I fucking hate that British guy's accent. [00:02:54] I think he's okay. [00:02:56] I don't like him. [00:02:57] I don't like the lady. [00:02:59] I really like her. [00:03:00] Yeah, I knew you were going to say that. [00:03:02] I just, I do. [00:03:03] No, I like her. [00:03:04] I'm not telling me. [00:03:05] Okay, okay. [00:03:05] I've never seen her face. [00:03:07] Okay. [00:03:08] Welcome to Truanon. [00:03:11] Hi, guys. [00:03:13] I'm sorry. [00:03:13] No, We got it. [00:03:16] Say it again. [00:03:17] No, you're going to do it again. [00:03:21] I'm blissed. [00:03:23] I didn't do it. [00:03:23] I didn't do it. [00:03:24] That's called trust. [00:03:25] Because I thought you were going to say it. [00:03:26] No take backs. [00:03:27] No take backs. [00:03:28] Fuck it. === Welcome to Truanon (03:36) === [00:03:29] Fuck it. [00:03:29] We're going. [00:03:30] We'll do it live. [00:03:30] Fuck it. [00:03:32] I'll write it and we'll do it live. [00:03:36] I'm Bryce. [00:03:37] I can't. [00:03:38] I didn't do it right. [00:03:39] I'm Brian. [00:03:41] All right. [00:03:41] All right. [00:03:45] I can't get it right. [00:03:46] I'm Brace. [00:03:49] I'm Brace. [00:03:52] And we're, and we're, I'm doing regular voice now, and we're, uh, we're joined by producer Young Chomsky. [00:04:00] And this is, did we already say this is true, Anon? [00:04:03] If you didn't, then it's my time in the sun, baby. [00:04:06] This is true and on. [00:04:08] Stop it, I can't stop laughing. [00:04:12] Oh my God. [00:04:13] Okay. [00:04:14] So we are actually talking about the intersection of technology and art, body spaces, all that jazz today. [00:04:23] No, we're talking 5G. [00:04:26] We're talking the internet of things, IOT. [00:04:29] That's right. [00:04:30] We're talking AI. [00:04:32] Oh, yeah. [00:04:34] And we, what else do we got? [00:04:36] What else are we talking about? [00:04:38] That's pretty much it. [00:04:39] I mean, related topics. [00:04:41] You know, one thing that, because you know, you know, like a lot of people don't realize this, but I have actually been spending my entire sort of COVID, let's say, seclusion in what's called Masdar City in right outside of Abu Dhabi. [00:04:58] And so we are, I am, I am actually recording right now from a smart city, and we'll be touching on several topics related to that. [00:05:06] Do you feel smarter? [00:05:08] Well, I am. [00:05:09] So I don't want to say, I didn't want to give this away on this episode, but you know, you know where sheik is? [00:05:16] I'm actually a chic now. [00:05:18] Okay. [00:05:19] Yeah. [00:05:19] Yeah. [00:05:21] Chic Belden. [00:05:22] It's pretty good. [00:05:23] The first Jewish sheik. [00:05:26] All right. [00:05:27] So, uh, would you want to introduce our guest, or should we just get the interview on the rule? [00:05:33] I mean, we introduced him at the beginning. [00:05:35] I know. [00:05:35] I never know how to do this. [00:05:37] You know what? [00:05:37] I can't believe we're recording this. [00:05:38] It's like, we don't even know what we're doing. [00:05:40] What are we even saying? [00:05:43] I mean, well, fuck. [00:05:46] Don't do me like this, baby. [00:05:48] Let's let it roll. [00:06:22] Welcome to the Cyber Zones, is that what? [00:06:24] Is that too much? [00:06:25] Regardless, welcome to the cyber zone. [00:06:28] We are here joined by Jathan Sadowski, who by the eye at the end of his last name, I deduce is a fellow poll, much like Liz. [00:06:38] He is the author of Too Smart, How Digital Capitalism is Extracting Data, Controlling Our Lives, and Taking Over the World. [00:06:44] And he is here to tell us how to, the best way to blow up a 5G tower. [00:06:51] Is that correct? [00:06:53] Allegedly. [00:06:56] How's it going? [00:06:57] It's good. [00:06:58] It's good. [00:06:58] Just, you know, in lockdown in the cyber zone. [00:07:03] How are you guys doing? [00:07:04] I'm good. === Smart Power and Psyche (06:16) === [00:07:05] I think I'm handling it pretty well this week. [00:07:09] I had a terrific day. [00:07:10] I'm on cloud nine. [00:07:12] I'm all, I'm juiced. [00:07:14] Guys, hear me out here. [00:07:15] I can't tell if we're living in dang black mirror or what. [00:07:19] Yeah, this is crazy. [00:07:21] It is. [00:07:22] So we are, I feel like we're going to have a pretty wide-ranging discussion. [00:07:28] We got a little bit of snark on the internet for our musings, our ironic musings on 5G and emerging new digital technologies from the Peanut Gallery, the Twitter Peanut Gallery. [00:07:43] So we're so excited to have you on. [00:07:46] And I think, like I said, I think this conversation is going to go end up going a lot of places. [00:07:50] So I kind of wanted to read two things to start out, to kind of give us a frame. [00:07:57] The first is from the philosopher Beng Chuo Han, and the second is actually from two pieces that you've written. [00:08:04] And I think that there's a way we can kind of connect a thread here that will be a good way to frame this conversation that we start with. [00:08:12] So this is from Psychopolitics from 2017. [00:08:16] Power that relies on violence does not represent power of the highest order. [00:08:20] The mere fact that another will manages to form and turn against the power holder attests to the latter's weakness. [00:08:27] Wherever power does not come into view at all, it exists without question. [00:08:31] The greater power is, the more quietly it works. [00:08:34] Power that is smart and friendly does not operate frontally, i.e. against the will of those who are subject to it. [00:08:42] Instead, it guides their will to its own benefit. [00:08:45] Smart power cozies up to the psyche rather than disciplining through coercion. [00:08:50] It is constantly calling on us to share and participate. [00:08:54] And he continues, destroying a contrast with Naomi Klein. [00:08:57] Klein's theory of shock blinds her to the actual workings of neoliberal psychopolitics. [00:09:03] Shock therapy is a genuinely disciplinary technique. [00:09:07] In contrast, the neoliberal technology of power does not exercise disciplinary coercion. [00:09:13] Electroshock owes its efficacy to paralyzing and annihilating the contents of the psyche. [00:09:18] Its essential trait is negativity. [00:09:21] In contrast, neoliberal psychopolitics is dominated by positivity. [00:09:26] Instead of administering bitter medicine, it enlists liking. [00:09:30] It flatters the psyche instead of shaking it and paralyzing with shocks. [00:09:34] Psychopolitics seduces the soul. [00:09:37] It preempts it in lieu of opposing it. [00:09:39] This is smart politics. [00:09:41] It seeks to please and fulfill, not to repress. [00:09:45] So sorry, that was a little long, but I do really want to connect this with two things that you've written about basically AI as a disciplinary ideology rather than like even a concrete power. [00:09:57] And I think we can kind of get into that. [00:09:59] And also, as we'll kind of move through, these kind of emerging possibly authoritarian impulses that the corona epidemic is kind of unmasking. [00:10:16] So you write in Potempian AI, which was from 2018 in Real Life magazine, and we'll have links to that in the show notes. [00:10:27] The disciplining power is much greater if people believe that an inhuman force is tirelessly processing feeds from the ubiquitous cameras rather than groups of human analysts who take time, get fatigued, and make mistakes. [00:10:40] Persuading people that the police are using AI is a way to normalize the idea that AI should be, and perhaps more important, already a ceaselessly monitoring society. [00:10:51] Again, for the purposes of power and discipline, it matters less if the AI is real or fake. [00:10:57] What matters is what people believe. [00:11:00] And then this is from a piece you wrote just weeks ago, I believe, also for Real Life magazine. [00:11:07] By the pragmatist standards, adopting these intrusive surveillance programs to continue the pandemic is at best a gamble. [00:11:15] There's a strong case laid out by legal scholar Susan Landau that location surveillance of cell phones does not work for contact tracing because of technical limitations. [00:11:24] If a proposed solution is not efficacious, there is no reason to consider the program, Landau writes. [00:11:31] But the exigencies of political economy say otherwise. [00:11:35] The efficacy challenge does not actually govern policy. [00:11:38] Exercising power is not just about effectively achieving particular outcomes or doing what works. [00:11:44] It's also deciding the parameters of how those ideas will be defined. [00:11:50] So, sorry, that was very long, but I wanted to kind of use that as like a general frame for how we understand these like technologies of discipline, both how they exist physically and digitally, right? [00:12:05] But also psychologically, which I think is something that you've written really excellently about. [00:12:10] And how, you know, like you write, it's less even what the capabilities of the technology are that are that are, but the kind of the specter of its possibility that can kind of quiet dissent or quiet contestation of, you know, what these, what these, you know, technologies are being developed and used for. [00:12:36] Yeah, I mean, that, that psychopolitics quote is so dead on. [00:12:41] I was like, just pumping my fist as you were reading it because I do think it gets to a lot of like it's a very fine line that we have to walk because so much of the technology, there's such a like whole industry of marketing and branding around it, which pushes its efficacy, right? [00:12:58] Like this is going to work. [00:12:59] It's perfect. [00:13:01] It won't glitch. [00:13:02] It's unmistaking. [00:13:03] Like it's unerring. [00:13:04] It's going to work. [00:13:05] And on one hand, it's really hard as a critic to not also take on that same kind of assumption that these things are going to work in the exact way that their entrepreneurs and inventors say that they're going to. === Military Hardware in Smart Cities (13:16) === [00:13:22] But as you were getting at with the quotes, it doesn't really matter because I mean, like the like so much of Silicon Valley is about this like fake it till you make it, right? [00:13:32] The technology is a placeholder. [00:13:34] It's a way of normalizing some kind of system until they can actually get the technology to the point where it works approximating in the way that they say it does. [00:13:46] So it's like they're kind of priming the pump, getting us used to it by having this really strong marketing push. [00:13:52] And then once it is actually there, like with the Potemkin AI piece you were talking about, a lot of that was about how so much, you know, quote unquote artificial intelligence is actually powered by human labor, right? [00:14:07] It's like office buildings full of like low-wage workers doing things that the companies like Facebook or the Chinese police say is actually artificial intelligence, right? [00:14:18] It's not, not yet, because the technology is not there, but that doesn't stop them from trying to kind of like cash out on those disciplinary dividends before they can actually make good on that technological promise. [00:14:33] Yeah, and you write a lot about this in relation to the smart city. [00:14:38] Can you give us a little bit of a working definition of kind of what that is? [00:14:44] Yeah, I mean, the smart city is such a like massive buzzword and there's so many competing definitions. [00:14:52] Like everyone wants to have their own definition because if you can define it, you can own it. [00:14:58] But I mean, the kind of like working definition that I go with is that it's a city built on top of and governed by kind of data-driven systems, right? [00:15:10] So this kind of like massive real-time collection of data about everything that's happening in the city, how it operates, how people live in it, move through it, the infrastructure, and feeding that in through these kinds of like network systems that keep everything interconnected, keep all the infrastructure, all the systems, all the people, buildings, et cetera, kind of talking to each other. [00:15:33] So it's about this kind of data. [00:15:35] It's about networks and it's about automation. [00:15:37] How do we then automate the operations of the city? [00:15:40] How do we govern the city in a way that's like, you know, optimized by algorithmic analysis or by some kind of artificial intelligence? [00:15:48] So it's about like making it a lot of it is really focused on the operations and the governance of the city and how do we optimize that, make it more efficient. [00:15:58] So the city runs like a well-oiled machine or as IBM, who's like, you know, one of the major kind of global proponents and vanguards of the smart city, conceptualizes the city as a system of systems. [00:16:14] Well, like in what, like, in, so sort of more to the specifics, like what would like a smart city look like? [00:16:20] Are we talking about like traffic lights? [00:16:21] Are we talking about policing or all of it? [00:16:24] Yeah. [00:16:25] So like the number one example that they always use, but like journalists and these tech companies is always about these like NASA-esque like mission control rooms. [00:16:36] You know, that's what I picture. [00:16:39] Yeah, yeah, it's what we picture. [00:16:40] And they, these actually do exist. [00:16:43] One of IBM's like first ones was in Rio de Janeiro. [00:16:46] So it became this kind of like major example. [00:16:49] Like the New York Times wrote this like all the way back in like 2008 when all this started really rolling out. [00:16:56] Like there are all these kind of puff pieces. [00:16:57] Like this is the city of the future. [00:17:00] And you know, they, but it's supposed to be this kind of like, you know, just imagine the Apollo 11 mission or something. [00:17:08] You know, you've got a whole wall full of like screens that are showing different views of the city. [00:17:13] They're crunching data. [00:17:14] There's like, you know, rows of analysts, data analysts sitting there kind of like, you know, real time, keeping track of the city, you know, all of this. [00:17:24] So that's, I mean, that's the idea of the smart city. [00:17:28] And in that is definitely things like, you know, networked and like algorithmically optimized traffic light systems and these kind of like infrastructures. [00:17:41] But like you ask, what does the smart city look like? [00:17:46] Like there's not like there's nothing that when you walk into a like quote unquote smart city that you're like, oh shit, I'm in a smart city now. [00:17:55] Dang, this is crazy. [00:17:57] This is wild. [00:17:59] Like it doesn't feel any different. [00:18:01] And I think that speaks to a lot about like who the smart city is for. [00:18:06] Yeah. [00:18:06] Right. [00:18:08] Like the users of these like enterprise systems are not the people that live in the cities. [00:18:15] It's the people who govern the cities. [00:18:17] Right. [00:18:18] It's really just like a managerial governance structure then at that point. [00:18:23] It's yeah, like like the like the major marketing around it, like the public kind of stuff that like the IBMs and Cisco's and Google and so on kind of talk about really focuses on the like the planners, like planning a city and governing it. [00:18:38] So this like, so that kind of side of things. [00:18:41] But there's like definitely, I think, a much larger like iceberg kind of much larger beneath the surface idea of who's actually using the smart city and how the smart city is being built. [00:18:53] And that is, as I've argued, you know, in pieces before, it's the police, right? [00:18:59] The smart city is for the police. [00:19:02] They're the ones who have the most powerful technologies. [00:19:05] They're the ones who are using it the most widely, both like in a city, but also across multiple cities, right? [00:19:12] Like they are the ones who are taking full advantage of the smart city. [00:19:18] Well, like you say in one of the articles that like they're basically, you know, all these, these, you know, radars and these tracking systems, I mean, it's essentially like a military, military hardware, or at least policing hardware. [00:19:30] And instead of being used solely for the purposes of policing, which obviously is a pretty broad sort of category, it's used to track everything that happens in your life, which I think kind of blurs the line between policing and governance. [00:19:45] Yeah, I mean, the police are, they are doing the governance, right? [00:19:51] And it's not like military hardware. [00:19:54] A lot of it is actually military hardware. [00:19:57] So you've got like, you know, Palantir is one of the major examples of the pod. [00:20:03] Friend of the pod, Palantir. [00:20:05] Peter Till, what's up? [00:20:06] Big ups. [00:20:09] But like, you know, so they're literally a CIA-backed data cruncher, like a defense contractor. [00:20:16] But then they decided that it was actually more lucrative to sell their wares to police departments, right? [00:20:25] Because there's a lot more police departments than there are, you know, DODs. [00:20:30] And so, I mean, you've got Palantir. [00:20:33] You've got a lot of irony is dead because like Palantir, you know, named after the all-seeing stones from Lord of the Rings, fucking nerds. [00:20:43] You know, another one of these more kind of like analog surveillance is another system or another company called persistent surveillance systems. [00:20:54] Persistent surveillance systems. [00:20:56] Yeah. [00:20:57] So it's like low, it's low-flying aerial surveillance that orbits around the city and kind of like captures what this, what the founder called like TiVo for the city. [00:21:09] So it's like recording everything that's happening in like a 10 square mile radius. [00:21:14] And then you can like fast forward, rewind, pause to like follow the movement of a specific vehicle or something, you know? [00:21:22] And that technology originated from a military, like military lab for the Iraq war in Fallujah. [00:21:33] And then the guy like retired from the military, owned the IP somehow still for this technology and was like, I'm going to start marketing this to police departments in the U.S., in Mexico, like whoever's going to, whoever's going to buy it. [00:21:48] Incredible. [00:21:49] Yeah, I think, you know, so much of this stuff too is really like kept from the public. [00:21:57] Or you mentioned that like the there's a difference between, like you say, a kind of iceberg where the stuff public facing is, you know, oh, these are, you know, the internet of things and all of these things are going to be outfitted. [00:22:12] And then the one behind the scenes is all the information about the D, you know, it coming from like a DOD contractor and who, what companies they're partnering with and, you know, all those companies, what contracts they have with municipalities, because municipalities are so starved from state budgets that they have to, you know, they outsource all of this stuff to private companies and, you know, et cetera, et cetera. [00:22:34] And it's kind of a snowball effect there. [00:22:37] Yeah, like municipality, exactly. [00:22:40] So municipalities, you know, the companies are stepping in because of austerity to kind of promise these solutions, right? [00:22:48] Like you can do more with less. [00:22:49] That's always the mantra, like efficiency, efficiency, do more with less. [00:22:53] So municipalities, I mean, you know, you can't really blame them when a big tech company is saying like we can solve all of your, all of your operational problems for, for, for nothing. [00:23:06] You know, they're like, all right, do it. [00:23:08] But the irony of that is that the one, like the one agency in cities that does have a ton of money are the police, right? [00:23:16] So like, they're getting it from both sides because they're getting all this like DHS funding, like Homeland Security. [00:23:22] It's all being done under this like anti-counterterrorism mandate. [00:23:28] Yeah, I mean, it reminds me, like, I can see essentially why cities would get into this in the first place, especially if, for instance, one of these big companies, Google, whatever, I know Google is trying to build some smart city like outside of Toronto, I think. [00:23:42] But like it's in Toronto. [00:23:44] Yeah. [00:23:45] Well, one of the smartest cities on earth anyways, even without Google. [00:23:49] But what, like, it's, it's, I could see them sort of being able to capture cities in this sense because cities, a lot of city governments, excluding the one that I'm in, San Francisco, but a lot of city governments across the nation are starved for money. [00:24:03] And if a company like Google or IBM steps in and says, well, we will not only outfit your city with all this new stuff, but we will do it for free. [00:24:10] Well, it could be similar to something like what they did with Project Nightingale, where their health insurance scam where they, where they basically got 50 million people's health data, you know, in gross violation, although not actually apparently of HIPAA, for free. [00:24:26] I mean, it could be the same thing. [00:24:27] It's like, okay, well, we'll do this for you if you basically let us scrape or don't ask us what we're doing with this data. [00:24:36] That seems like the most logical thing to happen for me. [00:24:40] Yeah. [00:24:41] I mean, it's like I've been trying to map out the political economy of this in my head, but there was an opening in 2008 after the financial crash where Wall Street took this major hit. [00:24:59] But there's still a lot of capital looking for somewhere to put, right? [00:25:03] So like real estate was gone. [00:25:06] Like the finance industry was gone, right? [00:25:09] Like insurance was still was like struggling. [00:25:12] They were trying to figure it out. [00:25:13] So a lot of capital got fed into technology. [00:25:16] Like it's no coincidence that something like the smart city originated by IBM and Cisco around the same exact time in 2008, at the end of 2008. [00:25:30] It's no coincidence that Uber and Airbnb were founded in 2008 and 2009, respectively, right? [00:25:37] Like none of these are coincidences because they saw openings, right? [00:25:42] Like Uber and Airbnb saw an opening in the city because of austerity, because they weren't able to provide these kinds of public services. [00:25:50] So they saw an opening to kind of disrupt it, to take it over, right? [00:25:54] IBM and Cisco saw an opening in the cities to move in with the smart city stuff to be like, we can revamp the city. [00:26:03] So, you know, while everybody was kind of like down and out, you know, the tech giants, all this venture capital, they became the big dog in town and they're still riding on that high from the 2008 crash. [00:26:19] So what kind of data are they? [00:26:21] I mean, this is something I'm always confused about. [00:26:25] Like, what are they doing with this data? [00:26:29] Because so far, it seems like the only thing anyone's figured out is like just to sell us like skinny tea ads on Instagram. === Algorithms And Advertising (03:43) === [00:26:38] For instance. [00:26:39] As a for instance. [00:26:43] Well, now you're going to get a lot more skinny tea ads after saying that. [00:26:47] She doesn't need it. [00:26:48] They know the algorithm knows she's skinny. [00:26:52] Yeah, that was a. [00:26:55] All right. [00:26:56] So it's a good question. [00:26:57] What are they doing with all this data? [00:26:59] I mean, a lot of it is advertising, right? [00:27:01] Like, like, yeah, I mean, Google gets like 90-something percent of its revenue is still advertising. [00:27:08] Facebook, right? [00:27:08] Like, so these massive companies are actually like super profitable on advertising. [00:27:15] But of course, they're collecting a lot more data. [00:27:19] And not all of that data is about economic capital or trying to change it into money, which is what advertising is, right? [00:27:26] It's a way of trying to take data capital and turn it into economic capital, into money. [00:27:32] But that's not always the case. [00:27:34] So something like Palantir, this kind of like, you know, policing technology, they're collecting a ton of data too, but they're not trying to turn it into money. [00:27:43] They're trying to turn it into power. [00:27:45] They're trying to turn it into control, into management, into governance. [00:27:50] And they are. [00:27:50] They're doing it. [00:27:51] We don't always see the effects of that because we're not the ones being targeted by it. [00:27:57] Right. [00:27:57] Like they're collecting data about us, but they're still targeting the same groups, right? [00:28:03] They're still targeting people of color. [00:28:05] They're still targeting poor people. [00:28:06] They're still targeting the same geographies of the kind of like vulnerable and oppressed. [00:28:12] Right. [00:28:12] Like it's a lot of this is about doing what you were already doing, but now like wash them. [00:28:18] But now you can launder it through data, through algorithms, make it objective, right? [00:28:23] But it's shit in, it's shit out. [00:28:24] It's injustice in, it's injustice out. [00:28:27] Like if all of your policing data is like overrepresenting certain groups or certain neighborhoods and you feed that into an algorithm that's supposed to predict where the next like crime is going to happen or who's going to do it. [00:28:41] Right, right, right. [00:28:42] Then all you're doing is just like predicting what you already knew because it was already like overrepresented in this biased data. [00:28:50] Well, and encouraging even more policing of that area, which is only going to increase, you know, crime, you know, and it's, it's a cycle that then fulfills itself. [00:28:59] And people say, you know, this is always my whole thing with algorithms. [00:29:02] It's like, you know, the people always are like, I mean, I don't know, you know, people are like, oh, the algorithms are racist or, oh, the algorithms are targeting this, they're targeting that. [00:29:12] And it's like, no, but the people are writing the algorithm. [00:29:15] It's not, you're not going to, you're not going to, it's not the algorithm itself, right? [00:29:19] I mean, it is, but that's not the point. [00:29:21] And if you don't deal with the actual material disparities that are producing the person who's writing the algorithm, right? [00:29:30] And what they're responding to, you know, because some people are like, well, just fix the algorithm. [00:29:35] And it's like, you can't just, but that's not going to fix anything. [00:29:38] Yeah. [00:29:38] Like, my main objective is not, I mean, obviously I object, you know, a lot of algorithms. [00:29:45] Like, I agree with what Liz is saying, but I'm against the existence of the algorithm to begin with. [00:29:51] Like, I don't like the algorithm at all. [00:29:54] I'm against, you know, the underlying causes of the algorithm, but I'm also, to be clear, no algorithm against the algorithm. [00:30:03] I mean, you're totally getting at it here, though, too, where it is like this. [00:30:07] It's a very like ironically vulgar materialism because it's like so focused on the technology. [00:30:13] So it's like, oh, it's materialist. [00:30:15] Like we're focused on the technology, the algorithm, the data. [00:30:18] It's like, no, you gotta, you gotta take that back a little bit. === People Behind the Algorithm (11:08) === [00:30:21] Like there's people behind that. [00:30:24] And those technologies are materializing certain interests. [00:30:28] They're prioritizing certain values, right? [00:30:31] That's the key there. [00:30:33] It's like, I'm with Brace. [00:30:34] Like, I hate the algorithm, but I hate the algorithm because it represents like capitalism. [00:30:39] It represents this like oppressive domination, right? [00:30:42] Like it's a tool for that, not by accident, because it was made to do those things by those people. [00:30:50] But like you were saying, though, like all these tech companies saw an opening in like 2008, 2009. [00:30:56] And we touched on this for a bit earlier, but I think a lot of companies are seeing an opening right now too. [00:31:02] It's with with the COVID outbreak, you know, to roll out, I mean, whatever psychotechnologies they've been brewing up in their Dr. Mengele's fucking laboratories. [00:31:14] But yeah, I mean, we talked about Palantir. [00:31:17] Palantir has actually been meeting with Donald Trump, et cetera, in the White House to talk about contact tracing, which is giving me a little bit of pause. [00:31:29] Palantir is partnering with the NHS in the UK. [00:31:35] Palantir is in it because they're like, look, we've got, so Palantir's technology is based on a technique that they call social network analysis, right? [00:31:44] It was designed to find terrorists and find all the people and addresses and things that are associated with known terrorists. [00:31:53] So you can kind of like map out that whole like terror network. [00:31:57] You can get a full picture of the cell, right? [00:32:00] And so they like use that and they're like, well, shit, if we can do it with terrorists, we can also do it with criminals. [00:32:06] We can see like, you know, in a city, like who are all the people associated with this known criminal? [00:32:12] And then we can map out this whole like crime network. [00:32:16] That's just contact tracing. [00:32:18] That's exactly what it is, right? [00:32:20] So this is a crisis that is like tailor-made. [00:32:24] I mean, I'm not going to say tailor-made because, you know, the peanut gallery is going to have some words about that. [00:32:31] Let's just say we don't know where it came from exactly. [00:32:35] Yeah. [00:32:35] Wet market of the mind. [00:32:39] Let's not talk about intentions, but let's talk about functions. [00:32:45] But so like Palantir, like they're boom, like, you know, contact tracing, but they're also working with. [00:32:51] So if you look on their website, they've got a list of people that they're working with on corona response, but they're very secretive. [00:33:00] So you can assume that they're working with a lot more people and organizations than they list. [00:33:05] But they list things like a large Anglo-Australian mining company that they're doing work for on logistics. [00:33:15] They list a major gas and oil retailer that they are mapping out and keeping track of like 50,000 employees of this gas and oil company to keep them safe, right? [00:33:30] To kind of keep tabs on them. [00:33:33] So they saw an opening for their social network analysis and they are in it. [00:33:39] They're moving from crime to contagion. [00:33:42] It's all the same thing to them. [00:33:45] Yeah, this is kind of why I want to just circle back real quick or bring back that quote from psychopolitics, because I do think it's interesting that these, you know, these technologies and Palantir, even with, as you rightly point out, the like horrifying name, comic book villain name of the company, is that like, you know, people engage with and are receptive to these new kind of technologies of surveillance because they are kind of, [00:34:14] they're marketed or sold to them as like positive good things. [00:34:18] And corona really opens up. [00:34:20] I never know to call it corona or COVID. [00:34:22] It's always like depending on how I feel. [00:34:24] You know, I do it. [00:34:25] I go with the wind on it, you know? [00:34:27] I know. [00:34:27] It's kind of fun to, whoop, what am I to say next? [00:34:31] Just don't say the roni. [00:34:33] No, I don't like that. [00:34:34] I don't like that either. [00:34:35] I do like it. [00:34:35] I do like calling it 19, though. [00:34:38] Ooh, just 19. [00:34:40] Some people call Trump 45. [00:34:42] I'm like, I'm always like, yeah. [00:34:44] 19. [00:34:44] That's that. [00:34:46] I like that. [00:34:46] Participate. [00:34:47] That sounds like with you. [00:34:48] I don't want 19. [00:34:50] That sounds like the next Netflix dystopian TV. [00:34:53] Yes, it does. [00:34:54] Absolutely. [00:34:56] No, but okay. [00:34:57] So I want to bring it back to that because it's, you know, I love the way that he contrasted it with Naomi Klein's shock doctrine because he's saying like, no, no, no, there isn't this sort of like moment of rupture where a dominating power comes in and like kind of, you know, electrocutes you into submission. [00:35:17] Like, it's actually the complete opposite, the way these kind of new dominating powers work, where it's all based on actual like acceptance and to the point of like almost asking for it. [00:35:29] I mean, I just get it like that relationship that you're put in. [00:35:32] That was probably not the best way to put that. [00:35:33] But, you know, like, where, you know, all these new sort of, you know, contact tracing or, you know, thermal mapping or whatever all these companies are doing get to be sold as, you know, for public health or to strengthen communities. [00:35:52] And it's how you get people to buy in. [00:35:54] And that buy-in is absolutely crucial to the continued domination. [00:36:03] Completely. [00:36:04] I mean, I like to me, it's impossible not to think about this in relation to 9-11 and post-9-11, right? [00:36:12] Because right now it's public health, but back then it was national security, right? [00:36:17] But it's, but it's this, these same kind of like concepts are being used and the same exact reason for the same exact kind of like patriotism, kind of buying in, right? [00:36:28] Like, and you're exactly right, Liz. [00:36:30] I mean, this is how you get people to buy in. [00:36:34] This is how you COVID wash your technology or your system because it's about public health. [00:36:41] And if you're against Palantir partnering with the NHS to make sure that their contract tracing is more accurate, then you are against all of the people that have been sick, all of the people that are being impacted economically, socially, physically by the pandemic. [00:36:59] You are actively working against their interest because you criticize Palantir doing this. [00:37:06] Yeah, it's it's I think it's important to have that sort of buy-in from the public there. [00:37:12] I mean, famously in Spain, I believe it was when the French got kicked out and sort of the entire idea of like an electoral democracy, anything like that, constitutional government was thrown out. [00:37:25] People rioted in sort of like a jubilee of ecstasy, screaming, long live our chains, because they had bought in to that so much. [00:37:34] And you need that. [00:37:35] You can't like have a despotism with, I mean, you can't, you can rule by force, but it's not, it's not nearly as effective as ruling by a sort of, I mean, let's not to be a corny guy about this, but a manufactured consent. [00:37:48] And I think that's really important. [00:37:49] I think that COVID provides perfect, perfect cover for that because, you know, if I had to ask myself very early on during this, like, what number of deaths do I find acceptable over, let's say, contact tracing and like extreme surveillance, anything like that? [00:38:07] And I don't know the answer to that. [00:38:08] I don't know how many, I mean, it doesn't matter what I think, obviously. [00:38:13] Like it does, you know, it's they're not, they're not asking me my opinion on this, but uh, but it's it's something hard to look at. [00:38:21] I don't think it's a very pretty number, to be honest with you, because like it's I've seen some of the technology that has been proposed to deal with corona, which which which just I mean seems terrifying. [00:38:32] That camera that can tell how far apart you are from somebody, what it does when it does see that you're six feet, that you're close to somebody. [00:38:40] It does not say, you know, what sort of penalty you accrue for being within three feet of somebody, for instance. [00:38:46] And then those heat mapping cameras we were talking about before we started rolling. [00:38:49] I mean, that terrifies me. [00:38:53] I mean, it's a terrible question to have to ask yourself that, you know, that what's that threshold of acceptable, but that, but on one hand, that kind of like utilitarian calculus puts us on, like it starts the debate on the wrong premises, right? [00:39:10] Okay. [00:39:11] And it, I think it starts the discussion with this like kind of false dilemma, false trade-off from the very beginning. [00:39:19] Cause now all of a sudden it is set up as a debate between liberty and security, between surveillance and health, right? [00:39:29] And I mean, like this is familiar to us, but it's familiar because it's a really effective tactic of framing this rather than saying, like, why aren't, why aren't you, you know, the government, the people in power, why, why are you turning to these kinds of like mass surveillance and like, you know, technology, technological solutions and population management? [00:39:56] And why are you turning to these techniques versus the techniques that we know work and work humanely, right? [00:40:04] Widespread testing, providing people with medical care, providing people with economic assistance, right? [00:40:11] Like we know these things work. [00:40:13] Things like providing people with support for isolation, for social distancing, right? [00:40:19] Like we know these things are what's effective. [00:40:22] So what's going on here that we're looking for this like silver bullet technological solution and then forcing people to have to ask themselves those questions that you raised, Brace, of like, how, how, how many people can I live with? [00:40:37] You know, dead people can I live with on my conscious if it means that, you know, we don't have this AI or this technology or whatever. [00:40:44] It's like, no, that's the, that's, that's the wrong framing from the get-go. [00:40:49] I know it's the wrong frame for some people, but unfortunately, the answer with me is a lot. [00:40:54] Well, I would say that, you know, also that's a framing that helps only the people in charge. [00:41:00] It doesn't help anyone else. [00:41:01] It's a completely paralyzing framework, which is exactly its purpose, right? [00:41:06] Exactly. [00:41:07] Yeah, no, that makes a lot of sense because I, you know, I realized that the reason I was using that framework, or at least I was thinking sort of personally in that framework, is because when this all started, we were pit up with a choice of the economy and people's basically, in essence, people's financial security versus people staying alive. [00:41:25] When actually it should be like, well, why isn't the government just money's fake? === 5G Speed and Latency (16:14) === [00:41:29] Just give me some. [00:41:31] You know? [00:41:32] Yeah. [00:41:32] I mean, where's the money printer, right? [00:41:35] Exactly. [00:41:35] There literally is one. [00:41:38] We know there is one. [00:41:40] I mean, even just beyond money printer, it's like, yeah, we can just plan the economy. [00:41:45] You could just centrally plan the economy and we wouldn't have to deal with any of this. [00:41:49] Yeah. [00:41:49] But instead, what we need is blood sacrifice to the market and we need to decide how many old people are willing to die so that we can open up the bars and salons and you know, whatever. [00:42:03] And believe you me, it is a literal blood sacrifice. [00:42:08] We are not being metaphorical here. [00:42:10] They are actually trying to summon Baphomet. [00:42:13] Well, speaking of Peter Till, yeah, I was going to say, you know, now that we're talking about blood sacrifice, we should bring up 5G, which is the technology that is going to, I mean, it's going to enable all of these systems to work. [00:42:29] I mean, isn't that the case now? [00:42:30] They're saying Bluetooth technology won't be able to support these sorts of systems that they are interested in implementing or that they're designing to implement these sort of, you know, mass social mapping systems. [00:42:44] But our old friend, our frenemy, 5G, is here to save the day. [00:42:52] Like I mentioned at the beginning of the show, we've gotten a lot of shit for our humorous takes on 5G. [00:42:58] So why don't we just start off with what is 5G? [00:43:03] 5G, as they'll tell you, the G stands for generational, not gangster, not the five gangsters. [00:43:11] But so 5G is the fifth generation of telecom, right? [00:43:16] Like right now, we're on 4G. [00:43:18] So I mean, if we kind of map out what those G's mean in a really kind of like broad sense, right? [00:43:23] Like 1G was analog. [00:43:25] It was the phone. [00:43:26] It was voice. [00:43:27] 2G was things like text messaging. [00:43:30] 3G brought us the mobile web. [00:43:32] 4G is where we're at now, where we can do things like, you know, downloading like YouTube videos or watching YouTube or like doing Zoom on our phone or whatever. [00:43:41] And 5G, that brings us to like new levels that we've never even imagined in terms of like what the technology can do. [00:43:53] So I think the two main things that they talk about with like where 5G is upgrading us, it's about speed and latency. [00:44:04] So speed is about how much data can you be downloading and uploading. [00:44:11] And 5G, right now, like 4G tops out at about 2 gigabytes per second download speeds. [00:44:20] That's the fastest. [00:44:21] They're talking about 5G being like 20 gigabytes per second. [00:44:25] So a tenfold increase in speed. [00:44:29] And the way they always like the example that people always use is that you can download a full high-definition movie in four seconds with 5G. [00:44:40] That's wild. [00:44:42] Yeah, it's wild, right? [00:44:43] It's wild. [00:44:44] Who needs their movie like that quick? [00:44:45] But you try to Netflix and chill at the last minute and you need that high-def movie. [00:44:52] You're like looking at the girl, you're like, she looks like she works at Beacon's Closet. [00:44:55] I need a Fellini film. [00:44:57] So that is a very specific message there. [00:45:00] And latency. [00:45:01] Now, what do you mean by latency? [00:45:03] So latency is about the time it takes between signals. [00:45:09] So right now, 4G is like 10 millisecond latency. [00:45:14] So if you send a signal to a device or something, there's like a 10 millisecond lag basically between like you sending that signal and you getting one in return. [00:45:27] So it's basically like the towers that they're building that they'll they can communicate with each other very quickly. [00:45:35] Very quickly. [00:45:36] So 5G, they're talking like one millisecond. [00:45:38] So again, like a tenfold decrease in latency. [00:45:45] So for our listeners, just to illustrate that, right now, this is a millisecond. [00:45:50] Oh, this is 10 milliseconds. [00:45:53] And this is one millisecond. [00:45:54] But that's a, you know, it's a layman's effort at it, but nailed it. [00:46:03] So 5G is happening everywhere. [00:46:06] I was reading that the FCC, I mean, they basically have removed all like environmental and historic protection reviews. [00:46:17] There's like no, they don't really have like waiting periods for people for cities to kind of like stipulate whether or not they want these in their, you know, in their districts or whatever. [00:46:29] It's just, it's happening, right? [00:46:32] Why is that? [00:46:36] Yeah, so it's like rolling out. [00:46:38] You know, they say it's nationwide, but it's still very, it is still very like piecemeal and patchwork. [00:46:43] Yeah. [00:46:44] And like some carriers are kind of like, you know, dominating the market in some cities, but it's still very, it's still like the infrastructure project is still at the very beginning. [00:46:56] And so, I mean, to the question of like, why is that? [00:46:59] I mean, because it's like, I mean, at the base level, it's going to be a huge economic boom, right? [00:47:08] They're talking, you know, one statistic you keep seeing is that it's going to lead to trillions of dollars of like economic gain and like tens of thousands of new jobs, right? [00:47:25] So here, you get. [00:47:26] So some industry analysts, and you see this number across multiple articles, predict that 5G could generate up to $12.3 trillion in goods and services by 2035 and add 22 million jobs in the US alone. [00:47:42] So like that's the kind of economic story there, right? [00:47:45] Like with this new like next generation telecom infrastructure project, we're going to be producing more, more goods, more services, making more jobs. [00:47:56] But on the like at an even baser level than that, I mean, it's a huge boon for the telecom companies, for the Verizons, for the Sprints, for the Highway, right? [00:48:08] Like they are going to be making a lot of money by rolling out and owning this infrastructure, right? [00:48:18] Like this is 5G has been described as the critical infrastructure for global transformation. [00:48:26] Like it's going to touch everything. [00:48:28] Nothing will be left unchanged by the 5G revolution. [00:48:34] I dislike that very much. [00:48:37] So how much of that though is like telecom industry PR? [00:48:41] And how much of it is actually kind of true, I guess, is my question. [00:48:47] Yeah. [00:48:47] And this really gets back to what we started off talking with. [00:48:51] What's actually working and what's not working? [00:48:54] And right now, it's still very early to tell because obviously a lot of the predictions and a lot of the hype around 5G is still just that. [00:49:02] It's predictions and hype because it doesn't really exist in any kind of like meaningful way yet. [00:49:09] So when they talk about what it can do, I mean, some examples of that is, you know, on the consumer tech side, it means things like downloading that Fellini film like at the drop of a hat. [00:49:22] But it also means like opening up the ability for like VR, like virtual reality and augmented reality to become like, you know, actual like kind of consumer great things that we can do. [00:49:35] And it's, you know, so it's opening up those kinds of capabilities. [00:49:40] It means being able to have access to extremely high speed, low latency internet anywhere where you have 5G. [00:49:50] So you can do things like, you know, like in South Korea where they do have 5G, like at a much higher rate than we do in the US or in Australia or wherever. [00:50:02] Like examples that some of the consumer tech reviewers give is that you can have like really high resolution, really smooth video chatting on your phone. [00:50:14] So you, so, you know, they talk about that as like something really cool and awesome. [00:50:19] So a lot of it can be. [00:50:20] It's great during COVID too, or COVID-20, I mean. [00:50:23] Yeah, yeah, exactly. [00:50:25] So when the next COVID comes, we will all have 5G by then. [00:50:30] And then we like, we don't even have to leave our apartment. [00:50:33] We can all go meet up in like a virtual saloon. [00:50:36] Well, it's funny how these technologies and then powers of domination keep reifying each other. [00:50:42] Isn't that so funny? [00:50:43] What a coincidence. [00:50:45] It's all just a big coincidence, Liz. [00:50:47] That's all it is. [00:50:48] Okay, so let me just really quickly defend the like 5G conspiracy theorists. [00:50:54] Or I would say also the people really pushing back in their localities on the like imposition of 5G. [00:51:02] And the reason I want to defend that is because the government in all over the world, but especially in America, has a really rich history of poisoning people and poisoning communities either with toxic waste or lead paint or chemical dumps, [00:51:24] giving generations cancer with materials that at the time, everyone was told are perfectly safe and you're insane if you push back on any of it. [00:51:35] And so the kind of shit that people get for having questions and being suspicious of impositions of new commercial technology and these new towers is like completely reasonable. [00:51:56] And I just don't understand why, why, or I do understand, and I, you know, and I really hate that people are so dismissive of these people's concerns. [00:52:09] You might even say it's a pretty privileged position to just operate. [00:52:13] There we go. [00:52:14] That's the buzzword. [00:52:16] But honestly, you're totally right because there is this long history of injustice, infrastructural, industrial, environmental, medical. [00:52:27] Like there's this nexus of injustices and a long history of it, which is the context for these conspiracies. [00:52:36] And it's the context that like gets completely washed away in any kind of like debunking or any kind of like calling this like, you know, calling these people like dumb and stupid and kind of talking down to them, right? [00:52:50] Like, like, you, they, like, that, yeah, that context of injustice is never put in place. [00:52:57] And you're, you're totally right. [00:52:58] Like, on one hand, like, you can't blame some people and you can't blame these people for being at the very least skeptical of what 5G means. [00:53:10] And, and if I can roll this back a little bit and talk a little bit more about what the like infrastructure for 5G looks like, I think that provides some good background as to why people are worried and why, honestly, we all should be a little bit like about it. [00:53:27] So, like, the way 5G works, it's these like the really top-level five, like the powerful 5G operates at these high-frequency millimeter waves, right? [00:53:42] And on an infrastructural sense, they don't travel nearly as far as the 4G signals. [00:53:48] So, like, the 5Gs, like millimeter waves, will travel from a like cell, like cell base station around a thousand feet before they drop off, as opposed to a 4G, like 200-foot tower. [00:54:03] Um, signal from that can reach several miles, right? [00:54:07] So, um, and the 5G signal is like susceptible to a lot of obstacles, so it can be blocked by walls and trees, windows, weather, like rain, right? [00:54:18] Like, it can like degrade the signal. [00:54:21] So, the new infrastructure, and it is like they are having to build like billions or trillions of dollars of infrastructure for this kind of like full coverage, like nationwide full coverage. [00:54:33] Um, that new infrastructure has to be dense, like like really, really dense. [00:54:38] We're talking like while the base stations for 5G are small compared to like a like a cell tower, there needs to be a lot of them, like millions, tens of millions of them. [00:54:48] We're talking cell relays every like 500 feet with full coverage. [00:54:53] Wait, and like these are, are these like little boxes? [00:54:57] Are they, I mean, because I read that people were sort of incensed because they were getting put in some churches and they were being put places with basically, I mean, obviously, they don't, they're not going to ask your input if they put one in your fucking apartment building. [00:55:11] Absolutely not. [00:55:12] Like, no one has input on this. [00:55:15] Um, and like, like the commissioner of the SEC has called criticism of this infrastructure NIMBYism run amok. [00:55:23] Yes, and which is more, oh, yeah, don't get me stuck, which is, I'm just going to show YIMBY's, that is your, that is your man out there. [00:55:31] The people burning down cell phone towers, those are my fellow, let's go, I'll be honest, a bit of a NIMBY. [00:55:40] But yeah, so the size of these cell, like these, these relay stations, they vary. [00:55:45] So they are like depending on the in-place infrastructure, they can just be a box with like antennas and electronics and stuff that can be kind of like attached to an existing tower in places where that the infrastructure for that to be kind of like installed is not fully there. [00:56:04] It might be like a refrigerator sized box like on the ground as a kind of like base station and then have a like smaller box kind of like up high on a on a tower. [00:56:14] But you're right. [00:56:14] Like, I mean, we're talking, you know, every 500 feet, which means like strapping them to street lights on buildings and hallways, like embedding them in manhole covers. [00:56:27] Like they have to be everywhere for full coverage to even be possible. [00:56:32] But isn't that also just like, I mean, I don't even know what advocate I'm playing here. [00:56:37] I don't think it's the devils, but like, it's very possible then for those to break easier and for you to lose coverage, I would guess. [00:56:46] I don't know. [00:56:47] It seems also just like, it seems absurd to have to place these little boxes everywhere. [00:56:52] I mean, this must be like really vital, I guess, to them, like 5G. [00:56:58] And what sort of strikes me is that like, you know, you mentioned the statistic of 22 million jobs earlier, or that somebody had, that statistic has been repeated a lot. [00:57:08] It strikes me more as, and the way I've always viewed 5G is actually somebody that comes and takes your job because they talk about automation. [00:57:16] And I know specifically in my union, the ILWU, in the port of Los Angeles, they were really pushing a hard rollout of 5G. [00:57:25] And in fact, they had a sort of test port with trucks, basically, you know, like I don't know the actual statistic, but like a greatly reduced workforce where everything from the trucks to the things loading was all automated using 5G. [00:57:43] Exactly. === Smart Cities and Autonomous Vehicles (06:12) === [00:57:44] So we talked a little bit about like how this might be used in a consumer sense, but really like the quote-unquote killer app for 5G is going to be industrial. [00:57:54] Like it's going to be about things like automated C ports. [00:57:58] It's going to be about things like fully networked autonomous factories. [00:58:02] Like it is about, I mean, it's explicitly about hypercharging, supercharging, automation and AI, right? [00:58:13] That's what it's, that's what it's about. [00:58:15] Because if you have 5G, then not only are like, is everything able to be connected in that kind of like internet of things kind of way where it's all like collecting and transmitting data and talking to each other and stuff, it's able to do so with more speed and higher latency or lower latency, but it's also able to do things like store things in the cloud. [00:58:39] that's able to run artificial intelligence off of the cloud because the speeds are so fast. [00:58:45] So it opens up all these possibilities for like, you know, anything, anything could have a like high-powered artificial intelligence kind of running it because you no longer have to have that local. [00:58:58] It can all be based in some like remote cloud server somewhere else, right? [00:59:03] And so, but yeah, industrial uses are the, are where like the rubber really hits the road with how 5G is going to, and, and, and, you know, another big case they use is like this is going to finally activate autonomous vehicles. [00:59:19] Yeah. [00:59:20] Yeah, that's, that's something I've read, the big Uber map that where everything, and that comes into play with the smart cities too. [00:59:26] And that's why 5G is so necessary for these so-called, I call them dumb cities because this all sounds so fucking stupid to me, but they, they, they need 5G in order for these, these cars to work. [00:59:40] And, you know, I live in San Francisco. [00:59:42] I see these cars everywhere now. [00:59:46] I rarely see them actually not being driven by somebody, but, but it strikes me as like, you know, they don't really work very well right now. [00:59:54] They, they look like they won't for maybe a while. [00:59:57] But, but 5G is huge, it's a huge leaps and bounds advantage for them. [01:00:01] And, and, and, and it's absolutely would make them work better, right? [01:00:05] Exactly. [01:00:06] It's, it's that infrastructure, right? [01:00:08] Like, like the another kind of analog that I've heard is describing 5G as like this generation's highway, right? [01:00:16] It's that massive of an infrastructure project, and it lays quite literally, like, lays the groundwork for everything else to be built on top of it. [01:00:26] But, like, the highway, I mean, those were extremely, I mean, those decimated some communities. [01:00:32] They put a highway right in the middle of San Francisco and demolished whole poor neighborhoods. [01:00:36] And that, I see, that analog really does make a lot of sense to me. [01:00:40] I mean, it gets back to exactly what Liz was saying. [01:00:42] That's exactly right. [01:00:44] Right. [01:00:45] Like, that, that, if we're, if we're going to call this, you know, the new highway, the information superhighway, you might even say. [01:00:56] I came up with that coined register trademark. [01:01:00] Then we have to also ask questions about the injustices of that, right? [01:01:04] Like, as you were saying, as Liz was saying, who's being demolished, you know, so that you can lay down the tarmac of this hall, of this highway, right? [01:01:14] Like, whose houses are that being laid over? [01:01:16] Whose lives are being disrupted by that? [01:01:19] And right now, we're doing a lot to discredit any kind of criticism from those people and calling them, you know, they're all just loony, right? [01:01:30] That's all they are. [01:01:31] They're just loony. [01:01:32] The thing too for me is like the thing about, this is what I always remind people about self-driving cars is that like none of the self-driving cars work because our actual physical infrastructure is so bad that the cars cannot operate on the roads that we have. [01:01:50] And you talk about kind of the physical always kind of interjecting itself into the digital, if we're still calling this digital, although I guess that's contestable. [01:01:59] But, you know, we talk about, you know, like what you're saying about, you know, what communities or where these technologies are being deployed, it's like they're not going to send, they're not going to put in the money and the investment into redoing every single road in non-capital city centers for autonomous vehicles. [01:02:22] And so when you talk about like the rollout of 5G and what those will then, what that'll kind of, what those networks will then be used for, in already existing, completely austere communities and rural communities and places where capital has completely fled, it will be, I mean, there won't be self-driving cars. [01:02:44] There won't be the consumer side of 5G that will be empowered. [01:02:48] It will be completely oppressive, completely, you know, the police state like that you're talking about, surveillance, and used in the, you know, industrial capacities, getting people, you know, subjugated to these new systems. [01:03:03] And in the capital city centers, they'll make the investment to make San Francisco a smart city, to make Las Vegas remake it a new capital city center Singapore of the West or whatever they're going to do with what remains of the detritus of the COVID-ill casinos or whatever. [01:03:24] But so, you know, those disparities are going to be even more radical than even just within these new smart cities themselves. [01:03:33] Like you're really looking at like, I mean, a completely binary, like completely binary state in a lot of ways that I think kind of mirrors what we see in this weird new, I mean, that we've talked about on the podcast, this weird new economy emerging of the people who can work from home and those who can't. [01:03:54] Yeah, and the people who serve those. === Crank Reactions and Disparities (15:34) === [01:03:56] Exactly, exactly. [01:03:57] Exactly. [01:03:58] Yeah, I mean, and those are the people where it's going to be the people who are consumers and the people who are not of this new digital infrastructure. [01:04:07] Yeah, I mean, you know, if they're building 5G in Detroit, who do you think that's for? [01:04:13] Right. [01:04:13] Do you think that's for the residents of Detroit so that they can download a movie in four seconds? [01:04:18] Or you think that's for the police? [01:04:22] Or you think that's for these platforms who are going to be owning and operating the services, like the city services? [01:04:29] I mean, exactly. [01:04:30] And then just the one Stock X office because they need their like hype for download speeds for their like sneaker stock market or whatever. [01:04:39] Well, there's also, I mean, you can look up videos of this, of these sort of model factories they've built to show off like 5G's prowess at automation. [01:04:49] And I tell you what, you do not need a lot of people working there. [01:04:53] And it's like, it's, it's, you know, you already see it like being rolled out in the restaurant industry, but imagine like that. [01:05:00] That is with today's technology. [01:05:02] Like tomorrow's technology, which they are rolling out, is going to be, I mean, just much more intense and much more widespread than that. [01:05:11] Yeah. [01:05:12] So should we get into actually mapping out what some of the conspiracies are around 5G? [01:05:19] Well, there's first, it is that it, A, that it gives you cancer, right? [01:05:25] That cell phones like cancer. [01:05:28] That's like, that's the, that's the original 5G conspiracy. [01:05:32] Classico. [01:05:34] That's Classico. [01:05:35] Yeah. [01:05:37] So it's giving you cancer. [01:05:39] And then obviously there's like the new corona wave, the new edition of the 5G conspiracies. [01:05:48] But there's multiple different things going on there as well. [01:05:52] There's not just like one conspiracy around 5G and Corona. [01:05:57] There's like a hierarchy of more popular conspiracies. [01:06:03] Well, I've always heard that cell phones, you know, like sort of like the layman's rumor or whatever is that cell phones gave you cancer. [01:06:10] But I also just like don't trust any studies that come out from anybody who says they don't. [01:06:16] I'm not saying that they do. [01:06:17] I'm just also saying that like, I'm not exactly sold that they don't because who the fuck do you think is funding these motherfucking studies? [01:06:26] Fucking no. [01:06:27] So much of this just follow the money and it's going to lead you down all the right rabbit holes. [01:06:32] Because I mean, that is a major issue with research on these network infrastructures and signals is that it's all either funded by the telecom or they are or they interfere with research on this, right? [01:06:49] There is like a paltry amount of actual independent research that is not somehow funded or touched by the telecom industry. [01:07:02] And so like it might very well be sound scientific work, but the fact that it's all like industry financed, can you blame people for mistrusting it for there being all this actual like uncertainty within experts saying that like basically like the Most credible, [01:07:26] most skeptical expert line that people like that people can say is there just needs to be more research, right? [01:07:33] Because if you say anything more than that, then you are like discredited as a, as a, as a crank. [01:07:39] You're outside the mainstream. [01:07:41] And you got to think about that. [01:07:42] Like, there is such a sort of reflexive defense of these giant telecom companies where it's like, yes, I mean, I have seen absolutely zero proof. [01:07:52] And it does not appear to be scientifically possible, perhaps, for 5G to spread coronavirus. [01:07:59] However, I am so much more on the side of the lone crank who thinks it does spread coronavirus than I am at that fucking John Liguere motherfucker who's in head of, I think, Verizon. [01:08:12] It's like, it's not even a question, you know? [01:08:15] And it's like, I think a lot of these sort of conspiracies are pushed to the fore. [01:08:18] And you do see so much press about anti-5G conspiracies. [01:08:23] And I think part of that is to make people reflexively defend the rollout of 5G and to not question whether it's a good thing. [01:08:30] Because if you think it's a bad thing, well, then you're like these, you know, wackos who are burning down the towers. [01:08:37] But why? [01:08:39] And essentially, it's getting people to do, it's putting a spell on them, essentially, to get them to defend their own eventual possible job loss. [01:08:51] Yeah, I mean, it's, this is a massive problem with the discourse around technology in general. [01:08:58] Like if you're really plugged into like the tech discourse in the way that I am, you see people out there just like standing for these these technologies and for these entrepreneurs and stuff. [01:09:13] It's like, motherfucker, do you have stock in this company or something? [01:09:17] It's like, like, no, they just have this reflexive need to be like, technology is good. [01:09:23] And if you don't like it, then you are, you are wrong. [01:09:27] I mean, one of my favorite lines that I read about this, but you see this sentiment all over the place is from a slate article about the 5G COVID conspiracies. [01:09:40] And it says, quote, finally, the conspiracies sow distrust in a technology that, like it or not, will be a major site of national investment in the coming years. [01:09:51] Like it or not, resistance is futile, right? [01:09:54] Like it's this total deterministic, like Borg kind of complex. [01:10:00] But 5G is just especially bad about this, but it's something that's common to like all technology, right? [01:10:07] People are just like, like it or not, here it's coming. [01:10:10] You can't stop it. [01:10:12] So your only choice is to welcome it. [01:10:15] But that's what's so wild to me because, I mean, I understand that, but at the same time, it's like, okay, The other choice isn't to just, I mean, okay, you know, there's a political calculation you can make about resisting, you know, kind of as a strategy or whatever. [01:10:31] But like, why, like, we need to kind of like reinvigorate the sciences and technology with material, with dialectical materialism, which is, which I mean, which is what I mean to say is that all of this is contestable. [01:10:50] And the fact that there's such a like reticence or like hesitation or reflexive like reaction to demanding to have a democratic voice in the trajectory of technological advancement is like no Marxism that I understand. [01:11:12] Just like from, I mean, not to sound like a Stalinist or whatever, but I mean, I really just like don't, I don't understand that at all. [01:11:20] And there seems to be just a, not just a retreat, but like a disinterest from any kind of like taking any part or even encouraging those at the forefront of these, [01:11:37] you know, you know, development of these new technologies and, you know, the scientists to kind of like have political charged discussions about the trajectory of these, you know, new technologies. [01:11:53] No, you've exactly nailed it. [01:11:55] I mean, there's something that is so political, right? [01:12:00] Like technology is so political from end to end because it's a way that governs our lives in a way that almost nothing else does. [01:12:12] But it's been depoliticized in the kind of discussion and the analysis of technology has been completely depoliticized. [01:12:23] But you're totally right in that like when there is no democratic control over technology, when there is no participation in decisions about what technologies are made and for whose benefit and what interest and so on, when there's none of that, it's going to lead to this kind of like quiet acquiescence in the core, right? [01:12:48] Where people are kind of like, what can I really do about it? [01:12:51] Like, you know, I'll accept the like marginal pittance of like a more convenient thing or a more faster thing because like that's all that I can really, that's all that that's trickled down to me. [01:13:04] But on the fringes, that acquiescence gives way to these kind of like violent outbursts, right? [01:13:10] It's a reaction to that systemic powerlessness. [01:13:14] It's a reaction to that long history of injustice. [01:13:18] And it's a reaction that's looking for some kind of outlet. [01:13:22] Absolutely. [01:13:23] And you can't really, it's hard to blame people. [01:13:26] It's hard to blame people for it. [01:13:28] I mean, like, you talk about the 5G towers, right? [01:13:31] Like, if we extracted it from this kind of like crank discourse, right? [01:13:38] Like, oh, this is all like anti-technology, Luddite, crank, whatever. [01:13:42] If we extract it from that and just look at the action. [01:13:46] Is burning a 5G tower because someone is building a massive infrastructure project in your backyard without your input, is that how much different than that is busting out the shop windows in a riot for like a Black Lives Matter movement, right? [01:14:02] Like to be like, or how much different is that than, you know, members of the IWW sabotaging capitalist production? [01:14:10] Exactly. [01:14:11] Like, like, how much different are these things? [01:14:14] And what's really different, is it about the intention and the strategy behind the tactic? [01:14:19] Is it about the discourses surrounding the action? [01:14:22] I mean, I think it's a little bit of column A, a little bit of column B, but like, we really do have to see these things as analogs, right? [01:14:30] Like burning down 5G towers, busting out the shop windows, sabotaging capitalist production. [01:14:35] They're all coming from a very similar kind of core. [01:14:39] Yeah. [01:14:40] And ultimately, like what these narratives that we're buying into, who do they ultimately serve? [01:14:45] Right. [01:14:46] Yeah, exactly. [01:14:48] I mean, we saw, you know, every time there's a quote-unquote riot or some kind of social movement demonstration and, you know, a car gets busted up or a shop window gets busted out. [01:14:59] We see, you know, the think pieces start rolling off the, you know, the assembly line, saying that like, oh, well, we need to be civil. [01:15:08] We need to be civil. [01:15:09] We should see the reaction to those social movements in the same kind of way that we should see the reaction to just completely delegitimizing and discrediting the 5G cranks without actually asking the dialectical materialist question about like, why are they doing this? [01:15:28] Where is this coming from? [01:15:30] And what does a materialist technology look like? [01:15:34] Right. [01:15:34] Like the original materialist algorithm is from each according to their ability to each according to their need. [01:15:40] That's an algorithm. [01:15:42] That's an if-then statement. [01:15:43] And that's all an algorithm is. [01:15:46] So, I mean, where's more of those algorithms? [01:15:49] I feel like that is the best place for us to wrap this up. [01:15:52] That's a great note to end on. [01:15:54] I want to thank you so much for coming on. [01:15:57] This has been such a fantastic conversation. [01:16:00] And I feel like people are going to be real jealous because this is probably not what they expected from us when we said we were going to do an episode about 5G, but haha, suckers. [01:16:09] No, I mean, we didn't even really get into the weeds of like the 5G conspiracy, which I wasn't sure if we were or not. [01:16:16] But I like that because I think you're right that like this was really an episode about technological politics. [01:16:23] Yes, absolutely. [01:16:24] And those you cannot separate the two. [01:16:26] I think that's the less said. [01:16:29] No, completely not. [01:16:30] I will say 5G may not give you coronavirus, but it's going to give you something a hell of a lot worse on the other side. [01:16:38] Yeah. [01:16:38] We didn't even get into my own conspiracy theory about it is that it's all a false flag. [01:16:44] Ooh, I like that. [01:16:46] Wait, wait, wait. [01:16:46] Lay it on me. [01:16:47] Lay it on me real quick. [01:16:49] So, I mean, my own theory is that like all of the conspiracies and debate around 5G are astro-turfed by the telecom companies as a false flag to delegitimize and discredit any criticism of their massive monopolistic infrastructure projects. [01:17:09] Yes. [01:17:10] Oh man, I love it. [01:17:12] And even if, and even if it, and even if it isn't actually astroturfed, it functionally operates the same exact absolutely. [01:17:20] It totally does. [01:17:21] You're right. [01:17:21] Which is the best conspiracy is that ultimately intention doesn't matter because the outcome is the same regardless of if there's collusion or not. [01:17:31] Love it. [01:17:32] Well, that was that rocked. [01:17:35] That was, and I guess this still is. [01:17:37] It's not still on the horn here. [01:17:39] This is Jathan Sadowski, the author of Too Smart. [01:17:43] Is this book out yet? [01:17:45] It came out last month. [01:17:46] Oh, amazing. [01:17:48] This is the author of Too Smart, How Digital Capitalism is Extracting Data, Controlling Our Lives, and Taking Over the World. [01:17:56] Where could people buy this book? [01:17:58] Buy it anywhere but on Amazon. [01:18:01] That's right. [01:18:02] We'll put a little link. [01:18:03] We'll put a link in the show notes. [01:18:05] Yeah. [01:18:05] Sweet. [01:18:07] All right. [01:18:07] Well, I will say, I was going to say something funny technology-wise, but I don't know enough words to do that. [01:18:17] Vlogging off from the cyber zone. [01:18:22] I cannot [01:18:56] believe we got Joe Montana on the podcast. [01:18:59] What? [01:18:59] Wait, which episode is this? [01:19:02] Sorry, I've been doing, I'm doing the next three months ending things in a row. [01:19:07] That was the football episode, right? [01:19:10] Joe Montana. [01:19:11] Yeah, we got Joe Montana. [01:19:13] Do you like that? [01:19:14] We talk about uh unfortunately. [01:19:18] Well, we were supposed to talk about passing. [01:19:20] Unfortunately, we just talked about group sex in the 90s. [01:19:24] Oh, god, okay. [01:19:25] Let me say, unfortunately, I'm the one who'd want to be talking about football. [01:19:28] You're the one who'd be wanting to talk about group sex in the 90s. === Love Big Faces (02:10) === [01:19:31] I've never, first of all, in the 90s, I was a child. [01:19:34] So literally, up until 2015, I was still legally a child, but that's because of court things with my parents. [01:19:43] Unfortunately, I was a full-grown adult at the time. [01:19:46] Oh, my God. [01:19:47] But I've never participated in anything like that. [01:19:50] That's not what I said. [01:19:51] I said you just want to talk about it. [01:19:53] Oh, no, in football, I'm talking about. [01:19:56] All right, all right, all right. [01:19:58] Uh, that rock, though. [01:19:59] I love, I love, uh, I love tip-tapping on the old keyboards with our cyber, uh, cyber guests. [01:20:04] Yeah, me too. [01:20:05] We got to do more episodes on technology and finance and stuff like that. [01:20:10] I, you know, I can go on forever about all this stuff. [01:20:13] You know, I've been trying to get this one guy. [01:20:16] In fact, he was instrumental in the creation of the internet, just in general, not the internet of things. [01:20:21] He is, he's been on chapel a couple of times. [01:20:23] So I'm like, I feel weird about stealing one of their guests, but have you familiar? [01:20:26] Have you heard of Albert Gore? [01:20:30] This guy, I mean, it's crazy because he's not only smart, but he's a hunk. [01:20:36] Oh, God. [01:20:37] I mean, Jesus, I know. [01:20:39] He's got a big face. [01:20:40] You know, that's always what I always thought about Al Gore. [01:20:43] Huge face. [01:20:45] It's like wider. [01:20:46] It's a wide face. [01:20:48] I'm afraid to chime in again because eventually, after several years of us recording podcasts, enough of these will accrue for someone to make a Super Cuts YouTube video of us talking shit about big faces. [01:21:00] First of all, I love a big face. [01:21:02] I want to be clear on that. [01:21:03] I love a big face. [01:21:04] Don't love Al Gore's big face. [01:21:06] But to reiterate, when I was younger, my friends and I used to call them super faces. [01:21:12] Super faces. [01:21:13] Or say, like, hey, that girl's got a real moon. [01:21:17] Sometimes you call them moons. [01:21:20] You love all women's faces. [01:21:23] Honestly, 100% down the line, yes. [01:21:27] I know. [01:21:27] All right. [01:21:28] So I'm Liz. [01:21:31] My name is Face. [01:21:35] We are joined by producer Young Chomsky. [01:21:38] And, well, that's true. [01:21:41] We'll see you next time.