True Anon Truth Feed - Episode 18: DARPA Marx Aired: 2019-10-15 Duration: 01:32:44 === Internet Addiction Confessions (04:11) === [00:00:00] What are we talking about today, the internet? [00:00:02] Oh, we are talking about the internet of things. [00:00:04] And by things, I mean everything I'm wearing right now. [00:00:09] We basically live on the internet. [00:00:11] Uh-huh. [00:00:11] Yeah. [00:00:12] You know how it says your screen time thing on your iPhone sometimes? [00:00:15] I've never looked at that. [00:00:16] 46 hours a day for me. [00:00:18] I've broken the mainframe. [00:00:22] What I do is I do this little test for myself. [00:00:27] I go back in time on my enemies, Facebook's, and I like statuses where they say slurs. [00:00:37] Just to let them know, I got the beam on you. [00:00:39] I'm watching. [00:00:40] That's like when people go through your Instagram and accidentally like something from two years ago or whatever. [00:00:50] Oh, I don't use Instagram. [00:00:51] I don't really either. [00:00:52] Yeah. [00:00:53] You don't really. [00:00:54] I literally don't use it. [00:00:56] Stop trying to be my friend on Instagram. [00:00:58] Yeah. [00:00:58] And also, let's give a round of applause for the first woman to break free. [00:01:02] It's true. [00:01:03] I'm on the Vanguard. [00:01:04] Yeah. [00:01:04] Feminist Vanguard. [00:01:06] I will say your rampant use of Ello is a little color tight. [00:01:13] Ello? [00:01:14] Yeah. [00:01:14] It's the thing where all your horrible, like kind of like what do you call Mass Indy, like artsy friends, like semi-artsy friends would be like, huh, Facebook's done. [00:01:25] Find me on Ello and post. [00:01:26] I don't know. [00:01:27] I never looked it up. [00:01:28] I don't know what that is. [00:01:29] You know, I don't know. [00:01:30] I mean, I got Twitter. [00:01:32] I don't really do anything else. [00:01:33] Yeah. [00:01:34] I literally don't have Facebook. [00:01:36] Liz, you have the top third biggest account on Gab. [00:01:40] Oh, yeah. [00:01:40] Love Gab. [00:01:41] I love to Gab. [00:01:42] Yeah. [00:01:42] With all my friends on Gab. [00:01:43] I'd like to, yeah. [00:01:44] I'm on whatever social media platform Thomas Wichter is on. [00:01:48] One thing that algorithms, I think, have predicted pretty well is that every single American male wants to fuck their stepmother. [00:01:56] What? [00:01:56] Yeah. [00:01:57] Oh, are you talking about the porn stuff? [00:01:58] Yeah, that has been all. [00:01:59] I don't know about porn because I don't look at porn. [00:02:02] Liz, you have one of the highest ranking OnlyCams in human history. [00:02:07] Only cams. [00:02:10] It's like what girls in DSA do if they need to make rent. [00:02:15] Banana bats. [00:02:18] Just kidding. [00:02:19] Liz isn't in DSA. [00:02:21] That's true. [00:02:22] But yeah, it's stepmother, stepsister, presumably stepbrother, stepfather. [00:02:28] Although I'm fascinated by this. [00:02:30] You keep sending me links about stepfans. [00:02:32] I don't know. [00:02:33] Let's be clear here. [00:02:34] I do not keep sending her links to the actual porn. [00:02:38] She just finds me watching those when we're supposed to be recording. [00:02:41] I haven't been sending people 10 minutes. [00:02:42] No, you sent me articles about it. [00:02:44] Yeah, it's why? [00:02:46] Because they're pushing this on us. [00:02:48] I don't want no part of me, especially not my dick, wants to fuck my stepmother. [00:02:54] You should just stop looking at porn. [00:02:56] I don't look at it. [00:02:56] It's shown to me By the algorithm. [00:03:00] No, porn is bad. [00:03:01] I try to type in and uncool. [00:03:03] I try to type in pcgamer.com porn hub. [00:03:07] Uncool. [00:03:07] There is nothing cool about porn. [00:03:09] Sorry, women can like you know do their thing while thinking about like the lavender and like what are those things that like diffusers and stuff like that. [00:03:20] Men need something more stimulating visually. [00:03:22] Oh my god. [00:03:23] Which is why I look at nature. [00:03:25] Beautiful vistas. [00:03:28] You know, the sublime really is God's porn. [00:03:32] You to sublime? [00:03:34] The sublime. [00:03:35] Even Lou Dog. [00:03:37] Oh god. [00:03:38] Oh my god. [00:03:39] All right. [00:03:39] Let's just get this show on the road. [00:04:10] Patreon subscribers. === Welcome to Trinon (04:02) === [00:04:12] Greetings, earthlings. [00:04:14] Welcome to this week's Trinon. [00:04:18] I'm well, I'm Liz. [00:04:20] I'm Brace, joined by producer Young Chomsky. [00:04:24] And today we are discussing the internet. [00:04:29] Oh, oh, man. [00:04:32] I've been doing research for this for like the past eight years. [00:04:35] Six hours a day. [00:04:37] We had a lot of feedback. [00:04:39] We did an episode recently about when all of the news about MIT and the Media Lab was coming out about Epstein and Joyito. [00:04:47] And we kind of did a sort of cursory discussion about the history of the MIT Media Lab, Stanford, DOD as concomitant sort of forces in the development of what we call the internet and related Silicon Valley tech. [00:05:06] So we thought it would be good to have kind of a bigger or longer, deeper dive into some of this history. [00:05:16] More penetrating. [00:05:19] So we have Ben Tarnoff on the show today, who's the founding editor of Logic magazine. [00:05:25] Which I'm looking at right here. [00:05:27] Yeah, the current issue is called bodies. [00:05:31] Bodies. [00:05:32] Bodies, minds, spaces. [00:05:35] Logic. [00:05:36] It's the only mention of logic that I don't get annoyed at. [00:05:39] Oh, yeah. [00:05:40] By your laundry. [00:05:41] Yeah, I hate nerds. [00:05:43] These guys are nerds too, but I know Ben. [00:05:46] I like him. [00:05:47] Yeah. [00:05:47] So we talk about a lot of things in this interview. [00:05:51] We're going to go kind of like meandering, wandering through some of this history and into some more kind of theoretical Marxian discussions about the nature of data and labor. [00:06:02] So hopefully you stay tuned for some of that and don't get a little annoyed with me and my musings. [00:06:08] So what you should do right now is go draw a bath, get all your smart appliances, perch them on the sill of the bath, and then whenever Liz makes a good point, just put one in. [00:06:19] Put another one in. [00:06:20] Put another one in. [00:06:21] At the end of it, you should be have all your toasters, et cetera, clean and be much more informed about it. [00:06:29] I don't think you should do that. [00:06:30] Be careful, everyone. [00:06:31] Yeah. [00:06:32] Also, if you don't have smart appliances, you can just open up an electrical socket by unscrewing those top and bottom screws in there and just reach it. [00:06:41] Yeah, don't hurt yourself, but enjoy the interview. [00:06:44] Welcome to the mainframe, to [00:07:14] the microchip pit. [00:07:17] With me, Brace Belden, Liz Francak, producer Young Chomsky, and the editor of Logic magazine, Ben Tarnoff. [00:07:26] Welcome, Ben. [00:07:26] Hey, thanks for having me, guys. [00:07:28] Thanks for coming. [00:07:31] So we're talking net today, baby. [00:07:33] The net. [00:07:36] Staring Sandra Ballet. [00:07:37] Are you jacked in, Ben? [00:07:39] I'm fully jacked in. [00:07:40] I'm actually wearing Bluetooth headphones right now. [00:07:42] I'm so ready. [00:07:44] I'm wearing, Liz is wearing an oversized pair of Google Glass. [00:07:49] Vintage. [00:07:50] Yeah. [00:07:50] And my body is, I'm wearing one of those suits from Tron that's just hooked up to several different worrying machines. [00:07:58] Yeah, this is, we're going full VR for this episode. [00:08:01] Yes. [00:08:02] So Ben, you've written extensively on the history of the internet and it's sort of like a founding mythos. === Silicon Valley's Countercultural Roots (10:40) === [00:08:14] We recently talked on the podcast kind of briefly about the long relationship, long-standing relationship between the Pentagon and various US agencies and Silicon Valley and the kind of, you know, where what the thing that we call the internet kind of comes from. [00:08:38] So we really wanted to have you on and kind of just have a deeper dive into that because I think our listeners were really, their interest was really peaked when we were kind of talking very generally. [00:08:49] But you've written extensively about this history. [00:08:52] Yeah, so from the very beginning, there's an extremely close relationship between the Pentagon and Silicon Valley. [00:08:58] I mean, modern Silicon Valley kind of takes its form in the 1950s, 1960s. [00:09:04] And at that point, it's really a cluster of microchip and transistor manufacturing companies. [00:09:10] So they're actually building physical stuff there. [00:09:13] You know, they have actually factories, these facilities, and they're churning out these microchips. [00:09:19] The number one customer for these firms is the U.S. military. [00:09:22] So the U.S. military is taking these microchips, it's putting them in advanced weaponry like Polaris missiles. [00:09:30] It's also, of course, using computers. [00:09:32] U.S. military is really the first major consumer for the computing industry. [00:09:39] So there's this very, very tight symbiosis, even really dependency that Silicon Valley has with the military. [00:09:48] And that relationship changes. [00:09:50] And we can talk about the different phases of that evolution. [00:09:54] But it's important to emphasize that from the very beginning, Silicon Valley couldn't exist without the U.S. military funding it. [00:10:02] Yeah. [00:10:03] I mean, my understanding or the way that I, you know, I kind of look at these, that their relationship is that the kind of boundary between what we would call now private companies and public infrastructure, public agencies, you know, however you want to describe the military, is very fuzzy. [00:10:28] And like that kind of that line is very, very much blurred. [00:10:32] I think a lot of people tend to think of Silicon Valley companies as these sort of separate private entities. [00:10:38] But as you say, they like really could not exist without those massive public contracts, like from their very inception. [00:10:48] Absolutely. [00:10:48] And part of what you're describing is a very successful ideological push by contemporary Silicon Valley to conceal its origins in public funding. [00:10:58] We have this kind of myth of the individual entrepreneur who disappears into the Palo Alto garage with the Apple computer. [00:11:07] It's a very appealing story, and it's one that's kind of told to us again and again. [00:11:12] And that story does a lot of work for these firms because to the extent that they can conceal their origins in public funding, that obviously has political implications for things like how they're taxed and how they're governed. [00:11:25] But the story, I mean, it's not a particularly radical or controversial opinion to say that the story of Silicon Valley is completely a story of public funding. [00:11:36] And what's interesting, again, as I mentioned before, is that story has different phases. [00:11:40] So by the early 60s, the military is no longer kind of moving away from being the major primary customer of Silicon Valley. [00:11:50] It's starting to pull back a bit. [00:11:52] So that's typically seen as the moment in the 60s where Silicon Valley is starting to orient itself more towards a civilian market. [00:11:59] It's also the decade where computers, very, very big mainframe computers of the kind that IBM manufactured, is starting to enter corporate America. [00:12:08] So you have these big computers doing things like payroll for big corporations in the United States. [00:12:15] Even so, the U.S. military remains absolutely critical to the industry because it's funding the basic R ⁇ D that is generating the innovations that the industry then monetizes very successfully. [00:12:29] So most famously, the internet, right? [00:12:32] But also a number of other technologies like GPS, touchscreens, and so on, all of which go into really making Silicon Valley what it is today. [00:12:42] Yeah. [00:12:44] So when you're talking about what companies exactly are you talking about at this point in the 60s? [00:12:48] This is like IBM? [00:12:50] So the earliest days, the big firms that are talked about typically are Shockley, Semiconductor, Fairchild is another big one. [00:13:00] You see Intel emerge in this period. [00:13:03] This is also relatedly the emergence of the VC sector in Silicon Valley. [00:13:08] I think it's generally in the kind of 60s and early 70s. [00:13:11] You have firms like Sequoia Capital being founded, Kleiner Perkins. [00:13:16] So there's this symbiotic kind of VC wing that is emerging in tandem with the growth of the computer industry. [00:13:24] Yeah, also when everyone was doing a lot of acid in California. [00:13:27] Yeah. [00:13:28] It's not like government figures are absent from these VC firms either. [00:13:32] No, yeah. [00:13:33] Famously, Henry Kissinger funded Theranos, right? [00:13:37] Theranos? [00:13:38] Yeah, he was definitely. [00:13:39] Yeah, he was on the board. [00:13:41] On the board. [00:13:42] I think Jim Mattis was too, which is so funny. [00:13:44] Yeah, it's funny. [00:13:45] The relationship between what you're saying with the VC firms emerging, but also these kind of like the California, like acid kind of hippie, but also there's like the, you know, the constant backdrop of the military complex is fascinating. [00:14:03] I was talking to someone who I think is going to come on the pod, Daniel Bessner, who's a professor of history out of University of Washington, who's writing a book about Rand Corporation in the early 50s, like running around Los Angeles, basically like advising on nuke technology, doing acid, being like wild 60s, late 50s kind of, you know, hippie lifestyle. [00:14:31] I mean, it's very similar to the kind of culture that permeates Silicon Valley today. [00:14:37] Stanford was a big, was a big part of that early technological push, too, right? [00:14:43] Absolutely. [00:14:43] I mean, Stanford is really a central piece of this whole ecosystem, you know, tying together both industry and defense department. [00:14:51] But Liz, to return to your earlier point, the countercultural strain here is super important to acknowledge as well. [00:14:57] There's a great historian at Stanford named Fred Turner who's written a book about this, From Counterculture to Cyberculture, that really traces the evolution of this kind of cultural formation through the history of Silicon Valley. [00:15:11] Figures like Stuart Brand and his whole earth catalog are really central to how, as Silicon Valley is coming into its own and starting to emerge as a kind of powerhouse for the industry, those engineers, you know, many of those folks are participating in the Bay Area counterculture. [00:15:29] There's kind of a free exchange between what's happening in San Francisco and what's happening further down the peninsula in these labs. [00:15:36] And it's a very interesting back and forth. [00:15:38] And I think, as you point out, definitely informs the current kind of cultural vibe and in ways, I think, both good and bad in today's Silicon Valley. [00:15:46] Yeah, that's something I think Franklin Fohr has written about in his book, World Without Mind, when he's kind of like profiling all the big tech companies and their kind of ideological, like they're more like ideological pursuits. [00:16:01] That kind of, I mean, I don't know how else to describe it as like, it's not just that it's like a libertarian kind of framework, but this very like freedom for freedom's sake. [00:16:13] Yes. [00:16:13] Like not a lot of, I would say there's not a big culture of responsibility, it feels like, or social responsibility. [00:16:21] There's that hyper-individualist sort of streak that like is that Ken Kesey sort of serves as like an emblem in like the Merry Pranksters, where it's this, it's this counterculture, right? [00:16:32] Like it's this like opposition to authority and to rules, but not like in pursuit of a greater project like socialism or something like that, but instead of like it's this like individual like path to seeking a comfortable life and fortune and all that. [00:16:48] It's really self-serving. [00:16:49] Yeah, it's very like spiritual as well, like a new spiritualism that kind of like runs through it that I can't totally put myself in. [00:16:57] It's like worship of the self. [00:16:58] Yeah. [00:16:59] It's like they're totally, they're, they're, I mean, yeah, they're total psychopaths. [00:17:03] Yeah, that definitely resonates. [00:17:04] I mean, I think it's important to note that that ideology goes through some mutation. [00:17:10] So I think in the 60s and 70s, it may retain a little bit more of a communalist or communarian spirit. [00:17:18] And once you move into the 80s and 90s, it really kind of becomes more neoliberal, for lack of a better word. [00:17:25] There's a term, you know, the California ideology, which was coined by a pair of British media theorists in the 90s. [00:17:32] And the way they describe it is as a kind of hippie Reaganism, kind of what you're describing. [00:17:37] There's this hyper individualism, this narcissism, this kind of still this tinge of counterculture, but embedded in this kind of very like intense celebration of capitalist values. [00:17:49] But again, I think that it's moved a bit. [00:17:52] I mean, I'm not sure that that was entirely the case in the 60s and 70s. [00:17:56] There were certainly elements of that, but I think it really does move to the right over the course of the decade. [00:18:01] Well, that whole like generation of people did the same too. [00:18:05] I mean, look at some of like the hippie founders or whatever becoming these like, you know, sandals wearing capitalists. [00:18:11] I mean, they're just literally like anarcho-capitalists in a lot of ways. [00:18:15] Exactly. [00:18:15] They're just Marin people, like these sort of like businessmen who live in the suburbs, except they have ponytails. [00:18:21] Like that is the difference. [00:18:23] Do the same amount of cocaine, wear sandals too, like boats, all that. [00:18:26] Well, and this is also like a lot of the like kind of hacktivist culture that was emerging in the 80s kind of made that same turn. [00:18:34] Like I'm thinking about actually Beto, wasn't he like a part of those kind of like a kind of like hacker collective that kind of shared that kind of quasi-communitarian, but also easily co-opted by the U.S. government? [00:18:52] He was, yeah. [00:18:52] He was part of the cult of the dead cow, I think. [00:18:54] Yeah. === Bit History of VC Influence (04:06) === [00:18:55] Like a prominent backing group. [00:18:56] Yeah. [00:18:57] I believe. [00:18:58] No. [00:19:00] But back to, okay, back to the little bit of the history. [00:19:03] So you're saying that these kind of relationships shift over the decades. [00:19:08] So how do you think the VC, emergence of VC, the VC has kind of shifted this relationship? [00:19:16] Well, it's interesting because as I mentioned, Silicon Valley is becoming less directly dependent on the U.S. military in the 60s. [00:19:24] The 60s is when the military pulls back slightly on its orders. [00:19:29] And Silicon Valley needs to find a civilian market. [00:19:34] And VC becomes very important to how it manages to do that because it's looking for alternative funding sources to scale these kind of businesses to the point where they can really sustain themselves in a civilian market. [00:19:49] But I think it's important, again, to acknowledge the division of labor that's going on between something like VC and an agency like DARPA, which is the Defense Department's RD wing, which continues to fund the basic research that is generating the kind of major breakthroughs that the industry will go on to monetize. [00:20:11] So VC is not a very good model to generate major breakthroughs in innovation because it tends to demand large returns on a short timeframe. [00:20:22] So there's a lot you can do with VC, of course, but in terms of making the internet, for instance, the internet could never have been created the kind of radical breakthrough technology that it was on a VC model because it required decades of very patient funding with no return. [00:20:42] And so it's really no coincidence that that comes out of the state and not the private sector and specifically VC. [00:20:49] Well, it's like, you know, you kind of think of it like the same thing with NASA. [00:20:53] Like there's absolutely, I mean, I don't know. [00:20:56] I'm not a huge fan of like the SpaceX programs and like private space exploration. [00:21:04] And I know that a lot of that does have state funded, or there's a lot of contracts tied up in satellites, et cetera. [00:21:11] And all of that. [00:21:12] But the kind of, you know, the massive like state push for the NASA program, like you say, could never have happened on these sort of short-term, not necessarily even profit, but like immediate results driven funding, which is exactly how angel investment is structured. [00:21:36] Absolutely. [00:21:36] I mean, I think a lot of this comes down to how you define innovation. [00:21:40] And this is a word that I think has been defined downward very aggressively over the decades. [00:21:46] Like innovation essentially comes to mean absolutely anything. [00:21:50] Right. [00:21:51] And right. [00:21:52] I mean, it's just like, it's a horrible word, actually, really. [00:21:56] It's just like a total nonsense word at this point. [00:21:59] But I think it matters how you define it, because if we're talking about innovation on the scale of the internet, that's one category. [00:22:06] And we know from the historical record that something like VC is not very good at generating innovations on that scale, that you actually need long-term patient funding of the kind that is not seeking a return really of any kind, much less a large return on the scale that VC requires. [00:22:23] And that kind of arrangement comes from the state. [00:22:26] So we know that that's the model that works to create that kind of innovation. [00:22:29] If we're talking about innovation in the sense of creating an iPhone, sure. [00:22:34] I mean, there's no doubt that there's a fair bit of work that goes from harvesting the publicly funding innovations like the internet, like GPS, like Siri, like the touchscreen, and packaging that into the iPhone, not to mention all the software that has to be written. [00:22:51] So it's not to say that all the workers in those firms and all the workers at the firms that contract with them in East Asia don't do any work. === Arpanet's Unexpected Path (15:02) === [00:23:01] Obviously, they do, but it's kind of where is the bulk of the labor behind the breakthrough? [00:23:06] And it's generally coming from the state. [00:23:08] Yeah, absolutely. [00:23:10] So you mentioned ARPA or DARPA briefly as the R ⁇ D wing of DOD. [00:23:19] That's correct. [00:23:20] Yeah. [00:23:22] What is maybe you can just go into a little bit of history there and kind of how the internet or whatever we want to call the internet when we use that term emerges out of that project or out of that wing. [00:23:35] So ARPA, later renamed DARPA, is the Defense Department's advanced RD wing. [00:23:42] It's first founded in response to the Sputnik launch. [00:23:46] So in 1957, the Soviets put up the first satellite and it produces a real panic in U.S. elite policymaking circles that they're falling behind and that they need to catch up. [00:24:00] That the Soviets, in terms of their science and engineering capacity, are way ahead of the United States. [00:24:05] So one of the responses to this is to found ARPA, later DARPA, which is a lavishly funded advanced RD wing for the Defense Department. [00:24:17] And they fairly soon in their history orient themselves heavily towards computer technology and networking technology. [00:24:27] So in the late 60s, they create something called ARPANET, which is this very innovative computer network spanning the United States, where these different mainframes at various research sites affiliated with DARPA can connect to one another and communicate. [00:24:44] So a real breakthrough in the history of computer networking. [00:24:48] And then the following decade in the 70s is when they start developing the technology of the internet. [00:24:54] And the origins there are that the internet is, they're looking for ways to inter-network. [00:25:01] So the origin of internet is just from inter-networking. [00:25:04] They're looking for ways to connect more than one network. [00:25:08] So ARPANET is a single network, but what if we could interlink that network with other kinds of networks, say radio networks, satellite networks? [00:25:18] And the reason that they're hoping to make these connections is to bring computing power into the field, into the theater of war. [00:25:28] That's at least the justification that the scientists are using to pursue these experiments, which is what if we had a mainframe in Northern Virginia that was connected to ARPANET and we had a Jeep, let's say somewhere in Vietnam, and through inter-networking, we were able to have a smaller computer in that Jeep or whatever it is communicate with the big computer in Northern Virginia because maybe there's some program running on that mainframe in Northern Virginia that can help soldiers in the field. [00:25:58] So it has a specifically military orientation. [00:26:01] Of course, that dream is never fulfilled. [00:26:04] I mean, it is now, obviously, but just to say that even though that's the rationale for creating the first internet protocols, it's not used that way. [00:26:14] What do you mean? [00:26:17] So that idea, I mean, just to draw a distinction between the fact that sometimes you create a technology with a certain rationale of why are we creating this technology? [00:26:27] What is its purpose? [00:26:28] And when we tell the history of technology, it's important to acknowledge that purpose, but often it's used for something very different. [00:26:35] So a good example to return to ARPANET, this innovative computer network. [00:26:40] The idea here was resource sharing, which was instead of buying a lot of these very expensive mainframe computers for every different research installation across the country, what if we just have a certain number of them, but we're able to share them? [00:26:56] So you can keep your nice mainframe right in Northern Virginia. [00:26:59] I have mine in Stanford, and I can talk to yours and use programs on yours. [00:27:05] That's the purpose, but it's not used that way. [00:27:07] Gotcha. [00:27:08] So it was like a way to share, like, you know, if I have a program that can do this kind of math equation, and there's a computer over there that can do a different kind, they can talk to each other. [00:27:17] I don't really know what computers did back then, so I'm kind of just making that up. [00:27:20] But yeah, and now it is what? [00:27:24] It's more or less decentralized? [00:27:27] Well, so as you're saying, originally that was the idea. [00:27:32] You have a program on your computer. [00:27:34] I want to use it. [00:27:35] And this is basically a way to save the Defense Department money. [00:27:38] What ARPANET ends up being used for actually is not really resource sharing, even though that's the rationale for creating it. [00:27:44] It's email. [00:27:45] Email turns out to be the major, the so-called killer app that makes ARPANET so much fun. [00:27:51] And obviously, email spreads to the internet once they develop those interconnected networks. [00:27:55] But I think that's an important point because, again, to move to the internet, the rationale is what if we brought the power of the mainframe into the field? [00:28:04] But again, that's not really what it's used for. [00:28:06] I mean, that's the rationale to develop these protocols. [00:28:10] Once they have them, they actually kind of put them to a more banal purpose, which is making better interconnections among the networks that the Defense Department has, the fixed link networks in the United States. [00:28:24] So again, the rationale is important, but the purpose that it's created for is often different than how it's used in practice. [00:28:32] Got it. [00:28:34] And so at what point does, I know, I mean, we've talked a lot at the show about promise. [00:28:40] Yes. [00:28:41] And about that kind of emerging, you know, obviously in the, you know, it's what, the early 80s as a kind of, you know, national security file sharing program, supposedly. [00:28:57] But this is, so at some point, that, you know, the national security complex kind of takes hold of some of this? [00:29:04] Yeah. [00:29:04] I would say, or would you contest that, which is also okay? [00:29:09] So from the beginning, the national security complex is deeply integrated and embedded within these experiments, right? [00:29:18] It's one of the first users of these networking technologies. [00:29:22] There's a big scandal in particular, I want to say the early 70s, where Army intelligence is caught conducting surveillance on domestic subversives, kind of anti-war protesters and associated radicals. [00:29:40] And instead of destroying their files, they just move the files from one site to another using ARPANET. [00:29:47] So right away, the kind of broader security state is using these technologies. [00:29:54] I think they're quite useful to them for a number of reasons. [00:29:58] I would say the one distinction I would make is there aren't that many people on these networks quite yet. [00:30:04] I mean, we're talking about ARPANET in the 70s. [00:30:07] We're talking about Defense Department personnel, researchers who are receiving DARPA funding. [00:30:14] It's not as if they can construct the kind of massive global surveillance apparatus they do today, just because it's a fairly limited population. [00:30:24] Obviously, that changes over the decades, but at first, you can't really do that scale of Snowden-style surveillance. [00:30:31] It seems key then, like, I mean, not saying that this was, you know, the idea from the beginning, but then the kind of, like you mentioned, the push of all of this technology into the consumer sphere then seems completely key to then at, you know, at whatever point, getting into these larger, you know, the Snowden kind of surveillance apparatus. [00:30:54] Because without the kind of consumerization, if I can use that as a word, of these types of technologies, you wouldn't have the massive database that you would need in order to build those types of networks. [00:31:10] Yeah, the more people you got using the internet or whatever, the more people you got doing whatever they're doing on the internet, sort of the more profiles you can make on them for, I guess, intelligence agencies. [00:31:21] Well, it turns out to be a perfect technology for mass surveillance. [00:31:25] Absolutely. [00:31:26] And one of the elements of this is, you know, in the original internet protocols, they're not thinking about privacy. [00:31:32] They're not thinking about security. [00:31:34] And that's because they're designing these technologies for fairly small communities of research scientists who don't expect the need for secure communications. [00:31:45] So that's something that's widely commented upon and people try to mitigate that, particularly as the internet becomes more commercial and they actually need things like encryption for e-commerce. [00:31:56] But it's really not baked in to the start. [00:31:58] I mean, there's not really much expectation that they would even need it. [00:32:03] Damn. [00:32:06] Seems like something they overlooked. [00:32:08] Yes. [00:32:09] So I do want to touch on how just the internet for kind of lack of a more specific word there and its sort of invasiveness in all corners of our lives and the fact that people's like houses are now being connected to the internet and like everything they do from the light bulbs they turn on to how many times they open the refrigerator to whatever songs they want to play. [00:32:34] They're willingly sort of bringing these devices into their homes and speaking to them. [00:32:39] I mean, it's like the internet of things. [00:32:41] The internet of things. [00:32:42] I mean, that to me, I mean, there's been obviously a lot of people have kind of raised their hackles when that started coming out and still do because it's a pretty insane wave of technologies. [00:32:55] But like, to me, I'm like, this seems crazy. [00:32:59] It's like a, it's like a NSA agent's wet dream is, I don't want to say, I should rephrase that. [00:33:08] It's like an NSA guy's like, I don't know, dream. [00:33:14] Bringing basically the internet and thus surveillance into every facet of your life. [00:33:19] I mean, it seems, it's like, it's appalling, but it's also like pushed on us by these companies like Amazon and other ones. [00:33:27] Well, not even just Amazon. [00:33:28] I mean, GE. [00:33:28] I mean, every appliance that's manufactured now is microchipped. [00:33:36] Yeah. [00:33:37] And has different kinds of, you know, Bluetooth. [00:33:40] I don't know why anyone would need Bluetooth technology in their refrigerator, but people do. [00:33:46] I'm old school in that way. [00:33:47] Old school Luddite. [00:33:48] Yeah. [00:33:49] Also could not afford a new refrigerator, even one that doesn't have Bluetooth on it. [00:33:54] Yeah, but that's a good segue into maybe some other stuff. [00:33:59] Yeah, I mean, it just seems like this, it's our lives have become like it's like these private companies are telling us for their own purposes, but also maybe for some different purposes that we need to be surveilled in order to have a more efficient home. [00:34:15] Absolutely. [00:34:16] And I think that's a phenomenon that's really emerged over the past decade, I would say. [00:34:21] I mean, I think what you're describing is probably due to a handful of factors, the growth of smartphones. [00:34:28] So the first iPhone comes out in 2007, and there's really an explosion in smartphone adoption in the decade following. [00:34:35] The rise of the cloud is an important consideration. [00:34:38] And then, you know, another really important one, maybe the most important is the growth of what is often described as artificial intelligence, but is really more specifically machine learning. [00:34:49] And there's a series of breakthroughs over the past decade that make it possible for corporations and governments to discover useful patterns in large amounts of data, which had not really been possible before. [00:35:03] So that helps stimulate this generalized hunger for data, which drives a lot of the kind of ubiquitous smartness that you're describing. [00:35:12] So everyone has an incentive now to put tiny computers all over your life because the data that those computers are collecting have the potential to be lucrative to a number of companies for a variety of reasons. [00:35:25] So that's really why this trend has accelerated in recent years. [00:35:29] Yeah, wait, I want to pinpoint that for a second. [00:35:32] So when we talk about, you said has the potential to be lucrative. [00:35:37] And I think that's a really interesting, or I don't know, I kind of want to highlight that because I think there's a lot of people are, there's so much data that they're just hoovering up that I wonder, it seems like no one really knows like what or how this stuff could be lucrative, but there's a kind of assumption that it is. [00:36:02] And I know there's advertising and, you know, obviously, I mean, I would think that the state would find a lot of that data lucrative. [00:36:11] But it seems a bit unclear. [00:36:14] And this is kind of, you know, I think is baked into a lot of the mythos around a lot of current tech companies and their valuations. [00:36:26] But that there's kind of an assumption baked in about the value of all of this information. [00:36:31] Or the future value of it. [00:36:33] Yeah, but it's very speculative. [00:36:35] Yeah. [00:36:36] But I mean, that's just my feeling. [00:36:38] I don't know. [00:36:39] I think it's a bit unclear. [00:36:41] No, that's a fascinating point specifically about the valuations of tech companies. [00:36:46] And they are, you know, some of them are, of course, very, very profitable. [00:36:50] I mean, a company like Google or Facebook, they do make a lot of money. [00:36:53] But as you're describing, some of that is investors looking at these companies and saying, well, they have an enormous amount of data and we expect that to be worth a certain amount. [00:37:04] And the big question is, what if it's not? [00:37:07] And, you know, one of the ways this manifests quite concretely is with something like Facebook. [00:37:11] You know, of course they have all of these users, you know, this massive 2 billion plus platform. [00:37:17] But what is a user really worth? [00:37:19] How many of these users are real users? [00:37:21] How many of the interactions of these platforms are actually human interactions? [00:37:26] And how many of them are bots or automated programs for misinformation or gaming or whatever? [00:37:31] So a lot of this is kind of guesswork about like some of this data is just garbage. [00:37:35] Like a lot of internet traffic in general is just kind of automated garbage. [00:37:40] So how do you actually begin to evaluate its speculative value is a super interesting question. [00:37:48] Yeah, I remember there was something recently, I want to say it was earlier this year or last year, when Facebook was getting into a bunch of trouble with, I mean, it was like the Cambridge Analytica stuff, but then other just general, you know, Facebook being a garbage. === Collective Data Production (15:34) === [00:38:04] Zuckerberg was called up to testify. [00:38:06] Uh justly. [00:38:09] But that basically when I think it was like a Facebook press release and they kind of made a big thing that was like, we're pivoting to privacy or private. [00:38:19] And that was the kind of slogan. [00:38:21] And they're kind of, it was this like, you know, obviously concerted PR campaign about how they were going to start changing the way they collected data as if it would be more private. [00:38:34] But I had a question because it seemed like what they were actually doing was that rather than like the way that they were collecting data before was just collecting like surface value vast amounts as much as we can get. [00:38:48] Yeah, wide net. [00:38:49] But to then pivot, as they called it, pivot to private, I think is how they called it, what it meant was that what they wanted was actually more meaningful data. [00:39:00] Because if you were under the assumption that you're having more private interactions with people, then I would assume that you would be revealing more things, which would then lend itself to like, you know, possibly more meaningful data collection. [00:39:17] So it didn't seem at all in any way like it was, you know, obviously, I mean, I wouldn't trust Facebook with anything. [00:39:24] I don't even have a Facebook. [00:39:26] But, you know, this idea that there could be a version of it where, like, I don't know, it just seemed like there's a way that that conversation can get manipulated for like greater data extraction. [00:39:45] Definitely. [00:39:46] And it's a fascinating question because quality of data at this point for a company like Facebook probably matters more than the quantity of data. [00:39:56] I wrote a piece with Maura Weigel last year for The Guardian that explored this idea, particularly in the context of Facebook. [00:40:04] And the comparison we drew was to the earlier period of industrial capitalism where capitalists running factories have to find ways to be more productive within a certain amount of time because workers had successfully won a shorter workday. [00:40:21] And Marx talks about this in the first volume of Capital. [00:40:23] It's kind of in his discussion of the distinction between absolute and relative surplus value, that capitalists, because of pressure from below, which take the form of regulations in the UK restricting the length of the workday, are forced to do more with less. [00:40:39] So actually restraining the workday to like a nine or a 10 hour workday is an incredible stimulant to technological innovation because they invest in machinery, they develop new types of labor discipline, and they manage to increase labor productivity in ways that they wouldn't have before. [00:40:58] And I think there's something analogous happening with a company like Facebook, where again, they have a lot of kind of garbage data because there's a lot of garbage on Facebook, a lot of low quality interactions that don't let them learn much about their users and also don't give their users much reason to stay on the platform. [00:41:18] So I think both for user retention and for their value proposition to advertisers, they're pushing much more into things like Messenger, into things like groups, into smaller or even one-on-one interactive modes with the expectation is that those interactions will harvest higher quality data that will then let them monetize more efficiently. [00:41:41] Well, it's like Fordism for nerds. [00:41:43] Basically, yeah. [00:41:45] I mean, it's a really, you know, this is a question that I, I mean, I don't know, we can just like riff for a second, but I find it just completely fascinating, this huge bet on data. [00:41:56] Yeah. [00:41:56] And I, you know, I think it's not just data. [00:42:00] Like I think it's also attention, which is even kind of more, I know people have talked about the attention economy and I, you know, I don't really like a lot of people. [00:42:08] You're gonna make a lot of money in that then. [00:42:10] Yeah. [00:42:11] A lot of currency. [00:42:15] But I do think it's funny that these companies, the value that they're then, that they're extracting, right, is not what you would call labor, right? [00:42:26] Or it's not being extracted from labor in like a traditional way that we would think of it. [00:42:30] Like I think about a company like Netflix, for example, which, you know, when you're watching Netflix, you specifically do not think of that as labor and it's not labor. [00:42:44] And yet a company is extracting like its primary value from that act. [00:42:53] And so this kind of like this change in this relationship, I think, I don't know, it's something I think about a lot. [00:42:59] And, you know, I don't know, I think it's a real question whether or not this stuff, like, you know, data and attention, if it's ever going to pan out in the way they think, or if it's going to pan out in a different way, or if these companies are just going to keep floating on valuations that are just, you know, it's almost as if firms are just becoming speculative assets, right? [00:43:25] Does that make any sense? [00:43:26] I think it does. [00:43:27] And it's a very difficult question, I think, that you're raising about is data labor or not. [00:43:34] This is a very contested theoretical question by scholars about how do we think about data, and particularly within a Marxist tradition. [00:43:42] Is data labor? [00:43:43] Is it raw materials? [00:43:45] How do we reason about this? [00:43:47] And this is one I think that there is still a lot of work to be done. [00:43:52] There's a scholar, Tiziana Terranova, who has this great canonical article in 2000 where she introduces this idea of free labor and is drawing on feminism and Italian autonomism to kind of make the case that these data producing activities should be seen as a kind of labor. [00:44:11] Other folks, I think Nick Cernicek prefers to talk about it in terms of raw materials. [00:44:16] I think the difficulty that I have is that data is a very capacious category, talking about anything that can be encoded as information digitally. [00:44:26] And I think the reason that computing is so good at what it does, why Turing called computers a universal machine, is almost anything can be encoded as data. [00:44:38] So I think we probably have to be more specific about what we're talking about to then think about whether labor is the appropriate category. [00:44:48] So for instance, you could be posting on Facebook or Twitter. [00:44:52] That's certainly creating data of some kind. [00:44:55] Alternatively, you clicking around the internet, that's also leaving a trail of data that is useful to advertisers, but it feels like a different category of activity than you typing out like a tweet or typing out like three sentences and putting them on Facebook. [00:45:12] Another category is you walking down the street and having your face picked up by a security camera and then that image ending up in a machine learning data set that's used to train facial recognition. [00:45:24] So is that labor just by you walking down the street? [00:45:27] Again, I think this is where these questions get quite difficult and the extent to which we use unwieldy abstractions like data can sometimes obscure what's really going on, I think. [00:45:37] Yeah, definitely. [00:45:38] And I think it's like, it's worth asking too, if it's even a helpful category. [00:45:45] Like, I don't know, like, is it even helpful for in our analysis of social relations of these new emerging relations of production to think of it as labor, right? [00:45:59] I don't know if it gets us any further toward understanding kind of, you know, exactly the kind of new formations of power and exploitation that have emerged in, you know, the last decade or so. [00:46:17] I don't know if it gets, if it's helpful. [00:46:19] And that's like something that I really struggle with. [00:46:23] Because it's like, yeah, you can think, okay, if you want to think of social media as labor, okay, but then that means it should be compensated. [00:46:31] Like it just, I don't know if it gets us any closer to a kind of like helpful analytic, if that makes sense. [00:46:38] Or, like, what's the next step? [00:46:40] Like, okay, so it is labor. [00:46:41] You know, we have this theoretical debate and come out agreeing that it's labor or something. [00:46:46] What next? [00:46:47] Because I don't think there really, I don't know if there is an answer to that. [00:46:50] Like, do you like, yeah, do you get paid for it? [00:46:53] Do we do this union to me? [00:46:56] Yeah, because exactly. [00:46:58] Like, it's, it's, I don't see any sort of future like after figuring that out. [00:47:05] I think figuring it out is kind of the end point there. [00:47:09] Well, I think you're absolutely right to try to extend the logic a little further and think about, well, what would this mean concretely? [00:47:16] What's the cash value, so to speak, of this analysis? [00:47:19] And there are folks, people I don't think that would be on the political left necessarily, but people like Jaron Lanier, major VR pioneer and kind of thinker, who is calling for people to be paid on an individual basis for their data. [00:47:34] So he did a video for the New York Times, I think, just a couple of weeks ago where he makes the case for this. [00:47:40] And again, I think, quite apart from the feasibility of it, it's troubling for a lot of reasons, I think, for folks on the left, because it suggests a kind of deeper marketization, deeper commodification of our social life, right? [00:47:53] So I don't think it's very satisfying. [00:47:54] I mean, I think, regardless of the tricky theoretical question of is it labor or is it not, I think at the very least, we could say this is a highly collective process so that your data is actually not worth anything without the data of a lot of other people. [00:48:11] Certainly the data that you provide in your kind of walking around the internet, or even in our example of your face ending up in the machine learning data set, it's your data with a lot of other data that makes it valuable. [00:48:25] And this is also why the idea of an individual payout isn't really conceptually coherent because your data is valuable in relation to other people's data. [00:48:35] It's those relations, in fact, that create the patterns that are then harvested for profit. [00:48:41] So I think, again, as Marxists, we could emphasize the collective nature of this operation. [00:48:48] And for some, that might lead us to the idea of things like a data dividend. [00:48:53] You know, I've made this case before, although I now feel more lukewarm about it. [00:48:57] I think Nick Cernicek has made this case. [00:49:00] I think Morozov has an idea for this. [00:49:02] A lot of folks have kind of made this argument that you could imagine thinking of data as a kind of social asset that is generating a dividend, not unlike the dividend they have in Alaska for the oil fund, that it is paid out to all members of society. [00:49:18] So that might be a somewhat more collective approach to this question, but of course, then raises a lot of difficulties of its own. [00:49:24] Yeah, I think that, yeah, I don't, I don't, I'd have to do more reading on that. [00:49:30] Like off the bat rubs me the wrong way. [00:49:33] What you say about the commodification, it's like basically, rather than attempting to disrupt the power relations innate to this, to the social production, it's just demanding compensation, which to me is like, not a Marxist understanding of like. [00:49:59] Like that I don't. [00:50:00] I don't understand how that would get us any any closer to a fairer or like more just society, but I'd have to think about that more. [00:50:12] Yeah, this is something I really like conceptually struggle with. [00:50:16] I know a lot of people are doing a lot of really good work on it. [00:50:19] But my like dark kind of black pill suspicion is I think that there, I mean, I don't know if this is something I should talk about on the podcast. [00:50:30] Maybe we'll end up. [00:50:31] I'm about to say something way more insane after that. [00:50:33] So go for it. [00:50:35] I've had two black pills today. [00:50:39] But I think that at least, I mean, this is kind of my kind of theory that I'll like just throw out there is that at least within the global north, obviously, when we're talking about these sort of changing relations of production, that there's been a mass shift from what we would call productive labor into unproductive labor in the Marxian sense, right? [00:51:04] So you have, at least, especially after the 08 crisis, you have a massive shift in job creation, whether it's in either precariat, like Uber work or whatever, or it's all in marketing, media, social media, et cetera, right? [00:51:27] Making Denny's tweets. [00:51:30] And that there's been, there's, you know, just a, I mean, it's, you know, I would say in the nascent stages, but, you know, it's not like a complete mass movement. [00:51:38] But I think when you, it was something like 80% of new jobs after the great, after the great recession, as they call it, after the crash, were in prekariot work, right? [00:51:50] However we want to call that. [00:51:52] Wild. [00:51:52] Yeah. [00:51:52] And also, you know, a lot of the jobs numbers, it's difficult to get like into the weeds about what this is actually, what we're actually talking about. [00:52:00] But, you know, my, my like weird, like probably what people would say would be not Marxist, but I think it's Marxist black pill theory is that there's like a kind of long-standing, I mean, who knew if he actually said this, but apparently Engels at the Second International said something like, the labor theory of value is forever. [00:52:24] And what my work would presuppose is what if it's not? [00:52:31] Yeah, and that there's a kind of a shift. [00:52:34] And I think there's something, there's like a connection here between the ways we think about, you know, like you say, data, the way we think about hyper-financialization and asset production, these sort of like kind of faux nebulous financial instruments that are being produced by this sort of like new hyper-financial, hyper-financial, supranational, globalized economy. [00:53:05] That's my black pill theory. [00:53:06] I don't know. [00:53:07] Yeah, I think it's, I think, one angle maybe on what you're describing is that the information technology revolution very broadly has not produced a major rise in labor productivity, which is often described as the productivity paradox, because according to mainstream thinking, that technology, as it advances, increases productivity. [00:53:35] And that that really hasn't been the case. === Tech's Alienating Effect (04:15) === [00:53:38] There's a brief surge in the late 90s, but it mostly has not been the case when it comes to what we would broadly think of as tech. [00:53:46] And again, it's a contested question as to why there's different theories of why that's the case. [00:53:52] There's a wonderful new article in the latest New Eff Review by Aaron Beninoff that goes into this, I think, in great detail and makes a very persuasive case that one of the reasons that we're not seeing the kinds of gains that we might expect from information technology is because we've been dealing with a major problem of manufacturing overcapacity for decades that's produced deep stagnation and in turn a low demand for labor, [00:54:21] which is reflected in stagnant wages and falling rates of labor, rate participation. [00:54:28] So we might actually look for the answers to these questions outside of the technology. [00:54:33] In fact, sometimes the technology itself can be a bit of a distraction from the origins of these problems. [00:54:41] But I think in general, what you're describing is a phenomenon that's felt by a lot of people in the United States, which is decades of disinvestment, stagnation, declining living standards. [00:54:53] And tech, as much as it can be celebrated for its disruptive kind of wealth creation potential, is quite clearly not filling that gap. [00:55:03] In fact, to many people's eyes, it seems to be intensifying those same inequalities. [00:55:09] Yeah, I would say not just inequalities, but alienation as well. [00:55:13] Oh, yeah. [00:55:14] That like a lot of these emerging, or not even emerging, but you know, a lot of the kind of like tech that we just use in our daily lives is like increasingly alienating individuals from each other. [00:55:29] I mean, absolutely, there's that study that came out that I keep harping on where it shows like nobody has any friends anymore. [00:55:36] And like people like have very little social contact, like a certain segment of the population. [00:55:42] I think predominantly, I think it's a lot of males have the Joker, the Joker cast. [00:55:48] Yeah, we call this, we call this, we call this the cyber joker syndrome. [00:55:53] The lump and joker terrier. [00:55:55] Yeah, but there are, there are literally like hundreds of thousands, if not millions, of lumpin' jokers out there in the U.S. [00:56:03] And like, I know. [00:56:04] I'm a friend to the Lumpen Joker. [00:56:06] You know that. [00:56:07] Oh, you are, sweetheart. [00:56:09] They have no friends. [00:56:12] I see you. [00:56:13] I hear you. [00:56:13] You're valid. [00:56:15] They're not listening to this pod. [00:56:16] Actually, they are listening to podcasts because they think the hosts of podcasts are their friends. [00:56:20] We are your friends to those listening. [00:56:22] But they think the hosts of other podcasts are their friends. [00:56:25] And I think that's true for a lot of people. [00:56:26] I think they substitute social relations with, or they're forced to substitute like actual in-person social relations with these relations they have with people on the internet. [00:56:36] It's sort of this like false, false front for that stuff. [00:56:40] And I think it drives people crazy. [00:56:42] I mean, it's certainly got this weird, I don't know, it's definitely had an effect on people that I know personally, just anecdotally, like has, I think, increased loneliness in their lives. [00:56:54] And it's interesting when I went to a country that there is not much internet, which was Cuba, people seemed a lot more likely to hang out with one another than they are here because you're not stuck, you know, you don't have this, you know, anything at your beck and call, you know, information-wise or friend-wise or entertainment-wise. [00:57:17] You have to actually go seek it out. [00:57:19] And I think the fact that we're sort of on the vanguard, America is sort of on the vanguard of these like new technological innovations or these social media, whatever, that I think it's that that has a lot to do with why everyone's just a fucking psycho now. [00:57:36] And my question is, can we stop that? [00:57:40] Like, is there a way to like, if, if, you know, just conjecture, just throwing out some future possibilities here, if there was a change in government, could we stop the internet? === Amazon's Military Entanglement (15:26) === [00:57:54] Frasa's getting super hated. [00:57:56] I don't like the internet. [00:57:57] But like, if, like, I'm serious, like, for example, if there was, you know, if America collapsed, whatever, something took its place, is this like an, this is an extra-national project. [00:58:07] Like, Amazon doesn't need the U.S. government to keep doing what Amazon does. [00:58:11] Well, I guess it does in terms of delivery, but not in terms of like cloud service, all the stuff that it does online. [00:58:18] And that freaks me out that it's an extranational project and that it's like a globalized thing because we can't really stop it then. [00:58:26] That might bring us back nicely to the question of the military's relationship to the tech industry. [00:58:32] Because you mentioned Amazon, and it's interesting. [00:58:34] I mean, I think Amazon is obviously a multinational company. [00:58:38] It has interests outside of the United States. [00:58:41] But in important ways, one piece of its business, its cloud business, is increasingly reliant on the U.S. government and on the military in particular, but also law enforcement agencies as consumers of its services. [00:58:54] So you could see this as maybe the next and current iteration of this long relationship between Silicon Valley and the military, which is now really centered on things like the cloud and also AI machine learning, [00:59:11] what I was describing before, because the Pentagon in particular is looking at these companies like Google and Amazon and Microsoft and seeing the kind of technologies that they've been building around cloud and AI and saying we want that for our purposes. [00:59:30] Because if you think about the U.S. military, they certainly produce an immense amount of data. [00:59:37] If you think about the global footprint of the U.S. military, I think they have 800 military bases around the world. [00:59:44] I'm sure there are numbers about how many aircraft carriers and helicopters and so on. [00:59:49] But there's an absolutely massive footprint. [00:59:52] And I'm sure they generate a lot of data currently. [00:59:54] And they could generate a lot more once all of these vehicles are wired up with Internet of Things devices. [01:00:01] And they figure that if Google and Facebook can use all of that data and all of these machine learning algorithms to sell advertising as effectively as they do, why can't they use the same technologies to increase their operational effectiveness around the world? [01:00:17] Yeah. [01:00:18] And when, I mean, another more kind of likely future is that these companies like Amazon and the U.S. government start becoming more enmeshed, right? [01:00:28] And more reliant on, well, sort of a one-sided reliance on like a company like Amazon Services, which, yeah, if you have, I mean, yeah, you have these smart military stuff too. [01:00:39] But like something that has occurred to me, you know, there's this PG ⁇ E like outage here in California, which I'm sure you've heard about, where they're shutting off all the power to these people's homes. [01:00:50] They say it about because of wildfire stuff, but really I think it's in retaliation for the $11 billion fine they got. [01:00:56] That's basically a capital strike. [01:00:58] Yeah, but like these same companies, If the government started sort of intruding too much into their whatever realm, they could potentially do the same with like your refrigerator or your website too. [01:01:11] I mean, look what they did with that book company, Hatchet. [01:01:14] I don't know how to pronounce it. [01:01:15] Hache, Amazon, where they where Hatchett declined Amazon's offer. [01:01:22] And then they basically put their books way down on the algorithm and started like not shipping them, shipping them late, doing all these things on purpose. [01:01:31] I mean, and if you think about like they could do that to a small publishing company, when Amazon sort of grows and grows and grows, and it doesn't look like, I mean, to this layman's perspective, it doesn't look like it's going to stop anytime soon. [01:01:42] They could potentially do something like that to a government too, right? [01:01:47] Yeah, I mean, one case where you could see something like what you're describing developing is with the ring doorbell, which is these kind of connected IoT doorbells with the video cameras. [01:01:58] Amazon has been selling very, very aggressively and has been partnering with law enforcement and giving law enforcement access to the data that these devices are acquiring and kind of forming neighborhood watch associations that have closer relationships with police. [01:02:16] So there is a kind of deeper and deeper entanglement of a company like Amazon with these local security agencies in ways I think will make a lot of the Snowden stuff look kind of quaint. [01:02:29] I mean, I think that's my fear is in a decade, you know, we'll be teaching the Snowden disclosures to a bunch of high school or college kids and they'll kind of laugh about it. [01:02:38] They're like, this is such small change. [01:02:41] I mean, the notion of the government tracking you around Google, who cares compared to what they will be able to do soon. [01:02:47] Yeah, I mean, although I do think it's cute to think that we would ever teach anyone about the Snowden disclosures. [01:02:52] But it's like, it's like, it's like comparing like the Pentagon papers to the lead up to the Iraq war or something. [01:02:57] Like, oh, there was a fake instance of them starting a war. [01:03:00] Like, well, they hid it. [01:03:01] Like, you know, in Iraq, they didn't even hide it. [01:03:03] You know, like everybody. [01:03:04] No, they published it straight in the failing New York Times. [01:03:07] Exactly. [01:03:07] So like, it's, it's, I, I, I definitely agree that there's, there's a lot of parallels there. [01:03:13] Well, it's a really, I mean, again, to get a little like black pill theory here, because it's fun to do that. [01:03:20] Um, it's a really interesting question then about what is the like the changing nature of the state, which I think, you know, it's like, you know, the United States as its kind of like, you know, regulatory apparatus has been completely demolished. [01:03:40] And as its continued austerity has hollowed out pretty much any and all like social provisions. [01:03:49] And you have that kind of concomitant with the rise of these corporations that are, you know, I mean, we talked, we had Tim Faust on and we were talking about Amazon's, you know, joint venture with Berkshire Hathaway on, you know, it's very unclear on its kind of, you know, getting into the healthcare industry. [01:04:13] And it's not, and, you know, I know that Google is interested in this stuff as well. [01:04:18] And kind of like, you know, not just telemedicine, but like small local clinics, boutique things. [01:04:27] And so it's very easy to see that kind of as the state, you know, the thing that we call the state is hollowed out to its basic military surveillance and law enforcement, you know, structures, that these massive monopolies become these kind of social feudal, like neo-feudal arrangements. [01:04:53] Well, we differ on the black pill people is that we don't think that's good. [01:04:58] No, no, obviously I don't think that's good. [01:05:00] But I'm saying it's very easy to see, and especially as you see, I mean, you know, you see these, again, these kind of like supranational state governance, you know, governance structures like the European Union kind of allow, like furthering this kind of withering away of the state as a kind of public entity to, you know, and, you know, [01:05:27] while allowing massive corporations to kind of rise in its place. [01:05:33] You know, I think Amazon, Bezos scares the shit out of me. [01:05:37] Like, I've said that before on the podcast, but like in many ways, you know, not to be like to, you know, I think I have a theory that a lot of the kind of professionalization and bureaucratization of the US military has led there to actually being less ideologues than there were in, say, like the 50s and 60s. [01:06:01] But where you have, I mean, Bezos, I think, is nothing if not an ideologue. [01:06:05] Yeah, absolutely. [01:06:06] Or you think of someone like Peter Thiel, who certainly has his own mold buggy in theories. [01:06:17] And, you know, it's a quite dangerous situation that I don't think a lot of people on the left really grapple with. [01:06:28] At least not, you know, as much as I'd like. [01:06:31] I don't know what your feelings on that are. [01:06:33] Well, one way to pose the question that you're asking, Liz, is what is a state for? [01:06:38] Is it for killing people? [01:06:39] Is it for locking them up? [01:06:40] Or is it for trying to give them the kind of basic resources that they would need to lead a dignified life? [01:06:47] In the 60s, they would talk about the welfare warfare state. [01:06:51] And I often think today we have one half of that. [01:06:53] We still have the warfare state. [01:06:54] It's bigger than ever before. [01:06:56] We don't have the welfare state anymore, increasingly. [01:07:00] And to return to the question of Silicon Valley and the tech industry more generally, you could think of tech as playing a kind of facilitating role in some of the transformations that you're describing, which is on the one hand, how do governments and in particular kind of right-wing politicians find ways to further erode those social provisioning capacities of the state? [01:07:25] And that can be often done through partnering with tech firms. [01:07:30] There's a great book by Virginia Eubanks called Automating Inequality, where she looks at certain cases. [01:07:37] There's one, I think, in Indiana in particular, in her first chapter, where a Republican governor is using digitization essentially as a Trojan horse for eliminating the Medicaid program in his state. [01:07:54] And you see this a lot. [01:07:55] I mean, this has happened in other places as well. [01:07:57] There's a famous case in Australia of kind of algorithmic austerity, where under the cover of making things more efficient, you basically fire all the caseworkers, you shrink these centers, and have software that doesn't, frankly, work very well, but it's not supposed to. [01:08:13] So it actually works. [01:08:14] It performs its function, which is being so impossible that nobody can use it and rejecting all of these deserving applicants for welfare. [01:08:25] So it's working by not working, essentially. [01:08:27] So that could be one way we would see how tech is being enlisted in this kind of erosion of social provision. [01:08:36] And then, of course, tech is also enlisted in the hardening and the amplification of the punitive aspects of the state. [01:08:43] So the carceral state, you know, we think about the proliferation of things like algorithmic sentencing and incarceration through electronic monitoring. [01:08:53] And then, of course, also the military and the police, facial recognition, new forms of algorithmic warfare that the Pentagon is developing, these kind of new forms of smart weaponry. [01:09:05] So yeah, I think it's a bleak picture. [01:09:06] I mean, I think my small point of optimism at the end of that is that there's nothing faded about this arrangement. [01:09:15] I mean, these technologies, as we've been discussing, were largely developed with U.S. military funding, but many of them do have emancipatory potentials. [01:09:25] Maybe not all of them. [01:09:26] I mean, I think there's a case to be made that some of them shouldn't exist. [01:09:29] But I think the purposes that they're being put to today are not inevitable. [01:09:34] We could imagine new ones. [01:09:36] Oh my gosh, what's algorithmic weaponry? [01:09:40] So I think we discussed this a little bit early in our conversation, but Amazon is pursuing this. [01:09:46] A number of these cloud providers are pursuing this huge contract with the U.S. military called JEDI. [01:09:53] Oh, yes. [01:09:55] And you may have read about this. [01:09:56] JEDI, which I think stands for like Joint Enterprise Defense Initiative, something like that, is part of this broader years-long initiative that the U.S. military, I think, really started to pursue in the late Obama years, but has accelerated under Trump, which is to get more serious about embracing artificial intelligence. [01:10:16] There's now a joint artificial intelligence center at the Pentagon, which oversees the various AI initiatives that are being undertaken across the department. [01:10:25] And the idea with JEDI is to have a single integrated cloud platform for all of the U.S. military. [01:10:31] So this is a really massive undertaking. [01:10:33] You think about the size of the U.S. military. [01:10:36] And JEDI will essentially be a clearinghouse for all of this data that is being collected and generated by the U.S. military and their operations worldwide. [01:10:46] And will also be where they can use machine learning to find patterns in that data that might be useful for killing people, essentially. [01:10:57] Jesus Christ. [01:10:58] So it's a somewhat terrifying picture. [01:11:01] I mean, the way that one Google engineer described it to me is that it's basically Skynet. [01:11:08] I mean, that's kind of the vision. [01:11:09] It's not totally precise, but I think it kind of gets us most of the way there. [01:11:15] That's where this is leading, right? [01:11:18] Yeah. [01:11:19] I mean, if this is where we're at in 2019, God help us 20 years down the line. [01:11:25] Yeah, I like that is a, you know, that dovetails so easily then when you start to think about social credit systems and social credit kind of theories, at least, or I don't even know how you want to describe it, for maybe the way that some of these social networking platforms are moving. [01:11:47] I know there was like the VC firm or VC guy, Silicon Valley guy, who's a big supporter of Greta Thunberg. [01:11:57] Who is he? [01:11:58] I can't remember his name. [01:12:00] Some scumbag. [01:12:00] But he wants to basically have a worldwide, I mean, social credit system for companies and kind of like green, they're like green practices. [01:12:16] But you can see very quickly how, like you say, that can be sort of perhaps the animating idea where that would, you know, lend itself to more kind of nefarious social, you know, surveillance applications. [01:12:36] I mean, there's the Max, Guillaume Maxwell's sisters who came up with Kileyad, you know, it was this company that creates this sort of central platform for information sharing and profile creating for law enforcement. [01:12:50] And they're like, well, our dream is to bring this to the private sector. [01:12:53] And I think the vision for a lot of these people, whether they know it or not, is a national, or excuse me, a global network that the private sector and law enforcement are both perfectly integrated in, whether they're integrated together or they're looking at the same information two separate lenses. [01:13:13] I think, you know, these things tend towards centralization. [01:13:17] And I can imagine that eventually they'll just be using the same lens. === Systemic Social Control (05:14) === [01:13:20] And it'll be the companies and the government look at you through the same fucking program. [01:13:25] Sounds great to me. [01:13:27] That sounds cool. [01:13:28] They'll know if I need more toilet paper easier. [01:13:30] No. [01:13:31] Yeah, I mean, I don't know, Ben, do you have any insight on social credit? [01:13:35] So we ran a piece on the social credit system in Logic's China issue, which came out earlier this year. [01:13:42] Nihal. [01:13:43] And it's by this wonderful scholar named Shahzada Ahmed. [01:13:49] And it's an interesting piece. [01:13:51] I might struggle to summarize it accurately, but what she's laying out is trying to parse some of the myth-making from some of the reality on the ground. [01:14:04] And I think she does it very, very carefully because she's not trying to convey the impression that this is not a dystopian system or that this could not be used for social control. [01:14:15] But I think that the thrust of her argument is that at least in its present iteration, it's more analogous to things like debtor blacklists and kind of various similar forms of blacklisting that happen in the United States than it is to something like a kind of Orwellian system of social control. [01:14:37] I think, again, I would encourage folks to check out that piece. [01:14:40] It's available on our website at logicmag.io because I'm not sure I could do justice to all of the nuance. [01:14:46] My casual observation without being an expert is that sometimes folks conflate the social credit system with what's happening with the Uyghurs in Jincheng. [01:14:56] Yeah, they do. [01:14:57] And the latter is, I think, a very clear case of a kind of technologically enabled totalitarianism. [01:15:06] And there's another piece, if I can continue to keep plugging logic on that same issue that's available online by Darren Byler on what's happening with the Uyghurs and particularly looking at the technological aspects of that system. [01:15:21] So again, I think if we're thinking about kind of techno-totalitarianism, I think Jinjang would be a better look for what we're, I mean, not better in any good sense, but certainly something a bit more terrifying. [01:15:35] So, with the social credit stuff that was reported on pretty heavily in the West here. [01:15:39] Yeah, there was that one kind of hyperbolic piece in Wired. [01:15:43] Yeah. [01:15:43] Right. [01:15:43] I remember. [01:15:44] They made it sound like you could be arrested for like posting cringe. [01:15:48] You would like to do it. [01:15:48] You know what, though? [01:15:49] If you do post cringe, I will arrest you. [01:15:52] Oh, yeah, absolutely. [01:15:53] But like, they were basically putting forth that, like, you know, if you ever say the word Winnie the Pooh, you will be denied train travel by this system. [01:16:02] It's like, that seems a little, it's, I, from what I gather, it's not one big centralized system either. [01:16:08] It's often localized. [01:16:10] So I think that's an important point, Brace, which is that it's often reported on as a kind of single monolithic system. [01:16:17] It's better understood as an umbrella term, which involves a lot of different initiatives, some of which are local initiatives. [01:16:25] And many of them are quite scary. [01:16:27] I mean, Shahzada describes these and they will, I think, creep you out. [01:16:32] I mean, it's not a good system by any means, but I think there's a level of coherence and kind of top-down integration that doesn't quite exist, but is often presented in the U.S. media perspective. [01:16:44] I mean, I think for me, what the more like it's more of a, you know, I think it's a useful specter, maybe, even though I think that some people can be hyperbolic about it. [01:16:57] I'm just thinking about that wired piece. [01:16:59] I don't know. [01:16:59] Maybe I'm being rude to that piece. [01:17:01] But I hate wires. [01:17:03] I don't go far. [01:17:05] They have their own relationship with U.S. military. [01:17:08] Yes, they do. [01:17:09] A fawning one. [01:17:12] But I do think that it is, I tend to think of it more as a kind of emerging ideology and the way you see its kind of logic deployed, not just within kind of technological systems, [01:17:26] but more like kind of, I don't know, as a kind of social phenomenon, which, you know, that's where I start to get like worried because it's very easy to see kind of some of the ways in which like, [01:17:45] you know, phenomenons on social media and et cetera can, you know, then legitimate, ideologically legitimate larger, like perhaps if one were so inclined, kind of like more sophisticated system like a social credit system. [01:18:04] I mean, just to the sort of basic point there about social media sort of having this influence in politics and then be able to create these, they mean the whole bot craze that came out. [01:18:15] You know, now everyone accuses each other of being bots. [01:18:17] But that is like, you know, there are a lot of bots out there. [01:18:21] And it's, it's the US government has talked about experimenting. [01:18:26] I mean, the Hillary Clinton State Department, as we talked about in another interview, was very focused on Social change through social media. === Social Media's Influence (05:48) === [01:18:34] Of course, not in America, in other countries. [01:18:37] And it's going to be interesting. [01:18:38] About the Arab Spring. [01:18:40] Yes, Mushbrava, or parts of Ukraine. [01:18:44] Oh, yeah, possibly some places in the Caucasus. [01:18:47] And I think that's why, like, it's also going to lead to a lot of new kinds of paranoia, I think, that would astound our counterparts from back in the 1960s. [01:18:58] You know, we need like a 21st-century pension. [01:19:01] Exactly. [01:19:01] You can't tell if anything's real, right? [01:19:05] Like, you can't tell. [01:19:06] You know, you'll see, you could be convinced if you're sort of a more or less detailed researcher, you could be convinced all sorts of things are happening in all sorts of parts of the world just from viewing recycled videos on Facebook. [01:19:19] And like, oh, they can tell you, oh, this is Morocco, or, oh, this is Syria, or this is, and it can be anywhere. [01:19:24] And like, it's, it's, I don't know, it gives me the fucking willies because it gives the state a vector of control and of like and and an in on any social movement because they can they can shift the dialogue around it online, I think, fairly easily. [01:19:45] And it leaves, it leaves the, it leaves the lone proletarian like myself wondering like, what, what am I reading is real and like, what am I being, it's, it's fucked up. [01:19:54] So yeah, these new kinds of paranoia. [01:19:56] It's going to drive me insane. [01:19:59] Which is why we have a podcast about Jeffrey Epstein and all his compatriots. [01:20:04] So Ben, what should we do? [01:20:06] What do we do next? [01:20:07] Oh man, this is the difficult final question. [01:20:10] Exactly. [01:20:10] What is to be done? [01:20:12] Is there a fiber optic cable, just one, that an enterprising man can snip with a large pair of garden shears or something of that nature? [01:20:21] I think there was a video that I saw this morning actually of sharks gnawing away at the Google fiber optic cable. [01:20:27] Oh my God, I love that. [01:20:28] That's a good question for you, bro. [01:20:30] We need to weaponize sharks. [01:20:31] Yeah, we need to extrapolate explosives to dolphins. [01:20:36] Yeah, I mean, you know. [01:20:38] But like, is there, I mean, is there, obviously, things do not look so great right now because we're continuing to head in the wrong direction. [01:20:45] But like, what, what do you think the future, like, how do you think we can affect this future? [01:20:52] If we can at all, if we can't, let me know because, you know, I'm biting my nails over here. [01:20:59] It's a good question. [01:21:00] It's a difficult one, particularly because I think we covered so much ground in our conversation. [01:21:04] Yeah. [01:21:04] I think my approach to this type of thinking is to try to look at what are the parts of society that are in motion? [01:21:13] Where are the social movements? [01:21:14] Where are people organizing? [01:21:16] And what kind of logic can we extract from that? [01:21:20] What kind of world might we build around that? [01:21:23] So I really like to think of social movements as good materials to think with and to kind of draw inspiration from. [01:21:30] I think it also helps us find a less technocratic mode so that we're not kind of proposing the solution, proposing the model legislation, the model reform, but really trying to think through how are people on the ground responding, building alternatives, and learn from them. [01:21:47] So towards that end, we have seen a lot of successful efforts by local organizers to push back on some of the surveillance technologies that we've been discussing. [01:21:58] It's a group in Los Angeles called the Stop LAPD Spying Coalition that has successfully gotten the LAPD to shut down a couple of its so-called predictive policing, essentially algorithmic policing programs. [01:22:12] Also have campaigns to ban facial recognition by public agencies. [01:22:17] I know in San Francisco that was successful. [01:22:19] Here in Somerville, Massachusetts, and in a couple other places that's been successful. [01:22:24] I believe there's something in the California legislature now along those lines as well. [01:22:28] So again, I think none of these individually really speak to the scale of the problems that we've been discussing, but I think they give us interesting points of departure for ways to think about cohering movements around broadly democratizing technology. [01:22:45] Because I think that's ultimately what we're talking about is the horizon beyond these individual moments of resistance. [01:22:51] How do we make it so that all of society, however broadly we define that, is involved in decisions about how technologies are built and implemented? [01:23:02] Because right now, of course, you have a fairly small number of people, primarily executives at these big firms who are, of course, constrained by the profit motive and the kind of competitive limits of the capitalist market, and then the kind of bureaucracy of the state security agencies. [01:23:19] These are basically the people who are making the decisions about what kind of technological world we live in. [01:23:25] So to the extent that we could build spaces and kind of conduct struggles that allow more and more people to get to participate in those decisions, I think that's constructive. [01:23:35] In terms of what that looks concretely, I mean, again, I think we have to point to what is the work that organizers are doing now. [01:23:42] Of course, there's a lot of mobilization within the tech industry itself. [01:23:46] Tech workers have been agitating for the cancellation of certain contracts with the military, with law enforcement, with ICE. [01:23:53] I think that's another important source of inspiration for this. [01:23:56] Yeah. [01:23:57] Also, like, you know, an accidental death or two would probably go really far. [01:24:04] I mean, all these motherfuckers overdose already all the time. [01:24:08] It's true. [01:24:09] Their heart explodes at their sex parties, et cetera. [01:24:13] They're poly parties. [01:24:14] They're poly, exactly. [01:24:15] Every single one of these people is in an open relationship with all of the rest. [01:24:19] They can suffocate in the poly ball pit. [01:24:22] Yeah. === Bernie's Organizing Potential (04:25) === [01:24:23] Do you think my last question? [01:24:24] I don't know if Liz has anymore. [01:24:25] My last question. [01:24:26] If Bernie Sanders survives the several impending assassination attempts on him and becomes president, could he arrest Jeffrey Bezos? [01:24:36] Could he arrest Jeffrey Bezos? [01:24:37] I think he could make life for Bezos and for that whole class pretty uncomfortable. [01:24:42] I mean, I think this is a little bit beyond my spray of expertise, but I think a lot of the conversation I've seen about what Bernie could or couldn't do, I think, rests in part on the rhetorical capacities of the presidency, I guess, for lack of a better phrase, like the bull pulpit. [01:25:00] What could he do, even if he faced an obstructionist Congress, a hostile executive bureaucracy, because I think leftists always do, what could he do just at the level of rhetoric and messaging, which I don't think would be the only thing he could do. [01:25:15] But that alone, I think, could have a really powerful impact. [01:25:19] I mean, we're at a moment, I think, in the organized left where we need such a massive kind of recomposition, reconstitution. [01:25:27] And a lot of it is just putting ideas into circulation that have not been in circulation for a while. [01:25:33] And of course, Bernie has already done a great service in terms of how many of those ideas he's already put into circulation, ideas like socialism, ideas around class struggle. [01:25:43] One could imagine that he could do a lot more if he were in the White House along those lines. [01:25:47] Yeah, I mean, I think that's like the, I mean, at least when I think about the prospect of Sanders' presidency, that is, you know, the biggest goal in a lot of ways. [01:26:04] Because I think, you know, a lot of people are, you know, especially, let's just call them Warren supporters, will say things like, well, he won't be able to get anything done and Congress and how's he going to pass Medicare for all? [01:26:17] As if that is, you know, I mean, not to sound whatever, you know, I think he could get possibly very far. [01:26:24] But the idea that if you just get one person in the White House and they want to do, they want to get free health care, I mean, people have been agitating this for over a century. [01:26:34] So, I mean, I think people, like you say, you know, pushing back on a lot of the kind of technocratic arguments for what like an exec, the executive, the executive's role is is like very important, you know, because it's not, there's already going to be hostility from every side, I think, of the state, ideologically and, [01:27:04] you know, just in terms of the various arms of the state, that focusing on those kind of like, kind of like technocratic concerns rolling is like the exact opposite. [01:27:19] Like it's like totally missing the point. [01:27:22] Yeah. [01:27:23] Because like you say, there needs to be a massive, you know, recomposition, reawakening in a lot of ways. [01:27:30] I mean, what we used to call consciousness, class consciousness. [01:27:34] Gamers rise up. [01:27:35] Yeah, jokers rise up. [01:27:38] And that, you know, leaders are incredibly valuable for that very reason. [01:27:45] Totally. [01:27:45] I mean, you see it with Trump, right? [01:27:47] I mean, obviously we've had a far right in this country since the beginning. [01:27:50] But you think the way that Trump has used the bully pulpit to recompose, reconstitute, amplify this like very terrifying, you know, ultra-nationalist far right, I think quite successfully. [01:28:04] And I think that that would be the question, you know, could Bernie do something analogous for the left? [01:28:10] Obviously, that leaves a lot of work for organizers on the ground to do because simply putting those ideas and those that rhetoric into circulation is a necessary but not sufficient condition, right? [01:28:21] It has to be then matched by a certain amount of organizing work that they're pulling people into organizations. [01:28:27] But again, I think a lot of this depends on, you know, depending on your politics, depending on the way you think about politics, what criteria are you using to evaluate a possible Sanders presidency? [01:28:40] I think folks with a more technocratic frame of mind might look at it and see, well, how much legislation could he get through Congress, for instance? === Stimulating Organized Left Growth (03:56) === [01:28:48] And again, I think that's an important consideration. [01:28:50] I don't think we can exclude that. [01:28:52] But I think folks on the left might put the emphasis somewhere else, which is to what extent can he stimulate this growth of the organized left and make it possible for this infrastructure, which has been developing but is still quite fragile and kind of nascent to really take deeper root. [01:29:12] And, you know, what does that look like in the labor movement, organizations like DSA? [01:29:16] How do we extend this ecosystem that has started to recover in the last few years, but is still quite fragile? [01:29:22] Yeah. [01:29:22] Yeah, I think there's probably a fun argument to be made that Obama actually did the same thing, but for a kind of liberal technocratic PMC class. [01:29:31] Even though I understand that right now, PMC is a highly contested term. [01:29:37] Yeah. [01:29:39] Pussy munching cuckoo. [01:29:42] Sorry. [01:29:43] Praise. [01:29:44] Liz did not seem happy. [01:29:46] Ben, this has been so fun. [01:29:48] We could talk for like another hour. [01:29:50] Yeah, yeah, fantastic. [01:29:52] Well, we should wrap this up. [01:29:54] Yes. [01:29:54] Yeah, before we heap compliments onto the compliment pile. [01:29:59] Thanks so much. [01:29:59] This has really been fun. [01:30:00] I've had a great time. [01:30:01] Yeah, thank you. [01:30:02] So Ben Tarnoff, thank you so much. [01:30:05] Founding editor of Logic magazine, magazine or journal? [01:30:10] Do you have a preference? [01:30:11] I don't know. [01:30:11] Is it a magazine or a journal? [01:30:13] It's a good question. [01:30:14] It's journal size, but magazine content. [01:30:18] Yeah, it's more of a vertical, I'd say. [01:30:20] Oh, yes. [01:30:21] There you go. [01:30:22] And we'll link to it in the show notes where everyone can check out those articles that you mentioned. [01:30:28] Yeah, there's so much more to talk about with this stuff. [01:30:31] But thank you so much. [01:30:33] Thanks, guys. [01:30:34] I will talk to you soon. [01:30:35] See ya. [01:30:35] Top tune. [01:30:36] Have a good one. [01:30:37] Man, I'm [01:31:07] destroying my phone. [01:31:09] Oh, yeah. [01:31:09] I actually, so a lot of listeners don't know this, but I actually dropped my phone in the couch that I'm sitting on right now a few weeks ago. [01:31:18] With Tim. [01:31:18] Yeah, when Tim Falls was on the show, he was in studio and he and I were sitting next to each other. [01:31:23] I'm usually sprawled across this love seat here. [01:31:26] And my phone fell out of one of the folds of fat on my stomach and it's usually stuck there with sweat and gum, etc. [01:31:35] Just kidding, I don't have any fold. [01:31:36] It goes into the couch and we lose it for what was almost 45 minutes. [01:31:41] Yeah, I don't know if this is a great story to tell. [01:31:43] Okay. [01:31:44] Anyways, I should have just left it there. [01:31:46] No. [01:31:47] Yeah, that, yeah, 45 minutes of trying to get a phone out of the couch because the couch, you can't remove the cushions. [01:31:53] Yeah, luckily, Liz has a wrist that can only be measured in centimeters, not inches. [01:31:59] So she was able to reach in there eventually. [01:32:00] Yes, my delicate lady fingers and hands. [01:32:05] Snipper mobile giants. [01:32:06] Yes, like a child factory worker making beautiful Nikes. [01:32:13] Yes. [01:32:14] Like a sleep cracker. [01:32:15] Exactly. [01:32:17] And so her childlike hands were able to reach inside the guts of this couch and remove my telephone. [01:32:23] So thank you for that, Briz. [01:32:24] Yeah, again, this is a stupid story. [01:32:26] Thank you, Ben. [01:32:30] Hope you guys enjoyed that. [01:32:31] We'll be back next week. [01:32:32] Yes. [01:32:34] So this has been Brace, Doc to Technology, Liz, and our producer, Cyber Young Chomsky. [01:32:43] And we'll see you next time.