The Tim Dillon Show - 282 - Lex Fridman Aired: 2022-01-04 Duration: 01:42:05 === Triple-Tex and Universal Sketches (05:00) === [00:00:00] With Triple-Tex, it's easy to deliver the loan signings for networks. [00:00:03] And sauerbønner. [00:00:04] And chiropractor. [00:00:06] And machine-leirer. [00:00:07] And advokater. [00:00:08] And alarm-saleskaper. [00:00:10] And of course, of course. [00:00:12] And of course, of course. [00:00:14] And of course, of course, of course, of course, of course, of course. [00:00:17] You've probably understood it now. [00:00:19] Triple-Tex helps all kinds of companies to manage the year-to-year-old. [00:00:22] Remember the price of the loan signings on the 31st of May. [00:00:25] The whole of the loan signings program, Triple-Tex. [00:00:30] Ladies and gentlemen, welcome to the Tim Dillon Show. [00:00:33] My voice is slightly suffering right now. [00:00:40] We are here with Lex Friedman, who is an incredibly intelligent podcaster, researcher. [00:00:47] He has a podcast that has probably right now. [00:00:50] There's, I don't know that you could find a podcast with better guests than yours if you look at your ability to get top-tier people in every field, predominantly the things you're interested in-robotics, technology, science, physics. [00:01:08] But you get everybody. [00:01:10] I mean, you've had Richard Dawkins, you just had Elon Musk. [00:01:14] How do you get the top people in those fields? [00:01:19] And they love coming on and talking to you. [00:01:22] Yeah, it's an interesting question. [00:01:24] I think if there's one universal thing that I've noticed is somebody close to those folks is just a fan of the podcast. [00:01:35] So it's like this momentum that keeps going. [00:01:37] And oftentimes, it's family members where they say, you know, dad or mom or whatever, this guy's this guy's all about love and is kind of an idiot and it's fun to talk to. [00:01:51] So, and then it just kind of builds like that. [00:01:53] With Elon, a few folks were fans of the lectures I was giving at MIT. [00:01:59] And so it's like somebody always recommends it to them. [00:02:03] Right. [00:02:03] And then some of it is also just networking. [00:02:06] Like, I just know a lot of people in the tech space. [00:02:08] And, but usually, it's funny for some of these busy folks, it's just somebody close to them, somebody they trust, somebody they like, will say, Hey, you should do this kind of podcast. [00:02:20] And that, because you find a lot of really busy people don't know anything about podcasts. [00:02:28] Yeah, and they're wary of it, right? [00:02:30] They're wary of sitting down with somebody they may not know and where that'll go and what they'll end up saying, especially a lot of people that are in positions of power. [00:02:38] Usually, they're, you know, I spent time with the CEO of NBC once for a day. [00:02:45] They were doing a sketch, an internal sketch to be played at their board meeting. [00:02:51] And they hired me as a comedic actor to play a cab driver in the sketch. [00:02:57] And the whole sketch was going to be the CEO of NBC Universal, who's no longer the CEO, but he was Steve something or other. [00:03:04] He was going to roast the board, the people, you know, his people. [00:03:11] He was going to roast them in a playful, fun way. [00:03:14] And I was going to be the cab driver. [00:03:16] And they spent tens of thousands of dollars, maybe more. [00:03:21] You know, the cab was on a dolly. [00:03:23] It was being carted around New York. [00:03:25] Now, what amazed me about spending like a few hours with him was how tightly structured his entire life is. [00:03:31] Yeah. [00:03:32] How many he had a lawyer with him at all times? [00:03:34] He had a personal assistant with him at all times. [00:03:37] NBC Universal had an attorney there. [00:03:40] The script, which was such a simple script, had gone through 10 or 15 rewrites and redrafts. [00:03:49] And he was a very intelligent guy. [00:03:52] And his time had such a premium. [00:03:54] It adds such value that to say to anybody in that space, come and spend two hours with me and just let it fly. [00:04:03] Like, I remember the first thing I saw when I, uh, the first thing I said when I saw him, he walked out and he goes, You're the cab driver. [00:04:10] And I said, Yep, last white cab driver in New York. [00:04:13] And he went, Whoa, whoa, that's not in the script, right? [00:04:16] I said, No, no, no, no. [00:04:17] He goes, Okay, that's. [00:04:17] And he's looking at like his lawyers and his people. [00:04:19] He's like, Wait a minute, we're not going to say that, right? [00:04:22] So very interesting that you have these people that come on and they feel comfortable enough with you to just speak. [00:04:32] Everything you described with the situation you were in, to me is toxic for a great conversation. [00:04:39] So 100%. [00:04:40] I'm becoming so. [00:04:41] One of the deeper questions here is: how do you talk to these people for two, three, four hours? [00:04:47] And how do you make sure that nobody else is allowed in the room? [00:04:50] Like, I have to vet anyone who's allowed. [00:04:53] So, no marketing people, no people that have a notebook and they're taking notes about the things you're not supposed to say where they look over. === House Ownership as an Excuse (07:20) === [00:05:01] Is that appropriate? [00:05:02] None of those people. [00:05:03] Sometimes it's cool to have just friends who just make so you can look over them and laugh, like Ben is here. [00:05:10] You can look over and laugh. [00:05:11] That's great. [00:05:12] But I, you may be in the first like few dozen conversations, I was okay with like people being present, but now I'm like hardcore. [00:05:22] You start at three hours, it has to be three hours. [00:05:24] Is it Joe Rogan inspired that you could do long form conversation with these folks and then say, like it's polite, of course, and respectful, but saying, I have different ways of saying it, but like I feel most comfortable creatively with just us two in the absolutely and and uh well, [00:05:44] yeah, there's and then there's the uncomfortable like pressure of saying, like, I know what the hell I'm doing, and that those that exact conversation has come up, which is ultimately you have to trust me that I know what the hell I'm doing, and then I would bring out the best in this person. [00:06:02] What's interesting about you, we talked about a little bit before uh we started is that you don't own real estate, you don't own a property, you rent, you like the freedom of that, yes. [00:06:14] And by the way, sorry to interrupt your conversation with Joe about digital real estate was quite epic. [00:06:19] I think it, I think it was interesting, yeah, and I think a lot of people responded to it. [00:06:24] Yeah, you right now, real life, and people tell me all the time, they go, Are you kidding or are you serious? [00:06:30] And I go, Real life right now is living on that line. [00:06:35] Yeah, the things that are real sound like comedy, yeah. [00:06:40] So I'm talking to Joe and I'm being funny because that's what I try to do, but I'm also bringing up real actual trends that are happening. [00:06:53] And people, I think, are so you know, removed from that world that it all sounds like comedy. [00:07:03] And a lot of it is very funny, but it's actually a real thing that is happening. [00:07:09] But you, what's interesting about you is the concept of ownership in the real world you view as kind of a lack of freedom. [00:07:18] You were saying, Yeah, I think I value freedom to make big leaps into the unknown as exceptionally important. [00:07:31] So, I also value love and monogamy and relationships, all those kinds of things. [00:07:35] And those put a limit on for a lot of people, those put a limit on freedom. [00:07:38] Right. [00:07:39] I don't see it that way. [00:07:40] So, like, I think people use that as an excuse. [00:07:43] In the same way, house ownership, people often use as an excuse to get complacent to limit themselves. [00:07:51] So I don't think a house, I don't think ownership in itself is a limit on your freedom, but people get comfortable, they get soft, it becomes a limitation because I want to be able to tomorrow say, you know, I'm going to start a company and I'm not going to do it in Austin. [00:08:06] I'm going to do it in Kansas City. [00:08:08] I'm going to do it in San Francisco and move. [00:08:10] Just walk away. [00:08:12] Yeah. [00:08:13] Cool guys don't look back at explosions. [00:08:15] Just walk away. [00:08:16] Just walk away. [00:08:17] And that something about me, I mean, I don't want to generalize, but for me, I probably won't do that. [00:08:23] But the freedom to do that allows you to think big, to take leaps into the unknown in a way that materializes like the best version of yourself. [00:08:35] I mean, I fully agree with you. [00:08:38] As a person who bought a house. [00:08:40] Well, I did buy a house in Austin. [00:08:42] I think that part of what I like is problems. [00:08:48] Yes. [00:08:48] Problems are essential for my business. [00:08:52] What's your business exactly? [00:08:55] It's a great question. [00:08:58] I think it's, you know, in its very simplest form, it's making people happy, right? [00:09:10] That's like the simplest way to say it. [00:09:13] And if there are no problems, there's never going to be a reason to make anyone happy. [00:09:17] That's why utopians hate comedy because they think you can create a world in which there will be no pain. [00:09:22] So if there is no pain, there's no reason to alleviate that pain through comedy. [00:09:27] But I'm someone who realizes that no matter how great you make the world, you're going to have pain. [00:09:32] You're going to have problems. [00:09:34] So one of the reasons that I kind of like entanglements, problems, things like that, is because I see a lot of opportunities in them to, you know, release that valve of pressure comedically. [00:09:51] So you lean into the chaos because from the chaos comes a lot of problems. [00:09:55] And then you can understand the human condition by analyzing. [00:09:58] I think that's part of it. [00:09:59] I also think I have to do things to know what they are. [00:10:05] And part of my job is to hopefully make those things funny. [00:10:09] So by owning a house, I've owned houses that I've, I mean, I own one house that I lost. [00:10:15] I'm very open about that during the mortgage crisis, right? [00:10:19] So I went through that. [00:10:20] And now I bought a house very responsibly. [00:10:23] But when you buy a house very responsibly, it's still very much, you know, a weight. [00:10:30] It's a weight on you. [00:10:32] You feel that weight, you know? [00:10:35] And, but there's value, I think, to weighing yourself down sometimes. [00:10:40] Yes. [00:10:41] Because then you understand what other people are going through. [00:10:45] When I'm in a comedy club or a theater, I'm looking out in the audience and I'm going, all of these people have that weight. [00:10:51] They have more than that. [00:10:52] They have kids. [00:10:53] They have, you know, jobs. [00:10:56] They have family members they support. [00:10:59] They have lots of weight. [00:11:01] So to understand that and to relate to that, for me, I think there's a value to it. [00:11:06] But I also agree with you and completely understand that it slows you down. [00:11:12] Houses are, they slow you down. [00:11:14] Families, even though they're amazing and valuable, can have that effect as well. [00:11:20] And it's well worth it to be slowed. [00:11:22] Certain times slowing down, I think is worth it. [00:11:25] But you also have to be honest with yourself that the slowing down is often artificial. [00:11:29] Like it's in your mind, not to be stoic for a second, but yeah. [00:11:34] Like there's no reason you can't sell the house. [00:11:37] Well, some of it is you want to make sure you get a house you can actually afford. [00:11:42] Yes. [00:11:42] All those kinds of things. [00:11:44] But even still, people talk about like FU money and they use the lack of money or insufficient amount of money as an excuse to stay in the job they don't like, stay in the life they don't like, stay in the marriage they don't like, stay in a house they don't like, when in reality, you could, I think it's an excuse. [00:12:04] Like most people can take the bold action. [00:12:07] It's a huge risk, but take a leap into the unknown, get rid of the house and go. [00:12:12] pursue a job that you actually love. [00:12:14] And then everything else will just everything. [00:12:19] Do you think everybody loves something? === Pursuing What You Love (15:17) === [00:12:21] So this is, see, this is a fundamental disagreement that we have. [00:12:25] They have the capacity to discover something they love. [00:12:28] Yes. [00:12:30] Well, that's a different thing. [00:12:32] So if you think that there's nothing in this world you love outside of chemical depression, so if you, if you, if there's just a chemistry that makes you incapable of seeing some aspect of beauty in the world, I think everybody could discover it. [00:12:44] And it's a little flame that's in everybody that you just have. [00:12:50] Haven't you ever met stupid people? [00:12:53] I think part of when I speak to you, I speak to a guy who's, because he's very smart, has had the privilege of speaking to some very, very intelligent people. [00:13:03] But have you ever spoken to someone who is not that smart? [00:13:07] And I don't mean a bad person. [00:13:09] I don't mean a malevolent human being in any way. [00:13:13] I mean someone who's not complex. [00:13:15] They're a simple person. [00:13:18] So let me just make clear that I don't like the quote unquote intelligent people or the elite people or the rich people. [00:13:26] I like simple people, simple, simple people. [00:13:29] That's my opinion. [00:13:30] I see myself as a, you say I'm. [00:13:32] But you're not. [00:13:32] But this is when, this is when I'm going to, this is the, you've got a robot on your floor. [00:13:36] You interviewed the richest man in the world yesterday. [00:13:38] This is when we have to stop. [00:13:40] You're wearing a suit. [00:13:41] It's, it's, it's 12. [00:13:42] I mean, Rogan thinks he's a simple man too. [00:13:44] He owns the Colorado River. [00:13:46] Can we stop? [00:13:47] This is when, as a comedian, I have to point out how insane this sounds to everyone. [00:13:51] You're a brilliant guy. [00:13:53] You give lectures at MIT. [00:13:54] You're just not a simple person. [00:13:56] I was in Bucky's yesterday. [00:13:57] You know what Bucky's is? [00:13:58] It's a Walmart competitor and Bucky is a beaver. [00:14:04] And it's a big Walmart and people go into it and they buy everything they need and they have Bucky's nuggets and they're like corn pops, a cereal. [00:14:11] They're not good for you. [00:14:12] They're like potato chips, essentially. [00:14:14] They're like, you know, this highly processed sugar food and they're so popular. [00:14:19] People go to Bucky's and they buy the nuggets and they eat the Bucky's nuggets on their couch. [00:14:24] Don't you understand that those people fundamentally have some differences with you? [00:14:30] No, I mean, okay, I just don't think so. [00:14:34] I think they are more like they appreciate life more actually than the people who are in this rat race of like trying. [00:14:44] So the quote unquote smart people are the people who are climbing some kind of tree like career. [00:14:50] You said to some of those people, do what you love. [00:14:51] They might say, I'm doing what I love. [00:14:53] I love the Bucky's nuggets and I love my couch. [00:14:56] First of all, that, yes. [00:14:58] If they, well, okay. [00:15:00] So I wanted to be a psychiatrist when I was growing up. [00:15:03] So I would have to have like several months worth of sessions with them to discuss, is it truly what they love? [00:15:08] I think on session two, you would realize that you're kind of going up against a brick wall. [00:15:15] Nope. [00:15:16] I don't think that's. [00:15:16] Not everybody is meant to live this fully, you know, like examined life. [00:15:23] Many people are quite happy. [00:15:26] It doesn't have to be examined. [00:15:27] You don't have to be, you don't have to read philosophical literature. [00:15:31] It can just be honest with yourself, like self-reflect in a simple way about what the hell do you love about life? [00:15:38] Well, that's fair. [00:15:40] And I'm with you on that. [00:15:41] I'm with you on that. [00:15:42] I think is what we've done as a society, in my estimation. [00:15:46] This is one of our fundamental disagreements. [00:15:48] We agree on many things. [00:15:50] You mean you and I? [00:15:51] I think we agree on a lot. [00:15:52] Yes. [00:15:53] But then there's things we disagree on. [00:15:55] Like, such as. [00:15:57] Well, I think love. [00:16:00] No, love is important and essential. [00:16:03] I have to talk to you about relationships. [00:16:05] We'll do that in a second. [00:16:06] But one of the things where me and you have a disagreement is this very vague and general advice that we give to people, we tell them your life is not valid or meaningful unless you have this dream that you or this deep love of something that you have to pursue and you have to be on this journey. [00:16:28] And to me, I know people that derive a lot of happiness from the greatest simplicities in their life. [00:16:38] Whether that might be, you know, and I'm not saying family is simple, but it might be just having a family. [00:16:44] It might be just being sober. [00:16:48] It might be just having enough money to pay the rent. [00:16:50] So to me, to tell everybody that there's a, that they have to love something in a way that motivates them or inspires them to chase it or follow it. [00:17:05] I think it's a very, it's a very, it's incomplete as advice. [00:17:11] Yeah. [00:17:11] Maybe like if there's a goal that you have to reach for, you have to improve like New Year's Resolution style goal. [00:17:17] Sure. [00:17:17] Yeah. [00:17:18] No, I was. [00:17:20] Like, you know, these great guys, these David Goggins, all these people who are like running backwards on a mountain. [00:17:25] Yes. [00:17:25] Totally get it, right? [00:17:27] I mean, that's great. [00:17:30] Who is that for? [00:17:32] Like, I know it's for people. [00:17:34] Like, the people that run up the hill are going to run up the hill. [00:17:40] The people that aren't going to run up the hill aren't. [00:17:43] Now, sometimes the people that aren't running up the hill will decide to run up the hill. [00:17:47] For me, the positive aspect of David Goggins is it wakes you up to consider, do I want to be a David Goggins in this world or do I want to be something else? [00:17:56] Like it just, it just like shakes you up. [00:17:58] Like looking at weirdness, like that's a good point. [00:18:02] Like just looking at something that's completely out of the ordinary. [00:18:05] Right. [00:18:05] I mean, you could just look at any kind of weird video, I suppose, of a human being doing anything weird. [00:18:10] And then it reminds you, wait a minute. [00:18:12] So people can are capable of this. [00:18:15] So maybe I'm capable of a lot of stuff. [00:18:17] What am I actually interested in? [00:18:19] I was just having a conversation with somebody and an older woman. [00:18:23] I've been practicing Russian. [00:18:24] I've been speaking in Russian with folks. [00:18:28] What's interesting about you is they hired you at the FSB without even speaking Russian. [00:18:33] They just let you. [00:18:35] That's odd. [00:18:35] They just said you'll figure it out. [00:18:37] Yeah. [00:18:37] Yeah. [00:18:38] They have a whole program to get you up to speed. [00:18:41] Fair enough. [00:18:42] So I was talking to her and she said the thing that brings her happiness is like this is going to sound a little bit strange, but sort of making dinner for a family, putting everybody to bed. [00:18:55] She has three kids. [00:18:56] Like her husband is like waiting for her in bed or he might be already sleeping and just sitting there at the kitchen table. [00:19:04] Oh, also making breakfast and lunch for the next day for the kids. [00:19:09] And just everything is done. [00:19:10] Everything is clean. [00:19:12] Everything is perfect. [00:19:12] And she's just sitting at the kitchen table and realizing this is happiness. [00:19:17] And that to me was like, yeah, that's another way of happiness. [00:19:20] And I am the same way. [00:19:22] To me, like, I mean, one form of happiness is just being pretty good at whatever I take on. [00:19:31] So like, I don't know, organizing any kind of stuff. [00:19:34] I love organizing. [00:19:35] Right. [00:19:36] You couldn't tell it by this place, but like I have an OCD nature, and it makes me feel good to organize stuff like spreadsheets or data and all those kinds of things. [00:19:46] And I could, you know, people talk about desk, you know, office work, but I would love that. [00:19:53] I would love to be an Excel person where I just organize, or you know what? [00:19:57] Like somebody's assistant where I like schedule stuff. [00:20:00] I love scheduling things. [00:20:02] You like that? [00:20:02] Yeah, I love it. [00:20:03] I would, yeah. [00:20:04] You were talking about love earlier. [00:20:06] Yeah. [00:20:06] And you were saying it's what are your thoughts on that as a organizing principle of people's lives. [00:20:16] And when you see people in relationships, it's very interesting to me. [00:20:22] I know people that I think are genuinely in love. [00:20:25] Like I think Ben and his wife are genuinely in love. [00:20:27] But I also see other relationships that are business arrangements. [00:20:32] And they're also successful. [00:20:34] Truly. [00:20:36] So that's very interesting. [00:20:38] Yeah. [00:20:39] Partnerships. [00:20:40] That's right. [00:20:40] That's different from love. [00:20:42] So love to me. [00:20:43] So broadly, first of all, love to me is much bigger than romantic love. [00:20:47] Of course. [00:20:48] So you have to put friendship like ever since I was young, I was a longing for other human beings and that connection you have. [00:20:56] You mentioned like sort of, you know, happiness is contrasted to the suffering or whatever, the hardship that people are going through. [00:21:04] And that's kind of your job. [00:21:05] Put yourself in a place of hardship so you can make people happy. [00:21:08] But I always saw that there's just unfairness and cruelty and all that kind of stuff in the world. [00:21:15] And you could just huddle like penguins together amidst that. [00:21:19] And that's what friendship is. [00:21:20] But aside from humans, also, there's just like it's sitting at the kitchen table or like looking out at a tree and nature and just appreciating to be alive. [00:21:31] That to me is love, like that connection to just fuck. [00:21:35] It's good to be alive. [00:21:36] That feeling. [00:21:37] That's huge. [00:21:38] Yeah. [00:21:38] But romantic love, I just have seen just over and over and over again, hanging out with Rogan and David Goggins in Vegas, just hanging out with their wives and seeing like, holy shit, so much of their success has to do with these people and their relationship. [00:21:59] That's a huge component. [00:22:00] And so like, I've interacted with a lot of successful people where I don't think it's an accident, where there's this belief in each other, excitement for each other. [00:22:08] And it's like, you also realize that what the hell are you doing any of this for? [00:22:12] It's to your point. [00:22:12] Like, what are we trying to teach people? [00:22:14] I think success tastes the best, feels the best when you get to share it with people you love. [00:22:22] Yes. [00:22:22] And that's 100%. [00:22:24] And that's, you know, so to me, that seems to be essential for a happy life is having like a inner circle, a small collection of people that you just love getting excited about shit together with. [00:22:36] That is absolutely true. [00:22:38] And sometimes they're, you know, the opposite sex or same sex, whatever they are, then you have sex with them. [00:22:44] Yes. [00:22:44] For many years. [00:22:45] Right. [00:22:45] Let me ask you an interesting question. [00:22:47] That's how that works. [00:22:48] In your estimation, I look at Jeffrey and Ghislaine and I say they were in love. [00:22:55] That was a very deep and powerful love. [00:22:58] I see the same. [00:22:59] I look at you and Ben the same way. [00:23:02] Absolutely. [00:23:02] Yeah. [00:23:03] He's just slain. [00:23:04] We are not as successful as Jeffrey and Ghislaine. [00:23:10] By the way, tonight is close to the end of the year. [00:23:11] Yeah, no, no, no. [00:23:12] The word is yet. [00:23:13] Yeah. [00:23:14] But what I mean about that, right? [00:23:16] Is you have this from the outside world. [00:23:19] This is a very interesting conversation, right? [00:23:21] Because everything you just said is beautiful and true. [00:23:24] But let's look at the other side of that. [00:23:26] So to us, you have Jeffrey and Ghislaine. [00:23:29] They're having this amazing love affair. [00:23:31] They genuinely, deeply love each other. [00:23:32] Do you think so, by the way? [00:23:33] It seems so. [00:23:35] Right? [00:23:35] I mean, the way they look at each other, I mean, they really truly seemed to love and trust each other. [00:23:43] Right. [00:23:44] They built this empire. [00:23:46] And nothing bonds you like doing evil onto the world. [00:23:50] True. [00:23:51] So my question is, is that valid, the love that they have for each other? [00:23:57] Because it does check a lot of the boxes of what you talked about, where it's like huddling together. [00:24:05] You know, it's us against the world. [00:24:08] Is that a valid love? [00:24:10] Yeah, it is. [00:24:11] It is. [00:24:12] But it can blind you to where you can do evil. [00:24:16] The harm you're doing to others. [00:24:17] Like love of nation is something I also have. [00:24:21] Which nation? [00:24:23] That's funny. [00:24:24] You should ask. [00:24:25] No, of course, man. [00:24:27] I'm American through, which is what I would say on a podcast that's going to be televised. [00:24:32] You live in Austin, Texas. [00:24:33] Nothing more American than that. [00:24:35] Exactly. [00:24:36] It really sells, I believe. [00:24:39] You're shooting guns now. [00:24:40] I'm going to start wearing a cowboy hat just to make people believe. [00:24:44] And can we edit out the part where I said I talk Russian? [00:24:47] Cause that's going to be that's not going to fit the brand of what I'm trying to sell here. [00:24:52] No. [00:24:53] You know, you grew up in the Soviet Union where the idea, I grew up in the Soviet Union, where the idea of love of country was really powerful. [00:25:00] And so much of it is propaganda, but there was, you know, literally the national anthem said, ironically enough, we're the greatest country on earth, indestructible. [00:25:10] It is the greatest. [00:25:11] The Soviet anthem is the greatest anthem ever written. [00:25:13] I play it in the car with Penn. [00:25:14] It is the most powerful. [00:25:16] You want to talk about musical prowess? [00:25:18] I mean, it's amazing. [00:25:19] And I don't know, you should look at the lyrics because they say we're indestructibles. [00:25:24] There's a, you know, the American national anthem is kind of polite. [00:25:28] Yes. [00:25:29] It's proud. [00:25:30] It's beautiful. [00:25:30] Yeah. [00:25:31] The Soviet national anthem is like, we're the baddest motherfuckers on this planet. [00:25:36] You will never destroy us. [00:25:37] Of course, it collapsed. [00:25:38] So there you go. [00:25:40] They changed it quickly. [00:25:41] They changed the national anthem after that. [00:25:43] So that's a love of country. [00:25:45] Hitler is an example of somebody that very effectively sold the love of country. [00:25:49] So love doesn't necessarily create good in the world. [00:25:54] Right. [00:25:54] It's this is more kind of. [00:25:56] Because love is a very powerful force, a blinding force. [00:26:01] When you talk about propaganda in the Soviet Union, you've lived in both countries. [00:26:06] You've seen American propaganda. [00:26:08] You've seen Soviet propaganda. [00:26:10] How does American propaganda function differently or does it than what you remember growing up in the Soviet Union? [00:26:19] Maybe you could speak to what you see as propaganda because there's so many beautiful flavors of propaganda. [00:26:25] Sure. [00:26:26] In the United States, what I remember from the Soviet Union, there's a centralized government-driven propaganda. [00:26:33] So it's more boring. [00:26:35] Capitalism does everything great, even the propaganda. [00:26:38] Right. [00:26:39] Well, I guess I would define it kind of loosely as the way that narratives emerge and are sustained in a society. [00:26:49] Those narratives could be about the society itself. [00:26:53] They could be about the way that society interacts with other societies. [00:26:58] But that to me is the most enduring form of propaganda is basically these specific belief systems that we have that are sustained through our media, Hollywood, athletics, sports, things like that. [00:27:19] Yeah, it's incomparable to me. [00:27:22] So, you could say, like, the mainstream media for a long time was an armor of the propaganda machine or something like that, because they would create narratives, maybe like going to war and Afghanistan and Iraq. [00:27:36] That was created through propaganda. === Zuckerberg and Freedom of Speech (15:31) === [00:27:39] Right. [00:27:40] You can make all those kinds of arguments, but ultimately, in the United States, the freedom of speech is obvious. [00:27:46] Not just the actual institutions that enable freedom of speech, but the desire in each human heart, in each American, like the obsession with the individualist freedom is like insatiable. [00:28:02] So, it's amazing to watch, actually. [00:28:04] And so, it like propaganda takes weird forms here. [00:28:09] There's just a lot of competing narratives and there's battles. [00:28:12] And then you realize somebody like Alex Jones could make a lot of money with one narrative. [00:28:17] Then, there's like the CNN could make a lot of money with another narrative. [00:28:20] And so, it's a competition of narratives, which ones go viral. [00:28:23] Right. [00:28:23] And so, ultimately, though, it doesn't feel like centralized control in the way the Soviet Union did, where everybody legitimately believed the story told. [00:28:38] In the United States, because of the competing narratives, I feel like everybody is constantly skeptical about any of that. [00:28:45] You're very cynical and skeptical. [00:28:46] To me, it feels like the centralized control is not for lack of trying. [00:28:54] Like, the lack of centralized control is not for lack of trying. [00:28:58] It feels like you have this carcass, this dead animal by the side of the road, which is the media. [00:29:05] And you have vultures, I'm one of them, that are taking a little bit of it. [00:29:11] And the animals may be still alive, and it's moaning and going, and those moans are the stories you get. [00:29:20] It does feel like this was a centralized. [00:29:23] So, the centralized effort is the dead animal on the side of the road. [00:29:27] Yeah. [00:29:27] And you're the vulture as the people. [00:29:30] We're the people. [00:29:31] We're not sustaining ourselves by coming in and saying, hey, we possess a technology and the ambition to also have a voice in a very crowded space where a lot of people want a voice. [00:29:46] But I think you're absolutely right. [00:29:48] But what I see with tech tech, there's a real danger that we can re-centralize, meaning that you have very few companies that have an enormous amount of power and you have a lot of different people and a lot of different beliefs and narratives. [00:30:08] You know, every institution, somebody said this once, and I believe that every institution seeks power through control. [00:30:17] And that's something that I've seen with the COVID over the last 12 months. [00:30:23] You've seen people get banned. [00:30:25] You've seen people have their accounts frozen. [00:30:29] They've been demonetized. [00:30:31] They've lost careers. [00:30:32] They've lost their ability to speak because they've had a different opinion and a different view. [00:30:39] Robert Malone at the same time. [00:30:40] Robert Malone was banned yesterday from Twitter. [00:30:44] This is one of the creators of the mRNA vaccine, right? [00:30:47] I don't know why we can't live in a society where Robert Malone is allowed to have an opinion. [00:30:53] I don't know why we can't live in a society where Joe Rogan is allowed to have an opinion. [00:31:00] I mean, he is now, right? [00:31:01] I mean, but just to clarify, you want him to be banned. [00:31:05] Who? [00:31:06] Joe Rogan. [00:31:07] Well, I think the way you made that sound. [00:31:11] I wasn't sure if you were doing the joke. [00:31:13] No, no, no. [00:31:14] Seeing people see that clip. [00:31:17] Yeah. [00:31:17] Well, Joe's show is not big, so I don't think it needs to be banned. [00:31:21] My when you get to a certain level, I'm surprised you're not banned yet. [00:31:25] They will net. [00:31:26] Well, it's interesting. [00:31:27] You do know I know Jack Dorsey pretty well. [00:31:30] So yeah, I don't think Jack's the problem. [00:31:32] He's not. [00:31:33] He's not. [00:31:34] I think he's like this libertarian guy with a beard who actually does believe in free speech. [00:31:40] Twitter was the freest of all of these. [00:31:43] I think Zuckerberg's a problem. [00:31:46] I think he's isolated, truly. [00:31:49] I think the people in his group are a problem. [00:31:53] And I think. [00:31:55] Can I ask you a question? [00:31:56] Yeah, actually. [00:31:57] So on Zuckerberg, because I have the same kind of intuition, but like, how do we know for sure? [00:32:09] Because it feels like some of it is also narratives that I think he wants to be a good guy. [00:32:14] And I think if you want to be a good guy and you want the world to see you at all times as a good guy, you're going to be very malleable. [00:32:23] And if you don't have values that are set in stone at any given time, I mean, look at the last year. [00:32:32] In the beginning of the last year, mentioning that the virus came from maybe potentially from a lab in China made you a hysterical racist, some crazy reactionary. [00:32:41] Now it's a pretty well-established, reasonable hypothesis. [00:32:47] That's only in 12 months you've seen voicing that opinion go from exiled from polite society to pretty reasonable, rational, mainstream belief. [00:32:59] Now, if you believe in freedom of speech, you ride that wave and you say you're allowed to have this opinion and you're allowed to have this opinion. [00:33:09] If you're looking to be a good guy, well, back there, you're going, that opinion's not allowed. [00:33:16] But then a year later, you go, we'll allow it now. [00:33:20] So if you're constantly going with taking the temperature of public opinion and these institutions that are around you, and when I say public opinion, I don't even mean the public. [00:33:33] I mean the small set of people that really define public discourse. [00:33:39] That's my worry about Zuckerberg: he wants to be seen as good all the time and he wants to be liked all the time. [00:33:46] And that can cause you to a lot of us want to be liked, right? [00:33:53] True. [00:33:54] I want to be a good person. [00:33:55] But we don't own Facebook. [00:33:57] So the point is that my aunt wants to be liked all the time, but she's unemployed. [00:34:02] So when you control one of the most massive exchanges of information in the world, I don't, you know, you having a basic human trait can be a lot more dangerous than a comedian having it who has very little power, you know, in terms of. [00:34:26] So your sense is that anybody in Mark Zuckerberg's position, your ability to truly have a good sense of who you are and what the reality is gets corrupted. [00:34:37] Yeah, I think that's why Jack probably left Twitter. [00:34:39] He's like, he maybe sensed that something. [00:34:42] I think Jack was basically like, I can't do it anymore. [00:34:45] And again, I'm speaking completely out of turn. [00:34:47] I've never met him or spoken to him. [00:34:50] But what I do think is that he, from my belief system, did believe in things. [00:34:56] Now, it doesn't mean everything he believed in, you know, his company did not behave that way all the time. [00:35:03] I don't think they executed always, but I think there was some overarching belief in the freedom of people to exchange their opinions. [00:35:16] Now that started to get curbed later in his tenure and now he's gone and it scares me what's coming next. [00:35:22] So I think sort of the belief in the freedom of speech is an interesting one as a platform because they have a lot of videos of rape, violence being uploaded and they have, so they have to remove something from the platform. [00:35:38] That's right. [00:35:38] And it becomes a slippery slope for every platform, which they realize every social media platform, even the ones that advocate freedom of speech, they have a line that you, if you cross, you don't think 8 Chan was the one that was created that said, like, we're saying there's no line. [00:35:56] But I think even they had a line and that was a big thing. [00:35:58] Well, I believe there should be a line, right? [00:36:01] I mean, in terms of like child pornography, rape, violence, scenes of gratuitous violence, right? [00:36:07] Threatening people to a point, you know, doxing people, putting out their personal information, encouraging harm. [00:36:14] Yeah. [00:36:14] Right. [00:36:15] There are all these lines that in a society we need to have. [00:36:19] What I don't know, I don't think those lines should necessarily be political. [00:36:24] And I think that's what I've seen more and more of. [00:36:28] I think my biggest problem is sort of determining lines from a place of arrogance versus humility. [00:36:34] So I really like it when people kind of approach all of this with a high degree of uncertainty about how much they truly understand the world. [00:36:45] So to me, the problem isn't the drawing of a line. [00:36:48] It's the drawing of a line with arrogance, like as if you know better. [00:36:52] And so, I mean, you could see reasonably that quote-unquote misinformation about a pandemic could be viewed in a boardroom at Facebook or whatever social media company as something that could lead to the death of millions, right? [00:37:08] So from their perspective, they could say, all right, well, we want to stop the spread of misinformation. [00:37:13] So we're going to censor everything that uses the word ivermectin or something like that. [00:37:18] But you have to have a humility that, wait a minute, maybe we don't perfectly know what are the solutions to this pandemic. [00:37:27] We need to be open-minded here. [00:37:29] Maybe very specific figures in the scientific community aren't the sole possessors of the truth. [00:37:34] Even the scientific institutions themselves, maybe they're not the sole possessors of the truth. [00:37:39] And having arrogance about that is really dangerous. [00:37:41] So to me, so the reason I'm bringing all this up is I hope it's possible to be at the scale of Facebook, at the scale of Twitter, and have the humility to actually have a product that does good for the world. [00:38:00] Right. [00:38:00] And like constantly ask itself, wait a minute, are we doing the right thing here? [00:38:04] As opposed to kind of having very specific metrics about engagements and those kinds of things that I'm talking about. [00:38:10] In your estimation, is there intact just like every business, the monopolies are so inevitable, seemingly? [00:38:21] Like we have all these smart people in America. [00:38:23] We have all these capable people in tech, but all of these massive, whether it's Amazon or Facebook, there's really five companies that control everything. [00:38:33] And it seems to me that's going to be the way that it'll be forever. [00:38:37] And does you disagree? [00:38:39] Okay, well, that's good. [00:38:40] I would love it to change. [00:38:42] It's a new, it's a new, so tech, everything we see with the internet is new. [00:38:46] So there's another generation of people coming up that are, I would place the problem, it's complicated, but let me try to explain. [00:38:56] I think one of it has to do with money. [00:38:58] So what happens is like these big companies, they acquire. [00:39:04] The moment you become successful, they start acquiring you. [00:39:06] And to me, one of the things that I value, like for me, I have a dream of starting a company and I know for sure I will never sell that company. [00:39:16] So you have to believe in your idea. [00:39:18] And that's, there's a lot of entrepreneurs that I see now that really believe in an idea. [00:39:24] And in that way, you can build a larger number of companies that create a market of ideas and how you can do social media, how you can do whatever we're talking about, how you can do platforms that show video, platforms that show images, all those kinds of things. [00:39:41] And then you could have a competition in terms of censorship, in terms of how data is stored and shared, how much transparency there's about the data. [00:39:49] So I have a hope for that. [00:39:52] So I don't think it always has to be that way. [00:39:54] One of the problems is that just Facebook keeps acquiring everything that comes up. [00:39:58] So when companies sell, they just lose control. [00:40:03] And whatever that initial idea was gets potentially perverted to fit someone else's idea or someone else's profit model. [00:40:15] And there's a kind of complacency within big companies where they don't, they lose track of what made them great in the first place, which is constantly questioning, are we doing the right thing? [00:40:29] How can we do this better? [00:40:30] Right. [00:40:30] So that urgency you have when you're going to go bankrupt if you do make the wrong decision, that's what startups have. [00:40:37] And you can't, you can never lose that urgency. [00:40:39] That's what Steve Jobs was great at, artificially or not creating urgency. [00:40:44] Like we need to constantly create a new idea, constantly creating an idea. [00:40:48] And those people are rare. [00:40:49] And Elon's example, somebody that's. [00:40:50] I do think that one of the, you know, the weaknesses, I think, of the pro free speech contingent of people is they don't appreciate the complexity of the issue. [00:41:00] Like it is very difficult. [00:41:02] Like all the things we talked about, violence against children, you know, threatening things, making people's home addresses available, like there is a lot of gray area. [00:41:11] There always has been. [00:41:13] So I certainly don't envy the job that somebody like a Jack Dorsey has, right? [00:41:19] I don't envy Mark Zuckerberg's job. [00:41:21] You know, these people are dealing with foreign manipulation. [00:41:25] They're dealing with bad actors in their space. [00:41:27] They're dealing with organized troll campaigns directed from other places. [00:41:32] So I don't envy that. [00:41:33] It's an insanely difficult thing to do, especially because we are living more and more online. [00:41:38] More and more people, their ability to earn a living, their ability to market themselves to companies all rests on their ability to have a digital identity. [00:41:50] So when you look at this stuff, you go, it's, you know, and all these, you know, things, it's all these like QAnon networks that start, you know, Parlor and all these things that are like, you know, explicitly political. [00:42:05] And they go, well, if you're off Twitter, you can go on Parlor, but then you go on Parlor and you're like, okay, this is Looney Tunes, right? [00:42:11] You're going into this crazy bubble. [00:42:14] You're going into this crazy bubble where a lot of things are just not based in reality at all. [00:42:22] It's a hard problem to solve because it's hard. [00:42:24] You could see like Parlor, you could have hope. [00:42:26] Like, okay, here's a place where freedom of speech reigns. [00:42:30] And then you look there and it's like, it's just a bunch of hate and it's not even fun to hang out there. [00:42:35] It's right. [00:42:35] So you go, you need a little bit of hate. [00:42:38] You need a little bit of hate. [00:42:39] This is my new campaign. [00:42:40] It's called a little bit of hate. [00:42:41] You know, a lot of people are getting rid of hate wholesale in the schools. [00:42:45] And I believe I should go back and just go, we need a little bit of hate. [00:42:48] That's your TED talk. [00:42:49] Yeah, it's my TED talk. [00:42:50] A little bit of hate. [00:42:51] You need a little bit of hate. [00:42:52] You need a little bit of, you know, contention. [00:42:58] You need people who have different ideas to be able to fight it out a little bit without completely saying, you have to use this service and you have to use that service, you know? [00:43:09] But it's tough. [00:43:10] It's difficult. === Proof of Work in Physical Reality (10:54) === [00:43:11] You know who Tom Waits is? [00:43:12] Yeah, of course. [00:43:12] Got that song, um, I like my talent with a little drop of poison. [00:43:16] Yeah, I think that's true. [00:43:18] I actually kind of see you as that person in the tavern. [00:43:21] I think it's in Shrek, yeah, where there's an like an old man in a tavern playing a piano, right? [00:43:27] He's got a voice kind of like yours. [00:43:28] Yes, it's good, over time, becoming more and more Tom Waits. [00:43:32] That's very true. [00:43:34] My final form is Tom Waits. [00:43:36] My final form is a piano. [00:43:39] Tripletex are flexible, tripletex available for net petic. [00:43:53] ...and urmakere... [00:43:55] ...and kaffebare. [00:43:56] Yeah, it was a double latte on soya. [00:43:58] ...and, of course, bilforhandlere. [00:44:00] Yes, you are sure now that all sorts of small and small businesses will need to do what they need from TripleTex, the whole Norges Regnskaps program. [00:44:07] Go to TripleTex on TripleTex. [00:44:09] With TripleTex, you will be easy to deliver the information for networks. [00:44:13] ...and sovebønner. [00:44:14] ...and kiropractor. [00:44:15] ...and maskinutleire. [00:44:17] ...and advokater. [00:44:18] ...and alarmselskaper. [00:44:19] ...and regnskapsfører, of course. [00:44:21] ...and begravelsesbyråer. [00:44:22] ...nei, kondolerer. [00:44:24] ...and selvfølgelig... [00:44:25] ...elektrikker. [00:44:26] Ja, du har sikkert skjønt det nå. [00:44:28] TripleTex hjelper alle slags bedrifter med å håndtere årsoppgjøret sitt. [00:44:32] Husk fristen for å levere skattemeldingen 31. mai. [00:44:35] Hele Norges regnskapsprogram. [00:44:37] TripleTex. [00:44:39] Hei Mikkel. [00:44:40] Hei, Gustav skal bli med hjem i dag. [00:44:42] Kan han spise på oss? [00:44:43] Ja, selvfølgelig. [00:44:44] Hva er det til middag? [00:44:45] Grønnsakssuppe. [00:44:46] Hva? [00:44:46] What? [00:44:59] London and Utval Hermal. [00:45:12] Or not seen as much as a pocket second or something. [00:45:17] So, for example, you have to do it. [00:45:35] What do you think the challenges are for you? [00:45:40] Because you have this new internet, Web3, as they call it, right? [00:45:44] Digital tokens, crypto. [00:45:47] Are you a believer in this? [00:45:48] Are you a skeptic? [00:45:49] What I mean, Web3 or crypto? [00:45:51] Well, it's a good, that's a good delineation. [00:45:55] Web3 is not something to believe in or not, right? [00:45:57] I mean, that's just kind of here. [00:45:58] Do you believe that crypto is going to be this force that helps things decentralize, or do you believe that absolutely? [00:46:06] Absolutely. [00:46:07] Yeah, so you know, I've had a lot of conversations about cryptocurrency, all the so I could give you the full sort of uh rundown of all the different cryptocurrencies that are exciting to me. [00:46:19] But to me, personally, zero of it has to do with money. [00:46:23] So, I uh like financial um what do you call it, investment. [00:46:28] Right. [00:46:29] So, a lot of people that's part of the problems of the space is a lot of people just want to make money and they want to make a quick buck. [00:46:35] So, they their excitement is exaggerated by the fact that they're invested in a particular coin. [00:46:42] And so, to me, what's interesting is the different technologies involved, and they're amazing. [00:46:46] So, explain briefly to people that are unaware of what blockchain technology is because maybe they have not come in contact with anyone who has made it their business to yell at them for 10 hours about it. [00:47:02] Well, it's a pretty tough task. [00:47:05] Maybe one Nice way to do it is just to mention some of the key technologies that are competing. [00:47:13] So, uh, the original implementation of cryptocurrency on a blockchain is Bitcoin. [00:47:19] And one of the defining aspects of it is it's proof of work. [00:47:23] So, it's very difficult. [00:47:25] Um, it's it's it's grounded to physical reality in the way that gold is grounded to physical reality, and that's really powerful. [00:47:33] And uh, also it's powerful because no governments uh can control it, and so there's a there's a finite amount of it, and there's a finite amount of it. [00:47:44] So, so the scarcity is enforced, so it's going to hold its value. [00:47:47] There's a lot of nice properties to it, and then uh, the negative side is the transactions are very slow and expensive. [00:47:54] So, um, people have constructed like layer two technologies, which is using Bitcoin for the great things that Bitcoin does and having the fast transactions that you want, like if you want to pay for dinner, they have layer two technologies like Lightning Network as an example that sits on top of Bitcoin. [00:48:12] Okay, then there's Ethereum with now blockchain is like a when people talk about a ledger and what would they like, how would you break that down to somebody who's like, wait a minute, what in God's name are you talking about? [00:48:31] Uh, it's a way that all of these transactions are stored. [00:48:39] Yeah, so it's basically a database, right? [00:48:42] It's a way to store data in a way that cannot be falsified or sort of sort of attacked and falsified and faked and changed by anybody because there's a lot of people that have a consensus mechanism that are monitoring and have full access to that data. [00:49:02] And that so blockchain just means store of data that has a temporal component. [00:49:09] So it's there's a sequence to it and nobody can modify it. [00:49:14] Okay. [00:49:14] And so like this is like secure, reliable in the way that physical reality is secure and reliable. [00:49:21] If you that if you break a glass, it's broken. [00:49:29] That's a fact that nobody, nobody can unbreak that glass and fake it, convince you it never broke fake news. [00:49:36] Like that reality happened. [00:49:37] And in the same way, blockchain is the first technology of its kind that in a digital space that allows you to have that same reliability across a population in the same way that physical realities is reliable. [00:49:54] Blockchain just stores information in a reliable way. [00:49:58] Nobody can mess with it. [00:49:59] And then everybody just uses that blockchain in different ways. [00:50:02] And there's is there one blockchain or are there several? [00:50:06] Yeah, each one, there's one per crypto. [00:50:11] So Ethereum, a lot of the NFTs that are selling right now, they're on the Ethereum blockchain. [00:50:15] Yes. [00:50:15] Right. [00:50:16] Yeah. [00:50:16] And then Ethereum has a lot of cool properties. [00:50:18] Right. [00:50:18] Of course, it's evolving Ethereum 2.0. [00:50:20] I think it's supposed to come out 2022. [00:50:24] You had the founder on your show. [00:50:26] Yeah, a couple of times. [00:50:26] Brilliant guy, really humble. [00:50:28] So some of these cryptocurrencies don't have one figurehead. [00:50:36] They have a community of people, community of developers. [00:50:39] Not obviously, but it's well known that the Bitcoin creator and founder is anonymous, Satoshi Nakamoto. [00:50:48] Right. [00:50:48] That seems to be less true every year. [00:50:51] More people. [00:50:52] I wouldn't be surprised if you are Satoshi Nakamoto. [00:50:54] Oh, you're saying people are seemingly, they're drawing a line around a group of people. [00:51:00] It's got to be one of them, right? [00:51:02] Yeah, yeah, yeah. [00:51:02] It's not like it could be anybody. [00:51:04] Well, but that's the beauty of it. [00:51:07] Right. [00:51:07] That you don't know. [00:51:08] And that's a powerful idea in itself, the anonymity. [00:51:11] But then Vitalik Buterin, who's the one the co-founder of Ethereum, is kind of like the figurehead and he represents it. [00:51:19] And so, and there's there's a few others like another co-founder of Ethereum is Charles Hoskinson, who created Cardano. [00:51:28] That's a very interesting piece of technology. [00:51:30] Also, so I would say there's smart contracts, which is ways to use the blockchain to make agreements between people, like exchange assets or do like, you know, the same kind of contracts you do between people. [00:51:46] So that's really interesting. [00:51:48] There's moving away from the proof of work, which requires you to use a lot of resources. [00:51:58] Explain that. [00:51:58] Explain proof of work when you said it grounds Bitcoin to the physical reality. [00:52:03] Explain proof of work. [00:52:05] Proof of work, you have to solve a cryptographic puzzle on a computer. [00:52:08] So a computer has to do a really hard thing for a long time to solve it. [00:52:12] And so you have to have a giant number of computers doing a giant number of computations. [00:52:18] That's why it takes all this energy. [00:52:19] It takes all of this energy. [00:52:21] Yeah. [00:52:21] But that's why it's very difficult to fake. [00:52:24] It's very difficult for a large number of people to get together and say, we're going to create a bunch of fake Bitcoin. [00:52:31] So this, this really grounds it. [00:52:34] You always have to come back to this mining effort of performing the computation. [00:52:39] There's another. [00:52:40] Now, what is it? [00:52:40] Is it a math problem, essentially? [00:52:42] It's a math. [00:52:42] It's a simple math problem. [00:52:43] You just have to do a lot of it. [00:52:45] To get one Bitcoin. [00:52:47] To mine one Bitcoin. [00:52:48] And how many computers have to do this? [00:52:50] As many as you want. [00:52:52] Okay. [00:52:52] So you can be mining Bitcoin with one computer. [00:52:55] With one computer. [00:52:56] And it would take a long time. [00:52:57] It would take a very long time. [00:52:58] So how many computers do you need if you want to make quick work of it? [00:53:02] It sounds like you're getting ideas in your head. [00:53:06] Maybe I am. [00:53:07] I think it's getting less and less possible to really make money with this mining effort. [00:53:13] I'm actually not paying too much attention of where the major mining facilities are. [00:53:20] China recently banned mining, Bitcoin mining. [00:53:26] So they banned cryptocurrency and banned mining. [00:53:28] So all the mines that were in China had to move elsewhere. [00:53:32] But I think it's distributed across the world. [00:53:33] I think there's in South America. [00:53:35] There's in. [00:53:36] And one of the reasons that Elon. [00:53:39] There's some here in Austin. [00:53:40] Sorry to interrupt. [00:53:41] Sure. [00:53:41] I'm sure there are. [00:53:43] Elon made some news by saying it needs to be a greener, more environmentally friendly process. [00:53:48] This is maybe last year. [00:53:50] Yeah. [00:53:50] So what the Bitcoins will say, or the Bitcoin folks will say, is yes, it's not environmentally friendly, but the benefit you get far outweighs the cost. [00:53:59] So there's a lot of stuff we do that's not environmentally friendly, driving cars, eating meat. === Fauci, Diseases, and Sacred Beliefs (07:07) === [00:54:05] And their argument is like, well, the benefit to society is much higher than the cost it has to the environment. [00:54:10] So like, don't, don't just find a boogeyman just because you want it. [00:54:16] They just see it as another way to attack Bitcoin. [00:54:18] Okay. [00:54:19] But yes, it's bad for the environment, but so are a lot of things. [00:54:23] Absolutely. [00:54:24] That's the argument that would make because this has the opportunity to revolutionize the monetary system across the world. [00:54:30] So one of the downsides of the way money is used in the world is used by governments to control the populace. [00:54:38] That's right. [00:54:39] There's a lot of people that talk about authoritarian regimes that use inflation, you use money to control people. [00:54:48] And so Bitcoin allows you to start wars. [00:54:51] You do everything. [00:54:52] And so to fund gain of function research into diseases that leak out. [00:54:57] Yeah. [00:54:58] Okay. [00:54:58] Allegedly. [00:55:03] I've been thinking about whether I want to talk to Fauci. [00:55:05] I really don't. [00:55:06] I'd really, he's not going to come on. [00:55:08] He will. [00:55:09] Do you think he'll come on? [00:55:10] Yeah. [00:55:11] So, no, this was a discussion here. [00:55:13] He said yes, but the problem is he wants to do 15 minutes. [00:55:18] So he doesn't know who I am or anybody. [00:55:21] It's the same podcast problem. [00:55:22] And I said, no, it has to be three hours. [00:55:24] And so because I do, this is one example of a person, again, that has a lot of, not again, but there's a lot of positive aspects to him that he's, he's accomplished a lot of incredible things in his life. [00:55:38] He started AIDS. [00:55:41] I mean, that's a huge thing. [00:55:43] That was a big deal. [00:55:45] Yeah, it's very impactful. [00:55:46] Hitler was, I think, man of the century for the 20th century. [00:55:48] He killed beagles. [00:55:50] Hitler did? [00:55:51] No, Fauci. [00:55:52] Your friend. [00:55:54] He's not my friend. [00:55:55] And then him and Bill Gates recently. [00:55:59] Pizza. [00:56:01] Well, I don't know about that. [00:56:02] Okay. [00:56:03] But Bill Gates is buying a bowl of farmland. [00:56:07] What's the connection between Fauci and there must be a connection? [00:56:10] They're disease freaks. [00:56:12] Yeah. [00:56:13] They like diseases a little too much. [00:56:15] Yeah. [00:56:16] There's, listen, I understand when you spend your life researching pandemics, you get a little, they're a little too excited about it. [00:56:22] It's Nietzsche. [00:56:23] If you, if you gaze into the abyss, the diseases back into you. [00:56:27] Fauci's also an Italian. [00:56:29] That's big problems. [00:56:31] That's the first big problem that you have. [00:56:32] They have many defects. [00:56:34] Yeah. [00:56:35] And that's just more suited to make pizza or be a maitre D at a restaurant than he is to be a scientist. [00:56:45] Yeah. [00:56:46] It makes perfect sense. [00:56:48] Italians are not meant. [00:56:49] Italians are criminals and they're meant to be in so wherever they are, they're going to commit crime. [00:56:55] So if they're in a pizzeria and it's a cash business, they can steal some of the money and feel good about themselves. [00:57:01] But when they're put in the government, they can steal so much money that they can have too much power. [00:57:05] So Fauci is a problem. [00:57:08] Yeah. [00:57:08] Primarily because he's Italian. [00:57:10] That's huge. [00:57:11] Well, it's a huge part of it, but he's a power-crazed vaccine Hitler. [00:57:21] Go back to the Ethereum and the canary and a coal mine. [00:57:28] Let me just wrap up the Fauci thing, though. [00:57:31] Yes. [00:57:32] To quickly say that I do think as a political leader in the response to this pandemic, he did a terrible job. [00:57:38] He, to me, does not represent science. [00:57:41] There is a lot of good things to say about him, but none of those things come to mind for me in the pandemic. [00:57:47] He did a poor job responding to the pandemic, but I think he did a really good job starting it. [00:57:53] Yeah, yeah, yeah. [00:57:55] Tim Dylan speaking here, who the man who also called him something Hitler, it was beautifully. [00:58:04] It was poetic the way you referred to him as Hitler. [00:58:06] I forgot. [00:58:07] Me? [00:58:07] Fauci, yeah. [00:58:08] Vaccine Hitler? [00:58:09] Vaccine Hitler. [00:58:10] Thank you. [00:58:11] That was very good. [00:58:12] We should have four shots, five shots, six shots. [00:58:17] I like to study diseases and then we like to leak them into the population and see what happens. [00:58:24] And then Bill and Melinda Gates get divorced because Bill's on Epstein's Island. [00:58:29] Fauci didn't even get invited to Epstein's Island. [00:58:31] That's how big of a loser he is. [00:58:35] Yeah. [00:58:36] Yeah. [00:58:36] I'm sure that that's the reason Bill Gates can. [00:58:39] So you're a big believer in the transformative effect of crypto rather than the investment side. [00:58:43] Transition, Tim. [00:58:45] Yes. [00:58:48] Not a not like a big, big believer in the way because big believer in crypto space means something else because there is really big believers in the crypto space. [00:58:56] I'm one of them. [00:58:57] Think there's certain uh cultish aspects to cryptocurrency supporters where they're not able to see um not always able to approach things with humility and open mind. [00:59:09] Well, that's why we need humor, right? [00:59:11] That's why we need to also poke fun at things. [00:59:13] Like, that's where people don't realize the value in that-the value in like not taking yourself so seriously. [00:59:18] You know who's created it? [00:59:19] The Winklevoss twins are great. [00:59:20] Yeah, you can make fun of stuff, they get it. [00:59:23] Gary Vee is actually really good. [00:59:24] I was texting with him, he understands like people that surprise me. [00:59:27] Like, they people understand humor, and then there's a lot of people that don't understand humor that don't understand the value because humor at its core, at its base, all it seeks to do, right, is to, I think, just take down these sacred cows. [00:59:45] Yeah, and it doesn't mean that the sacred cows aren't sacred for a reason, right? [00:59:48] Make a joke about Jesus doesn't mean that Jesus wasn't this tremendously impactful figure in history, that billions of people have put their faith. [00:59:57] It doesn't mean that what it means is that what humor has this unique ability to do is maybe make anything more accessible to people. [01:00:13] I think it makes things accessible, it takes kings off thrones. [01:00:17] Yeah, and so people that believe so much in Bitcoin, it can take a joke. [01:00:22] The crypto world can take a joke, yeah. [01:00:25] And that's what that's why I'm a big fan of you. [01:00:27] That's why I've been honored that you make fun of me a few times, which is great. [01:00:32] I love it. [01:00:33] I like um, and when it's never nasty, no, no, or mean, it's that's not it's no, I like uh smart people acting silly versus silly people acting smart. [01:00:45] I like that. [01:00:46] I'm not necessarily saying I'm smart by a certain error on the side of the silly because I that's why Elon is also an inspiration in that despite everything else, he's able to be ridiculously silly. [01:00:58] Well, that's what I like about him. [01:00:59] He kind of pokes the bear, yeah, and he doesn't get like uh, you've made fun of him a bunch, doesn't care, doesn't care, I mean, not in a Foxy way. [01:01:08] I don't think he knows who I am or even knows that I've made fun of him. === The Game Played by Intelligence Agencies (15:13) === [01:01:12] Oh, that I'm sure that's not true, actually, but that'd be interesting. [01:01:15] I'll, I'll bring that up. [01:01:16] Yeah, I don't, I don't know that he, I don't know if it's on his radar, but I also think if it was on his radar, he wouldn't have me drone striked. [01:01:23] I think he'd just chuckle, yeah, exactly, just chuckle, but yeah, of course, there's different kinds of humor too. [01:01:29] There's a, I mean, he's more in the uh dad jokes, silly memes category of humor. [01:01:35] You're more like, I'm going to do a tweet thread on Ghislaine Maxwell and how great Kim Jong-un is for the world. [01:01:43] And, you know, he's a, he's a, he's a, he's fat-forward. [01:01:48] Yeah, fat-positive, fat positive. [01:01:50] Is that a new fat-forward? [01:01:51] Okay, fat-forward. [01:01:52] Meaning that he believes the future is fat. [01:01:55] Yeah, he's a progressive fattest. [01:01:59] I agree. [01:02:00] Yeah. [01:02:01] Is it interesting to you? [01:02:02] You're in worlds with a lot of very interesting people. [01:02:04] This technology, the robotics that you're into, all of these countries, right, are competing for, I guess the word would be supremacy in a lot of these arenas, right? [01:02:17] If China has the best technology, China's going to export it to everybody. [01:02:21] If the U.S. has the best technology, the U.S. is going to export it to everybody. [01:02:25] There's a lot of intellectual property concerns. [01:02:28] There's a lot of shadowy intelligence figures that embed themselves in institutions to try to gain access to a lot of highly sensitive stuff that's being developed and stuff like that. [01:02:46] When you look at the landscape and you see a country like China, I've had journalists on my show that have said they believe that China is very influential in terms of like they see a lot of American U.S. corporate interests saying that we have to adopt a lot of these Chinese technologies, surveillance states, social credit scores, things like that, so that we can then export them to the rest of the world. [01:03:14] We have that hegemony. [01:03:16] Does that worry you at all that we're in a competition with an authoritarian country and that in order to beat them, we may have to become them? [01:03:27] Yeah, of course that worries me. [01:03:29] So, for example, in the AI space, there was a recent report from the United States that we are now fully engaged in developing autonomous weapon systems because China is developing them. [01:03:42] And the argument was made that, you know, we're willing to go into the race for development of artificial intelligence driven weapons because that's for the good of the world because China is doing it. [01:03:57] So if you don't do it, then they're going to do it. [01:04:00] And that's how any kind of, you know, bad thing happens. [01:04:05] So I'm really worried about that because there's a lot of people saying like, you know, developing like robot dogs, for example, aren't the main concern here, actually. [01:04:14] It's drones. [01:04:16] Drones are terrifying. [01:04:18] People have, there's some kind of magic connection that people have with legged robots, for example, because it just connects them to dogs and cats and humans, but they're not actually very good at committing violence compared to drones. [01:04:32] Drones are great at it. [01:04:33] Yeah, the best. [01:04:35] So the development of autonomy. [01:04:37] Currently, most drones are remotely controlled and the degree of autonomy is minimal in terms of drones that are doing. [01:04:44] But we're moving into an arena where you would have drones just deciding who and when to blow people up. [01:04:50] Well, it's more and more, it's greater and greater degrees of automation. [01:04:54] So you would say, okay, this location, this building has a terrorist. [01:04:58] And currently you have remote controlled operation of the drone to where you can target very precisely that particular building. [01:05:07] But then you could start seeing going to a place where, okay, these are the set of locations we believe the terrorists are located in, but let the drones start to decide what is the most likely of those locations. [01:05:22] And then you get to more and more automate the decision about where to launch the bomb or the missile. [01:05:30] And then you start to expand the region of automation where now there would be a set of drones that just monitor the presence of terrorists in a particular city. [01:05:41] I mean, it just becomes the slippery slope where when you take the human out of the loop of the decision making, you start to commit, like have the capacity to commit vastly unethical things. [01:05:56] So yeah, that really worries me. [01:05:58] But also another thing that kind of worries me is the warmongering. [01:06:02] So on the flip side, I've actually talked to a bunch of folks that live in China that Americans have a way of seeing the rest of the world as kind of flawed and America has the right to answer for everything and how to live life, the idea of freedom, that we are the good guys. [01:06:21] I mean, every country has this idea, but we have to be careful with that because I think you have to try to understand another culture. [01:06:29] Agreed. [01:06:29] So it's not obvious to me that the Chinese government is absolutely unequivocally like evil. [01:06:39] They have a very different culture. [01:06:41] I've interacted with folks In many ways, their culture is very different. [01:06:46] Like the value of uniqueness and innovation, for example, it's not there in the same way it is in the United States. [01:06:55] So Chinese folks, Chinese engineers, Chinese entrepreneurs are more okay with copying ideas. [01:07:03] Right. [01:07:03] Like legally copying the way I would say Bill Gates and Microsoft copied ideas. [01:07:09] Their whole approach was, let's see what Apple does and take the best ideas from Apple, from Netscape, from Firefox, from Google. [01:07:19] They take ideas and they try to make them better. [01:07:23] That kind of way of thinking is just very fundamentally different from the United States. [01:07:27] And if you say, if you constantly call them out for like stealing or something like that, it's just constantly trying to create tension where there doesn't necessarily need to exist tension. [01:07:39] Now, there's obviously a lot of aspects of surveillance state that they're trying to create that are really worrying about human rights. [01:07:47] But again, I just feel like you have to lead. [01:07:50] You have to start by trying to understand them. [01:07:52] I think one of the things that I like that they do is that when they have a problem with someone in their society, usually a very wealthy person, like a billionaire, instead of like beating around the bush, they just disappear that person until they can be re-educated. [01:08:18] Now, it's dark, but here's the thing. [01:08:23] I think there's a, you know, we bring all these billionaires before Congress and we like yell at them. [01:08:32] We have like these Midwestern mom congressmen be like, you're making all this money and nobody's got any money. [01:08:39] And it goes viral on Twitter and everyone's God screen. [01:08:42] But the reality is, instead of that, why not kidnap and torture these people? [01:08:47] Disappear, allegedly torture. [01:08:50] Right. [01:08:52] So, I mean, I'm not actually, so that's, that's really bad if that's happening. [01:08:59] Well, I'm also, I'm making light of it, but I'm also saying we seem anemic. [01:09:03] This is my serious point behind the joke. [01:09:05] Yeah. [01:09:06] We seem impotent to control anybody in this country if they've got enough capital. [01:09:12] Yes. [01:09:12] China does not. [01:09:15] So that's interesting. [01:09:16] And currently, China is making a lot of centralized decisions that are actually good for its society. [01:09:20] So there's a lot of people discussing. [01:09:22] That's interesting, right? [01:09:23] Like if you said the human right, if you set the human rights violation aside for a brief moment, if you look at the markers in terms of the economy, in terms of actually the quality of life across the population, it's increasing rapidly. [01:09:40] So the problem is, you know, that's the problem with communism. [01:09:44] Like you could do a lot of good in the short term, but it's the slippery slope of power that corrupts people. [01:09:51] Yeah, my biggest worry with China is if there's an overbearing government is not the disappearing of billionaires. [01:09:57] It's the fear that overtakes the populace where they can't speak up. [01:10:03] It's the self-censorship. [01:10:06] It's not censorship, it's self-censorship. [01:10:08] What do you think people get wrong about Vladimir Putin? [01:10:13] This is a very touchy subject on this particular program. [01:10:18] Well, what do you mean by that? [01:10:20] I have not. [01:10:21] Well, you've become, now that Larry King has passed, now you've become the new Larry King. [01:10:25] That is true. [01:10:27] That is unequivocally true, and no one would deny that. [01:10:30] What do they get wrong? [01:10:32] Well, because he's a historical figure and a present figure. [01:10:36] You, growing up in the Soviet Union, you have an understanding of him maybe as a leader that other people don't, right? [01:10:43] From the West or from outside. [01:10:45] One of the things I was highly critical of was when our government, our CIA, was like fomenting coups in the Ukraine, encircling Putin, encircling Russia. [01:10:58] And then when Putin would respond to that, we'd go, what a monster. [01:11:02] But I mean, if he was doing that to us, we'd have the exact same reaction. [01:11:07] This is what I mean about propaganda and narratives emerging, right? [01:11:11] So, however, that being said, it does seem like there's a consolidation of power that is unhealthy. [01:11:20] So, I'll say one, just building on your previous point of the CIA and the FSB and the KGB, there's a game being played there. [01:11:30] There's a game being played in the shadows, and there's a bunch of people that are very good at this game. [01:11:36] And I think what I could say about Vladimir Putin is I think he's very good at this game. [01:11:40] And I don't like, I personally don't like that such a game is played, that anything is done in the shadows, that there's this manipulation that's happening. [01:11:49] Where does this game be? [01:11:51] Just spying on each other, the cyber attacks in the modern world, stealing information, manipulating information, like you know, creating narratives that get like the American populace pissed off about something, and then America creating narratives in Europe that gets Europe pissed off at Russia, all those things. [01:12:15] That's a game. [01:12:16] That's the game of geopolitics. [01:12:18] Henry Kissinger is somebody who was incredibly good at this game. [01:12:21] And the fact that such a game has to be played is deeply dishonest and destructive to the world. [01:12:28] So I mean, but that it's a fascinating game. [01:12:32] And what troubles me most, of course, is that we know so little about it. [01:12:36] There's just a giant number of conspiracy theories, but they're just flirting at the door of truth. [01:12:44] They're not really, you can't, you can't, it's not, it's not transparent. [01:12:47] These organizations are not transparent, unfortunately. [01:12:50] So I just want to, you know, say that there is kind of in Russia a discussion, especially amongst young folks, and it's certainly in the United States, that sounds similar to the way people analyze Donald Trump, which is saying that Putin is dumb, that Putin is just a lucky idiot. [01:13:12] And that's not true. [01:13:15] Well, there's a lot of people that believe this. [01:13:17] And what I personally believe is he's a very intelligent man. [01:13:22] And in terms of the game that we were just referring to. [01:13:26] Oh, of course. [01:13:27] The other thing I should mention that people might know, the reason I find him fascinating as a historical figure is he is able, he is very intelligent in conversation. [01:13:38] So he's able to analyze the world in very interesting ways. [01:13:43] And leaders aren't always that intelligent in conversation or just as, you know, as analysts of history, analysts of the present geopolitics of the world, and which you would appreciate, humor. [01:14:01] Right. [01:14:02] And like dark humor. [01:14:05] Right. [01:14:06] So that's what makes him really fascinating to me. [01:14:09] Obviously, what also makes me fascinating is that he's been in power for 20 years now, 20, 21 years. [01:14:18] And I think power has changed him. [01:14:20] And it's unclear to me in which direction, but it has. [01:14:23] I think he was a different man at the beginning of his rule. [01:14:28] I don't want to say too much on this podcast and sort of what that is because it requires a lot more context. [01:14:35] But there was a, let me just say that I, as somebody who was in the Soviet Union in the 90s and then the collapse of the Soviet Union, that was true suffering and uncertainty. [01:14:47] That was a really, really difficult time. [01:14:50] And what Putin did is there's a significant economic recovery that started the 21st century with him at the rule. [01:15:02] And so that's why still a lot of people see him as a powerful leader because he saved Russia from the collapse of the Soviet Union, from like the destructive aspects of the Soviet Union. [01:15:15] Yeah, from the looting. [01:15:16] Yeah, well, but also the financial system was wrecked. [01:15:20] Yeah. [01:15:21] And there's oligarchs just kind of taking more and more. [01:15:25] And that's one of the difficult realities you have to see or think about when you think about Putin is the context in which he operates, the level of corruption all around. [01:15:38] There is some aspect to Russia where a strong leader in the short term is necessary to keep the corruption at bay. [01:15:50] Now, that's a dark reality, and I don't think it's a... [01:15:53] It's grim. [01:15:55] I think all revolutions are really painful. [01:15:57] That doesn't mean you shouldn't go through with them. [01:16:00] But if you just analyze the situation as it currently is, Russia is in a very difficult place. [01:16:06] And as a human being, just for me, it hurts my heart because I know the culture of the 20th century, the Soviet Union. [01:16:14] There was so much love for mathematics and science, for poetry and music. [01:16:18] There's something about the world war where tens of millions of people die that creates some of the most beautiful art. === Cynicism About Powerful Figures (16:54) === [01:16:26] Great comedy. [01:16:27] Amazing. [01:16:27] And so much of that, I think it's just a beautiful place that culturally can flourish and be part of the world. [01:16:39] And it currently is not nearly as much a part of the world as it could be. [01:16:45] So science and tech, there's just so little communication between the United States and the rest of the world with Russia. [01:16:53] What do you think the next, when Putin steps down, what does the future of that political system look like? [01:16:59] This is the scary thing. [01:17:02] It's not good. [01:17:04] That's why a lot of people say, well, my life is good now. [01:17:08] And they're afraid of the uncertainty of Putin stepping down. [01:17:12] Would you have him as a guest on the show? [01:17:15] Yes, of course. [01:17:16] Do you think he would do it? [01:17:17] Yes. [01:17:18] Really? [01:17:19] Yeah. [01:17:20] When will he do it? [01:17:23] You're the Larry King soon. [01:17:26] Interesting. [01:17:27] Yeah. [01:17:28] Do you think he will do a podcast tour? [01:17:30] Well, he will also do my show and Red Scare and a bunch of other shows. [01:17:34] Red Scare. [01:17:35] That's a good one. [01:17:36] Yeah. [01:17:37] Well, they have to go from there. [01:17:38] Yeah. [01:17:39] Probably Joe Rogan first. [01:17:42] It would be great if he did Rogan. [01:17:44] Yeah. [01:17:44] Well, I told I told Joe to be a translator. [01:17:47] So he speaks pretty good English, actually, but he doesn't speak English publicly because you kind of have to... [01:17:55] He only needs to understand one word. [01:17:57] Yes. [01:17:57] Ivermectin. [01:18:00] He would probably say that there's not enough love for the Russian vaccine. [01:18:04] Or, you know, there's a challenge. [01:18:05] What is the Russian vaccine? [01:18:08] I see you brought me on to the show for the scientific analysis of non-conceptions. [01:18:13] Sinovac is China. [01:18:15] Right. [01:18:15] What is the Russian vac? [01:18:18] Yeah, I don't actually know the details about it. [01:18:20] Do I know the name of it even? [01:18:22] It's Sputnik 5 or whatever. [01:18:23] Sputnik 5. [01:18:25] I'll get boosted with Sputnik 5. [01:18:27] Is it? [01:18:28] Ben, can you fact check me on that? [01:18:30] Kovacs. [01:18:31] That's not the Russian one. [01:18:32] That's not the Russian one, Ben. [01:18:35] What kind of mediocre research team do you have here? [01:18:38] I mean, he's not a research team. [01:18:43] Yeah. [01:18:43] Thank you. [01:18:47] Who's he working for? [01:18:48] That's the real question. [01:18:49] Well, that's he's the Ghislaine Maxwell of the Operation Year of the show. [01:18:53] He's got a lot more. [01:18:53] Who do you think is the intelligence in him? [01:18:56] Who do you think is the brains, honestly? [01:18:58] Jeffrey and Ghislaine? [01:19:00] Yeah. [01:19:02] Neither. [01:19:02] Well, here's what I mean by that. [01:19:04] Ghislaine's father was intelligence. [01:19:07] Ghislaine was very schooled in the world of intelligence, right? [01:19:10] And, you know, understanding like what intelligence is, is the maintenance of relationships, essentially, right? [01:19:17] And having leverage over people and how relationships are structured and who has the upper hand at any given time and that could change and blah, blah, blah. [01:19:25] And, you know, so Ghislaine understood that. [01:19:27] Her dad was this media mogul, Robert Maxwell, who's murdered. [01:19:31] Her sisters are in tech. [01:19:33] It's an intelligence family. [01:19:37] Epstein was a, you know, kind of, you know, from many people that have met him, sort of by no means brilliant guy, right? [01:19:48] Like he was not this amazing intellect. [01:19:51] However, he had a very good understanding of people, their appetites, their proclivities, their weaknesses, surrounding underage women, surrounding illicit streams of money, where they wanted their money hidden. [01:20:09] I'm sure surrounding many other things as well. [01:20:12] He seemed to be very good at that. [01:20:14] I think the way that they both functioned was a team of people that, you know, ran this very, I think Epstein was the public face of it as a philanthropist, as an investor. [01:20:30] I think Ghislaine was just a socialite, but not just a socialite, you know, to the public, just a socialite. [01:20:38] But behind the scenes, she was a madam. [01:20:41] She was a pimp. [01:20:42] She was an access agent. [01:20:44] She connected Jeffrey to people that were very powerful in the UK and other places that were friends of her father. [01:20:53] And they had an operation that was based on entrapping, blackmailing, gaining leverage over very powerful politicians. [01:21:03] So do you think there was an intelligence connection to all of this? [01:21:06] Mossad. [01:21:07] It was absolutely a massage. [01:21:08] You were with Eric on this. [01:21:10] Oh, it was a mossad operation. [01:21:12] But that doesn't mean the CIA didn't know about it and give it its blessing. [01:21:15] You know, the CIA ran operations like this in the 80s, in the 90s. [01:21:20] These honeypot blackmail operations are not terribly new. [01:21:23] They were pioneered by the mafia. [01:21:25] I mean, this has been going back for a very long time. [01:21:29] But like have so many powerful people involved. [01:21:32] Oh, rich people. [01:21:33] Oh, yeah. [01:21:34] Sure. [01:21:34] I mean, that's the point. [01:21:36] The point is to all of these institutions, whether it's the Council on Foreign Relations or all these things where people think are very nefarious, whatever it is, Bohemian Grove or Skull and Bones, all these things where they really are, they just create a consensus amongst the wealthiest and most powerful people on the planet. [01:21:53] And that consensus becomes our foreign policy, our domestic policy, our corporate policy. [01:21:59] So it's just creating a consensus. [01:22:00] So if you put a bunch of young guys at Yale together and whatever they do, whatever bonding activities they do to become closer, or if you put them out in the woods in Monterey, California at Bohemian Grove, or if you put them on an island committing crimes, [01:22:16] what this does is this fosters a sense of mutually assured destruction, a sense of homogenous Homogeneous ruling class. [01:22:32] I agree. [01:22:32] By the way, I should mention for the record, Ben is wearing a Harvard shirt because he's yellow that he was given to by a fat woman who's no longer fat because she had a gastric bypass. [01:22:41] Well, if you were more fat forward, like your friend, you wouldn't even make that comment. [01:22:47] But I mean, I think I'm right about this, where it's like it's an intelligence op. [01:22:51] They were the face of it. [01:22:52] As they moved into the digital world, it became less necessary to have these physical honeypots. [01:22:57] Yeah, you tweeted about this. [01:22:59] The true tragedy of Ghislaine Maxwell is that it's just yet another human being losing out to automation. [01:23:05] It's they're being replaced by AI. [01:23:07] No matter what you think of Jeffrey and Ghislaine, it is a real issue. [01:23:10] Human trafficking used to involve humans. [01:23:12] Yeah. [01:23:13] There's a human cost. [01:23:15] This is absolutely the case. [01:23:17] Do you think I'm wrong about anything I've said? [01:23:20] Well, I'm uncomfortable with the intelligent connection. [01:23:24] Why? [01:23:26] There's a book about Ghislaine's father called Israel's Super Spy. [01:23:29] Yeah. [01:23:30] So the father is definitely shady. [01:23:32] Okay, so the father's in the Mossad. [01:23:34] No, I don't know about the Mossad thing. [01:23:35] You don't know about that. [01:23:36] There's something about the suicide or whatever, the death. [01:23:38] Well, yeah, but it's kind of widely believed. [01:23:41] You know, he was at the funeral and I forget of this big Israeli premiere. [01:23:44] Like, I think it's widely. [01:23:46] But there's a difference between, for example, if I have a conversation with Putin, am I part of the FSB? [01:23:50] No. [01:23:51] But that's important. [01:23:53] You know, Elon Musk was once at a party with Ghislaine Maxwell at the same party. [01:23:58] And she everybody was at a party with Ghislaine Maxwell. [01:24:01] Well, yes. [01:24:01] And so the question is, what is the depth of the connection? [01:24:05] Well, there's a book about it. [01:24:08] I mean, it's a lot of books about a lot of stuff to show you. [01:24:11] Well, but written by Seymour Hirsch. [01:24:13] I mean, Ben, get the title of the book up, right? [01:24:16] What is the title of the book? [01:24:18] Written by one of America's greatest Pulitzer Prize-winning journalists. [01:24:20] Oh, yeah. [01:24:21] It's Gordon Thomas and Mark Dillon. [01:24:23] Okay. [01:24:25] Yeah, but also Seymour Hirsch wrote something. [01:24:27] It might not have been a book, but Seymour Hirsch wrote something as well about him. [01:24:32] But again, there's an entire book about it. [01:24:35] So there is evidence. [01:24:37] No, I know. [01:24:37] So I'm underinformed about this topic. [01:24:42] I just have a general wariness of calling people intelligence. [01:24:46] Yeah, so because it's such a tempting, like whenever I, in my mind, feel a temptation to see a conspiracy theory and something, I hesitate. [01:24:58] So what do you think it would be if it wasn't intelligence? [01:25:00] What? [01:25:01] And let's say it's not the Masad. [01:25:02] We'll back that out. [01:25:03] Yeah. [01:25:04] Which it was. [01:25:05] But it's also CIA and the Mossad and Saudi intelligence are all like really tight. [01:25:10] And MI6. [01:25:11] They're like bros. [01:25:13] So, and then there's like Iranian intelligence and Russian intelligence and Syrian intelligence. [01:25:21] And they're like, cool. [01:25:23] They're like friends. [01:25:24] Yeah. [01:25:24] So this is a view of the world with the intelligence organizations as ultra-competent. [01:25:30] Yes, they are. [01:25:31] Okay. [01:25:31] Well, this is my view is they're not ultra competent. [01:25:35] Not all of them. [01:25:36] They're exceptionally well funded. [01:25:38] They're large organizations that have a lot of people pushing paper and they are pretty effective at a lot of things, but these are not the A students of the world, I believe. [01:25:47] Okay, let's the A students don't run the world. [01:25:49] C students do. [01:25:51] And you don't have to be an A student if you're willing to kill someone. [01:25:54] Truly. [01:25:55] If you're a C student and you're willing to put a gun in someone's mouth, it's actually probably just as good as having an A. What would you say it is? [01:26:03] If it's not an intelligence blackmail operation, what would you say that it could be? [01:26:09] I would say it's one charismatic person, Jeffrey Epstein, that was evil and committed things. [01:26:20] But that's the everyone listening to this, you sound crazy on this show, which is why I enjoy doing it. [01:26:27] No, but think about it. [01:26:27] No, no, no. [01:26:29] I'm a little bit playing devil's advocate, Gary, so I'm not, I don't fully believe this, but there's a sense where I feel that there's an important distinction to draw. [01:26:39] Jeffrey Epstein being a complete manufacture and manifestation of an intelligence agency versus as one charismatic evil person becomes more and more successful individualist on an individual level, you start meeting powerful people and then intelligence does. [01:26:54] I'm not saying that it's necessarily the entire intelligence agency, right? [01:26:58] These intelligence agencies are bureaucracies, but they do have within them fiefdoms and factions of power. [01:27:04] You have bad actors, you have people that are outside of the agency officially that have all the skills and know all the people. [01:27:11] So it's not necessarily as easy as going like it's the entire CIA behind any of these things, right? [01:27:18] But there are groups of people. [01:27:19] All the intelligence agencies do is they work for billionaires, essentially. [01:27:23] They work for American corporate interests. [01:27:25] That's what the CIA does. [01:27:27] They've overthrown governments at the behest of American corporate interests. [01:27:31] This is the CIA works for wealthy people. [01:27:34] They essentially are the information, the muscle. [01:27:38] And yes, where we cannot do things overtly with our military and with U.S. policy and get things passed through Congress, you have a shadow organization that's able to do that. [01:27:51] That's not terribly controversial, right? [01:27:53] I mean, that's pretty well known. [01:27:55] If you look at all the coups, you know, in other countries that are provable, coups may be in this country that are what's important for me to understand with this. [01:28:04] I don't think it's controversial. [01:28:06] The question is the scale at which how many people are involved, how much, like how much of it is at the core of what the CIA does versus this is a few people. [01:28:16] Well, the core of what the CIA does probably is largely, you know, they gather a lot of information and analyze it. [01:28:23] Yeah. [01:28:23] But then they're like the core of any business, right? [01:28:27] So if you looked at Walmart, the core of Walmart is you going in and buying socks and someone taking your money, you know, but then there's a group of people in Walmart making decisions and charting the direction of the company and what they're going to do for the next five or 10 years. [01:28:45] So I think the core of these institutions can be relatively benign, but there's, again, power factions within them that are not all homogenous, right? [01:28:55] There's people in the CIA that had no idea what was going on. [01:28:58] There's people that would be disgusted by it. [01:28:59] There's people that would be repulsed by it. [01:29:01] There's no getting around this fact. [01:29:04] Blackmail has been an essential part of the recruitment of agents. [01:29:13] It's been an essential part of maintaining relationships with people. [01:29:18] This is how intelligence people have often found leverage to gain information. [01:29:24] That is true. [01:29:25] Yeah, that's very true. [01:29:26] My question is, how is that increasing or decreasing the amount of that being done? [01:29:34] So ultimately, my question is about, so whenever you think the intelligence is running the most powerful people in the world, they're blackmailed and everything is being controlled by certain very powerful figures, you very naturally lead to a cynicism about the future of the world that deeply troubles me. [01:29:55] So I want to land on an optimism with people. [01:29:59] Have you tried drugs? [01:30:02] No, drugs are very drugs are very good for this because they make optimism. [01:30:07] Well, they make you feel a way that reality might not. [01:30:10] No, I don't think so. [01:30:11] I think drugs. [01:30:12] I'm not saying everything's cynical. [01:30:13] I'm not saying everybody's controlled. [01:30:15] What I am saying is that one cannot bury their head in the sand when there is an island and a mansion in New York City that has recording equipment. [01:30:26] You have a guy that was let off the first time of the very sweetheart deal who's paling around with prime ministers, kings, and presidents. [01:30:34] It's the definition of this looks bad. [01:30:38] But I would argue it's the Nietzsche gaze into the abyss and the abyss gazes back into you. [01:30:43] I would argue if all you study is the Jeffrey Epstein case, you will see devils everywhere. [01:30:50] I don't think there's devils everywhere. [01:30:52] So this is the burden of the great that's the QAnon. [01:30:56] Well, that's also, I mean, Jones talks about this. [01:31:00] Alex is crazy. [01:31:01] I love Alex, but he's nuts. [01:31:03] When you constantly analyze this stuff, you just, it's all you see is red. [01:31:07] Understood. [01:31:08] And so I... [01:31:09] If you don't see any red, that's also a problem. [01:31:12] If there's no devils, you go, well, I don't think there's any devils. [01:31:15] That's an issue. [01:31:16] There's definitely a balance to strike there. [01:31:19] It seems unsupported that this is one evil person and another evil person together with no larger. [01:31:28] No, no, no. [01:31:29] The network builds over time. [01:31:30] Right. [01:31:31] And a lot of it is parties and handshakes and so on, and you greet each other, politicians. [01:31:38] I'm sure. [01:31:38] So, okay. [01:31:39] One thing that seems to be the case, because I've known people that interact with Jeffrey Epstein, he seems he completely deluded them. [01:31:49] Like the charisma is there, and it doesn't have to do with money or women. [01:31:57] At the core level, it has to do with straight up in the room together characterism. [01:32:02] That's kind of guy. [01:32:02] Yeah. [01:32:03] And so I just believe that one charismatic guy can do a lot of damage in this world without first sort of being manufactured as a strategic deployment onto the United States by some intelligence agency. [01:32:21] Now, over time, as he interacts with more and more powerful people, he probably met, he probably got some phone calls and probably had some meetings. [01:32:29] I don't even think I would say that he was deployed. [01:32:31] I think that I don't believe that the way these agencies necessarily operate is that we're thinking of them as, you know, custodians of national interest, which I don't think they are, right? [01:32:43] I think they are kind of work for higher organizations where there's certain people in those countries that have a lot of money, that have an idea about the way things should be run, that, you know, kind of use these agencies. [01:32:57] I don't necessarily think that like all of these agencies all the time are focused on the betterment of any one particular nation state. [01:33:08] I think there's a certain group of people that have enough money and enough power that use these agencies as ways to sustain that position in society. === CEOs Who Own Diners (04:55) === [01:33:20] So I think Epstein, whether it was in Tel Aviv or in London or in Manhattan, was having conversations with these people whose loyalty is really not to any particular country, but it's to their own interests. [01:33:34] I just have to say that I am really troubled on a personal level that there's a lot of people I respect that when they were in a room with Jeffrey Epstein were not able to see through the would not able to see evil, would not have the integrity to because they were looking at pussy. [01:33:52] Well, this is this is the thing is, I mean, maybe I'm built different. [01:33:57] This is, isn't that like a Ford thing or something? [01:34:00] Built, yeah. [01:34:07] I just think you can't let pussy or money delude you and destroy your integrity. [01:34:13] Integrity should be above all else. [01:34:15] And when I see that not being the case with Jeffrey Epstein, like people that have spoken with Jeffrey Epstein, that really troubles me that his charisma and manipulation revealed the people who lack integrity, in my view. [01:34:33] Now, I could be naive because you could be the next Jeffrey Epstein and Ben could be the next Ghislaine Maxwell. [01:34:41] And here I am doing this beautiful podcast, and I'm even a fan. [01:34:46] So maybe it's your charisma that's nobody's giving me an island. [01:34:50] Nobody's giving me an island and nobody's giving me the largest, one of the largest pieces of real estate in Manhattan. [01:34:56] This is also the problem. [01:34:57] When you look at this, you know, as just this independent thing, you look at, well, why did Les Wexner, who's the head of the limited brands, gift Jeffrey Epstein these massive, insanely desirable pieces of real estate? [01:35:17] It's one in New York City, right? [01:35:20] What is the connection there? [01:35:22] That connection has not been progressed. [01:35:24] Yeah, that one. [01:35:25] Yeah. [01:35:25] What people have spoken about, that's really shady. [01:35:28] Well, it's just a shady connection, right? [01:35:30] So you have this guy who is a well-established human trafficker, and you have his biggest benefactor financially not questioned at all in any meaningful way about this. [01:35:44] I mean, you can be as averse to conspiracy theories as you want, but a lot of times, unfortunately, they just become the news. [01:35:54] And this has kind of become the news. [01:35:59] Yeah. [01:36:00] Yeah, it's become the news. [01:36:01] But I understand your thing. [01:36:02] I think your thing's valid. [01:36:04] The QAnon movement, which is psychotic people whose minds had melted when they got a few nuggets of truth, and then they concocted Disney-esque, childish stories about good and evil because they wanted everything in life to have the clarity and certitude of religion. [01:36:26] And they created this thing where everyone in Hollywood's evil and everyone's a pedophile. [01:36:31] And then dead people are going to come back and JFK Jr. It was a very religious movement. [01:36:36] That's what happens to people when they see devils everywhere and they're unable to differentiate fact from fiction. [01:36:44] And on the other side, there's similar things. [01:36:47] The scientific community has really been disappointing to me about the level of arrogance, how they dismiss every conspiracy. [01:36:54] They dismiss basically everything as a conspiracy theory that's not like very narrowly defined public policy. [01:37:03] I watched your interview with the CEO of Pfizer, that guy, the Greek guy who owns a diner. [01:37:09] And what I like about that interview was that he was a no, but he's a diner. [01:37:20] You've offended the Greeks. [01:37:22] He's a diner. [01:37:24] Even though he's the CEO of Pfizer, he literally owns a diner in Astoria Queens where he brings out Spanikopita and he yells at his wife and their nephews who work at the diner. [01:37:35] By the way, that's the reality. [01:37:36] I'm guessing his son is a fan of yours because his son mentioned the podcast he likes. [01:37:41] He's a fan of this podcast or my podcast. [01:37:44] And I'm guessing. [01:37:44] It doesn't mean they're a fan of me. [01:37:46] I mean, very smart people listen to you. [01:37:47] That doesn't mean they come over to me. [01:37:49] They should. [01:37:52] Yeah, that's the next step. [01:37:54] But my answer to him is less listening to Lex Friedman and more getting people mozzarella sticks in your father's. [01:37:59] To get back to work. [01:38:00] Because when I see that guy, you go, oh, he's a CEO of Pfizer. [01:38:02] I go, this is a guy who owns a diner. [01:38:06] And it's a struggling diner, but it's good. [01:38:09] Yeah, they get mileage out of the coleslaw. [01:38:12] Let me finish up with this. [01:38:13] Ben, how long have we done? [01:38:14] We've done a while. === Letting Your Bluebird Out (02:48) === [01:38:16] Can I just say before you finish up how much I love Ben? [01:38:21] He's really the best part of your podcast. [01:38:23] Ben is a that's not true, but Ben is a no, I love Ben. [01:38:28] Ben is essential for me to do what I do. [01:38:30] No, he's just a kind soul. [01:38:32] He's a kind soul. [01:38:32] He's a good-hearted person. [01:38:34] He's a good-hearted person. [01:38:36] And he works very, very hard. [01:38:39] And he's one of these people where I'm lucky to have a Ben. [01:38:43] Not everybody has a Ben, right? [01:38:45] So I'm lucky to have him. [01:38:47] Now, that being said, he must be watched very closely. [01:38:52] Harvard. [01:38:53] He must be watching. [01:38:53] Oh, you know who else wore a Harvard sweatshirt? [01:38:55] Jeffrey Epstein. [01:38:57] Yeah. [01:38:58] Is that the reason you? [01:39:00] Yeah. [01:39:00] But Ben must be watched like everyone. [01:39:03] They must be surveilled. [01:39:05] Yeah. [01:39:06] So you do like the Chinese government after all. [01:39:08] I love the Chinese government. [01:39:12] And they're advertising on my podcast next month. [01:39:15] No, here's what I say about them. [01:39:17] They're going to be tough to beat. [01:39:19] Yeah. [01:39:19] They're going to be hard to beat because they're willing to do things that we might not be willing to do. [01:39:26] And that's, you know, I don't know. [01:39:30] Can I read you a poem? [01:39:31] Yes. [01:39:32] Because this poem makes me think of you. [01:39:34] Yes. [01:39:36] I like this. [01:39:37] I haven't even heard it, but I'm excited. [01:39:39] And then I'll read you a poem. [01:39:41] So you go first. [01:39:42] Okay. [01:39:44] It's a cat in the hat. [01:39:45] All right. [01:39:46] Well, this is probably my favorite Bukowski poem. [01:39:48] And it's exactly about you. [01:39:50] Oh, that's very sweet. [01:39:51] There's a kind of Bukowski-esque aspect to your career and to who you are. [01:39:58] Yeah. [01:39:59] Okay. [01:40:00] It's called Bluebird. [01:40:02] There's a bluebird in my heart that wants to get out, but I'm too tough for him. [01:40:07] I say, stay in there. [01:40:08] I'm not going to let anybody see you. [01:40:11] There's a bluebird in my heart that wants to get out, but I pour whiskey on him and inhale cigarette smoke. [01:40:18] And the whores and the bartenders and the grocery clerks never know that he's in there. [01:40:23] There's a bluebird in my heart that wants to get out, but I'm too tough for him. [01:40:28] I say, stay down. [01:40:30] Do you want to mess me up? [01:40:31] You want to screw up the works? [01:40:34] You want to blow my book sales in Europe? [01:40:37] There's a bluebird in my heart that wants to get out, but I'm too clever. [01:40:41] I only let him out at night sometimes when everybody's asleep. [01:40:46] I say, I know that you're there, so don't be sad. [01:40:50] Then I put him back, but he's singing a little in there. [01:40:54] Haven't quite let him die. [01:40:56] And we sleep together like that with our secret pact. [01:41:00] And it's nice enough to make a man weep, but I don't weep. === A Secret Pact with Bukowski (01:01) === [01:41:04] Do you? [01:41:07] Charles Bukowski. [01:41:08] That's a great poem. [01:41:09] Now I want to do a poem for you. [01:41:11] Yeah. [01:41:11] That was, I love that. [01:41:14] It's an encouragement to let your bluebird out. [01:41:18] Yeah. [01:41:19] It's a little bit of love. [01:41:22] Okay. [01:41:22] I mean, I want to find this one. [01:41:27] I'm almost scared. [01:41:32] Are you a Bukowski fan? [01:41:34] Yeah. [01:41:37] Badass bitch. [01:41:39] Bad attitude. [01:41:41] Nails done. [01:41:42] Hair done. [01:41:43] Ass too. [01:41:44] Your baby daddy fucking me and sucking me. [01:41:48] He don't answer you, bitch. [01:41:49] That's because of me. [01:41:54] Who's the author of this masterpiece? [01:41:57] Young Miami. [01:41:59] Okay. [01:42:00] Well, thank you so much for having me on your podcast. [01:42:02] Thank you for having me. [01:42:04] Thanks, Friedman, everyone.