Knowledge Fight - #877: A Philosophy For Curiosity Aired: 2023-12-13 Duration: 02:16:19 === Welcome Back To Knowledge Fight (02:20) === [00:00:16] Dan and Jordan, I am sweating. [00:00:19] Knowledgefight.com. [00:00:20] It's time to pray. [00:00:21] I have great respect for knowledge fight. [00:00:24] Knowledge fight. [00:00:25] I'm sick of them posing as if they're the good guys. [00:00:27] Shang me are the bad guys. [00:00:29] Knowledge my fight. [00:00:30] Dan and Jordan. [00:00:31] Knowledge fight. [00:00:35] I need money. [00:00:39] Andy and Panzer. [00:00:42] Andy and Kansas. [00:00:43] Andy in Kansas. [00:00:45] Andy. [00:00:46] It's time to pray. [00:00:47] Andy in Kansas. [00:00:48] You're on the airplane. [00:00:49] Hello, Alex. [00:00:50] I'm a fish tin color. [00:00:51] I'm a huge fan. [00:00:51] I love your room. [00:00:52] Knowledge fight. [00:00:55] Knowledgefight.com. [00:00:58] I love you. [00:00:59] Hey, everybody. [00:01:00] Welcome back to Knowledge Fight. [00:01:01] I'm Dan. [00:01:01] I'm Jordan. [00:01:02] We're a couple dudes. [00:01:02] Like to sit around, worship at the altar of Celine, and talk a little bit about Alex Joe. [00:01:06] Oh, indeed we are. [00:01:07] Dan. [00:01:08] Jordan. [00:01:08] Dan. [00:01:09] Jordan. [00:01:09] Quick question for you. [00:01:10] What's up? [00:01:11] What's your bright spot today, buddy? [00:01:12] My bright spot today is Shohei Otani. [00:01:14] Got a giant deal. [00:01:15] No, it's going to try and steal yours in case. [00:01:18] Why would I? [00:01:19] Yeah. [00:01:19] I don't know. [00:01:20] I guess it's sort of, I don't have a good bright spot, so I have sort of a bi-pronged kind of comment. [00:01:30] The first is that the cheese advent calendar hasn't been as good as I expected. [00:01:34] That is expected. [00:01:36] Yeah. [00:01:36] That is exactly as good as I expected. [00:01:38] When we talk about the wonderful positive things about Aldi, it's important to also bring up some negatives. [00:01:43] And that is that these cheeses have not been blowing my mind. [00:01:46] That's not to be surprised. [00:01:47] But it's still fun to open a little door and all that stuff. [00:01:50] It is. [00:01:51] It is. [00:01:52] There's no denying that. [00:01:53] The second thing, the second prong in this makeshift bright spot is you getting dragged on social media for saying that all-star was a cover. [00:02:05] How dare you? [00:02:06] I 100% thought it was I'm a believer. [00:02:09] Wow. [00:02:10] No, I was pretty sure you were wrong, but I didn't have enough confidence to push back on it. [00:02:18] 100%. [00:02:19] And then you got the, I mean, it wasn't as bad as the Chumbawumba article, but you still got quite a bit of correction on that. === Smash Mouth's Grammy Wonder (03:21) === [00:02:27] Oh, well, the problem was I got the accidental correct answer on the first try, and then it was all over. [00:02:36] I was like, there can't be that many Smash Mouth Grammy-winning movie soundtrack songs. [00:02:44] And there are more than one. [00:02:45] Because you got Shrek. [00:02:46] I'm a Believer cover. [00:02:47] Exactly. [00:02:48] That's why I was like, movie songs, Smash Mouth, my brain made it. [00:02:52] I'm a believer. [00:02:52] Right. [00:02:53] And then they also won a Grammy for might as well be walking on the sun. [00:02:58] I don't understand. [00:02:59] That was in Schindler's list. [00:03:01] It was. [00:03:01] It's so strange. [00:03:03] It's so strange whenever people were like, oh, you got the song wrong. [00:03:07] And I was like, wait a second. [00:03:08] We should really be talking about how much Smash Mouth did. [00:03:13] That is insane to me. [00:03:14] Yeah, it's true. [00:03:15] It is insane. [00:03:17] They had a really interesting trajectory as a band because you had the walking on the sun, which was a big hit, but that was like, you're in the direction of one-hit wonder of that era. [00:03:28] That was like, this is great. [00:03:29] You're going to be like Fastball. [00:03:31] Yeah. [00:03:33] Perfect. [00:03:33] Yeah. [00:03:34] Yeah, the way. [00:03:35] Everybody loves that song. [00:03:36] And then go away. [00:03:37] Right. [00:03:37] Yes. [00:03:38] Or even some other Kula Shaker with Totva. [00:03:44] Sure, sure. [00:03:45] Yes. [00:03:47] That was One Hit Wonder Territory, and they had nailed it. [00:03:50] Yeah. [00:03:50] And then they had the All-Star, which was a song that I would say the only thing that I can think of that comes close to it in the modern time is something like DMX's Y'all Gonna Make Me Lose My Mind. [00:04:04] Yeah, yeah. [00:04:05] They're like in every trailer and commercial and all these things. [00:04:09] So they got elevated into this like really bizarre territory. [00:04:13] Yeah, yeah. [00:04:13] And then they did the Shrek soundtrack. [00:04:16] Yeah, because it's not like a music, it's not like a hey ya moment where everybody goes, oh my God, this music video or the music or Andre 3000. [00:04:24] It's like they have put All-Star on every commercial, in every movie, in every in the store when I walk down. [00:04:32] I also think that there's a dramatic difference between the hey ya kind of moment of being like that maybe gets to a wider audience and then people can realize how great Outcast's catalog is earlier on. [00:04:44] Absolutely. [00:04:44] Whereas I don't think that All-Star led to a rediscovery of Smash Mouth's album cuts. [00:04:51] No offense to Smash Mouth. [00:04:53] They were fucking bad. [00:04:54] It's a critical reinvention of Smash Mouth. [00:04:58] Yeah. [00:04:59] But anyway, you were wrong. [00:05:00] I was thoroughly wrong. [00:05:01] Happy to admit it. [00:05:02] What's your bright spot? [00:05:03] My bright spot is this is it's weird to say this because I'm not a big economics guy, but I will own my car in like I will have paid off my car in like two payments. [00:05:19] Okay. [00:05:19] Which has never happened to me before. [00:05:21] That's never happened before. [00:05:22] Congratulations. [00:05:23] Of all the cars, I don't think one, well, actually, no, I won't own it. [00:05:27] It's still in my wife's name. === Cars Vanished and Loans Forgiven (02:23) === [00:05:28] Everything is in my wife's name. [00:05:29] I own zero things. [00:05:31] That's not suspicious. [00:05:33] I'm off the grid completely. [00:05:36] But in order, my cars have disappeared via hit and run. [00:05:43] The last one that I had in Chicago, I parked on the street, and then it was hit by a truck and totaled completely. [00:05:51] And then I came out one morning, I saw it, and then the cops came. [00:05:55] And God, I love the Chicago cops whenever something horrible happens to you because they laugh in your face. [00:06:00] Sure. [00:06:00] That was great. [00:06:02] And then after that, it was stolen. [00:06:05] See, that's more of a disappearance than a hit and run. [00:06:07] I mean, the car is still there. [00:06:08] No, no, no, no. [00:06:09] Well, I mean, yeah, yeah, yeah. [00:06:11] This was the car before that. [00:06:12] And then the car before that was my dad's red truck. [00:06:15] Yeah. [00:06:16] All of these cars never been mine and have gone horribly wrong. [00:06:20] Well, this one still isn't. [00:06:22] Still, that's true. [00:06:23] That's true. [00:06:24] But I have paid it. [00:06:26] I have paid for it with monies. [00:06:28] I mean, it is an accomplishment, and it is something to be excited about. [00:06:32] But it is still your wife's car. [00:06:34] I understand. [00:06:34] But I mean, in 40 years, I'll be able to say, hey, I paid off my student loans. [00:06:39] And that's what will be important. [00:06:41] Do you have that much student loans? [00:06:43] Yep. [00:06:44] You went to like one year of college. [00:06:46] No. [00:06:46] If you recall, I went to five colleges. [00:06:50] Dude, that's true. [00:06:50] But never longer than a year. [00:06:52] That's the problem. [00:06:54] The problem was not that I didn't go to enough college. [00:06:58] Yeah. [00:06:59] I do think that if you have a path like that, you definitely should have your loans forgiven. [00:07:04] Exactly. [00:07:04] Like, you've got nothing to show for it. [00:07:06] It's not fair. [00:07:07] Yeah. [00:07:07] It's not fair. [00:07:09] And I mean, the other problem was just like, you know, I could get a degree from somewhere, but they want you to stay there at least two years. [00:07:17] And then I was like, I don't want to do that. [00:07:18] So I just left. [00:07:20] That was probably a bad move, but it's hard to go back now. [00:07:22] Yeah. [00:07:22] You can't relitigate. [00:07:24] So, Jordan, today we have an episode to go over. [00:07:26] Yeah. [00:07:27] On our last episode, towards the end, we were alerted to the fact there was a Twitter Spaces going on featuring Alex Jones and some other noted luminary shitheads. === Party Call Politics (03:53) === [00:07:36] Oh, there is a last time on. [00:07:38] Yeah. [00:07:38] Yeah. [00:07:39] So today we are covering that because I wrestled with this. [00:07:44] So first of all, I don't think a lot of people maybe even know what Twitter Spaces is. [00:07:48] I don't. [00:07:49] So I tried to explain it to my dad earlier. [00:07:52] And the way best I could come up with is it's like a party call, like a party phone line. [00:07:58] You know, there's a like a big group phone call. [00:08:00] Sure, sure, sure. [00:08:01] But streamed on Twitter. [00:08:02] So, like, on Twitter, you can have a bunch of people who are on the line, and then they can all take turns speaking and some sort of a group call. [00:08:09] Okay. [00:08:09] Dude, wait. [00:08:11] Do they call it SpaceX or Spaces? [00:08:15] Right, but I mean, it's not Twitter anymore. [00:08:17] Oh, no, it's X Spaces. [00:08:18] Yeah, I guess. [00:08:19] Oh, no, that's stupid. [00:08:20] Well, the other name's already taken. [00:08:22] Yeah, it's his name. [00:08:23] So, yeah, I think that would get confusing. [00:08:26] So I wrestled with this because I think it's stupid and I found it's three hours long and I found it insufferable. [00:08:34] No, no, no, no. [00:08:35] It felt like it would never end. [00:08:38] But the list of people who are in the mix here who are involved in this conversation, it doesn't feel like something I should just ignore. [00:08:46] It feels like, you know, when you have someone like Marjorie Taylor Greene coming on InfoWars the first time, it's like, this is relevant. [00:08:54] This is a person who is in Congress who is on Alex's show. [00:08:57] Or when Vivek comes on Alex's shit, you know, or I guess actually Alex was interviewed by Vivek for his shit. [00:09:05] You know, it is like someone is running for president. [00:09:08] These intersections of like what would be respected positions, people in those supposedly respected positions, it feels like I would be remiss if we just ignored that this happened. [00:09:20] Right. [00:09:20] That's true. [00:09:22] And there's one thing that's completely insane that happens. [00:09:26] That I believe. [00:09:26] So maybe it's worth it for that. [00:09:28] Just for that. [00:09:29] Oh, man. [00:09:30] It's hard not to be like, I understand it is newsworthy that we are surrounded by burning lakes of fire, but we are in hell. [00:09:41] So it's not that newsworthy, you know? [00:09:43] Well, it's giving some sort of window dressing. [00:09:48] It's letting you know what's going on. [00:09:50] Sure, sure, sure. [00:09:51] Showing you the lake. [00:09:51] Right, right, right, right. [00:09:53] So let's look at that lake. [00:09:54] Okay. [00:09:55] But first, let's say hello to some new wonks. [00:09:57] Okay. [00:09:57] So first, that moose that got the parking ticket. [00:09:59] Thank you so much, Uranio Policy Wonk. [00:10:01] I'm a policy wonk. [00:10:02] Thank you very much. [00:10:02] This is a Woody Allen bit. [00:10:05] Yes. [00:10:05] Next, Ben from Canada. [00:10:07] Thank you so much, Uranio Policy Wonk. [00:10:08] I'm a policy wonk. [00:10:09] Thank you very much. [00:10:10] Thank you. [00:10:10] Next, I know these get delayed a bit, but here's hoping we're not too far into the monthly mustard tasting. [00:10:16] Looking forward to your stone ground Dijon. [00:10:18] Thank you so much, Uranio Policy Wonk. [00:10:19] I'm a policy wonk. [00:10:20] Thank you very much. [00:10:23] Next, Maestro Jones says Bear Dog is number one dog. [00:10:26] Thank you so much, Uranio Policy Wonk. [00:10:28] I'm a policy wonk. [00:10:29] Thank you very much. [00:10:30] And we got a couple technocrats in the mix, George. [00:10:32] So thank you so much to Beats by Nim appearing against the advice of my publicist Odette. [00:10:37] You're an Iowa Technocrat. [00:10:38] And I became a technocrat just to publicly shame Dan for his garbage survivor takes. [00:10:42] Thank you so much. [00:10:43] You're an Iowa technocrat. [00:10:44] I'm a policy wonk. [00:10:45] I have risen above my enemies. [00:10:49] I might quit tomorrow, actually. [00:10:50] I'm just going to take a little breakie now. [00:10:52] A little breakie for me. [00:10:56] And then we're going to come back. [00:11:00] And I'm going to start the show over. [00:11:02] But I'm the devil. [00:11:03] I got to be taken off here. [00:11:04] I've been all this. [00:11:06] Fuck you. [00:11:07] Fuck you. [00:11:08] I got plenty of words for you, but at the end of the day, fuck you and your new world order and fuck the horse you rode in on and all your shit. === Garbage Survivor Takes Confusion (03:56) === [00:11:16] Maybe today should be my last broadcast. [00:11:19] Maybe I'll just be gone a month, maybe five years. [00:11:22] Maybe I'll walk out of here tomorrow and you never see me again. [00:11:26] That's really what I want to do. [00:11:28] I never want to come back here again. [00:11:30] I apologize to the crew and the listeners yesterday that I was legitimately having breakdowns on air. [00:11:37] I'll be better tomorrow. [00:11:39] Never. [00:11:40] So I feel attacked. [00:11:42] I'm not entirely sure what these garbage survivor takes are. [00:11:44] That's what I was interested in. [00:11:46] That's why I left it in there. [00:11:47] They're unspecific, so I can't respond to any of them. [00:11:49] But I do not have garbage survivor takes. [00:11:52] Well, the reason I left it in there, I'm interested to know what do you think those could even be? [00:11:58] I feel like I probably was disparaging about this season. [00:12:01] Okay. [00:12:02] And I stand by that. [00:12:03] Okay. [00:12:03] I think I've been disparaging of a number of seasons so far. [00:12:06] Wow. [00:12:06] Like, especially the recent seasons, I think they have been dog shit. [00:12:10] Okay. [00:12:10] There's a problem that they've shortened the show. [00:12:13] Sure. [00:12:13] So, like, I think that probably had something to do with COVID. [00:12:16] Yeah. [00:12:17] Wow, I mean, yeah. [00:12:17] So they're only out there for a shorter amount of time. [00:12:21] So the game is played faster. [00:12:23] And that kind of sucks. [00:12:24] Yeah. [00:12:24] There's too many advantages. [00:12:26] There's too many weird twists. [00:12:28] It's just, it's gone away from what made the show alluring. [00:12:32] Right. [00:12:33] And also, I think the casting has been not great a couple seasons. [00:12:39] I don't know. [00:12:40] I'm just not into it. [00:12:41] Yeah. [00:12:41] I'm still watching it, but I'm not into it. [00:12:43] That's what, I mean, I guess what confuses me about the whole garbage takes thing is essentially you're like, everything's not as good as it should be. [00:12:51] Yeah. [00:12:52] Nah, that's not really a garbage take. [00:12:53] No, but I think this person probably thinks that it is as good as it should be. [00:12:57] And maybe it's Jeff Probst. [00:12:59] You know what? [00:13:00] If this is Jeff, stop listening to the show. [00:13:04] You don't have enough time. [00:13:05] Here's something I want to say about Survivor. [00:13:07] Yeah. [00:13:07] On a recent episode, they did something with editing that I have never seen them do before. [00:13:13] And I found it to be absurd. [00:13:16] So there was a challenge. [00:13:18] Community and reward challenge, I believe, or maybe just a reward. [00:13:21] Something like that. [00:13:22] And some woman won, and she decided to take all the women who were left with her on the reward. [00:13:30] So they went to a spa or got food or something. [00:13:33] Yeah, yeah. [00:13:34] And so the men stayed behind at camp. [00:13:36] Right. [00:13:36] Right? [00:13:37] Yeah, yeah. [00:13:37] And so then the men are like burping and like having bro time and stuff. [00:13:42] Great, I guess. [00:13:43] That I have seen before. [00:13:44] Right. [00:13:45] That has happened in Survivor Past. [00:13:46] And I have edited it together. [00:13:48] This turned into a montage of them doing bro stuff with the song Playing with the Boys by Kenny Loggins. [00:13:57] I'm sorry, what? [00:13:58] Over it. [00:13:58] The song from the volleyball season. [00:14:00] Oh, I know. [00:14:01] Oh, I know my Kenny Loggins. [00:14:03] I don't remember the last time they had a licensed song for a montage. [00:14:10] And it was just to do Playing with the Boys. [00:14:14] It was bizarre. [00:14:15] I had no idea what they were doing. [00:14:17] I like that. [00:14:17] I like that because there's a story behind that. [00:14:19] And you know that one of the editors had to get permission. [00:14:23] They had to go through. [00:14:24] They had to call. [00:14:25] They were a whole thing. [00:14:26] And that's because somebody had a wild hair up their ass that was like, hey, you know what? [00:14:30] We don't normally do montages like this. [00:14:32] But I think it's time to reach for the stars. [00:14:35] It's completely out of line with the way this show is normally edited. [00:14:38] But after chasing sunsets, one of life's greatest joys is playing with the boys. [00:14:44] You got to do it. [00:14:45] And that's kind of what's going on on this show. [00:14:47] Oh, God. [00:14:47] Is that your chase? [00:14:49] That's playing with the boys. [00:14:51] It is a lot of dudes. [00:14:53] Oh, my God. [00:14:54] What are they even doing? [00:14:55] What do they do? === Why Applaud Elon? (15:43) === [00:14:56] What are they doing? [00:14:57] Are they talking about something or are they just like, we're here. [00:15:02] As best I can tell, there's a character named Mario. [00:15:05] I'm not entirely sure who he is, but he seems to be the host of this space. [00:15:09] Oh, I thought we were doing a bit. [00:15:11] No. [00:15:11] I thought we were going to get to jumping. [00:15:13] Nope. [00:15:13] Nope. [00:15:14] His name is actually Mario. [00:15:15] Mario something. [00:15:16] I'm not sure. [00:15:16] Gotcha. [00:15:17] He plays very little role in this except as a periodic questioner of people. [00:15:22] Okay. [00:15:23] And he tells people sometimes that their mic is on. [00:15:25] Okay. [00:15:26] That shouldn't be. [00:15:27] Wow. [00:15:27] And does not do a great job of keeping things on the rails or moderating or anything like that. [00:15:32] So he's doing this space. [00:15:34] Yeah. [00:15:34] And from what I gather, the intention is to have Alex on because he is now back on Twitter. [00:15:41] Right. [00:15:42] And we'll talk about how great it is that he's back, ask some questions and stuff. [00:15:46] And then as it is going on, tons of other people just end up showing up because it's on Twitter and you could just ask to join. [00:15:56] You have a whole. [00:15:57] So you end up with Jackson Hinkle, Matt Gates, Patrick Bett David, and others who end up showing up. [00:16:06] It's a who's who of dum-dums. [00:16:08] So you can, okay, so then the more general concept is anybody can like ask to be on. [00:16:15] Yes, and like you screen like them or not. [00:16:18] Right. [00:16:18] So it could be anybody. [00:16:19] It's not like you have to have a number or you have to have anything like that. [00:16:23] You're watching, and you can participate in the instant if you got you. [00:16:28] I saw like a screenshot of this on Twitter when it was happening on Sunday. [00:16:34] And one of the faces that I saw as one of the speakers was Chase Geyser. [00:16:38] God damn it. [00:16:39] And it was like, what? [00:16:40] He ranks? [00:16:41] He's relevant enough to be in this conversation. [00:16:44] And then I realized as I listened to the beginning of this, Alex's account has been reinstated, but some of the features don't work. [00:16:51] So he can't join Twitter spaces with his account because that's still coming online. [00:16:56] So he's using Chase Geyser's account. [00:16:58] So that's a lot of fun. [00:16:59] Excellent. [00:17:00] So we start here with Mario giving a little bit of a self-congratulatory premise of the interview. [00:17:10] First question that I have for you, we've got a pretty incredible panel. [00:17:13] A lot of people that you probably know. [00:17:15] We're going to go to the panel in a bit as well for questions. [00:17:19] But, you know, first I want to applaud you for coming here. [00:17:22] You could have gone anywhere, but you came on a platform that, you know, we ask the hard questions. [00:17:27] We always try to represent all sides of every discussion and avoid echo chambers. [00:17:31] And that was the first place you came on as you got reinstated on X. [00:17:35] So I want to applaud that step. [00:17:38] First question that I have for you. [00:17:40] Thank you for coming on my show. [00:17:41] You should be back. [00:17:42] So proud of you. [00:17:43] Well, I've got to say, it's not about me. [00:17:47] It's about everybody having the courage to stand up for free speech, particularly Elon Musk, Tucker Carlson, and others. [00:17:53] And because when we put up with somebody else being censored, for whatever reason, the media says, or whether something they said that was wrong or something that hurts our feelings, when we censor somebody else or go along with it, we end up censoring everybody else, including ourselves. [00:18:10] So I want to applaud everybody for after five, six years of this nightmare for waking up. [00:18:15] I don't know who this Mario guy is per se. [00:18:20] So I don't know if he does generally ask the hard questions and avoid echo chambers. [00:18:25] But I can tell you based on the experience of listening to this, that is a fraudulent presentation of how this show goes down. [00:18:33] I mean, it's very... [00:18:37] I just want to applaud you. [00:18:39] Me. [00:18:40] For having the courage to come on my show. [00:18:43] Mm-hmm. [00:18:44] You're brave. [00:18:45] So amazing. [00:18:47] Because, you know, what I do is I go hard in the paint. [00:18:50] I'm maybe the best interviewer that's ever been. [00:18:53] No softballs around here. [00:18:54] People don't want to. [00:18:56] I mean, people don't want to come on my show. [00:18:58] I'm so good. [00:18:59] Now, first question, what does it feel like to be back on Twitter? [00:19:01] You're the greatest human being that ever lived. [00:19:03] How do you feel about that? [00:19:05] So, look, here's the deal. [00:19:07] Yeah. [00:19:08] You're going to hear a lot of Alex kissing some Elon Musk ass in this. [00:19:13] And it starts pretty early. [00:19:14] Wow. [00:19:15] So in your interview with Paco, you did express concern on what could happen to you. [00:19:19] So you expressed understanding on why Elon didn't bring you back. [00:19:22] Obviously, that didn't last long. [00:19:24] And you also had concerns on what could happen to Elon if he does reinstate your account. [00:19:28] Now that you're back, do you expect Elon, who also quote, I think he replied to Brian, who's on stage, one of his replies, he said, I expect revenue to potentially drop from this decision. [00:19:38] Do you expect there to be a tax or pressure on Elon after he reinstated you? [00:19:43] Well, I think he said that yesterday. [00:19:46] I don't have the comment on X in front of me, but he basically said it's the right thing to do because the principle, and it's also what the majority of the vote said. [00:19:56] So, this probably will hurt X monetarily, but justice be done, may the heavens fall, basically. [00:20:02] But at the end of the day, it already has way more traffic, way more people. [00:20:06] It's avant-garde. [00:20:08] It's revolutionary. [00:20:10] It's rebel. [00:20:11] It's Maverick. [00:20:11] And that's what Elon Musk has always said he is. [00:20:14] And he's done all these Maverick things. [00:20:16] And people always say, well, he didn't really do that. [00:20:18] But I went back and researched it. [00:20:20] He did do that. [00:20:20] All these incredible successes. [00:20:22] I'm not kissing his ass. [00:20:23] It's true. [00:20:24] And the fact that he could take over Twitter, free it, and bring back one of the most demonized people, if not the most demonized in the world. [00:20:33] Talk about a witch hunt. [00:20:34] Shows he put his money where his mouth is and is the Mavericks Maverick. [00:20:39] I mean, nobody can argue now that this guy is not cutting-edge Maverick. [00:20:44] He's Maverick. [00:20:46] And I would just like to point out something that maybe I haven't brought up on the show, but is very funny. [00:20:52] And that is that towards the end of Alex's book, The Great Reset, he relays the plot of a couple scenes of Top Gone Maverick. [00:21:03] That sounds right. [00:21:04] It's pretty amazing. [00:21:05] You're telling me he loves that movie. [00:21:07] You're telling me that even in the book, it's always a movie. [00:21:10] Yep. [00:21:12] So I'm sure that Musk knew that revenues would likely go down and he'd lose more advertisers if he let Alex back on the platform, but he's also being boxed in by the extreme right-wing audience he's cultivated to the point where the decision isn't really a Maverick decision at all. [00:21:24] It's bowing to the shouts of the mob. [00:21:26] By his words and actions, Elon Musk is clearly politically and ideologically aligned with the most extreme right-wing voices in the media, many of whom are in this Twitter space. [00:21:35] He's shown that he's desperate in his need to look cool for these shitheads, and that's much more important for him than creating a functioning, decent-to-use website, and more important than running a successful business. [00:21:46] In the second quarter of 2023, Twitter's revenue was down just about 50% year over year and went from about 1.5 billion in Q4 2021 to just under 600 million in Q2 2023. [00:21:59] He's overseeing the full deflation of a site from a revenue standpoint. [00:22:04] Twitter has also been bleeding users who don't want to hang out on a website full of unchecked shitheads because it's a really bad user experience. [00:22:11] The site was losing millions of daily users, so in response, Musk stopped using that metric and started pointing to monthly users since that number looks a bit better. [00:22:20] By almost every metric, Twitter's going pretty much down the drain. [00:22:23] According to stats compiled by Axios, app downloads dropped 38% globally and 57% in the U.S. in the year after Musk bought the site. [00:22:32] User time on the app is decreasing and overall traffic to the site is down as much as 11% on web browsers in the United States. [00:22:40] Essentially, what's going on is that Elon Musk is trying to take Twitter on a $44 billion joyride. [00:22:45] And a big part of that is letting the types of shitheads he likes create an unpleasant environment for the other users. [00:22:51] It'll be a lot of fun for them while it lasts, but ultimately the end result of this is that the whole thing is going to go out of business as it becomes just another telegram or gab where all the user base is just the dumb dumbs like these people and a handful of folks who just have accounts to keep tabs on what they're up to. [00:23:06] It's pretty annoying and it's a farce to see all this being masqueraded around to some kind of a free speech activism, but you can't really expect them to do anything else. [00:23:13] This is exactly what they would do. [00:23:15] And if this is really about free speech, then Musk needs to go all the way. [00:23:20] Let David Duke back on. [00:23:21] Get Milo Yiannopoulos back in the site. [00:23:23] Let Tila Tequila come back. [00:23:25] All she did was post pictures of herself doing a Nazi salute at a white nationalist gathering. [00:23:29] That's free speech, baby. [00:23:31] Come on, man. [00:23:31] Let Tila Tequila back. [00:23:33] Yeah, I mean, he should let everybody back on. [00:23:36] If it is a free speech thing. [00:23:37] It doesn't. [00:23:37] I mean, I appreciate that people continually confuse Elon Musk as somebody with a business or anything along those lines. [00:23:46] But people really need to start thinking him more along the lines of like MBS and that kind of thing. [00:23:50] Like, he has a sovereign wealth fund. [00:23:53] He does not have money. [00:23:54] Money is not real to Elon Musk. [00:23:57] No. [00:23:57] Money is pretend. [00:23:58] It's just $44 billion of his own money or anything like that. [00:24:02] $44 billion isn't even a real number of money. [00:24:05] That's just a pretend thing that rich people talk about. [00:24:08] That's not real money. [00:24:10] Right. [00:24:10] You know, like it's all fun. [00:24:11] It's fun bills. [00:24:12] It's all fun and games to do that, but it doesn't exist as something. [00:24:17] It's pretend. [00:24:18] And he can do whatever he wants. [00:24:20] Yeah. [00:24:21] Forever. [00:24:21] Yeah. [00:24:22] And he doesn't have a business in the form of Twitter either because it's not running a business. [00:24:28] No. [00:24:28] It's not protecting free speech. [00:24:30] It's creating a fiefdom where everyone kowtows and bows to him like they do in this Twitter space. [00:24:37] Yeah, the Saudi wealth, like the reason that like Live Golf and all of these, like the reason that MBS is spending $300 million on something that will make $40 or whatever is because one, money is pretend. [00:24:52] It doesn't exist to him. [00:24:53] And two, it makes him feel good. [00:24:55] This is what happens when you give one man too much power is something feels good is worth spending a billion dollars on. [00:25:04] Hey, man, I'm a hedonist. [00:25:05] I don't mind the idea of pursuing pleasure. [00:25:08] I understand that, but other places where they're like, oh, let's have a board or something can at least be like, can we have some businesses? [00:25:16] Well, tell that to Linda Yaccarino. [00:25:18] Yeah. [00:25:19] There we go. [00:25:20] Poor lady. [00:25:21] I don't know. [00:25:22] Don't care. [00:25:22] Fuck her. [00:25:23] So anyway, this is all about free speech. [00:25:25] So demonized, so lied about. [00:25:27] They built me into the devil's devil at an emotional level with so many people. [00:25:35] Deep. [00:25:36] At a never-before-seen level, this happens. [00:25:38] And the ADL, the CIA, and the Justice Department and the Southern Party lost to their media matters. [00:25:43] They were already doing it when I was taping the interview like three weeks ago. [00:25:46] But since then, all those big establishment sponsors who are already being boycotted by the American people, by the way, so they're losing literally hundreds of millions of dollars in the aggregate because the people are boycotting them. [00:26:00] As Elon said, well, now they're trying to bully Elon to bully the American people, the people of the world. [00:26:05] So he said, go fuck yourself. [00:26:06] And I can't speak for Elon. [00:26:07] Maybe he'll call in. [00:26:09] But I think when he, as soon as they did that to him, even though he tried to basically work with them, was being very, very fair. [00:26:15] And I think really cleaned up the thing of bots and things like that. [00:26:20] As soon as they doubled down, he said, you know what? [00:26:23] Screw you. [00:26:24] I'll go ahead then and release the cracking. [00:26:27] And it's not that I'm that good on Twitter or that I'm even that great of a talk show host. [00:26:31] It's the symbol of what they built of Alex Jones is now an archetype of the rebel populist and what the establishment fears. [00:26:41] And so what Elon did was really throw down the gauntlet. [00:26:44] So that's not free speech at all then. [00:26:46] He's just throwing a tantrum. [00:26:48] He's mad at people. [00:26:49] And so he's like, all right, I'll bring back this bombastic fucking idiot. [00:26:53] Yeah. [00:26:53] Right? [00:26:54] I mean, like, that's not a free speech stance. [00:26:56] That's a fuck you. [00:26:59] I want to do something that'll make you unhappy. [00:27:02] Yeah. [00:27:02] Stance. [00:27:03] Yeah, that's pretty much the definition of spite. [00:27:07] Yeah. [00:27:07] I believe that's spite. [00:27:08] Sure. [00:27:09] He did a spiteful thing. [00:27:10] Yeah. [00:27:11] Not motivated by a concern for free speech and everybody has the right to talk and blah, blah, blah. [00:27:16] No. [00:27:16] But I don't, Alex is speaking for Elon. [00:27:19] This is not necessarily what he did. [00:27:20] But even in Alex's conception, Elon's throwing a temper tantrum. [00:27:24] Yeah. [00:27:25] What Elon actually did was try and measure his own dick. [00:27:28] Yeah. [00:27:28] Yeah. [00:27:28] So Elon's going to show up later, obviously. [00:27:31] I don't know if I made that clear. [00:27:32] Yeah, yeah, yeah. [00:27:32] No, we've all got that. [00:27:34] The big deal of this is Alex ends up talking to Elon for a long time. [00:27:38] I hate us all. [00:27:39] But he's not there yet. [00:27:40] Yeah. [00:27:41] And I don't think that they knew he was going to come. [00:27:45] No. [00:27:45] Like, I think it's a surprise. [00:27:46] Yeah. [00:27:47] Sort of that he just showed up. [00:27:49] But there is a bit of a panel that's there asking some questions. [00:27:53] And the first of them, noted embarrassment and shithead, Laura Luber. [00:27:58] Great. [00:27:58] Got to take a few questions from the panel. [00:28:00] Maybe three quick questions. [00:28:01] Laura and then Brian. [00:28:06] Thanks, Mario. [00:28:07] No, I saw that you were asking Alex about whether this threatens democracy. [00:28:11] I think the biggest issue is that this has been a threat to our Constitutional Republic and democracy all around the world. [00:28:19] And we've been blowing, you know, blowing the whistle on this for years. [00:28:23] I can't help but think back to the day before Alex was permanently banned from Twitter. [00:28:28] And that was September 5th, 2015. [00:28:30] I think you can see that I attended the congressional hearing on big tech censorship, and we called out Jack Dorsey. [00:28:37] So I just wanted to play that because Laura cannot stop bringing up that she was there in Congress with Alex the day before he got kicked off Twitter. [00:28:46] In case anybody's wondering. [00:28:48] Before Alex shows up, there's a little bit of time where they're just talking to her, and she brings it up twice. [00:28:52] Nice. [00:28:53] It's like, all right, I get it. [00:28:54] This is about you. [00:28:55] Yes, yes, yes. [00:28:56] You're very important as well. [00:28:58] You're very important. [00:29:00] So next, we get Jack Pasobic coming in, giving a little bit of a question/slash comment. [00:29:06] Good stuff. [00:29:07] Prior to Alex Jones's just really public execution in terms of the digital square, it was a digital public execution that we saw. [00:29:15] Prior to that, the idea that people were getting suspended on Twitter, you know, Twitter 1.0 or even Facebook or YouTube or any of these things, it was ridiculous. [00:29:23] You had to either, you know, really, you know, seriously violate terms of service, death threats, doxing. [00:29:30] That was pretty much it. [00:29:31] That was pretty much the only thing, or like actually hacking the system. [00:29:33] It was the only thing that gets you out. [00:29:35] But when they did that to Alex Jones, it fundamentally changed the way that we operate on social media and the way that we share information. [00:29:42] And up until the point of Elon Musk purchasing Twitter and transforming it now into X, we have not had the ability to freely share information. [00:29:53] And that's why in the past year, it's basically been about a year and change since he purchased it. [00:29:57] That's why you've suddenly started to see people, and not even just on X, but out in the real world, out in normal conversations. [00:30:04] We're finally starting to move past that point of intense censorship, where if you've lived in any country that has an authoritarian regime, you know that the censorship isn't external, it's internal. [00:30:15] So it's internal. [00:30:16] It's in your mind. [00:30:17] You know you're not allowed to hold certain opinions. [00:30:19] You know you're allowed to say certain things. [00:30:21] And so you censor yourself before you speak. [00:30:23] And so, Alex, I just commend you. === Step Throat Bananza (04:24) === [00:30:25] So they got a myth here, but this isn't reality. [00:30:28] Before Alex got kicked off Twitter, there were plenty of people who got banned. [00:30:31] Milo got banned two years prior. [00:30:34] Holocaust denier and racist troll Charles Johnson, who's now claiming he was an FBI informant, got kicked off in 2015. [00:30:40] Tommy Robinson was kicked off earlier than Alex in 2018. [00:30:43] Same for the Proud Boys and Gavin McGinnis. [00:30:46] Shit, Azalea Banks was banned from Twitter earlier than Alex. [00:30:49] Roger Stone was kicked off a year before Alex. [00:30:52] Alex's Nazi buddy Owen Benjamin was kicked off before him. [00:30:55] Anyway, the point is that Alex was not digitally crucified or executed, and he wasn't the first high-profile person to be kicked off for being a fucking asshole. [00:31:03] But this is the new mythological telling of the story because it works really well for narrative purposes. [00:31:09] Now that Alex is back on Twitter, if they pretend that him being kicked off was the beginning of this rash of censorship that they've all been persecuted by, this becomes all the more triumphant. [00:31:18] It's completely inaccurate, but very emotionally satisfying for them. [00:31:22] So, you know, you publish the legend, as they say. [00:31:24] Yeah. [00:31:26] Yeah. [00:31:27] I think that's why I, I mean, and my point, you know, I think I said it on the last episode about the whole like you have to step on their throat in the fourth quarter. [00:31:35] You're from sports. [00:31:36] I'm from sports. [00:31:37] Yeah. [00:31:37] You have to step on their throat. [00:31:39] And this is going to be one of the great, I told you so, because I'm turning into, I'm going to go full psychopath when this comes true. [00:31:47] And Alex is reinstated and popular among everyone because none of you fuck stepped on his throat. [00:31:53] And I'm going to blame all of you. [00:31:55] I'm going to blame Mark. [00:31:57] I'm going to blame everybody who said, oh, the judge is cool. [00:32:00] Nope. [00:32:00] Fuck all those judges. [00:32:01] Fuck all of those people. [00:32:03] But I don't step on his throat because here we are. [00:32:05] But I'm not sure that in their capacities in those circumstances there was any way to metaphorically step on his throat. [00:32:11] I understand what you're saying. [00:32:12] That's fine. [00:32:13] And then I blame all of you and society. [00:32:16] You can blame society. [00:32:17] You're creating a situation wherein all of you behave stupid instead of not stupid. [00:32:22] It's fair to blame society. [00:32:23] I'll give you that one. [00:32:23] Okay. [00:32:24] So, yeah, just a bunch of dumb, dumb shitheads talking about Alex. [00:32:28] And here's another one. [00:32:30] So another example of someone that's gone through the same thing is obviously Andrew Tate, which you retweeted earlier. [00:32:35] We've got Tristan on stage. [00:32:37] Tristan, we've got Alex Jones back on X. Would love your thoughts on your experience getting censored and congratulations on finally being able to walk free. [00:32:46] Well, thank you very much. [00:32:47] As most of you know, I am free now to travel the entire country of Romania. [00:32:51] It's just another step on the way to my eventual exoneration. [00:32:55] This isn't as much of a setup as people think, but obviously I can't talk too much about the case. [00:33:00] Yeah, I probably wouldn't either. [00:33:02] All right. [00:33:02] So now the panel on this show that really loves to ask the hard questions and thinks about here, you know, it's important to hear from all sides has been Laura Loomer, Jack Pisobic, and Andrew Tate's brother. [00:33:13] The woman who handcuffed herself to Twitter's door, the guy who almost pissed his pants in fear, eating at comic ping pong while trying to push Pizzagate, and a dude who's definitely going to prison for organized crime and sex trafficking. [00:33:24] I honestly cannot imagine why advertisers wouldn't want to be associated with this kind of platform since the owner of the fucking site shows up and hangs out for like two hours with these shitheads. [00:33:34] Yeah. [00:33:34] Recently, the Tate brothers have gotten their geographical restrictions loosened up, so now they can travel around Romania as long as they get the court's permission first. [00:33:42] The criminal charges are very much still pending, and Andrew Tate is being sued by four women for multiple sex crimes in addition to that. [00:33:48] There's that old saying, you know, a person by the company they keep, and man, this is bad company. [00:33:53] Like that band, bad company. [00:33:55] I did the song, bad company. [00:33:57] Man, just all the name. [00:33:59] You know, it is, I just like the names don't even have meaning anymore. [00:34:03] It's just that collection of sounds makes me tense up. [00:34:07] You know, like if there's like a baboon scream in the distance, like evolutionary, like your body tenses up, you're like, I gotta fight. [00:34:14] Like just hearing Andrew Trate, like that. [00:34:17] It's not a name. [00:34:18] It's not a name. [00:34:19] It's not a name to me. [00:34:20] It's just a collection of sounds that make fucking Wolverine claws pop out of my head. [00:34:25] If you see this collection of people, and I know that it's hacky to say nightmare blunt rotation, but really, that is this is a worst thing I could think of. === Alex's Convenient Story (15:28) === [00:34:35] This is a room where if I walked into it, you couldn't stop me from getting out of that room. [00:34:40] Yeah. [00:34:40] Immediately. [00:34:41] Yeah, yeah. [00:34:41] Wild horses couldn't keep me in that room. [00:34:43] And I don't know how they would. [00:34:45] Round of space for the horses. [00:34:47] That's not how horses are. [00:34:47] You understand what I'm saying? [00:34:48] Yeah, I do. [00:34:49] And shit gets worse because there's apparently a voice from the other side that's allowed in the mix here. [00:34:57] And of course, it's one of the Cresenstein brothers. [00:35:00] There we go. [00:35:01] Brian, before we move on to the war in Ukraine and the war in Gaza, I'll give you the mic for a quick question. [00:35:07] Yeah, so before I ask you a question, I just want to say we're probably about as far politically away from each other as anybody can get. [00:35:15] I disagree with you on probably so much. [00:35:19] But I think dialogue's key. [00:35:21] And I think I even advocated for your banning at one point, probably like 2017 or something like that. [00:35:28] And when we were banned in 2019, you were literally the only person to actually speak out for us. [00:35:35] So I thank you for that. [00:35:38] So I guess my question would be, what are your thoughts about the new X? [00:35:45] Community notes. [00:35:46] Obviously, that's going to help fact-checking. [00:35:49] Have you played around with community notes? [00:35:50] Do you understand how that works? [00:35:52] And how do you think that's going to be used, whether it's for you or against you? [00:35:58] I welcome community notes. [00:36:01] I mean, I think that it's not censorship. [00:36:02] It's the community square. [00:36:04] All right, you want a better answer, Ding-Dong? [00:36:06] All the idiots who like Alex's shit already think community notes are bullshit and they just ignore them. [00:36:10] Yeah, I was going to say. [00:36:11] Plus, Alex was on his show days before this workshopping with callers how to evade the notes in the first place. [00:36:16] It's an incompetent system that applies literally no pressure or pushback on misinformation, but is designed to give the image of doing exactly that, which is why Alex doesn't have a problem with it. [00:36:25] It poses zero threat to his ability to spread bullshit, so it's fine. [00:36:29] And like, I understand it's like, oh, there's fact-checking there. [00:36:31] Like, oh, if someone posts something wrong, then the thing that's right is underneath. [00:36:35] Hey, guess what? [00:36:36] All those people are still making money off those wrong posts. [00:36:39] I mean, hilarious. [00:36:42] Yeah, there's still a giant financial incentive in posting the bullshit that gets notes that people ignore. [00:36:48] Yeah, the idea that that matters is absurd. [00:36:52] Absurd. [00:36:53] But here's the thing. [00:36:54] Alex loves being corrected. [00:36:57] That's why he loves the idea of these community notes. [00:36:59] He'll update his stuff. [00:37:00] Crazy statement is the answer is being able to respond to it. [00:37:04] And so I really welcome community notes because then it's not some big corporation like Microsoft with their news guard who's got one of the worst records actually of truthfulness out there telling you that they're the arbiter of truth and using sly methods like Snopes and others do to deceive people. [00:37:21] Sly methods. [00:37:22] I welcome. [00:37:23] Developed by Sylvester Stein. [00:37:24] I welcome people correcting me. [00:37:25] You know, when I'm on the radio slash TV doing my weekday shows at Infowars, I've got a crew in there running it and they don't tell me what to say or do, but a couple times a day they said, hey, you just set a date wrong or you just said something that they know is wrong. [00:37:39] And then I correct it because I never consciously get something wrong. [00:37:44] But when you're talking three, four hours a day, you make mistakes. [00:37:48] And so I think that that's what this is all about. [00:37:50] And so I really look forward. [00:37:52] In fact, I want people that politically disagree with me to come on. [00:37:56] That's much better radio and TV. [00:37:59] But the internet put people in their own silos where we don't communicate with each other. [00:38:05] And half the time, we're disagreeing with the other side because we have a distorted perspective of what they're saying. [00:38:11] So just to be clear, Alex is saying that in his belief, his enemies have a distorted view of who he is. [00:38:16] This is not a two-way street. [00:38:17] He's sure he knows exactly who his enemies are and they all work for the devil. [00:38:21] He's trying to paint this as a thing where the two sides just need to get together and realize they're not so different. [00:38:26] But that conversation is not going to be give and take. [00:38:28] And we've seen it a bunch of times. [00:38:30] The two sides need to get together in a room where they can either join my side or never leave the room again. [00:38:37] Right. [00:38:37] It's simple. [00:38:38] But it would be good radio because it would just be Alex yelling at someone and making an ass of himself like he did with Bill Ayers. [00:38:43] I mean, that's good stuff. [00:38:44] That was pretty good. [00:38:45] If Alex loves being corrected, you'd think that at some point he'd get on air and issue a correction based on the literal thousands of things we've talked about that he's been wrong about, was lying about, or just entirely made up. [00:38:55] I imagine his engagement with community notes won't be any better. [00:38:58] And within a month or so, I would expect him to come up with a conspiracy about leftists within the community notes structure working against him. [00:39:05] I wouldn't be too surprised if he tried to make some hay out of that. [00:39:08] At a certain point, like, obviously you have no problem with community notes because it doesn't really threaten you. [00:39:15] But Alex is somebody who gets annoyed. [00:39:17] And I could see him being annoyed by it to the point where he comes up with a conspiracy about the notes. [00:39:22] Well, I mean, the notes are good for two groups of people. [00:39:25] They're good for the group of people who likes to see people on the far right say stuff that is obviously wrong, and then they're like, ha ha, I bet they care. [00:39:34] The idiots, you know? [00:39:35] And then there's the other group of people who it's important to, and those are the people who work at Twitter. [00:39:41] Who get to complain about how evil it is and how terrible it is for those people. [00:39:48] There's a third group then, too. [00:39:49] And that is the people who want to cover their ass. [00:39:51] Shit. [00:39:51] Should pretend that they're doing something. [00:39:52] Yeah, yeah, yeah. [00:39:53] Definitely that. [00:39:54] Pretend that this has any impact whatsoever on misinformation. [00:39:57] Right. [00:39:58] That the people like Elon Musk. [00:39:59] Yeah. [00:40:00] Who coincidentally is about to show up. [00:40:01] Great. [00:40:02] He shows up and he has a question for Alex right out of the gate. [00:40:05] And Alex, you know, look, I honestly don't really know you and you don't know me. [00:40:09] Wait, I thought Alex had dinner with Musk and hung out with him with Rogan. [00:40:13] I don't know. [00:40:13] Oh, man, maybe that was all made up. [00:40:15] Maybe. [00:40:16] You know, one of the questions I really have to just get out of the way, and you've probably talked about this already before, is the whole Sandy Hook thing. [00:40:22] And, you know, because it's not, like, obviously, if, you know, if somebody's sort of denying murders of children, that's not cool at all. [00:40:37] You know, and so just what exactly did you say? [00:40:40] And what is going on with that situation? [00:40:43] You know, I just, I would like to actually hear what did you say? [00:40:48] And yeah. [00:40:50] Well, Elon, thank you for allowing me back into the public square. [00:40:56] Oh, man. [00:40:56] So yeah, this is, you know, it's good. [00:41:00] It's good. [00:41:02] Really starting the conversation off right? [00:41:07] Asking Alex, hey, man, it's not cool for somebody to deny the deaths of children. [00:41:12] What was up with that, man? [00:41:14] What was up? [00:41:16] I appreciate the massive, ridiculous and over-inflated ego that comes with somebody saying stuff like, why don't you tell me what you said? [00:41:31] Because it comes along with two things. [00:41:33] It comes along first with, I'll know if you're lying. [00:41:38] Right. [00:41:38] Yeah, yeah. [00:41:39] Because I'm a genius. [00:41:41] I have done no looking into this of my own accord, which could be done very easily. [00:41:46] All I need to do is look a man in the eye and hear what he has to say, and I'll know if he's being honest. [00:41:53] I don't even think it's that. [00:41:54] I think it's just a, please lie to me in a way that allows me to cover my own ass about this. [00:41:59] And that was the other thing. [00:42:00] Yeah. [00:42:00] Yeah. [00:42:01] That's what people want Alex to do. [00:42:04] They're just like, please just make it okay for me to not be mad at you or whatever. [00:42:08] I don't really care what you did. [00:42:10] Yeah. [00:42:10] Yeah, obviously. [00:42:11] We never did. [00:42:12] We were helping harass the Sandy Hook families. [00:42:16] So Alex has like, hey, man, you want the short answer or the long answer? [00:42:20] Good question. [00:42:21] I don't know how much. [00:42:22] Would you like the short answer or the medium long? [00:42:25] I think at least the medium answer. [00:42:28] Look, I guess people just want to know, like, obviously it would be like heartless and cruel to deny a school shooting of children or to attack the parents or anyone who was involved. [00:42:42] It seems that that would be, you know, just incredibly mean and cruel, frankly. [00:42:50] So it's sort of, that's, I think, you know, what a lot of people are upset about, or at least they think that is the reason to be upset about it. [00:43:00] And, you know, if that were true, I think we would rightly be upset with you. [00:43:07] Oh, okay. [00:43:08] I mean, that's well communicated. [00:43:11] I think we know the stakes of this conversation. [00:43:15] So he wants at least the medium version. [00:43:17] Sure. [00:43:18] And here goes, Alex is going to jump into what we obviously know is going to be a lie. [00:43:22] Okay. [00:43:23] Yes, sir. [00:43:24] Well, please let me then just tell you what really happened, okay? [00:43:27] Yes. [00:43:27] And if you want me to send you a dossier with clicks, post it to your account, you know? [00:43:34] Yes, sir. [00:43:34] I will do that. [00:43:35] So, so let me let me tell you what happened. [00:43:37] I'm a guy that didn't go to college, a few years of community college. [00:43:41] I started out on access TV. [00:43:43] Nope, already done. [00:43:45] I was not professionally trained. [00:43:48] And by 19, by 2016, I had 30 million viewers conservatively. [00:43:54] I was the biggest shows, biggest Rogan is now or bigger. [00:43:57] And I had a very small operation and did not even understand how powerful I was. [00:44:02] And so when that event, I just call it the school shooting, which I do believe happened, happened 11 years ago, the internet exploded. [00:44:12] And it was a top story for off and on for years with all the professors and former school safety people and all of them saying they believed it was a drill. [00:44:25] And I simply covered them covering that. [00:44:29] What was entered in court against me in both cases where I was found guilty by judges? [00:44:33] Kind of like in New York, there's a judge in Trump's case, not even a jury, in his real estate case. [00:44:40] And then years later, after Trump got elected and after I was deplatformed, it made me bigger. [00:44:49] And so suddenly I would wake up and there would be sometimes 100 articles or more a day, every major news channel saying that I was currently saying nobody died, currently sending people to their houses, currently peeing on graves. [00:45:03] I didn't even know these people's names. [00:45:05] I only said one of their names ever. [00:45:07] So I love the way Alex is making himself such a squeaky, clean, respectable little boy for Mr. Musk, calling him sir. [00:45:13] What a worm. [00:45:14] Obviously, anybody who followed the case at all or has listened to Alex's shows from the time of the shooting knows that he's entirely full of shit. [00:45:21] The day of the shooting, Alex got on air and suggested it was a false flag. [00:45:24] He started insinuating that Robbie Parker, one of the parents of a murdered child, was an actor almost immediately, long before he ever came into contact with Wolfgang Halbig, the wildly discredited school safety administrator he's claiming he just covered. [00:45:37] Alex didn't just cover Halbig, Alex promoted Halbig, all while he had every reason to know that Halbig was literally stalking and harassing the parents, insisting they were actors and that their kids never existed. [00:45:47] Alex had Halbig on the show as an expert and directed his fans to financially support Halbig so he could continue his harassment. [00:45:54] And not only that, Alex paid for his employee, Dan Badondi, to go to Connecticut to join in the harassment of Newtown residents, which Alex celebrated and aired on his show as a triumphant thing in the InfoWar. [00:46:05] Further, Alex sold Jim Fetzer's book, No One Died at Sandy Hook, on his website, and literally all of the quote anomalies Alex was so confused by that made him suspect it was all a hoax came directly from either Fetzer or Halbig. [00:46:17] It was those two and then a third person even closer to Alex, Steve Pieczenik, who was feeding Alex's crisis actor narratives long before he met Wolfgang Halbig. [00:46:25] The reality is that Alex wanted to cover Sandy Hook as a fake event because it was an incredibly powerful driver of traffic and sales for him. [00:46:32] Alex understands that emotionally traumatic events are ones that have a way of overwhelming people's rational thinking. [00:46:38] And there are few things that are more traumatic than a mass murder of children in a school. [00:46:42] No one wants to believe that that kind of thing could happen. [00:46:44] So it's not difficult for talented conspiracy theorists to entice people into that false narrative that allows them to not accept that reality. [00:46:52] Alex wanted to cover the story this way, and that's why he gravitated towards these bullshit experts who were clearly full of shit, like Wolfgang Halbig, Jim Fetzer, and he just has a soft spot for Steve. [00:47:03] So, you know, Steve can do whatever he wants on the show. [00:47:05] Yeah. [00:47:06] It's a convenient story to pretend that this was the internet. [00:47:08] It was just really interested in Sandy Hook. [00:47:11] Alex was just covering what these guys were saying, but that's cowardly revisionist trash. [00:47:15] Alex was doing what Alex always does, which is trying to profit from other people's pain. [00:47:20] There weren't hundreds of stories all the time, but Alex Hussein, you know, he's still saying things to send people to the victims' families' homes. [00:47:28] He repeated his defamatory claims about them because he got mad about Neil Hesslin going on Megan Kelly's show and talking about his experience. [00:47:36] So some people rightly reported that Alex is still saying the things about these families that he said in the past, and it was the stuff that led to their harassment. [00:47:44] He has no one to blame for that but himself. [00:47:46] And for what it's worth, multiple times under oath during his deposition and trial, as well as on air during the trial, Alex repeated his assertion that Sandy Hook was probably fake. [00:47:55] So he can calm the fuck down with this I'm a victim stuff. [00:47:59] I want to take a quick moment and examine something that Alex said at the beginning of that clip, though, because I think it reveals something important. [00:48:05] He's trying to dodge responsibility for his actions or minimize them by saying that he never went to college and he wasn't formally trained. [00:48:11] I've listened to countless hours of Alex's show, and I can tell you with zero reservation that Alex has no respect for formal education or training, particularly in the field of journalism. [00:48:20] The fact of whether he was trained or not should be immaterial here. [00:48:24] And Alex is doing that and bringing this up just to make his actions look like an innocent mistake. [00:48:29] Also, if I wasn't trained for the job as a possible excuse for defaming the grieving families of dead children and directing harassment in their direction, then I would suggest that training for this job should be mandatory. [00:48:40] Clearly, the subject matter being handled by someone in Alex's position is far too sensitive and dangerous to be trusted in the hands of someone who's this fucking irresponsible. [00:48:48] Speaking of which, I wonder if Alex is going to tell Elon that he was drunk on air when he said that the families were actors. [00:48:54] I'm going to guess he's going to avoid that because that's a detail that's just there for the cool kids' table when he's trying to sound like a badass rebel. [00:49:01] Here, he's in sniveling worm territory, so that kind of game just isn't going to work. [00:49:05] And Elon Musk is going to eat all this shit up. [00:49:08] It's pathetic. [00:49:10] Yep. [00:49:12] No other, nothing else to say. [00:49:14] Just, yep. [00:49:15] I mean, garbage. [00:49:16] I have plenty of, I have plenty to say. [00:49:19] And I mean, at a certain point, this is not our first time going down this road. [00:49:24] No. [00:49:24] So I think that's. [00:49:25] It's interesting to hear it being gone down with Elon Musk, though. [00:49:28] Yeah, I mean, I'm just now more interested in, like, because here's the. [00:49:34] How could I put it this way? [00:49:37] I think that in the concept that I'm given by people, that the law matters, right? [00:49:46] And so that means that the case has been handled with Alex. === Musk's Clarifying Question (15:43) === [00:49:49] The case is over. [00:49:51] The law has done its job, right? [00:49:53] So at this point on, everything that Alex gets to do is everybody else's fault, not Alex's, because the law has handled it, right? [00:50:03] So that means that the Sandy Hook chapter is closed in terms of what people can argue about. [00:50:10] It's been done in the law, so that's that. [00:50:12] And if you don't like the results of that, which I have a hard time imagining anybody does, that's your fault. [00:50:19] That's our fault. [00:50:21] That's, again, that is the justice system's fault. [00:50:24] That's everybody's fault, but Alex's. [00:50:26] At this point, Alex is playing with house money, and he can do whatever he wants, and whatever he gets to do is our fault. [00:50:34] Well, as a society, as a society, I think that it is just so disgusting, I guess, on some level that people will not inform themselves in any way about the reality of these situations to the point where Alex has free reign to just lie about what the case is about, what he did, what anything happened. [00:50:55] Like, that, to me, is such an abdication of any concern on the part of everybody in this call, quite frankly. [00:51:03] Right. [00:51:04] Well, to me, I have, I mean, these people are supposed to be shitheads. [00:51:07] That's what they do. [00:51:08] That's their job. [00:51:09] It is the media's job, ostensibly, to not allow that to happen. [00:51:14] I feel like sometimes they try, but like it's. [00:51:17] I mean, in this case, from what I saw of the past weekend, I think they work for Alex. [00:51:24] I think they do. [00:51:25] I think, genuinely speaking, somewhere along the line money works for Alex now. [00:51:32] We talk about this a bit, though. [00:51:33] That is the like, yeah, it's going to drive traffic to have a story about Alex being back on Twitter. [00:51:39] And it's not in service of the public being informed. [00:51:43] It's not really helping anything, but it will drive traffic. [00:51:46] And so that's what's done. [00:51:47] I would be stoked. [00:51:49] Honestly, I would be stoked if that's all they did. [00:51:51] If all they did was write, Alex Jones is back on Twitter. [00:51:54] That's news. [00:51:55] I get it. [00:51:55] That's fine. [00:51:56] Then they go on to say, Alex Jones was back on Twitter. [00:51:59] He apologized for they lie. [00:52:01] They lie because it's easier than actually doing their job. [00:52:05] And they do it all the time. [00:52:06] Yeah, no one wants to do this job. [00:52:08] It's hard. [00:52:08] That's why I end up doing it. [00:52:10] Exactly. [00:52:10] So anyway, Musk does call out one thing that Alex says about the Sandy Hook stuff. [00:52:15] And it seems like an interesting ask for more information, but it's really not. [00:52:20] Yeah. [00:52:21] Now, now, and I believe her children died, and I understand all that. [00:52:25] But I'm saying, imagine I was not deplatformed, no mention of the school shooting in Connecticut for like six, seven years. [00:52:37] Then they go back to my timeline, and it turns out it was a big New York PR firm, Democratic Party. [00:52:43] They dredge it up. [00:52:44] They run hundreds of articles, sometimes a day, but a week for over a year. [00:52:50] Suddenly, it becomes a big story again. [00:52:52] What's the PR firm? [00:52:54] Pardon me? [00:52:55] Which propaganda firm was this? [00:53:00] Public relations is a propaganda word for propaganda. [00:53:03] So I think we should call PR firms, propaganda firms, because that is in fact what they do. [00:53:08] So what hold propaganda firm was this? [00:53:10] I will find the name as soon as I'm off because I can't do today. [00:53:13] I'm not good at doing two things at once. [00:53:15] I can't walk into a bubblegum, but I will post it to the 2X. [00:53:20] So that's the first really interesting thing that's happened here. [00:53:22] Elon Musk actually asked Alex a follow-up question that Alex absolutely cannot answer. [00:53:27] Alex's story about why he got sued about Sandy Hook was that a PR firm that works with the UN and the Democratic Party got mad at him because he was too popular and Trump got elected. [00:53:35] So they went through his history and found something they could attack him over. [00:53:38] This is absolute bullshit, but I've never seen anyone call Alex out on it when he repeats it. [00:53:43] That's true. [00:53:43] Alex's answer is really interesting that he can't remember the name of the PR firm. [00:53:48] It's interesting because here's Alex from like 35 minutes earlier in the call, just before Elon Musk had shown up. [00:53:54] Then they had, we later learned, I'm not going to say names or even get into it because that's what they want's attention, but we learned there was a big, powerful New York PR firm that actually does the PR for the UN and also for some of the CIA operations. [00:54:08] And that's really what we're talking about here. [00:54:09] That's all come out in Congress now. [00:54:11] And then they went and dredged up stuff in my timeline out of context, blew it up times a million, and then said that I was currently doing things I'd never done. [00:54:23] Just outrageous, horrible things. [00:54:25] But notice, never proof, never a clip, never a video. [00:54:28] Plenty of it. [00:54:29] So that seems a little bit different. [00:54:31] Alex is never going to come out and say who this firm is because he's making all this up. [00:54:35] Alex is a total idiot, but even he knows that if you were to actually make an accusation like this about a real business, he will get sued immediately. [00:54:42] Yeah. [00:54:43] It's really weird to see Musk have the instinct to ask a clarifying question here and yet still be completely oblivious to how flagrantly Alex is lying to his face. [00:54:51] But I think that what's going on is that, you know, Musk has this battle against media matters going on right now and all this. [00:54:57] And so he considers all them PR firms. [00:55:00] Yeah, he didn't want to ask a question. [00:55:01] He wanted to say that PR firms were propaganda men. [00:55:04] Yeah, he wanted to use this as like part of his he was saying something. [00:55:07] Yeah. [00:55:07] Yeah. [00:55:07] He was making a statement in the form of asking Alex a question. [00:55:10] It's less the point of like, oh, what is that? [00:55:14] Yeah. [00:55:14] You know, like, it's not asking a follow-up question. [00:55:16] It's a, can we talk about something I like? [00:55:19] Yeah, exactly. [00:55:20] Yeah. [00:55:20] So Alex gets to admitting that he, you know, he did say it was fake, which is progress, I guess. [00:55:27] I guess. [00:55:27] I did question it. [00:55:29] I did say at times I could see that I might even say it now. [00:55:32] They'll take out of context to say it again. [00:55:35] I did have what they entered in court on me was 23 minutes of video and audio over five, six years. [00:55:42] We did an audit. [00:55:44] I hadn't talked about them when they sued me for two years. [00:55:48] I refused to talk about it. [00:55:49] I apologized when the PR firm got involved. [00:55:52] And I know who it was at the time. [00:55:53] It was just all the news. [00:55:54] I said, hey, I thought it happened. [00:55:56] I said it happened. [00:55:57] I said it happened. [00:55:58] I decided it happened five years after it happened. [00:56:01] So I said, I'm not the Sandy Hook guy. [00:56:03] It turns out some of these experts that said it didn't happen are crazy. [00:56:08] They made up stuff. [00:56:09] I said, I believe it happened. [00:56:12] And then they spun it and said, oh, now he admits he lied about it. [00:56:16] So it isn't who I was. [00:56:18] It's kind of like they've done with you, and you did nothing like I did. [00:56:20] I mean, I did question it. [00:56:21] I did say a few times that I thought it hadn't happened, but I didn't turn the knife. [00:56:25] I didn't really think about it. [00:56:26] I thought about how I was talking about the internet with YouTube videos with 30, 40 million views that I didn't make. [00:56:35] It was a hot topic that would come back from time to time. [00:56:38] But no, I was not the creator of it. [00:56:40] I was not the progenitor of it. [00:56:42] I was not the guy pushing it. [00:56:44] Alex just said that he decided the shooting really happened five years after it happened. [00:56:48] That would be in December 2017. [00:56:51] Alex was interviewed on Megan Kelly's show, which also featured the Neil Hesslin interview in June 2017. [00:56:57] The actions he took trying to attack Heslin's appearance in that interview are the grounds for bringing this suit against him. [00:57:03] So by his own admission, at the time he was on Megan Kelly, he thought the shooting didn't happen and that these people were all fake. [00:57:09] What's going on here is that Alex is mixing up his timeline because he knows he's talking to shitheads who don't actually want the truth. [00:57:14] They just want Alex to lie to them in a way that gets them off the hook for looking into any of this and having to deal with the fact that, oh my God, this guy is a real piece of shit. [00:57:22] Yeah, when they go and talk to their friends in real life and they're like, I can't believe you talked to Alex, then they can be like, oh, Alex is different. [00:57:28] He's not what you think. [00:57:29] Right, right. [00:57:30] That allows that. [00:57:31] Whereas if they dealt with this in reality, they can't do that. [00:57:35] Yeah, they'd be like, why did you talk to Alex? [00:57:37] Yeah, I know. [00:57:38] I'm ashamed. [00:57:39] Also, not for nothing, but the idea that Alex is admitting to believing and preaching on air that the shooting didn't happen for the first five years isn't helping his case the way he thinks it is. [00:57:47] That was when most of these people were traumatized the most deeply, where the grieving process was literally interrupted by violent harassment from idiots who thought that they were actors. [00:57:57] By saying that he didn't think the shooting happened for the first five years, he's in effect admitting to complicity with the worst part of what he's accused of. [00:58:04] Alex said that he apologized and admitted that some of these experts he relied on to make the claims were crazy liars, but this introduces some more problems. [00:58:12] The first is that one of these so-called crazy liars was Steve Pieczenik, who remained a respected guest on Alex's show until the 2020 election when he got on Alex's bad side by insisting that all the ballots were secretly watermarked. [00:58:23] The second is that even after this point, Alex continued to repeat the claims made by Halbig and Fetzer, the ones that he's conceding these dudes made up. [00:58:32] Even under oath in his deposition, Alex still clung to some of these things to justify why he thought Sandy Hook was suspicious. [00:58:38] The third problem is that Alex didn't just repeat these guys' claims in a way that would be like, Wolfgang Halbig believes X, Y, or Z. Read about it here. [00:58:46] Alex presented the position that it was a fake event with crisis actors as the result of him researching the whole thing. [00:58:52] This wasn't a slip-up. [00:58:54] He was lying to the audience about doing research about this when in reality, he was just parroting the insane claims of these liars because it was more interesting to the audience and it drove traffic. [00:59:02] Like Alex said, it was a hot topic. [00:59:05] At the end there, you get an interesting insight into Alex's mind. [00:59:08] He says that he wasn't thinking about the shooting and he wasn't turning the knife. [00:59:12] The thing is, he clearly was turning the knife. [00:59:15] He wasn't thinking about what he was doing because he didn't care. [00:59:18] He didn't care that he was lying about people in a way that turned a knife in their wounds. [00:59:23] They didn't exist to Alex, but the narrative did. [00:59:25] And that was a juicy narrative. [00:59:28] If you can take the deeply traumatic event and blame it on your enemies, that is a powerful recruitment tool for the InfoWar. [00:59:34] And that was what Alex was interested in. [00:59:36] He didn't give a fuck about these people and what they were going through and what impact his actions had on them. [00:59:42] Yep. [00:59:42] He's a bad person. [00:59:44] He is a bad person. [00:59:45] But he's apologized. [00:59:47] Yeah, man. [00:59:49] What is fun about Alex is that because I'm like, I'm raised in a fiction world, you know, like I've always interacted with reality through fiction first, you know, that kind of concept. [01:00:01] That's where I learned about, you know, like I learned about a lot of shit from Moby Dick when I was like too young for Moby Dick, that kind of stuff. [01:00:08] Mainly, don't click MobyDick.com. [01:00:13] Good tip. [01:00:15] No, it is just like all of the morality tales, you know, that idea of the scarlet letter, of the albatross, of all of these things, should shove it up their ass. [01:00:27] Because in 2023, that does not exist. [01:00:30] It doesn't exist. [01:00:32] Do you know what I mean? [01:00:33] That idea of like... [01:00:34] But also, I think that all of those stories are saying that that's a bad thing to do. [01:00:39] Right? [01:00:39] I mean, wearing a scarlet letter is not good. [01:00:42] No, no, no, no. [01:00:43] I mean, the concept of public shaming and of something being with you forever. [01:00:50] Right. [01:00:51] That kind of idea. [01:00:52] And I mean, I mean, also in terms of like, oh, when he was permanently banned from Twitter, that no longer exists. [01:00:58] We should not allow him. [01:00:59] No, I know, but I mean, that kind of idea is that we should not even consider the permanently as a part of things. [01:01:05] Right. [01:01:05] You know, there's no permanently. [01:01:07] You can always redo it. [01:01:08] Yeah. [01:01:09] You can always rerun it up again. [01:01:10] Run it back one more time. [01:01:12] And like whatever kind of public shaming there is is not effective if you can find a group of people who it's profitable for them to deny that you have any reason to be ashamed. [01:01:25] Yep, absolutely. [01:01:26] So that's what we're experiencing here. [01:01:30] But the issue comes down to Alex has apologized so many times. [01:01:33] So, so many times. [01:01:34] No, he has not. [01:01:36] Not one time. [01:01:37] Not really. [01:01:38] No. [01:01:38] I mean, it's like, I'll post it X if you want. [01:01:41] There's over 100 apologies that I've given. [01:01:46] Over 100. [01:01:47] In fact, probably 500. [01:01:48] Every show I go on, they ask this. [01:01:50] I apologized on Joe Rogan's show five years ago. [01:01:54] I apologized on Patrick Ben David's show five years ago. [01:01:58] I mean, these are prominent ones. [01:01:59] I apologize on every show. [01:02:02] And I'll say it again. [01:02:03] I apologize that I just gave my commentary because I'm really just a guy that a talk radio host. [01:02:10] So I do that on the internet. [01:02:11] I just take calls and interview guests and that I played devil's advocate. [01:02:14] And if that hurt people's feelings, I apologize. [01:02:18] But I did not send people to your houses. [01:02:20] I did not pee on Graves. [01:02:22] I don't know any of this stuff that went on. [01:02:24] And then when they had the trials, after I was found guilty, trials and damages, there was never any video of people peeing on Graves. [01:02:30] Any video of people at houses. [01:02:32] I love the idea that Alex has this, like, whenever he needs to get out of trouble for stuff, I'm just a talk show host. [01:02:37] You know, I'm just, you know, I'm an untrained, I didn't go to college guy. [01:02:41] And then on his show, he's screaming about how God chose him to fight the devil. [01:02:45] He has prophetic visions and he's never wrong. [01:02:49] Get the fuck out of here. [01:02:50] Nonsense. [01:02:51] Yep. [01:02:51] When I'm around anybody who might hold me accountable, I'm just a regular old person who can't be held accountable. [01:02:57] Right. [01:02:57] And when I'm not around anybody who can hold me accountable, not even God can hold me accountable, you motherfuckers. [01:03:04] I am basically an entity that requires you to change your religious beliefs. [01:03:10] If you accept me, you have to change your religion. [01:03:14] If you accept Alex, you have to accept boy so much. [01:03:19] Alexism. [01:03:19] Like, I honestly am getting to the point where I'm thinking this is more like cult shit than maybe we give it credit for. [01:03:25] It has to be, yeah. [01:03:26] So none of Alex's apologies are real. [01:03:28] They're all like this. [01:03:29] Complete bullshit, lies about what he did, followed by a performative, if it hurt you, I'm sorry. [01:03:35] There's no apology without responsibility, and Alex is literally incapable of taking responsibility for his actions. [01:03:40] The perfect encapsulation of that is him insisting, I never sent anyone to your homes. [01:03:45] And maybe that's true in the sense that he never literally ordered anyone to go to people's homes. [01:03:50] However, he promoted and preached that these people were actors, faking having had their family members murdered so that they could be used to take away American citizens' guns. [01:03:59] They were being used as a weapon against the Second Amendment, and they were willing participants in that. [01:04:04] This put a giant target on these grieving people, which was only made worse by Alex's frequent comments that people needed to look into what's going on there and all that. [01:04:12] Beyond that, Alex literally did send Dan Badanti to Newtown to harass people. [01:04:16] So maybe he didn't go to their homes, but he did go to their town. [01:04:19] And let's not forget that he was fundraising for Wolfgang Halbig, who was using those funds to continue his campaign of harassment against these people where he literally did go to their homes. [01:04:28] I know that Alex doesn't want to feel responsible for his part in this, but that doesn't really matter. [01:04:32] Until he stops dodging this aspect of things, his apologies don't really matter. [01:04:36] Also, no one was accusing Alex of peeing on a grave, and I don't know why he would expect there to be video of it. [01:04:41] I mean, what happened was that Mark and Jackie Barden received a letter from someone who claimed that he had peed on their seven-year-old son's grave. [01:04:48] They got another letter threatening to dig it up to prove that their son didn't exist. [01:04:52] Alex has taken this unthinkably painful detail that the family shared during the trial, and he's weaponizing it for his own purposes. [01:04:59] Alex still doesn't give a single shit about any of these people and the pain he was causing and was a part of causing. [01:05:04] And it would really do Elon a lot of good to look into the actual reality of this instead of relying on a liar to tell you the truth about it. [01:05:11] Like, that's so dumb. [01:05:12] Yeah. [01:05:13] Yeah. [01:05:14] Pointless exercise. === Imagining Alex's Deterrent (06:25) === [01:05:16] I can't believe I harp on this so much, but I do. [01:05:19] It's just his apologies are so fucking. [01:05:22] I can't understand how the world doesn't just listen to what he says because I understand using the word sorry, S-O-R-R-W. [01:05:31] I understand that exists in a sentence. [01:05:33] It's not a magic word. [01:05:34] But if your sentence structure is like this, here is why what you're telling me I did wrong wasn't that bad. [01:05:41] And I didn't do it. [01:05:42] Second step: here is why what you're telling me I did wrong is probably something that I didn't even do. [01:05:49] Third step, here is why what you're telling me wrong hurts me, and you should apologize to me for hurting my feelings. [01:05:56] I'm really the victim. [01:05:56] Here is why, step number four. [01:05:59] So, anyways, I hope that now we can be fine. [01:06:03] After you apologize to me. [01:06:06] Yeah, and if any of this bullshit that really I'm the victim and all of that somehow inconveniences you, I'm sorry. [01:06:11] Yeah, exactly. [01:06:12] Yeah, that's to even pretend that adding the word sorry to that is an affront to language itself. [01:06:19] True. [01:06:20] And so Alex talks a little bit more in this next clip about how he is really the victim in all this. [01:06:24] Yeah, there we go. [01:06:24] I challenge people to find me say any of their names. [01:06:27] I said one guy's name and I apologized to him on the stand. [01:06:32] The thing had probably 100 million views on five or six different YouTube videos of him smiling and laughing before he walked to the mic. [01:06:41] I played the clip one time and said, yeah, that looks like he's an actor. [01:06:46] I did not attack him, did not come after him, did not say his child didn't die. [01:06:51] Look, I don't want to fight with him, though. [01:06:53] I said to them in a deposition, I said, I will chop my pinky finger off with a meat cleaver right now. [01:07:00] And I will. [01:07:03] If you'll just leave me alone and stop saying I made hundreds of millions of dollars off of you and stop saying I'm attacking you. [01:07:10] So, so it's very simple. [01:07:11] I had become, and I know what happened. [01:07:14] The media ran a year of articles attacking them in my name, saying things I never said as a straw man, enraging them against me. [01:07:23] And then, so they've been victimized, they've been manipulated by a PR operation. [01:07:29] And so, I would love to come on X with the families. [01:07:32] I'd love to raise money on this show or your show, Elon, or any of them. [01:07:37] I'd love to come on here and raise them $10 million for gun safety awareness next week. [01:07:41] I would love to, I would love to be in an open panel with them. [01:07:46] I would love it. [01:07:49] It actually looks like Ed Krassenstein wants to talk. [01:07:52] Oh, God. [01:07:53] So, Alex just said that for five years, he thought the shooting didn't happen. [01:07:56] By definition, he was saying that Robbie Parker's child didn't exist. [01:07:59] This is such lazy bullshit, and I can't believe Musk can be persuaded this easily. [01:08:03] Like, how can he possibly go through life if he's this uncurious about liars clearly lying to you? [01:08:08] When you take somebody into out of their circumstance and put them in a different circumstance and then ask them a question about something, that can often help you understand what it is they're saying in that other circumstance. [01:08:18] For instance, if you were to ask Alex, would you be so easily tricked by a PR firm? [01:08:26] He would say, No, I'm not that stupid. [01:08:29] And then we go over here into this scenario and he says, Oh, these families, they've been tricked by these PR firms, by these lawyers, by all of these things. [01:08:36] Now, if you understand how language works, that means he's calling them stupid. [01:08:40] Sure, and he's entirely invalidating the entire perspective and autonomy of the families by insisting that the media and the PR firms incited them against him. [01:08:49] Everything he's literally incapable of accepting responsibility for his actions and cannot conceive of a reality where these people don't like him for the reasons they say. [01:08:57] If they think Alex is bad, their opinion can't possibly be genuine because Alex knows that he's super good. [01:09:03] Yeah, another thing to point out is how dangerous a game Alex is playing here. [01:09:06] The families clearly don't want to engage in media bullshit games with him, like coming on a panel, something that would be literally just whitewashing PR for Alex. [01:09:14] Yeah, but what he's doing has a very real potential to restart a fair amount of harassment. [01:09:19] It's not a very difficult thing to imagine Alex's fans doing some kind of why won't you do a panel with Alex kind of shit? [01:09:24] And I find it difficult to imagine that Alex doesn't understand that that is a potential outcome of what he's doing. [01:09:29] Like, he's putting more likely attention and harassment in their direction, and that's uh probably not good. [01:09:37] Yeah, I mean, I think it will be an interesting test to see what kind of a deterrent Alex actually Alex's court case actually was towards future harassment like this, you know, because if Alex's fans continue harassing, I think that says everything you need to know. [01:09:53] I would imagine that they will not. [01:09:57] Well, so Musk takes this question from Ed Krassenstein. [01:10:02] So, we've had both Krassenstein brothers. [01:10:05] Who are the Krassensteins? [01:10:07] What are they? [01:10:08] Are they the one? [01:10:09] Were they the Facebook ones? [01:10:11] They were no, that's the Winklevoss twins. [01:10:14] Yeah, they're the ones who were with them. [01:10:16] They don't even know who founded Twitter on Facebook with no brothers exist anyway. [01:10:21] Which are the property brothers? [01:10:22] Are they brothers or those weirdos on TV? [01:10:25] Okay. [01:10:26] The flip houses. [01:10:27] All right. [01:10:28] The Krassenstein brothers are guys who had a Twitter presence, and they were real, like, blue wave hashtag resistance type. [01:10:38] Oh, like the 2017 people who got popular on the show. [01:10:41] A lot of like, uh, Trump is going to go to prison, don't you agree? [01:10:46] Sure. [01:10:46] Kind of engagement farming kind of stuff. [01:10:49] Those guys. [01:10:49] And then I think there also was some scams and fake accounts. [01:10:53] And so I didn't follow it too closely. [01:10:54] Yeah. [01:10:55] I'm not an expert on the Krassenstein brothers because I think they're just, I don't know, social media shithers for the most part. [01:11:00] But they're ostensibly on the left. [01:11:01] And so that's why they are like sort of a left-wing voice in this thing. [01:11:07] Great. [01:11:07] So now we've got both of them, though. [01:11:08] Good to have you guys there. [01:11:09] I already heard from Brian. [01:11:10] Now Ed's going to come in. [01:11:12] Ed Krastenstein wants to talk. [01:11:14] Maybe, Ed? [01:11:16] Could be good to hear from you. [01:11:17] Yeah, so I kind of have a question for you, Elon. [01:11:21] So let's say that. [01:11:24] This is not an opportunity to interview me. === Alex Jones and Prior Restraint (09:37) === [01:11:26] It will be. [01:11:27] In between Alex Jones, though. [01:11:29] It could be an interesting idea. [01:11:30] So let's say Alex Jones or somebody else does the same thing, but clearly directly claims that a school shooting did not take place when we know it did. [01:11:41] Does that future person or Alex Jones get banned, or is the new policy that they remain? [01:11:51] We need to look at the circumstances there. [01:11:56] The rule that we're trying to follow here is to obey the law. [01:12:00] Stop. [01:12:01] Stop. [01:12:01] Obey the laws of the United States and the laws of the countries in which X is present and really do our best to not go beyond the law on the premise that if the people wish the law to be different, then the people will ask their representatives or their leaders to change the law. [01:12:22] But otherwise, our goal is to hear as closely to the law as possible. [01:12:29] So if somebody says something that is unlawful, then we will take action. [01:12:34] If someone does not do something that is unlawful, then we would aspire to not take action. [01:12:38] So I guess the answer is that you can deny school shootings on Twitter. [01:12:41] That's fine. [01:12:42] This dude's a real piece of shit. [01:12:43] He's hiding behind this thin veneer of pretending that his rules on Twitter are just going to be in line with the law of the country, but that's not at all how he's operated. [01:12:50] Within the last month, he said that using the word decolonization was a euphemism that, quote, necessarily implied genocide, which wasn't going to be allowed. [01:12:57] Cool. [01:12:58] Does the law of the United States say that you can't use the word decolonization? [01:13:01] It seems like Elon's extending past the law of the land there because he wants to. [01:13:06] In November of last year, Elon kicked off a bunch of people for making parody accounts about him. [01:13:10] Was that against the law of the land, or was that Musk just being a little fucking baby? [01:13:14] Saying that you'll allow whatever speech is allowed by law is a perfect way to run cover for why you're not kicking off the racist trolls and lying grifters who are Elon's main fan base. [01:13:23] It just rings a little bit hollow and that's clearly not the standard that you're using. [01:13:27] And to be clear, Elon's free to use whatever standard he likes, so long as he's not restricting people based on a protected class. [01:13:33] Like, it is his business. [01:13:34] He did buy it. [01:13:35] Yeah. [01:13:36] But it's a pathetic charade to pretend what he's doing is guided by adherence to the law in protecting speech. [01:13:41] It's just, I don't know, it's a farce. [01:13:43] I mean, it is like a guy going out of his way to make me not like him. [01:13:50] Yeah. [01:13:50] Because there is, there is one, I mean, one thing he could do that nobody would even be mad at him about is if he was just like, I am going to behave whimsically. [01:13:59] Yeah. [01:14:00] I'm going to evaluate each on a case-by-case basis depending on how I feel on that day. [01:14:06] Maybe he will be banned and maybe he won't. [01:14:09] I don't fucking care. [01:14:10] Get ready. [01:14:11] It's chaos world. [01:14:12] Absolutely. [01:14:13] If he said that, I'd be like, this is a shithead who needs to be stopped, but that's a great thing to say. [01:14:18] Good on you, buddy. [01:14:19] Yeah, I mean, it's the pretending that you're just like, oh, we'll adhere to the laws. [01:14:23] Oh, fuck you. [01:14:24] That's just a way to not have people leave. [01:14:28] Every oh man, it's just everybody who says they're so courageous is a coward. [01:14:33] Everybody says they're smart is fucking stupid. [01:14:35] It's annoying. [01:14:37] Completely unrelated to that. [01:14:38] Yeah. [01:14:39] The vape Ramaswamy shows up. [01:14:41] God damn it, it's annoying. [01:14:44] So here he jumps in. [01:14:46] I think there's two kinds of ways to worry. [01:14:49] If you're worried about the spread of misinformation or false facts, one is let's deal with the content of the speech itself. [01:14:56] Elon talked about community notes as being imperfect, though it is maybe the best method out there on the internet, and I think that that's one set of discussions to have. [01:15:06] That's quite a different thing from what the law certainly looks at as the most draconian first amendment violation, which is what it calls a prior restraint, which is to previously just restrain a person for from speaking period, and I think that that is no matter what the content of the speech is. [01:15:25] It's quite a different thing to say that because of who you are or what you have said in the past, you may not say another thing in the future, and so I just think that those are two different concepts and I I just for one just wanted to share my experience of uh, my Alex Jones experience, if I may as well. [01:15:44] That is perfectly said, because you don't cut somebody's tongue out yeah, even if I did bad things in the past, but you know what, if you, I mean, but if people say false things, they deserve to have the consequences for that particular speech aired out in the marketplace of opinions, and you know community notes is one feature of of doing that, and I think that's a legitimate discussion to have. [01:16:02] All these people love community notes because they're entirely ineffective in terms of limiting misinformation. [01:16:08] Yeah, it's the best. [01:16:08] It's important to understand that. [01:16:10] They love this system because it's pretending to address the problem of how they make money, but it's actually not doing anything. [01:16:16] Literally every post Alex puts up could be community noted and the people who want to believe him will ignore each one. [01:16:21] Prior restraint is a fun way for Vivek to describe being booted from twitter, but it's not the same as having your ability to speak taken from you. [01:16:29] This is a group of adults, including one of the richest people in the world and a guy pretending to run for president, sitting around whining about how they used to have to worry about being suspended from twitter. [01:16:38] This is a fucking embarrassing. [01:16:39] Look yeah yeah, yeah. [01:16:44] If I was, if I was alive right now, i'd be like real embarrassing. [01:16:48] Yeah, what would, what would our, our grandparents think of the political system working like this? [01:16:55] I genuinely, I genuinely, 100 believe that we could go back in time, explain what's going on to a Neanderthal. [01:17:04] They would purely understand it instantly and then be like, well, we got to kill all humans, that's what we got to do. [01:17:10] We can't go down this road. [01:17:11] Homo Sapien must be destroyed, Neanderthal Forever. [01:17:17] Yep, yep. [01:17:18] Anyway, I will say that Elon Musk makes an interesting point in this next clip okay about people who should maybe be booted from social media. [01:17:25] Interesting yeah, I mean, as Alex said earlier. [01:17:28] I mean and and this is it does not mention the any kind of, any kind of attack on George Bush, but George Bush did say uh, George Bush Ii did say that there were wmds right in Iraq. [01:17:38] He said it not once, but but many times and and that and a lot it's, that was, that was that there was, there was. [01:17:44] He did not have sufficient evidence to make the to make and a lot more deep died as a consequence of that statement exactly than of anything that Alex Jones has incorrectly not killed anybody but Alex, I think you know Alex, I think one of the things I just wanted to share with people is that I was actually just curious about this guy, Alex Jones. [01:18:02] You know i'd never met him, but I happened to be in Texas, I visited the southern border and you know I popped in Austin on the way Back or somewhere, I forget where we were, and I heard Alex Jones was, that's where he was. [01:18:12] I said, I want to meet this guy that everybody says don't meet. [01:18:15] And so we sat down, and I was actually surprised. [01:18:17] We actually aired it on X, or we, or I put out clips of it on X, I can't remember in like a podcast format. [01:18:24] And so I was expecting to get in a debate about the Sandy Hook thing. [01:18:27] And as soon as I open up, what I get is a guy apologized for being wrong. [01:18:30] And then we moved on and talked about something else. [01:18:32] Oh, wow. [01:18:33] So, in terms of what Elon was saying, fine, kick Bush off Twitter then. [01:18:37] Yeah, it's interesting that for these guys, the deaths in Iraq are the result of speech. [01:18:42] Like, somehow Bush saying that there were weapons of mass destruction was the underlying cause of their deaths. [01:18:47] There were decisions made and actions taken that led to the war, and misleading the country about this was something that they used speech to do. [01:18:55] But is a public figure lying against the First Amendment, the law of the land that he cares so much about? [01:19:01] In a situation like this, they're perfectly able to take speech and then abstract it out to the consequences of actions taken related to that speech. [01:19:09] Bush lied about weapons of mass destruction, which was a part of creating a larger public support for the idea of going to war. [01:19:16] The problem really isn't the speech itself, it's the results of it. [01:19:19] The lie is bad, but the results are really what you care about, like all the dead civilians. [01:19:24] Now, let's expand this out to Alex's situation. [01:19:27] He lied about families of murder victims being actors and that their loved ones never existed or died. [01:19:32] The speech itself there is bad, and I think that all right-thinking people would consider it distasteful. [01:19:37] But the real problem is the results of that speech. [01:19:39] That speech was part of creating an environment where the victims were targets of harassment and receiving threats like having their children's graves desecrated and shit like that. [01:19:49] Yeah. [01:19:50] Neither of these results were necessarily what the person was going for. [01:19:54] Bush wasn't setting out specifically to kill civilians. [01:19:57] He just didn't really care if that happened as a result of the thing that he was intending to do. [01:20:01] Alex didn't set out specifically to terrorize these families. [01:20:04] He just didn't really care if that was the result of him doing the kind of coverage that he wanted to do. [01:20:09] This should be a simple thing for Elon to grasp if he's bringing this up as some kind of a free speech issue, and yet he remains a buffoon. [01:20:15] Also, to Vivek's point, Alex didn't apologize for being wrong when you talked to him. [01:20:19] He lied about what he did and then did the performance of a bad apology in order to defuse the situation and make sure you didn't explore the point anymore and realize that you were talking to a monster. [01:20:28] Yeah. [01:20:29] But whatever. [01:20:31] I so when Vivek brought up the great replacement theory, I mean on the debate stage. [01:20:40] Yeah, I was so, so disappointed because the funniest thing in the world could have happened, which is just one of them turning and looking at Vivek and just being like, yeah, Bergham should have been standing there. === Elon Complains About PR (15:10) === [01:20:53] It would have been so fucking funny. [01:20:55] And it would have ended everybody's lives. [01:20:57] We would all have like clapped and walked away from the United States into the oceans and returned to our dolphin brethren. [01:21:04] I believe that's what would have happened. [01:21:06] You a Doug guy? [01:21:08] You a big Doug guy? [01:21:10] Yeah, that's what I was referencing. [01:21:13] I want to make clear about something. [01:21:15] I know nothing about Doug Bergham. [01:21:17] Me neither. [01:21:18] I know nothing about him. [01:21:19] Never even knew the name existed. [01:21:21] Now that I do, he's gone. [01:21:24] It's amazing. [01:21:24] Yeah, he's gotten out of the GOP primary. [01:21:27] I know nothing about him, but I have apparently made one too many jokes about supporting his candidacy. [01:21:34] Sure. [01:21:35] That sounds right. [01:21:36] Yep. [01:21:37] Maybe next time. [01:21:38] So Elon complains. [01:21:41] That was all right. [01:21:42] Elon complains here a little bit about PR firms. [01:21:46] Just to elaborate on the public relations firms, PR firms, which really should be called propaganda firms because that's literally public relations is literally a propaganda word for propaganda. [01:21:57] You can read the history of how public relations was developed. [01:22:01] Edward Bernays. [01:22:02] Correct. [01:22:04] So, but The way that PR firms actually corrupt the media is actually in a very significant way, is that journalists are paid very little as journalists, but they can later retire and join a PR firm and make a lot of money on the basis of the articles they've written in the past and their contacts at their publication. [01:22:31] So there is actually a strong monetary incentive, very strong, for corruption of the traditional media. [01:22:39] Elon, Elon, it's so glad you said that because people don't understand that's how these work. [01:22:43] And I'm just now learning about that. [01:22:44] The PR firms are so powerful. [01:22:46] They're full of intelligence agency, former people. [01:22:48] So just like a general will sign off on some bad weapon system, then he retires and gets on the major board of the defense contractor for $5 million a year. [01:22:58] It's a revolving door. [01:22:59] And so that's why when these big PR firms snap, they jump, the media says how high, and you're actually right. [01:23:06] When it comes to fact-checking, they're really one of the most nasty, deceptive groups out there. [01:23:13] Yeah, they're propaganda firms. [01:23:15] It would just make a lot more sense if you just think PR firm equals propaganda. [01:23:18] PR equals propaganda. [01:23:20] That is literally what it means. [01:23:24] And if you read the story, it's ridiculous. [01:23:31] I think you're dropping out, Elon. [01:23:33] Is he dropping out for anyone else? [01:23:37] Yes, he's dropping off for me. [01:23:38] Elon, you said. [01:23:39] Yeah. [01:23:40] Yeah, you just dropped out, Elon. [01:23:41] I think you're back now. [01:23:43] Oh, okay. [01:23:44] Is your mic working? [01:23:45] It's good. [01:23:46] now it's good now my mic is working we're on the art firm You were on PR firms equal to propaganda. [01:23:52] Yes. [01:23:54] The public should really understand that public relations literally just means propaganda. [01:23:59] PR firm means propaganda. [01:24:01] And the PR firms have very strong control over the traditional media because that's where journalists go to retire and make tons of money. [01:24:11] So there's a very strong monetary incentive for journalists to do what the PR firms say because they know that that is where they're going to get rich in the future. [01:24:22] Bingo. [01:24:22] Bingo. [01:24:23] So this is complete insanity, but it makes sense for Elon to have this stance. [01:24:27] He's been acting like an erratic asshole, posting all kinds of racist and anti-Semitic shit, associating with some of the most bigoted accounts on his own website, and people notice. [01:24:35] Because people notice, there are articles about his behavior, and he needs to come up with a good excuse for why that is. [01:24:40] It's not because his behavior is unacceptable and really scary considering his position. [01:24:44] It's because PR firms are all propaganda outlets out to get him. [01:24:47] Now, I'll go with him halfway on this. [01:24:49] Sure. [01:24:49] I don't think the PR as an institution is a really great thing. [01:24:52] I've always been averse to marketing and advertising in their entirety, but what Elon's saying is fucking idiotic. [01:24:58] Also, fun fact, when Elon took over Twitter, one of the things he did was to diminish the PR department. [01:25:03] This was probably because that was the point of contact for journalists, that they could contact the company and ask for comment on things like him letting all the Nazis back on the site. [01:25:12] However, in time, Musk realized that he kind of needed PR in order to run his company, so now he just calls it, quote, business operations. [01:25:20] They even brought in Joe Benerock as the head of this effort, a guy whose previous jobs were director of Facebook's small business and international advertising, as well as the executive vice president of communications for NBC. [01:25:33] Elon operates like a child. [01:25:35] He makes a big mess, and then Linda Yaccarino and the rest of the corporate PR that's not called PR inside Twitter work like mad to clean it up and save the company from losing more money. [01:25:44] He's like somebody who thinks doing laundry is a scam because someone else always does his laundry. [01:25:49] Also, journalists definitely are paid enough, but they don't just do the bidding of PR firms so they can get a job with them after they retire. [01:25:56] I'd like to see any backing of this claim. [01:26:00] That's a conspiracy theory that belongs on InfoWars. [01:26:03] I mean, that makes I get when you're like lobbyists or politicians who then revolving George, that makes a perfect sense. [01:26:10] Not quite analogous. [01:26:11] I don't remember Woodward being like, oh, no, I'm a PR flag. [01:26:16] But maybe he was. [01:26:17] Actually, he probably was. [01:26:19] I bet Woodward was a PR flag. [01:26:21] But this also seems to imply that PR is not a profession of its own. [01:26:25] I don't even know what we're talking about, to be honest. [01:26:28] Right. [01:26:28] I don't know. [01:26:29] Yeah. [01:26:30] Anyway, just sort of stuff that exists in Elon's brain. [01:26:33] Yeah, that sounds right. [01:26:34] So we got a question for Elon here. [01:26:36] Elon, a question is more for Alex. [01:26:39] And me and Alex were discussing it before you jumped on. [01:26:41] Is the pressure, if any, that you've faced since reinstating or since you said you'll reinstate Alex. [01:26:47] Now, Alex kind of referred to extreme scenarios where your life could be in danger because of what you're doing with X. [01:26:55] I know you mentioned it briefly in one of the first spaces we had. [01:26:58] You made a joke about, hey, guys, if something happens to me, I'll never commit suicide. [01:27:02] But how much pressure have you faced in recent months? [01:27:06] Obviously, we've seen the back and forth with a few organizations that are trying to censor others. [01:27:11] And has that increased since you reinstated Alex? [01:27:17] I mean, at this point, reading the sort of legacy media is just depressing. [01:27:26] I imagine so. [01:27:27] Evidently, once in a while, we'll go see Google News or whatever, or Yagu News or whatever, some sort of random thing. [01:27:35] And I'm like, I'll accidentally Google my name every five seconds for the rest of my life. [01:27:38] The quality of the propaganda isn't even good. [01:27:41] Look, if you're going to do propaganda, at least make it entertaining. [01:27:44] And I find it dull, boring. [01:27:49] And just not even well written. [01:27:52] Well, that's right. [01:27:53] They'll put out one thing and then they all parrot it. [01:27:56] It's a dog lazy propaganda. [01:27:58] It's like, we're caught up. [01:28:01] We've got Andrew Tate here as well. [01:28:02] Andrew, how are you? [01:28:03] Oh, God. [01:28:04] Oh, God. [01:28:06] Andrew, your mic, is it working? [01:28:08] You've got to unmute bottom left corner. [01:28:09] Can you hear me? [01:28:11] We can, yeah. [01:28:12] Yeah, I'm good, friend. [01:28:13] I'm in the middle of a poker game, but since this is the battle for humanity against the Satanists and the Matrix with its constant deception of the populace, I thought I would jump in and say hello to everybody. [01:28:22] This is good stuff. [01:28:24] Now, the second Tate brother has shown up. [01:28:26] We have got two collections of shithead brothers, the Krasensteins and the Tates. [01:28:31] Both Tates have a sibilant S? [01:28:33] They sound exactly the same. [01:28:34] That is insane. [01:28:36] And now I can't stop thinking about it. [01:28:39] When Tristan was on earlier, I thought like, that's Andrew. [01:28:44] He sounds almost exactly the same. [01:28:46] I can't believe they both have the same sibilant S. That's freaking me out. [01:28:50] Well, maybe it's practiced. [01:28:52] That does seem fake. [01:28:53] That would make sense. [01:28:54] So I guess there's no real answer to whether or not Elon's life is in danger or there's pressure. [01:28:59] He just took the opportunity to complain about how boring MSM propaganda is. [01:29:03] Yeah. [01:29:04] Fine. [01:29:04] Cool. [01:29:05] Yeah. [01:29:06] You know, it's just getting depressing, really, reading all of the people point out how fucking stupid I am. [01:29:11] Right. [01:29:11] Yeah. [01:29:12] That's probably. [01:29:13] And all the negative indications about my business and shit. [01:29:17] Yeah. [01:29:18] Yeah. [01:29:18] I imagine, I imagine it must be a little bit like going home every day to when my, you know, my mom would be like, your dad's going to be home after I did something wrong at school. [01:29:30] And just never not doing it, just continuing to do it. [01:29:35] What if you get on Google and your dad will be home soon? [01:29:38] Yeah, I mean, it might as well be that. [01:29:39] It really is. [01:29:40] And he's just like, nah, I think I'll just keep doing the exact same shit. [01:29:43] Fine. [01:29:44] So Alex is pretty enthusiastically kissing Elon's ass throughout this. [01:29:49] Your thoughts, Andrew, we had your brother on earlier, Tristan, come on earlier. [01:29:54] Your thoughts on Alex Jones being back on X? [01:29:57] I'd rather hear his thoughts on Elon Musk being the biggest maverick of the last 500 years. [01:30:02] I'm not kissing ass here. [01:30:03] Elon, I mean, you've got big ones, man. [01:30:07] On every front, you are literally overturning the entire power structure. [01:30:10] I was just going to say this to Andrew get in, but I just want to say this while you're here. [01:30:14] I mean, you are literally changing the entire paradigm, and you've definitely got the system scared. [01:30:22] And so everybody needs to support X. Everybody needs to support the sponsors on X. [01:30:27] I personally am doing all my Christmas shopping this year with all the great gadgets and stuff that are on X. [01:30:33] This is a bit much. [01:30:35] I know the kids are eating ass these days. [01:30:37] But Jesus Christ. [01:30:40] I feel like Alex is. [01:30:44] I mean, if anybody has familiarity with him, has listened to enough of him, hearing him be this subservient should set off some alarm bells. [01:30:54] It should be like, hey, Alex, fucks wrong with you. [01:30:57] Why are you acting like this? [01:30:59] You ever, there were these old, those old 90s strongman, ESPN World Strongest Man competitions. [01:31:07] And back in the day, they stopped doing it for very obvious reasons. [01:31:11] But back in the day, they used to do tug of war between two absurdly strong, like 400-pound men. [01:31:18] And it just lasted forever and it was incredibly boring because they're absurdly strong relatively equally. [01:31:24] They would need one of those guys to get Alex's nose out of Elon Musk's ass right now. [01:31:29] Yeah. [01:31:30] Yeah. [01:31:30] That was a long walk. [01:31:32] It was a long walk, but I got there. [01:31:33] You did arrive at your destination. [01:31:35] I sure did. [01:31:35] So Andrew Tate finally does get around to answering this question that Alex so rudely interrupted to brown nose a little, but here we go. [01:31:44] The purchase of a simple website has literally cracked the Matrix in real time, and it becomes extremely difficult now to run the psyops they were previously running and enslave the populace, which is their primary goal. [01:31:55] So Elon is a hero, absolutely. [01:31:58] And the risks you are taking, Elon, I don't think many people at home actually understand the gravity of the risks you are taking because your ability to speak freely is heavily leveraged against your insignificance. [01:32:08] You're only allowed to speak if nobody listens to you. [01:32:11] And if you get big and people start listening, they're going to come at you hard. [01:32:14] And I think I'm not completely versed, but from what I understand, Elon's already suffering the law fair tactics, which they're going to do. [01:32:21] They're going to keep pulling out the hat to try and slow him down or Andrew. [01:32:26] Let me interrupt before I forget. [01:32:27] I don't give any attention. [01:32:29] The same law firm that came after me with these PR firms. [01:32:36] You've just jumped out, I think, Alex. [01:32:38] Did you just drop out? [01:32:39] Anyone else can hear? [01:32:40] Yeah, I think he got a call. [01:32:41] Now you got a cold. [01:32:42] Yeah, go ahead, Alex. [01:32:43] There is a three-letter agency running this. [01:32:45] Not all CAA. [01:32:46] Let's just say it starts with a C and it ends with an A. Sorry, still could be. [01:32:51] Still could be CAA. [01:32:52] Yeah. [01:32:53] Might be the great talent agency. [01:32:56] No, so what Alex is saying is that, you know, Mark Bankston is suing Elon over that. [01:33:02] So that's the connective tissue that Alex is making. [01:33:04] He's trying to say that that's all being run by this PR firm out of New York. [01:33:09] And this is a way of connecting his own plight to Elon's plight and then making Elon more sympathetic to, oh, the way I'm being attacked is just like the way you are being attacked. [01:33:19] That's where we are one. [01:33:20] Yeah, it's a way to be even more convincing. [01:33:23] Yeah. [01:33:24] So yeah, Andrew Tate freaks me out. [01:33:28] Is that I saw, I've seen pictures of him. [01:33:33] I really thought he I've never, I think this is the first time I've actually heard his voice. [01:33:38] Maybe. [01:33:38] I feel like we might have covered a time that he was on InfoWars way in the past. [01:33:43] Okay. [01:33:43] Yeah. [01:33:44] I just thought he would try and sound cooler. [01:33:46] No, he doesn't need to because he sex traffics people. [01:33:49] See, that's kind of, but I thought he engages in organized crime and enslaves. [01:33:53] I know. [01:33:54] That's the thing. [01:33:55] I thought he was doing a whole thing. [01:33:57] This is weird. [01:33:58] No, the voice does not have to be blah, So, yeah, he also just got some news that he's not getting all of his shit back from the Romanian government. [01:34:07] He asked to get his millions of dollars of things that were seized from him back. [01:34:14] And they're like, no, we're not going to give that up. [01:34:16] Yeah, that's probably smart. [01:34:17] So he's dealing with that. [01:34:19] I don't know. [01:34:21] There's a part of me that feels like this is all people whose identity surrounds how important it is that they post on Twitter. [01:34:32] And to me, I understand that using Twitter is an incredibly effective tool to scam people and to easily manipulate folks into your revenue streams. [01:34:44] Yeah, that's con man's dream. [01:34:45] And I understand why that is so important to them. [01:34:48] But pretending that it has some kind of a like, you, Johnny Q Public, the person on the street, your voice is being taken away. [01:34:57] I don't know. [01:34:59] It's just another level of the con. [01:35:02] It's bizarre to me to hear them take this so seriously. [01:35:05] Yeah, it is like the levels of the con is what's so fucking infuriating is that you're you're like, okay, finally, I've unraveled it. [01:35:16] I'm out of this con. [01:35:17] And then, oh shit, it's just another part. [01:35:19] Now I have to just be gone. [01:35:21] You know, like I have to completely get rid of Twitter. [01:35:23] And then not just that, if you find out that's part of the other thing. [01:35:26] And then Andrew Tate's watching you. [01:35:28] You got to run. [01:35:29] Everyone, get off the grid. [01:35:30] That's what I'm saying. [01:35:31] Oh, no. [01:35:31] Yep. [01:35:32] You know, who's off the grid? [01:35:34] Who? [01:35:34] Tristan Tate. [01:35:35] He's waiting for you. [01:35:36] God damn it. [01:35:37] So Alex makes a declaration of victory for Elon Musk. [01:35:41] Sure. [01:35:41] This is what happened. [01:35:42] I'm going to shut up. [01:35:43] I want to hear from Elon, but this is so historic. === Elon Musk's Impact (11:55) === [01:35:45] Elon Musk's courage, and it's true, I'm saying, has broken the back of the globalists. [01:35:49] They'll never be able to turn this around again unless they have a nuclear war. [01:35:54] Elon Musk has broken their back. [01:35:59] Yeah. [01:36:00] Yeah. [01:36:00] Well, I guess some people are afraid to die, but I am not. [01:36:04] Oh, my God. [01:36:05] Oof. [01:36:06] Oof, oof, oof. [01:36:08] Bunch of dorks. [01:36:09] Wow. [01:36:10] There's a lot of talk about how none of these people are going to kill themselves. [01:36:13] Like, Andrew Tate says it a bunch. [01:36:15] Elon Musk brings it up. [01:36:17] Yeah, I know. [01:36:18] It's all the like, we're about to be killed by the man. [01:36:20] I understand you're not going to do that because people like me are afraid of saying it out loud for very smart reasons. [01:36:28] So Elon discusses his ideas of how we need to colonize space. [01:36:34] Sure. [01:36:34] And you may notice his kid is making a bunch of noise in the background. [01:36:39] Okay. [01:36:39] So Elon, when are you going to? [01:36:41] I know you got 100 irons to fire, but I've really, when you talk about we need to create a plan B for humanity, well, that's really, I mean, an alternate master plan. [01:36:54] Because the global is under control right now, you're trying to rest control with us helping. [01:36:58] I mean, when are you going to put out your battle plan, or are you already putting it out of pieces? [01:37:04] No, I mean, what I'm saying is that actually, I think we should expand humanity. [01:37:10] Like, basically, we should have basically more kids. [01:37:13] You know, population should increase and we should become a multi-fanet species and make life multi-planetary. [01:37:24] Build a self-sustaining civilization on Mars. [01:37:27] Let's do that. [01:37:28] And then ultimately, this will be long after I'm dead, probably, but almost certainly we can go to other star systems and go out there. [01:37:37] And I don't know, maybe we'll find some long-dead alien civilizations. [01:37:41] And I don't think we want to be one of those lame one-planet civilizations that never got beyond its home planet. [01:37:48] I mean, we've got to, you know, what are the aliens going to think of that? [01:37:55] We got to make a good showing. [01:37:57] Team Human. [01:37:58] Yeah, absolutely. [01:38:00] That's extremely disappointing. [01:38:03] But it's essential, and truthfully, it's so amazing we even speak about these things. [01:38:08] Only two years ago, you couldn't even speak about these subjects, but it's so pertinently obvious to anyone who pays attention. [01:38:14] What exactly does Andrew Tate, alleged human trafficker, think that we couldn't discuss two years ago? [01:38:20] Does he think that rambling vague ideas about colonizing Mars is what Drew censors him? [01:38:24] I mean, there's a little bit much. [01:38:26] I mean, great. [01:38:28] Talk about do you want to put a colony on Mars? [01:38:30] Great. [01:38:31] Don't want to disappoint the aliens. [01:38:33] Great. [01:38:34] So long as anyone is listening to Elon Musk, we will, as a species, be long dead before Mars colonies are even a thought. [01:38:41] It does seem like that'll probably not help. [01:38:44] Yeah, like, let's colonize Mars, but let's also light Earth on fire while we're doing it. [01:38:52] And in terms of, like, really trying to set up this, the way that we're going to colonize Mars, it's important that everyone's voice is heard. [01:39:00] So I'm going to get Laura Loomer, Jack Pesobic, a couple sex trafficking brothers, and the Krassenstein brothers together, and we're going to really get to the bottom of this thing. [01:39:08] You know what, Dan? [01:39:09] You have made a very good accidental point. [01:39:12] Those people need to be on a ship to Mars right away. [01:39:17] Here's my problem. [01:39:19] If you send them to Mars. [01:39:20] Ship not big enough? [01:39:21] No. [01:39:21] No, if it's successful and they establish a colony there, you've just given them a planet. [01:39:28] I gift it to them happily. [01:39:30] I don't know if you have the authority. [01:39:32] As king of Mars, I do. [01:39:34] So they're all rambling about the globalists, and these globalists are bad. [01:39:38] Sure. [01:39:39] And so one of the Krassenstein brothers, I'm not sure which one, decides to pipe up and say, what about it? [01:39:44] Not all globalists. [01:39:45] Oh, my God. [01:39:46] What are we doing? [01:39:47] Yeah, it's a mess. [01:39:48] You guys are all attacking the globalists, but if you ask a globalist, like I have friends who I would consider globalists. [01:39:56] If you ask them, their ideologies are aligned that they believe that somebody living. [01:40:05] I don't know what you're doing. [01:40:06] lives in america and i know you know they've already enslaved the third world and then that's not how everybody who you would categorize as a lot of useful idiot global Globalists at the top are depopulationists. [01:40:21] That's their word. [01:40:22] So maybe if you want to look at the top, you can say globalists at the top. [01:40:26] Some of them might have that view. [01:40:27] But, you know, if you just talk to an ordinary person who views themselves as a globalist, they're not saying, oh, you know, I'm evil. [01:40:35] They're not an evil person. [01:40:36] They just have this belief that everybody, we're not talking about being a race. [01:40:40] We don't care about them. [01:40:41] We don't care about you. [01:40:42] I would call that an international code word. [01:40:45] What are you doing? [01:40:45] Globalists want one world government run by corporate. [01:40:48] What are you doing? [01:40:49] I mean, I think you can label them differently. [01:40:51] Stop this. [01:40:52] Well, Henry Kissinger was a globalist. [01:40:54] Zbigniew Brzezinski was a globalist. [01:40:56] I'm not trying to be mean to you, but their number one rule is the Earth is too small. [01:41:01] We can't expand. [01:41:03] We've got a bean count and put everybody on rations. [01:41:05] We've got a social engineer and in the normal human program because humans are failed. [01:41:10] They want to turn us into factory farm humans. [01:41:12] Those are a lot of surprise. [01:41:14] I'll answer the question. [01:41:15] I'll answer the question. [01:41:16] Sorry, guys. [01:41:19] Some of these titles are a little confusing. [01:41:20] If you say someone's an internationalist or a globalist. [01:41:23] Arbitrary, in fact. [01:41:24] Almost made up. [01:41:27] To avoid saying what you actually want to say. [01:41:29] Does someone have as an axiomatic belief that there are too many people on the Earth, or do they not? [01:41:35] Do they believe that the Earth can sustain the current population, or do they think it cannot? [01:41:40] Now, the reality is Earth can actually handle a human population probably 10 times larger than the current population. [01:41:47] Not with you. [01:41:47] It's actually very sparsely populated by humans. [01:41:50] We only see density if we're in a dense urban environment like New York or Boston, London, or something like that. [01:41:58] But if like here'll be like a good test. [01:42:01] If you took a plane from LA to New York and you try to drop a bowling ball and hit somebody, your chances of success are basically zero. [01:42:09] You'd have to drop 10,000 bowling balls, maybe. [01:42:12] It's cute that one of the Krassenstein brothers decided to try to stick up for all the globalists out there, but unfortunately he doesn't understand the way terms are used in the extreme right-wing communities where the people he's talking to exist. [01:42:22] These people are just battering around terms that have no consistent definitions. [01:42:26] There's no point. [01:42:27] No point. [01:42:28] Blurgs. [01:42:29] Just say, yeah, it doesn't matter. [01:42:31] We make sounds. [01:42:32] We don't make words. [01:42:33] We make sounds. [01:42:34] Now, as for Elon's point, he's an idiot. [01:42:36] It is true that if you just go based on how much space there is in the world, we could sustain a much larger population than we have. [01:42:42] We could have tens of billions if we just had urban population densities all over the place. [01:42:47] But the carrying capacity of the planet involves a whole lot of other variables that his idiotic bowling ball test doesn't take into account. [01:42:54] Just on a basic level, it's estimated that you need about 2.5 acres of land to feed one person per year. [01:42:59] There are some estimates that are lower, and there are some that are higher, depending on diet and how directly you're involved with farming, but this is a median estimate. [01:43:06] There are only about 11 billion acres of arable land in the world that can be used for agriculture, so that's a pretty difficult hurdle to get past in terms of this infinite population thing. [01:43:16] I guess you could turn to innovations to try to find a way to feed people using less land, but you've seen the way that Alex and his idiot friends have responded to synthetic meats and insect protein, which are efforts to do exactly what they would want. [01:43:29] Even beyond the concern about feeding all the additional people Elon wants to create, you run into so many more issues like access to water. [01:43:35] Climate change is exacerbating these problems as it diminishes the amount of arable land we have at our disposal and also cuts down on the amount of water that's available. [01:43:43] These are real problems. [01:43:45] And the people Alex hates so much are the ones who are trying to address the problem in a way that hurts the least people. [01:43:51] Promotion of access to birth control and family planning is a giant way that this can be challenged, but Alex acts like that's just an attempt to depopulate. [01:43:58] Working to provide better access to general health care and raising the standard of living in the developing world is another way to reduce the amount of children people have because as you become more stable, you have less kids. [01:44:09] He has a whole ton of kids when you expect half of them to die before the age of five and you need them to provide manual labor. [01:44:16] That Alex is, of course, opposed to having vaccines and health care aid given to the developing world. [01:44:25] These guys are absolute clowns. [01:44:26] Their entire premise is based around a failed understanding of a problem, which they just aggressively decide isn't a problem because they don't want to get it. [01:44:34] If you can drop a bowling ball and not hit somebody while flying over middle America, then the world isn't overpopulated. [01:44:40] These are the thoughts of a child, but they aren't really different than Alex's whole thing about Trump obviously winning the 2020 election because he had larger crowds at his rallies. [01:44:49] This is all about making an argument by pointing to an optical point that isn't meaningful in order to obscure the actual data that would refute their point. [01:44:57] And it's a, I don't know, it's a game for dullards. [01:45:01] Yeah. [01:45:02] Yeah. [01:45:02] Very stupid. [01:45:03] It is, it is just kind of heartbreaking. [01:45:07] Like to hear someone say that out loud and then pat themselves on the back. [01:45:12] What if you drop a bowling ball and you hit the land that's needed to feed the people in the world? [01:45:17] Does that count? [01:45:18] Can you just toss a person there? [01:45:20] It is like you, I mean, you just genuinely have to sit and look at them and be like, you're not serious people. [01:45:25] No, not at all. [01:45:26] You need to have all power taken from you instantly. [01:45:29] Yeah. [01:45:29] Because this is your threat. [01:45:31] I don't understand why, like, for all the bullshit I've heard from my stupid fucking government about what's dangerous, I don't understand how Elon Musk is not considered a national security threat. [01:45:43] I don't understand that at all. [01:45:44] Assassinate that motherfucker. [01:45:46] You've got the CIA. [01:45:47] That's what they're for. [01:45:49] I don't know if that's the answer. [01:45:50] That's obviously not the answer. [01:45:52] Can destabilize a fucking South American country. [01:45:55] You can murder Elon Musk. [01:45:58] It's not different. [01:46:00] I'm going to push back. [01:46:01] I'm going to hold back on this. [01:46:03] Fine. [01:46:04] Calls for assassination. [01:46:05] Fine. [01:46:06] Listen, the CIA's murdered so many people while I'm alive. [01:46:09] Might as well do it for one that I want. [01:46:11] That doesn't make it. [01:46:12] I pay their salary, and they have murdered so many people who are not on my list. [01:46:18] Look, here's the situation. [01:46:20] Yeah. [01:46:20] No, no, no. [01:46:21] It doesn't work that way. [01:46:22] No. [01:46:22] Just because the CIA has done bad things doesn't mean you get one. [01:46:27] Okay. [01:46:27] It doesn't mean you get them through another bad thing. [01:46:29] And the FBI murdered Fred Hampton, so they get to kill Elon Musk. [01:46:34] Same problem. [01:46:35] Okay, fine. [01:46:36] Another bad thing doesn't make up for a past bad thing. [01:46:39] I'm not asking it to make up for a past bad thing. [01:46:41] I'm just saying if they're going to continue doing bad things, then there's one on the way that they could get. [01:46:49] Take a little, hey, man, look at that off the side of the road. [01:46:52] I'm going to have to stop you. [01:46:53] I understand the fun you're having and all this, but I'm going to have to pump the brakes on this murder talk for the time being because we have more pressing things to deal with. [01:47:03] Sorry, it was government-mandated, so I wasn't being personal about it. [01:47:07] So one of the Chrisenstein brothers tries again to stand up for globalists. [01:47:12] Sure. [01:47:14] And then something incredibly shocking happens. [01:47:16] You know, it's not black and white. [01:47:18] You're not either a globalist or you're not. [01:47:19] I think people fall in between, and they have, there's different reasons for why people might feel one way about one, you know, you could say globalist idea and another. === Regulatory Concerns (08:38) === [01:47:29] So, I mean, I don't like painting people like, you know, with a black and white pen because I feel that everybody falls somewhere in the middle. [01:47:39] All I want to know is this. [01:47:41] There are people that want a corporate world government whose aim is depopulation and not giving the general public access to technology by lying about resources and literally saying carbon dioxide that plants breathe is evil and then telling us the world's going to end in 2030 and the ice caps are all going to melt and none of that's true. [01:48:00] So our children basically give up on the future and decide not to have children. [01:48:04] That's all I'm saying. [01:48:05] And Elon Musk is promoting an optimistic pro-human future that the science and evidence shows is real and that we need. [01:48:12] Gentlemen, I have to go. [01:48:14] I just want to be sort of exactly clear about my position. [01:48:18] I'm super pro-human, and I need all humans. [01:48:21] You know, humans in America, humans, and somebody's got your thing on 15 and every growth. [01:48:30] Yeah, that's Vivek. [01:48:31] Vivek, that's your phone, Vivek. [01:48:33] I'm not able to mute you. [01:48:35] Vivek. [01:48:37] Go ahead, Elon. [01:48:39] Sorry about that. [01:48:41] Yeah, Vivek just peed on a live stream. [01:48:44] That is, you know what? [01:48:46] I will say this. [01:48:47] That's a healthy stream. [01:48:49] Good on you, Vivek. [01:48:50] I'm not sure. [01:48:50] Good on you. [01:48:51] I'm not positive I even agree with that. [01:48:56] In my day, someone who is running for president peeing on a live stream, accidental or not, that might have been a big deal. [01:49:06] That might have been enough to be like, all right, come on. [01:49:08] Wow. [01:49:09] Yep. [01:49:09] Wow. [01:49:10] Wow. [01:49:11] Deeply unserious people. [01:49:13] Very. [01:49:14] Wow. [01:49:14] Very. [01:49:16] There is like an image of like, yeah, this is exactly what this is. [01:49:20] Like Alex rambling climate change denialism while Vivek pees. [01:49:24] Yeah, I don't that is that is one of those kind of found metaphors that can never be equaled or surpassed. [01:49:32] No. [01:49:32] It's just it doesn't get better than a presidential candidate accidentally leaving his mic on while he's peeing as these fuckers listen to Alex Jones talk about nonsense. [01:49:47] Yep. [01:49:48] That is, I mean, what do you say? [01:49:51] The zeitgeist. [01:49:52] What do you say? [01:49:53] So, earlier, Musk was very clear. [01:49:57] This is not going to turn into an interview. [01:49:58] This isn't for you to interview me. [01:50:00] Sure. [01:50:01] And, of course, it does turn into that. [01:50:02] Naturally. [01:50:03] And Alex just decides he's going to interview Musk. [01:50:05] Great. [01:50:06] Elon, and Elon, way more interesting. [01:50:09] And I just asked one question of Elon. [01:50:12] Elon, it's great having you here in Texas. [01:50:15] You're kicking ass. [01:50:16] You're Texas through and through. [01:50:17] Your whole spirit. [01:50:18] We love you. [01:50:20] Whatever happens with Trump down the road, should we change the Constitution so you can run for president? [01:50:26] What did you ever think about that? [01:50:29] I would like to say, as a technologist, and build rockets and electric cars and things that technologies that hopefully have a good effect on the world and advanced civilization, that would be my preference. [01:50:47] I would not like to be president. [01:50:51] So that would, that would, I would just like to further civilization. [01:50:59] That's a smart. [01:51:02] I think that's a smart answer. [01:51:03] Smart answer. [01:51:04] Why do you ask that? [01:51:06] Hey, Elon, should we change the Constitution so you can be president? [01:51:09] I mean, it is like, hey, Elon, do you need a ball massager right now? [01:51:14] Oh, yeah. [01:51:14] I'm available. [01:51:16] And while I'm doing that, can you give us a scoop? [01:51:19] Can you give us a scoop, Elon? [01:51:21] Just keep my rubbing your eyes. [01:51:22] I know you talk about incredible jets you want to invent, so many other things. [01:51:25] Is there any other big invention you've got on the drawing board in the back of your mind that you haven't announced the world that you want to tell people about today? [01:51:34] No. [01:51:35] This would not be the forum for announcing any new products or technologies. [01:51:40] I'm not going to say that on this dumb space. [01:51:42] I would use my PR apparatus for that. [01:51:45] I mean, yeah, but it is really cute. [01:51:50] Alex, like, oh, you have any new inventions, Elon? [01:51:53] Oh, my God. [01:51:57] But I mean, I guess that is the image that they still project onto him, is that like Iron Man image. [01:52:02] Yeah, yeah, yeah. [01:52:03] That's what that's what. [01:52:04] You've been in the lab. [01:52:05] You have anything crazy to reveal? [01:52:06] Yeah, yeah, yeah. [01:52:07] That's what keeps them going. [01:52:08] The idea that he's actually geniusing as opposed to being a fucking idiot. [01:52:13] Yeah, and retweeting bigger. [01:52:15] I mean, he's concerning. [01:52:17] He's talking to you. [01:52:19] Iron Man does not talk to you. [01:52:21] No, Iron Man's not going to do a podcast with J. Jonah James. [01:52:25] Not going to happen. [01:52:26] So Elon does have a technology that he does want to talk about, though. [01:52:29] Sure. [01:52:29] And that is his brain chip. [01:52:31] Oh, right. [01:52:32] He talks about this a little bit, the Neurolink. [01:52:34] Yeah. [01:52:35] And Alex really shouldn't be on board with this. [01:52:37] No, he should really love a private, ultra-powerful, unaccountable billionaire putting a computer chip inside your skull. [01:52:46] But he is actually in favor of it. [01:52:48] He's thrilled. [01:52:48] Thrilled. [01:52:49] We do have the Neuralink chip, which I know some people might be concerned about, but that's really something that will take a very long time to be in any kind of widespread use. [01:53:00] We've got the first use. [01:53:02] The first patient will get a Neuralink chip. [01:53:05] This is a quadriplegic, and it will enable them to control their computer and their phone. [01:53:12] My uncle was in a motorcycle accident and he was having seizures about to die, couldn't even walk. [01:53:20] He got, it's not one of your brain chips, but a brain chip. [01:53:23] And actually, he can walk and talk and is happy now. [01:53:26] So that's not bad. [01:53:28] Exactly. [01:53:29] So, I mean, the regulatory stuff on this is very intense. [01:53:35] But the first one will have, you know, you could think of it sort of like a telepathy. [01:53:40] You can sort of control your computer and phone just by thinking. [01:53:44] So it's kind of like telepathy. [01:53:47] And then Kind of like that would be my car is trying to work. [01:53:51] I'm not trying to give a name for it, but you can think of it like blindsight. [01:53:53] It's kind of like even if somebody has lost both eyes or lost the optic nerve is completely blind. [01:53:59] It's kind of like how all of that other stuff doesn't work either. [01:54:02] Give them some amount of eyebrows. [01:54:03] Like Twitter? [01:54:04] Full of shit. [01:54:05] I think high-resolution site, kind of like Deordi LaForge from Star Trek, you could actually see in multiple wavelengths. [01:54:12] You could see ultraviolet and infrared. [01:54:14] We're going to bring Jordy LaForge into the world. [01:54:17] You would have to be a stupid billionaire or somebody who thinks Alex is not an idiot to listen to that and be like, oh, he's not lying. [01:54:27] You would have to have no context of anything he's ever done in order to be really wowed by this is going to work really well. [01:54:35] And I want to say another thing, too. [01:54:36] Like, I'm not sure I'm perfectly in sync with the general population as a whole. [01:54:41] Sure. [01:54:42] But I can't imagine how little I want a brain chip to telepathically communicate with my computer. [01:54:48] Sure. [01:54:49] That is a problem I don't need solved. [01:54:52] Right. [01:54:52] A mouse works. [01:54:54] Not just that. [01:54:55] And I mean, yeah, 100% totally agree with you on that front. [01:55:00] But the idea of that being something that will work for any length of time to justify allowing that man anywhere near my head is absurd. [01:55:12] Can it filter intrusive thoughts? [01:55:15] Because I could imagine myself buying a lot of stuff on Amazon telepathically, accidentally. [01:55:21] I mean, there's just, there's no way for that not to be a dumb idea in any kind of practice. [01:55:27] It's such a fun idea in a science fiction story. [01:55:30] And if you think that science fiction, because you're a billionaire, is anything you want it to be, then sure, you can lie to people like that and say bullshit like that. [01:55:40] But in reality, that's never going to happen. [01:55:43] I would say the odds are low. [01:55:44] Yeah. [01:55:45] So Alex has taken over this whole ship basically and turned it into an interview with Musk. === Quick Questions on Ukraine Support (05:20) === [01:55:51] And he's bringing his buddies in. [01:55:53] Great. [01:55:54] It was probably trademarked saying, but I think Blindside is a cool name for it. [01:55:59] Yeah. [01:55:59] Let me get one question from Mark Dice because this is the best interview ever. [01:56:02] Mark Dice is a great journalist. [01:56:04] He stood up for me for the last five years when nobody else would. [01:56:06] Mark, quick question for Mark, please. [01:56:10] A comment, really, just to force all the journalists. [01:56:14] And thank you, by the way, Elon, for unbanning Alex for the platform. [01:56:19] So much manny. [01:56:20] Hey, Elon, you're here, and bring Mark Dice in here to ask you a question. [01:56:26] Chase Geyser is going to come in and ask you a question. [01:56:28] Chase Geyser doesn't come in and ask you a question. [01:56:30] I was joking. [01:56:31] It would have been great. [01:56:32] However, this actually is not somebody that Alex is just bringing into the proceedings. [01:56:36] He's been around since the beginning. [01:56:38] But Jackson Hinkle decides to make an appearance and ask a pro-Russia question, of course. [01:56:45] Elon, I've got a quick, quick, quick, quick question. [01:56:48] You tweeted about the imprisonment of the American Chilean Gonzalo Lyra in Harkiv, Ukraine yesterday. [01:56:57] I'm curious if that imprisonment of an American for speaking his mind on YouTube and X has caused you to consider further support for Ukraine, albeit through Starlink or other means, and also unrelated to that, can you provide any updates about Starlink for Gaza? [01:57:19] Yeah, I mean, I generally think, look, look, I understand that if somebody, if an American citizen is in another country and violates that country's laws or what those countries' laws, even if their actions would not violate the laws in the United States, that that person would then be put in prison. [01:57:34] But in the case of Ukraine, the United States is providing a vast amount of aid to Ukraine. [01:57:41] And the United States government has an obligation to protect its citizens. [01:57:45] And so I think even if one disagrees with what that I guess YouTuber or journalist, depending on your perspective, what they posted, I feel uncomfortable sending massive aid to Ukraine if they're putting American citizens in jail for doing videos on YouTube. [01:58:08] Sure. [01:58:09] That's not cool. [01:58:10] Yeah. [01:58:10] And it's like, it could say like, okay, well. [01:58:13] Yeah, but Ukraine in their country. [01:58:16] It's like, yeah, but they don't have a right to our money as well. [01:58:20] So it's like they don't have a right to our money. [01:58:26] Frank up your country. [01:58:27] Well, that's right. [01:58:28] Look, look, I'm not a dog to fight Russia, Ukraine. [01:58:32] It's an old ancient fight between the two countries that's going on for an ancient fight. [01:58:36] It's a Slavic civil war. [01:58:38] Son of man. [01:58:39] Ukraine is arresting the Orthodox Church. [01:58:42] Ukraine, even the mayor of Kiev, has said Zelensky's becoming a dictator. [01:58:47] So all I'm saying is this black hole we're feeding hundreds of billions into, we should at least be able to debate it. [01:58:54] And if an American journalist is critical, he doesn't deserve to be put in a gulag. [01:58:59] That's very dangerous. [01:59:00] I agree, Elon. [01:59:01] So this isn't a matter of Ukraine arresting a journalist who's just critical of the government. [01:59:05] This is a person actively supporting Putin's war effort against Ukraine inside Ukraine. [01:59:10] Ooh, that's tough. [01:59:11] In Ukraine, as is the case in many places, you're not allowed to engage in wartime propaganda for the country that's invading you. [01:59:17] That's usually a good idea. [01:59:19] Yeah. [01:59:19] If you choose to do so, you're going to be considered part of the opposing side, and you can't really be too surprised when you get arrested. [01:59:26] It's even more glaring when you consider that this guy is, he's also accused of spreading information about Ukrainian military movements on social media, which could constitute providing material aid to the enemy. [01:59:36] Yeah, that'd get you. [01:59:37] So that's beyond journalism or any of that stuff. [01:59:41] I get that makes sense. [01:59:43] If you're a Russian soldier and you go into Kyiv and you're like, hey, listen, I'm a Russian soldier. [01:59:48] I'm totally going to kill you guys, but I don't have a gun right now. [01:59:50] I'm not doing anything. [01:59:51] I'm just coming for a vacation. [01:59:52] Just hanging out. [01:59:53] No big deal. [01:59:54] I'll go back to killing you when I go back to Russia. [01:59:57] Or I'm just, I don't have a gun or anything. [01:59:59] I'm just doing some sort of like information reconnaissance type stuff. [02:00:03] I think it would be reasonable if I go, excuse me, sir, I'm going to have to detain you until the end of the war. [02:00:08] Sure. [02:00:08] Now, Elon introduces an interesting thought, which is that if we give a country money, our citizens should be immune from being arrested by that country. [02:00:15] That is an interesting thought. [02:00:16] This is obviously stupid, but I've really been trying to come up with a better way to understand his point, and this is about as generous as I can be. [02:00:23] No, that's literally what he's saying. [02:00:24] Listen, we gave you a lot of money, so why can't this American citizen kick babies in the face? [02:00:30] Or subvert your war effort against the country that's invading you. [02:00:34] Sure. [02:00:35] Very weird. [02:00:35] Yeah. [02:00:36] So we get another weirdo in the mix. [02:00:38] Sure. [02:00:38] Matt Gaetz. [02:00:39] Great. [02:00:40] Representative Matt Gates. [02:00:41] Oh, my God. [02:00:41] Your thoughts on Alex Jones being back on the platform? [02:00:44] I think it's great. [02:00:44] Alex has been someone who's provoked a lot of critical thinking from policymakers and broad audiences. [02:00:52] Of course, there are things that I'm going to say that would offend people, things that Alex would say that would offend people, but I think they'll just have to be offended. === Difficult Conversations (14:50) === [02:01:00] I think it enriches the discussion to have Alex back. [02:01:03] Yeah, it does. [02:01:04] It doesn't. [02:01:05] Actually, it doesn't at all. [02:01:08] Nope. [02:01:09] Nope. [02:01:10] Also, Matt Gates may be back under investigation for sex crimes by the House Ethics Committee. [02:01:16] So that story is still unfolding. [02:01:19] Man, man. [02:01:22] It is very difficult to listen to this. [02:01:25] Yeah, it is. [02:01:26] It is. [02:01:26] It is. [02:01:28] It was dragging. [02:01:29] It is. [02:01:30] It is very difficult to have. [02:01:33] I mean, you called for the government to assassinate Musk. [02:01:36] That's how tough this is for you to listen to. [02:01:38] I mean, I would have done that before we started this. [02:01:41] I don't like hearing all these people say nice things to Alex where he can hear. [02:01:45] It is such an interesting collection of awful people that are all coming around to like, this is a, this is your life. [02:01:51] It is no, it is 100. [02:01:54] It is hell. [02:01:55] It is hellish yeah, it is. [02:01:57] If you are fucking a duke in fucking hell, this is a great day for you. [02:02:03] I don't think I have any respect for anybody who is on this panel. [02:02:09] I can't imagine not throwing hands upon. [02:02:12] First sight, it's so. [02:02:13] It's so bizarre and by that I mean it's not bizarre at all that there couldn't be anybody who isn't a shithead who they got on this. [02:02:20] Not one to to talk, not one remarkable. [02:02:23] No, I throw hands all day. [02:02:25] Elon has an interesting uh it's, it's not really interesting, it just rhymes sure he has a philosophy of curiosity? [02:02:32] Oh, my gosh, I would say. [02:02:34] I would say I have a philosophy of curiosity, which is, you know, trying to understand more about the nature of the universe and our place in it, and that's why I study physics, not not for career reasons, just try to try to understand how the universe actually works and what. [02:02:52] What has good predictive value and and physics is has got very good predictive value, so that's why I studied it. [02:02:59] Um then, give us, give us your. [02:03:02] Give us your, your predictive value? [02:03:04] Gut level, Elon Musk, the I mean this is like beyond any Hollywood movie. [02:03:09] Where we're at right now, does humanity survive? [02:03:13] Gut level, you've got all these great children. [02:03:15] That means you bet on humanity. [02:03:16] Do we make it to the next level? [02:03:18] And what is the next level? [02:03:19] Got all those shitty children, though. [02:03:20] So 50 50, get on you there, like. [02:03:31] I gotta say, listening to this also had a hilarious thing, because they keep talking about how great the spaces platform is and then the owner of the site keeps cutting out. [02:03:40] Yeah like, and other people are cutting out too. [02:03:43] It is, it's not really as remarkable as they. [02:03:46] They seem to be uh, saying yeah, so yeah, that's always. [02:03:51] That's always one of those weird ones where it's like, if you're, of all the things that people do, where it's like the CEO is giving way too much attention or anything like that. [02:04:00] For me, it's just like if you're going out and demonstrating the thing, you do be the best at that thing. [02:04:08] If you're gonna be out there doing it right, you should, you should. [02:04:10] Instead, his call drops repeatedly. [02:04:12] I mean, that's I would. [02:04:14] I would be like i'm really mad at the crew right now. [02:04:17] We're gonna be done. [02:04:18] Yeah, he has no real uh prediction for the future or anything. [02:04:21] He just says you have to do things to make the future wow. [02:04:24] So that's deep it is. [02:04:25] So you get another person who jumps in uh, and it's the host of a podcast called the All-in Podcast. [02:04:31] Uh-huh, don't know what. [02:04:32] That is no idea, i'm all out, but he has it. [02:04:35] He has a question for Alex. [02:04:36] Okay uh, that is getting back to Sandy Hook stuff. [02:04:39] Great, and nobody likes that. [02:04:41] He's asking, yeah, of course not. [02:04:43] Uh Jason, a quick question to you first. [02:04:46] Would you have Alex Jones on the All-in podcast? [02:04:49] Sorry, to put you on the spot. [02:04:51] Well um, I have him here now, so would it be okay if I asked him three questions? [02:04:56] Yeah, but Elon's far more fascinating. [02:04:58] Please ask Elon questions. [02:05:01] Alex, my first question for you is, i'm curious if you'd be willing to answer three questions about the Sandy Hook parents. [02:05:09] Oh great yeah sure, go ahead. [02:05:12] Well, I mean, you now have your freedom of speech and you're here, so I think a lot of people are wondering what theory or evidence led you to believe that that was a fake uh, staged uh situation. [02:05:28] He's already answered these questions. [02:05:30] Yeah, every time we hear on stage. [02:05:32] It's the first time let's ask him. [02:05:33] He already answered this like 40 minutes ago, Idiot. [02:05:38] We don't actually care. [02:05:40] Everything I say is misrepresented. [02:05:42] You'll say I'm saying it again. [02:05:43] I believe it happened. [02:05:44] And I'm sorry. [02:05:45] And I apologize. [02:05:46] And I'm done. [02:05:48] Okay. [02:05:49] Okay. [02:05:49] So Alex can't get specific without sounding like a complete idiot. [02:05:52] So his go-to response is always to pretend like he's already discussed the incorrect things he's believed in the past. [02:05:57] And if he does it, if he does it again, people will just accuse him of still believing those things. [02:06:01] It's a coward's move, but it's super effective as a dodge that people who aren't really curious are just going to accept. [02:06:07] You notice here how everybody got super defensive on Alex's behalf. [02:06:10] Even like, I think it was Jack Pesobic jumping in before Alex even answers. [02:06:14] And that's kind of because they know that any further examination of Alex's actual record will reveal that what he's been saying to Musk this whole time is a lie. [02:06:22] Alex's story for Musk is a constructed reality that all these right-wing scam folk have just agreed to accept as real, but deep down, they know it's not true. [02:06:30] The guy coming in and asking this question is kind of threatening to ruin their fun and poke holes in their illusion. [02:06:36] I don't know what the all-in podcast is or who this guy is, but you can tell that even before he asks the question, the panel is not excited for him to ask questions. [02:06:43] No, I mean, it is very much them being like, buddy, we got that out of the way. [02:06:49] Right. [02:06:49] We never have to talk about it. [02:06:50] We did the satisfactory lie in order to cover our ass. [02:06:53] Stop it. [02:06:53] Yep. [02:06:54] Yep. [02:06:54] It's done. [02:06:55] And Musk even kind of feels like he has a similar perspective. [02:06:58] Yeah, Jason, I should say that the Sannyo issue was the first thing I erased with Alex. [02:07:05] He did answer it at length at the beginning of this Spaces conversation. [02:07:10] So I think you may not have heard the start of it. [02:07:15] I was alerted to this towards the end. [02:07:18] I think candidly. [02:07:21] I'm not coming on whether I agree or disagree with his answer, but it was the very first thing that I asked when I got on this Spaces conversation, just as it was the first thing that you asked. [02:07:33] I think it's the first, you know, for people that care about whether there's sort of empathy and whether somebody has been cruel or mean or something, that's like the first thing they're going to ask about. [02:07:48] And so that was the first thing I asked about. [02:07:50] And it was answered. [02:07:54] Somebody can agree with that answer or not. [02:07:58] I did ask it. [02:07:59] No matter how many times I answer it, it's never good enough because I'm that guy. [02:08:03] I'm not what they said. [02:08:04] I covered the internet questioning it. [02:08:07] I've already said I'm sorry over 500 times, three or four times today. [02:08:11] But it's always the same question. [02:08:13] I'm not that guy. [02:08:14] I won't even say the name of it. [02:08:16] Yeah, because you're not answering the question, then you're dodging your responsibility and you're not actually apologizing. [02:08:21] That's why the question keeps coming up. [02:08:23] Yep. [02:08:23] There's an easy way to deal with this, and that's just sincerity and actually wrestling with the issue. [02:08:30] But it's too dangerous. [02:08:31] It's too vulnerable for his position as the God prophet or whatever the fuck they're trying to brand him as now. [02:08:39] And Elon Musk, I mean, like, this whole bullshit of he answered the question, and I don't know, you know, you can agree with him or disagree with him. [02:08:46] If you disagree with him, continuing this call is a monstrous act. [02:08:51] So we know that you agree with him. [02:08:52] Yeah. [02:08:53] You accepted the version of the story that he told you. [02:08:55] Otherwise, you wouldn't be having continuing this call. [02:08:58] So cut it out with me. [02:09:00] He would be off Twitter by now. [02:09:02] Maybe. [02:09:02] Yeah. [02:09:03] Is it against the law to lie to a CEO or a man who owns a business? [02:09:08] I fucking hate these people so much. [02:09:09] So Elon Musk is suing Media Matters. [02:09:12] And so this is discussed a little bit. [02:09:14] And then he leaves the call. [02:09:16] Okay. [02:09:17] I was perfect, Tommy. [02:09:18] Elon, maybe you can also give us an update if there is any on media matters and why you decided to do them. [02:09:27] Yeah, Media Matters is an evil propaganda machine. [02:09:32] So I just generally am against evil propaganda machines. [02:09:37] So, we are suing them in every country that they operate. [02:09:40] And we will pursue not just the organization, but anyone funding that organization. [02:09:44] I want to be clear about that. [02:09:46] Anyone funding them? [02:09:51] I'm so strong. [02:09:52] I've got big muscles. [02:09:54] Media Matters is an evil propaganda machine. [02:09:56] Unlike InfoWars. [02:09:58] I mean, get the fuck out of here. [02:10:00] I hope they do. [02:10:02] So, base. [02:10:04] Yeah. [02:10:06] Benny? [02:10:07] So, actually, I need to, I hope, you know, just step off the call at this point because I just have some family obligations. [02:10:17] But I think it's been certainly a very interesting conversation. [02:10:21] I suspect this will go viral. [02:10:24] Probably snippets of it will go viral in a way that don't entirely represent the situation. [02:10:31] Vic wasn't being. [02:10:33] Anyone who reads about it or hears about it actually just takes the time to listen to the entire spaces conversation. [02:10:42] I did. [02:10:42] And it's actually probably less interesting than the version we've presented. [02:10:47] Yeah. [02:10:48] Like, because there's long, it's long and boring. [02:10:53] Yeah. [02:10:54] But yeah, I imagine there are things that are going to go viral. [02:10:58] And I love that Elon is already getting into this game that all these people play, which is the preemptive prediction of their own, like, we're going to be lied about. [02:11:07] Yeah. [02:11:07] Which is basically just doing preemptive damage control when people discuss the fucked up things that you said. [02:11:13] Yeah. [02:11:13] You'd be like, oh, yeah, of course they were going to take me out of context when they commented on the thing I said. [02:11:18] Yeah, I mean, it is an exercise in building an impenetrable wall. [02:11:28] It is that exercise of not only will I never engage with what I'm saying. [02:11:37] I mean, not only won't I engage with the criticism what I'm saying, I won't even engage with what I'm saying. [02:11:42] You know, like, I'm not even going to take responsibility for it two seconds after this call is over. [02:11:46] I'm going to say that I said whatever it is that I felt like I wanted to say now. [02:11:52] Right. [02:11:52] And then in an hour from now, I'm going to say whatever I felt like I wanted to say an hour from now. [02:11:58] What I said does not mean anything. [02:12:00] Particularly if it's criticizable and indefensible. [02:12:03] And then I never said any of that stuff. [02:12:05] I never said any of that. [02:12:06] And I've apologized 100 times. [02:12:08] And no matter what, I will not change my behavior one bit. [02:12:13] Why would I? [02:12:14] This is profitable. [02:12:15] Any conversation with me is useless and a waste of everyone's time. [02:12:20] Quite. [02:12:21] Yep. [02:12:21] Quite. [02:12:22] So yeah, Elon leaves, and then the call basically just. [02:12:26] And then everybody's like, oh, okay. [02:12:27] Why would anyone stick around? [02:12:29] The first person to leave the party usually earns a party. [02:12:31] Well, but also the first person to leave, you know, well, I mean, if it's someone that no one likes, then the party gets better. [02:12:37] But if one person leaves and then the party's over, that means that they were the reason you were the high-status person there. [02:12:45] Yeah, which is interesting because he wasn't even part of the conversation for the first chunk of it. [02:12:51] Right. [02:12:52] Alex is back on Twitter and we're getting shitheads like Pesobic and Loomer to talk about it and all this. [02:12:58] That was the name of the game. [02:13:00] And then Elon Musk shows up and it's all Musk, baby. [02:13:04] And Alex trying to interview him, all this shit, to the point where he took power over the entire thing. [02:13:13] Well, I mean, it makes sense. [02:13:15] It's hard not to, it's hard not to do that. [02:13:18] He owns the website. [02:13:19] Well, not just that, but I mean, like, for these people, they're on the grift side of things. [02:13:27] And that means that they're always looking up, you know? [02:13:30] Right. [02:13:30] Elon is somebody who has agency, which is something that, like, what, you know, a thousand people on this planet truly have, where it is, he can just be like, I don't like this thing. [02:13:41] I will make it end. [02:13:43] You know, that is a thing he can do that most of them, all of us can't do. [02:13:47] Yeah, and I think that element is there with all of these people that realize that their existence is at the whim of Musk. [02:13:56] Exactly. [02:13:57] Because for all the talk about free speech and all of this, like, we'll follow the laws, they kind of know that if they run afoul of him, we'll do whatever Elon says. [02:14:07] And then we'll apologize like a little baby if he says, if he's mad at us, we'll go, I'm so sorry, I'll never do it again. [02:14:12] I'm so sorry. [02:14:14] Yeah. [02:14:14] Yeah, like Alex. [02:14:15] Yeah. [02:14:16] It's pathetic. [02:14:17] It is. [02:14:17] It is. [02:14:18] It's a grim look. [02:14:19] And I mean, I can't stress enough what a roster this is. [02:14:25] I would like the people who call themselves the Krassenstein brothers to look in the mirror and just figure out what the fuck just happened. [02:14:35] Because if you wind up there, I don't even, I don't. [02:14:38] Shit went bad. [02:14:39] What I know about you is bad. [02:14:41] Yeah. [02:14:41] But if you wound up there doing that, it's way worse. [02:14:45] If you're there as the sort of token, presumable person who's on the left is a part of the right-wing scammy media ecosystem. [02:14:52] Yeah. [02:14:52] You have to know, first of all, things went wrong. [02:14:56] Yeah. [02:14:57] Second. [02:14:58] Yeah, Carville. [02:14:58] Come on now, Cajun. [02:15:00] How you doing? [02:15:00] Second, they recognize that you're part of the same grift. [02:15:04] Yeah. [02:15:05] The only reason that you're allowed there is that they recognize that whatever you're bringing to the table is not a threat and is kind of a joke to them. [02:15:13] We can argue about politics because the true rule number one is don't fuck with the grift. [02:15:19] Yes. [02:15:19] And that is what you will follow. [02:15:21] Yeah, so congratulations on that. [02:15:22] Yeah, good work, Residencines. [02:15:24] Anyway, this sucked. [02:15:26] I hated doing it. [02:15:27] And I imagine there might be more Twitter spaces in the future that Alex does. === Twitter Adventure Continues (00:49) === [02:15:31] And I don't need him to do more shit where he just says the same thing. [02:15:35] Yeah, yeah. [02:15:36] But we'll find out what happens in the future as his Twitter adventure continues. [02:15:40] But until then, we have a website. [02:15:42] Indeed, we do. [02:15:42] It's KnowledgeFight.com. [02:15:44] Yep. [02:15:44] We're also on Twitter. [02:15:44] We are. [02:15:45] Well, maybe not for long. [02:15:47] Well, yeah, not for long. [02:15:48] But we're there for now. [02:15:49] Yeah. [02:15:50] At knowledge underscore fight. [02:15:52] Yep. [02:15:52] Sigh. [02:15:53] Blue sky. [02:15:54] Maybe make a move over there. [02:15:56] Yeah, proud time. [02:15:57] But hey, we'll be back. [02:15:58] But until then, I'm Neo. [02:15:59] I'm Leo. [02:16:00] I'm DZX Clark. [02:16:01] Dan does not support the comments made by Jordan on this episode about governmental assassinations. [02:16:07] Woo-yo! [02:16:08] Woo-yo! [02:16:09] And now here comes the sex robots. [02:16:12] Andy in Kansas, you're on the air. [02:16:13] Thanks for holding. [02:16:15] Hello, Alex. [02:16:16] I'm a first Tim Color. [02:16:17] I'm a huge fan. [02:16:18] I love your work.