The Glenn Beck Program - Best of the Program | Guest: Brendan Carr | 12/1/22 Aired: 2022-12-01 Duration: 45:11 === Tough Questions for the Billionaire (15:17) === [00:00:00] All right. [00:00:00] Podcast is fantastic today, especially I think like the last hour and a half of the show today, last half of the podcast, is something very, very different. [00:00:12] We talk about, well, we start in hour two with the FCC commissioner who is responsible for a lot of great things. [00:00:20] And we kind of got into a philosophical discussion about TikTok, Twitter, what's going on there, and also freedom of speech. [00:00:29] From there, we went to Elon Musk's NeuroLink, to a simulated wormhole for the very first time with quantum computing, and the future that is coming right around the corner. [00:00:43] All that and so much more on today's podcast. [00:00:46] Don't forget, Goldline is out there and they want to talk to you about protecting your retirement account. [00:00:52] If you have a retirement account or you just have even money in the bank, make sure that you spread the risk out. [00:01:00] The Fed came out yesterday and they're still saying, nobody saw this inflation coming. [00:01:05] You should all be fired if you didn't see this coming. [00:01:08] Anyway, gold or silver is a great way for you to be able to protect your money, what you have. [00:01:17] Somebody's got to have some money. [00:01:20] Somebody's got to have things to rebuild our country as we go through this mass transition. [00:01:27] Please call Goldline now. [00:01:28] Tell them that I sent you from the podcast and give Goldline the code MYB, which represents mind your business, the silver bar that they'll give you for free, just as a thank you for calling in. [00:01:39] Request the information. [00:01:40] You can go to their website, goldline.com, or call them. [00:01:44] Tell them I sent you 866Goldline. [00:01:46] 866 gold line or goldline.com you're listening to the best of the blend back program So, Stu, yesterday I kind of had a bad day. [00:02:07] I had a bad day. [00:02:08] Yeah, I would say that. [00:02:10] You're doing okay? [00:02:11] Yeah, I'm doing fine. [00:02:12] But I would say that was a bad day. [00:02:14] Sure. [00:02:15] Yes, I would agree. [00:02:15] But that bad day didn't include me losing $10 billion for other people. [00:02:23] No, you didn't lose any billions of dollars. [00:02:24] No, I didn't pull that out. [00:02:25] No, no, because I don't know if I would describe that as having a bad month. [00:02:29] You know, I had a bad month, lost $30 billion, $10 billion in other people's money. [00:02:35] Right. [00:02:35] That's a really bad thing. [00:02:37] That's a bad month. [00:02:38] I think it's above a bad month. [00:02:41] But that is the way that Bankman Freed started the interview yesterday with the New York Times. [00:02:48] He apologized and said, I've just, I mean, here, let me actually play it for you. [00:02:58] Let's play cut 11, please. [00:03:00] I mean, look, I've had a bad month. [00:03:03] This has not been a fun year for me. [00:03:05] That's a real line. [00:03:06] That's not what matters here. [00:03:07] Like, what matters here is the millions of customers. [00:03:11] What matters here is all the- Stop. [00:03:14] I want to come back to that part of it at the end. [00:03:17] Put a pin in that. [00:03:19] But let's start with what he said. [00:03:23] I just didn't know. [00:03:24] Cut nine, please. [00:03:26] Was there commingling of funds? [00:03:28] That's what it appears like. [00:03:29] It appears like there's a genuine commingling of the funds that are of FTX customers that were not supposed to be commingled with your separate firm. [00:03:40] I didn't knowingly commingle funds. [00:03:43] And again, one piece of this, you have the margin trading. [00:03:46] You have customers borrowing from each other. [00:03:48] Alameda is one of those. [00:03:49] I was frankly surprised by how big Alameda's position was, which points to another failure of oversight on my part and failure to appoint someone to be chiefly in charge of that. [00:04:04] But I wasn't trying to commingle funds. [00:04:08] Oh. [00:04:09] Okay. [00:04:10] Well, there's all kinds of evidence that Almeda, which was the hedge fund, and FTX shared an account with their banking partner. [00:04:21] So, I mean, you're sharing an account at Silvergate. [00:04:26] So not sure how you square that circle or, you know, you weren't aware. [00:04:30] But what he's saying here basically is I wasn't aware of it. [00:04:34] That's his girlfriend that's running it. [00:04:37] So in other words, hey, I didn't look at her. [00:04:41] His girlfriend is running it. [00:04:43] He appointed her and he's still the owner of Alameda. [00:04:47] He still owns the, you know, that's kind of a big part of it. [00:04:50] And his incompetence slash fraudulent activities are what we're talking about. [00:04:56] Yeah, right. [00:04:58] I wasn't running that. [00:04:59] And I didn't get involved because I was nervous about the conflict of interest if I were too involved in that. [00:05:06] You got the same bank account, dude. [00:05:08] You have the same bank account. [00:05:11] All right. [00:05:11] So let's go to cut 10. [00:05:14] I personally don't think I have criminal liability. [00:05:16] How concerned are you about criminal liability at this point? [00:05:20] So I don't think that, I mean, obviously, I don't personally think that I have, you know, criminal. [00:05:27] the liability i i think the real answer is that's not it sounds weird to say but but i think the real answer is that's not what i'm focusing on um What are you? [00:05:39] There's going to be a time and a place for me to sort of think about myself and my own future, but I don't think this is it. [00:05:50] Oh. [00:05:52] So, dude, did you commit any crimes? [00:05:55] Look, I don't think so, but it's not the time or place to think about me. [00:06:01] You can think about me later. [00:06:02] What I'm concerned about are all of the people who have lost their money. [00:06:07] What an amazing answer. [00:06:11] And I guess that's the best answer you can give in this moment other than the correct answer, which is don't do the interview. [00:06:18] That's the first thing is stay away. [00:06:21] Correct. [00:06:22] So you don't do an interview like this. [00:06:27] Stu, how many interviews in the course of my career have I been asked to do and everyone clearly said, don't do it. [00:06:42] You do not want to do it. [00:06:45] This sounds familiar. [00:06:46] We've had that conversation a thousand times. [00:06:48] A thousand times. [00:06:50] Now. [00:06:50] And to be clear, you have not lost $10 billion for investors at any point. [00:06:54] Correct. [00:06:55] And never done anything wrong. [00:06:57] Knowingly. [00:06:59] So, however, people used to say that watched me when I was on Bill O'Reilly. [00:07:10] Why do you continually go on his program? [00:07:14] He kills you. [00:07:18] And that never was ever discussed except for the first time I went on Bill O'Reilly, right? [00:07:25] Bill is going to, you know, he's a wild car. [00:07:27] You don't know what's going to happen. [00:07:29] Okay. [00:07:31] Why did I always not listen to that advice on Bill O'Reilly? [00:07:38] Well, I mean, I think you guys actually had a good relationship. [00:07:43] We had a good relationship. [00:07:43] And he would push you on things, but you knew it was coming from a good place. [00:07:48] Exactly. [00:07:50] He said to me at one point, look, if I think you're wrong, I'm going to tell you you're wrong. [00:07:54] If I think you're out of line, I'm going to tell you you're out of line. [00:07:58] But most of the stuff that you do, I don't think you're out of line. [00:08:02] I may not agree with your conclusions, but I don't think it's out of line. [00:08:06] It's a good question. [00:08:08] So, Glenn, come on my show. [00:08:11] I will ask you the hard questions, but it won't be a setup. [00:08:15] I'm not trying to destroy you. [00:08:17] Right. [00:08:18] Uh-huh. [00:08:19] So why would Sam go on? [00:08:24] Go on with the New York Times against all advice because he knew he was walking into a friendly room. [00:08:34] They're going to ask you tough questions. [00:08:37] We're going to ask you tough questions. [00:08:41] But he knew it would not be a lynching. [00:08:45] He knew that there were friends at the New York Times and he could ask and then answer. [00:08:55] This is not the time to think about me. [00:08:58] It's not the time. [00:09:01] I mean, my question is, why aren't you in jail, dude? [00:09:05] What makes you different from Ken Lay? [00:09:08] What makes you different from Enron? [00:09:10] What makes you different than Bernie Madoff? [00:09:15] He then appeals to the people of the audience. [00:09:18] I care about the people who lost their money. [00:09:20] And I'm sure there's people in your audience that have lost money. [00:09:25] And they laugh. [00:09:27] And they laugh. [00:09:30] Wow, somebody who lost $10 billion of investors' money. [00:09:35] And he shows up and people are like, oh, that crazy kid. [00:09:41] This is a whitewashing. [00:09:45] This is money laundering, except it's reputation laundering. [00:09:50] They are just laundering him here. [00:09:53] That's interesting. [00:09:54] I mean, look, I think everybody wants this interview, right? [00:09:59] Like this is, it's not like people are resisting talking to this guy right now. [00:10:02] I think, you know, so he's selecting who he's going. [00:10:04] He's also going on with George Stephanopoulos, apparently. [00:10:07] Oh, George Stephanopoulos, which is interesting, kind of, I think, supports your thesis there. [00:10:12] Right. [00:10:13] You know, I'm not surprised that the New York Times would take the interview or offer the interview. [00:10:19] I'm sure every mainstream financial journalist has offered this interview. [00:10:23] He's selecting where he's going, though, right? [00:10:26] And this is, you know, we were in that room, weren't we, Glenn? [00:10:29] Yes, yes, yes. [00:10:30] I can't remember what event we did there. [00:10:32] We did something there. [00:10:32] I remember doing it. [00:10:33] And it's an impressive, impressive room, right? [00:10:37] Like at the New York Times. [00:10:38] It's like an incredible place for one of these things. [00:10:41] Yeah. [00:10:42] We did, just to piss him off, we did an event for the Blaze when we first launched in that same room. [00:10:48] That's what it was. [00:10:50] It was a launch announcement of the Blaze. [00:10:53] That's right. [00:10:54] And it was pretty sweet. [00:10:56] It is. [00:10:56] Pretty sweet. [00:10:57] And we walked into the New York Times and everybody's like, good Lord, what are these people doing here? [00:11:01] They're just hoping that it was some arrest announcement. [00:11:04] Don't worry. [00:11:04] We're just doing an exorcism. [00:11:06] Don't worry about it. [00:11:08] But you're right. [00:11:10] I mean, it's like, I mean, I watched a good chunk of this interview. [00:11:13] It was over an hour. [00:11:14] And apparently he did almost two hours with George Stephanopoulos that's coming out partially today. [00:11:20] Another guy who's not going to really press on. [00:11:24] So how much money were you giving to the Democratic Party? [00:11:27] Yeah, because that's, you know, I don't, I didn't hear one question about that from the Times. [00:11:32] See, that's where they're washing all of this. [00:11:37] Yeah, there's this. [00:11:37] Just make sure that he's not tied to any of that. [00:11:41] Let's not get into any of how much money was going to the people who are now going to keep him out of jail. [00:11:48] Let's make sure we don't ask those questions. [00:11:51] See, there's two ways this can go, I think. [00:11:54] One way is the way you're talking about where they will protect him. [00:11:57] And I think there's a real argument to be made that that's the way it goes. [00:12:01] I think it's maybe the most likely way it goes that he will be protected because of all the money that he was giving to Democrats. [00:12:08] There is that part, though, where this does cause problems for Democrats, right? [00:12:12] Like it does expose them. [00:12:15] You know, like you don't want pictures of you when you're running for reelection with Bernie Madoff with your arm around him. [00:12:22] And like these sorts of problems are going to be real for Democrats going forward. [00:12:26] You don't have Bernie Madoff here. [00:12:29] You don't have a mainstream press making him in to Ken Lay or Bernie Madoff. [00:12:35] I don't know that he, this is really bad. [00:12:37] And you might be right, but it's going to be hard to whitewash this guy. [00:12:41] You remember what it was like with Bernie Madoff? [00:12:44] How they just hounded him all the time. [00:12:49] Have you seen him being hounded in his Bahamas home? [00:12:53] Have you seen that? [00:12:54] No. [00:12:54] Have you seen the gaggle of... [00:12:56] No, I haven't either. [00:12:57] No. [00:12:57] No, though, Bernie Madoff was walking down the streets of New York when the footage was taken. [00:13:02] This guy knows that they were also staked out in front of his house. [00:13:05] They never let these guys rest because they were on a mission to make sure they showed how evil these people were. [00:13:14] That's true. [00:13:14] They're not on that mission. [00:13:16] So the people who are average people are not hearing the Sam Bankman Freed jokes. [00:13:26] Yeah, that's true. [00:13:27] You know, it's interesting. [00:13:28] They don't seem to be on a mission. [00:13:29] They seem to, like, the tone of the coverage, and I've watched a lot of it. [00:13:33] They are on a mission. [00:13:34] Yeah. [00:13:34] That's a different story. [00:13:35] It's like the way the tone of it is, like, we need to understand. [00:13:39] Yeah. [00:13:40] Almost like we need to understand because this guy who we all said was so great may have done a couple of things wrong. [00:13:46] So let's come up with a reason to, or let's let him explain, give him an ample opportunity to explain why this reason wasn't that he wanted a private jet and a $30 million apartment in the Bahamas. [00:13:59] Imagine that I lost $10 million of people's investment and I was commingling funds. [00:14:08] And it was an honest error. [00:14:10] $10 million, not billion. [00:14:13] They would slaughter me. [00:14:14] They'd be all over the place. [00:14:15] They'd have people parked outside these windows right now. [00:14:18] Absolutely they would. [00:14:19] They are on a mission. [00:14:21] And one of the mission, to me, there are two reasons. [00:14:26] He's doing interviews where he knows he won't be pushed on the tough questions. [00:14:33] Keep this away from everyone else. [00:14:36] Contain this. [00:14:37] Okay. [00:14:38] So he's looking like a really good guy. [00:14:42] Look, I'm just trying to help out. [00:14:44] I just believe in giving all this money away and it just got out of hand, but I wasn't part of it. [00:14:49] And don't ask any questions about Democrats. [00:14:52] Don't ask any questions why all of a sudden everything is different with this guy than Ken Lay or Enron. [00:15:00] And they want to contain it. [00:15:02] But the second thing they need to do is make sure America learns the lesson about how bad these unregulated markets really are. === Containing Unregulated Crypto Markets (02:27) === [00:15:17] I mean, it is so dangerous. [00:15:19] We can't just have this cryptocurrency out there. [00:15:23] We need a central cryptocurrency. [00:15:26] That's what he was for. [00:15:29] He was leading the band on saying, we got to, these people are out of control. [00:15:34] We got to regulate all of this. [00:15:37] He was a major force in that. [00:15:40] So they need to tell the story that cryptocurrency is bad. [00:15:47] That's why we need a Fed coin. [00:15:49] And by the way, we weren't in league with him on that or anything else. [00:15:54] No, no, no. [00:15:56] The media, the politicians, the Democrats, no, What money? [00:16:01] He gave us money. [00:16:02] What? [00:16:03] That's what's happening. [00:16:09] This is the best of the Glenn Beck program, and we really want to thank you for listening. [00:16:17] So this guy is the guy, they call him the FCC's 5G crusader. [00:16:23] He's a guy who cut all of the red tape and really pushed for the high-speed networks to be built by private businesses. [00:16:35] He is also the guy who is one of the big forces behind telehealth, mainly for veterans and low-income Americans to be able to get to doctors on their smartphones or tablets or any other connected device, driving down the price and driving up the access to medicine all around the country. [00:17:01] And he also, like Mike Rowe, and I believe in apprenticeships and everything else, this is a, I think this guy is a real warrior for what we believe are American truths. [00:17:14] His name is Brendan Carr. [00:17:15] He is a commissioner with the FCC. [00:17:17] Brendan, how are you, sir? [00:17:19] Glenn, so good to join you. [00:17:21] I really appreciate the chance to be with you. [00:17:23] Big fan of everything you're doing. [00:17:24] And listen, if you ever get in trouble at the FCC, if anyone files a profanity or indecency complaint against you, just don't mention you know me. [00:17:32] It'll go a lot better. [00:17:33] It'll go a lot better. [00:17:34] I know. [00:17:34] I'll do there for you. [00:17:35] You and I never talked. [00:17:37] That'll be a story going forward. [00:17:39] I know, I know. [00:17:40] I know how this works. [00:17:41] Anyway, I wanted to talk to you about two things. === TikTok Data Feeding Beijing (07:46) === [00:17:45] Let's start with TikTok. [00:17:47] Everybody in the tech industry seems to be against Twitter. [00:17:52] I mean, it's crazy by letting people talk how they are being accused of destroying free speech. [00:18:00] It's an upside-down world. [00:18:02] But TikTok, nobody seems to want to do anything about this. [00:18:06] I've read your letter. [00:18:07] I've read your reports on this. [00:18:09] TikTok is extraordinarily dangerous to Americans. [00:18:15] Can you fill in why it's a danger and why everybody in America seems to be focused on Twitter, including the White House, and not TikTok? [00:18:26] Well, it's quite amazing. [00:18:27] And, you know, TikTok is an example of this. [00:18:30] And as we may get into Apple as well, when your product is, you know, for better or worse, immensely popular with consumers, it's amazing what you can get away with. [00:18:38] And I think TikTok is the prime example. [00:18:39] It's popular with millions and millions of Americans, including young Americans. [00:18:42] And they look at it and they think, well, that's just a fun platform for sharing videos and dance memes. [00:18:47] And the reality is that's just the sheep's clothing. [00:18:49] Underneath, it operates as a very sophisticated surveillance technology, right? [00:18:53] In the terms of service, they reserve the right to get your biometrics, including face prints and voice prints, search and browsing history, keystroke patterns. [00:19:03] The list goes on from there. [00:19:04] And for years, they said, don't worry, this is stored outside of Beijing, not a big deal, even though our parent company is Byte Dance is based in Beijing. [00:19:11] And well, that's been revealed as nothing more than gaslighting. [00:19:14] It turns out that, according to internal communications, quote, everything is seen inside China. [00:19:21] And that's a massive, massive problem. [00:19:23] In fact, their COO was testifying in Congress a couple weeks ago and was asked point blank, do you transfer U.S. user data to employees in Beijing who are themselves members of the CCP? [00:19:36] And the COO said that she declined to answer that particular question. [00:19:41] So that's troubling. [00:19:42] There's also a new report that just came out that they had this Beijing-based operation that was attempting to surveil the location of specific Americans based on their usage of the TikTok application. [00:19:54] And that's not to mention, obviously, the concerns that come from the content side, where Americans, including children as young as 10 years old, are being fed things like the blackout challenge that's literally convinced themselves. [00:20:06] And some have done that and died as a result. [00:20:09] So it's a national security threat, and it's something that parents should be worried about as well. [00:20:13] So explain to, because I've tried to explain this to my family. [00:20:18] You know, my kids are like, yeah, right, Dad, I got it. [00:20:21] What is China going to do with my, you know, my face print and my information? [00:20:26] Can you explain why that's dangerous? [00:20:29] Yeah, it really is. [00:20:30] And if you, one way to think about it is there's a version of TikTok. [00:20:33] TikTok itself isn't available in Beijing, but a version of it called Doyan, a sister app run by the parent company. [00:20:39] And that application shows kids science experiments, museum exhibits, educational material. [00:20:45] And again, here in the U.S., it's showing kids the blackout challenge. [00:20:49] So that's where the real danger comes. [00:20:51] But also, if you step back, what's really happening when you're using TikTok, every time you swipe or click or search, what you're doing is you're feeding, training, and improving China's artificial intelligence, their AI. [00:21:04] And China has said, we want to dominate the world in AI by 2030. [00:21:08] And they're going to use it for authoritarian purposes, for surveillance, for exporting their control. [00:21:14] So even if you step back from your own self and your own kids, and even TikTok itself, the idea that we're sending this data, these clicks back to Beijing is improving their AI, and that's going to come around and bite us in ways that are, again, unrelated to TikTok itself. [00:21:28] So we have Google doing the same thing. [00:21:30] I mean, that's why Google was free, is they wanted all that information to work on AI. [00:21:40] So you're saying this is just another version of Google, if you will, that's here in America to be able to mine for all of that information. [00:21:51] Yeah, you're right. [00:21:51] You know, China has a fundamental flaw both in their system of government, obviously, but it carries through to AI, which is they don't have feedback loops. [00:21:59] They don't understand sort of Western-free thinking. [00:22:02] And so they need Americans to be on TikTok to be observing their usage of data in order to create their AI and make it a healthy system. [00:22:13] So the sooner we cut off data flows back to Beijing, the sooner their version of AI starts to atrophy and go down a separate path in which it's going to be a lot less successful. [00:22:23] So I think we do need to look broadly, how do we stop training China's artificial intelligence? [00:22:28] But again, that's a piece of it. [00:22:29] It's used for surveillance. [00:22:31] It can be used for blackmail. [00:22:32] It can be used for foreign influence campaigns. [00:22:34] And where things are right now is this is in the court of the Biden administration. [00:22:38] The Treasury Department has a group called CIPIAS, Committee on Foreign Investment, and they've been reviewing TikTok for over a year at this point. [00:22:46] And the New York Times reports that they've got a preliminary deal in place to allow TikTok to continue to operate. [00:22:52] And frankly, I think this is a big IQ test for the administration. [00:22:55] And it's sort of a pass-fail at this point. [00:22:58] And in fact, you just had FBI Director Chris Wray testify last week in Congress and said that the FBI has serious national security concerns. [00:23:05] So I don't see how the Biden administration can go forward and bless TikTok to continue to operate when you have the FBI, when you have Democrat Senator Mark Warner, chair of the Senate Intel Committee, saying that it is TikTok that scares the dickens out of him. [00:23:18] But we may very well be heading towards that direction here. [00:23:21] Google Play Store, Apple App Store, I know you wrote a letter to both of them and said, drop, drop this. [00:23:28] This is really bad for the country. [00:23:32] Yeah, I mean, putting aside the content of what's in this application, Google and Azel have very clear terms of service to stay in the app store. [00:23:41] If data is being used for purposes that aren't being disclosed, or if data is traveling to countries and being accessed from countries without that being properly disclosed, there's precedent for Google and Apple to boot apps off the app store for that reason. [00:23:54] And so I wrote them a letter and said, Look, in light of the national security concerns, in light of these clearly surreptitious data flows that we're now learning about, just apply the terms of your app store policies and boot them from the app store. [00:24:06] Of course, they didn't do that. [00:24:07] And that's why it's obviously highly ironic that there was at least the concern this week that Apple might take action against TikTok. [00:24:15] Because look, if you're pulling advertising dollars or pulling support in Apple's case, potentially from Twitter while keeping your support or expanding your advertising on TikTok, you are sending quite the signal about your brand value. [00:24:29] I think it's very different than the one you think. [00:24:32] One last thing because I've got something else I want to talk to you about, but one last thing on this. [00:24:37] You just kind of brushed over this, but I think it is critical. [00:24:41] There was a new survey out that showed, I can't remember, six or eight out of 10 children in China want to be astronauts and want to be scientists. [00:24:53] Here, eight in 10 want to be social media movers. [00:25:01] Influencers, influencers, yeah. [00:25:04] That's crazy. [00:25:05] And part of that is because of TikTok. [00:25:08] As you said, the same thing under a different name over in China is encouraging people to do crazy great things and science and knowledge and education. [00:25:23] And this same platform is programmed here to really make you as dumb as a box of rocks. === FCC Scrutiny and Divisive Content (06:36) === [00:25:31] I don't think that's, I don't think that's just, oh, really? [00:25:34] I didn't even notice that. [00:25:35] That's intentional. [00:25:37] Yeah, you're right. [00:25:38] I mean, this is why I've talked about TikTok as China's digital fentanyl, because it's effectively, you know, a pipe directly from Beijing, from the CCP, into the ears and eyes and minds of millions and millions of America's youth. [00:25:53] And what they're being served is divisive content. [00:25:55] It's content that is increasing ADHD problems, suicide ideations, body image issues. [00:26:05] This is what is being fed to us. [00:26:07] And that's deeply, deeply concerning. [00:26:10] And that's why I think it's incumbent on the Biden administration to step in and take some tough action here. [00:26:15] So, Brendan, I have a philosophical question. [00:26:19] And I'd like you, if you would, noodle this out. [00:26:24] I tried to contact you a few weeks ago because I was presented with a story of a book that was in a school library and being read to kids in school. [00:26:38] And it was one of the most vile things I have ever read. [00:26:43] And look, I've done this for 40 plus years. [00:26:46] I know exactly what I can and can't say with the FCC, okay? [00:26:50] And I've always understood those to be community standards, et cetera, et cetera. [00:26:57] Here's my problem. [00:26:58] There are times when things need to be heard by the general public. [00:27:03] And I know we can go online and do it, et cetera, et cetera. [00:27:06] But why when we are a community standards-based system, if you can teach it to my children and have it in the classroom, why can't I, a program that is aimed at adults and during the day when kids should be in school, why can't I read that book on the air? [00:27:30] Well, you're right. [00:27:31] Look, we still have in place at the FCC rules that apply to broadcast radio and broadcast television that regulate profanity, indecency, similar content like that. [00:27:43] It obviously hasn't been enforced very much in the last few years, but they're still on the books. [00:27:47] And so you're right. [00:27:48] There is a point at which potentially you reading things from across the broadcast airwaves that may be found in a library somewhere could have issues under the FCC's profanity and indecency regulations. [00:28:04] Now, of course, there tends to be a newsworthy exception to a lot of that stuff so you can cover issues and things like that. [00:28:10] But it's a challenge. [00:28:11] And some people say, look, how do you generally square this pro-speech, free speech view with that type of stuff? [00:28:18] And I would say, look, what we can speak of as adults and talk about freely is very different than the content that should be stocking the shelves of school libraries for kindergartners. [00:28:30] Yeah, my problem is, is this is a show that is based on information and opinion. [00:28:37] You may not like it, but we take our job seriously. [00:28:41] We try to be responsible. [00:28:43] I've always been responsible with the FCC. [00:28:47] And it's not a 1990s Howard Stern kind of thing, which we're way past that. [00:28:56] This is being read to our students in many schools all across the country. [00:29:03] And it is absolutely indecent. [00:29:05] And I know it's indecent. [00:29:07] But why do I get in trouble for exposing this indecency? [00:29:15] And the way to expose it is to make people understand by hearing it how unbelievably indecent it is. [00:29:25] Yeah, look, I think we've gone a long way recently in trying to address this issue by doing what you're doing. [00:29:31] We've had instances where parents have tried to read books from their, you know, again, kindergarten library at school board meetings, at city council meetings, and they've been shut down and said we can't allow that content to be spoken at these city council meetings. [00:29:47] Yet, you know, there it is in the kids' classroom. [00:29:51] And so I do think there's been some progress in that. [00:29:53] Now, look from my perspective, I remember, you know, growing up in high school, the famous MM song, the FCC Won't Let Me Be. [00:30:01] And it's quite ironic after humming that song in high school that I've ended up at the FCC. [00:30:06] Look, we try to be very, you know, pro-free speech about this stuff, but this is an issue that we're dealing with as a cultural matter right now. [00:30:12] And I would not have a problem if it were me possibly losing my license, but I lose the license. [00:30:20] Anything I do could possibly jeopardize the license of every station in my chain. [00:30:26] So there's no way that there's no way I'm going to put people out of work to prove this. [00:30:34] What do you recommend? [00:30:37] Well, look, again, there's a newsworthy exception to discussing some of this stuff. [00:30:42] You know, look, if you think it's, I mean, it could be good or bad, I don't know, but if it's close to the line, you know, there still are background indecency, profanity rules of the FCC. [00:30:51] We do get complaints from time to time. [00:30:53] I mean, we usually dismiss them or don't address in the main. [00:30:56] But yeah, you do potentially subject yourself to FCC scrutiny in those cases. [00:31:01] My problem is I had some of the best attorneys in Washington on free speech and FCC. [00:31:07] I've always had, I have for 25 years. [00:31:09] About three years ago, they called. [00:31:12] They also represent Google and Apple and Facebook. [00:31:17] And they dropped me in the middle of a case as a client because it made their other clients uncomfortable and they had to make a choice. [00:31:27] So I'm not sure if you will see me and my attorney at some point because, you know, it's hard to get one if you have to have my opinion today. [00:31:37] Brendan, thank you so much. [00:31:38] I appreciate all that you do at the FCC. [00:31:40] God bless. [00:31:41] Appreciate it. [00:31:42] Thank you. [00:31:42] You bet. [00:31:43] Brendan Carr, FCC Commissioner. [00:31:50] The best of the Glenn Beck program. [00:31:57] So, Stu, there are two stories that I barely understand. [00:32:02] Let me start with the one that I really am a little foggy on. === Rats Evolving New Sex Chromosomes (13:04) === [00:32:07] For any mammal, the loss of the Y chromosome should mean the loss of males and the demise of the species. [00:32:16] However, the Amami spiny rat manages without a Y chromosome and has puzzled biologists for decades. [00:32:27] Now, a Japanese scientist and her colleagues have shown that one of the rats' normal chromosomes effectively evolved into a new male sex chromosome. [00:32:38] Now, I hate to get all sciencey because I don't know how these rats identify. [00:32:44] I don't know any of their pronouns or anything else. [00:32:47] So, the reason why this is important is because the Y chromosome seems to be getting weaker and weaker in a lot of mammals, including man. [00:33:06] And once you lose the why, then what happens? [00:33:10] You've only got females, end of the species. [00:33:14] So that's why they're looking into this because they believe that we are headed for the same kind of thing, which is end of the species. [00:33:23] Yeah. [00:33:24] I mean, I guess think of just all the car accidents. [00:33:27] Where we just wiped out. [00:33:28] Only women drivers? [00:33:29] It would be crazy. [00:33:30] Oh, my gosh. [00:33:31] And women presidents and CEOs. [00:33:33] Oh, gosh. [00:33:34] Just shut the thing down. [00:33:35] Lord, please come now. [00:33:36] Anyway, so stupid. [00:33:42] That was largely just to piss off Sarah in the other room. [00:33:44] Oh, it is. [00:33:45] Largely. [00:33:46] 100%. [00:33:48] And of course, the fact that it's true. [00:33:50] Right. [00:33:51] So the next story is a quantum computer has simulated a wormhole for the first time. [00:33:59] Now, do you know what a wormhole is? [00:34:02] It's a space thing. [00:34:03] It's like a sciencey space thing. [00:34:05] Okay, so it's like you take a piece of paper and you fold it in half, then you, I think, fold it again. [00:34:11] Okay. [00:34:11] And you put a little hole in it. [00:34:14] Okay. [00:34:16] You would see that there would be two holes in the piece of paper. [00:34:20] It's like a mask. [00:34:20] Okay. [00:34:21] With your eye holes. [00:34:22] In fact, it's almost the perfect mask. [00:34:24] Okay. [00:34:25] So, and probably Fauci would have me wear this. [00:34:28] Anyway, so a wormhole is a way to collapse the distance in between those two holes in space. [00:34:38] And then they are right. [00:34:39] You go through one hole and you're right there because they're next to each other. [00:34:43] Right. [00:34:44] Instantly. [00:34:44] If space is folded. [00:34:46] Okay. [00:34:47] So that's the idea of a wormhole. [00:34:50] You could travel great distances through that quickly. [00:34:55] So this has just been a theory. [00:34:58] Scientists with a quantum computer have just simulated a wormhole for the very first time. [00:35:07] Now, it gets very complex because they say it was a holographic, but it's not exactly a holograph. [00:35:16] They just simplified things by taking gravity out of the equation, which gets into Einstein and theory of relativity. [00:35:24] So they had to have something that would take gravity out and see if they could simulate this. [00:35:29] Well, they did. [00:35:31] And what this means is you could have, without any wires, cables, Wi-Fi, nothing, you can take something digitally and send it from, let's say, my desk to a desk in London, and it would exist in both places. [00:35:54] And you could close one of the doors and it would either come back to me and only be here, or I could close my door and it would be in London. [00:36:05] They just did this. [00:36:07] This changes everything. [00:36:10] This changes everything. [00:36:12] This is, you remember Einstein when he was, they talked to him about quantum physics. [00:36:17] He said, God doesn't play dice. [00:36:21] Meaning there is no super, there's no super position of a molecule or, I don't even know, of a qubit, they're now called. [00:36:32] It can't be both positive and negative. [00:36:35] It can't be both one and a zero. [00:36:37] But quantum says, yes, it can. [00:36:40] That led him to say, God doesn't play dice. [00:36:44] It doesn't work that way. [00:36:47] Remember, the theory of relativity is only a theory. [00:36:52] It's the best theory we have on how things work. [00:36:56] Quantum comes up and says, I don't think the basic soup, I don't think it really goes with any of those physics. [00:37:05] I think it breaks down at some point and starts behaving completely illogically. [00:37:12] This shows that Einstein may have been wrong. [00:37:16] Maybe God is playing dice. [00:37:21] The things that we have on the horizon are so groundbreaking. [00:37:27] And just quantum computing, all of this stuff will change life in ways. [00:37:36] It's like we're standing in the 1200s and trying to imagine today. [00:37:41] But it's going to happen in the next 50 years. [00:37:44] Do we have any idea where this would end up? [00:37:46] Like, what would be the end game of this type of technology if it wasn't? [00:37:50] The biggest thing of quantum computing is you will probably solve cancer in a week. [00:37:57] You will solve these problems that cannot be solved because it can model a million different things all at the same time. [00:38:04] So you remember Edison said, you know, I didn't fail a thousand times. [00:38:12] I found a thousand ways the light bulb doesn't work. [00:38:17] That will, you'll only fail, you'll fail and succeed one time because you'll try all of the combinations all at once. [00:38:28] And you'll have the answer. [00:38:30] It feels like there are so many things right now on the fringes of science, like where we are really where scientists are playing, right? [00:38:41] They're at the very edges of understanding where they can go, but see the path forward. [00:38:47] You know, some of these problems like this one are just beginning to be solved. [00:38:50] And there's so many different directions, whether it's, you know, we talked about the singularity or whether it's quantum computing or all sorts of different technologies that it feels like one of these is going to hit in a way that totally changes the world almost immediately. [00:39:06] But in a way, let's look at the telephone for a minute. [00:39:09] Put yourself back at Alexander Graham Bell's time. [00:39:13] Alexander Graham Bell comes up with it and people think, oh, this is great. [00:39:16] Look at this. [00:39:16] But nobody's going to have a telephone for a long time. [00:39:18] Yeah, they say about everything. [00:39:19] Yeah. [00:39:20] And they think, oh, well, right. [00:39:22] I'll just go to, you know, the town square that will have a telephone and I'll be able to call, you know, Washington if I needed to talk to the president because it was an emergency. [00:39:33] They were thinking like that. [00:39:34] They would have never thought, think of the phone today. [00:39:40] It's no longer cordless. [00:39:41] I mean, it's no longer corded. [00:39:43] Right. [00:39:44] It doesn't work with wires. [00:39:48] It's a television. [00:39:50] It's a camera. [00:39:51] It's no longer really even for phone conversations. [00:39:53] Right. [00:39:55] And that's, I think, a really interesting example of how this goes. [00:39:58] Think about the singularity for a second, right? [00:40:00] Singularity being eventually we merge with machines. [00:40:02] Tell me if this is a terrible description, but my very terrible understanding of it. [00:40:06] Eventually we merge with computers where we are able to access information instantly because we have maybe a chip in our head or whatever that allows us. [00:40:16] Right. [00:40:16] And we also have nanobot technology in our bloodstream that is keeping you alive. [00:40:22] You don't have to take medicine anymore. [00:40:24] The nanobots are programmed to take care of your body and it repairs itself through technology, which is connected to AI, a giant machine outside of your body. [00:40:36] Right. [00:40:36] So you're one with AI. [00:40:39] You're one with machines. [00:40:40] You're a hybrid person. [00:40:42] Who is that? [00:40:42] That's the singularity. [00:40:43] That's the singularity. [00:40:45] So if you think about, let's just say for information purposes, you want to get an answer about something in this world of the singularity. [00:40:54] You want to know who was the president of France in 2004, right? [00:41:00] It would instantly, you'd be able to access that information instantly inside your brain, basically. [00:41:04] Yeah. [00:41:04] Right now, you have to go to Google, open up Google, and type in your question. [00:41:09] Right. [00:41:10] The singularity, the way it would be imagined to be used at its highest level would. [00:41:16] Oh, who was the president of France? [00:41:18] Oh, it was so-and-so. [00:41:19] Right. [00:41:20] The minute you think it, the answer is there. [00:41:23] Right. [00:41:24] Because you're connected to everything. [00:41:28] But in a way, what you're describing is essentially the same process, just faster. [00:41:34] Right. [00:41:34] You are, right now, we have crossed a line to the point where When we used to do, when we talked about this on radio terms before, radio used to be really fun because what you'd be able to do is come on the air and you'd say, oh, what was that movie with Corey Haim? [00:41:52] Remember this? [00:41:53] He was a guy and he would go. [00:41:56] Do you remember there's two Corey's in it? [00:41:58] What was that movie? [00:41:59] And then people would call in. [00:42:00] They'd say, oh, it was Goonies. [00:42:02] No, no, it wasn't Goonies. [00:42:04] It was, you go through this whole thing. [00:42:06] And you could do hours on this and people would reminisce about these memories and think about these things and try to figure them out. [00:42:12] And now that's all dead because everyone just goes, oh, Corey Haim types it in and looks at his IMDb page and knows the answer in five seconds, right? [00:42:21] And so we wait, wait, wait, before you move on from there, what has that also done to our memory? [00:42:28] Right. [00:42:28] Right. [00:42:29] Terrible. [00:42:29] It's worse. [00:42:30] You don't even think about I have to store that or I remember what's his name? [00:42:34] Oh, I remember we were sitting in a room and it was so-and-so that said you don't do that. [00:42:40] So your memory is weakened. [00:42:43] I see it with my kids. [00:42:44] When they want answers from things, they're like, who's, you know, what was the score of the, they just ask the person, the dumb device that I won't screw all the people by saying the name. [00:42:55] But they'll ask the device, you know, without trying to think about it for an hour. [00:43:01] They just know the answer's there. [00:43:03] And that's the same concept of what the singularity could theoretically become, right? [00:43:07] So imagine, imagine if you're going to Italy and you want translation, you'll be able to understand them instantly because it'll be there. [00:43:17] The translator will be inside of you. [00:43:20] You'll probably butcher it because it requires your physical use of your mouth, but you will know how that is supposed to be said and you'll say it. [00:43:31] But once that information is gone, you can't communicate in that language anymore. [00:43:37] If you're cut off, you know, there's no memory of it. [00:43:42] There's memory that you did it, but there's no memory. [00:43:45] There's no muscle memory. [00:43:46] There's nothing. [00:43:47] And this gets to the point that, you know, think about being de-platformed now. [00:43:51] What does that mean? [00:43:52] Oh, I lose my Twitter account. [00:43:53] What does it mean if the singularity exists and you're de-platformed from all of this knowledge that everyone else can access immediately? [00:43:59] Oh, no, Ray Kurzweil said that would never. [00:44:02] That would never happen. [00:44:02] That would never. [00:44:03] So, okay. [00:44:04] Why would we do that? [00:44:05] That's a totally different road. [00:44:06] When we think about the innovations that happen when these things kick in, we could talk about Alexander Graham Bell, but go back just a decade, right? [00:44:16] Go back to 2008, right? [00:44:18] Before the iPhone. [00:44:19] Before the iPhone. [00:44:20] We've gone from literally no one having these things, maybe just for occasional phone calls, to the era where everyone expects to have, is on these things five, six hours a day. [00:44:33] That is the merging of man and machine. [00:44:36] It's already happening. [00:44:37] On the air, I said to you in the 90s that networks and watching shows is it's not going to be Thursday night at eight o'clock. [00:44:46] It'll just, you just will log on and download it and you'll have all the episodes that you want. [00:44:51] And that seemed completely insane. [00:44:53] Insane in the 90s. [00:44:55] And here we are. [00:44:56] We all now expect to see you. [00:44:56] Nobody watches TV and really put very little thought into what it means or how. [00:45:02] Wait a minute. [00:45:03] You sound like you're starting to make a point on this. [00:45:06] Yeah. [00:45:06] Well, hang on. [00:45:07] I just want to live in my fantasy world here for a second and not Think about that.