| Time | Text |
|---|---|
|
AI Replacing Government Workers
00:14:58
|
|
| You know, I'm so glad that I interviewed Saleem Ismail, who is the founder of, well, what's it called? The Open EXO, which he teaches companies and organizations how to 10X their growth using some really innovative techniques. | |
| And some of it is lowering the cost of the supply side, which I found really interesting. | |
| So I had a great interview with him. | |
| That'll be running next week. | |
| He uplifted my optimism, which is really a good thing because it's so easy to get down right now. | |
| And I don't mean like get on down. | |
| I mean, to be dark and moody or, you know, depressed. | |
| I'm not saying I'm not depressed because I can always laugh at everything, but it's a depressing era in our world. | |
| You know what I mean? | |
| It's easy to have a day of doom scrolling given what's happening in our world. | |
| And when I talk with Salim, it's just a great reminder that we don't have to focus on all the political nonsense and all the domestic nonsense. | |
| There's a much bigger picture at work here. | |
| And it helped me realize that, you know what the best way is to get rid of government violence against people? | |
| And this is the real focus of the podcast. | |
| The best way to get rid of government violence against the people is to make government obsolete. | |
| That's it. | |
| Make government obsolete. | |
| What a great mantra. | |
| And, you know, the funny way to achieve that is to first, you know, I mean, what is government? | |
| It's like 80% waste and fraud. | |
| And so one of the ways to shrink government is to actually make it more efficient to where they don't need as many people to run it. | |
| And if you want to reduce the cost of government, which would reduce the required tax base to fund government, one of the easiest ways to do that, the obvious way, is to replace government workers with automation. | |
| Now, the government workers themselves, they may resist that. | |
| I mean, obviously, government jobs tend to be pretty cushy type of jobs, right? | |
| And people like those jobs. | |
| They want a paycheck. | |
| They want a pension, etc. | |
| You know, wait till they find out where the dollar is going, though. | |
| The currency is not going to be around long enough for many of them to collect on their pension anyway. | |
| But they don't know that. | |
| So they want to resist the whole thing about innovation and AI change. | |
| And they just want bodies in the chairs, you know? | |
| So a lot of what government does is it just sits there. | |
| It just sits there and it tries to exist and it tries to expand. | |
| And that means expanding the people, expanding the payroll, expanding just the bodies in the chairs that do the things that government does. | |
| And the thing is that AI now makes a lot of that utterly obsolete. | |
| So even among the staunch defenders of government who say, well, we need government to do all these important things like build roads, which, by the way, that's one of the tiniest functions of what government does today. | |
| They really don't build that many roads. | |
| They really don't build that many schools. | |
| I'm talking about the federal government. | |
| Most of that happens locally at the county level or the state level, by the way. | |
| But even for those who say, well, we need government to do certain things, like we need a military, we need to defend the nation. | |
| And I agree, we need some level of national defense. | |
| That actually does make sense. | |
| I get that. | |
| You know, we do need some kind of standards for things, like we should move to the metric system, for example. | |
| And that would be a function of maybe the NIST or something like that. | |
| So I understand there's some certain functions of government, but my point is whatever functions of government you think should exist, they can be made a lot less costly, a lot less tyrannical, and a lot more efficient by replacing the government workers with AI. | |
| And some people are afraid. | |
| Well, I don't want AI to rule over me. | |
| What? | |
| So you'd rather some tyrant rule over you? | |
| I'm way more comfortable with an open source AI model, a reasoning model. | |
| Let me put it this way. | |
| The reasoning models are way more reasonable than the humans. | |
| It's not even close. | |
| The reasoning models actually make sense. | |
| I can reason with a reasoning model. | |
| I can't reason with a party loyalist. | |
| I can't reason with a Zionist. | |
| I can't reason with an authoritarian. | |
| I can reason with a reasoning model. | |
| I am much more comfortable seeing AI run whatever necessary functions of government that you think are important than having human beings run those. | |
| I don't trust the humans, and that's from experience. | |
| The humans are actually, well, really, the humans are bad at doing the jobs. | |
| They don't have the cognition of AI and they don't have the neutrality. | |
| They have biases. | |
| They're lazy often. | |
| They're privileged. | |
| They're arrogant. | |
| I don't know about you, but in all the AI engines that I've interacted with, I've never had one get arrogant with me. | |
| Never. | |
| Pretty much they're narcissistic. | |
| They're like, oh, you're right. | |
| I shouldn't have dropped that index table, which that's what Claude Code said to me one day. | |
| I shouldn't have dropped the index of that row. | |
| Or no, it dropped an entire column out of a table. | |
| And when I said, what did you just do? | |
| It's, oh, you're right. | |
| I shouldn't have done that. | |
| Like, okay. | |
| We had to rebuild that column. | |
| It took 90 minutes to rebuild that one column in that one table. | |
| So that was a mistake. | |
| But even then, it wasn't trying to hurt me. | |
| But believe me, there are people in government, especially because of all the warring between the parties these days. | |
| There are people in government, well, and also, you know, outside of government, but people want to hurt other people a lot. | |
| You know, the Democrats hate Trump and Trump's people, who increasingly resemble, you know, the Third Reich, Trump's people absolutely want to kill Democrats. | |
| Because how do we know that? | |
| Because they're doing it. | |
| They're doing it with ICE on the streets right now. | |
| I mean, this level of warfare and hatred, this is not good for society. | |
| We don't want extreme hatred and polarization in government or anywhere in society. | |
| So if we were to simply have AI models, AI reasoning models make government decisions based on prompts that we the people agree on. | |
| You know, so Congress could debate about the prompting of the model or the model's priorities that it must follow. | |
| Let the model reason through it and arrive at the correct answer. | |
| How do we save money on public education? | |
| How do we limit fraud in government grants? | |
| How do we achieve these things? | |
| I'm telling you, AI can solve these problems much better than humans. | |
| It's not even close. | |
| Much better than humans. | |
| How do we slash the size of government? | |
| 90% or whatever percent you want to target? | |
| 80%? | |
| That's doable right now with AI. | |
| Now, I'm not saying that we should have Skynet Terminator robots replace all the soldiers in the military. | |
| I think it's very important to have humans in the command chain when it comes to the, well, training people to kill, you know, in the military. | |
| I don't want that automated, obviously. | |
| You know, we still need human values in those chains, but there are a lot of things in government, most things that we don't need humans for. | |
| For example, sitting in the EPA and trying to figure out how do I give out a billion dollars in grant money for environmental projects or whatever. | |
| You don't need people to figure that out. | |
| All you have to do is have people define the priorities, the criteria, the goals. | |
| And then you give that to an AI engine and you say, here you go. | |
| Here's the priorities. | |
| Basically writing the prompt, saying, here's what we want to focus on. | |
| And here's 5,000 grant applications. | |
| I want you to go through the grant applications and give me the top 200 that seem to be the most worthy and go, and then let it crunch on that for a day. | |
| You come back, boom, you've just cleaned up thousands of applications. | |
| You've replaced weeks of human work with one day of AI work at a fraction of the cost and probably with better accuracy and no bias. | |
| I mean, the only bias the AI engine will have is what you told it. | |
| If you said, well, I want to have bias for minority groups, right? | |
| If that's your policy, then it will do that. | |
| But that's your bias. | |
| That's not the AI's bias. | |
| The AI is not pushing bias, the reasoning models, especially. | |
| They're really trying to reason through the problems. | |
| So if you believe in small government, which, by the way, conservatives used to believe in small government, not anymore. | |
| With Trump in power, all the conservatives are like, bigger government, more troops on the streets, take over the cities, more war, bigger Pentagon, bomb more countries, right? | |
| So conservatives, I've done another podcast on this, conservatives, that is the vast majority of them, no longer have conservative values, which is just truly bizarre. | |
| I still have, you know, I like Ron Paul. | |
| I like Ron Paul's values when it comes to small government. | |
| And I actually still believe those same values. | |
| But whatever your values are, if you believe in small government, or even if you just believe in more honest government that's less weaponized against the people or not weaponized at all, you should support AI replacement of government workers. | |
| You're going to take the size way down. | |
| You're going to boost the efficiency, the honesty, the integrity, and the transparency. | |
| Because every AI agent that works for government should be open source and we should be able to monitor its chain of thought reasoning as members of the public. | |
| Like you want to know what does the director of the USDA think right now? | |
| Well, since the USDA director is a human, you don't know. | |
| What's inside their skull is a black box. | |
| It's a mystery in there. | |
| Who Knows what's going on and which pesticide companies have paid that person off or promised a lucrative job or something like that, right? | |
| Or gave a bunch of grant money to their daughter and the daughter's business or whatever. | |
| I mean, that's how it works, right? | |
| That's the reality of Washington, D.C. What if instead we had an open source AI model, a reasoning model running the USDA, and you could log into a webpage and you could watch the thinking tokens be generated in real time. | |
| So the director of the USDA is a machine that is highly intelligent, that knows the whole history of the USDA, that knows all the legal cases, everything, has all the science about the dangers of pesticides and herbicides, has all the science about biosolids, biosludge in the soils, everything. | |
| And then it's making decisions in the interests of the people because that's its programming. | |
| And then you can watch what it's saying, what it's thinking. | |
| Like literally watch what it is thinking. | |
| You just go to the webpage. | |
| It's like, you know, USDA, or what would it be? | |
| Like aidirector.usda.gov or something like that. | |
| You just go there and just start churning out tokens of what the director is thinking. | |
| Wouldn't that be better than the current system where it's a mystery, a bunch of mystery humans with mysterious loyalties and mysterious bribes or threats or promises or blackmail or whatever is affecting? | |
| I mean, same thing's true in the Senate or the House. | |
| Same thing's true. | |
| I would much rather trust a cognitive AI model that I can see what it's thinking than to trust a typically corrupt human being where everything's a secret and all the deals are done in dark rooms behind closed doors and they have these weird intrinsic biases or they don't even put America first. | |
| You know, they put some other country first or they have some other agenda, some hidden agenda that we don't even know about. | |
| Well, you wouldn't have that with AI. | |
| AI, the thinking would be all out in the public. | |
| You'd be able to see exactly what it's thinking. | |
| And if you have a problem with what it's thinking, then guess what? | |
| You and the other voters get to control the AI through a consensus mechanism. | |
| And the best way to deploy that would be through some kind of local blockchain voting system. | |
| You can vote on every prompt. | |
| You can say, hey, do you want your senator to support, you know, blah, blah, blah. | |
| Do you want your senator to support abortion or to outlaw abortion? | |
| Right. | |
| And then the people in the state vote on that. | |
| And then whatever vote wins, then that's the directive that goes to the senator. | |
| Oh, Mr. AI Senator, your directive is to, let's say, you know, support abortion because that's what the voters voted for in California or wherever. | |
| And that's representational government, isn't it? | |
| That's representational government, which is what senators are supposed to do, but they don't do. | |
| I mean, they don't represent you or me. | |
| They represent corporate interests, don't they? | |
| Or some other weird international interests. | |
| Or, you know, a senator in Texas. | |
| You just swap out Ted Cruz with a machine, and believe me, it's going to be better. | |
| You can't do worse than Ted Cruz. | |
| I mean, and then you can tell the machine, hey, Mr. AI Senator, actually support the Second Amendment. | |
| Don't just talk about it. | |
| And as long as that's what the voters support, then that's what the AI senator does. | |
| Oh, we're going to defend the Second Amendment. | |
| This is Texas. | |
| You see? | |
| And instead, what we have is Ted Cruz tweeting about Israel every other day, or every day, instead of focusing on Texas. | |
| I mean, call me crazy, but I think that senators should represent the people of their state, not some foreign country. | |
|
Call Me Crazy
00:03:00
|
|
| I don't know. | |
| Is that weird? | |
| And so, look, getting back to what I opened up with here in this podcast, even I, I can get sucked down these rabbit holes of domestic politics that are a complete waste of time because you're arguing with idiots. | |
| You really are arguing with idiots about, oh, should citizens have a First Amendment right or should they have a Second Amendment right? | |
| You're arguing with idiots on the liberal side and the conservative side. | |
| It doesn't matter. | |
| They don't know anything. | |
| They really don't. | |
| They don't have any principles. | |
| They don't understand law. | |
| They don't know history. | |
| Every AI that is open source right now is smarter than all the people I'm arguing with on Twitter or wherever. | |
| And the solution to all of this is not to argue with people about what's happening, but rather make government obsolete by replacing the workers with AI. | |
| So again, I'm happy I had this conversation with Salim because this makes perfect sense. | |
| And the other thing worth noting in all of this, and I'm not trying to sound arrogant, but I just have no more patience in interacting with low IQ people like you find online on social media. | |
| They're just, and many of them are really popular people. | |
| Like, let me just call out Tim Poole. | |
| I call him Tim Toole the Fool. | |
| Tim Poole is, for some reason, he's got great visibility as a podcaster. | |
| I think that's because the establishment likes obedient puppets like that. | |
| And he went out there the other day and he said, and I'm paraphrasing, but he said that he's not a boot licker. | |
| He likes the boot because it's his boot. | |
| It's his boot that he's licking because he's saying that his people are terrorizing the illegals in Minneapolis, right? | |
| So he's like, it's my boot. | |
| I voted for that boot. | |
| Something like that. | |
| And that's why I call Tim Poole a self-licking boot, which is actually, it's a perfect description. | |
| Tim Poole is just not very smart. | |
| He's and, you know, being smart actually is a drawback in terms of popularity. | |
| If you're too smart, you're never going to reach the masses because the masses aren't that smart. | |
| So Tim Poole is kind of right in that bell curve, you know, C student sweet spot of average intelligence that resonates with the average herd out there, you know? | |
| And I just, I can't, I just can't stoop to that level of low IQ. | |
| I just can't do it. | |
|
The Masses Don't Matter
00:05:51
|
|
| I mean, not that I would want to. | |
| It's just, it's intolerable. | |
| I like to talk to smart people. | |
| I like to interact with intelligent people. | |
| I like people who are well-informed. | |
| I don't care what their politics are. | |
| I want them to be intelligent. | |
| I want them to look at the world in a critical manner and to think through things, not just react or push a bunch of party propaganda or do a bunch of, hoorah, you know, just like sports team garbage. | |
| You know, people root for Trump the way they root for the New York Jets. | |
| I don't even know if that's the team anymore, but it's like a sport. | |
| And that's idiocy. | |
| That's idiocy. | |
| It's idiocy on the left, too. | |
| You know, if you're, if you're rooting for a, a political party, you're just kind of dumb. | |
| And that is, that's not going to get us anywhere in this country. | |
| We have to transcend that system and we have to use technology to do it. | |
| That's what I'm doing with my projects, the AI projects like Bright Learn. | |
| And that's what I'm encouraging others to do. | |
| So I'm going to say right now, I have actually I've said this before. | |
| I have zero loyalty to any political party. | |
| I have loyalty to principles. | |
| Principles. | |
| And it's those principles that we need to anchor this nation and this civilization in order to move humanity forward. | |
| The principles that I espouse are universal principles, the fundamentals of civilization. | |
| The right to private property, the right to due process, the right to freedom of speech, the right to farm, the right to own the product of your own labor, things like that, the right to disagree with government. | |
| These are fundamental rights. | |
| Also, I believe in the right of access to information and knowledge and the fundamental right of access to water, for example. | |
| I do not support the corporatization of water supplies globally and the impoverishment of people by water companies like Nestle, etc. | |
| Okay, but that's just me. | |
| So my values are the values that create civilization. | |
| The values of the tribalists who want you to get dragged down into political wars, those are the values that destroy civilization. | |
| Those are values of violence and war and oppression or suppression, censorship, etc. | |
| So at the end of the day, the future of our world depends on people like you and I who have foundational values that are pro-civilization. | |
| And there are a lot of people like us out there, thank goodness, and we're the ones that matter. | |
| Understand, the masses don't matter. | |
| They're not going to change the world. | |
| All they can do is detract us from helping the world move forward and heal. | |
| So don't get sucked into the mindless mainstream masses and their low IQ distractions and focusing on tribalism and other kinds of nonsense. | |
| And I'm saying this to myself as well as to you. | |
| I have to remind myself, because I get sucked into this sometimes too. | |
| That's my fault. | |
| But don't allow yourself to get sucked into it if you can help it. | |
| Let's focus on building the next civilization because this system that we're in right now, it's going to collapse. | |
| The collapse is accelerating. | |
| You can see it. | |
| You can sense it. | |
| It's all around us. | |
| Everything's collapsing. | |
| Every institution, every currency, every government, every principle, you name it. | |
| It's all collapsing. | |
| This system is not sustainable, but it's people like us that can get it to the other side and can build something that actually works for humanity. | |
| And I am committed to doing that in a principled way. | |
| Principles are not for sale. | |
| They are immutable. | |
| And they are compatible with an abundant world of joyful, healthy, wealthy people. | |
| That's what I believe in. | |
| All right. | |
| If you want to follow my work, you can read my articles at naturalnews.com. | |
| Or you can check out my interviews and podcasts at brighteon.com. | |
| And of course, you can use all my free AI tools that are part of all this at, well, let's see. | |
| Go to brightlearn.ai. | |
| That's our book engine. | |
| Or brightanswers.ai is our AI answer engine. | |
| Or brightnews.ai is our news engine. | |
| So check it all out. | |
| Use the tools. | |
| Spread the word. | |
| Thank you for listening. | |
| And thanks for being who you are. | |
| There's a lot on our shoulders. | |
| We have to build civilization again because the system out there right now is absolutely committing civilizational suicide. | |
| They're going to bring the system down. | |
| But when they do, we can bring it back online in a whole new way. | |
| We can actually do a better job. | |
| We can. | |
| We can do a better job than the ones who are bringing the system down because they're, frankly, they're retarded. | |
| They really are. | |
| So don't fret the collapse. | |
| Should actually kind of hope for it because that's the only way we get to anything better. | |
| All right. | |
| Thanks for listening. | |
| Take care. | |
| Stock up on HealthRanger's nascent iodine. | |
| Highly bioavailable, shelf-stable, non-GMO, and lab-tested for purity. | |
| A bug out bag essential. | |