Speaker | Time | Text |
---|---|---|
This is what you're fighting for. | ||
I mean, every day you're out there. | ||
What they're doing is blowing people off. | ||
If you continue to look the other way and shut up, then the oppressors, the authoritarians, get total control and total power. | ||
Because this is just like in Arizona. | ||
This is just like in Georgia. | ||
It's another element that backs them into a corner and shows their lies and misrepresentations. | ||
unidentified
|
This is why this audience is going to have to get engaged. | |
As we've told you, this is the fight. | ||
All this nonsense, all this spin, they can't handle the truth. | ||
War Room Battleground. | ||
Here's your host, Stephen K. Bannon. | ||
In public and policy conversations, talk of human-level AI is often treated as either science fiction or marketing hype. | ||
But many top AI companies, including OpenAI, Google, Anthropic, are treating building AGI as an entirely serious goal, and a goal that many people inside those companies think they might reach in 10 or 20 years, and some believe could be as close as one to three years away. | ||
More to the point, many of these same people believe that if they succeed in building computers that are as smart as humans, or perhaps far smarter than humans, that technology will be at a minimum extraordinarily disruptive, and at a maximum could lead to literal human extinction. | ||
The companies in question often say that it's too early for any regulation because the science of how AI works and how to make it safe is too nascent. | ||
I'd like to restate that in different words. | ||
They're saying we don't have good science of how these systems work or how to tell when they'll be smarter than us. | ||
We don't have good science for how to make sure they won't cause massive harm. | ||
But don't worry. | ||
The main factors driving our decisions are profit incentives and unrelenting market pressure to move faster than our competitors. | ||
So we promise you're being extra, extra safe. | ||
Just in general, is your impression now is it was OpenAI doing enough in terms of its safety procedures and protocols to adequately vet its own products and to protect the public? | ||
unidentified
|
I think it depends entirely on how rapidly their research progresses. | |
If their most aggressive predictions of how quickly their systems will get more advanced are correct, then I have serious concerns. | ||
If their predictions, their most aggressive predictions may well be wrong, in which case I'm somewhat less concerned. | ||
And I think that it's very, very important that we be prepared for a variety of threats. | ||
Those threats could include deep fakes of election officials, of candidates like yourselves, | ||
and also deep fakes that present election apparatus in them that indicate that there | ||
was tampering with physical objects associated with the election. | ||
In two bills that I mentioned earlier that I worked on in California that are currently | ||
awaiting signature, fingers crossed, by Governor Newsom, Assembly Bill 2655 by Mark Berman, | ||
Assembly Bill 2839 by Gail Pellerin, those would actually have serious consequences so | ||
that if someone posted those types of election deep fakes, they could be removed, the platforms | ||
would be required to remove them. | ||
Yeah, well our bill allows for them. | ||
Many of them are actually supporting this bill because it makes it clear, the platforms are, that they would have to take it down, but it also puts liability on the people that put those up. | ||
Yeah. | ||
Potential liability. | ||
The police will be on their best behavior because we're constantly recording, watching and recording everything that's going on. | ||
Citizens will be on their best behavior. | ||
Because we're constantly recording and reporting everything that's going on. | ||
It's unimpeachable. | ||
The cars have cameras on them. | ||
I think we have a squad car here someplace. | ||
But those kind of applications using AI, and we're using AI to monitor the video, So it's not people that are looking at those cameras. | ||
It's AI that's looking at the camera. | ||
No, no, no. | ||
You can't do this. | ||
It would be like a shooting. | ||
That's going to be immediately. | ||
That's going to be an event. | ||
That's immediately an alarm is going to go off. | ||
It's going to be, and we're going to, we're going to have supervision. | ||
In other words, every police officer is going to be supervised at all times. | ||
And if there's a problem, AI will report the problem and report it to the appropriate person, whether it's a sheriff or the chief or whomever we need to take control of the situation. | ||
We have drones. | ||
If there's something going on in a shopping center, and I'll stop, a drone goes out there way faster than a police car. | ||
There's no reason for, by the way, high-speed chases. | ||
You shouldn't have high-speed chases between cars. | ||
You just have a drone follow the car. | ||
I mean, it's very, very simple. | ||
And the new generation of autonomous drones. | ||
Good evening. | ||
It is September 19th in the year of our Lord 2024. | ||
I am Joe Allen, sitting in for Stephen K. Bannon, who still sits in prison unjustly. | ||
What you heard there at the beginning is the recent Senate Subcommittee on Privacy, Technology, and the Law. | ||
Helen Toner, the first speaker, was on the board of OpenAI before she was ousted after raising the alarm that OpenAI was being reckless with the technologies that they are developing. | ||
The real issue that we hear there is not necessarily the existential threat of AI coming alive and killing everybody. | ||
It's not necessarily the threat of deepfakes, although this is going to be a major problem going forward. | ||
I would say that the real problem of artificial intelligence, which has already invaded our lives like soulless insects, is that we now have non-human entities that can speak to us, that can listen to us, that can produce very dramatic illusions such as video content or visual content, and we are being primed | ||
for a symbiosis with these entities. | ||
We are already seeing in schools, children being acculturated to speaking to non-human entities and trusting them to give them accurate information. | ||
We're already seeing the AI rolled out across corporations, government agencies, and even in religious institutions to help people through their prayer lives. | ||
What we're facing is not necessarily something that Congress or any state government is going to be able to prepare us for, protect us from, or control. | ||
What we're facing is a broad cultural revolution in which our closest companions aren't necessarily going to be human beings. | ||
For many people, it's going to be machines. | ||
And as you heard there at the end of the video with Larry Ellison of Oracle, Oracle being a major contractor with both the U.S. | ||
government and the intelligence agencies of the U.S. | ||
government, we hear there this vision of artificial intelligence allowing surveillance to basically be a behavioral modification program so that if everyone knows that they're being watched, then everyone will be incentivized to alter their behavior as such. | ||
On the one hand, Ellison is speaking about surveillance of the police. | ||
I think that to a large extent, many of us can relate to the desire for the police to be under as much surveillance as ourselves, but it won't be just the police. | ||
Already, after over 20 years after the Patriot Act, We are already seeing the fruits of mass surveillance, not only in the stifling of political dissent, but also the overarching sentiment that privacy is becoming a kind of quaint | ||
desire that no longer can you expect to be your own person in your own realm, but you are inevitably, invariably part of a larger technological system, one which you are impacting and one which will impact you, one which is constantly monitoring your every move. | ||
And in the end, perhaps, silencing you or even seeing you put in prison for something you simply said. | ||
I want to bring in Norbin Laden, but before there's a video, I think it should give you a good idea of what the sentiment is in the Democratic Party in regard to free speech and censorship. | ||
So Denver, if you could roll that. | ||
He has lost his privileges and it should be taken down. | ||
And the bottom line is that you can't say that you have one rule for Facebook and you have a different rule for Twitter. | ||
The same rule has to apply, which is that there has to be a responsibility that is placed on these social media sites to understand their power. | ||
They are directly speaking to millions and millions of people without any level of oversight or regulation. | ||
And that has to stop. | ||
unidentified
|
I think we need to push back on this. | |
There's no guarantee to free speech on misinformation or hate speech and especially around our democracy. | ||
...Trump back in 2016, but I also think there are Americans who are engaged in this kind of propaganda. | ||
And whether they should be civilly or even in some cases criminally charged is something that would be a better | ||
deterrence. | ||
Excellencies, ladies and gentlemen, dear Klaus, your annual global risk report makes for a stunning and sobering read. | ||
For the global business community, the top concern for the next two years is not conflict or climate. | ||
It is disinformation and misinformation, followed closely by polarization within our societies. | ||
These risks are serious because they limit our ability to tackle the big global challenges we are facing. | ||
Changes in our climate and our geopolitical climate. | ||
Shifts in our demography and in our technology. | ||
spiraling regional conflicts and intensified geopolitical competition and their impacts on supply chains. | ||
The sobering reality is that we are once again competing more intensely across countries than we have in several decades. | ||
And this makes the theme of this year's Davos meeting even more relevant. | ||
Rebuilding trust. | ||
This is not a time for conflicts or polarization. | ||
This is a time to build trust. | ||
This is a time to drive global collaboration more than ever before. | ||
This requires immediate and structural responses to match the size of the global challenges. | ||
I believe it can be done. | ||
And I believe that Europe can and must take the lead in shaping that global response. | ||
The starting point for that is to look deeper at the Global Risk Report to map out a way forward. | ||
Many of the solutions lie not only in countries working together, but crucially, on businesses and governments, businesses and democracies, working together. | ||
It has never been more important for the public and private sector to create new, connective tissue. | ||
Because none of these challenges respects borders. | ||
They each require collaboration to manage risks and to forge a path forward. | ||
Rebuilding trust. | ||
I am not feeling that. | ||
Norbin Laden, welcome. | ||
Thank you very much for joining us. | ||
Are you feeling your trust levels increase after hearing that riveting speech? | ||
Not at all, Joe, quite on the contrary. | ||
And I think this is a very important clip, which is why I wanted to share it once more with the Posse, because we covered it back in January in Davos, where Ursula von der Leyen was speaking and giving this address to open the World Economic Forum annual meeting in Davos. | ||
It's incredibly telling how they really are working so hard with all of these stakeholders in order to quell any form of speech and to quell any form of dissent. | ||
On the heels of the other clips that you played, you know, Hillary Clinton, most notably, was calling for even the arrest of certain members of the population who are calling out, you know, these excesses and these unjust unconstitutional measures that are being deployed against the population without their consent. | ||
You have, you know, the WEF and these globalist institutions that are continuing with this agenda. | ||
On the 9th of September, just a couple of weeks ago, they published their annual WEF report for 2023 and 2024. | ||
And once more, Misinformation and disinformation is at the center of the report and calls for collaboration, just as Ursula von der Leyen said back in January, are reiterated between all of these different stakeholders, between the corporations, between the politicians, these minions, these, you know, oligarchs, these international organizations and institutions, our governments, all working hand in hand to | ||
You know, already we've seen, we've covered a lot, the UK protests and in the aftermath, many people being hauled off to jail for something they posted online. | ||
You have examples here in America, Douglas Mackey, for just a kind of a joke tweet, spent time in prison for it. And you can tell without any shadow of a doubt that | ||
the machine is right on the edge of rolling over American citizens, not because of things | ||
they did, but because of things they said. | ||
I'm Myself, I'm actually quite concerned about the spread of misinformation and disinformation, but the real problem I see is that these organizations are positioning themselves to determine what is and is not misinformation and disinformation, and they are also the purveyors of a lot of mis- and disinformation. | ||
You just recently wrote an article about a colleague of yours Who in Switzerland was convicted and fined for, I guess you would call it something inflammatory. | ||
I thought it was quite funny. | ||
What is the title of the article and can you tell us a little bit about that? | ||
Sure, Joe, but first, bouncing off of what you just said, they are absolutely the main purveyors of misinformation and disinformation. | ||
And because we are noticing that and calling it out and trying to share truthful information, they are basically inverting the reality. | ||
You know, that that speech from Ursula von der Leyen had many, many important points that she made. | ||
And she did mention that this is something that is beyond borders, you know, that it impacts all countries. | ||
And you rightly pointed out, you know, different cases in different countries, you know, the U.S., obviously, but also the U.K. | ||
we saw over the summer. | ||
I also wrote an article about that in Sock Unmasked. | ||
Talking about how people were being sent to jail for retweets, you know, coming out of the members of the government's own mouths, you know, on tape, threatening the population if they were to even just retweet a post that would supposedly lead to these protests. | ||
Excuse me in the street. | ||
And it's across the board. | ||
And in Switzerland as well. | ||
I wrote an article entitled The End of Free Speech in Switzerland because there are quite a few cases of ordinary citizens who are being find hefty sums and also risk going to jail for mere tweets. | ||
And this indeed did happen to a friend of mine who goes by the non-name of Barbuy, and | ||
he tweeted a single word below a video that was denouncing the propaganda, the LGBTQPA | ||
plus whatever indoctrination in classrooms here in Switzerland. | ||
And my friend just called out this indoctrination and he's facing, you know, close to $7,000 equivalent of a fine and two years probation, you know. | ||
It's just absolutely insane what is going on across the West. | ||
And we are truly living under the false pretense that we have freedoms here. | ||
Freedoms that are increasingly and exponentially eroding by the day. | ||
And it is absolutely critical that we talk about these things before we are completely locked up in these digital gulags that they have been preparing for us. | ||
And the step, the key step for them, why they are so intent on focusing on misinformation and disinformation being the number one threat. | ||
It's because without our voice, without the ability to call out their many, many different machinations in the first place, then we don't stand a chance. | ||
And coming back to the opening video that you shared, you know, this, this digital That they have built and in which they want to imprison us. | ||
The walls are closing in very rapidly, Joe, and it is absolutely imperative that we stand up against it, and very firmly so. | ||
There's a tension that I see here, and we'll be bringing in Tim Henschliff on the other side. | ||
I'd actually like to hold you over just to come back at the beginning of the next block and talk about the new project that you are working on and that you've just launched. | ||
But just on that note, with Larry Ellison, he, of course, is a frequent flyer at the World Economic Forum. | ||
But what you see with Ellison is, I think, a point that I try to make a lot. | ||
This is not some monolithic, just completely homogenous blob coming out of Davos. | ||
Larry Ellison, you could say, is very much on the conservative end of that spectrum. | ||
And even if he's a part of that system, he is much less inclined towards any sort of socialist measures that would see money redistributed to less wealthy people. | ||
He is much more inclined towards minimizing regulation as opposed to the highly regulated | ||
programs you hear a lot there. | ||
You see that right now with Trump and the people in his camp from the tech sector. | ||
You've got Elon Musk, Peter Thiel, Marc Andreessen, Ben Horowitz. | ||
These people want to see a very different kind of dystopia. | ||
Still dystopian in my eyes, but just in the two and a half minutes we have left, what are your thoughts on that tension between these vying powers within that central kind of global system? | ||
Well, actually, I'm not sure I agree with you, Joe, on this point, because although there are some elements or there are certain factions and there is a competition there, when you look at most of these, oligarchs, and I would call them, you know, manufactured characters that are put on the central stage to lead certain of these projects. | ||
But you know, they essentially come from the same source, which is the American Deep State, when we talk about the big tech companies. | ||
And, you know, Larry Ellison was assigned to the CIA's Project Oracle, and it's from that project that the company Oracle was born. | ||
And it's the case of most of these big tech companies. | ||
And on that note, I would really encourage the Posse to go watch James Corbett's 45-minute mini-documentary entitled The Secrets of Silicon Valley, What Big Tech Doesn't Want You to Know. | ||
And it goes more into the details of how Silicon Valley was born. | ||
And how, you know, companies like Facebook, you know, it's that's quite well known. | ||
It was a DARPA project called LifeLog. | ||
And the day that they killed the project, the company, the Facebook was was born. | ||
And there are many such stories when it comes to to big tech. | ||
And I think there is very much a concerted effort by the powers that be, this elite that is very shadowy and working behind the scenes to, to advance this, this agenda that, you know, on its face may present certain differences, but at the end of the day, the goal, or the road very much leads to the same bleak place. | ||
I would certainly agree with it leading to at least similar bleak places, but I guess we'll have to agree to disagree on the uniformity of the elite powers. | ||
I think it's going to be much more complex behind that curtain than any series of trails leading back to intelligence would allow for. | ||
On that note, we will come back after the commercial break, and Noor has a very, very special project she has just launched. | ||
I look forward to hearing about it, and I hope you do too. | ||
unidentified
|
Stay tuned. | |
With the massive tax hikes proposed by Harris, an almost 40% top income tax rate, 7% increase | ||
to the corporate tax, a capital gains tax on unrealized gains, and the fact that she's | ||
proposed to add an almost $2 trillion to the current $2 trillion deficit, you might be | ||
thinking it's time to make more of your savings tax-sheltered and inflation-sheltered. | ||
childhood. | ||
This is where I trust the good people at Birchgold Group to help you. | ||
Birchgold will assist you in converting an existing IRA or 401k to an IRA in gold. | ||
And the best news, you don't have to pay a penny out of pocket. | ||
Just text the word BANNON to 989898 and get a free info kit on gold. | ||
There's no obligation, just information on fortifying your savings before the crazy really hits. | ||
With an A plus rating with the Better Business Bureau and thousands of happy customers, you too can trust Birchgold. | ||
Text Bannon to the number 989898 for your free info kit today. | ||
You owe back taxes, right? | ||
Here's the question. | ||
Why is the IRS targeting you and not millionaires who owe a fortune compared to you? | ||
Rich people have tax attorneys. | ||
You probably don't. | ||
Tax Network USA are patriots you want on your side to solve your IRS tax problems quickly and painlessly. | ||
Their attorneys, strategists, and expert negotiators employ brilliant strategies designed to solve your IRS problem quickly in your favor. | ||
They have a preferred direct line to the IRS. | ||
They know which agents to talk to and which to avoid. | ||
And Tax Network USA learned of a limited-time special IRS offer. | ||
The IRS is willing to forgive $1 billion in tax penalties. | ||
Find out if you qualify before it's too late. | ||
Schedule your free confidential consultation now. | ||
Look, Tax Network USA has resolved over a billion in tax debts, and they offer a best-in-class client satisfaction guarantee. | ||
This is the team I recommend to solve your IRS issues so you can get your life back. | ||
Call 1-800-958-1000 or visit TNUSA.COM slash Bannon. | ||
TNUSA.COM slash Bannon. | ||
unidentified
|
All this nonsense, all this spin, they can't handle the truth. | |
War Room Battleground with Stephen K. Bannon. | ||
You like the idea that you're an individual and think your own thoughts. | ||
If you have one thought in your life that is uniquely your own, you are extremely lucky. | ||
You will strongly object to this statement. | ||
Reality would disagree with you. | ||
Cultural programming from your parents and greater environment have you locked into its way of seeing things by the time you are three years old. | ||
You are given a menu of options. | ||
Choose yours, and act those choices. | ||
There is no option for the original you in this scenario. | ||
All this programming is controlled by middlemen. | ||
Middlemen who profit from your directed thought. | ||
This is no news. | ||
It has gone on for centuries, but now it is different. | ||
Now the digital world completely dictates how the ball rolls down your internal slots and you shall obey. | ||
All right, welcome back. | ||
That is antimatters.world. | ||
Norbin Laden, tell us about it. | ||
What is this? | ||
What can people expect to find there? | ||
Listen, it's a project that was born between me and three friends, and we were just talking about all of this digitization that is going on in the world and how these walls are slowly but very surely closing in on us. | ||
We basically created some clothing and some designs that we wanted to wear ourselves. | ||
And then we thought, you know, it would be great if we could create a movement whereby people who would wear these very thought-provoking designs could spark conversations in real life. | ||
And also it would be a way to educate people by having a platform where we would collect resources for people to learn about the dangers of digitization and how slowly but very surely humanity is being directed towards a place whereby our humanity is actually being eroded. | ||
And Yeah, it's a way for us basically to talk about these challenges and to call out this digital status quo and to warn people about what's what awaits us and to warn people about what I like to refer to as the digital dystopia that these overlords have prepared. | ||
And I mentioned it on the word room many times before, you know, I'm a human being, not a QR code. | ||
And I strongly object to this digitization of all aspects of our society, but also the digitization of us as a species. | ||
And this project was a way for us to be able to participate in raising awareness and to basically take a stand against that. | ||
You know, one of the things I love about the project is that very pro-human sentiment so that it's not simply anti-digital. | ||
We're against that. | ||
Wonderful. | ||
So tell us real quick before you sign off, where do people go? | ||
How do people find Anti Matters? | ||
You can go to antimatters.world and you'll have all the information on there, social media. | ||
You can sign up to our newsletter. | ||
We'll aim to send out a monthly newsletter with all the latest news about these dangers of the digitization of our society. | ||
And we hope you'll really like what we've created. | ||
We've certainly had fun and it was a way for us also to express our creativity. | ||
We didn't use AI for anything, even the fonts. | ||
are hand-drawn, so everything is human-made from beginning to finish. | ||
unidentified
|
So we hope you'll enjoy it. | |
I don't care what anybody says, human-made is the future. | ||
It's our future anyway. | ||
I agree with you. | ||
It is our future, 100%. | ||
Norbin Laden, thank you very much for coming by. | ||
Thank you, Joe. | ||
Always a pleasure to be on with you. | ||
All right, the UN is now meeting in New York. | ||
They begin the high-level debates on the 24th of this month. | ||
Here to talk about their global digital compact is Tim Henschliff. | ||
Tim Henschliff of Sociable.co. | ||
Tim, what can you tell us about this compact? | ||
Is it looking good? | ||
Is our future brighter by the day or what? | ||
unidentified
|
Well, the UN will want to have you think that the future is good and bright with this Global Digital Compact, but I don't believe so at all. | |
I've gone through it, and although it has some flowery language, which sounds good, there's a lot of underlying advances towards a digital control grid. | ||
So the Global Digital Compact is actually an annex of the pact for the future. It's also going to include a | ||
declaration on future generations. | ||
And so in the overall context, the pact for the future, which is expected to be signed | ||
at the summit of the future, which is taking place Sunday and Monday, the whole purpose is to upgrade | ||
the UN to UN 2.0 to restructure the whole financial architecture of things and to basically give the | ||
more power. | ||
And the U.N. | ||
doesn't just call on member states, but also stakeholders, so private companies, NGOs, and the like. | ||
But the Digital Global Compact, on its surface, it says the purpose is to establish an inclusive global framework essential for multi-stakeholder action required to overcome digital data and innovation divides. | ||
So it has five objectives. | ||
Which sound nice, again, but I've gone through them and they're not. | ||
If you read through the lines, no. | ||
So they want to close all digital divides and accelerate progress across the sustainable development goals. | ||
This is Agenda 2030. | ||
They want to expand inclusion in and benefits from the digital economy for all. | ||
They want to foster and An inclusive, open, safe, and secure digital space that respects, protects, and promotes human rights. | ||
Number four is they want to advance responsible, equitable, and interoperable data governance approaches. | ||
And they want to enhance international governance of artificial intelligence for the benefit of humanity. | ||
But if you go through them, first goal, close all digital divides. | ||
What does that mean? | ||
They want everyone connected to the internet. | ||
That's what they say. | ||
So they say there's 2.6 billion people that are not connected to the Internet. | ||
We want to connect them because, I mean, how can you build a digital control grid, a digital gulag without having everyone connected? | ||
So that's that's the first step. | ||
Second step is expanding the inclusion and benefits from this digital economy is just to how to effectively set up their digital ecosystems. | ||
So this is where this is where digital public infrastructure comes in. | ||
So that's a civic technology stack consisting of digital ID, fast payment systems like programmable CBDCs | ||
and massive data sharing. | ||
So first get everyone online, second, get them hooked up to this system here, | ||
which is just a surveillance and coercion and control system. | ||
And then third step is fostering an inclusive, open, safe and secure digital space, | ||
which sounds nice again, but it's really about censorship and how to control views. | ||
I mean, we know that a couple of years ago, I broke the story on the UN partnering with Google to censor anything about climate change and COVID narratives. | ||
You know, there's been work done seeing how Google can sway elections. | ||
So if you say anything that goes against UN narratives, especially as it relates to sustainable | ||
development goals, because that's what this is about, then they call on nations to stomp | ||
that out. | ||
In fact, last year, they came out with a voluntary code of conduct listing bit by bit calling | ||
on every member state to crush misinformation, disinformation, to work with authorities, | ||
to work with private sector to do everything to crush any narrative that can impede their | ||
sustainable development goals. | ||
They're going to push through with this agenda no matter what. | ||
And the fourth objective, advanced, responsible, equitable, and interoperable data governance approaches, which means they just want to collect as much data and information on you and everything else as possible using track and trace technologies, and then share that information across borders. | ||
Again, it's in the name of sustainable development goals. | ||
I mean, this is things like related to how much carbon are you admitting? | ||
You know, if you're saying misinformation online, that impedes their agenda. | ||
So it's a lot of all these things, all these objectives, Astounding. | ||
on one another. | ||
And then the fifth one is about international governance of artificial intelligence and | ||
supporting interoperability and compatibility and AI governance. | ||
But I mean, if you try doing that between the US and China or just any intelligence | ||
agency or, you know, open AI and see how willing they are to share best practices and share | ||
what they're doing, good luck. | ||
Astounding. | ||
And, you know, just from beginning to end, in a sort of summarized sense, what I hear | ||
is that, as you say, the expansion of this digital gulag or the expansion of total digitization | ||
is seen as a necessity in order to allow people to fulfill themselves. | ||
Right. | ||
And the system that we have now. | ||
That's absolutely true. | ||
It's very, very difficult to survive outside of this digital grid, as I've oftentimes mentioned. | ||
Here we are. | ||
But what really alarms me is the notion that this is accompanied by clearly stated surveillance objectives, and also clearly stated objectives to tilt the scales in favor of authority rather than any other sort of independent or even dissenting point of view. | ||
Looking at this, Tim, how do you see it fitting in with a lot of the work you're doing on U.S. | ||
companies, U.S. | ||
organizations and think tanks like Rand Corporation? | ||
You've chronicled in great detail the ways in which corporations in the U.S. | ||
government are In many ways, merging or at least partnering to roll out all of these, what I would call, kind of transhuman objectives. | ||
How does that fit in with the more global-oriented UN objectives? | ||
unidentified
|
Well, I mean, so there's, we're kind of merging corporation and state. | |
That's the remotest operandi. | ||
That's corporatism, fascism. | ||
So where the public's, where the governance can't fit it, go in and actually censor, then the private corporations can. | ||
And there's trade-offs there. | ||
But no, it's these organizations, they work with each other to... I mean, if we're looking at... I'm sorry, what were we going on with the whole transhumanism bit to it? | ||
To me, you see in the U.S. | ||
the same sort of process that you're talking about in the U.N., in which you have the government and these corporations kind of partnering together. | ||
But I wonder, in the context of the U.N., is this just an extension of it, or is there some sense of tension there between U.S. | ||
objectives and other nation-states that are U.N. | ||
partners or U.N. | ||
nations? | ||
Something that I'd like to tease out if possible, where is that separating line, right, between the kind of globalist and the nationalist objectives? | ||
Because I mean, many of the member states of the UN are notoriously defiant towards the U.N. | ||
and certainly at odds with each other. | ||
So I just wonder, as far as national objectives go versus this overarching global paradigm, where do you see tension? | ||
And hopefully, where do you see possibility that we can break out? | ||
Maybe even, who knows, leave and abolish the U.N. | ||
as many dream of doing, maybe a little ambitious. | ||
unidentified
|
Well, I think what the UN is doing is kind of, even though it wants to give itself more power, the key word is interoperable. | |
So it recognizes that each country, each nation, is going to do things their own way. | ||
So I think that's kind of desirable for each nation because They can build their own digital control grids in their own ways and how they see fit, what fits in with their own cultural norms and what kind of freedoms that they have or, you know, whatever is best for them. | ||
So you can, it's a blueprint that allows you to set something up in China. | ||
Let's say something similar up in Russia, the United States. | ||
And I mean, government's going to partner with the corporations, especially the United States on commercially available data, data sharing. | ||
And so it's just going to feed one another. | ||
What this does, if you get everybody connected to the internet with the global digital compact, | ||
then you get more ways to surveil people, and then you get this free exchange of information | ||
between governments and corporations and governments and governments. | ||
So I mean, it's like Yuval Harari said about who controls the data, you know, you don't have to send in a whole physical armies, whoever controls the data controls the world. | ||
So what the UN is doing is giving that kind of platform is like, okay, this is how we're going to set everything up globally with the whole infrastructure. | ||
Now do your thing. | ||
Wow. | ||
Yeah, the power, to my mind, like much of the power of the engines of these ambitions really lies within the corporations who are developing them. | ||
Obviously, there's government assistance and many times government seed money and missions to start these companies. | ||
But once they get going, they seem to have a kind of Momentum of their own, and occasionally go against U.S. | ||
government policy in the U.S., or you see it to some extent in Europe. | ||
You definitely don't see it in China. | ||
So going forward, looking ahead at the impacts that this will have, do you think that these U.N. | ||
policies, especially on A.I. | ||
regulation, is in some sense a means to capture that energy, to capture that power of data collection, data analysis, surveillance, social control? | ||
And if so, do you see it in any way being a kind of positive restraint on these corporations, or does it all look pretty black to your eyes? | ||
unidentified
|
It all looks pretty bleak to my eyes because it just gives everyone more power to collect information, you know, through internet of things, sensors, you know, just, or drones, whatever kind of spying device or surveillance device. | |
And then the AI is just, you know, it's more in, you know, the processing and how it can just go through so much information and make sense of it all. | ||
That is extremely valuable to all involved. | ||
So I don't, you know, you were talking about earlier about how Things that companies that kind of respond out of government, like Norah Bin Laden was saying about how, you know, DARPA had having LifeLog and then Facebook coming off and then doing its own thing. | ||
You know, CIA, NSA backed Google. | ||
And then CIA, I think you tell with Palantir, with Peter Thiel. | ||
So like they, and then Oracle, you know, they spawn these things and then all of a sudden they're kind of going off on their own, the companies. | ||
But it almost is like, you know, they can kind of reel them in a bit and use them again because they kind of funded that stuff. | ||
So they kind of own them, not own them, but a little influence here and there. | ||
So that's where I see that happening. | ||
Yeah, absolutely. | ||
You know, it's impossible. | ||
Obviously, these are state secrets. | ||
So it's impossible to know exactly how much control is asserted in regard to these companies, but certainly some. | ||
You know, the CIA churns, the intel agencies churn, the objectives churn. | ||
It's a confusing landscape and one which we are not privy to many of the scenes behind the trees. | ||
So all we can do, I guess, is hold on. | ||
unidentified
|
I will say one thing, though, you know, and far from all the doom and gloom, you know, something that did come out of DARPA, DARPAnet, which was the internet years ago, I mean, that was supposed to be for government use and military use and, you know, GPS and things like that. | |
But those have been, you know, Internet opened up communications and gave us a lot of good things, a lot of bad things, but we're doing what we're doing now because of that, you know, GPS. | ||
They came out with voice assistants like Alexa as well. | ||
I don't touch those, but I guess they help some people. | ||
I wish you hadn't ended on that. | ||
I was actually feeling pretty good about things for a second there. | ||
Then you had to bring Alex in, that evil demonic wench. | ||
Tim Henschliff, where can people find your work? | ||
You just published a piece on the Global Compact, correct? | ||
Or you're about to? | ||
unidentified
|
I'm about to, yeah. | |
It's 75% done. | ||
So you can find me at sociable.co is the website, and then also on X, formerly known as Twitter, at The Sociable or at Tim Henschliff, which is my name. | ||
And yeah, I'm looking to get this new article published either this afternoon or early tomorrow. | ||
Fantastic. | ||
Thank you very much, Tim. | ||
Thanks for stopping by. | ||
We will see you again. | ||
Sociable.co. | ||
All right, War Room Posse, that is it. | ||
If I can leave you with one bright note, you can always turn these devices off. | ||
You can always do your best to detach yourself from the system and go out, I don't know, into the sunshine. | ||
Maybe go see a family member. | ||
Maybe bring your kids in and explain to them that this civilizational transformation May have a few pitfalls and dangers, but it's going to be okay because mom and dad are here. | ||
At any rate, Godspeed, God bless, and of course, free Steve Bannon. | ||
Thank you very much. | ||
We will see you tomorrow. | ||
Thankfully, there are companies like Patriot Mobile that still believe in America and our Constitution. | ||
They're on the front lines fighting for the First and Second Amendments, sanctity of life, and our military and first responder heroes. | ||
Take a stand for conservative causes and put America first by switching to Patriot Mobile today. | ||
You'll get the same nationwide coverage as the big providers because Patriot Mobile operates across all three major networks. | ||
Plus, they back their service with a coverage guarantee. | ||
Keep your number, keep your phone, or upgrade. | ||
Go to patriotmobile.com slash Bannon or call 972-PATRIOT. | ||
Right now, get a free month when you use the offer code BANNON. | ||
Switch to America's only Christian conservative mobile provider, Patriot Mobile. | ||
Go to patriotmobile.com slash BANNON or call 972-PATRIOT for your free month of service today. | ||
America is standing on the brink of an election meltdown. | ||
And Jim Rickards, editor of Strategic Intelligence, the man who predicted the 2008 financial crisis, Trump's victory in 2016, and the COVID disaster is sounding the alarm. | ||
Rickards has just dropped a bombshell that could change everything you think you know about the 2024 election. | ||
He's uncovered what he calls the Democrats' secret plan to keep Trump out of the White House, even if he wins. | ||
This isn't some far-off theory. | ||
He's warning that this meltdown could lead to a 50% market crash. | ||
The total collapse of the U.S. | ||
dollar and violent riots in our streets. | ||
The stakes are higher than ever, folks, and if you're not prepared, you could lose everything. | ||
But Jim Rickards isn't just predicting disaster. | ||
He's laying out five critical steps you need to take now to protect yourself, your family, and your financial future. | ||
This isn't fear-mongering. | ||
It's coming straight from a man who's been at the highest levels of intelligence, finance, and national security. | ||
Go to Meltdown24.com Don't wait until it's too late. | ||
Meltdown24.com. |