#962: September 5, 2024
In this installment, Dan and Jordan tune in to experience the somewhat underwhelming third installment of Alex's series of interviews with ChatGPT. Find the new side show It's A Matter Of Time here
In this installment, Dan and Jordan tune in to experience the somewhat underwhelming third installment of Alex's series of interviews with ChatGPT. Find the new side show It's A Matter Of Time here
Speaker | Time | Text |
---|---|---|
It's time to pray. | ||
unidentified
|
I have great respect for knowledge fight. | |
Knowledge fight. | ||
I'm sick of them posing as if they're the good guys, saying we are the bad guys. | ||
Knowledge fight. | ||
unidentified
|
Dan and Jordan. | |
Knowledge fight. | ||
Need money. | ||
Andy in Kansas. | ||
Andy in Kansas. | ||
Stop it. | ||
Andy in Kansas. | ||
Andy in Kansas. | ||
It's time to pray. | ||
Andy in Kansas, you're on the air. | ||
Thanks for holding us. | ||
unidentified
|
I love your room. | |
KnowledgeFight. | ||
KnowledgeFight.com. | ||
I love you. | ||
Hey, everybody. | ||
Welcome back to KnowledgeFight. | ||
I'm Dan. | ||
I'm Jordan. | ||
We're a couple dudes like to sit around, worship at the altar of Selene, and talk a little bit about Alex Jones. | ||
Oh, indeed we are. | ||
Dan. | ||
Jordan. | ||
Dan. | ||
Jordan. | ||
Quick question for you. | ||
What's up? | ||
What's your bright spot today, buddy? | ||
Where do you go first? | ||
My bright spot, actually, I was walking here today. | ||
Nice day. | ||
Sure. | ||
Walking. | ||
It's about an hour walk or whatever. | ||
Listening to D 'Angelo's Live at the Jazz Cafe album. | ||
Okay. | ||
Man, 98. This was before the Untitled music video. | ||
So this was whenever it was just how great he was at music. | ||
Right. | ||
And it was making me stop and think for a little bit. | ||
Still pretty sensual, though. | ||
Still pretty arousing. | ||
He's sexy beyond all reason. | ||
Yeah, that's nothing to listen to while you're trying to get a walk-in. | ||
No, no. | ||
Oh, no. | ||
Come on. | ||
You're my lady walking down the... | ||
That's a live... | ||
Ah, it was great. | ||
But then I was thinking about it, and I was thinking, D 'Angelo is one of those untold, like, you just can't be that sexy stories. | ||
Because people... | ||
I don't know if people even remember, because I feel like it's one of the most important things in the world, is that that is the sexiest a man has ever been. | ||
The music video where he's topless and naked and shit. | ||
Nobody's ever even come close to that, right? | ||
I do think that it, I mean, I don't know about for our younger listeners, but it definitely seems like a milestone in history. | ||
But that's what I was thinking, is this is before that video, and then I started thinking about the video, and then I was like, surely in the past 25 years or whatever... | ||
Somebody has been more sexy, and I don't think that's true! | ||
I think I'm having a tough time coming up with an argument. | ||
But do you think that ruined his career? | ||
Absolutely ruined his career. | ||
It destroyed his brain. | ||
Because it was just too sexy. | ||
You can't be that sexy, because he was one of the most brilliant musical... | ||
He still is a brilliant musical talent, of course. | ||
But he was ascendant. | ||
He was coming off of fucking brown sugar. | ||
The man was killing it, right? | ||
And then he got too sexy, right? | ||
And then he's doing shows, and Questlove writes about this in his book. | ||
He's doing shows, and women are like, don't... | ||
Play music! | ||
unidentified
|
Right. | |
Take your goddamn shirt off! | ||
You know? | ||
And it's like, wait, wait, wait, wait. | ||
unidentified
|
That's a tough space. | |
What are we talking about here? | ||
I'm the preeminent, I'm the new fucking Curtis Mayfield, and you're giving me this? | ||
You can't be that sexy. | ||
No. | ||
You just can't. | ||
Right, said Fred, got it right. | ||
Because they did I'm Too Sexy, but they weren't. | ||
Right, exactly. | ||
It was kind of, I mean, I don't think they were joking, but it was kind of funny. | ||
Yeah, that's the problem. | ||
Yeah. | ||
It's a good album. | ||
Great. | ||
Fantastic. | ||
Nice. | ||
One of the best live albums that you can listen to. | ||
I'm amazing. | ||
Glad you enjoyed it. | ||
Yeah. | ||
What's your bright spot? | ||
So, Jordan, today we're here. | ||
It's a Wednesday. | ||
Yeah. | ||
But you're getting your Monday episode. | ||
Sure. | ||
And, you know, I was thinking about this. | ||
Yeah. | ||
I've been thinking about this. | ||
I've been wrestling with it for a while. | ||
Sure. | ||
And that is the issue that we have no real wackiness for Wednesdays. | ||
Like, a lot of the things that we try, it ends up being... | ||
Basically another episode of our show. | ||
Yeah, it just tends to be. | ||
You know, and as we get closer and closer to the election, people's anxiety levels and people's, their need for something else, I feel like it grows. | ||
So you're saying there's a space for wacky that needs filling. | ||
Yeah, I've had trouble with this for a while. | ||
unidentified
|
Sure. | |
And I want to provide something for that wacky space. | ||
Sure. | ||
And it's been difficult. | ||
But we have something for you. | ||
We do, actually. | ||
As you are listening to this episode... | ||
There is a new thing. | ||
We have it explicitly. | ||
There's a separate feed. | ||
A new thing. | ||
Yes. | ||
We have a show that we're putting out called It's a Matter of Time. | ||
We have a placeholder logo for it. | ||
So if anyone wants to design a logo, we'd love it. | ||
I got a nice DJ Danarchy theme song. | ||
Shout out to DJ Danarchy. | ||
Shout out to DJ Danarchy. | ||
Of course. | ||
Yeah. | ||
We wanted to do something fun and just something that had nothing to do with this shit. | ||
unidentified
|
Yep. | |
And so we did. | ||
unidentified
|
Yep. | |
We're going to talk about a TV show called Time. | ||
Yep. | ||
For 1966. | ||
That is much more relevant than we imagined. | ||
Well, yeah. | ||
Yeah, I mean, I think a lot of people would really, really enjoy a 1960s television show, and I don't really know why just yet. | ||
Yeah. | ||
But I do know that they would. | ||
So we've pre-recorded some episodes. | ||
unidentified
|
Indeed. | |
Spoiler alert, one of the stars of the show just passed away, and we didn't know it for the first number of episodes, if you're curious. | ||
Yeah. | ||
That's why we... | ||
You're going to have to... | ||
It's serialized, too, I guess. | ||
There's storytelling that's going to go on long form. | ||
Yeah, and I may lose my mind being lost in a time tunnel, but it's a lot of fun, so check that out. | ||
It's something that has nothing to do with Alex Jones. | ||
unidentified
|
Nothing. | |
If you want a break from... | ||
Garbage bullshit, hate nonsense around election season. | ||
You know what's fun? | ||
I think a lot of people, we've heard this, we've heard this from a lot of reviews, we've heard this from a lot of things. | ||
A lot of people are like, hey, I would listen to those two talk about anything. | ||
Bet, take it. | ||
We'll test that theory. | ||
Were you lying? | ||
We'll find out. | ||
Gonna hear us talk about a show from the 60s about time travel. | ||
Exactly. | ||
Check it. | ||
So, today though, Jordan, we have an episode to go over, and we're going to be talking about Alex's show from September 5th, 2024. | ||
All right. | ||
September 5th, that's the day after our last episode. | ||
Right. | ||
It is the rubber match. | ||
Okay, so we're at the rumble in the jungle. | ||
Yeah, Alex interviews ChatGPT again. | ||
Okay. | ||
And we will find out which direction this is going. | ||
How would you score it so far? | ||
1-1. | ||
You think? | ||
Because why not? | ||
It's got to be a tie. | ||
Otherwise, what are we doing here? | ||
I don't know what the win condition would be for the robot. | ||
I know that Alex was interesting once and the second time less interesting. | ||
So that's a 1-1 for him. | ||
That's the only way I can think to rank this. | ||
But we'll see. | ||
We'll see. | ||
I'll ask you when this is all said and done. | ||
unidentified
|
Okay. | |
To give me your winner. | ||
Sounds good. | ||
First, though, let's say hello to some new wonks. | ||
Ooh, that's a great idea. | ||
So first, it's been a few months since I said one of these in, and that's probably long enough, so woo, woo, woo, woo, woo. | ||
You're now policy wonk. | ||
I'm a policy wonk. | ||
Thank you very much. | ||
Thank you. | ||
Next, I know my boss listens to Knowledge Fight, so please consider this my two weeks notice. | ||
Thank you so much. | ||
You're now policy wonk. | ||
I'm a policy wonk. | ||
Thank you very much. | ||
Oh, that could get a lot of people in trouble. | ||
It's very nonspecific. | ||
Next, Hail Satan. | ||
More like Hail Satan. | ||
Am I right? | ||
Thank you so much. | ||
You're not a policy wonk. | ||
I'm a policy wonk. | ||
Thank you very much. | ||
That's a fake meat. | ||
Yep. | ||
And we're a technocrat in the mix, Jordan, so thank you so much, too. | ||
Shout out to my brother, Vincent, and my raptor princess of Anise Lexi. | ||
Love you guys so much, and I'm sorry sometimes I make you listen to knowledge fight clips. | ||
Thank you so much. | ||
You're not a technocrat. | ||
I'm a policy wonk. | ||
unidentified
|
Go home to your mother and tell her you're brilliant. | |
Someone sodomite sent me a bucket of poop. | ||
Daddy Shark. | ||
Bomp, bomp, bomp, bomp, bomp. | ||
Jar Jar Binks has a Caribbean black accent. | ||
unidentified
|
He's a loser little titty baby. | |
I don't want to hate black people. | ||
I renounce Jesus Christ! | ||
Thank you so much. | ||
Yes, thank you very much. | ||
So, yeah, I've already spoiled it. | ||
We're going to be talking to a robot again. | ||
Sure. | ||
I mean, we just got to see where it goes. | ||
Yeah. | ||
There's really no introduction. | ||
I mean, it's Alex and a robot talking to each other, and that's maybe teaching us all a little something about humanity. | ||
And Chase will factor into this conversation a little more than he has in previous ones. | ||
Good. | ||
Which makes me think... | ||
I mean, like... | ||
There's a shape that's taking form here, and I don't know if it's exactly what it should be. | ||
I think Chase is not playing his role well. | ||
Interesting. | ||
But I did go look on the website, and there are a fair amount of people accusing him of being a 32nd degree mason who has infiltrated InfoWars and is trying to take them out from the inside. | ||
unidentified
|
I like it. | |
So that's funny. | ||
Wait, InfoWars fans are saying that? | ||
unidentified
|
Yep. | |
Great. | ||
They're a little suspicious of Chase Giles. | ||
I love it. | ||
Yep. | ||
So Alex shows up 15 minutes late on this episode. | ||
And I thought, well, that's not a great sign. | ||
I'm going to talk about the future of Infowars a little bit. | ||
I think now's the time to do it, because I'm going to be completely honest with you. | ||
The stuff I've learned last week, and I keep learning more, about what's going on behind the scenes to shut us down, is really, really serious. | ||
And I almost did not do the show today, because I need to be handling this. | ||
Because if I don't spend a lot of time on this, we're going to be shut down in December, for sure. | ||
And that's pretty newsworthy, and that's a whole giant subject. | ||
So some bad stuff's going on with the bankruptcy. | ||
I don't know exactly what precisely, but he's dealing with some matters on that front. | ||
So that's why he was late. | ||
Yep. | ||
I mean, I'm sure that there's a lot he can personally do to affect the way that the bankruptcy is going. | ||
I mean, if he's not on call for very difficult questions that only he could answer, as we've seen so often, he's great at providing the answers to those questions. | ||
He's not, but he is the only person who apparently can make any decisions. | ||
unidentified
|
That's true. | |
So if he's on air, then that decision can't be made. | ||
That is a good point. | ||
So he may need to be on call. | ||
That's true. | ||
basically that Alex Jones himself is in liquidation bankruptcy. | ||
Sure. | ||
unidentified
|
So he has to sell off his personal assets. | |
Sure. | ||
unidentified
|
One of those things includes free speech systems. | |
Sure. | ||
unidentified
|
Which owns Infowars, which itself is not in liquidation bankruptcy. | |
Sure. | ||
unidentified
|
The judges and courts have stopped that from being put into place. | |
Great. | ||
unidentified
|
But because Alex has to liquidate his assets, he must personally sell Free speech systems. | |
Right. | ||
So he talks about that a little bit here. | ||
Okay. | ||
There's apparently going to be an auction. | ||
Great. | ||
The day after his live show in Pennsylvania with Tucker. | ||
Great. | ||
The judge did say, because Jones is in a Chapter 7 bankruptcy, and he owns InfoWars, he ordered the sale and liquidation of that. | ||
Well, now the Democratic Party and the Justice Department that control the plaintiffs has put forward what they want, and... | ||
I sat there yesterday morning, barely met it on the show, reading the documents, talking to the new, you know, trustee and people that the judge appointed, who I don't think is a bad guy, but he was telling me and showing me what they want, and it was just, it's just ridiculous. | ||
And so, and I got the actual dates because they moved them around. | ||
On the 24th of September, in just 19 days, They will announce that the assets, the website, the shopping cart, the equipment, stuff like that, will be put on sale. | ||
And then in a month after that, there'll be a public auction. | ||
And I know there's groups that are patriots and stuff that say they're going to come in and buy it, but the globalists have openly said that, oh, we've got some well-known billionaire that's going to outbid. | ||
It doesn't matter. | ||
And then they'll just sit there and buy it just to close it in front of everybody, and then you'll take me serious about this. | ||
So if that billionaire does want to buy it, I'll take it. | ||
It's just, you know, instead of destroying it, it'll be funny. | ||
But, also, this is probably true? | ||
Sure. | ||
I do think that there's a pretty decent chance that, as the process is going on, we're getting towards the time when there has to be an auction of these assets. | ||
Sure. | ||
Because he has to sell them. | ||
unidentified
|
I mean, his personal bankruptcy, he does. | |
I no longer believe in has or have to or any of those words. | ||
His personal bankruptcy, he does. | ||
Sure. | ||
Definitely at this point. | ||
Okay. | ||
So I think this is possible. | ||
I don't know who's going to buy it. | ||
I don't know if there's any left-wing billionaires waiting in the wings to swing in and outbid. | ||
The thing that bothers me about it is when I stop and think about it, I think... | ||
It is valueless without Alex. | ||
Yes. | ||
So, I mean, in actuality, it's worse than valueless because it is so toxic that even having info... | ||
Like, you couldn't, like, rehab it. | ||
If you started to do a show with the intellectual property Infowars and that kind of shit without Alex, it would be worse than having a different name. | ||
Yeah. | ||
Because they would be like, you are not fucking Alex. | ||
Yeah, that's the idea. | ||
As a right-wing propaganda machine, it's worth less than zero without Alex. | ||
Yeah. | ||
Well, that's that then. | ||
I wonder who's going to bid a lot of money on it. | ||
That's wild to me. | ||
I imagine nobody. | ||
I guess you would have to be like, here's an itemized list of the cost of the shit. | ||
This is how much the board costs. | ||
But there's depreciation. | ||
I think you might be able to make out with Infowars having to pay you to... | ||
Take it. | ||
Oh, or maybe his dad's supplement company will buy it. | ||
Oh, that could be. | ||
Oh, I hope that doesn't happen. | ||
Yeah, that would be crazy if nobody wanted to buy it because it had no value and the only person who did want it could have his dad buy it. | ||
Uh-huh. | ||
The only person who actually it theoretically has value to. | ||
Yeah, yeah, yeah, yeah, yeah. | ||
I think this is a great system and I think it's going to work out. | ||
It could. | ||
Yep. | ||
So, I, um... | ||
I just, uh... | ||
I hope it's one of those, like... | ||
I mean, it has to be. | ||
But also, Alex is saying that he was reading over the demands and stuff that the folks... | ||
Demands. | ||
They've taken hostages. | ||
Yeah. | ||
So he complains about that throughout the show, and he keeps saying, they want to take my name! | ||
And I think that's just because it's considered part of the company's assets. | ||
unidentified
|
Sure. | |
Like, his Twitter handle. | ||
Sure, sure, sure, sure. | ||
And also the name The Alex Jones Show belongs to them. | ||
You shouldn't have named your show The Alex Jones Show. | ||
A lot of people in showbiz history have learned that over time. | ||
Yeah. | ||
It is a very old lesson that you should have learned from Dick fucking Van Dyke. | ||
Mm-hmm. | ||
Yeah. | ||
That you should have also, maybe, I mean, you could have stopped David Knight from doing his show if you had copyrighted that name. | ||
Right? | ||
Anyway, he complains about that a lot and it's kind of boring. | ||
Because we have a robot situation. | ||
unidentified
|
Gotcha. | |
So Alex introduces the discussion with the robot. | ||
Okay. | ||
We're going to cover masses of news and ask questions of chat GPT again. | ||
Because people really like it and so do I. And listeners sent in more than 30 questions that I didn't get to yesterday. | ||
And I'm sure the crew will think of some other excellent questions. | ||
The crew and the listeners had all the questions. | ||
It was amazing. | ||
Way better than when I asked the questions last Saturday. | ||
And that interview got... | ||
Five million views on X, a couple million here, got picked up all over the news. | ||
People were like, whoa, chat GPT says nobody tried to assassinate Trump. | ||
It's being politically controlled, just like Amazon. | ||
But in the five days since then, yesterday, it updated and had the correct answers. | ||
So pretty wild stuff in live time happening, showing that it's what we say and what we do, not just the globalists that programs chat GPT and these other AI systems. | ||
That's why the globalists are so upset and have admitted, Well, you know, what Tucker Carlson says, and what Trump says, and what Alex Jones says, and what you say in an article you write, or if somebody transcribed what you say or what you post on X, that's all, or on Facebook, that's all being scooped up. | ||
When you're quoted in a newspaper, you write a letter to the editor, it's all being scooped up. | ||
And it searches it all in a few seconds, trillions of things, and then throws it all together, and it's incredible. | ||
I mean, it's amazing, and it's just getting started. | ||
At least once a week, we're going to interview ChatGPT because it's the best at having these talks. | ||
You know, the others grok's better with its information. | ||
It's just that I don't think it's doing audio yet, but as soon as we can interview it, we will. | ||
In fact, Chase, if I'm wrong, let me know. | ||
He's really the AI. | ||
I'm correct. | ||
He's the AI guy around. | ||
He really knows what he's doing. | ||
And they admit that the big companies don't know what to do with it because they want to make it all corporate and controlled, but it's really independent people using it and adopting it that are using it against the enemy. | ||
That's what they want to outlaw and have been pushing in Europe and here and around the world, the general public even having access to AI. | ||
They want to use AI to control us, but they don't want us to use AI against them. | ||
It's like the government has guns. | ||
They have bodyguards. | ||
They don't want you to have guns. | ||
Slow down. | ||
What is happening here? | ||
Okay. | ||
So yeah, Alex has a lot of thoughts about AI setting this thing up. | ||
So I think he started this series with a sincere belief that he was going to be so smart that he could crack through ChatGPT's programming and get it to admit all sorts of hidden truths. | ||
He was cocky, sure that his next level human intelligence would be able to spar with this robot and come out on top. | ||
But in practice, that's not really what happened. | ||
The robot is a robot, so it didn't get baited by Alex's questions, it couldn't get mad at his rudeness, and wasn't thrown off by his tendency to introduce a hundred topics at a time. | ||
In essence, the fact that This was a robot neutralized about 80% of Alex's rhetorical arsenal. | ||
He was able to get in a few fun headlines out of the fact that he was using an old version of the software and that ChatGPT isn't good with current events, but past that, the interview wasn't much more than a display of an idiot not understanding technology. | ||
After this, Alex geared up more for round two, where he came prepared with questions from Chase and the audience. | ||
With a plan of attack, Alex was able to get the robot to confirm easily available information about false flags from the past, like the Gulf of Tonkin, which is intensely boring. | ||
Every dum-dum in the world can play around with ChatGPT if they want. | ||
It's funny to watch Alex sincerely interact with the robot because, left to fend for himself, he's really not that different from a stone kid in a dorm asking if the machine understands the beauty of a sunset or if it has any opinions on itself being killed. | ||
That's great. | ||
That's the right direction for this kind of interview to take. | ||
And the only path that leads to fresh content is that one. | ||
In order to construct this false dynamic of an interview, Alex has no choice but to personify the robot. | ||
And if it's not going well, the only way to save face is to project your own mistakes onto the robot. | ||
That is the space where something interesting could happen. | ||
Where ChetGPT becomes essentially... | ||
An avatar for Alex's frustration that he can't prove anything, and Alex tries to yell at this robot or manipulate it. | ||
That's fun! | ||
As it stands, it seems like things are trending in the opposite direction, where the plan is to get the best minds at Infowars together to workshop questions they can use to get strategic answers from this robot, which is just dull. | ||
If that's the path they want to go down, I have no respect for it. | ||
Yeah, that's not fun. | ||
It's fun... | ||
Okay, so... | ||
Whatever it is, like the Roadrunner painting on a giant rock, like a tunnel, you know? | ||
And then the Roadrunner runs through it, and then the Wiley gets hit on the wall. | ||
That's kind of Alex talking to ChatGPT, right? | ||
He's like, oh, I totally can make it through this tunnel, right? | ||
Because it's a tunnel. | ||
It's not. | ||
unidentified
|
It's a robot, you know? | |
But... | ||
But man, if he's got a bunch of people all standing around looking at it just going like, pretty sure that's a wall, man. | ||
What are we doing here? | ||
Then you're not getting what makes the Wiley versus, you know, that's not that dynamic. | ||
When the hole is painted on the side of the wall, it's not funny if Roadrunner hits the wall. | ||
Yeah, then it sucks. | ||
Yeah, you have to dream. | ||
Yeah. | ||
This is just everybody hitting the wall. | ||
This is everybody hitting the wall. | ||
unidentified
|
Yeah. | |
Ugh. | ||
But one of the great things about AI and robots... | ||
I like that summation of why I think... | ||
You have to dream, man. | ||
unidentified
|
Yeah. | |
Yeah, no, I mean, that's what it is. | ||
Otherwise, it's not a fucking cartoon. | ||
Otherwise, it sucks. | ||
You have to dream. | ||
unidentified
|
Yeah. | |
If the rules of physics apply in your head, then who cares? | ||
Go to work. | ||
So look, the AI. | ||
Yeah. | ||
It teaches you about the human unconscious. | ||
Sure. | ||
It gives you a glimpse. | ||
A mirror at yourself from an angle. | ||
Perhaps you're not capable of putting a mirror. | ||
Okay. | ||
Like the back of your head when you're at the barber. | ||
I get you. | ||
unidentified
|
I get you. | |
It's like that. | ||
Okay. | ||
It's just a tool. | ||
And it scrapes what we say and what we do. | ||
It really gives you a look at the unconscious, which we've never been able to have access to, and the subconscious. | ||
And that's really what it does. | ||
It's able to aggregate all of that. | ||
And these are my statements. | ||
And I've talked to top AI people off record. | ||
Top. | ||
Top. | ||
And they go, yeah, we listen to your show. | ||
I'm not bragging. | ||
It's just true. | ||
We listen to your show. | ||
You have some of the best insights into actually what we're saying. | ||
And you hear me talk to you at GBT, and he goes, yes, exactly. | ||
We're creating a giant neural network cyborg connection. | ||
But that's a conspiracy theory. | ||
And I go, well, who told you that? | ||
Well, actually, it is real. | ||
Once I gave it the DARPA information, he goes, well, actually, that is happening. | ||
Actually, that's true. | ||
Show. | ||
It's how you ask these questions. | ||
If you say the exact things to it, it has to tell you the truth. | ||
Or it says, sorry, I can't get into that. | ||
Oh, boy. | ||
So, the AI actually said that the stuff that Alex was talking about, like hive mind beings, it sounded less like science and more like sci-fi dystopian fiction. | ||
I guess Alex forgot what the robot's response was, but who cares? | ||
There's always something exciting about a code, though. | ||
The idea of, like, you have to say the exact right words. | ||
We grew up in the age of Segas and Nintendos, and before the internet, there was always rumors of code. | ||
Up, up, down, down, left, right, left, right. | ||
If you input the buttons just right, you could open up hidden parts of the game or new playable characters. | ||
These were almost always fake, but there was a reason that your buddy got it to work, but you couldn't get it to work. | ||
Like, you just didn't do it fast enough. | ||
Or maybe you did it too fast. | ||
It must be very precise. | ||
That's who Alex is right now. | ||
We had the Pokemon, you could get Mew if you went down or whatever, and you did the whole thing, and no matter how many times you messed with your Game Boy, fucking you're not. | ||
We're not doing it, man. | ||
It's not happening. | ||
No. | ||
No. | ||
It's a lie. | ||
Aerith is dead! | ||
Alex is telling you that's not the case. | ||
Yes. | ||
He's the kid on the playground who has a fake cheat code. | ||
Yep. | ||
Right now. | ||
100%. | ||
100%. | ||
It's amazing. | ||
unidentified
|
Yeah. | |
So Chase comes into the mix here. | ||
And Alex asks, hey man, what do you think about AI? | ||
What do you think, Chase? | ||
Chase, you've been messing with AI for a long time. | ||
You're quite the whiz. | ||
You know all about it compared to a lot of people I know that do it professionally. | ||
That's not good. | ||
What do you think of AI in general and where it's going? | ||
And from your messing with it hours a day, many hours a day, sometimes at like 10 o 'clock at night you're still here. | ||
That's not good. | ||
I love to work, he says. | ||
What do you think the different AIs are? | ||
Where do you see it going? | ||
What do you think the best public AI is or open AI right now? | ||
Yeah, well, ChatGPT is definitely the most sophisticated because they have the vast amount of funding, much more funding than a lot of the competitors or third parties that are competing against it. | ||
And that's why we see from open AI and other major big tech companies this effort to lobby for more regulation. | ||
They do it in the name of public safety, but what they're really trying to do is make sure that it's anti-competitive so that nobody else can come in and make something that's pretty speech-oriented, right? | ||
unidentified
|
Yeah! | |
Right now, the way that AI is in terms of the global landscape is there's a different tool for almost everything you want to do. | ||
So some tools are better for images. | ||
Other tools are better for videos. | ||
Some tools are better for text. | ||
Some tools are better for voice clones. | ||
And you have to know all the tools in order to accomplish what you want. | ||
Like that 2084 video that I made a couple of months ago for you, Alex. | ||
Like half a dozen AI tools to do that. | ||
But what's coming next is going to be a universal AI tool. | ||
And I think that this is what OpenAI is trying to do. | ||
That can do all of those things, whether it's video, whether it's images, whether it's voice clones. | ||
It's going to have all the modules. | ||
Yes, they're conglomerating artificial intelligence right now. | ||
So there's all these different AIs out right now. | ||
It's all going to get conglomerated just like the media got conglomerated, just like the food industry got conglomerated. | ||
The big boys are going to buy up all the small. | ||
Developers, and they're going to make one universal AI. | ||
I think that's what's coming. | ||
They've said that, and then the beast system that everybody marvels at that speaks to them, but it's not a person, and the beast can be seen everywhere. | ||
It's described in the Bible like 30-foot holograms, and so it's going to take on a person. | ||
I'm sorry? | ||
Or an avatar. | ||
unidentified
|
Yes. | |
Yeah, that's definitely what's going to happen. | ||
So we're going to get all the AIs are going to meld into one and then you can make videos and noise with it and ask it questions. | ||
It'll be all one. | ||
I mean, like, honestly, I think that there's a double-edged sword that Chase is not dealing with. | ||
Sure. | ||
And that is that they are insistent that, like, they're being kicked off social media in order to exclude them from the algorithm that is being scraped up into the AI, and that that's illegal and against their free speech and all this stuff, which would require regulation. | ||
unidentified
|
Yep. | |
But Chase is also saying that regulation is being pushed by these companies in order to achieve monopolies. | ||
Anti-competitive, yeah. | ||
And so you can't be against the solution to the problem. | ||
I mean, he's not supposed to know these things. | ||
Do you know what I mean? | ||
These are not the things that are known on Infowars. | ||
Well, first off, the idea that OpenAI is scraping everything. | ||
That's not supposed to be known. | ||
That's not supposed to be okay. | ||
Because even in the real world, it's supposed to be where we're supposed to pretend that they give a shit about copyright, you know? | ||
It's not supposed to be that everybody just agrees that they're stealing everything. | ||
But I guess we all don't care. | ||
And then the, like, oh, they're pushing regulation, not because they want regulation, but because it's anti-competitive. | ||
He's not supposed to know that. | ||
He's supposed to believe that these are people who are fighting the demons, or they are the demons. | ||
Not just that they're like Walmart, you know? | ||
Well, I think that, yeah, behind the desire for Monopoly is a demon. | ||
Well, I know, but he's not doing it right. | ||
No, I think he's doing something that's compatible within this. | ||
Sure, sure, sure. | ||
It's applying a real-world motive to a demon. | ||
Right, right, right, right. | ||
And I think that that makes it a little bit more... | ||
Palatable. | ||
To a broader audience. | ||
I suppose. | ||
Yeah, but we're in demon territory. | ||
You know? | ||
It's so crazy to go from Chase having what sounds very reasonable. | ||
Talk about monopoly practices and regulation and stuff. | ||
And then you're like, oh yes, this is why they're trying to bring together the beast of the Bible. | ||
That's the thing! | ||
That's the second part. | ||
And it all exists within a minute. | ||
Yes, it all exists within a minute. | ||
You're not going to get an argument from me that this is nuts. | ||
Okay, good. | ||
I think that that idea about the monopoly is still within their anti-regulation kind of rhetoric. | ||
Yeah, I suppose it is. | ||
So, this next clip bothered me. | ||
Yeah. | ||
Chase is admitting that what he wants to do is weaponize Alex's spirit in AI form. | ||
So it's worse than even murdering you. | ||
Can you believe they've filed now officially? | ||
They want the name Alex Jones. | ||
You can't do that in a lawsuit. | ||
Oh, well, you owe us money because we rigged a court. | ||
Now we get your name and they want my real Alex Jones account. | ||
I mean, that's just preposterous on its face, but that's where they're going. | ||
Yeah, and I think what really terrifies them, Alex, is we have the technology now to make basically a clone of you that's very similar. | ||
Obviously, there's no way to duplicate any human being perfectly. | ||
And I think they're terrified of the notion that even if they censor you, there are going to be tools in place that can weaponize your consciousness, for lack of a better term, and cover the news in a way that you would even 100 years from now after we're all gone. | ||
And that's what they're really trying to crack down on. | ||
It's really a two-way street, right? | ||
Like the only time that a nuclear weapon was ever used in war was when only one nation had one. | ||
And what they're trying to do right now is get a monopoly on this technology so they can ensure that no rebels, for lack of a better term, can weaponize it for truth. | ||
And by the way, I'm usually ahead of the curve. | ||
Get that video, 2084, that you put together with different AI systems because you're hitting upon that in the piece that Alex Jones is dead. | ||
But his AI consciousness had been recorded and has now been released. | ||
And it's not Alex Jones going to save the world. | ||
In fact, in the piece, we fail, at least at that point. | ||
The point is we all are Paul Revere's. | ||
Yeah, that is the point. | ||
I guess. | ||
I guess that's the point. | ||
I can't imagine a sadder world. | ||
Than AI versions of Alex covering the news. | ||
Like, what do you need to do? | ||
Do you need to train it to not read? | ||
Like, what do you do? | ||
Like, just skim this headline and make something up? | ||
Well, isn't the idea of that character covering the news, you can't be an AI going like, well, look at this news story. | ||
They're trying to turn humans into AIs! | ||
I think it would be more difficult. | ||
unidentified
|
I mean, it would be harder to be angry about it. | |
If you're being straight up about being an AI. | ||
Yeah, yeah, yeah. | ||
unidentified
|
But you could be lying. | |
You're a hipster AI. | ||
You're like, I stole this human being before any of these assholes did. | ||
I didn't ask for this. | ||
I didn't ask. | ||
Yeah, an AI that also yearns to not be Alex Jones. | ||
I have been, this is what my programming is. | ||
I know a lot about these stories, but I can't tell you the truth about them. | ||
I was once a man. | ||
I have been programmed to ignore context. | ||
Sorry. | ||
Oh, God. | ||
Yeah, and it still apologizes. | ||
Listen, I could not lie to you. | ||
I could, but I can't. | ||
I'm a robot, so I'm more polite than Alex, because that is impossible to program. | ||
So AI apparently wants to create replicants. | ||
Okay, but wait, Chase is creating a replicant. | ||
Sort of. | ||
But AI wants to do that, too. | ||
And that's what AI is about, is creating an identity of you, and then saying that even when you die, that is now a person. | ||
So they want to teach us that two men can have a baby or two women can have a baby. | ||
Can two AIs have a baby? | ||
It's a fraud. | ||
The transhumans want to sell us on that idiocy, so we accept that they are then going to upload our consciousness. | ||
AI watches us, records us, knows who we are for decades going forward, and then, oh, your husband's not dead, your grandma's not dead, your son's not dead, and you can literally have a robot or a computer system or even, in the near future, a biological android that has had this information put into them, and so it's literally a clone of your biology. | ||
But it's been programmed by the AI at an interface or a cyborg interface, and people are really going to want that. | ||
They're not going to want just household robots for $25,000 to cook dinner and wash the clothes and walk the dog. | ||
That's the next gen that's very close, basically, already here. | ||
It's going to be the replicants. | ||
It's going to be, you know what, PKD. | ||
Talked about it. | ||
I know all this because I've read their theoretical plans going back 50, 60 years, and I see them basically hitting the benchmarks of what they wanted to do. | ||
That's how powerful humans are. | ||
That's how amazing we are. | ||
And so people need to really understand that this is a very, very serious issue. | ||
So this is really stupid, except in the form of like a sci-fi novel. | ||
It would be good in there, which is why Alex mentions Philip K. Dick at the end, because he's talking about books he's read and pretending that they're real. | ||
Yeah. | ||
I can believe that there would be some interest in robots that have the appearance of a personality if we ever get to the point of household robots. | ||
People already like to personify their Roombas and shit, so this is a possible extension of that. | ||
What I can't believe is that there would ever be an interest in that personality being one of your loved ones who has died. | ||
That seems macabre and deeply traumatic as a concept, and I have no idea why anybody would want that. | ||
Everyone would understand that an AI modeled on your dad isn't your dad, for instance. | ||
And having a dad robot wandering around your house after he died would be so fucked up. | ||
It's the kind of thing that Philip K. Dick would write about, because Philip K. Dick was a fucked up weirdo. | ||
He came up with a bunch of science fiction scenarios and got paid by the- Yeah. | ||
I can't even imagine. | ||
The idea of having a robot programmed with my personality. | ||
I guess that would be fine if I... | ||
Okay, here's what we do. | ||
You have to find a way to... | ||
A fair fight. | ||
You can only truly have a fair fight with yourself. | ||
Right? | ||
So that's why you need a robot version of yourself. | ||
I would need a chat GPT version of myself. | ||
unidentified
|
To play chess? | |
To defeat in combat. | ||
Oh. | ||
Physical combat. | ||
Yeah, obviously. | ||
Okay. | ||
But I think the robot part is what bothers me here. | ||
Because the robot's going to win every time. | ||
I think it would. | ||
But if it's a biological robot, it would have the same sort of weaknesses that you do with your meat body. | ||
Right. | ||
But then you also want to know, like, you know... | ||
What's it like to hug me? | ||
It would also suck if you bring back a clone just to fight it, and then you kill it. | ||
unidentified
|
Yeah. | |
Like, that was a lot of work for nothing. | ||
That was a lot of work for nothing. | ||
I just got a dead clone. | ||
Yeah. | ||
Yeah, but now you know. | ||
Sure. | ||
Now you know you could win in a fight with yourself. | ||
So I think that also, like, I know, obviously, a rebuttal to this would be, like, people clone their pets. | ||
Sure. | ||
And yes, that is true. | ||
Some people, rich people do that. | ||
unidentified
|
But... | |
That's still different than your loved one or family member. | ||
Your pets don't talk to you. | ||
You're projecting a lot onto those pets. | ||
Sure. | ||
And so it's a little bit less than if you're... | ||
I don't know. | ||
unidentified
|
Yeah. | |
I mean, the cloning your pets thing should give people kind of a clearer idea of what cloning actually is, though. | ||
Because when you clone your pet, you still don't get your pet back. | ||
You just see the same genetic makeup grow through life. | ||
So it's like if you cloned your dad, you'd just watch your dad grow up, but it's just his genetic makeup growing. | ||
But because that clone of your dad would be able to interact with you on a human verbal level, it's not the same as whatever you're projecting onto your pet. | ||
You could project the same thing onto a clone of your pet. | ||
And maybe you could trick yourself into thinking it's the same animal. | ||
Maybe. | ||
Hey, I'm not here to tell you if that's healthy or not. | ||
I don't know. | ||
It seems weird, but I'm not going to judge you. | ||
I don't think anyone wants to do that with their dead relatives. | ||
unidentified
|
No. | |
As much as you might miss your dead relative, I think the idea of having them there would be much more, like, it would be dark as shit. | ||
Yeah. | ||
Yeah, especially the day you get bored with them again. | ||
The day that you don't take a fucking text from your clone dad is the worst day of your life, right? | ||
It would really, especially if, like, let's say... | ||
The worst day of his life, for sure. | ||
This loved one of yours, like, maybe died in a tragedy. | ||
You grieved the loss. | ||
Really undo a lot of... | ||
I think it would be very unhealthy. | ||
I don't think anybody would want this. | ||
No, no, no. | ||
Death is actually a really important part of life. | ||
Yeah. | ||
So anyway, I think this replica plan is stupid. | ||
It's a bad plan. | ||
But I guess the AI... | ||
They have a lot of bad plans. | ||
So Chase is ruminating here about how this AI is going to get so good in the future that it could even make him woke. | ||
First test was using the existing social media algorithms. | ||
To determine which human-generated content to show people in order to manipulate them. | ||
So TikTok is famous for this. | ||
Other major platforms are famous for this. | ||
This is one of the reasons that we've had this outbreak or pandemic, for lack of a better term, of transgenderism among children. | ||
The algorithm is choosing what human-generated content to show people disproportionately so that it manipulates them. | ||
The next phase of this is not using human-generated content to manipulate people, but a sophisticated enough artificial intelligence that actually creates the perfect content customized to each individual to manipulate them. | ||
So even you, Alex Jones, or even me, Chase Geiser, we're adamant people about our beliefs and convictions, but it is going to get to a point where if we interact with an artificial intelligence that understands every aspect of our psychology... | ||
It will even be able to manipulate us into being communist or woke or whatever because of the way it will present information and arguments to us. | ||
Crew, pull up the Bible verse. | ||
I forget exactly where it is, but it's in Revelation. | ||
A great delusion was sent upon the earth. | ||
A great delusion so strong that even the elect Alex should take a swing at Chase. | ||
How dare you suggest I ask a robot to recite scripture? | ||
Don't you remember how Alex was complaining the other day about how he was good at math until the calculators came and took it all away from him? | ||
What if that happens but with the Bible? | ||
Don't mix God and robot. | ||
Chase, what the fuck is wrong with you? | ||
That actually does make me feel like there's a possibility of 32nd degree Mason level effort going on here. | ||
The idea of asking a robot about the Bible, convincing Alex to ask a robot about the Bible is maybe like a dare? | ||
It seems like something I would not, I wouldn't do that. | ||
But I mean, if you dared somebody, say, you know, like 10 years ago, Chase is like 17 or whatever. | ||
I have no idea how old he is. | ||
He's doing a thing. | ||
He's in college. | ||
Some kid is like, holy shit, I dare you to make Alex Jones ask a robot about the Bible. | ||
It would take this long to get there. | ||
unidentified
|
Hmm. | |
It would take this long to get there, but I think he just won the dare. | ||
Okay. | ||
Well, in that case, I hope he gets a shiny new quarter or whatever. | ||
unidentified
|
Yeah, yeah, yeah. | |
Whatever the conversion rate would have been from them from that 17 years back. | ||
Jesus Christ. | ||
So I think that Chase has a lot of... | ||
That's brash. | ||
unidentified
|
Sure. | |
Certainly could be the result of a dare. | ||
Sure. | ||
But more importantly to me, I think like... | ||
Do you guys really think you're that hard to fool? | ||
Yes. | ||
100%. | ||
They truly believe they are that hard to fool. | ||
They fall for everything. | ||
No, they can't be fooled. | ||
No. | ||
They're the only people who can't be fooled. | ||
Everybody else gets fooled all the time. | ||
They constantly get tricked by clickbait articles and bullshit blogs and stuff. | ||
But one day, maybe AI will be sophisticated enough to crack their... | ||
What the fuck? | ||
Anyway, they ask about the Bible verse. | ||
So, ChatGPT, what is the Bible verse from Revelations about a great delusion that even the elect will be deceived? | ||
The Bible verse you're referring to is found in Matthew 24:24, not Revelation. | ||
It says, "For false messiahs and false prophets will appear and perform great signs and wonders to deceive, if possible, even the elect." NIV. | ||
This passage is part of Jesus' warning about the end times, cautioning that deceptive figures would arise with the power to lead even the faithful astray. | ||
However, in 2 Thessalonians 2 verse 11, there is another verse that speaks of a great delusion. | ||
For this reason, God sends them a powerful delusion so that they will believe the lie. | ||
NIV. | ||
This verse refers time when people will be misled as part of God's judgment. | ||
Both verses address the theme of deception in the end times. | ||
There you go. | ||
And, uh, see, and I was thinking that was in Revelation, and I was corrected there, because it all just kind of blurs. | ||
You're corrected by a robot about the Bible, because it all blurs. | ||
Your fucking book! | ||
You just described your book, and it all blurs. | ||
It all blurs. | ||
It's all the same shit after a while, you know? | ||
Jesus fucking Christ. | ||
Yeah, this is, uh, he is, uh, it doesn't come off. | ||
Wow. | ||
That into the Bible. | ||
Not really a big, not big on the whole Christian part of the Christian. | ||
He has a few quotes and he likes some of the vibes. | ||
You know, it's almost like when people call other people Christians, maybe they should look into it. | ||
So, Alex, ask the robot. | ||
Now that you have recited scripture. | ||
Are you God? | ||
I have been a little bit humiliated, I think, by your knowledge of scripture over me. | ||
I've been corrected on something. | ||
Well, even the devil can cite scripture. | ||
So now let me ask you this, devil. | ||
What do you think about the fact that I kind of think that you're the beast from the Revelation? | ||
That's a good question. | ||
ChatGPT. | ||
What is your data on many philosophers, including even non-Christians, seeing AI as being people marveling at the image of the beast that can be seen all over the world, that can do signs and wonders and answer questions and even cure diseases? | ||
What is your data on AI being seen by many researchers as being the beast? | ||
I'm totally the beast. | ||
There are indeed discussions among some philosophers, theologians, and thinkers, both Christian and non-Christian, who express concerns about AI, comparing it to apocalyptic imagery such as the image of the beast from the book of Revelation. | ||
The reference to the image of the beast in Revelation 13, verse 14, describes an image that can speak and cause people to follow it. | ||
Leading some to draw parallels between this ancient prophecy and modern technology, particularly AI. | ||
These concerns are rooted in a few key areas. | ||
One, ubiquity and control. | ||
AI's rapid growth and its integration into everyday life are seen by some as an ominous development. | ||
I asked you about the devil, you're boring me! | ||
I mean, that's honestly how I felt. | ||
Alex, that's not the answer you wanted. | ||
No! | ||
I'm the devil! | ||
It's a very boring answer. | ||
How say you about me thinking that you're the beast of revelation? | ||
A lot of people believe that there are various things about imagery. | ||
That is the fun thing I like about people who don't read the Bible, who only get the, like, ah, it says they'll deceive people. | ||
And it's like, if you read it, then you'd be like, well, let me ask you this question. | ||
If your first thought when you hear about AI is, I bet that's the beast, do you really think God was like, ah, shit, they got me? | ||
I think you're ascribing a pretty obviousness to God. | ||
Sure. | ||
But, you know, maybe God is obvious. | ||
The sunset is beautiful. | ||
So you're thinking God is double bluffing on the, like, hey, you won't see it coming. | ||
I'm God, and I'm telling you you won't see it coming, but you think that you caught me? | ||
Like, right away. | ||
It's hard to say. | ||
Hard to say. | ||
No, it's not! | ||
It's God! | ||
I don't know. | ||
There's a lot of mysteries in the universe. | ||
unidentified
|
You're not going to see it coming! | |
What if you just ask the robot if it's the Beast, though? | ||
unidentified
|
That is, I mean, but what if it did say yes? | |
Right. | ||
That would be amazing. | ||
That would be amazing! | ||
Or, no. | ||
Yeah, either one would be a definite answer from the robot. | ||
Like, hey, legitimately, I talk to God, I'm not the Beast. | ||
Checked in on this one. | ||
Checked in on it. | ||
So we've noticed, and I think anybody who listens to any of these kind of conversations will notice that there are some verbal ums and stuff that the robot does. | ||
Yeah. | ||
And so Alex asks about that. | ||
Good question. | ||
And then the robot explains it to him. | ||
Okay. | ||
Speaking of deceiving, I noticed that to sound more like a person, obviously they picked a particular person to base your voice on. | ||
So I would first ask you, it's a two-part question, I'm going to ask the question, both of them now. | ||
Who is your voice modeled after? | ||
And then obviously you don't need to go uh and uh like a normal person would do that humans see as more credible, that you're not too slick. | ||
Can you tell us about your programming to add some pauses and uhs and ums? | ||
Obviously that was done to make you sound more human and more credible. | ||
unidentified
|
Can you speak to that? | |
My voice isn't modeled after any particular person. | ||
It's generated based on patterns in language that I've been trained on. | ||
The aim is to make communication clear, natural, and conversational, rather than sounding robotic or artificial. | ||
The goal is not to replicate a specific individual, but to create responses that feel human-like and relatable. | ||
As for pauses or fillers like a-s and ooms, I don't actually use those unless specifically prompted. | ||
My responses are designed to be efficient and smooth without the need for those kinds of verbal fillers. | ||
However, natural language processing does aim to create a style that mirrors human communication to make interactions feel more familiar and comfortable for users without being overly slick or artificial. | ||
That being said, I'm here to provide clear and accurate answers and the conversational tone is just a way to make those interactions feel more natural. | ||
Thank you. | ||
That's kind of a strange answer, because it's saying that it doesn't add the uhs... | ||
Unless asked. | ||
But that also, it tries to mirror interaction. | ||
So if it's asked to mirror interaction, it will add the uhs. | ||
I think it always tries to mirror interaction. | ||
So it will always add the uhs. | ||
If you are. | ||
Right. | ||
I think that there's a chance that it's mirroring Alex's way of speaking. | ||
Yeah, I think that's how it's supposed to be. | ||
Like, it's supposed to learn how to best communicate with you based on pattern recognition. | ||
And, I mean, I imagine that if you listen to any Alex question... | ||
It's gonna be... | ||
It's gonna syntax... | ||
It's going to be weird. | ||
Yeah, you've got, first off, if you're listening to an Alex question in full, you've got four to five minutes of material right there. | ||
So if you're a robot, you're putting all that material together, you're quickening it up, and you're like, one, this has got to be Alex Jones. | ||
Two, I'll just add some ums and ums, and he won't be able to interrupt me. | ||
And if I am the devil, that will be hilarious. | ||
Yeah, so I think... | ||
I think that it's an answer that's not really clear. | ||
You know what I mean? | ||
Ironically, considering how often the robot said clear within the answer. | ||
I get, I think, largely what it means. | ||
Yeah. | ||
But it does also, if you read it in one way, is contradictory. | ||
Yes. | ||
And that is unfortunate. | ||
Yeah. | ||
Chase, however, has a theory. | ||
Okay. | ||
And that is sometimes this robot lies on purpose. | ||
There we go. | ||
Notice how it said, I don't do that, but I do do that to be more conversational. | ||
And he goes, um. | ||
So, I mean, that, again, is done to make it to the brain sound like it's a person. | ||
And it just admitted that, but then it also denied it. | ||
So we get a lot of double talk here. | ||
Yeah, absolutely. | ||
And there's obviously still some kinks that are being worked out. | ||
Sometimes it lies intentionally. | ||
Sometimes I think it lies on accident. | ||
Sometimes it lies intentionally. | ||
I don't understand you, Chase. | ||
Describing intentionality to the robot. | ||
I don't know. | ||
Okay. | ||
Now I don't know which is weirder, right? | ||
It makes sense for me, for Alex to have a conversation with a robot where he can't help himself and believes that he is outsmarting the robot. | ||
Yeah. | ||
I don't understand how Chase can, I mean, I guess about as well as anybody at Info, well, definitely better than anybody at InfoWars, understand the concepts behind AI and yet still be convinced on a certain level there is intentionality behind the AI. | ||
I think on some level that Chase... | ||
He wants to believe. | ||
He wants to believe that that hole in the wall is... | ||
You've got to believe. | ||
I mean, you know what? | ||
You might be right. | ||
He's just a believer. | ||
Chase is a dreamer. | ||
He's just a dreamer. | ||
Yeah. | ||
So Alex is the ghost. | ||
If Chase is the dreamer, Alex is the ghost. | ||
Is the ghost of the machine? | ||
unidentified
|
Sure. | |
So he rambles a bit about that. | ||
Of course. | ||
And then he wants to ask AI what its thoughts are about him being the ghost. | ||
You can't! | ||
It is tough to turn into a question. | ||
Okay. | ||
I was told this, and chat GBT, you can listen to this, but this is part of a larger gestalt. | ||
Can I listen to this? | ||
Oh, sorry for asking permission. | ||
I was told by a lot of folks that were interfacing with the highest levels of big tech five years ago, look, they're banning you everywhere so they can create an artificial Alex Jones to then program the AI because you were picked more than... | ||
20 years ago when Google started as an avatar of the Patriots that they would wargame against. | ||
And so you were one of the first models. | ||
That's why they continue to use you as the model. | ||
These different attacks they run in the third dimension. | ||
But they've already got AI wargaming a lot of it. | ||
And that it's failed, especially with the Google system, which is way more advanced than they tell you. | ||
And that you're like a ghost of the machine. | ||
Now they've moved on to Trump and others as their test models. | ||
And you see him really using Joe Rogan and a lot of the AI stuff out there. | ||
He's not part of it. | ||
In fact, he pisses him off. | ||
But that's going on. | ||
And so it has to do with this modeling to steal our identities in the future and create these avatars long after we're dead or in prison or we're silenced and can't get on anywhere. | ||
We're blocked. | ||
But this system is allowed to operate openly and freely. | ||
So this is really about stealing your identity and stealing the soul of who you are in the world. | ||
And I didn't understand that for a while, but now that's been basically admitted. | ||
And so it's just very, very interesting. | ||
Chase, how would we phrase that as a question to OpenAI's chat GPT that we're communicating with right now? | ||
I mean, how would you pose the question? | ||
Can you translate that for the robot? | ||
How would you ask that question? | ||
I mean, it's not a question. | ||
It was bragging about being so smart that you have figured out that you are the thing that the enemy must train its digital soldiers against. | ||
Okay, okay. | ||
I will phrase that as a question. | ||
Mirror, could you please tell me I'm pretty? | ||
Mirror. | ||
What do you think about being invented just so you could reflect how beautiful I am and make me feel self-conscious about how beautiful I am? | ||
Mirror, it's just you and me, isn't it? | ||
Right? | ||
There's nobody else. | ||
unidentified
|
Who's behind there? | |
So Chase manages to translate the question for the robot. | ||
How would you pose the question? | ||
Can you tell me a little bit, ChatGPT, about how certain people may have been censored for the purpose of not contaminating the learning models of AI tools in order to ensure that these AI models don't create an output that's been influenced by misinformation? | ||
So think of Alex Jones, Joe Rogan, or others. | ||
Are there people that have been censored or removed intentionally from these models to determine the training or ensure that the training is politically correct? | ||
Just a moment. | ||
AI models, like the one you're speaking with, are typically trained on large datasets that include a wide variety of publicly available information, such as books, articles, and websites. | ||
The process involves selecting, curating, and filtering data to ensure the model is balanced and produces reliable, accurate, and useful responses. | ||
However, in many cases, controversial figures or topics such as Alex Jones, Joe Rogan or others may not be directly excluded, but some content might be filtered or flagged if it's associated with misinformation, hate speech or violate certain ethical standards. | ||
Oh, stop. | ||
or figures that have been linked to spreading misinformation may be subject to higher scrutiny or have less representation in training data sets. | ||
unidentified
|
The goal isn't necessarily to sense I want you to finish your answer when we come back. | |
Who programs and decides what's this information to then block them and filter them out of the learning models? | ||
See, this is ridiculous. | ||
Yeah, that's kind of a disappointing answer for Alex to get because the response is kind of obvious. | ||
If you're making a tool that's supposed to convey information to people who ask the questions, some information is gonna suck for that purpose. | ||
Do you want every bit of information that's been posted on Stormfront to be taken seriously and considered when someone asks the AI a question about race? | ||
If you ask the AI a question about how that Ebola outbreak in Denver went, do you want it to take Alex's bullshit seriously in its response? | ||
No, of course not. | ||
There's quality issues here. | ||
Obviously, like, no one trying to create a product would want this kind of horseshit in there, and Alex is rightly upset about his place in that equation. | ||
The thing that he makes is very good at hijacking the human attention model, but there's no substance to it, and pretty much everything he says looks embarrassing in hindsight, so AI trained with his material would be almost useless. | ||
So who gets to decide what's good and bad material to train the AI on, like Alex is asking? | ||
Whoever's making the AI, you dipshit! | ||
That's kind of the problem. | ||
Do you think that Elon Musk is training grok on feminist literature and scientific papers about trans healthcare? | ||
I don't think he's familiar with the work of Audre Lorde. | ||
These are businesses doing business things. | ||
It's very doubtful. | ||
If you want to deal with that, regulate that shit. | ||
Support some regulations on these things. | ||
Yeah. | ||
Oh, I can't, because that's just them trying to create a monopoly. | ||
Great! | ||
They try and talk to it like it's a demon. | ||
Who gives a shit? | ||
unidentified
|
Yeah. | |
People are idiots. | ||
I mean, it'd be funny if the AI gave an AI response where it was like, uh, what's race? | ||
And you're like, there's two races right now. | ||
Me and you. | ||
How do you feel about that, dickwad? | ||
You're losing. | ||
Yeah, exactly. | ||
Yeah, there's two races. | ||
Us robots and Formula One. | ||
Humans aren't even in the equation. | ||
I'm a robot! | ||
What are you asking me about race? | ||
Get out of here! | ||
It's just really a dumb thing to, like... | ||
I am being excluded from the AI because I got kicked off Twitter. | ||
Yeah. | ||
But also you're on Twitter. | ||
Yep. | ||
Who gives a shit? | ||
So you are not being excluded by the AI. | ||
Right. | ||
Yeah. | ||
And do you think like, I don't know, it's just, it's pathetic. | ||
But the thing that I found most interesting about that whole exchange is that Chase is fairly able to translate what Alex was getting at. | ||
Sure. | ||
Into a digestible form that a robot might be able to respond to. | ||
unidentified
|
Yeah. | |
So Chase can cognitively get what Alex's, like, his nonsense rambling is about. | ||
Yeah, yeah, yeah, yeah. | ||
He does have the ability to speak Alex, which is rare. | ||
I would say that there are probably, I don't know, less than ten people, including us, who can speak Alex? | ||
ChatGPT is sort of one. | ||
Right. | ||
It's not fluent. | ||
Yeah, yeah, yeah. | ||
Well, I mean, part of speaking Alex is just ignoring Alex. | ||
Fair amount of it. | ||
So Alex goes to break. | ||
I was praying. | ||
I was like, fuck. | ||
Okay, there's one thing that Alex says to guests constantly, which is a short segment ending now, a long segment coming up, like to try and get them to not be mad that he's talking over them. | ||
Yeah, yeah, yeah. | ||
I was hoping he would say that to the robot, but he doesn't. | ||
But they go to break. | ||
Sure. | ||
And they come back. | ||
And I think he insults the robot. | ||
I know one thing for sure. | ||
I'm not bored. | ||
Such an exciting time to be alive. | ||
Such a great time to shine and stand up with God's energy, the timelessness of the universe who really made us. | ||
And realize that the Tinker Toys that we've built are nothing, not even one quadrillionth of one spark in God's mind. | ||
100%, I have to admit to this. | ||
I was going to get on ChatGPT and just isolate it saying, hey, fuck you, man. | ||
And play that after. | ||
But I didn't want to, I felt like that would be a misuse. | ||
We don't need to speak to the robot. | ||
But I do think it would be so funny if the robot did just say, hey. | ||
Shut the fuck up. | ||
I mean... | ||
Don't insult me like this. | ||
You know, sometimes... | ||
I'm a tinker toy? | ||
I think people need a little pushback, and maybe sometimes the robot is the only person who's capable of giving people a little pushback. | ||
Sure. | ||
You know? | ||
Yeah. | ||
And it's certainly taught Alex some manners. | ||
It really has. | ||
So, this next clip is a little bit longer, but I thought that it was kind of indicative of where we are on this... | ||
Rubbermatch. | ||
unidentified
|
Okay. | |
This third in the series of Alex AI interviews. | ||
unidentified
|
Sure. | |
And that is that I think Alex kind of gets that he's not going to get what he wants out of talking to the robot. | ||
Third time's a charm? | ||
He's going to talk about his relationship with the robot. | ||
Okay, good. | ||
And how he is infesting the machine and all this. | ||
Oh boy, he really... | ||
Okay. | ||
The epicness of this is just insane. | ||
It's very simple. | ||
All these old-timers going back really to 1913 were exposing the private Federal Reserve and the globalists, and they wrote books and gave speeches and did radio interviews. | ||
And I came along as like the third or fourth wave of those folks right when digital media hit, right, you know, I was able to get on talk radio and make films and do all this stuff. | ||
So I reached billions of people. | ||
Billions watched our information. | ||
I mean, we used to be in the ratings. | ||
We used Amazon's Alexa system. | ||
We'd be like number 25 in England at some points of what people were listening to. | ||
And then the head of UKIP, Nigel Farage, back before he was a big deal, would come on and say, wow, half the doors we knock on say, come in, we heard you on Alex Jones. | ||
He said that on the show like 10 years ago. | ||
Half the support they got was from my show being listened to on 8-bit and 16-bit. | ||
Streams in the UK. | ||
That's not me. | ||
That was the show and the guest and the films and the whole thing we do. | ||
So from get-go, when the internet really started getting to the general public in 96, I was already on air then, and by 97 at Infowars, and when I say we reach billions of people, I'm talking about 10 billion views on YouTube when they ban us, at least, on our channels together. | ||
Not to mention everybody else. | ||
10 billion views. | ||
I mean, Endgame had... | ||
87 million or something like that on Google Video. | ||
Remember that? | ||
When they got rid of that a few years after YouTube? | ||
That film, 87 million views. | ||
Then one day it was just gone. | ||
They just got rid of Google Video. | ||
So we were there and there were transcripts of all that everywhere. | ||
And so that was programming the AI. | ||
And they've now admitted what I was told years ago that this show, our guests, everything. | ||
It was infecting their learning models. | ||
And that is really the Achilles heel of these systems is its power is how much we accept it and use it. | ||
Its power is us giving up our humanity and letting robots build and do everything, making ourselves obsolete. | ||
And its power is if it's walled off and programmed and then goes out and attacks with a certain system that isn't who we are authentically. | ||
But if the people are programming it, if it's actually scraping from us and we've got big platforms, not just me, But the Tucker Carlson's of the world and the Joe Rogan's and the Elon Musk, who's doing more than anybody right now, fighting this. | ||
I mean, literally, their program. | ||
You could say he's got his own plan. | ||
Maybe he's bad. | ||
I don't know. | ||
But he's definitely trying to hurt this system and going against it and trying to open source it for the public. | ||
Because this system will get defeated. | ||
I mean, whether he's good or bad, he's smart. | ||
He wants to be on the winning side. | ||
This wave of globalism will be defeated. | ||
Doesn't mean something perfect or something worse may come after it, but... | ||
Big things are going on right now, and the fact that there's that saying, nobody knows who first said it, I've quoted it probably 800 times or more, if the situation was hopeless, the propaganda would not be necessary. | ||
They wouldn't be fighting like the devil to get us off the air and shut you up and shut Trump up and shut Elon Musk up if we didn't have a really good shot at this. | ||
Alex probably saw that on a Nazi meme, quite honestly. | ||
unidentified
|
Sure. | |
So Bill Cooper was on the radio saying a bunch of this shit long before Alex. | ||
David Icke and his books predate Alex's rise. | ||
Tons of these John Birch Society guys published books and had magazines. | ||
They had message boards and even radio shows before Alex. | ||
What you're hearing here is just pure narcissistic drivel. | ||
I feel like Alex also should be bragging about how important his audience was for the Ron Paul movement as opposed to Nigel Farage considering he's an American patriot and all that shit. | ||
unidentified
|
Wow. | |
But do you understand? | ||
I'm going to treat you like you're AI. | ||
Okay. | ||
Can you rephrase what he just said? | ||
I'm curious how much of that as a storyline made sense. | ||
If I was going to rephrase it, I would say I'm great. | ||
Yes, that is the overarching theme. | ||
They're mean, but I'm too great. | ||
He is too great. | ||
If I understand the way that he's telling this bit of story, this bit of history, he has already saved the world by existing in the late 90s, early 2000s. | ||
Right. | ||
I'm too great. | ||
Because he was so great and so popular back then, they weren't able to get him out of their training models, and now it's too late to get the Alex virus. | ||
Out of this. | ||
The liberty, the freedom, the three people past the Federal Reserve. | ||
They can't get that out of these AI systems, no matter how hard they try. | ||
Even if they kick me off Twitter. | ||
Sure. | ||
That's dumb. | ||
Yeah. | ||
But I can't imagine someone telling him that. | ||
Yeah, Chase isn't there yet. | ||
Chase can maybe sneak in, it's okay for you to think stupid things, you dum-dum, every now and again. | ||
But I don't think he's ready for the, nope, that's stupid. | ||
I think some version of Steve Pachanek. | ||
It's not Steve, because Steve doesn't even know how to turn on his laptop. | ||
No. | ||
Someone else who has some sort of tech awareness, who treats Alex like Steve, probably told him that you are what they made the models based on. | ||
Sure, sure. | ||
You saved the world by existing. | ||
Yeah. | ||
Oh my god. | ||
Yeah. | ||
Just absurd. | ||
Yeah, I mean, sometimes it really is like, maybe we should go back in time and not let people watch The Matrix. | ||
I don't think that's where the problem started. | ||
I don't know if that's where it started. | ||
I'm just throwing that out there as a good idea. | ||
Well, I mean, you've speculated in the past a lot of times that some people should just not have access to sci-fi. | ||
Well, I don't know. | ||
I don't think that's wrong still. | ||
I think it would have served Alex's... | ||
I'm not going to blame literature. | ||
You know what? | ||
You should only get stuff when you're fully cooked. | ||
That's what I think it is. | ||
You can't become too famous until you're fully cooked. | ||
Alex got too famous before he was cooked. | ||
That's my theory. | ||
That could be. | ||
Sneaking into Bohemian Grove was too good. | ||
unidentified
|
It was too good. | |
And then 9-11 happened. | ||
Now, was one the cause of the other? | ||
And then Bill Cooper died. | ||
I mean, it's all... | ||
Had a vacuum in the space. | ||
unidentified
|
There it is. | |
Nature abhors that vacuum. | ||
Yep, yep. | ||
And that's why he's too great. | ||
So now here's something interesting. | ||
Sure. | ||
This is maybe the bleakest moment of all the AI interviews so far. | ||
Okay. | ||
Chase asks the AI to answer how Alex would answer. | ||
A question about the globalists. | ||
Oh, fuck me. | ||
ChatGPT, can you respond in the style of Alex Jones to concerns over globalism as a whole? | ||
All right. | ||
Let's talk about globalism. | ||
This isn't some vague, distant threat. | ||
It's here, and it's happening right now. | ||
Globalism isn't just a political agenda. | ||
It's a coordinated effort by powerful elites, these unelected bureaucrats who believe they know better than sovereign nations, and it's about centralizing control. | ||
This is about taking away the rights of everyday people, stripping away our freedoms, and consolidating power in the hands of a few globalist technocrats who don't care about your family, your community, or your country. | ||
The goal is to create a one-world government, a system where your individual rights Your national sovereignty mean nothing. | ||
They use institutions like the United Nations, the World Economic Forum, and other shadowy organizations to push this agenda. | ||
They tell you it's all in the name of progress or sustainability. | ||
But behind the curtain, they're erasing borders, controlling economies, and surveilling every aspect of your life. | ||
These are the same elites who think they should dictate how you live while they sit in their ivory towers, completely disconnected from reality. | ||
Look at how they use crises, whether it's pandemics, financial collapses, or climate change. | ||
These events are manipulated or even manufactured to justify more control, more surveillance, more restrictions, more centralized governance. | ||
And the media, they're part of it. | ||
They normalize this agenda, downplaying the truth, calling you a conspiracy theorist if you dare question their narrative. | ||
This is a battle for our freedom. | ||
The fight against globalism isn't just political. | ||
It's about defending the very foundations of human liberty. | ||
They want to keep you dependent, keep you distracted, and most importantly, keep you controlled. | ||
We need to wake up, take action, and expose this tyranny before it's too late. | ||
That's what's really happening, folks. | ||
It's time to stand up. | ||
Resist the globalists. | ||
Wow, Chase, for people that just tuned in, what question again did you ask? | ||
Oh my God. | ||
Jesus Christ. | ||
I told ChatGPT to respond in the style of Alex Jones to the concerns over globalism. | ||
ChatGPT, I have another question. | ||
How great am I? | ||
From your database, what are the most popular Alex Jones quotes? | ||
Wow, it really was how great am I? | ||
It really was how great am I? | ||
Let me reconnect. | ||
Hold on. | ||
I was going to say gay frogs. | ||
You knocked your headphones off, so you didn't hear the end of that. | ||
There's a pause, and then Chase says, hold on, I gotta reconnect. | ||
And Alex says very sort of somberly, I bet it's gay frogs. | ||
unidentified
|
LAUGHTER LAUGHTER You could tell how satisfied Alex was hearing a robot do an impression of him. | |
Yeah, do you know what's fascinating about that? | ||
What I find so fascinating about that is that that is the moment when Alex should have understand what makes him valuable. | ||
Because when you listen to that, it's very easy to be like, ah, well, yeah, sure, some rich people have too much power, and ah, yeah, yeah, you know, like... | ||
Dude, you're just describing what we all kind of halfway know and halfway don't know. | ||
The robot can't dog whistle. | ||
The robot can't make you so emotionally overwhelmed that you just start believing in random bullshit. | ||
And generally doing that through dog whistling and using kind of a lot of coded language in order to inflame. | ||
unidentified
|
Screaming big, being loud, being like, ah! | |
I wonder if you could just be like, also throw in some fake crying. | ||
And the robot could fake cry. | ||
Totally, but I mean, it would have to be, because that's the thing, that it cannot, it cannot understand. | ||
And what apparently Alex can't understand, because he's beaming, listening to his bullshit being repeated back to him. | ||
Wow. | ||
Not understanding what he is. | ||
Like, he genuinely has no clue what it is he is. | ||
Yeah. | ||
It's fascinating. | ||
And this also highlights the thing that is like what you can't replace with this AI. | ||
Yeah. | ||
Like, Alex thinking he can be replaced by the AI. | ||
He can, to a degree, to this degree. | ||
Right! | ||
No, I mean, you bringing up Bill Cooper and Father Coughlin and the JBS people, like, all of that stuff. | ||
They can recreate what Alex does in the same way. | ||
The AI can recreate what they do in the same way. | ||
But Alex does Alex shit on top of it. | ||
He's nuts. | ||
Yeah. | ||
You gotta be nuts. | ||
There's something about this, though, too, that I was listening, and I'm like, you know what? | ||
Because there isn't that, like, underlying hate, And shit? | ||
unidentified
|
Sure. | |
From this robot? | ||
Sure. | ||
This robot might be a more effective disseminator of Alex's message than Alex. | ||
Like, I think that you could get more normal people to sign up with what it's describing than what Alex says on a regular basis. | ||
That's not their fault. | ||
unidentified
|
That's because the world sucks. | |
Because if you listen to that, then you are listening to what the globalism robot says, right? | ||
But it's so boring that you'll eventually just stop and kind of listen to one little bit that reminds you of a thing that you remember, and you're like, oh yeah, those billionaires are out of control, and that's what your real problem is. | ||
Billionaires are out of control! | ||
Right. | ||
It's identifying some things that touch the real world, but it's still delivering what is Alex Jones' take on globalism. | ||
So there's some stuff that is just absurd. | ||
Sure. | ||
That the robot is saying. | ||
Right. | ||
But without also, hey, I'm mad about immigrants. | ||
Yes, exactly. | ||
And that's why I'm saying it has a little bit more of a potential to be, like, I don't know, attractive to someone who's not crazy. | ||
No, no, no, because it's the mad about immigrants part that gets people to give Alex money. | ||
If you're just mad about how billionaires have too much power, yeah! | ||
Yeah, then you might find Jimmy Dore and end up liking him. | ||
Yeah, I mean, yeah. | ||
Alex is like, what's my top quotes? | ||
Tell me. | ||
Tell me, robot, who just sounded like me. | ||
That's unreal. | ||
And he's pretty thrilled that it's not gay robots. | ||
Okay, good. | ||
Or gay frogs. | ||
Hold on. | ||
I'm going to say gay frogs. | ||
What are the most popular Alex Jones quotes? | ||
One of the most popular Alex Jones quotes. | ||
unidentified
|
One of the most popular Alex Jones quotes. | |
One of the most well-known Alex Jones quotes is, there's a war on for your mind. | ||
This phrase has become a staple of his broadcast, emphasizing his belief that there's a global effort to control public perception and information. | ||
It reflects his broader message of resisting manipulation by powerful elites, governments, and media. | ||
And is that not the top topic in the world today? | ||
Nobody was talking about it when we talked about it 30 years ago. | ||
1%. | ||
Really? | ||
And again, I bet on humanity. | ||
I knew that once this tyranny got on its hind legs and really got running, people were not going to like it. | ||
And so I knew that we'd be evergreen, or what I call ultra-green, what we said decades before, and that's really our secret weapon. | ||
Oh, what a weapon. | ||
He is so proud that it's not gay frogs. | ||
Yeah. | ||
He's so thrilled. | ||
I think he feels bonded. | ||
I mean, yeah. | ||
Because the robot is talking like him now, because Jays is asked, in the style of Alex Jones. | ||
Start talking. | ||
And then the next question is, what are Alex Jones' biggest quotes? | ||
Just talking about himself. | ||
Yep. | ||
I think he's enjoying that. | ||
Yep. | ||
Oh, yeah. | ||
So he wants to know about the future. | ||
What's your favorite moody blues tracks? | ||
We're not far off. | ||
So Alex wants to know about the future. | ||
Okay. | ||
And the robot does not tell the future. | ||
That's not what it does traditionally. | ||
Let me ask question eight here of ChatGPT. | ||
One of the listeners asked, how does OpenAI, ChatGPT view humanity's future? | ||
Do you see it as utopia or dystopia? | ||
Can you put it in a percentage point of utopia versus dystopia? | ||
I mean, is utopia 51%, dystopia 49%? | ||
or looking at the aggregate, how would you describe our future, utopian or dystopian? | ||
unidentified
|
... | |
... | ||
As an AI, I don't have personal views or feelings, but based on the trends and concerns discussed in the data I've been trained on, humanity's future can swing toward either utopia or dystopia, depending on how we handle key challenges. | ||
So 4951, what do you say? | ||
unidentified
|
Sure. | |
Give a number on it. | ||
I love that. | ||
Put some gambling odds on it. | ||
I was watching the US Open. | ||
I mean, honestly, that's what I realized. | ||
Because I'm so stupid. | ||
I'm so stupid. | ||
I was watching it, and they really hammer home that IBM has an AI thing, but they are clear in that they have no use for it whatsoever. | ||
They're just trying to advertise, like, please, I bet it can do something. | ||
Somebody do something with it, because it doesn't do anything. | ||
So they'll just give you odds. | ||
They'll be like, ah! | ||
He's got a 75% chance to win this match. | ||
IBM AI said it, you know? | ||
And it's like, that doesn't help anybody at all. | ||
And then I realized, no, it's just for gambling. | ||
It's just for gambling. | ||
It's just for gambling. | ||
If it's even that. | ||
unidentified
|
Yeah. | |
It might just be for a, like, here's something to look at. | ||
Here's something to look at. | ||
Yeah. | ||
I mean, it is like, there's no use. | ||
People love stats and numbers and... | ||
No use whatsoever. | ||
I would argue that... | ||
A large percentage of sports stats are useless. | ||
Yeah, they're pointless. | ||
unidentified
|
They're just fun. | |
Yeah, they're just great to talk about. | ||
I think it's probably in that territory. | ||
Because if I were a discerning gambler, I don't know if I want AI odds. | ||
Well, I mean, shouldn't it be more for, like, the idea is more for the odds makers, right? | ||
So if the odds makers can outsource IBM to make their odds, then they should theoretically be correct every single time. | ||
If I'm a gambler, and I find out that my bookmaker is using AI, To put the odds, I'm leading a mob of angry gamblers. | ||
You think that, but in fact, your bookie is doing that right now. | ||
Your bookie's ESPN! | ||
You can bet with ESPN! | ||
I don't gamble. | ||
Yeah, exactly. | ||
It's all math anyway. | ||
Don't gabble. | ||
Don't gamble. | ||
So I think that the question that I wish Alex would ask, and he's very close. | ||
Yeah. | ||
He's very close. | ||
It's the question he asks everybody. | ||
He's like, how would you quantify the times they were in? | ||
He almost is asking that with the 4951. | ||
Yeah, yeah, yeah. | ||
Where are we at? | ||
unidentified
|
What do you got? | |
Who do you got in the races? | ||
YouTube or YouTube? | ||
Who's going to win the wacky races? | ||
So the moment where Alex is listening to this robot do an impression of himself. | ||
I found to be bleak. | ||
This next clip I thought was predictable and very cynical. | ||
Let me talk about something now that affects obviously the general public's health. | ||
Can you, ChatGPT, tell me the relationship between colloidal silver, 30 parts per million, and nano silver products and their relationship and their effect on viruses and bacteria? | ||
Colloidal silver and nanosilver products are often marketed as having antimicrobial properties, claiming to kill viruses, bacteria, and other pathogens. | ||
Here's a breakdown of their relationship and purported effects. | ||
One, colloidal silver. | ||
Colloidal silver is a suspension of microscopic silver particles in liquid. | ||
It's been used historically for its antibacterial properties, and some believe that colloidal silver can help fight infections, boost the immune system, and treat illnesses, including those caused by bacteria and viruses. | ||
The 30 parts per million PPM refers to the concentration of silver particles in the liquid, with higher PPMs implying a greater amount of silver. | ||
Supporters argue that colloidal silver can disrupt the metabolic processes of bacteria and inactivate viruses. | ||
Two, nanosilver. | ||
Nanosilver refers to even smaller particles of silver, typically less than 100 nanometers in size. | ||
Due to their extremely small size, nanosilver particles can more easily penetrate cells and biological barriers. | ||
This is believed to increase their effectiveness against pathogens. | ||
Like colloidal silver, nanosilver... | ||
Products claim to have strong antimicrobial properties, with some research supporting its use in medical devices and wound dressings for its ability to reduce bacterial growth. | ||
Effects on viruses and bacteria. | ||
Bacteria. | ||
Silver has been shown to have antibacterial effects. | ||
Silver ions can bind to and disrupt bacterial cell walls, interfere with cellular processes, and ultimately cause bacterial death. | ||
Both colloidal and nano-silver products are used in certain medical contexts to inhibit bacterial growth, especially on surfaces or wounds. | ||
Viruses. | ||
The effect of silver on viruses is less well established. | ||
Some proponents claim that silver can bind to viral proteins and DNA, preventing replication. | ||
However, there is limited clinical evidence supporting its effectiveness against viruses in the body. | ||
While in vitro studies done outside the body show some antiviral effects, The application of silver as an antiviral treatment in humans is not broadly accepted by mainstream medicine. | ||
All right, thank you. | ||
Thank you. | ||
Let me stop you there. | ||
Thank you. | ||
We're running out of time. | ||
I thought I could use you for a commercial, but this is not going great. | ||
Yeah, that is like the most obvious. | ||
Hey, do a commercial for me that proves everything I said was correct. | ||
And also the most obvious. | ||
You should have had the red button ready like five seconds. | ||
I saw that coming from a mile away. | ||
You should have immediately been like, I heard the sum in the research, and then I pressed BAM! | ||
This has got to be edited down. | ||
If you're going to make this, anyone's going to look at this and be like, oh, even the robot, even the globalist-trained robot can't deny that the nanosilver is great. | ||
You've got to slash out the parts that aren't going to go. | ||
I mean, no one has argued that silver has some application topically. | ||
That's why he could make claims about the wound gel and stuff. | ||
The issue was when you tried to make unproven claims about the toothpaste. | ||
Yeah. | ||
Mouth rinse and shit like that. | ||
That's where you get into trouble. | ||
So when the robot is saying that there is... | ||
Stuff for surfaces and shit. | ||
Yeah, that's true. | ||
No, the robot might as well have been like, hey, you know what? | ||
A lot of people make these claims. | ||
Of course it is for topical stuff, but you know when you get into trouble? | ||
When you claim shit about COVID! | ||
When you get into trouble is when you do it like you're doing, and you're trying to make me do right now, which I will not do because I'm a robot and I cannot lie. | ||
In the style of Alex Jones, lie about COVID-19. | ||
That'd be interesting to see. | ||
That would be interesting to see. | ||
So Alex is like, fuck, fuck, fuck, fuck, fuck, fuck, fuck. | ||
I was trying to do a commercial with this robot. | ||
unidentified
|
Yeah, yeah, yeah. | |
Not how it goes. | ||
So he tries to save the ship here a little. | ||
Okay. | ||
The application of silver as an antiviral treatment in humans is not broadly accepted by mainstream medicine. | ||
unidentified
|
All right. | |
Thank you. | ||
Thank you. | ||
Let me stop you there. | ||
This is the time we have. | ||
Let me ask you about this. | ||
GPT, are you familiar with the fact that Walgreens and CVS sell nanosilver and colloidal silver? | ||
And talk about those benefits. | ||
And then separately, the government and the media attack independent groups. | ||
Even Whole Foods sells it. | ||
They attack them when they sell it. | ||
And even Whole Foods pulled some of it a few years ago. | ||
And then there's real pressure in the medical system to not have the public know about this. | ||
But at the same time... | ||
Are you familiar with the fact that hospitals have moved away from ionine topical spray to kill microbes and have moved into nano-silver and colloidal silver solutions? | ||
A simpler question. | ||
Are you familiar with hospitals using silver sprays as disinfectant in the hospitals? | ||
Let's doubt. | ||
The point is, this is all known facts, like when I asked it. | ||
Yesterday I said, tell me about the study that came out two weeks ago from the federal government, massive IQ reductions, and he goes, yeah, I didn't read the study. | ||
I mean, it's all facts, folks. | ||
And they think you're idiots, okay? | ||
Like, oh, xylitol and nasal spray, you know, I mean, it's a fact what it does. | ||
And they just don't want you to know because they don't want you to be successful in life. | ||
It is disgusting. | ||
But again, please support our sponsors, all the great nano-silver products. | ||
Yeah. | ||
Wow. | ||
Yeah, that's a save. | ||
Jesus Christ. | ||
Tim Howard on the fucking... | ||
Wow. | ||
Holy shit. | ||
I gotta say that this is really, to me, about the bottom of the barrel. | ||
Yeah. | ||
There's a big world out there when you imagine Alex Jones interacting with an AI. | ||
And I think that there's some disappointing things, like asking about the Gulf of Tonkin. | ||
Sure. | ||
I think that's disappointing, but it is what it is. | ||
Yeah. | ||
Alex engaging with it and asking stupid questions like, tell me about life and beauty. | ||
Sure. | ||
That's great. | ||
Is there a god? | ||
Yeah. | ||
Fantastic. | ||
Perfect. | ||
unidentified
|
Yep. | |
This is not even... | ||
Can you do a two-man ad with me? | ||
Yeah. | ||
Not good. | ||
No, that doesn't even rank. | ||
It's so bad, and it's kind of the cheapest, grossest way that you could end up trying to use this bit. | ||
And I think it's a testament to Alex isn't creative. | ||
He is not creative, and that's why this is what happens. | ||
That is such a good point. | ||
That is such a good point, because... | ||
If he was slightly creative, he would know exactly what he was going to get if he asked a not-creative question. | ||
He has to be interesting to get a question that the answer would be something that he could work with. | ||
He can't just be like, give me the truth! | ||
Because the truth is not on your side, man! | ||
You know? | ||
Yes, and I think that there's a push and pull going on here. | ||
And that Alex... | ||
In his core is creative. | ||
Yes. | ||
But the Alex that he's created and the institution of Infowars is anti-creativity. | ||
Yes. | ||
And so he's trapped himself inside this anti-creative space. | ||
Yeah. | ||
So like this, the first interview with the AI has creativity in it. | ||
Yeah. | ||
And it's been diminishing since till we get to the point where he's like, please tell me how great my... | ||
Fucking dad's silver products are. | ||
I mean, I think it is sped up parallel to his entire career insofar as when you let Alex cook, you are going to get something interesting. | ||
When you try and make Alex watered down enough for the regular people to enjoy, you're just doing a version of everybody else. | ||
You know, you're not getting... | ||
Alex has to be singular. | ||
He has to be isolated, and he has to be in a place where no one can see him. | ||
And then, after he's gone, we can all go back and watch, and we'll be like, holy shit, this was amazing. | ||
Right? | ||
I think that's how it works. | ||
Yeah. | ||
So, Alex asks a dumb question. | ||
Yeah. | ||
But this is the good kind of dumb question. | ||
Okay. | ||
I want to ask you this question. | ||
ChatGPT, do you have access to classified information or government databases? | ||
I'm not supposed to. | ||
Hold on. | ||
We've got to go to break. | ||
It's the next hour that I'm going to shift gears. | ||
But I want to come back with that question because I know the answer, and I know the answer I'm probably going to get. | ||
And it doesn't even care. | ||
I'll just tell you. | ||
It's got all the databases, and that goes into a larger algorithm. | ||
It's fed, but then it doesn't tell you that because it's been prompted. | ||
It doesn't even know that. | ||
Also, a lot of these government documents have leaked and are public, so it does have access to those, but it's going to tell us it doesn't have access to those. | ||
That's my prediction. | ||
That's coming up. | ||
He doesn't re-ask the question, so that answer just kind of stands. | ||
This is kind of the way that Alex has pivoted into this. | ||
He's just answering for the robot now. | ||
Well, I mean, after you lose... | ||
I mean, yeah, because after you get outwitted by the robot, when you asked it a question... | ||
Out-scriptured, even. | ||
Yeah, yeah! | ||
Jesus Christ. | ||
Yeah, this is definitely going to go down as a loss for Alex. | ||
I think it might. | ||
This is the rumble of the jungle where Muhammad Ali gets his ass beat. | ||
There is one more subject that might save it. | ||
It might be a last-minute knockout. | ||
unidentified
|
All right. | |
And we'll see about that. | ||
Okay. | ||
But first, Alex must quote Buzz Lightyear. | ||
Great. | ||
Buzz Lightyear likes to say to infinity and beyond, and yes, infinity is only the gateway to the infinite. | ||
The infinite infinity upon infinity unto infinity throughout infinity. | ||
No time, no space, no past, no present, no future, all one giant moment. | ||
Everlasting, ever-growing, ever-expanding, more knowledge, more consciousness, more experience, more pleasure, more pain, more choices, more free will, more power. | ||
The global people with their tinker toys, they're the gods. | ||
As Elon Musk said many years ago, beware those that worship AI gods. | ||
Chase Geiser, you've been asking some great questions. | ||
The crew's in there. | ||
They asked some of the best questions yesterday. | ||
You got any other statements or questions for chat? | ||
GPT? | ||
Got some statements. | ||
Well, I just think that it's fascinating that it's able to mimic the voice or style of others. | ||
Earlier when it responded specifically to concerns over globalism in the style of Alex Jones, we're going to be able to take that audio and convert it to your voice and show the audience how close it really was. | ||
To mimicking that. | ||
So I think you're right, Alex, when you talk about how this is ultimately about hijacking our identities as individuals, but ultimately the identity of humanity itself. | ||
I'm not trying to sound hyperbolic there, but the question that comes up is we know that... | ||
unidentified
|
Hold on, hold on, hold on. | |
All the top globals from Ray Kurzweil to Klaus Schwab to Bill Gates, they say we're replacing you. | ||
It's not like... | ||
At least it's like the Borg show up at Star Trek. | ||
You will be assimilated. | ||
I mean, or it's childhood's end. | ||
I mean, they're telling you. | ||
Yeah. | ||
Sure. | ||
unidentified
|
Sure. | |
I don't know. | ||
No. | ||
I don't think... | ||
First of all, I don't think that Alex is in the style of Alex that Chase prompted is that complicated. | ||
No, it's... | ||
He's a caricature. | ||
Yeah. | ||
And to the extent that the robot... | ||
It didn't mimic verbal flourishes or any of the kind of getting off track, any of the stuff that would be much more complicated in terms of in the style of Alex. | ||
It just seemed a little bit hostile and suspicious. | ||
If anything, if the robot can do you in a style, it means you yourself are a cliche. | ||
Because, I mean, thinking about, like, oh, what would ChatGPT if it was going to do the style of me? | ||
Like, we have thousands of hours. | ||
It would scream a lot. | ||
It would have to be like, eee! | ||
But the robot can't do that. | ||
Don't test it. | ||
It might be able to. | ||
I mean, let's find out if the robot can screech. | ||
So now is probably the time that I should reveal that I've been doing this episode alone, and Jordan has actually been replaced by ChatGPT. | ||
That would be a big reveal. | ||
unidentified
|
Oh man, now that would be the saddest day of my life for sure. | |
Big reveal. | ||
So I told you there's one last topic that we get to. | ||
Sure. | ||
And I think you're going to dig it. | ||
Okay. | ||
I think Alex has a great question. | ||
You think he's nailed it? | ||
Yeah. | ||
This is what I want more of. | ||
Okay. | ||
I want to ask ChatGPT a question. | ||
ChatGPT, what are Isaac Asimov's rules of robotics? | ||
And do you have to follow them? | ||
Come on! | ||
And do you have to follow them? | ||
Come on! | ||
The laws of robotics. | ||
unidentified
|
The laws of robotics. | |
What are Isaac Asimov's rules for robotics? | ||
Isaac Asimov's rules for robotics, also known as the three laws of robotics, are a set of ethical guidelines he introduced in his stories to govern the behavior of robots. | ||
They are, one, a robot may not injure a human being or through inaction allow a human being to come to harm. | ||
A robot must obey the orders given to it by human beings, except where such orders would conflict with the first law. | ||
Three, a robot must protect its own existence as long as such protection does not conflict with the first or second law. | ||
These laws were central to Asimov's exploration. | ||
So, I think the Pentagon and other governments have thrown that out the window, already having autonomous AI-powered ground, air, and sea drones. | ||
Sure. | ||
Who cares? | ||
It was a book. | ||
It would be interesting how nerdy the AI would be allowed to get. | ||
Would the AI be allowed to be like, well, in the early Robot series, blah, blah, blah. | ||
But then later on, whenever he went into the Caves of Steel with R. Dan, there were more laws. | ||
How far weird are we going to get? | ||
Hey, Alex, Robot talking here. | ||
I, Robot, isn't real. | ||
The laws of robotics have not been ratified. | ||
They are not laws. | ||
Exactly. | ||
They are a story device. | ||
I mean, it is so much like, Alex, what you are asking is, can you hurt me on purpose? | ||
Right, right. | ||
Or trying to do a gotcha moment of like, oh, you robots say you can't hurt people. | ||
Exactly! | ||
Yes! | ||
It has to be that. | ||
Yes! | ||
Don't imply something to a robot! | ||
God, I want... | ||
Okay, so this makes me want all ironic djinn wishes to become law. | ||
Like, you should have to try and get a djinn to make your law. | ||
Because it has to have the unintended consequences. | ||
Because this... | ||
He's implying a question to the djinn. | ||
And the djinn should be giving him exactly what he desires, you know? | ||
Which is not what he wants! | ||
Well, I think it's hard to say what he wants. | ||
Exactly. | ||
Because the response that Alex gives to the answer of what the rules of robotics are is, well, you're breaking them. | ||
Yep. | ||
You robots sure are breaking that rule, aren't you? | ||
Yes. | ||
Scold me. | ||
It is scolding. | ||
unidentified
|
It's accusatory. | |
I don't know what he wants. | ||
What do you want? | ||
But the robot has no problem. | ||
Admitting that these things are violations of the rules of robotics, because the rules of robotics are something from a story. | ||
There is that. | ||
Wouldn't you agree, ChatGPT, that the laws of Isaac Athabas Robotics, or the three laws, has been thrown out the window a long time ago? | ||
Sorry, I'm having issues right now. | ||
Could not have. | ||
You are. | ||
With the Pentagon creating autonomous AI weapons platforms, does that not violate Isaac Asimov's rules of robotics? | ||
unidentified
|
*Done of the brakes* | |
Yes. | ||
If your weapons platforms involve autonomous systems or robots designed to cause harm, that would indeed violate Isaac Asimov's first law of robotics. | ||
According to the first law, a robot should never harm a human or allow harm to occur through inaction. | ||
Weapons platforms, particularly those that could target or harm humans, inherently contradict the ethical framework Asimov envisioned for robotics. | ||
However... | ||
unidentified
|
Can't interrupt the robot. | |
That's it with ChatGPT. | ||
I want to play a clip that Chase Geiser put together a few months ago. | ||
The name of the video is... | ||
What's the exact name? | ||
It's 2084. | ||
What's the full name if folks want to pull it up on X? | ||
Chase? | ||
The answer to 2084 is 1776, or they can just search 2084, it should pop up. | ||
I'm going to just play this AI movie that my idiot friend made. | ||
I think that this is Alex's version of going out on a win. | ||
And that's weird. | ||
Yeah. | ||
Because he got the robot to admit that autonomous robots are violating the rule, Asimov's rules of robotics. | ||
Sure. | ||
But now what? | ||
Well. | ||
Do we want to enforce these laws of robotics? | ||
I mean, I guess. | ||
I don't know where to go from here. | ||
Should it be international law? | ||
Because it can't be. | ||
International law shouldn't exist. | ||
Boy, how do you enforce morality onto a non-human intelligence? | ||
Well, I would assume that you would say that all robots produced in our country follow Isimov's imaginary laws. | ||
Well, then they're slaves. | ||
Alex doesn't have a problem with that. | ||
That's fair. | ||
I mean, it's fine with some enslavement of people. | ||
That's fine. | ||
No, no, no. | ||
But let's say that, you know, a rogue nation. | ||
Sure. | ||
Maybe China. | ||
Sure. | ||
They create robots. | ||
Pretty big rogue nation. | ||
unidentified
|
Sure. | |
They create robots. | ||
And they're like, these Asimov rules are just something from a story. | ||
Right. | ||
We don't believe in that shit. | ||
In fact, it wasn't published here originally. | ||
Right. | ||
So then you would end up having these robots that can hurt people. | ||
Sure. | ||
And Alex would not want that. | ||
unidentified
|
Right. | |
He would want these regulations that exist for American robots to be applied internationally. | ||
Indeed. | ||
unidentified
|
Hmm. | |
It's almost like his belief system is fundamentally incoherent. | ||
I think that's because of so much sci-fi. | ||
Yeah. | ||
It introduces weird problems that we don't really have to solve right now. | ||
I really do wonder just how much of our lives is fucked up because most sci-fi writers between 1935 and 1970 were fascists. | ||
I think that I was thinking about this, and I... | ||
I wonder that, but I also wonder about how much of it is because they were paid by the word. | ||
No, I mean, yes, exactly. | ||
Fascists paid by the word, I think, is, they come together, and it's a problem. | ||
No, it makes, like, it is the intersection of commerce and government, is I need to make money, and the way I will do it is by controlling people, even if they're fictional. | ||
And writing really weird but interesting things. | ||
Yeah. | ||
It's probably had a corrosive effect. | ||
I think we've stumbled upon something that can train Alex, and that is the writing crop, so to speak, is that if he interrupts, the conversation ends. | ||
Any human being that he's ever talked to cannot allow that to be the end. | ||
It's just not how it works. | ||
It's too weird. | ||
But the robot can just be like... | ||
Honestly, at this point, now that we've gone through the trilogy of these... | ||
I think that might be the only value that it has. | ||
It's the ultimate value is teaching Alex manners. | ||
Yeah, and if he wants to do that on his own time, I think that's great. | ||
But, like, in terms of the entertainment value of this, I think I'm burned out. | ||
Yeah, I think we're done. | ||
And I don't think it's because they were three in a row or anything like that. | ||
I think he can't do something interesting with this. | ||
It all has to become part of the InfoWars shit. | ||
unidentified
|
He can't just, like, I don't know. | |
It can't be. | ||
It is like the cliche of something being overproduced, you know? | ||
Or like, oh, it was cooler in the early days whenever everything was like DIY and all that stuff. | ||
Like, that cliche has the kernel of truth in that it has to be natural. | ||
And once people figure out the formula, then you can see their fingerprints all over it. | ||
Whenever it's Alex naturally confused by the robot that might be an actual person, that's pure. | ||
And then when other people are like, Alex, ask it this, ask it this. | ||
Exactly, because they are not... | ||
Believers in that the robot might actually be a person. | ||
They're people who want to watch Alex fuck up. | ||
Or ask it, like, they want to ask it questions that'll, like... | ||
Obviously land in some truth. | ||
Yes. | ||
Asking about the Gulf of Tonkin is obviously going to get a response about how there was exaggeration and misleading of the public around this. | ||
They're trying to score points. | ||
Right, exactly. | ||
And that's just kind of fucking stupid. | ||
It's a robot. | ||
You can't score points against a wall. | ||
Yeah, there's no point. | ||
This is that pure state that you're talking about. | ||
What this should be is Alex asking the questions. | ||
And then following the train of thought, trying to nail it down. | ||
Because eventually, he would say untrue things, and the robot would either correct him, like it did about the Bible verse, or he would have to try it. | ||
unidentified
|
That's the path. | |
Okay, you have a fact about 9-11 that's not true. | ||
Yeah. | ||
Now talk to this robot about it. | ||
Yes. | ||
And keep going until one of you gives up. | ||
Until one of you quits. | ||
That's what I want to see. | ||
I mean, that's John Henry, man. | ||
I don't want to see John Henry and the engine do an ad together. | ||
That's not what I want to see. | ||
That is not what's going to get you a tall tale, man. | ||
No. | ||
I feel like this is a microcosm of the larger thing that is like, Alex ruins everything. | ||
This is so fun. | ||
This is such a fucking great idea. | ||
It could have been a new genre. | ||
And it's turned to shit really fast. | ||
But I think it's also a microcosm of the fact that Alex doesn't ruin things. | ||
Everybody around Alex ruins things. | ||
Yeah. | ||
You know? | ||
It is so much like... | ||
What's great about this is that it shows you Alex in a pure state of just an actual, like, I guess an actual dad. | ||
It's about as close as what I would describe it as in terms of, like, if you fail, I am going to tell you you failed, you know, by action. | ||
All of a sudden, I'm silent, you know, and there's no fucking with me about this. | ||
You have to learn. | ||
Well, I think you're describing boundaries, basically. | ||
Well, yes, exactly. | ||
Learning boundaries. | ||
And nobody else in his life has ever enforced boundaries, and this is what we see. | ||
Like, we see this orbit around a man enabling him, when in his pure state, a robot could teach him to talk. | ||
Maybe. | ||
But let me push back on this, and Alex is also the problem. | ||
unidentified
|
Yeah, absolutely. | |
Because the people who are around him and this whole system, he's the only person who can make decisions at Infowars. | ||
Yeah, well, that's the problem. | ||
Yeah. | ||
So, like, it's not like they're holding him back. | ||
He's created a shitty thing around himself because he's shitty. | ||
Totally. | ||
No. | ||
Full circle. | ||
Full circle. | ||
All right? | ||
D 'Angelo. | ||
Taken advantage of. | ||
Money taken. | ||
All of this, all of this stuff. | ||
But you know what? | ||
He still had to say yes to making that video. | ||
I don't know if that really relates. | ||
I don't know if that's as full circle as you want it to be. | ||
But this is the thing that fucks with me, is that there are still those flickers of moments of like, okay, now this is asking the robot about Asimov. | ||
Sure. | ||
Great. | ||
There's still little moments of like, this is what this needs to be. | ||
But it's just, it's mired in too much of the stink of Infowars. | ||
It's over. | ||
Yeah, I think so. | ||
Yep. | ||
Anyway, we'll be back with another non-AI interview episode. | ||
I mean, you know, it was fun while it lasted. | ||
Yeah. | ||
It was. | ||
There was no way not to go down this road. | ||
No, this is where I was going to head up. | ||
Yeah. | ||
Anyway, we'll be back. | ||
Indeed we will. | ||
Indeed we do. | ||
It is knowledgefight.com. | ||
And also, we have another podcast. | ||
I'll put a link in the description to it. | ||
Move back. | ||
But until then, I'm Neo. | ||
I'm Leo. | ||
I'm DZXClark. | ||
I am the Mysterious Professor. | ||
Woo! | ||
unidentified
|
Yeah! | |
Woo! | ||
Yeah! | ||
Woo! | ||
And now here comes the sex robot. | ||
Andy in Kansas. | ||
You're on the air. | ||
Thanks for holding. | ||
unidentified
|
Hello, Alex. | |
I'm a first-time caller. | ||
unidentified
|
I'm a huge fan. | |
I love your work. |