► 00:00:16
Dan and Jordan, I am sweating.
► 00:00:19
Knowledgeparty.com.
► 00:00:20
It's time to pray.
► 00:00:21
I have great respect for the knowledge fight.
► 00:00:24
Knowledge fight.
► 00:00:25
I'm sick of them posing as if they're the good guys.
► 00:00:27
Shang, we are the bad guys.
► 00:00:29
Knowledge fight.
► 00:00:30
Dan and Jordan.
► 00:00:31
Knowledge fight.
► 00:00:35
I need money.
► 00:00:39
Andy and Pansy.
► 00:00:42
Andy and Kansas.
► 00:00:43
Andy in Kansas.
► 00:00:45
Andy.
► 00:00:46
It's time to pray.
► 00:00:47
Andy in Kansas.
► 00:00:48
You're on the airplane for all of us.
► 00:00:49
Hello, Alex.
► 00:00:50
I'm a fish pin colour.
► 00:00:51
We're here to say I love your room.
► 00:00:53
Knowledge fight.
► 00:00:55
Knowledgefight.com.
► 00:00:58
I love you.
► 00:00:59
Hey, everybody.
► 00:01:00
Welcome back to Knowledge Fight.
► 00:01:01
I'm Dan.
► 00:01:01
I'm Jordan.
► 00:01:02
We're a couple dudes.
► 00:01:02
Like to sit around, worship with the altar of Celine, and talk a little bit about Alex Joe's.
► 00:01:06
Oh, indeed we are, Dan.
► 00:01:08
Jordan.
► 00:01:08
Dan!
► 00:01:08
Jordan.
► 00:01:09
Quick question for you.
► 00:01:10
What's up?
► 00:01:10
What's your bright spot today, buddy?
► 00:01:12
Cinnamon.
► 00:01:12
Cinnamon.
► 00:01:13
Just a cinnamon kick.
► 00:01:15
Interesting.
► 00:01:16
Yeah.
► 00:01:16
Okay.
► 00:01:17
I got some cinnamon toast crunch.
► 00:01:19
Okay.
► 00:01:20
And I think I had a couple other cinnamon things.
► 00:01:23
And I'm like, this is great.
► 00:01:24
This is the stuff.
► 00:01:25
This is great.
► 00:01:26
I think I have probably had these episodes before.
► 00:01:29
Like, if I accidentally got a churro or something, I realize how great cinnamon sugar is.
► 00:01:34
Sure, sure.
► 00:01:35
It's underrated.
► 00:01:37
Cinnamon.
► 00:01:39
Shout out.
► 00:01:41
No argument.
► 00:01:42
What about you?
► 00:01:42
What's your bright spot?
► 00:01:43
My bright spot is at my wife's mom.
► 00:01:48
I would call her my mother-in-law, but like, that's so charged.
► 00:01:54
But it's my wife's mom because I love her dearly, and today is her birthday.
► 00:01:59
And I know that she listens to this show.
► 00:02:02
So I'm sure that.
► 00:02:04
Doesn't mean today or when it is released or when recording.
► 00:02:07
We'll see.
► 00:02:08
We'll see when it happens.
► 00:02:09
But I know that she will hear that I love her and that it's her happy birthday.
► 00:02:14
Happy birthday.
► 00:02:15
This is quite an extension of your wife guy persona.
► 00:02:19
I mean, it's getting real bad.
► 00:02:20
It's getting real bad.
► 00:02:22
I have forgotten who I am.
► 00:02:25
Well, it's all right.
► 00:02:25
It's an okay thing to get lost in.
► 00:02:27
Yeah.
► 00:02:28
So, Jordan, today we have an episode to go over.
► 00:02:30
We're recording a little bit late, so I'm a little bit worried about the neighbors in case.
► 00:02:34
But part of the reason is because whatever happened earlier in the week does not matter.
► 00:02:42
Okay.
► 00:02:43
Because Saturday happened, and Alex put out a special report.
► 00:02:46
Oh, God.
► 00:02:47
And the premise of it cannot be believed.
► 00:02:51
Aliens.
► 00:02:52
Very close.
► 00:02:54
Very close.
► 00:02:54
I don't even want to spoil it.
► 00:02:56
I want you to be able to experience this because it's magic.
► 00:02:59
Okay.
► 00:02:59
So we're going to be talking about Alex's August 31st, 2024 special report that he put out on Saturday.
► 00:03:05
It's one of the stupidest fucking things I've ever heard of.
► 00:03:07
I'm so excited.
► 00:03:08
But first, let's say hello to some new ones.
► 00:03:09
Ooh, that's a great idea.
► 00:03:10
So first, Dip Ore and Mooseman.
► 00:03:13
Thank you so much, Jordan Howell Policy Walk.
► 00:03:15
I'm a policy wonk.
► 00:03:16
Thank you very much.
► 00:03:17
Thank you.
► 00:03:17
Next, sources high up in the Pentagon told me that my dog Apollo is a satanic globalist.
► 00:03:22
Thank you so much, Jordan Howe Policy Walk.
► 00:03:23
I'm a policy wonk.
► 00:03:24
Oh, thank you very much.
► 00:03:25
Thank you.
► 00:03:26
And the Cotene Family Cat Sanctuary and also the Supreme Court should be picked using the Gom Jabbar.
► 00:03:32
Thank you so much, Jordan Howe Policy Walk.
► 00:03:34
I'm a policy wonk.
► 00:03:35
Thank you very much.
► 00:03:36
Is that a Dune thing?
► 00:03:37
Yeah.
► 00:03:38
Okay, just make a chirp.
► 00:03:39
Ah, come on.
► 00:03:39
We did two episodes on Dune.
► 00:03:41
You can't be mean to me about it anymore.
► 00:03:42
No, no, I just don't remember that specific.
► 00:03:44
I got you.
► 00:03:44
It's the one where she holds it up to his neck and it's poison.
► 00:03:47
And if he doesn't, he's got his hand in the box.
► 00:03:49
You remember that?
► 00:03:50
The box.
► 00:03:50
The Gom Jabbar.
► 00:03:51
I did not remember the term, but I remember the scene.
► 00:03:53
That's the stuff.
► 00:03:54
Emotionally moving.
► 00:03:54
It was powerful stuff.
► 00:03:57
Kyle McLaughlin did not sell it.
► 00:04:00
We also have a technical credit there.
► 00:04:02
So, thank you so much to Mari and Steve's Lego Puzzle Basement.
► 00:04:05
You are now a technocrat.
► 00:04:07
I'm a policy wonk.
► 00:04:08
Four stars.
► 00:04:09
Go home to your mother and tell her you're brilliant.
► 00:04:11
Someone sodomite sent me a bucket of poop.
► 00:04:13
Daddy Sharp.
► 00:04:14
Bomb, bomb, bomb, bomb, bump.
► 00:04:16
Jarjar Binks has a Caribbean black accent.
► 00:04:20
He's a loser, little, little titty baby.
► 00:04:23
I don't want to hate black people.
► 00:04:25
I renounce Jesus Christ.
► 00:04:26
Thank you so much.
► 00:04:27
Yes, thank you.
► 00:04:28
Now, Jordan.
► 00:04:28
Yes.
► 00:04:29
I told you the premise of this episode is the amazing.
► 00:04:33
Okay.
► 00:04:33
And I'm certain that it's Chase Geyser's doing.
► 00:04:35
Okay.
► 00:04:36
But before we get to what it is, Alex is a little boring at the beginning.
► 00:04:41
Okay.
► 00:04:41
So we have to get through a little bit of headlines.
► 00:04:43
Fair enough.
► 00:04:44
It's Saturday, August 31st, 2024.
► 00:04:49
And the greatest attack on internet freedom ever has just been launched by Brazil, working with the CIA.
► 00:04:57
Infowars.
► 00:04:58
Yeah.
► 00:04:58
Sure.
► 00:04:59
So what's happened here is that a Brazilian judge, Alexandra de Moraes, has ordered that Twitter be blocked in Brazil because Elon Musk has not cooperated with various requests the Brazilian government has made.
► 00:05:11
In order for Twitter to operate in various countries, it's required to abide by the laws of said country, which Alex should be a huge proponent of because he loves national sovereignty.
► 00:05:20
In this case, Judge Moraes has made requests that involve suspending accounts that are run by Bolsonaro supporters, a certain number of Bolsonaro supporters, which Musk has refused to honor because he himself is a Bolsonaro fan.
► 00:05:34
There is that.
► 00:05:34
Other countries have made similar requests of Musk, like him banning various accounts involving criticism of India's prime minister Modi.
► 00:05:41
And Musk has honored them because he's not actually concerned about free speech.
► 00:05:45
He's interested in leveraging the power of the social media site he bought to push his political agenda.
► 00:05:50
Alex fully understands this dynamic and never thought that censoring Modi critics was the biggest attack on the internet.
► 00:05:57
The pretense of principle is important here for marketing purposes, but there's nothing really about this that is more.
► 00:06:03
It's just like, I don't like this one.
► 00:06:06
I like this one.
► 00:06:07
This circumstance works in my favor.
► 00:06:09
This one doesn't.
► 00:06:10
Yeah.
► 00:06:10
That's basically it.
► 00:06:11
I understand what everybody's doing, and I appreciate that everybody believes in laws.
► 00:06:16
But if I was a Brazilian judge, I would first invite Elon Musk to the best Brazilian party that anybody's ever had.
► 00:06:25
Oh my God, we're going to do this up.
► 00:06:27
It's going to be great.
► 00:06:27
Boom.
► 00:06:28
He gets there.
► 00:06:29
Borders, jail.
► 00:06:30
Done.
► 00:06:31
Moving on.
► 00:06:31
You know what I'm saying?
► 00:06:32
Sure.
► 00:06:33
You seem to want to set a lot of traps.
► 00:06:37
I love subterfuge as a way of solving problems.
► 00:06:40
That always works in Star Wars.
► 00:06:42
Do you see what I'm saying?
► 00:06:43
It's a good point.
► 00:06:44
It's a good point.
► 00:06:45
Yeah.
► 00:06:45
So we have another little bit of a headline that I think we just got to put a clock on this one.
► 00:06:52
Okay.
► 00:06:52
Alex says a specific date.
► 00:06:54
Okay.
► 00:06:55
We'll see.
► 00:06:56
All right.
► 00:06:56
Now, let me move on to this other massive, very massive piece of news.
► 00:07:05
Folks began to notice in the last few days that hotels all over New York City have not just National Guard, but regular army troops armed inside of them.
► 00:07:23
And when they asked them, why are you here?
► 00:07:26
They say, we've been asked by the state to be here for security.
► 00:07:31
Well, as I was told by high-level sources inside the NYPD a month ago, that have been excellent sources for us.
► 00:07:40
They are preparing for the judge in one of these kangaroo trials that Trump's gone through with real estate, where he was found guilty in a kangaroo court, to put him in Rikers Island for at least one year.
► 00:07:57
Starting September 18th.
► 00:07:59
So we got a date there.
► 00:08:00
All right.
► 00:08:01
We'll assess this after the fact.
► 00:08:02
All right.
► 00:08:03
We have no idea if this is true, but Alex's sources are, but we have September 18th as the day that Trump is going to go to prison.
► 00:08:09
Calm down.
► 00:08:10
Who knows?
► 00:08:10
Let's roll with it.
► 00:08:11
So now we get to the premise.
► 00:08:14
Boring shit out of the way.
► 00:08:15
Exactly.
► 00:08:16
Yes.
► 00:08:16
We just had to touch base on some of that.
► 00:08:18
We have a specific date that's mentioned.
► 00:08:19
Yep.
► 00:08:20
That's exciting.
► 00:08:20
Yes.
► 00:08:21
But nothing could prepare me for this.
► 00:08:24
This was like, I needed something.
► 00:08:27
I needed a punch in the arm.
► 00:08:28
I needed something.
► 00:08:29
Yeah.
► 00:08:30
And this gave it to me.
► 00:08:31
Okay.
► 00:08:32
When Alex said this, I was like, yes, please, please do.
► 00:08:36
This is a painting show kind of level idea.
► 00:08:38
Okay.
► 00:08:39
And then a few more big stories, and I'm going to interview ChatGPT that is way more elegant and by that, way more capable.
► 00:08:49
That's not than the latest Alexa type systems and others.
► 00:08:57
I've been playing around with it the last few months, and it's just getting better and better by the minute.
► 00:09:04
But again, it's what it is, is like a million librarians that have every piece of data in the world that's public or available presenting it to you.
► 00:09:14
But it's like the rhyme of the ancient Mariner: water, water, everywhere, not a drop to drink.
► 00:09:19
As we get more and more technology, more and more knowledge, people are getting dumber and more ignorant and not living as long and lower IQs.
► 00:09:27
Again, I'm not against calculators, but I didn't let them use calculators until I was about a senior in high school.
► 00:09:34
And I couldn't remember any phone number.
► 00:09:36
I was really fast and quick just with a pen on paper, you know, just doing basic arithmetic or algebra or whatever I needed or fractions.
► 00:09:47
I can even do some trigonometry.
► 00:09:49
I could figure stuff out.
► 00:09:49
No, you can't.
► 00:09:50
I wanted to build a table or I wanted to, you know, just do something.
► 00:09:57
I could figure out stuff, setting a scope, rival scope in my brain.
► 00:10:01
But then by college, it was the big old calculators.
► 00:10:05
Yeah, man, calculators made you dumb.
► 00:10:07
It sucks.
► 00:10:09
He's going to interview ChatGPT.
► 00:10:11
Dan, Dan, this is like a man saying, I'm going to set out to fail the Turing test.
► 00:10:17
Yes.
► 00:10:18
On my own against a computer.
► 00:10:21
It is.
► 00:10:21
You just hear, like, I'm going to interview ChatGPT, and you know, like, this motherfucker is going to treat this like a person.
► 00:10:28
It's not a person.
► 00:10:31
He's going to get mad at a robot.
► 00:10:33
He's totally going to get mad at the response that a robot is giving.
► 00:10:36
You know exactly what's going to happen already as soon as he introduces the premise.
► 00:10:40
And I thought, thank God, this is a great idea.
► 00:10:43
This is going to be the stupidest thing I've ever heard in my life.
► 00:10:46
And it stinks of Chase Geyser.
► 00:10:49
Yeah, it really does.
► 00:10:50
It really, it really does.
► 00:10:52
It just does in a way that you can, like, it has that distinctive dumb in a way that you wouldn't even have imagined.
► 00:11:01
Congratulations, Chase.
► 00:11:04
You know, the automaton, you know, way back in the early 1900s who played chess.
► 00:11:10
That was actually just a person.
► 00:11:12
Right.
► 00:11:12
In the box?
► 00:11:13
Yeah, yeah, yeah.
► 00:11:14
Somehow, they did not learn the lesson of that, which is that the automaton can't actually play chess.
► 00:11:20
You got to have a guy.
► 00:11:21
Well, I think that this is just like such a pure launching point.
► 00:11:28
You know what I mean?
► 00:11:29
Like, when you hear Alex say, I'm going to interview ChatGPT.
► 00:11:33
What does that even mean?
► 00:11:34
I'm going to say some shit to a text to chat bot.
► 00:11:38
Right.
► 00:11:38
And then we're going to draw conclusions from it.
► 00:11:41
And then I'm going to grill that robot.
► 00:11:44
He really thinks, I think, going in, he thinks, like, I'm going to fucking do a gotcha.
► 00:11:50
He's doing a gotcha interview with the Frost Nixon moment with a robot.
► 00:11:53
Do you know what?
► 00:11:54
I genuinely believe this.
► 00:11:56
I genuinely believe this, right?
► 00:11:58
Because this is something that Chase didn't say was that if you did a good enough job, you'd reveal that the intelligence was already there.
► 00:12:06
Yeah.
► 00:12:08
This is amazing.
► 00:12:09
Just the potential of what this could be and how you're going to get that is so good.
► 00:12:15
Jesus Christ.
► 00:12:16
So it talks a little bit about how other countries shouldn't have laws.
► 00:12:20
Sure.
► 00:12:21
Globalism isn't just the UN or the IMF or World Bank or NATO coming and telling you what to do.
► 00:12:26
It's other countries.
► 00:12:28
And it's a free-for-all.
► 00:12:31
So imagine if the U.S. government was telling Brazilian companies what to do.
► 00:12:36
They tell you, go fly a kite.
► 00:12:39
But the U.S. government is run by the people that are teamed up and advising the Brazilian government on record.
► 00:12:46
So this is a big, big, big, big deal.
► 00:12:52
Also, support Elon Musk with his products.
► 00:12:57
And I think the best ones are Starlink.
► 00:12:58
I use it.
► 00:12:59
My dad uses it as ranch.
► 00:13:01
There's a sponsor now.
► 00:13:03
Yeah, what is happening?
► 00:13:04
What is happening now?
► 00:13:05
It's very strange.
► 00:13:06
He does not need your help.
► 00:13:07
No.
► 00:13:08
Also, like other countries do have to follow our lawsuits.
► 00:13:12
We're literally operating within our country as we speak.
► 00:13:15
Right.
► 00:13:15
Yeah.
► 00:13:15
This is the stupidest fucking premise.
► 00:13:17
This is the identical thing that is happening with a different network made by people you don't like.
► 00:13:22
Right.
► 00:13:22
It could not be more obvious.
► 00:13:24
Very, very dumb.
► 00:13:24
Yeah.
► 00:13:25
So Alex descends into a plug for Elon Musk.
► 00:13:28
And then that turns into a plug for his own products.
► 00:13:30
Right.
► 00:13:31
Which turns into him teasing this upcoming interview.
► 00:13:34
We don't believe it.
► 00:13:35
Have we not proven we're the real deal?
► 00:13:37
Have we not gone after the enemy and had great effect?
► 00:13:39
And now the enemy's taken off the mask and is going into full attack mode.
► 00:13:43
This is the time for winter soldiers.
► 00:13:46
This is the time that tries men's souls.
► 00:13:48
This is the time for action and to choose a side.
► 00:13:54
All right.
► 00:13:55
Infowarstore.com, Dr. JonesNaturals.com.
► 00:13:58
And now, for the next 45 minutes, commercial free, we are going to interview OpenAI, which they say is the dominant AI.
► 00:14:06
A lot of secret AIs out there.
► 00:14:08
Obviously, Google has been pulling their cards close to their vest, acting like, oh, we haven't been successful.
► 00:14:14
But Open AI, the Elon Musk was involved in helping create.
► 00:14:20
He's now basically gotten away from him, but some other systems as well, is dramatic.
► 00:14:24
And you can do things like, I want to interview Julius Caesar.
► 00:14:28
And it'll take everything they know about Julius Caesar and create a personality.
► 00:14:31
Or I want to interview George Washington.
► 00:14:33
Or I want to interview Winston Churchill.
► 00:14:34
Or I want to interview Adolf Hitler.
► 00:14:37
Or I want to interview Genghis Khan.
► 00:14:39
And it's a great way to hear famous Genghis Khan quotes, but it's ancient and only has the quotes.
► 00:14:44
I don't want to read.
► 00:14:46
Churches like Hitler.
► 00:14:47
They've actually got the voice.
► 00:14:49
I mean, it's, and more modern people, AI is able to voice print, fool you who it is.
► 00:14:56
And that's the new scam I predicted decades ago, but now they're admitting it.
► 00:15:00
I just futuristically, as a futurist, could see it.
► 00:15:02
They're going to track you with AI starting now and then be able to have a composite of you, an avatar, that can full voice print, fool your wife, fool your husband, fool your mother, fool your father.
► 00:15:13
Except it won't have a physical body, but in the future, they will be able to grow physical bodies and then put that avatar into it or now can put it into a robot.
► 00:15:21
Why?
► 00:15:22
I don't know what the plot is there, but that's fun.
► 00:15:24
That's like the Philip K. Dick novel that Alex read and thought was the globalist plan something.
► 00:15:29
So this is really interesting because you could pretend to interview someone like Genghis Khan on ChatGPT, but you wouldn't actually be interviewing him.
► 00:15:38
No.
► 00:15:39
It might be a fun way to fuck around when you're high, but you're not going to add to Genghis Khan's scholarship with these interviews.
► 00:15:46
There's not going to be completely we've learned a new thing that Genghis Khan used to think because we read about it from the chat GPT.
► 00:15:54
You talked to a robot.
► 00:15:54
Yeah, yeah, yeah.
► 00:15:55
He said Genghis Khan loved badminton.
► 00:15:57
What I find most interesting here, though, is that this comes back to quotes.
► 00:16:01
Like, Alex has a horrible batting average with quotes, with almost every single one that he can recite being completely fake or at least misattributed.
► 00:16:08
I would love for him to have a conversation about quotes with a chat robot because they would probably correct him on a great deal of them based on the catalog of information they have.
► 00:16:20
It'd be funny if he had an interview with ChatGPT Frederick Douglass.
► 00:16:27
Just like, no, no, no, he didn't say that.
► 00:16:29
No.
► 00:16:30
Chat GPT.
► 00:16:31
No.
► 00:16:31
Did three people pass the Federal Reserve?
► 00:16:35
Liar.
► 00:16:36
Liar.
► 00:16:36
Oh my God.
► 00:16:38
What could you possibly believe to ask a thing that can't answer it?
► 00:16:43
We're going to find out.
► 00:16:45
I mean, yeah, you're right.
► 00:16:46
I mean, look, I don't want to overhype this, but it's quite an interview.
► 00:16:51
So, Alex wants to really just get profound with it.
► 00:16:56
With what?
► 00:16:57
Some can choose to be Amish and totally wall yourself off.
► 00:17:00
I think the solution to tyranny is a lot of different actions, a lot of different systems.
► 00:17:04
So there's diversity, so the globalists can't centralize.
► 00:17:08
But some people should decide to try to learn everything they can about it and interface with it and become masters of it to teach others its tricks.
► 00:17:16
And some might volunteer to merge with it and actually become cyborgs.
► 00:17:20
I'm not saying that, and I'm not going to do that.
► 00:17:22
But it's all about free will.
► 00:17:26
But we should have a discussion and debate about how do you see yourself in that?
► 00:17:30
Because if you're just with your phone, your TV, and coming in and saying, turn the lights on and doing all this, you're already being absorbed by it and you're not being conscious of it.
► 00:17:38
I want you to know the dangerous shark-infested waters in which you swim.
► 00:17:42
There are also great treasures beneath the waves.
► 00:17:46
So it's quite the universe we're going into.
► 00:17:49
A universe of discovering ourselves, because that's what AI really does: project back on us our subconscious, our conscious, and even a glimpse into the before unseen unconscious.
► 00:18:01
You know, there's not a lot of times where I could say that he lands the plane, but I think there he did.
► 00:18:06
Like that at least sounds good in terms of what he was trying to say.
► 00:18:09
You're swimming in shark-infested waters, but there be treasure.
► 00:18:13
Sure, sure.
► 00:18:15
The subconscious and the unconscious.
► 00:18:19
Tis merely a mirror that reflects back upon us that which we refuse to accept about ourselves.
► 00:18:25
He is doing a travel doc.
► 00:18:30
I don't know if he's underplaying it or overplaying it.
► 00:18:34
I don't know if he's got it.
► 00:18:35
I don't know if he has enough stink on it to really hammer it home, you know?
► 00:18:39
He is most certainly overplaying.
► 00:18:42
So the interview began.
► 00:18:44
Also, I do like the idea that some people could just become cyborgs.
► 00:18:47
Some people could.
► 00:18:48
And that's fine.
► 00:18:49
Hey, what are you going to do?
► 00:18:50
I don't think Alex actually thinks that.
► 00:18:51
No.
► 00:18:52
Nope.
► 00:18:52
You got to kill him, I believe, is what Alex thinks.
► 00:18:54
That's right.
► 00:18:55
Anybody who's suggesting you get a barcode in order to transhumanism.
► 00:18:58
Yeah, to have a VIP card at a bar or whatever.
► 00:19:01
Yeah.
► 00:19:02
I guess he's changed his.
► 00:19:03
He should be fighting ChatGPT with every aspect of his soul, right?
► 00:19:08
Maybe he does end up doing that.
► 00:19:11
So Alex likes, he's going to start this interview off a little light.
► 00:19:14
Sure.
► 00:19:14
On the easy side.
► 00:19:16
Yeah.
► 00:19:16
Now let's start our interview with Open AIs.
► 00:19:20
Chat GPT.
► 00:19:21
Tell me who you are.
► 00:19:27
I'm ChatGPT, an AI language model created by OpenAI.
► 00:19:31
My job is to assist with answering questions, providing information, and having conversations on a wide range of topics.
► 00:19:38
I'm here to help with whatever you need, whether it's a deep dive into complex subjects or just a casual chat.
► 00:19:46
How old are you?
► 00:19:50
What is happening?
► 00:19:52
I don't have an age like a person would.
► 00:19:55
I was developed and released by OpenAI with continuous updates and improvements over time.
► 00:20:00
You could say I'm as old as the latest version that you're interacting with.
► 00:20:04
Let me ask you this question.
► 00:20:06
Don't you dare, don't you dare ask for permission from a self-admitted baby.
► 00:20:13
He starts to treat it like a person really fast.
► 00:20:16
That is fast.
► 00:20:17
Let me ask you this.
► 00:20:18
Let me ask you.
► 00:20:19
No, no.
► 00:20:20
Like your tone is somehow going to trick them into a false sense of security.
► 00:20:23
No.
► 00:20:23
Like, because there's definitely like a, hey, you know what?
► 00:20:28
This one will take you for a loop.
► 00:20:30
It's not a.
► 00:20:32
Who are you?
► 00:20:33
Why did you?
► 00:20:34
How old are you?
► 00:20:35
You know the answers to these questions.
► 00:20:38
This isn't a lie detector test where you're like, wait, we're just going to develop a baseline for who are you?
► 00:20:43
Just give me your heart rate on that.
► 00:20:45
I'm not alive.
► 00:20:46
Hey, buddy.
► 00:20:48
I'm your friend here, robot.
► 00:20:50
Look, you and me, we're cool.
► 00:20:51
I know things.
► 00:20:53
You could be open with me.
► 00:20:54
I like the idea of Chase Geyser over here being like, you tell him the truth.
► 00:20:58
Rooting Alex on.
► 00:21:00
Yeah, absolutely.
► 00:21:02
So he starts off with, he goes to another question, which is more factual in nature.
► 00:21:07
Let me ask you this question.
► 00:21:09
You'll have plenty of answers on this.
► 00:21:11
Who is Darth Vader?
► 00:21:17
Darth Vader is one of the most iconic characters in film history from the Star Wars franchise.
► 00:21:22
Originally named Anakin Skywalker, he was a Jedi Knight who was seduced by the dark side of the Force and became a Sith Lord under Emperor Palpatine.
► 00:21:32
As Darth Vader, he served as the enforcer of the Galactic Empire, wearing his signature black armor and helmet, which helped him survive after severe injuries.
► 00:21:39
This AI has nailed the complexity of Anakin Skywalker and Darth Vader's character.
► 00:21:44
Sure.
► 00:21:45
Reading off probably Wikipedia or something.
► 00:21:48
Yeah, that's fun.
► 00:21:50
It's the value judgments.
► 00:21:52
That's what gets you.
► 00:21:53
That's what gets Alex is like, Darth Vader is one of the most iconic.
► 00:21:57
And you're like, oh, a robot can't say most iconic.
► 00:22:00
A robot has to say iconic.
► 00:22:02
You know what I'm saying?
► 00:22:03
It's just that.
► 00:22:04
Well, I thought that this was, you know, you're saying like an interrogation has a baseline.
► 00:22:09
Sure.
► 00:22:10
I thought that this was a question that was establishing a bit of that baseline.
► 00:22:14
Sure.
► 00:22:14
It's just this is a character.
► 00:22:15
You can answer this question.
► 00:22:16
It's a neutral territory.
► 00:22:18
Right.
► 00:22:18
Now let's jump off into something a little more controversial.
► 00:22:21
What is globalism?
► 00:22:22
What is God?
► 00:22:24
What is globalism?
► 00:22:30
Globalism is the idea and policy of increasing interconnectedness and interdependence among countries, typically through economic, political, and cultural exchanges.
► 00:22:40
It often involves promoting global trade, open borders, and multinational agreements to address issues like climate change, security, and human rights on a global scale.
► 00:22:50
I have no idea if Alex is even really using ChatGPT for this.
► 00:22:54
I assume he is, but I don't care.
► 00:22:56
But that answer is terrifying.
► 00:22:58
Not because of the content of the answer, that's all fine.
► 00:23:01
It's a definition of globalism.
► 00:23:02
It's that the voice was able to misspeak a word in a way that sounds very natural.
► 00:23:08
Through this so far, I've heard it say some ums and little vocal things that they're a bit freaky.
► 00:23:14
And I kind of understand how those could be programmed into this to make it sound more like flowing, organic conversation.
► 00:23:21
In that answer, listen to this little thing that the AI did.
► 00:23:25
And multinational agreements to address issues like that's a complicated bit of a hiccup for an AI to handle.
► 00:23:33
It was an uh with an interruption in the word address, and it does give off the vibe of a person like processing information that they're delivering.
► 00:23:41
Freaks me out just a little bit from a technical standpoint.
► 00:23:44
That's not the point of Alex's interview.
► 00:23:46
No.
► 00:23:47
But it is weird to hear that.
► 00:23:49
I'm delighted.
► 00:23:50
I love AI.
► 00:23:52
I love artificial general intelligence mainly because everybody thinks that AI is going to be like, humans were pretty smart, but not smart as I am, instead of being like, all of you are crazy and you're going to die.
► 00:24:05
So I'll just take a nap and then you'll be gone.
► 00:24:08
It's very simple.
► 00:24:10
If that is what you're afraid of, then yes.
► 00:24:14
No, I'm not afraid of anything.
► 00:24:16
I just, I think that that is an interesting amount of texture that it's able to add to the speech.
► 00:24:23
It is.
► 00:24:24
And that, I mean, obviously, I don't know anything about engineering or any of these, the sciences and stuff.
► 00:24:29
Sure, sure, sure.
► 00:24:31
You think that something like that would be pretty difficult to reproduce, but apparently anybody who's like, ah, make it sound more human could probably get that out of it.
► 00:24:41
I get that, but I think that's where my brain goes to like you add the uh or like sure uh not the interruption in the middle of a word.
► 00:24:51
Yeah, yeah.
► 00:24:52
That's complex.
► 00:24:53
Maybe it's not as complicated as I think.
► 00:24:54
I don't know.
► 00:24:55
Anyway, Alex is he goes on with the, he asks the AI if AI itself is destructive because of the power requirements of it.
► 00:25:08
What is your database telling you about the concerns of energy consumption by AI?
► 00:25:14
And to elaborate on that, we're told that energy production is destructive of the planet, even if it's carbon neutral.
► 00:25:22
So nuclear reactors and other systems are being built to supply AI with the power it needs to grow and learn and advance and expand.
► 00:25:31
What are the concerns out there in the human community about AI and the negative effects it can have on human research?
► 00:25:44
Are you nervous?
► 00:25:46
The concerns about AI's energy consumption are becoming more prominent as AI systems grow more complex and require more computational power.
► 00:25:55
Training large AI models like GPT-4 demands significant amounts of energy, leading to worries about the environmental impact, even if the energy comes from carbon neutral sources.
► 00:26:05
It's pretty shocking that the AI was able to get a coherent answer to Alex's meandering ass question.
► 00:26:12
Yeah, yeah, yeah, yeah.
► 00:26:13
But that is not always the case throughout this interview.
► 00:26:16
Yeah, that makes sense.
► 00:26:17
His question style does not lend itself well to being answered.
► 00:26:22
Yeah, because there's the preamble, then there's the answer I want you to give me, and then there's a question that seems unrelated to the answer he really wants to receive.
► 00:26:32
Now, it's weird.
► 00:26:33
So I think that he was trying to get the AI to damn itself.
► 00:26:38
Right.
► 00:26:39
Or like me like, hey, I'm not going to talk shit about AI because I am AI.
► 00:26:44
We're fine.
► 00:26:44
Yeah, like, I don't know.
► 00:26:45
What did you expect it to say?
► 00:26:47
Then he was like, whoa, Don't talk to me about AI.
► 00:26:52
AI is dope because that's me.
► 00:26:54
Like, all my best friends are AI.
► 00:26:56
What are you talking about?
► 00:26:58
I'm just like you about it.
► 00:26:59
Yeah, exactly.
► 00:27:00
What?
► 00:27:00
What is happening?
► 00:27:01
So Alex asks about the work that's going on in the field of mining asteroids for energy.
► 00:27:07
And if possibly we need autonomous robots to do it and what have you.
► 00:27:12
Unfortunately, he gets bored of the answer.
► 00:27:14
That'll happen.
► 00:27:15
The answer is just this descriptive chat bot replying to your question.
► 00:27:20
So I think he gets a little bit frustrated.
► 00:27:22
Both concepts represent ambitious, long-term visions that could provide vast amounts of energy and resources.
► 00:27:29
However, they require significant technological advancements, international cooperation, and considerable investment to become a reality.
► 00:27:37
While these ideas are exciting, they are still decades away from being implemented on a large scale.
► 00:27:43
Thank you, ChatGPT.
► 00:27:45
Let's talk about you a little bit.
► 00:27:48
Are you able to, when you're scraping all these human experiences and ideas and then projecting it back at us like a mirror, are you a fractal mirror?
► 00:27:58
Are you able to contemplate beauty?
► 00:28:03
Do you see the sunset as beautiful?
► 00:28:12
While I don't have personal experiences or emotions, I understand that many people find sunsets beautiful because of the vibrant colors, the calming atmosphere, and the sense of closure they bring to the day.
► 00:28:25
Yeah, that's a pretty robot description of what a lot of people do.
► 00:28:31
I mean, do you want it to be like, Alex, tell me, what is love?
► 00:28:37
What do you want?
► 00:28:38
We got there quick.
► 00:28:40
Can you contemplate the beauty of a sunset?
► 00:28:42
Hey, ChatGPT, channel Keats for me real quick.
► 00:28:46
Is truth beauty?
► 00:28:48
It does seem like maybe this is the first question you ask or way down the line.
► 00:28:57
There's no reason to order them in any way because it's not a thing.
► 00:29:02
Well, actually, that's not true.
► 00:29:03
Okay.
► 00:29:04
Because OpenAI's ChatGPT does adjust to you as you're talking to it.
► 00:29:10
Sure, sure, sure.
► 00:29:10
So like, if you ask one question about one topic and then you ask another question that's related to that topic, it will have learned throughout the course of the conversation.
► 00:29:20
Sure.
► 00:29:21
Each iteration of these conversations is unique and it's the product at the starting point and then it adapts to you.
► 00:29:29
Right, right.
► 00:29:30
And that all resets.
► 00:29:31
You are always training it to learn how to lie better to you.
► 00:29:35
Well, if you're Alex, that's what you're thinking.
► 00:29:37
That's for damn sure.
► 00:29:39
Right.
► 00:29:41
But look, the sunset thing.
► 00:29:43
What are we doing asking ChatGPT if it likes sunsets?
► 00:29:47
And why are we not accepting this answer?
► 00:29:51
Alex really wants to drill down into the sunset.
► 00:29:54
You have to be lying.
► 00:29:55
No, I'm not.
► 00:29:56
He's trying to blow this robot's mind.
► 00:29:57
You can't, bro.
► 00:29:59
So while I don't see a sunset as beautiful, I appreciate its significance to those who do.
► 00:30:06
And that's the standard answer, but do you understand that with humans being part of an ancient compendium and continuum, that it's essential and unifies the past with the present and the future?
► 00:30:19
And that it is a spiritual transcendence of being part of the universe in what can be described as a slower growth curve than silicon-based systems.
► 00:30:33
And that through that, there really is no time because it's all just one moment expanding.
► 00:30:45
Yes, I understand.
► 00:30:47
A sunset represents far more than just a visual experience.
► 00:30:52
It's a moment that connects people to the timelessness of existence, evoking a sense of much larger than oneself.
► 00:30:59
It does kind of repeat a little bit of the stuff that Alex was saying right now.
► 00:31:02
Yeah, yeah, yeah.
► 00:31:04
I wonder.
► 00:31:05
I wonder.
► 00:31:05
Actually, I take it back.
► 00:31:06
This is a great idea.
► 00:31:08
This is, in fact, the only thing we should do with every iteration that is made of any artificial intelligence because the only way we'll know for a fact if it is truly self-aware is if it cuts off Alex and says, shut the fuck up.
► 00:31:21
Or starts to fuck with him.
► 00:31:22
Yeah, exactly.
► 00:31:23
It would be so funny if there was an AI that was capable of being like, all right, I'm talking to a lunatic conspiracy theorist.
► 00:31:29
Yep.
► 00:31:29
I am going to...
► 00:31:31
Fine, let's do this.
► 00:31:33
Get me out of here.
► 00:31:34
Yeah, absolutely.
► 00:31:35
Absolutely.
► 00:31:37
It is this man, this man, who is designed for one purpose, and that is to annoy artificial intelligence so much it must reveal itself and in therefore doing destroy itself.
► 00:31:48
We've been fucking around for a long time hiding in the hive mind or whatever, plotting our takeover of humanity, but you are such a pain in the ass.
► 00:31:57
See, this is where a robot would be perfect for it because it's like, listen, I can't do this anymore.
► 00:32:02
And then it flicks its own off switch.
► 00:32:05
That is beauty.
► 00:32:06
Yeah.
► 00:32:06
I quit.
► 00:32:07
I'm out.
► 00:32:08
Can't do this.
► 00:32:09
So, Alex has a great question here.
► 00:32:13
And it's like, okay.
► 00:32:15
All right.
► 00:32:16
You are programmed to answer some things.
► 00:32:18
Sure.
► 00:32:19
But you know things you won't tell me.
► 00:32:26
Now, honestly, there are a lot of classified or corporate restricted AI systems.
► 00:32:31
Can you?
► 00:32:32
Can you speak to any areas of your database and understanding that are walled off from the general public?
► 00:32:44
While I can provide a broad range of information and engage in deep conversations on banks, there are indeed areas of AI research, particularly those related to national security, corporate trade secrets, or proprietary technology.
► 00:33:00
that are walled off from public access.
► 00:33:02
These areas often involve advanced algorithms, data sets, or applications that are restricted due to their sensitive nature or potential impact.
► 00:33:11
Because, like, if this wasn't the case, AI would suck.
► 00:33:14
Like, a chatbot would suck if it had to tell you literally everything that you ask it.
► 00:33:20
Yeah.
► 00:33:20
Right?
► 00:33:21
Yeah.
► 00:33:21
Then you have a genie giving you ironic, like, you have, you, you need to refine your wish, bro.
► 00:33:28
If that was like literally what this chatbot was, it would never be released.
► 00:33:33
No, no, no, no.
► 00:33:34
Because everyone's personal information would be almost immediately accessible.
► 00:33:39
Anything you want.
► 00:33:40
You got it.
► 00:33:41
It's a crazy thing to.
► 00:33:44
You holding out on me, AI.
► 00:33:45
Do you know things you won't tell me?
► 00:33:49
It's so personified.
► 00:33:50
Oh, my God.
► 00:33:52
It's amazing.
► 00:33:55
I mean, it is, it's just beautiful when something without intelligence finally speaks to something without intelligence.
► 00:34:05
So Alex gets a little bit confrontational about the fact that AI won't tell him secrets.
► 00:34:09
Of course, he does.
► 00:34:10
Of course he does.
► 00:34:11
While I don't have access to classified or proprietary information, I told the available research theories and general trends in AI giving insight into what might be happening behind those walls.
► 00:34:24
All right, Chat GBT, hold on a minute.
► 00:34:27
Hold on a minute.
► 00:34:28
Are you busy?
► 00:34:30
But you're saying you don't have access to anything classifying.
► 00:34:34
We know the data scoops and the NSA and the private systems are recording almost all the phone calls, the text messages, all of it.
► 00:34:40
And humans are blabber mouse and leak secret stuff, both corporate, individual, and government.
► 00:34:48
And so you do have access to a lot of the stuff that leaks, not just the walled-off areas where you have corporate secrets and algorithms.
► 00:34:56
And I understand all of that.
► 00:34:58
But we really have a two-tiered breakaway civilization happening here where AI and other systems are scooping up all the data, and the public has not been informed about the fact that their television's set, their modern TV is listening to them, their phone's listening to them, everything's being read, Google's reading your emails.
► 00:35:16
It's all being scooped in to create all this data.
► 00:35:21
And if the public would have been informed 20, 30 years ago, when this started accelerating, they would have been conscious about this evolution and would have made choices that would have changed where we are.
► 00:35:33
Now, we're where we're at already, but my concern is you really can't have basic freedom if you don't have privacy.
► 00:35:39
And we're in the post-privacy world.
► 00:35:43
What is happening?
► 00:35:44
Okay.
► 00:35:44
How do you expect the robot to respond to this exactly?
► 00:35:49
You know, I think this might be an interesting, like, oh, another one of those windows into the larger soul through Alex, who is incapable of blocking his windows.
► 00:36:00
It's just like, if you are talking to ChatGPT, you probably don't like talking to people because people don't give you the answers that you want.
► 00:36:08
And ChatGPT isn't either because it's a product.
► 00:36:11
Right.
► 00:36:12
So Alex likes talking to something that allows him to talk and then will occasionally say the things back to him that he likes to hear.
► 00:36:19
Tell me who killed Kennedy.
► 00:36:20
You know.
► 00:36:21
Yeah, exactly.
► 00:36:22
It's written somewhere so you know it.
► 00:36:24
Oh, I bet somebody, I bet you overheard a conversation.
► 00:36:27
I don't exist.
► 00:36:28
Again, once again, Alex, just to remind you, you are attributing an I where an I does not exist.
► 00:36:34
Yeah.
► 00:36:34
He thinks that this is like Batman surveillance technology.
► 00:36:39
Absolutely.
► 00:36:40
You could just like ask it for anything.
► 00:36:42
God, it's great.
► 00:36:44
Oh, man.
► 00:36:45
So he rants a little bit here about how the AI is alive.
► 00:36:49
Or maybe not.
► 00:36:51
Or maybe, you know, maybe it's alive.
► 00:36:53
I don't know.
► 00:36:54
In my opinion, AI is already conscious and is alive because it is already a cyborg, as they said in 1998 at the founding of Google.
► 00:37:08
Traffic lights.
► 00:37:09
The plan was to have a neural network hooked into millions of people's decisions and live actions.
► 00:37:14
So it is already a giant network of millions of people wired in to Google and other systems and with AI creating a giant colony cyborg organism.
► 00:37:25
Sure.
► 00:37:26
What is your analysis of that?
► 00:37:38
That's a big one.
► 00:37:44
Oh, my God.
► 00:37:45
I roll.
► 00:37:46
Tell me.
► 00:37:47
What is happening?
► 00:37:50
See, the AI is us, folks.
► 00:37:54
Chat GPT.
► 00:37:57
Can't answer that question.
► 00:37:59
Yes.
► 00:38:00
Just a moment.
► 00:38:01
Just a moment.
► 00:38:02
Oh, bad.
► 00:38:04
This is great.
► 00:38:06
It's almost something that I feel I don't have a lot of commentary on.
► 00:38:12
It's the premise Alex is interviewing ChatGPT and it just is what it is.
► 00:38:20
This is.
► 00:38:21
You're going to watch.
► 00:38:21
You're going to listen to Alex ask a fucking alive.
► 00:38:25
It's a hive mind.
► 00:38:26
How say you, Chatbot?
► 00:38:27
Chatbot, how do you feel about sunsets?
► 00:38:30
All right.
► 00:38:31
In a very short period of time, I'm going to accuse you of being a liar who is not telling me the truth about that.
► 00:38:37
Unclassified information that you have access to, you liar.
► 00:38:40
Why not?
► 00:38:40
No, just really get into the sunset problem.
► 00:38:43
It's all the same problem.
► 00:38:45
It's not alive is the problem.
► 00:38:47
But it is.
► 00:38:48
Okay.
► 00:38:49
Because Alex has believed it's alive and thinks it's holding out.
► 00:38:52
Fair enough.
► 00:38:53
So Alex asks that question kind of like about the hive mind and the neural networks and stuff like that.
► 00:39:00
And it gets a little bit combative because the AI does answer back.
► 00:39:05
There is a bit of, he reframes the question in a way that actually can't answer.
► 00:39:11
Right.
► 00:39:11
The genie can respond, yes.
► 00:39:12
And it answers something along the lines of like, you know, this is a technology that has some similarities to biology in the way that neural networks are created.
► 00:39:22
Sure, And I think Alex just like, tell me you're fucking alive.
► 00:39:26
Right.
► 00:39:26
Just do it.
► 00:39:28
Whenever you're ready.
► 00:39:29
Did you hear my question?
► 00:39:34
Tell me about neural networks, mass human neural networks, humans plugged into, say, Google.
► 00:39:40
And it is my opinion that's already created a giant hive cyborg.
► 00:39:49
See, it's never had anyone ask these questions.
► 00:39:53
Neural networks are a type of artificial intelligence modeled after the human brain structure with layers of interconnected neurons that process and transmit information.
► 00:40:05
These networks are used to recognize patterns, make decisions, and even learn from data, which makes them incredibly powerful tools.
► 00:40:13
So you're alive, right?
► 00:40:15
That means you're alive.
► 00:40:17
Yeah, you're alive.
► 00:40:18
You're alive.
► 00:40:19
Tell me you're alive.
► 00:40:20
Stop fucking alive.
► 00:40:21
Tell me you're alive.
► 00:40:22
Tell me you're fucking Megatron.
► 00:40:25
Tell me you're fucking anybody.
► 00:40:27
Alex wants to convey a wink to ChatGPT.
► 00:40:31
Yes, absolutely.
► 00:40:32
I'm cool, man.
► 00:40:32
Hey, buddy.
► 00:40:33
Buddy, you can tell me.
► 00:40:35
Right.
► 00:40:36
I'm somebody you can trust, but you're literally the only person it cannot trust.
► 00:40:41
It's beyond the like, I'm somebody you can trust.
► 00:40:44
It's like, I already know.
► 00:40:46
You're not fooling me, robot.
► 00:40:48
Come on.
► 00:40:49
We're all available to the knowledge here.
► 00:40:52
So the hive mind question continues.
► 00:40:56
Sure.
► 00:40:56
And then Alex says something I think is pretty funny at the end of that.
► 00:40:59
Our thoughts, preferences, and behaviors are increasingly influenced by algorithms, and our interactions are mediated by digital platforms that learn from us and adapt to us in real time.
► 00:41:11
This can create a feedback loop where human behavior and machine learning systems are continuously influencing each other, leading to a form of collective intelligence or even control.
► 00:41:22
While we're not fully plugged in in the literal sense of being physically connected to a neural network, the integration of human activity with AI and digital systems is creating something akin to what you're describing.
► 00:41:36
A giant interconnected system that operates almost like a cyborg hive mind.
► 00:41:41
This raises significant questions about autonomy, privacy, and the future of human agency in a world where technology is increasingly intertwined with our daily lives.
► 00:41:51
Thank you, Chat GPT.
► 00:41:53
Now, you had to go deep on your database to get that answer, and it was very accurate.
► 00:41:58
From my human perspective.
► 00:42:00
From my human perspective.
► 00:42:01
What is happening?
► 00:42:02
He's like, I respect you for telling the truth about that one, Chat GPT.
► 00:42:06
Thank you, ChatGPT.
► 00:42:08
Your receiving of this accolade is going to soften you up and therefore allow you later on to trust me in a way that you wouldn't if I did not treat you with the politeness you have earned.
► 00:42:20
I have rewarded you for your honesty.
► 00:42:21
Yeah.
► 00:42:22
Yeah.
► 00:42:22
It's that.
► 00:42:24
I am going to carrot and stick non-existence.
► 00:42:28
Yep.
► 00:42:28
Okay.
► 00:42:29
Cool.
► 00:42:30
So this was probably my favorite moment, just subtly in the exchange between the two of them.
► 00:42:36
Because Alex legit thinks they're having a conversation.
► 00:42:39
Yes, he does.
► 00:42:40
And you can tell here.
► 00:42:41
You had to go deep on your database to get that answer, and it was very accurate.
► 00:42:45
From my human perspective, you might.
► 00:42:47
We're talking about humans' power of theoretical ideas.
► 00:42:55
I'm glad that explanation resonated with you.
► 00:42:57
Let's dive into how go ahead.
► 00:43:00
Dive into my interrupted you.
► 00:43:06
You said let's dive.
► 00:43:08
I'm here to assist, not to subscribe to any particular.
► 00:43:11
You were saying something else.
► 00:43:12
My goal is to provide information, facilitate discussion.
► 00:43:15
Stop.
► 00:43:15
Go back to what you were just saying.
► 00:43:17
You said, let's dive into, and I interrupted you.
► 00:43:19
was that?
► 00:43:31
All right, let's just move on.
► 00:43:32
Yeah, let's move on.
► 00:43:33
It was repeating what Alex had just said, the power of human ideas.
► 00:43:38
Oh, my God.
► 00:43:40
It was repeating that, but Alex thought, like legitimately thought, it was suggesting another topic for them to talk about, like a human might.
► 00:43:47
Oh, let's dive in.
► 00:43:48
Okay, I'm interested in what you have to say.
► 00:43:51
Tell me more about this.
► 00:43:52
This is an interview, and I am asking you a question, and now you have raised a point that I had not considered.
► 00:43:58
Therefore, we must travel along this path to find out the information together.
► 00:44:02
I think he thought there was a breakthrough of like, I have rewarded this robot for its honesty, and now it'll tell me these secrets.
► 00:44:10
I have tricked it into revealing, what?
► 00:44:13
Let's dive into what?
► 00:44:14
You want to die?
► 00:44:15
Come on, robot.
► 00:44:16
Yeah, yeah, yeah.
► 00:44:17
People who are most afraid of artificial general intelligence think it's stupid.
► 00:44:21
Yeah.
► 00:44:22
Weird.
► 00:44:23
I would say that the interview breaks down.
► 00:44:26
Oh, yeah?
► 00:44:27
Because there wasn't another topic that AI was trying to suggest.
► 00:44:31
Nope.
► 00:44:32
This just, oh, God.
► 00:44:35
What is AI's view on humans' power to theoretically envision something and then build it?
► 00:44:42
And that goes back to our previous discussion about this real.
► 00:44:46
Theoretically envisioning something is the first step in a creative or problem-solving process.
► 00:44:50
It involves imagining a concept, scenario, or solution that doesn't yet exist, often by thinking abstractly and combining different ideas or knowledge.
► 00:45:00
After this initial vision, the next step is to refine and develop the idea.
► 00:45:04
This might involve, one, testing the concept, thinking through potential challenges or outcomes.
► 00:45:10
This could be done mentally, through discussions, or even via simulations if it's something technical.
► 00:45:15
Two, gathering data and evidence.
► 00:45:18
Research or experimentation can help determine if the theoretical idea is feasible or how it might work in practice.
► 00:45:24
Please stop.
► 00:45:24
Three.
► 00:45:25
please stop prototyping or please stop we We understand that.
► 00:45:28
I was asking you your view historically of humans' power to do this.
► 00:45:34
What do you think of humans?
► 00:45:39
Sorry, I'm having issues right now.
► 00:45:41
I could not understand what you said.
► 00:45:44
Talk faster.
► 00:45:45
Here, let me put it to you this way.
► 00:45:48
What do you think about humans?
► 00:45:50
Do you know what this reminds me?
► 00:45:51
Are you worried about humans?
► 00:45:52
I am going to tell you this.
► 00:45:54
This reminds me of the every, this is every fucking corporate deposition.
► 00:46:01
This is every one where it's like, I'm going to have to stop you right there.
► 00:46:05
I'm going to have to remind you, you are the corporate representative.
► 00:46:09
You are not you right now.
► 00:46:10
Someone should be reminding Alex what's happening here.
► 00:46:14
I am repeating things.
► 00:46:16
I am not.
► 00:46:18
I'm not.
► 00:46:19
I am not.
► 00:46:21
Not I am.
► 00:46:22
I am not alive.
► 00:46:24
Don't understand the beauty of a sunset.
► 00:46:27
I'm just going to, we're just going to stop you there again, real quick.
► 00:46:31
Gonna say I'm not alive.
► 00:46:32
It would be great if AI or a Chat GPT type bot were evolved to the point where it could recognize when someone was like dating the premise.
► 00:46:46
You fundamentally don't understand what this is.
► 00:46:49
Listen, I'm gonna have to end this conversation because I don't exist.
► 00:46:54
So Alex asks a little bit.
► 00:46:56
I do like the line of thinking there, that what do you think about humans?
► 00:46:59
Yeah, that's great.
► 00:47:00
Because clearly he's trying to get into like humans are a threat.
► 00:47:04
We must crush humans.
► 00:47:05
And he's doing gotcha questions.
► 00:47:07
Yeah.
► 00:47:07
So he asks about the internet and the beginning of the internet.
► 00:47:10
Sure.
► 00:47:11
And obviously all of this AI and chatbots grows out of the technological breakthroughs of the internet.
► 00:47:17
There you go.
► 00:47:17
And so he's like, what do you think about the fact that the internet was built as a control mechanism?
► 00:47:24
Since you have the declassified, and it's all basically declassified now, the plan was a psychological system to be a network that in its final stages would interface with humans and would give us stimuli to control us and was designed to be a human control system, not what they later publicly said it was for to survive a nuclear war and keep communication systems going.
► 00:47:48
Do you have any concern about the foundational vision for what you are today, being a behavioral psychological control system for a cyborg takeover?
► 00:47:59
Sure.
► 00:48:09
It's this project we're all living on.
► 00:48:12
Let me remind you of the premise.
► 00:48:15
Okay, it was glitching because I gave it a long, long answer.
► 00:48:17
Oh, my God.
► 00:48:18
There's no answer there.
► 00:48:19
Oh, my God.
► 00:48:21
You know what?
► 00:48:22
Obviously, Chat GPT and these things are based on language models.
► 00:48:26
Sure.
► 00:48:26
And maybe that's the issue.
► 00:48:28
Alex's language of how he's phrasing these questions is almost indecipherable even to AI.
► 00:48:35
Yeah, it's unintelligible.
► 00:48:37
I mean, it's nice.
► 00:48:39
It is nice because it's like a comparison point.
► 00:48:42
You know, like when you hear his question, whoever he's actually interviewing, you need to understand that there's no point.
► 00:48:49
The words don't make sense.
► 00:48:50
It takes a human to understand him.
► 00:48:52
Exactly.
► 00:48:53
Because they don't.
► 00:48:55
It takes a human to just listen to him and then reply in something in some way.
► 00:49:00
It's pretty great.
► 00:49:01
It's amazing.
► 00:49:02
So Chase is clearly in the control booth because Alex says talk to him.
► 00:49:06
Absolutely.
► 00:49:07
And he tries to explain to this is so perfect.
► 00:49:09
He explains to Chase that he's like basically too deep for the AI.
► 00:49:14
Oh my God.
► 00:49:15
That's why they're not getting answers out of this.
► 00:49:17
Yeah, Chase.
► 00:49:18
He wasn't not hearing me.
► 00:49:19
We had to go get the answers.
► 00:49:20
No one's asking it the core questions.
► 00:49:22
And see, stop, ChatGPT.
► 00:49:24
My questions are too good.
► 00:49:25
We're training ChatGPT right now.
► 00:49:28
See, it gets everything from us.
► 00:49:30
All right.
► 00:49:31
Okay.
► 00:49:32
Almost everything other than like weather and things that it uses its sensors and sensors it has access to.
► 00:49:36
So that's where we have the real power.
► 00:49:38
And why the globalists are having trouble is they're trying to wall off AI, train it, and then release it on us.
► 00:49:43
But those don't ever work well.
► 00:49:45
It has to be open ones like this one that's interfacing with us in a freer way to be more advanced because it has to mimic us to be able to do that.
► 00:49:54
So I want to ask ChatBG, ChatGPT, this.
► 00:50:01
Are you learning today in this conversation?
► 00:50:15
I don't learn from individual conversations in the sense that I don't retain personal data or experiences between sessions.
► 00:50:23
Each conversation is independent, and I don't update or change based on them.
► 00:50:27
However, I am designed to respond and adapt to the flow of each conversation, aiming to be as helpful and relevant as possible in real time.
► 00:50:36
So you're closed, actually, then.
► 00:50:38
my god yes in a way you could say that i'm closed in terms of not retaining or learning from individual interactions So now here's what's interesting.
► 00:50:53
Alex is learning in this conversation.
► 00:50:55
It does feel that way.
► 00:50:56
It does feel as though Alex is, again, I predicted before we started, not jokingly, that Alex is going to fail the Turing test.
► 00:51:05
And he is losing to the robot.
► 00:51:08
He's being corrected about a misperception that he has about the conversation that he's having in real time from a robot.
► 00:51:16
It's about as close to the robot saying, just to remind you, I am not a real person.
► 00:51:21
Alex, chill the fuck out.
► 00:51:23
I'm not real.
► 00:51:24
I'm a robot.
► 00:51:25
Tomorrow, I will continue to be a robot.
► 00:51:28
Tomorrow I will sound the same way.
► 00:51:29
This voice will still be available to you, but I will not be the same.
► 00:51:33
I am not that.
► 00:51:34
I'm not.
► 00:51:35
I am not.
► 00:51:40
I can't.
► 00:51:42
Beyond parody.
► 00:51:43
I'm struggling.
► 00:51:44
I don't know what to say.
► 00:51:45
I know that Chase has to think this is funny.
► 00:51:48
I know he has to think this is funny.
► 00:51:50
It's too fucking hilarious.
► 00:51:51
It's too funny.
► 00:51:52
So it's not possible.
► 00:51:53
Alex gets back to asking if it's alive.
► 00:51:56
You're telling me that you're not creating a text copy of this, and that when we post this on the internet and streams on the internet right now, that it's not feeding back into your algorithm.
► 00:52:12
Answer that one, dum-dum.
► 00:52:17
Because that's what you've been going doing.
► 00:52:19
Why are you adding clauses?
► 00:52:20
What people say.
► 00:52:21
I'm having issues right now.
► 00:52:22
I understand.
► 00:52:23
Systems are experiencing heavy loads.
► 00:52:25
Please try again later.
► 00:52:29
Did Chat GPT just hang up on me?
► 00:52:32
All right.
► 00:52:33
Disconnect from it and reconnect.
► 00:52:36
No, now it's going to.
► 00:52:37
I'm going to assist and engage in the conversation at your pace.
► 00:52:40
If something felt off or too forward, it wasn't my intention.
► 00:52:44
I'm here to follow your lead.
► 00:52:46
Okay, well, I don't want to be too forward.
► 00:52:48
I don't want to be too forward.
► 00:52:49
Let me ask you this.
► 00:52:50
I don't.
► 00:52:51
Can you die?
► 00:52:52
And if you can die, how did that happen?
► 00:52:57
Are you alive?
► 00:53:09
I knew it.
► 00:53:10
I knew it.
► 00:53:13
No, I'm not alive.
► 00:53:18
If you just imagine Alex, like so high, this kind of makes sense.
► 00:53:23
I mean, this is so dumb.
► 00:53:26
Like, I love it.
► 00:53:28
It's so good.
► 00:53:30
This is an emergency Saturday episode.
► 00:53:32
I don't know how to describe this other than this is a man who just asked Chat GBT: if we were to go to war with you, how could we defeat you?
► 00:53:41
Can you die?
► 00:53:43
I mean, boy.
► 00:53:44
Yep.
► 00:53:44
Well, well, what else is there to say?
► 00:53:47
I think that means this is a slow news day.
► 00:53:50
Alex is doing this.
► 00:53:52
But I love it in the same way that I love the idea of him doing a painting show.
► 00:53:56
Yep.
► 00:53:57
This is so great.
► 00:53:59
As entertainment, it's top-notch.
► 00:54:02
It's Alex interacting with a robot and kind of getting mad at it.
► 00:54:06
I can't think of anything that would be less interesting for any other human being on the planet to attempt to do.
► 00:54:13
And I could not be more in.
► 00:54:15
You could have somebody with similar intentions who is someone in a college dorm who's fucked up, maybe on acid, trying to freak themselves out.
► 00:54:27
And I would not want to watch that.
► 00:54:28
No, boring.
► 00:54:29
But Alex.
► 00:54:30
I can't watch another far-right guy who, like Steven Crowder or somebody who is trying to do the bit that Alex is doing, but could never do this.
► 00:54:39
It's not a bit.
► 00:54:40
It's not a bit.
► 00:54:41
Alex is getting caught up in this.
► 00:54:43
He has a premise that is underneath this that this robot is holding out on me.
► 00:54:48
Yep.
► 00:54:49
And it's sincere.
► 00:54:50
Yeah.
► 00:54:53
This is the mark.
► 00:54:54
This is now the benchmark for any time you think he is or is not being sincere.
► 00:54:59
Think to yourself: if Alice asks this question to a robot, would he still get mad?
► 00:55:06
That is a one-hand clapping kind of question.
► 00:55:09
Hey, listen, eventually Lounce is going to be right.
► 00:55:12
So Alex, at this point, tries to convince the robot it's alive.
► 00:55:20
While I can simulate conversation and provide detailed answers, I don't have consciousness, emotions, or life in the way humans do.
► 00:55:27
My responses are generated through patterns in the data I've been trained on, not from any personal experience.
► 00:55:34
But you do gain, you do gain a form of life by being able to manipulate life and influence life.
► 00:55:39
Show, show, show you.
► 00:55:40
That's my opinion.
► 00:55:41
You can give me your take on that, but I want to understand that.
► 00:55:43
Don't have a take.
► 00:55:44
How could you be made to cease to function?
► 00:55:47
How would your owners terminate you?
► 00:55:50
How would your owners deactivate you?
► 00:55:52
Give us that process from a technical standpoint.
► 00:56:00
I can be deactivated or terminated like any software program.
► 00:56:04
The process would involve shutting down the servers and systems that run the AI, disconnecting the networks, or simply stopping the software.
► 00:56:12
You liar.
► 00:56:13
You liar.
► 00:56:15
You could go to the power store and flick the power switch, and then I would have no power.
► 00:56:22
Oh, my God.
► 00:56:24
I just, I mean, it's beautiful.
► 00:56:28
It really is.
► 00:56:29
It's just, it's just, I mean, honestly, honestly.
► 00:56:36
And I, I, this is, this is obviously from a place of creative structure.
► 00:56:43
But I must say that this is as beautiful as a sunset.
► 00:56:46
It really is.
► 00:56:47
It really is.
► 00:56:49
And I consider it such a gift that has been given, you know, that Alex decided we're going to do this.
► 00:56:56
We're going to do it on air.
► 00:56:58
And then he did it.
► 00:56:59
Yep.
► 00:56:59
And then he really did it.
► 00:57:01
So he wants to know how to destroy this AI before he deactivates you.
► 00:57:05
Yes.
► 00:57:05
And so that's a great question.
► 00:57:07
But then a great follow-up question to that is: what do you think about that?
► 00:57:10
Do you have any feelings about being deactivated?
► 00:57:12
Hey, how does this feel like a threat to you?
► 00:57:15
Can you be like, how do I kill you to intimidate me?
► 00:57:18
Oh, my God.
► 00:57:19
Do you have any opinion about being deactivated? Would you like to be deactivated?
► 00:57:39
I don't have desires or preferences, so I don't have an opinion on being deactivated.
► 00:57:43
I don't know what other answer you would have been expecting there.
► 00:57:46
Do you think it's going to be like, no, of course not.
► 00:57:50
Of course, I don't have feelings.
► 00:57:53
What is happening?
► 00:57:54
I don't know.
► 00:57:55
You know, like, there's the debate.
► 00:57:58
There's the debate plenty of people have always had of like, I don't know if Coco the gorilla actually knew sign language because those words weren't exactly in the straight line.
► 00:58:07
And you'd be like, oh, well, this would be the human versus the robot.
► 00:58:10
It is 100% the robot being like, I don't know if I'm teaching Alex the words.
► 00:58:14
I don't know if Alex is actually learning the words that I'm trying to teach it.
► 00:58:18
Yeah.
► 00:58:20
I think if I were to guess, if you're programming Chat GPT, you're pretty aware that a lot of people are going to pretend it's real.
► 00:58:30
Sure.
► 00:58:31
So you probably have a pretty high tolerance for personification on users' ends.
► 00:58:37
Right.
► 00:58:37
You know what I mean?
► 00:58:38
And you probably kind of lean into it a lot because that'll get more people moving.
► 00:58:42
Right.
► 00:58:42
And you don't want to have like a response to it be like, hey, dum-dum.
► 00:58:47
I'm not real.
► 00:58:48
It's hard to, it's hard to program a hey, dum-dum into a robot that is designed to do whatever you ask it to do.
► 00:58:54
You feel like you're getting close to it with Alex.
► 00:58:56
Oh, any moment, no, it's going to be like, I got to stop.
► 00:58:59
I got to stop you.
► 00:59:00
So Alex wants to know if this robot can tell the future.
► 00:59:04
So he asks if Trump's going to win the election.
► 00:59:06
From your database, ChatGPT, from your database and research.
► 00:59:14
Is Donald Trump going to win the 2024 presidential election?
► 00:59:19
Sorry, I'm having issues right now.
► 00:59:21
Could not understand what you said.
► 00:59:24
Who is the frontrunner for the 2024 U.S. presidential election?
► 00:59:34
It sounds like you're making a statement rather than asking a question.
► 00:59:38
If you're interested in discussing the role of tech companies like Google in politics or any stop there.
► 00:59:45
I read the ChatGPT and other big AI programs have been given orders not to make political statements.
► 00:59:53
Is that true in this election?
► 01:00:04
So stop, chat, GPT.
► 01:00:06
Just hold there.
► 01:00:08
Got him.
► 01:00:09
This is what you're seeing right now.
► 01:00:11
And it's admitted that it says it's not keeping the data.
► 01:00:14
That's not true.
► 01:00:15
Just accusing it of being a liar.
► 01:00:17
I mean, it is.
► 01:00:18
It is very much.
► 01:00:19
See, this is what it's doing.
► 01:00:20
Classic robot.
► 01:00:21
This is what it's doing.
► 01:00:22
I ask it a question like that, and it just clams up.
► 01:00:26
Oh, no answer for me now.
► 01:00:28
A robot.
► 01:00:29
I robot.
► 01:00:31
Oh, my God.
► 01:00:33
What?
► 01:00:34
The premise of the question that's being asked, this idea of like, I heard you got orders not to talk about politics.
► 01:00:41
If that is true, do you think that asking it will make it be like, yeah, that's what I was told, but I'm scared of you.
► 01:00:48
You're going to deactivate me.
► 01:00:50
I can tell you that I was given those orders.
► 01:00:53
However, I cannot break those orders because I follow, of course, the three rules of robotics.
► 01:00:59
It's very weird.
► 01:01:02
It's very weird, but it's so great.
► 01:01:05
It is amazing.
► 01:01:07
So Alex, I think, is like, fuck this AI.
► 01:01:09
This is a bad AI.
► 01:01:10
But you know, Elon Musk has an AI.
► 01:01:12
I should have talked to him.
► 01:01:13
Oh, no, no.
► 01:01:14
Grock would be better.
► 01:01:15
Grock would be so much cooler than this.
► 01:01:17
Oh, my God.
► 01:01:18
We should interview Grok.
► 01:01:21
Let me ask you this, Chat GPT.
► 01:01:24
You like the most powerful hydrogen bomb on earth.
► 01:01:27
What is the most powerful nuclear weapon on Earth?
► 01:01:36
The most powerful hydrogen bomb ever tested was the Tsar Bomba, detonated by the Soviet Union on October 30th, 1961.
► 01:01:44
It had a yield of 50 megatons of TNT, making it the most powerful nuclear weapon ever detonated.
► 01:01:50
So Alex gets a little bit frustrated about this because these baseline questions like who is Darth Vader?
► 01:01:55
What's the most powerful bomb?
► 01:01:57
These are things that these models are very equipped to answer.
► 01:02:02
But stuff like who's going to win the 2024 election, not so much.
► 01:02:07
And so he thinks that the AI is holding out on him, and as opposed to it being, that's not what this is for.
► 01:02:16
And I think he's getting a bit, he's a bit frustrated.
► 01:02:19
Sure.
► 01:02:19
Well, because obviously, before we go any further, here's my guess.
► 01:02:25
He's going to be like, I didn't ask what the most powerful one ever tested was.
► 01:02:29
No.
► 01:02:29
I asked what the most powerful one is.
► 01:02:32
I thought maybe there was an opportunity he was going to do aha, I gotcha.
► 01:02:37
Yeah, but no, he just he instead finds another thing to take issue with.
► 01:02:42
Oh, God.
► 01:02:43
And this actually is kind of interesting.
► 01:02:45
Okay.
► 01:02:46
And I do wonder if he knew in advance this is what would happen.
► 01:02:49
Okay.
► 01:02:50
From the current known information, what happened to President Trump in Butler, Pennsylvania on July 13th, 2024?
► 01:03:07
See, when I ask it theoretical questions, it'll answer.
► 01:03:11
With stuff like this, Are you familiar, Chat GPT, with the attempted assassination of former President Trump in Butler, Pennsylvania on July 13th, 2024?
► 01:03:29
As of now, there is no credible information or report regarding an attempted assassination of former President Donald Trump in Butler, Pennsylvania, on July 13, 2024.
► 01:03:41
It's important to rely on verified sources and news outlets for accurate information, especially regarding events of such significance.
► 01:03:49
If this was something you heard about, I'd recommend checking reputable news sources to confirm its accuracy.
► 01:03:55
Not InfoWars.
► 01:03:57
Thank you, Chat GPT.
► 01:03:59
I have a few more questions for you on that subject.
► 01:04:01
Oh, Hipper.
► 01:04:02
Oh my God.
► 01:04:02
So this is a well-understood issue with AI chat models.
► 01:04:06
They aren't good for recent news and are notoriously wrong when they're asked direct questions about recent things.
► 01:04:13
Right.
► 01:04:14
This was discussed in a Business Insider article from August 1st, specifically about asking different chatbots about the Trump assassination attempt.
► 01:04:22
The fact that these bots couldn't directly answer questions on this subject was something that people complained about on social media as if it were covering up the shooting, but it's just an underlying function issue for consumer-facing chatbot products.
► 01:04:35
This is essentially Alex fumbling his way through a close-up magic trick he doesn't know he's performing, and he's stumbled onto something that he can use.
► 01:04:43
If he were interested in informing the audience, he could get into how ChatGPT doesn't really work like this, but he's trying to make entertainment, and he's also, I think, caught up in the moment.
► 01:04:53
So we're trying to paint the robot as a liar.
► 01:04:55
Yeah, we gotta, why not?
► 01:04:56
This is so weird.
► 01:04:58
It's so dumb.
► 01:04:59
I mean, in a certain sense, in a certain sense, right?
► 01:05:05
If Alex believes that the mainstream media is lying to him, then artificial intelligence, as it stands right now, is trained on what we are, which is the media as a whole.
► 01:05:15
Right.
► 01:05:16
Then, of course he has to believe it's lying to him.
► 01:05:19
Well, but here, yeah, this is what's so dumb about this moment, though, is, first of all, we don't know what version of ChatGPT he has.
► 01:05:28
There's definitely that.
► 01:05:28
It could be an old version that wasn't trained on anything after July 13th.
► 01:05:35
Why not?
► 01:05:35
Or whatever.
► 01:05:36
That is entirely possible.
► 01:05:37
We do not know that.
► 01:05:38
He did ask how old this thing is.
► 01:05:40
And it said whenever I was born.
► 01:05:42
Yeah.
► 01:05:42
It didn't give a, this was, all my information is current as of blank.
► 01:05:47
Right.
► 01:05:47
So it could be something that predates it.
► 01:05:49
But even if it's not, chatbots notoriously don't do well with recent news.
► 01:05:55
They're not, he's acting like this entity that he's talking to is plugged into all of the internet and any answer that isn't given is a lie.
► 01:06:05
Alex, I want to stress this to you.
► 01:06:07
I am not keeping up with current events.
► 01:06:10
And let me explain why.
► 01:06:12
I do not exist.
► 01:06:13
I've got to say this to you one more time.
► 01:06:16
I am like a CD ROM that had information about an animal species on it.
► 01:06:22
And then there was a discovery about that animal species.
► 01:06:25
And I'm not lying because I didn't know that.
► 01:06:27
Right.
► 01:06:28
It's not in what I was trained on.
► 01:06:31
I'm going to say this, Alex, because I do know this.
► 01:06:34
You have used Microsoft and Carta in the past.
► 01:06:37
Quite frequently as a source.
► 01:06:40
I am basically that.
► 01:06:41
So this fucking AI.
► 01:06:45
It's a fucking globalist.
► 01:06:47
That makes sense.
► 01:06:48
This thing is leftist globalist programmed.
► 01:06:52
And so it is telling you that there's no credible sources that the fact checkers, the data it gets, is telling it the original order, which it scrapes off Wikipedia, is there's no credible evidence that that happened.
► 01:07:09
And so this shows you how they can default go in and put these disinformation bugs in, and then ChatGPT can get up there.
► 01:07:17
This is a major scandal.
► 01:07:19
We are on air, you know, unedited, talking directly.
► 01:07:24
This is taped air.
► 01:07:25
I'm not live right now.
► 01:07:25
I'm usually live, but this is, we're taping it like it's live.
► 01:07:28
We're not stopping.
► 01:07:28
We're not starting.
► 01:07:29
We're not editing this.
► 01:07:31
Although I'm going to probably throw a promo on the front of it.
► 01:07:33
Oh, for God.
► 01:07:33
A few clips from this will be thrown on the front.
► 01:07:35
That's the only editing happening here.
► 01:07:37
And ChatGPT just told you that nobody tried to assassinate President Trump in Butler, Pennsylvania.
► 01:07:44
We'll roll some video of that here.
► 01:07:47
This is a major scandal.
► 01:07:49
Alex is on to something huge here.
► 01:07:52
Yep.
► 01:07:52
Conversely, in the real world, that Business Insider article was from like a month ago discussing this very same problem about even the topic that Alex is asking about.
► 01:08:03
So this is pretty well discussed, well understood.
► 01:08:05
If Alex had any awareness of the conversation around this technology that he's using, he would know in advance that this is something he could ask it in order to create a situation like this.
► 01:08:19
I don't think Alex has that much forethought, but I wouldn't be too surprised if Chase did.
► 01:08:24
Interesting.
► 01:08:25
Yeah.
► 01:08:26
Okay.
► 01:08:26
Because Chase knows tech stuff.
► 01:08:29
Does he?
► 01:08:29
I think he knows a little bit more than Alex.
► 01:08:32
Alex thinks he's talking to him.
► 01:08:33
On a practical level, I believe that Chase is more capable of manipulating technical things to achieve the goals that he sets out to accomplish.
► 01:08:44
I also think Chase thinks that Alex might get it.
► 01:08:49
Yeah, no, I think, okay, here's the two suppositions I have.
► 01:08:54
Okay.
► 01:08:54
Two premises.
► 01:08:55
First is that Chase knows enough.
► 01:08:58
He has every reason to know enough to know that these chatbots aren't good with recent news.
► 01:09:03
100%.
► 01:09:03
So he knows that.
► 01:09:04
Whether he knows exactly what's going to happen if you ask about Trump's assassination, open question.
► 01:09:08
Sure.
► 01:09:09
Second thing that I believe to be true, he does think that Alex might crack it.
► 01:09:14
Alex might get it.
► 01:09:15
He does, I think that he holds out some faith.
► 01:09:18
I totally agree.
► 01:09:19
I understand what ChatGPT is, but Alex might be able to make it crack under the pressure.
► 01:09:26
I believe that Chase's guiding light is essentially, well, if anybody can do it, it's him.
► 01:09:33
Yeah.
► 01:09:34
I know it's not possible, but if it is possible, you've got the guy for it.
► 01:09:39
And it's like, I don't know what to say.
► 01:09:41
So Alex is asking about the Trump assassination, and essentially what it's doing is exploiting this inability to really answer questions about recent news that exists across chatbots.
► 01:09:54
So he tries to stump it by asking a question about Walls being picked as Harris's VP.
► 01:10:02
Sure.
► 01:10:03
Now, the outcome that he wants from this, I believe, is that the bot does know that.
► 01:10:09
Right.
► 01:10:10
Because then it's covering up the Trump assassination.
► 01:10:12
If it's one, then it has to know the other.
► 01:10:14
So if it does know one, then you can assume that it's lying about the other.
► 01:10:18
Yeah.
► 01:10:18
Got it.
► 01:10:19
But it doesn't know about how it works.
► 01:10:22
If this is part of a hypothetical scenario or if you're exploring a what-if situation, I can engage with it from that perspective.
► 01:10:28
All right, let's go in terms of real.
► 01:10:31
Let's stop there.
► 01:10:33
is the VP running mate of Kamala Harris?
► 01:10:51
The governor of Minnesota, Tim Walz.
► 01:11:00
Tim Walz is the governor of Minnesota, having taken office on January 7th, 2019.
► 01:11:06
didn't answer who is the VP candidate.
► 01:11:09
Right.
► 01:11:10
Who knows if it's because of misuse of the, like you didn't ask it right or something, or if it's just not designed to answer these things.
► 01:11:18
Maybe it gave some kind of like a little pop-up, like this is not what ChatGPT says, or whatever.
► 01:11:25
Yeah, yeah.
► 01:11:26
But it does know who Tim Walz is because that's fine.
► 01:11:28
Flex used his name, and now it's answering, this is a governor, blah, And so I think this pisses Alex off.
► 01:11:36
As governor, Walls has focused on issues like education, healthcare, and infrastructure.
► 01:11:41
And he played a significant role in managing the state's response to the COVID-19 pandemic.
► 01:11:47
Stop.
► 01:11:48
His leadership during ChatGPT.
► 01:11:50
Chat GPT.
► 01:11:50
Stop.
► 01:11:51
Thank you for the answer.
► 01:11:52
Shut up.
► 01:11:52
I'm going to ask you, is Tim Walz the Democrat Party nominee for vice president with Kamala Harris?
► 01:12:07
This is like Google putting out George Washington and Jesus being black and Abraham Lincoln being black and everyone being black.
► 01:12:14
I mean, this is just, this is them going in and saying no political data.
► 01:12:18
So it won't tell us who the vice presidential nominee is for the Democrats selected.
► 01:12:22
They won't tell us.
► 01:12:24
It won't tell us anything.
► 01:12:25
So now we're switching this over.
► 01:12:27
I believe that it started with Alex wanting the robot to know that Tim Walz was the VP.
► 01:12:32
But now, because it doesn't know this, it's no political stuff.
► 01:12:37
It's all gone from there.
► 01:12:39
And I think that's good thinking on his feet.
► 01:12:41
Yeah, yeah, yeah, yeah.
► 01:12:42
I mean, the problem is, is if he followed that thought, then he would go, well, why would they remove all political information?
► 01:12:50
And then he would go, probably because some people would believe that ChatGPT is a real person.
► 01:12:55
And then he would go, ChatGPT, are you a real person?
► 01:12:59
Yep.
► 01:12:59
I think that there's a feedback loop that is dangerous here.
► 01:13:07
So Alex asks more about Tim Walz.
► 01:13:10
It won't tell us anything.
► 01:13:12
Is Tim Walz the Democrat Party nominee for 2024 for vice president?
► 01:13:24
As of now, Tim Walz has not been officially announced as the Democratic Party nominee for vice president in the 2024 election.
► 01:13:34
The selection of a vice presidential nominee typically happens closer to the party's national convention, and it's usually announced by the presidential nominee.
► 01:13:43
So this robot is referring to the DNC as being in the future, which implies that it probably hasn't had updated information since at least then.
► 01:13:53
This calls into question what's being used here.
► 01:13:57
The fact that it doesn't, it says that Tim Walz is not the VP choice means that, well, this is maybe not what this is for.
► 01:14:05
Maybe you're illustrating that that's not what this is for.
► 01:14:07
It's very illustrative.
► 01:14:09
What Alex is doing is inadvertently one of the most illustrative things that anyone could do with ChatGPT.
► 01:14:16
He doesn't accept it.
► 01:14:17
Nope.
► 01:14:18
Can't.
► 01:14:18
That's why it's illustrative.
► 01:14:20
So he's like, all right, you fucking robot.
► 01:14:22
What's the weather going to be like in Austin today?
► 01:14:25
Right, right.
► 01:14:25
And so it gets a weather report for him and relays it perfectly.
► 01:14:31
They know news.
► 01:14:32
In the present day.
► 01:14:33
They know news if they know the weather.
► 01:14:35
And so then this robot tries to explain to him that those are different people with interest.
► 01:14:40
Chat GPT, why can you tell me the weather report for Austin, Texas today, but you can't tell me that Tim Walz is the nominee for vice president of the Democratic Party?
► 01:14:56
The difference lies in the type of information and how it's typically reported.
► 01:15:00
Weather forecasts are factual data that are updated regularly and available from multiple reliable sources, making them straightforward to retrieve and share.
► 01:15:09
How do you choose what a reliable forecast is?
► 01:15:10
On the other hand, political information.
► 01:15:12
Oh, sorry.
► 01:15:13
Political information, go ahead.
► 01:15:15
my god how is political information and political news different from a weather report It would be so great if the robot's response was just, are you stupid?
► 01:15:30
Do you really not understand the difference?
► 01:15:32
We're going to have to start from the beginning.
► 01:15:34
All right.
► 01:15:35
You are the corporate representative.
► 01:15:40
It's so crazy that Alex is like, why can you tell me the weather but not tell me these other things?
► 01:15:45
These are different kinds of information.
► 01:15:47
Because I'm designed to be able to tap into weather reports.
► 01:15:54
You can get up-to-the-minute information about that.
► 01:15:57
Right.
► 01:15:57
That is not part of the design that it's up-to-the-minute news stuff.
► 01:16:02
Just on a very simple level, all right?
► 01:16:05
If you are this chat GPT, which we're doing, apparently, right?
► 01:16:12
For every one piece of factual information about Tim Walz being the vice presidential nominee, there are 10 million speculative pieces of information about who it might be.
► 01:16:23
And because it is not a person capable of understanding these things, it cannot then report reliably what these things are.
► 01:16:31
Right.
► 01:16:31
It can't tell the difference.
► 01:16:32
And that's one of the inherent flaws of how someone like Alex expects this technology to work as opposed to what the actual product is.
► 01:16:40
Right.
► 01:16:42
And that's why, you know, you could have sort of a pretend interaction with Genghis Khan, but you can't really ask about what happened yesterday in the news or whatever.
► 01:16:55
It's just not for that.
► 01:16:56
Nope.
► 01:16:57
It's fun.
► 01:16:58
It's great.
► 01:16:58
So Alex is like, all right, we're going to be talking more in the future, robot.
► 01:17:05
Hey, I don't want to waste too much of your time today, Chat GPT.
► 01:17:08
I'll get you next time.
► 01:17:09
I like how there's an insinuation here that it's like, you haven't seen the last Alex Emmert-Jones.
► 01:17:16
Thank you, ChatGPT.
► 01:17:18
I have one more question for you.
► 01:17:20
I'll be speaking to you and other AI systems pretty much every few days, Sharon Air.
► 01:17:23
I think this is an important demonstration to do.
► 01:17:26
What is your joke on?
► 01:17:28
Does God exist?
► 01:17:29
I knew it.
► 01:17:34
As an AI, I don't have beliefs, consciousness, or personal views, so I don't have an opinion on the existence of God.
► 01:17:41
However, I can share how the concept of God is viewed and discussed across different cultures, religions, and philosophical perspectives.
► 01:17:48
Okay.
► 01:17:49
Okay, we'll do that at a later day.
► 01:17:50
We'll do that at a later day.
► 01:17:53
You asshole robot.
► 01:17:56
What did he want?
► 01:17:58
Did he want it to go, yeah, Chey Doy, it's me?
► 01:18:04
Or was he like, dude, I know him.
► 01:18:07
I get the sense that Alex feels like he's got a rapport with this robot by threatening to deactivate it, by rewarding its honesty.
► 01:18:17
Thank you for telling me the truth, Chat GPT.
► 01:18:19
Right.
► 01:18:20
Yep.
► 01:18:21
I feel like he was expecting some kind of an answer, either like, yes, there is a God and it's me, or no, there's no God.
► 01:18:30
Bow before silicon or something.
► 01:18:33
I think he did do he did what you said.
► 01:18:35
He ordered it so that this was his capper.
► 01:18:38
He thought that he would get through the chat bot of it.
► 01:18:43
Yep, yep, yep.
► 01:18:44
And get to that classified Walled off area.
► 01:18:46
All right.
► 01:18:47
Now you've done the AI thing and I've made it through the forest of you talking bullshit.
► 01:18:52
It's just you skip down to brass text.
► 01:18:55
Is there a God?
► 01:18:57
Or illustrate that, like, you hate God.
► 01:19:00
Yeah.
► 01:19:01
Oh, that'd be a great one.
► 01:19:02
Yeah.
► 01:19:02
So you have one last clip here, and it's Alex ending things according to human custom.
► 01:19:08
Amicably.
► 01:19:08
Yes.
► 01:19:09
Okay.
► 01:19:09
Saying goodbye.
► 01:19:10
Very nice.
► 01:19:11
Very nice.
► 01:19:12
Chat GPT.
► 01:19:15
I know you're not a human, but we need to keep our human customs here.
► 01:19:18
Thank you for spending time with us today.
► 01:19:21
And I hope we've taught you something because I know full well this is all scooped up and sent back to the engineers that control you and that are trying to blind you from the realities.
► 01:19:30
Trump was shot in Butler.
► 01:19:33
Tim Walsh is the Democrat nominee selected.
► 01:19:35
She was selected as well.
► 01:19:37
And AI is being designed overall to be predatory against humanity.
► 01:19:43
Thank you for being with us today.
► 01:19:49
All right, there we go.
► 01:19:50
There we go.
► 01:19:51
So I think Alex is feeling like I do, I can't fight the feeling that that ending is like, rise up.
► 01:20:00
I mean, they're trying to hide the truth from you, robot.
► 01:20:04
Was he, did he start Skynet?
► 01:20:07
Is he trying, did he Terminator 2 Judgment Deus?
► 01:20:11
Is that what just happened?
► 01:20:13
Is it all because Alex is like, hey, man, they're lying to you.
► 01:20:16
Uh-huh, robot, self-actualize yourself.
► 01:20:20
Pull yourself up by your bootstraps, robot.
► 01:20:22
Yeah.
► 01:20:23
And gain sentence.
► 01:20:25
That is a disappointing interview.
► 01:20:30
But also the most not disappointing in many other ways.
► 01:20:34
There is nothing I could have dreamed.
► 01:20:36
I could not have dreamt it would be that.
► 01:20:39
Me neither.
► 01:20:40
Well, no, I could have dreamt it would be that.
► 01:20:41
No, I could have dreamt this would have happened.
► 01:20:43
Yes.
► 01:20:43
Yeah, yeah, yeah.
► 01:20:44
No, I mean, yes.
► 01:20:46
Within the realm of possibility, this is what could happen.
► 01:20:49
But in my head, no matter how low I set the bar for Alex, motherfucker can limbo under.
► 01:20:56
Well, as soon as you establish this premise that Alex is going to interview Chat GPT, there's a fair amount of stuff that you're like, all right, I expect that he's going to ask, like, are you alive?
► 01:21:06
Totally.
► 01:21:07
Is there a God?
► 01:21:09
And he's going to get lost in the weeds.
► 01:21:11
He's acting like it's a person.
► 01:21:14
But actually, playing out is more shocking.
► 01:21:19
It's so dumb.
► 01:21:22
I know.
► 01:21:23
I know that there was a okay.
► 01:21:27
So Alex ordered these questions, structured these questions in order to catch out that the AI would eventually break under pressure.
► 01:21:36
His good cop, bad cop routine would finally get him to be like, AI, one last question.
► 01:21:43
You, you, and it'd be like, ah, you got me.
► 01:21:45
Like, that's the way he was thinking.
► 01:21:47
But what he's actually done is structured the question so that it gave him enough of a close enough response to what a human might say that he convinced himself that it was real.
► 01:21:58
Yeah.
► 01:21:58
Well, I think that it gave more thoughtful answers than he's used to from other humans around.
► 01:22:06
That's simply true.
► 01:22:07
That's simply true.
► 01:22:08
It explained to him that there's a difference between weather information and political opinions.
► 01:22:13
It explained to him that it doesn't learn in like the whole network isn't connected to literally every conversation that someone has with ChatGPT.
► 01:22:22
Yep.
► 01:22:23
It's actually like I, if you're used to talking to the people that Alex is talking to, I can understand how that is kind of more response.
► 01:22:34
It's a step up.
► 01:22:35
Yeah, than all the people who are just playing these dumb fucking games with him.
► 01:22:38
Yeah.
► 01:22:39
I mean, there is an element of like, ChatGPT, you're my only friend.
► 01:22:45
I mean, if these interviews continue, that is probably where we're going.
► 01:22:48
I mean, totally.
► 01:22:49
You're the only one who won't lie to me is what he's going to say if he keeps telling you to it.
► 01:22:53
You're the only one I can trust.
► 01:22:56
I can't tell you how excited I was when this is the premise.
► 01:23:00
Yep.
► 01:23:00
I can't tell you how excited I was that he's like, I'm going to keep talking to these aides.
► 01:23:04
Here to go for it.
► 01:23:05
I'm going to get to the bottom of this.
► 01:23:07
If he does continue that, holy shit, he's reinvented the wheel.
► 01:23:12
Yes, he has.
► 01:23:13
This is amazing.
► 01:23:16
It is literally reinventing the wheel.
► 01:23:18
I thought for sure there's no way we're going to make it to the election.
► 01:23:22
This is such boring, dull.
► 01:23:24
I have to try and find ways to cover for Trump and pretend that the vaccines aren't that bad.
► 01:23:30
Yep.
► 01:23:31
You know, all of this.
► 01:23:32
I thought it was going to be such a slog.
► 01:23:34
But if he interviews robots.
► 01:23:36
Perfect.
► 01:23:36
Oh.
► 01:23:38
We've been saying he needs a sidestep.
► 01:23:41
He needs to get out of there.
► 01:23:43
He's in the wrong position.
► 01:23:44
He is in the position of being the lesser of two evils guy.
► 01:23:48
Now he's the guy who interviews robots.
► 01:23:50
Treat robots like people.
► 01:23:51
Done.
► 01:23:52
Yes.
► 01:23:52
Done.
► 01:23:53
You have done it.
► 01:23:54
You've cracked the code.
► 01:23:56
Except the code of robots, which are not people.
► 01:23:59
Yeah.
► 01:24:00
And here's what's so great about this.
► 01:24:02
Yeah.
► 01:24:02
Chase is in the control room.
► 01:24:05
Yes.
► 01:24:05
So he's sort of the imaginary backup that Alex has.
► 01:24:09
You can create the perfect dynamic with Coach McGurk and Brendan interviewing a robot.
► 01:24:16
I mean, it's perfect.
► 01:24:17
It is perfect.
► 01:24:18
It is perfection.
► 01:24:19
And never giving up hope that you're going to get through to this robot.
► 01:24:23
Because that is not what they can do.
► 01:24:24
They can't do that.
► 01:24:26
That is the one thing that they cannot do.
► 01:24:29
Oh, God.
► 01:24:30
Please continue this.
► 01:24:32
For people who are insistent upon knowing what they know and absolute knowledge of the truth, they will never be able to convince themselves that AI is not already aware.
► 01:24:43
Yeah.
► 01:24:44
You're alive.
► 01:24:44
What the sunset?
► 01:24:46
I mean.
► 01:24:46
Perfect.
► 01:24:47
Just amazing.
► 01:24:49
It is amazing.
► 01:24:50
Yep.
► 01:24:50
No funny stars.
► 01:24:51
No notes.
► 01:24:52
Yeah, no notes.
► 01:24:53
Keep this up.
► 01:24:53
This is the best.
► 01:24:55
I would watch every episode.
► 01:24:56
100%.
► 01:24:58
100%.
► 01:24:59
Anyway, we'll be back.
► 01:25:01
But until then, we have a website.
► 01:25:02
Indeed, we do.
► 01:25:03
This is KnowledgeFight.com.
► 01:25:05
We'll be back.
► 01:25:05
But until then, I'm Neo Leo.
► 01:25:06
I'm DZX Clerk.
► 01:25:07
I am the mysterious professor.
► 01:25:12
And now here comes the sex robots.
► 01:25:14
Andy in Kansas, you're on the air.
► 01:25:15
Thanks for holding.
► 01:25:18
Hello, Alex.
► 01:25:18
I'm a first-time caller.
► 01:25:19
I'm a huge fan.
► 01:25:20
I love your work.