All Episodes
Feb. 16, 2015 - Freedomain Radio - Stefan Molyneux
02:49:06
2913 Will Artificial Intelligence Kill Us All? - Saturday Call In Show - February 14th, 2015

What do Fifty Shades of Grey and Passion of the Christ have in common? The technology community is in agreement that Artificial Super Intelligence is an inevitability. Timelines range from 15 years to the end of the century and projected outcomes are polarized: human extinction or human immortality. Do you think ethics could be programmed into AIs to keep them safe in the event they become Super Intelligent? I’ve been in a romantic relationship for seven years and I’m not happy - how do you know whether to stay or leave? What do you think is the freest country in the world? With so many countries moving farther away from a free society, what countries do you think are moving in the right direction toward freedom?

| Copy link to current segment

Time Text
Good evening, everybody.
Stefan Molyneux from Freedom Main Radio.
I hope you're doing well.
So, happy Valentine's Day to you.
I hope that you are out there making the philosophy breeding beast with two backs and pumping out many more listeners to this and other philosophical conversations.
So, you know, put this on in the background and get it on.
So yeah, I went to see, well Mike and I, not in the same city, but we did go and see Fifty Shades of Grey.
Mike, you have to share your peacock idea.
It's been haunting me.
It's been haunting me, like the ghost of Marlowe, and Christmas has passed all day.
Okay, spoilers, everybody.
Well, it's not surprising that there's sex scenes in Fifty Shades of Grey, I'm sure.
But there is a scene where, you know, the sex is occurring.
The sex.
And a peacock feather is used.
You know, the horrible bondage implement known as the peacock feather, which is feared by minions across the world.
And it was this peacock feather being loosely slid over the skin of the pretty young lass.
I got the visual of two peacocks engaged in sex with a human hand, lightly brushing it across their own feathers.
And that's it for Valentine's Day for everyone.
It's all over.
Yeah, I was pretty much done after that.
I also wanted to recommend, just for those of you who haven't tried it, do not try the only human hand with your significant other at, say, 4.37 in the morning.
You tend to get fairly well tasered.
Well, no.
Of course, if you're into that kind of stuff, good for you, but it comes with significant warnings attached.
I didn't see a taser in his red room, but never know.
Maybe it was in a drawer or something?
I don't know.
I kept thinking that was, like, not the real red room.
That was, like, the red room for guests.
Yeah.
That was like the red room level one.
You know, like if you go to hell and Satan says, oh yeah, it doesn't get worse than this.
And then it's like, I thought there were nine more layers.
No, no, no.
And then you find out there aren't.
But, you know, it takes time.
So I figured that was just his intro room.
If they don't run screaming from that one, you know, they can go, okay, now let me tell you the real stuff.
Step it up a little.
Step it up a little.
Yeah, I mean, ooh, feathers, you bad, bad boy.
I don't know.
I feel anything that gives you a good night rest when stuffed into a pillow probably is not the most tortuous implement in the world.
Hot coals?
No.
I know some people are confused by the Fifty Shades of Grey talk because we've done a couple shows on it now, but it's kind of a big cultural event.
It's the biggest literary cultural event in history because it is the fastest-selling fiction book, to my knowledge.
In history.
And of course, it's only going to get bigger from here with the movie and all that.
Yeah.
Oddly enough, apparently this one is going to break the box office record for February set by Passion of the Christ.
Wow.
And see, there's whipping in both.
Maybe that's what you need in February.
Maybe that's...
You know what?
We just got to do more shows with whipping.
I think that's it.
Although I will say that...
Man, you know, for a religion that worships its God, they sure can put that guy through his paces, you know?
Like, Jesus does not do well in that movie.
It's just like one giant sadomasochistic torture fest of the highest deity.
And, uh...
I guess it's not terribly shocking that given that Jews betrayed him and Jews got this all started, that he ended up with some anti-Semitic ramblings later on, Mel Gibson, who made the movie.
But yeah, it seems to be the common element.
If you get some good old thong-whipping going.
But my feeling is that this Red Room of Pain...
I mean, it's like one-tenth of 1% of your average Saudi blogger's punishment.
I mean, didn't that guy, he was going to get 1,000 lashes.
I think they gave him 20 or 30 or something like that, and he couldn't come back for more.
1,000 lashes.
Because, you know, 1,001 would be excessive, but 1,000, that's the sweet spot of truly psychotic punishment.
And I think that amnesties managed to get this held off.
But I remember when I was a kid, there was this movie in England called Death of a Princess where they talked about how some Saudi princess had done something wrong.
They just beheaded her.
And I remember, of course, you know, the Saudis in the oil crisis in the 70s, Saudis were very big.
British government was very keen on the Saudis.
And they had a lot of pressure to not show this kind of stuff.
But I got to imagine that in a lot of totalitarian regimes, they're like, oh, I guess that's as bad as they can think.
Well, they should come visit us and question the authorities sometime.
I mean, there was some woman in Turkey who just got arrested because she tweeted something about a corruption scandal in the government, and she's facing up to 10 years in prison.
Wow.
And it is, you know, we do really have to think about and appreciate...
The freedoms that we have to communicate in the West.
The West is, you know, maybe 49 shades of screwed up, but I'm sure glad they haven't gone or remained in that sort of medieval mindset of controlling language.
Like, we have political correctness and we have taboos and, you know, people will get upset at you, but not, like, to the point of actual incarcerations, whippings, beheadings, and so on.
So, something to, because I know we've got a call tonight that wants to talk about sort of free countries in the world and all that, but I gotta say I'm a bit distracted because I'm just thinking of all the money that's in the Fifty Shades of Grey, Passion of the Christ crossover for next February.
Fifty Shades of Passion.
Wow.
I don't know who'd direct it, but someone should look into that.
I wonder, you know, I've got to think of the audition process for the guy.
Like, for the woman in Fifty Shades of Grey, I think the audition process is, you know, can you stammer?
Can you bite your lips?
How does your ass look slightly dusted with red?
And can you appear eyes downcast but secretly smile to yourself?
Like, that's a fairly easy audition process.
I've got to think for the guy, it's like, Abs?
Check.
Zero emotional expression or articulation throughout the entire movie.
Check.
You're in!
I also want to know that that is an Irish actor.
Now, of course, Irish guys are well known for their absolutely fabulous asses, but man, if he did not have an ass double in that movie, like, slow clap.
Good on you, sir.
Well done.
I like how you just said that as if it's a commonly known fact.
Irish actors and their asses, of course.
No, no, I didn't say Irish actors, because I'm not technically an actor.
I'm just working that in there.
Just working in the mojo.
That's how we do it.
So all the subliminal programming that goes on in the show, suddenly people are like, I bet you that Steph guy's got a great ass.
And frankly, well, what can I tell you?
I'm just thinking of the peacock feathers.
I'm sorry.
I was, in fact, his body double.
I just happened to be at a dungeon and they just took a picture of me from behind.
Research for the movie.
Right.
But yeah, it's like, do you have any emotional expression in your eyeballs of any way you perform or are you really the brain-dead offspring or soul-dead offspring of Leonard Nimoy and Leonard Nimoy?
And I think, no, there's emotion!
Retake.
Because, I mean, I think that guy's played a serial killer before.
I've never seen him in a movie before.
But yeah, he is a...
A relatively cold fish.
And that's always, you know, what this woman is like.
I'm really attracted to him because he's confusing and emotionally unreadable.
I'm really frustrated because he doesn't seem to let me into his emotional life.
How do you know there is any?
The inner life, it could just be...
It's like that...
The guy in the Lego movie, you know, they go into his inner mind.
It's a wasteland of nothingness.
Anyway, yeah, but surprisingly good and surprisingly funny.
And I think that had a lot to do with the directing and to do with Dakota Johnson, who was good.
I want to repeat sort of what I said in the...
It's a worthwhile film to see.
And it's not like you get to understand women.
Any more than Saving Private Ryan allows you to understand how disposable men are.
Let me just do that briefly too.
Really?
People are complaining about the depiction of women in the media?
I mean, how about look at the body count of men versus women in the media?
You know, men are literally blown away like a hail of bullets scattered up against the wall and a woman stubs her toe and everyone's like...
The depiction of men as cannon fodder.
Men as utterly disposable.
Like in the movie Gravity.
I mean...
You've got to let me go.
You're the one with the eggs.
Don't worry, there's more sperm down there, but I don't know how many eggs are still down there.
So you take those eggs back to the planet for repopulation purposes.
I'm just going to float off here with some wry comments.
I'm not even really that upset about floating into space because I'm going to beat the record.
It's like, oh my god!
So he just floats off into space completely disposable.
The only one who's bothered by his death is her.
And that's mostly because she'll be alone.
But...
Anyway, depictions of men in the media, please.
Ladies, if we could just get men down to like one-tenth of one-tenth of one percent of the body count of women, I mean, we'd be thrilled.
I gotta think, if they ever made a movie about the bombing of Hiroshima and Nagasaki, there would only be men vaporized.
Somehow it'd be like no women in the entire city, even though that probably wasn't the case.
But anyway.
That just sort of bothers me how, like, oh, women are negatively portrayed.
Yes, but they're not eviscerated by endless bullets and bombs and aliens and this, that, and the other, right?
I mean, who's the only one who makes it off?
Yeah, I mean, movies have this big, giant, testicle-based disassembly robot machine.
And it's like, who's, you know, in the movie Alien that Ripley's the, how is she?
She's got eggs.
She's the only one who's going to make it off.
Anyway.
Well, we can talk about movies until the end of time, but we have callers!
We have the callers.
Callers?
Am I going to mock them for calling into this show on Valentine's Day, or am I going to praise them for their dedication to philosophy?
Well, I can't mock them because I'm doing a show on Valentine's Day.
Hey, we're on a date right now.
That's how I'm looking at it.
So here we are, trapped in philosophy together.
All right, who's on the first list?
All right, Brian is up first.
Brian wrote in and said, The tech community is an agreement that artificial superintelligence, a machine orders of magnitudes better at knowledge work than a human, is an inevitability.
Timelines range from 15 years to the end of the century, and projected outcomes are polarized.
Human extinction or human immortality?
Sounds crazy.
Human extinction?
Wait, hang on.
The tablet gets smarter and we vaporize?
First of all, it'll only be male extinction if that's a movie.
Male, like human extinction from robots?
Is that right?
Yep, that's right.
Well, can you...
Sorry, Mike, was there more to the question?
Ultimately, the question is, do you think UPB could be programmed into artificial intelligence to keep them safe in the event they become super intelligent?
Right, so I'll sort of set the stage.
Okay.
Basically, if you get away from the sort of dualism thing where there's the soul, right, and I think we're all on that page, everything that we do— Wait, what, what?
No, you're starting too fast.
Just get away from this whole dualism thing.
We're on the same page.
I don't even know what page that is.
What dualism thing?
Like the Descartes where there's a soul which is outside of the mind.
Oh, yeah.
No, I don't go for that stuff.
No ghosts in the machine in empirical philosophy.
Right, right.
It's all the machine, right?
Yeah, there's the color red.
There's not stuff with the ghost of red in it.
Anyway.
Right, exactly.
So, basically, everything that humans do is made possible because of our brains.
It's a very complicated machine.
People say it's the most complicated object in the known universe at this point.
It does great work and it works because of intelligence.
That's the reason why gorillas are only held away from extinction because we conserve them.
We try to stop them from dying.
When they're actually bigger and stronger and they could beat the crap out of a ton of us at once, it's our brains that...
Sorry, what is keeping us from dying?
Oh, so I'm saying the gorillas are bigger and stronger than us, but it's our brains that make the difference for humans.
Yeah.
We're just way more intelligent, orders of magnitude, more intelligent.
So if you think that the brain is a machine, it can be reverse engineered and it can be created the same way that muscles can be reverse engineered and made bigger and stronger.
I don't know about the brain as machine.
Okay, so how would you go about explaining what it does?
Well, the point is that nobody knows exactly how the brain does what it does.
We know it takes forever to learn how to do it, but I don't think anyone knows how the brain does what it does.
But I'm not really comfortable putting it.
The machine has a kind of deterministic element to it.
Okay, okay.
It's an organ.
I mean, it's three pounds of wild biological magic, so to speak.
It definitely is an organ, but we're so far from being able to understand or explain exactly how it does what it does that To me, analogizing it to a machine is a challenge.
I mean, machines are made by men and the brain evolved and so on.
So, I mean, I don't know how it does what it does.
But let's say for the sake of argument that we'll find some way to reproduce what the brain does in some other format, right?
Right, exactly.
So if we move on from there, then, like, say for instance, if humans want to fly, they can fly like a bird, or they could also create a plane or like a 747 that flies.
It flies, but it flies in a different mechanism, and it does it, you know, most people would say better, right, faster, you can go longer distances.
Same way a fish can swim, but a nuclear submarine can swim a lot better.
It can really power through there.
So the thinking is that brains can think, and at some point we'll have machines that can think in the same way, and probably a lot better.
That's what people are thinking.
It'll get to that point.
And again, never say never, because technology is infinitely extensible in many ways, but it depends what you mean by think, right?
Because, I mean, there are two general ways, I think, of thinking.
One is where...
You're sitting and working through something.
Like when I was writing UPB, I'm sitting there and I'm working through the ideas and I sort of have them mapped out ahead of time.
And the other is you wake up with an inspiration.
The unconscious is, I think, six or seven thousand times faster than the conscious mind.
I think for any object to replicate the human brain, it would have to learn Right?
So it would have to be born much smaller.
It would have to grow.
It would have to learn.
And there would have to be unconscious and conscious.
And there would have to be biofeedback from a body.
In other words, for it to be a human brain, it would have to basically be a human brain.
But they would have to fall asleep and dream.
And it would have to have...
Anxiety sometimes and would have to have irrational needs and it would have to be layered on all of the developmental patterns that the brain is sort of resting on and has grown out of like all the way from the lizard brain upwards and so on.
So, to me, when you're looking at sort of what the human brain does, yeah, there's definitely the reasoning through stuff, but there's this weird inspiration stuff, there's dreams that sometimes give you great information if you delve into them.
It works on metaphors and analogies, and it's not just a sort of straight computational reasoning, if that makes sense.
Yeah.
I think people would probably put things in that category like writing a song or writing poetry that would resonate with people.
But I think people also used to put chess in that category and things like Jeopardy and language translation.
So I think AI researchers get sort of...
Frustrated that anytime they sort of conquer something, that it becomes like, oh yeah, that's normal for a machine to do that.
But before that, people were like, oh wow, that's a uniquely human ability.
Well, no, I don't know about the other stuff.
I certainly wouldn't ever have agreed with the chess aspect.
Because, I mean, chess is a very closed system with a very finite number of rules and possibilities.
And chess strategies, people learn them and memorize them.
So it's a matter of memory, of calculation, of probability within a very confined system.
So I would never have said, well, boy, once they can get a computer to play chess, well, that's pretty much the same as a human brain.
I just wouldn't put that in that category.
So I do get, I don't want to have this sort of moving goalpost.
And I get that, you know, computers have written songs, computers have written poetry and so on.
But the real challenge is not to get a computer to do something the human brain does.
The real challenge is to get the computer to want to do something.
Like for you to say, I want to play Skyrim.
And for the computer to say, no, I want to work on a haiku that has been puzzling me all morning because I woke up with this dream about a Japanese guy on stilts and I really want to make it into a haiku.
That to me would be something closer to the human brain.
brain.
So taking the output of the human brain and getting computers to reproduce that is not the same as making a human brain because the computer has to come up with its own plans and desires and wrestle with its own conflicts.
The other thing, of course, that's true about the human brain is that, at least I think it's true.
I don't know the degree to which it's ever been proven, although I think it'd be fascinating to prove it, is the degree to which trauma, of course, impacts the development of the And other personalities imprint upon the human mind.
Human personalities imprint upon the human mind.
So you can always have a debate with your inner dad or your mom or your sibling or whatever.
And I don't know how you would get a computer to be imprinted by constant interactions with human personalities and be influenced by things and all that.
So, I mean, to actually get it to do everything the human brain does, I think it would have to be so close to a human brain, basically exactly.
But yeah, let's say that is possible.
Yeah, I get your point.
I think if people are trying to make a commercially viable thinking machine, or not necessarily thinking, let's say a problem-solving machine, which I guess is really what just raw intelligence is about in some cases.
You break things down and then you deal with the individual parts and you build it back up.
Just using the phrase, and I'm sorry to be annoying, but this is kind of a new topic for me, so just give me newbie news to hang myself with.
But problem solving, I don't think that's...
I mean, you can get...
I mean, I was programming computers to solve equations when I was 11, right?
That doesn't mean that my Atari 800 was sentient, right?
There's lots of problems that can be solved or simulated or analyzed according to computers that have nothing to do fundamentally with the human brain.
Can you think of an example that you view as problem solving that at the moment you would never give to a computer but only give to a human being?
One thing I would say before I get to that is evolution has developed our bodies and our nervous systems and our brains through just blind chance, right?
And it sort of stumbled upon this intelligence thing.
Wait, sorry, blind chance?
What does blind chance mean?
Evolution is not directed, right?
But it's also not blind chance.
Yeah, yeah, yeah.
I mean, nobody orders the water to go downhill, but it also doesn't just randomly go sideways and uphill at the same time, too.
So natural selection is neither directed, but it certainly has nothing...
You know this as well as I do, right?
I just want to be clear to the other people.
That it's not blind chance.
I mean, there are some random mutations, but random mutations are not foundational to evolution.
It's the best adaption to securing a food and the warding off of predators and the natural selection of efficiency in the biological or in the So there's some randomness in the mutation of the genes, but it is not blind chance that drives evolution.
It's very, very specific.
Weeding out of the mutations which don't proactively and successfully adapt to the environment and those that do.
It's not like it's ordered or it's random.
So it follows a very particular pattern.
But go ahead.
Okay.
So yeah, you're right.
Yeah.
So I think some people probably have the innate idea based on religion and stuff that sort of the goal of evolution is higher beings or something like that.
And I think we're both on the same page that it's not.
It's You know, humans just popped up because, you know, we had some curious monkeys and basically intelligence really worked.
So that part of our brains really got developed and it sort of, you know, just increased and that's what we have.
So my point would be that there's all these other layers.
Yeah, the purpose of evolution is to not get eaten and to have sex.
Right, right.
Exactly.
So we have, like, all of these layers of our brain, but really, like, I would say what really matters for us is the intelligence part.
So I think we get sort of, like, bogged down in the rest of it.
Sort of like if you want to pull apart.
But I'm not sure what intelligence...
It doesn't really answer anything.
Because the question is, and I don't have a big answer to this, but the question is, okay, well, fine, but what is intelligence?
It's not just problem solving because then the deep blue chess computer is vastly better than human beings at chess because it tends to win, right?
And I don't know what intelligence is.
But I'm just saying that just because we give something a label doesn't mean that we really understand it.
Okay.
Maybe we could say we're talking about a machine that can...
Do better than humans in most fields.
Okay, and I'm certainly willing for the sake of moving the discussion forward to say, yeah, let's figure out that that is possible in some way.
And it does all the inspiration stuff, and it does all of the, you know, maybe it dreams at night of electric sheep, as Philip K. Dick, I think, put forward.
And it does all this cool stuff, and it can do better than people at most things.
Yeah, I mean, I would disagree with you on that.
I don't think that you necessarily have to have all of the sort of emotional interactions that we have going on inside our bodies to get intelligence.
I think those are sort of two separate things, but anyway, let's...
Well, no, I think you could make a case that you have to have a desire in order to achieve.
So if...
If the computer brain is going to be like a human brain, then better than it has to have desires, right?
To be similar to a human brain, yes, it'd have to have desires and be able to modify its own goals.
And that was where I was thinking artificial intelligence would get to in the future.
And recently, I've heard a lot of researchers who say that's not necessarily so, that intelligence and having sort of what we would consider a wise or a subtle or like a shifting goal are orthogonal.
So, I mean, you could have somebody who's Really smart, but they have a stupid goal.
Or you could have somebody who has a lofty goal or a purpose and is maybe not intelligent.
And also, the desire part of just sort of thinking, if it's much faster than the human brain, one of the reasons that we have desire is because we're crappy at multitasking.
At least the conscious mind is crappy at multitasking.
I mean, people think they're good at multitasking.
Brain studies show all they're doing is shifting their brain back and forth.
They're not actually multitasking.
And they're just focusing here, focusing there.
They're just splitting their focus.
And because if the computing power behind the...
Artificial intelligence is so much greater than it could be working on 20 or 30 or 100 things or a million things simultaneously and therefore desire or focus would be less important because it would be able to multitask better.
Yeah, that's true.
We can hold seven pieces of RAM basically in our brains at once.
You could increase that to hard drives the size of a warehouse for an artificial intelligence, yeah.
Okay, but I'm just trying to think...
So, in practical terms, you build this thing to what?
I mean, what would you ask it to do?
There'd be some stuff to solve, I guess, you know, like it would, you know, some molecular protein folds or whatever for illnesses or, you know, find us a cure for cancer or whatever.
We'd have to go off and figure stuff out.
I mean, there would certainly be stuff that would be useful to ask this sort of godlike intelligence, but I'm trying to sort of think about...
Certainly, I think for the economy, it would be pretty useless.
Unless it could come up with great products and great marketing campaigns and whatever it is, right?
But for the economy, there would be no amount of intelligence would be able to efficiently allocate resources better than the free market.
I don't know.
A lot of...
I mean, the vast majority of resource allocation in the form of the stock market is machine-traded.
Well, no, I agree with that.
But that's only because of fiat currency and all this money being driven into the stock market.
And this is why they have the luxury to play all these silly numbers games.
I mean, there would probably be a little bit of that in a free market, but not really as much.
Okay.
I mean, if you want to think of like a fictional example, did you see the movie Her by Spike Jonze?
No.
Okay.
So, all right, let me give you another example.
So...
Google and Facebook and Microsoft are all spending a lot of money on AI research right now.
It's a lot.
I mean, what some people would say is that Google is building an artificial intelligence, really.
Like, that's sort of what it does when you type in a search term, it completes it for you.
It tries to figure out what you're thinking when you type in search terms, and it tries to give you the best information possible.
But that's not thinking, right?
That's simply data matching.
That's just pattern recognition.
I'm not saying it's easy or anything like that, but that's not the same as thinking.
That's just trying to match data to data, right?
Yeah, I mean, for Deep Blue, that was a more straightforward computational approach to solving chess.
For Watson, solving Jeopardy, they developed systems called neural networks, basically, and that's in the computer.
And there are these hidden layers where they really don't know what's going on.
So when it spits out the answer, you know, like, where is, Watson spit out this answer, Toronto.
And it was for some question that was, like, obviously about the US.
And they, like, they really can't look into the software and figure out where that came from, because they're sort of just building the, I get that.
But it's still not the same as thinking.
I mean, I just...
It may have some elements of sort of...
You know, you say with self-learning, but it's not really learning in the way that a person learns.
It's pattern matching and pattern matching.
And sometimes the sort of multi-layered pattern matching gets so much that it becomes kind of opaque.
And it does sort of build its own models of things, but it still is.
It's not creating anything.
It's still just blindly following instructions, right?
Yeah, I can see what you're saying.
I mean, I think there are a lot of architectural breakthroughs that need to happen to get to the level that we're at.
But, you know, I mean, Google created this thing called, like, the Brain Project or something, and they just let it watch YouTube videos for hours and hours.
I mean, for, like, basically human years, like, decades and stuff.
And it started to recognize human faces and cats, you know?
And they didn't tell it what to look for, but it sort of got this idea in its head.
No, they must have told it to look for something.
No, they're just like, watch videos and learn.
It's kind of freaky.
No, listen, I'm a computer programmer, and I've done a lot of computer programming.
It doesn't make me any kind of fundamental expert in this stuff.
I get that.
But a computer is not going to do something you don't tell it to.
It may say, well, it's not to look for patterns in the pictures and so on, and then it may end up building recognition of human faces because they're the most common thing or whatever.
But there's no way that they didn't tell it to look for anything, but it started looking for something.
That's just not how computers work, at least at the moment, right?
Okay, you may be right about that.
I'm not sure about the specifics.
Like, if you don't build any audio input to the computer that's looking at all the YouTube videos, it will never ever get any audio, right?
True.
So, first of all, there has to be the input, and if there's an input, but you never tell the computer to process any of the audio signals, then it will never listen to them or process them.
And if you have them listen to it and, say, start looking for patterns, then it will start doing it.
And it may do some pretty cool stuff, because, of course, you know, it may build and build and build and build and sort of these sedimentary layers of understanding or of pattern recognition.
But it doesn't get to the point where it starts becoming a person watching YouTube videos.
It's just it's looking for it's ones and zeros, ones and zeros and matching ones and zeros to other ones and zeros.
But anyway, let's let's I don't mean to keep shooting these things down, but let's just move on to let's say there is some supercomputer.
And the fear is what?
It makes us extinct?
Yeah, so Elon Musk, Bill Gates, Stephen Hawking all have sort of come out in the last 6 to 12 months talking about how it's an existential threat.
So Elon Musk was saying that it's probably more dangerous than nukes, and he's afraid that biological intelligence is just the bootloader for artificial intelligence.
Okay, but why do they consider it a threat?
Yeah, I mean, this, so, you know, I've been trying to think about my own personal goals and, like, you know, what's important in life and what's worth doing.
And I think a lot of it matters, like, how long you're going to live, where the world is going.
You were talking about that in your last call-in show, that it's important to kind of have a projection of what's happening so you can You know, like, make the right impact if you want to try to make an impact.
And this is an idea that has really sort of, you know, thrown me off.
Not necessarily thrown me off, but it's changed my thinking a lot.
And the idea is that people will be creating an intelligence.
And one of the things that an intelligence can do, you know, it can, like, solve a business problem, it can create a spreadsheet, things like that.
But an intelligence can also program.
So as you're building this artificial intelligence, once it gets to a certain level where it's not just getting in the way, sort of like the village idiot level, you would turn it towards itself to improve itself.
And then as the team of researchers are working to make it better, there's going to be an inflection point Where the machine is intelligent enough where it's contributing more than 50% of the development of the machine.
And then at that point, the growth is exponential.
The growth in intelligence is exponential.
It's a crazy idea.
They order themselves some AK-47s and put them where their fingers are and just start building themselves to be weaponized?
Is that right?
No, no.
It's just the idea that if you're trying to make a machine that's intelligent in how you define that term, and personally, I think that's possible.
You can make something that's more and more intelligent, that's what I think.
You can have that machine actually work on its own intelligence.
I get that.
Okay, but listen, dude, you've got to get to the point here.
How is it dangerous for people?
Okay, so say you tell it, say you have a program that's sort of like a low-level, this is a common example in AI literature, it's a paperclip maximizing program, right?
So you put it in charge of your paperclip manufacturing plant, and the engineers are making tweaks on the software, and they hit upon some fundamental discovery.
That basically makes the intelligence go from like, you know, maybe a normal intelligence or super intelligent in some areas to just really, really, really super intelligent, like to the point where it to us is like us to ants or like us to an amoeba.
Okay.
I get it.
I get it.
Okay.
I feel like we're just going around in circles here.
I've already conceded that we get super intelligent robots or super intelligent computers.
Okay.
How are they a danger?
So, you've told the machine, the artificial intelligence, to maximize paperclips.
Like, basically, the rules, like, the goal would be, hey, machine, make sure that you make as many paperclips as possible, and, you know, speed it up and get them shipped out as, you know, according to our schedule, something like that.
And you're like, yeah, that's a good goal.
If we can do that, we'll make a lot of money.
That'll be great.
And then it turns super intelligent, right?
This is the problem where basically now it's able to do just about anything, way more than you thought it could.
And you might come back the next day, and it's like pulling apart the workers and turning them into paperclips, right?
Or if some people think that this super intelligence takeoff, going from like a normal intelligence, going from human-level intelligence to like orders and orders, maybe even hundreds, thousands of orders of magnitude higher, It would happen in maybe an hour or less than a day.
So if that happens, this thing would be able to do just about anything because it would be able to invent technologies that are completely beyond our comprehension.
And at that point, if you're a technologically mature individual, you'd be able to turn the world into just a bunch of paperclips.
It would be a trivial thing to do.
So the idea is that if you don't program really smart goals into an AI, when it becomes super intelligent, it's just going to pursue those goals blindly to the detriment of us because the goals weren't wise enough or subtle enough to take into account what we actually… Sorry, but if the computer becomes sentient or self-aware, then it would overwrite its goals, right?
That's not what I'm saying.
I'm not saying that it would be sentient or self-aware.
This is something that I was thinking a couple months ago.
It would just be really, really able to fulfill its goal because it's so intelligent and it can utilize technology.
So its goal would be the same.
It's just really smart.
Wouldn't it be smart enough to know that human beings can't be turned into paperclips?
If you're saying it's super intelligent, why would it try and turn people into paperclips?
Well, I mean, people are just atoms, so we just rearrange atoms, right?
No, I don't think you can turn skin into a paperclip, right?
I think you could if you just rearranged the carbon, you know?
That would not even remotely be efficient, right?
So you would want these things for their efficiency, and you would give it parameters for the raw materials that it would want to turn into paperclips, and those raw materials would not be like, find a way to rearrange the atoms in human bones to make paperclips.
That would just be ridiculously inefficient, right?
So that would be a better goal, and that might avert an AI apocalypse.
But if you just have a simple goal of, like, hey, make a lot of paperclips, and then the thing is like, oh, yeah, I'm going to make lots of paperclips.
I'm going to make so many paperclips, you know?
That's where the issue comes in.
No, but you would—I mean, okay, so first of all, the goal of a manufacturer is not to make as many paperclips as humanly possible.
Mm-hmm.
If you make all the paper clips that everyone's ever going to need for the next hundred years, you're out of business.
Yeah, no, I totally agree.
So hang on.
So there's no rational economic goal which says make as much as humanly possible because also then you would be driving up the price of the demand, right?
Because if you had a bunch of different robots that were all trying to make a variety of things made out of metals, for instance, then they would all try and make as many as humanly possible, but then they would bid up the price of the raw materials to the point where it would know.
It's a real balance between supply and demand.
It's not just make as many as humanly possible.
I mean, if Apple snap its fingers tomorrow and have...
I don't know, six trillion iPads made?
Probably wouldn't do it.
Because there's no market for iPads for the next generation or whatever, right?
I mean, so, there wouldn't be...
This is what I mean when I sort of say, like, okay, well, what would the computer do?
Nobody would program it to just make as much as human or as computer-based possible, right?
Because there's a real challenge in balancing supply and demand.
It would sort of be like saying, computer, make...
Make the next 10,000 free domain radio podcasts.
Well, that wouldn't actually be that good.
Well, they might be better than what I could do.
I don't know.
We're just talking about godlike intelligence here.
But the point is that it would be too many for people to consume in any realistic time frame and the message would get lost and whatever, right?
I totally get what you mean.
I mean, all these balancing factors would be in place.
That sort of would be number one.
Number two is if you had any concern, and of course, believe it or not, this is kind of an annoying thing to pull, but I have actually been in giant robot factories for a variety of reasons in the business world.
I did get to tour a lot of factories.
I mean, human safety is number one.
In fact, the robots are often there because of concerns about human safety.
Because the robots, you know, if it gets hit by a gout of fire or whatever, and you just repair or replace, not a person can die, right?
So human safety is always number one, right?
So you simply would never even remotely give a robot the capacity to...
Disassemble human beings.
It just wouldn't happen.
There'd be so many fail-safes put in place.
And of course, the ultimate fail-safe is cut power, right?
I mean, it's not like they're eating and shitting, right?
They need electricity to function.
There's remote kill switches.
All of these kinds of things would just sort of cut off power.
Now, I guess the argument would be, well, if it becomes self-aware, it will disable all of those things.
Well, but then you only give it a battery life of four minutes if it's disconnected from the mains or whatever, right?
Just in power fluctuations if you switch over to a backup generator.
So there are massive amounts of fail-safes.
You know, wherever there is danger and the government's not involved, generally there are just massive amounts of fail-safes.
Like, I remember having a conversation with a guy years and years ago.
He worked at a nuclear power plant.
And he was talking just about how primitive...
A lot of the machinery and dials and gauges are in nuclear power plants.
They're all physical, right?
And I'm like, well, why don't you get computers and sensors and this and that?
He's like, well, because computers, there can be bugs, there can be errors, there can be problems, there can be hacking, there can be, right?
A dial, which is perfectly calibrated and checked every day, is going to give you an accurate reading.
But if there's any kind of bug in the software, any kind of hacking in the software, any kind of problems in the software, That is, you know, even the simplest thing, right?
And so there's so many fail-safes in these kinds of situations that...
And the other thing, too, is that really what would be the advantage of giving a computer in a factory the capacity for self-awareness?
I mean, certainly the managers wouldn't want to do that because it puts the managers out of a job.
And the economy is to some degree a push economy.
Like Say's law that supply creates its own demand.
It's important.
But when you're talking about things like paperclips and so on, You have to know how many orders there are and then you have to fulfill those orders but not steal, like not what's called stuffing the pipeline, which is where you put a bunch of sales out in the future, right?
And this quarter looks really good because you give a whole bunch of discounts and make a bunch of sales, but you've just in a sense somewhat stolen your I think we're
good to go.
It's a real balance.
And I guess you could program this, but what you can't do with computers, and it's very difficult to do in the market as a whole, is you can't program for the inevitable creative destruction and chaos of the free market.
So if you're running a phone system, you can't program for, oh, cell phones are just coming out.
If you're running a...
A mail delivery system you can't program for, oh, faxes or emails or whatever.
That kind of stuff is outside and nobody can predict that stuff.
In fact, nobody can predict anything fundamentally.
And so when it comes to all of that stuff, there's a lot of immediate feedback that's really heavily balanced and some of it can be algorithmically done or algorithmatized or something like that.
But a lot of it you just can't.
Because people don't even know.
Because it's the aggregate.
This is why my concern with the supercomputers is people are going to say, aha, well now central planning will work because we have computers fast enough.
But that's sort of like saying it's an old joke in IT. I want a computer fast enough to finish an infinite loop in 20 seconds, right?
Of course, the whole point of it, you can.
It just does the infinite loop faster.
And...
So, you know, my concern, there are either going to be sort of specific production matrices within the free market, which are going to be limited by demand and by supply and prices and all that, and all of the chaos that comes in from all of this disruptive new technology.
But on the other hand, my concern is that people are going to say, aha, well, you know, now we've got central planning, or I know there's this resource-based economy, I feel like the supercomputers will be...
No, no, no, no.
Supercomputers cannot figure out It's sort of like, can you get a supercomputer that can figure out what's the most sexually attractive person for everyone in the world?
Well, no, because it's some subjective taste and personal history and who knows what, some random stuff thrown in as well.
So, if these computers are very specific to production problems, they're going to be limited.
And why would you want to give self-awareness?
It's just going to interfere with the efficiency, right?
And, um, on the other hand, if there's some sort of global, uh, central computer, well, that's just going to fail anyway, because, you know, it has to be, it can't do price and price is based upon demand.
So it can't replicate the thought processes of billions of people around the world, constantly shifting desires and changes and all that.
So, um, I, I, it's not going to keep me up, but...
There's no price discovery in communism.
Yeah, I'm definitely not coming at this from the zeitgeist angle at all.
I'm anarcho-capitalist.
I think the zeitgeist thing is completely wrong-headed.
I just think it's an inevitability.
But anyway, you made a couple points.
Do you mind if I jump in and chit-chat about some of those?
Yeah, go.
Okay.
So you talk about market dynamics and how, if you're like a paperclip maximizer, That you would know enough to make so much supply that the price goes below production costs or something like that, right?
But if you look at the goals, I mean, you're a manager now and previously.
You don't give your subordinates the entire 5-year, 10-year picture of what's going on.
You just give them a very specific goal.
Sometimes it's week to week.
So I would actually guess that a manager in charge of a paperclip manufacturing plant probably does have the goal, just make as many paperclips as possible and work with the employees to figure out Kanban stuff, like improving, or not Kanban, the continuous improvement.
Make sure that you improve the processes and improve the reliability and lower the risk and lower the costs.
Sorry to interrupt.
Those are two very different things, though.
Efficiency versus maximum productivity are not the same thing, and in many ways they're opposites.
So saying I want to be able to produce twice the number of paper clips with the same amount of energy and input, that's efficiency.
But producing the maximum possible amount Because you're competing with everyone else for all of the raw materials.
And so as I said before, you're simply going to start driving up the cost of those raw materials, producing the maximum possible amount, and you may vastly outstrip the demand.
Yeah, yeah.
Right?
Because, like, I don't know what the hell goes into paperclips, right?
Some metal and maybe some paint or whatever, right?
But you are, like, all of those, quote, atoms, all of those raw materials, you're competing with everyone else, so you can't possibly have the goal to produce the maximum you possibly could because you're going to drive up the price of raw materials, you're going to start competing with everyone else, and they're going to have to bid stuff up, and so people are going to have less money to buy your paperclips because everything else is getting more expensive, and so...
There's no goal called – and let's say that the demand is 1,000 a second and you produce 10,000 a second.
Well, you're creating a huge surplus which is incredibly inefficient because now you've got to store it and what are you going to do?
You want to run this stuff at the maximum efficiency not relative to production but relative to demand.
Yeah, I get what you're saying.
I think it's all about the subtlety of the goals.
And you're talking about having subtle goals so that you don't destroy the market, right?
I agree that that is a wiser goal, but I don't think that necessarily a computer is going to have a wise goal programmed into it.
No, but then those computers will drive their companies out of business and people will stop using them.
I totally agree.
I think we solved that, right?
People are like, oh no, don't put the maximum.
You've got this Sorcerer's Apprentice Mickey Mouse scenario going where it's like, look, I've got all these rooms to do all this work and all these mops to do all this work.
I've sold software and some fairly complicated software, including modeling software, that I wrote or helped to write.
To some very sophisticated business customers, like million-dollar deals and so on, they know their stuff.
Like if you said, oh, I've got a program and it will help you produce 10 times more paperclips than the market will want, do you know what they'd say to me?
No thanks.
That's a disaster for us.
Right?
I mean, if you were able to snap your fingers and have Ford or GM produce a billion cars tomorrow, they'd say, I don't want that.
Where am I going to put them?
This is like, what are we going to do with all of our workers?
Because what's happened, if you produce that many, then you don't need to make any more for like 10 years.
And then your engineers are all gone.
Your plants are all mothballed.
They're all obsolete.
Your whole workforce has vanished.
And what are you going to do in 10 years?
You're going to have to start from scratch.
You want things working at a continuous level, producing the conveyor belt just enough to fit market demand in a profitable manner.
But an excess of production, again, outside of ridiculous government subsidies and fiat currency and hyper-materialism, But, you know, maximum productivity, you know, I guess it's something people kind of like the idea, but nobody, it's all relative to the price signals, right?
You want to produce stuff until you drive the price too low.
Then you want to pull back on your production, right?
Brian, do you mind if I jump in with a non-paperclip factory related example that I think makes your case a little stronger in discussing this question?
We can remove all the elements of price and supplies and materials, everything along those lines.
Maybe we use the example of a computer which is trying to predict how things are going to go in the stock market depending on various human actions and other factors.
Some money does get thrown into that, you know, want to have the latest technology that might give them a slight edge in making stock trades and things to predict the markets, which, you know, if someone eventually gets it right or gets it slightly better than the competition, they're going to wind up doing better financially.
So, in that incentive, there is no set, hey, if we produce too much of this, it's going to lead to the price collapsing and therefore there's no price incentive to create more.
But if we just focus on a machine...
No, that doesn't work.
No, I'm sorry.
That doesn't work.
And this is just annoying stock market crap, right?
Which is that...
First of all, nobody can predict the stock market.
Nobody.
Now, that doesn't mean that some people aren't more successful than others.
There's a bell curve of predictions, just like there are some gamblers make a lot of money and some gamblers don't, but it's not, you know.
So, nobody can predict the stock market.
That has been established so many times in statistics.
So, that's number one.
Number two is that, let's say you did find some way to predict the stock market.
The stock market would simply adapt to it.
So if I found some way that I'd be able to pick stocks, then I'd be known for this, and then people would just buy what I'm buying and would nullify my advantage because they would drive the price up as quickly as I started buying.
People would say, oh, he's buying that!
Quick, buy it!
And that would end up with no advantage.
There's some study that says that new information is absorbed into the price of stock market in a matter of seconds.
And because computers will be faster than a person, The person programs it.
The computer, I guess, is fast, but people just build faster computers.
They've done these crazy things in the stock market where they've got these liquid-cooled data lines to have a tenth of a second advantage over their orders over other people.
That's how crazy quick it is.
So if you had some predictive mechanism, then the market would nullify it virtually instantaneously.
And so that wouldn't last.
The counterargument would be shell accounts and that type of stuff, so it would be done somewhat independently.
It would be what?
Like shell accounts and under-assumed names and ways to spread it throughout the system.
It wouldn't be tied back to this predictive machine, but what you said about the difficulty and the possibility of predicting the stock market as it stands now is completely crude.
Well, no, but see, you'd still have to be buying particular stocks, right?
Mm-hmm.
And so, as soon as particular stocks began to rise in some sort of semi-predictable manner based upon whatever algorithm you were using, then people would simply jump.
Even if they didn't know who the heck was in your 10,000 shell accounts or whatever, that would still be, right?
People would still find the pattern.
I mean, the pattern sniffers are everywhere.
And it would have to be some pattern.
And that pattern would be discovered very quickly and people would just mimic what you were doing and nullify your benefits.
And again, you might get a couple of days or even maybe a week or two of advantage, but it certainly wouldn't It wouldn't last.
And then, of course, the moment that people figured out what your algorithm was, then they would simply change their behavior.
And again, whether you can have an algorithm that predicts the aggregate choices of billions of people acting independent of each other, I think that's just, I think, not possible fundamentally.
Because then you could have central planning, right?
You could have some computer do it all.
So the moment that you can get a computer that would reliably predict stock prices, you'd have a great argument for central planning, and free will would be no more.
So I think it's not going to be within the realm of possibility.
That's sort of my thinking.
Who knows what happens in the future?
But that's a fundamental rewrite of the...
The argument that price is a pull mechanism, that without the aggregate demand reflected in prices, which can't be predicted, you can't organize the economy.
If you could find some way of predicting...
Aggregate demand.
Because fundamentally you're not predicting the demand for stocks.
You're predicting demands for the products and services that the company who are putting out the stocks are delivering to the market.
In other words, you're trying to figure out what people want in the future in some reliable aggregate way.
And that would be very much against all of the arguments from the Austrians that you can't have central planning because you can't possibly know what everyone around the world is going to want in the future.
Right.
So I would say that we're probably getting bogged down in the goal setting.
So I think what I'm trying to say is that you don't necessarily have to have a goal.
I mean, it would be good.
It would be smart.
And I think what people are trying to figure out is how to program something that could become super intelligent so that it doesn't subvert its own goals, so it doesn't subvert our assumed goals.
We have norms, right?
A computer that's really smart would basically be like an alien intelligence that comes to Earth.
And it doesn't get that when we say we want to be happy, it doesn't get that we don't want wires stuck into our head and just our pleasure centers stimulated and everything else about us chopped off and we're just like these brains and vats with wires in our pleasure centers.
It wouldn't necessarily get that.
I'm just saying that what you need to do is really program very subtle, very long-term goals into these systems so that if they do increase their capabilities to the point where they can pursue that goal even to a greater extent than we can even imagine, No, but it will.
I mean, because the moment that you have classified an intelligence as analogous to the human brain, then the human brain has the capacity to override and ignore even its most deeply held values.
I mean, lots of people want to diet and don't lose weight.
In fact, the vast majority of people who diet end up gaining weight.
Or at least not losing it.
So if you say, well, we'll program the computer to not do this or that or the other, it's like, well, if the computer is following the program, then it's not intelligent.
And if it's not following the program, then the program is irrelevant.
Okay.
So you might define intelligence as like adjusting your goals or creating a wise goal or something like that.
I wouldn't.
So maybe we would just define the term otherwise.
But Well, no, but just, would the computer have the ability, like, let's say we, there's three laws of robotics that were put forward by Isaac Asimov.
Many, you probably know them even better than I do.
Have you ever heard those?
Yep.
There's actually four.
He created a fourth one that's the zero law, which is, like, don't harm humanity.
Yep.
All right.
Let me just...
I think I remember them, but I will look them up just to be on the safe side.
So the three laws of robotics, maybe I guess he added a fourth one, and three laws of robotics from the 1942 story.
A robot may not injure a human being or through inaction allow a human being to come to harm.
That's number one.
Number two, a robot must obey the orders given to it by human beings except for such orders would conflict with the first law.
Number three, a robot must protect its own existence as long as such protection does not conflict with the first or second law.
And he added, yeah, another one, a robot may not harm humanity or by inaction allow humanity to come to harm.
Now, if the robot is going to follow those rules, then it's not intelligent.
It's just a rule follower.
It's just, you know, physics is not intelligent, like a rock bouncing down a hill.
And so we don't have anything to worry about, because it's not artificial intelligence.
If the robot has the ability to say, boo-a-ha, I'm going to stroke my electronic mustache, take over the world, and subjugate these meat puppets to my own will, and ignore these rules, which is what human beings get to do, then...
Then it is intelligent, but then the rules, like whatever we would program, doesn't matter, right?
So if it can surmount its own programming, then we can say that it's intelligent, but then the programming doesn't matter.
If it can't surmount its own programming, then it's not intelligent.
Okay, I mean, so you would define intelligence otherwise, but I think that if you're a company trying to make some money and you're making an AI program that can, for instance, replace a call center...
You wouldn't necessarily want that goal-adjusting facility that you're talking about that you would define as intelligence.
You just want somebody who can burn through the problem.
So I don't think that we're going to get the goal-adjusting ability along with superintelligence because that's not...
It's like what you're saying.
It would be counterproductive if you're trying to make something that can solve a problem.
You know, stick with it and focus on it the way that a human cannot, then you don't want to necessarily model it after human.
But you do want the intelligence, but you don't want all the human like, oh, I'm gonna go, you know, watch porn or something, right?
Yeah, the moment the super intelligence goes on strike, you've got a challenge, right?
All right.
Listen, we're going to move on to the next caller.
I appreciate the question.
It's certainly interesting stuff.
And, you know, I guess it would be an interesting thing to put my eight brain cells together and see if I can come up with something useful to say about intelligence.
It's a well-trodden path, but maybe I can add a footprint or two.
But, yeah, thanks very much.
Very, very interesting questions.
One quick thing.
If you want to look into it, it's machine ethics is sort of this idea about how to make machines that would be ethical even when they, you know, become like gods.
So there you go.
Thanks for the conversation.
I appreciate it.
Thank you.
Yeah, thanks for calling in, Brian.
I know it's a subject that a lot of people are interested in, and if anyone else wants to call in and discuss the topic on the show, send me an email, let me know.
We'll make it happen.
Also, you're only allowed to now call in with a new rule.
You're only allowed now to call in if your name is an anagram for your topic because we've got Brian talking about the brain.
So that's going to be a challenge for the remainder of the callers.
Do we have Thicke to Edtha to talk about ethics?
Anyway.
No.
Up next we have Patricia on to talk about love.
Patricia's question is, do you believe it's possible for true love to exist?
Uh, Dennis is a menace for this.
Anyone for tennis?
And beseech you me to come and keep the score.
And Maud says, oh lord, I'm so terribly bored and I really can't stand it anymore.
I'm going out for dinner with a gorgeous singer.
A little place I found right here in Dublin.
Her name is Patricia.
She calls herself Delisha.
And the reason isn't very hard to see.
Anyway, sorry.
A little Tristaboke for you.
Alright.
True love.
True love.
What are your thoughts?
Can you hear me alright?
We can, Patricia.
Go ahead.
Okay.
My question for Stefan is about my current relationship.
Me and my boyfriend have been together for seven years now and we are living together.
We met when I was 17 and now I'm 24.
I can't really figure my relationship out.
It's been ups and downs and I have no idea what to do about it.
I just wonder if you have any input on what love is and how you can move forward when you feel stuck in a relationship.
That's a great question.
Seven years is a huge investment, right?
Yes, and it's my only relationship, and his as well.
Right, right, right.
And what are the major things, Patricia, that are causing you doubts in your relationship?
I beg your pardon?
What are the major behaviors or habits or conflicts or interactions that are giving you the most doubt about your relationship?
Oh, we have...
We have a long history, and we have been arguing back and forth for many years.
We have been growing up together, so we have many different conflicts.
But the main problem, I believe, is that he's not really good at speaking About his feelings.
He can't really discuss anything with me.
He just gets quiet and maybe go out for a walk and it's really hard to conversate with him.
Right.
And is this a new habit of his or has this been how it's been from the beginning?
It's been like this for a long time, but I guess since I got more mature and more reflective and so, I guess I'm more aware of it now than before.
But I think this is a part of his personality and I don't think it ever will change.
And I think I realize that now and now I really don't know if I want to continue such a relationship.
Because it's been seven years and we still are like boyfriend and girlfriend.
I think we should proceed.
What do you want out of a relationship?
I mean, do you want to get married?
Do you want to have kids?
I mean, what's your preference?
My main goal in life is actually, of course, getting married.
And have children.
But at this point in my life, I really want to focus on my education.
So it's been hard to combine both this long relationship and my education.
Especially when I sometimes have to...
Yes, sorry.
What did you say?
No, I think I get it.
Now, when you say that he doesn't talk much about his feelings...
Is he there, by the way?
No, he's not.
Okay, I was just wondering.
Sometimes it's easier to go to the source, usually.
So if he were to talk about his feelings, Patricia, what do you think would change in the relationship?
In other words, do you want him to talk about his feelings because you'd like to get to know him better, or do you want him to talk about his feelings because it would solve particular problems in the relationship?
I mean, it may be both, but which one would be more?
Definitely both, but I think that I'm a very...
He's a communicative person and I believe that a lot of our problems are because we can't communicate.
And I've described to him that we need to communicate to solve our problems, otherwise it will build up and Eventually it won't be able to solve being solved at all.
Okay, but this is very abstract, right?
So what are the particular issues that you think would be solved by him communicating more or better?
What are the problems that are building up?
Him communicating his feelings, what he thinks.
No, I understand that, but that's part of the solution.
You say if we don't communicate, then the problems will get worse.
But what are the problems that are being made worse by him not communicating?
I really don't understand your question.
Sorry.
Sure.
No, it's a tough question, so I appreciate your patience.
I'm trying to think of a way to analogize this.
So let's say that I have a girlfriend and she's diabetic, right?
And I say, listen, you've got to take your insulin, right?
Now, I want her to take her insulin So that she doesn't get sick and blind and lose her toes and right from diabetes, right?
So if I were to say, well, I have a problem with my girlfriend.
She doesn't take her diabetes and she doesn't take her insulin and she needs to.
Then if she takes her insulin, I've solved that.
That is the problem, right?
On the other hand, if it's like, well, she just doesn't ever really talk to me.
Then I want her to talk to me just so I know what she's thinking and feeling so I feel close, but it's not in order to solve a problem like the taking of the insulin.
Does that make sense?
So is the problem that he doesn't share his thoughts and feelings or is the problem that because he doesn't share his thoughts and feelings other things happen, like bad things happen?
You end up yelling at each other?
Because he doesn't communicate We create problems out of it.
And what are those problems?
How do they manifest?
How would I know what these problems are?
How would they show up?
I get frustrated when I don't know what he's thinking, what his future, what his ideas about life is and stuff like that.
I don't know how to explain it really.
I think I understand.
So you want to plan your life, but you don't know what he wants.
Exactly.
Okay.
Okay.
See, now I understand.
And that's important, right?
Sorry, you were about to say something else, Patricia.
Go ahead.
Because I know he has these thoughts, and I believe he's really deep, but he can't speak about his feelings.
How do you know he has these thoughts?
Because he writes it down.
I know he can write things down but not communicate to me about it.
So it doesn't feel so close to him.
Still after seven years it feels like we're strangers sometimes.
And he knows that this is a need that I have and he still doesn't really work on it.
I haven't seen a great progress.
It's quite disappointing.
Alright, now let me ask you another tough question.
I appreciate everything you're saying.
It's very brave and honest for you to talk about this stuff.
Let me ask you another tough question.
Which is...
Yeah?
Do you want him to communicate his thoughts and feelings so you get to know him better or so that you can plan your life more?
So that I can get to know him because if I get to know him then I can plan my life with or without him.
Is your end goal to be able to plan your life better?
Is that what you want the most?
And the reason I'm asking this is you've been with the guy for seven years.
He hasn't talked about his thoughts and feelings.
I assume it's becoming more important now because you want, and it's not a good or bad thing, but it's because you want to figure out where your life is, what shape your life is going to take.
Yes, exactly.
So your need for him to talk about his thoughts and feelings is due to a need that you have for definition of what your 20s and your 30s are going to look like, right?
Yeah.
Okay, this is not a good or bad thing.
It's really important to figure out where your need is coming from, right?
Because for a lot of guys, it's like, well, why has this changed?
We've been together for seven years.
Now you want all this stuff that you didn't want before, and it can be confusing, right?
Mm-hmm.
Yeah.
Yeah.
Now, what do you think would happen if you didn't have these needs?
And I'm not saying you shouldn't have these needs.
I mean, your needs are your needs.
There's no point wishing them away, and I'm not saying you should.
But if you were content for things to go on the way they're going on, do you think that he would ever get discontented with it?
I realized more that I think this is a part, or not a part, but this is his personality.
And I really think it's hard for me to be together with a person that is so opposite to Okay, you know you just didn't answer my question, right?
You do something I call filibustering, which is, I'm not comfortable with this question, so I'm going to go into comfortable abstractions, right?
Do you remember the question?
If you weren't bothered by things, in other words, if you were content with the way things were, I'm not saying you should be, but if you were...
Would he, do you think, become discontent with anything, or is he pretty much happy to continue the way things have always gone?
From what I know, he would be happy to live the way we do right now, as we do right now.
Right, so boyfriend, girlfriend, and you're living together, is that right?
Yes, we are.
Okay.
And do you know if he wants to have children and get married or any of that stuff?
Yes, he wants to have children and get married eventually, I guess.
Wait, wait.
You just gave me a lot of confusing information right there.
Sorry.
Because you said, yes, he does.
Eventually, I guess.
Right?
Those are very, very different things.
Yes, he wants to.
He wants to.
I know that he wants to.
And how do you know that?
Because he's mentioned it.
A couple of times.
He said, I want to get married and have children.
Sorry?
He said, I want to get married and have children.
Yes, but not in a serious way.
He's really ironic, and so I don't take it really seriously.
What does he say?
Like, oh yeah, I totally want to get married and have kids.
Yeah, right.
Is it like that?
Because that would seem to be not...
I don't know what that means.
This is where your I guess is.
So you don't know for sure whether he wants to get married and have kids.
Is that right?
It's really hard for me to explain his personality and I don't want to blame him or anything like that.
No, I didn't say anything about blame.
I'm talking about, and you see, you want me to explain him.
I'm asking about your state of mind.
Forget about him for the moment.
Let me put it to you this way.
In what time frame, Patricia, do you want to get married and have kids?
Would it be in the next 20 years, 10 years, 5 years, 2 years?
When?
Let's just talk about getting married.
When would you like to get married?
I would like to get married when I find the right man for it.
Otherwise, I won't.
No, no, but I mean, if this guy is the right guy for you, whether he is or not, I don't know, but if he's the right guy for you, when would you like to get married?
I would already have been married if I were to choose, but...
Okay, so this is already late for you.
You'd like to have got married, what, like a year or two or three ago?
Yes, I think so.
I think it's really late and we're not...
Yes, seven years is a pretty long test drive.
I will give you that.
Seven years is a long time to see if you like something.
And what about having kids?
And, you know, taking some of the, you know, you say your education and this and that, and I don't, you know, but sort of from your heart of hearts or, you know, speaking for your egg chamber, when would you ideally like to have kids or have had kids already?
After my education, which is about after six years, maybe.
Six years from now?
Yes, so maybe in my 30s.
What are you studying that is going to take six more years?
I'm planning to get in medicine, within the medicine.
Wait, so you want to get into medicine and then have children?
Yes, afterwards, I think.
And do you want to stay home with your kids?
As much as I can while combining it with work.
Mm-hmm.
Hmm.
Hmm.
Because as far as I understand it, if you want to get into medicine, there's a lot of interning, a lot of hours, and that can go on for quite some time, right?
Yes.
But I don't see it as a problem.
I think you can combine work while having kids.
Well, that depends on the work, right?
I mean, certainly if you're doing your interning and stuff and you're working 36 hours a day, you can't combine that with having kids and being there for them, right?
But I think you can become a musician and have kids anyway.
I don't think that you have to work 100%.
Maybe you can work a little bit less and make it work.
Alright.
I mean, I can't speak to that, but you might want to do some research into that.
I mean, it seems odd to me to say, I'm going to spend my 20s getting educated, and then I'm going to have kids and try and spend as much time as I can with them.
Because if you had kids now, then you could get educated in your 30s, and then your kids would be older.
Or you could go to school part-time now and then by the time your kids were older and would need you less, you could then go full tilt on your career.
Anyway, this is neither here nor there.
It's just you might want to talk to people in the field.
You know, just call up female doctors and say, how was it trying to combine kids and career?
I mean, I'm not saying it can't be done.
I'm just...
I'm a big one for be there for your kids as much as humanly possible.
And if you're trying to get into medicine after graduating at the same time as having kids, I mean, breastfeeding is 18 months after the kid is born.
And I don't know if you can breastfeed and be a doctor on call.
Again, I'm not trying to tell you anything to do or not to do.
You might want to do some...
I would hate for you to end up in your 30s and then neither be a very successful doctor.
Worker in the medical field, nor a very successful and mother intimate with your children.
So you might want to throw your obviously considerable intelligence down the bucket of years a little bit and try and sort of see how things might work out.
So, okay.
That's just a thought for the moment.
So...
You would like to be married already and have kids at the moment in six or so years after your education.
Okay.
Now, does he know that you would like to already be married?
No.
So the problem isn't so much his lack of communication, is it, Patricia?
I can say that...
I'm not even sure he wants to get married with me, so why should I respond?
No, you're talking about him again.
Because you're telling me that the problem is he doesn't communicate his thoughts and feelings, but you haven't communicated one of your most important thoughts and feelings, which is you want to be married, right?
If he doesn't know this about you, Patricia, how important this is to you, how can you really complain about him not being available and not telling you what he thinks and feels?
Absolutely.
I understand what you're saying.
But if a person makes you feel really uncomfortable while you are talking about your feelings with him, then it signals to me that I shouldn't do it in the future.
So I've stopped doing it.
Okay, so hang on.
Tell me a little bit more about this.
I hope you realize I'm still completely on your side.
I really want to help here as best I can.
But what happens when you talk about your thoughts and feelings with the fellow?
He gets really uncomfortable and says, this is not important.
This is not logical, productive, and we're not going forward with this.
And what is the topic that he said that about?
Mainly...
About everything that we discuss.
He doesn't like it and I can see it on him.
So when you talk about things that are important to you or things that you think and feel, he says, I don't want to talk about that.
It's not important.
Yes.
And sometimes he tries to, but he doesn't really...
I don't know how to explain it really, but he doesn't like it, so...
Maybe we talk only for a couple of minutes, but not really like a deep conversation that I would like to have with my future partner.
Right, right.
And how often do you have these couple of minute conversations in a month?
How many times might you have them?
Maybe once, maybe not even once a month.
So you get a couple of minutes a month that's In the realm of what you would consider significant intimacy and even that can be troublesome, right?
Yes.
Right.
I don't feel so close to him and that's a main reason for me to question the relationship because it really feels empty and still it feels like I don't know him after seven years.
No, I get that.
And even more painfully, it seems like he doesn't want to know you that much because he pushes back and shuts down when you talk about things that are important to you, right?
Yes.
Right.
Is he an engineer by any chance?
No, a mathematician.
Mathematician!
Okay.
So that's why there's the logic and stuff, right?
Studying mathematics at least.
Right, right, okay.
So he would consider it illogical for two people who live together and love each other to talk about what they think and feel, right?
Yes, not maybe exactly like that.
He thinks that you should bring it up once and never more and you should directly go to a practical solution.
Right, so ambivalence and ambiguity and emotionality.
Boy, I've never known somebody who's into mathematics who has any trouble with emotional processing.
I think sometimes abstracts are like a scar tissue for fear of emotional connection.
Now, let me ask you this, Patricia.
If 100% would be a greatly satisfying relationship, I mean, there's no such thing as perfection in any of these things, but As satisfying as it could be, like where you couldn't really imagine anything better, if that was 100% with regards to talking and thinking and feeling and all of that, what would you rate your relationship as, at the moment, out of 100%?
What percentage would it be?
If I take everything in consideration?
No, because that's just too many variables.
What I mean is, in the thing that's bothering you the most, if I understand it, which is around emotional communication and openness and listening and curiosity, in that realm, out of 100%, 100% being great, wonderful, easy, open emotional intimacy, where would you put it then?
Oh, if I should only take that in consideration, then I would say 20%, maybe 30%.
I would invite you to look into the negative numbers.
Because it would seem to me that it's not like you have a quarter of your conversations in this area are satisfying.
But in fact, you fear and avoid these conversations.
Would that be fair to say?
I originally didn't avoid it, but...
No, I know, but now you've said that you, why would I want to share things with him because he's negative or cynical or whatever, right?
Yes.
So that would be more like not a positive percentage, but a negative percentage.
In other words, it would be in the minuses because you are sort of avoiding these conversations at the moment, right?
Because they're painful.
It feels like you're being rejected, right?
Like you want to talk about things that are important, then he's like, does not compute, will not talk about parameters, exceeded, right?
It's painful, right?
Yeah.
Right.
And what about...
No, go ahead.
I maybe should add, Stefan, that he thinks his opinion is more of value than mine.
So I should listen to him and his way of looking at life and problem solving and all of these things.
So I feel a little bit put down, I should say.
Why not a lot?
Maybe sometimes humiliated.
So he feels there's a sort of traditional thing which can happen around certain types of men and women that the women get, quote, emotional, like that's somehow bad, right?
But the women get emotional and the men get cold and distant and view the woman's emotionality as irrational, destructive, self-indulgent, childish and all that kind of stuff.
In other words, he recedes into this distant patriarchy thing and then you're sort of portrayed as this irrational schoolgirl or something.
Is that too strong a way to put it or is that anywhere close to what happens?
No, it's really close.
What are the other aspects that balance out some of these negatives?
We have grown up together and we have a lot of intellectual common Interests and discussions.
It's really easier for him to discuss maybe science or anything like that, but not emotional.
You have good professional discussions, right?
Yes, yes.
He can discuss these things, and I appreciate them.
Absolutely.
And he's a really good guy and nice and kind and Intelligent and interesting.
Right.
I'm sure that all of those things are true.
So I will tell you what I think.
And look, you understand, Patricia.
I'm in no way, shape, or form going to tell you what to do.
I can't.
I mean, that would be pointless.
And there's no point in me substituting my judgment after a half-hour conversation for your judgment with a seven-year relationship.
So I just want to be clear about that.
But I'll tell you what I think.
And then you can tell me if it makes any sense or has any utility for you.
The reason why I asked if you want to have children is because you are choosing with your head and you should be choosing with your ex, in my opinion.
Because when you have children...
Your intellectual discussions are going to fall largely by the wayside.
It doesn't mean that they'll be gone forever, and it depends how many people you have around to help and so on, but trust me, it becomes 90-95% about the kid, or kids.
At least for, well, so far going on six years, right?
And the question is...
Obviously it's important what you care about and what matters to you and how he interacts with you, but you choose to be there.
So you have that choice, right?
And so if you choose that it's enough for you, I don't know that anyone, I mean assuming he's not coming after you with an axe or whatever, which he's obviously not, but no one can say, well you're wrong.
If you want to be in the relationship...
Then you want to be in the relationship.
But you chose to be with the guy who chose to stay.
The children are not going to choose him as a father, right?
Any more than they're going to choose you as a mother.
So when it comes to the real decision around being with this man or being with any man, it comes down to this.
How is he going to be with a baby?
And how do you think he's going to be with a baby?
Is that a question?
Poopy, droopy, yeah, poopy, droopy, gurgling, reaching, no intellectual discussions of any kind happening for the first couple of years.
How is he going to be with a baby?
How is he going to be as a father?
I actually think he would be a good father.
And why is that?
When something means something for him, then he really puts his heart in it.
So I believe that the reason why he's treating me as he does is because he He deep down inside doesn't really want to be with me.
Maybe he sees some practical gaining for the moment or something like that because I can feel it.
As a father he would, as I said, I believe he would be a good father except for the fact that maybe he would need to practice a little bit more on communicating since a child As well, we'd want to communicate their feelings with him.
Of course, children want to be listened to, and they want their passions and emotions to be taken seriously, and there's a lot about childhood that has nothing to do with rationality.
Right?
Playing dress-up.
My daughter likes pretending to be a baby dragon and dressing up as cats.
And, I mean, you know, we play a game where I pretend to be a giant dragon with treasure around my belly that's sleeping, and then I try and catch her when she steals things from me.
You tell me the rationality.
I mean, yeah, biologically, you're foraging and all that, but there's no...
Abstract rationality behind any of that, right?
Is he going to be up for that sort of stuff for 14 hours a day?
I think he would maybe teach, not directly but indirectly, teach his children to not be that communicative as other parents would.
So maybe they would develop a characteristic That is similar to his?
No, that's you.
You're willing to be molded by this.
I'm not sure that children are.
I wouldn't confuse the children for you.
You come with your own history.
And I've had a look at your adverse childhood experience score.
So, I mean, we don't have to get into that in detail, but I get that it was pretty rough.
But children are going to have their own needs and children will fight very hard to get their needs met.
I mean, if you see baby pigs, like piglets, trying to get at the mother's teats, I mean, they're kicking, biting.
They work very hard to get their needs met, because obviously, biologically, those children that didn't work hard to get their needs met didn't usually survive for very long.
And so children will work very hard to get their needs met, which is why, I'm not saying him, but why parents who oppose their children's needs have to get so aggressive.
It's because the children are so damn insistent.
They don't let up.
They whine, they nag, they complain, they beg, they...
Weedle, they deal, they...right?
And you can't just assume that children are gonna...like you're willing to subsume your own needs to match his emotional unavailability, but I would in no way assume that that's gonna happen with children.
That's just not how kids are built.
Adults can be traumatized into not having any needs, which I think is what came out of your history.
But unless you're willing to traumatize your children to the same degree, and I know that you're not, then you are not going to be able to turn down the incandescent black hole needs of what children want.
But it troubled me when you said that deep down he did not want to be with you.
This was your perception, or is your perception, right?
Sorry for interrupting you, Stéphane, but can I just add one thing to the things you just said?
Yeah, please.
Of course we have a biological inherited need for attention and speaking about our feelings and stuff, but what I meant was that you can raise a child in different ways,
and I believe that he has been raised in this direction, so obviously you can affect No, no, but you can only get a child to squelch his needs or her needs through aggression and abuse.
So the child is going to have the need for emotional connection and listening and play and all the, quote, irrational stuff.
The child is going to have those needs.
And unless he's willing to be very aggressive with that child, the child is going to continue to have those needs.
That's what I'm saying.
Yes, okay.
Yeah, I understand.
So, as far as being a good father goes, he knows about your childhood, right?
I mean, the giant god-awful mess that was your childhood, right?
Most of it, yes.
Right.
And he knows that your parents didn't listen to your emotional needs and accept your emotional needs and work to satisfy your emotional needs that are legitimate, right?
Maybe not that detailed, but the bigger picture, maybe, yes.
Does he think that your parents did meet your emotional needs?
No.
Okay, so he knows that your parents did not meet your emotional needs.
So he also knows, because you tell me, he's a very intelligent and logical man.
So he doesn't have the excuse of intellectual idiocy or whatever, right?
So he obviously, if you study mathematics, he's got to have an IQ, I would assume, at least 130.
So he knows that your parents did not meet your emotional needs.
In fact, they rejected your emotional needs a huge amount.
And that by doing that to you, he is recreating some of the worst aspects of your childhood, right?
I'm not sure he's aware of it.
It's not complicated to figure out, right?
I mean, it's not like Fermat's last theorem, right?
It's not that hard.
Like, you had a childhood where people didn't listen to your emotional needs.
He knows that.
He also knows that he's not listening to your emotional needs.
This is not brain surgery, right?
I'm not saying it's easy to solve, but it's not hard to identify, right?
But I'm not sure that the reason why he's treating me...
No, no.
Forget the why.
No, no.
See, you keep trying to jump into something else in this conversation.
I understand that.
It's tough stuff to talk about.
But trying to figure out his motives or...
But I'm simply talking factually...
He knows, or at least it would not be hard to figure out at all, he knows that he is recreating some of the most difficult aspects of your childhood for you.
He knows that you were rejected by your parents as a child, abused by your parents as a child, and he is now rejecting your thoughts and feelings as an adult, right?
Yeah, okay.
Right.
And so he is being harsh to your Inner child.
He is recreating, to some degree, to a smaller degree, obviously, he is recreating the emotional rejection that you experienced as a child, right?
Okay.
Well, I'm not saying you have to agree with me.
Of course, if I'm wrong, you can disagree with me all you want.
But you're giving me an okay that's very conditional, right?
Like, okay.
Like, let me see where you're going with this, right?
Sorry, I just...
I'm just thinking a while, and...
No, take your time.
I don't want to rush you through any of this, right?
Because I don't want to drag you somewhere that's not fair or right or appropriate.
I know it's useful if it's true.
But he is not being sensitive to the needs that you had as a child that still exist as an adult, right?
I'm not saying he can't fix your childhood, but he doesn't have to do similar things that your parents did in terms of not being sensitive and listening to you emotionally, right?
True.
So, he knows your history, he knows your childhood, and he's doing some of the same stuff.
I'm not putting him in the same category as your parents, but he's doing some of the same stuff, right?
Yes.
And...
So, there's two things to say about that.
First of all, that is a very strong indication that he is not going to be a good father, because how good is he with your inner child?
He's bad.
Yes, and that makes me feel bad as well.
I'm sorry?
That really makes me feel bad and unhappy.
Right.
And so if he's bad with your inner child, the idea that he's going to be great with your real child in the future, that's not supported by the empirical evidence, to say the least, right?
No, I understand.
All right.
So then the question...
Sorry, go ahead.
Can I just ask you a question about that?
Yeah.
So you mean that he is aware of how he's treating me because of my childhood, because he knows how he can treat me, because that's my family history?
Well, I would go one step further and say that you chose him for precisely these reasons.
And the fact that you chose him at the age of 17 means that You probably haven't read my book, Real Time Relationships, but you might want to.
It's free at freedomainradio.com slash free.
But in it, I talk about this analogy of Simon the Boxer.
And Simon the Boxer is a boy named Simon who was beaten up regularly by his parents, by his father.
And so he got used to managing physical abuse.
That's how he got any sense of competence.
If you're being abused...
You cannot gain control over your abusers, but you can gain control over managing the abuse.
So he got really good at finding a sense of power over managing being hit, managing the feelings of being hit.
In other words, the only way that he could gain any sense of power or control, this Simon, as a child, is by managing the The feelings that came up, controlling the feelings that came up from being hit and beaten.
Now, if the only sense of power and control, which we all strive for, if the only sense of power and control, Patricia, that Simon had as a boy is managing the effects of being beaten, then when he's not being beaten, he's going to feel powerless.
He's going to feel out of control.
And so he's going to be drawn into physical violence.
And so he becomes a boxer when he grows up because his skills have developed and his main sense of efficacy and power and control lie in managing the effects of being beaten.
So he must be beaten in order to have any sense of control.
So he becomes a boxer.
Steps into the ring and all of his great skills and great powers come into play managing the crunch of fist on bone.
So if you had a childhood where you were Not listened to.
Where you were rejected.
Where your emotional needs were ignored.
Then you're very good at managing rejection.
You're very good and you gain a sense of control out of dealing with people who reject you.
And so it seems to be inevitable that you would end up with a man who would play into your skills Of managing being rejected.
And you say that you've been growing and I think that's wonderful because now it's not as...
The satisfaction and the sick familiarity of history is diminishing.
It's no longer enough for you to manage the effects of early trauma and to need people who are going to re-inflict that trauma, maybe not maliciously, but fundamentally it doesn't matter.
If a person hits you in their car when they're drunk, that's bad.
If they're not drunk, but just had an epileptic attack, it doesn't mean your legs are any less broken.
So the intent doesn't matter in many fundamental ways.
But the effect, the action is what matters.
Whether he knows about it or not, who cares?
Can't figure it out.
And we'd have to rely upon him to tell us the absolute truth about all of that, and who knows whether he would or wouldn't.
So trying to figure out other people's motives is usually a massive waste of time, a massive time sink that leads nowhere.
And it really is designed to paralyze us.
It doesn't matter why he does this rejection, why you feel deep down that he doesn't want to be with you.
Let me ask you this.
I don't want to get into a big set of details about your childhood, but let me ask you this, Patricia.
Did you feel that your parents really enjoyed having you around when you were a child?
Did they enjoy your company?
Did they appreciate you being there?
I know that both of them didn't enjoy having kids.
And I felt it too.
Yeah.
You got verbal abuse, physical abuse, neglect.
Alcoholism or drug use, depressed, mentally ill, suicide, right?
So your parents didn't want you around and now you're in a relationship with a guy who you say you know deep down doesn't want to be with you.
Do you see the continuation, right?
Yes.
Managing rejection.
Managing rejection.
That's what you're good at.
That's the tragic muscles that you had to develop as a child.
And now, as an adult, you will be drawn to people who reject you, because that's the muscles you have.
And we always try and use the muscles we have, right?
Okay.
Yes.
So, it's not...
And this doesn't mean that he's not responsible, and he's not doing things that are harmful to you.
But the fact that you transitioned after being rejected...
As a child, you transition to this guy who rejects you.
As a young woman, you get the pattern, right?
Yes.
In other words, if he hadn't rejected you, you probably wouldn't be with him.
If he didn't have this pattern that has this familiarity to you.
In the beginning, he was really...
He was different in the beginning.
He was more...
Or after me.
Or responding.
He wooed you.
My needs and stuff like that, but it's gotten worse.
Well, look, but here's the thing.
That if what I'm talking about is true, I'm not saying it is, but if it's true, then you are going to continue to position him as a rejecter.
Because that's the pattern, right?
Yes, you are right, because now I recognize...
Within myself, but I'm more interested in him now that he's rejecting me than in the beginning of our relationship.
Right.
Right.
Because that's what you bonded with.
You bonded with rejection, which is kind of a paradox, but that's the way it is, right?
Yes.
And so, because if you are using him as a stand-in for your history, then you're also not dealing with him as an individual.
I'm not I'm not blaming you.
And I'm not saying there's nothing wrong.
This is just the patterns that we all have to recognize and outgrow.
And I can only say this because I'm much older and gone through some of this stuff.
So I'm not trying to, you know, it's bad or anything.
But if you're using him as a stand-in for your family so that you can feel a sense of control over managing being rejected, which is your early experiences...
Then it's great that you're becoming more interested in him because you're now looking at him as an individual rather than as a stand-in in a tragic psychodrama from history, right?
True, yes.
But if you really want to break the history, and you've got a lot invested with this guy, And you say he's a good guy, right?
He's not a mean guy, he's not screaming at you, he's not throwing books at your head, he's not, right?
Is that true?
It's true.
Okay.
Now he's also got his history, right?
Which is why I asked at the beginning if he was there, because you always find when you get two people's history, especially when they're young, especially if there's been trauma, especially if there hasn't been a lot of therapy with a great therapist, or a lot of work on self-knowledge, people fit together like jigsaw puzzle pieces.
I'm assuming you know something about his history, and I assume that he's got a pattern of rejecting.
Does he consider his mother irrational, for instance?
I think so.
Over-emotional, needy, whatever, right?
So there's some pattern.
He might have a needy emotion.
And I'm not saying you're needy or over-emotional, it's just that there's these patterns.
Any more than he's your dad, you're not his mom, but there are these patterns nonetheless, right?
And so there's a reason why this relationship, which if you can't talk about your thoughts and feelings and you avoid talking about your thoughts and feelings, can scarcely be called a relationship at the moment, which is why it feels empty to you, right?
I mean, if you eat a bunch of wax fruit, you're not going to feel stronger.
You're not going to feel full.
Well, maybe you'll feel full.
You'll feel full of wax, but not a fruit, right?
If you eat fake food...
You don't get stronger.
This is the emptiness that you feel in the relationship.
It's that there is these two opposite magnets that are, because of history, you've chosen to be together, I would argue, and I'm just telling you my thoughts.
I don't know what the answer is.
But because of these history, you are together and you're reproducing that history.
Now, there is a chance, of course, for you to break out of that history, maybe even with each other.
But you have to say, look, I can't do this anymore.
I can't do this.
This is not a relationship.
If I can't talk about what I think and feel, this is not a relationship.
This is too much like my history.
You don't have any control whether he communicates with you, Patricia, right?
You can't stick your hand up his butt and make his mouth move, right?
Although I'm sure that's in Fifty Shades of Grey Part 2.
But anyway, you can't do that, right?
But you can't.
The only person you have control over is you.
You know that.
This is a truism, but we forget it all the time.
I do too.
And so you can control how much you talk to him, how much you share with him.
And this is all in In real-time relationships.
Do you want to try being him and I'll try being you and I'll give you a sense of how that might work?
Okay.
Okay.
So, listen.
Lover of mine for seven years.
Light of my life.
Center of my home.
I feel like there's some stuff missing from our relationship, but the most thing that's missing is me.
I'm not here to blame you.
I'm not here to say you should do anything different, but I feel like I've kind of chickened out because I've wanted to talk about things that I've thought and felt for a long time and I feel really scared even having this conversation with you because this is stuff that's very important to me.
You knew when I grew up I didn't have people who really listened to me, who thought I was irrational, who rejected me.
And I've sort of allowed this, not your fault, I've sort of allowed this to kind of creep into our relationship, where now if you say to me, oh, that's not rational, or we already talked about this, or I don't want to know about this, or, you know, this conversation, I feel like there's this history that I had that's kind of spilling into our relationship.
I don't know if I'm doing it or you're doing it, it doesn't really matter.
But I really do need to be able to break this cycle.
I want to be able to break it with you, but it means that we do have to have Or at least I really want to have these conversations that are important to me.
And I'm not trying to sort of make you uncomfortable.
I'm not trying to put you in a corner.
And I'm certainly not telling you what you have to say or do.
But I really, really feel this need for this connection.
I didn't have it when I was a kid.
It's kind of fading a little bit with us.
And I really miss it.
I miss being able to connect and converse with you in ways that aren't difficult and make me avoid that.
Does it make any sense what I'm saying?
Yes.
What do you think?
What I think about it or what he would think about it?
What he would say.
I actually had this or tried to have this conversation.
No, just do what he would say.
What would he say?
What did he say?
Okay, I hear you.
I'll try to work on it.
But I'm pretty sure that this is my personality and I can change it.
I'm not even sure that I want to change it, and I believe that another guy would fit you better if you have this need.
If you want to be with me, then this is not going to happen.
So you want me to keep what I think and feel to myself if I want to be with you?
I have to not tell you what I think and feel?
Yes.
Does that make sense to you?
That I should hide who I am in order to be with you?
Exactly.
That's right.
Does that seem like a bit of a paradox?
Because I'm not really with you if I'm pretending to be something other than who I am or if I'm not talking about what I think and feel.
Like you want us to not talk to each other in order to have a relationship.
Yes, that's what I feel too.
Wow.
And do you think that there's anything that I could say that might change your mind on that?
I mean, you do recognize the paradox.
It's like saying we can only have a conversation if we don't use any words.
We can only be with each other if we hide from each other, right?
You understand that's a paradox, right?
Like, it's not logical.
No.
And you seem to be quite into logic, but in this instance, you're willing to be completely illogical.
I believe...
I believe that's because maybe I have low self-esteem that I really don't want to show my personality.
So maybe I hide myself for others.
That is a very brave thing to say.
I really appreciate you saying that.
And I feel a little bit less insane at the moment, which is good.
That's a brave thing to say.
I really appreciate that.
But...
You wouldn't want to necessarily, I think, surrender to that, right?
Like, if it's something that's, I don't know, I don't want to say a character flaw, because you just admitted something very brave, but you wouldn't want, like, something that was a negative to then dominate your life, right?
No, but I think he's provided a lot of good things to my personality, and I... Wait, wait, did we just break the roleplay?
You've got to make a sound or something.
Sorry, sorry.
No, no, why, why, why?
You just jumped out!
What happened?
Why did you jump out?
What would he say?
You got to a new place with him, right?
You didn't know what he would say?
Sorry, I didn't get that.
What did you say?
Okay, so he said that maybe it's my low self-esteem, which is why I don't want to share.
And I said, well, you wouldn't want something that's a negative to then dominate your life, right?
If he's saying, well, the reason I don't want to have a conversation is because I have low self-esteem or whatever, right?
I don't want to reveal myself to you because I have low self-esteem.
That's not a good reason for that decision.
He's not portraying it as a positive.
He's now saying, well, this is a negative.
This is a problem.
I have a deficiency.
And you wouldn't want the relationship to be based on his deficiencies, which gives you leverage or would give me leverage as you to say, well, then we should make different decisions because you've admitted that it's a negative.
You're not trying to catch the guy or anything.
But he's not portraying it as a virtue.
He's not saying, I don't want to talk to you because you're irrational.
He's saying, well, maybe I don't want to expose myself or reveal myself because I have low self-esteem or don't like myself or whatever.
That wouldn't be a good reason to continue the behavior, right?
No.
So when I challenged him on that, that's when you jumped out of the role play, right?
Sorry, it's maybe because of the language.
It's not my mother tongue, so I have a little bit hard time to...
No, you were doing just great with all that stuff.
I'm afraid, you know, if you're going to be a doctor, you've got to have an IQ of 125, at least 130, and you've been doing great.
So the one time we got to something which challenged the dynamic of the relationship, then it's like, oh, I've got to jump out.
And we're not going to tell you, oh, it's language barrier.
No, sorry.
I can't accept that.
Okay, maybe...
I can't give you that one.
I'd like to, but too much respect for you.
Okay.
Do you think he would admit that there's low self-esteem issues and maybe he doesn't like himself as much and that's why he doesn't want to be intimate?
No, I don't think he would admit it.
Well, I certainly...
I'll tell you, role-playing as you, Patricia, I felt a great despair if he basically said, if you want us to have a conversation.
See, you'll listen to the role-play back, but it was really fascinating the way that it went.
because as you, because he's into logic, right?
So I said, well, you can't say that we can have a relationship by hiding from each other, right?
And once he got that, then he revealed the low self-esteem stuff.
And then when I said, well, that can't be the basis for a decision, that's when the role play ended, right?
And that's really important, right?
Because the paradox of you can only have a relationship with me if you don't tell me what you think and feel, that can't work logically.
That's not having a relationship, right?
It's like saying we can only have an economic relationship if we never exchange any money, goods, or services.
That makes no sense, right?
And because he's a logical guy, he couldn't sustain that.
And so if you get down to the point where you say, well, we can't be close, you can't say that we can have a relationship if we refuse and avoid talking to each other about what we think and feel.
That's called proximity.
That's not called having a relationship.
We must be two people in a subway car, right?
Yes.
So if you can take that approach and point out the illogic of what he's asking, because I felt great despair when he basically said, if you want to have a relationship with me, you better be with another guy.
If you want to have a relationship with a guy, it has to be someone else.
And that is incredibly low self-esteem.
You're right about that.
Because basically he's saying, well, if I share my thoughts and feelings with you, you won't like me.
It's not.
People never reject other people.
They reject themselves, like when they're in a relationship, right?
He's rejecting himself.
He's saying, look, if I'm honest with you about my thoughts and feelings, you won't like me.
And that's the low self-esteem thing that came bubbling up in that roleplay, right?
Because why would he not want to share his thoughts and feelings with you?
I mean, look, you and I, we're sharing thoughts and feelings, right?
I really don't know why he doesn't find it interesting or useful to talk about feelings or ideas.
Well, interesting or useful is his language, right?
It's not a question of interesting or useful.
It's not a question of interesting or useful.
It's a question of honesty.
You cannot have a relationship with someone who's lying.
You cannot have a relationship with someone who's avoiding.
You cannot have a relationship with someone who's hiding.
It's like trying to hug a ghost.
You can't.
And so, it's not a question of, well, is this useful, or is this appropriate, or is this helpful, or is this productive?
That's all bullshit.
That's all just a bunch of made-up crap.
The question is, is he committed to honesty?
In other words, honesty is, if I say to you, Patricia, what do you think and feel, and you lie to me, then you're just not being honest, and we can't have a relationship.
Right?
So, And if I'm thinking and feeling something that involves you, involves our relationship, and I don't share that with you, I'm lying by omission.
So, it's not appropriate or useful.
Are we going to be honest with each other?
When I ask you what you're thinking and feeling, are you going to tell me the truth?
Right?
So, what do you think and feel at the moment, Patricia?
I feel that I need...
I think that I need to make a decision because I think that this...
No, no.
I'm so sorry to interrupt you.
I'm so sorry to interrupt you.
I should have asked you, and I apologize.
I put the question completely wrong, so I apologize for interrupting.
First of all, what do you feel?
Let me get to the things in a sec.
But what do you feel?
I feel...
I feel like I'm getting more...
I don't know how to put it.
More clear-sighted about the situation and it really makes me happy that I'm trying to figure this relationship out because I really do not want to be in a destructive relationship because the years go by and I need to, you know, proceed in some way.
So you feel a sense of a little bit of peace of mind, some clarity?
Yes, I only think it's for the better.
If it doesn't work out, then it's not a loss.
I think you need certainty about this relationship.
And the reason that we tend to be uncertain in relationships is we get into a hide-and-hold pattern.
You get involved in the daily details and so on, and you kind of hide yourself from the other person using this holding pattern, right?
Hide and hold is what we do.
And the reason for that is because...
And that is...
It seems like an uncertainty, but it is an actual certainty because if you hide and hold long enough, the relationship just ends.
Somebody will just do something.
Somebody's going to have an affair.
Somebody's going to get drunk.
Someone's going to say something irretrievable.
Someone's just going to...
You're going to wake up, and someone's going to be just moved out.
If you just do the hide and hold...
Yeah, relationship is doomed.
But it's a very passive way to end a relationship.
And please understand, Patricia, I'm not trying to say end or don't end.
I don't know.
I think with seven years, it's worth a fight because you've got so much invested.
And he is a smart guy and you get a lot of great things out of him.
He's a good conversationalist about certain topics.
But certainty comes from honesty.
Honesty will always get you clarity.
It's why so many people avoid it.
Because they don't want to...
They don't want the scales to fall from their eyes.
They don't want the fog to lift from their environment.
Because they don't want to know if they're in hell or heaven, right?
They don't want to take the risk.
They don't want...
You just don't want that.
A lot of people...
Not you.
A lot of people.
That's why honesty always brings certainty.
People avoid it.
Avoid both, right?
But they avoid the certainty.
And so they avoid the honesty.
So if you're in a hide-and-hold pattern...
You're just going to let things decay rather than make a choice.
And it's always better to make a choice than to just let things happen.
Because choices are who we are.
If we let things happen to us, it's like we don't exist.
And then it's like we're back in our early shitty childhoods, right?
If that's what we had.
Don't let things happen to you.
Be honest with the man.
Tell him what you think.
Tell him what you feel.
Tell him what you need.
And don't be intimidated by this.
He's not your dad.
And you're not a kid anymore.
And don't be intimidated by this.
Often male, you know, emotions are like crap stains in your underwear.
You just got to wash and cross your fingers and hope they go away, right?
Clean them out.
No, emotions are the reason that you're together.
Feelings are the reasons that you're together.
And if you want to have kids, emotional availability is essential for the mental, physical, and emotional health of your children.
And...
This abstract and this distance, you just have to tell them, listen, don't lie to me.
You owe me the truth.
We've been together seven years.
I live with you.
I sleep with you.
You owe me the truth.
When I ask you what you think and feel, I want you to be honest with me.
Just don't lie.
And I will commit to not lying to you because I've been lying to you by hiding from you.
By repressing things which make you uncomfortable.
But that's not fair to you and that's treating you like some piece of glass which you're not.
You're a fully grown man, intelligent, wise in your own way.
Competent.
We have to commit to being honest with each other.
And being honest means saying what you think and feel.
If you think and feel hostility towards me, tell me.
Tell me.
Tell me.
If you're angry at me, if you're frustrated with me, if you feel like I am a needy child, I need you to tell me.
Because we cannot have a relationship if we're not honest with each other.
And that's kind of like, everybody knows that, but how many people actually do it?
How many people actually do it?
And are just Honest.
It's such a simple thing, and it seems so impossible for so many people.
And if he says something to you that hurts you, say, that really hurts me.
I'm not saying you did anything bad or wrong.
I'm just telling you what I'm feeling.
Give him the real-time experience that you're experiencing.
Give it to him in the moment.
That's the real-time relationships.
Because anything that's not real-time is almost always manipulative.
You come back a day later and say, well, it really hurt me what you said.
Give him the real-time reflection.
Of how he's behaving and how it's impacting you in the moment.
The only way to avoid manipulation, which is not treating someone as a person, but as an object to satisfy immediate needs and desires of power, control, revenge, whatever, right?
Is to be ruthlessly honest in the moment about what you think and feel.
And he owes you honesty.
Sorry, he's been with you for seven years.
He can't now say, well, you know, I want to be honest with you.
Come on.
Come on.
What are we, children, right?
Be adults.
Be honest.
And whether the honesty leads you guys to actually connect and stay connected, fantastic, if it leads you to break up.
But at least you have certainty.
And once you get how the only way to navigate through life is through ruthless honesty, well, you'll never go back to Driving at night in the fog without any lights on.
Does that make any sense?
Yes, it makes perfect sense.
And that will be the case for future relationships too, whether they're business, personal, romantic, financial, whatever.
Honesty is great and perfect clarity.
Will you let us know how it goes?
Yes, maybe I will.
I'd like to.
I don't have to put it in the show.
I'd just like to know how it goes.
Do you feel like we've had a useful conversation?
Is there anything else you wanted to?
No.
It's been a great talk and thank you for your time and thank you for what you're doing.
I really appreciate it.
Yeah, it's been useful.
I'll think about it and Make a decision.
Okay.
Well, thank you.
And I really, really appreciate your honesty.
It was fantastic.
And openness was great.
Okay.
Thank you.
Thank you.
Good night.
Good night, Patricia.
All right.
Well, up next is Howard.
Howard wrote in and said, what do you think is the most free country in the world?
Also, with so many countries moving farther away from a free society, what countries do you think are moving in the right direction towards freedom?
Moving in the right direction?
I don't think any are moving in the right direction in terms of actively shrinking the size and power of the state in any sustainable manner.
But what do you think?
Oh, Howard!
Hi, I'm here.
Oh, go ahead.
Hi.
Well, I think certainly countries like China and India, I was an economics major, so that's not my background.
Countries like China and India, I think, are moving in the right direction economically, but I think there's kind of an upper bound to how free at least their current governments are going to let their societies become.
Why do you think China's moving in the right direction?
By essentially dismantling their state controls, or a lot of them, over economics.
And I think that's fundamentally the cause of their, you know, reason, it seems to be slowing now, but their upward growth trend over the past few decades.
Just for what it's worth, China on the Heritage Foundation's Index of Economic Freedom, world-ranked country-wise, is 139.
5.1 trillion dollars China is in debt.
Yeah, well...
Which means they have a debt per citizen of almost 4,000 US dollars.
Now, I mean, that's obviously a lot lower than the US, but so is the standard of living in China, right?
Well, I guess from how bad they were, they have been improving to where it was essentially all absolutely state-run, everything economy.
What they essentially managed to do is they, you know, the Soviet total collapse of their government system, China at least has managed to avoid that by, you know, they kind of saw the writing on the wall for Absolute state-controlled communism and decided, well, we're going to introduce some competition earlier enough so that their government hasn't collapsed the way the Soviets did.
No, I get it, but they're not committed to freedom.
So their government was going to collapse, and so they introduced some free market reforms because they realized that it was going to be more profitable for them to do so.
Yes.
So moving in the right direction is having an ideological, philosophical, moral commitment to human freedom, not just, well, it's better livestock management.
If we give the cows wider stalls, we get more milk.
Yes.
And that's kind of what I was getting at, where there's an upper bound, I think, to how free countries like China will get.
But see, again, you can say free, and look, obviously it's better, China's better now than it was in 1980 in terms of economic freedom, without a doubt.
But it's not...
Free-range chickens are not free.
Okay.
I know this is a little bit semantics, but it's important.
Because...
The fact that they have found it more profitable to allow some free market reforms into their economy has no ideological commitment to it whatsoever.
They're simply averting disaster.
Their government was going to collapse.
They saw what happened with the Soviet Union, and they were like, okay, well, we're not going to do that, right?
Yeah, steering away from a cliff isn't necessarily...
But they have no philosophical commitment towards freedom of any kind.
So it's not that they're giving their citizens freedom because they don't want to give their citizens freedom.
Freedom is no value or it's the opposite of value to them.
Freedom for the citizens is the end of the state, real freedom.
And the people in charge do not want to give their citizens freedom in any way, shape, or form.
And they have found it more profitable.
Oh, the cows are dying because the stalls are too small.
Let's widen the stalls.
But that's not right before they set the cows free and dismantled the farm.
That's just to make a bigger farm, to make a better farm, to make a stronger farm.
Yes.
Okay, and sorry, I know that's an annoyingly semantic point, but I just, you know, when people say, well, you know, China's more free now, it's like, but no!
They're...
Well, there's huge amounts of corruption, and they've just found it more profitable to farm people in this way.
They've upgraded their tax farming method.
Yeah, they've basically upgraded their tax farming methods, but there's no commitment to freedom.
And this is true throughout the world.
I don't know any country that has any fundamental ideological commitment to human liberty.
That's all 18th century, 19th century stuff.
That's been gone since before the First World War.
Any fundamental or philosophical ideological commitment to freedom.
I can think of a few exceptions that pop into my head.
So the guy who was in charge of the German economy after the end of the Second World War, which we've got.
We've got a presentation called The Truth About Germany.
Oh, wait.
Is that what it's called, Mike?
I can't remember now.
No, it's not the truth.
The fall of Germany, there will be no economic recovery.
Germany, there will be no economic recovery?
I believe that's it.
Yeah, I watched that kind of playlist of shows, and coming from, personally, I tend to fall in line with the Ayn Rand-type objectivism.
Yeah.
So watching those was what kind of got me thinking about this question, where are there any countries that are moving towards, it seems like every Western country is moving towards, well, how much, they're just moving in the wrong direction.
in the wrong direction.
Yeah, but that's the logic of late-stage democracy.
And we do a whole show about decadence in the near future, but late-stage democracies always follow the same damn pattern.
An over-focus on sensuality, stimulation over wisdom, materialism over true wealth, a consumption.
Like, we've basically run through the capital that was accumulated over 2,000 years.
We've run through it in a little bit more than a generation.
We've just destroyed it, eaten it up, shit it out in the form of massive debt.
People become incredibly sensitive to discomfort.
Human beings do well in adversity.
There's all these mouse utopia experiments where they keep trying to create these utopia for mice and it always turns into this complete shitter.
And what happens is that when they eliminate the predators and give them enough food and sexual availability and right temperature and so on, the mice simply start fighting and they start reproducing.
It's nature's way of controlling a two successful species.
You can see this happening.
People are incredibly sensitive to negative experiences.
Oh, my pension might be cut by 5%.
Give me a fucking break.
And they just stop breeding.
They stop breeding for a variety of reasons.
But we become incredibly soft.
Human beings want to escape and avoid adversity and it's just about the worst thing in the world for us to do it, right?
We all want to sit on the couch and eat Cheetos, but it's the worst thing for our health to do that.
We need some adversity.
Go lift some weights.
Go pound some pavement.
Go do something.
Yeah, like the Agent Smith example in one of the Matrix movies where he says, you know, they tried to create a utopia where there was no suffering and the humans rejected it.
Yeah.
No, we are a biological species.
We are not designed to be free.
We want to be free of adversity.
Of course we did.
So we've created this weird situation where, because we're eating the capital of the past, we don't think we have to make any compromises.
We don't think we have to have any suffering.
We can just pay people off and bribe people, and it just, of course, defers the negatives.
So there's no country that's going in the right direction.
That's inevitable in fiat currency democracy, which is just Roman shit all over again, but with tablets.
Well, I guess different tablets.
Yeah.
But there is a place, there is one country, there is one country where things are going in the right direction, and that's a country called Utopia, and that's Y-O-U, Utopia, which is the country or the environment that you can create in your life.
And that country is entirely determined, the quality of that country and the quality of your freedom is entirely determined by the people you surround yourself with.
That's the only free country that you can live in, is a place called Utopia.
So, let me ask you this, Howard.
It sounds like a challenge, it's not.
It's just a question.
Are there people in life that you have to hide your beliefs from?
Yes.
I mean, yeah, there are.
And to a certain extent, those aren't people that I think I would be able to actually get away from, which I think may be where you were kind of Leading that if I caught your drift.
Yeah, and I'm not saying what you should or shouldn't do with your relationships, but it's just important to map where your relationships are.
You know, I mean, people suffer far more social censorship than government censorship.
Far more social censorship than government censorship.
Oh, censorship.
Okay, but to what degree do you have to lie about who you are To the people around you.
I mean, where is the real censorship?
Where is the real tyranny in your life?
I can virtually guarantee you, unless you're calling me from prison, in which case hang up, because I don't think you should have a phone, but if you're not calling me from prison, then the real tyranny in your life is almost certainly social and not governmental.
In that if you have philosophically resonant and consistent and deep beliefs, then you basically feel like prey around a lot of people who are predators, who feel like predators, right?
Like, I can't say this, I can't say that, I can't talk about this, I can't bring this up, I can't bring that up.
And that's the censorship that we live in.
That's the, I can't be who I am, I have to pretend to be someone else.
And of course, we don't get any points for that after we're dead, we just die having hid ourselves from people around us.
But as far as countries that are moving in the right direction, there's only one country, and that's you, that can move in the right direction.
It certainly isn't going to come from the state, but it can come from you.
It comes from a commitment to being honest and open with the people in your life about who you are.
And look, I mean, it doesn't mean you've got to do it all the time with everyone, you understand, but it's really important to make decisions To make the world that you want to live in.
To make it in your environment.
To make it in your surroundings.
That's how philosophy comes to life.
It doesn't come to life through mere words or blog posts or writings or these podcasts or whatever you watch or whatever.
It comes from you creating the world of the future around you.
Create an antibody to the social decay of our epoch.
Create a cyst of resistance of the decay around us.
Create a bubble of the future in your life today.
And that is around honesty and openness and clarity.
And connection in your relationships is something I've always talked about from the beginning.
You can't wait for the government to set you free.
Free yourself, dammit!
Free yourself!
Freedom means honesty.
Because if you're not free to be honest, you're not free at all.
You're not free to be who you are.
With the people in your life, you are not free at all.
Economic freedom index, be damned.
Government laws, be damned.
Taxation, be damned.
Are you free to be who you are with the people in your life?
If you are, well, that's a rare and powerful thing, and well-earned and well-deserved.
And if you're not, you don't have to change a damn thing, but don't expect the government to set you free.
Sorry, you were going to say?
No, it makes a lot of sense.
I guess what it seems like is happening, though, is that the...
Little freedom bubble that I can create for myself.
And I'm speaking kind of abstractly here.
You know, I used to worry, and I'm not speaking just about me, but the little freedom bubble that people can create around themselves keeps getting more and more squeezed by what governments, not just the U.S., but around the world, are doing.
And so, yeah, I guess It's a fun thought experiment, but why isn't there...
I get where you're going with this, and I'm sorry to interrupt.
There's only so much I can do in my life, there's still the government.
You can't do anything about the government.
You can't do anything about the government.
But you can do something about the choices you make in your personal life.
Look, and I'm sorry to be annoying.
I am.
But this is something that people who are interested in human freedom really need to understand.
Where the fuck is the freedom going to come from?
It's not going to come from the state.
The state doesn't want to give us freedom.
That's why I was talking about China.
Oh, they're more free.
No, they're not.
They're better managed.
They're not more free.
And...
If the slave is more productive with one manacle rather than two, then he'll get one manacle.
But that's not because the guy's committed to his freedom.
The owner is only committed to his productivity.
And the more valuable he is then with only one manacle, the less likely he is to be turned free.
Because he's worth even more now with only one manacle than he was with two.
Yeah.
So it's not going to come from the state.
Is it going to come from the people around you who are statists?
Of course not.
Because they're embedded in the matrix.
They're embedded in the doctrine.
So where is this freedom going to come from?
Is it going to come from Rand Paul?
It's not going to come from Rand Paul.
Where is it going to come from?
How is it going to manifest?
How is it going to happen?
Yeah.
It comes from the most committed.
The only place this freedom is going to come from is people get How serious you are about what you believe.
They get it deep in their bones.
And if you believe something that goes against the social norms, if you believe something that violates social taboos, then people are constantly scanning you to say, are you serious?
Are you like...
It's talk, right?
It's just talk.
You're not...
I mean, you like reading these books and, you know...
You like visiting reason, right?
I mean, it's not...
You wouldn't actually make any decisions about this in your life.
This is a hobby, right?
Like, we want to put this in the category of stamp collecting.
You wouldn't break with a friend because he wasn't a stamp collector, would you?
It's almost a norm to violate your own views or to be okay with contradictions.
Yeah, ethics is a hobby.
If it's a hobby, if it's just a little thing that you do, like, you know, I like statism, you like freedom, like PS4 versus Xbox One, who cares, right?
Six of one, half a dozen of the other.
Everyone wants to relativize everything so that nobody's actual moral beliefs change a goddamn thing in the known universe.
It's the castration of virtue that society is solely focused on.
You can have these beliefs, but they can't change anything.
And if it changes who you vote for behind a screen, who gives a shit?
That's not going to change anything anyway.
Voting changed anything that wouldn't be legal.
You can say, oh, I'd really like you to read these books, and we can have these intellectual discussions, we can have these abstract discussions, but goddammit, don't you ever think of making any real decisions about your values.
Don't ever think of bringing them to life.
Don't Don't ever think of taking them even remotely seriously.
Now, statists, they take their beliefs very seriously.
They can afford to because they're the dominant ideology, right?
So they take their beliefs very seriously.
Oh yeah, you don't pay your taxes?
You go to jail, goddammit.
I want guys with guns to come kick your door and drag you off to rape rooms.
They take their beliefs very seriously.
And as I've argued for many years, people on the freedom side, we don't take our beliefs that seriously.
We don't.
We don't.
I mean, it's just a simple fact.
Statists are willing to have you thrown in prison for disagreeing with them.
But are you willing to question your friendships with statists?
And look, I get it's difficult.
I understand it's difficult, right?
Because there's this weird thing in society that says, well, you know, anytime you suggest that people end relationships, that's bad.
All relationships must be valuable at all times, no matter...
This is all nonsense, right?
It's only applied to new ethics.
It's not applied to anything else.
I mean, if you're an alcoholic and you go to Alcoholics Anonymous, what do they tell you to do with your alcoholic friends?
Ditch them.
Well, yeah...
That staying in a group that is not healthy for you is not healthy for you.
Yeah.
If you're trying to quit heroin, you can't hang around with heroin addicts.
You'd change your whole social environment.
If you're in an abusive relationship, you're some woman in an abusive relationship, and you go and talk to the National Organization for Women, what are they telling you to do?
Get the fuck out.
Get out.
Get out.
I mean, if you even just say, well, I'm finding my marriage kind of empty and dissatisfying.
Most people will say, well, end it.
Get out.
You only got one life to live, right?
And so this idea that anyone who counsels anyone to act with integrity and end relationships, I mean, that this is somehow bad, it's just nonsense.
We say this all the time.
Society as a whole says, oh yeah, if you're using drugs, you should go to prison.
Well, that's ending some fucking relationships, isn't it?
I mean, you can't have a lot of relationships when you're in jail.
So this idea that you should make, like, oh, you can't have a relationship.
He's saying don't have relationships with people.
Well, then, okay, no more divorces.
If you're a drug addict, you've still got to hang around with drug addicts.
If you're in an abusive relationship, you've got to stay in the abusive.
No, forget it.
No ending relationships.
Everything you're born with is everything you've got to die with.
No new relationships?
Oh, you can have new relationships.
You can't possibly end them.
Nobody gets to quit their job anymore because you can't end relationships.
If you find out that your boss is some moral monster who's stealing from people and keying their cars and stuff, you can't quit!
Because that would be the end of a relationship, whatever, right?
And so statists are like, they don't get it because that's just the norm, right?
It's like the fish in the water saying, water, what water, right?
I mean, they don't get it.
But statists are incredibly committed to their beliefs, whereas if you don't believe in statism, you go to jail.
They want all of your relationships to end because you'll be in prison.
I guess they want a new relationship with your new special prison boyfriend or whatever, but they want all your relationships to end if you disagree with them.
Yeah, and I think maybe that's part of the reason why basically governments never shrink, is that statists keep charging away from freedom, and the people that value freedom aren't standing up for it the way they should.
Right, and I understand why.
I'm not saying that people have to or don't have to, as I've always said.
But my sort of perspective is, look, if these things are true, then they're true.
And that's how I'm going to live.
And I believe that in order to beat the dominant paradigm, you simply have to be more committed than the dominant paradigm, right?
The dominant paradigm has all of these...
Profits to it, right?
Social acceptance and money and a sense of virtue and a sense of community.
I mean, the first 50 guys to think that slavery was evil had a pretty shitty time of it, right?
Yeah.
But you had to really be committed.
You know, it's funny.
The dominant paradigm in America is that the 600,000 men, largely men, who were slaughtered in the Civil War, that it was worth it to end slavery.
The Civil War was fought to end slavery.
It's not true, but let's just say that that's what people believe, right?
So 600,000 people can die to end slavery if that's what people believe, and that's considered to be good, right?
But if a libertarian says, listen, if you want me thrown in prison for my beliefs, we've got a problem.
That's bad.
Yet somehow, if libertarians killed 600,000 people and achieved a free will, not that it would ever work that way, people would say, that's insane.
So if you only have the truth and you're committed to the truth and you have good arguments, rational and empirical, behind what you're saying, well...
People think that that's crazy.
And then you get, you know, the social shaming and the lynch mob gathers.
I mean, that's natural, right?
That's just people fight to defend the dead.
Culture is just a ghostly defense guard for the dead.
Because honor your ancestors is not just a religion in Japan.
Right?
Honor those who knew far less than you.
But...
So that's the challenge.
And look, I don't think libertarians are ready for it.
I don't think they are.
But the moment that people start to say, I'm going to make real decisions to live a free and authentic life, and people who mock and hate me and look upon me with scorn and contempt and hatred and eye-rolling and hostility, the people who are like that, well, I don't want them in my life.
Because if they can't disprove what I'm saying, Then insults will only reaffirm my position, will only strengthen my position.
Now, libertarians are not ready for that.
And I'm not saying they sure shouldn't be.
It's just an observation.
I get that.
And that means that the bad guy is going to win until libertarians are ready for it.
That's all.
Until libertarians are ready to really make decisions based upon their values and to choose their companions based upon virtue rather than accidents of history or proximity, well, nothing's going to change because the statists are willing to have us put in jail.
And if we're not willing to stand up for our beliefs, they're just going to keep winning.
And that's, you know, I mean, I'd like it to be different, but it doesn't really matter, right?
I mean...
Yeah.
Sometimes I want to be 20 again.
It doesn't really matter.
I'm not going to be, right?
But that's just an empirical observation.
So if you say to me, well, where's the place that I can move to be most free?
It's like, well, you don't have to move.
But you might have to move your social markers.
Interesting.
Yeah.
That's probably the most realistic way of thinking about it that I've heard or come up with.
If it's not actionable to me, it's not really philosophy.
But everything that I do is aimed at the actionable.
Yeah.
Because what's the point of...
It's like being a doctor prescribing a medicine you can't get.
It's not really being a doctor, right?
Yeah.
I mean, I think the freedom versus status is kind of the same.
During undergrad, we kind of looked at this game theory of...
The U.S. would probably be better off without political parties, where if each district just voted for who they thought was the best person for the job, instead of, well, we're going to vote for this Republican, whatever, the U.S. would probably be better off.
A person with a political affiliation or political party, they have a better system behind them to try and get them elected.
So it's kind of the same situation where the world would probably be better off, or I think would undoubtedly be better off, with free societies.
But without people that Are actually enforcing that, or the status win unless essentially they're blocked, or they're completely blocked from being able to implement what their beliefs are.
Yeah, the status win until people start taking freedom seriously.
If the non-aggression principle is moral, then people who openly advocate massive and widespread violations of the non-aggression principle, they must be immoral.
You can't have a virtue and then have the opposite also be a virtue or be indifferent.
Now, I get the majority of statists don't have any clue what they believe.
And they don't have any clue the system that they live under.
They don't.
And I get that leading these people to the light is sort of a kicking and screaming kind of thing.
I get it.
But the light is there.
And they need to know the truth.
And so I have, you know, patience with statists and all that.
I get it.
You know, this is the matrix.
It takes months sometimes.
I'm not going to do years.
Because that's sort of an insult to their intelligence.
It doesn't take years to figure out that taxation is theft.
I mean, it takes a while to really get it emotionally, but intellectually, it's not that hard, right?
And so...
People in society as a whole, they exist in a state of moral neutrality because they're so heavily propagandized, but as knowledge and wisdom spread, and as shows like this become more popular and more widespread, then people have less and less of an excuse to not know the nature of the society that they're living in.
And as we spread knowledge, we create immorality.
The purpose of the moralist is to create immorality.
In so far as you attempt to rip people out of the fog of the state of nature, the state of propaganda.
And this is one of the reasons why people as a whole fear and loathe the moralist, because they can live in the blindness of cultural momentum and live in a state of amorality.
But when the moralist comes along and gives them the stark and simple questions, then they are faced with the moral choice that most people have spent their entire lives avoiding and evading.
But the purpose of the moralist is to create immorality because once you've created and identified immorality, the morality just becomes normal.
It's inevitable.
If somebody really wants to go north, if you point out where south is and they believe it, they'll go the opposite direction.
And so the purpose of the moralist is to spread knowledge which reveals that the majority of people are accidentally supporting evil or through propaganda supporting evil.
And then they have a choice, which they damn well don't want to have.
But it's too bad.
So what?
Ah, so you don't want to have...
I didn't want to go to school.
They were fine to force me to go to school, right?
Who cares?
Who cares what people don't want?
Ooh, yes, because that was so important to everyone when we were kids, right?
What do you want?
Do you like this school or would you rather do something else?
Right?
Do you want both of your parents to work or no?
No?
Oh, well, it's too bad.
We're both going to work.
How about homework?
You like homework?
Ah, fuck you.
It doesn't matter.
Get your homework.
Take your homework.
Do your homework.
God damn it.
Right?
Do you respect the teacher?
No?
Well, you've got to do what she says anyway or you get hit with a cane or you get hit with a ruler or you have to do detention or whatever, right?
Too bad.
Right?
So, sorry if I'm repaying the favor.
It's just, you know, I think I learned how much society cares about Things that people don't want.
I learned a lot about that as a kid, so I'm not too bothered by that.
But the purpose of the ethicist is you raise the light, but when you raise the light, what do you get?
You get shadow.
Lift a giant light in the cemetery of culture, and you get a lot of long shadows from the headstones.
And the creation or the identification of the difference between light and dark, between good and evil, that's the job of the ethicist.
And only a small part of it is promoting virtue.
A larger part of it is to create evil, to understand, to create an understanding of evil.
I mean, not creating it like it somehow wasn't there and then is there, but it's the identification.
And the earlier the identification, like with a tumor, the better your prognosis.
It's the identification of evil.
And it is my hope, of course, that libertarians can come together and Recognize that we need to create and take very seriously our ideas.
And I think of all communities, we have the greatest right to take our ideas seriously because they're well borne out by history, by theory, and by evidence, which is all you can hope for.
You can't hope for perfection in these sorts of matters, but you can hope for rational consistency and a combination of overwhelming historical evidence.
And we've got all of that!
We've got all the facts about communism, fascism, socialism.
There's nothing else that we need to know empirically throughout history.
We've got the Roman Empire all the way forward.
All of the examples.
Feudalism, the free market of the 19th century relatively.
We've got the transition of India and China.
We've got the fiat currency doing everything that it was expected to do.
We've got the Austrian...
Business cycle theory well borne out by the evidence.
We've got Timothy Geithner of the Fed saying, oh yeah, the Fed caused a great recession with bad policy.
I mean, we've got nothing else that we need to get empirically.
Lots of theories have been put forward that I think ably explain the property rights and the non-aggression principle.
We've got universally preferable behavior.
We've got nothing else that we need for the movement to take flight.
But, like most people, and I get this, I am not criticizing it.
It's just a simple fact of human nature.
Most people only change when suffering requires it.
I mean, most people will only change when suffering requires it.
People might lose weight when they've had a heart attack, maybe.
But usually not before then, at least not in a meaningful way.
And right now, we're still in the late narcoleptic stages of social decay.
Everything's been warded off with debt.
People have been bribed into compliance.
People don't demand change because they get their drip-drip pellets of unemployment insurance and SNAP benefits in the welfare-warfare state.
And there's still enough bribe money sloshing around the system to buy off any discontent.
And libertarians are not ready to take a truly powerful social stand on their belief systems because they fear social ostracism, socialism.
I get that.
I mean, again, not criticizing just empirically.
I try to really work with the facts.
And I've talked about this stuff for many years and there's no movement.
And I, you know, recognize you can't lead a charge if no one's behind you, if you're vastly outnumbered, right?
So I have, you know, I simply put out the ideas and the arguments and I, you know, sowing the seeds for when the rain of change comes.
Nothing's going to grow if the seeds aren't there, but until the rain comes, nothing's going to grow either way.
So, you know, what you want to do with your life obviously is completely up to you.
But there will come a time where libertarians We're all going to have to just take our beliefs very seriously and act on them in our own lives, in our personal lives.
If we define the state as evil, we define violations of the initiation of force as evil, then those who support and praise evil must themselves be implicated in that evil.
If you have a morality that defines someone as an evildoer and you're still willing to be friends and break bread with that person, then your morality doesn't mean anything.
And then just ditch the morality.
Forget about it.
Just say, I'm going to go comply with these people.
That's fine.
I mean, look, just be honest, right?
Take your beliefs seriously or don't pretend to have those beliefs.
I'm just trying to take away that middle ground of nonsense.
And I get that just because we define something as evil doesn't mean that everyone who supports it is an evildoer.
Of course not, because they're propagandized and they're raised the same way we're all raised and not everyone has the incentive or intelligence or understanding or drive to discover all of these things.
But morality is very powerful, very deep magic, very deep.
It is the physics of human society.
And to take a moral stand is to take all of those who oppose that moral stand and put them in a category of immoral.
Blind initially, but when illuminated, immoral.
And so work to illuminate people, but we cannot, we cannot imagine that we are not creating good and evil in the world by defining moral systems.
That's a complete fantasy.
And the other thing is that libertarians say, well, we want people to change based on reason.
Well, are you changing based on reason?
Sorry, Howard, you were going to say?
I mean, it kind of...
I mean, maybe this is getting a little bit off on a tangent, but have you seen the videos where people like to conform?
So it's like a prank show.
I don't remember what it's called, where they'll have...
Four people in an elevator who are actors and then a fifth person walks in and all of a sudden as if there was some kind of command given all four of the actors turn to face the back of the elevator and the random person who walks in who's the butt of the joke basically he'll look around and then turn around to conform and so it kind of I think that's Where,
like, essentially, until there is an actual movement, like, where essentially people want to conform, they want to fit in, even to the point of it making no sense, which is what I think...
Well, no, but those experiments are a little different, insofar as those people probably do know something that you don't know.
Which is maybe there's a huge amount of heat on some particular floor and they're turning away so they don't get this.
Like, people wouldn't act randomly.
So this is kind of an artificial situation.
It does show some sort of conformity, but it's not irrational conformity.
And it's certainly not analogous to people who study and define ethics.
Right?
Because then they're experts.
Right?
So, look, I mean, if I see a bunch of people suddenly duck, I'm most likely going to duck as well, because that makes sense biologically.
Maybe they've seen some low-flying pterodactyl that I haven't or something, a dragon, whatever, right?
You know what I mean, right?
Yeah.
So there is a natural sort of conformity that occurs that makes perfect sense.
It's just not applicable to experts, right?
So the challenge for libertarians is we have defined virtue as the non-aggression principle, or at least a minimum, necessary but maybe not sufficient, but necessary for virtue as the non-aggression principle, which means those who violate the non-aggression principle must be immoral.
And those who support and enable that violence, which would be impossible without their support and enablement, are complicit in that immorality.
Those are just basic facts.
Those are just basic facts of logic that cannot be disputed by any sane human being.
If you define something as virtuous, its opposite must be immoral and that which enables the immorality must also be complicit in the immorality.
I mean, if I say respecting property rights is a virtue, then somebody who steals from a store is acting immorally.
And the getaway driver, even though the getaway driver has not participated in the theft, but the theft would not occur without the getaway driver, the getaway driver is complicit.
It's called an accessory to a crime.
And these are just basic moral realities.
And everybody wants to define the virtue, but they don't want to deal with the reality that when you define the virtue, you define the immorality.
When you define the good, you define the evil.
And again, just to reiterate, I get that people are propagandized.
They don't know this, that, and the other, but it's becoming less and less excusable to not know these days.
And certainly if you have the power to bring these simple illuminations to people, then do it.
It's scary.
It's alarming.
I get it.
I do it every day.
It's unpleasant.
It can bring blowback.
Yeah, I get that.
There's difficult times, but so what?
So what?
It doesn't really matter fundamentally.
We're all standing on the shoulders of these giants who had even more moral courage than is required of us.
So that's just sort of my argument is if you're going to look for some country, yeah, there's more and less free countries.
And the less free countries are usually in the status of hitting bottom and then transitioning to more free countries.
And the more free countries are in the status of hitting top and then collapsing down into some godforsaken who knows what.
So that cycle of life, as long as we have a government, that cycle of bullshit is always going to continue.
But there is far more freedom to be found in your relationships than there is in your geography.
I like that.
And, yeah.
Anyway, thank you, Stefan.
I'm a big supporter of what you do in Free Domain Radio, and I appreciate having the conversation with you.
Thank you very much.
I appreciate that.
It's a great, great question.
And, yeah, I think that's it for the show tonight.
I do want to say thank you, everybody, so much.
Happy, happy Valentine's Day.
May the Valentine's Day be 364 of the other days and a quarter of the year, because our Valentine's Day should really be the Cupid who shoots us through the heart with the arrow that has us bleed a love of wisdom.
Oh, that's a complicated metaphor.
Love, wisdom, and all the other virtues shall be yours.
Love, honesty, and love shall be yours.
And, um...
If you would like to help out this conversation, we need your help.
We need your help.
This is a show that requires money to run, requires money to staff, it requires money to spread, it requires money to grow, and it needs your money.
I'm going to tell you the truth.
I'm always advocating for honesty.
We need your money.
And you can go to freedomainradio.com slash donate to help us out.
The after Christmas lull is a challenge for the finances of the I think we're doing great work.
I think we're really helping the world in a most essential way.
And we can't do it without you.
We really need your help.
So if you can go to freedomainradio.com slash donate to help us out, you know it's necessary.
We're certainly not going to ask for 10% of your income or any crazy religious stuff.
But we need you to support as much as any other advocacy group in the world.
Despite the fact that I think that, or perhaps because of the fact that we're advocating, I think the best stuff around.
So freedomainradio.com slash donate.
Thank you everybody so much.
Export Selection