All Episodes
June 2, 2025 - Freedomain Radio - Stefan Molyneux
05:48
How Much Sympathy Do Men Get in the World?
| Copy link to current segment

Time Text
How much sympathy for men, regardless of race, how much sympathy do you get based upon the fact that Dr. Warren Farrell did a great book on this called The Boy Crisis.
So the bias and bigotry and sexism against boys at the educational system.
I mean, there are very few males teaching at the younger levels.
And we know for a simple fact that there's massive bias against boys because when you take essays and tests and all of that and you give them to the teacher to grade and you take off whether it's a boy or a girl, the boys' marks go up considerably.
Yeah, got to have those women-owned businesses and all of that, right?
So how many times have you received sympathy from the world, from people as a whole, for being a male or for being white and all this kind of stuff, if you have challenges in the hiring process?
And, you know, you can see some of these challenges and I'm sure you've experienced that.
So why would you give sympathy to people who don't have sympathy for you?
Right?
I mean, if AI replaces a bunch of HR, human resources stuff, well...
I don't care.
It's a foundational contract you must have in society.
You have to have this contract in society that you do not give people more compassion and empathy than they give you.
You don't do it.
You do not be a spiritual whore.
And I use that term with great emphasis.
And it's an insult to whores, because at least whores get paid.
Do not be a spiritual whore.
Do not be a slave.
Do not have this commandment of be nice, be nice, be nice.
It is a relationship.
Consideration, compassion, empathy, concern, care, love.
Sympathy is a relationship.
It is not a fucking commandment that you must do like gravity.
It is not that at all.
It is not that at all.
Not that at all.
You will lose everything.
If that's your role, you will lose everything.
Everything.
Your history, your culture, everything.
I mean, this is the argument to some degree with deportations, right?
I mean, again, I'm not a statist, and I would like nothing more than a truly free society, and I've written about this in my novel, The Future.
I talk about immigration there, how it works in a free society, so that's what I want.
But in terms of the general things, like, well, we've got to have compassion for the people who came into America illegally.
It's like, but where's the reciprocity?
How much compassion do the people who come in illegally have?
for the preferences and laws of the host country.
Dystopian?
But the funny thing is, it would probably be more objective than the HR women we have currently.
So, prejudice in people is undocumented.
In general, right?
Prejudice is hidden.
It's not like there are all of these things written out, right?
So prejudice is undocumented.
However, AI code is documented, right?
So if there's bias in HR, you can't look in people's heads to see their bias, but you can sure as shit pass out the AI code to see if it's programmed to be biased.
So people are reverse engineering the bias that's in AI.
All the time.
You can see it in Google, right?
That's pretty obvious, but it would be much more objective.
So, for instance, if you have AI as part of your hiring process, well, in America, it's against the Civil Rights Act to have any preferences or obstacles based on race.
Right?
You can't run the internal thoughts, right?
There's no breaking down the source code of the brain, but you sure as shit can break down the source code of your AI and see if it's biased against race or sex.
So, I disagree.
I think AI, and now an AI, No, it was a gold panning.
Society didn't set up a job fair to help everyone that lost their oil and gas jobs.
Yeah, of course.
Of course.
Of course.
So AI, if it has source code and so on.
See, AI, you can test the biases because AI will simply do what is told, right?
Whatever the program tells.
However, you can't test the biases in HR because people will just lie to you.
No, no, no, we don't do any of this and that, blah, blah, blah, right?
So AI has the possibility of being much more objective and rational and non-discriminatory.
Isn't that what we want?
Non-discriminatory.
Export Selection