All Episodes
Oct. 10, 2018 - Tim Pool Daily Show
11:30
Gender Diversity Initiatives BACKFIRE, Get Canceled

Gender Blind Diversity Initiatives BACKFIRE, Get Canceled. The latest news is that Amazon used an artificial intelligence to bring about greater diversity in the hiring process. However the social justice initiative had the opposite effect with the AI slowly learning to discriminate against women.Other studies have shown similar results. In Australia a gender blind social justice initiative resulted in more men being offered interviews. Interestingly without the initiative women received an advantage. The results of the diversity study was replicated according to Harvard Business School.In the past feminists have championed gender blind hiring as a way to increase gender diversity but a few studies and the latest news may prove it to be detrimental to feminism. Support the show (http://timcast.com/donate) Learn more about your ad choices. Visit megaphone.fm/adchoices

Participants
Main voices
t
tim pool
11:28
| Copy link to current segment

Speaker Time Text
tim pool
We're starting today's video with two questions.
Do you think that men are likely to be sexist against women in the hiring process?
That, in some circumstances, a man won't hire a woman because she is a woman?
Comment below, let me know what you think.
The next question.
Do you think that a computer algorithm would do a better job of hiring people equally, meaning that there would be less discrimination and more women and minorities would be hired?
Comment below, let me know what you think.
We've got a story now that Amazon was secretly running an artificial intelligence in their recruiting process.
The idea was a machine will do a better job of finding the best candidate and it won't discriminate based on gender or race.
However, it ultimately did start discriminating because there are key factors and key differences between men and women that the algorithm actually found.
Amazon had to suspend this practice.
And there's other data to suggest that even though we don't have an equality of outcome, meaning that in many jobs we don't see 52% women and 48% men, we do see that men actually favor women in the hiring process.
And when we go gender blind in recruitment, it actually penalizes women.
So perhaps it's not so much about sexism, more so that men and women make different choices.
But let's explore this a little bit.
First, we'll start with the story about Amazon.
Now, before we get started, please head over to Patreon.com forward slash TimCast to help support my work.
Patrons are the backbone of the content I create, so if you like these videos, and you like the videos on my second channel, please go to Patreon.com forward slash TimCast and become a patron today.
From VentureBeat, Amazon scrapped a secret AI recruitment tool that showed bias against women.
Amazon.com's machine learning specialist uncovered a big problem.
Their new recruiting engine did not like women.
The team had been building computer programs since 2014 to review job applicants' resumes with the aim of mechanizing the search for top talent, five people familiar with the effort told Reuters.
The company's experimental hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars, much like shoppers rate products on Amazon, some of the people said.
Everyone wanted this holy grail.
They literally wanted it to be an engine where I'm going to give you 100 resumes, it will spit out the top 5, and we'll hire those.
But by 2015, the company realized the new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way.
That is because Amazon's computer models were trained to vet applicants by observing patterns and resumes submitted to the company over a 10-year period.
Most came from men, a reflection of male dominance across the tech industry.
In effect, Amazon's system taught itself that male candidates were preferable.
It penalized resumes that included the word women's, as in women's chess captain, and it downgraded graduates of two all-women's colleges according to people familiar with the matter.
Amazon edited the program to make them neutral to these particular terms, but that was no guarantee that the machines would not devise other ways of sorting candidates that could prove discriminatory, the people said.
The Seattle company ultimately disbanded the team by the start of last year because executives lost hope for the project.
Amazon's recruiters looked at the recommendations generated by the tool when searching for new hires, but never relied solely on those rankings, they said.
The story mentions how some 55% of U.S.
human resource managers said artificial intelligence would be a regular part of their work within the next five years.
And this is according to a 2017 survey by talent software firm CareerBuilder.
If someone said to me that they thought a computer program was going to do a good job of removing discrimination and bringing about equality of outcome, I would say you are wrong, and it's very obvious the computer program will do the opposite.
There are people who went to fancy, prestigious schools and they had access to money.
Somebody who goes to Harvard, for instance, is probably more likely to have money in their family, which means they're more likely to have extracurricular activities, which means they're more likely to have accolades, and the resume is going to look better.
They're also going to have more access to resources, and their parents, who are going to be highly educated, are probably going to help them craft a better resume.
So you'll have poor people, whose resumes don't look that good, who went to less prestigious schools, who will get weighed down.
Then you're going to have women, who might have gaps in their employment history from taking care of their family.
And the machine is going to say, well this person wasn't working and this person was, so it's ultimately going to favor those who are more likely to be wealthy and less likely to be female.
It's rather obvious.
Now a lot of people say, maybe we just need humans to do it, but we remove the names so they can't tell if it's a woman or a man, and they can't tell if the person's white or non-white.
In fact, that actually doesn't work either.
Blind Recruitment Trial to Boost Gender Equality Making Things Worse Study Reveals.
Leaders of the Australian Public Service will today be told to hit pause on blind recruitment trials, which many believed would increase the number of women in senior positions.
Blind recruitment means recruiters cannot tell the gender of candidates because those details are removed from applications.
It is seen as an alternative to gender quotas, and has also been embraced by Deloitte, Ernst & Young, Victoria Police, and Westpac Bank.
In a bid to eliminate sexism, thousands of public servants have been told to pick recruits who have had all mentions of their gender and ethnic background stripped from their CVs.
The assumption behind the trial is that management will hire more women when they can only consider the professional merits of candidates.
Their choices have been monitored by behavioral economists in the Prime Minister's Department, colloquially known as the Nudge Unit.
Professor Michael Hiscox, a Harvard academic who oversaw the trial, said he was shocked by the results and has urged caution.
We anticipated this would have a positive impact on diversity, making it more likely that female candidates and those from ethnic minorities are selected for the shortlist.
We found the opposite, that de-identifying candidates reduced the likelihood of women being selected for the shortlist.
The trial found assigning a male name to a candidate made them 3.2% less likely to get a job interview.
Adding a woman's name to a CV made the candidate 2.9% more likely to get a foot in the door.
We should hit pause and be very cautious about introducing this as a way of improving diversity, as it can have the opposite effect, Hiscox said.
At least according to this study done in Australia, it would seem that in the hiring process, men actually give the benefit to women.
Perhaps the reason that we don't see equal outcome in these jobs is because women just choose to do different things.
Maybe not that many women are actually applying for these jobs, so when it comes to hiring people, they end up choosing men.
When they would put a woman's name on a resume, they were more likely to get an interview.
And that means the recruiters gave the benefit to women, even if they were less qualified.
And men were penalized, even if they were equally qualified.
But there's actually some data from a Harvard Business School report.
They talk about a study done to test for gender bias in hiring processes.
They said that researchers created online experiments with 100 participants, representing
workers seeking jobs and another 800 representing employers looking to hire workers.
The workers were asked to complete a series of sports and math quizzes.
Some of these questions easy and others hard.
Overall, men performed slightly better than women, answering on average one extra question
correctly.
Employers then had to hire a candidate, choosing between one woman and one man.
Each candidate's score results on the easy questions were made available to the hiring
official, but employers were not provided workers' scores on the difficult questions.
Yet they were additionally told they would receive compensation if their hire did well on the hard quiz.
When told that men did slightly better on average than women on sports or math tasks, Employers were much less likely to hire a female worker than a male worker, even when two individual workers had identical easy quiz grades.
The researchers then took gender out of the hiring decision.
Workers were simply identified to potential employers as either born in an even month or an odd month.
In reality, but unknown to the employers, the researchers labeled all women candidates as odd month and all men as even month.
Using test results as their guide, employees still steer cleared of the odd month or female workers, choosing them only 37% of the time.
When identified as women, they were chosen 43% of the time.
Just like the woman was hired less often, the odd month worker was hired less often too, Kaufman says.
That tells us the discrimination isn't based on a prejudice against women.
So it's not that people in the setting don't like hiring women.
Instead, employees are drawing on the information about average performance and are not hiring members of lower-performing groups.
The important point here is that just like in the study done in Australia, when female workers were not identified, they were chosen 37% of the time.
But when they were identified as women, they were chosen 43% of the time.
And this is just two studies that I found.
It's not necessarily indicative of the entire world and whether or not all companies operate in the same way, but it's an interesting find that at least in these reports, in these studies, when identified as a woman, they actually received an advantage.
Gender-blind hiring actually had the opposite effect.
Personally, I don't know if we'll ever have equality of outcome because some jobs just require certain kinds of people.
We're not surprised when the NBA is dominated by people who are over six feet tall.
Firefighters are more likely to be men because you need to be able to lift a certain weight to be firefighters.
It's just the way it is.
It's strange to me, then, that we can recognize certain jobs require certain types of people, yet when it comes to issues like STEM, we think we will find equality of outcome.
It's just not working.
The computers aren't solving the problem, and neither is gender-blind hiring processes.
All it's doing is actually making things worse.
In fact, the only thing that's actually helping bring about equality of outcome is the fact that individuals of certain ideologies are actively trying to end it.
When it comes to a process, like being gender blind, for natural reasons, people are going to choose those who tend to be wealthier or men.
And computers are going to do the same thing, but probably much, much faster.
So if you are somebody who believes we should have an equality of outcome, the only thing you can do is advocate for ideology to prevail.
But when it comes to trying to find the most efficient person for the job, it's always going to be biased, and it's probably going to be sexist.
For the simple reason that in the long run, women tend to take off more time for work because if they want to raise families, that will weigh against them.
If we use a computer algorithm, it will probably look at women who took time off and put them in the same category as women who didn't and still weigh down other women.
Now, it's possible the algorithms can be programmed to overlook this, but even when Amazon tried to remove gendered language from how the algorithm worked, it found other ways to be biased against women, and it just didn't work.
And then, as we saw from the other bits of data, it looks like when recruiters are presented with the name of a woman on a resume, they give them a benefit.
They actually are more likely to give them an interview.
So it just really seems like ideology is the driving force between all of this.
Equality of opportunity might not be enough necessarily, but equality of outcome doesn't seem to work.
So, I can't say I know what the solution is.
I can just say that it's really interesting that Amazon had to abandon A computer algorithm-based hiring process because the computer algorithm was biased against women.
For whatever reason, when it came down to the nitty-gritty, the computer chose men.
But let me know what you think in the comments below.
We'll keep the conversation going.
You can follow me on Twitter at TimCast.
Stay tuned.
New videos every day at 4 p.m.
And more videos coming up on my second channel, youtube.com slash TimCastNews, starting at 6 p.m.
Export Selection