Why Did Mashable Make The Youtube Shooter Look White?
SUPPORT JOURNALISM. Become a patron athttp://www.patreon.com/TimcastMy Second Channel - https://www.youtube.com/timcastnewsMashable ran a story about the Youtube incident a few days ago yet for some reason ran a doctored photo of the suspect that made her skin whiter and her eyes green.What this deliberate or was this an accident? How did the author accidentally use this photo as it was hard to find. Why didn't the author use on of the thousands of photos that had gone viral already?Make sure to subscribe for more travel, news, opinion, and documentary with Tim Pool everyday.Amazon Prime 30 day free trial - http://amzn.to/2sgiDqRMY GEARGoPro Karma - http://amzn.to/2qw10m4GoPro 6 - http://amzn.to/2CEK0z1DJI Mavic Drone - http://amzn.to/2lX9qgTZagg 12 AMP portable battery - http://amzn.to/2lXB6SxTASCAM Lavalier mic - http://amzn.to/2AwoIhI Canon HD XF 105 Camera - http://amzn.to/2m6v1o3Canon 5D MK III Camera - http://amzn.to/2CvFnnm360 Camera (VR) - http://amzn.to/2AxKu4RFOLLOW MEInstagram - http://instagram.com/TimcastTwitter - http://twitter.com/TimcastMinds - http://Minds.com/TimcastFacebook - http://facebook.com/TimcastnewsBitcoin Wallet: 13ha54MW2hYUS3q1jJhFyWdpNfdfMWtmhZSEND STUFF HERETim Pool330 Washington Street - PMB 517Hoboken, NJ 07030Support the show (http://timcast.com/donate)
Learn more about your ad choices. Visit megaphone.fm/adchoices
Yesterday I was talking a bit about how whenever there's a major mass casualty incident, terror attack or shooting, you will see the left and the right try to use it to push their political agenda.
But what happens when they try to push one of these narratives and then information comes out proving them wrong?
It would seem now we have a pretty interesting example of just that.
because Mashable published a photo of the shooter, who was a Persian vegan activist,
and for some reason, the photo they used whitened her skin and made her eyes green instead of brown.
♪♪ Before I get started, make sure to hit that subscribe
button and click that little bell icon,
because YouTube doesn't do a very good job of notifying you all when I have a new video out.
And also, make sure to go to Patreon.com forward slash TimCast and become a patron today.
There are many different tiers to choose from, most notably Tier 1.
At $10 per month, you get access to behind-the-scenes photos and video and bonus commentary videos when available, usually when I'm out reporting in the field, so please consider becoming a patron to help support me and my work.
Several months ago, I tweeted something out to exemplify just how I feel about the left and the right and partisan bickering.
And in this tweet, I said, the left and the right.
In this photo, you have Charles Clymer.
In one, he says, my thoughts are with those in Manchester and the UK tonight, and I am disgusted and unsurprised at Trump supporters exploiting this tragedy.
But just over here, you see him saying, this is a terrorist attack.
The perpetrator is a white terrorist.
We need to say these things out loud as much as possible.
Next, we have Peter Sweden, who said, stop politicizing Texas shooting.
27 people have lost their lives.
This is not about politics or skin color.
This is about evil versus good.
And then just below, New York terrorist turns out to be a 29-year-old migrant from Uzbekistan
who worked as an Uber driver.
Time for extreme vetting.
And the point of that tweet was just to say that Whenever something happens, one side will downplay it if it negatively impacts them, or they'll play it up if it will benefit their political agenda.
Well, during the YouTube shooting, we saw this tweet.
If this shooter at YouTube isn't a white male with far right leanings, I will eat my effing hat.
Get rid of the effing guns.
And as it turns out, the YouTube shooter was a vegan bodybuilder, Nasim Aghdam.
And you can see this is what she looks like.
But now to the main point.
It wasn't long until people found out this woman was, well, a woman and Persian.
We found out that she was vegan.
We found her website.
We found her YouTube videos.
But Mashable decided to do something interesting.
In an archived version of the story from Mashable, it says, YouTube shooter identified as creator who accused company of filtering her videos.
And this is the photo they used.
Her skin has been lightened, and her eyes have been turned green.
And beneath it, it says, an image of Nassim Afgham, as it appeared on her Instagram account, at Yassil Nassim, which has been deleted.
Image, Instagram.
That's really interesting, because that is not what the woman actually looks like.
Why did Mashable run a photo of a woman who was Persian with brown eyes with her skin lightened and her eyes turned green, and then captioned it to say that it came from Instagram when it probably didn't?
The article title has been changed and updated.
It now reads, And the doctored photo has been removed entirely.
Here we can see the original photo and the doctored photo.
Now, I want to know how this happens.
Why is it that Mashable ran the incorrect photo that apparently tried to make this woman appear white with green eyes?
On Twitter, FrexB tweeted, Hey Mashable, Johnny Liu, why did you lighten Nassim Aghdam's
skin and turn her eyes green?
I'm sure you're not trying to gaslight your audience into believing she's white.
Then he posted a photo from ABC7 News with the correct photo and you can see her eyes are dark, presumably brown.
And then the photo from Mashable where her skin has been lightened and her eyes have been turned green.
Now it starts to get a little interesting when we see the response from Johnny Liu, the author of the article.
First, he said, the Instagram Facebook pages this was from are all now deactivated.
Now that's very convenient, because you can't actually check to see if Johnny was telling the truth.
He claims the image came from Instagram.
Well now the pages are deactivated, you can't check.
Fortunately, I began to archive her pages after the shooting because I... I believed they would be taken down.
Now, although it wasn't my intention to capture her profile photo, you can clearly see that the doctored version is not the photo she is using.
She is using the photo that everyone else has, with her eyes, the natural color, and her skin tone, the natural color.
So it would seem that something is wrong with what Johnny is saying, because he's posting links to her Facebook and Instagram, but claiming they're deactivated now.
Well, how did people respond?
A Twitter user said, he is lying.
Every single cached image is of the original, untouched photo.
To which Johnny Lou responded, here's one I got from an Instagram aggregator.
And this is the image he posted.
You can see that it's a Google search of Yaseel Nassim, and sure enough, here is the image of her with green eyes and whitened skin.
But here's what I want you to do.
Take a look at the next three photographs.
You have another picture of Nassim, you have another woman who is unrelated, and a young man who is unrelated.
The same Twitter user responded, And in his version, which is the same Google search, you can see the same four photos, yet for some reason, the image of Nassim is normal.
Now, when I did a search, we can see Yassiel Nassim, and none of those photos actually come up.
When I search for Yassiel Nassim Pyctomy, I do find the doctored image from a website called Pyctomy, which is not Instagram.
Apparently, that is the Instagram aggregator that Johnny Liu is talking about.
But let's go back because Johnny Liu said the Instagram slash Facebook pages this was from are all now deactivated.
Well, that's strange because he claimed the photo was found on Google through an Instagram aggregator and not through these links.
So was this first post a lie or was he mistaken?
How did it come to be that he sent links to the accounts in which the photo was not being used as a profile image?
And if he got the photo from Google, why did he link this in the first place?
Perhaps it was all a mistake.
Perhaps this writer just did a really bad job.
At the very least, we can definitively say Johnny Liu is bad at what he does.
Because he posted a doctored image of a woman in an article, in a breaking news article nonetheless.
And then when he fixed the article, he didn't actually put the correct photo up.
He just removed the bad image.
At the bottom of the updated story, it says, Update April 4th, 3.45 PM.
He got her name wrong in the original article.
Bruno police and report from the Mercury News.
Editor's note, an inaccurate image from Aghdam's Instagram account
was originally featured in this post.
It has since been removed.
Mashable apologizes for any confusion caused.
He got her name wrong in the original article.
He called her Afghan.
And they're claiming the image came from Instagram.
Well, that's really hard to prove considering the accounts are all deactivated.
But when I went through her account, I never saw this photo.
And what I can say is that it was not her profile image.
Her profile image was not doctored.
So, I don't know exactly where they got this photo from.
They claim it came from Instagram, but the author also claims it came from an aggregator.
So, this seems to be very strange.
Many people believe this to be a case of an author trying to make it seem like this woman was actually white with green eyes.
And we can't say that's true.
At the very least, we can say Johnny Liu is bad at what he does, because he published a bad photograph.
It was the wrong photo with the wrong name.
At the very least, that's what we can say.
But it is entirely possible he did it on purpose.
We also have to consider that if this was an accident, that would mean that when he did a Google search for this woman, He saw all of the photos of her as someone with darker skin and brown eyes, and decided to choose the one photo that made this woman white with green eyes.
Which, in my opinion, I find very strange.
But, as Hanlon's razor suggests, never attribute to malice that which is easily explained by incompetence.
I can always just defer to the fact that this guy is probably just bad at what he does, and I don't have to be worried about getting sued by him or Mashable, because at the very least, that is true.
Accusing him of doing this on purpose, we can't prove that.
But if it's his statement that they just picked the wrong image, well then, I think it's fair to say you're bad at this.
Next, let me give an honorable mention to the Daily Beast, because it has been two days, and the Daily Beast still has this story up on their website.
Update.
Official.
Woman shot boyfriend at YouTube HQ.
A federal law enforcement official told Daily Beast that a woman shot her boyfriend at YouTube headquarters in San Bruno, California on Tuesday.
The assailant is dead from a self-inflicted gunshot wound, San Bruno Police Chief Ed Barberini told reporters.
This is all wrong and should not be public right now.
People are probably looking at this article and believing it to be true.
Many people have tweeted about this woman saying it was a domestic incident, that she was targeting her boyfriend.
We know that's not the case. As it stands, the working theory is that she was upset with Google and YouTube for
demonetizing and filtering her videos. But for whatever reason, the Daily Beast is still running this article. News
is not perfect.
People make mistakes.
But with so many people calling out the Daily Beast and calling out Mashable, you'd think they'd get it right.
Why is it that Mashable can make such a serious mistake in publishing a whitened version of a woman with green eyes instead of the actual photo which was going viral?
How does that happen?
In my opinion, I'm going to defer to incompetence and just say this Johnny Liu guy probably wasn't paying attention, probably didn't care, he's probably paid crap, and he just did a quick Google search and then picked whatever photo he found without paying attention.
But man, if that's true, and that is the simple solution, that is serious incompetence.
The point of being a journalist, the point of working for a company like Mashable, is to do work to find the truth.
And if the average Twitter user is doing a better job than you, you probably shouldn't have a job in the first place.
Why is Mashable paying people who would make such serious mistakes?
Could you imagine if you hired a plumber and they came to your house, and they didn't know what they were doing, and you knew more about plumbing than they did?
Why would you want them to do that job?
Why would you pay them?
What is it about these individuals who work for these various news organizations that they get to have these jobs even though they are worse at their job than your run-of-the-mill Twitter user who just happened to do a Google search?
This is one of the biggest problems we have in media.
Is this political?
I don't know.
But we have one of two scenarios.
This person was trying to politicize the incident and make the woman appear white, or he's just bad at his job.
Either way, it paints a pretty damning picture of the current state of politics.
Should we assume that media companies are political and trying to push a narrative and an agenda, even if that means manipulating the public?
Or should we assume that people are just really bad at their jobs and underpaid and don't really care if they're getting the facts right?
In my opinion, A little bit from column A, a little bit from column B. Why not both?
In my experience, that tends to be the case.
And it is entirely frustrating.
It's frustrating for me that I have to make videos complaining, highlighting this poor journalism.
I would much prefer to be out in the field Doing interviews with people.
But when things like this happen, when news is in such a sorry state, I feel like one of the most important things I can do is just continually shine a light on the bad journalism.
We all need to pick up some of the slack and start fact-checking on our own.
And this is what a lot of people on Twitter did.
How hard is it?
To do a Google search to find this woman's website, to find pictures of her.
It took me only about 10 minutes to find all of her social media accounts.
And I'll be honest, I do this for a living.
I know how to find a lot more than just her social media accounts.
I found out her family's information, I found her address and phone numbers.
Journalists should be able to dig through information, determine what is relevant to the public, and publish facts.
We are supposed to be doing our best to find the truth.
And many people are not doing that.
Both because of malice and because of incompetence.
And goddammit is it frustrating to have to see this every single day.
I'll tell you what, I had a different story lined up.
I was going to be talking about data theft from Facebook and I have another story lined up about the current state of the American fear of terrorism.
We can talk about data analytics.
We can talk about terrorism.
The problem is, if our media isn't doing its job, none of these conversations matter because people are going to be fed such bad information.
It won't matter what I do.
It doesn't matter if I go out and do a good interview.
It doesn't matter if I actually film Antifa in Berkeley throwing explosives at people.
Everyone, people aren't going to believe it.
They're going to say, oh that's fake news and they're going to accuse me of being partisan because of stories like this, because of people who claim to be journalists but can't even find the right photo.
What's particularly strange about this is that the doctored image is the needle in the haystack.
This guy was searching through a haystack of photos of this woman that were accurate, and he chose the needle.
I just don't understand it.
So let me know what you think in the comments below.
Do you think this was deliberate, or do you think this guy is just bad at his job?
Because at least in the bigger picture, we can say that there's a lot of people who are doing this on purpose, and a lot of people who are just bad at their jobs.