You know, one of the main criticisms against social media companies like YouTube, the place where we published this episode, is that they appear to have their cake and eat it too.
Because on the one hand, whether it's Google, Twitter, YouTube, which is owned by Google, Facebook, or Instagram, They all call themselves platforms, which in theory makes them not liable for the content that their users post.
Their argument is that they're just like a phone company.
They provide the service, and they're not responsible for whatever people might say over the phone.
However, that is the way it works in theory.
Because in reality, for the past six or seven years now, these social media companies, they have been acting more and more like publishers.
They choose which videos or which types of contents get promoted.
They choose which publishers are authoritative and therefore deserve more prominent placement.
They have so-called fact checkers, which are pretty much like editors in a newspaper, which quite literally censor whatever videos they deem to be untrue.
They have certain narratives that they absolutely need to push out, such as the narrative surrounding And then, of course, there's the ever-present algorithm, which just takes down whatever videos contain keywords that the YouTube executives have deemed to be taboo.
For example, just yesterday, our sister media, NTD, put out the trailer for their new documentary, and they put it on YouTube.
That documentary is called The Unseen Crisis, and it includes a number of interviews with people who were vaccine-injured.
Now, regardless of what you might think of the topic, it's important.
These are the real-life stories of real-life people who were severely injured after getting the shot and then subsequently completely ignored.
But wouldn't you know it, within minutes, the trailer was taken down.
Now, just like that, the supposed platform called YouTube took down the trailer for a fact-based documentary which took about half a year to put together.
You can say all you want, but that does not really sound like something a phone company would do.
And just as an aside, if you'd like to check out that documentary, The Unseen Crisis, you can watch it over on Epic TV, our awesome no-centorship video platform.
The link to that documentary will be right there at the very, very, very top of the description Regardless, with all that as the preface, there were two very high-profile cases before the U.S. Supreme Court which truly had the legal teams at these big tech companies sweating bullets.
Because these two cases, which separately made their way up to the U.S. Supreme Court, they gave the court an opportunity to actually reevaluate Section 230.
And in case you don't know, Section 230 is the legal provision in the U.S. Code which exempts online platforms like Google, YouTube, Facebook, and Twitter from being held liable for content that's posted by their users.
It's basically the very small part of the U.S. legal code which allows them to be treated as platforms rather than as publishers.
And wouldn't you know it, after months of legal back and forth, just yesterday, in a unanimous decision, the U.S. Supreme Court sided with Twitter, Google, and Facebook.
Although, in an interesting turn of events, the court actually almost completely sidestepped the question of Section 230 altogether.
Let me break down for you the specifics of their decision here.
And by the way, I do hope that if you appreciate content like this, you do take a super quick moment to smash those like and subscribe buttons, which quite literally forces the algorithm, the biased algorithm, to share this video to ever more people.
Now, in regards to the first lawsuit, it was filed after a deadly ISIS attack took place at a nightclub over in Istanbul.
The families of the victims sued Twitter, arguing that the company was liable because they either allowed terrorist videos, specifically videos from ISIS, to be posted on their platforms, and or they failed to do enough to police the terrorist accounts that were actually posting those specific videos.
Essentially, their argument, the argument that they were making, is that by not policing the speech, Twitter was culpable in the attack.
On the flip side, however, as per usual, the Twitter legal team was making the counter-argument that they have near universal protection from whatever their users post, based on Section 230.
Now, during the oral arguments for this particular case, the Supreme Court justices struggled back and forth a bit over the extent to which social media platforms should be held liable when you have actual terrorist groups use their platforms to promote their terrorist ideologies.
For instance, Chief Justice John Roberts said that, quote, Despite any algorithm YouTube may use to push users to view videos, the company is still not responsible for the content of the videos, or the text that is transmitted.
Meanwhile, Justice Alina Kagan from the court's liberal wing told a lawyer for one of the families that, quote, I can imagine a world where you're right, that none of this stuff gets protection.
And you know, every other industry has to internalize the cost of its conduct.
Why is it that the tech industry gets a pass?
A little bit unclear.
On the other hand, I mean, we are a court.
We really don't know about these things.
You know, these are not like the nine greatest experts on the internet.
And as I mentioned earlier, after several months' worth of deliberations, in their final 38-page decision, the U.S. Supreme Court actually sidestepped the issue of Section 230 altogether.
And instead of adjudicating Section 230, the court focused instead on the technicalities of this specific case.
Here's what the Supreme Court wrote in their opinion statement in relevant part.
Quote, The plaintiffs sought to hold Twitter, Facebook, and Google liable for the terrorist attack that allegedly injured them.
But the court concluded that plaintiffs' allegations are insufficient to establish that these defendants aided and abetted ISIS in carrying out the relevant attack.
The connection between the online platforms and the nightclub attack was far removed.
The allegations plaintiffs make here are not the type of pervasive, systemic, and culpable assistance to a series of terrorist activities that could be described as aiding and abetting each terrorist act by ISIS.
So that was the first case.
The second lawsuit was filed against Google.
This case dates back to 2015, when a U.S. citizen was killed in an ISIS terrorist attack over in Paris.
Now, if you remember back to that period of time, That particular killing was part of a larger series of attacks that were carried out throughout multiple areas of Paris, which eventually led to about 129 deaths.
And so the family of one of the victims, they filed a lawsuit against Google, who was the owner of YouTube, claiming that they were liable for the attack under the Federal Anti-Terrorism Act.
Their argument here was essentially that YouTube was aiding the recruitment efforts of ISIS by using their algorithms to steer more and more people towards ISIS videos.
Here's specifically what the initial lawsuit that was brought forth against Google said in relevant part.
Quote, The plaintiffs asserted that Google had knowingly permitted ISIS to post on YouTube hundreds of radicalizing videos inciting violence and recruiting potential supporters to join the ISIS forces then terrorizing a large area of the Middle East and to conduct terrorist attacks in their home countries.
Because of the algorithm-based recommendations, users were able to locate other videos and accounts related to ISIS even if they did not know the correct identifier or if the original YouTube account had been replaced.
Google's services played a uniquely essential role in the development of ISIS's image, its success in recruiting members from around the world, and its ability to carry out attacks.
Google officials were well aware that the company's services were assisting ISIS. Google, on the other hand, denied liability, saying that it's impossible for them to review every single video that gets posted on their platform, given the fact that over 500 hours of new content gets posted every single minute.
Although, to be frank with you, if you happen to post a video mentioning the words vaccine and injury in the same sentence, well, suddenly Google seems to find time to review your video instantaneously.
Regardless of that, though, just like in the earlier case, the one of Twitter, the U.S. Supreme Court ruled in favor of Google in this case as well.
However, again, just like in that earlier case, they sidestepped the question of Section 230 altogether, and instead they focused on just the merits of this particular incident.
Here's part of what they wrote.
It was unnecessary to address the application of Section 230 to a complaint that appears to state little, if any, plausible claim for relief.
And so the status quo remains.
You and me, the law-abiding American citizens, continue to be censored by these companies, even though these companies can once again continue to call themselves platforms who are not responsible for the speech of their users.
They just continue to censor us nonetheless.
Very cool.
If you'd like to go through the details of any of the cases we went through in today's episode, I'll throw the PDF versions of the Supreme Court rulings.
They'll be down in the description box below.
Speaking of box, let me show you the sponsor of today's episode.
That's right, the sponsor of today's episode is an absolutely awesome small elderberry farm over in beautiful Wisconsin.
Now listen, as great as all these mRNA vaccines are, and I know we're all clamoring to get ever more of them into our food, well the truth is that elderberries are some of the most potent natural sources of antioxidants, especially quercetin, which is a naturally potent antioxidant that helps to free the body of these floating free radicals and keep us happy and healthy.
Elderberries are, by the way, also a great source of vitamin C as well.
Now listen, I'm of course not a doctor, but I will say that drinking these Elderberry extracts makes me feel good.
They come straight from the beautiful ground in Wisconsin, go into these little packets, and then into my mouth.
And we actually asked the owner of that farm what makes his elderberry extract different from others, and here's what he told us.
Quote, I infuse elderberry flowers into the fruit extract.
We're the only ones that do this in the ultimate packet, which increases the quercetin levels.
Flowers blend well with the fruit because they have the same molecular composition.
And quercetin has been shown to remove toxins from the body.
Our elderberries are locally grown and tundra tough.
And so these small elderberry packets are awesome.
They're healthy and they're budget friendly.
They're essentially made for people who don't need a full eight ounce bottle of the stuff.
They come in these perfectly sized small packets so you can either take it on the go or give it to your kids as they rush off to school and that way you can make sure that they get their daily dose of antioxidants.
And best of all, right now they're offering viewers of Facts Matter a cool 20% off.
Just use American20 as the promo code.
American with an N20. And so, again, it's a great product, small, beautiful farm over in Wisconsin, and it's owned by patriotic Americans who not only care about the country, but also about the truth, which is why they sponsor a program like ours.
So, check it out, Wisconsin Elderberry.
The link will be down in the description box below.
Use promo code AMERICAN20 to save 20% off, and now let's head on back to the studio.
And then, until next time, I'm your host, Roman from the Epoch Times.