All Episodes
March 23, 2018 - Tim Pool Daily Show
14:52
Free Speech on the Internet May Have just Ended

SUPPORT JOURNALISM. Become a patron athttp://www.patreon.com/TimcastMy Second Channel - https://www.youtube.com/timcastnewsMany people have called for an "internet bill of rights" and other have highlighted how problematic defamation is but perhaps one of the more important issues facing online free speech is the erosion of Section 230 which provides "Safe harbor" protections for social media networks and online platforms.Make sure to subscribe for more travel, news, opinion, and documentary with Tim Pool everyday.Amazon Prime 30 day free trial - http://amzn.to/2sgiDqRMY GEARGoPro Karma - http://amzn.to/2qw10m4GoPro 6 - http://amzn.to/2CEK0z1DJI Mavic Drone - http://amzn.to/2lX9qgTZagg 12 AMP portable battery - http://amzn.to/2lXB6SxTASCAM Lavalier mic - http://amzn.to/2AwoIhI Canon HD XF 105 Camera - http://amzn.to/2m6v1o3Canon 5D MK III Camera - http://amzn.to/2CvFnnm360 Camera (VR) - http://amzn.to/2AxKu4RFOLLOW MEInstagram - http://instagram.com/TimcastTwitter - http://twitter.com/TimcastMinds - http://Minds.com/TimcastFacebook - http://facebook.com/TimcastnewsBitcoin Wallet: 13ha54MW2hYUS3q1jJhFyWdpNfdfMWtmhZSEND STUFF HERETim Pool330 Washington Street - PMB 517Hoboken, NJ 07030Support the show (http://timcast.com/donate) Learn more about your ad choices. Visit megaphone.fm/adchoices

Participants
Main voices
t
tim pool
14:49
| Copy link to current segment

Speaker Time Text
tim pool
A lot of people have been talking about an internet bill of rights to guarantee the right of free speech on these massive platforms.
One of the things that I've talked about repeatedly, especially yesterday and the day before, is that our public and political discourse is happening on private platforms where a small group of elite individuals can decide what is or is not allowed.
And if our political discussions are taking place on a private platform, and the person who owns that platform says, I don't like this particular idea, they can strip that idea from political discourse.
This is going to have a huge negative impact on the future.
Now, a lot of people have also discussed defamation cases and fake news.
Why is it that so many outlets are allowed to publish lies and get away with it?
Why aren't any of these platforms responsible for putting out misinformation?
One of the most important pieces of legislation, which was passed in 1996, is called Section 230, and it gives these companies protection against being sued based on what a third party published to their platform.
Some people say this guarantees free speech on the internet, and is possibly the most important legislation as it pertains to the internet.
But some of these rights are being stripped away.
Some of these protections are being stripped away with a new law that was recently passed and is expected to be signed soon.
This could spell the end for free speech on the internet.
In the early days of the internet, a lawsuit triggered this fierce debate as to whether
or not websites were publishers or just platforms.
Section 230 was passed with bipartisan support, saying that you can't hold a website responsible for what a third party publishes.
Now, before I go any further, I've got to give a quick shout out to today's sponsor, all of you guys.
If you haven't already, go to patreon.com forward slash timcast and become a patron today.
There are many different tiers to choose from, most notably is tier one at $10 per month.
You get access to behind the scenes photos and videos, commentary, usually from when I'm out in the field and we do have some trips up and coming.
So please stay tuned for that and consider supporting me at whatever level you feel comfortable today to help me do the work that I do.
Following the suit, we gained Section 230 protections.
And some people say that without this, Twitter, Facebook, YouTube, and many other platforms would not exist.
But it's more complicated than that.
First, let's look at how this free speech protection came to be.
This from NPR.
had to do with some online posts about a company called Stratton Oakmount.
On one finance-themed bulletin board, someone had accused the investment firm of fraud.
But in 1994, the firm called the accusations libel and wanted to sue.
But because it was the internet, the posts were anonymous.
So instead, the firm sued Prodigy, the online service that hosted the bulletin board.
Prodigy argued it couldn't be responsible for a user's post.
Like a library, it could not liable for what's inside its books.
Or, in now familiar terms, it's a platform, not a publisher.
The court disagreed, but for an unexpected reason.
Prodigy moderated posts, cleaning up foul language, and because of that, the court treated Prodigy like a newspaper liable for its articles.
After seeing this ruling, some politicians thought that that's exactly the wrong response.
That Prodigy was trying to do a good thing by moderating its platform and it shouldn't be punished because of it.
Thus, a bipartisan bill was born which granted Section 230 protections that states no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
If you want to go on Twitter and say something that is liable, slanderous, whatever, if it's defamation, that you can't hold that platform accountable.
So when you go on Twitter and say some nasty stuff about a company, make some accusations, Twitter is not going to get in trouble because of it, simply because they moderate.
But here's the latest news.
Congress okays sex trafficking bill that critics say will censor the internet.
Critics say law will limit free speech online and won't help trafficking victims.
Now this story is from two days ago, but I'm going to read it as is.
The U.S.
Senate today passed a bill that weakens legal protections given to websites that host third-party content.
saying the measure will help stop promotion of prostitution and sex trafficking on the internet.
But the legislation won't actually help victims of sex trafficking and will erode online free speech, critics say.
The Senate passed the Stop Enabling Sex Traffickers Act in a 97 to 2 vote.
Only Senator Ron Wyden and Rand Paul voted against the bill, which is also known as the Allow States and Victims to Fight Online Sex Trafficking Act.
It already passed the House of Representatives and is expected to be signed by President Donald Trump.
The bill changes Section 230 of the 1996 Communications Decency Act, which provides website operators with broad immunity for hosting third-party content.
The bill declares that Section 230 was never intended to provide legal protection to websites that unlawfully promote and facilitate prostitution and websites that facilitate traffickers in advertising the sale of unlawful sex acts with sex trafficking victims.
As such, the bill says that website operators who promote or facilitate the prostitution of another person will no longer have the legal protections of Section 230.
Violators could face fines or prison sentences up to 25 years.
The bill was spurred largely by Backpage.com, even though the site already shut down its adult advertisement section because of government pressure.
Now that seems pretty straightforward and I think most people would agree with it.
We don't want to allow sex traffickers to get away with hosting this kind of content when they know it exists and it should be taken down and they should be held responsible.
But a lot of people are concerned that this is an erosion of a particularly important free speech guarantor that this section makes sure That everyone has a right to free speech.
And that just because these things are being posted to these platforms doesn't mean that the person hosting them should be responsible for it.
You should always be weary when politicians try to use the most egregious crimes and circumstances to push laws that might erode particular freedoms.
But is Section 230 actually protecting us?
Is it potentially hurting us?
First, let's look at what the Electronic Freedom Frontier has to say about Section 230.
What is this Section 230 thing anyway?
Section 230 refers to Section 230 of Title 47 of the United States Code.
It was passed as part of the much-maligned Communication Decency Act of 1996.
Many aspects of the CDA were unconstitutional restrictions of freedom of speech, and with EFF's help, struck down by the Supreme Court.
But this section survived and has been a valuable defense for internet intermediaries ever since.
Can my commenters sue me for editing or deleting their comments on my blog?
Generally no, if you're not in the government.
Section 230 protects a blog post from liability for any action voluntarily taken in good faith, to restrict access to, or availability of, Material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.
Whether or not such material is constitutionally protected.
This would include editing or deleting posts you consider objectionable, even if those posts would be protected by the First Amendment against government censorship.
So I'm not a lawyer.
I am not a litigator.
I don't know exactly all of the cases that have cited Section 230.
But what I can say is there's some interesting reading to consider.
Now, EFF says that generally you can't be sued even if you remove posts that would be protected by the First Amendment against government censorship.
Now I wonder if that means, without Section 230, you could sue these platforms for violating your First Amendment rights.
But let's go back to the original story from NPR and look at another interesting piece of information.
Prodigy argued it couldn't be responsible for a user's post.
Like a library, it could not be liable for what's inside its books.
Or, in now familiar terms, it's a platform, not a publisher.
The court disagreed, but for an unexpected reason.
Prodigy moderated posts, cleaning up foul language, and because of that, the court treated Prodigy like a newspaper liable for its articles.
Now, I'm not the internet historian, and I don't know a lot about this case outside of this light reading from the EFF, NPR, and a few other news sources.
But at least at a cursory glance, it would seem like this is rooted in Prodigy, the original company, moderating its posts.
And perhaps if they did not do that, they would not have been seen as a publisher, simply a platform for people to communicate.
The court felt that because they were moderating, they had some culpability in what was posted because they were getting rid of some content and not others.
Thus, Section 230 came into existence, stating that you can't hold these platforms responsible for what third parties say.
And that alone is very important.
But it's interesting when you look at how the court felt about this.
It seems to me that there's some possibility that without Section 230, these companies would have no choice but to let commenters post whatever they want so long it wasn't a violation of the law.
In a story written by The Hill, we now have the ability to go after these websites who are exploiting women and children online.
Senator Rob Portman, Republican Ohio, One of the original authors of the bill said at a press conference after the vote.
The legal liability protections are codified in Section 230 of the Communications Decency Act from 1996, a law that many internet companies see as vital to protecting their platforms.
SESTA would amend the law to create an exception for sex trafficking, making it easier to target websites with legal action for enabling such crimes.
So although it would seem like before Section 230, and with this case, it would seem that removing Section 230 protections would result in a massive loss for free speech.
These companies would have no choice but to heavily moderate their platforms, or these platforms might not be able to exist in the first place.
According to the EFF, What if there were no CDA 230 to protect online speech?
Blogs and social media would look radically different.
They say sites like Huffington Post, Facebook, Twitter, Google+, and Reddit could be sued every time a user crossed the line.
Blogs, review sites, forums, and other sites that deal with controversial issues could be pressured to silence unpopular opinions.
Innovation would be diminished.
Sites may feel pressure to limit real-time posts in order to review them to avoid liability.
More importantly, Twitter, Facebook, Google, etc.
They can't police this, and it's been one of the biggest problems.
Now the reason that these social media sites are censoring content isn't because of government pressure or liability.
It's because they don't want to lose advertisers.
So regardless of what the government thinks, these companies are taking action based on ad revenue.
Back to the Ars Technica story.
SESTA-FOSTA undermined Section 230, the most important law protecting free speech online.
EFF activist Elliot Harman wrote today in a post titled, How Congress Censored the Internet.
It's easy to see the impact that this ramp up in liability will have on online speech.
Facing the risk of ruinous litigation, online platforms will have little choice but to become much more restrictive in what sorts of discussion and what sorts of users they allow, censoring innocent people in the process.
What forms that erasure takes will vary from platform to platform.
For some, it will mean increasingly restrictive terms of service, banning sexual content, for example, or advertising for legal escort services.
For others, it will mean over-reliance on automated filters to delete borderline posts, no matter what methods platforms use to mitigate their risks.
One thing is certain, when platforms choose to err on the side of censorship, marginalized voices are censored disproportionately.
The internet will become a less inclusive place, something that hurts all of us.
Now again, I'm not a lawyer, I don't know exactly how this will be implemented, but I have to say, at a cursory glance, that seems to be the case.
We already know that YouTube, Twitter, and Facebook take very heavy hands at policing content they fear might negatively impact ad revenue.
And this has to do mostly with how we perceived what YouTube did following the adpocalypse.
It has nothing to do with their liability, it has to do with their bottom line.
If these companies can't maintain revenue, they can't function, they will cease to exist.
And in this instance, the attack isn't coming from advertisers, it's coming from the government.
If YouTube feels that anything that could be perceived as sexual trafficking needs to be removed, then the restrictions will become increasingly tighter on all of its users.
And as the EFF activist said, it's likely that innocent people will be censored and restricted on accident.
We've seen this in response to the adpocalypse.
Videos that should not be demonetized are demonetized.
Videos that should not be restricted get restricted.
And news providers seem to get a pass.
Certain channels seem to get a pass.
Why?
Well, in the instance of large creators, YouTube, Facebook, and Twitter can personally review that.
If you've got millions of followers, it's not hard for these social media networks to say, we can look at these channels, we can hire people to look over at least a few hundred of them, and know they aren't breaking the rules.
But what about the smaller creators?
These big platforms are going to say, look, there's no way we can police every tweet, every post, every video.
The only thing they can do is increase how the filters work.
Target more people, remove more content, protect themselves.
We need to pay attention when these protections are removed, and we should pay attention to what's going on.
As I stated earlier in this video, when you look at the history of Section 230, it seems like this was born Out of a website's moderation of speech.
And I would say there is a possibility that without this rule, many platforms might take an absolute hands-off approach and say, we aren't publishers, we in no way curate, moderate, or edit content.
Therefore, we are not having these platforms used in any of these ways.
But it would seem that the provisions that were passed recently will punish a platform if their platform is used for trafficking.
In which case, it would seem like this can only get a lot worse.
So I recommend doing a bit more digging into Section 230 and the bill that was just passed.
Don't take my word for it.
We can't predict the future, but I think it's something we absolutely need to pay attention to if our focus is on protecting free speech.
Let me know what you think in the comments below, and we'll keep the conversation going.
It seems like a lot of people who are talking about this are on the left, so I'd wonder What people on the right think about this?
Is there a move that can be made to protect free speech?
Or do you think that this bill is actually a good thing to protect those who are victims of sex trafficking?
Comment below.
We'll keep the conversation going.
You can follow me on Twitter at TimCast.
Stay tuned.
New videos every day at 4 p.m.
Export Selection