Jason Fyk reveals how the courts got Sec. 230 WRONG, unleashing decades of unlawful CENSORSHIP
|
Time
Text
Welcome to today's Brighteon.com interview.
I'm Mike Adams, the founder of Brighteon.com, the free speech platform.
And speaking of that, today we have an extraordinary guest.
Mr.
Jason Fick joins us.
He's the founder of what's called the Social Media Freedom Foundation, and he is the expert in Section 230.
And he and I have met for the last several hours, and he's given me quite an education in understanding how the courts have misinterpreted Section 230 to grant big tech the ability to arbitrarily censor almost anybody for almost anything, even though that's completely illegal or outside the law and unconstitutional.
So, Jason Fick, welcome.
It's great to have you here.
Thank you for having me, Mike.
Well, thank you for your years of expertise about this.
You are widely recognized as the expert on Section 230, and you are also a pro-freedom, pro-liberty, pro-free speech type of person.
Now, correct me if I'm wrong on any of this, but does that sound...
Yeah, I don't see how anybody could be against those things.
Who could be against freedom and liberty and the ability to say what you want to say?
Well...
People are.
Of course, the whole tech cartel is against that.
But people understand, I think, the basics of Section 230.
They know that any time that a platform censors them, if they try to sue the platform, the courts just say, oh, Section 230, and throw out the case.
But you have informed me of why they're wrong.
And I'd like you to share that with the audience.
We'll go into some details.
But also here in Texas, we have a relatively new law, HB 20.
And could you explain also why that is highly relevant to restoring the freedom to speak in America?
Absolutely, Mike.
So essentially what has happened is that there have been, you know, 1996 is when this law came into place.
And its formal name is Title 47, U.S. Code Section 230 of the Communications Decency Act.
Decency.
That's what this was about.
This is what Congress wanted to have done, was they wanted to protect companies that were going to act in good faith to remove certain content that was considered indecent.
So what happened is, is that they read the law, but they didn't read it correctly.
And when I say they, I mean the courts.
Predominantly, most of these cases end up in the Ninth Circuit Court, ultimately.
And the Ninth Circuit Court, from almost day one, Has misread the statute, and I pointed out the actual language of the statute and how they articulate this in what are called legal briefings and orders and judgments, and it's wrong.
They're not using the right words.
And that's what we see a lot nowadays is this conflation between private rights and a business's ability to function and what this protection allows.
Well, over time, that has gotten to be pervasive.
The protections have become almost unlimited.
They simply say that you can't treat them as a publisher in the general sense, and then that's it.
Well, that's not what it says.
It says that they cannot be treated as the publisher or speaker who is someone else.
And that they are prevented from being treated as someone else for their conduct and their content.
But they shouldn't be protected for their own conduct.
Ever.
Okay, but you're already getting into the weeds a little bit for people who are new to this.
I mean, you're the expert and it took me a long time to get up to speed on this as well.
So let me back up just a little bit.
Sure.
The Ninth Circuit covers California and almost all the Western United States.
Predominantly, yes.
And so the Ninth Circuit tends to protect the interests of big tech.
It would seem so.
Yeah.
And so they interpret this law in a way that is favorable to the tech platforms.
Correct.
When the law was originally written, it was designed, as you said, to allow these information platforms, I'll call them, to censor nudity, violence, harassment, snuff films, porn, whatever, especially to protect children in particular.
Correct.
But then, especially since Trump became president, these companies have used this law and said, well, we can censor anybody for viewpoints that we don't like.
Correct.
How is the Ninth Circuit justifying that in their own thinking?
Well, what they're doing is that they're just applying it as blanket immunity and not justifying anything.
There is no justification of good faith.
There is no being a good guy in this circumstance.
They simply say you can't be treated as a publisher and it ends there.
That's not the way it was supposed to be.
They are supposed to have a measure of good faith.
We see it right in the language.
It says that they must act in good faith.
What do they mean?
Good Samaritan is in law.
Good faith applies to being a good Samaritan.
Now, originally, obviously, when I sued Facebook, it was over financial issues.
But that same process of restricting speech without any limitations, right, became a weapon against ideology, against religion, and against political viewpoint.
It's the same process, and unfortunately, the courts just aren't, you know, holding them accountable for their own conduct that falls way outside of being a good faith act.
Well, yeah, it's so crazy now that even if you don't bow down to their, I don't know if you call it a religion or cult or what, let's say LGBT. If you don't bow down to the LGBT ideology, then you are considered hateful and you will be censored simply for not using someone's, quote, preferred pronouns.
That was never intended as the reach of two-thirds, not even close.
I can't imagine a congressional meeting where in 1996 they even considered changing the pronouns of someone.
Right.
Much less considering that to be hate speech, much less that being considered indecent.
Because that's what this comes down to.
It's the decency act.
Right.
It's about being decent, about being a good Samaritan.
It doesn't mean that you have to agree.
It's like if you don't agree with them now, you simply are hateful.
Right.
That's just not true.
But they get to decide, of course, whether or not they agree with you, and they change their position all the time.
Correct.
So they're constantly advancing what they think, and then suddenly everybody else has to conform with that, or they get deplatformed.
Well, that leads us to another interesting thing, and believe it or not, I'll give you the credit on this one.
You came up with this analogy that's absolutely perfect to understand this.
They take away everything that they don't want with the intent for it to be what they think.
Of what's left behind.
Of what's left behind.
And it's very much like a sculpture.
If you start to chip away a sculpture, you're only removing content, but with the intent to develop some type of sculpture.
Let's say the Statue of David, for example.
What's left is what they have wanted.
In other words, they've developed the information that's left behind by not removing anything that they don't like.
So you use the key word there, which is developing or development.
And yeah, thank you for bringing up that analogy.
These platforms like Facebook or even a search engine like Google or Apple, which approves apps, they claim they're not in the content development business.
But as you mentioned, by shaping what's left behind, as in a sculpture, Chipping away the wood or the marble to leave behind something, they are in fact a developer of the resulting content that is then distributed to everybody.
Correct.
But the courts don't yet understand that.
No.
Unfortunately, I think one of the problems that we have with courts specifically these days is oftentimes these judges are older.
They're people that have not been using technology.
And the other thing that we've sort of come to realize is that lawyers lawyer, right?
They do law stuff.
But social media people do social media stuff.
There was no bridge there.
And what I sort of came to the realization was, is that as being somebody who is in social media, managed to bridge those gaps to the law side of things so that we now can articulate that to judges that may have...
I mean, even the Supreme Court admitted that they're not the nine most internet savvy people on the planet.
Well, they're not.
They don't use it to the extent we do.
So they wouldn't know what this means.
But of course, if you're making content decisions as to what to remove, you are by proxy making decisions of what to allow.
That's content development.
Well, and in addition, what Twitter has done, for example, and Facebook does it too, is they add fact-checking content.
Yep.
So they basically...
And they shadow ban, right?
So they'll limit the distribution of your content, even if it's true, even if it's factual, such as a woman's true story of a vaccine injury.
That will be censored, limited, and then for those few who do get to see it, there will be a fact check added to it, an annotation added by a publisher, which is Facebook in that case.
They've become a publisher by adding notes onto your post.
And what's even more important is they became a content provider.
They became a content provider.
So they are doing publishing.
So they are a publisher, right?
Right.
In the context of the law, that's a very big difference between being treated as the publisher.
The original publisher.
Exactly.
But what's more important is in developing that information when they...
And let's just take, for example, Facebook's model.
They identify misinformation.
They send it out to a third-party fact-checker who creates and or develops information to add to it.
Your fact-check labels is then sent back to Facebook and then it is added, as you said, the word added to your label, which means that that is their content.
And they say, well, no, it came from a third-party fact-checker.
Who they paid to create the content.
That they authorized that they hired to do that.
Correct.
Yes.
So it's just all they're doing is they're subletting or subcontracting the work out, but it's still their content.
Right.
Which means that by definition, they are responsible in part for the development and creation in this circumstance of that information provided through the internet.
Right.
They're a content provider.
Well, and isn't the same thing true, but in a different way with Google, by choosing the order in which content appears?
Even though Google itself, let's say, doesn't write the titles of the pages that they index, but they do absolutely choose the...
They choose which ones to include, which ones to reject based on their arbitrary beliefs, political opinions, medical opinions.
Financial gain.
Financial gain, right?
And then they present it in an order that's determined by an algorithm that they wrote.
Correct.
Their people, their employees on their payroll wrote those algorithms.
And even if they say AI wrote it, their people wrote the damn AI in the first place, right?
So they are developing the content in order to present it.
Yes, they are developing the information that they want.
They are content providers of it.
And part and parcel with that is that if they're providing that content, they are responsible for that order.
And I think that that's kind of the revelation that you had today, is then you realize that all of these platforms Our content providers.
And courts have said, well, we want to protect them when they're doing their business.
That's not what Section 230 was about.
It was not protecting them to choose the order.
It was not about recommending content.
It was not about developing information.
It was not about fact-checking tags.
It was nothing about that.
It had one thing in mind, or excuse me, there's technically two things in mind.
The first thing was giving them the ability under civil liability protection to remove indecent content.
Indecent.
Correct.
Again, violence.
That was step one.
Right.
And the second thing was is that if they act upon some content, they didn't want to be held accountable for all of the rest of the content that they failed to get to.
That's the second portion of it where they cannot be treated as who published the information, meaning another information content provider.
As soon as they become the information content provider, even in part, because it says those words, in part responsible, right?
They are responsible for all of the content on their site.
It's amazing.
So if they had just left it all alone except they could remove lewd, violent, harassment, porn content, they could have done that, but then not mess with the other stuff, not augmented content, not thrown fact checking, not deplatformed channels they don't like in order to shape the content that's remaining, then you wouldn't have a problem with them.
Exactly.
And if you recall, in our conversation you brought up, well, you need to consider how Brideon runs its sites, right?
And my point to you was, is that if you're doing the right thing currently, which is you are removing indecent content and content that happens to exist on your site that you're unaware of, you're protected.
This won't hurt the little guys that are following the rules.
This change would only hurt the big guys that have used it and manipulated it and turned it into something that it never was intended to be.
When you say this change, you mean if this is overturned in the courts?
If the courts apply it correctly.
And, of course, what I haven't mentioned yet is that I currently have a petition for writ of certiorari in the Supreme Court right now that is asking very fundamental questions.
And we're essentially asking the Supreme Court, do we do what the text says or do we stick with status quo?
Yeah.
Well, it's not the court's job to interpret what the legislature did if the legislature specifically articulated it.
It's there in the words.
We're asking them just do what the text says.
Thank you.
Well, you may or may not like this comparison, but I think our audience will appreciate it.
I consider the current interpretation of Section 230 to be kind of like a digital Roe versus Wade.
That's my analogy.
So it took decades to overturn Roe versus Wade.
Roe versus Wade...
allowed legal protection for the killing of babies in the same way that Section 230 allows legal protection for the killing of baby ideas.
Yeah.
Right?
I mean, again, maybe it's a loose metaphor, but there is so much ideological momentum, especially on the authoritarian left.
So the left has become the anti-free speech political party in America, And I know that there are many more purist progressives who agree with us on freedom of speech.
For example, Robert F. Kennedy Jr.
would be one of them, and maybe many others as well.
And we're with them on the same page on this.
But isn't overturning this going to be as difficult as overturning maybe Roe v.
Wade?
It will be, because the concern is the economic impact to the Internet.
And unfortunately, one of the things that we have to deal with right at the moment is that, and I'm going to say this relatively bluntly, they made their bed.
The courts need to deal with it, right?
They created this monster.
These companies would not nearly be this big if it had not been for the fact that they can wipe out their competition.
That's true.
And they became these gigantic companies because nobody can compete with them.
In doing so, now the problem is what?
About seven companies constitute 95% of the entire internet.
Well, of course, if we change this, it means that they're responsible for what they've done, and it protects the interests of the public.
So realistically, what's in front of the Supreme Court, and it took me a little while to wrap my head around this, but somebody said it to me the other day.
It was actually Susan Prager said it to me the other day.
She said, you do realize that your case in the Supreme Court is probably the most important case in modern history.
Wow.
Because it will decide whether or not there is still free speech, whether the Supreme Court wants to protect the interests of the public over the corporate interests, financial.
I mean, there's so many implications in this case that really what needs to be done is we need to protect the interests of the public because we're going to lose free speech on the internet.
And when, just as America is an example, they say, what, 73.7% of communications occur online.
If they control that flow of information, We're in dangerous times.
Very dangerous times.
We are indeed.
That's why I so much appreciate the effort that you're putting forth to help protect the freedoms we have left or to restore the freedoms that we have lost.
I want to go to your website, the Social Media Freedom Foundation.
What is it?
Socialmediafreedom.org and Free Speech on Social Media.
You founded this organization, correct?
We did.
We are a 501c3 charity organization.
People can donate to us.
And our entire goal is to restore our freedoms online.
I can't tell you how much I think you know from having spent some time with me.
Hey, there's our interview right there.
Our radio interview.
Yeah.
We've spent...
I know personally I've spent 10,000 plus hours on this because the reality is that if we lose our freedoms, they're gone.
They're gone forever.
And we are the last pillar of hope for the rest of the world right at the moment.
We have to maintain this.
This is important stuff.
So that's what our foundation is all about is the real world work.
And I can say across the board, every single one of the members that has helped with this, not one has taken a dime.
All the funds that we have received have gone to the work itself.
That's it.
Which is legal work.
That is all legal work trying to protect our interests.
Yeah.
Right?
And everyone's interests.
And then you have this event or you're speaking at this event.
What's the correct website to get to this?
It would be IES23.com.
That's the easiest way to get to it.
Yes, for Internet Equality Summit 2023.
It is our first public event.
Great.
It is a nonpartisan event as a charity fundraiser for the Social Media Freedom Foundation.
May 11th and 12th.
May 11th and 12th in Orange County, California.
Wow, Orange County.
So we're going right to their front door and saying, hey, can we have a conversation about this?
Yeah, and I noticed you're using the term equality, which is typically a term that the progressive left uses, even though they no longer believe in equal speech.
Well, nowadays it's equity.
Yeah, it's equity.
It's not.
And we say, wait, no, we just want an equal footing.
It doesn't mean that every business will be the same, that everybody will survive the same.
Not everybody's point of view is the same.
But the point is you should have an equal opportunity To express it.
Exactly.
That's what the Internet Equality Summit is about, is where do those lines exist?
And actually, one of our biggest features, and this is an amazing thing, again, even these speakers have come in and donated their time.
So these are for the right reasons.
We're not out here trying to get rich or trying to, what do they call it nowadays, clout chase.
We have Dennis Prager has graciously given us his time.
He is going to be doing a live debate on May 11th with Grant Stern, who is the executive editor of Occupied Democrats.
We have a far left speaking to a far right to have a discussion.
Where does free speech exist?
Where does that line begin and end on the Internet?
Well, I'm so glad you have someone you described on the far left of the spectrum of being part of this because...
Because...
For us to have a functioning constitutional republic, we have to be able to debate.
Correct.
We don't have to necessarily agree with people, but we have to be able to debate, as you said.
But it seems to me, this is my opinion, that the political left in America today, they can only maintain a dominant cultural and political position due to extreme censorship and deplodforming.
They can't win on ideas because, well, their ideas are horrible.
Facts are a difficult thing to fight.
Yeah, like for example, transgenderism again.
You know, a biological man cannot have a baby.
Right.
It's never happened before, will never happen ever.
Yeah, no matter how much you wish it, it's just not going to happen.
And if it has, it's been some insanely rare situation where, you know, genetically everything was messed up anyway.
Yeah, good point.
But also, so many other issues, like, for example, economics, monetary theory.
Progressives tend to believe that you can print unlimited amounts of money without causing inflation, which is economic insanity.
It's like economic transgenderism.
It's almost economic warfare, the government against the people, because it's diluting all of our dollars as the government just spends it everywhere.
True.
So they don't want, let's say, Austrian economics to have a voice or the Ron Pauls of the world to have a voice because Ron Paul makes so much sense that he would change people's minds if he were allowed to be heard.
And that's one of the things that I do know that Grant dealt with immediately.
When he posted about it, they said, oh great, you're going to give that Nazi more of a platform to speak.
Nazi?
Where's that coming from?
Exactly.
Where does that even come from?
No, he's just got points.
And I got to say, I mean, I give credit to both of them for standing up and having that conversation.
Yes.
You know, whoever wins, wins.
Whoever loses, loses.
At least you get to the right information.
But like you said, silencing one side of an argument is not winning an argument.
No.
That's just cheating.
Well, and then Section 230...
It has been used now on expanding lists of issues to silence dissent or views that the left does not want to be heard.
For example, during COVID, well, especially under Biden, the White House and the CDC instructed tech platforms to censor specific types of content, such as content that was critical of vaccines, critical of masks, critical of lockdowns, and so on.
I mean, that's all come out in the Twitter files, by the way.
Yes.
But now we're hearing calls to censor views about climate change that don't conform to the radical climate cult, I would call it, on the left, which thinks that carbon dioxide is a pollutant.
Right.
You're not allowed to have.
So then well, and then Senator Kelly, Mark Kelly of Arizona, said maybe we should stop people from talking about banks being unstable because it could cause bank runs.
So where does this end?
That's exactly it.
Where does it end?
We are already on the slippery slope.
We are already sliding down it.
Somebody needs to stop it.
And what you're talking about is this private-public entity partnership.
It's the state working with these private entities.
And, of course, they all point at each other and they go, well, you did it.
No, you did it.
You did it.
So the courts are going, oh, we don't know who did it.
And the point there is, and even something that I pointed out, I think, to you today is that Whether it is a directive from state or not, which we know is happening now, it's clear as day that they have been instructing these companies to silence the information they don't want out there.
Hunter Biden laptop is an obvious one because Mark Zuckerberg admittedly said the FBI asked him to do it.
Yes.
But then they say, well, it was asked, so it was a voluntary act.
But was it?
In reality, had they not been asked, would they have done it?
It wasn't entirely voluntary.
It was influence.
But what I pointed out to you today, and I think you saw it, the directive to restrict this kind of content is right in the statute, is right in Section 230.
So if they are seeking the protections of Section 230, what does it mean?
It means they had to have acted as an agency of state.
They chose to voluntarily, but they still did what the state asked.
Which means it would be unconstitutional to do so because it violates the First Amendment.
Correct.
And that is what my second case, which is against the United States, is about in the Washington D.D.C. courts, is we are challenging the constitutionality of this law because it is unconstitutional in lots of aspects, and its abuse is clear to everybody.
Right.
Now, I find it fascinating how you got into this, and I think this is okay to share publicly, but correct me if it's not, but Many years ago, over a decade ago, you had quite a thriving social media business.
I did.
Very successful.
You're a very creative entrepreneur.
You knew how to harness social media in order to be an influencer of, in your case, I think it was kind of comedic content or funny type of content.
Yeah.
Like these days, it would be like, has cheeseburger type of stuff, right?
Yeah.
That's a popular site.
Or the people of Walmart.
You mad, bro?
Like memes, stuff like that?
Yeah, memes and stuff.
All that kind of stuff is what we used to do.
You were very successful in that realm, and then Facebook decided to economically strangle your business.
I mean, to just cut it off.
Yeah.
And that thrusts you on this path where we are today.
Right.
But do you want to share a little bit about what happened to you?
Well, sure.
So one of the things that I think everybody sort of misses about all of these sites, the fact that they're developing information.
If you take money to show an ad, wouldn't that ad literally be to develop that information, that you're providing that information to more audience, make it more available, which is the definition of it, right?
So they are providing this content.
Well, if they're taking money from advertisers to put content in the news feed, Well, they're content providers in the newsfeed which compete with me and you and everybody else for the newsfeed.
Well, everybody right now is mostly focused on the political and the ideological and the religious issues, but it started out with money.
It started out financially.
They needed to move their competition out of the way.
So they had this engagement model.
Get everybody on the site.
Sure, run your business.
Build your business.
Oh, by the way, we're going to take it all away from you.
Because we're going to displace you with content we make money on.
And that's exactly what they did to me, is they needed to get me out of the way so that they could put more advertisers in there.
Which means that they control the information coming out from me, the information to the end user, and the bridge between the two.
And they basically just shoved me out of the market.
So what they did was fraudulent.
They made a promise to you and all of us that, hey, you can come on and use the Facebook platform for free.
You bring an audience with you and you'll be able to reach that audience and maybe even attract new audiences.
And then once they've reached a critical mass of audience, then...
They took us down, but they kept the audience.
Correct.
For themselves.
So they could run ads and they could earn their money, which, by the way, Mark Zuckerberg just flushed down the toilet with Metaverse, by the way.
Those billions of dollars he wasted?
It came from...
I mean, it should have gone to people like you and I and other content creators.
Should have settled me.
Should have settled me out.
Yeah.
No, we were the ones who brought them that audience in the first place.
Exactly.
We built them.
Everybody says, oh, they built their site.
No, they didn't.
They built a service.
They built a platform.
We built the site.
We built everything.
We brought everybody in.
We were the ones that got them all there.
And when I see we, I mean the whole collective of all of us, we are what is, we essentially walked like cattle into the stalls.
You know, like that is what we did.
It was we are all now here, now we're trapped.
We can't do anything because the courts won't.
It doesn't matter how they treat us or anything else like that, they won't impart any liability on them.
And that's ridiculous.
That's a complete bait-and-switch program.
And we've proven that, but it doesn't seem like the courts seem to care.
Well, not yet, but let's talk about HB 20.
Because even though there's no chance that there would be, let's say, a new federal law under this administration, and with Democrats controlling the Senate, I mean, they've come to love censorship.
They've come to depend on it.
But at the state level, in Texas here, we have HB 20.
Tell our audience about HB 20 and why that matters for this fight.
Well, HB 20 being that it's a Texas law, Texas stepped up and they said that they wanted to protect the interests of Texas state residents, right?
And Texas run businesses and any content that runs through Texas.
And the intent was is that they wanted to protect the ideas and so forth because right now there is this term otherwise objectionable.
Nobody knows whether it should be considered subjectively or objectively, whether it's anything that they want, or if it has to apply to unlawful, indecent content.
So what HB 20 did is it kind of had, well, you can use a cowboy term, they headed them off at the pass.
And it essentially walks in and says, no, you have to articulate, company, how you're imparting your regulations.
You have to be on the surface about it and you have to be honest about it, which means it can't defraud people.
For example, Facebook is a platform for all ideas.
Except ideas they don't like.
So that's not being upfront and honest.
They also have to be accurate about it.
These are things that are stated in here, and it essentially protects from censorship.
Now, everybody says, well, wait a second.
These big tech companies have a First Amendment right.
Right?
And they get into this argument, and everybody's like, oh, yeah.
They flipped it upside down.
Well, here's the thing.
They do have a First Amendment right.
And you know what?
They can run their businesses how they choose to run them.
But if they run them unlawfully, you should be able to sue, correct?
That's how normal businesses operate.
You break laws, you get held accountable.
Well, the thing about censorship is, and it was challenged, and they said, well, this is unconstitutional because it compels speech, right?
It makes them the host content.
Well, it doesn't.
They've already gone through this and what they made a determination.
It's subtle, but I'll make this distinction.
It's really key to this whole thing.
Is this the Fifth Circuit?
This is the Fifth Circuit considering this law.
The Fifth Circuit recognized that if content is considered pre-publishing, meaning it hasn't gone up on their platform yet, that is, in essence, expression of speech.
Because some of their consideration is going into The publishing itself.
Pre-publishing.
But that's not how these sites work, do they?
They allow us to direct publish first because we're the publisher.
Once it's up there, though, they made the determination that post-publishing, after it's been publishing, if they take the content down, there's no freedom of expression there.
There's no expression.
It's only suppression, which means it's not First Amendment protected activity.
Censorship is not First Amendment protected.
Right.
And just to clarify, the left has been arguing, or the tech platforms, that censorship is freedom of speech.
Exactly.
Which is Orwellian.
No, it's not.
I mean, war is peace.
Freedom is slavery, right?
Right.
Censorship is freedom of speech.
And finally, the Fifth Circuit said, no, it isn't.
And that means that HB 20 is in effect right now in Texas, right?
Yep.
It's in place.
It's in place.
Yep.
So then shouldn't we expect a flurry of lawsuits coming out of Texas?
Well, I can say that I've been involved in a couple so far and been consulting on them and so forth, and people are gearing up for this.
Wow.
Because things have changed and now there are protections.
And, of course, I think one of the biggest battles is going to be the form selection issue because these companies, as I said, they're predominantly fixated in the California area and Silicon Valley, and they all go to the nice circuit.
Well, of course, one of the cases that I had seen...
It was really interesting because it was an HB 20 related case.
It was brought in Texas and the company that they were suing brought 30 lawyers in on it.
And they're all fighting forum selection.
Why would that be?
Because they don't want it to stay in Texas.
They don't want it to stay in Texas because if it is brought out of Texas and it brought back to California, you essentially just neutered it.
You've killed it.
HP-20 will have no effect whatsoever if it goes back to California, because California has so far in two and a half decades done absolutely nothing to fix this thing.
Okay, but it seems to me, I mean, I'm just a layperson on this point, but it seems to me that So HB20, number one, intends for the right to speech to be asserted in the state of Texas, but also all these companies, they engage in censorship and deplatforming in the state of Texas, and many of these companies, such as Facebook, have Offices in Texas.
Google has representatives in Texas.
They all have employees in Texas.
And these employees are involved in the censorship.
And again, people use their online websites from Texas.
Correct.
So the censorship happens on the screens in Texas.
Right.
And I think the point of HB 20 is that It articulates that it was written for the interests of Texans, right?
And Texas businesses and Texas operations.
Why, if it was to be in this state of Texas, would you consider it out of state?
Right.
So why would we move forum selection if they're saying, no, we're making this law for Texans.
Why would you send it to California to decide what Texans do?
So all of the judges that are in Texas should heed what I'm saying right now, which is the intent was to protect Texans.
Keep it in Texas.
Do not let it go back to California because—and this is a strange thing.
I think we were talking about this earlier is— It shouldn't.
Shouldn't.
It does.
It shouldn't matter which district, which circuit you go through.
The law should be consistent.
It is not in Section 230.
It is not in social media companies.
The Fourth Circuit is right now in my petition for Supreme Court, the Fourth Circuit is in conflict with the Ninth Circuit.
And it's only the Supreme Court that could ever break the tide.
The two cases don't fit with one another.
They're in conflict.
Fifth Circuit is the same situation.
They're in conflict with the Ninth in certain circumstances.
Right, right.
Okay, well, let me mention another potential conflict.
Here we are at Brighteon.com.
Brighteon.com is censored by YouTube and Facebook and Twitter, by the way.
And you can't even share a Brighteon.com URL in a private message on Facebook.
That's how crazy the censorship is.
But let's suppose that we sue YouTube.
Mm-hmm.
And under HB20 in Texas.
And YouTube, let's say they lose that case.
Let's just say we win in Texas.
Couldn't YouTube just then segment their results and say, in Texas, we won't censor Brighteon, but everywhere else we will.
And then they're going to have a fractured...
Internet.
They're going to have the censored states, which will be California, of course, Oregon, Washington.
And then they'll have the free states, like Texas, maybe Louisiana, Florida, whatever.
Is that a possibility?
Could that happen in this case?
Well, yes.
They could choose not to do business in Texas.
Entirely.
Entirely.
No, I meant what if they just didn't censor Texas?
I know, but we could take it to the extreme.
What if they just decided, well, we're not going to do business with Texas?
What about the next state that does it?
I know Florida's trying to push in directions like that.
What happens?
What, are they just going to continue to lose business?
Well, now they're going to have to answer to their shareholders and stock holders.
Right.
Well, you...
You've got to stop censoring.
Well, then the problem is that if you have one state that you're protected, first off, that's going to help Texas because people are going to start moving here to run their businesses and bring businesses here.
I will set up a VPN service in Texas if that ever happens because everybody will want a Texas IP address.
Exactly.
And that's what everybody will start doing business out of Texas.
And the point there is that what that will do is it will motivate them by bottom line.
Money.
It will push them to do the right thing across the board.
That's what will happen.
Litigation and liability are the biggest motivators when trying to get somebody to do the right thing, if they can be held accountable.
But right now, unfortunately, they're not being held accountable for what they do.
So in kind of wrapping this up here, I mean, I know time has already flown by, but I can't believe it.
A lot of information.
There's a lot.
I mean, you and I spent hours on this.
I'm barely beginning to understand this.
But these companies...
Do you get a sense that there is momentum now because Texas has passed HB 20, Florida you just mentioned, other states are working on this, and all the whistleblowers, the leakers, have come out.
Even Elon Musk seems to be helping to get information out about how evil Twitter was before he bought it, by the way.
Do you feel like the tide is turning here?
I absolutely do.
Because the reason I do is because, as you know, and we discussed this in the past couple hours, and it's not something we've necessarily talked about today, is that in our petition, which is in the United States Supreme Court right now, what we have asked the court to do is exactly what they asked in the justices asked in Gonzalez versus Google.
We already have the alignment just peripherally, but the Department of Justice sees this the way we do.
There's a big mistake in there that can easily be fixed.
It's a textual issue.
And if they simply correct 230C1, which is the first portion of it, it would pretty much correct 95% of the problem overnight.
The Department of Justice is on the same page.
Ted Cruz and 17 members of Congress are on the same page.
And Texas' Attorney General Paxson actually spoke with his office yesterday.
They're all on the same page.
They recognize the mistake.
It's fixable.
It's very easy to fix.
And it's not the legislation.
You know, the courts keep looking at, well, fix it legislatively.
Doesn't need to be.
The law is correct.
It's the application of it that's wrong.
Let's get the application right.
Protect the interests of the public.
And honestly, it will be a better place for it.
Will they still be able to censor to some degree?
Yes.
But they're going to have to be a whole lot more honest about it.
Yeah.
It's going to change.
Well, exactly.
And I think Schmidt in Missouri is working on some of these issues as well.
He's got an active case.
Missouri versus Biden.
He's digging into that.
I don't know if he was an active Section 230 case.
I haven't spoken with him directly yet.
Okay.
Paxton in Texas.
I'm very impressed with Ken Paxton.
Yes.
And his work.
And I think it seems to me like Texas just got tired of waiting around for anybody else to do the right thing.
Texas stepped up.
And being that we're in Texas, we're like, we're just going to get her done.
You know, let's just do it.
And they did.
Yeah, they did.
I mean, they have a huge legal team.
I know that the information has slowly worked its way through and sort of everybody caught up.
Because I showed you, we have been right about this since 2019.
We've been saying it, screaming it from the top of the mountains, but now everybody is finally catching up that we've been right, which is why, and I can't discuss any of the cases, but I do consult on other cases professionally to help them understand it.
Because as you said, as you saw, When you sit down there and you walk through it with me, the language of it, you realize what they're doing is wrong.
It just doesn't fit.
And that's what people need the time to understand how this thing really functions.
Right, right.
And for me, the big aha was, and you taught me this, you showed me this, that every one of these platforms, Google, Facebook, YouTube, Vimeo, Pinterest, even PayPal, every single one of them is actually developing Content.
Yes.
They are content service providers.
Is that the term?
Information content providers.
Information content providers.
Right.
They are developing content.
Right.
By applying their censorship.
Right.
They're developing their interests.
They're sculpting the content that is pushed out there.
Yep.
It's manipulation of content at post-publishing.
So creation would be pre-publishing, which means that you bring it into existence.
But after it's published, if you manipulate it from there out, that's content development.
It's just like developing film.
The film's already been done, right?
You've already taken the picture.
But developing it is just bringing it out through a process by time.
Right, right.
That's what they're doing with content.
That's the point.
And unfortunately, the courts have really just diminished the value of development, what its meaning is, what they call the germane definition.
It's very simple.
If they play with content and they manipulate it for their own interests, whether that be financial, ideological, religious, political, I don't care what it is.
If they manipulate that content, they are responsible in part for the development and provision of that content.
Yeah, let me offer another kind of metaphor that just popped into my head on this or an illustration.
Let's say I have the world's largest library, public library, with every book that's ever been written, in every language, every religion.
All content.
And the doors are wide open.
Everybody come in, read any book you want.
Right.
There we go.
I'm not a publisher.
Right.
Instance.
Yeah, you're not involved in the content creation or development.
It's just there.
It's just there.
But now suppose I decide that 50% of these books I don't like.
And so I pull them off the shelves.
And then half of the remaining books, I put labels.
Don't read this one.
This might be dangerous.
Or I put a label over here.
Definitely read this book.
This is the best.
Or I take chapters out of books because I don't like the chapters of those books.
And then I say, this is your new freedom.
Come into this library.
That seems to me, that's what these platforms are doing.
And put all the books that don't really make you any money, put them way in the back.
But like you said, order.
Bring all, oh wait, this book, they actually pay me a royalty for it.
So we're going to put that right there on that table, right in the front.
That's what I'm saying.
That's how Google operates.
Oh, look at that.
I'm looking for a good doctor.
Gives you one that's out of state, but it's a paid doctor.
And meanwhile, you can't even find...
You might even search a doctor's name and you'll get paid ones before you even get the doctor you're looking for.
Okay, here's even better.
The library has a room way in the back where they put the books they don't want.
The lights don't work in that room.
That's deplatforming.
Yeah.
The lights only work out here in the lobby area.
Right, so you can't read them.
Yeah.
The books in the back, oh sure, it's freedom, but the lights don't work.
Yeah.
And a library is even a better example because oftentimes people try to explain 230 using a bookstore or a newsstand.
The bookstore or newsstand still orders books, meaning they still...
Ask for the content.
It's not a good example.
A library is better because the content is just there.
It was put together by somebody else and it is only ordered, meaning the aggregation of the content is maybe sectioned by genre, right?
Science fiction, fiction, but even the adult book section, right?
If somebody else created it and you never looked at that book as a librarian, you are not responsible because you can't be treated as the person who spoke that, who wrote it.
That's a great example because they never looked at it.
But let me give you another example.
Imagine you come over and you pick a book out of the shelf that's in science fiction.
And it's got really graphic content in it.
And the librarian looks at it and goes, wow, that's really bad.
And they put it back.
Now they've considered that content and they've allowed it to remain.
So they have some negligence in that act.
Well, they are actually becoming a publisher.
They weren't the publisher, the original one, the original speaker, but they became an additional person making a content decision.
Right.
Development.
They put it back on the shelf.
Now, what happens if some kid comes up and pulls that off the shelf and is like, oh my god, this is crazy, right?
And harms some child.
Should they not be held accountable for their own conduct?
Right.
That's where it gets confusing because what we're saying is, no, if they did not ever look at that book, Cool.
Can't be treated as the person who did.
But if you did, you are a publisher, and therefore, the second part of this law applies that says you're allowed to restrict Specifically, you know, bad content, indecent content.
So if they came up and they put that book out and they go, oh, that actually is kind of harmful, and they took it down, they do have a liability protection to do that.
That's what it's supposed to do.
But part of the slippery slope on this, and stop me if I'm going over the time that you have available, part of the slippery slope is they expand the definition of harm.
Exactly.
So they say that anybody who isn't 100% on board with every vaccine, even experimental vaccines, even vaccines that are killing some people, they say, oh, you're engaged in harmful content.
Because promoting vaccines is the good option here.
Because the government told us.
Because the government, right.
So they can twist anything to be harmful.
Even the Bible, right?
They can say, oh, this person quoted scripture.
Well, that's advocating harm in some way.
Right.
And they do.
They do that right now.
They do.
In the UK, people are being arrested.
In Canada, the preacher there has been thrown in jail for citing scripture.
Right.
Which is to say that if you go back to why are they doing it, it's because they want to promote another idea, right?
They want to advance the idea that they're good.
So they're going to remove everything that's bad, right?
That's the fact that we were saying.
If it's anti-vaccine, it's bad.
It's got to come down.
It's harmful if it's good.
But they're developing that idea.
That brings us to what would be called an irreconcilable statutory conflict.
And we talked about this.
This is where this law has a conflict within it that's irreconcilable.
There's no way to fix this.
And that is simply the flip side of the coin issue, a proxy.
It means that if you're allowed to consider what is removable, restrictable, by proxy, meaning the flip side of that coin, what are you also able to decide?
Whether it's allowed.
Well, if they decide what's allowed, that's content development.
That would make them a content provider.
But the content has to be provided by someone else.
And it wouldn't be.
So the problem with that is that if you both allow them to develop information, but also not allow them to develop information, which is it?
It makes no sense.
Exactly.
That's why it's irreconcilable.
It allows them to both develop and not develop information simultaneously.
So all that is necessary.
For this Section 230 regime, I'll say, to be overturned, is for the proper courts to simply engage in the logic that is already embedded in the word of the law right now.
Yes.
That's it.
Yes.
It just requires a change in understanding or a proper parsing of the sentences.
There's two fundamental...
Things that occur here that need to change.
One, what is called the intelligible principle or the general provision of the statute, which is that they have to act as a good Samaritan.
Courts aren't applying that, but it's the fundamental principle upon which they must make their regulation.
So the threshold question for any litigation is, did they act in the interest of the public or for the good of others?
Now, they can make that argument.
You just said that, that they can make almost any argument that they, you know, except for making their own financial gain.
That one's really hard to make, right?
That, oh, it was helping you out by making us money.
But if they surpass the Good Samaritan issue and they say, yeah, we were acting for the good and benefit of others, they have traditionally said that you just can't treat them as a publisher under the first section, C1, which is to say that that in effect says that you're protected from any publishing conduct you do.
All of it.
Well, that's actually wrong.
If you read it correctly, and what we have asked the court, the second element, which would change almost everything overnight.
If 230C1, and Ted Cruz articulated, this isn't even my words, he said, the 230C1 does not protect any conduct at all.
That means they don't get to check the book.
They don't get to allow content.
They can't have any involvement with the content.
Nothing.
Now, procedural function-wise, they may be able to move the book to a genre section, but they sure can't tell what's in it.
As soon as they consider the content, C1 no longer applies, and that would mean that all application of any actions they take, any conduct whatsoever, only applies to C2A, at which case there is an element of good faith, and you can argue that then in court it wasn't done in good faith because you'd have to be a good Samaritan acting in good faith.
And they would have to lay out their logic for what they consider to be objectionable content.
It would have to be an objective list.
Yeah.
Tell us why.
Tell the court why.
Exactly.
They never ever have to justify what they did.
That's the problem.
It's all done in secret.
The courts don't care because they just say you can't be treated as a publisher and they never get into, did they act in good faith?
Did they act as a good Samaritan?
Right.
That's why you have these vague terms.
It's hate speech.
It's spam.
Because there's no justification.
They can't point to a single rule that we broke.
They don't even have an articulated single rule that is anything beyond, it's bad.
I mean, it's that broad.
Mm-hmm.
And it keeps changing.
Exactly.
Even among them.
Nobody knows what is allowed and what isn't allowed, which has a chilling effect on speech.
Even these same tech platforms, 10 years ago, if someone had posted, I am woman, hear me roar, it would have been thumbs up.
Awesome.
Women, feminism, you're awesome.
Motherhood, breastfeeding, all of it.
Awesome.
Today, I am woman, hear me roar.
Oh, you're a trans hater, I guess.
You believe in women.
Yeah, it should have been I am them.
I am them.
It's so ridiculous.
I mean, the things that they're claiming are offensive and objectionable maybe to somebody, but everything is objectionable to someone.
Right.
Everything.
Yeah, exactly.
So the point then becomes, let's put good faith to work finally.
Let's get the statute operating correctly so the courts go, okay, was this done in good faith?
I mean, that would be a massive change just to even consider it.
Yeah.
It's not being done.
Well, look, I hope, Jason, that with your, you mentioned your consulting on some cases and with your efforts, I know you have a lot of contacts out there.
I hope, and I really want to support this effort, that we can get this law interpreted correctly and restore freedom of speech across all these platforms.
Now, you're sitting here in the Brighton studios.
Brighteon would not exist, except they censored me.
In fact, the day I decided to build Brighteon, which has been many millions of dollars, as you can imagine, was the day I was deplatformed by YouTube.
That was the day I made the decision.
If they hadn't deplatformed me, there would not be a brighteon.com.
Think about that.
They could have actually controlled the space if they had just allowed some freedom of speech.
Instead, They've gone on what I think is a corporate suicide path, and they've pushed people out the doors by coercion.
I think it's funny because people say to me, like, well, what happens if you ever do get to trial, right?
Do you think they're going to be able to find 12 jurors that don't, that actually anybody likes big tech anymore?
Right.
I mean, it's so pervasive.
No, people know big tech is suppressing.
It's the strangest thing when you see like 70 and 80 year old grandmoms going, Oh, I don't like this anymore.
They censored my information to so-and-so.
And you're like...
Why?
Like, why everybody hates you now?
Because they've just gotten so far off the rails.
Just let us say what we want to say is all we're asking.
And you have to commend, you know, I know that you're an individual that has built this network of Brighteon and Natural News and the Health Ranger.
And you've taken the shots.
Oh, yeah.
You've been like me.
We've gotten hit because we are not willing to stop saying what we want to say.
Exactly.
And that's the hard part is you have to make a decision.
Do you continue to say what you want or do you bow down to what they want?
Because it's a war of attrition.
They want to run us all out of money by silencing us.
Well, yes.
And one final thought, we're not the only alternative video platform, obviously.
And I think the most prominent one right now is Rumble.
And Rumble is going to give YouTube a run for their money.
Rumble is signing people like crazy.
And I even know of some big names that have been given multi-million dollar offers by Rumble.
Not me, but people I know.
They're going to go to Rumble.
And Rumble is going to eventually dwarf YouTube or basically relegate YouTube to a sector of sort of ignorant people who don't know what's happening in the world might go to YouTube.
But anybody who wants to be informed would go to Rumble or other alternative platforms.
Bitchute, Brighteon, Gab, you know, there's a lot of choices.
Odyssey and others.
So that's kind of the direction it's going, by the way.
But I would forewarn people, and we've seen this, Companies that are the ones that are like Rumble, for example, if they're allowed to survive, why?
You have to ask yourself, why is it growing so big?
Why are they allowing it?
Because Parler, man, they cropped them right off as soon as they were gaining steam.
So we have to be careful that we're just not creating another monster by a different name.
Well, I agree, and I've mentioned the same thing about all these other platforms, that how are they getting their apps approved by Apple and Google?
That's the question.
Are they selling out?
Well, I think some of these platforms have had to engage in some kind of agreement of censorship, or maybe censorship is not the right term, but certain categories.
Yeah, conformity.
Now, notably, we have had no conversations with Apple or Google or anybody.
They won't allow our apps at all, right?
But we're not even interested in going to them and saying something like, hey, how can we be in your good graces?
Because I don't believe in that.
I don't want to conform to them.
I want this to be pure free speech.
Or at least explain, why can't we be here?
You're holding yourself out to be a public service to anybody that wants to come on and use it.
But not us.
Why is that exactly?
Explain to me, because I always, what did I do?
Because everybody asked me what I did, what bad content did I have?
I didn't.
It was just funny stuff.
And that's the thing is that, you know, and you know your business far better than I do.
I'm sure it would be difficult to point the finger and go, this is bad.
It's just your opinions.
Well, yeah, but I know what I did to piss off the system.
I warned people about big pharma before COVID, a decade before COVID. I told people, you don't need to take pharmaceuticals to be healthy.
That cut into a multi-trillion dollar profit system.
Oh, it did.
And that's why they took me out even before they censored Alex Jones, by the way.
But they censored you before they even censored me.
Yeah, they censored me straight as competition.
I was straight up...
They had to move guys like me that had...
I mean, I listen to a lot of people.
They're like, oh, yeah, I had 100,000 hits.
I used to have in the billions of hits...
Billions per month.
Because it was just...
We didn't have the restrictions back then.
It was full throttle.
And I mean, we had enough reach that we could change national trends at the time.
Oh, no question.
It was crazy.
Look, if they weren't censoring people like myself or Alex Jones, I just mentioned, or others in the alt-media space, we would be the media.
Correct.
We'd be far bigger.
We'd be far bigger.
I mean, frankly, InfoWars is already bigger than CNN. Well, that's not hard to do.
No, it's not hard to do these days.
Actually, kind of funny thing, just as a side note, because I think everybody will laugh about this.
At one point, my one page was actually ranked fifth in the nation.
It was one ahead of CNN. My little magazine page on Facebook was more traffic than CNN. See?
And so they had to cut us down in order to control speech and things and elections and issues.
Because, well, it's all obvious at this point.
But I've gone way over time.
I apologize.
I apologize.
Oh, but it's a good...
What we're talking about is great stuff.
It is.
This is not the argument that you hear anywhere else.
You always hear these crazy stories of people that have not been doing this.
I've been working in this segment for seven years.
So we have a different perspective on it.
We know how to fix it.
And I think that that's the big message here to everybody is we know how to fix the Internet.
It comes down to whether the Supreme Court decides to do it or not.
Good point.
No wonder they so viciously fought against the nomination or the Trump nominees to be accepted to the court.
But in any case, and it's also interesting that this interview about censorship will be censored on YouTube.
Most likely.
Of course.
Of course.
On Facebook.
But let me give out your website again here, socialmediafreedom.org.
Is where people can go and follow your work.
And I'd love to have you back on again as well, especially if there's progress or updates on any of these lawsuits as they're made public.
I would imagine there's going to be a lot of action here in Texas.
It sounds really interesting.
So keep me posted on that, would you?
Absolutely.
Things are going to be changing.
I will definitely keep you up to speed on this one.
Okay, and we've got to thank the Texas legislature for making that happen.
All of them, including Ted Cruz's office, Attorney General Pax's office.
They're doing the right job.
They are.
Texas really has stepped up here.
I'm impressed.
I hope to be working with them a bit more.
It's been a little bit lax about the communication back and forth, but we're in one of the biggest fights in the world.
Like I said, this case is insanely important.
We need their help, if nothing else, to pressure the Supreme Court.
Yes, exactly.
And thank God that I'm a Texan and that we are in Texas and the Brighteon is in Texas because it's the perfect place to speak freely.
It is.
Yep.
Okay.
God bless Texas.
God bless America as well.
Thank you for watching, folks, and thank you, Jason, for coming on.
Thank you for having me.
Absolutely.
It's a pleasure.
Jason Fick's website, again, is socialmediafreedom.org.
So check that out.
And then your upcoming event is May 11th and 12th.
Internet Equality Summit.
Internet Equality Summit.
And that's going to be in what part of California again?
Orange County.
Orange County.
Yep.
Okay.
All right.
And people can go there in person and attend it?
Yeah, this is a live event.
We'll have a lot of, I believe Dr.
Malone is signing up with us as well.
We're involved with the Unity Project.
I mean, we've got some big players in this thing that are interested, obviously, in internet equality.
Okay.
So we're going to have that discussion.
All right.
I should talk with you after this to see if maybe Brighttown could live stream that event with you or something like that.
Maybe be represented there.
Maybe.
Yeah, let's talk about it.
I think it'd be great.
In any case, look, folks, thank you for your support.
We couldn't do it without you.
I'm Mike Adams, of course, the founder of Brighteon.com, and Jason Fick here in studio today.
This man has an amazing encyclopedia of knowledge about this, and I think that his ideas are going to change the world.
So thank you for watching today.
Feel free to repost this interview.
You have my permission to post it on other platforms and other channels, especially on YouTube.
See what they think about it.
And if you want to support us, just keep visiting brighteon.com or brighteon.tv.
God bless you all.
Take care.
A global reset is coming.
And that's why I've recorded a new nine-hour audiobook.
It's called The Global Reset Survival Guide.
You can download it for free by subscribing to the naturalnews.com email newsletter, which is also free.
I'll describe how the monetary system fails.
I also cover emergency medicine and first aid and what to buy to help you avoid infections.