All Episodes
Sept. 22, 2022 - Dark Horse - Weinstein & Heying
01:33:34
Implausible Deniability: Bret Speaks with Ramsey Ramerman

Bret Speaks with Ramsey Ramerman about section 230 in the context of the social media and governmental landscapes in 2022. They address the problem solved by section 230, how it has been interpreted by courts, and the power it has allowed to be abused; furthermore, they discuss what routes are accessible to address this issue, and what paths are being and have been explored already.Find Ramsey on Twitter: @RamseyRamerman (https://twitter.com/RamseyRamerman)*****Mindbloom: Go to Mindbloom.com/...

| Copy link to current segment

Time Text
One of the things that I've thought that's always weird is a lot of conspiracy theories get demonized and suppressed, but then the whole flat earth conspiracy theory seemed to be amazingly promoted.
And so it's just made me think it's less valuable because no one's trying to suppress it.
It was so ridiculous that nobody bothered to suppress it.
Yeah, it's an odd one because you can demonstrate to yourself that the Earth can't be flat.
So I do wonder why it suddenly showed up.
But nonetheless, you know, here we are on a very round earth.
Hey folks, welcome to the Dark Horse Podcast.
I have the distinct pleasure of sitting today with Ramsey Rammerman, who is an attorney with 20 years experience on First Amendment and open government issues.
Ramsey, welcome to Dark Horse.
Thank you.
So you and I have been in conversation for I don't know how long about issues surrounding Section 230 of the Communications Decency Act, is that correct?
Is that what it is?
This is an arcane piece of legislation that turns out to be at the core of what I swear is the dysfunction of civilization.
Now, we have our work cut out for us here in the sense That one of the ways that skullduggery protects itself is to be so deathly boring that nobody can pay attention long enough to figure out how it works.
So our job here is to figure out how to reveal to the Dark Horse audience what Section 230 is, what effect it's having on the world, what its authors intended, how that's different from what has unfolded, And that's not going to be easy to do in a way that is cultivating of attention.
So if I can ask you a favor, if you could sprinkle into your explanations a certain number of curse words, and when you use analogies, please make them sexual.
That will keep people on track.
Very good.
That's very fitting in this context.
Yes, perfect.
All right.
So, Ramsey, tell us what Section 230 of the Communications Decency Act is.
So it's a statute that impacts the liability of social media companies for the speech that they carry, which as in social media, as you know, is the speech of third parties.
And it was enacted in 1996 to address problems in the law as it existed before it was enacted on when a third party could be liable for somebody else's speech.
I see.
So 1996 is substantially before the social media era.
The internet was digital by that point, but it still was a A crude prototype of what we have now.
So, of course, the authors of this section couldn't really envision things like Twitter and Facebook and the way their amendment might impact us.
What problem were they trying to solve?
They were trying to deal with the problems of message boards.
And so the two cases that dealt with this were Cubby versus CompuServe.
You may remember CompuServe.
Sure.
And then Stratton Oakmont versus Prodigy Services.
These were two cases dealing with message boards.
And the premise or the issue of liability for third party speech is the same with a message board as it would eventually be with social media, in that you have someone hosting speech, but the speech is made by third parties and the host has a certain level of control or lack of control over what speech gets posted.
And these two cases dealt with when would a message board be liable for posts by a third party?
So, this is interesting to me.
As a kid, I was involved in what were called message boards.
They were a crude prototype of what eventually became services like CompuServe.
In our book, Heather and I discuss the danger of hyper-novelty, which is making modern people sick both physically and psychologically.
Two of the most common disorders are anxiety and depression.
Our first sponsor, Mindbloom, is a leader in the treatment of anxiety and depression using at-home ketamine therapy.
It is a combination of science-backed medicine with clinician guidance and support for people looking to improve mental health and increase their sense of well-being.
MindBloom connects patients to licensed psychiatric professionals to help them achieve better outcomes with lower costs, greater convenience, and an artfully crafted experience.
To begin, take MindBloom's online assessment and schedule a video consult with a licensed clinician to determine if MindBloom is right for you.
If approved, you'll discuss your health history and goals for mental health treatment with your clinician to tailor your MindBloom regimen.
Mindbloom will send you a kit in the mail complete with medicine, treatment materials, and tips for getting the most out of your experience.
After only two sessions, 87% of Mindbloom clients reported improvements in depression, and 85% reported improvements in anxiety.
It's time to enter the next chapter in mental health and well-being.
Let Mindbloom guide you.
Right now, Mindbloom is offering our listeners $100 off your first 6-session program when you sign up at mindbloom.com and use the promo code DarkHorse at checkout.
Our second sponsor for this episode is American Hartford Gold.
Inflation is at its highest level in 40 years.
We all feel it at the grocery store and the fuel pump.
Findbloom.com/darkhorse and use the promo code DARKHORSE at checkout.
Our second sponsor for this episode is American Hartford Gold.
Inflation is at its highest level in 40 years.
We all feel it at the grocery store and the fuel pump.
Interest rates are soaring.
Retirement accounts are in real danger.
If you want to better protect your family's future, you should consider that people have been putting wealth into precious metals for thousands of years.
The more uncertain access to other stores of value gets, the more precious precious metals are likely to become.
Call American Hartford Gold to see how easy it is to get started.
They can show you how to protect your savings and retirement account by diversifying your portfolio with physical gold and silver composed of actual atoms.
All it takes to get started is a short phone call and they'll have physical gold and silver delivered right to your door or inside your IRA or 401k.
They are the highest rated firm in the country with an A-plus rating from the Better Business Bureau and thousands of satisfied clients.
Call them now.
They will give you a percentage of your first qualifying order back in free silver.
Call American Hartford Gold at 866-828-1117.
That's 866-828-1117.
866-828-1117. That's 866-828-1117. Or text "Darkhorse" to 998899.
Again, that's 866-828-1117. Or text "Darkhorse" to 998899.
You know I remember dial-up message boards where there was literally a phone number that connected to somebody's computer in their house that they had set up as a board to discuss Whatever it might be, you know, whether, you know, back then we were talking about things like how to hack the copy protection on software.
Kids would exchange games and other programs and, you know, they would essentially violate the rights of the intellectual property owners by exchanging information on how to defeat the copy protection.
So that sort of thing existed there.
And I can certainly imagine In the context of dial-up message boards, there was stuff on there that was really marginal, right?
And it was the Wild West, and you can certainly imagine The instinct to protect this environment, because there was really no way for somebody who set up this service, which was a good thing, you know, what you discussed on it wasn't necessarily good, but the idea that somebody would use their own computer to facilitate the interaction of people who lived
Hundreds or thousands of miles away from each other was a positive thing, but it wasn't going to be possible if, you know, any psychopath could post any criminal material on there and cause the person who simply supplied the computer technology to be liable.
So is that the problem that was being addressed?
That is the problem that was being addressed, and I think to understand how it or why it became a problem, it would be useful to talk about how third-party liability worked before the computer age, because the rules were developed pre-computer, and that's kind of where there became a problem here.
So prior to the computer age, we had the liability for third parties was broken into two categories.
You had publisher liability and distributor liability.
And so publishers were deemed to be just as liable for defamation and other speech-based torts as the speaker.
Exact same liability.
And the theory being that the publisher has editorial discretion on choosing what they publish and then changing the words, and therefore it's fair to make the publisher liable to the same extent as the speaker.
Then you have distributors.
Distributors would be like a bookstore or a library or a shipping company that ships books.
And for distributors, the law developed that they were only liable if they had actual knowledge or constructive knowledge of the tortious nature of the speech.
And for defamation, it's really hard for a distributor to be liable because you have to say they know it's actually defamation.
And absent a court finding that it's defamation, that's very hard.
But the same rules applied for obscenity.
And so just the case law dealing with distributors were basically, did they know what they were, you know, distributing qualified as obscene?
And you can imagine given the, you know, covers of those type of magazines or that such, there may be at least constructive knowledge, a duty to figure out if it's obscene.
But they had to have actual knowledge.
And the distributors treated differently because they can't change the content.
So as far as defamation, they don't have any ability to edit it.
And so they were only liable if they had actual knowledge.
And so when the first case came up, the Cubby versus CompuServe case came up, they're the message board.
There was no editorial Discretion.
The host did not exercise any editorial authority.
Whatever got posted, got posted.
And so there the court looked at it and said, okay, this message board is like a distributor because you aren't exercising any editorial authority.
That was in 1992 or 1991, sorry.
And then the second case, the Stratton-Oakmont case versus Prodigy was 1995.
And there, by then, the Prodigy had learned that you can't have complete free reign on it.
You have to exercise some limits on editorial.
So they had some basic rules of, you know, like no profanity, no pornography or whatever, and, you know, certain limitations.
And they exercised very limited editorial authority.
And so when they were sued, they said, you should treat us just like CompuServe because we're having very minimal editorial authority.
It's preset rules.
We aren't trying to change viewpoints.
We're just trying to keep certain garbage off our listserv.
And the court said, no.
The essence of a publisher is the exercise of editorial authority.
And if you exercise any editorial authority, we're going to treat you like a publisher, not a distributor.
Ah, which creates a terrible bind then, because there's obviously stuff that you have to be able to keep off of a service, and as soon as you've done that minimal work, you have made yourself, you have opened yourself to the possibility of liability suits over things like defamation.
Exactly.
And so that was in 1995.
And then at that time, the Communications Decency Act was already in the works.
And this part, Section 230, gets added to that to address that problem.
And so I think that it's so that's the basic defamation law.
I think one other foundational aspect that's important to understand is the First Amendment concepts of association and viewpoint discrimination.
And so Under the First Amendment, there's the five explicit rights, the religion, speech, press, assembly, petition.
But there's also a sixth right the courts have recognized, which is the right of association.
And so this is something that came out.
Each of the five explicit rights have an associational aspect, but the courts have recognized that this right of association is an independent First Amendment right.
And as the right of association, it also includes, obviously, the right not to associate, just like the right to speech includes the right not to speak.
And so that's one fundamental principle, understanding the right of association.
The second aspect is understanding the levels of censorship.
We have content-neutral Censorship rules.
These are generally time, place, and manner rules.
Things like you can't have music played at a certain volume after a certain time of day.
They don't depend anything on the content.
It's just the volume, but they obviously impact speech.
Then you have content-based, but viewpoint-neutral limitations.
These are limitations that may impact what speech is being made, but don't turn on the viewpoint being expressed.
They're going to be neutral rules.
And then you have viewpoint-based.
And viewpoint-based is obviously, as it sounds, you're censoring people based on the viewpoint they're expressing.
And viewpoint censorship is almost always unconstitutional.
Very limited circumstances.
has to meet the highest level of of court scrutiny to to be permitted and it's almost always prohibited.
Although those same um the viewpoint is also plays a role in the right to association.
You can have limitations on association if they're content neutral.
If they aren't viewpoint neutral they're not based on the viewpoint or the viewpoint based and so non-discrimination laws generally are considered Viewpoint neutral and they obviously force association, but they, they can be applied in situations where the association is not aimed at expressing some point of view.
And so that's why a restaurant you can force a restaurant to associate with customers with non discrimination laws.
But if the association is there for the purpose of expressing a viewpoint, then certain limitations end up being viewpoint-based.
And that's why, for example, in religious organizations, people who play a spiritual role in those religious organizations are not subject to discrimination laws.
because the idea is that something in that religion may have impact.
Like if you don't want women to be priests, that's based on a spiritual basis, and therefore you can do that.
And so non-discrimination laws don't apply to those type of positions.
In these, well, maybe I should let you get where you were going.
Go ahead.
Well, I can almost feel us losing in the various intricacies you are describing.
I understand why they're important.
The law has to be Precise, and it has to allow for all of the important things and allow the prohibiting of the things that are truly troubling.
And the difficulty is in doing these things in such a way that, you know, especially in the context of an unfolding technology, That you don't end up with massive unintended consequences, which is exactly where we are, right?
You're talking about laws that emerged as the world went from dial-up bulletin boards to services like CompuServe and Prodigy.
And then the question is, okay, so you make a rule that protects CompuServe and Prodigy in that context, and then what happens when the next evolutionary phase happens and you're dealing with Facebook and Twitter and YouTube, right?
So anyway, keep going.
Tell us how this impacts those.
Okay, so kind of bringing the idea of the publisher, distributor, and association together, in the 70s there were laws that had the called the right to respond.
And basically what these laws said is that if some media organization says something bad about you personally, you have a right to appear in that media and respond to it.
And so I think the so these were challenged in under the right of association and what the courts held in the lead case was Miami Herald case, where some political actor was, you know,
Criticized and the law that Florida had said that he had a right to have an editorial in that newspaper and they challenged it and the Supreme Court said no, that's unconstitutional because a publisher has a first amendment right to choose what views they're expressing and promoting and who they associate with when they promote those views and therefore the right to respond laws were deemed unconstitutional.
And that comes to what we get with Section 230.
And so Section 230 has two relevant provisions in it.
The first is 230c1, and it states that no, let me get the exact language here so I can get it right, no provider or user of an interactive computer service, i.e.
social media, shall be treated as a publisher or speaker of any information provided by another information content provider.
So that's the first immunity, which is protecting them, basically saying you aren't going to be liable for defamation that a third party person speaks.
You're not liable for what people say on your platform.
Yes.
Just because you're hosting a mix.
And so that basically overturns or, you know, deals with the Prodigy case that and makes them all, all of it looks like the CompuServe case.
But the second part is deemed its title is called Civil Liability and it says no provider of an interactive computer service shall be held liable on account of
Any action voluntarily taken in good faith to restrict access to or the availability of material that the provider or user considers obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.
So this sounds to me like a dry, dull, and boring version of Catch-22, right?
So in Catch-22, I will remind the audience, Catch-22 is the following provision.
You do not have to fly military missions if you are crazy.
But if you don't want to fly military missions, then you're obviously sane, so get up there.
And the line is, That's some catch.
It's Catch-22, it's the best there is.
And this Section 230 is really Catch-230, and it becomes really diabolical at the point that you get to the phrase, otherwise objectionable.
Is that the correct phrase?
That is certainly one of the problems that has happened.
Basically, the courts have interpreted otherwise objectionable to mean anything that the social media company deems objectionable.
If we were interpreting under rules of statutory interpretation that courts apply, generally what they do is they look at the other items in the list and then interpret the open phrase otherwise objectionable to have a similar limitation.
So we basically look at otherwise objectionable would be something that is related to harassing violence or obscenity of some sort.
But the courts have not taken that route and they basically interpreted otherwise objectionable to mean anything.
But I think the bigger problem in this statute is that these two provisions are not tied expressly tied together.
I think that what the intent was, if the intent was to look at, let's basically accept the argument that Prodigy made, which was we have very limited rules on censorship.
They're all laid out.
And when we engage in that limited levels of censorship under these express rules, then we should still be treated like a distributor and not be liable for other speech.
And so if the civil liability provision said, no provider or user of interactive computer services shall be held liable as a publisher on account of the editorial decisions they make in good faith, then that would be very limited.
And it would limit the liability that we would limit the immunity from liability to social media companies if they have clearly written rules that they're enforcing in good faith.
And if they instead want to have kind of this freewheeling editorial control, then they'd be treated like a publisher.
Right, and I can certainly tell you, as somebody who has been thrown off of platforms, has been demonetized on platforms, that these rules are impossible to follow.
They are vague, and it would appear that they are vague by design to provide discretion for these platforms to arbitrarily enforce the rules.
And, you know, this is It's difficult stuff because you have to stand in the right place to see what is occurring.
This pair of provisions, right?
Section 1 and Section 2, or not.
Yeah.
Subsection 1 and 2 of Section 230.
Basically create a, what we would call in biology, a super species.
That is to say something with advantages that allow it to defeat all other competitors, or in this case defeat users who wish to utilize the platforms as a distributor.
Right?
We face the ruthless and arbitrary publisher that wants to avail itself of the right to alter what can be said and remove people who would say things that it doesn't want said, to label things as misinformation that are actually facts, etc.
But it has no liability and we have no recourse to even point out that the rules that they've provided are So unclear as to be unfollowable.
So it is that pair of things that functions together like a bludgeon.
Go ahead.
Yeah, they're exercising the full editorial control of a newspaper publisher, but with no risk of any consequences for what's being said.
So they can take steps that actually shape the message that's being delivered, Um, and control it, but they have just zero risk of liability for anything that's being said.
And it really is just, um, it creates this kind of beast.
And I don't think that was the original intent, because, um, the idea of, at least if we look at the two cases that developed, and the type of, of censorship that was, or editorial control that was being exercised by Prodigy, it was so limited.
But as it's written, it has just this blanket liability from any defamation, and then it has this, you know, immunity, blanket immunity from defamation, and then this immunity from censorship.
Now, generally, and this is why the right of association is important to understand, generally, you know, the publisher has that First Amendment right to edit and censor, you know, at their complete discretion.
And so the civil immunity in Section 2 actually doesn't really do that much if it's a standalone provision, because you already have the First Amendment right to edit and censor in bad faith, if you want, and there isn't going to be any liability.
It provides immunity in very limited situations.
There's one case that was recently decided where it was Intentional interference with business expectancy, which is basically bad-mouthing your competition to try and get them to break a deal or potential deal with a rival.
The immunity here might play in there, potentially, but as far as generally, it doesn't provide that much immunity unless it was key to the ability to get the immunity from defamation in the first place.
So, I'd like to come at this from a slightly different direction.
The problem is you've got to, you know, if we step back into the mindset of 1996, You have an emerging landscape, and as much as these major platforms are intuitive to us now, they were anything but intuitive to anyone at that point, right?
I mean, the idea of Twitter is preposterous on its face, right?
You're going to have a text-based service in which you're limited to, what was it originally, 180 characters?
Right?
That doesn't sound like that's going to be a winner.
But, obviously, things are different.
But here's the problem.
In the world of dial-up, and I believe CompuServe was dial-up initially, You had these services that somebody was opting into, right?
Some service that you were opting into.
It was not ubiquitous, right?
I did know a couple of people with CompuServe email addresses, but this was not the place where the world's business was conducted.
And
In that environment, the idea, the freedoms to set rules that govern what takes place in the space are very different than when this is the de facto public square, where the arbitrary rights of the platform actually interfere with the de facto rights of the public.
The problem is we are dealing, and I don't think I've even heard this term used, we are dealing with quasi-public spaces, right?
And I want to dispense with the argument that we all hear so frequently, well, it's a private company, it can do what it wants.
Well, it is a private company.
However, it is managing a quasi-public space.
Airports are, by and large, privately held.
If airports decide they want to arbitrarily decide that, you know, Republicans can't fly, that's obviously a problem.
And you can make the same argument, they're private corporations, they can provide service to whoever they like.
I'm sorry, no.
If you, you know, if you have effectively private corporations allowed to set up a no-fly list, Based on political orientation, you have a serious problem immediately.
Likewise, hospitals, often privately held.
Do you really want hospitals to be able to refuse service to people?
I mean, we've seen this actually under COVID.
We've seen questions about whether or not The so-called unvaccinated were entitled to medical treatment.
And yes, I understand that there's a philosophical question there, but there's also this deeper legal question, which is if you're providing a service that is necessary for citizens, you are in a different situation than if you are engaged in some niche activity, which is not fundamental to life, like the ability to travel, the ability to And to the public square and participate in a discussion.
I think that I understand the appeal of that argument.
The problem that it comes from a legal standpoint in the First Amendment standpoint is this right to association.
And your airport or your hospital is an example of non-expressive associational relations.
And therefore, imposing limitations based on, you know, you can't restrict based on their viewpoint, doesn't impact the purpose of that association, which is to fly somewhere or to provide medical services.
But social media is deemed an expressive association.
It's media.
It's like a newspaper to a certain extent.
And even in the right to respond cases, the argument there was the newspaper has this huge platform and I can't, I as an individual, can't get out to as many people.
Therefore, it's totally unfair because the newspaper has this huge platform.
They can totally change the conversation and I'm helpless to try and respond.
And the courts rejected that, saying that that's the nature of the business here, this is still an expressive entity, therefore they have a right to choose who they associate with and who they don't associate with.
They also note, and a second problem comes to figuring out how to draw the fine lines that, you know, what type of viewpoint can you not discriminate against versus what type of viewpoint, you know, can you discriminate against.
And under the First Amendment, the courts have been really clear that if we're going to impose liability, you have to have really bright lines and easy to define lines in place, or else you end up having a fear of chilling.
So I think that Trying to adopt any law that is based on the viewpoint, you know, prohibiting viewpoint discrimination, it becomes very tricky to do, and that's what we've seen in the Texas law and the Florida law that have been both been deemed unconstitutional.
the Florida law basically was saying you can't censor political candidates.
And that was deemed to be viewpoint-based discrimination that was unconstitutional.
But they did permit some of the disclosure requirements.
The law also required them to disclose, you know, all the rules that they have for censorship, any changes to those rules, and the number of view counts that you've been getting.
So you could try and track the shadow banning.
And they held that those were viewpoint neutral rules, and those disclosure requirements were upheld.
They did in the Florida law hold that one of the disclosure requirements, which was a thorough explanation of why you're being censored, was too burdensome and therefore was found to be unconstitutional because of the volume of censorship that they do.
It was just deemed to be impractical to exercise that right.
So, but you can have some type of limitations as long as they're viewpoint neutral, they may have a chance of surviving the First Amendment challenge.
So, this again goes to the issue of this super species, right?
On the one hand, I understand the argument that it is technically too onerous to require these platforms to explain every instance of censorship.
Probably.
Some, you know, well above 99% of instances of censorship are some tiny account saying something obnoxious and it's transparent on its face why it's not being allowed.
On the other hand, When YouTube doesn't have to explain why it has demonetized our channel, it does not have to point to the statements that it finds to be false, for example.
And then the statements that it clearly was responding to turn out to be true.
Obviously, it should reverse the decision, right?
And the fact that these platforms do not have to be specific about this stuff means that there is no requirement of consistency for them.
They are not required to behave logically.
They can just simply effectively decide who they don't like and they can, you know, put a shackle on.
And This is obviously the result of an anachronistic law being applied to modern circumstances that its authors did not understand.
I want to say one other thing before we lose it.
The argument that it's a private company, it can do what it wants, it's I understand the appeal of it too, and as much as, you know, I'm a dyed-in-the-wool liberal progressive, and so that's not a native argument to me.
I am concerned about things like corporate power, and good regulations are a necessary feature of the landscape as I see it.
In 2022?
Boy, do I understand this argument, right?
My feeling is we have malignant government and I want its power minimized because the way it wields that power is is paradoxical, and I would just point out that the argument, the classically libertarian argument, it's a private corporation, it can do what it wants, that argument is actually functioning in the service of aggressive malignant governance.
As we've now discovered, the platforms are apparently Meeting with the government and the government is telling them who it wants censored.
So this is a fascist nightmare unfolding on these platforms hiding behind this libertarian notion that these are private corporations.
They can do what they want.
So, I think there's a couple things to address that.
The first way that it could be addressed would be, again, going back to tying the immunity from defamation to an agreement that they only censor in good faith based on existing written rules.
And such.
Clear rules.
And so there, what's happening there, by granting the immunity from defamation, the government is granting a huge benefit to the social media companies.
They could not exist but for that immunity from liability.
And so when government grants someone a benefit, they can exact a price from that.
And that price could be to comply with Rules on censorship that and limit your censorship to good faith censorship based on pre written rules provide clear explanations for that censorship and such.
And so, if we tied section one and two together.
Then that would address this by requiring social media companies to make a choice.
You can censor freely in bad faith if you want, but then you don't get the immunity.
If you want the immunity, you have to have these clear rules.
And that was what, under the Trump administration, they put forth a huge proposal on amending Section 230.
And this is one of the things that they proposed specifically.
It was tying these together and having specific rules on if you want the immunity, You have to have clearly written rules.
You have to have an enforcement mechanism, a way to appeal and challenge that ruling.
And so it's definitely something that could be fixed in this that would do that.
And it's permissible because we're providing that benefit.
It's the trade-off saying if you want this immunity from liability, you have to do something.
And I do think that was the original intent.
But I think that would address this, the super beast that we've created by Saying that, you know, they have to have clear rules.
And we can actually back that up.
I think that the other things that states can do, I think the Florida case that allowed or permitted some of the disclosure requirements, they did say that the thorough explanation of the censorship was impermissible.
But I can imagine lesser requirements, such as you need to quote the specific language that you say violates the rules, and you need to identify what that rule is, is something that states may be able to do on an individual basis.
Um, without amending section 230, um, that I think also would go to at least part of the problem.
Because I do think, I mean, one of the biggest complaints that I hear from a lot of the, the content providers is that we don't know what the rules are.
I mean, the whole thing of demonetizing Dark Horse, you know, originally that demonetization was our advertisers are making us, but now they, you know, I watch my, you watch your podcast and they got ads on them.
Yeah, they got ads and they make the money.
So obviously YouTube doesn't think that whatever you're saying is un-adworthy, they're just doing it for this whatever, the arbitrary, because they don't like you.
And I do think that even Section 230 censorship does talk about good faith, and the courts have not gone too much into what that actually means.
Um, but, um, that may be part of the solution, too, is trying to get a good case there that says, you know, the censorship wasn't in good faith.
The one case that from the Ninth Circuit that I mentioned earlier on, um, Torsh's business interference with business expectancy, the court held, well, if you're trying to ruin someone's business, you aren't acting in good faith, so the immunity actually didn't protect him in that case.
But that's the, that's The most on point case we have and we don't really know what the good faith means.
Otherwise, so I do think that we could address this problem without adopting a rule that says you can't discriminate based on viewpoint.
Because I think that's going to be, I mean, it's the First Amendment is going to prohibit that type of limitation.
The Florida was really clever and how they did it.
They were trying to say that, well, we're doing candidates and therefore You know, we have an interest in, you know, basically if you're censoring one candidate in a race, but not the other candidate, you're effectively giving a campaign contribution and we're allowed to adopt laws on campaign contributions.
And even that the court held was not a good enough basis to require them to carry all candidates or limit the ability to censor candidates.
So it is going to be a really steep road.
The second point is that, you know, the Justice Thomas on the Supreme Court has advocated similar to what you were advocating is we should treat social media companies like common carriers.
I think it's going to be hard.
are required to take all comers.
And that, I mean, that would, I think, present a different, you know, it would recategorize social media as something else.
But, you know, it's, that might work.
I think it's going to be hard.
There was a case that came out a couple of years ago out of New York, which was a public access channel that was run by a private entity.
And And the Supreme Court held that that private entity had the First Amendment right to choose who gets channels and who don't and how they organize it.
And I think that logic would apply to social media companies trying to limit based on censorship under the common carrier theory.
But that is one of the other theories that are out there.
All right, there are a number of points that I want to catch up on here.
One thing I hear missing in what you've described as necessary, and I think it's implied in what you said, but I want to highlight it, is not only clear rules and an explanation of what you violated, but a process.
There has to be a process, and it has to be transparent.
The idea that Twitter throws you off because it claims you violated a rule, and then sometimes it'll say, as it did in Jordan Peterson's case, for example, or Dr. Rollergator, you have to delete this tweet, which is offensive, and by doing so, You are admitting that you violated the terms of service, right?
And so, you know, talk about catch-22s.
That's exactly what they've set up.
And there's no mechanism other than, frankly, having your friends shout at them in public.
You know, to go in and say, actually, no, I didn't violate the rules.
Or here are, you know, 17 examples where you have let the exact same behavior or something far more extreme live on your servers.
And you've singled me out because of who I am and what I think, not because of what I said, right?
You need to have that process.
I would also point out, and I think this is really, you've got The authors of Section 230 in 1996, understandably and unavoidably failing to understand how the law they wrote would work in a future they could not possibly have foreseen.
That is what it is.
But what happens is you now have the federal government Hiding behind the protections of private corporations who have this pair of provisions that make them into this unbeatable force, a force with all of the immunities and none of the costs, right?
And this is reminiscent, I would say, of what we discovered through Largely through Snowden, right?
That, you know, we have a constitution that protects us from the overreach of our government, but nobody thought to prevent our government from subcontracting, spying on its citizens to another government, right?
In other words, I'm protected from the federal government spying on me by provisions that were written long before the modern technological era, but I'm not protected from the British government spying on me.
And if the U.S. government says, hey, spy on Brett to the British government, right, this is just unforeseen.
So that five eyes, you know, outsourcing of obviously unconstitutional activities is now happening in the context of social media where the federal government, which is not allowed to censor me, is now telling Twitter and YouTube, you know, here are the people that we is now telling Twitter and YouTube, you know, here are the people that we don't want to have You have Biden,
Stepping onto a blood red stage and invoking clear and present danger, which is effectively a thinly veiled argument that the censorship is required for the protection of the nation.
And this is, you know, it's a nightmare.
And So, this is not a small matter that rests on the question of whether or not Section 230 should be two isolated provisions that provide benefits without the obvious cost that should go with them.
This is something on which the functioning of civilization depends.
Yeah, no, I think you're right.
And I do think this is a good time, I think, maybe to transition to talk about the cases that are going on right now, or the main case, which is the states of Missouri and Louisiana suing all the federal agencies that are involved in the censorship.
For this and there is a doctrine under the First Amendment that recognizes under certain circumstances when the government is acting in conjunction with the private entity to censor speech that private entities speech that private entity censorship does can violate the First Amendment.
And so what the state of Missouri and Louisiana are arguing is that the federal government has entered some type of partnership with the social media companies.
They've adopted Section 230, which gives these social media companies this massive power to censor.
And then they're working in conjunction with those companies to tell them what to censor.
And the courts have recognized a couple ways that this can happen.
One of them is when the government compels private entities to engage in censorship.
And here, a recent case dealt with the NRA in New York, where the NRA was offering these insurance packages through private insurance companies that would basically give you some type of protection if you're sued for firing your gun.
And the state regulators and the governor basically pressured saying, hey, we regulate all of you insurance companies, and we're going to start cracking down on you with our regulations if you stay in partnership with the NRA.
And therefore, all the insurance companies stopped working with the NRA.
And the court held that that was an example of suppressing the NRA's gun advocacy by pressuring their partners to not work with them.
And therefore, that did violate the NRA's First Amendment rights.
And so that was the compulsion.
The second way that it can happen is if there's a partnership, if the government enters into a partnership with private entities to enter into a censorship regime.
And that's what the state of Missouri and Louisiana are arguing.
And they're saying, so the federal government has given this great censorship to Section 230 and then are partnering with these social media companies to tell them what to censor.
And they've gotten through discovery and through some of the documents that came out in the earlier lawsuits by individuals, they've gotten a certain amount of documents that have shown this level of coordination and partnering.
And they've gotten past the first hurdle that they wanted to.
So they're seeking a preliminary injunction to basically tell the government to stop telling social media companies who to censor.
And in order to get that, they need discovery.
And so the lawsuit faced a first test.
And back in July, the lawsuit got past that test.
The court basically said, you have enough actual evidence of this partnership that if we assume it's all true, would state a First Amendment violation.
So we're going to allow you to get this discovery.
And then so they've gotten a certain level of discovery than just yesterday.
The court ruled that they should be able to get broader discovery, including people in the White House and Anthony Fauci and his communications with the social media companies.
So it's worse to come on that.
And so we're really going to be able to dig into the evidence and see how deep this partnership runs.
And if it if it is what we think it is, then that will end up being a violation of the First Amendment.
Yes, although I believe that we are, I believe we have the evidence to detect an arms race of sorts here.
And I do think the courts are the most likely, they are the entity, the governmental entity that retains The most public spiritedness and immunity from capture such that they might still be able to function.
But the problem really is if you listen to what is said.
No, maybe I should step back one step.
The reason that we have the constitutional protections that we have is that the American founders Having experienced tyranny in their own time from the crown, understood the ferocious power of government and sought to explicitly limit it to protect citizens.
The government has now in many regards, I would argue in most regards, been captured by private interests.
Interests that are not public spirited.
Now, the power of government is tremendous.
Captured, it becomes an asymmetrical weapon that functions on behalf of its captors.
But even worse, like a child engaged in the question of, well, you know, if you had three wishes, what would you wish for?
Well, wish number one, I want more wishes.
Those who have captured this ferociously powerful entity we call the government are not content with the limits placed on it by the Constitution.
And so, Because the Constitution is understood not to be a suicide pact, that is to say, even the provisions in it are understood in some contexts to be flexible, they are using certain terms and tropes to trigger the removal of ordinary limits, right?
Now, this is particularly egregious In the case of the term terrorism, just like clear and present danger, the invocation of the concept of terrorism triggers many post 9-11 provisions that effectively unhook constitutional rights.
So, when the Department of Homeland Security, for example, declares that there are three kinds of terrorism involved in social media traffic, you've got misdisc and malinformation, misinformation being errors,
Disinformation being intentional errors and malinformation being truth, which causes distrust in government.
They are saying something preposterous and laughable, right?
Truth that causes you to distrust your government is terrorism?
Like I'm not allowed to talk about the fact that my government has been captured?
That's obviously nuts.
But that's not the important part of it.
The fact that it's funny hides the fact that it also has a legal implication, which is that all sorts of provisions in which the executive branch on its own authority can designate people as terrorists, and that once that has been done, the necessity to defeat terrorists is so great that we must avail ourselves of exotic rights to censor, to monitor, to corral, etc.
Logically absurd chain means that we don't know what it is that has been said.
And frankly, the ability The right of discovery that the court grants only goes so far once the executive branch has behind the scenes said that because so and so said such and such, they are guilty of malinformation, terrorism and are therefore a threat to the blah, blah, blah, blah, blah.
Once they make that leap, even if a court would laugh at it, were that court to see it, there's no provision, there's no mechanism to know it's happened.
You can't challenge it.
There's no way to ask a court to review it.
And so my basic point, and I know this has been long-winded, but my basic point is they have built a structure where if they can trigger a certain pathway, it unhooks all of the normal protections.
And there is no way to even know it's happened.
So as much as the courts may be our best hope, I'm not convinced they have access to the information they would need.
No, I mean, it's certainly the courts are, you know, they're they're the weakest branch of our government.
They have no enforcement ability.
They've got no, you know, built in financial resources.
They're entirely dependent on the other two branches of government for their support.
And and they also dependent on those two branches of government acting honestly and truthfully.
And if you look in the cases, You can find many an example of where the government has not been truthful to the court and, you know, and it's in the terrorism context.
There's, you know, you'll find there's one particular case that I've read that was dealing with the Department of Justice was investigating potential terrorists and they just lied about the existence of certain records and they tried to argue to the court that these documents were so secret they had to lie to the court and they got caught, of course.
And the court wasn't having it, but it's an example of that trying to be attempted.
And you have to imagine that's going to be happening in other situations as well.
I think that when you have in the social media context, we do have the advantage of the outside private companies are also involved.
And so the federal government doesn't have complete control of, you know, if it's an email between the government and a private entity, there's two copies of that email.
And so I think that there's a greater chance of uncovering at least the collusion on the censorship here on that basis.
But it's always a risk that the government's going to do it.
And they've certainly used the terrorism scare as justifications for lots of secrecy.
And they'll continue to use it.
I think that if we look at the precedent going back in the 60s and 70s, Then the courts were facing, you know, a lot of social upheaval and the federal government thinking that they needed secrecy and the courts, you know, stood up pretty well against it.
The clear and present danger standard is a pretty high standard for when speech can be limited.
It's got to be, you know, it's got to be an immediate call to arms in front of a body that is volatile and can actually, you know, create this type of problems.
And the courts have been pretty, you know, that's fairly limited in when it actually can apply, and it's certainly never going to apply.
If the precedent is applied, it would never apply to something in written form on social media, because that doesn't have the immediate imminency that the courts in the past have required before you can censor that type of speech permissibly.
Well, but this, I mean, I don't want to get stuck down this particular rabbit hole, but Again, my point would be, when it comes to the post 9-11 anti-terrorism provisions, the problem is that a court has to see a case.
And if the executive effectively comes up with a rationale whereby, through its own reading of the law, These instances do not, they cannot be shared with a court.
Then the fact that a court would say that's absurd, you have to share it with the court, doesn't matter because what you effectively have is the executive branch having freed itself from checks and balances because terrorism is so darn dangerous.
And, you know, so, you know, when I hear In a context where I'm involved, right, I am discussing as a biologist the truth of COVID, of the various treatments available in public.
And then I hear the Department of Homeland Security saying that even true things that you say in public may be terrorism.
I know why they're doing that.
They're not doing that because they don't realize how insane that sounds.
They're doing it because it triggers something internal to this system, which involves it not being accountable to a court.
It also triggers something in the public, and I think that this is where we get this horrid feedback loop with social media censorship.
In that, you know, the one tool that the public has against this type of, you know, government secrecy and a government that's not willing to play by the rules is supposed to be the fourth branch, the media, that's supposed to be exposing it.
But when you have the media, Working hand in hand with the government on, and they're all on the same page on what message they want getting out there.
It makes it so much harder to get that message out there.
And social media was supposed to be a great, you know, equalizer because, you know, we, the people, um, are the content providers.
And you saw this in the election of Trump and, um, where social media was able to get past the big mainstream media barriers.
And then they crack down since that we've had this massive social media censorship campaign.
Ever since.
And so how do we counteract these type of things?
And obviously terrorism still does trigger something in people, so it makes it that much harder to get through using whatever media That is free enough to actually have have views expressed openly.
So yeah, we are in a real quandary.
It's a real problem right now.
And it's the courts, you know, have the potential of a role with this Missouri suit to, you know, really dig into a real problem.
And hopefully, you know, we saw Um, it break in the 70s when you have the church hearings and such, there was, we actually got to really dig into a lot of the, the secrecy from the CIA and the FBI and all that was going on there.
Um, at least partially dug into it.
Um, you know, maybe this, this, this lawsuit has the potential to have that type of break when we see how open the collusion has been between the government and the, um, you know, and the social media companies on censorship.
But, you know, it's, um, we'll see how far it gets.
Um, and it's, you know, there's only so much they can do.
And as long as the, the public, the media, the majority of the public digests is, So willing to go along with the federal government's line, it's going to be hard to break that open as well.
But it's all the more important for different voices out there.
As long as there's at least demonetized or not voices like Dark Horse and such that can get the opposite views out there, there's some hope.
Well, I would also, I think I want to go back, I invoked fascism earlier when I was talking about the collusion between the federal government and these private entities to censor, and I want to, I'm sure that sounded like hyperbole to some, but I want to point out just how frightening, you know, the sine qua non of fascism is the hybrid between corporate and governmental power.
And the, it always comes with the demonization of minorities.
And in this case, you have, I think the partnership is different than it's ever been.
This is a historically novel version of fascism, but not only do you have the federal government meeting with platforms and giving them guidance on who it wants censored, but you also have the The power of these corporations to shut down would-be competitors.
So if the argument is, well, you can't use Twitter because Twitter has the right to throw you off, and then the answer is, well, okay, then I'll use Parler.
Well, no, you won't use Parler because Parler is depending on an architecture that we control.
And so the point is, Twitter would be insane to cannibalize its own business by throwing people off so that basically, A, there was a large population that needed to go somewhere else, and B, that it became an uninteresting platform because it was just a single big echo chamber, right?
On the other hand, in the context where you can't create that alternative service because if you do create it and it becomes wildly popular, it will be shut down, Then the point is, oh, well, Twitter now has an anti-competitive advantage.
Not only does it have the freedoms of a publisher and the rights of a platform simultaneously, But it also has the ability to create an absolutely gargantuan barrier to entry for a competitor who would take advantage of the insanity of Twitter.
And so that combination of things is this fusion of governmental power and private corporate power together in one entity that is now in doing many of the things that are classically involved in fascism.
It is censoring, it is punishing in extrajudicial ways, and it is basically designating minorities who are not deserving of the protections that citizens ordinarily have.
So, at what point do you have enough circumstantial evidence that something that An objective observer would call fascism is apparently alive and well in the United States in 2022.
At what point do you have enough evidence to say, holy moly, it's happening here?
Well, I mean, and that's the question is, you know, who, who can do something with that?
And, you know, at a certain point, we've talked about, you know, the limits of the courts, but the courts have some ability in in addressing that.
But, you know, it's really a matter of public opinion.
And, you know, it's in given that the collusion is between the media and the government, Trying to sway public opinion is particularly difficult.
I mean, just the absurdity that fact-checkers treat statements by the government as the truth when they're deciding what's true and not.
I mean, I can't think of anything that is just more insane than that.
And the whole point of media is to check the government.
So to take the government's word is just, it's kind of mind-blowing, and it shows how warped our current media environment is right now.
And your fascism point, the fact that there's such reverence for government is, you know, it's very novel right now, in a novel way, but it certainly shows that conjunction between government and private entities that We should be concerned about.
Yep.
And it also, you know, just you could you could spend all day spotting the little asymmetries and arbitrary forms of power.
But the ability to throw people off social media on the basis that what they are saying is so dangerous that it cannot be allowed obviously creates a mechanism Whereby they can construct a pretense of consensus where there is no consensus whatsoever.
So the basic point is, well, how do you know that that statement is dangerous?
And the answer is because all the experts agree.
Well, What happens when you throw off anybody who says, I don't agree, is you get all the people who agree, so it looks like a consensus.
And that then creates a false justification for continuing with an even more aggressive policy.
We keep seeing that.
We saw that with the Hunter Biden laptop.
We saw that with, you know, the lab leak theory, the efficiency of masking and shutdowns, and we saw it in the 2020 election things.
I mean... And safe and effective vaccines.
Yeah, and then vaccines.
And it's, I mean, if you look at the, on the election context, Just abstractly, you know, the almost immediate demonization of anyone wanting to challenge the validity of elections.
I mean, if you can't, if you don't have an absolute free right to challenge, to allege an election is corrupt, you're going to get corrupt elections.
I mean, that is the only protection against that type of corruption is the ability to call it out.
And so, but when you have this, you know, the, the, this very quick, um, you know, agreement in the media that it's all bogus, um, you know, it undermines that and that gets to the core of our, you know, of our whole governmental system is the ability to challenge elections.
And then obviously the vaccines, I mean, the safety of, of our, our lives and just the pressure that's been put on vaccines.
I mean, uh, you know, it was just like a month or two ago, I saw something in my grocery store of a, Uh, it was a cardboard cut out of a kid in a superhero outfit saying, you know, I got my vaccine.
I'm, you know, once they approved it for five year olds or whatever, I mean, just the, the idea that it can't be, you can't challenge that type of thing is really.
Yes, even where the challenges are wrong, they are fundamental, right?
It is the fact that it is challenged and it stands up to challenge that tells you that it was free and fair or close to it.
And you know, we know That, A, we know two things.
One, both sides challenge the fairness of elections when they lose, right?
That's an unfortunate fact.
I don't want to live in that country, but that's the country we live in, right?
The Democrats did not treat the Trump election as free and fair.
They demonized it as downstream of a Russian campaign to sway the election.
Remember 2000 and 2004, too.
They're both, you know, you had Congress members challenging the The electoral count in Congress based on those as well.
It was just, it's happened a lot.
It's happened a lot.
And so the point is that's the world we live in and we have to accept it, not, you know, cherry pick and look at one side as, you know, crying foul when it loses.
The fact is we have a crisis of confidence in our electoral system.
And, you know, I think frankly, rightly so.
I think there's a lot at stake in elections.
And we are told that, you know, it's an existential threat to the nation if, you know, the other side wins.
Frankly, we're told that no matter who the enemy is, right?
Mitt Romney was, you know, the reason that we couldn't engage in discussing a third party option, you know, and then it became Trump.
And people don't remember that.
That's just, that's always the argument.
Remember the demonization of Obama as well.
I mean, both sides do it.
It's they, you know, it's, it's, it's just part of the nature of what it is, is we're going to have both sides demonizing the other side.
And that's why you need to have, you know, a more detached way of looking at election challenges that doesn't tie in, it doesn't turn on who it is.
It's supposed to be a process that's there and it needs to be explored and not, you know, immediately demonized.
It's, um, Right.
And as you point out, I like your formulation a lot, that if you forbid the challenge, then you enable the distortion, right?
You enable collusion to affect these elections.
And frankly, the Hunter Biden laptop story, which I remember being demonized for saying, hey, this is serious and important.
And it does involve the president, because what Biden was talking about was influence peddling, which was clearly, clearly something that Biden had engaged in over his career.
Then you also get the blackmail ability.
I mean, once someone engages and once they allow themselves to be bribed, they then can be blackmailed.
It's a big deal.
Right.
But the point is the suppression of that story was an attempt by whoever did it to sway an election.
Right.
It's newsworthy information that that laptop exists.
It was clear at the time that it was fairly clear at the time that it wasn't phony.
And yeah, its emergence was unfortunate for Team Blue, but I guess the point would be, hey, how about nominating people who don't have this kind of shit in their background, right?
So when you make it impossible to discuss conspiracy, you are giving a gift to those who would conspire.
That's the, that's the point.
And when you make it impossible to discuss the corruption of an election, you make it possible to corrupt elections.
And I don't know that that principle, a principle must have a name, and I don't happen to know what it is, but If you don't want to see the thing, you need the process in which we can explore whether it has taken place.
And as soon as you start forbidding it, you are guaranteeing it.
That's that's the point.
Yeah, no, I mean, and I think that we those who want to, you know, permit this type of suppression, you know, argue, well, you know, people are so easily tricked by lies.
I think we discount the power of truth to, you know, eventually climb its way to the top.
And, you know, I think that, you know, however convincing a false argument is, you know, it's much better to allow the truth to dispel it as opposed to just try and suppress it.
Well, I mean, you know, they have a nasty habit of telling you exactly what they're doing and then saying the opposite of the truth.
And the fact is, Yes, getting to the truth is a messy process, but the truth does tend to out in a level playing field.
And the problem is not that people are so easily persuaded by lies.
That's certainly true in the immediate, right?
But in the long term, I mean, look at what happened with Dark Horse.
We discussed a lot of things that were apparent if you had a biologically sophisticated perspective and people heard it and the point was they had to avail themselves.
YouTube had to avail itself of Extraordinary rights in order to prevent that information from spreading and, you know, demonetizing us was clearly a threat to our economic well-being designed to persuade us to stop talking about certain things.
But it didn't work.
And so now they've resorted, you know, our numbers, despite the fact that it is evident to us when we walk around in the world, that we are reaching a large number of people, our numbers do not reflect it.
And this is something people who do not have a large channel will not intuit.
But it is very apparent When you are being shadow banned, or otherwise having the information about your channel altered, you feel it, right?
You feel the numbers go suddenly flat.
And, you know, when you talk about it, it sounds like you're being petty, right?
Or you're imagining things because you think you're reaching more people than you are.
But it's not subtle when it's targeted at you.
I don't even think it's designed to be subtle.
I think it is designed to create a sense of futility and, you know, one has to ignore it.
But the power of these entities is so asymmetrical, right?
This is one area where I think that states might be able to address this.
Again, going back to the Florida lawsuit and the disclosure requirements, Under the case holding that approved some of the disclosure requirements, I think you could make a good argument that states could adopt laws that require you to provide notice when you're doing any type of censorship and basically outlaw shadow banning.
Because I do think shadow banning is the most nefarious.
I mean, it's secret censorship.
It's the most nefarious form of censorship.
And, you know, and I don't think you can say you're doing it in good faith if you're doing it secretly.
And therefore, I think that Section 230 does permit states to enact certain laws that are inconsistent with it.
And the Florida lawsuit shows that some of those disclosure requirements can survive a challenge.
And the Florida law did not have a specific prohibition on shadow banning, but it got to the requirement of view counts so people could at least observe it.
But I think you can probably go a little further on that and states could prohibit shadow banning.
Because I do think that that is such a nefarious thing when you don't know what's happening and limiting the reach and not allowing the true organic reach.
And you see this on channels Like yours.
I know Jimmy Doors pointed this out a whole bunch on how he gets great view counts, he was growing and growing and then just hits a plateau and stops growing.
Yep.
No, it's exactly what it looks like.
Our numbers have been flat for half a year or more.
It's shocking how How clear it is that some new force has been applied.
But I like your idea of states addressing this and I would point out there are numerous examples where because of the globalized nature of our system, a single state or a single nation in the case of larger issues Enacting wise regulation can often force the issue, right?
California forced emissions regulations that I mean, they were necessary.
I grew up in LA and my God, the air was disgusting.
And the fact is California enacted some regulations, which yes, people still grumble about, but wow, the quality of life that improvements that came from that were spectacular.
And the fact is, You know, California had a big car market, so you'd be crazy to make a car that couldn't be sold in California, so people just, you know, lived by the standard even where they didn't have to.
In the case of, you know, do you really want to have a social media platform that you can't watch in Texas?
No.
So, you know, that means the ability is there.
Once you require social media companies to build the infrastructure to meet those obligations, even if it's for one state, it's easily scalable.
So it's not a bigger burden on social media companies either, as far as the, you know, the infrastructure to do it.
And if you get a couple states like Florida and Texas enact something like that, that's going to be, you know, those two big states are going to be enough to, you know, I think probably tip the balance on it.
You know, we'll see.
Obviously courts will, you know, we just have one court ruling.
It's a, you know, appellate decision, so it's good in that sense.
But, you know, courts may look at them differently, different courts, but I think it's at least a good start and it's an area.
And because the 230 does specifically allow state regulations that are not inconsistent, it's definitely a possible avenue to try and at least get to one of the problems.
And I think shadow banning, I do think if we could end shadow banning, that would, You know, I think that would go far because, you know, one of the things about, you know, when they have to censor openly, there's, you know, the Barbara Streisand effect of, you know, once you know it's being censored, everybody wants to see it.
And, you know, I think that that would be at least a little step in the right direction that doesn't actually require amending Section 230.
Because I think trying to amend Section 230 at this point is Is is too hard without Congress is right now.
Obviously they can't do anything that's controversial so.
Well, I do think that I have a principle I like, which is read the books they seek to burn.
And I think in this environment, given what's afoot, we have an obligation as citizens to do this.
And you know, some of that stuff may be garbage, but in that garbage are going to be the things you need to know and you should pursue them.
One of the things that I've thought that's always weird is that a lot of conspiracy theories get demonized and suppressed, but then the whole Flat Earth conspiracy theory seemed to be amazingly promoted.
And so it's just made me think it's less valuable because no one's trying to suppress it.
It was so ridiculous that nobody bothered to suppress it.
Yeah, it's an odd one because you can demonstrate to yourself that the Earth can't be flat.
So I do wonder why it suddenly showed up.
But nonetheless, you know, here we are on a very round Earth.
Well, there's no effort to suppress it, you know, when you've got the ancient aliens is, you know, promoted again, something nobody bothers to suppress.
And that's the flip side of read the books they want to burn.
Also look at what they don't care about and compare.
Compare it.
Yeah, I would agree.
Oh, one last point I wanted to make and then anything that you think should be on the table here, we'll go with that.
But the part of the shadow banning issue that, you know, it's uncomfortable to talk about.
Heather and I ended up doing what we've done, we scrambled when our successful, very successful run as professors came to a sudden end as a result of lunacy.
And so we ended up accidentally becoming, you know, something else in the world, including podcasters.
And then that got ended by a kind of, not ended, but it got demonetized by a related kind of lunacy.
But what people, I think, don't really intuit is how directly this shadow banning and all of its analogues impacts one's security in the world and one's ability to earn, right?
In other words, you can look at the numbers of subscribers and the fact that it's gone flat and say, well, okay, I don't know what to make of that, but all right, so how many numbers do you actually need?
On the other hand, there are two points here.
One is, if we're saying things that are important in the conversation, Limiting our reach artificially basically means that we don't have whatever our natural impact in the world would be.
But it also means that our ability to, you know, to make our way in the world is severely impacted, right?
And we will never know how severely impacted it is without You know, we would have to sue Google in order to get Discovery to find out how severely they had been altering our view counts and, you know, and they had changed our access to an audience, right?
So that's the point is the barrier to finding out even how badly you've been impacted is tremendously large.
Well, it's just contrary to all of the values we look at in this country.
The First Amendment is supposed to protect the marketplace of ideas.
And just American value of if you get out there and hustle hard, you know, your fate is in your own hands to a certain extent.
And, you know, if you work your best, you're going to make it is a general belief.
And shadowbanning totally goes against this.
I mean, we saw when you guys were, you know, what happened at Evergreen.
Well, that created a bunch of media and it gave you a new platform, a new ability to promote yourself to a wider audience.
And so that's the marketplace of ideas, in effect, working properly.
But shadowbanning, because it's secret, because you can't see it, it's just so contrary to it.
It says that, you know, as hard as you're going to work, we're going to hold you back in the secret way and you aren't going to be able to get there.
You know, the type of things that at least we imagine happened in the South as far as whispering of, oh, this person is too friendly to the minority, so we're not going to go to their business and stuff.
It's that, you know, hidden secret, you know, unfair, inherently unfair limitations.
And it's just so contrary to everything.
And I think If we could address it in a way, I think it would be at least a step in the right direction.
And I certainly think it's the most nefarious aspect of censorship that's going on.
Yeah.
Well, I mean, these things are bad enough when it's a whisper campaign.
But a whisper campaign is one thing.
A corporate government secret hybrid is quite another and really We have to be incredibly careful not to burn out the claim.
People will just get used to hearing what this implies and they will stop paying attention.
When this happens in history, what comes next is historic tragedy.
And this is the moment to avoid the historic tragedy by recognizing that it's not going to goose step this time, right?
It learned the lesson.
Goose stepping is out, but the tropes, they're there.
It's something that cannot be, it should not be ignored.
You know, the First Amendment protects everybody's views.
You're supposed to protect the least popular views with it.
And it always comes around.
I mean, it may be that, you know, certain people can think they have comfort in, you know, because their side is not being censored as much as the other side, but it'll all come around once that power is, once you lose your First Amendment rights, everybody's going to suffer from it.
Um, so I think it's, you know, everybody needs to recognize that, you know, threats.
I mean, if you look at the censorship in the 2020 election cycle, I mean, you know, if, if, if, if there was shadow banning of the Trumps, you know, pro-Trump stuff, you know, that's clearly swaying an election.
That's, you know, that's, you know, it certainly looks like Yeah, I mean, it's almost too easy.
activity.
But once you permit that, then that could be used to suppress your candidate next time.
I mean, nobody should be taking comfort just because they don't like the people who are being censored.
Yeah, I mean, it's almost too easy.
Had they been censoring things that were truly wrong, this would still be a disastrous, dangerous signal.
The fact that what they censored has so quickly been vindicated.
Right, says not only is it wrong to censor, wrong and dangerous, but these people cannot be trusted to even decide what's good and what's bad.
They're censoring actual information and they're promoting misinformation.
And, you know, that's whatever process has created that inversion is not on your side.
And it may be, I mean, it may be that COVID ends up, God forbid, being a catalyst for real change.
If the vaccines end up being as dangerous as some fear, you know, in the long run, maybe that'll be a catalyst to get people, because I think censorship played such a massive role in the promotion of the vaccines.
Yes, but it also, it's a double edged sword, because to the extent that these things are at the very least far more dangerous than we were led to believe in the first place, and the information that suggested that they were at least extremely risky, because that information did exist, the incentive for them to continue to censor the discovery of the truth Oh yeah.
I mean, it's just, that's why, depending on how dangerous they end up actually being, I think there's a breaking point where they only can suppress it so much.
I mean, if you can get anybody to watch, you know, just a reel of all of the young athletes who've died on pitches, you know, in the middle of games or whatever.
Shortly after getting the vaccine, you know, you see a certain number of those and it starts to be hard to ignore.
Obviously, we aren't there yet and I'm not saying that those are necessarily caused by the vaccine or not, but I'm just trying to say that I can imagine a level of evidence that becomes too hard to suppress.
Yes, on the other hand, what I fear is that although you couldn't suppress the pattern on its own, if you falsely portray the pattern, you punish and economically cripple anyone who points out that that's what's happening, The point is, let's put it this way.
From the Nixon era, we got the, I believe it's from the Nixon era, we got the term plausible deniability.
What we now have is implausible deniability.
It doesn't have to be plausible.
It just has to be forceful enough that most people know not to touch it.
And I worry that the number of individual tools that have come together at the disposal of one ideology is so powerful that we We have to realize that those tools are not only being used against the fundamental truths, but they're being used against the process that would normally reveal it.
Yeah.
No, I mean, it's, it's, it's definitely, it's an uphill battle.
Um, and that's why I, I, you know, I'm not hoping that that ends up being this bad, but I do know that once somebody gets touched by the harm of it, then they, they can't be, you know, they're much harder to brainwash.
If you look at, you know, all of the, the former, you know, not former, but people on the left who have seen the, you know, the harms of wokeness and such, you know, because they've been touched by it.
Like you, you as an example, you were touched by it.
And then now you, you you're immune from that level of brainwashing.
I think, you know, the more people have a personal experience with tragedy from the harms from the vaccine, The more people will be immune from this type of stuff, and it's gonna have, you know, if it gets bad enough, God forbid, it'll hit a breaking point.
Yeah, and we do see people, you know, whatever they're willing to say, people are not lining up to get inoculated.
Yeah, the tide turned there.
Yep, sure did.
All right, is there anything else you believe should be on the table in this conversation?
I'm sure there is, but not that I'm thinking of.
I think the point in looking at Section 230, I do think that there are things, obviously it could be amended itself, but I really want to reiterate the point that states can potentially have a role on disclosure of the censorship.
And that can, I think, be at least a step in the right direction on this that doesn't have the same impacts on First Amendment Right.
And it doesn't conflict with what Section 230 gives.
I think the second point is just, you know, remembering that Section 230 is giving such a huge boon to these private companies.
And so more libertarian minded saying these are just private companies.
Well, these are private companies that are giving a getting a massive government benefit in the form of this immunity.
And so asking those companies to then give a little back in exchange for this massive government supplement.
Is not contrary to libertarian principles on their face.
I think that, um, you know, it's a, you know, it's good when you're giving a government benefit you can ask for something back just you know if you want to be truly libertarian get rid of the government benefit.
And then we won't have social media, but we won't have this problem either.
Yeah, I think those are excellent points.
I mean, there's tremendous precedent for, we are granting access to something and protection that the government is facilitating the possibility for these platforms to exist, and it is entitled to ask something in return, and it should be doing that.
If it was acting in our interest, it certainly would.
I love your point that although the right thing to do is to amend Section 230 to link the two provisions and provide a process and a description of the requirements that would connect the benefits that these platforms get to the obligations that they should naturally have, but that that may be impractical for political reasons.
But what is not necessarily impractical at the moment is that states could act and impose a reasonable balance on their own and courts could bring it about.
That's a hopeful message.
And I hope it does occur.
Sounds good.
All right.
Well, Ramsey Rammerman, thank you so much for joining me.
Where can people find you?
They can find you at Ramsey Rammerman on Twitter?
Yeah.
Yeah, add Ramsey Rammerman, just my full name.
It's the best place to find me.
Right.
By the way, kudos to your parents for creating that name.
It's excellent.
Always happy.
They were debating between Ramsey and Roscoe, and for anyone at my age, the Dukes of Hazzard, I'm much happier to have Ramsey than Roscoe.
Yeah, Ramsey Rammerman works.
Roscoe, after the Dukes of Hazzard thing kind of died down, that would have been all right, but you would have had a tough decade there.
Yeah, all right.
So, at Ramsey Rammerman on Twitter, and we'll have to have you back as things develop on Section 230, as they clearly are doing.
So, anyway, we will keep the audience posted.
All right.
Export Selection