All Episodes
June 30, 2019 - Dark Horse - Weinstein & Heying
47:02
Benjamin Boyce - Bret Weinstein's DarkHorse Podcast #2

Bret Weinstein sits down with Evergreen graduate, Benjamin Boyce to discuss The Evergreen State College, Project Varitas, big tech influence, machine learning, AI, and the artificial shaping of cultural ideas. Follow Benjamin on Twitter & YouTube: https://twitter.com/BenjaminABoyce https://www.youtube.com/c/BenjaminABoyce Find and Help Support this work below: Patreon: https://www.patreon.com/bretweinstein/ Twitter: @BretWeinstein https://twitter.com/Bre...

| Copy link to current segment

Time Text
Hey folks, Brett here.
I am sitting with someone most of the people who follow me almost certainly know, but for those who don't, this is Benjamin Boyce.
Benjamin has been chronicling what has been going on at Evergreen since the debacle in spring of 2017.
He and I did not know each other before, but Benjamin is an alum of Evergreen.
I was teaching there, and we've become friends in the aftermath of that event.
So anyway, we are here to talk on what will soon be the set of Bret Weinstein's Dark Horse podcast.
We are here to talk about what is taking place with you, with Google, with the recent revelations out of Project Veritas, do you want to say something about what you've been up to and what you've experienced recently at the hands of Google?
Well, let's back up a little bit.
So I've been working on the story about Evergreen, which includes you, but I've gone beyond the Brett Weinstein story into various layers of student activism and student counter-activism, and then the administration and the faculty.
And I've been following this for now two years.
And I've kind of, it's kind of obvious to me that I might be kind of annoying for certain people within the institution of the Evergreen State College because I keep on bringing to light things that they very explicitly don't want to be brought to light.
However, they are a public institution, and Washington state law has a very explicit public records set of statutes.
So they owe me information.
They owe the public information.
And I've been able to get information, even though they've been basically obstructing my access through official means.
The public records department is basically locked at a standstill.
So I get information from other places.
Well, Wade, I want to stop you right there, because I think this is an incredibly important fact that the public doesn't know, which is that this public institution is, by all evidence, conspiring against the laws of the state of Washington.
It is obstructing your attempt to get information that you are legally entitled to get from the institution, and it's giving you the runaround.
That raises serious questions about an institution that has problems that have been revealed through public records requests.
So effectively the institution has gone rogue and surely there is some legal mechanism that should kick in that forces them to comply because the mechanism that keeps them honest is people like you or me taking information that we are able to extract and exposing it and discussing it and letting the chips fall where they may.
We'll add to that the fact that Evergreen's enrollment is tanking and their main source of revenue right now is the legislator and the only thing that they can... It's the legislature.
Sure.
Legislature.
The legislature.
Yes.
Which is a governing body.
A governing body full of legislators.
Yes.
And they have the one proud thing that Evergreen can tout as a win is getting securing funding from the legislature.
They are getting state money in order to survive.
Therefore, it compounds the problematics of why would they not be following the letter of the state law, or even the spirit of the law?
Why would they be wanting to take money, but not giving anything back in return?
It seems like a quid pro quo.
If you are a public servant, then you are beholden to the public.
Yeah, and the law is the law, and the legislature should surely be holding them to the laws of the state of Washington.
Yes, George Bridges.
withholding funds if they don't comply.
I would also say that the instruction not to comply must have come from somewhere and I think you and I would be likely to guess the same source.
George has been all about spin and President Bridges.
Yes, George Bridges and current President.
Current President Bridges, oddly not fired after one of the most amazing shows of incompetence anyone has ever seen on American College.
Well, at least live streamed.
Yeah, well that's for sure, but let's just say we don't have evidence for greater incompetence shown by other college presidents that I'm aware of.
This is really singular and yet the man retains his job.
So what brings it up to an intersection, my work on Evergreen intersecting with this Project Veritas revelation about the way that Google is, we've already known that Google has a hand at controlling information.
That should be pretty much obvious, but this makes it very explicit.
A couple weeks ago, I got demonetized.
No, last Tuesday I got demonetized.
About 40 videos in the last six months were all of a sudden, well, demonetized.
I don't want to quibble here, but I think we have to be super careful.
You were soft demonetized.
Which I wasn't even aware was a thing, but apparently a monetized video can exist at different levels of monetization, and a bunch of your videos suddenly dropped from the green state to the yellow state.
Yeah, thanks for correcting me.
And you have inquired from Google?
Yeah, I inquired into it, and I was told that I need to have all those videos manually reviewed, and I've since submitted all those videos to manual review, and most of them are getting flipped back into the green.
But more interesting to me, this caused me to look at the Google search.
And I just typed in Evergreen College and Evergreen State College.
And I've repeated this a couple times now.
And I know everybody has a different result.
So it might be because I'm in the area of Evergreen that they're stranding it for some reason.
But my videos, of which I have 92, about the Evergreen State College, and most of those videos are Based entirely on fact, and what's not based on fact is me responding or processing what these documents say, but it's all based on public records, on primary sources.
It's a major source of material for every documentarian that comes and does the Evergreen story.
They contact me because I have all the information and the contacts.
So I'm the basis for any story about Evergreen.
The main basis, and I am no longer listed in the video tab on Google search results.
You have become hard to find.
Yeah.
In fact, if you use Google to search for you, which I find amazing.
On the other hand, there's something I hear you, I would say you're working too hard.
What you're saying about the difficulty finding you is very disturbing, but in light of what's come out of Project Veritas in the last couple days, I think there's a very interesting interpretation, which is that machine learning is being employed by Google in order to create what they were calling fairness, which is much like equity is the inverse of what we would normally assume that term means.
Fairness is the inverse of what we would normally assume that term means when deployed by Google.
But the idea that a bunch of your videos Suddenly got shifted and then manual review is restoring them.
Opens the possibility that what happened is that the machine learning algorithm, whether it was sent to look at your stuff or whether it finally got around to it or whether somebody changed a parameter and that's what caused your videos to be altered in their state.
Somehow, something inside Google did a lot of thinking for the public and decided that your stuff, which is very interesting clearly to many who are paying attention to the story, should disappear.
And the idea that that can happen ought to frighten us, because as you point out, what you do is really just expose the facts of the situation.
You're not hostile to Evergreen.
I know from many conversations with you, some of them on camera and some of them not, that you and I and Heather all share a desire to see the institution right itself and restore the model that we know can be so successful at educating students.
So the fact is the public has every reason to want to look at your content because the critique that you're leveling at this institution is actually in the hope of making the institution, restoring it to its former glory, making it better in the future, serving students, and the idea that some algorithm might think that it knows best what people should hear about Evergreen and that somehow you're off limits based on
I don't know, what Project Veritas suggested was that they're using some kind of analysis of language algorithm that searches for terms and decides what might be conservative of all things.
It's just, it's out of control.
Well, I don't know if this is more or less disturbing, but could it also be the case that some third party that was hired by Evergreen to boost their reputation on the web Well, let's try that a different way.
listed somehow as an untrustworthy news source.
So Google then automatically scrubbed me because I was told somehow I was flagged somehow.
Well, let's try that a different way.
Let's say to the extent that there is some algorithm out there making decisions that have big consequences, not only for your ability to earn, but for the ability of the public to find information, that that algorithm, the existence of it, creates an opportunity for agents on the creates an opportunity for agents on the other side to game it.
So if there is an algorithm, then there's a question of, well, what do I have to do to trigger that algorithm to make my detractors' content obscure?
And some time ago, I tweeted the question.
I can't find it on Twitter.
I don't remember exactly what phraseology I used.
But the question was approximately, If it is possible to pay Twitter or some other platform to promote your content, what is to stop those platforms from selling the right to downregulate your competitor's content or your detractor's content?
And I think that's where we are.
Well, we are there, and we just watched the Project Veritas video, and one of the questions I wanted to ask you about that is that if any one of us is given the power to shape the world, such as Google has the power to shape the world, to quote Tim Pool, would it not be a sin to not try to shape the world for the better?
Is this not something that is inevitable, that somebody given this much power is going to want to manipulate reality or perception in order to manipulate reality?
Well, I think it is inevitable if you don't build some sort of protection into the system.
And one of the things that I find very surprising about the present moment is that lots of people are losing touch with why we have counterintuitive rights like the right to free speech.
After all, We should all be able to agree that certain things that people say are just terrible and it would be great if nobody ever said them again.
Right?
We don't need more Nazi garbage.
But there's a reason that the Constitution itself says actually even that is protected.
And the reason it is protected is because there is no way to draw the line that protects heterodox speech and bars truly obnoxious speech.
And because we can't draw that line surgically, The value of heterodox speech is so great and the founders were so clear on its importance that they put in a protection against governmental interference with such speech.
Now, unfortunately, the founders did not see the present moment.
They could not.
I mean, these were people who never saw a bicycle, they never saw a chainsaw, a train.
They didn't have any idea what world we would be living in.
They would have no way of imagining an entity like Google that Sat as an interface, a literal interface between human beings at a mechanistic level.
That something that has some kind of crude intelligence could be making decisions about what I am able to say and who gets to hear it and how likely they are to encounter it.
Founders would never have seen that coming.
And because they didn't, they didn't properly fear private interference with speech.
So the First Amendment is just what I said to Congress is it's inadequate to protect the right that the founders were attempting to protect, which is the free exchange of ideas.
So are you advocating that the government expand its reach in order to give more liberty, like control, like take a little liberty in order to?
As a liberal, there's nothing I like better than governmental control.
The more of it that we have, the happier I am, is basically where that falls out.
Now, look, I've said many times that The problem with much liberal thought is that it underrates the danger of solution making and that as you make solutions, you are always inviting unintended consequences and that liberals have a blind spot about those things.
So believe me, I have.
Lots of fear about what happens if we attempt to regulate our way out of this puzzle.
We could make things worse.
And so, I also believe we need to recognize that we are generally, just across the board, we are in novel circumstances where the documents that we were handed that built this flawed but great nation Are going to find themselves without the ability to address the problems that we encounter.
And that means that we have to figure out what to do about that.
Do you open up a constitutional convention?
Well, I happen to think that would be a disaster if you did it.
So I'm not in favor of a constitutional convention.
What do you mean?
Why would it be a disaster?
And what do you envision this convocation to be like?
Well, if you opened a constitutional convention, suddenly the immense potential to shift the nature of the republic in favor of one constituency or another would be on the table.
And I don't see the wisdom available to us to do it in a way that would not be detrimental.
And what I do see is a lot of powerful players who would not resist the opportunity to attempt to remake the nation to their interests.
So I don't favor that sort of thing.
But I do favor us thinking about the fact that we have a general problem with the novelty of our systems relative to the documents that are built to protect us.
And we have to have a frank conversation about what to do about that.
Well, if these companies are so big and so important to anybody who creates content, Twitter is absolutely essential for me to interface with other thinkers and my audience and grow my base and then also figure out and grow as a content creator.
YouTube's been wonderful for me.
Evergreen was very good for me.
YouTube's been wonderful for me.
It's given me a voice.
It's given me contact that I never imagined was possible.
So within these frameworks, is it possible to change them?
Is it possible to band together to get people to... I don't know.
Outcry doesn't seem to work.
Maybe Congress will go after them at some point.
Do we have to reboot everything?
We can't reboot everything.
So, I'm unfortunately not quite ready to talk about this publicly yet, but I will say I think this is the core question that we face, which is how exactly do you deal with a mechanism as complex as civilization has become?
How do you Upgraded without taking it offline or allowing it to collapse or any of the things that we can't afford and I do think there is a category of answers to this question and The question about what to do about the net is I think going to be the first test case question is the net is new enough to
And the mechanisms by which we might upgrade it so that it did not become a totalitarian nightmare, those mechanisms actually exist at our disposal if we can think, I hate to use a cliche, but think outside the box enough to deploy them intelligently.
We have seen enough components that are available to us that actually we, the public, can upgrade the net That sounds like a technical solution.
Do you have any ideas about a cultural solution?
compete the authoritarians in their own landscape.
That sounds like a technical solution.
Do you have any ideas about a cultural solution, the way that we interact with one another, the kind of content that we train ourselves to favor, or the types of of interactions that we promote or demote inside of ourselves that might be playing into the hands of these large behemoths?
I am not sure those are different questions.
My sense is if you look at for example television, when I was growing up, you're not that much younger than me, you're not that much younger.
You're 40-ish?
Yeah.
Dang, I'm 50-ish.
In any case, when I was growing up, everybody understood television to be dangerous and destructive, especially to kids.
And about the time I was in high school, I started saying, it's not the box, it's the business model.
The box itself, and you know the thing that caused me to think that was that I spent plenty of time watching documentaries on public TV.
And I didn't have the sense that I was degraded by them.
In fact, I felt enhanced.
I was educated by them.
And so, obviously, it came over the same box.
And what was different was that the business model was different.
Now, I don't think NPR or PBS are a model of anything at the moment.
I think they've been captured by something quite dangerous.
What I do know is that HBO at first and then later on Netflix and others have taken the very same box that was delivering really toxic content and has repaired it by getting people to pay up front for a service and then delivering them a higher quality good that does not require
The HBO version, where what they do is they sell you the right to watch some long form narrative and they don't interrupt it with commercials, actually you can do something very sophisticated in that space.
And yeah, there's a lot of garbage out there still, but the point is, Nothing had to change about the tech in order to change the narrative content and enhance it.
And I think the same thing is going on with the net, where what we have are platforms that, because of the competition they are caught in, are forced to manipulate us into sticking around longer than is healthy for us, to staying on site, as they say, when our mental health would suggest that we should get up and Turn off the computer, do something else.
So the real question is, if we were to fix the net and fix the platforms that ride on it, what would that do to discourse?
And my guess, and it's not a wild guess, it's an educated guess, my guess is the quality of discourse would go way up.
And in fact, I think IDW has been a demonstration of this.
So what would cause the quality to go up?
The need to no longer keep people stuck around but give them something that they want rather than something that they are told they want or manipulated into wanting?
Yes, if you fix the incentive structure around the content and the interaction between, let's say, creators and platforms, then that which was delivered, the quality of it would go up, and people would seek out that which You know, in a video that you filmed of me quite some time ago, we talked a little bit about wisdom.
Yeah.
And I argued that the core of wisdom was delayed gratification.
And it's funny, in the aftermath of releasing that video, people have pointed me to lots of other places where other people have argued that that's actually the core of wisdom.
So it's a resonant idea.
And I think the point is, given good tools, Wisdom about how to spend your time online will go up.
We don't have that happening now because we have antagonists.
We have very sophisticated antagonists who are deploying very sophisticated machine learning to keep us from getting wise, to keep us coming back for the junk food that they're delivering.
Well, at the same time, I think we can be a little bit generous with at least the intentionality of the Google, whatever it's called, Yeah, especially the nothing to see, hear, move along ministry.
or whatever they have, the language itself is really telling.
Like how they frame all their little ministries is just alarm bells.
Yeah, especially the nothing-to-see-here-move-along ministry.
What the hell is that?
Go away, go away.
But it seems like in the Project Veritas video, the woman that was caught on camera who's higher up on some level of the appropriations or the appropriateness committee, she wants to change the election.
She doesn't want Trump to happen.
Like, they're really scared of Trump.
They really want the world to be a better place.
So it seems like they are acting out of an idea of wisdom or care or concern.
Well, is there any way to reach them and get them to not go down this direction?
No.
I'm utterly convinced there is no way to reach them.
These are maniacs who do not realize that that's what they've become.
And, you know, we saw this at Evergreen.
Was there any way to reach Evergreen's faculty?
There's still no way to reach... Which tells you something.
I think that's a model of what's going on.
Google is incapable of questioning the wisdom of doing what it's doing.
Now, with respect to their point about Trump, I too think that the election of Trump is a very frightening fact.
As we are staring down the barrel of potential war with Iran, having a guy like Donald Trump in charge of whether or not we go that direction makes us much more vulnerable, I believe.
And so there is something to be said for the question of, well, what does this mean?
On the other hand, the idea that Google thinks it has the right to adjust content so that he doesn't get elected again is the height of hubris.
And it carries with it a very obvious danger, which Google, which has studied artificial intelligence, should well understand, which is if you start with the conclusion, which is I'd like to live in a world in which Donald Trump doesn't get elected, And then you start adjusting content so that it points away from a guy like Trump.
You derange us.
You make it impossible for us to have a nuanced conversation.
You make it impossible for us to have a conversation in which the counterintuitive plays an important role.
Because the algorithm, you know, it's not artificial genius.
It doesn't even deserve the... Intelligence, yeah.
Right.
Intelligence is a stretch in and of itself.
The point is, look, you have to let us have discussions.
And the fact that you and Tim Pool are finding themselves on the wrong end of this algorithm tells you everything you need to know.
Well, what does it tell me?
I just can't get my head around why they would think I am important enough to scrub out.
Let me put it to you this way.
It tells you that the algorithm thinks that you are bad for the future.
When in fact, I've listened to you.
You are a difficult But positive force.
You're a difficult force because you're forcing people to look at something they don't want to see.
And you know, it's the same thing I encountered.
The problem with the Evergreen story was that good people don't like a story with black bigots.
Black bigots were the problem.
I don't like a story with black bigots.
On the other hand, if there are black bigots, which we encountered at Evergreen, You can't decide that story is unreportable because it goes against the narrative that you favor.
You have to say, well, what's happened?
Why is that a phenomenon that's occurring in the present?
So, anyway, my point would be, you and Tim Poole are delivering very high quality material, and for the algorithm to have decided that what you are doing is actually counterproductive tells you that the algorithm is not interested in truth.
It's not interested in in-depth explorations, because what both of you are doing is you are exploring things in depth so that people can see The horror show, they can also see what's being lost and in any case, I'm against any algorithm that would decide that you and or Tim Poole were on the wrong side of history because I believe you are on the right side of history and not in as great company.
Well, I'm not on the side of those who would rewrite history.
We know that at this point.
Well, you have an inability to look away.
Yeah, it seems like in that what you brought up with wisdom that delayed gratification there's also, it just makes me wonder what is that desire for us to declare an outcome or to be outcome oriented, and it seems like in.
Unintuitive principle to not become outcome oriented in certain respects with regards to they want Trump elected.
They want to fix it with Evergreen.
They want to fix the numbers even though the numbers were fixing themselves.
They wanted to fix these gaps between different races and their achievements and that.
In quality of outcome, it seems really good.
It seems like a fool's gold though.
It seems like a foolish wisdom.
Do you see that?
It seems like you brought that up with Google deciding where to go and then trying to muddy the waters to get there.
Something has switched sides and the desire to stamp out inquiry at Evergreen or in the Academy
Is obviously paradoxical and the desire to I mean this is Google Google became Google because it allowed people to discover What was actually available in the world it allowed it indexed it enough that we could actually find what we were looking for and so in some sense
Google became a kind of next level perceptual apparatus, like an eye to look into civilization and what's taking place.
And now what is that eye doing?
It's editing what you can see.
It's serving a particular brain.
It is becoming paternalistic.
And it has decided that it doesn't want you to look at your content, for example, and evaluate it for yourself.
It's going to evaluate it for you and disappear it.
And if you really want to find it, you can.
But you and I both know that the degree to which Google intervenes between you and your audience, that adjusts your fate going forward.
They've decided that they've looked at your content so other people don't have to.
And what the heck?
Google is going to decide that?
Yeah.
That's like the Academy deciding that inquiry is a form of oppression.
Yeah.
It just, it's, it's odd.
And I tried to design, as I developed as a content creator, I'm still very early in this experiment, but I tried to design content that was complex, that kind of used that knee-jerk reaction to get people into the door.
Getting attention is important.
The way that I kept attention and tried to go forward and manipulate attention or to break apart attention and to engage in tension over the course is to not feed on those clickbait things, is to actually upset expectations and have long-form conversations is to actually upset expectations and have long-form conversations with people that you wouldn't normally have a conversation with.
And it seems like if the algorithm is somehow misreading me in my intentions or reading a certain intention into it that they that either can't process that level of complexity or that level of complexity goes against what its handlers want to happen.
I think that that level of complexity, such as we're exercising right now, that I don't think any machine could actually gauge what we're actually doing here.
That level of complexity, I believe, is the way forward.
It's the way for us to get to a place where we can disagree and build something while disagreeing on things or whatever we need to do.
Well, I've said elsewhere that we missed the boat with respect to the fears about AI, that we were expecting robots and that we are actually now living the very early stages of the AI apocalypse, that we were expecting robots and that we are actually now living the very early stages of the AI apocalypse, and we don't even know it because the robots aren't And in some sense, the algorithms have started to think for us.
Now, here's a very basic question.
Thank you.
Do the executives at Google have an escape from their own algorithmic adjustment of reality?
Or is this a positive feedback where the executives of Google are going to be convinced by the algorithms that they stupidly set in motion?
Are they going to be reinforced in their wrong I hate to use that term, ironically.
They think we're involved in wrong think, but to the extent that they have deployed an overly simplistic view of what's worth hearing and what isn't, are they creating their own echo chamber which is then going to derange them?
Right?
In other words, the overlords have created the seeds of their undoing and it's, you know, it's very much like the scene in whatever movie where the robots turn on their creators.
The algorithms are inevitably going to confuse people at Google who are programming the algorithms.
It seems like every once in a while a new video surfaces from One of those dynamic companies out in Boston, where they have a robot.
Boston Dynamics.
They beat up the robot now, and the robot can sustain a beating.
It's still, if you look at those robots, they're scary, they're funny, but they're not really articulate.
What Google has done is released robots into our cultural space, which they're non-physical robots, but now they're going through our cultural landscape.
Are they any more adroit?
Are they any less clumsy than the ones that we see in those videos?
Yeah, they're not quite a droid.
They're Android.
Oh, sorry.
All right, we can edit that out.
I was headed somewhere though, but it does remind me.
So every once in a while, I put out an evergreen video and there's always somebody saying, get over this kid.
You got fired from there.
I'm like, I was a student, but.
People don't understand.
But I really do think that the Evergreen situation maps onto other things.
And what you just said about Google kind of circling the wagons or getting their robots to cut them off from discourse, they don't have any accountability to the outside world.
And when we don't have accountability, just like what happened with the Evergreen story recently, the certain faculty were trying to get the faculty to have an open and honest discussion, and every layer of authority above the faculty, which were all supposed to serve the faculty, shut down that discussion.
They finally had a little bit of discussion, and even within that discussion, they didn't want anybody else to hear them speaking.
When we cut ourselves off from that discourse, that damages us more than anybody else.
We are unable to actually look at the world, because that which we've designed to protect us from the world ends up creating huge blind spots for us going forward.
Yeah, I mean, in fact, the...
The point is, this authoritarian stuff ends in a dystopia every time you deploy it.
And we are now seeing every stripe of authoritarian, right?
We saw it at Evergreen.
I mean, Evergreen now functions like a little dictatorship.
And the fact is, the faculty don't even have the ability to email each other, right?
The dictatorship decided that because faculty emailing each other was the root of their unhappiness, they cut that off.
You can't reach the board of trustees without going through the president's right-hand man.
You can't reach your colleagues.
It's a little totalitarian state.
Google is behaving like a little totalitarian state, except it happens to be one that is now in some amorphous way sitting right in the center of our ability to collectively think.
That's a very dangerous process, and it's not going to end well.
So the question is, A, what should we think?
This video is going to be released on YouTube.
Will YouTube's algorithm be able to figure out?
Will it detect that we are actually attempting to figure out how to defend liberal values?
What does Google's algorithm think of traditionally liberal values?
Sounds to me from what Project Veritas, which frankly makes me very nervous, but Project Veritas, what they captured Is there anything that we can do then, other than sound the alarm?
I'm tired myself of sounding the alarm, of feeding into outrage.
values and it thinks it knows better and doesn't need them anymore.
And, you know, we all know where that's headed.
Is there anything that we can do then other than sound the alarm?
I mean, I don't want to, I'm tired myself of sounding the alarm of, of, of feeding into outrage.
I'd rather keep on trying to produce deep, good content, but it seems like we do need to, um, continue.
Come together on some level and organize or get ready for moving in another direction.
Well I'll give you two answers.
One is we need to retool the internet which will allow us to walk away from those who have decided to think on our behalf.
That's one level and that's going to require some careful technological thinking.
The other level though involves us Figuring out how to reject the diagnosis of us.
And this is a place where I think IDW, at least for a while, succeeded.
When IDW first began to form, before there was even a name for whatever that phenomenon is, It was immediately dismissed by a wide range of people as a conservative phenomenon.
And many people, including me, kept pointing out that doesn't make any sense.
At least half the people who are associated with that conversation are actually left of center.
Not only that, but everybody involved in the conversation is heterodox in one way or another and broadly tolerant.
So it isn't an ideological conversation.
Now my point though is, at first, there was a lot of resins to the dismissal on the basis that it was conservative.
But figuring out how to respond to it without becoming enraged and saying no, that simply doesn't fit the facts, actually caused And update to people's model of what they were seeing.
So I guess what I'm saying is we have evidence of places where stigma that is supposed to shut down one's access to an audience was actually repelled because the audience is interested and There is enough openness to an analysis of what's worth listening to that in fact we won, at least for a while.
That threat of stigma will always be popping up, it seems like, no matter where any Particular people congregate, they will be slandered.
And the weird thing is, is that the people, there are these useful idiots, there are these little gaggles of groups that just want to go around and stigmatize people, and those are being used.
That action itself is so easily feeds into Larger authoritarian movements, because it seems like it shuts down people's ability to really engage with things.
And I wonder, maybe we don't have time to talk about it right now, but about the mechanics of stigma and how do we see it in ourselves and and?
Shut it down or use it in the proper way?
Well, the most important thing is that, you know, I've said in many of these places where I've been asked to talk about the free speech crisis, I don't think this is a free speech crisis.
Part of the problem with the framing of what we're facing as a free speech crisis is that it focuses too much on the speaker, when in fact what we saw at Evergreen, on other college campuses, is an attempt to prevent people from being able to listen to what they want to hear, right?
You shut down a speaker, it's not their rights that are the most important.
The question is, what about all the people who wanted to hear what they had to say, who don't now have the ability to access it?
So, the point of the stigma is to interrupt your ability to seek content that you want to engage with.
And in some sense, that means that the solution to the stigma problem is not to deal with the stigmatizing itself, but to make sure that it does not interrupt the ability of an audience to find the content that it wants to engage with.
And I, you know, there is a frightening aspect of that.
I don't want people engaging white nationalist content, but The reason that the founders protected speech the way they did is that they trusted that the net effect of protecting all speech is to allow heterodox ideas to flourish when they are right.
And for that, we have to accept a certain amount of speech that, on the whole, we would rather not exist.
In your estimation, is there forms of speech that are so toxic that they can destroy or corrupt everybody's mind?
That should be like some sort of science fiction novel, like the book that will upload us all into the net, or something very deadly or dangerous.
Is there a combination of words that can ruin You have to accept a qualified answer on that point.
I don't think there's a simple answer.
There is no set of words that does that.
But given a particular context, there is a set of words that will set us in motion in a particular direction that is very dangerous.
And the point I've been making is that economic contraction, the experience of the opposite of growth, puts people in mind of who to go after in order to restore growth for their family, for their kin group.
And so my point is we are actually wired for messages that will cause us to turn on people who are vulnerable.
This is what happened in Germany in the 30s and it can happen anytime.
So to the extent that what we have are people Like, I would argue Donald Trump played on this.
I don't think it's necessarily what he's interested in, but to the extent that people's cognitive structures were looking for evidence of somebody who was going to point out which group we could turn on, Well, he's played on anti-immigrant sentiment and, you know, distrust between populations and things like that.
It's part of how he got elected.
And so we have to be aware that we live in a period in which messages that we do not want to spread are likely to be more resonant than we would like them to be.
It doesn't mean that those same messages would have any resonance whatsoever were we living in a boom.
Okay, alright.
So, yeah, the answer is usually not just the words themselves, but the context in which those words might ignite a large fire, let's say.
Yeah, maybe the simple way to say it is, there are no words that will do it, but there are combinations of contexts and words that will.
If Google goes through and starts to banish us from certain nutrients, let's say, and thinking that these things are bad, but just goes overboard, then it could lead us to a place where, on a psychological or cultural level, we're getting hungry or needing for some sort of upheaval that could otherwise be avoided.
I think that's true, but maybe at a more basic level, the thing that we should, the thing that we need most is our ability to think collectively clearly.
And that requires that nobody decide which fraction of the information we are able to see.
As bad as it is that there is Uh, there are messages out there that people may spread that are dangerous.
It is far worse that somebody decide that they know which messages we get to see and which ones we don't.
So, uh, that right is a very important one.
It is not synonymous with free speech.
It's related, but it is not the same thing, and we're gonna have to figure out how to protect Our ability to think collectively because it's quite clear that people wielding algorithms have set us on a very dangerous course.
By thinking collectively it's the opposite of groupthink in a way.
It's together alone or together against or it's another way of framing interpersonal communication that doesn't come down to everybody thinking in a rigid form but interacting with The opposite of rigidity sometimes.
Yes.
The humor and the ugly and the unnecessary, the uncomfortable.
Let me put it this way.
There is no reason that language would have evolved to just synchronize people's thinking.
Synchronizing people's thinking is not a very interesting phenomenon and it doesn't, would not have caused the evolution of something as absolutely miraculous as human language.
So yes, it is true that sometimes people use language to cause groupthink.
But the most interesting thing that language does is it allows minds that are different to compare notes and to upgrade each other.
And that's what I mean by our ability to think collectively.
I'm imagining something much more like a analog of a brain that exists in between us that actually does result in us getting smarter over time and to the extent that Google believes that it knows what that brain should think about stuff we ought to be quite frightened of Google.
Yeah again they've put like an Android thing into our head and an insert of some sort some sort of Borgian modifier or enhancer that does the opposite of enhance the world?
I have to say, when I walk down the street now and I see Apple's earbuds hanging out of people's ears, I do sort of have the sense, I'm not sure why it strikes me as more dire than when they had earphones with cords, but there is something about a large fraction I'm not sure why it strikes me as more dire than when they had earphones with cords, but there is something about a large fraction of the population walking around
oh goodness, where have we ended up? - All right, Benjamin.
This has been marvelous.
What I'm hoping, we don't live that far apart, what I'm hoping is that you'll come down and we can have further conversations about what's taking place and what it means, because it's always a pleasure engaging with you.
It's been absolutely wonderful.
Thanks, Brad.
Terrific.
Thanks, Benjamin.
All right, for the rest of you, I would say if you want more content like this, hit like and subscribe and maybe hit the notification button and we will talk more about civilization as it develops.
Oh, and find my channel in the description somewhere.
Also, his channel will be linked in the description.
It was very impolite of me not to mention that, but of course it would have been and it will be, and so you can find him there if you're not already subscribed, which is what I suspect.
Export Selection