Special Episode: Interview with Thi Nguyen, a Gurometer's Guru
Today we talk to C. Thi Nguyen, a philosophy professor at University of Utah. He has some excellent insights into the kind of discourse that *feels* like it gives us insight, that wonderful 'aha' moment. Basically, what happens when unscrupulous actors aim to optimise that feeling - putting aside concerns as to whether or not it's the real thing.Thi has previously studied 'moral outrage porn', which is a bit like food porn, but for your emotions. You might say "X-porn" is any material that gives us the fascimile of the thing, without having to put in the hard yards of actually doing the thing.You might also say there's such a thing as 'insight porn' and maybe that's what Gurus deliver! Matt and Chris feel like they got some real insights (touch wood) from their chat with Thi and they hope you do too!--------------------------------More from Thi Nguyen:Thi writes about many things, including echo chambers, epistemic bubbles and the seductive feeling of clarity. A free preview of the first chapter of Thi's Games bookMoral outrage pornThe op-ed version of moral outrage pornThe echo chambers paperHow Twitter Gamifies CommunicationRecommendation from Thi: The best book on Echo ChambersAnother recommendation: A great recent book arguing for the large-scale misinformation thesis: and it’s open access.Thi's appearance on ETV pod discussing Cheap Talk.
And welcome to Decoding the Gurus, the podcast where two academics listen to content from gurus across the online world and we try to understand what they're talking about.
And in this special interview slash chatette episode, we have someone here to help us understand some of the reasons why gurus are so appealing.
So welcome, T Nguyen.
Hello, hello.
Okay, so T is a professor in philosophy at the University of Utah.
He's done some very interesting work on how the online infosphere affects people's thinking, including the phenomenon of outrage porn, which he can tell us about, as well as epistemic bubbles and echo chambers and other interesting things.
So it's great to have you on, T. Thank you very much.
So to get us rolling, we might...
Start off with, briefly, with your stuff on Outrage Porn, because on DTG, we're really interested in essentially fake things, masquerading as the real thing, right?
Well stated, Matt.
Very well explained.
So, yeah, so Outrage Porn is kind of a similar thing, so maybe tell us a quick bit about that.
Okay, so this paper, this paper, And I will be honest, the paper started as a drunk Facebook conversation on someone else's thread between me and Becca Williams, who turned into my co-author for this.
And I was just like...
Two in the morning, I was like, you know what?
No one's given a good definition of?
Food porn!
Because we talk about it all the time!
Of course, there's this huge amount of work on sexual pornography, but there's this new use, right?
And I think we all know it.
Food porn, poverty porn, ruin porn, closet porn.
My wife says to calm herself down, she goes to look at this site called Things Organized Neatly, which is just...
Obviously, organization porn.
I'm adding it to my bookmarks.
It is strangely sexual, and one of the things you can see is when you look at these porn sites, like food porn and organization porn, it's obviously porn-like.
So we were trying to figure out what it was, and Becca had this incredible suggestion.
There's this old paper from Michael Ray where he says what sexual pornography is is you exchange Sexual images outside of the context of a relationship, not for furthering a relationship.
He was really interested in the fact that people could exchange naked and erotic pictures as part of a healthy relationship, but porn was something else.
So he thought it was like this weird thing that existed outside of the normal goals of intimacy and connection and a romantic relationship.
We were like, hell yeah, we can generalize that definition.
So here's our definition of porn.
Ex-porn, for any ex, because I'm a philosopher, I have to use variables.
Ex-porn is representations of ex used for immediate gratification while avoiding the costs and consequences of entanglements with the real thing.
So food porn.
Pictures of food make you feel all hungry and good and whatever, salivating, but without having to...
Buy food or make it or deal with the nutritional consequences or go out.
Like real estate porn?
Cool pictures of real estate without having to buy for it or care for it.
And one of the suggestions we made was, okay, here's a new kind you can identify.
Moral outrage porn, right?
Moral outrage porn is representations of morally outrageous situations.
We're engaged in instant gratification for the pleasure of moral outrage rather than for actual moral action.
And I want to be super clear here.
A lot of people read the stuff of ours and they immediately try to adapt it to this crappy end that I don't believe in at all.
The crappy end is, oh, this means moral outrage is bad.
Let's all be civil and nice to each other.
Fuck that.
That's not what we meant.
What we meant was, when you say that...
If you think that sexual pornography is bad, you don't think sex is bad.
We don't think that moral outrage is bad.
Moral outrage is incredibly important.
It's motivating.
Aimed at the real thing, aimed at actual morally problematic situations, it's one of the most crucial emotions we have.
It's because real outrage, moral outrage, is so important that the pornified version, which simplifies moral outrage for the sake of pleasure, It's so devastating.
And one of the worries is, as with all other kinds of porn, shorn from the responsibilities of doing it in a nuanced and careful way, when you're just, like, optimizing it for pleasure.
So, I mean, if you want to be really moral, you have to pay attention to nuance, you have to pay attention to people's feelings.
But if you're just in it for the pleasure...
Of outrage, then you want to do something else.
You want to tune it for max pleasure.
And tuning it for max pleasure involves making it simple, making it easy to access, making it un-nuanced, making it uncomplicated, making it into moral candy.
Yeah, that seems like very much an idea for the modern age where...
So much of people's lives is conducted online and in this virtual, not unreal kind of sense and is often performative to some degree.
So when I heard about...
This idea, I immediately thought of so many instances of, let's see, moral grandstanding, which is public shaming and that kind of online activism, which becomes a kind of slacktivism.
So I'm extending from your idea here.
I know it's not exactly what you were talking about, but it feels like those are also things that can sometimes be a facsimile of the real thing, which is quite time-consuming and difficult and frustrating, but are done.
Just really for the pleasure of it.
I mean, you might think there's a slight difference.
So moral grandstanding is like using expressions of morality for status, and moral porn is using expressions of morality for pleasure.
So they have slightly different purposes, but they share a similar structure, which is you're not supposed to use morality for...
Pleasure or status, right?
You're supposed to use it to be good.
This is what I think is happening with many expressions of morality.
I wish I could go back in time and change one thing.
A lot of people read this stuff and immediately are like, oh, this only applies to expressions of outrage.
I have expressions of civility and calm and connection.
That shit is just as pornifiable, right?
If you express centrist expressions of like, let's all get together, let's be civil and kind to each other, enough with this moralism.
You do it to feel smug towards, say, the radical left.
It's just as pornified.
I noticed that whenever this concept came up and people were debating it online, that they kept conflating when they hear you say that.
They think you're saying civility doesn't matter.
Civility is important.
Like, we don't need to be nice to people.
And they avoid the fact that the qualifier porn is there, right?
That's the whole point, that you're not denigrating civility and being respectful to people.
You're denigrating the indulgence of it.
Outside of its purpose.
Well, I'd say when it's done in a performative way, Chris, we both have instances of how it's done in a very elaborate way where it's a way of showing, oh, look how much credit I'm giving.
Look how open-minded I'm being.
Yeah.
I think you want to distinguish between the performative and the hedonistic, right?
And I think they might go together.
So I think the moral grandstanding stuff may be more performative.
You're doing it...
More to look like you're moral than to actually be moral.
And the hedonistic is you're engaging it for your own pleasure.
And I think often my suspicion is a lot of the outputters of moral outrage porn would be described as performative.
And a lot of the receivers of the audience and the audience are engaged in it for hedonistic reasons.
And these may be parasitic with each other.
Yeah, that makes a lot of sense to me.
Yeah, fascinating.
So one thing that came up when you were describing that is the concept of a super stimulus, that in the modern environment we have things that activate our moral senses or desires, but they're kind of super attractors for it.
And, you know, there's lots of examples from evolutionary biology that's applying to other animals.
And so I was wondering...
Is that inherent to it, that for something to be porn, that there's a super stimulus aspect of it?
Or is that not necessarily, does that not need to be there for the concept?
So, I don't think it's necessary, but it's a common result of a certain functional relationship.
So, you can use...
Anything is porn, right?
So Michael Ray made this point.
People can exchange intimate sexual pictures as part of the relationship, and then someone else grabs it and uses it as porn, right?
And I think the same thing is true of moral outrage porn, which is...
I mean, I could act...
The situation is really complicated.
Like you might think that someone could sincerely tweet a genuine expression of moral outrage from a really morally difficult situation.
And other people could use or retweet it for porn, right?
And so, but the thing that I'm really worried about it is, so when you use
for porn, you're trying to use it for pleasure.
So if people start consciously producing it, then they'll want to optimize the pleasurable aspect.
So I keep in everything I.
We're evolved to want sugar and salt for fine reason.
In the environment of evolutionary adaptedness, they were moderately correlated with nutrition.
We get pleasure from those things.
The moment someone can profit off of Giving us pleasure, then they're not going to target the original function.
They're not going to target nutrition.
They're going to target the thing that gives us pleasure.
So if there's any wiggle room, you should expect companies that make money off of selling food to exaggerate the super stimulus part that gives us pleasure.
I suspect the same thing.
If you're peddling moral outrage horn, and especially if that gives you power, then you're going to...
You have audience people who have started to get used to and want and crave pleasurable moral stimulation, then you'll have reason to exaggerate whatever parts of it are pleasurable.
Like, whatever will give people the sensations of confidence or smugness or any of that.
Now, the point where your work really became super interesting and relevant for me was...
When you moved into looking at the sort of stuff you cover in your manuscript, the seductions of clarity, which I think is a, correct me if I'm wrong, but it seems like a cognitive parallel to the outrage porn in that it's,
well, I'll let you describe it, but in my fuzzy understanding, it seems to be that you're talking about how actors can focus on giving the feeling and impression of insight and that aha.
kind of feeling, but actually is really can be a substitute for the real things.
Yeah, exactly right.
So, I mean, you all should help me because I shuffled a bunch of papers at you.
For people in the audience, these two did this enormously, like, insane thing of reading this whole pile of my papers for no reason.
And they're all related to each other in ways that are really hard for me to say.
And I'm actually currently trying to write a book for not just I'm having a little bit of trouble saying what that center is.
I can say it in technical philosophical language.
I'll do that later.
It's gross.
So, but no, this is exactly right.
So this is, there's a separation between the signal and the actual content.
So here's the idea of the sections of clarity.
So I mean something specific by clarity.
I mean the feeling we get, right?
So Alison Gopnik, a psychologist who studies this stuff, has this paper called...
Understanding is as orgasm.
And it's this moment of like, she's trying to talk about that cognitive moment of like epiphany, like, aha, like, I get it.
And everything falls into place.
And it feels good, right?
So what I'm thinking is, look, we're limited beings.
We can't do everything.
We need to know what to pay attention to and what not to pay attention to.
And so...
My claim, which I have some integration with the psychological – you can tell me if the integration was good, but I think I have some backing from the cognitive sciences and psychology.
My claim is that we use the feeling of clarity, the sensation of understanding as a guide to when to stop thinking.
When you get that aha moment, right?
That feels right.
Then you're like, oh, I get it.
So you stop thinking about it and you start thinking about it.
Something else, right?
So we use the feeling of understanding as a heuristic for terminating inquiries.
So if that's true, then it would be really valuable for anybody who wanted to manipulate our beliefs to game that feeling.
So if you could fake the feeling of understanding, then you could get control of people's attention, what they paid attention to, right?
I think in the paper I have this analogy of like, look, so stage magicians actually, what they train in is to make The hand that's actually doing the work look boring and the hand that's not doing the work really interesting to send people's attention away.
A signal of boringness is an invisibility cloak.
And so if you can manipulate people's feeling that they understand, then you can cloak things behind a similar cognitive invisibility cloak.
So there's actually a really useful description.
Borks are drawn from the philosophy of science.
They ask, like, so what is it to understand something?
Like, to really understand, to, like, the real thing, not the feeling.
And they say, look, understanding is not just doing separate facts.
It's having all the facts cohere together in a usable way.
When you understand something, you have a model that connects things and makes them coherent.
That model is usable to generate new explanations and actions, and it's easily communicable.
In this paper, what I was saying was, look, so if you want to game this, if you want to fake this, then you want to fake the feeling of coherence, you want to fake the feeling of usability, and you want to fake the feeling of being able to communicate things easily.
How do you do that?
Conspiracy theories is one.
Actually, in the paper, I think bureaucracies is another.
I think you two are far more interested in the conspiracy theories, but I think it's just as applicable to bureaucracy.
Birocracies are there in those intricate webs they weave in everyone's lives as well.
But the thing that those points make me think about is, you know, a lot of the people that we look at, the gurus, they actually do create these extremely elaborate, interlocked series of narratives and theoretical frameworks about how the world works,
the reasons that they are disparaged by people, and also broader...
Often civilisation-sweeping narratives about how the issue of trans bathroom access will relate to the downfall of Western civilisation.
What it strikes me as, you know, when, for example, Jordan Peterson's devoted fans are saying to people when they criticize him that you're not understanding him in context and you haven't looked at the lectures where he connects these ideas and gives a more nuanced understanding.
You know, it's sometimes presented as that's them being disingenuous, right?
They'll never be satisfied, which may be the case.
But I think part of it is...
More related to the point that you're making.
They are swimming in these dense networks of symbols and connections and narratives.
And when people come in and point holes in it, it doesn't really work because they have a whole elaborate network and taking out one part of it, it just feels like that's barely making a dent.
Well, I mean, the thing that it reminds me of is in Windy.
Go into the conspiratorial communities.
You so often hear them say, you have to go do the research.
If you don't get this, then go do the research.
That's like a slogan.
And I think you're right, Chris, they really do mean that in the sense that this thing might seem silly on the surface, but when you've done all of the reading that we have and you've accumulated this vast, complex infrastructure, then it makes sense.
So I have a question, though, that relates to that, and then I'll shut up.
So there is an aspect of that where that's a reasonable thing to do.
You know, when you have genuine expertise in the topic and somebody comes and says, well, I think this, you know, I have this opinion on immigration.
But have you read anything about immigration or the policies, the statistics?
And genuine people say, you know, you need to do the research.
So a question is how to distinguish those requests.
Yeah.
I mean, a lot of the work I'm doing is fighting this view that, say, people in echo chambers, people in the alt-right are, like, unthinking or intellectually lazy.
It seems to me like the opposite.
They're like hyper-intellectual.
In fact, sometimes it's almost about being like too attracted to the pleasures of intellectual power.
And what I mean is something like – so when I was an undergraduate, I had an English professor, Richard Marius, and he said – Something that I've always remembered.
And he said, we were reading Thomas Pynchon, which is all about real epiphanies and fake epiphanies in Crying About 49. Something you definitely read as an undergraduate.
And he said, well, he had this lovely southern accent.
He said, well, I've always thought that the pleasure of mystery novels was like the pleasure of religion.
Everything that seems disconnected stands revealed as having some kind of perfect order.
And a lot of the times, the pleasures of these networks you're talking about remind me of the kind of like some of the vast fantasy novels I read where like I'm reading Brandon Sanderson right now.
And the thing about Brandon Sanderson is there's all these like cool hints and things.
And in the end, like there is an order, like everything makes sense.
And that's so pleasurable, right?
So one thing that I think is going on is if you compare what it's like to be a real scientist – by the way, I just read this marvelous book, a popular book on the philosophy of science from Michael Strevens called The Knowledge Machine.
And one of these things it talks about is, look, you're aiming – what you hope for is total coherence, right?
That's the long-term goal.
But as long as you're getting hit with all this other evidence that doesn't fit – You have to be on the uncomfortable position of saying, we don't know yet.
No theory we have works perfectly.
It would be nicer if we had a theory that fit perfectly, but we're still waiting.
I think that takes a certain, I don't know, something.
Where with this stuff, you're like, no, no, we've got it.
And one of the things I was trying to talk about in this paper is that...
It seems to me that a lot of these theories are made to be easily applicable, so they constantly give you the sensation of intellectual power and understanding.
Because if you can generate explanations for anything easily, then you feel like it's not that you're unintellectual, you're getting confirmation of your own intellectual powers.
Yes.
Yeah, no, I think that's true, and it gels with what I know about the literature on this.
One random example is that it's actually people who are more open to experience, more intellectually curious, who tend to succumb to conspiracy theories or various other belief systems.
And so the fact that they're so elaborate and, you know, Byzantine and their complexity is part of the appeal.
Because it is like saccharine.
It's intellectually pleasurable in that sense.
But, you know, I've never really got my head around this apparent contradiction, which seems to be that on one hand, there's one simple explanation for everything.
As you said, this huge amount of explanatory power.
Who did it?
Well, it's the New World Order and the Illuminati or whatever, or the Jews or God, you know, depending on what your theory is.
Or all of them.
Or all of them.
If you're Alex Jones, yeah.
So from one point of view, it's extremely simple.
Because everything comes back to the one thing.
But on the other hand, it's also extremely complicated and elaborate.
But I think the key thing that you said is that it has huge explanatory power.
Any new thing that comes along can be explained quite easily.
Yeah, that's a really interesting observation.
I mean, I haven't spent as much time as you two looking at the particulars of the current gurus, but my guess would be something like the relationship between A single core idea with a really complex application is something that would both give you pleasure,
because you would tie it back to the single central idea, but also the sensation of power.
As long as the idea is complicated, but within your grasp, then you get to have the feeling of intellectual power.
I think it's interesting to compare it to the kind of knowledge that is real but unsatisfying.
Which is the kind of stuff that I feel like I have, right, in psychology, right?
Which is, it's a mess.
You know, there's a bunch of different explanations and theories for various things.
None of it fits together very well.
So it's a very unsatisfying state of affairs.
And I think that's often the case for real knowledge.
The ratio of hard work you have to do to satisfaction that you get having it is, yeah.
The thing it makes me think about is that You know, the kind of guru people that we look at, I think they do exactly what you're talking about, where they expand their explanatory ideas across all these fields that they're not experts in to showcase their ability,
how insightful their worldview is.
But the opposite of that is that as academics become more specialized and more proficient in a particular area, they tend to become less willing.
To venture grand opinions about fields that they don't know about.
So it's a kind of inverse relationship.
And of course, there's plenty of mainstream academics who do venture grand opinions.
But I think in general, there's that knowledge amongst academics that...
Becoming highly proficient in a certain field makes you like this uber nerd about a topic that you recognize is extremely niche.
And that seems the opposite dynamic.
I think you're right.
The sort of graduate student in a field is generally far more certain of themselves than a professor.
I think this is super interesting.
This is weird because this connects to another part of my research that seems totally unconnected, but I think is weirdly connected.
So if you look at the history of philosophy, so the history of philosophy in the modern era has this fetish for intellectual autonomy.
Right?
For like, think it through yourself.
You can think independently and you can understand everything.
One of my own life-changing experiences was reading this book from a philosopher, Elijah Milgram, called The Great Endarkenment.
And he has the suggestion that the Great Enlightenment undermined itself.
It said, think for yourself.
And that created the sciences.
And the sciences were so vast and so enormous that it's now impossible for us to think for ourselves, right?
We have to trust experts, and we have to trust experts.
So, I mean, expertise is our host.
So, I was asking my wife about this.
She's a chemist.
And I'm like, so how far away in chemistry do you have to go before things are basically incomprehensible to you?
And she was like, I'm in organic chemistry.
It's not chemists.
It's not just inorganic chemists I can't understand.
It's like any sub-sub specialty right next door.
I have no fucking clue what's going on.
So, this is really painful.
And I think...
One of the things that happens with a lot of the figures you're talking about is they actually offer this fantasy of being able to be back in the time when you can understand it all.
Yes.
And weirdly, that's built into intellectual...
If you listen to most...
Philosophers, scientists, they're like, it's so important to think for yourself.
And then we're sitting here being like, but hold on a second.
I don't know the math behind climate change science.
Yeah.
Look, I love that observation.
I think that is fantastic.
I mean, it's so true about the absolute need for just out of specialization these days.
I've published in the journal Vaccine, Human Immunotherapeutics, and various public health and epidemiological journals, right?
I don't have any hot takes on COVID or how the vaccine, various things going, right?
Because the expertise is so narrow these days.
Just because we're covering such a broad range of technologies and forms of knowledge, what these gurus offer, that's a horribly unsatisfying state of affairs, as you said.
And the gurus offer this polymath...
An ability to draw it all together, which is so satisfying.
I'm published in the Philosophy Journal.
That's clearly an error on someone's part.
So one more thing I want to say.
I think at the end of the Seductions of Clarity paper, I'm trying to figure out what we can do about this.
And the thing that I end up saying is something like, at some point, I think a lot of us have this, I mean, go back to the food analogy.
I spent a lot of time just, like, stuffing my face with, like, the crappiest chips.
And at some point, you're like, okay, this stuff is not good.
Someone has engineered this stuff to be addictive.
And you get it.
You evolve this sense of, like, no, that's a little too fucking yummy.
That's a little too salty and savory and fried.
And maybe get indulged once in a while, but I need to be suspicious because that shit was engineered.
And I think there's something similar where I think...
The actual intellectual life, when you're exposed to the complexity and difficulty, is painful and humiliating.
It's fucking awful.
And you almost have to...
It's like learning a taste for kale.
You have to cultivate in yourself.
That makes you morally better.
No, it means you're not fucked.
Yeah, I think that's excellent.
I've actually said something similar, which is that...
You know, you have to become a little bit suspicious.
When it feels too appealing, when it sounds, oh, that sounds right, that's got to be true.
And, you know, we all have that feeling a lot of the time.
In this conversation right now, it's a lot of the time where someone is saying something, oh, yeah, that sounds completely right.
We should always be suspicious.
Of that feeling, hey?
You know, we see a lot of the gurus, when we listen to their content, and we don't break it down, we just let it wash over you, we often say, that it feels really satisfying in the moment.
And you can kind of, you know, follow along the connections where they're going.
Jordan Peterson is great at it, giving these extended metaphors and analogies and layer them on top and connecting them to grand narratives.
And it feels satisfying in the moment.
But then when you...
Take time like what we do in the podcast.
And you stop and say, okay, so what was the argument made here?
And what's the evidence for what they're claiming?
And it very quickly becomes apparent that it's a lot of emptiness.
You know, you can spend 15 minutes describing this idea, which actually would only take two sentences to explain.
So there's a...
Genuine, I think your analogy to junk food, that it's super satisfying and we enjoy it in the moment, but afterwards we look at what we've done and are feeling kind of ashamed of ourselves really applies.
Yeah.
Yeah, my take on that, and I sign off on that totally, but another thing I've thought about that phenomenon, Chris, about how...
That feeling of washing over you.
And it does, it kind of feels right, you know, like the analogies are evocative, the connections kind of seem all right.
But then when you stop and think about it, and what we're doing, Chris, is we analyse it, like we're adopting an analytic frame of mind where we actually go, okay, hang on, does that follow from that?
Does that actually make, or is that a contradiction with this other thing and so on?
But I think that gurus largely rely on intuitive, Information processing.
As long as you sit back and just let it wash over you, that's kind of...
That intuitive feeling of it feeling right.
And I think those are ideas from cognitive psychology, which I think are helpful here.
That's not the way they editorialize it, though.
Especially the people we look at tend to invoke that they are practicing real science and doing it in an analytical, scientific, rationalist way.
But I think we all know that people who claim the mantle of rationalism and science and not to be tribalistic tend to often.
Let me give you something.
I want to ask you about something because this is something I've been puzzling over.
So when I wrote this thing about echo chambers and how they manipulate trust, I had this basic image where echo chambers are structures where you're told to distrust everyone on the outside.
And I kind of said...
And you kind of trust everyone on the inside.
And I was thinking about spending a lot of time on the online echo chambers that I've looked at.
By the way, to anyone listening, I just have to say, I think it's really important...
To distinguish between filter bubbles and echo chambers, so people have been confusing these two notions.
A filter bubble is when you don't hear the other side, and an echo chamber is when you don't trust the other side.
The original research into echo chambers was about trust structures, and lately people have bundled them all together, and there are all these disproofs that, oh, exchangers don't exist, but they're all talking about filter bubbles.
They're all talking about how you hear people on the other side.
Climate change deniers know all the other arguments on the other side.
They just trust them.
And it's a great point because I think the notion that people aren't hearing the other side is just wrong.
It's often clearly wrong because they spend all day often talking about the other side.
Right.
Obsessively discrediting the other side.
Yeah.
Well, I want to pick you up on that echo chambers stuff because it really closely ties to one of the key features of our gurus.
We actually described it as anti-establishment, but it really is.
But we have another feature where we talk about their cultish in-group, out-group dynamics.
But if you actually sort of put those two things together, right, for a lot of gurus, the out-group is everyone else.
It can be like another political side, but a lot of the people we follow are not really, they're not political.
Stuff is not their main game.
Their art group is the institutions, the experts, the establishment.
Academics like us are definitely in the art group.
And so the establishment is presented by the gurus as being hopelessly corrupted by incentives, by groupthink, by ideologies, etc.
And therefore, you really can't trust them.
And they spend an awful lot of their time undermining all other sources of knowledge while building themselves and sometimes their friends up.
So that really strikes me as a very, now I'll probably get the two terms mixed up, echo chamber-y thing to do.
Did I get it wrong?
No, you got it right.
You got it right.
So I think I have this image of...
Echo chambers as ones where everyone on the inside trusted each other and distrusted everyone on the outside world.
And then Joshua DePaulo, another philosopher, a friend of mine, wrote a critical paper about it where he points out, look, there's another option that also counts as an echo chamber, which is when the leaders make the people inside the echo chamber distrust themselves.
Not just the outside world, but also distrust themselves and create this total vacuum where the only person you trust is the leader.
So now it seems to me like there are two echo chamber structures you could have.
One is everyone in the echo chamber trusts everyone else in the echo chamber and each other, but especially the leaders.
The other one is everyone in the echo chamber distrusts everyone on the outside and themselves.
They've been taught to think that they themselves are dumb.
They only trust the leader.
So Josh pointed out that the second structure is actually characteristic of a lot of older religious cults.
And I haven't seen that structure a lot in the new online world.
I tend to see this like leaders pumping up the confidence of the followers.
And I'm not sure.
And I wanted to ask you because you two follow this stuff better than me.
My sense is that it has something to do with the structure of online recruiting, that giving people a sense of pleasure and confidence is a better way to snag people online, and it's harder to – the self-hate.
I think the self-hate methodology was the methodology that cults use when they could isolate people and take them out to a compound.
But now you need something else.
You need a sugary bait to put on the internet.
So my sense is that the methodology of cult building has slightly changed to this more hedonistic, satisfying, pump up the ego of your audience thing on the online world.
Yeah, I think you're largely right about that, that it is more carrot than stick.
There's certainly an awful lot of flattery being done of followers by gurus.
On the other hand, you do see some tricks.
The thing that we called the Emperor's New Clothes Maneuver, where they will say things like, now look, I think this is going to be too complicated for a lot of you.
I respect your intelligence and I think some of you might be able to get it.
And then they proceed to say something quite absurd and outrageous.
So nobody wants to be the person that doesn't get it, right?
So there is, you know, we do see, you know, we interviewed someone who had spent a fair bit of time in Eric Weinstein's Discord server and he told us a fair bit about some of that.
Follower management that was done.
There were certainly in-groups and out-groups within the Discord where the people who were more loyal, more on board with the message, were definitely treated more favourably than the people who asked irritating questions and challenged things and thought for themselves too much.
So that's my take.
How about you, Chris?
Yeah, so I'm echoing probably some of the things that you're saying, but basically I agree that there's a lot of flattery about you are the guys that can look at these topics with nuance and complexity and like you're interested in scientific approaches.
But there's also this element where it depends on the guru, but some of them really go along the lines of indulging in almost like Star Trek style techno babble.
about specific mathematical or scientific topics where the way that they are illustrating the concepts feels like it's to illustrate their own intelligence.
Yes, but the other thing that it's to do is to highlight to the followers that they're invited into the club.
To be a part and to watch.
And maybe they'll get to that level.
But there's such a vast difference between where you are and where the guru is that if you are going to chastise them for not knowing things, you really better do your homework.
And in their internal communities, like the Discords and the Patreon groups and these kind of things...
I think there is a greater chance for the old-style dynamics, kind of the things Matt is talking about with community management, threatening to withdraw access or negging people, right?
So I think both dynamics are probably at play, but it just depends what part of the network you're looking at or how deep you're into the gurus.
This makes me think something.
So there's this...
Paper draft.
I haven't even sent it to you, but it's about different kinds of epistemic traps.
And I start – one of them I want to call like a deference trap.
Like it's like stop thinking for yourself.
Just believe this thing.
Just believe me.
And another one I want to call an inquiry trap, which is like a belief system that lets you do a lot of intellectual inquiry but like subtly sends it down the wrong channels and like redirects your trust settings in various ways.
The inquiry trap works because it gives people the sensation of being intellectually autonomous and gives people, like, the feeling of power.
But in order for it to be a trap, it has to end up in the same place.
So if you're building something like this, you need to, like, do...
You can't...
It's like you want to give people the feeling of intellectual autonomy and power, but you don't want to actually give them real intellectual autonomy, or they'll leave and not follow you.
So you have to build this weird – I mean, this feels, again, like a magician's sleight of hand.
And listening to the two of you talk about this push and pull – I mean, I'm really – I need to listen to this stuff because I'm really interested in seeing – hearing the experiences of people in the current kind of online thing.
The more cultish thing.
And it does seem like it sounds like the thing you're describing sounds like the technique you'd need to fake giving people intellectual autonomy and then subtly not give it to them.
Matt and me have often commented, like I've described it as people act as if words are magic.
And what I mean by that is when somebody...
I'm not advocating a conspiracy theory or I'm not going to just pat myself on the back.
And then they do that.
They proceed for 30 or 40 minutes to outline a conspiracy theory that's sweeping and makes all these large disparaging claims about entire fields.
The fact that they added in just the disclaimers.
It really works as people will, if you point out, well, that guy was advancing conspiracy theories.
People come back and say, no, did you not hear?
They said that they're not doing that.
And they added disclaimers at the end saying they're not entirely sure.
Maybe they got some things wrong.
But it feels to me that that's not epistemic humility.
It's not genuine.
It's a kind of covering your ass.
You know, if you had real epistemic humility...
Humble.
Yeah, humble.
That's what I'm looking for.
You wouldn't then spend the hour and only the two minutes on disclaimers.
It would be the inverse structure.
So, yeah.
My take on what you said, T, was that perhaps that problem of managing where the independent inquiry goes to, the end point, is not...
Such a big problem for the gurus or for these communities a lot of the time.
So if you take something like the COVID conspiracies around...
Everyone wants to blame China, right?
So they will be naturally drawn there.
Or around climate change.
They naturally just want to deny that it's happening, right?
So I feel like the kind of gurus who exploit these...
These issues are pandering to a pre-existing prejudice that is widely held and I guess helping them along with the very complex intellectual rationalisation for what they wanted to believe in the first place.
Does that make sense?
Yeah, I mean, one thing that unites a lot of this stuff is All of these tricks and tactics, like the moral outrage porn stuff, the fake clarity stuff, it's all dirty tactics you would use if you don't actually care about the truth.
So it's all playing up the symptoms of the truth and then making maneuvers that don't actually require loving the truth or giving a crap about the truth.
Only fronting as much as it's useful.
Yeah, well, I mean, people have talked about Trump in exactly the same way, for having a complete disregard.
So this is what they call bullshitting, right?
It's not caring about the truth and deliberately wanting to deceive people.
It's just completely having no regard for it whatsoever.
So people have described that, of Trump, have described it as a superpower.
Because it gives you suddenly so many more degrees of freedom with which to optimize your persuasive tactics.
Is that a fair summary of what you were saying?
Or not a summary?
Yeah, I mean, the analogy that comes to mind is, again, it's easy to make things delicious if you don't care if they're nutritious.
That's totally easy.
Like, it's really, really hard.
The balance of nutrition and nutrition.
Goodness is tough and requires other sacrifices.
Let me actually float a weird theory that just came to mind about this really interesting balance between trying to give your followers the feeling of freedom and not from the other side of my brain.
I don't think we've talked about this yet, but the other half of my philosophy of life is about understanding games and the philosophy of games.
And I find something really...
They've started to collide, and I think there's a really interesting similarity.
So, for me, one of the things that makes games incredibly pleasurable is they offer a completely...
Clear sense of value and purpose.
Like normal life is like full of these incredibly conflicting plural values and they're hard to apply.
And then a game, you know exactly what you're doing.
You know exactly where you stand.
All the values like fit into one economic point system and things are clear.
It's like this relief.
In particular, I think one of – and this relates to some of the conspiracy theory stuff.
I think in our actual lives, trying to get things done is very – Rarely pleasurable, because problems are either so vast and overwhelming that our abilities don't fit, or so boring that they're easy things we have to do over and over again,
and we want to shoot ourselves.
But games are like engineered environments to make the process of...
Thinking or doing whatever just fit.
Your ability just fit.
It's a world of practical struggle where the struggles are engineered to feel good.
There's this article that everyone's been sending me about how a game designer says QAnon is like a game.
It seems like what you're doing is creating this game-like puzzle experience.
The thing about games is, unlike, say, science, the puzzles are hard, but they're...
Built for people to solve.
And you can do that because you have a lot of free play in the game to redesign the environment and the abilities.
And I kind of think that if you're out there to build pleasurable, candy, intellectual belief systems, you want to make them hard?
But within human capacity.
So the weird connection was something like, and you know what else game designers are really good at?
They're really good at giving you the feeling of freedom and yet steering your action down a pre-channelized path, right?
Like game designers are masters of, oh, you feel free, but you're going to end up at this next cutscene anyway.
And I just wonder, like, maybe the analogy goes to that next level too.
Like, being able to create a choice environment.
I mean, this is like Nudge's stuff.
So when I started researching this stuff, everyone was talking about how games are good because they're like fiction or they tell stories or like movies.
And I'm like, no, they're more like cities or governments.
They like are these choice spaces full of nudges to get people to go in certain ways.
And I feel like intellectually, a lot of these conspiracy-ish theories have the same like, you're free, but hey, somehow we've constructed it so you end up in this place.
Yeah.
Well, look, I think...
One angle of it too, though, is the richness and complexity of the space of the game or the conspiracy theory.
So there's lots and lots of space for people to do their own research and to come up with their own little insights and elaborations and make their own connections.
And it's...
It's challenging but not too challenging in the same way that you're talking about with games.
And when you were talking, I was reminded of the work I've done on complementary and alternative medicines.
So this field of alternative health treatments ranging from homeopathy and energy therapies and kinesiology, there is just so many.
And it has some similar things to what you described.
Like, it's an extraordinarily rich and interesting landscape for an interested person to explore.
I think, like a conspiracy theory, it taps to some fundamental anxieties and stuff that people have, perhaps even existential ones, about health in the case of alternative medicines.
And conspiracy theories often, there's an underlying kind of thing that they tap that's quite different.
So you have scientific medicine.
Which is boring and difficult and technical.
It doesn't have any of these satisfying properties.
And then you have this sort of alternate version, which any interested person can quite easily feel like they're making a lot of progress in mastering and understanding.
And yeah, anyway, I just felt like it was an interesting parallel.
I'm sure you're used to this, Tee, but because I have some history and interest in games, I really liked your discussions and your work on gamification.
But I think like many people who have played games, I'm also inevitably thinking, well, what about that counter example that doesn't really exactly fit?
So in a contrarian way, I was thinking about Minecraft, where part of the appeal is that the goals are...
Although there is a game there, a survival game, the reality is that most people play it in an open-ended way.
So I'd be interested to hear your thought about that as a nitpicky thing.
But the other thing is that thinking about that, when you pointed out that...
You're allowed to play, but there are actual hidden constraints.
And it seems like you have endless opportunities, but really you don't.
And that also fits the Minecraft analogy where you can do incredible things.
You can rebuild, you know, the Star Trek Enterprise, if you want, and go around all the nacelles.
But you're still ultimately trapped in voxels.
And yeah, so I think the metaphor sits really nicely.
But I'd be interested that there are games that are popular which seem to have rather open-ended reward systems now.
I mean, so one thing I can say is the thing that I'm analyzing is constructed systems in which you have a clear goal.
And clear rules that constrain how you get to that goal.
Not all things that are called games in our natural language are like that.
So I think chess is like that, right?
Another thing to think is you have to be really careful.
Now I'm just putting on my philosophy of games hat.
You have to be really careful to distinguish the software environment from the game.
And you can play different games with the same software environment.
So you can play Super Mario Bros., or you can speedrun Super Mario Bros., which is a different game with a different goal, played on the same software.
You can play World of Warcraft for experience points in gold, or you can go to socialize in the environment.
Then I think you're doing something slightly different.
So one thing I think is there are some things that people call video games.
But really, they're more like toys.
They're more open-ended.
They're like structures for play.
And I think a lot of the times, a lot of modern games are made, a lot of modern video games are made so you can engage within it, engage with it with this clear goal, or you could just play around in the environment as a toy.
And that's like different activities supported by the same thing.
So I think like you have to be careful there.
It's almost as if you thought about this topic.
For eight years of my life.
But there is a point related to that.
You've talked about the gamification of Twitter or other social networks, harvesting likes and retweets.
And I think nobody is immune from the reward dynamics that are in play there.
But one thing that struck me about that when I was listening to you talk about that.
Is that when I look at the case of James Lindsay, who's a super stimulus in the guru sphere at the minute, because he's burning brightly as somebody who, whatever you thought of him, he wants him to be on the...
kind of legitimate side of things to some extent, right?
That he might be obnoxious and whatnot, but he has some legitimate arguments that he makes and he's taking
And in recent months, famously since Trump retweeted him...
Much more leaned into complete right-wing partisanship, retweeting people from Infowars, endorsing voter ballot conspiracies, and doing things that, you know, if you are a secular, rational, atheist concerned about science, you don't promote coronavirus conspiracies,
which he does.
So when I've looked at that, one of the things that keeps coming up when people are discussing it is, you know, to what extent is he...
Believe the things he's doing and to what extent is he just engaging in harvesting followers or playing to a certain audience?
And I'd be really interested to hear your view on that.
From my perspective, it looks like...
It's a little bit of, you know, column A and column B, the academics' eternal answer that he is intentionally doing things to garner controversy and that pander to an audience.
But at the same time, he seems genuinely to have bought into a whole ecosystem of ideas that are not his own, that are about, you know, the Great Replacement or the Great Reset and about Soros' influence that predate him.
And kind of co-opt his agenda to some extent?
That's a lot of things, but I'd be really interested to hear your opinion on any of that.
I can give you some ideas that you can apply them to Lindsay, because I honestly can't stomach following.
You have more stomach following.
So I have no actual evidence about him himself, more than like...
Flybys on Twitter they're trying to look at.
So T has already proved himself far more emotionally aware and stable and healthy than the two of us, Chris.
Well, I will add, because the Guru account only follows the people that we cover in the show.
So it's only got like nine or ten people.
And basically the reason I see his tweets, like he's blocked me long ago.
It's because our Decoding the Guru's account is basically his Twitter feed because he tweets so prolifically.
But after your appearance, you and ContraPoints will dilute the stream.
So that'll be nice at least.
Sorry to interrupt you there.
Go ahead.
Yes, sorry.
Okay.
So let me step back for a bit and vomit some stuff on gamification and then we can try to think about how it...
Connects to the situation.
So there's a standard thought out there that something like games are good, so gamification is good.
This is like James McGonigal, like gamification booster says this.
I think actually if you understand why games work, you'll understand why gamification is terrible.
And the reason is that games offer you this wonderful value clarity of a simple artificially clear goal, but they do so in a secluded environment where you pursue it.
Away from the rest of the world and where the goal isn't connected to the rest of the world.
When you gamify ordinary activity, to get that pleasure, you have to simplify the goals in real life, right?
Like in some sense, it doesn't matter who I kill in like Dota or whatever, right?
But what I say on Twitter matters.
So the worry is that when you gamify an activity, so I worry about like Fitbit and Twitter.
When you gamify Twitter, to make that exciting, You have to change what you care about from whatever rich and natural goals you have to just whatever the points measure.
It's only thrilling to watch the points come up if you care about them.
So this is part of this larger phenomenon.
What I'm actually trying to write about right now is this larger phenomenon I'm calling value capture, which is when you have rich and subtle values, you get put in an environment with really simple, often quantified versions of them, and then they drift into you.
And they start to take over.
And I mean things like for academics, like citation rates, right?
Or like the status.
There's a really interesting kind of like what the U.S. News and World Report law school rankings do.
Like it seems like everyone in that system gets value captured and they just start caring about moving up this clear ranking.
And there's this weird sense in which it seems like I haven't quite figured out how.
It works exactly, but it does look like you have this promise of pleasure.
If you align your values with whatever this thing, whatever this point system is pounding out, then suddenly you get these huge bursts of pleasure.
And so, this seems reminiscent in a way of what you're talking about, about Lindsay, but again, I don't know.
What I would imagine is that suddenly you get this huge burst of points for doing a certain kind of action, right?
And then if you continue that, actually, you get more points.
And I mean, I'm not a psychologist.
I don't know how incentives change belief systems.
But I definitely like...
I've been on Twitter for a while and I can feel the pull.
To me, it feels like sometimes I'm on Twitter and then some tweets do really well.
I try some tweets that are really about what I like, what I care about, and they're like, you know, whatever.
And you say some zippy, peppy thing and then it explodes.
And then you can start feeling your brain reorient around saying things like that.
And I don't even know...
I mean, I try to pull back.
Like, I can recognize it because, like, background.
One of the reasons I write about games and game addiction is that I've lost years of my life to certain games.
Like, Civilization 2, 3, and 4, which I'm never allowed to touch again.
Good games.
Good games.
I didn't say that.
Yeah, good games.
Yeah, yeah.
I know.
They're good.
I can't touch.
But, like, I can feel that.
And, like, the way...
The way games can take over your brain is like you just start looking.
I'm a climber and sometimes when you climb well and you're in a climbing mind, you just look around and the world is suddenly just like, how would I climb that?
How would I go up that?
And I feel like when the Twitter thing gets its hooks in you, I walk around the world and I'm like, would that be a fun tweet?
Would that be a good tweet?
And it's almost like the thing that I do that's called believing things as they're true is a little bit disengaged.
And the filter I'm looking around the world is not, is that true?
Yeah.
Does that make a good tweet?
Yeah.
And that creeps me out.
Whenever that happens, I make myself delete Twitter for a while because I can feel it and it fucks me up.
Yeah.
Look, I'll jump in now.
First of all, about Twitter, one thing I noticed is that what's always the worst tweets, the tweets that I'm actually a little bit ashamed of because they're cheap, that always do the best.
And when I noticed that, it was a good reminder that never to...
You know, never to pay attention to that particular scoreboard.
But, I mean, I think, you know, if I understand your point correctly, you're basically saying that the rewiring is going on in one's brain and, you know, value system such that there isn't really a dichotomy between,
oh, are you bona fide about...
About this stuff, or are you just chasing the thing?
The brain's rewired, so those two things have become conflated.
And when you mention the incentivisation of academics towards citations and the various metrics they apply to us, I mean, you guys probably know the same characters that I did.
There's a couple of famous figures who ended up publishing, and I know one of them personally, publishing.
Like, more than 100 papers a year.
And, you know, just these crazy citation metrics.
Most of it is self-plagiarized and just regurgitating the same thing.
And I can tell you that in his mind, he has definitely conflated his original goal of being, you know, scientifically influential and, you know, in a genuine way with those metrics.
I mean, this is so, this is, look, I will do more autobiography than you probably expected.
But like, at some point, I was like, I was super depressed in philosophy.
And the reason was, I realized, I mean, it's the exact same thing.
I was just looking at ideas and being like, well, that's, is that published?
Will that get in a good journal?
Right?
Is that the kind of thing you get published in this fancy journal?
And again, like, thinking about me then, it's not like I was saying things I thought were false.
In order to advance professionally, it's like something had slipped in my brain and I was just looking at ideas and the criteria in my head for good ideas was the kind of things that would get public.
And I was really depressed for a long time because I was writing things I thought were boring.
I actually almost quit the profession and then had this moment where I was like, I can't fucking do this anymore.
One of the interesting things is I ended up writing a blog post on this internal philosophy thing about how I tried to throw all this stuff out of my head and write about things I cared about.
And I got this flood of...
Emails from people, like, all private.
I won't mention the names.
They're all like, oh my god, yeah.
I've forgotten why I got into philosophy.
Like, why am I writing about this boring stuff?
And I'm like, these are philosophers, right?
If anyone's supposed to be fucking resistant to this shit, it's the lover.
Like, why the fuck are you in philosophy if you don't care about ideas?
But somehow, like, even, I mean, even, this will sound weird, like...
Even the philosophers who are supposed to be the best are completely vulnerable to having their belief criteria shift from institutional metrics and measures.
So, I mean...
I think of myself as fairly intellectually rigorous and careful, and this shit will subvert me in like a second, right?
I feel like I have to be constantly vigilant, and I don't know, I feel like this...
Lately, I've been trying to figure out, I don't know how much we should assign responsibility to people for it.
Sometimes I think it's when the entire system pervasively hits you with these points.
It's so hard not to rely on yourself.
My other field is addiction.
I'm definitely on board with the idea that you cannot necessarily blame.
The individual's vulnerability for that kind of dopamine reward delivery.
But Chris, sorry.
No, I was just going to say that, you know, the points that you are making, and I think we've discussed also, Matt, on the podcast and offline, relate to the fact that there's plenty of genuine criticisms to be made about institutions and academia and incentive structures,
which are...
There's validity to them.
And that's part of the reason I find it so annoying when what we call anti-establishmentarianism is like, you know, a kind of hollow version of that where they don't actually address things like the citation metrics.
That's not a big focus.
It's mentioned, but just in passing.
And they act as if you do not agree with them.
In their critique of, you know, the establishment.
That you are a defender of the status quo and the mainstream establishment.
And that might be valid on some occasions, but it's chucked around so often.
And in my case, it usually doesn't bother me that much because when I see people presenting me as something, like one of the things I get presented as is an advocate for wokeism.
And it's so far from an actual accurate hit on me.
But it doesn't bother me because it just feels, you know, like they're attacking an image that doesn't exist.
But the other point that you made about, you know, pursuing what you're interested in and that often is at odds or seems at odds, at least intrinsically, from institutional metrics and the things...
Even just social metrics like Twitter, yeah.
Oh, yes, yes, both.
But the...
I, you know...
When I saw your talks, the ones that partly made us interested to interview you, it was clear you had a passion for the topic.
And we're talking about it in an academic way, but for an interest purpose.
And that's to me the most gripping things.
And it's not really related to the guru point.
It's more that just to echo your personal story that pursuing the things that are Interesting and which you feel, you know, are important or have insight.
I think that's really important.
It's not related to conspiracies or anything, but yeah, just saying.
Follow your dreams, kids.
Follow your dreams.
You too can have a podcast.
So, okay, so this is essentially to a question I wanted to ask, which is, so, you know, we talked about those social media gamified incentives.
And so it sort of raises the question of to what degree our gurus are actually, you know, you think of a guru as sort of leading the flock, but to what degree are they driven by their audiences,
you know, that desire to build an audience and keep an audience?
Are they victims?
Yeah, that's really interesting.
So I realize I haven't thought about the things from that angle at all.
Mostly I've been thinking, I've been trying to adopt this like, in some sense, super simplistic model just to help me think.
And that super simplistic model is like, imagine you were out to manipulate people and get them to believe this system you wanted.
How would you build the system?
And from that angle...
Yeah.
I was thinking about gamification as a useful tool for a manipulator because if the manipulator is trying to bait people into joining the system for pleasure, gamified systems, the gamification of Twitter really – well, basically it gives – it offers a lot of pleasure for being a part of a large unanimous group.
And so it's a good reinforcement mechanism for getting people to be in a group because you get a ton of likes if you say something that people in the group agree with.
But now you have me thinking this other thing where it's like where there's another possibility where, right, like the leaders and the followers evolve together and the leader, like you can imagine them both chasing pleasure and the pleasure comes from either for the followers having a pleasurable system and the leaders like having people uptake.
And so you might think that like you could kind of wander into one of the guru positions.
Not from being this kind of like...
I mean, I think like Steve Bannon, for example, is just like a purely conscious manipulator.
Like, he's making systems infected.
But like, you can imagine other systems where someone just like starts saying things and people start responding and they get stimulus serotonin hacked into like saying more things.
And right, and so they co-evolve.
And that seems like, I mean, does that seem...
Right, of some of the gurus you're looking at?
Yeah, I mean, that's my opinion.
I would definitely describe it as sort of a co-evolving thing for most of them.
I think there's certainly a few of them that have some strong ideological prior thing that they're looking to convince people of.
And there's some who are political partisans, like that's how they started, like Scott Adams, for instance.
So they already have like a neat...
Audience to sort of talk to.
But I think a lot of them are much more flexible and that they're kind of bullshitters.
It's a bit like Trump's policies, you know.
Really, I don't feel like they come to the table with a strong desire to convince people of something, but rather they interact and co-evolve with their followers.
Yes, yes.
Okay, analogy.
Let's go back to the junk food analogy.
Frito-Lay company doesn't have to be out to make you unhealthy or control you.
They just have to respond to profits.
So they don't have to be aggressively trying to game the gap between nutrition and
All they know is if they do this thing, then they get more money.
And so like functionally, that creates a gaming of the gap between nutrition and pleasure.
But they can just be responding to incentives.
So you might think that someone is just like, oh, my God.
They just say shit, and then some of it takes, because it gives people pleasure, and then they get seamless response.
They're like, I should say shit more like that.
So without aggressively gaming the system, they're pushed to become the kind of people that emit pleasurable, catchy, sticky ideas.
I totally agree with that.
And one of the reasons I agree with it so strongly is because one of these features we've identified again and again is narcissism.
So we're all subject to the pull of attention and praise, right?
But narcissists are really subject to it.
They're almost victims to it.
Large majority of them that we cover, you know, the narcissism is so strong.
The self-aggrandisement is so strong.
And I feel like that makes them particularly vulnerable or particularly incentivized, just like the company you described, because they're just like they're just 100%.
If you're 100% focused on maximizing profits, these guys are 100% focused on maximizing attention.
So someone like yourselves might look at, oh, that's a viral tweet, but I'm not very proud of it.
And put that aside, but a narcissist wouldn't be able to do that.
That is so interesting.
I mean, this is why talking across fields, like, I don't know.
We should pat ourselves on the back for, you know, being willing to engage with difficult ideas.
Yeah, it feels so good.
So, okay.
So there's a standard view, I don't know if I believe it, that, like, companies are psychopaths that just...
Aim at profit, because that's the only thing they respond to.
So you might think that, I mean, exactly what you said, like, the more you only respond to praise, the more you will spend all your energies optimizing your thought patterns to get praise.
And the structure of Twitter makes it really good.
For harvesting praise.
And obviously not just Twitter, but YouTube and stuff like that.
All of the modern...
Yeah, all of it.
All the modern, like, you have a channel and likes go up.
I mean, most of the time, people, your followers like you.
It's almost like if you could create an environment to optimally evolve.
The ultimate, like, emitter of viral ideas.
If you wanted to select for narcissism, filter out all the people who are narcissists, and then build the narcissism amongst those people, then we have that now.
It's awful.
I think I have a related question, T, and I want to get there before I know your time will be running out.
So these ideas, like your talks, I really like them.
Obviously, Matt really likes them.
And I know that a lot of people that have been exposed to them, especially on the left of center or the far left of center, you know, the left wing general, tend to find them appealing, right?
Because it hits a lot of buttons.
For one, it's criticizing social networks for the incentives that they're damaging society in our brains.
And two, that it points out a lot of the features within right wing.
Like conspiracy communities or right-wing gurus or even the IDW so-called centrists.
The dynamics that are at play there.
But one thing I wanted to put to you and get your...
I think there's an issue that you're sampling from a biased pool because there's a bigger amount of it on the right.
But I want to ask, how do you recommend or do you have any suggestions about how people avoid essentially taking the points that you're making and viewing them as These are things which my opponents and the right wing do,
but which us on the left are generally immune to or less prone to acting on.
And do you think that's true?
Is there an imbalance?
Or is it just our self-serving biases in play?
Right.
So obviously, I'm fairly left.
There are echo chambers and moral outrage porn and seductive clarity on both sides.
I think it's quite asymmetric.
But of course the other side was like, oh no, whatever.
But I don't think it's totally asymmetric.
And I think there are a lot of super...
I mean, when I write this stuff...
I always am hoping to write it so that the experience of someone reading is like, I see it on the other side, but wait, what about me?
And I always try to put little hooks in the end because I think it's easy to get someone to upload it and then turn.
But I think the thing that I'm talking about is I have become really cautious of versions that look like this on the left, and I think I see...
a decent number.
And again, it has that feel.
Here's a really nice theory that explains everything and it makes the other side totally evil.
And that, but you also have to be like, I mean, one of the dangers here is, I mean, some of the stuff I think starts to look like versions of a conspiracy theory.
And one thing that you always have to remember, here's a way in which I'm, I'm kind of opposed to a lot of other people that think about conspiracy theories in the Academy.
A lot of people want to say that all conspiracy theories are bad.
What I want to say is, no, no.
Conspiracy theories are sometimes good, and they are good when there's actually a real conspiracy.
You know what?
Here's a situation in which you should believe in a conspiracy and believe that the mainstream media is corrupted.
If you're in Nazi Germany and you're looking around, you're being like, there's a conspiracy that's sweeping the world and it's corrupt.
You're actually right.
So one of the things that we have to be careful about is you can't just say, look, no conspiracy theory.
So the thing is, like a lot of the people on the left, their beliefs about the functioning of capitalism look a lot like a conspiracy theory.
And now we have to do the hard work of not saying, like, well, dismiss all conspiracy theories.
We actually have to figure out which ones are legitimate and which are not.
Okay, so one thing that struck me when you were talking about that, and it's an issue that we come up with, is that some of the gurus we look at, like...
The Weinsteins are always front and center of my mind because they're kind of excellent at this.
But, like, Eric Weinstein talks about responsible conspiracy theorizing and Brett Weinstein talks about conspiracy hypothesizing, not theorizing.
It's just a hypothesis.
And both of them make the point that you just did where they...
Indicate that there are, you know, there was Watergate, there are the dirty tricks of the CIA and attempted blackmails, and there are conspiracies in the real world.
So one thing I'm curious to get your feedback or opinion on is how do we avoid that we basically say...
You know, on the right, they have conspiracies about the postmodern neo-Marxists overtaking academia.
And that's obvious nonsense.
But the left has things that look similar about capitalism or institutionalized racism could be presented that way as well.
And how do we avoid it just being that we say, well, the conspiracy theories on the right are...
Obviously crazy, but the ones on the left, well, they're in the category of, you know, reasonable ones.
I mean, I'll do you one better.
So let me give you something that I think looks, has the shape of a conspiracy theory that I probably believe, that I think it's pretty good evidence for.
So if you read the book Dark Money, this is a journalistic investigation of the Koch brothers.
And how they've been spending money to influence politics for the last 20 years.
And it looks like they've been funding various libertarian think tanks.
They're funding scientific ventures that support the progress of big oil, stuff like that.
And it's a story about a long-term informational manipulation for a purpose by a particular elite cabal, this time the Koch brothers.
I've read this thing.
I've checked it up.
Seems reasonable to me.
I have high credence in it, and it's very explanatory of a lot of weird features that you see.
So, I mean, here's the difficult line to walk.
I mean, when I talk about this, like, clarity is seductive thing, people always say, and I mean, I think it's a good thing to say, Oh, well, that was very clear.
Like, that made sense of everything.
So should I be suspicious of it?
And maybe the right answer is yes, be suspicious.
But it would be too easy if all conspiracy theories were false.
Those stupid people, they believe in conspiracy theories.
That's too easy, right?
That is exactly the earmarks of the thing we're worried about.
Because we know for a fact that some conspiracy theories are true.
So now we get into a much more complicated space.
First of all, we know that some conspiracy theories are true.
Second of all, remember, I have this worry about...
Clarity being seductive by imitating the joys of understanding.
But the other thing is, real understanding also makes things coherent and is joyful.
I mean, you're a scientist.
When you see a theory, it's not like we should be suspicious of unifying coherent experiences that feel pleasurable.
It's that there's a cartoon manipulative version that's imitating closely a real thing, which is, oh my god!
Some theories do make sense of the world, and that feels good.
And so now we're in this incredibly difficult space where we have to carefully separate real conspiracy theories from fake ones and genuinely pleasurable unifying experiences with cartoon ones with the pleasure slightly amped up.
And that is incredibly hard.
Sometimes I worry that for many of us...
We don't quite have enough information to do it.
That's like my paranoid worry.
Sometimes I worry it's just a matter of luck about which institutions you ended up connected with.
But I mean, it's super hard.
And I'm not, again, I'm not quite sure how to do it.
One of the things we talked about before is there is this signal that certain things are just a little too easy, that they've just been made for pleasure.
But I'm almost worried that a sufficiently clever group could fake that by making it fairly difficult.
But not too difficult.
Like, again, you talk about the labyrinthiness of QAnon, right?
So, like, this is hard.
Yeah.
Look, I'm going to be a bit of a philosopher here and define terms a little bit, because in psychology, we like to focus on conspiratorial ideation.
Rather than conspiracy theories, yeah?
So if conspiracy theories are the content, then the ideation is the sort of the mental process.
So the problem with conspiracy theories, as we've talked about, is that there's heaps of them around and they're completely true.
Because the way they're defined is any group of powerful actors acting secretly...
In their own interests and maybe not in everyone else's best interests, right?
So that is mundane.
That happens all the time everywhere.
So it's far too broad a thing.
So really when we talk about conspiracy theories, we're really using a bit of lazy language here.
Really what we're talking about is conspiratorial ideation, which we define specifically to be having unwarranted...
Paranoid and overly elaborate models of this.
So I think that's just a helpful way.
Because it's almost like the opposite of good science, isn't it?
Like conspiratorial ideation, if you think all the things like Occam's brand and, you know, working from an evidence base and so on, the conspiratorial ideation is almost doing the opposite, having a large, intricate kind of theory with lots of tenuous connections, maybe some internal contradictions,
which a theory shouldn't have, and being basically motivated by these prejudices or biases.
Anyway.
Yeah, it sounds like both of you are hitting the point, you know, it's not a conspiracy when they're out to get you.
But I think the point you made, T, about, you know, that there's versions of conspiracies which are accurate.
And, like, for example, the flip side for the COC funding is people focusing on George Soros, right?
And now, again, I'm not saying there's an exact equivalence to draw here, but there's a version of it where, yes, there are funders who support specific kind of causes and some can be more nefarious than others, but they are funding things.
Often they're doing it fairly openly.
So the question is when it's hidden and through shell companies and all that kind of thing.
But your point that it's hard to thread the needle is really important because like take, for example, the current issue about the origins of the coronavirus.
Now, this is a topic that's super popular amongst the gurus we look at to highlight their heterodox thing that that they're willing to consider the possibility that it's a lab like, but they don't just consider it.
But what I find is they frame it as if nobody's allowed to talk about that hypothesis and it's not even on the cards.
It's verboten.
But when you actually listen to experts discuss it, they do leave space for that possibility.
But what they do is that they put it in with the probabilities and they say, we can't rule this out completely, but it's very unlikely from the current evidence.
And they give the reasons and go for it.
But getting to that nuanced place where you're saying what the other person is doing is conspiracy theorizing, even though there is still a possibility that it's true.
The reasoning approach, which is kind of going wrong, which is echoing what Matt said, rather than the outcome, say there was some massive Chinese government cover-up and all of the virologist community had not anticipated the level of duplicity that was involved.
It wouldn't mean that the reasoning was...
It would just be that there was a grand conspiracy, which was extremely unlikely.
And yeah, it seems difficult.
So thinking about this and thinking about the ideation thing, I'm a little worried that the psychological approach that Matt is talking about makes things a little too easy and helps itself to a certain thing, which is, I mean, so, I mean, think about something like Paranoid.
Could I use this in self-reflection?
Could I be like, look, am I...
Doing the real thing or am I involved in ideation?
Well, it depends if I'm paranoid.
But again, the problem is a belief in a conspiracy theory is paranoid if it's false.
But if you believe it, you believe it's true, right?
Similarly, like, if you did the reasoning, then you're not going to think that you're unwarranted.
So in the background, I think there's a slightly different picture about what's happening with conspiracy theories.
So my worry is that...
Some of the psychological conspiratorial ideation may be right of some people, but I'm worried that that's sometimes too individualistic an account that is mostly focused on trying to find a reasoning error in a person.
And my worry is, again, if someone sets the evidence in the right way, I mean, remember, the big background picture here is that we learned who to trust from other people, right?
If you start and everyone has to trust their parents and teachers about parts of the world, so if your parents and teachers tell you most of these things are false, it's only Fox News or whatever that tells you the truth, and believing the teachers around you is a reasonable first move,
now you're...
Trusted source of evidence is giving you these sets of evidence.
So a lot of the times my worry is that it's not an ideation problem.
It's like a large-scale sustaining misinformation problem.
And if you make it pleasurable, it's simultaneously a little extra sticky, right?
So my worry is that what we should be looking for is...
Reasonable procedures in tainted informational environments, which is a different story from a purely psychological process.
And I think that story might be true of some people.
Yeah, look, look, yeah, no, I completely agree with that.
And in that sort of, that psychological frame I gave it, I don't want to overemphasize, you know, the biases and fallacies or the emotional motivations of the people.
Those are useful explanatory factors, but I definitely agree with you that they're not.
A necessary component at all.
But I probably would stick to my gun slightly in describing the process of the way in which they're evaluated as enacting bad scientific investigation principles,
I suppose.
However, this connects back to what you mentioned earlier on, which, again, I wanted to follow up on because I really think it's important.
Having a good trust network is and that we all necessarily rely on authoritative sources of information.
Like my opinions about climate change do not derive from a close inspection.
Of the raw data.
There is far, far too much of it.
We've talked about specialisation and so on.
So I'm just simply agreeing with you that in practical terms, in terms of how us or just people as just consumers of information and havers of opinions, probably the most important thing is figuring out the correct trust allocations to have.
And a lot of the people who are victims of conspiracy theories or adhere to them are not, there's nothing wrong with them psychologically.
Just as you said, they've simply, often through no fault of their own, allocated the trust to the wrong sources.
I mean, I'm really, sometimes I just really, like, sometimes I can tell a story that says, like, no, that procedure is totally reasonable.
Other times, I think, like, what it often seems like is, once you accept the belief system, it's self-sustaining, but what about the moment of acceptance?
And sometimes I think, like, what you might find is, Not like wildly rationality, but like a moment of weakness where you're like, oh yeah, that's nice to believe that person.
And then once you start doing that, rational procedures are self-sustaining to continue that belief.
So it's just like a little slip.
And I don't even know how to assign responsibilities.
And I mean, there's also all this other stuff where I worry that like, if you have these situations where you reason a little bit more loosely, just a little bit.
And you get enormous amounts of emotional comfort.
Like, that's really hard to resist.
Yeah, I know.
My gut feeling on this, and this is pure opinion now, not being all professorial and stuff.
It's just that my gut feeling is that the two things that help with that is trying to be dispassionate, like just cultivating that a little bit.
And that helps, I think.
And being willing to revisit.
One's assumptions.
Those two little good habits can maybe help us pull back from the brink after we've sort of accepted that first premise and then have started going down a rabbit hole.
And I think we all have.
I think everyone has gone down some little rabbit hole at some point in their lives and you need to be able to walk it back and I feel like those two things can help.
Here's a worry.
I'm going to continue to play the pessimist about rationality here.
I'm not totally sure about this, but here's my worry.
A lot of the times, the systems that are so sticky and catchy get some of their catchiness by simulating a particular experience of rethinking your assumption.
And that looks like, oh, what your assumption was was CNN is trustworthy.
Rethink that.
And that's why, I mean, right?
So that's why in some ways, like...
And I think the party line in a lot of...
This world is like, oh, you sheep, you haven't even, you just trust CNN.
We've thought about it.
We've stepped back.
We've worried.
And that, I mean, so I don't think you say like, oh, you're never rethinking your assumptions.
Right?
And that's the pleasure.
No, no, I agree.
There is no magic bullet that solves that problem.
It's very hard to reason your way out of a place of delusion.
But that point about having skepticism and cynicism and it being rewarding.
I was strongly interested in Buddhism when I was a teenager in the slightly exotic, oh, it's a philosophy, not a religion kind of way.
And then I went to university and started studying actual Buddhist history and cultures and found out, oh dear, my illusion was shattered.
And that was unpleasant, right?
But then there's a pleasure that comes from it where you're like, oh.
Actually, now I get to find out the reality and it's complex and it's messy and the history is actually interesting.
But there's a pleasure in that, oh, I saw through the facade.
And when I see people talking exactly like you said, T, about CNN or the WHO or institutions, there's the same feeling that they've seen through things that others haven't.
And it's really hard to explain.
Well, I get the pleasure and I also get that you're right to be cynical, but you've took it too far.
And that feels like it's a position that it's hard to communicate in a way that doesn't end up sounding like special pleading.
But it's probably exactly what you said at the start of this conversation about, you know, the reality.
Is complex, unsatisfying, sometimes a bit contradictory.
But that's what you have to deal with if you want to grapple with reality.
I was actually thinking about this on a walk earlier today.
This went in a weird place, so let me see what you think about this.
So sometimes I think, is there an internal...
Hint that I'm caught in something like this, right?
And one thought you might have is, I said before, like, one of the pleasures of a game is that it's made to fit your mind, and it feels like it's capable to do the stuff in the game.
And the worry ad was like, it's almost like, look, if you think you have a total picture of the world...
How could you think that the world is something that would just fit easily in your mind?
So maybe a little sign is that it's too easy.
You think you have a final answer.
But then again, once again the worry is, what does it feel like to be Darwin?
Oh my god!
I understand so much now.
Yeah, there was a real Galileo.
Even though there's a lot of people thinking they are Galileo.
Yeah.
Well, yeah.
I still think it's a good rule of thumb.
I quite like that.
When it seems to fit like a hand in a glove and the mist seems to fall from your eyes and everything now makes sense, that should make you very, very suspicious and cautious.
Yeah.
Cool.
So is there anything else we wanted to, points we needed to cover before we wrap this up?
Anything?
I don't think.
You feel we've badly misrepresented.
This is awesome.
This has been an incredible time.
If we spend some time and you see more phenomenon, I've been thinking about this more, I'd love to talk again and figure out more stuff.
I do think that the philosopher's way of thinking and the psychologist's way of thinking and the anthropologist's way of thinking are usefully different intersecting.
Yeah, totally agree.
You know, I feel after we just did the Douglas Murray episode where Douglas Murray and Eric Weinstein slapped themselves on the back for four hours.
I'm in danger of falling into that area, but I will say for me, just for me, this has been an extremely enjoyable conversation.
And T, I really genuinely love the work you're doing.
Yeah, keep it up.
And I'm sure our paths will cross again before too long.
Absolutely.
I think there's so many intersections between the stuff we're doing with the gurus and the stuff you're investigating in your academic work.
And as we said offline, we are hoping, planning to eventually write something academically on this ourselves.
And yeah, it'll be great to be working in the same field.
So just want to say thanks very much for coming on.
We will post links to those excellent lectures you've given that are available on YouTube and also a link to your interview on Embrace the Void.
And so some good stuff there.
And if there's any other cool things you want to share with our audience, we can probably find a space for it.