All Episodes
May 23, 2024 - Conspirituality
01:04:38
207: Gaming Realities (w/Thi Nguyen)

Philosopher Thi Nguyen first visited us 150 episodes ago (!!) to discuss how social media gamification exploded online conspiracy theories and audience capture drags content producers toward the seductions of premature clarity—and the ecstasy of fascism.  Nguyen returns to discuss “value capture”: how simplified and portable metrics in institutions, technology, and media landscapes erode our moral capacities as we pursue goals we never signed up for. (We even consider this influence on podcasting!) Throughout, we also talk about the heart of Nguyen’s book, Games: Agency as Art, in which he explores the liberatory nature of games that offer the pleasures of striving and absorption. We wonder whether—if we valued and understood play for its own sake—we might not need to gamify the world.  Show Notes Games: Agency as Art  Games and the Art of Agency (Philosophical Review) (2020 APA Article Prize; selected for Philosopher Annual‘s “10 Best Philosophy Articles of 2019”) Value Capture (JESP) Trust as an Unquestioning Attitude (OSE) Transparency is Surveillance (Philosophy and Phenomenological Research) (short summary) Hostile Epistemology (keynote for the 2022 NASSP.) Autonomy and Aesthetic Engagement (Mind) (audio) Art as a Shelter from Science (Aristotelian Society Supplementary) The Arts of Action (Philosopher’s Imprint) Moral Outrage Porn with Bekka Williams (Journal of Ethics and Social Philosophy) (selected for Philosopher Annual‘s “10 Best Philosophy Articles of 2020”) How Twitter Gamifies Communication (Applied Epistemology, OUP) (And a shortened version for students, with suggested classroom exercises.) Echo Chambers and Epistemic Bubbles (Episteme) The Seductions of Clarity (RIPS) Cultural Appropriation and the Intimacy of Groups, with Matt Strohl (Philosophical Studies) Trust and Antitrust — Annette Baier Learn more about your ad choices. Visit megaphone.fm/adchoices

| Copy link to current segment

Time Text
I think a lot of people in this space think games are awesome, so if we gamify ordinary life, Duolingo, Fitbit, everything will be awesome.
And my theory tells me, it confirms my experience of life, that games are often amazing, and gamification and metrics are often soul-destroying.
So I'm trying to understand why.
I'm going to do a little bit of research.
Hello, everyone.
Welcome to Conspiratuality, where we investigate the intersection of conspiracy theories and spiritual influence to uncover cults, pseudoscience, and authoritarian extremism.
And today we can add to that tagline a paradox that both conspiracy theories and Twitter gain power to the extent they are gamified, but that's because games in their purest form offer the joy of striving and the power of agency.
I'm Matthew Remsky.
I'm Julian Walker.
I am T. Nguyen, Associate Professor of Philosophy at University of Utah.
We're so happy to have you, T. Happy back.
We are on Instagram and threads at ConspiritualityPod, and you can access all of our episodes ad-free, plus our Monday bonus episodes on Patreon, or just our bonus episodes via Apple subscriptions.
As independent media creators, we appreciate your support.
Conspiratuality 207 Gaming Realities with Ti Nguyen.
So Ti, welcome back.
I think I can speak for everybody here at the pod in saying that since your last visit with us, this is 150 episodes ago, by the way.
This is episode 55.
Yeah.
Your work Your work on gamification is pretty much always buzzing in our ears as we consider the explosion of conspiracy theorizing online, the audience capture that drags content producers towards the most visible and inflammatory positions, in most cases towards, like, the ecstasies of fascism, and the seduction of premature clarity, how mic drops function as much to shut people up as they do to communicate the pith of something, because it's a mic drop, which implies no one can use the microphone after the mic drop.
But there's two huge conversations slash topics we didn't burrow deeply enough into in the last visit.
One, because I think it was still in development for you.
So this is the nuts and bolts of value capture.
And the other, because we didn't fully explore the liberatory philosophy of games that you laid out in your 2020 book, Games Agency as Art.
Yeah.
Welcome back, T. We've been so looking forward to an opportunity to have you back.
And, you know, of course, anyone listening can tell we're already like into the nerd sphere and referencing terms and using jargon
and like quoting you from your tweets.
So let's back up a little and just for the listeners, let's quickly define a couple things
that Matthew referenced. So for anyone unfamiliar, you're a philosopher who's written extensively on
the philosophy of games and especially on how certain aspects of games have increasingly
influenced online discourse, social media, echo chambers, the filter bubbles of political
punditry and conspiracism, and even institutional decision-making.
Can you refresh our memories now by connecting the dots on how gamification Has led to this phenomenon that you are calling value capture.
So I'll take the long ramp in and then you can like stop me or tell me if I'm bored.
So let me talk about games first and gamification second.
And so where this is going is I think a lot of people in the space think games are awesome.
So if we gamify ordinary life, Duolingo, Fitbit, everything will be awesome.
And my theory tells me, it confirms my experience of life, that games are often amazing and gamification and metrics are often soul-destroying.
So I'm trying to understand why.
And I think at the center is what it is to have a scoring system and to be given points.
By the way, this is very much on my mind, because I'm actually in the middle of writing a manuscript for my first popular book, which is just about this.
It is called The Score, Games, Metrics, and the Meaning of Life, and it's kind of about how games are a way into a meaningful life, and metrics often destroy it.
So, I was trying to understand the nature of games, and if you look in this academic World of people being like, oh, games are great.
Games are an art form.
A lot of the times what you say, what you see is people desperately trying to show that games are an art form by comparing them to already existing extant art forms.
Oh, games are great because they're like movies.
They can tell stories like fiction.
People try to tie them to recognizable high status art objects.
And a lot of the stuff, and it's in both the academic discourse, but a lot of the times in like fan, like when you read about games and the New Yorker or like the Atlantic, It's always going to be about some game with a rich story and fixed dialogue and a script and as little like a game and as much like a movie as possible.
And so I was trying to understand this and how far this had gotten from my understanding of games and all the stuff I read about online about games.
And by the way, I should just say here, my notion of game is really broad.
I'll talk a little bit more about this, but it includes video games, sports, card games, board games, role-playing games, right?
I was listening to a lecture from Rainer Knizia, who's my favorite European board game designer, and he said, the scoring system is the most important part of my game design toolbox because the points set the player's desires.
They tell the players what to care about.
And as a board game player, I was like, this makes perfect sense.
Like, these days, you open up a board game, and not only do you find out whether you're, like, collecting sheep or trying to kill each other militarily, you literally find out whether you're cooperating or competing around teams, right?
Your whole, like, you can open up a game, and we're like, oh, this is a co-op game.
We're all in it together.
Or, oh, we're trying to kill each other, right?
The game just suddenly aligns your desires.
So this is totally obvious to me as a game player.
And as a philosopher, I was like, oh my god, no one talks about this.
Games just set your desires.
And so when you start thinking about things like gamification, rankings, and metrics, you seize the same phenomenon outside in the real world in a way that is extremely terrifying.
So one of my favorite books about this is Wendy Esplin and Michael Sauter's Engines of Anxiety.
It's a study of what the U.S.
News and World Report law school rankings due to legal culture.
And one of the things I say is that before the law school rankings, law schools had deeply different values and deeply different missions.
And some values, some law schools pursued corporate missions, trying to get their students into big paying jobs.
Some pursued theoretical missions, doing lots of legal theoretical research.
Some pursued outreach missions or training people for social justice.
And the moment the rankings shows up, the entire legal world Instantly orient towards the rankings and what they care seems to be set by just whatever the rankings
Measure.
And a lot of the times that's very different from what they might have valued.
So for example, the U.S.
News & World Report rankings are heavily set on incoming class GPA and LSAT score.
So if you're interested in outreach to underserved minority populations, if you serve that mission, your ranking will drop.
Because students that come from historically marginalized groups, students that come from impoverished backgrounds, even if they can be as good lawyers, will typically have poor GPAs and LSATs coming in.
It's funny, we're talking about spirituality.
I was on another podcast called A Philosopher and a Pastor Walk Into a Bar, and when we were talking about it, the pastor was like, oh my God, this is happening in our church.
Right.
There's a leaderboard for baptisms, and now like everyone's obsessed with upping baptism numbers, which is different from building like a healthy flock and a good community, right?
Upping baptism numbers is much narrower.
What I think is that games have these very simple, clear scoring systems.
That help us enter into alternate agencies.
So my claim for my games book is that what games are is an art form that works in the medium of agency itself.
It tells you what to desire, it tells you what abilities you can use, and it tells you the obstacles, and then it shapes an interesting activity.
And then ranking systems do the same, but, and there's so much more to say here, typically they're not built for your pleasure and interest, they're not shaping activity that you find rich and lovely, and you often don't have a lot of control over which ones you Just to push on that law school transformation example a little bit, we might say that being interested in the law was a way of seeking a kind of agency.
I want to be this type of person in the world, or I want to do this type of work in the world.
And in that sense, the law itself as a discipline would grant the person a way of accomplishing that.
But if the law school sort of entrance So the first place I saw this, I mean, I'm an educator, so I see this all the time with students, like, a lot of the times, my freshmen, right, the people that are just coming in are like alive and curious by the time they're
Seniors, they're just like, what do I need to know for the tests?
I just need to get out of this class.
And in my own discipline, like if you would have thought anyone would be immune to this, it would be philosophers.
But instead, it's the opposite.
We have a ranking system.
We have two ranking systems.
We have a journal ranking system, and we have a department ranking system.
And literally, so many people go into philosophy With a sense of like, oh my God, I want to do this because these big topics are really meaningful.
They're really engaging.
They're really exciting.
I want to talk about the meaning of life.
And then after seven years of professionalization in graduate school, they're like, well, to get a good job in a good ranked university, I need to publish in highly ranked places.
And the way to publish in highly ranked places is to write small technical articles on minute topics.
That no one, including in many cases, the people themselves care about, but that's the root.
And so like, there's this thing.
So for me, I felt it heavily on Twitter.
I started writing about this a lot because I was on Twitter and I would do these things where I was like, I was on Twitter because there were other cool academics on Twitter.
And I could find out about cool ideas and talk about interesting ideas and find out about stuff.
And then a couple of times, I posted a dumb joke that my five-year-old said, and twice I went viral.
To give you numbers, when I'm posting about philosophy stuff, I get 300, 400, 500 likes.
Twice I posted a joke from my kid, and it got 120,000 likes.
I had to delete Twitter.
did like a joke from my kid and it got like 120,000 likes.
Right.
And then my like it I had to delete Twitter.
Like my brain went on fire and I was just like, okay, what's another funny thing I can do?
Wait, what?
What's the next quip, right?
And it just like, it shapes you.
And really loosely, one of the things that I'm really worried about is that a scoring system in a game is very artificial, but it's not artificial of anything in reality.
Right.
The score in chess is not a transformation of something outside of chess.
But the score in Twitter is a transformation of your values of communication.
My score in philosophy is a transformation Of my interest in, like, wisdom.
Yeah.
And thoughtfulness, right?
So games, insofar as they're temporary artificialities, aren't transforming a rich, dynamic value into something really simplified.
But gamifications are typically targeting something like education or communication or journalism, right, which has distinct and complex values, and then simplifying it.
What it amounts to in gamification cases is outsourcing your values.
That one of the things that's going on is instead of deliberating about what you care about and thinking about why you care about the thing you're doing, you're letting the U.S.
News & World Report, you're letting Fitbit, you're letting Elon Musk set what your values are.
Before the U.S.
News & World Report law school ratings, the decision of which law school to go to, because law schools represented deeply different values Because it didn't scale against each other easily, would trigger a kind of process of figuring out what they cared about in their own legal education.
They would ask themselves, why do I want to go to law school?
What kind of career do I want?
What kind of education do I want?
Who do I want to help?
Yeah, who do I want to help?
And then the moment the law school rankings show up, they stop deliberating.
The vast majority take their goal to be just to get into the best law school, and they take best to be set by the U.S.
News and World Report rankings.
So I treat that as a kind of value outsourcing.
Right now, whatever you care about is set by whatever Criteria feed into the U.S.
News & World Report ranking systems.
So you're outsourcing your process of evaluation.
You've got these two messages, though, throughout your work.
Like, you persuasively argue that gamification of social media, of institutions, and especially on social media when it exacerbates cruelty and ignorance or it narrows our values down to, like, really crude parameters.
You know, you're a real bummer with that stuff.
Because it's a picture of a world in which I think we're familiar with it.
People are literally masturbating themselves on a feedback loop of intensifying content within a narrow range of meaning.
They're yearning for a resolution that never comes.
And then I'm reading your book about games, and I can see that this is really a distortion of this other possibility, and that broadening our understanding of human play might shed some light on how we might all just chill out a little bit more.
So how do I resolve this contradiction?
Like, why are games so lovely?
Much of the time, not always.
And why is gamification so terrible?
Much of the time, not always.
There are a few ways to put it.
The simplest way is that games end, and there are lots of them.
Right.
And gamifications do not.
I was in the middle of writing my first book, this games book, and I was like, trying to explain why games made it so flexible.
And I encountered this work, this guy, Miguel Sicart.
He's a really good thinker.
He really got me going.
So he has this early paper that I love called The Banality of Simulated Evil.
Right.
Wow.
And he talks about how a lot of people's, a lot of simple moralist response to Grand Theft Auto was to be like, oh, this is encouraging people to be evil.
We need to fix it.
And the way that people tried to fix it was gaming companies like Bethesda created simple in-game moral point systems.
Like, they did it for their Star Wars game.
It was always super simple.
It would be like, an orphan asked you for money!
If you give them money, you get a morality point.
Do you kick them?
You get a dark side.
It was ultra simple.
And Sigard's argument was actually, this was worse.
Because what it inculcated was the idea that morality was simple and quantified.
Grand Theft Auto did not gamify morality.
It gave you complex, nauseating situations, and then just left you to think about them.
One of the things I think that's most important about games is the reason you play them.
And to think about that reason, you often want to distinguish between a goal and a purpose.
So I recently discovered that I've too... I too forgotten on-ramps to this thought.
One was a fly-fishing book from John Gierak, who was an author I read like decades ago, who turned out to be... have been a philosophy major as an undergraduate.
And he says, you go fly fishing, sometimes you catch fish, sometimes you don't, but the process like kind of attunes you to the water and the insects.
And sometimes you're all frustrated because you didn't catch fish and you're walking home and like your senses are all hyped up and you can smell the air and you can see the insects and you can see the light and you remember That catching fish is the goal, but not the purpose of fly fishing.
Right.
Similarly, my advisor, Barbara Herman, Kantian ethicist, at one point was, we were talking and she was like, oh, you're all just forgetting the difference between a goal and a purpose.
And we're like, there's no difference.
And she was like, sure.
When you go over to your friends to play a board game, The goal is to win, but the purpose is to have fun.
Yeah.
I think this is incredibly deep.
So what I ended up saying in my book, Developing This Idea Out, is that there are two reasons you can play a game.
One is achievement play, where you try to win because you value winning.
And the other is striving play, where you try to win for the sake of your absorption in the struggle or the process.
So who does achievement play?
Tons of people.
But I think typically, here's an easy example, most Olympic athletes, I'm guessing, maybe not all, but many, Care about winning.
A really easy example is like a professional poker player that just wants the money.
What matters to them is winning.
And I think there are a lot of people that just rawly want to win.
One of the ways you can detect striving play is that you try really hard to win, but afterwards you don't care.
Like, I mean, think about if you play a party game, like if you play charades, right?
It's only fun if you're trying to win.
But if you lost your raid and you were like, screw you all, the evening is wasted.
I hated this.
You're an asshole, right?
You didn't get the point.
So for me, this is really deep.
I think a lot of the times what's happening is in games, You have a goal that's specified in the game that's really thin and narrow, but that's not really the reason you're doing it.
The reason you're doing it is to shape this kind of rich...
And you can pick the game that gives you the kind of activity you want.
I think that's often not true of gamifications.
I mean, a simple way to put it is like, look, you can bounce around hobbies and like, like running doesn't fit me.
Climbing was great.
I started fly fishing during the pandemic.
And I think, so my theory is that fly fishing is meditation.
right? That like you stare at water and what you actually get is a cleansing spiritual experience.
And then you can pick the kind of activity, if what you want is meditative state,
then you try on different goals. Maybe you try running more miles, maybe you try
climbing harder cliffs, maybe you try fly fishing, right?
And then the goal that gives you the state you want, that's the goal you glom on to, but that's not the goal itself.
So the thinness of the goal is just a trick to get you to this other bigger thing, right?
You pick the party game, you all try to like collect more word points, and you pick the one that gets you fun, right?
Yeah.
And I think in many cases, But if you forget, what forgetting is like is being like, well, okay, that's it.
My goal is to get as many fish as I can, and no matter how miserable I am, that's what I do.
My worry is that state is the state that's inculcated by a lot of gamifications.
Yeah, I want to get as many fish as I can so I'm not the loser.
Exactly.
It's fascinating because I hear this paradox in here, which is that You know, games in and of themselves have this value which is different than the sort of stated goal of the game, right?
There's an experiential piece, there's a social piece, there's maybe a flow state kind of piece, there's these different things you're gesturing towards.
When that kind of gamification happens where it gets imported into these other spheres that are in the real world, the stakes actually get higher.
But the structure within which the game is happening has this artificiality that is just sort of taken on board as if like, well, this is how you quote-unquote play the game.
And I was thinking about that, about how many people who are intensely career-driven talk about, well, you got to play the game, right?
You got to figure out What are the rules?
What are the ways to get ahead?
And how do you do that regardless of perhaps ethics or deeper, you know, deeper personal sort of meaning in terms of what you're doing?
And the thing I find really fascinating about all of this, you said the gamification, the structure of it is not artificial to anything else.
And I was hearing the idea of metaphor, right?
And that there's a symbolism that's happening in actual gameplay.
It somehow gets corrupted or gets turned superficial or gets turned in service of something that you're not even noticing that you've lost track of the actual reason you're doing the thing you're doing, like studying philosophy or becoming a lawyer, right?
Okay, so the inspiration for a lot of my work is this book by Bernard Soot, a philosopher from the 70s who wrote a book called The Grasshopper.
And he has a definition of a game, and playing a game is voluntarily taking on unnecessary obstacles to create the possibility of the activity of struggling to overcome them.
So a key to that notion is the voluntariness.
Another key is that With actual games, usually, not always, usually we get to pick the rule set in the scoring system that gives us this kind of process that we like.
And at the end of the book, Suits has this argument.
And he says, OK, imagine Utopia, where we've solved all our practical problems.
What would we do?
We would play games or be bored out of our skulls.
So if games are what we would do in Utopia, they must be the meaning of life.
Yeah.
Right.
This is something like a lot of people hate.
When I first read it, I was like, this is weird.
And I thought about it.
I think it's actually a way for him.
So, Suits was also an Aristotle scholar.
And Aristotle thought that meaning in life came from activity, not from the outcomes of activity.
The thing that gives our life meaning is the activity of exercising your abilities.
And you make stuff, and some of that stuff is useful to get you to be back into the activity, to support you in doing activity.
But if you just think that the point of life is to accumulate those outcomes, right?
So here's a really rough version of a solution to this paradox you're worrying about.
With games, a lot of the times, We change games until we find an activity we like.
Games are places where we shift around the scoring system to give us lovely activity.
In a lot of gamifications, we're stuck doing shit we hate for our entire lives in order to make a
set of points go up, and we don't take ourselves to have freedom or agency about which point system
we're picking.
One note about striving play is that I think we can say that when Ron Watkins, at the end of the
QAnon phenomenon, says, well, maybe this was all about the friends that we made along the way,
I think he's describing he was involved in striving play, that QAnon actually was a game,
Not only did it make use of a gamified internet structure and social media structure, but it did feel like a game to those who were running it.
And of course, it had disastrous circumstances, you know, outcomes.
But here's the other thing that I wanted to pick up on.
The suits argument about are games the meaning of life if that's what we're left with or that's what we choose to do in Utopia because all of our problems are fulfilled.
It sounds weird to adults, but I can tell you, I heard you give this example, or maybe I read it in the book, and then I passed it along to my 11-year-old, and his jaw just dropped to the floor, and he said, that is absolutely true.
That is absolutely correct.
And so there's something about that argument, I think, that probably makes more sense to children than it does to adults.
Is it the case that children have this naive view and adults, like, know the truth?
Or that adults have been bureaucratized and uptaken into large-scale work institutions and reprogrammed to think that... Yeah.
Right?
That the purpose of our life is to make some number go up and that kids understand that the point of games is to have fun.
Yeah.
Or to have a rich interaction.
So, one of my terms for this In the book is that what we're talking about is a process object and something that encourages process aesthetics.
So an object artwork is something where the good things are in the artwork itself.
So the novel, right?
The painting.
And a process artwork is something which structures your actions and that makes the good stuff come out in you.
Like the player.
One way to put it is in a game, the primary seat of beauty and elegance is the player.
I think like, and like for me, This is a broad category.
This category includes dancing, yoga, right?
All of these activities where I think, I mean...
Right, is the purpose of yoga to get into the hardest pose you can or is the purpose of yoga to feel your body in motion and becoming more elegant and becoming attunement for the sake of attunement itself, right?
The way in which, to me, gamifications are deep corruptions of true gameplay is that true play involves doing things for the sake of doing itself and manipulating the kind of activity you're doing to give you a joyous, a rich, a fascinating kind of activity.
And gamifications are about creating exhausting, miserable, grinding work and motivating us in order to Right?
Make more junk or move things around faster in the warehouse or something.
I love that you brought up dance and yoga.
I was thinking about dance in that exact moment.
And I was thinking, wouldn't it be weird if we thought about dance as having some kind of purpose that it was, it was training you for that, that if you, if you could learn how to dance in this way, then out in the real world, you could do this other thing.
And it's making me think about how games in and of themselves, seem to ride upon some kind of evolutionary, you know, set of functions where, you know, you watch a little animals playing and they're play fighting, right?
They're preparing for like real life situations in which they may need to use their reflexes and certain kinds of moves in terms of protecting their lives or winning or getting the, you know, getting the food.
But there's, yeah, there's some kind of fascinating way that I hear you talking about games
almost in their pure sense as sublimating various drives in our lives that ultimately could be just,
end up being, you know, an endless pursuit with no real sense of joy or satisfaction or pleasure.
To me, what happens when serious adults get their hands on games and play
is they try to make them do work.
And what I mean is like, they try to say like, well, they're justified if they make you more moral.
Or they're justified if they train you better for your job.
Or they're justified if they make you, you know, more able to learn like fine motor skills
so you can go work in the factory or whatever, right?
Like, yeah.
And one of the fascinating things is how much of the entire option space doesn't include that they're fun.
Yes.
And so this idea that kids have it backwards because they don't realize that the whole purpose of kids games is that they train you to eventually be a productive adult.
Right?
Right.
Do you know William Davies' book, The Happiness Industry?
No.
This seems up your alley.
No.
This is a great book.
He's a journalist.
And he has this analysis of a modern movement in the psychological sciences called positive psychology.
And he thinks basically positive psychology, a lot of the methodology is quite scientific, but at the core is an operationalization of the notion of happiness, where happiness is measured in terms of your ability to work.
Basically, people are depressed if they can't go to work anymore, and we fix them and they're happy if they can go back To work, right?
I mean, I'm guilty of this a lot, too.
And I do think games are useful for a lot of things.
I do think that they do teach us things.
We do enter into other agencies.
They do teach us a kind of fluidity.
But let me tell you something.
While I've been in a basement a lot, so I got a research release to write this book.
So I've been in a basement myself.
And this is a book about scoring systems.
And I have, during it, I was doing all this research.
So I got really interested in cases where you had an activity That was kind of rich and lovely.
And then it enters kind of a formal competition scene and it changes.
So one of the ones that's really interesting to me is skateboarding.
So skateboarding kind of naturally is about doing these cool, rich, inventive tricks.
And then when it enters a more official setting like the ESPNX, it has to become objectively measurable.
Yeah.
Very weird.
In order to give the prize.
Very strange.
And then it becomes like about.
Who can do the highest jump and the most flips?
And although I think skateboarding culture has pushed back against this heavily.
I, when I was researching this, I discovered that there's this world of like modern competitive yo-yoing.
Yeah.
I started reading about this and I started doing it and it's awesome.
Like it's right in line with, if you like yoga and rock climbing and all this stuff, like modern yo-yoing is the same.
It's like all these incredibly precise, intricate, dynamic tricks.
And then I would find myself with other adults being like, what are you doing?
Well, I'm writing a book and I'm learning to yo-yo.
And then I would feel awkward.
And even though I literally am writing a book about this, I'd be like, yeah, yo-yoing is really good because it helps me de-stress and write more.
And it's really useful because it like calms me down and I get a lot of ideas.
So my writing goes better.
Right?
Like, and that feels like, you know, it feels like a betrayal of the thing, right?
Like, I feel the same.
I feel the same thing about, like, the world which asks us to justify doing yoga in terms of, well, you know, it helps me de-stress so I can, so I'm more productive.
Look how much more productive I am when I take an hour.
I mean, I say this all the time, like, you know, academia is a workaholic culture and people will be like, how could you?
Take an hour of the morning to do yoga, like in shock.
And the ready response is, oh yeah, I turn much more productive.
And the people are like, oh yeah, that's great.
That makes sense.
Right?
Right.
Great.
Well, it gives snipers better aim when they calm down their breathing, right?
It's all kinds of usages.
And at the center of all of this is agency, right?
And so I had a question about consent that comes out of the value capture paper.
You write, many of us feel an intuitive horror when contemplating cases of institutional value capture.
And so you've listed some of them.
But it is rather difficult to say in a principled way exactly why value capture is so horrifying.
Because for one thing, value capture is often consensual.
My question is, how do we define and exercise consent in an immersive, dominating, techno-capitalist environment?
That's a great question.
I mean, it could be that we don't really have consent in a deep way.
I mean, at least one of my philosophy teachers thought that there was no such thing as real consent if you had to work for a living or starve.
Yeah.
Okay, let me walk into this question for a bit.
So when I was trying to figure out why it was that these cases of value capture where
someone internalizes a simple external account, why they were so problematic, the first thought
is like, oh, it's because it's not consensual because you're losing control of your values.
But that isn't right for two reasons.
One, a lot of our values do come from the outside.
And like, that's not a problem.
Right?
I learned to like jazz from other people teaching me and showing me the way.
And the other is, a lot of gamifications are as consensual as anything.
I mean, maybe not perfectly consensual, but if you're like, well, I want to work out, and I'm going to buy a Fitbit.
About as consensual as it gets, right?
And there's a way for a lot of philosophers in this space, like a lot of these techniques are considered like ways to fix the weakness of the will.
And there's this thought from people like John Elster, who's this really interesting social thinker, that a lot of the times, when you reduce your will, in the short term, you're really actually augmenting it in the long term.
So these examples like, if you're like, okay, I'm trying to quit smoking, no one give me any smokes.
That's a way of reducing your will in the short term to extend your will in the long term.
The book of his is called Ulysses and the Sirens, because that's an example.
Ulysses has himself strapped to the mast and deprives himself of his will because in the long term he wants to achieve something he wants, which is hearing the sirens.
You might think of a lot of these gamification cases as cases where you are augmenting your will by like choosing and then tying yourself to the system that will lock you in to this value system.
Why not?
Why isn't that just great?
It's going to make you a strong man.
It's going to make you a real masculine dude who like can stick to his commitments, right?
Right.
You're about to distract me into a totally different topic.
That is, I find super interesting.
So for me, the answer can't lie in consent.
It has to lie in the nature of the values and your relationship to them and something much bigger than consent.
And for me, a lot of the times it has to do with two things.
One is that the values are so thin, right, that the kinds of things that we can measure easily in large-scale institutional settings are very different from the kinds of things we actually care about.
The easiest example is obviously Twitter.
I care about communication, but what we actually can measure is people clicking like, and that doesn't capture everything about communication.
All that captures is this very simple, short-scale, did I like it at the moment I saw it?
Oh, man, I think a lot about that and how these simplified assessments and bits of feedback just push people in directions that they can't even necessarily see.
And, you know, Julian and I were thinking exactly about this with regard to our own field.
So there's something a lot of people notice that is related to this whole algorithmic filter bubble and echo chamber effect that we've talked with you about before.
And that's sometimes called audience capture.
We've seen it with how some of the figures we cover, they started off as culturally liberal spiritual influencers who during the pandemic increasingly shifted to the right by a conspiracy theory discourse.
It's like the metrics of likes and views and follower counts and then monetization opportunities becomes a vortex sucking these influencers into espousing more and more radical points of view because that's what gets them the result they want.
And we've seen it too with contrarian political news channels on YouTube and perhaps most notably with huge podcasts like Joe Rogan and all of the satellite shows that cluster around him.
What do you think about this idea that the feedback loop of gamification can create dangerous incentives that distort our discourse?
So I've lately been calling it the gap.
And it's the gap between what matters and what's easy to measure.
And a lot of the times, I think what gets this loop going is that when the measurement has a greater impact on us than what we originally valued.
So again, if you value kind of rich connection and communication, and then Twitter gives you very clear feedback on popularity, which is a different measure, then you're going to start Insofar as we find those numbers motivating and all the research, the empirical research says we find numbers going up incredibly motivating, you'll start to shift a little, you'll start to shift, right?
You'll start to shift in terms of what's easily measurable.
So then the question is like, what makes certain things easily measurable?
So I can give you some kind of abstract theory and then we can try to touch it down to social media.
So When I was trying to figure this out, I found this incredibly good literature in a discipline called science and technology studies.
And there are a bunch of people who've been trying to figure out the nature of quantification, right?
Why it has this incredible pull on us.
Theodore Porter.
is this very interesting historian of quantification.
So there are two kinds of justification, he says.
Qualitative justification, that's like rich in terms of words, right?
And quantitative justification, numbers.
And he ends up saying that they're both good, he thinks, but they're good at different things.
So qualitative justification is good at being nuanced and sensitive.
and responsive to what's going on in the particular situation, but it travels really badly between contexts.
It requires a lot of shared background.
Quantitative justification is comprehensible across backgrounds because it's been engineered to be comprehensible across backgrounds.
So what Porter says is, when you make an institutional quantification, like a metric, you identify some context-invariant nugget, and you distribute that across A large number of people in different contexts.
So for me, as an educator, again, qualitative justifications look like, to me, like writing paragraphs of response to student papers in which I talk about what they did and how it worked.
And quantitative justifications are the letter grade, A, B, C, D. And there's a huge information loss between this multidimensional responsive thing of writing to students about what they did and this very simple thing, which is, where are you in this simple spectrum?
But because that spectrum of letter grades is so simple, everyone can understand it, and it can travel easily between contexts.
Right?
So that's, for him, what quantification is.
It is a denuancing, desensitizing, A decontextualizing drive that is powerful precisely because the fact that we've decontextualized things makes it travel more easily between contexts and be more understood between contexts.
Again, right, I can talk, oh, this idea was super interesting, super rich.
People are like, whatever.
And I'm like, oh, yeah, I got 15,000 likes.
Right?
That makes sense.
And that's such a thin measure.
So when I was trying to figure out what social media was doing to me, one of the ways I thought about it was this.
So if I'm having a conversation with a bunch of people at a party and I say a weird thing to 20 people and 19 of them don't get it and one of them just comes alive and is like super excited and we have a conversation, that registers to me as meaningful.
If I tweet something to 20 people and only one of them likes it, that feels like a failure.
Right?
So there's some interaction between this decontextualizing and the thing we already know, which is that you're playing to the lowest common denominator.
But one of the interesting things about social media is the degree to which it rewards things that are extremely, extremely low context, and that can be understood instantly by people without background.
Right?
And so there's this question about why that works so well with things like conspiracy theories.
My own theory is that conspiracy theories are themselves also engineered to travel really well and be understood by everyone and applicable immediately, right?
Oh my god, this is fascinating, right?
Because on the one hand, you're talking about a joke, right?
A joke that is just like instant fast food gratification.
Anyone can understand it, right?
Given enough like cultural sort of relevance or familiarity to their lives.
But then you're talking about conspiracy theory that is, it's like a layered piece of fast food
that you can really sink your teeth into and get a lot of enjoyment from.
But it's also this sort of, this thing that has been engineered in such a way
that it doesn't matter whether the facts and evidence add up, right?
Right, no, this is it.
So this is, so you're asking me to connect two things I haven't connected yet, so I'm kind of like
lost in like all these threads.
But one way to put it is, so I have this paper, Seduction to Clarity, and I talk,
and I say like one of the things that you, that,
We love this feeling of understanding, of like grasping a whole, right?
So the philosophers who've worked in the space say that what it is to understand something is not just to have disparate facts, but to see the whole and see how all the parts fit and then be able to use the whole to explain new things.
That's what happens when you have a scientific theory.
And so I was asking, how would you game that, right?
How would you create something that Got people the feeling of understanding so sharply that they would be addicted to it, and you would engineer something that was easy to grasp, easy to use, and powerful, right?
But really quick to grasp.
And two of my examples are actually conspiracy theories and bureaucratic metrics and institutional metrics.
If a conspiracy theory were super simplified, And you didn't have to do any work.
It wouldn't feel like intellectual investigation.
Yeah, there'd be no game.
Right.
It's not a game.
If your theory is so complicated that it requires you to interface with thousands of experts, like, I don't know, real science, that wouldn't be satisfying because you couldn't fit the whole model in your head.
So what you'd want is to carefully design a model that was hard enough that it would involve all of an individual's intellectual activities, but also make that It's complete enough that once one individual had it, they could run around applying it to everyone.
I realized recently that everything I've been doing in philosophy is about one idea, and that idea is that the world is too big for a human head.
I have two favorite philosophers who write on this, Elijah Milgram and Annette Bayer.
Elijah Milgram puts it this way.
The essential fact of the modern world is that there are so many sciences, that ideas are strong.
Every argument has to be strong across so many fields that no one actually can understand every part.
This is from a book of his called The Great And Darkenment, and the interesting thing he says is that this is actually all screwed up because we had this thing called the Great Enlightenment, and the Great Enlightenment started with this idea of intellectual autonomy.
Think for yourself.
Don't just believe things because other people said so, right?
Throw off the shackles or whatever.
But then the Great Enlightenment created so many sciences that it rendered intellectual autonomy impossible.
And now suddenly the size of the world is such that we have to constantly be trusting people that we don't understand.
So Annette Bayer, who has this paper that is probably my favorite piece of philosophy written in like the last hundred years, a paper called Trust and Antitrust.
And it's actually railing against, she's a feminist epistemologist, and she's railing against how masculinized moral theory is.
And she starts saying like, people try to do all this philosophy about socializing and politics, and they never once mention trust.
She ends up saying that what trust is, is making yourself vulnerable to somebody else.
And entrusting a piece of you to their care out of your sight.
And the simple version of this is like, for me, putting my kid with a babysitter.
But more complexly, whenever you trust a doctor or a scientist or a mechanic with what pill to take, right, you're entrusting a part of you to something that you don't understand and you cannot understand because the world is now too large.
So here's the problem.
We want understanding, but the real understandings of the world are too big.
So here are two things we can do.
We can play games, and games reorient the world to be right-sized to fit our brains.
Right?
The difference between, my spouse once put it, she said, it took me a while to realize, but games are made to be playable by humans.
The puzzles aren't too hard, right?
The puzzles are just hard enough that they're hard enough to be interesting.
Games are something where a lot of the actual world, when you have to deal with the real world, it's either too simple and boring, or it's too overwhelming and you can't understand it.
In games, you can just right-size the world to be interesting, but not overwhelming.
And also, because different of us have different capacities, We can, like, you know, my physical abilities are really poor, and so what's difficult and interesting to me is really simple, right?
It's like I am a bad climber, but that's okay, right?
I just pick the climb.
I don't know, that climb might be a beginner for a climb for someone else, but for me, it's the hardest thing I've ever done, and, you know, even if it's a warm-up for a professional, that's fine, right?
I can pick the thing that's right for me.
But...
That works, because games are not reality, and I'm taking a temporary departure from reality to have this little, like, right-sized refuge.
And so my worry—one way to put the worry about conspiracy theories is that it's engineers' nuggets that are made So you don't have to trust other people, you don't have to be vulnerable, because you can put it all in your head, and it's just the right size.
But also, and this is the connection that you were pushing me towards, I hadn't made before, they're weirdly decontextualized.
Because a lot of other information, a lot of other explanations for the world require a large amount of expertise, right?
You need a lot of expertise to understand something that doesn't transfer easily.
If conspiracy theories are kind of engineered nuggets where you've cut the expertise out, then They transfer easily, right?
You can just give them to other people, and it doesn't matter if the other person doesn't have the same degree or expertise as you do.
It's going to be just as comprehensible.
Yeah, and not only does that conspiracy theory become the new lens through which you look at absolutely everything, but a lot of the influencers we cover become sort of galaxy-brained about how they have hot takes on every single possible discipline and subject.
This whole round, everything we're talking about, for me started like, I don't know, like 15 years ago, when I got obsessed with a single question, which I thought was super interesting, and I couldn't get anyone interested in it for like 10 years.
And that question is, how does the non-expert identify which expert to trust?
Right.
And I found this result that blew my mind, and it was just a bunch of empirical research about how good people, jurors, were at picking expert witnesses and their crap.
And here's what happens.
Most people tend to pick as an expert witness someone who's clear and declamatory and offers strong, confident opinions about everything.
Actually, experts are often going to be like, that's not outside of my expertise.
I find this a lot in the podcast, like, so I've been doing this a lot.
So I do a lot of stuff about data, obviously, and metrics.
And then people will be like, so what do you think about machine learning and AI?
I have learned to be like, I don't know anything about that.
And then they push me, people are like, oh, but don't, and I'm like, I once said something and I was immediate, afterwards, I, I read it by some AI people and they're like, that was dumb as crap.
Like, so, so, so here, here's the thing, like a conspiracy theory is something that gives you the power and gives you the appearance of being able to make quick, powerful decisions about everything.
And you can transfer it, right?
It's funny, because in this philosophy about what understanding is, what they say is that what it is to understand something is to have a complete model that you can use and transfer to others.
And they're thinking about things like, you know, physics models, right?
You have a good model of planetary motion if you have a model that explains things, unifies everything you say, and you can transfer it to someone else.
I just feel like conspiracy theories have, like, amped that up and made it a little easier.
So we've put together these two fields in which we have denuanced nuggets of data that are highly portable, they can travel far, they can be sort of taken up in many different contexts.
With conspiracy theories we have, okay, there's an outgroup, okay, there's a cabal, okay, there's a plan that's rich and, you know, it's complicated and only I have the solution to it.
But then with the value capture that you're speaking about in terms of media and social media and institutionalization, and then when we return to the notion of the podcast, we ourselves deal with these packets of denuanced data ourselves.
Like Julian, we have to stay aware of the metrics of download numbers and retention rates, right?
So how does that work, Julian?
Yeah, I mean, this is going to give us an existential crisis if we let it, but maybe that's a good place to be.
I mean, you know, we track various metrics.
We look at download numbers.
We look at retention rates, like how long are people actually listening to an entire episode?
Do they get bored?
Do they drop off?
And, you know, one good reason I think to do this is to try to get a sense of what our listeners are interested in, what holds their attention, how well we're doing if we're rambling too much, etc.
But then behind that hovers the specter of like, are we just getting value captured and gradually losing track of our true kind of, you know, motivations and ethics?
What do you think?
Yeah, so do you have any advice for us?
I don't have advice.
I want to give you a nauseating, a nauseating picture that I think you're moving towards.
Oh no.
Oh no.
We've been crapping on portability, right?
And being like, look, these things are genomes and portable.
That's literally what science is based on.
So Sabina Leonelli, who's my favorite philosopher of data, she does philosophy of biology and data.
Her idea of what scientific data is, is quote, Information that's been prepared to travel to unexpected places and be used by unexpected people for unexpected ends.
What makes science powerful is precisely the data that's been prepared to travel between contexts.
I mean, Theodore Porter, who I think is a background influence for Leonelli, the person that I was talking about before, he actually says this thing where he says what it is The information is to be a kind of understanding that's been prepared to travel and be understood by a distant stranger.
And we both, I mean, it's funny because we are sitting here, a professor who's writing a popular book, who teaches intro classes, and two podcasters crapping on the notion of communicability.
Right?
I mean, what I'm saying is there's this weird paradox.
So I ended up, the first, when I first got on this train, I ended up writing this paper called Transparency and Surveillance about transparency metrics, about these things, these attempts to make, you know, government workings transparent.
I found this line for Onara O'Neill, who's a philosopher who's working in bioethics at the time, and really studying metrics in hospitals.
And she says, people think transparency metrics and trust go together, but they actually come apart.
Because transparency forces experts to explain themselves to non-experts, and they can't.
So they have to invent things.
So she thought transparency forced people to deceive others.
I think it's even worse.
If you believe in value capture, then transparency actually asks experts to act on reasons that non-experts can understand, which actually undermines their expertise.
I just keep seeing Anthony Fauci.
That's a great example.
Okay, here's the problem.
Transparency isn't always bad because the reason we wanted it was because transparency gets rid of corruption and bias.
And it's totally true.
Here's the problem.
Transparency metrics do both, right?
Transparency metrics force you to explain yourselves in publicly comprehensible terms.
This prevents corruption and bias, and it prevents expertise at the same time.
Because one way for me to think about it is that what transparency metrics do, when you have to explain yourselves in publicly clear terms, is they prevent you from making intuitive, rich, hard-to-explain judgments.
And intuitive, rich, hard-to-explain judgments are exactly where bias creeps in, and exactly where expert sensitivity creeps in.
So you get rid of both.
So you can see this problem.
I mean, Anthony Fauci is a great realm because people who work... So I've been talking to a bunch of people in this field called science communications.
This is the basic problem.
You want to earn people's trust, but people can't understand the science.
So you have to simplify it.
But if you simplify it, you'll be caught out because you're deceiving things.
But if you don't simplify it, it'll be incomprehensible and no one will trust you.
Oh man.
I know you want to talk about conspiracy theories.
This is where my head is at.
Yes, a philosopher on.
This is, I think, the basic problem of the modern era.
Right?
The world is too big.
And I think maybe it's not that far afield because I think this is where conspiracy theories Get root, right?
The whole problem of our era is that you don't want to not think for yourself, but you also can't perfectly think for yourself.
One time, I was teaching Annette Byer's trust paper, and this guy in the back of my class raises his hand, and he's like, well, that's why I've never trusted anyone with anything in my life.
You can't be vulnerable.
And I was like, how'd you get to school?
He said, I drove.
I said, on the highway?
He said, on the highway.
I said, how many people did you trust with your life Five minutes and five minutes of highway driving.
And he had an emotional meltdown in class because it just, it spreads, right?
Because it's not, it's the drivers, but also the brake mechanics, but also whoever made the brakes and all the science that went into making the brakes and all the, right?
We are on this precipice of endless trust.
And I think Annette Byer is right that it's vulnerable and it's nauseating.
And one of the pleasures of both conspiracy theories and metrics is they give things the illusion of comprehensibility And completeness.
That you can actually see everything for once in your life.
This is also I think about games.
Games are systems where you can see everything for once in your life and hold a whole system in your head.
But games don't pretend to be about reality.
With regard to games pretending or not pretending to be about reality, in a couple of weeks I'm going to be interviewing a gaming reviewer.
His name is Riley McLeod.
And we're gonna be talking about how I've been learning to play Cult of the Lamb, which is a roguelite game
in which you play this cute little lamb guy who has to recruit members to a cult.
And they will ultimately serve your master, who's called the one who waits.
So anyway, my kid is teaching me how to play.
The most important thing that I'm trying to learn is how to not to grip the joy-con so hard
that my hands go into spasm, because I'm not used to that.
But what I find fascinating, and more importantly, hilarious,
is that this game lays out the joys and troubles of cult life in such a clear form.
And actually makes me empathize with the stress and everything that the malignant narcissist
is dealing with all the time, because, you know, he's gaining and maintaining immersive power
over the group, and it's exhausting, and he has to spend money to feed them,
and, you know, then you have to clean up their shit.
If your followers in this game get delusioned, you have to either kill them before they spread discontent, or you have to whip up a kind of prophecy to increase everybody's faith quotient.
So it's this hilarious nightmare.
So I wanted to ask Riley, but I'm gonna ask you as well, about the possibility that setting cult dynamics,
or setting cult mechanics within game mechanics, is actually way more educational,
way more defanging and demystifying than reading books about cults.
Because I think, I have this feeling that it could normalize cult dynamics.
I think it makes them transparently obvious to the point of providing a kind of inoculation.
So, I don't know, maybe we can finish on that general idea about the possible social goods that can come from games.
And maybe even imagine what kind of game would get at this problem of conspiracy theorizing
that we've been discussing as this core human dilemma.
I got convinced early researching games, I can't remember who was writing about this,
but one of the things they thought was that a lot of people are trying to make games like fictions
and capture emotional perspectives, but this writer thought that what games are really good at
was modeling mechanical scenarios.
Oh, this might have been an Ian Bogost stuff.
Big game studies person.
So what games are really good at is giving you a mechanical system that has causal links and seeing how the parts interconnect.
So for me, what this ended up being was something like, and if you add on to this, the fact that games give you a goal, they're really good at getting into an alternate perspective, but it's not like an emotional perspective.
It's an instrumental perspective.
So one of my favorite games is a truly evil game called Imperial.
In an imperial, it's World War I and the six nations are fighting it out, but you don't play them.
You play shadowy investors trading investments in the nations and manipulating the war for profit.
And a lot of what the game is about is watching the co-investments.
So, for example, if I was heavily invested in England and the player that was heavily invested in Germany was going to come for England using Germany, One thing I could do is fight.
Another thing I could do is sell them cheap stock in England, so now they're co-invested.
It's evil.
But also, every time I have to, like, deal with the business school trying to defund philosophy, like, I think about this, right?
Because that has given me the practical mindset of being like, okay, are we going to fight?
Or can I shift the incentives to give us shared incentives?
Right?
So each of these things encapsulates this different mechanical perspective.
But I do think there's this, it's funny, I've just been using the word inoculation a lot.
So one of the things I ended up saying in the end of the games book was that a reason why I
thought that actual real deep play was in some sense an antidote
to this kind of hyper-gamified narrowed view was that playing games with practicing stepping
into sharp mechanical scoring systems and then stepping out of them and asking yourself
if it was worth it.
Or if you liked it.
Let me try another thing.
I have another paper that you might be interested in.
I think I wrote this after, the last time I talked to you all, about the value of intellectual playfulness.
What could intellectually playfulness be for?
So if you think of intellectual playfulness as stepping into other intellectual perspectives for fun, that would be a really good insurance policy against A dogma against an entrapping system, against an echo chamber, a conspiracy theory.
Because those things are feedback loops, right?
And they work when you're stuck inside them and you don't see the world in another perspective.
And then there's this question about like why... Okay, this is the geekiest part of this paper, but you all might like it.
So why not think instead of being playful and exploring for fun, you should just explore to know more?
And my answer is something like, if you already have a worldview and you explore to expand your intellectual knowledge, You will be exploring what seem like plausible scenarios to you.
But inside the right echo chamber of conspiracy theory, they work by making reality deeply implausible.
I felt this way about, like, the way Rush Limbaugh, like, made feminism, like, the ultra evil.
Like, if you bought into his worldview, you would never explore that.
That's such a low likelihood scenario.
But if you're doing it for fun, then you're just kind of going to be random.
So I don't know if that's satisfying to you, but I think like somewhere in this stuff, what's on the side of conspiracy theories and intellectual metrics is this, the features they seem to share is this kind of locking into a simple worldview.
Where you stop seeing things that are outside and the things that unite playfulness and like fun and games is this like tendency to shift perspectives easily?
T, it sounds like a sort of Distinguishing moral feature of the positive or liberatory game is that it is a box that you can walk into with distinct rules and identities and agencies that you can operate.
But you can also step out of that box and then you can step into a different box and then there's a gap in between the two boxes that you step into or even a third one later on.
And gamification just does not allow you to do that.
So is that Fair as a summary and what do you do in between the boxes?
I mean, I think it's a fair but partial summary.
Like there's this thing about engaging with rule systems under control.
And I think it's possible to know what you're doing when you enter a pervasive rule system and really give yourself to.
An academic life, a life with yoga, right?
A life as an artist.
And it's also possible that some games, even though you open the box and close the box, they're so pervasive and you're so obsessed with them that you... I think, I mean, part of this is, I think the thing I want to worry about is, that I don't want to make easy, is to think that, oh, there are certain games and they're just safe.
And there are other things that are just evil and you should just avoid that.
And it's a much more, I mean, I, I think since we've talked so much about yoga, I felt in my life, both very unhealthy and very healthy relationships to yoga, right?
Where there have been times where it's something like, this is this incredible thing that I'm doing.
And I go into this room and I try to do this thing and it leaves me rich and fulfilled.
And there are other times when I was like, I just want to make the next pose.
I just want to make the next pose.
And so it's, it's, it is both the, The fact that games close and you can exit them makes it easier to reflect, that encourages reflection, right?
That encourages figuring out whether this thing is actually working for you.
I think the deeper question is something like, when you're engaged with this kind of external system, Do you know why you're doing it?
Do you have a reason?
Have you changed it to make it work for you?
T. Nguyen, thank you so much for coming around and playing with us.
Thank you.
Always a pleasure.
Yeah.
Thanks, T.
Thank you for listening to another episode of Conspiratuality.
Export Selection