Jon Guy, author of Think Straight and Owner’s Manual for the Mind, debunks critical thinking as a buzzword, framing it as rigorous self-evaluation—like rewiring the brain’s "street world" with physics, statistics, and metacognition to discard pseudoscience (e.g., UFOs) without abandoning experience. He warns fringe beliefs enable dangerous skepticism, like vaccine denialism, and uses Socratic questioning to guide resistant minds, contrasting online refutation battles with real-world rapport-building. The "veritical vending" metaphor reveals how brains filter data: reliable but poorly presented info gets rejected, while misaligned claims slip through. Guy’s Richard scale (Feynman to Dawkins) and neuroplasticity insights—like mastering The Blind Watchmaker—show critical thinking strengthens mental pathways when paired with deliberate learning, not just contrarianism. [Automatically generated summary]
And we're back with Truth Unrestricted, the podcast would have a better name if they weren't all taken.
I'm Spencer, your host, and I have a special guest today.
I will allow him to introduce himself.
Go ahead, John.
Hey, Spencer, thanks for having me on.
My name is John Guy.
I'm the author of a book on critical thinking called Think Straight and Owner's Manual for the Mind.
I'm also a contributing author to an upcoming book called Investigating Clinical Psychology, Pseudoscience, Fringe Science, and Controversies, which should be coming out in October, and that's available for pre-order on Amazon.
And I also write for a website dedicated to promoting critical thinking called thinkingispower.com.
Wow, great.
And to no one's surprise, really, I would have you on today to talk about critical thinking.
It seems a little on the nose, but I think we'll do it anyway.
So when I think about critical thinking, one thing that occurs to me right away, just thinking about the name alone, is that the name itself, the way it's used and really overused by a lot of people that don't really delve into its subject matter that would justify it, but just kind of use it as a byword to say that they're doing something good.
It tends to lose connection with the meaning of the words itself.
In the same way that if you, let's say you went to school with someone, grade school with someone whose name was Terminator.
And you went from grades three through seven with a person whose name was Terminator.
And you might even have some, you know, nickname for them or something.
And then in the eighth grade, they kill someone.
And suddenly you have to face the fact, oh, right, this is a Terminator.
That's in the name.
It's in the meaning of the.
And so you get what I mean is through everyday contact with it outside of the context that in many discussions not directly involved with actual critical thinking, but just used in this way, a lot of people lose contact with what's really happening with critical thinking.
And so I start there with this.
And I think the idea that we should just talk about what we really mean.
I mean, it literally is in the name, thinking critically about first your own thoughts and then also the information that you get.
So let's just kind of start there.
What's your take on first thoughts on critical thinking, having wrote a whole book about the subject?
Yeah, there's a lot to unpack there.
Oh, yeah.
I'd say that in one of those contexts, like with the Terminator guy, that's more of like equivocation, right?
When you take a particular term that has multiple meanings and then try to apply it to a specific context, when there's there's that, those need to be teased apart in that certain situation, right?
And critical thinking kind of suffers from a similar flaw in that like a lot of people, they hear critical thinking and they aspire to be critical thinkers because it's catchy, right?
People and it's a good positive quality to have.
Yeah.
It's yeah, people, people strive to be critical thinkers.
And in my experience, most people who haven't received instruction on critical thinking, they just have kind of an like a folk understanding of the term, right?
So it gets confused with like contrarianism or cynicism or denialism.
And those are all separate things.
And again, the nuance is important to tease those things out.
And when you get into the weeds on all of those, they do have fundamental differences.
But when we talk about critical thinking, man, there's so many components to it, right?
Because like when you get into critical thinking, it's not just about like metacognition or self-examination or reflection.
It's about like, it is about all those things, but it's also applied knowledge and applied skills in like pursuing an intellectual endeavor that aims to get to the truth of reality or as close as we can to understanding truth through our limited senses.
So in the book, I wrote, I tried to cover a lot of different fields of study because there's so many that you can write about that are relevant to critical thinking.
So, for example, a lot of people will make arguments, I saw it with my own eyes, or I remember it as clear as day, right?
Well, our memories are massively flawed.
So if you want to understand your cognition a little better, you should understand what those flaws and memories are and how they work.
How are memories stored?
How are they formed?
How are they retrieved?
Those kinds of things are really important.
If you're going to, because I don't make that argument very often, like, no, it happened.
I remember it.
Because I know that I could be misremembering it.
And most of the research suggests that I probably am misremembering it the same, at least to some degree.
Yeah.
And some people would use that, would attempt to use the sense that your memory isn't perfect to try to distort, deliberately distort things about the past and about past events or about your own memory or to mislead you.
And once you know more about how it is that your memory distorts things or gets things wrong or, you know, ways in which you could more easily rely on your own memory, then that won't happen as much.
Right.
And so that's, that's just good things that we should all know and yet we don't.
Yeah, that's wonderfully said.
I think the idea is that like critical thinking and skepticism, they're not, they're a way of approaching information, right?
It's not like a position.
It's not a stance.
Like I'm a critical thinker.
That's my stance and that's my it's not a place on the political spectrum or anything like that.
No, it's not.
It's a way of looking at the world with respect to claims about reality.
And just like you said, memory, if we understand the foibles of memory a little bit better, then if you and I get into some sort of some sort of argument or debate about something and I'm relying on a memory, I can put a caveat on that and say, this is how I remember it, rather than being certain and saying, no, it happened because I remember it happening.
So there's a little bit of room for uncertainty.
And then I have to have the intellectual humility to say, I could be wrong because I understand that my memory might not be accurate on this particular thing.
Yeah, memory is a particularly tricky one because much of what we see as ourselves comes from things that are in our memory.
And so it feels like you have to, you know, let go sometimes of a carefully cherished belief or memory in that way.
And that's really difficult for a lot of people to not have the thing that they call themselves firmly cemented by their seemingly perfect memory, which, as we said, is not perfect.
Yeah, I got kind of a cool anecdote about that.
So when I was a kid, we used to go camping at this river and there's this old wooden bridge that goes across the river.
And it was like, you know, what they used when they carried wagons and stuff across the river.
It's shut down now.
But my brother and I were laying on this bridge at night watching stars and there was this, there was this spaceship and we all saw it.
It lit up the sky for just a brief second.
And it was obviously an alien spacecraft because we saw it go from one particular star to another particular star.
And it wasn't like it went a little past it or it started a little before it.
It was like clearly it started here, it had a destination and it went to that other destination.
So growing up, this was like really strong evidence for me that like UFOs were like circling around the skies and they were watching us.
I don't know what they were doing, but they were definitely there.
And I held that belief till, you know, probably into my, I'd say into my mid-20s.
And when I started getting into skepticism and critical thinking, that wasn't the first thing on my mind, obviously, but it did cross my mind as the years went by.
And I had to kind of give up that memory because now I have a relatively decent understanding of, you know, the fundamentals of physics.
And physically, that's not possible.
I also have a relatively decent understanding of cosmology.
So I know that that was probably a meteor that just happened to cross the sky in that particular pattern.
I also have a relatively decent understanding of statistics.
And statistical probability would suggest that every once in a while, a meteor is going to go or seemingly go from my vantage point from point A to point B.
So all those things kind of did what you said.
They kind of collapsed that cherished memory of laying on that bridge with my brother and seeing a UFO, you know, and now it's kind of a different perspective.
So for me, I really love the idea of skepticism as an approach because I can appreciate that memory for what it was, right?
I shared this moment with my brother on this bridge as kids looking at the stars.
We saw something cool.
And now having been, having had all this kind of these opportunities to learn about these things, I think that my experience, or at least my memory of the experience, is more closely grounded in reality now than it was then.
And I think that's, you know, much more productive than believing in, you know, a fantasy.
Yeah.
And it having believed in having seen a thing that you couldn't explain and explaining it in a fanciful way is, you know, not new for children, right?
Right.
So, you know, to go back and say, well, you know, you, you have to reexamine this and you have to think something about yourself in this way.
Some people would look back on that and say, well, yeah, but I have to give up that thing.
That's a thing about myself.
I've always felt this was true and I have to.
But you don't.
That shouldn't have to be a part of yourself.
And same with things like people who convince themselves that they're dead relatives that they loved are in the house still.
And that can be a comforting thought, but it doesn't change what's real around us.
You know, belief in ghosts is interesting, but it's not falsifiable.
It's not a part of our scientific canon because we can't, we don't seem to have any way to observe it with any kind of third-party instrument of any kind.
It's only individual people who claim this and all of them have reasons to claim it to be true.
And it's maybe not.
And that's sad.
But yeah, there's a lot of beliefs that are that are like this.
And a lot of people attach a sense of themselves to that.
And I think that we need to move past that because right now in our world, people are attempting to distort those things for political gain.
And that's much more dangerous.
I mean, happening to believe in ghosts, I've never really thought too much about trying to debunk any of those things or whatever, because they don't really hurt anyone all that much unless they contribute to an unreality wherein a person can then start to think, well, science says that ghosts aren't real.
And then science also says that vaccines are safe because I want to believe in ghosts and I don't want to believe in science.
I'm much more likely to not get vaccinated and endanger myself, my family, and the rest of society.
Maybe then it's time to, you know, lump these all in together and tie them in because maybe they are tied in.
Yeah.
Once you stop thinking critically on one thing, maybe you're more likely to think less critically about other things.
What do you think?
Yeah, I think the research bears that out as well.
There's kind of a fun term called crank magnetism that suggests that.
Yeah, you're familiar.
Yeah, you're familiar with the term.
And this basically talks about how, you know, these beliefs about weird things, whether they're ghosts or the paranormal or, you know, anti-science ideas like, you know, vaccines are dangerous or GMOs are harmful or whatever, they kind of run in pairs.
You can predict, sorry, one of the strongest predictors of beliefs in one particular thing, like ghosts or UFOs, is whether or not you believe in other particular things.
And that holds true across the board with respect to conspiracy theories as well.
So my position is that there's really no such thing as a harmless pseudoscience because it kind of creates that sort of culture and mentality of uncritically accepting things on whimsical or shoddy evidence.
And if you are, if you do have that propensity, you're probably going to carry that propensity to other things that are more serious, like health or politics.
So it's definitely an issue.
If you look back kind of in the history of skepticism, a lot of people were kind of just like, they didn't really care about skepticism because they were focused on aliens and Bigfoot and the Loch Ness monster and, you know, things of that nature that a lot of people thought were trivial.
And I never held that belief because exactly what you just said is that those kinds of beliefs, they're maladaptive.
They can carry over into other realms of our lives when we're thinking about things that are much more important.
Yeah, it's like an erosion of rationality.
Sure, exactly.
Yeah.
So I have a couple of questions.
I didn't have a huge list of questions, but I do have a couple that I'd like to ask.
So we said before, critical thinking is generally seen as a full positive, which seems obvious, but to me, it's not very obvious.
There's some people who think that all those people who do that book learning stuff isn't all that good.
So the idea that thinking itself can sometimes have a negative is interesting and not useful.
But so far, critical thinking itself is generally universally seen as a positive.
So everyone generally thinks that they're doing it or would like to think that they're doing it.
And so they're biased in that way.
What do you think is the best way to approach a person who clearly isn't thinking critically when they clearly think that they are?
Do you have any thoughts on this?
That's a really, that's a really tough question to answer because there's going to be many variables involved.
Where are we having this conversation?
Who are we having this conversation?
How politically charged it is, right?
Right.
Yeah.
How emotionally salient it is or politically charged or there's a lot of factors.
I have conversations with those regularly.
I have them with people at work.
I have them with family members.
I have them with people on social media.
And my style on how I have those conversations kind of changes based on what my goals are as well as on what the situation calls for.
So I think if your goal is to try to persuade somebody that their beliefs are mistaken, then you need to be very careful with that, right?
You should like first and foremost, you should establish a rapport if you don't have one already.
If it's with a family member, you probably already have some decent rapport.
If it's not with a family member or if it's with a co-worker who you're not really close with, establish a rapport with that person.
And then after that, get some sort of baseline of agreed facts and goals, right?
So that this kind of puts them in the perspective that they understand where you're coming from.
You understand where they're coming from.
And if you, and most of the time, people have shared goals, right?
Like, for example, I think that vaccines are safe and effective.
Anti-vaccinators think they are not safe and not effective, right?
Yeah.
But we both really do have the same goals in mind.
Safety of people, yeah.
Right.
The safety of people, the safety of children, right?
I want vaccines to be safe and effective.
I happen to believe in the scientific vetting process of bringing vaccines to market.
And I think that that process demonstrates safety and efficacy.
They tend to not think that, right?
But like if I'm having a conversation with a close family member about vaccine safety and efficacy, we already have rapport, but then I'm going to go to what are the goals here, right?
What are our goals?
You want something safe and effective.
I also want that.
Well, then where are we differing here?
And then you can get into what their beliefs are.
And a really good effective tool is to just let them talk and then ask pointed questions that kind of in kind of a Socratic way so that people, you kind of put their own thoughts and ideas into perspective for them through themselves so that when they vocalize these thoughts and you ask these pointed questions, they can kind of come to different conclusions all on their own.
And that's a really effective strategy.
Yeah, that's that's really good.
I've I found that myself that I have two distinctly separate strategies based on where it is.
And there's only two where's now: there's real life and the internet, really now, in the grand scheme of things.
I mean, there's I like how the internet is not part of, I like how you put the internet as not part of real life.
Well, it's so true.
In World of Warcraft nomenclature, that's that's how we refer to it when I played World of Warcraft.
There was World of Warcraft life and then there was real life.
And so it just fits naturally.
But the internet is its own beast.
It's not really like real life.
There's very few empathy reactions.
There's very few face-to-face interactions.
Most of the social cues are gone.
And so I find that I have really then, like I say, two completely separate strategies.
And one is, I think, similar to what you mentioned, which is about developing a rapport and not just being the contrary of theirs because they can just paint in their mind that you're just a contrarian.
But on the internet, I find that I have to have a completely different strategy because the ability to just block and tune a person out is ever present and it's one click away, right?
So I tend to get a little more tricky.
I use essentially sales techniques.
I let a person start a conversation and I let it drought as long as I can before I contradict them because I think that if they've spent enough time in the conversation, they want to have more time in the conversation.
That's a, it's a sort of a, you know, they've spent that much time building ground.
It is a logical fallacy, right?
It's uh, oh, yeah, that's the escalation of commitment.
Yeah.
And so I tend to do that.
It takes longer, but I tend to do that.
I tend to draw people in with more and more just, and it's like you say with asking questions.
I tend to ask more questions, get them to get more and more responses.
And then I start contradicting what they're saying with more and more evidence.
And almost always it's only contradict the evidence that they've brought up.
Make them think about what they think is true when it's not true.
Or if it's true, but they've come to the wrong conclusion about it.
Try to show them how that is that they came to the wrong conclusion.
Right.
And it's, and I don't know how well it's working.
Honestly, I don't, you know, results are very, very mixed.
Yeah.
There's no way.
I was going to ask that.
How, how does that strategy work?
But like you said, that's a really it's painstaking, right?
Because when you think about like brand, Brandolini's bullshit asymmetry principle, it's, I think, you know, Brandolini was off by a couple orders of magnitude, right?
So for anybody who's not familiar with Brandolini's bullshit asymmetry principle, is that Brandolini said that it for any piece of bullshit, it takes an order of magnitude more time and effort to refute that piece of bullshit than it does to spout the piece of bullshit.
And it is, like you said, like it's painstaking, right?
Because how do you explain to something, explain to somebody?
They say they might make on social media, they might make five completely separate false claims in one tweet, right?
And each one of those needs to be separated.
And then they all have a history and they have like research behind them.
They all have context.
They all have context.
Exactly.
And it's like you have to backtrack so much just to get to that base level that we were just talking about that you have when you have like an in-person conversation that it's, it takes so much effort to establish any kind of mutual playing field with people because it's just so overwhelming.
And then most of the time, science communicators get really jaded because you can go through all this effort, right?
Yeah.
You can, like, I have files on my computer and on my phone with like, yeah, you know, tons of a list of references.
Exactly.
Yeah.
Just tons of stuff.
Papers that I've been reading for the last five, 10 years or whatever.
And, you know, they're all, but, but like, what is the point?
What, like, if I'm going to have this conversation with somebody and then we get to like, okay, well, they say, well, where's, where's your evidence, right?
And if you accept that, they're probably the one that made the claim and the burden of proof is on them.
But if you get past that, just because you want to establish that you do have some evidence and you present them with some sort of paper, then they just move the goalpost and say, well, that was funded by Bill Gates.
So you can't trust Bill Gates because he wants to depopulate the planet.
And they just sea lion with just a whole bunch more claims.
And then you're back to square one and you did all this effort for absolutely nothing.
And it's really, it gets really tiring sometimes.
Yeah.
And there's, there's almost no payoff.
No, there's no payoff.
There's no sense of satisfaction at the end of the day.
Yeah.
I totally agree.
I think Jonathan Howard just mentioned that in his book that, yeah, he was talking about people who combat scientific misinformation, scientists and doctors who are qualified to do that sort of thing.
They're not paid to do that.
That's on their own spare time.
And most people don't care.
Most doctors and scientists, they want to treat patients and do science and they don't want to get into all that because one, they don't care.
And two, they're not trained for that kind of thing.
And I think what one of the things that Jonathan Howard wanted to do is kind of push more people towards doing that because obviously we get to the state where society, what happens when everything we hear is fake, right?
When every piece of news is false, like where is the tipping point where people start to apply some basic skepticism?
And I think Jonathan Howard's push to encourage more doctors and scientists to get involved in public communication, get involved in politics is hugely important.
Yeah, that's, I almost think that should be a whole other episode because there's so much there.
Maybe next time, but I'm going to get to my next question here because I think it's interesting.
When I was thinking about this, what I would want to talk to you about, I mentioned before that critical thinking has become sort of a buzz phrase.
We have another buzz phrase that's right next to it, maybe also overlapping with it, probably is overlapping with it, at least somewhat.
And that's a phrase called thinking outside the box.
So in most conversations, these two things are sort of used interchangeably, completely synonymously.
So in your opinion, do you think they are exactly synonymous or do you think they're not quite the same or some overlap?
Where do you think these two things fall in relation to each other?
Well, to be honest with you, I haven't really thought deeply about that, but I can say that I don't, I don't agree with the premise that they're the same, right?
Because as far as I'm concerned, like thinking outside the box, that's a healthy endeavor, right?
Yeah.
But when I think about thinking outside the box, to me, that has to do more with creativity rather than background education.
Whereas I think critical thinking has more to do with background education and more applied self-reflection and knowledge towards a claim, right?
So if you make a claim and someone says, well, we need to think outside the box about that.
Well, what does that really mean to think outside the box?
Is like not like a preferred narrative?
Or does it mean that we need to find like brainstorm like different ways of approaching it or whatever?
So to me, it has that term has more to do with like creative thinking, whereas critical thinking has more to do with like having informed approaches to information.
Yeah, yeah.
I mean, oftentimes thinking outside the box is used in the wrong context.
I think that its original popularly used context is in problem solving, riddle solving, getting yourself out of a jam kind of thing.
I think we have a term originally used as the term hack, which was sort of as computer hacking, but now it's become, it's developed a more broader definition of devices that are used in some way that's not their original intent.
And we have life hacks now.
Yeah, life hacks and that sort of thing.
They're all about devices and situations where you're using something outside of its original designed intent.
And so in that context, you are thinking outside the box every time you're doing that.
And that's admirable.
Critical thinking, I think you're right, is more about discovery.
It's more about a system inside your brain that you're using to teach yourself about reality.
And so they're almost like two halves of a whole.
It's like two soldiers who fight back to back.
One who's looking backward at all of the world and universe attempting to make sense of it.
And the other that's attempting to use that knowledge to then solve some potentially intractable problem, right?
And of course, one would feed into the other.
The more information you get and glean accurately about the way everything works in the universe, the better you're going to be at thinking outside the box.
And probably all the muscles in your brain, I shouldn't call them muscles in the brain, but all the ways in which your brain thinks where you're thinking outside the box are probably going to help you think critically.
But I think that you're right in that there's a Venn diagram and there's two circles and they're smushing together to make two ovals that just aren't overlapping, but are very, very close together.
I think that's to me anyway, I think that's where that sits.
So I'm kind of glad you agree is what I'm saying.
Yeah, no, I think you articulated it far better than I did.
Well, you definitely had the critical thinking part down.
Yeah.
Yeah.
I think, no, I think you're absolutely right that there are definitely two parts to a whole.
I like how you put it in the terms of the Venn diagram because, you know, to visualize it that way, it makes a lot of sense because they're smushed together clearly, but I don't think they're quite overlapping.
Right, right, right.
Yeah.
I think, I think that's really useful.
And I think you're probably right too that our brains work through like exercise, right?
So one of the one of the common sayings is neurons that fire together wire together.
So if you, if you, um, if you, if you tend to think outside the box and creatively, and you also have some critical thinking skills, I think that that sort of the wiring that allows you to think creatively will also influence how you think critically about things and vice versa.
If you are a good critical thinker, then when you are thinking outside the box, you can bring your strengths from that domain over into the other one as well.
I think it couldn't be any other way.
Yeah, that I welcome the internet to come and tell us the way in which we're wrong about this.
I'm sure they will.
Great.
I'm sure they will.
Yeah, that's one of the strengths of the internet is that.
Yeah, well, that's what it should be used for.
Yeah, absolutely.
Correcting the things that are wrong.
Yep, exactly.
I agree.
So I reference a very small, very small corner of neuroscience very, very often, actually, in my everyday life and also on this podcast, where I talk about the models in the brain.
And you referenced something in the book that you even brought up when we were discussing talking about doing this episode of the podcast.
You talk about a metaphor you call street world.
Oh, yeah.
And it's like, as soon as I read it, I thought immediately, well, okay, it's just a different metaphor for the same thing that I do.
Do you see this as a like a neuroscientific concept?
Or do you just come up with this just as a way to explain a way that would in which we think?
I'd say both.
Yeah, sure.
Yeah.
Yeah, I'd say, I'd say a bit of both.
Maybe just explain quickly what you mean when you say street world, what do you mean?
Yeah, so street world is kind of like an overarching theme throughout the book because I apply it to the chapter on memory and the chat.
I have a chapter on conspiracy theories.
And, you know, so conspiracy, conspiracy city would be a particular destination in street world.
And I thought it was a really powerful metaphor.
So what I basically did is I took the brain as we know it, as a ball of, you know, interconnecting neurons that send signals back and forth to each other to create what we have as an experience of a human and a self.
Right.
And instead of having dendrites and axons and neurotransmitters and synapses and all that stuff, I change it to streets.
So if you think about the brain as inside your head, it's just a big giant city with a bunch of interconnected streets.
And those interconnected streets have different destinations and they have different places.
And those different destinations and places represent certain things, whether they're how you feel about Hillary Clinton or how you feel about aliens or what, I mean, whatever it may be.
Yeah.
And both whatever.
Yeah.
Yeah.
It doesn't matter anything.
You know, how you feel about public education or how, you know, what do you think about redwood trees or whatever, you know, whatever it may be.
And we, so, so the units of travel in the brain are neurotransmitters.
Well, in street world, they would be, you know, people moving around and passing information back and forth to each other kind of mimetically.
And I developed a metaphor throughout the book to kind of illuminate different ways in which the brain actually does work, using the metaphor to kind of just paint the picture about whatever it is I'm talking about in that section of the book.
Right.
I thought it was a very good metaphor, especially when you're looking at things like we were just mentioning that the more you do these activities, you actively engage your brain to do critical thinking work and thinking outside the box and every other activity that your brain does.
You are, in your metaphor, building infrastructure in Street World.
You're helping to strengthen pathways between different parts of the brain to help you to think better at some future moment, because we don't normally think about things we do now as training for a future moment.
We might do it.
We might think that way when we're training the body, but we don't often think that way when we're training the brain.
We probably should, because the decisions we're making right now are important, but the decisions we're going to make in the future, we generally feel are even more important.
Yeah, I like the idea about the infrastructure because the way I developed it in the book is that, like, if you have a, let's just call it a thought about something, call it one of the 300,000 species of beetles, right?
Yeah.
That in your brain, that's, that's going to represent kind of like a cul-de-sac or a dirt road dead end in the metaphor of street world, right?
There's nothing really there, right?
Unless you're an entomologist.
Most of us are not, right?
Right, exactly.
And if we develop an interest in this particular beetle, right?
We are going to expand those neuronal connections in different ways so that we have kind of a model in our head about this beetle, right?
And to use the street world metaphor, what I would say is that if we need to understand more about this beetle, we can't have a dirt road getting us to that destination, right?
We have to, we have to lay some concrete.
If you have to build a whole neighborhood there, you're going to bring a lot of supplies.
Going to need a bigger road.
You're going to want to do that, right?
And you're going to have to, you're going to have to pave it.
You're going to have to put lights down.
And if we're going to do that, we want to build it with the most current technology, with the best materials, right?
We don't want to use shoddy electrical systems to build our houses because it's going to fail on us, right?
And so we're not going to build our neighborhood with shoddy concrete or old reused telephone poles that are archaic and don't work anymore, right?
We're going to bring to the table the best we have, right?
And the reason we do that is because we want to have the best understanding about this beetle as we possibly can.
And I think critical thinking is kind of the most modern version of technology and architecture we have for the brain to get us to build those cities and those destinations with the best technology available.
Yeah, I agree.
I mean, I often try to ask the question, how do I know what it is that I think that I know?
I have a general question that I ask.
It's just more general.
It's this question.
What have I seen or experienced that leads me to believe that this particular thing is true?
And I ask myself this at least 20 times a day.
I think to myself, I mean, it might seem silly to walk into your own office and see all the things there.
You're going to assume that many of those things are the same in the same state as they were the day before.
Every once in a while, you're going to be disappointed in that assumption.
But when you find something new, when you find when you have to compare a new thing to all the things you currently have in your brain, and there's a mismatch, most people, I think, are just discarding the new thing.
But we owe it to ourselves to examine ourselves and say, What if the thing inside our brain needs to be changed to fit this new thing in?
So I asked myself that question: What have I seen or experienced that leads me to believe that this particular thing is true that I have always thought was true?
And I think that we need to ask ourselves that question much more often about many more things because flat earth is a phenomenon.
There's actually people who really, really strongly believe this, and they haven't asked themselves this question.
Yeah, that's that's a really, really important point.
I, uh, there, there's another metaphor in the book that touches on that pretty strongly about veritical vending, right?
So, what I what I did was I took the idea about a vending machine, and a vending machine processes bills according to internal programming instructions, right?
So, we all know that it will basically anything that's shaped like a bill or closely to it, it'll try to pull it in, right?
Yeah, um, so it will pull in and accept anything that it's fed, right?
Which our brains do that too through our senses, right?
It pulls in anything that is fed to us, right?
You can't see things, you can't, right?
You can't just see them, yeah, you can't unexperience them, right?
So, the vending machine, let's say we try to slide like a three by five card into it or something, it's going to pull it in and spit it right back out because that three by five card violates the programming instructions that the machine has, right?
Now, if we're fed information, that the analogy here was that our programming instructions are exactly what you were talking about: our prior beliefs, our prior experiences, what we've seen, what we've heard, what would lead us to believe the things we do, right?
So, that's what we call our internal models of reality, right?
Or our worldviews.
So, if information comes in that is dissonant with our worldviews, we tend to spit it back out just like that vending machine does the three by five card, right?
Well, if it's uh, if it's consonant with our worldviews, we tend to accept that information without challenge, right?
So, we'll go back to the vending machine.
We know that vending machines can be tricked by counterfeit money, right?
It can look like money, it can feel like money, it can go through scanners like real money.
Um, and vending machines don't have really powerful programming instructions to detect counterfeit bills, so they can make their way through.
And by analogy, we can also misinformation and bad ideas and disinformation, that stuff can present itself in the same manner that counterfeit money is presented to the vending machine, and it can make its way past our programming instructions, and we can start imbibing that information.
And too much of that information can cause what we were talking about earlier-about complete changes in our worldviews based on all the information that we've seen.
And the opposite can also happen if you have, you know, say you have a dollar bill that's legal tender and you can feed it into the machine.
Sometimes it won't take it, right?
And we have the same thing happening with the that's analogous to us.
We can have good, accurate, true information, and it can come be presented to us in a way that we, our programming instructions just reject it.
Or it violates our worldviews or internal models of reality in some way.
And with the vending machine, sometimes we can pull that bill out and we can smooth it out and feed it back into it and get it in there.
And that's kind of what I think our job is as science communicators is to take that information and smooth it out.
If it violates people's internal models of reality, we need to figure out clever ways to smooth out that bill and feed it back to people so that they can, you know, get basic scientific concepts and kind of approach information with the sort of critical eye that skepticism demands.
Yeah, that's a great way to put it.
And I must say that I haven't finished the book, but I have appreciated it so far.
The metaphors especially are very useful and colorful and productive.
As a metaphor guy myself, I really appreciated that.
Well, that's great praise.
Thank you.
I appreciate it.
I also wanted to tell you that informally, I rate scientific literature on what I call the Richard scale.
It's not a good or bad scale.
It's a readable to unreadable scale based on the two Richards, Feynman and Dawkins.
That was going to be my guess.
And you're a lot closer to the Feynman part of the scale than the Dawkins part of the scale.
And that makes it a lot easier to read.
And that's another thing I greatly appreciate about the book.
Well, wonderful.
Thank you for that.
Yeah.
Thank you for that.
Authors want to hear that.
Well, and when I was getting through it, like I don't really enjoy reading scientific literature.
I much prefer fiction, as I think most people do.
But I think, you know, this is not, it doesn't have all the same difficulty that people like Richard Dawkins put into his words that just make it a lot more difficult to digest in large, large pieces.
So yeah, I'll give you, I'll give you another little anecdote.
And you can keep, you can keep this for the show if you want or not.
But it's kind of a cool story.
When I read The Blind Watchmaker by Richard Dawkins, I remember getting done with that book.
And specifically, there were two chapters that I was like, I read them, but I didn't understand anything.
I didn't understand a single thing he said.
And I was just completely over my head.
And that was really early on when I took an interest in intellectualism and, you know, evolutionary biology.
And when I, you know, started getting into, you know, physics and cosmology and all that kind of cool sciencey stuff.
And like, I think it was about three years later.
And at the time, I was reading probably about anywhere between 50 and 75 books a year and, you know, thousands of research papers.
And I reread The Blind Watchmaker.
And when I got done with it, I didn't even know which two chapters I didn't understand the first time.
I just couldn't remember because I understood the whole book the second time.
And I was, and I was reading it with an eye for that.
I was specifically trying to figure out like, or at least identify where it was that I was having difficulty with.
But by that time, I had read so many books on biology and so many books on evolution and genes and that kind of thing that I couldn't identify what I didn't understand the first time.
It's pretty cool.
And that speaks again towards that neuroplasticity that we were talking about earlier.
Building those pathways, yeah, right.
Building those pathways, right?
The first time I read that book, I didn't have those pathways fortified.
And when I came back a couple of years later and I read the book again, those pathways were fortified.
And I, you know, I went right through the toll.
There was no stops.
It didn't, you know, there was no construction that I had to slow down for or anything like that.
You know, it just went right smoothly.
And that's one of the things that was really enlightening for me when I was going through that whole phase of like discovering, you know, the world of intellectualism.
Right.
So wrapping this up, is there where can people find you, John?
Yeah.
Well, I have a Facebook page called Think Straight that people can like and follow.
I can be found on Twitter at skepticjohn guy, and that's J-O-N-G-U-I.
And I also have an upcoming YouTube channel coming up, which is has the same username.
You can find it through at skepticjohn guy.
And that's going to be launching towards the end of this month.
And we got some really excited, exciting content lined up for that.
The end of, well, the end of August, this month being August, but yeah.
Yep.
Yep.
So yeah, that's all great.
I wish I was nearly as busy as you are.
Yeah, it's not as fun as it sounds.
Yeah.
I'm available on Twitter, Spencer G. Watson on Twitter.
And anyone who has any comments or concerns or wants to tell us all about what we got wrong on this episode can send that email to truthunrestricted at gmail.com.