All Episodes
Feb. 16, 2026 - Decoding the Gurus
37:06
Decoding Academia 34: When Prophecy Fails Debunked? (Patreon Series)

When Prophecy Fails (1956), by Festinger et al., claimed UFO cult members—led by Dorothy Martin—doubled down after a failed prophecy, proving cognitive dissonance. But Thomas Kelly’s 2024 Journal of the History of the Behavioral Sciences debunks this, revealing archival evidence of group fragmentation and Martin’s later recantation while exposing ethical misconduct, like faked alien communications. A 2024 replication study across 39 labs and 4,800 participants found little support for the theory’s core claims, suggesting systemic academic bias. The episode underscores how "persistent" cases skew research, urging scrutiny of disconfirming evidence to challenge flawed psychological narratives. [Automatically generated summary]

|

Time Text
Why We Align Behavior 00:06:37
Hello and welcome to Decoding the Gurus, Decoding Academia Edition.
That's the academic thing sub-podcast of the world famous and highly successful podcast, Decoding the Gurus, of which there are two hosts, Matthew Brown, the psychologist, Christopher Kavner, the cognitive anthropologist slash psychologist.
That's me and him.
Hello, Professor Matt.
Nice to see you today.
Hello.
Hello, Mr. Chris.
Hello, hello.
Yeah, would you say we have a very special episode of Decoding Academia coming up?
Yeah, I guess so.
I guess it's what I would consider an interesting case, a thing that's been in the news, also a pillar of psychology.
And it involves ethnographic study, all sorts of things.
I think this is an interesting little case for us to look at.
I think so too.
I think so too.
Yeah, yeah.
Okay.
So where are we starting?
We're starting when prophecy fails, I guess.
Isn't that where we're starting?
Okay.
Well, we're going to be looking at an article that just came out this year in the Journal of the History of the Behavioral Sciences called Debunking When Prophecy Feels by Thomas Kelly, right?
But it is worth beginning with what is when prophecy feels and why would anybody want to debunk it anyway.
So When Prophecy Feels is a book from 1956 written by a group of psychologists, most famous among them, Lien Festinger, who is associated with cognitive dissonance, the theory of cognitive dissonance.
Now, Matt, you're a psychologist.
What do you know about the theory of cognitive dissonance?
What's that theory about?
It's kind of a big deal, Chris.
Kind of a big deal.
Yeah, it's a very, very influential.
And I think just one of those classic phenomena that is thought to have broad-ranging impacts.
So what is it in a nutshell?
Well, it's basically the idea that it challenges that kind of more intuitive notion that what we normally do is we align our behavior to fit our opinions, beliefs, or perceptions.
Right.
So we go, you know, I think the world is like such and such, and therefore I'm going to do this.
I think it's not very far to get to the shops.
Therefore, I'll walk instead of driving the car.
Right.
That's how we, a lay person, a naive non-psychologist might think that people operate.
So cognitive dissonance says, hey, no, no, sometimes what we do is we align our beliefs to match our behavior.
It actually goes the other way around.
And you sort of think to yourself, well, I've been doing it this way all the time.
Therefore, I must think that, you know, this is the right way to do it.
So kind of the opposite tail wagging the dog scenario.
How did I do?
Is that all right?
Yeah, you know, I give A for effort.
But you're, you know, you're right that a lot of the famous examples go that way where you induce participants to do something that is boring or unpleasant or embarrassing.
And then they, in turn, try to justify it by either expressing, you know, how much they enjoyed it or saying that they really like the group, even though it's a crap group and so on.
So like you say, that's bringing things in line such that why would I have endured such unpleasant things for something that I don't care about?
I must care about it, therefore.
But the broader theory is suggesting that, like, in general, humans don't like it when they have inconsistent cognitions.
And these can be just thoughts.
It can be behaviors and thoughts, or it can be any combination thereof, right?
So it could be that you want to be fit and healthy, but you're going to eat that Big Mac hamburger, right?
And all those nice chocolate cakes and so on.
So this arises dissonance because how do I hold these two things together?
And rationalizations are one way that people can address this, or they can, you know, change the behavior so that it better matches their self-perception and so on.
But cognitive dissonance is kind of saying we don't like this when we notice things are in dissonance.
So we strive to reduce the dissonance via something.
And it can be adding beliefs, subtracting beliefs, changing behavior, whatever the case might be, right?
Yeah.
The more famous examples are changing beliefs to match behavior.
Yeah.
Yeah.
That's right.
It's kind of like I mean layers to it to the idea because there's a very general kind of way in which like we just don't like things to be inconsistent.
We like things, you know, everything that we think, say, and do and believe to be, yeah, not dissonant, right?
To match up.
And we are motivated to adjust something, not necessarily the behaviors, not necessarily the beliefs, but do something in order to make it to make it work.
I mean, you know, this is why, for instance, it would probably come into play when people are trying to understand why we're so susceptible to fake news when it suits us, right?
So if ICE had, let's say, murdered somebody and some information came to light that, you know, maybe reflected badly on them in some way, then we might be more ready to doubt that or find reasons to dismiss it or disbelieve it entirely.
Yeah.
So like you could go that way or you could go the other way where Alex Predi is killed.
And if you are a big supporter of MAGA and the need for ICE to be more strict and enforce the things and whatever, then you might go, yeah, looking for evidence that he deserved it, right?
Or he did something wrong.
And in that way, lead to that you can justify your position, right?
So you could go in any direction around any topic.
Exactly.
And so it's sort of connected to those broader ideas of like need for completion, need for cognitive or closure.
Yeah.
Yeah.
Where, you know, we just don't like it when things are mismatching, when things are complicated.
We would much prefer things to be coherent and to not challenge our existing beliefs and commitments.
Yeah.
And so one of the pillars of cognitive literature relates to effort justification, which is kind of what Matt was talking about whenever you engage in effortful acts or you go through painful initiations or whatever the case might be, that there comes a need to justify why you did that, right?
Researchers Double Down 00:12:47
And like a famous study, Aronson and Mills looked at people going through severe, what they called severe initiations for a group, which in that study involved university students, all women, being forced to read erotic passages and swear words to the professors, right?
That was the unpleasant initiation event that they had to go on.
Times ran pretty high in the 60s and 70s of experimental social psychology.
They just won't let us, the work won't let us do it, that kind of thing these days, Chris.
Yeah.
And like I have some issues around the way that particular study was conducted.
But generally, then the other people didn't have to do.
Some people read bad words and other ones didn't read anything that was sexually explicit.
And then they looked, they give them access to this like boring recording of a group.
It was supposed to be a group discussing like sexual things, which is part of how they justified the cover story.
But it was a completely boring, dull, non-informative conversation designed to be such.
And then they asked people to rank their attitudes towards the group and so on.
And the finding of that study is reportedly that the group that went through the severe initiation reported higher liking and more value from the conversation.
So that would be a kind of illustration of effort justification.
Again, take that account for what it's worth.
Because it is a social psychology finding from a certain time.
Yes.
So when prophecy feels is not that, it's actually not a study, or at least not a traditional study.
It's a classic social psychology book, but it is actually about a group of researchers observing members of a UFO cult or religion that had apocalyptic beliefs, made some prophecies, and these prophecies did not come true.
And then they look at the dynamics of what occurred whenever the prophecies failed, like the title suggests.
And the traditional account is that in the light of the failure of the leader of the group, Dorothy Martin, that unlike you might expect, members did not become disillusioned and lose faith in the group, but in fact, they became more devoted, engaged in more proselytization.
And this is an example of cognitive dissonance in the real world, right?
Where there is disconfirming evidence provided, but people have already devoted themselves.
So they have to keep going.
And this became like an often cited example, but also a thing that people reference just in general around processes where there are groups where people appear to, you know, double down in the face of field predictions or things not coming true.
So it's a very influential book.
And the cognitive dissonance literature in general is a big pillar of science.
I think Festinger is in the top five psychologists cited of all time.
He's number five, I think, but still, there we go, big deal.
So yeah, so why would anybody want to debunk this pillar of the social psychology literature, aside from the fact that we know that a lot of classical studies, when examined critically, have not fared so well, right?
The Stanford Prison Experiment and Olives Milgram has actually held up a little bit better, but there are issues there as well.
So yes.
So that's the context for the debunking when prophecy fails, right?
This is an influential book.
And we have an article published by Thomas Kelly, who it notes as independent scholar at Washington, arguing that that Account of what occurred is wrong for a variety of reasons.
Yeah, so it's interesting that Thomas Keller is an independent scholar.
Not that there's anything wrong with that.
Some of my best friends are independent scholars.
Yeah, exactly.
But it's not incredibly common to rare.
So anyway, that's interesting.
But yeah, so I think, I mean, at least to me, and it's not like I've fact-checked the debunking, but at least to me, when I read this, Chris, it seemed to put together a pretty convincing case that actually the When Prophecy Fails case study and how it was represented is utterly rife with all kinds of misrepresentations and misconduct, actually.
That is scientific in terms of empirical misconduct and also ethical.
And he makes a pretty strong case that actually the narrative that is spelt out in When Prophecy Fails doesn't actually match reality.
What really happened is that the group responded pretty poorly to their prophecy not coming true.
You know, there was some short-lived rationalization and stuff of it, which lasted a matter of days.
And the group started to fragment.
They started to get less enthralled with this particular prediction, right, that the aliens, the UFOs would come and lead us all to enlightenment or whatever.
They didn't stop believing in UFOs.
They didn't stop being these cosmic mystic religious types, but they sort of, I guess, edged off this particular prediction, fragmented, and moved on to other things.
So the argument is, is that, no, well, it actually doesn't support this particular application of cognitive dissonance theory at all.
They actually changed their behavior in response to the disconfirming evidence.
Right.
And I mean, it's very strong at the end.
And in the conclusions overall, it says, you know, the end of the article says reappraisal should be swift.
Every major claim of the book is false.
And the researchers' notes leave no option but to conclude the misrepresentations were intentional.
It's not pulling punches.
And I also want to note that the approach is one of kind of investigative journalism in a way or like archival research.
They got documents, you know, research notes material that was unsealed in archival material and interviewed people who had contact with the members or people involved right later, look back at contemporary reporting and newspapers and so on and triangulated accounts.
So it's in the Journal of the History of the Behavioral Sciences, and that's a good place for it because it's like critically re-evaluating the claims made in the book and coverage compared to the actual historical information that we have.
And like you said, Matt, it paints a rather damning picture.
There's quite a few things that we might highlight.
Ones that I would note are that there's a lot of researchers.
Yes, there are people there observing who are identified as researchers, but there's a lot of people crossing the line.
There seem to be undercover researchers posing as new members.
There are researchers who take on the role of quasi-leaders in the group or potential successors and encourage the group to respond to events and so on.
So rather than this passionate observer role, this is actually like researchers deeply involved and in some cases, heading off social services investigations and the child welfare.
Yeah, I mean, just the ethics alone of it would be like, this would be career ending today.
Because, I mean, you know, they infiltrated this group and were very deceptive.
Like, no, no one.
But in some cases, they identified themselves as like researchers, right?
Yeah, in some cases.
But like you said, you mentioned this guy, Brother Henry, who became like a quasi.
So he's an observer, but became a quasi-leader of the group, was treated as spiritually authoritative by the group, and then later admitted he precipitated.
climactic events.
So the actual observers influencing the group, I don't think you don't have to be a methodologist to understand.
And other ones like faked supernatural experiences, they use like automatic writing to communicate with the aliens and they participated in that and faked that as well.
And actually they were active, you know, enthusiastic participants.
So yeah.
Yes.
And there were in the research notes, there were some parts where the researchers were like, is this ideal?
Ideally, you probably wouldn't become this involved.
But the other thing, apart from their kind of interference, is that they also misrepresent activities, right?
Like, so he gives an example of something which is presented as non-proselytizing activity prior to the disconfirmed prophecy, right?
Where somebody visits a salesman and they're kind of lectured for an hour about the beliefs of the group.
And then after the field prophecy, a similar event where they talk about somebody visiting and being lectured is presented as like evidence of increased proselytization, whereas the first thing is not treated as that.
So they're saying that, you know, basically their narrative takes precedence over the actual facts and it colors the way that they interpret activities.
And so he looks back and notes, you know, cases where it appears they were proselytizing and trying to promote things in media before the field prophecies, right?
It's kind of ironic, isn't it, that like they were investigating cognitive dissonance.
But then the researchers themselves just basically took whatever they observed and happened and molded it to fit their pre-existing belief in cognitive dissonance theory.
Isn't that kind of a beautiful thing?
Yeah, it is.
And, you know, and like you said at the start, Matt, they also, you know, they trace that the leader went on to recant these specific prophecies, but emigrated to South America, I think, somewhere.
And then, you know, went on still in contact with mystical forces and beliefs about UFOs and so on.
So it wasn't like they recanted in general all of their beliefs, right?
In most cases, they continued on involved in like UFO subculture or like psychic belief stuff and finded new groups or became involved in new groups and so on.
Yeah, that's right.
So, you know, it is still a disconfirmation of the theory, I would say, right?
Because as you said, the leader, Dorothy Martin, publicly recanted this specific belief, right?
So the specific belief that the flying sources would come down and rescue them.
Is that what they were supposed to do?
Take them away?
I was never.
I believe so, yeah.
Yeah, yeah.
So, so of course, this particular prophecy not coming true didn't cause them to go, oh, well, UFOs are a load of shit.
I'm going to become an atheist now and get into science.
No, of course.
They continued being into the occult.
They continued being into UFOs.
But this specific belief that the UFOs, you know, we're communicating with them and we're going to come down and abduct them.
They recanted those ones.
They did change their mind and they stopped promoting that particular prophecy.
So they didn't double down on the specific belief that was disconfirmed.
And they dropped their evangelicizing to a large degree.
And as you said, the things that Festiger et al. cited as like increased evangelization, which would have been a nice thing from the point of view of cognitive dissonance theory, didn't happen.
Yeah.
So look, we haven't factored, I don't know if you have, but I haven't attempted to debunk the debunking.
I've taken everything that's written there on good faith, I suppose.
Well, so I went and looked at the materials that were posted up in the Open Science Framework because the archival materials are scanned and posted there.
And I don't have any reason to doubt the account.
And I think it's in line with a lot of things that have occurred recently, right?
Many Versions of Cognitive Dissonance 00:15:16
Like the similar sort of thing happened around the Stanford prison experiment where the researchers appear to be giving rather strong instructions for the people about how to behave.
And then later the behavior is presented as spontaneous, right?
So this is not out of line of what we have come to observe in multiple classical studies from this period.
And I've got no strong attachment really to any psychological theories, but cognitive dissonance in particular.
I like the general idea, right?
But I'm also aware, Matt, in 2024, there was a multi-lab replication of the induced compliance paradigm.
This was conducted with 39 labs, 4,800 something participants, and they used this writing of a counter-traditional essay, like students promoting non-drinking on campus, for example, this kind of thing, or increases in tuition fees, and then looking, you know, at whether they adjust beliefs to match this.
And this large many lab study failed to find any strong effect.
It actually did find a slight minor effect in terms of comparison with a control, but no difference between there was supposed to be this high choice versus low choice, where you're forced to do it or you're or you're not.
And even in the case where there was a difference with the control, it was a very small effect overall, right?
So the general finding is quite null, right?
And that was in 2024.
And so, you know, sort of bad news for cognitive distance theory is not a big deal for me.
I'm happy teaching about the, you know, I like field replications and stuff because I see it as science progressing.
And it doesn't mean that everything about the theory is wrong, but if you take it together, like you have a large scale field replication effort, you have this now re-evaluation of a classical study that is associated with it.
You know, these things begin to pile up and raise questions.
Yeah, I think the actual specifics of the methodology that they did that many labs replication on are instructive.
I think it gives a sense of why, you know, this, like in its pure form, the predictions made by cognitive distance theory are always going to be, you know, actually pretty unlikely to be observed.
So let's just spell it out, right?
They recruited a bunch of students.
The students were given a task to write an essay in favor of policies they know that the vast majority of students would not be in favor of, right?
So like increased student fees.
Counteratinal SA, yes.
Yeah, that's right.
So this is the thing, counter-attitudinal essay.
And then their key thing is to have those high choice and low choice conditions.
In the high choice condition, this is where it's more, the motivation is supposedly more internal to the students.
It's emphasized to them that participation is voluntary.
They are explicitly asked for consent to perform the task and they sign something or write something or verbally confirm this signing a form.
And the idea with this condition is it's meant to instigate a kind of feeling amongst the students that they chose to do this.
It was their choice to do the behavior, right?
So this should be better at inducing dissonance because if it's forced by you have no choice, right?
Then there is no dissonance, right?
It's like, oh, like, why did you, why did you no dissonance, lower, lower.
Yeah, that's right.
Because it's a good explanation for why you did send.
Yeah.
Now, in the low choice condition, they're just told to do the task.
They're not reminded of that it's voluntary and they're not asked to affirm consent.
So, but that's the only difference, right?
And it's, it's kind of like, I know it's a very minor thing.
It's a weak manipulation.
That's, that's, I guess, what I'm, but you know, but that's the kind of thing you need to do if you want to kind of really show that the strong version of cognitive dissonance theory is correct.
Because this is the thing that's challenging for me, Chris, in thinking about this.
There are, there are many versions of sort of cognitive dissonance and there are sort of there are strong versions and weak ones.
This is validating a relatively strong one.
And like you said, it was totally not confirmed.
It did not, did not replicate.
And after reading the methodology, I'm not at all surprised.
Yes, no, again, the kind of wrinkle there, just to note, is that they did find a like relatively consistent effect when you had the high choice versus control, the neutral condition.
But there was no difference between the high control versus low control, which is supposed to be part of the whole thing.
So yeah.
And even then, you know, it's a relatively limited effect, but a weak manipulation in general.
So my point, though, here is like you are right when you see these kind of things and results to, you know, increase your critical perception of the quality of evidence that is there in the cognitive dissonance literature in general.
But in general, I don't have this view that there are, aside from a few topics where I do think that applies, in general, my view is like the evidence is not going to be super strong, right?
And a lot of cases.
What tends to fail is the strong version of the theory, right?
That it's very pure and you can induce these things, I don't know, in a almost magical kind of clever little experiment, right?
Like this kind of manipulation, right?
Where you show people a couple of words that activate a concept or you give them like one-line disclaimers that change the way that they frame it or this kind of thing, right?
Yeah.
And the idea being that, so you've got this very strong or pure, if you like, Festinger type theory, right?
That the inconsistencies create motivationally aversive states and then people are motivated to reduce it, et cetera.
And then that changes their attitudes to match the beliefs.
And then it's all very subtle and it all works perfectly.
So that doesn't seem to be confirmed.
But that isn't to say that certain weaker versions of the sort of general concept, that there isn't truth in them, right?
As you said, they did find a weak effect there compared to the null.
And, you know, there's other evidence that like weaker versions of it might be like if you give people a task to say, okay, generate a whole bunch of reasons why, I don't know, America staying in NATO is a good idea or a bad idea, right?
Do a lot of work, do some research, write a whole essay, then they're going to sort of self-persuade themselves a bit just by doing that task.
Yes, right.
Absolutely.
And this is one of the things the authors of this study present that like this, this difference found between the counter-attitudinal essay and the control essay could be explained by those processes.
It doesn't have to rely on dissonance.
It could just be simply thinking about this issue and like arguing for it more, right?
So completely that.
No, but I will say, Matt, as well, that on that point, it cuts both ways, right?
Because you and I are well aware of many examples in the gurus that we cover.
And I'm aware of many religious groups and historical accounts from my own background, you know, in study of religions, where you have cases where people make predictions or failed prophecies and it doesn't lead to the destruction of their influence and public persona or power, right?
Just think of Brett Weinstein.
How many predictions he has made?
How many have been proven false?
And yet he remains, you know, like a popular figure, not amongst everyone or Alex Jones, right?
Infowars.
Oh, yeah, yeah, yeah.
But I'll just point out, Chris, that this is like subtly different.
Like certainly, it's indisputably true, right, that people are very resistant to disconfirming evidence, changing their beliefs.
They have high commitments to.
And Brett Weinstein's a perfect example of that.
But the test here or the predictions made by the strong version of cognitive divisions theory is that someone like Brett Weinstein becomes more committed to anti-vax or whatever it is when the evidence comes in.
Not that the disconfirming evidence just doesn't perturb him, it doesn't affect him, but rather the disconfirming evidence is actually pushing him more.
Well, isn't the theory also more about the followers in terms of the response of the followers to experiencing disconfirmation of prophecies and whatnot?
And in that case, this is actually an argument I have a little bit with the interpretation, because the fact that the Martin person didn't double down on their specific prediction, right?
They went on to abandon it.
Of course they did, because time went on.
And like, they're not going to say, well, actually, if you go back, like, you know, the world ended then and so on.
So they have to abandon it by the reality of events.
But what you see very often with guru types, charismatic leaders, religious figures is a reinterpretation that even though it looks like that was not fulfilled, it actually was if you reinterpret it in this way.
And it doesn't mean that it has no effect.
But I do think that when you look at, for example, all the people that predicted that lots of people would die from the vaccines, you know, there would be millions dead.
And obviously that didn't occur.
But has that now damaged their credibility amongst the followers?
No, they just declare it did happen.
It's been denied or it will happen.
Or actually, they were talking about something else and it's like long-term damage and so on.
But remember, the claim by the original book about the group when prophecy fails is that they would double down, that the fact of the prophecy failing would make them more committed to the thing.
Yeah, but so how is that different than like this?
You are saying that they would acknowledge that it failed and that makes them, but they have to interpret it as being true, right?
To argue it occurring.
So like it has been proven false, what the people claimed about the pandemic and what was going to happen.
But their commitment to their narrative is such that they just reinterpret all the disconfirming evidence as false, as lies by the authorities.
So they haven't given up on that belief, right?
No, that's right.
That's right.
But I'll just point out that the When Prophecy Fails authors, Fest and Gerardal, they made the claim, right, that they became more prostatilizing.
Became more committed, not that they stayed the same.
So I i'd say Brett is, or people like him are, just as committed about anti-vax as they were before.
But oh, rather than, rather than you know, the lack of evidence isn't making them more.
Just to be clear, I don't think it's the lack of evidence that's making more.
But I would say that Brett's anti-vax attitudes over the course of the pandemic have become like, much more extreme.
Like there was once a time when we had people uh, responding to me saying, you know, you're really over interpreting that he's anti-vaccines, he's just expressing right, but he's obviously way now in the anti-vaccine camp, you know, and so on.
So, like there has been a yes, a kind of like, I just don't think that's been caused by the lack of confirmation that we're just on that trajectory and that the lack of evidence didn't bother them.
Oh agreed, agreed.
Well, i'll take some more issues with interpretation, but I, I agree that I don't think that's the causal factor applying there.
Right, and there is another article matt, that Thomas Kelly wrote the year before, which I read and I like some things about it.
It's called, field prophecies are fatal.
Um, in 2023.
It came out, and the argument is kind of connected to what he argues with the, you know, the debunk of Dorphy Martin's Group.
It's essentially that when you go back and look at these cases that are often cited, where religious groups experience field prophecy and then they go on to flourish that in many cases this doesn't happen.
If you look for the cases where groups have made prophecies that are very publicly disconfirmed right, in actual fact, many of them dissolve and what you're dealing with in the literature is often a survivorship bias, where people are looking at cases where there is a group which survives till now but went through a, you know, a controversial event, but that doesn't mean that in general,
groups that go through those events will experience that.
So it's kind of like a double-pronged attack where he's saying some of the cases are misrepresented when you go and look at the details, just like with when prophecy fails, and many of the groups do either disband or, you know, become less popular, and even if they do follow the pattern that's described, that doesn't mean that that is the normal like structure of what occurs, right?
Yeah, that's right.
It can just be a bit random.
Yeah, some of them get more committed, some of them stay the same, some of them become less committed.
There's not necessarily any relationship between the failed prophecy and what happens next.
Yes, and so I think you know we've recommended this and I think this is a good thing to do, and Thomas Kelly is, in a way, modeling it that, like you, can get a lot of benefit from, if you've got a thesis, looking for evidence that disconfirms or is counter to your expectations right, rather than just seeking out confirmatory cases because yes, a bunch of scholars have found cases where it seems like you know these dynamics are at play,
but how representative Representative, are there cases where it isn't a play?
That is what you want, but that is often not what people do.
And his title for the article is Field Prophecies Are Fatal, right?
He's arguing it's the opposite, actually, than the common understanding.
Yeah, well, I mean, we had this criticism of Helen Lewis's book about when she was reviewing cases of these geniuses who were, you know, very disagreeable, you know, marketing themselves, kind of a, you know, anti-social personality cult creation.
Yeah.
Like marketing thing.
Yeah.
And, you know, she found some examples and reviewed them in some depth.
But yeah, I mean, you can find examples of anything.
What you need to do is try to be more systematic and even-handed in the way you collect your data.
We really like the book, but of course, as everyone knows, right?
That was one of our few criticisms.
But if you wanted to approach it like in a kind of social science way, right?
Like a rigorous approach, this is what you should do.
And in actual, just in general, in science, right?
This is what you should be doing.
And in Helen's defense, she is a journalist writing a popular book.
Systematic Evidence Collection 00:02:00
Yeah.
Now, one thing that I noticed, though, Matt, throughout these, so I well, hold on, before I say that, I'll just say, like, I heard about this on the study show with Stuart Ritchie and Tom Jeffers, right?
And I went and read it.
I enjoyed both of these and thought, oh, this is good.
I actually taught about it last week to students, right?
And I enjoyed teaching them the original version and then going through these critical re-evaluations.
And so I have no issue with people doing what Thomas Kelly does.
I enjoy this kind of work.
And I am convinced, like you are, by the case he makes for when prophecy feels being overstated.
And in general, people buying into this narrative a bit much without looking at counter evidence, right, to the claim, this potential disconfirming evidence.
With that said, that said, I did notice throughout these that there was a reference to early Christian communities came up fairly often, more than I would anticipate in articles.
And I'm sensitive to this, right, in a way, because I've went through study of religion.
So I know when people are giving examples and it comes up in the debunking when prophecies field, it also comes up in this other field.
If you'd like to continue listening to this conversation, you'll need to subscribe at patreon.com/slash decoding the gurus.
Once you do, you'll get access to full-length episodes of the Decoding the Gurus podcast, including bonus shows, garometer episodes, and decoding academia.
The decoding the gurus podcast is ad-free and relies entirely on listener support.
And for as little as $5 a month, you can discover the real and secret academic insights the Ivory Tower elites won't tell you.
This forbidden knowledge is more valuable than a top-tier university diploma, minus the accreditation.
Your donations bring us closer to saving Western civilization.
Export Selection