The 1956 book When Prophecy Fails describes one of the most famous case studies in social psychology. The researchers Leon Festinger, Henry Riecken, and Stanley Schachter covertly joined a small, apocalyptic UFO group, observing how members prepared, including quitting jobs, giving away possessions, and severing ties with skeptics. According to the book, when the group’s predicted disaster did not occur, instead of simply abandoning their beliefs, many core members strengthened their commitment and actively sought new converts.
But it didn’t actually happen like that. In truth, Festinger and his fellow researchers glossed over evidence that contradicted their thesis and actively influenced the UFO group to get the result they wanted.
This is the discovery of our guest Thomas Kelly. By combing through newly unsealed materials from Festinger’s archives, UFO and occult magazines of the 1950s, later writings by group leader Dorothy Martin, he discovered crucial information that was omitted by the original book. Kelly detailed his startling findings in a paper published in the Journal of the History of the Behavioral Sciences titled Debunking “When Prophecy Fails.”
Travis interviews Kelly to discuss the real story behind When Prophecy Fails, the possible consequences of invalidating Festinger’s study, and how his work fits in the wider “replication crisis” in experimental psychology.
Subscribe for $5 a month to get all the premium episodes:
www.patreon.com/qaa
The first five episodes of Annie Kelly’s new 6-part podcast miniseries “Truly Tradly Deeply” are available to Cursed Media subscribers.
www.cursedmedia.net/
Cursed Media subscribers also get access to every episode of every QAA miniseries we produced, including Manclan by Julian Feeld and Annie Kelly, Trickle Down by Travis View, The Spectral Voyager by Jake Rockatansky and Brad Abrahams, and Perverts by Julian Feeld and Liv Agar. Plus, Cursed Media subscribers will get access to at least three new exclusive podcast miniseries every year.
www.cursedmedia.net/
Editing by Corey Klotz. Theme by Nick Sena. Additional music by Pontus Berghe. Theme Vocals by THEY/LIVE (https://instagram.com/theyylivve / https://sptfy.com/QrDm). Cover Art by Pedro Correa: (https://pedrocorrea.com)
https://qaapodcast.com
QAA was known as the QAnon Anonymous podcast.
SOURCES
Debunking “When Prophecy Fails”
https://onlinelibrary.wiley.com/doi/full/10.1002/jhbs.70043
Debunking “When Prophecy Fails” (Free Preprint Version)
https://osf.io/preprints/socarxiv/9j7qc_v2
Failed Prophecies Are Fatal
https://journal.equinoxpub.com/IJSNR/article/view/33085/32543
Cults, Conscripts, and College Boys: Whither Cognitive Dissonance? (Preprint)
https://osf.io/preprints/socarxiv/xdj2u_v1
The first post on 4chan from the anonymous entity later known as Q read, HRC extradition already in motion effective yesterday with several countries in case of cross-border run.
Passport approved to be flagged effective 10.30 at 12.01 a.m.
Expect massive riots organized in defiance and others fleeing the U.S. to occur.
Of course, Hillary Clinton wasn't extradited, her passport wasn't flagged, and there were no massive riots in the days after that post.
The fact that QAnon's very first message contained predictions that did not come to pass didn't seem to dampen Q's popularity among conspiracists.
How could so many people see QAnon as a source of truth when it was clearly wrong in this and many other instances?
Well, social psychology provided a ready answer.
In the 1950s, Leon Fessinger developed the theory of cognitive dissonance.
According to this theory, people feel most comfortable when their beliefs, known facts, actions, and values don't contradict each other or are consonant.
When their beliefs or mental states do contradict in some way, the resulting cognitive dissonance creates powerfully uncomfortable feelings, which in turn drives people to resolve the contradiction in their heads, even in ways that may seem strange or irrational to outside observers.
This theory was tested in the famous field study, which was reported in the 1956 book, When Prophecy Fails.
Festinger, along with his colleagues Henry Reichen and Stanley Shaster, observed members of an obscure apocalyptic UFO group.
The group's leader, Dorothy Martin, claimed that the entire world was destined to be destroyed by a flood on December 21st, 1954.
According to the book, the group was mostly reluctant to evangelize their beliefs before that date, but that changed after the day passed with no apocalypse.
Suddenly, the group's most committed believers became even more committed, and further, their previous reluctance to spread their beliefs gave way to a newfound willingness for publicity.
This demonstrated that when a committed believer is faced with disconfirmation of their beliefs, they resolve their cognitive dissonance in ways besides abandoning what they believe.
This case study has been cited countless times over the past 70 years in order to explain the strange behavior of cults in other committed groups.
The problem is that it's just not true.
It didn't happen the way that the authors said it happened in the book.
This is the discovery of our guest today, Thomas Kelly.
By combing through newly unsealed materials from Festinger's archives, UFO and the cult magazines of the 1950s, later writings by Dorothy Martin, and other neglected sources, he discovered crucial information that was omitted.
Kelly detailed his startling findings in a paper published in the Journal of the History of Behavioral Sciences titled Debunking When Prophecy Fails.
Now, Thomas Kelly joins me today to discuss what he found and how it may impact how we understand the psychology of belief.
Thomas, thank you so much for taking the time to speak with me today.
Thanks for having me.
I'm excited.
It's a really interesting topic.
It really, really is.
And I have to say, how I discovered your paper, it was mentioned in a tweet when I was browsing Twitter one day.
And my immediate sort of reaction to just the title, Debunking When Prophecy Fails, was, I guess, incredulity.
Because like, again, this is something that like many people in the field have referenced and something that I've referenced several times on this podcast because I really thought it, you know, it provided a good case study and then a good sort of framework for understanding this kind of phenomenon.
But boy, when I dove into the paper, I have to say that your case is very convincing.
Thanks.
For a long time, I also thought of when prophecy fails as like presumptively true, just because it's canonical.
So, you know, it's really interesting.
Also, you are listed as an independent researcher on this paper.
So is this not your primary area of research?
That's right.
I'm a political scientist and I mostly work on like public health policy.
So this was definitely a side project for me.
My educational background is political science, not psychology.
Well, I mean, yeah, it's a hell of a side project.
I mean, it's just really interesting because of like how impactful the study is.
Like the 1999 review, When Prophecy Fails and Faith Persists, a theoretical overview, opens with, there's a quote, almost everyone in the sociology of religion is familiar with the classic 1956 study by Fessinger et al. of how religious groups respond to the failure of their prophetic announcements.
I mean, the study provided like just the default model for how like we understand how groups respond to failed prophecies.
It inspired decades of empirical work on doomsday and millenarian movements, and it continues to function as a reference point.
It's like it's one of those studies that like even people who are not really in the field may be familiar with.
So let's get into your findings.
Again, the, I mean, the sort of like the story in it is really, really important because like it's, it basically goes, well, they, at first they were kind of reluctant to publicize.
They did a little bit, but not that much.
But then this big disconfirming event happened.
And all of a sudden, their willingness to like call newspapers and sort of like try to get more people into the group.
So there's this like, which is counterintuitive finding, but it did, according to Fessinger, sort of confirm his then developing theory of cognitive dissonance, like a perfect example of like what he was trying to push in academia.
But you found that basically this wasn't true in all parts.
Let's start with the idea that they didn't work to sort of spread their beliefs before the disconfirming event.
You talk about how they actually, they publish a lot of like articles and letters promoting their messages about the and the coming cataclysm in these UFO occult magazines.
You found it in like magazines like Roundhouse and Mystic.
This is a few months before December.
So you found that they actually were very much interested in talking about what they believed, right?
Yeah, that's right.
So the official story of When Prophecy Fails is that Dorothy Martin and her main followers, a married couple, Charles and Lillian Lawhead, had like a brief passing interest in spreading the word.
Like they sent out a press release and they attended a couple of talks and they say that's how they found out about this group.
You know, they just were in the right place in the right time and during this brief spurt of publicity, a few months before the flood was supposed to happen, they heard about it.
But right away, just reading the account that Festinger and his co-authors provide in When Prophecy Fails, it seemed like a little more substantial than that because they would provide examples.
Like they say like, there was also a book publisher like at these meetings, but they were trying to get him to publish a book of their teachings.
And yes, there is some subjectivity between like what's like a sustained evangelism campaign, but like a brief interest in it.
That seemed pretty sincere to me.
Another issue was that they would like offhandedly reference like one of the followers' jobs was to copy down these messages and to send them out to UFO groups across the country and world.
And quite quickly, when I just started reading what people had written about Martin, mostly from the UFO community, people were aware of how eager she was to spread her word.
So for instance, in the Sausserian magazine, Gray Barker, who was, he was a little bit dismissive of Martin, but did cover her, but she was well known, wrote something like, she was sending her messages out to anyone who would listen.
Yeah.
Again, the complete opposite of the thesis of the book.
Archival notes also record Martin telling followers that from now on until the cataclysm, her job was to gather in the recruits and that aliens had explicitly commanded her and the lawheads to spread the message.
And there was even like specific roles handed out.
Yes, that's right.
So Dorothy Martin received most of her messages through automatic writing where she would like sit and write something on the paper and the idea is being guided by this like ascended extraterrestrial intelligence.
And a lot of these messages would say something like, Dorothy, you need to spread the word.
Dorothy, you need to tell the world.
And she even thought that it was the role of Charles because of her channeled messages that he was supposed to help out with the publicity.
And you see that he does this.
He like publishes a couple of accounts about her teachings.
And this is all before the prophecy starts to fail.
And then what's interesting is you see from the internal notes of the researchers, some of which were only opened up this year, they like openly talked about gathering in the recruits or how to start the proselytizing.
And they talked about like who Martin trusted when people would visit her house.
They would talk about who she trusted to like teach them the real teachings and who she didn't.
But she was kind of choosy about this.
So, okay, so she had favored lieutenants who were like more adept at sort of spreading the word over others.
So, I mean, yeah, I mean, this is the sort of the second shocking part of it.
Because the thing is that like, you know, bad science happens.
Like bad science is part of science.
You know, it's like it's normal to publish ideas that may be a little off, but perhaps are sincerely explored or perhaps on the right track.
I'm thinking of like Charles Darwin on the origin of the species.
He believed that he didn't even have Mendelian genetics.
He believed this an idea of like blended inheritance, which doesn't even make sense on its face.
And Orsal contradicted his own theory.
But he was wildly wrong in some sense.
But like the core idea was so solid, it's been expanded upon and refined over the decades.
But you're proposing something more serious, which is that not merely, this isn't merely just bad science.
This is science in which the scientists knew that something was off, knew that there was information that contradicted their main thesis, but they chose to ignore or gloss over it anyway.
Yeah, that's right.
I think that's fair.
And like what you were saying about bad science is even if you do everything right, there's always some risk you're just going to get some random or non-replicable result, right?
Like they could have just studied a really weird cult that didn't generalize, right?
And it's not their fault that that can happen.
But here at every step along the way, there's serious problems.
So for one, they do a lot of things that help convince the cult members that their beliefs are real.
Yeah, yeah, yeah.
I mean, I have, you know, interesting.
I have read criticisms of the study before that they were a little too embedded, a little too involved in the group, but this was, you know, that by itself didn't invalidate the results.
Like the core thesis essentially could survive kind of research practices that today would be considered unethical.
But yeah, again, you're suggesting like more, more extreme, that they really had their thumb on the scale and were trying to push the group towards the kind of result that they thought would sort of support their cognitive dissonance thesis.
Yes, that's right.
So one example of this that they briefly acknowledge in When Prophecy Fails, but you see in more detail and I discussed in the journal, is that one of their paid research assistants joined the branch in East Lansing.
When she shows up to the house, they're like, hey, why are you here?
And she's like, well, I had a dream about a cataclysmic flood and being rescued from it.
And then another research assistant joins them too.
And of course, the cult is thrilled.
They're like, wow, these two people joined us out of nowhere.
They're obviously like channeling sacred messages from the extraterrestrials as well.
We've been right all along.
But the most persistent bad actor in this whole study is Henry Reagan, whom everyone loves and adores.
And for whatever reason, they immediately become convinced he's like spiritually ascended.
They start calling him Brother Henry.
Everyone else basically goes by like their first name, but he's like the big man on like the cult campus.
So, and you see this from his writings and also the writings from the other observers and Leon.
And so, you know, there's a meeting at Martin's house and they're all waiting for Henry and they're like, we need to wait for the man from Minnesota.
And of course, Stanley and Leon are also from Minnesota, but no one just like cares that much about them.
They're just like normal members.
And people will ask him for blessings.
They'll ask him for messages.
He talks about in one of his like research journals, he's like, yeah, if I wanted to, I could easily take over the screw.
And he starts to play a bigger and bigger role.
And so probably the most dramatic instance is on the night when the prophecy fails.
He kind of like coaches them through specific things.
Maybe that's jumping ahead a little.
No, it is.
I mean, the interesting is shocking.
It would be as if like I, through like doing this podcast, I didn't merely report on what they believed.
I went to like Forge Hanner and stuff and like encouraged people to go down rabbit holes or like explained why they should believe despite Q's failed prophecies and like took an active role in the direction of the belief and the sort of the group dynamics rather than be an observer.
So it's like, I think it's a pretty extreme example of them sort of like fiddling with their own experiment.
Martin said that Henry Reagan was the favorite son of the most God.
And oh yeah, could you explain what was the magic box episode that you say was actually misdescribed in when prophecy fails?
Sure.
So the Martin had a like a book of like sealed teaching.
So she has some documents that like other people she believed weren't supposed to see.
But after the prophecy fails, both Martin and Charles Lawhead are in legal trouble.
Martin might face they might they're both dealing with like allegations of mental incompetence and they're worried about being institutionalized.
So Martin ends up leaving Illinois and she's like, oh, what to do with these books?
And the researchers have always wanted to see these for a long time.
So Henry, who has just revealed to the group or just claimed to the group that the Space Brothers sent him all along.
So he's just said, you guys are right.
My title is Earthly Verifier and I was sent from above.
He's like, you should give the books to us.
I'm going to put a magic protection symbol on them and then I'm going to give them to someone trustworthy to look after them.
And that guy was one of the paid research assistants who had also infiltrated the cult.
In the book, there's like this, this in When Prophecy Fails, this happens too, but in a different way.
They're like, they were scared about what to do.
So they asked us, what should we do?
And like, Henry was like, fine, I'll put a magic symbol on it.
Although they don't identify who's who.
They're just always like one of the authors.
And then in that telling, they gave it to one of the true believers.
And they go on to say, this proved the belief of the true believer because, according to them, like some of his like school books had inadvertently been sealed in this box and he couldn't get them out because like he was afraid of breaking the magic symbol.
But in their internal research notes, there's no, there's no reference of this.
So like, I guess there's a tiny possibility this box made the way to the true believer in the future, but there's no documentation or discussion of it.
And so it doesn't sound like that.
So it appears to be.
Yeah.
Yeah.
Real terrible stuff.
The other part of the equation when prophecy fails is the idea that like there was this disconfirming event and then they the group reacted in a counterintuitive way, which is that they doubled down and they believed even with even more energy.
And then they all of a sudden they found a new passion for spreading their beliefs that was not present before.
That again, this is this is something you also found to be not true.
I mean, could you talk about like what you found was the actual reaction to the date coming and going without the apocalyptic flood?
Sure.
So even in the original account of When Prophecy Fails, they admit that like some people give up just waiting for the flood, like they get too bored.
Like there's this story about this, like a teenage woman who's a follower and she gets bored.
Her boyfriend's like, hey, want to like go get a drink?
And she just like leaves and never comes back.
So they acknowledge there's like some attrition.
But what really happens is that a fair number of people do leave right after.
But among like the hardcore who are gathered at Martin's house for the apocalypse, there's definitely a few days where they're like trying to keep each other's spirits up, right?
So, you know, they do release a press release with a psychic message, although this message, you know, that says, you guys were such good people, your light prevented the flood.
Although as I show, this message was in part prompted by Henry Regan himself.
So they do do some press releases and they do at one point like go outside and sing a song of like Christmas carols as like a song of praise to the aliens, hoping maybe they'll decide to show up.
There's like a few days where they're like still holding on.
And then the group essentially disintegrates.
So Martin is worried about legal trouble.
So she goes and stays with a friend outside of her hometown.
And there, some of the younger members are already like, you know, they're still cordial with her.
People aren't mad at her, I want to say.
People still seem like they always like her as a person.
But, you know, one of the younger believers is quoted by the researchers saying, like, hey, it was like pretty disappointing because like nothing really happened and we thought something would.
And then a new paid researcher shows up, Marsh Ray, and they're really excited because they're like, maybe this is a sign of indication.
So they're still holding on to hope and they start, they're like, welcome, brother Marsh.
Like, do you want to give us a blessing?
And he's like, no.
And from his telling, they had totally given up on evangelism at this point.
And so this is like his first interaction with them is like Christmas Day, right?
So this was in like four days.
They've abandoned it, right?
And the other thing, I do want to stress it, like, these people don't go become like, you know, like normal, stereotypical suburban people, all of them.
Or at least Martin doesn't, and neither do the Lawheads, right?
Before they connected, they were interested in the cult and the UFOs and they stay like that.
But that's actually helpful because you can see their writings in later years and they refuse to talk about it.
So a few months later in 1955, there's a big piece in the Causerian magazine published by Gray Barker, who I mentioned earlier, and she's already walking back her beliefs.
She's like, she's not saying like there was going to be a real flood.
We were really going to Rescue and Saturday World.
She's like, well, I never really believed we were going to be physically picked up by aliens.
And the Lawhead's like, yeah, like, you know, but maybe like our spheres were like lifted by like bonding through it.
Like, and they both have lots of opportunities to spread the word.
So a few months after that, Lillian Lawhead is published in Mystic Magazine.
And everyone in the UFO community knows who these people are now because there was like mainstream press coverage of them.
And so the editor literally hasn't edited a note before her article being like, you guys, this is the Lillian Lawhead who was like involved in this whole failed prophecy.
So this is like a great opportunity for her to like talk about the Christmas message or like reaffirm her belief.
But instead, she starts talking about these secret messages and these alleged alien footprints that someone's found in the desert.
And Lillian, this is like a long-term project of hers.
She had written about this before Martin and she would like continue writing it afterwards.
So she doesn't want to talk about it.
The Lawheads and the Martins do stay friends at least briefly.
They visit Peru and they, you know, they try to channel psychic messages.
There's kind of some like ancient alien type like archaeology going on there.
But then they split up and Dorothy Martin goes off to call herself Sister Thaedra.
And she is, I guess, pretty charismatic, but she does once again start to gather a small number of followers in her later years, but she refuses to talk about this.
And we know that because like they put out many books of compilations of her teachings where she never talks about it.
And they talk about how like they have no idea what happened in her life from 1950 to 1955.
And she even changes her story about how she gets her psychic powers.
And she totally omits the Illinois thing.
So she's written out this entire incident that was supposed, instead of affirming it, she's changed her entire biography.
She has a new name.
The cult itself doesn't exist.
Yeah, like under the theory they were operating under, they would like hold to this confirming event, but like reinterpret it in some way.
But that's not what you saw.
They seem to run away from it entirely.
Yeah, like they do do this interpretive move for a couple of days, right?
I mean, it's clouded because Henry played a big role in triggering it, right?
But they do come up with the initial explanation that like it was going to happen, but like they were such peer people and like meditated and prayed essentially so that they like saved the world, right?
So, you know, it's not that this move didn't happen.
It's just it doesn't like didn't lead to like long-term reaffirmation of their beliefs.
Well, I mean, yeah, so this is essentially basically kind of cherry-picking evidence where they sort of they did find something that seemed to affirm their theory, but then instead they kind of like ignored the more long-term consequences of the failed prophecy.
Yes, that's right.
And unfortunately, I think that's a pattern in the like literature on failed prophecies.
I talk about this in another paper, Failed Prophecies Are Fatal, where I discussed the 1999 article you mentioned earlier, whereas people will like interview people like a couple days after a prophecy failed.
And if they don't like immediately say we were completely wrong, it gets listed as a case study for like how beliefs are like really resilient.
Yeah, that's pretty shocking.
So yeah, I mean, you list a lot of like, again, this is, it's not just empirical problems that you found in your research.
There are also some serious ethical problems.
For example, you talk about when Liz Williams, who fabricated a mystical dream about being saved from a flood.
I mean, what was this about?
Yeah, so that's how she wasn't sure how she could join the group.
So the group essentially has two bases, Martin's home in Oak Park, Illinois, and the Lawhead's home in East Lansing.
And she was this graduate student at Michigan State University at the time, although I think it had a different name back in the 50s.
And in order to gain entrance, she like comes up with a flood story and they're really impressed.
And then she starts functioning essentially as a nanny for the younger daughter of the Lawheads.
So she's almost constantly at their house.
So she's able to observe them in great detail.
Liz is, you know, she's a funny writer.
She's not a fan of some of the other cult members.
So she, you know, she wants one of them to leave the house.
So she like thinks she's an automatic writer too.
So she like gets a psychic message to like encourage her to leave.
Or this one time, this woman she did not like was like having a, like, you know, who would sometimes like have a beer or a cigarette and Liz would speak loudly about like, oh, we must be so enlightened and avoid the temptations of the flash, like alcohol and nicotine.
Actually, a very large amount of her notes are just dedicated to complaining about this one cult member.
And so what was probably the most shocking is that Liz and her fellow observer, Frank, actually try to stall a child welfare investigation into the lawheads.
So this is triggered by Charles's sister, who has a long-running grudge against Charles.
For instance, like he would always try to encourage her to not seek out conventional medical treatment, even when she was seriously ill and like focus on the power of positive thinking.
So she like really like had a grudge against him, at least as she's portrayed by like the Caesarian magazine and the accounts in the archives.
And she's she contacts people in Michigan and says like, hey, I'm really worried that my brother and his wife are neglecting their kids because they're in a cult.
And so a child welfare psychologist is sent over to talk to interview the children.
And at first she's really dismissive of Liz that she thinks Liz is just, this is from Liz's Liz and Frank's account.
She thinks Liz is just a cult member.
And then Liz is like, listen up.
I'm a real social scientist.
This is a crucial study.
You can't interfere this.
And just so you know, like higher ups at Michigan State know about this and have approved it.
You need to go talk to them.
And so she tries to like pull rank on her and to stop this because this happens just like a couple days before the flood's supposed to occur.
So this really is going to be like the worst time imaginable for the study.
And then she goes talk to people at MSU and she like has the younger daughter with them.
And the younger daughter loves Liz because she's like her nanny.
And they're like, oh, like, I don't know.
We should like interrupt the situation at such a crucial point and like go tell Charles, who maybe's from their perspective, crazy that there's like a spy in his house.
So like the core officers don't show up until after the prophecy fails.
So Liz is able to like, well, we don't know the counterfactual, but at least there's no, the courts don't like send a probate officer until after the crucial date.
I mean, this is really, really shocking that like, you know, it's, I suppose it does happen where I don't know they're more interested in making sure, I don't know, their grant money wasn't wasted than actually doing sort of a real sort of rigorous legitimate science.
Now, like I said, I was pretty skeptical of like, you know, the bold title, The Bunky When Prophecy Fails, until I read through all the evidence, you know, we just discussed and showed that like, no, this is pretty bad, pretty bad for the case.
But I mean, that still kind of like, you know, leaves us with a lot of questions about what this means for, I guess, social psychology research and like the theory of cognitive dissonance more broadly.
Because first of all, it is the case that we can know of like many examples of sort of like prophetic kind of like groups that continue believing despite the prophecy, like QAnon.
You know, this is something that happens that requires some kind of explanation, I feel like, because it's so counterintuitive.
So I mean, the question is like, well, what exactly is happening here if it can't be explained by the sort of like the kind of events as described in when prophecy fails?
So here I want, I think, I want to make a distinction between like what I think are different implications for my work on cognitive dissonance in general versus for like the study of prophetic failure.
So for people who think the theory of cognitive dissonance has a lot of explanatory power, they don't think that because of a single study, right?
They'll point to different, you know, like a larger body of evidence that they think substantiates it.
There's definitely criticism of that larger body of evidence, which I share, but this paper doesn't show, you know, by itself that like people need to advance that.
When it comes to new religious movements, I think it's a lot more substantial because this is a field that's like, you know, we don't get to observe that many examples of prophecy failing, right?
There's not really a huge number of case studies.
And when you actually go and evaluate these case studies, you see that the evidence for persistence of belief is a lot weaker than people say.
First of all, just like this one case study actually shows the opposite.
The other issue is this, is that some people who have tried to study this have just like looked at the past and said that group exists, that group had a prophetic failure.
That's fine to some extent, because it shows that it's not impossible, right?
Like so you can point to Jehovah's Witnesses and you can say like, just because you have like a failed prediction early in your history, it doesn't stop you from becoming a large religion.
So that's true.
But the issue is if you're selecting only from surviving groups, you're going to way overstate.
The other issue, and if people are interested, they can read the open source article, Failed Prophecies Are Fatal at a different peer-reviewed journal, is that a lot of groups have been falsely claimed to survive when they don't survive.
To some extent, that's just ambiguity.
Like what does it mean to like survive failed prophecy?
But if you just like zoom out a decade, a lot of these groups are supposed to provide evidence for survival.
They don't exist.
You know, they fall apart.
So I would say that we should now think of groups surviving failed prophecy as exceptional rather than expected.
And then you can look at the survivors to see if there's any commonalities that might provide at least a reasonable hypothesis for like why some are better than the others at dealing.
No, yeah, one thing I really liked about your paper and some of its implications is that it kind of vindicates debunking a little bit as a persuasive tactic.
Because my whole stance through doing this podcast and researching this material was like, you know, debunking may be a no, worthwhile on its own just as an exercise, but it's not a method of persuasion.
You know, it's like, it's like, you know, finding the logical flaws or showing how, you know, the belief system does not actually predict what it claims to predict is, I think, valuable on its own.
And in some cases, may even persuade people who are less committed, but generally is ineffective.
But this is, I think, the idea that generally these groups that have sort of like these strong belief systems don't survive a failed prediction.
Sort of like, I don't know, the vindication of the debunkers, you know, is the idea that it is, in fact, possible very frequently to persuade people to abandon false beliefs through showing that is very much demonstrably false.
I mean, that's, I don't know, that's kind of like, that's kind of like a hopeful, I think, possible consequence of your research.
Yes.
One thing that I do think maybe is like is relevant from these false beliefs, maybe versus other categories, is the stakes of these being true or false are really high, right?
So in Martin's group, some of these people had literally quit their jobs because they thought they were like saving their lives and getting to go hang out with like spiritually like enlightened extraterrestrials, right?
So the potential payoff was really high either way.
So the truth of it mattered a lot.
But a lot of things we believe, like it really doesn't matter if it's for us as individuals, often there's no harm to having inaccurate beliefs.
But for these high demand groups, you really care if they're saving you from the like the cataclysm or not.
One thing I really thought of when I was reading your paper was that like, how is it that this went unnoticed for seven decades?
I mean, like you mentioned, like there were newly released materials from his notes.
But even absent that, I think as you point out very well, there were very serious flaws in the study.
And there are, there are like many smart people who investigated this and took it seriously.
And as you point out, sort of like tried to replicate it with not much success.
So, I mean, how is it that this study has become so iconic and without any serious challenge for seven decades?
Yeah, I do think it's pretty bad that happened.
If you simply even read Fessinger's own work, comparing When Prophecy Fails to his later account of the cult and A Theory of Cognitive Dissonance, literally published one year later, you can see like he's revising his story to make it more dramatic.
And I think there's an there, there is, or at least there was an issue where people who like to argue about social psychology like to argue by coming up with their own pet theories rather than diving into data or case studies.
So if you look at the early criticism of cognitive dissonance theory, you'll see people saying like, hey, this theory could also explain these results, not just your theory.
But you see much less work going into like, does this literal exact experiment replicate, right?
Or why are you throwing out so much data in your data analysis?
Or in the case of When Prophecy Fails, these people could have gone and read like several different UFO magazines that would have posted a lot of, posed a lot of challenges to the study.
Or Martin, aka Sister Thedra, was not like, you could, you could have written her.
You could have gone and talked to the law heads.
This would have been really easy to fact check.
And some newspapers would cover them just as like, you know, fun public interest stories, right?
So when Martin like moved to Arizona, her local newspaper ran just like a human interest story that's basically like, hey, we have a new neighbor who gets psychic messages.
So yeah, it's troubling that they trusted the account of like the credentialed psychologists, even though like the actual people were trivial to go out and talk to, but they just didn't.
When we were emailing back and forth, I talked to you about this.
Like remind me of a case I studied for a couple of podcast episodes for my mini-series Trickle Down, which talked about the case of Henry Herbert Goddard's study on the, what they call the Calicack family, in which he claimed that there is a feeble-minded barmaid in the revolutionary times who gave birth to generations of feeble-minded children who behaved in these awful antisocial ways.
And from this, Goddard basically put together a scientific case for eugenics, saying that, like, you know, the feeble-mindedness is a very serious hereditary quality, and therefore it was like good for society to like limit these people from reproducing.
And there were like many, many flaws with this, as like scientists discovered later.
Like, for example, he got some of the genealogy wrong.
He didn't really actually know much about this feeble-minded barmaid.
He was very, very selective.
He did cherry-picking in order to kind of like bolster his case.
But this is a theory that was like the original study was published in 1912, but was more or less kind of like debunked by the 1930s, but then kind of persisted decades after in kind of academia.
Just so it's like, it was like it was too good of a story.
People liked it so much.
I mean, are you sort of like concerned that maybe something similar might happen now?
It's like even as sort of like more critical research, like your, your own, and maybe more inspired by this kind of work sort of shows that that work was really, really bad.
It's just going to be, it's just going to persist in sort of like popular consciousness and even the scientific literature, despite that fact, because, gosh, it's such a good story.
I think that work like this is effective, but only over time, right?
I think there's a difference between like if you're like a new academic or researcher who like sees this at a pivotal moment in your career, maybe versus someone who's been like working in a subfield for 30 years and now, you know, some of the things, you know, some of the premises you should have questioned are now being questioned.
And also, you know, it is appropriate, I think, for like people interested in this field to like take their time to like read my work and think about other things as well, right?
Like, you know, and evaluate it.
And if people want to do this, you can find supporting archival material is up at the Open Science Foundation if people want to read some of these things and some of these psychic messages or research notes themselves.
Generally, the whole area of like field research psychology, experimental psychology has taken kind of a beating in the past couple decades.
You know, there's been a, what they call the replication crisis in which, you know, studies of these kinds of psychological phenomenon, which were considered very solid and serious, just are not holding up either because they can't replicate or when people go back and scrutinize them more carefully, they realize that there is very serious ethical or empirical flaws with them.
I think a famous example is like, you know, the Stanford prison experiment.
People have some sort of really serious questions about the Milgram compliance experiments.
Like all this research they thought that we previously thought sort of like opened up a whole new sort of like empirical understanding into how people behave and what their motivations are and their reactions under certain circumstances have kind of been crumbling over the past 20 years.
I mean, I'm curious.
I mean, how does your study sort of like fall into this current phenomenon?
I don't know.
What does this mean for the field more generally?
Yeah, that's interesting.
So I had no reason to think that One Prophecy Fails in particular was flawed, but I had, I guess, been influenced by like awareness of the replication crisis and how some of these famous studies didn't hold up.
So I guess whenever I read these studies, like especially from the 20th century, you know, like I'm maybe not as charitable to them as like people reading this 30 years ago would have been right when the, you know, when it wasn't normalized to think that these might have been really flawed.
I don't, I like, I don't have like a great answer.
I think it's just like a gradual replacement of like bad evidence, hope and bad studies mean slowly fact checked and, you know, better, larger scale, more replicable work being done.
And I guess maybe it's like a vindication for people who are just like maybe a little skeptical in general, right?
Like probably looking back, like, why were do people find psychologists so convincing, right?
Like, yeah, like that's maybe just like a bad call on us.
I mean, yeah, I think it's especially interesting.
It's like, it's like, yeah, that's again, that's a normal part of science.
Like the example I brought of Goddard's Calicack study.
This was debunked by other people in the field, often younger people in the field who sort of like took a closer look at his research and found serious flaws.
But what's interesting to me about your case is that, again, you are, this is not your primary area of research.
You would expect people who would be, you know, really kind of like raised up in this kind of like this field and sort of like really kind of immersed in it to take a more sort of like skeptical eye towards it and sort of revise it and provide more, perhaps more satisfying or empirically supported explanations for the phenomena of like, you know, fringe belief or whatever.
But that hasn't happened.
It took someone who was listed as an independent researcher on their paper in order to find the serious flaws.
Yeah, I believe the Stanford prison experience laws were also exposed by, sorry, I'm blanking.
I believe he's either a French like journalist or filmmaker.
So this wouldn't be the first time.
And there was, you know, some weird defensiveness I did get through this process at a couple of points.
Not at the journal where I published, but I'll say at a different journal.
I won't say which one.
I did get one reviewer comments and it was something like, you're coming very close to accusing these people of dishonesty.
And like, I just literally didn't know how to respond because it's like, well, I feel like I'm like demonstrate, I'm showing facts that will cause a normal person to include that.
Or when working on kind of the companion piece, failed prophecies are fatal, I said, we also shouldn't rely too heavily on this other study by these researchers braiding a hardened because there's no reason to doubt it itself, but because everyone in it has a pseudonym and even the location, like we can't, you know, we can't check on anything.
So we should just be cautious about over-relying.
And I got a reviewer comment there that was like, that was standard in the field.
You're coming very close to saying this was wrong.
And it's just like, you know, and I have no reason to think they did anything wrong, right?
I'm just saying we shouldn't rely on things you can't take things you can't verify with a grain of salt is the only point I made in that article.
Yeah, you know, I feel like, yeah, in academia, there's this, the idea of like principle of like charity.
There's this default assumption that like, you know, people who are working this are sincere and are genuinely working towards finding, you know, the information and they're not deliberately being fraudulent.
That's just a very serious accusation.
And I think that sometimes for the sake of discourse, you know, this, I think this is a, this is a fine principle.
But man, but the examples you provide, it seems as though it actually causes people to kind of like dig in their heels and refuse to look at very serious evidence that shows that they were fraudulent regardless of how much charity you might extend to them.
I mean, it's convenient for me.
I'm not like a social psychologist.
Like nothing I do like depends on like professionally on the validity of this body of work.
So it's like, you know, I'm less invested, I guess.
Yeah, yeah, because like, like I mentioned, because like this thing has been just about every book that I have that talks about, you know, new religious movements or QAnon or kind of like any kind of sort of like strange fringe belief that sort of contradicts their own predictions or empirical evidence.
This study is trotted out.
It's like if the field were to go in and say that, okay, actually this study is no good.
We can't really rely upon it anymore and it shouldn't be referenced except possibly as a case study of what happens when field research sort of violates very serious ethical norms.
They're going to have to basically like rewrite decades and decades and decades of research into this matter because so much is built upon this original study.
I mean, it's just, I feel like, I mean, I don't know what exactly would happen, but it feels like the consequences of accepting that when prophecy fails, it's fraudulent is pretty severe.
It is amazing to me how guided people's interpretation of the events were by when prophecy fails to the extent that people would use it as an example of like new religious movements surviving disconfirmation because everyone knew this movement didn't survive, right?
Like it's not still around.
Like they didn't do anything.
And they had this little coda that's like, oh, who knows if they hadn't run into like these legal difficulties, maybe it would have survived.
But that's where it's like a lot of new religious movements run into very serious legal difficulties or persecution.
Or because in this example, we know that like Martin and the Lawheads even reunited briefly, like they could have done whatever they wanted, but people just like, I do, I do wonder if like, maybe we were, and I kind of share in this, like it's just like fun to have a story about like how crazy people can be, right?
Because like, you know, I don't agree with how they interpreted events, right?
Like I don't mind their beliefs very like convincing personally.
And so maybe because they were like, you know, like a stigmatized belief system, maybe it's like easy to go along with like, you know, kind of this dismissive story.
Like they're so crazy, they don't even notice when they're obviously wrong.
But, you know, a more realistic one is like, maybe you think they seemed crazy in the sense they still were open to like the idea of psychics, but they recognized what part of their beliefs were wrong and moved on from them.
Yeah, I think one sort of like optimistic possibility of the idea that when prophecy fails is debunked is the idea, it opens the avenue to a more nuanced understanding of these dynamics.
You discuss a couple possibilities in that other article you mentioned where failed prophecies are fatal, where you propose kind of like two possible ways or two possible situations in which a belief system survives failed prophecy.
One of them is merely age, you know, was like if the idea, if a belief system is very old, it's been around for generations and it results in a failed prophecy, people aren't probably going to totally abandon that then, which makes sense.
The other one where I thought it was really interesting, especially in the context of QAnon, is the idea that it could survive failed prophecy if it's based upon a textual interpretation as opposed to, say, sort of like divine channeling or something like that, which I mean, and the reason that, again, we're hypothesizing is that with a textual interpretation, you could always blame the interpretation while affirming the validity of the text.
This is relevant to QAnon because like the, it's all based upon a text, the Q drops, you know, is like many people, this is something I ran into over the years over and over again, is that whenever I would like, you know, debunk something or say like, he's like, you claimed this would, this person claimed that this thing would happen based upon this Q drop and it didn't happen.
And the response I got from believers was frequently is like, oh, that interpretation obviously was wrong, but the cue drop still comes from a valid source who has insight and information and they know what they're talking about, but it does reveal something important.
I can't say what it was, but you can't just turn to an interpretation of the cue drop and say that therefore QAnon is wrong.
And I think that's, I don't know, that's, that's really, really interesting because it, I don't know, it provides us with a more complex framework of sort of like understanding why sometimes failed prophecy is in fact fatal and why it isn't.
Yeah.
And so like you said, I mean, I think, I think there's a good, you know, I think it's a pretty reasonable hypothesis, but the sample size is enormous.
And I don't really know much about, you know, QA.
It's not my area of expertise, but looking at new religious movements, right?
Look at the ones who did the best, like the Jehovah's Witnesses, was probably like the least harmed at all, and their beliefs were like um hey, like we're interpreting the Bible and we're trying to do it accurately, and this was in like a largely Christian population, right?
So like, just because one of their interpretations of the New Testament and the Old Testament were wrong, it doesn't mean that these people were suddenly like well, I guess I don't believe in Jesus anymore.
Or like well, I guess I don't believe in the Bible.
Or even if you look at the um, the Millerites, who did a bit worse, most people left the movement after their prediction was wrong, but a lot of them stayed.
Once again, they were trying to interpret when Jesus would return, based on a prophecy from the book Of Daniel.
Right, so just because, like their math was wrong, you don't have to be like well, like I don't believe that.
Like the Old Testament was inspired, and you even see this with like um another, another group that did survive, although it was very harmed, called the Bahais.
Under the Revision Of The New Covenant, which is comes up a lot in this literature, is a lot of people quit after they were wrong.
But um, the leaders would emphasize, like hey, we're interpreters, we're not predicting, we're interpreting based off like pre-existing sacred texts.
And and that makes sense because, like you know, you don't need to like abandon, like your entire belief system if one part is wrong.
And if you're just a human trying to interpret it like you know, if you're, if you're like hey, i'm an omniscient angel channel and you're wrong, that looks pretty bad for you.
But if you're someone who gets a couple details wrong about, like how to read part of the Bible, that's like more sympathetic.
Yeah, there's another way.
I think that your research invites, I think, a more nuanced understanding of these dynamics because like, again talking about like the Millerites uh, was it's like after the uh, the failed prophecy, after the great disappointment, the Millerites kind of like as they existed before, kind of disbanded, but it gave birth to the Adventist movement and then, which still survives today, as the uh Seventh Day Adventist.
But it was obviously it's not the exact same kind of belief system that uh, that Miller sort of preached, but it's sort of like an altered form.
And this sort of like invites the question, so like, if there's a belief system that's based upon some sort of uh prophecy and it fails.
What happens when it changes?
In a way, that's sort of like between abandoning your belief system entirely you know they'll become, you know, an atheist or something or doubling down and saying no, actually it's right, or perhaps it's right in the spiritual way or some sort of reinvention of the, the narrative.
What if it's altered?
And how is it altered?
And what ways does the failed prophecy alter the core belief system and what metamorphosis emerges and what sort of like comes what?
Uh, what belief system comes out the other side?
I think this is uh, I think might be an interesting area of research that I think is like more nuanced than sort of assuming that every time a failed prophecy happens, the uh people who believe the, the belief system, are kind of like either doubled down or abandoned entirely.
Yeah, that's right.
I do think this is um, you know, most of my work is like act professionally, is like pretty quantitative, but this is an area where, you know, I I do think you just have like reasonable people making like judgment calls right, because there are some like cases where it's kind of like honestly even ambiguous, whether a group had a failed prophecy right, like you know, a lot of groups have very hedged language, like they'll be like um, so like some Ufo religions they'll talk about like hey, like the aliens are coming, they're probably going to come next year.
And then they'll also have a message saying like well, the aliens will only come when humanity is ready.
So it makes it really hard to say Like you guys were like definitely wrong, right.
So there is like a lot of like real social ambiguity in like the literal text of the prophecies sometimes and how it's like received by like the adherents.
Yeah, um, so I'm curious what you're working on next.
Because like we mentioned earlier, like just because a study is bad or even just because a study is fraudulent, that doesn't mean that the uh that the sort of like the phenomena that's sort of described in the study is totally wrong.
But you were talking earlier is like you suspect that perhaps there are more serious flaws with the whole concept of cognitive dissonance theory than there's maybe uh than people think.
Yes, that that is my position, and that comes from reading from outside of um social psychology, to be honest.
So there's a couple of other um like famous findings of the cognitive dissonance theory, and I think there's reason to be suspicious of one.
So one famous study said that if you undergo an more intense or stressful initiation into a group, you'll value it more.
So there's this famous study from the late 1950s where women in college had to like read something to join a discussion group.
And some of them were given like a really boring thing to read and some of them were given like dirty words, essentially like sexual phrases to read in front of a male observer.
And then they did like a small survey and they're like, oh, look, like the women who had to embarrass themselves liked the group more.
And then people were like, see, this is like why hazing works.
You know, if you have a more stressful initiation into a group, you'll rationalize.
You'll say, I didn't like do something stupid.
You know, I went through the pain because the prize was worth it.
Right.
And I think, honestly, cognitive dissonance theory is weird to me because on one hand, it always has an intuitive appeal.
But then you think about it and you're like, wait, does like making a job worse make me like it more?
Or does like making an airplane worse make me like it more?
But in 2022, you had these two, I believe they're sociologists, Samino and Thomas, do a study where they actually track people going through fraternity initiation, some of which has what you might call hazing, some which they're like, hey, in the real world, more intense initiation has no experience.
Another part of cognitive dissonance theory, I'm skeptical, is this idea of induced compliance.
And this is the idea if you're like pressured to do something, it might cause you to convince yourself you like it.
You don't want to think of yourself as like bullied or battered into doing something.
So there's this idea that if you're like paid to do a boring task, maybe you'll convince yourself that like you didn't do something really boring for no money.
You just you actually liked it or something.
But we have real world experiments of this from military conscription.
Like during the Vietnam War, there was literally a lottery based on birthdays where some men were at high risk of being sent to fight in Vietnam and some men weren't in the United States.
And the people who were like exposed to this state coercion, they actually became more anti-war, which like I think a normal person would expect, but you wouldn't expect that if you thought people rationalized their compliance with authority.
And there's similar experiments like from East Germany.
So I have a preprint if people are interested that you can read freely at the Open Science Foundation called Cults, Conscripts, and College Boys, where I like review evidence from outside of social psychology that I think challenges some of this lab work.
But it's not peer-reviewed yet.
So I don't want to say, those are my reasons for skepticism about the field, I'll say, but I don't want to present it as conclusive yet.
I mean, yeah, it is really, really interesting because, again, so much of the research I look into is really about how irrational people are and like how people like they behave in these counterintuitive ways.
But I know I kind of like your angle because it has a little bit more optimistic sort of attitude towards humanity.
It presents people as like more sort of like aware of their beliefs and more willing to change than perhaps some of the literature gives them credit for.
I do think, like, I don't think people are like perfect or irrational nor perfectly intelligent, but I think like when we step outside of the world of psychology, And what we've like read in psychology studies, it does seem like people are normally rational-ish.
We can say: if something's cheaper on sale, people buy more of it.
If your job is more pleasant, you like it more.
You know, you'll have like skeptic groups who go out and debunk mediums by showing their tricks.
That's bad for the mediums.
The mediums don't think to themselves, yes, now my fans will double down and believe my powers more.
It's like humiliating, right?
And so, I think if we step outside of the world of like social psychology literature, we normally assume people are like move in the right direction in general when they're given information.
Not like 100% of the time, but generally.
That's really interesting stuff.
Speaking to Thomas Kelly about his really interesting research into when prophecy fails.
I'm going to link to some of the papers you've written in the show notes so they can explore this more.
I think it's really interesting.
I think it's, I don't know, it's like, again, it's like, it gives me a hopeful feeling.
As like, you know, it's the vindication of the debunkers.
We're not just wasting our time.
We're potentially helping people understand, provide new perspectives on their belief system that may actually change.
So, yeah, thank you so much, Thomas.
This has been a really interesting talk.
It was great to be here.
Thank you so much.
Thanks for listening to another episode of the QAA podcast.
You can go to patreon.com/slash QAA and subscribe for five bucks a month to get a whole second episode every week, plus access to our entire archive of premium episodes.
For everything else, we have a website qaapodcast.com.
Listener, until next week, may the deep dish bless you and keep you.
We have auto-keyed content based on your preferences.
The psychological story of decision-making doesn't end, however, when a decision has been made.
The act of making a decision can trigger a flood of other processes.
According to psychologist Leon Festinger, whenever we choose to do something that conflicts with our prior beliefs, feelings, or values, a state of cognitive dissonance is created in us.
A tension between what we think and what we do.
When this tension makes us uncomfortable enough, we're motivated to reduce it in a number of ways.
We may change the way we think about the decision, or try to change how others think about it so that they can support our decision.
Or we may change some aspect of our behavior so that our decision seems more in character with us.
In other words, we try to reduce the dissonance between how we think we should act and how we actually act by changing one or the other.