Miss Information Episode 1 - Holiday Special Episode
On this Memorial Day, we are proud to introduce the newest limited series from the Institute for Religion, Media, and Civic Engagement and Axis Mundi Media. Miss Information is a podcast about how conspiracies and misinformation infiltrate wellness communities and religious spaces.
Subscribe here: https://redcircle.com/shows/21b4b512-ceef-4289-b9fc-76f302f5bd22/episodes/3532f1b2-5f15-4302-82f6-8a281d676871
Misinformation is big news, but what does it mean and why does it matter? If misinformation is simply incorrect information, can it be solved simply by telling people the right answer?
In this episode, we learn how misinformation can prevent people from voting if they think they aren’t eligible or can’t vote by mail; how misinformation can convince people to take certain drugs to cure a disease even if it’s not proven to be safe; and the ways misinformation can draw people into conspiracies like QAnon. But it’s not as simple as dispelling all misinformation from our midst. That seems impossible. Rather, in dialogue with Dr. David Robertson from the Open University, what we will discover points to a different question: Why do people believe misinformation at all and what does it do for them? In other words, instead of focusing on what people believe, perhaps the phenomenon of misinformation directs us to ask what beliefs do - who they favor, who they put in power, who they marginalize, and who they leave vulnerable. And by understanding the mechanics, maybe we can mitigate the damage misinformation does to our public square.
For more information about research-based media by Axis Mundi Media visit: www.axismundi.us
For more information about public scholarship by the Institute for Religion, Media, and Civic Engagement follow us @irmceorg or go to www.irmce.org
Funding for this series has been generously provided by the Henry Luce Foundation.
Creator: Dr. Susannah Crockford
Executive Producer: Dr. Bradley Onishi (@bradleyonishi)
Audio Engineer: Scott Okamoto (@rsokamoto)
Production Assistance: Kari Onishi
Dr. Susannah Crockford (@suscrockford): Ripples of the Universe: Spirituality in Sedona, Arizona
Further Reading
Robertson, David G. UFOs, Conspiracy Theories and the New Age: Millennial Conspiracism. London: Bloomsbury, 2016.
Robertson, David G., and Amarnath Amarasingam. “How Conspiracy Theorists Argue: Epistemic Capital in the Qanon Social Media Sphere.” Popular Communication 20 (2022): 193-207. https://doi.org/10.1080/15405702.2022.2050238.
Howard, Philip N. Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives. New Haven: Yale University Press, 2020.
Bail, Chris. Breaking the Social Media Prism: How to Make Our Platforms Less Polarizing. Princeton, NJ: Princeton University Press, 2021.
Uscinski, Joseph E., and Joseph M. Parent. American Conspiracy Theories. Oxford: Oxford University Press, 2014.
Byford, Jovan. Conspiracy Theories: A Critical Introduction. London: Palgrave Macmillan, 2011.
Argentino, Marc-Andre. “The Church of QAnon: Will Conspiracy Theories Form the Basis of a New Religious Movement?” The Conversation, May 18, 2020, https://theconversation.com/the-church-of-qanon-will-conspiracy-theories-form-the-basis-of-a-new-religious-movement-137859
Hao, Karen. “How Facebook got addicted to spreading misinformation,” MIT Technology Review, March 11, 2021, https://www.technologyreview.com/2021/03/11/1020600/facebook-responsible-ai-misinformation/
Learn more about your ad choices. Visit megaphone.fm/adchoices
If a man being ugly isn't enough of a reason for you to not want to date them, let me give you another.
My name is Axis Mundy.
My curly hair is from my dad's side, and until my mom became pregnant with me, she had pin-straight hair.
All of a sudden, it turned really curly.
I just found out a man's genes can actually alter yours if you're pregnant with their child.
So you're telling me that getting pregnant can make you look like your partner?
It changes your DNA?
You're telling me that when I get pregnant, I suddenly look like the father of the child?
Like him getting me pregnant also made me his baby somehow?
If this is true, it might be the best form of contraception ever invented.
But is it true?
This TikTok video went viral in September of 2023.
There are thousands of comments of people gasping at the horror of becoming the man who impregnated them, and users sharing stories of how their mum's hair or skin changed after pregnancy.
But at least one user realised that what was being shared in the video might not be scientifically accurate.
After posting information from the TikTok, they used a popular emoji to signal their realisation that the information was false.
The emoji is a large, shiny blue face creasing with laughter with a white hand covering its mouth under the caption in black type.
When I purposefully spread misinformation over the internet!
After posting this TikTok, the person came back a few days later and said, sorry, my source was TikTok.
This is a 21st century mea culpa, the meme version of an apology.
It seems like they hadn't watched enough videos yet that day to realise they'd been misled.
Here's Dr Chad, an ER doctor who is popular on TikTok explaining why the idea that pregnancy changes your DNA isn't exactly accurate.
...as much as it is the pregnancy itself.
Pregnancy does change your DNA, but not in the way of, like, the baby's genes infesting your DNA.
The genetics of hair type are quite complex, spanning many different genes, controlling many different types of hair.
Part of what determines whether a person has straight, wavy, or curly hair is the angle that the hair follicle grows.
Many people with straight hair actually have curly genes that are bound up in part of the DNA that is not being expressed.
And when pregnancy comes along, the hormones change which genes are expressed and can change those features of your hair.
Sometimes these changes are pretty dramatic, as you can see with this woman before pregnancy, during her first pregnancy, and during her second pregnancy.
Many times these are temporary changes lasting a few months to a year, but sometimes they are permanent.
So while you don't exactly absorb a man's DNA during pregnancy, there are times where you can get fetal tissue lodged within your brain.
One of the things we'll learn throughout this series is that misinformation often contains a thread of truth that makes it easier to believe that That tiny grain of a fact makes it seem like it could be true.
This is often much more convincing than something that's entirely made up.
Another example of people using the misinformation meme was on a list of the world's most livable cities in 2023, of which three were Canadian.
Above the picture of the list, a user wrote, as someone from Canada reading this list, and then put the misinformation meme.
Replies to the quote tweet concurred that those cities were only livable for people earning over $200,000.
But is this misinformation?
What makes a city livable is subjective and contingent.
Depending on how you define livable, this information may be true, false, or somewhere in between.
This is why the kernel of truth is so important when there is no real right answer.
Such as, what makes a city livable, right?
There's a lot of different variables that go into making a city livable for different people.
So it is entirely true that there's some things about those cities that make them liveable, but for others that may be entirely false because those particular variables don't apply to them.
So when it comes to misinformation, it may be one thing to spend a few days believing something demonstrably false about what happens during pregnancy, or to think that a city is livable when the story is a little more complex than you first thought.
But it's another when the information one consumes leads to life-altering behaviour.
For example, what types of medicines or chemicals to take in order to ward off a virus causing a global pandemic.
I'm the president of the United States.
I'm not the president of other countries.
Every one of these doctors said, how do you know so much about this?
Maybe I have a natural ability.
Maybe I should have done that instead of running for president.
What do you have to lose?
I'll say it again.
What do you have to lose?
Take it.
Hydroxychloroquine.
Try it.
If you'd like.
When somebody's the President of the United States, the authority is total.
And that's the way it's gotta be.
Supposing you brought the light inside the body, which you can do either through the skin or in some other way.
And I think you said you're gonna test that too.
Sounds interesting.
Right, and then I see the disinfectant where it knocks it out in a minute.
One minute.
And is there a way we can do something like that by injection inside or almost a cleaning?
A lot of good things have come out about the hydroxy.
A lot of good things have come out.
I happen to be taking it.
I happen to be taking it.
If we were testing a million people instead of 14 million people, we would have far fewer cases.
Right?
So, I view it as a badge of honour.
Or whether or not vaccines cause autism.
Time for Health Watch and the heated debate over childhood vaccines and whether they cause autism.
Yesterday, a respected British medical journal retracted a study that said the MMR vaccine may trigger autism.
CBS News correspondent Richard Roth... Or if an election was or was not stolen by way of sham votes and rampant cheating.
The ballots that you said you saw are lying around the place or in trash cans or whatever.
Where are you hearing that from?
Oh, I mean, it's there.
The videos are going viral everywhere.
I've seen them on TikTok.
I've seen them on Facebook.
I've seen them on Fox News.
I've seen them on the local news around my area.
I've seen too much pieces of different evidence so far that shows that at this point I would be okay with a revote.
Really?
Yeah, absolutely.
When you have video footage of people taking bags of ballots and showing that they are for Donald Trump and lighting them on fire.
I helped write a fact check on CNN on that particular video.
The election officials said that video has been going around for a few days.
So what is the cost of misinformation?
Is it human lives in a pandemic?
Damaged lives due to vaccine hesitancy?
The weakening of democracy in the wake of polarising elections?
Or just silly videos on TikTok about curly hair and memes about whether or not a city is liveable?
The question is not whether or not misinformation exists, but who does misinformation hurt, and why do we fall for it?
Misinformation matters, and sometimes it matters more.
But it is a big deal, because it can have an effect on how we view and experience birth, death, and everything meaningful in between.
Join me as we consider these questions and examine the ways misinformation infiltrates human communities.
Throughout this series, we'll take on topics like biohacking, yoga, QAnon, 15-minute cities, and wellness culture in order to figure out the role misinformation plays in cultivating conspiracy theories, changing individual behaviours, and building distrust in our institutions and governments.
We are going to uncover what misinformation is and why it matters for how we think about the bodies we live in, the cities we inhabit, the communities we build, and the future we want.
Our hope is that by examining the mechanics of how misinformation works and spreads, we might get better at detecting it when it comes across our timelines.
Or at least help us stop believing TikToks about DNA changing when you're pregnant.
That would be a start.
Welcome to Misinformation, a limited podcast series by me, Dr Susanna Crockford, in conjunction with the Institute for Religion, Dr Susanna Crockford, in conjunction with the Institute for Religion, Media and Civic Engagement and Axis Mundi Media, which Misinformation was produced by Dr. Bradley Onishi and engineered by Scott Okamoto.
Carrie Onishi provided production assistance.
misinformation was made possible through generous funding from the Henry Luce Foundation.
Bye now.
We all know that people are wrong on social media all the time.
The internet is full of false claims, lies, misrepresentations, rumours and conspiracy theories.
This is old news.
What has emerged is various attempts to try to control or curtail the firehose of unwitting falsehoods and deliberate lies that is contemporary online discourse.
And the word misinformation and related disinformation are at the forefront of this attempt to clean up public discourse.
If you're watching in Europe and own a smartphone, which probably applies to everyone, then you're going to start noticing changes over the next few months with new safety, verification and consent features.
The DSA will force companies with over 45 million monthly users like Google, YouTube and Instagram to clean up its act in terms of its content moderation, user privacy and transparency.
But this laundry list is still unclear over free speech.
Most large platforms already remove lawful but awful content, but the lines are blurry for keyboard warriors on X for instance.
But even billionaire Elon Musk, who took over the Twitter platform last year and immediately butted heads with the EU over content, is agreeing to comply.
While there's nothing new about sifting out fake news and tackling hate speech, hefty fines and punishments may help.
In 2022, the European Union enacted the Digital Services Act and it came into effect in 2023.
The law requires companies with at least 45 million active monthly users, so Facebook, Instagram, YouTube, TikTok, Twitter, now X, to control the spread of misinformation, hate speech and propaganda, or face a fine of 6% of their global annual revenue, or even a ban in EU countries.
The UK and Australia have also been working on similar laws to force technology companies like Meta and Alphabet to take responsibility for the bad and harmful information that spreads on their platforms.
Platforms that earn those companies billions of dollars in revenue every year, primarily from advertising.
Tech companies have been profiting from misinformation for some time.
But what does misinformation even mean?
In his book, Lie Machines, Philip N. Howard, a sociologist at Oxford University, defines misinformation as contested information that reflects political disagreement and deviation from expert consensus, scientific knowledge, or lived experience.
Misinformation is often distinguished from disinformation, which Howard defines as purposefully crafted and strategically placed information that deceives someone, tricks them into believing a lie or taking action that serves someone else's political interests.
The main difference that scholars draw between misinformation and disinformation is intentionality.
Both words describe information that is wrong or inaccurate somehow.
But disinformation is wrong on purpose, and that purpose is often assumed to be nefarious.
The problem, from a legal perspective, is that information does not need to break any laws to be harmful.
That's why the alarm over misinformation has brought this flurry of new laws seeking to regulate what can be said, especially online.
But who decides what is harmful?
And as the purposefully spreading misinformation on the internet meme shows, for some, lying online is funny.
Internet culture is full of jokes, parodies and satire.
Harm is often a subjective evaluation, like what makes a city livable.
What one person may dismiss as a joke, another may take offence.
What Howard draws our attention to with his definitions is that misinformation and disinformation are often political in nature.
So, So it's more than just someone telling you that getting pregnant can alter your genes.
It's someone telling you that mail-in ballots won't be accepted when they are, for example.
Or that a certain politician committed a crime when they didn't.
What is the difference between misinformation and disinformation?
Misinformation, which is kind of like gossip spreading through like a telephone line, right?
It gets distorted slightly.
That's very different from disinformation, which is like a push of power to manipulate you, a lie seeded for a purpose.
That was Nobel Peace Prize winning journalist Maria Ressa making clear that disinformation is more serious than misinformation.
Disinformation, a lie seeded for a purpose, is a way of exerting power.
It's trying to deceive people to manipulate or control them.
It's more like propaganda or even psychological warfare.
And while on the face of it, ascertaining the truth seems like it should be straightforward, it often isn't.
Facts are open to interpretation.
The source of information is often taken as discrediting its veracity, especially in our current polarised political environment.
Simply calling something misinformation does not solve the problem.
Because first, you have to answer a much harder question.
What is the truth?
Who gets to decide what is true?
Whose truth matters?
Drawing hard boundaries around misinformation and disinformation requires that we share what political scientist Joaszczynski called properly constituted epistemic authorities in his work on conspiracy theories.
So we know that Watergate was a real conspiracy because the media and Congress investigated it and confirmed it.
Whereas when governmental authorities investigated 9-11, they confirmed the official account was true.
But like Howard's definition of misinformation that rests in part on expert consensus and scientific knowledge, this requires that we can agree on who is an expert and which authorities are trustworthy.
If you lived through the Bush era, you may have already scoffed at the suggestion that we should believe the 9-11 Commission.
Especially when we now know that Bush and his advisors lied about the existence of WMDs in Iraq as a pretext to invade.
While disbelieving politicians is a common disposition, scientific knowledge is also increasingly under attack.
So, do we have properly constituted epistemic authorities?
Laws against misinformation arouse suspicion that they may be infringing free speech.
Free speech is a concern that arose specifically in response to the question of what to do if the government tries to punish people for what they say.
But free speech has never been an unlimited right, and speech that breaks the law is not protected.
Intentionality comes in again, because there are different laws around satire and parodies, around sincerely held beliefs, and the result is an epistemic minefield.
In today's episode, we unpack the terms misinformation, disinformation and conspiracy theories.
How do the categories of misinformation and disinformation operate as a form of social power?
Misinformation is not only a weapon used to deceive and mislead people, it can also be weaponized against marginalized groups, or whichever side you want to discredit.
Misinformation and disinformation became big news after the 2016 US presidential election.
I'd like to welcome the fake news media, which is back there.
There's a lot of people back there.
We have a lot of big races going on right now, so enjoy that, enjoy the food, and enjoy everything.
And really, in all fairness, it is a great honor to have the media with us, and we hope you enjoy yourself.
The election of Donald Trump shocked many, and in searching around for reasons why he won, misinformation and disinformation became a popular answer.
Donald Trump's campaign, and Trump himself, made so many false claims that fact-checkers lost count.
We do not need a reckless president who believes she is above the law.
Lock her up!
Lock her up!
That's right.
Yes, that's right.
Lock her up!
I'm going to tell you what.
It's unbelievable.
That was Michael Flynn, the former Director of National Intelligence, among other things, leading a lock her up chant at the Republican National Convention in 2016.
The big claim, of course, was that Trump's opponent, Hillary Clinton, had committed multiple criminal offences and therefore should be locked up.
This claim was supported at the time by the fact that the FBI did investigate her for use of a personal server for her emails.
None of these claims were substantiated and Hillary Clinton remains unindicted.
Part of Trump's success was attributed to his digital campaign.
It later emerged that his campaign took advantage of digital services from companies like Cambridge Analytica that were harvesting data from Facebook and other social media platforms.
Harvested data allowed campaigns to target personalised ads, boosting one side or discrediting the other.
Our data was being used against us!
Even more alarming were claims that Russian agencies like the Internet Research Agency were purposefully stoking discontent through inflammatory posting on divisive issues in America, such as race or LGBTQ rights.
Russian troll farms were posing as Americans and sharing memes and replying to others' posts to increase the circulation of polarising content.
This case became the archetypal example of disinformation, spread by an authoritarian regime to weaken its enemies through taking advantage of their domestic social problems.
By the 2020 US presidential election, the online dystopia was even worse.
QAnon had emerged online as a network of pre-existing conspiracy theories that crystallized around claims that Hillary Clinton and other Democrats had been arrested for heinous crimes and that Trump was going to impose martial law.
This event was called The Storm, the outbreak of which was meant to signal the triumph of Trump and the so-called Patriots against the forces of the Deep State.
This online conspiracy theory was retweeted by Trump himself on numerous occasions.
QAnon followers were also part of the crowd present on Jan 6, 2021.
After losing the 2020 election, Trump claimed fraud and declared the election had been stolen from him.
False claims spread on Facebook groups devoted to stop the steal.
And then, when the electoral count was to be certified by Congress, an angry mob gathered outside the Capitol building.
Militia, including the Proud Boys and the Oath Keepers, instigated an invasion of the building, and the protest turned into an insurrection, aimed at overturning the election and installing Trump by force.
Many at the Jan 6th insurrection claimed religious reasons for being there.
They held prayer circles and blew Jericho horns.
They held signs declaring, God wins!
For part of the motive for Jan 6th, scholars of religion have identified a current of Christian nationalism, believing that America is a Christian nation and rightly belongs to a specific type of Christian: white, patriarchal and heterosexual. believing that America is a Christian nation and rightly belongs
Religion sits uneasily beside conspiracy theories like QAnon, Scholar of religion and extremism Marc-André Argentino called QAnon a new religious movement.
Both are founded on non-empirical beliefs.
So, is there a clear distinction between the two?
And if truth is at least partly dependent on authority, who we believe as much as what we believe, what happens if the highest political authority in the nation uses lies and conspiracy theories to seize power illegitimately?
To pass these thorny issues, I spoke to Dr David Robertson, a Senior Lecturer in Religious Studies at The Open University in the UK.
My name's David Robertson.
I'm a Senior Lecturer in Religious Studies at The Open University.
I research new religions, conspiracy theories, the history of thinking about religion and all kinds of marginalised knowledge.
So since David studies marginalised knowledge and conspiracy theories, I asked him about how we can define misinformation and disinformation.
These kind of social categories that are being used in academia, outside of academia, in the press, in politics, in everyday speech, they never have specific tied down meanings.
It's I think we have a tendency as academics sometimes to try and find a single kind of definition, whereas actually I tend to think that the vagueness and the way that it changes is part of the deal, right?
It's baked in, it's part of what it does.
The main difference then is about intentionality.
Misinformation is information that's wrong, but disinformation is information which is deliberately wrong.
It's there to mislead.
The point of disinformation is often to make people disengage, rather than to convince them of something else.
The more people disengage, the more passive they become, the less they will oppose bad actors, but also the less politically and socially active they become.
And the more distracted we are with trying to work out what is even true, the less likely we become to do anything at all.
And as we've just heard, disinformation is not the same as conspiracy theories.
Conspiracy theories refer to a broad category of stigmatised knowledge that identifies secret plots and groups of collaborators manipulating world events largely unseen.
It may be easy to dismiss conspiracy theories as just wrong, but that's too simplistic.
There's a reason why these theories remain transient.
Even though in their details they are fantastical and phantasmagoric, in the broader diagnoses of how power works in society, they refer to wider social truths.
Take chemtrails.
This conspiracy theory suggests that white trails seen behind airplanes crisscrossing the sky on a clear day are the remains of poisonous gases sprayed intentionally to control the weather or our minds.
Some say they suffer from chemtrail flu on days when spraying is high.
Aluminum, boron and other particulate matter is blamed.
Although, these claims can be ridiculed.
Aluminum is a naturally occurring element, and who would fly a plane spraying poison and then get out and breathe the same air?
But, it's not like we directly control what industries, like the airline industry, are doing to our atmosphere.
It's not like planes are good for human health or the environment.
Carbon emissions from jet fuel are a major driver of anthropogenic climate change and the rumours of weather control come from documents published by the US Air Force speculating that they need control over the weather to maintain military dominance.
It's not unreasonable to be sceptical about the powers and motives of the US military.
The conspiracy theory that HAARP, the High Frequency Auroral Active Research Program, a scientific lab in Alaska, now closed down, could create hurricanes, came in the wake of the federal government's insufficient response to Hurricane Katrina.
New Orleans is a majority black city, and many saw a latent racism in the weak response from the Bush administration.
Conspiracy theories are often prevalent in minoritized communities for a reason.
They have experienced that the state, the government, the law, all the epistemic authorities do not work in their favor and in fact may work against them.
So, is a conspiracy theory just a mistaken form of belief?
And if it is, is it any different from religious belief?
Here's David Robertson again.
When somebody, you know, when a foreign power does it, we might talk about propaganda, right?
It's not simply that you're presenting your position, it's that you're deliberately misleading the other side.
certain exaggerated claims or misrepresenting what the other people have said.
There's various ways of doing it, which I'm sure we'll get into.
But the intentionality is the main issue.
How easy is it to read intentionality, especially when we're looking at things like online speech?
It's not easy.
And in fact, a lot of the earlier studies of conspiracy theories online and misinformation and disinformation online got that wrong and assumed that somebody sharing something on Twitter meant that they believed it.
We now know that's not the case at all.
I sometimes liken it to When I went to the G8 riots in London in, when was that?
2007, I think?
It became very clear that there was maybe 20% at most of the people there were there to riot and protest, right?
I mean, there was maybe a slightly larger bunch of people who were like me, who were sort of protesting, but sort of just watching what was going on.
Press photographers outnumbered everyone.
So there'll be a small core of people who really strongly believe it.
There'll be a slightly larger circle around that of people who are maybe interested in the idea.
You know, they might go, oh, this is mad.
This is it.
But this is like, what if this is true?
That's wild.
But aren't necessarily believers, right?
And then there'll be an even larger circle of people around that who are pointing at them and calling them idiots or saying this is a danger to society.
We must do something about this.
But counting all of those as believers massively exaggerates it.
Yeah, I think that there's this phenomenon now, especially online on social media, where like a lot of the, especially the shares of a post that is misinformation, are just people pointing out that it's wrong.
So now you get the dynamic where people will try and be wrong on purpose, like with spelling mistakes, because it increases engagement, right?
The kind of meta-commentary on misinformation has become part of the dynamic of how it spreads.
It should be borne in mind that disinformation and misinformation aren't always about actively sharing material.
They are also about which material you share and which material you don't share.
And it can also be about The whole flooding the zone with shit kind of Steve Bannon strategy, which is also kind of associated with Putin's Russia, that idea of, you know, you're funding both sides to produce this information and it's not to You're not trying to incept one particular idea into the mind of the public.
Rather, you're creating so much confusion that they don't actually know what's going on.
People do things they don't necessarily believe in, and they say they believe in things which don't guide their actions.
And sometimes people will just do what they feel they need to do, whether it's something they believe in or not.
The simplest version you'll be given is that a conspiracy theory is something that somebody believes that isn't true.
And, of course, the question becomes immediately, according to what criterion of truth?
There's always a suggestion of it being a belief in a conspiracy of some sort, but it's more than just that.
It's not only that.
There's a sense of malevolent agency that whoever it is is working against the common good, for want of a better way of putting it.
The other thing, though, is that you have to talk about what a belief is, right?
And there is no agreement on that.
Depending on whether you're coming from a more sort of hard scientific side or the social science side, there are many different ideas, right?
I don't believe in horoscopes, but I used to read it every time when I was in the metro and I was on the bus going to work.
That's out of interest.
I used to ask people in religious studies lectures and say, so who believes in horoscopes?
I'd usually use horoscopes, not always.
And one or two people might put their hand up.
And then I would ask, OK, so who reads their horoscope regularly?
And fewer, you know, more hands go up, right?
Not as many if you ask people my age, because we remember newspapers is probably higher.
But if you I then asked, have you ever looked up to see whether your star sign and a partner's star sign are compatible?
And at least three quarters of the class would put their hand up.
And those people had just said they didn't believe it, right?
And so that's the opposite way around.
It's saying, I don't believe it, but it's actually driving their behavior.
So belief can mean a lot of different things.
The other thing is chronic pain issues particularly, but other sort of chronic issues that we have or even really dramatic things like losing a loved one, losing a partner or a child particularly.
Say you've got like really severe arthritis and the painkillers you're getting from the doctor and the physiotherapy just aren't doing it.
I would try acupuncture or aromatherapy or hypnotherapy or anything.
I would try it.
If it works, I'll change my beliefs to fit.
Because what I'm looking for is the pain relief.
I don't really care whether it fits my worldview.
At times of heightened media coverage, it can seem like conspiracy theories, misinformation and disinformation are a serious problem.
But the existence of wrong information, and more broadly, non-empirical beliefs, is ongoing.
But are they dangerous?
When we talk about belief in conspiracy theories, we've already sort of talked about it in terms of misinformation, right?
The number of people who seem to believe, believe conspiracy theories is around 6%, maybe slightly higher, and has sat there for a long, long time.
Conspiracy theories are a form of non-empirical beliefs, beliefs that cannot be substantiated through empirical observation or evidence.
But so is religion, and religion is often credited as a social benefit, not as a danger to society, even though it equally does not fit with empirical reality.
Conspiracy theories is a category that operates socially as a way of designating bad belief.
Religion is seen as good belief, a form of belief that leads to socially productive activity.
When religion is abusive, it is more likely to be labelled a cult.
But this too constructs a category to isolate small, socially objectionable groups.
Large established religions, such as the Catholic Church, have been able to cover up abusive behaviour because of their level of social power.
And power, when it comes to misinformation and disinformation, is really important.
The solution to misinformation that is frequently advocated is to tell people correct information.
Debunking and fact-checking have become common practices in the media over the past few years.
We now have little boxes at the bottom of social media posts with community notes or content warnings.
But does this work?
Only if the source of the information is trusted.
Because truth is tied up with authority and authority stems from power relations.
There are many aspects of our lives that we do not have access to directly.
We actually don't know firsthand if the votes counted are legitimate.
We have to trust the electoral officers in charge of counting the votes to follow the rules.
We have to believe that those in power are doing the right thing and following those same rules.
In Arizona, claims of vote tampering followed the 2022 midterms.
Losing candidates such as Carrie Lake in the governor race claimed that faulty printers and bamboo papers created irregularities that resulted in her loss.
Following a disastrous midterm election showing, Arizona's Attorney General ordered Maricopa County officials to submit a report on its botched handling of November 8th election before anything can be certified.
Carrie Lake, the Trump-endorsed Republican candidate for Arizona governor, lost her race to Democratic opponent Katie Hobbs with 49.7% and 50.3% of the votes, respectively.
Her campaign is now calling for a redo of the election following the ballot printing issues at about one-third of polling locations.
Carrie tweeted, quote, I'm so sorry they did this to us, Arizona.
We will not let them get away with disenfranchising our vote.
If you experienced issues on Election Day, please submit them at saveaznow.com.
So, Robbie, where are you on that?
Carrie Lake's actions show that Trump's claims of election fraud have gained traction among Republicans.
They believe he had the election stolen from him.
And then, when they lose subsequent elections, they claim that is a fraud as well.
Electoral fraud becomes the standard response to Democrats winning.
Long and detailed investigations into the elections in Arizona have largely not changed the majority of Republicans' minds.
The COVID-19 pandemic was also met with claims of disinformation and conspiracy theory.
Some denied the pandemic was real.
Others claimed vaccines were really microchips implanted by Bill Gates.
Mitigation measures were rejected as control mechanisms by a corrupt federal government who were suspected of forcing a great reset to socialism under the guise of the emergency measures to limit the spread of COVID-19.
Arizona had a limited shelter-in-place order during the summer of 2020 and a mask mandate that was lifted in the fall, although some cities maintained their own orders.
I remember being shouted at for wearing a mask in Sedona in October 2020.
Arizona had limited measures to mitigate the pandemic.
It also had the highest death rate from COVID-19 of all the U.S.
states.
But the Arizona Senate investigated the anti-COVID measures, not the reasons why the Arizona death rate was so high.
Well, I would say pay attention to the choreography.
What If you go into any hospital and you see a complicated patient, the first thing you see is you see groups of doctors and nurses.
It's always a team game.
Medicine, no single person has the right ideas.
Always takes multiple minds.
No two doctors agree.
No two nurses agree.
No two lawyers agree.
And that it's teamwork.
There simply are medical observations and data that are produced.
And there's one or more interpretive points of view.
The word misinformation doesn't exist in medicine.
The word misinformation doesn't exist in this pandemic.
Does not exist.
Because no one holds agency over the truth.
No one does.
There simply are the evolving observations that we have and we're trying to interpret it.
So when you hear the word information and misinformation, immediately that's a red flag for not trusting.
That was Dr. Peter McCulloch, an American cardiologist, former vice chief of internal medicine at Baylor University, and professor at Texas A&M.
Since COVID-19, he has become a prominent proponent of questioning the mainstream medical response to the pandemic, promoting alternative therapies such as ivermectin, and questioning the efficacies of vaccines.
What he is saying is that no one person can claim agency over the truth and so there is no such thing as misinformation in medicine.
It's all just differences in opinion.
He goes on to criticise governments for acting like dictators, claiming they alone know the truth and they alone spoke for the science in imposing mitigation measures during the pandemic.
So he's implying that governments lied in order to limit civil liberties to take our freedom, as the right-wing rallying cry goes.
So, if misinformation doesn't exist, and anyone who claims they know the truth is a dictator, where does this leave us?
Wading through an information zone flooded with shit?
And how do we square powerful people, Republican state senators and credentialed medical experts spreading distrust of authorities with the fact that conspiracy theories are often most commonly believed in marginalized communities?
I asked David Robertson.
If non-empirical or Epistemically dubious ideas which are not based on facts are a danger to society.
What are we going to do about the fact that three quarters of Americans accept non-empirical claims such as God gave his only son that we might have eternal life?
Why is that not misinformation or post-truth?
The government lies and manipulates.
This is common knowledge.
But that was also the question that Steve Montenegro was posing to Peter McCulloch in the previous clip.
Montenegro said Americans tend not to trust their government and other authority figures.
So how can we trust medical professionals?
And he is a state senator!
He's saying Americans don't trust the government when he is the government!
If people in government are telling us not to trust the government, and doctors are telling us no one knows the truth in medicine, how do we know what is true, and who do we trust?
Marginalised groups are more likely to be accused of conspiracy theories, but they're also more likely to accept conspiracy theories because they have legitimately been lied to and brutalised by the people who are accusing them of conspiracy theories, right?
So there was lots of critique of black communities for not wanting to get vaccinated, but those people who were critiquing them were forgetting that there is a long history of black communities in the US being allowed to have syphilis or sterilized or arrested.
And I'm not even bringing the slavery aspect into it because that's a whole other level that I don't have time to get into.
But the point is that they have been historically oppressed and lied to.
So you can't be, it's not irrational for them to be scared of it happening again.
Asserting what is true is a form of power.
And while it may seem like people like Peter McCulloch and Steve Montenegro are speaking out against power, they are actually in positions of power and using their position to attack the policies of their opponents.
Right?
Republicans like Steve Montenegro don't mean all government is untrustworthy, they mean Democrats in government are untrustworthy.
It's an assertion of power, just like Carrie Lake tried when she claimed the election she lost was fraud.
And just like Donald Trump tried when he lost the election.
Claims of misinformation and disinformation are emerging in a political context of rising global authoritarianism and authoritarians use disinformation to drive a message according to Rachel Maddow.
That message is the same.
It's to turn us against each other To make us believe that democracy doesn't work, that there's no knowable truth, right?
Yes, this is really important.
This is really important, and I can tell right now that this sounds woo-woo, but it's not woo-woo.
It's very specific.
One of the things they do is they tell you, don't believe journalism, don't believe science, don't believe experts, don't believe history.
It's all fake.
It's all designed to bamboozle you.
None of these so-called sources of expertise are The only knowable truth is something that you feel in your gut, and let me tell you what to feel in your gut.
Separating us from the idea of knowable truth means we don't recognize real practical problems in the world, we don't recognize real practical solutions to those problems, which we should put our government to, and it means that you're very susceptible to both conspiracy theories and you're susceptible to suggestion from the leader who wants you to do things that you probably would not do on your own steam if you had your wits about you.
Misinformation and disinformation are social categories with contested boundaries.
They can be used in many different ways to signal disbelief.
It can seem like anyone can say anything on social media and misinformation is barely distinguishable from parody.
Social media platforms are not editors.
They do not take responsibility for lies that are spread on their sites because they claim they are not publishers.
They only host other people's content, so they are not responsible for what that content is.
However, they do remove illegal material such as terrorism and child pornography.
However, There has been a dissolution of institutional authority over time.
Media authority peaked in the 1950s when the traditional print and television media dominated and there were limited outlets with editorial control.
Social media destabilized the information environment and disrupted the business model of traditional media.
And social media companies have been criticized extensively for not doing more to tamp down on hate speech, extremism and propaganda.
These facets of disinformation are used by authoritarians to push and manipulate people into doing what they want.
An egregious example of the consequences of inaction by platforms is the anti-Muslim disinformation spread on Facebook after an August 2017 attack by Rohingya militants.
The Rohingya are a Muslim minority in the Buddhist-dominated state in Myanmar, where they have faced ongoing state persecution, leading to mass migration to neighbouring Bangladesh and other countries.
In 2017, thousands of articles proliferated anti-Muslim messages on Facebook in Myanmar, spread by either politically or financially motivated actors.
They spread inflammatory articles, which were then amplified through recommendation algorithms.
Members of the Myanmar military used fake accounts, posing as fans of pop stars or national heroes, to post false allegations of Muslim violence against Buddhists, calling Islam a global threat to Buddhism.
They used Facebook as a tool for ethnic cleansing, according to journalist Paul Mazur in a 2018 New York Times article.
It was a systematic campaign on Facebook for over five years.
Military members used fan pages and troll accounts to post incendiary comments and articles at times of high internet usage, maximizing engagement and leaving harassing comments on posts critical of the military.
They used the platform to collect intelligence on their opponents and they disseminated pictures of violence that they then falsely attributed to the Rohingya.
1.3 million people followed accounts found to be fronts used by the Myanmar military.
These military-controlled accounts used the anniversary of 9-11 to allege thousands of armed Muslims were gathering to attack.
They sent direct messages through Facebook, warning of jihad attacks about to occur.
To Muslims, they sent messages warning Buddhist groups were about to attack.
Tensions were already high.
The social media messages generated widespread feelings of anxiety and despair, feeding into people's fear, and presenting the military as the only saviours from violence.
The military used psychological operations tactics developed during the long period of military rule in Myanmar, when they were discrediting American and British radio broadcasts.
Others trained in Russia and their campaigns use similar tactics to the notorious Russian disinformation operations.
The golden rule of disinformation is, if one quarter of the content is true, that helps make the rest of it believable, says Moser, paraphrasing a veteran Myanmar military psychological warfare operative.
And making lies about imminent Muslim violence believable had terrible consequences in Myanmar.
During 2017, 10,000 Rohingya were killed, 354 villages were burned down and 750,000 were displaced, mainly to neighbouring Bangladesh.
80% of the Rohingya population in Rakhine state were killed or fled.
In 2018, a United Nations investigation determined that the violence against the Rohingya constituted a genocide, and that Facebook had played a determining role in the atrocities.
Facebook substantially contributed to the genocide because they proactively amplified and promoted anti-Rohingya content on the platform, according to an Amnesty International report in 2022.
Facebook later admitted that they hadn't done enough to stop political violence in Myanmar and that its platform was used to help provoke genocide.
It did ban the Myanmar military's account and their television station's account in 2018.
It was used to convey public messages but we know that the ultra-nationalist Buddhists have their own Facebooks and are really inciting a lot of violence and a lot of hatred against the Rohingya or other ethnic minorities.
The revelations were shocking.
The UN investigators named Facebook in their report, calling it a useful instrument to spread hate.
Facebook didn't cause the genocide of the Rohingya, but the platform did enable the exacerbation of pre-existing tensions that led to an escalation of violence.
So, here's the crux of the matter.
Facebook doesn't stop political pages posting misinformation.
And if they did, they would have to stop all sides posting misinformation.
And then who would be the arbiter of truth?
Facebook?
Remember what we learned about epistemic authorities and how having shared sources of truth and reality helps to create a society where people agree on basic facts?
The problem in the case of social media in particular, but all forms of media, is who gets to decide what is true and further, do we want media companies playing the role of gatekeepers of truth?
Because the thing is, people actually believe things that are not true all the time, but Is this really a problem?
There's a difference between believing your DNA changes when you get pregnant and believing that an election was stolen when it wasn't.
Believing lies about an election can affect the choices you make when you go to vote next time.
Democracy requires accurate information.
This is something former President Barack Obama claimed when he described an epistemological crisis that's undermining democracy.
Basically people don't have shared sources of truth or trust in those sources and so we can no longer agree on the basic facts that are upholding our sense of what's going on and what we know.
The idea is that if everyone participates in a political system then everyone needs access to clear and truthful information about the candidates and their policies.
This way they can base their choices on facts rather than lies.
Then the voters can make the right choice.
But what is the right choice and who gets to decide?
The narrative that these people are believing the wrong things and therefore misinformation is a problem is itself problematic.
This narrative rests on an assumption that with the correct information people would not vote for Donald Trump.
But what if people like Trump, even though they know he lies?
What if they agree with what he stands for, despite the fact that it's not based in reality or fact?
What if that is a big part of why they vote for him?
What we've learned today is that misinformation can be simply defined as incorrect information, but it's more complicated than that because not everything has a right answer.
Who you should vote for is a subjective decision that is determined by a plurality of factors.
It is not simply the outcome of giving voters the correct information and then they go and make the right decision at the ballot box.
So it may be that misinformation is not a problem for some people in terms of who they choose to vote for.
Having the quote truth may not have steered many people away from Trump over the last decade, but misinformation can prevent people from voting at all.
If they think they aren't eligible or can't vote by mail.
Misinformation can convince people to take certain drugs to cure a disease, even if it's not proven to be safe.
And misinformation can draw people into conspiracy theories like QAnon.
The answer is not figuring out how to dispel or the misinformation from our midst, that seems impossible.
Rather, what we have learned today points to the question of why people believe misinformation at all, and what it does for them.
In other words, instead of focusing on what people believe, perhaps the phenomenon of misinformation directs us to ask what beliefs do, who they favour, who they put in power, who they marginalise, and who they leave vulnerable.
And by understanding the mechanics, maybe we can mitigate the damage misinformation does to our public square.
Thank you for listening to misinformation.
Next episode we're going to talk about vaccinations and how false claims about vaccines are spread online.