E.J. Dickson, Rolling Stone’s senior internet culture writer, exposes how staged "Karen" videos—often featuring recycled actors and outrage-driven narratives—go viral on TikTok and X, fueling misogynistic retribution while platforms profit from engagement. Examples like a fake Ebola panic at Burning Man or Sabrina Prater’s weaponized doxxing reveal algorithms prioritizing spectacle over truth, normalizing cruelty under hollow justifications. Even apolitical conflicts, like Dave Portnoy’s pizza feud, spiral into culture-war performativity, with real victims facing death threats while staged ones escape accountability. Dickson warns of a media literacy void, where the internet’s darkest trends—doxxing, harassment, and "internet bloodsport"—thrive alongside its rare moments of genuine progress, leaving ethical boundaries eroded in the pursuit of clicks. [Automatically generated summary]
And I would say my For You page is very wide-ranging as a result.
Like, I get a mixture of a lot of different kinds of stuff.
And the one thing that I started noticing I was seeing a lot of were these videos of, it was always white women in their 30s, 40s, 50s, 60s, freaking out.
at largely service workers, but also, you know, occasionally just like strangers on the street.
There were a lot of like parking and driving altercations.
Mm-hmm.
views, which didn't surprise me too much in itself because TikTok, especially as a platform, really prioritizes like outrage inducing content.
There's really, there's nothing more reliably outrage-inducing than a video of a Karen going apeshit on somebody.
But what did surprise me a little bit was that the videos were clearly fake.
They were staged, they were clearly scripted to some degree.
They were clearly taking place on a set because there was nobody else in the videos.
And that was very unusual considering a lot of them took place at a restaurant or someplace where you'd expect a lot of people.
It does get a little complicated because I have covered in the past some of these accounts that create these types of viral content will have a disclaimer on the original video saying, like, this is just a comedy sketch.
platforms by multiple different times of meme accounts with that disclaimer taken away.
Right.
You know, right.
So even if there's sort of like and this wasn't this case on this for this particular video, but even if there's sort of like an initial cursory attempt to.
To, like, include a disclaimer or say, hey, this is not real.
Like, there's sort of, like, a knowledge that that disclaimer is not going to be included in reproductions of the video.
So there's almost, like, a laundering process wherein they make the original video, put this little, we don't believe that we're going to get away with this if we don't lie, but then other accounts, which may be operated by them, too, realistically, you know, will put it out without that.
Yeah, that's a good way to describe it as a laundering process because they know that these videos are going to be picked up by these very high engagement meme accounts.
The one that I talk about in the piece is this account called Crazy Karens, which is on Twitter and has something like 500,000 followers.
And they know that it's going to be completely devoid of context when it's shared there.
So it's sort of just a way to be like ethically responsible in quotes while simultaneously like ensuring the content goes as viral as possible by knowing because you know that context is Yeah, yeah.
Yeah, it's all white, middle class, usually older women.
So I think that there's a lot of misogyny that's specifically targeted towards women who are not deemed of having value, like a certain type of sexual value in society anymore.
So there is that.
I also think that among people of color who share these videos, there's a very understandable feeling of...
I mean, the question then that follows is, does that then kind of like...
Defuse or delegitimize whenever somebody is actually behaving terribly.
You know, like, if you're giving me such vehement criticism for this fake video, and you're, well, I mean, honestly, there's a certain part of me that says what that does is actually decrease the ability of anybody to actually address it.
You know, if you feel like...
Talking shit on this fake video is what you need to do in order to fight against this type of behavior, then why would you actually do something in real life?
That's really what I've learned in covering the internet.
People really don't care what the truth is.
They don't.
That's ultimately secondary to the idea that people can coalesce behind a given narrative and join in behind it.
And I'm not saying that...
That's not an argument against, like, cancel culture.
And I put cancel culture in quotes.
Like, that's not an argument against, like, the process of using the internet as a tool for holding people accountable when they behave badly.
Like, that's not what I'm saying.
I'm just saying that I think we're at this point in internet culture and in our culture in general where the truth behind a given story or, like, the context is just...
Completely immaterial.
Like, I really do feel that way.
So, no, I don't think anybody particularly cares that these videos are fake.
Honestly.
And I don't think anybody will care.
I think that probably what's going to happen is we're just going to see more and more and more of them.
And the boundaries between what is legitimate and what is staged for clicks and for outrage are just going to completely collapse.
I think that's very much the direction we're moving in.
I can't even think of one specific example because it happens so often.
I regularly see at least five things on the internet every day that are fake and are easy to prove that are fake and generate a shit ton of outrage and, you know, in some cases generate entire news cycles, you know?
And the fact that I can't think of a specific one...
Well, okay, I guess one example that's very recent is the rumor that people...
Yeah, so there were like two days where Ebola was trending on X, formerly Twitter, which is a phrase I have to, unfortunately, write all the time now in my stories.
Ebola was trending on X for quite some time because there was this rumor that seemed to have been completely just crafted out of thin air that somebody at Burning Man had Ebola.
I don't think it was covered by any legitimate mainstream media outlets because they have to follow a certain fact-checking protocol and basic reporting principles.
But it was picked up enough that it was just kind of regarded as fact within a few hours.
And this happens all the time.
That's just one minor example of something that I just see happening all the time.
And it's not a particularly new argument I'm making, I don't think, that people don't care about the truth anymore.
Or people just fashion, you know, create their own narratives, create their own agendas, so to speak.
But it's just, I just see it happening all the time.
Well, I mean, part of why I'm so fascinated is with our job, you know, we've looked at Alex Jones over his past 20 years, and that concept of, like, people don't give a shit about what is true or what isn't true seems like it, I mean, it didn't originate with him, but it seems like he was a proto-form of it on the internet.
At the very beginning of people not giving a fuck about reality.
There was Alex.
And so what I'm looking at whenever I see all of these different videos is the idea that it's not a grift.
Well, it's not crazy because it's not like they're not making money off of it.
You know, they're just monetizing.
And honestly, I mean, on TikTok, which is a famously difficult platform to monetize content on, you know, you have to think that they're not making that much money off of it.
But, yeah, I mean, they are making money off of it, so it's not like they're not profiting off it.
It's just a more indirect type of profiting than the type of profiting you were talking about earlier.
But, I mean, the reason that I bring that up isn't because I want them to do it either direction or anything like that.
It's that, with Alex, that extra step, you know, that extra step of not being just the person who tells you the truth.
He's also the person who sells you the shit you need to survive.
That kind of thing.
That's kind of an extra step of evil.
That's an intent.
That's an intent to take an audience and then steal their money, basically, by selling them garbage.
And with the fake Karens, I'm fascinated because if they're not trying to then convert their audience into something else, does that mean that they're not doing it yet?
Or does that mean that this is a different type of grift?
Yeah, so I don't really want to be, I don't really want to say, you know, accuse them of grifting in any way, because they're not.
I mean, it's not a grift in the technical sense, but it's just a different type of attention grab.
It's just a different type of, I mean, it's all the same, really, in the attention economy.
There's not that much of a distinction between what Alex Jones is doing.
I mean, he's got a more overtly political agenda, but he's also just going on air and saying things for attention that a lot of the time he knows aren't true.
But that's, I mean, again, that's what I'm kind of driving towards is this idea.
And we see it all too often with, you know, like with your wellness scams or your other types of grift is that they have suddenly started to turn towards the right wing.
They've all suddenly started turning political.
You know, we've covered natural healing people who over time after the COVID vaccine, because they're anti-vaccine, they turn into far- Yeah.
And, you know, there is a question of whether or not you...
And also I used to cover extremism and the far right far more, although I still do to some extent, like far more than I do now.
And there is always this question of like, well, are we adding more oxygen to something or platforming something that really shouldn't be...
I think it's a fine line.
But I think at a certain point, and I'm starting to...
It's really a case-by-case thing with each story that I think about covering, but I'm finding myself coming to this conclusion for more and more of them.
At a certain point, something gets so big that you can't ignore it.
At a certain point, the horse is out of the barn.
I think...
I think this is actually a pretty good example of that because they were getting millions and millions of views.
And they were showing up on my For You page and I'm not a person who actively seeks this kind of content.
A, because I think just having increased media literacy in general is helpful, even if nobody is actually interested in having increased media literacy.
And B, to some extent, I mean, less in this specific case, because this isn't as overtly harmful as some of the other things that I've covered that TikTok has promoted.
But it puts pressure on the platforms, which play a huge role in spreading this type of content.
And ensuring that it gets as many eyeballs as possible because it increases engagement and increases time on the platform.
It's only to their benefit that these platforms implicitly promote misinformation.
So I think this type of coverage is important because it ultimately puts pressure on them and holds them more accountable than they otherwise would be to doing that.
TikTok, I think, is a particularly powerful social platform in that regard.
Because even though it's uniquely, it's supposedly uniquely tailored to an individual's interests and an individual's, like, which it is to some degree.
Like, I mean, I get tons of TikToks about, I like musical theater, I like Disney, I like Harry Styles content like that.
I get tons of, I like, you know, animal content.
I get tons of shit like that.
But it also is unique in that it...
Has the power to decide what does and doesn't go viral a lot of times.
And I'm not saying that there's somebody behind a computer deciding, okay, we're going to make this fake Karen video go viral today.
But it sees that something is about to get a lot of engagement.
It sees that something is getting a lot of engagement.
It sees that something is getting a lot of eyeballs on it.
And it will boost that because it is to their benefit to keep people on the platform.
People, people should, people, there's a lot of discourse about, you know, about TikTok and China, and, you know, I don't, that's not what I, that's not my beat, that's not what, like, I report on.
I'm saying that's why people are scared of TikTok.
And I'm saying, no, you should be scared of TikTok because it is an incredibly powerful source of information that actively prioritizes information that is not correct.
So then, I mean, that's why the way that I find it difficult to talk about it is as though it is an it, you know, as though it is an unfeeling corporate engine.
That cannot be viewed.
You know, like, there are people behind this who are doing this on purpose.
that the more I started to, the more that like, I started writing stuff that I was super proud of, and the more confident I felt about my abilities and my career, the more harassment I got.
It was kind of astonishing.
Basically, the moment I started working for Rolling Stone is the moment...
Well, that's actually not true.
About a year into working for Rolling Stone, when I really thought that I had hit my stride and I figured out what I wanted to write about and thought that I was doing pretty good work, that was when the harassment started increasing.
And I bet a lot of women who go on your podcast will probably say the same thing.
How proud of your work you are, the more you want to put it out there, the more you want people to see it, and then more people see it, and then pieces of shit see it.
I mean, it's literally happening to me in my mentions right now.
Like, I won't get into it, but, like, there's literally, like, somebody called me an ugly cunt, like, an hour ago.
I don't know if you can put that on your podcast, but somebody, like, an hour ago, and I was just like, oh, somebody tweeted me who, like, shouldn't have.
And the more it happens, the more you get used to it, which has helped me just on a personal level and a pragmatic level in doing my job.
But is not really a great reflection on the state of discourse these days.
That people culturally have permission to hate Jews and trans people and black people and women more than they have, you know, have been certainly within my lifetime.
If you're saying that you didn't know that Dave Courtney was a popular pizza critic, then I guess the context needs to be explained.
Yeah, he's got this series.
No, people know.
He's really viral.
He's really popular.
It kind of goes back to your point earlier about something's happening on one sector of the internet that's huge and the other sector of the internet has no idea about it.
He does these pizza reviews.
He just goes to various pizza places on the East Coast, takes a bite of pizza, gives it a highly specific grade.
And it's pretty harmless.
I actually enjoy his videos, the pizza videos.
I think they're good.
I agree with his opinions.
I grew up in New York, so I care a lot about pizza.
And even though I'm not David Portnoy's...
Target audience.
Like, I think he's a pretty good pizza critic.
Like, you know, everything he says about pizza, I'm like, yeah, I agree with you.
Even if I disagree with him about many other things, I do enjoy his pizza reviews.
So those have been pretty popular for a really long time.
And he recently went into some guy's pizzeria who I interviewed after this happened.
And this guy was like, no, I do not want this guy.
Because apparently there had been some sort of rumors circulating in the pizza community that Dave Portnoy had a lot of power, like a review had a lot of power over the success of an individual business.
And this guy was like, no, I don't want to be a part of this.
He feels strongly about his product, and he does not want this guy who he perceives as having too much power and influence over the pizza community in general.
He doesn't want to play the game, which I understand.
And the flip side of that, which I also understand, is that he...
You know, what Portnoy does is no different than food criticism in general, and this guy should be, like, open to that and should understanding of that.
But, like, regardless, he came out and he told him to fuck himself, and they got in a big fight, and Portnoy, like, blasted it on his social channels, and he talked about it on Tucker Carlson, and it became...
It unleashed this, like, wave of harassment against this guy.
Again, I'm not a psychologist, so I can't speak to what goes on in the brain when people just want to be fucking assholes on the internet.
I just can't.
And I want to be clear that I have a very complicated opinion about accountability culture.
I am not going to sit here and make the argument that someone...
Who verbally abuses a white person for no apparent reason doesn't deserve to be held accountable for it.
I think that there are a lot of cases, especially if somebody is being racist or abusive or harmful in some way, where it is absolutely acceptable to film their behavior and to call them out on it.
It should face consequences.
But that's with an asterisk.
I see a lot of the times...
I see, and I'm saying this more and more on TikTok in particular lately, videos of private citizens being filmed are clearly people who are mentally ill or in distress or a lot of times are neurodivergent and not picking up on social cues.
And these are, if you just took a second, I just want to be like, if you just took a second to think about What, you know, who this person is and what the context is, then you would come to that conclusion.
Or, like, you would come to the conclusion that it might be possible that they're vulnerable in this way.
So why attack them?
You know, like, why use your platform to attack somebody for clout if they are potentially, you know, really, really struggling and deeply sick?
You know?
And that's not a question people are asking themselves right now.
It's just not.
I see it every day.
Like, it's just not a question people.
I can name ten examples off the top of my head.
It's just not a question people are asking themselves.
The one example I'm thinking of, I guess I'll just talk about it, even though you didn't ask.
The one example that I'm thinking of is this video that is going viral.
It's this woman, right?
It's actually a series of videos.
And she has been filming this guy in her neighborhood who looks really disheveled.
His legs are black for some reason.
He, like, clearly hasn't showered in weeks.
Walking around her neighborhood singing to himself, singing gibberish, basically, like, you know, causing, you know, being disruptive in the neighborhood, but is clearly not well.
This guy is, like, clearly not well.
And instead of, you know...
Trying to see where this guy's family is, if he has any family, you know, getting a wellness check on him, trying to see if he's okay, if he could use any support.
This woman's reaction was to create an entire TikTok account around her crazy neighbor.
And she's got millions of followers now.
And people are, like, doing lip syncing his, like, the songs that he sings to himself.
Like, they're remixing it.
And this happens all the time.
I cannot even tell you.
How commonplace this is.
And again, because it's TikTok, I am getting these videos even though I have no interest in them.
You know?
That's because they are just getting so much engagement.
And I really wish that somebody would write about it.
I don't know if I struggle with whether or not I should.
Now, so to me, what that indicates is that the algorithm that people have designed is specifically angling towards creating a place where people who would say that's psychopathic never see that shit.
Well, no, I would actually argue that it was a cordoned-off area of the internet, but that cordoned-off area has gotten bigger and bigger and bigger because of platforms like TikTok.
Like, in the past, you know, five years ago, ten years ago, you would see the only people...
Who were sort of making bloodsport out of, you know, doxing private individuals and cruelty like that was 4chan or Kiwi Farms or places like that.
And now it's sort of...
Not only are there more forums for, like, right-wing assholes to do that, and not only are there more right-wing assholes doing that, but there's also sort of been this...
Water-down effect, I guess you could call it, where we're not just talking about doxing.
It's not just right-wing assholes talking about doxing women and trans folks because they posted a photo on the internet.
It's like they are creating...
It's like left-wing people, right-wing people.
It's totally politically unbiased.
And it's like they are creating a sort of social justification for it.
There's another example.
This is one of the cruelest things I've ever seen.
This woman, Sabrina Prater, was a trans woman on TikTok a couple years ago who would record videos of herself in her house.
Dancing.
And her house was kind of ill-kept on the surface, and it looked a little...
She was doing some construction on it, I think.
So people were taking these videos and not only memeing them and remixing them, but also accusing her of being a serial killer and calling CPS on her because she had kids.
You know, calling the police to check her basement and doing these, like, true crime-esque videos, like, dissecting her content.
And that's that's really like when I say a watered down effects like that's that's in part because that that's that's like the Kiwi farms and fortunes shit sort of trickling down to general society.
Educate people in media literacy and how to use the Internet responsibly.
And I think that a lot of...
I mean, my kids aren't old enough.
But if I had to guess, I would say that a lot of media literacy targeted at Gen Z has not been focused around, like, how to consume information responsibly or, like, as focused on misinformation at all.
I think it's probably been, like, don't send nudes, you know?
Like, I think that's the majority of, like, internet education for teenagers.