| Speaker | Time | Text |
|---|---|---|
| The narrative in the New York Times did not match reality because that brand is so strong and so many people really put a lot of faith and store in what they have to say. | ||
| But when you go back and look forensically at their coverage on the most important topics of the last century, uncover some shocking stuff, which is what I found in the book. | ||
| Their Berlin Bureau chief in the lead up to World War II was a Nazi. | ||
| Cuba, where they resurrected Fidel Castro from irrelevance and made him an international hero. | ||
| So Google and Wikipedia have this symbiosis. | ||
| What it becomes is kind of a knowledge cartel. | ||
| You can't break it. | ||
| It's a monopoly. | ||
| It's Google's monopoly. | ||
| It's Wikipedia's monopoly, which is a monopoly on information online. | ||
| Why should anyone care about Wikipedia? | ||
| They changed the mission of Wikipedia from building an online encyclopedia that's reliable to building a social movement, social justice movement powered by DEI, those are their terms. | ||
| They wanted to create what they call global knowledge infrastructure in 2016-17. | ||
| This is right after Hillary Clinton declares a fake news epidemic in front of Congress. | ||
| So you're getting the information on ChatGPT as a reference. | ||
| It'll just pull from an article and tell you this is what Wikipedia says about it. | ||
| But it's also because it's trained on the data. | ||
| The AI is getting shaped by Wikipedia's worldview. | ||
| The bots and like you said, that they're undetectable. | ||
| The impact is absolutely massive. | ||
| Someone's not paying for this for no reason. | ||
| They're doing it because they're advancing an unseen agenda. | ||
| And we don't know what that is. | ||
| you say they when you're talking about wikipedia who do you mean by they all right i'm dave rubin and i'm in the lower chair today which means i have a guest live in studio Joining me is the chief investigative officer of NPOV, my friend Ashley Rinsberg. | ||
| Ashley, nice to see you again. | ||
| Thank you, Dave. | ||
| Have we done this in person? | ||
| We've done this digitally many times. | ||
| Digitally. | ||
| Yes. | ||
| First time in person. | ||
| That's right. | ||
| Once with Liz Wheeler and once with Josh Hammer, but it was all remote. | ||
| And now you get to explain yourself for 50 minutes in person of the Inquisition. | ||
| For people that have never seen you on the show or that don't know your work, and you, I mentioned on my show earlier this morning, you are sort of like an internet sleuth, I think. | ||
| You kind of figure out where these trends are coming from on the internet. | ||
| You're doing a lot of work on what's going on with Wikipedia these days. | ||
| There's a lot of craziness happening on Reddit. | ||
| It's sort of these underbelly internet sites. | ||
| But sort of what put you on, at least my map about two years ago or so, was you wrote a book about the New York Times. | ||
| And it will not surprise anyone watching this that there's been a lot of, let's say, terrible stuff going on at the New York Times for quite some time. | ||
| So maybe if you give me just a minute or two recap of that and what got you involved into all this, and then we'll dive into the horrible parts of the internet. | ||
| Yeah, the New York Times. | ||
| Or anything else you want to talk about. | ||
| How is that for an intro? | ||
| That's great. | ||
| I can sprawl all across the propaganda world because it's everywhere. | ||
| And the New York Times is sort of the Wellspring because that brand is so strong and so many people really put a lot of faith and store in what they have to say. | ||
| But when you go back and look forensically at their coverage on the most important topics of the last century, you uncover some shocking stuff, which is what I found in the book. | ||
| Their Berlin bureau chief in the lead up to World War II was a Nazi, like a Nazi. | ||
| They covered up the Ukraine famine. | ||
| This was the famous or infamous Walter Durante that is known. | ||
| But what's less known is that so much of that came from the New York Times management. | ||
| They wanted to like shuffle and shift the blame to Duranti as rogue reporter. | ||
| That wasn't true. | ||
| It was the New York Times institution itself that put it. | ||
| Right, so it wasn't just Duranti that went there, made up some nonsense, sort of glossing over what was really going on, but actually they kind of put him there. | ||
| They put him there and they directed him to tell that lie, that there was no famine in the Ukraine caused by Stalin. | ||
| And they did that because they wanted to open up, they were part of this big consortium of New York money and business, and they wanted to open up that market to American business. | ||
| And you couldn't convince the American public to have the president recognize the new Soviet government, which they hadn't done if that government had just killed five or 10 million of its own people for no reason. | ||
| So they just kind of smoothed over that narrative. | ||
| Cuba, where they resurrected Fidel Castro from irrelevance and made him an international hero, a democratic savior of the Cuban people. | ||
| A total lie. | ||
| On and on and on, the Iraq war and the 1619 project, which distorting American history, which, as a point of view, is fine if that's what you think. | ||
| The problem is they lied. | ||
| Right. | ||
| They put facts into the world that were just not true and they knew it. | ||
| And this is kind of the through line: they will do whatever it takes to put that narrative out there and also to stay number one as a news publication. | ||
| Like they're ruthless. | ||
| It's a family. | ||
| It's a dynasty. | ||
| The publisher and chairman of the New York Times company today is his name's Arthur Sulzberger. | ||
| His father was Arthur Sulzberger. | ||
| His father was Arthur Sulzberger and his father was Arthur Sulzberger. | ||
| This is an American dynasty in the most classic sense, like the Vanderbilts were or the Rockefellers, but this is one that's still in power today. | ||
| Yeah, as you probably see over my right shoulder over there is a front page New York Times Sunday from about six, seven years ago, where I made it to the front page of the New York Times, which is very exciting. | ||
| That's impressive. | ||
|
unidentified
|
Head of the alt-right, basically, is what it says right there. | |
| Not a great picture of me. | ||
| I wish they would have used a better picture. | ||
| But actually, that's from my interview with John Stotzel, who we were just talking about right before we started filming. | ||
| But when did you wake up to it? | ||
|
unidentified
|
Because I, you know, as someone from New York, from kind of, I would say, moderate liberal circles, the New York Times, in some sense, was kind of like the Bible. | |
| Spending a lot of formative years on the Upper West, it was like you had the New York Times with you. | ||
| That's what you were doing. | ||
| And I think even now, even despite so much of what the last couple of years of just pure propaganda from them and COVID stuff and Russia, Russia, Russia, and Hunter Biden laptop, all of those things, I think a certain set of people, they cannot give it up. | ||
| They're almost addicted to the lie because it might uncover something about them that they don't want to see. | ||
| I mean, that's, I guess, the psychological version of it. | ||
| Yeah. | ||
| For me, I was a New York Times reader in college. | ||
| I was in upstate New York at college. | ||
| And where'd you go? | ||
| I was at Cornell. | ||
|
unidentified
|
Oh, okay. | |
| Captain Binghamton. | ||
| Right down the road. | ||
| Neighbors. | ||
| Yeah. | ||
| So I liked it. | ||
| I was a believer. | ||
| And I think after that, not long after college, I went to Israel, just kind of like just traveling around type thing. | ||
| And I could just see in front of my own eyes the narrative in the New York Times did not match reality, which you're kind of like, fine, that's not that big a deal. | ||
| But while I was in Israel, I picked up this book by William Shirer, who wrote this. | ||
| He was a journalist during World War II in Europe, and he wrote this book about what was going on, the rise and fall of the Third Reich. | ||
| And he had this footnote in one of the chapters saying, when hostilities broke out in World War II, the New York Times lead story of that day, September 1st, 1939, claimed that Poland invaded Germany. | ||
| And this was one of those like wheels screeching, tires, you know, the record going off. | ||
| And I went in, I looked at the story, and that's actually what it said. | ||
|
unidentified
|
Wow. | |
| Turns out that was a Nazi propaganda ploy to make it seem as if Poland invaded Germany. | ||
| They took a bunch of prisoners of war from one of their camps. | ||
| They dressed them up as civilians in a German radio station, killed them, and then pretended it was Polish guerrillas who'd invaded the radio stations called Operation Himmler. | ||
| And the New York Times just bought it and they printed it for the world. | ||
| And that was exactly the plan by the Nazi press. | ||
| God, I mean, it sounds so similar to what they do with Hamas propaganda now. | ||
| It's a Hamas ministry. | ||
| So the health ministry, the God of God, the health ministry, which is basically just Hamas. | ||
| Okay, so you wake up to some of this stuff. | ||
| You write the book exposing that. | ||
| And then how did you get into the online world of this? | ||
| Because now there's a whole other layer. | ||
| That was sort of the mainstream layer, the mainstream airlock for a while. | ||
| And now there's something else going on online that I think is probably way more pernicious and perverse in a sense because of the amount of people that can be involved as opposed to a family maybe that's putting out propaganda. | ||
| Yeah. | ||
| And online, you really can't see it. | ||
| Like in the media, you can see what's happening. | ||
| You can see how it's manifested. | ||
| But when it goes to Wikipedia, it goes to Reddit. | ||
| Sometimes you can't even see it because you're only getting the output on the back end of it on Chat GPT or on Google or places where you don't know what the source is and you don't know what the source of the source is. | ||
| So with Wikipedia, I was just interested in the topic. | ||
| A friend of mine who's, his name is David Rosato. | ||
| He did some interesting research. | ||
| He's a machine learning researcher in New Zealand. | ||
| He just did this study looking at bias on Wikipedia, political bias, left and right. | ||
| He looked at how they talk about American senators, House of Representatives, journalists, Supreme Court members, and the bias is not just consistent, but it's also fragrant. | ||
| Flagrant, sorry. | ||
| When you talk about a figure on the right, the sentiment is really low. | ||
| It's bad. | ||
| When you talk about a figure on the left, the sentiment is really good. | ||
| And this is across the board. | ||
|
unidentified
|
Right. | |
| This is sort of how nobody is ever thought of in the New York Times or even on Wikipedia as far left. | ||
| They never say that, but everyone is always far right. | ||
| If you're far left, they just leave that out. | ||
| You're just neutral. | ||
| You're neutral. | ||
| Right. | ||
| Exactly. | ||
| So I was like, okay, I think there's a good story here. | ||
| And I started to dig into the first story. | ||
| It spent about a week just digging. | ||
| And what I started to see was an ideological capture of Wikipedia, which had taken place in 2017. | ||
| And they were very explicit about it. | ||
| They're very overt about it. | ||
| They called it the movement strategy. | ||
| They changed the mission of Wikipedia from building an online encyclopedia that's reliable to building a social movement, social justice movement powered by DEI. | ||
| Those are their terms. | ||
| They wanted to create what they called global knowledge infrastructure in 2016-17. | ||
| This is right after Hillary Clinton declares a fake news epidemic in front of Congress. | ||
| Because for her, she just lost the election. | ||
| There's no good explanation aside from the fact that voters didn't know what they were talking about because they were being fed Russian lies. | ||
| So we have this vacuum open up. | ||
| And Wikipedia understands they can fill the gap. | ||
| When you say they, when you're talking about Wikipedia, who do you mean by they? | ||
| The Wikimedia Foundation, which owns Wikipedia and oversees and operates the site. | ||
| Catherine Maher, we've seen these clips of her on Steve. | ||
| She's an NPR. | ||
| She went to CEO of NPR. | ||
| She was the CEO of Wikimedia Foundation at the time. | ||
| So she had this moment where she was head of comms at Wikimedia Foundation. | ||
| They had a big scandal, which is that they tried to create a Google-killing search engine. | ||
| Wikipedia wanted to have a search engine of its own. | ||
| This caused a big ruckus. | ||
| The executive director had to leave. | ||
| Catherine Maher steps in. | ||
| The first thing she does is orders a PR audit, like a media audit of Wikipedia to be run by their PR agency. | ||
| The agency is owned and run by a guy named Craig Manassian. | ||
| At that same time, Manassian was CMO of Clinton Global Initiative. | ||
| He was a direct employee of the Clintons, reshaping Wikipedia for the next 10 years. | ||
| And he produces this audit. | ||
| He guides them how to take it forward, how to shift the mission. | ||
| And the person they hired to implement that was another Hillary Clinton 8. | ||
| Gold has officially gone mainstream. | ||
| You're seeing it everywhere, and prices just keep climbing. | ||
| Every day, more and more Americans are discovering the power of gold. | ||
| But how do you actually take advantage of it? | ||
| How do you invest in a way that's safe, simple, and built for the long run? | ||
| That's where Noble Gold Investments comes in. | ||
| Their specialists are trained to help advisors like you go from confused to confident, from wondering what's next to knowing exactly what to do. | ||
| Whether you're looking to roll over an old 401k into a gold IRA or you want physical coins and bars delivered right to your home, Noble Gold makes the process simple, secure, and stress-free. | ||
| You'll get step-by-step guidance, transparent pricing, and U.S.-based service from a trusted team that's handled over $2.5 billion in precious metal transactions. | ||
| With Noble Gold Investments, protecting and growing your wealth isn't just their job, it's their commitment. | ||
| Visit dave RubingGold.com today to claim your free wealth protection kit. | ||
| That's dave RubingGold.com. | ||
| NobleGold, helping you build a stronger future. | ||
| So what is going on here? | ||
| Is this coordinated with the BNC, the Democrats, the Hillary machine, the Obama people? | ||
| Is this a bottom-up thing? | ||
| How much do they even like those guys? | ||
| Because I think a lot of it is sort of more connected to like the Antifa type. | ||
| I mean, how do you figure out where this is all coming from? | ||
| So this is where things get complicated as you're picking up on, because there's the community on Wikipedia, which does all the editing, and then there's the Wikimedia Foundation, which runs things. | ||
| And they're supposedly air-gapped, but they're not. | ||
| There is a dynamic between them. | ||
| Wikimedia Foundation can make any edit it wants. | ||
| It can ban any person it wants. | ||
| It doesn't have to say why. | ||
| It doesn't have to tell you what happened. | ||
| It can just make that decision and it does that. | ||
| So they exert a tremendous amount of control. | ||
| I think what happened with Catherine Maher at the time was that she was a true believer. | ||
| She's a Soros acolyte. | ||
| In her very first job, she ran this Lebanon election monitoring thing when she was in her 20s. | ||
| And she talks about her profound belief in George Soros' open society philosophy, political philosophy. | ||
| She really, really believes this. | ||
| Hillary Clinton is, in Soros' own words, one of the greatest evangelizers for Soros' ideology. | ||
| And I think this is just what they truly believe. | ||
| And there's a quote from, I'm going to butcher it slightly, but there was a quote we had played on the show a couple of times from Maher about that truth can't get in the way of making a better society or something, something to that effect, which sounds dangerous that it is. | ||
| She calls it a distraction. | ||
| She calls the truth a distraction on Wikipedia. | ||
| Yeah, that was literally it. | ||
|
unidentified
|
Yes. | |
| And she also called Wikipedia, while she was running it, a white male, a white Western male construction or construct or something like that, because this is actually how she thinks. | ||
| She thinks that truth is something that we don't find, but we make up. | ||
| It's a product of like power dynamics. | ||
| It's a very Marxist ideal. | ||
| And Soros is very adjacent to that kind of thinking. | ||
| It's always about breaking down power, breaking down hierarchies, removing borders, removing religious tradition so we can have a flat, open global society. | ||
| This is actually, it sounds like what I'm saying is a conspiracy theory, but it's not. | ||
| When you study Soros, this is what he wants. | ||
| He's not shy about it. | ||
| And Maher wanted the same thing, and so did Clinton. | ||
| Okay, so let's pause for a second and back up because why should anyone care about what's going on on Wikipedia? | ||
| There's a million websites out there. | ||
| We've got Grok, we've got AI, you've got ChatGPT. | ||
| Why should anyone care about Wikipedia? | ||
| When Wikimedia Foundation tried to shift Wikipedia's role as global knowledge infrastructure, they succeeded. | ||
| So today, when you look at Google and you do a search, I did, for example, I tried this out. | ||
| Free speech is what I typed in. | ||
| The entire page, not just the first result, which is always the case with Wikipedia. | ||
| You're always going to get that first result dominated by Wikipedia. | ||
| But we also have a knowledge panel on the side of Google. | ||
| That's Wikipedia. | ||
| We have an AI overview. | ||
| That's Wikipedia. | ||
| We have other summaries down at the bottom of the page on the Google search. | ||
| That's Wikipedia. | ||
| And that's just Google. | ||
| So Google and Wikipedia have this symbiosis. | ||
| So what is that connection? | ||
| Are they partners? | ||
| Is that a financial partnership? | ||
| They are. | ||
| They are partnered. | ||
| Nobody really knows how it works. | ||
| They've never disclosed it. | ||
| What we know is that there's a deep relationship. | ||
| And the same time this Wikimedia movement strategy took place, they created at Wikimedia an endowment. | ||
| They wanted to raise $100 million in six years. | ||
| They put that endowment inside of the Tides Foundation, which is a far, far-left billion-dollar foundation. | ||
| Why did they do that? | ||
| It's not clear because Wikimedia is a 501c3. | ||
| They could have raised the money themselves. | ||
| But they put it in Tides, which is a donor advice fund, meaning the donors can donate without knowing where the money ends up. | ||
| At that same time, Google gave Tides $200 million over the course of three years. | ||
| This is vastly more money than Google has ever given to anyone in its entire history. | ||
| And it lines up. | ||
| When we go back and remember that I said Wikipedia was trying to create a Google killer search engine at that same time, the project gets scrapped. | ||
| Wikimedia creates an endowment, puts it in Tides. | ||
| Google gives Tides $200 million. | ||
| And the endowment, which was seeking to raise $100 million in six years, completes that goal in three years. | ||
| So you put the two and two together. | ||
| Google's just paid a lot of money to Wikimedia Foundation, and they've supported it over the years. | ||
| The other thing to remember is that Google in the 2010s, when it was starting to grow, isn't the company it is today. | ||
| Today, if Google wanted to just create as much content as it pleased to, it could do that. | ||
| It's got the resources. | ||
| It didn't have the resources back then. | ||
| Back then, having Wikipedia fill out its top results, meaning meant that you could get verified content, millions and millions of pages of great, high-quality content for free. | ||
| In return, Wikipedia gets the most valuable real estate on the entire internet for free. | ||
| So this is kind of this like, again, like a mutualism, a symbiosis. | ||
| They both benefit from it. | ||
| But what affected what it becomes is kind of a knowledge cartel. | ||
| You can't break it. | ||
| It's a monopoly. | ||
| It's Google's monopoly. | ||
| It's Wikipedia's monopoly, which is a monopoly on information online, locked together. | ||
| Are they still convincing? | ||
| Well, I guess the answer to this is yes. | ||
| I normally don't ask a question, then answer at the same time. | ||
| But are they still convincing a lot of people that they are unbiased? | ||
| Like, I never go on Wikipedia anymore. | ||
| My page was hijacked a long time ago, Reddit, all of these things. | ||
| Like, to me, they've all been just taken over by the activist class and just all of the bad actors. | ||
| And I just view that as a portion of the internet that exists that's just a cesspool of horror. | ||
|
unidentified
|
Obviously, not everyone necessarily is as attuned to that, but I mean, how much effect are they having? | |
| Is there a way to quantify it? | ||
| It's massive because they train Reddit and Wikipedia together, both of them, but particularly Wikipedia, trains every Frontier AI out there disproportionately. | ||
| So they weight it more heavily in the training than any other website. | ||
| It gets sort of shown to the model more times than a normal website would because it's considered credible and verified and all that kind of thing. | ||
| So you're getting the information on ChatGPT as a reference. | ||
| It'll just pull from an article and tell you this is what Wikipedia says about it. | ||
| But it's also, because it's trained on the data, the AI is getting shaped by Wikipedia's worldview. | ||
| The worldview of AI reflects Wikipedia's worldview disproportionately. | ||
| And you don't know that as a user of ChatGPT or Claude's or Anthropic Claude or Gemini or anything else, or even Siri and Alexa when you ask a question. | ||
| That's the same model. | ||
| So whether or not someone thinks it's unbiased, and I think most people believe that it's unbiased, most people still believe that founding myth, which is that it's neutral and crowdsourced. | ||
| It's the people. | ||
| It's the people. | ||
| It's got to be fair. | ||
| And that in itself was also a myth. | ||
| That was never true. | ||
| Because if you go on Wikipedia today and be like, actually, I don't think this is true. | ||
| I don't think this particular left-wing point of view, a market point of view on an important topic is accurate. | ||
| And you say, I'm going to insert some counterbalance here. | ||
| Your edit will never stick. | ||
| There is no chance unless you are a dedicated, full-time, ideologically driven editor, which most people on the right are not part of that mechanism. | ||
| It's not going to work. | ||
| How do you become one of those people? | ||
| Like, someone's watching this going, well, I should just devote more time to counterbalancing some of this nonsense. | ||
| How do I get involved? | ||
| How do you do it? | ||
| You would just need to get in there and start experimenting good faith, thinking this is what I actually believe, but you're going to have to put in the time. | ||
| You're going to, I would say, 50 to 100 hours before you're really getting conversant with the maneuvering, the backbiting, backbiting, stabbing, the rules lawyering, all this stuff that goes on behind the scenes, which we don't see. | ||
| And this is part of the problem. | ||
| We see the surface. | ||
| It's not even like an iceberg, because at least the iceberg counts like 20%. | ||
| You're seeing 1%. | ||
| You're seeing this kind of glare of Wikipedia. | ||
| And beneath it, what is actually going on is something that looks a lot more like social media. | ||
| Just the fighting, the clout chasing, all this stuff that you would see on X is going on beneath the surface of Wikipedia. | ||
|
unidentified
|
Right. | |
| So this is the editors fighting with the other editors and who can get more posts and who can get other people's posts deleted and downranked. | ||
| Bans, people getting banned off the site, people getting more and getting someone else banned because they don't like them. | ||
| Edit gangs. | ||
| Like I did this very big investigation into these pro-Hamas editors, this gang of 40 that I call them three dozen plus. | ||
| And these guys are relentless. | ||
| Some of them are clearly aligned with Hamas. | ||
| Some of them are openly supportive of Hezbollah. | ||
| They have a little user box saying this user supports Hezbollah. | ||
| They are removing mentions of Hamas terror attacks from Wikipedia in mass, not just one or two, dozens. | ||
| They're doing the same for Iranian human rights abuses, removing them, gone. | ||
| They're doing the same for Hezbollah. | ||
| They are severing ties between Israel and the Jewish people. | ||
| This is coordinated. | ||
| It's a mass level. | ||
| They together did a million edits across 10,000 articles just in the Israel-Palestine space. | ||
| How do you compete with that? | ||
| All right. | ||
| If you follow politics, you know everyone's got an opinion. | ||
| Your group chats the pundits, and yes, even me. | ||
| But on Polymarket, you get something better. | ||
| Well, maybe not to me, but it's still pretty good. | ||
| Real odds on what's actually likely to happen. | ||
| Polymarket is a prediction market where people trade on real events, elections, debates, policy moves, and more. | ||
| And it doesn't stop at politics. | ||
| You'll find markets on the economy, tech, sports, pop culture, and even random internet moments that blow up for a week. | ||
| It's live, transparent, and gives you a legit vibe check on what people really think. | ||
| You've definitely heard about Polymarket before, so take a second and check it out. | ||
| Head on over to polymarket.com right now. | ||
| So how do you figure out the other parts of this related to state actors and that sort of thing? | ||
| Because some of that sounds like it would probably take some state money or outside actors to be pushing this. | ||
| It's a very hard thing to do because it's all anonymous. | ||
| We can't trace them. | ||
| We don't know who they are. | ||
| You can only look at the patterns and say this is clearly coordinated. | ||
| Statistically, this is coordinated. | ||
| And then tell the world that this is happening. | ||
| The same goes with China. | ||
| I mean, there are at least 300 dedicated Chinese, and these are activists who say they are pro-CCP. | ||
| And they have their own little user box saying, this user supports the CCP. | ||
| And they're working on Wikipedia around the clock on the most important topics to influence the platform in their direction. | ||
| Wikipedia presumably knows this. | ||
| If they do, they're not doing anything about it. | ||
| But we on our side can track it because the information is all available to us. | ||
| This is the one advantage we have. | ||
| It's all out there. | ||
| So if we say, okay, let's look at the pattern. | ||
| Let's look at what these guys are affecting. | ||
| Let's look at how they're changing articles about Xi Jinping, about COVID lab leak, about Taiwan, about tariffs, about President Trump. | ||
| And you can just start to tell the story. | ||
| And at least then people know what's happening. | ||
| What do you want to happen by exposing, like when you're doing these exposés, when you're doing the research on this, like what is the end game there? | ||
| I mean, it doesn't sound like this thing's going anywhere anytime soon. | ||
| The end game is for people to understand that this is not cute little Wikipedia of 2010, 2005. | ||
| This is a platform that has mass influence and that is a back door into our information ecosystem. | ||
| It is the most hackable system out there for propagandists, for ideological warriors. | ||
| So number one, awareness. | ||
| That's the most important thing. | ||
| But number two, like you're saying, it would be great to see more people on the other side of the issue involved, out there editing, striving, learning in good faith, not to be propagandists for the other side, but to really do something they believe. | ||
| But do you think it's too far gone? | ||
| I mean, that there's been enough damage. | ||
| It sounds like that to extricate yourself out of that seems a little upstream. | ||
| I think you're probably right about that. | ||
| But I think the reason they're too far gone is because you got to a point where someone like Elon Musk, who was a supporter of Wikipedia, said enough and did the thing that nobody else can do, which is just launch his own version of the thing. | ||
| And now you have a market of competition. | ||
| This is what Wikipedia never had, which is direct competition. | ||
| They had a full-blown monopoly for 20 years that nobody noticed. | ||
| Nobody even talked about this. | ||
| But now we have something, an alternative, something that you can use if you don't like what you're getting from Wikipedia. | ||
| Over time, Gracopedia will start to feed other AIs. | ||
| It'll start to feed Google. | ||
| There's another site called Justopedia, which is a former senior editor, veteran editor on Wikipedia who is working to counterbalance some of the left-wing bias that she encountered. | ||
| And she got to the point, she threw up her hand. | ||
| She said, this can't work. | ||
| I'm just going to start my own thing. | ||
| And she actually did it. | ||
| And it's actually working. | ||
| How worried are you that even something like Gracopedia wouldn't be infected one way or another? | ||
| Or I think I just saw in the last couple of days that Elon's saying something like they're going to eliminate the current X algorithm, which I think is quite horrific. | ||
| It has its moments where it used to be terrible, then it gets better at times, then it gets worse, then he made it better. | ||
| But it's sort of impossible if you've been around long enough, you can sort of see the trends. | ||
| But how worried are you that even a guy like Elon, who I think is incredible in the fact that he bought X, I mean, he might have saved Western civilization, that even something he creates will be susceptible to the same nonsense. | ||
| I am concerned about that, to be honest with you. | ||
| But Having more than one viewpoint out there is the main thing. | ||
| And if it's more than two, even better. | ||
| And if we get to the point where we have five or ten different reliable online encyclopedias or sources of information, and it's not just one making all the decisions for the rest of us, I think we're better off. | ||
| And I think that's the model we're all trending towards. | ||
| And it was Wikipedia's original model, which is to have a multitude of viewpoints represented online. | ||
| And they did that for the first few years, but then it got captured. | ||
| So I think the more, the merrier. | ||
| And I think for now, that's where we're heading directionally. | ||
| Whether that actually plays out in reality, I don't know. | ||
| So is the dystopian version of this that depending on what AI you're going to keep on your phone and what I'm going to keep on my phone and everybody else, that we're just going to live in, it's hard for us to sort of see it now in 2025. | ||
| But if you throw this thing 10 years into the future, that we're just going to basically live in completely different realities. | ||
| Like we'll all be on Earth for the most part. | ||
| Maybe. | ||
| Who knows if El Antenna saw it first? | ||
| But that's in essence where we will be because it will just be reflective. | ||
| You know, it's not going to be a mirror. | ||
| It's just going to be a sort of funhouse mirror with different AI binding. | ||
| I think so. | ||
| I think we're seeing that on social media. | ||
| We're seeing X as sort of right of center, Blue Sky as far left of center. | ||
| We're seeing Facebook for a different demographic. | ||
| They still got Facebook still doing that. | ||
| It's still there. | ||
| And apparently there's still like a billion boomers using it. | ||
| That's good for something. | ||
| But I don't necessarily think that's a bad thing because there's a great tech thinker, entrepreneur, Balaji, Srinivasan, he talks about Twitter in the old style as a prison yard where all the gangs are in the same place at the same time. | ||
| And that creates chaos. | ||
| It creates a kind of violence in a way. | ||
| So at least having a place where it's like, okay, like blue skies for like the left-wing crazies and like, you know, X is like sometimes for the right-wing crazies. | ||
| It's, I think, a better model than what we have right now. | ||
| It's like these people just don't belong in the same space. | ||
| Right. | ||
| Except that we end up bumping into each other in the real world. | ||
| And that seems like the next version of the problem. | ||
| Yeah, though, I think for most normal people, when they do bump them to other people in the real world, that's actually okay. | ||
| Like my taxi driver, I had an Uber driver today. | ||
| He's a Kamala Harris supporter, and it was a nice conversation. | ||
| In Florida? | ||
| Was he lost? | ||
| Yeah, it was surprising. | ||
| He must have been dropping somebody off from New York. | ||
| But nice guy. | ||
| I could relate to him. | ||
| I listened to his point of view. | ||
| It wasn't that bad. | ||
| It was fine. | ||
| And, you know, it was a real world thing. | ||
| Online, I would have hated the guy. | ||
| And he would have hated me. | ||
| Right. | ||
| What about the other place? | ||
| So we've mentioned Reddit a little bit, but there's a couple of other things like this. | ||
| Like, is it all sort of the same, what's going on underneath the hood with these things? | ||
| I would say in most cases, it is. | ||
| I think some are more susceptible to wide-scale, broad-scale propaganda campaigns. | ||
| I think Reddit in particular is one of those where you're just seeing this again and again, where it's not a one-off. | ||
| It's not just a bunch of people expressing a crazy idea, which also happens. | ||
| It is something that is coordinated. | ||
| It's something that is mechanized. | ||
| On Reddit, I did a piece about this very big propaganda group that is, again, pro-Hamas. | ||
| In this case, they're actually literally laundering Hamas ideology. | ||
| What they do is they take messages from Telegram. | ||
| This is from Hezbollah, Hamas, the Houthis, et cetera, et cetera, and translate them into English, put them on Reddit, and spread them from Reddit onto Discord, X, community notes, basically everywhere else. | ||
| So this is actually a terror pipeline, terror propaganda pipeline on Reddit. | ||
| Reddit knew about this. | ||
| My sources who gave me this story worked with Reddit for a year trying to get them to address this. | ||
| They banned every other community on Reddit. | ||
| The Donald, which is by far the big - that was the biggest pro-Trump community online. | ||
| I did a Ask Me Anything with them, and then they got booted shortly after. | ||
| They had no problem doing the fat shaming communities, anti-trans communities, gone. | ||
| When these guys go to Reddit and say, okay, look, we're, you know, these were like left-of-center guys, liberal people, and saying, we're fine with how you operate in terms of your content moderation. | ||
| These are choices you're making, but be consistent. | ||
| Reddit ignored them for a year until the point they got to say to me, we have to do something about this. | ||
| And I do the story and Reddit denies it. | ||
| So when you have this kind of susceptibility, it usually indicates there's a problem at the management level of the company. | ||
| So what is the management level at a place like Reddit? | ||
| You know, I think I mentioned to you privately, I got doxxed over there and they refused to do anything. | ||
| I had a high-up, a high-up person in the C-suite, and they were like, oh, well, submit a form on the generic form, which I did. | ||
| And of course, nothing just bumps back. | ||
| Yeah, Reddit. | ||
| This is a couple of years ago. | ||
| Reddit's got some, I think there's a few staffers there. | ||
| Again, these are senior trust and safety people who have a history of the same kind of ideology. | ||
| And when you think about who's making these kinds of decisions, it's usually not the CEO. | ||
| CEO is doing CEO stuff, big deals, and like which direction are we heading in in the next five to 10 years. | ||
| When you look at the trust and safety people, and it sounds like this anonymous kind of boring thing, but these are the people with the power who are making those decisions on a month by month or even week by week basis about who stays, who goes, what you can say, what you can't say, why you can call someone a Zionazi on Reddit, and that's totally fine. | ||
| But if you say something that a man is a man, that's not okay. | ||
| That's where that decision gets made. | ||
| And it's not just Reddit. | ||
| It's at all the platforms that have trust and safety, including Wikipedia. | ||
| That's this kind of unique hinge, this pivot point online. | ||
| And again, one of these things that we saw a little bit of it at TwitterX when Elon took over and fired them all. | ||
| But aside from that, we don't really talk about that trust and safety. | ||
| If there's one sale you don't want to miss, this is it. | ||
| CB Distillery's end-of-year sale is here, and you can save up to 50% on everything with code Rubin. | ||
| If the holidays have left you feeling stressed, run down, or struggling to sleep, CB Distillery offers natural CBD options that support better sleep, less stress, and more calm, improved mood and focus, post-workout recovery, and even options for pets. | ||
| CB Distillery stands out because their hemp is grown in the U.S., third-party tested and free from artificial dyes or fillers. | ||
| Just clean, high-quality CBD you can trust. | ||
| Don't miss this chance to treat yourself and save big. | ||
| Head on over to cbdistillery.com. | ||
| Use code Rubin to save up to 50% off that cbdistillery.com, promo code Rubin. | ||
| It's part of the weird thing around this that, you know, we got the Twitter files, but we never got the YouTube files. | ||
| We never got the Google files. | ||
| We never got the Facebook files, et cetera. | ||
| I went and I met with Elon and some of the engineers who showed me what was going on under the hood there. | ||
| The entire Twitter system was built to shadow ban. | ||
| You know, Jack Dorsey testified under oath saying we don't shadow ban, which my guess is the reason that's not perjury is because you probably, you know, just the word shadow ban is not a technical term. | ||
| So he was able to scoot around it. | ||
| But I mean, he, in essence, he lied under oath. | ||
| I saw the thing and it was like, if you, you know, they had all these if-ands, and it's like, if you communicated with this person, you won't be seen. | ||
| And if you said this within the time limit of this, and then they had all these arrows and, you know, charts going like it was, it was crazy. | ||
| You look like a crazy person, like a conspiracy theorist, you know, back, you know, like the Red String thing. | ||
| Yeah, yeah. | ||
| So what do we do about those other companies? | ||
| I mean, okay, so fine. | ||
| Wikipedia Reddit. | ||
| That's a certain portion of the internet. | ||
| We got YouTube. | ||
| You mentioned Google already, but I mean, Facebook, all of these things. | ||
| We have just no idea what's going on. | ||
| We don't. | ||
| I think some of the vibe shift of 2024-25 has maybe blunted that a little bit. | ||
| You know, Zuckerberg came out saying, we're firing all the content moderation people. | ||
| We're not doing the fact check thing anymore. | ||
| News agencies are saying the same thing where it's just like, this doesn't work as a model. | ||
| We have to let people say what they have to say and we can let the market decide on the platform what is considered acceptable and what's considered unacceptable. | ||
| And people can block it, ignore it, mute it, respond to it, debate it, make a choice. | ||
| So I think it's been, it's shifted a little bit, but I do think you're right because we don't have a full accounting of what happened up until now. | ||
| We don't have, nobody knows. | ||
| Well, that's the thing. | ||
| We're basically so deep in the cave in essence. | ||
| Like we're basically in Plato's Cave. | ||
| We're so deep in the cave, we have no idea what goes on on the outside at this point. | ||
| So even if they were to now say, all right, well, we're getting rid of the content moderation, we just don't know what the algorithm's even feeding us. | ||
| Yeah, that's right. | ||
| And when I was doing a lot of the COVID lab leak reporting back in 2022, 21, I noticed that things in my account started to change. | ||
| I didn't get that engagement. | ||
| This was still like, you know, Twitter era where you couldn't say that. | ||
| You couldn't say that maybe this virus came from the lab in China that makes these viruses. | ||
| And you get punished for it. | ||
| When my book came out, I hired someone to do some Facebook ads, and he said the engagement on the ads was something he had, this is a professional performance marketer. | ||
| He had never seen anything like it in his career. | ||
| Until the moment, they just banned my account. | ||
| And I could never advertise. | ||
| I still can't advertise on Facebook today. | ||
| Facebook would not let me advertise for my book tour. | ||
| They're too political or whatever they say. | ||
| But meanwhile, plenty of people on the left. | ||
| On the left, you have no problem. | ||
| Right. | ||
| So do you want, what do you want the government to do, if anything? | ||
| I mean, if Trump was sitting here or somebody was sitting here and could help at a governmental level, what is it that they could do? | ||
| On a governmental level, I think it's investigate foreign influence because this is something that is national security. | ||
| It's not just like, oh, you can't say this kind of thing. | ||
| It's like, obviously, we want to keep the government out of that part of things. | ||
| But if there is the Chinese government infiltrating and changing things on Wikipedia to suit its own national security interests, we should at least know that. | ||
| Whether or not you want to force people to stop doing it, that's a different question, but we should at least know. | ||
| The American people should know. | ||
| On the Wikipedia level, I would say it's probably about time for the president to write an executive order to say this should not be in the federal government. | ||
| You should not be referencing Wikipedia and the federal government. | ||
| I don't just mean like looking it up for some casual information. | ||
| I mean, if the federal government is developing AI for itself, it should not be training on Wikipedia. | ||
| I think an executive order could get that done and get Wikipedia out of the federal government and find a better way to gather information for this kind of thing. | ||
| I think also the last thing is to look at the funding. | ||
| Look at the money. | ||
| Where is it coming from? | ||
| Where is it going? | ||
| Wikipedia raises a lot of money. | ||
| They have an annual revenue of around $200 million. | ||
| They fund a lot of other NGOs that you don't hear about or you don't even know about. | ||
| Most of them are on the far left. | ||
| Sometimes that's fine. | ||
| I think they should be more transparent about it. | ||
| But if they're funding stuff that is working against American interests, that's another question that we need to raise. | ||
| That's something that needs to be investigated and looked at carefully because they're a tax-exempt organization. | ||
| They don't pay tax because the American government says you are theoretically doing something good for the country. | ||
| We just need to make sure that that's true. | ||
| Where does the bot part of this fall into the game? | ||
| One of the things that I'm seeing is, and again, it's like when you're on the internet long enough, you see that you just see things come and go. | ||
| You see the way things happen. | ||
| So there was a time when bots first appeared and it sort of became pretty obvious quickly because it was just kind of cut and paste and same type of thing over and over. | ||
| They looked very similar. | ||
| They've gotten much more sophisticated now because of AI. | ||
| And every now and again, you see one of these Pakistani bot farms with the phones attached to all of the wires and it looks right out of the matrix and all this. | ||
|
unidentified
|
But I think it's having a huge impact on our political discourse because if you say something crazy, the bots like the hell out of it and retweet the hell out of it. | |
| You get engagement and then you just start saying crazier things. | ||
| And I think this explains a lot of what's happening actually on the right right now. | ||
| I mean, it's been happening on the left for a long time, but I think it's happening there now too. | ||
| Yes, it's the bots. | ||
| And like you said, they're at the point that they're undetectable. | ||
| Even the best, there's like, I'm not going to say the name of it, but it's the sort of gold standard bot detection company. | ||
| This is what they do and what they're known for. | ||
| Someone who's an expert in this said to me recently, that's basically at this point snake oil. | ||
| It doesn't work. | ||
| You cannot, even with the best technology, detect a bot. | ||
| You can only as a human, because you're kind of in tune to it after so much experience. | ||
| It's the last thing that's human about us, our ability to detect the bottom. | ||
| Yeah, detect the inhuman. | ||
| I think the impact is absolutely massive. | ||
| I think so many of the comments, accounts that look real, accounts that are, you know, sometimes if I'll post something about anything related to Israel or push back on some anti-Semitism, I just get this slew of responses saying, juke, juke, juke, juke. | ||
| Those are bots. | ||
| Those are not. | ||
| I mean, for the most part, some of them are real. | ||
| But mostly those are bots. | ||
| It sways the discourse. | ||
| It contorts information online. | ||
| And again, a lot of this is not, it's not, someone's not paying for this for no reason. | ||
| Someone's not just like, I want to like employ all this server power, all this electricity power, just for like shits and giggles, excuse my French. | ||
| They're doing it because they're advancing an unseen agenda. | ||
| And we don't know what that is. | ||
| It seems like it's to top a Western civilization if I was to throw a dart out there. | ||
| Yeah, generally so. | ||
| I mean, if you think about countries that are good at this stuff and invested in it, it's China, it's Russia, it's Iran, it's Venezuela, it's North Korea, it's the countries that have an interest in weakening America and trying to disrupt American global hegemony. | ||
| This is not what they want in the world. | ||
| And they understand that one great way you can do that is by creating a divide internally in the culture. | ||
| Put the people against each other, create this kind of cultural civil war, which is what we're seeing. | ||
| I don't think they started it, but I think they're amplifying it. | ||
| Do you think there's just going to be a complete disconnect with a certain set of people that are just going to put their phones down and want to be off the machines altogether? | ||
|
unidentified
|
I hear that sort of sentiment more and more. | |
| You know, I do this off the Great August thing where I do it, but I think people saying it and doing it are two very different things. | ||
| I agree with you on the last point. | ||
| I think that would be amazing, personally. | ||
| I think as someone who kind of looks at this stuff really carefully and forensically, it just really warps your brain. | ||
| It makes you someone you're not. | ||
| So if that were ever to be the case, it would be a win for humanity, for society, for our country. | ||
| I just don't know it's ever going to happen because it's like this thing Cal Newport points out, which is that these companies spend billions and billions of dollars to keep you on the platform. | ||
| Even for one extra minute, they'll pay for that. | ||
| We are so conditioned. | ||
| Our society culturally is addicted to this, not just individually. | ||
| I don't know how we break that cultural addiction. | ||
| I don't think anyone knows. | ||
| And I'm pretty damn good about it. | ||
| And I don't think, and I certainly don't have the answer. | ||
| And you can just see, I think you can just see something happening right now where people, I'm very aware of this. | ||
| If I, you know, I sit down with somebody for dinner or I go out with a group of people. | ||
| Someone starts looking at their phone. | ||
| I'm basically done with them at that point. | ||
| And then there's this other group of people that have an ability not to do it. | ||
| Yes. | ||
| And that may, maybe that's going to be the divide more than anything else going forward. | ||
| I hope so. | ||
| We're going to have a bunch of us slogging through the real world, not knowing what the hell's going on, but it'll be a little bit better than the other guys. | ||
| But the real world is so much better. | ||
| You know, the texture of it, the difference between putting your kids to bed with a phone in your hand and the phone just not in the same room on the same floor of the house is huge because you can be a human being with other human beings. | ||
| And it's, you know, I think you and I are old enough to remember a time without it, without any of it. | ||
| And it was a different time and it was a better time. | ||
| It's enough to sort of reminisce about the great, how beautiful the past was. | ||
| But I think we do need to bring some of that back into where we're going in the future. | ||
| Otherwise, I don't know what it looks like. | ||
|
unidentified
|
Right. | |
| It's interesting. | ||
| It's not to do midnight in Paris, like, oh my God, every generation before us was so special and so unique. | ||
| And if we could have only been there, but it is true. | ||
| We are now, Gen Xers are really the last generation to have any real memory with a little bit of adulthood before this thing. | ||
| And it's, I guess, incumbent upon us to remind people that it was okay. | ||
| It was great to go outside and to play, even to play the video games we were talking about. | ||
| The thing is, even though I could play Contra for two hours, you could put it down and Contra wasn't talking back to you. | ||
|
unidentified
|
Right. | |
| Where now we've handed them the world and anyone on the other side is talking. | ||
| Right. | ||
| Zelda didn't tell you to hate your neighbor for no reason. | ||
| Zelda taught you to hate who is the bad guy in Zelda? | ||
| Gamora? | ||
| Gondola? | ||
| Come on, somebody help me here. | ||
| One of my video game guys. | ||
| Gannon. | ||
| That seems like the ending of the show. | ||
| Thank you, Ashley. | ||
| Thank you, Dave. | ||
| Good to be here. | ||
| If you're tired of the mainstream media circus and want more honest conversations, go check out our media playlist. | ||
| And if you want to watch full interviews on a wide variety of topics, watch our full episode playlist all right over here. |