Speaker | Time | Text |
---|---|---|
unidentified
|
[music] | |
Why, hello there, you excellent person, you. | ||
I'm Dave Rubin, this is the Rubin Report Direct Message, and today is October 5th, 2021. | ||
Do me a favor, share our videos, subscribe to our channel, and tap the notification bell. | ||
We got three stories for you. | ||
One of them is breaking right at this very minute. | ||
There is a hearing on Capitol Hill. | ||
with a Facebook whistleblower and I've been watching the hearing and watching all the clips fly on Twitter and elsewhere and it's quite interesting. | ||
We're going to show you four clips of this thing because at first she was kind of saying, yeah, you know, these big tech companies and Facebook specifically are very evil and they're manipulating us and making us depressed and You know, driving political polarization. | ||
But then suddenly she started prescribing what the government should do, which is an odd thing to do as a whistleblower. | ||
A whistleblower, you would think, would just give you the information. | ||
Oh, this is all the bad stuff, the secret stuff that I'm exposing. | ||
But then suddenly in the midst of the hearing, she started acting a little bit more like an activist and actually prescribing what Congress should do. | ||
It involves censorship. | ||
Yeah, of course, of course. | ||
So is she a whistleblower or a plant? | ||
We'll discuss. | ||
Part two of today's show, you know, Joe Biden, let's go Brandon, that guy, he is talking about this debt situation. | ||
And we've been talking a lot about it, how they're pushing this idea that $3.5 trillion equals $0, because they're just going to steal it from the rich people in the corporations because they don't pay. | ||
Well, he gave a ridiculous, ramblingly, mostly incoherent little chat about what the debt ceiling is and debt in general, and that, you know, we can pay back because we just go into, well, you'll see, you'll see. | ||
That was my version of Joe Biden right there. | ||
And then finally, this one, I mean, this is just going to be an ongoing thing, unfortunately. | ||
We've crossed what I would say is a very strange and dangerous Rubicon in America. | ||
Kirsten Sinema, who is the senator from Arizona, we showed the video yesterday of her being harassed in the bathroom. | ||
I think it was at Arizona State University, if I'm not mistaken. | ||
Well, now she was on a plane where she got harassed again, then at an airport after, so we're going to show you some video. | ||
And we've sort of said now it's okay to get up in their face, which as you guys know, this is exactly what Maxine Waters and many of the Democrats were saying all along. | ||
You can get up in their face. | ||
She was talking about Republicans, but now it has come home even to her Democrat people. | ||
Should we do an ad first, or are we going right into story number one? | ||
We're going to do story number one? | ||
All right. | ||
There you go. | ||
All right, so let's talk about this Facebook whistleblower situation. | ||
So this woman by the name of Frances Haugen, she is actually testifying right now as we speak. | ||
You might have it even open in another window, because that's one of the things that the internet does. | ||
You can have 800 windows open. | ||
You can pay attention to 800 things. | ||
You can actually overload your brain until you're nothing but a drooling zombie. | ||
But I know you wouldn't do that. | ||
You're watching me. | ||
You're focused. | ||
You're paying attention. | ||
You're learning. | ||
And you're going to transmit this information to other people. | ||
Anyway, she is testifying right now, and she's a Facebook whistleblower, basically bringing some information to light on how Facebook manipulates people, how it's leading to political division, how they're data mining. | ||
I mean, a whole bunch of stuff that we all know, right? | ||
We all know this stuff is happening. | ||
Nothing that I heard her say or that you're about to hear, I think, is so mind-blowing. | ||
And I would also Preface this all by saying that, you know, we've done this before, right? | ||
Like we've been to this rodeo before where Congress has these hearings about big tech. | ||
Usually it's a bunch of old people who don't even know how to turn on their computers. | ||
They're asking tech people things. | ||
We never get real answers. | ||
People lie under oath as Jack Dorsey did to Ted Cruz. | ||
I mean, you've seen all of these things. | ||
And even the guys that I like, say like Ted Cruz or Rand Paul or whatever, they ask good questions, but then nothing ever happens after the hearing. | ||
And one other thing before I show you the video, you know I kept saying all along when Trump was president it was like there was a chance to do something then. | ||
And I was always leery of the government getting involved because what happens if Trump wasn't president and then suddenly the Democrats were in power and now you've married big tech and the government even further. | ||
That seems to be the direction we are now heading in. | ||
So we've got four videos from her and I think you'll see why they each get more disturbing. | ||
Here's number one. | ||
unidentified
|
They want you to believe in false choices. | |
They want you to believe that you must choose between a Facebook full of divisive and extreme content or losing one of the most important values our country was founded upon, free speech. | ||
That you must choose between public oversight of Facebook's choices and your personal privacy. | ||
That to be able to share fun photos of your kids with old friends, you must also be inundated with anger-driven virality. | ||
They want you to believe that this is just part of the deal. | ||
I am here today to tell you that's not true. | ||
Okay, so far so good, right? | ||
We all kind of know this stuff. | ||
There's nothing that she said there that I think you're going, oh my God, I can't believe Facebook's doing that. | ||
Like, we know it's sort of driving political division. | ||
We know that they manipulate the feed, right? | ||
The feed, if it was unmanipulated, would be purely chronological. | ||
I think that's what we mostly thought when we signed up for these things. | ||
Say, 20 years ago we thought, oh, I follow someone or I follow an organization or a school or whatever it might be and I'm just going to have a feed that's going to be fed to me chronologically. | ||
Then, of course, the algorithms got involved. | ||
They start looking at your behaviors and they start feeding you more things. | ||
They could try to push you one way politically or another way. | ||
All sorts of things. | ||
So she says all that, and I don't think that's anywhere as mind-blowing. | ||
And then, of course, she mentions that this is sort of an assault on free speech, because they're giving you false choices between things. | ||
And, of course, an assault on privacy. | ||
So, so far, pretty good, right? | ||
Like, she's saying things that we believe to be true. | ||
Nothing major new under the sun there. | ||
Here's clip number two. | ||
unidentified
|
Could you talk more about why engagement-based ranking is dangerous, and do you think Congress Facebook is going to say, you don't want to give up engagement-based ranking. | |
You're not going to like Facebook as much if we're not picking out the content for you. | ||
That's just not true. | ||
together? | ||
Facebook is going to say, you don't want to give up engagement-based ranking. | ||
You're not going to like Facebook as much if we're not picking out the content for you. | ||
That's just not true. | ||
There are a lot of-- | ||
Facebook likes to present things as false choices. | ||
Like, you have to choose between having lots of spam. | ||
Like, let's say, imagine we ordered our feeds by time, like on iMessage or on-- | ||
there are other forms of social media that are chronologically based. | ||
They're gonna say, you're gonna get spammed, like you're not gonna enjoy your feed. | ||
The reality is that those experiences have a lot of permutations. | ||
There are ways that we can make those experiences where computers don't regulate what we see, We together socially regulate what we see. | ||
But they don't want us to have that conversation because Facebook knows that when they pick out the content that we focus on using computers, we spend more time on their platform, they make more money. | ||
The dangers of engagement-based ranking are that Facebook knows that content that elicits an extreme reaction from you is more likely to get a click, a comment, or reshare And it's interesting because those clicks and comments and reshares aren't even necessarily for your benefit. | ||
It's because they know that other people will produce more content if they get the likes and comments and reshares. | ||
They prioritize content in your feed so that you will give little hits of dopamine to your friends so they will create more content. | ||
And they have run experiments on people, producer-side experiments, where they have confirmed this. | ||
Okay, so now we're getting to something interesting here, right? | ||
We're getting a little bit more of a look under the hood. | ||
As I said, we all sort of thought we were getting this unmanipulated feed years ago, but we all know it's been manipulated. | ||
And what she's saying there is the more that you do things that outrage people, let's say, the more that that drives engagement and gets more clicks and likes and comments and all that, the more they're gonna show you a certain type of thing. | ||
And that if they left it just to us, to our own devices, Well, we might not be on the platform as much. | ||
Our engagement and the amount of time that we let that thing suck us off for... | ||
How about that? | ||
That it would go down, and obviously that's not what they want. | ||
So that's kind of interesting. | ||
And it goes to the tension between, oh, can we just have what I think Facebook was originally designed to be, this place where you could reconnect with old friends and family members, know a little bit about people, maybe get a little bit of information, versus sort of the monolithic, Sort of all-encompassing force that it is now part of all of our lives. | ||
And I actually want to relate a little of this to YouTube, although this hearing is not specifically about YouTube. | ||
But we know, I do a show on YouTube, right? | ||
We do pretty well around here. | ||
But we know that YouTube has A feed and an algorithm that feeds you guys different things. | ||
So right now, if you're watching this video, you probably have suggested videos on the side. | ||
We hope, and we do everything we can, we've got our guy Chris, who is a great YouTube optimization guy. | ||
We hope that most of those videos are our videos, so that you'll click around through our videos, right? | ||
We are in business, and part of the business is that we keep you watching. | ||
Now, I am not in the business of keeping you outraged and angry and trying to drag you |