Ed Zitron of Better Offline critiques Google's "Code Yellow" crisis and Sam Altman's manipulative leadership, arguing that non-technical executives prioritizing profit over product have destroyed the internet. He exposes how legacy media gaslights audiences regarding inflation while tech giants like Meta and Apple enable authoritarianism through inshittification. Ultimately, Zitron warns that this erosion of trust and quality creates fertile ground for a Silicon Valley apocalypse unless the industry returns to building sustainable solutions for real human problems. [Automatically generated summary]
Transcriber: nvidia/parakeet-tdt-0.6b-v2, sat-12l-sm, and large-v3-turbo
|
Time
Text
Trust Your Girlfriends00:03:02
This is an iHeart podcast.
Guaranteed human.
When a group of women discover they've all dated the same prolific con artist, they take matters into their own hands.
I vowed I will be his last target.
He is not going to get away with this.
He's going to get what he deserves.
We always say that, trust your girlfriends.
Listen to the girlfriends.
Trust me, babe.
On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
I got you, I got you.
Hey, it's Nora Jones, and my podcast, Playing Along, is back with more of my favorite musicians.
Check out my newest episode with Josh Grobin.
You related to the Phantom at that point.
Yeah, I was definitely the Phantom in that.
That's so funny.
Share each day with me each night, each morning.
Listen to Nora Jones is playing along on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
What's up, everyone?
I'm Ego Modern.
My next guest, it's Will Farrell.
My dad gave me the best advice ever.
He goes, just give it a shot.
But if you ever reach a point where you're banging your head against the wall and it doesn't feel fun anymore, it's okay to quit.
If you saw it written down, it would not be an inspiration.
It would not be on a calendar of, you know, the cat just hanging in there.
Yeah, it would not be.
Right, it wouldn't be that.
There's a lot of life.
Listen to Thanks Dad on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
In 2023, bachelor star Clayton Eckard was accused of fathering twins, but the pregnancy appeared to be a hoax.
You doctored this particular test twice, Miss Owens, correct?
I doctored the test once.
It took an army of internet detectives to uncover a disturbing pattern.
Two more men who'd been through the same thing.
Greg Gillespie and Michael Mancini.
My mind was blown.
I'm Stephanie Young.
This is Love Trapped.
Laura, Scottsdale Police.
As the season continues, Laura Owens finally faces consequences.
Listen to the Love Trapped podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
10-10 shots five, City Hall building.
How could this ever happen in City Hall?
Somebody tell me that.
Jeffrey Williams.
A shocking public murder.
This is one of the most dramatic events that really ever happened in New York City politics.
They screamed, get down, get down.
Those are shots.
A tragedy that's now forgotten.
And a mystery that may or may not have been political, that may have been about sex.
Listen to Rorschach, Murder at City Hall on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Cool Zone Media.
Google Code Yellow Rollback00:10:48
Hey, everybody.
Robert Gosh Darnevans for you here.
And, you know, but for the end of the year to celebrate and stuff, we've got our normal behind the bastards content coming to you.
Do not worry.
That's all going to continue as normal.
But we also wanted to highlight some other shows in our network, most of which are new and launched this year.
We've got some compilation best of episodes that we think the bastards audience is going to love.
And we're delivering to you now in a special format with fewer ads.
So today, you're going to hear some episodes of Better Offline, Ed Zittron's excellent critical tech industry podcast, which has taken the tech world by storm.
And I'm excited for you to learn about the man who killed Google search, about Sam Altman, the CEO of OpenAI and why he's dangerous for society, and what Ed Zittron calls the rot economy.
Hello and welcome to Better Offline.
I'm your host, Ed Zikron.
And in the next two episodes, I'm going to tell you the names of some of the people responsible for destroying the internet.
And I'm going to start on February 5th, 2019, when Ben Gomes, Google's former head of search, well, he had a problem.
Jerry Dishler, then the VP and GM of Ads at Google and Shiv Van Kartraman, then the VP of Engineering Search and Ads on Google properties, had called something called a code yellow for search revenue due to, and I quote emails that came out as part of Google's antitrust hearing, steady weakness in the daily numbers and a likeliness that it would end the quarter significantly behind in metrics that kind of unclear.
For those unfamiliar with Google's internal kind of Scientology-esque jargon, which means most people, let me explain.
A code yellow isn't a terrible need to piss or some sort of crisis of moderate severity.
The yellow, according to Stephen Levy's tele book about Google, refers to, and I promise this is not a joke, the colour of a tank top that a former VP of engineering called Wayne Rosling used to wear during his time at the company.
It's essentially the equivalent of a DEF CON 1 and activates, as Levy explained, a war room-like situation where workers are pulled from their desks and into a conference room where they tackle the problem as a top priority.
Any other projects or concerns are sidelined.
And independently, I've heard there are other colors like purple.
I'm not going to get into that though.
It's quite boring and irrelevant to this situation.
In emails released as part of the Department of Justice's antitrust case against Google, as I previously mentioned, Dischler laid out several contributing factors.
Search query growth was significantly behind forecast.
The timing of revenue launches was significantly behind.
And he had this vague worry that several advertiser-specific and sector weaknesses existed in search.
Now I want to cover something because I've messed up and I really want to be clear about this.
I've previously and erroneously referred to the code yellow as something that Gomes raised as a means of calling attention to the proximity of Google's ad side getting a little too close to search.
I'm afraid the truth is extremely depressing and so much grimmer.
The code yellow was actually the rumble of the goddamn rot economy, with Google's revenue arm sounding the alarm that its golden goose wasn't laying enough eggs.
Gomes, a Googler of 19 years that basically built the foundation of modern search engines, should go down as one of the few people in tech that actually fought for an actual principle.
And he was destroyed by a guy called Prabaka Raghavan, a computer scientist class traitor that sided with the management consultancy sect.
More confusingly, one of their problems was that there was insufficient growth in queries, as in the amount of things that people were asking Google.
It's a bit like if Ford decided that things were going poorly because their drivers weren't putting enough goddamn miles on their trucks.
This whole story has personally upset me, and I think you're going to hear that in this.
But going through these emails is just very depressing.
Anyway, a few days beforehand on February 1st, 2019, Kristen Gill, then Google's VP Business Finance Officer, had emailed Shashi Thakur, then Google's VP of Engineering, Search and Discover, saying that the ads team had been considering a code yellow to close the search gap it was seeing.
Vaguely referring to how critical that growth was to an unnamed company plan.
To be clear, this email was in response to Thakur stating that there is nothing that the search team could do to operate at the fidelity of growth that the ads department had demanded.
Shashi forwarded the email to Gomes asking if there's any way to discuss this with Sundar Pashai, Google's CEO, and declared that there was no way he would sign up for a high-fidelity business metric for daily active users on search.
Thakur also said something that I've been thinking about constantly since I read these emails.
That there was a good reason that Google's founders separated search from ads.
I want you to remember that line for later.
A day later, on February 2nd, 2019, Thakur and Gomes shared their anxieties with Nick Fox, a vice president of search and Google Assistant, entering a multiple day-long debate about Google's sudden lust for growth.
This thread is a dark window into the world of growth-focused tech, where Thakur listed the multiple points of disconnection between ads and search, discussing how the search team wasn't able to finally optimize engagement on Google without hacking it, a term that means effectively tricking users into spending more time on a site, and that doing so would lead them to, and I quote, abandon work on efficient journeys.
In one email, Fox adds that there was a pretty big disconnect between what finance and ads wants and what search was doing.
Every part of this story pisses me off so much.
When Gomes pushed back on the multiple requests for growth, Fox added that all three of them were responsible for search and that search was, and again I quote, the revenue engine of the company, and that bartering with the ads and finance teams was now potentially the new reality of their jobs.
On February 6th, 2019, Gomes said that he believed that search was getting too close to the money and ended his email by saying that he was concerned that growth is all that Google was thinking about.
On March 22nd, 2019, Google VP of product management, Darshan Kantak, would declare the end of the Code Yellow.
The thread mostly consisted of congratulatory emails until Gomes made the mistake of responding, congratulating everyone, saying that the plans architected as part of the Code Yellow would do well throughout the year.
Enter Probaka Ragavan, then Google's head of ads and the true mastermind behind the Code Yellow, who would respond curtly, saying that the current revenue targets were addressed by heroic RPM engineering and that the core query softness continued without mitigation.
A very clunky way of saying that despite these changes, query growth was not happening at the rate he needed it to.
A day later, Gomes emailed Fox and Thakur an email he intended to send to Raghavan.
He led by saying that he was annoyed both personally and on behalf of the search team.
In this very long email, he explained in arduous detail how one might increase engagement with Google search, but specifically added that they could increase queries quite easily in the short term, but only in user-negative ways, like turning off spell correction or ranking improvements, or placing refinements, effectively labels, all over the page, adding that it was possible that there are trade-offs here between the different kinds of user negativity caused by engagement hacking, and that he was deeply,
deeply uncomfortable with this.
He also added that this was the reason he didn't believe that queries, as in the amount of the things with people searching on Google, were a good metric to measure search, and that the best defense against the weaknesses of queries was to create compelling user experiences that make users want to come back.
Crazy idea there.
What if the product was good?
Not good enough for Probaka.
So a little bit of history about Google here.
They regularly throughout the year do core updates to search.
These are updates that change the algorithm that say, okay, we're going to suppress this kind of thing.
We're going to elevate this kind of thing.
And they are actually the reason that search changes.
It's why certain sites suddenly disappear or reappear.
It's why sites get a ton of traffic, some don't get any, and so on and so forth.
But they do a lot of them.
The one that's really interesting, I'm a little bastard and I went and looked through pretty much the last decade of these.
The one that stood out to me was the March 2019 core update to search, which happened about a week before the end of the Code Yellow, meaning that it's very likely that this was a result of Probaka's bullshit.
So this was expected to be one of the largest updates to search in a very long time, and I'm quoting Search Engine Journal there.
Yet when it launched, many found that the update mostly rolled back changes, and traffic was increasing to sites that had been suppressed by previous updates like Google Search's Penguin update from 2012 that specifically targeted spammy search results.
There were others that were seeing traffic as well from an update that happened on the 1st of August 2018.
That was a few months after Gomes became head of search.
While I'm guessing here, I really don't know.
I do not work for Google.
I do not have friends there.
I think the timing of the March 2019 core update, along with the traffic increases to previously suppressed sites that 100% were spammy, SEO nonsense, I think these suggest that Google's response to the Code Yellow was to roll back changes that were made to maintain the quality of search.
A few months later in May 2019, Google would roll out a redesign of how ads were shown on Google search, specifically on mobile, replacing the bright green ad label and URL color on ads with a tiny little bolded black note that said add in the smallest font you could possibly put there, with the link looking otherwise identical to a regular search link.
I guess that's how they managed to start hitting their numbers, huh?
And then in January 2020, Google would bring this change to desktop, and the Verger's John Porter would suggest that it made Google's ads look just like search results now.
Awesome.
Five months later, a little over a year after the Code Yellow situation, Google would make Probakar Raghavan the head of Google search, with Jerry Dishler taking his place as the head of ads.
After nearly 20 years of building Google search, Gomes would be relegated to the SVP of Education at Google.
Raghavan's Failed Yahoo Era00:15:07
Gomes, who was a critical part of the original team that made Google search work, who has been credited with establishing the culture of the world's largest and most important search engine, was chased out by a growth-hungry managerial type.
Several of them, actually, led by Prabhakar Raghavan, a management consultant wearing an engineer costume.
As a side note, by the way, I use the term management consultant there as a pejorative.
While he exhibits all the same bean-counting, morally unguided behaviors of a management consultant, from what I can tell, Raghavan has never actually worked in that particular sector of the economy.
But you know who has?
Sundar Peshai, the CEO of Google, who previously worked at McKinsey, arguably the most morally abhorrent company that's ever existed, having played roles both in the 2008 financial crisis, where it encouraged banks to load up on debt and flawed mortgage-backed securities, and the ongoing opioid crisis, where it effectively advised Purdue Pharma on how to growth hack sales of OxyContent.
An extremely addictive painkiller.
McKinsey has paid nearly $1 billion over several settlements due to its work with Purdue.
But I'm getting sidetracked.
But one last point.
McKinsey is actively anti-labor.
When a company brings in a McKinsey consultant, they're often there to advise on how to cut costs, which inevitably means layoffs and outsourcing.
McKinsey is to the middle class what flesh-eating bacteria is to your skin.
But back to the emails, which are a stark example of the monstrous, disgusting rot economy, the growth at all cost mindset that's dominating the tech ecosystem.
And if you take one thing away from this episode, I want it to be the name Prabhakar Raghavan, and an understanding that there are people responsible for the current state of the internet.
These emails, which I really encourage you to look up, and if you go to where'syoured.at, you'll be able to see a newsletter that has links to them.
Well, these emails tell a dramatic story about how Google's finance and advertising teams, led by Raghavan with the blessing of CEO Sundar Peshai, the McKinsey guy, actively worked to make Google worse to make the company more money.
This is exactly what I mean when I talk about the economy, an illogical, product-destroying mindset that turns products you love into torturous, frustrating quasi-tools that require you to fight the company to get the thing you want.
Ben Gomes was instrumental in making search work, both as a product and a business.
He joined the company in 1999, a time long before Google established dominance in the field.
And the same year when Larry Page and Sergey Brin tried to sell the company to Excite for $1 million, only to walk away after Vinard Kostler, an Excite investor and co-founder of Sun Microsystems that's now a VC who tried to stop people going to a beach in Half Moon Bay.
Well, he tried to lowball them with a $750,000 offer, also known as a 100-square-foot apartment in San Francisco.
In an interview with Fast Company's Harry McCracken from 2018, Gomes frayed Google's challenge as taking the page rank algorithm from one machine to a whole bunch of machines.
And they weren't very good machines at the time.
Despite his impact and tenure, Gomes had only been made head of search in the middle of 2018 after John Guillanderia moved to Apple to work on its machine learning and AI strategy.
Gomes had been described as Google's search czar, beloved for his ability to communicate across Google's many quite decentralized departments.
Every single article I've read about Gomes and his tenure at Google spoke of a man deeply ingrained in the foundation of one of the most important technologies ever made.
A man who had dedicated decades to maintaining a product with a, and I quote Gomes here, guiding light of serving the user and using technology to do that.
And when finally given the keys to the kingdom, the ability to elevate Google search even further, he was ratfucked by a series of rotten careerists trying to please Wall Street, led by Prabhakar Raghavan.
Do you want to know what Prabhakar Raghavan's old job was?
Well, Prabhakar Raghavan, the new head of Google Search, the guy that ran Google search, that runs Google search right now, that is running Google search into the goddamn ground.
Do you want to know what his job was?
His job before Google?
He was the head of search for goddamn Yahoo from 2005 through 2012.
When he joined the company, when Prabhakar Raghavan took over Yahoo Search, they held a 30.4% market share, not far from Google's own 36.9% and miles ahead of the 15.7% that Microsoft's MSN search had.
By May 2012, Yahoo was down to just 13.4% and had shrunk for the previous nine consecutive months and was being beaten by even the newly released Bing.
That same year, Yahoo had the largest layoffs in its corporate history, shedding 2,000 employees or 14% of its overall workforce.
The man who deposed Ben Gomes, someone who worked on Google search from its very beginnings, was so shit at his job that in 2009, Yahoo effectively threw in the towel on its own search tech, instead choosing to license Bing's engine in a 10-year deal.
If we take a long view of things, this likely precipitated the overall decline of the company, which went from being worth $125 billion at the peak of the dot-com boom to being sold to Verizon for $4.8 billion in 2017, which is roughly a 3,000 square foot apartment in San Francisco.
With search no longer a priority and making less money for the company, Yahoo decided to pivot into Web 2.0 and original content, making some bets that paid off, but far, far too many that did not.
It spent $1.1 billion on Tumblr in 2013, only for Verizon to sell it for just $3 million in 2019.
It bought Zimbra in 2007, ostensibly to compete with the new Google Apps productivity suite, only to sell it for a reported fraction of the original purchase price to VMware a few years later.
That's not his fault.
But nevertheless, Yahoo was a company without a mission, a purpose, or an objective.
Nobody, and I'll speculate, even those leading the company, really knew what it was or what it did.
Anyway, just a big shout out right now to Kara Swisher, who referred to ProbaCar as well-respected when he moved from Yahoo to Google.
You absolutely nailed it, Cara.
Bang up, job.
In an interview with ZDNet's Dan Farber from 2005, Raghavan spoke of his intent to align the commercial incentives of a billion content providers with social good intent while at Yahoo and his eagerness to inspire the audience to give more data.
What?
Anyway, before that, it's actually hard to find out exactly what Ragavan did, though.
According to ZDNet, he spent 14 years doing search and data mining research at IBM.
In April 2011, The Guardian ran an interview with Raghavan that called him Yahoo's secret weapon, describing his plan to make rigorous scientific research and practice to inform Yahoo's business from email to advertising and how under then CEO Carol Barts, the focus had shifted to the direct development of new products.
It speaks of Raghavan's scientific approach and his steady process-based logic to innovation that is very different to the common perception that ideas and development are more about luck and spontaneity.
A sentence that I'm only reading to you because I really need you to hear how stupid it sounds and how specious some of the tech press used to be.
Frankly, this entire article is ridiculous, so utterly vacuous that I'm actually astonished.
I don't want to name the reporter.
I feel bad.
What about Raghavan's career made this feel right?
How has nobody connected these dots before?
I have a day job.
I run a PR firm.
I am a blogger with a podcast.
And I'm the one who said, yeah, okay, Dracula is now the CEO of the blood bank.
Nobody saw this.
Nobody saw this at the time.
I just feel a bit crazy.
I feel a bit crazy.
But to be clear, this was something written several years after Yahoo had licensed its search technology to Microsoft in a financial deal that the next CEO, Marissa Mayer, who replaced Barts, was still angry about for years.
Raghavan's reign as what ZDNet referred to as the search master was one so successful that it ended up being replaced by a search engine that not a single person in the world enjoys saying out loud.
The Guardian article ran exactly one year before dramatic layoffs at Yahoo that involved firing entire divisions worth of people and four months before Carol Bartz would be fired by telephone by then chairman Roy Bostock.
Her replacement, Scott Thompson, who previously served as president of PayPal, would last a whole five months in the role before he was replaced by former Google executive Marissa Mayer, in part because it emerged he lied on his resume about having a computer science degree.
Hey, Robaka, did you not notice that?
Anyway, whatever.
Barts joined Yahoo in 2009, so about four years into Rubakar's reign of terror, I guess.
And she joined in the aftermath of its previous CEO, Jerry Yang, refusing to sell the company to Microsoft for $45 billion.
In her first year, she laid off hundreds of people and struck a deal that I've mentioned before to power Yahoo's search using Microsoft's Bing search engine tech with Microsoft paying Yahoo 88% of the revenue it gained from searches, a deal that made Yahoo a couple hundred million dollars for handing over the keys and the tech to its most high-traffic platform.
As I previously stated, when Prabhakar Raghavan, Yahoo's secret weapon, was doing his work, Yahoo's search was so valuable that it was replaced by Bing.
Its sole value, in fact...
I mean, maybe I'm being a little unfair, but there's a way of looking at this.
You could say that Yahoo's entire value at the end of his career was driven by nostalgia in association with days before he worked there.
Anyway, thanks to the state of modern search, it's actually very, very difficult to find much about Raghavan's history.
It took me hours of digging through Google, and at one point being, embarrassingly, to find three or four articles that went into any depth about him.
But from what I've gleaned, his expertise lies primarily in failing upwards, ascending through the ranks of technology on the momentum from the explosions he's caused.
In a wired interview from 2021, Glad handler Stephen Levy said Raghavan isn't the CEO of Google, he just runs the place, and described his addition to the company as a move from research to management.
While Levy calls him a world-class computer scientist who has authored definitive texts in the field, which is true, he also describes Raghavan as choosing a management track, which definitely tracks with everything I've found out about him.
Raghavan proudly declares that Google's third-party ad tech plays a critical role in keeping journalism alive in a really shitty answer to a question that was also made at a time when he was aggressively incentivizing search engine optimized content and a year after he deposed someone who actually gave a shit about search.
Under Raghavan, Google has become less reliable and is dominated by search engine optimization and just outright spam.
And I've said this before, but look, we complain about the state of Twitter under Elon Musk, and justifiably, he's a vile, anti-semi-racist bigot.
We all know this.
It's fully true.
We can say it a million times.
However, I'd argue that Raghavan, by extension, Sundar Peshai, deserve 100 times more criticism.
They've done unfathomable damage to society.
You really can't fix the damage they've been doing and the damage they'll continue to do, especially as we go into an election.
Raghavan and his cronies worked to oust Ben Gomes, a man who dedicated a good portion of his life to making the world's information more accessible, in the process burning the library of Alexandria to the goddamn ground so that Sundar Peshai could make more than $200 million a year.
And Raghavan, a manager, hired by Sundar Peshai, a former McKinsey man, the king of managers, is an example of everything wrong with the tech industry.
Despite his history as a true computer scientist with actual academic credentials, Raghavan chose to bulldoze actual workers, people who did things and people that care about technology, and replace them with horrifying toadies that would make Google more profitable and less useful.
Since Probakar took the reins of Google in 2020, Google search has dramatically declined, with these core search updates I mentioned allegedly made to improve the quality of results, having the adverse effect, increasing the prevalence of spammy, shitty search-optimized content.
It's frustrating.
The anger you hear in my voice, the emotion, is because I've read all of these antitrust emails.
I have gone through this guy's history and I've read all the things about Ben Gomes too.
Every article about Ben Gomes where they interviewed is this guy just having these dreamy thoughts about the future of information and the complexity of delivering it at high speed.
Every interview with Raghavan is some vague bullshit about how important data is.
It's so goddamn offensive to me.
And all of this stuff happening is just one example of what I think are probably hundreds of things happening across startups or that have happened across startups in the last 10 or 15 years and big tech too.
And it's because the people running the tech industry are no longer those who built it.
Larry Page and Sergey Brin left Google in December 2019, the same year, by the way, as the Code Yellow thing.
And while they remained as controlling shareholders, they clearly don't give a shit about what Google means anymore.
Probacar Raghavan is a manager and his career, from what I can tell, is mostly made up of did some stuff at IBM, failed to make Yahoo anything of no, and fucked up Google so badly that every news outlet has run a story about how bad it is.
This, this is the result of taking technology out of the hands of real builders and handing it to managers at a time when management is synonymous with staying as far away from actual work as possible.
When you're a do-nothing looking to profit as much as possible, who doesn't use tech, who doesn't care about tech, and you only care about growth, well, you're not a user, you're a parasite, and it's these parasites that have dominated and are now draining the tech industry of its value.
They're driving it into a goddamn ditch.
Ragavan's story is unique, insofar as the damage he's managed to inflict, or if we're being exceptionally charitable, failed to avoid in the case of Yahoo, on two industry-defining companies.
Managers Obfuscate Reality00:04:17
And the fact that he did it without being a CEO or founder is remarkable.
Yet, he's far from the only example of a manager falling upwards.
I'm going to editorialize a bit here.
I want you to think about your job history.
I want you to think about the managers you've had.
I've written a lot about management and specifically to do with remote work and the whole thing around guys who don't do work, who are barely in the office, telling you you need to be in the office.
This problem is everywhere.
Managers are everywhere.
And managers aren't doing work.
I'm sure someone will email me now and say, well, I'm a manager and I do work all the time.
Yeah, mate, sure you do.
That's why you're emailing me telling me how good you are at your job.
People who actually do work don't feel defensive about it.
People who do things and are part of the actual profit center, they don't need a podcast to tell them they're good at their job.
What I think the problem is in modern American corporate society is that management is no longer synonymous with actually managing people.
It's not about getting the people what they need.
It's not about organizing things and making things efficient and good.
It's not about execution.
It's about handing work off to other people and getting paid handsomely.
And if you disagree, easy at betteroffline.com.
I will read your email.
Maybe I'll even respond.
But the thing is, management has become a poison in America.
Managers have become poisonous because managers are not actually held to any kind of standard.
No, only the workers who do the work are.
What happened to Ben Gomes is one of the most disgusting, disgraceful things to happen in the tech industry.
It's an absolute joke.
Ben Gomes was a goddamn hero.
And I really need you to read the newsletter and read these emails.
I need you to see how many times him and Thacker, great guy as well, were saying, hey, growth is bad for search.
The thing that Ben Gomes was being asked to do was increase queries on Google, the literal amount that people search.
There are many ways of looking at that and thinking, oh shit, that's not what you want.
Surely you don't want no queries.
You don't want people not using it at all.
But queries going upwards linearly suggests that if you're not magic to user growth at least, that people are not getting what they want on the first try, which, by the way, kind of feels like how Google is nowadays.
When you go to Google and the first result and the second result and the fifth result and the tenth result, just don't get what you need because it's all SEO crap.
Now, this is all theorizing, but what I think Prabhagar Raghavan did was I think he took off all the fucking guidelines on Google search.
I think he rolled back changes specifically to make search worse, to increase queries, to give Google more chance to show you adverts.
I am guessing.
Don't have a source telling me this.
But the pattern around the core search updates, the fact that Google search started getting worse toward the middle and end of 2019 and unquestionably dipped in 2020.
Well, that's when Prabhakar took over.
That's when the big man took the reins.
That's when Dracula got his job at the blood bank.
And this is the thing.
There's very little that you and I can actually do about this.
But what we can do is say names like Prabhagar Raghavan a great deal of times so that people like this can be known, so that the actions of these scurrilous assholes can be seen and heard and pointed at and spat upon.
I'm not suggesting spitting on anyone, no violent acts.
No.
You can be pissy on the internet like the rest of us.
Now I'm ranting.
I realize I'm ranting.
But this subject really, it really got to me.
But it's not the only one.
In the next episode, I'm going to conclude this sordid three-part fiasco with a few more examples and how many of these managers, these bean counters, devoid of imagination or ability or anything of note, save for that utter slug-like ability to protect oneself.
Addiction to Acceleration00:05:59
I want to talk about how these people managed to obfuscate their true intentions by pretending to be engineers, by pretending to be technologists and pretending to be innovators.
I want to tell you all about how Adam Maseri destroyed Instagram.
And I want to tell you how little Sam Altman has achieved other than making him and his friends rich.
See you next time.
Thank you for listening to Better Offline.
The editor and composer of the Better Offline theme song is Matt Osowski.
You can check out more of his music and audio projects at matosowski.com.
M-A-T-T-O-S-O-W-S-K-I dot com.
You can email me at easy at betteroffline.com or check out betteroffline.com to find my newsletter and more links to this podcast.
Thank you so much for listening.
Better Offline is a production of CoolZone Media.
For more from CoolZone Media, visit our website, coolzonemedia.com or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
There's two golden rules that any man should live by.
Rule one, never mess with a country girl.
You play stupid games, you get stupid prizes.
And rule two, never mess with her friends either.
We always say, trust your girlfriends.
I'm Anna Sinfield, and in this new season of The Girlfriends...
Oh my God, this is the same man.
A group of women discover they've all dated the same prolific con artist.
I felt like I got hit by a truck.
I thought, how could this happen to me?
The cops didn't seem to care.
So they take matters into their own hands.
I said, oh, hell no.
I vowed I will be his last target.
He's going to get what he deserves.
Listen to the girlfriends.
Trust me, babe.
On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Hey, I'm Nora Jones, and I love playing music with people so much that my podcast called Playing Along is back.
I sit down with musicians from all musical styles to play songs together in an intimate setting.
Every episode's a little different, but it all involves music and conversation with some of my favorite musicians.
Over the past two seasons, I've had special guests like Dave Grohl, Leve, Mavis Staples, Remy Wolf, Jeff Tweedy, really too many to name.
And this season, I've sat down with Alessia Cara, Sarah McLaughlin, John Legend, and more.
Check out my new episode with Josh Grobin.
He related to the Phantom at that point.
Yeah, I was definitely the Phantom in that.
That's so funny.
Share each day with me each night, each morning.
Say you love me.
You know I.
So come hang out with us in the studio and listen to Playing Along on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
I'm Laurie Siegel, and on Mostly Human, I go beyond the headlines with the people building our future.
This week, an interview with one of the most influential figures in Silicon Valley, OpenAI CEO Sam Altman.
I think society is going to decide that creators of AI products bear a tremendous amount of responsibility to products we put out in the world.
From power to parenthood.
Kids, teenagers, I think they will need a lot of guardrails around AI.
This is such a powerful and such a new thing.
From addiction to acceleration.
The world we live in is a competitive world, and I don't think that's going to stop, even if you did a lot of redistribution.
You know, we have a deep desire to excel and be competitive and gain status and be useful to others.
And it's a multiplayer game.
What does the man who has extraordinary influence over our lives have to say about the weight of that responsibility?
Find out on Mostly Human.
My highest order bit is to not destroy the world with AI.
Listen to Mostly Human on the iHeartRadio app, Apple Podcasts, or wherever you listen to your favorite shows.
What's up, everyone?
I'm Ego Modern.
My next guest, you know, from Step Brothers, Anchorman, Saturday Night Live, and the Big Money Players Network, it's Will Farrell.
Woo, My dad gave me the best advice ever.
I went and had lunch with him one day, and I was like, and dad, I think I want to really give this a shot.
I don't know what that means, but I just know the groundlings.
I'm working my way up through and I know it's a place they come look for up and coming talent.
He said, if it was based solely on talent, I wouldn't worry about you, which is really sweet.
Yeah.
He goes, but there's so much luck involved.
And he's like, just give it a shot.
He goes, but if you ever reach a point where you're banging your head against the wall and it doesn't feel fun anymore, it's okay to quit.
If you saw it written down, it would not be an inspiration.
It would not be on a calendar of, you know, the cat just hang in there.
Yeah, it would not be.
Right, it wouldn't be that.
There's a lot of luck.
Listen to Thanks Dad on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
In 2023, former bachelor star Clayton Eckard found himself at the center of a paternity scandal.
The family court hearings that followed revealed glaring inconsistencies in her story.
This began a years-long court battle to prove the truth.
You doctored this particular test twice, Miss Owens, correct?
I doctored the test once.
It took an army of internet detectives to crack the case.
OpenAI Board Doubts Sam Altman00:15:44
I wanted people to be able to see what their tax dollars were being used for.
Sunlight's the greatest disinfectant.
They would uncover a disturbing pattern.
Two more men who'd been through the same thing.
Greg Owespi and Michael Marancini.
My mind was blown.
I'm Stephanie Young.
This is Love Trap.
Laura, Scottsdale Police.
As the season continues, Laura Owens finally faces consequences.
Ladies and gentlemen, breaking news at Americopa County as Laura Owens has been indicted on fraud charges.
This isn't over until justice is served in Arizona.
Listen to Love Trapped podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Hello and welcome to Better Offline.
I'm your host, Ed Zedron.
As I've discussed in the last episode, Sam Altman has spent more than a decade accumulating power and wealth in Silicon Valley without ever having to actually build anything, using a network of tech industry all-stars like LinkedIn co-founder and investor Reid Hoffman and Airbnb CEO Brian Chesky to insulate himself from responsibility and accountability.
Yet things are beginning to fall apart as years of half-baked ideas and terrible, terrible product decisions have kind of made society sour on the tech industry.
And the last month has been particularly difficult for Sam, starting with the chaos caused by OpenAI blatantly mimicking Scarlett Johansson's voice for the new version of ChatGPT, followed by the resignation of researchers who claim that OpenAI prioritized, and I quote, shiny products over AI safety after the dissolution of OpenAI's safety team.
I know, it's just, it's almost cliché.
Shortly thereafter, former OpenAI board member Helen Toner revealed that Sam Altman was fired from OpenAI because of a regular pattern of deception, one where Altman would give inaccurate info about the company's safety processes on multiple occasions.
And his deceit was so severe that OpenAI's board only found out about the launch of ChatGPT, which, by the way, is OpenAI's first product that really made money, arguably the biggest product in tech.
You want to know how they found out about it?
Well, they found out when they were browsing Twitter.
They found out then, not from the CEO of OpenAI, the company which they were the board of.
Very weird.
Toner also noted that Altman was an aggressive political player, with the board, correctly, by the way, worrying that, and I quote again, that if Sam Altman had any inkling that the board might do something that went against him, he'd pull out all the stops, do everything in his power to undermine the board, and to prevent them from even getting to the point of being able to fire him.
As a reminder, by the way, the board succeeded in firing Sam Altman in November last year, but not for long, with Altman returning as CEO a few days later, kicking Helen Toner off the board along with Ilya Sutskeva, a technical co-founder that Altman manipulated long enough to build ChatGPT and then ousted him the moment that he chose to complain.
Sutskeva, by the way, has resigned now.
He's also one of the biggest technical minds there.
So how is OpenAI going to continue?
Anyway, last week, a group of insiders at various AI companies published an open letter asking for their overlords, to the heads of these companies, for the right to warn about advanced artificial intelligence in a monument, genuinely impressive monument, to the bullshit machine that Sam Altman has created.
While there are genuine safety concerns with AI, there really are.
There are many of them to consider.
These people are desperately afraid of the computer coming alive and killing them when they should fear the non-technical asshole manipulator getting rich, making egregious promises about what AI can do.
AI researchers, you have to live up to Sam Altman's promises.
Sam Altman doesn't.
This is not your friend.
The problem is not the boogeyman computer coming alive.
That's not happening, man.
What's happening is this guy is leading your industry to ruin.
And the bigger concern that they should have should be about what Leo Ashenbranner, a former safety researcher at OpenAI, had to say on the Dwakesh Patel podcast, where he claimed that security processes at OpenAI were, and I quote, egregiously insufficient, and that the priorities at the company were focused on growth over stability or security.
These people are afraid of OpenAI potentially creating a computer that can think for itself that will come and kill them at a time where they should be far more concerned about this manipulative con artist that's running OpenAI.
Sam Altman is dangerous to artificial intelligence, not because he's building artificial general intelligence, which is a kind of AI that meets or surpasses human cognitive capabilities, by the way.
Kind of like data from Star Trek, they're afraid of that happening when they should be afraid of Altman's focus.
What does Sam Altman care about?
Because the only thing that I can find reading about what Sam Altman cares about is Sam Bloody Altman.
And right now, the progress attached to Sam Altman actually isn't looking that great.
OpenAI's growth is stalling, with Alex Kantrowitz reporting that user growth has effectively come to a halt based on a recent release claiming that ChatGPT had 100 million users a couple of weeks ago, which is, by the way, the exact same number that the company claimed ChatGPT had in November 2023.
ChatGPT is also a goddamn expensive product to operate.
With the company burning through capital at this insane rate, it's definitely more than $700,000 a day.
It's got to be in the millions, if not more.
It's insane.
And while OpenAI is aggressively monetizing ChatGPT, both to customers and to businesses, it's so obviously far from crossing the break-even Rubicon.
They keep leaking and they'll claim, oh, I didn't put that out there.
They keep telling people, oh, it's making billions of revenue, but they never say profit.
And eventually someone's going to turn to them and say, hey, man, you can't just do this for free or for negative.
At some point, Satchinadella is going to call Sam Altman and say, Sammy.
Sammy, it's time.
Sammy, it's got to be a real business.
I assume he calls him that because of supernatural.
But as things get desperate, Sam Altman's going to use the only approach he really has.
Sheer force of will.
He's going to push OpenAI to grow and sell into as many industries as possible.
And he's a specious hype man.
He's going to be selling to other specious hype men.
The Jim Kramers of the world are going to eat it up.
And they're all, all of them, the Mark Bernioffs, the Satchinadellas, the Sundar Peshais, they're all desperate to connect themselves with the future and with generative AI.
And those that he's selling to, the companies brokering deals, yes, even Apple.
They're desperate to connect their companies to another company, which is building a bubble.
A bubble inflated by Sam Altman.
And I'd argue that this is exceedingly dangerous for Silicon Valley and for the tech industry writ large, as executives that have become disconnected from the process of creating software and hardware follow yet another non-technical founder hawking unprofitable, unsustainable, and hallucination-prone software.
It's just very frustrating.
If there was a very technical mind at these companies, they might walk away.
And I'm not going to give Tim Cook much credit.
But looking into it, I can't find any evidence that Apple is buying a bunch of GPUs, the things that you use to power these generative AI models.
I found some research and analysts suggesting that they would buy a lot, but now OpenAI is doing a deal with Apple to power the next iOS.
And it's interesting.
It is interesting that Apple isn't doing this themselves.
Apple, a company with hundreds of billions of dollars in the bank, I believe, pretty much prints money.
That alone makes me think it's a bubble.
Now, it might look like an arsehole if it comes out they have, but also, why are they subcontracting this to OpenAI when they could build it themselves, as Apple has always done?
Very strange.
It's all so peculiar.
But I wanted to get a little bit deeper into the Sam Altman story.
And as I discussed last episode, Ellen Hewitt of Bloomberg, she's been doing this excellent reporting on the man and joins me today to talk about the subject of her recent podcast, Sam Altman's Rise to Power.
So tell me a little bit about the show you're working on.
The show is the new season of Foundering, which is a serialized podcast from Bloomberg Technology.
So this is season five.
And in every season, we've told one story of a high-stakes drama in Silicon Valley.
I was also the host of season one, which came out several years ago.
It was about WeWork.
And we've done other companies since then.
And season five is about OpenAI and Sam Altman.
And I think we really tried to, you know, cover the arc of the company's creation and where it is now.
But in doing so, we really tried to do a character study of Sam Altman.
Like he's a very important person in the tech industry right now with a lot of power.
And we really wanted to ask ourselves the question and to help listeners ask themselves the question, should we trust him?
Should we trust this person who is currently in a position of a lot of influence and about whom there have been very serious, you know, allegations and questions raised about, you know, to put it in the words of the OpenAI board, his not consistently candid behavior.
And I think it's, you know, my hope is that we give listeners a chance to hear kind of the whole story and this like broader, you know, when there's news that's happening, it can happen so quickly.
It's hard to get a step back.
And I think what the show really does is it collects a lot of information in one place.
And we also have lots of new information that you won't hear anywhere else.
And interviews with people who, you know, have worked with Sam, who knew him when he was younger.
We have an interview with Sam's sister, Annie, from whom he is estranged.
And there's a lot of material in there, I think, that tries to get closer to this answer of like, what should we make of this person?
How should we think about checks and balances of power when we have these companies that are, by all appearances, gathering a lot of power?
And therefore, the people who are running them have a lot of power as well.
So we have, it's a five-episode arc, five-episode season.
And the first three episodes are out now to the general public.
And the last two will come out on subsequent Thursdays.
And if you would like to binge the whole season right away, the episodes are available early to bloomberg.com subscribers.
So you've just started this series about Sam Altman and his upbringing and also the growth of OpenAI and looped and everything.
Who are the people that have helped him get where he is today, though?
So the making of Sam Altman is a really interesting part of the overall story of Sam Altman.
Many people know him as the CEO of OpenAI because that's the role he's been in when he has risen to prominence beyond Silicon Valley.
Like, I think for many years, he was well known in Silicon Valley, but this is like now he's kind of a household name.
And so it's important to understand where Sam came from.
You know, he's been in the Valley for since 2005, I think is when he started college, 2004, 2005 at Stanford.
Then he dropped out and then he joined Y Combinator, the now famous startup accelerator, but he was actually part of the first cohort of founders ever in YC, along with Twitch as well, right?
Yes, including the co-founders of Twitch and of Reddit.
And so Emmett Scheer, you know, for those who know Emmett Scheer, has a like very short 72-hour cameo in the OpenAI Sam Alvin firing saga.
But yes, Emmett and Sam were both in the same YC batch.
So when we think about Sam's early career in Silicon Valley, I think what's important to know is that he rose very quickly in part because he was very successful in making these strategic, advantageous friendships and connections with already established people in the Valley.
The most important one is Paul Graham, who is the, you know, one of the founders of Y Combinator and basically like immediately took Sam under his wing when Sam joined this first batch of YC.
And yeah, Paul's a really important mentor to Sam.
He's kind of the first person who really sees in Sam this ambition, this hunger for power, this like drive to really build bigger and bigger companies.
Even when they met when Sam Altman was 19.
So Paul like sees him as a teenager and sees this future potential.
And so yes, you know, not only did Paul become a mentor to him and sort of helped build Sam's profile over those early years because he would, you know, Paul Graham is very famous for writing these essays about how to build startups and how to build the best startups.
And if you're at all interested in building startups, you've read many of them.
They're kind of like almost like a startup Bible.
And in many of them, he extols the virtues of Sam Altman.
He talks about Sam's ambition.
He talks about Sam's cunning, his ability to like, you know, make deals and like think big.
These are actually things Sam Altman has done is what I've found.
Yeah, there are some interesting, you know, I've read many of the things Paul has written about Sam.
Some of my favorite ones include Paul writing that within three minutes of meeting Sam, this was when Sam was 19, Paul thought to himself, ah, so this is what a young Bill Gates is like, or this is what Bill Gates was like at 19, I think is the exact quote.
So, you know, he really builds him up in this way.
And I do think Paul had like unique insight into Sam.
Like they were close.
They, in many ways, I'm sure still are.
But it is this interesting role where Paul met Sam when he really didn't have much to his name and he really elevated him early on through his writings as this like startup founder to emulate, right?
That other founders should be emulating Sam.
And then of course, as Sam progresses in the Valley, he also starts to write these like startup wisdom essays.
Quasi-vacuous stuff.
In a similar style to Paul.
And then of course, the most important thing that happens is that in 2014, when Paul decides he no longer wants to run Y Combinator, which at this point is a much bigger vehicle than it was when Sam first started, it has no longer just a few stops.
Totally has produced Stripe, Dropbox, Airbnb.
This is a big job, right?
Like running Y Combinator.
And when Paul wants to hand it off to someone, he has said that the only person he considered giving this to was Sam.
So in 2014, when Sam is, I believe, 28 years old, he becomes president of Y Combinator.
This is, you know, he had started a startup.
It didn't really work.
Y Combinator's Rising Star00:14:48
He sold it and was starting to dabble in angel investing.
And at that point, Paul really elevated Sam to this new position of power.
And then he ran YC for a while and then started OpenAI.
And in starting OpenAI, he also leveraged these very useful connections with particularly powerful people who could help him, such as Elon Musk, who was able to give the vast majority of the pledged funding to start.
Open AI.
Later, when Elon Musk splits from OpenAI, Sam makes this very powerful partnership with Satya Nadella to help fund OpenAI.
Another important partnership that Sam has made, you know, much earlier on was his friendship with Peter Thiel.
And one of the things Peter Thiel does is also, you know, gives him millions of dollars to start investing.
This is like before Sam takes over at YC.
With Hydrazine, right?
Yeah.
And, you know, another thing that Paul did that really Paul Graham did that really helped Sam was also he gave Paul had the opportunity to be one of the first investors in Stripe.
Yep.
He was offered the chance to invest $30,000 for 4% of Stripe, which of course, now that Stripe is enormous.
We all know how valuable that was.
And Paul split it with Sam.
He was like, oh, I might as well share this with Sam.
So Sam has said that that $15,000 for 2% of Stripe has been, you know, one of his best performing angel investments ever.
That was something he had.
The question is always where he got 15 grand from.
He was still working on looped at the time.
It's funny how privileged.
Anyway.
Yeah.
My guess is 15 grand was, I don't actually know this, but my guess is 15 grand was not hard for him to pull up.
And it's one of those things where it's really is, you know, access to access and relationships are the sorts of things that can build a career and can lead to great wealth, right?
Like Sam is now, you know, by our own internal accounts and by other lists, a billionaire.
And this money comes from, you know, not from OpenAI, but from these angel investments that he's made early on that have been enormously successful.
So you called him in one of the titles, the most Silicon Valley man alive.
Is this what you're getting at, this kind of power player mentality?
Yeah, I think it's it reflects a few things.
One, that even though he's, um, you know, he's in his late 30s, he's been a player in Silicon Valley for such a long time, you know, close to two decades.
And also that he's just someone who is extremely well connected.
So even before he took over Y Combinator, which I think you could argue is like kind of king of the startup world in some sense, like Y Combinator is like, you know, the top accelerator.
Early stage.
Totally.
Even before he took over at Y Combinator, I think he was extremely well connected.
He's very social.
He's very helpful.
He's very efficient.
Like many people have told me stories in which he, you know, calling Sam and talking for five minutes has solved their problem because he knows exactly the right person to call to fix it or, you know, he's really good at making deals.
I think it's just clear he's extremely well integrated into this world and has very successfully moved up the Silicon Valley status ladder to the point where he is now, which is kind of, you know, one of the, you know, he's the CEO of the one of the arguably hottest companies in the valley right now.
And I think that that's not luck, right?
Like he didn't just come up with, he's not like a nobody who came up with an idea.
It's like he has the connections and has parlayed his connections into power to bring him to the point he is now.
So in your experience talking to people about Sam Altman, how technical is he, do you think?
What have you heard?
Because you say that he wasn't lucky, but he also does not appear to have successfully run a business because looped shut down two people, well, two executives tried to get him fired from there.
He got fired from Y Combinator, which did very well.
But at the same time, Weisey was basically a conveyor belt for money at one point.
Not so much recently.
Yeah.
It just, it feels weird that this completely non-technical, semi-non-technical guy has ascended so far.
My sense is that's not maybe the most fair description.
Like, I think Sam is incredibly smart and people say this a lot.
And I, you know, I believe them.
I think his special skill, you know, he obviously knows how to, like, he's an engineer.
He has training.
I'm sure he can build a lot of stuff.
It seems like his comparative advantage, his special skill is relationships, deal making, figuring out who exactly is the right person to help him in whatever he's really trying to get done and figuring out the best way to get something to happen.
You know, one of the people I spoke to is someone who knows Sam from when he was younger and knows him personally and said that his superpower is figuring out who's in charge or figuring out who is in the best position to help him and then charming them so that they help him with whatever goal he's trying to get done.
And I think that like, yeah, one could argue that that's actually a really good skill set if you want to build a very big company, which, you know, I think at this current moment he has, right?
Like OpenAI is really, you know, you can, there's a lot that you can say about whether they're upholding their original mission or that, you know, that's up for debate.
But I think that they've obviously been commercially successful so far.
So it feels like Silicon Valley on some level, and I just to give some thoughts here.
Within the two episodes I'm doing here, the pattern I've seen with Sam Altman is that everyone seems to want him to win.
And there's almost a degree of they will make it so.
Have you seen anyone who's really a detractor or anyone who's not pro Sam Altman?
Because it's interesting how few people are in tech.
Well, there is, I won't get too into it because this is in some future episodes, which will drop in future weeks.
But, you know, I would say in some of the conversations that I've had off the record about people about in some of the conversations I've had off the record with people about Sam, I think, you know, my general impressions are people often do find him impressive in terms of what he has gotten done, you know, the size and scale of his ambitions and the way that he has generally been able to make that happen.
I think there's also a lot of people who are willing to privately share some gripes that they might have about him.
Also, in recent weeks, we've seen people be a lot more public about some of those gripes.
We have Helen Toner, a former board member at OpenAI who voted to remove Sam last November, saying publicly in the last few weeks that Sam lied to her and the other former board members, that his misdirection made them feel like they couldn't do their jobs.
And she has also said that people were intimidated to the point where they did not feel comfortable speaking more publicly about negative experiences they'd had about Sam, that they are afraid to speak more publicly about times that he has not been honest with them or has, you know, has in which they've had challenging experiences.
And that has also been reflected in some of the private conversations I've had in which people, you know, they might have complaints or they might have had like challenging situations with him.
And I think they just feel like the risk calculus is not worth it to come out and say something like that.
But, you know, there have been bits and pieces where people have come out and said things that, you know, Sam has, you know, another thing that the board members have said was that Sam had been deceptive and manipulative.
And that's also followed up by, or not followed up.
There was also, I think back in November, a former OpenAI employee who had tweeted something publicly about that, you know, saying that Sam had lied to him on occasion, even though he had also always been nice to him, which I think is a very interesting combination of characters.
I'm talking about Silicon Valley, though.
I'm afraid of dealing with them, but they were so nice to me.
And yeah, of course, that person has not elaborated more publicly about what they meant.
I think this is why people are asking themselves these questions, which is like, you know, the more that we hear about what the board was thinking before they decided to fire Sam, I think the more people are wondering about what are the patterns of behavior that he shows that, you know, that led to the board trying to make this drastic move.
Yeah, that's actually an interesting point.
So when Sam Alton was fired from OpenAI, there was this very strange reaction from Silicon Valley, including some in the media, where it was almost like Hunger Games, everyone doing the symbol thing where everyone was like, oh, we've got to put Sam Altman back in.
Isn't it kind of strange we still don't know why he was actually fired, though?
I mean, Helen Toner has elaborated.
Like, I've never seen, have you seen anything like this in your career?
I think that it has been surprising that there has not been more of a clear answer.
I think, you know, as time has gone on, like, we have heard a little bit more.
Like, I think Helen Toner has, you know, to her credit, tried to give more information in recent weeks about what happened.
I think, you know, people were obviously asking this question six months ago.
And so I think like there's been a little bit of a delay in trying to get this answer.
And I wonder if maybe there just isn't like a very neat answer to it.
And so in that absence, we get this kind of more of a like murky, multifaceted, multi-voiced answer.
But yes, I agree that it is sort of surprising that there hasn't been more clarification on what exactly happened or a little bit more granular detail about what led up to it.
So on to the AI hype in general.
Said that a bit weird, but I'll keep going.
Why do you think there's such a gulf between what Sam Altman says and what ChatGPT can actually do?
What Sam Altman says, what are you talking about specifically?
As in, he says it will be a super smart companion.
Yeah, yeah, yeah.
That he'll be all of these things.
Well, this is something that we get into in episode three, which is a personal interest of mine, which is kind of the psychology of the AI industry right now.
And, you know, what I find so interesting about this and what we try to delve into in episode three and kind of throughout the series is these kind of like extreme projections about AI.
And in the industry, you see both positive ones and negative ones.
And I think, you know, the negative ones, that's what looks like AI doomerism, AI existential risk, sometimes called AI safety, depending on your point of view.
But, you know, it's these projections that, you know, super intelligence might very quickly and very soon learn to self-improve in a way that allows it to rapidly outstrip our control and our capabilities and could lead to the extinction of humanity.
There are so many interesting things to say about the psychology of believing that our human race might either be wiped out or incredibly changed within our lifetimes.
And we get into that in episode three.
I think I really wanted to get into the psychology of someone who believes that AI doom is just around the corner.
We talked to someone who sort of became convinced of this belief soon after the 2016 AlphaGo matches in which the Go playing AI beat the world's champion in Go.
And he talks about, yeah, no long, you know, deciding not to make a retirement account because he was like, what is the point?
By the time I reach retirement age, either the world will be dramatically different and money won't matter or we'll all be dead.
And I think that even though some people might scoff at that, that's like a real belief that people believe that this, you know, these extreme possible scenarios are in our near future.
And on the other hand, we also see extreme projections in a positive direction, you know, this idea that AI is going to unlock a whole new era of human flourishing that we might expand beyond our planet, that we might be able to give.
Say what?
Abundance.
Abundance.
Right, exactly.
You know, one of the things we do, I believe in episode three is do a little bit of a supercut of Sam Altman talking about abundance.
It's pretty clear that this is a way that he likes to frame our AI future is going to be this future in which everyone has plenty, right?
Everyone has, you know, access to intelligence, abundant energy, abundant access to superintelligence that can help us live kind of our best lives and beyond our wildest dreams.
Right.
And I, you know, obviously Silicon Valley is a place where people like to make grandiose statements, but this is beyond that, right?
This is not just, this is not just like, you know, we joked about WeWork, WeWork's mission statement was to elevate the world's consciousness.
Like, well, galaxies of human flourishing for eons beyond us, like that is on another scale, right?
Like we're talking about something that is sort of at an unprecedented level of extreme rhetoric.
And I think that's really interesting.
I think it is a very powerful motivator, both in a, you know, in the doomer sense and also in the abundance sense.
People believing that what they're working on is the most important technological leap forward for humanity.
Talk about a motivating reason to work on this technology, right?
Talk about a way to feel powerful or feel like you're making a huge difference.
I think that's a really key part of what's driving a lot of work in AI right now.
It's driving a lot of work, sure.
But with Altman himself, there is this gulf.
It is a million mile gulf between the things he says and what ChatGPT is even, even on the most basic level, capable of doing and will be capable.
And it just feels like, it almost feels like he's become the propagandist for the tech industry.
And it's very strange to me how far that distance is because you've got the AI doomers and the AI optimists, I guess you'd call them.
But Altman doesn't even feel like he's in with either.
The Propaganda of Generative AI00:16:08
He's just kind of, he'll say one day that he doesn't think it's a creature.
The next one he'll say it's going to kill us.
Or it all just feels like a PR campaign, but for nothing.
Yeah, it has been interesting to try to answer the question.
You know, one of the questions we tried to answer in the podcast is, does Sam actually believe, you know, because as you mentioned, there are some early clips of him, you know, and when I say early, I mean around the time of founding OpenAI, like 2015 or so.
There's some clips of him talking about, you know, saying somewhat jokingly that AI might kill us all.
But there's also this very famous blog post that he wrote in 2015 in which he says that basically superintelligence is one of the most serious risks to humanity, full stop.
And so it's clear that at some point in his life, he believed kind of what we might now call a more doomery outlook.
But as time has gone on, he has, you know, offered views that are a little bit more measured and more positive.
You know, he tends to, you know, in his big media tour of 2023, he tended to talk about how AI was going to, you know, his projection was that AI would radically transform society, but that it would be net good, right?
That like overall, we would be glad that this happened and that it would improve lives, even if in the short term, or for some people, it might prove to be bringing a lot of challenges as well.
And so it is, you know, I think one of the interesting things about him is it is a little hard to pin down exactly what he thinks.
I think you're right that I wouldn't consider him like a gung-ho effective accelerationist.
I would not consider him a doomer.
He is like somewhere in this large gulf in between there.
But I think he's also smart enough to know that making grandiose projections about what AI could bring is a compelling story, right?
Like is a story that he can help sell by being like a spokesman for it.
And often that is the role of a CEO is to be a really good storyteller, to bring the pitch of the company to the public, to investors, to potential employees, to customers, to try to sell them on this vision of the future.
And I do think Sam is good at that.
There is an interesting tidbit in episode three in which we interview a fiction writer who was actually hired on contract by OpenAI to write like a novella about AI futures and things like that.
And yeah, he just talks a little bit about, you know, the novella is not, I think, in active use within OpenAI, but they did at some point see, they did at some point see value in commissioning it.
And I think, you know, something that the author, Patrick House, explains to us is, you know, that OpenAI, just like many other startups, is really motivated by story, right?
And that Sam Altman is inspired by fiction, you know, is inspired by certain kinds of sci-fi.
I think this is not unique to Sam.
Many founders in Silicon Valley, you know, Elon Musk has talked about this as well, are driven to create things in part because of what they read about when they were younger, these dreams of the future.
And so it's just interesting to get his perspective on how motivating a story can be and how motivating this compelling story of like, oh, we're building something that's going to change the course of human history.
Like you just couldn't ask for a more powerful motivating force.
So as Altman accumulates power and as he kind of ascends to the top of Open AI, do you think he's done there?
Do you think there's going to be another thing he starts?
Because it feels like you've discussed like UBI and all these other things.
Do you think he has grander ideas that he wants to pursue?
Well, obviously, I can't speak to what's inside Sam's head.
No, you don't know the man's mind.
But I mean, past indicators would suggest yes.
Like I think he has proven pretty Consistently, that he's someone who, you know, is, you know, as much as he might focus on one project with a lot of effort, like he is cooking things on the side.
Like, this is a man, this is going to be an extended metaphor, but this is a man working at a stove that has like six burners, not one.
And, you know, he, we already know that, what'd you say?
Sorry.
He's got a big house.
He's got multiple houses.
The, you know, we already know that, you know, in addition to running OpenAI, he has funded and/or helped prompt the founding of or has been very involved in investing in or supporting other startups that are part of this kind of ecosystem of businesses that are connected to an AI future or might benefit in an AI future.
So for example, Helion, which is a nuclear fusion company, which he has invested a ton of money into.
I think he has said publicly that his vision is that this is a potential way to provide abundant energy that could then power the technology that we need to improve AI to the level that we're hoping that it can get to or that he's hoping that it can get to.
At the same time, we've talked a little bit about universal basic income.
This has been something that Sam has been a proponent of and an advocate of since at least 2016 when he was running Y Combinator and they started a side research project to study universal basic income by giving cash payments to families in Oakland of, I believe, $1,000 a month.
That research project is still ongoing.
It's now moved away from Y Combinator and is associated with Open Research, which is, I believe, funded by OpenAI.
And so it has kind of moved with Sam to his new role.
And of course, he also co-founded this company called WorldCoin, which used these silver orbs machines to scan, to take pictures of your iris and give everyone, register every individual human as like a unique human individual and to create this eyeball registry in which, by which one could in the future distribute a universal basic income.
So he's funding these energy companies.
He's like involved in these sort of this sort of crypto eyeball registry project that will help distribute UBI in this future that he's imagining.
Like I think it's safe to say he's definitely thinking about things beyond just open AI for the future and imagining like, okay, well, if we have this piece that's growing, what else would we need to support it?
And I'm sure there are other things he's working on that we don't even know about, right?
Like I know he has also funded some like longevity bioscience projects and things like that.
He's, I guarantee he's thinking about stuff beyond what we know about.
Final question.
Why do you think the entire tech industry has become so fascinated with AI?
Do you think it's just Altman or is it something more?
I do think ChatGPT started heating up this interest that was already percolating a little bit in the tech industry.
But it does seem like something about ChatGPT captured the public imagination, made people imagine very seriously for the first time how AI could affect their lives, their lives individually.
It used to be kind of this abstract thing that was a little farther away.
Or maybe you understood that like you were interacting with AI sometimes, like when you would look at like flight price predictors.
Yeah, exactly.
But I think as we, you know, we talk about this in episode three, but ChatGPT wasn't even new technology.
It was actually just a different user interface on a model that already existed, GPT 3.5.
And so to me, that actually speaks, I guess, to the power of like making a technology accessible to everyone and in a way that was like easy to use and, you know, for better or for worse, that kind of got a lot of people in this like public momentum of people thinking about AI feeling, you know, just feeling like it had rapidly increased its capabilities in a short period of time.
And yeah, something about that really captured not just, you know, the minds, but also the hearts of people and like getting them really thinking about like what could a future like this look like.
And I think while some people were excited, a lot of people also reacted with fear, right?
And like, I think in the Valley, like you will hear a lot of people more openly discussing their fears of sort of like job loss or just like dramatic social change that might come about in the next 10 or 20 years.
The feeling I get in conversations that I have in and around San Francisco is, you know, even people who are pretty deep in this technology are uncertain about whether it's going to be overall good or bad.
Like they're just uncertain of how to look back on this time, like whether it will have ended up being a leap forward for humanity or something different.
Altman has taken advantage of the fact that the tech industry might not have any hypergrowth markets left, knowing that ChatGPT is, much like Sam Altman, incredibly adept at mimicking depth and experience by parroting the experiences of those that have actually done things.
Like Sam Altman, ChatGPT consumes information and feeds it back to the people using it in a way that feels superficially satisfying.
And it's quite impressive to those who don't really care about creativity or depth.
And like I've said, it takes advantage of the fact that the techie ecosystem has been dominated and funded by people who don't really build tech.
As I've said before, generative AI, things like ChatGPT, Anthropics Claude, Microsoft's Co-Pilot, which is also powered by ChatGPT, it's not going to become the incredible supercomputer that Sam Altman is promising.
It will not be a virtual brain or imminently human-like or a super smart person that knows everything about you because it is at its deepest complexity, a fundamentally different technology based on mathematics and the probabilistic answer to what you have asked it, rather than anything resembling how human beings think or act or even know things.
Generative AI does not know anything.
How can a thing think when it doesn't know?
Eh?
Anyone want to ask Brad Lightcap?
Mira Marati, Sam Altman, any of these questions just once?
Hear what they fart out?
No?
Well, ChatGPT isn't inherently useless.
Altman realizes that it's impossible to generate the kind of funding and hype he needs based on its actual achievements.
And that to continue to accumulate power and money, which is his only goal, he has to speciously hype it.
And he has to hype it to wealthy and powerful people who also do not participate in the creation of anything.
And that's who he is.
I've been pretty mean about this guy.
I really have, but he does have a skill.
He knows a mark.
He knows how to say the right things and get in the right rooms with the people who aren't really touching the software or the hardware.
He knows what they need to hear.
He knows what the VCs need to hear.
He knows quite aptly what this needs to sound like.
But if he had to say what ChatGPT does today, what would he say?
Yeah, yeah, it's really good at generating a bunch of text that's kind of shitty.
Yeah, sometimes it does math, right?
And sometimes it does it really wrong.
Sometimes you ask it to do it, it can draw a picture, eh?
What do you think of that?
These are all things, by the way, that if like a six-year-old told you, you'd be like, wow, that's really impressive, or like a 10-year-old perhaps, because that's a living being.
Chat GPT does these things and it does it.
I know it's cheesy to say, but in a soulless way, but it really does that because and the reason all of this, the writing and the horrible video and the images, the reason it feels so empty is because even the most manure-adjacent press release still has gone through someone's manure-adjacent brain.
Even the most pallid, empty copy you've read has gone through someone.
A person has put thought and intention in, even if they're not great with the English language.
What ChatGPT does is use math to generate the next thing.
And sometimes it gets it pretty right.
But pretty right is not enough to mimic human creation.
But look at Sam Altman.
Look who he is.
What has he created other than wealth for him and other people?
What about Sam Altman is particularly exciting?
Well, he's been rich before.
And his money made him even richer.
That's pretty good.
He was at Y Combinator.
Don't ask too much about what happened there.
Just feels like sometimes Silicon Valley can't wipe its own ass.
It can't see when there's a wolf amongst the sheep.
It can't see when someone isn't really part of the system other than finding new ways to manipulate and extract value from it.
And Sam Altman is a monster created by Silicon Valley's sin.
And their sin, by the way, is empowering and elevating those who don't build software, which in turn has led to the greater sin of allowing the tech industry to drift away from fixing the problems of actual human beings.
Sam Altman's manipulative little power plays have been so effective because so many of the power players in venture capital and the public markets and even tech companies are disconnected from the process of building things, of building software and hardware, and that makes them incapable or perhaps unwilling to understand that Sam Altman is leading them to a deeply desolate place.
And on some level, it's kind of impressive how he succeeded in bending these fools to his whims, to the point that executives like Sundar Pashai of Google are willing to break Google search in pursuit of this next big hype cycle created by Sam Altman.
He might not create anything, but he's excellent at spotting market opportunities, even if these opportunities involve him transparently lying about the technology he creates, all while having his nasty little boosters further propagate this bullshit, mostly because they don't know.
Or perhaps they don't care if Sam Altman's full of shit.
Maybe it doesn't matter to them.
It doesn't matter that Google search is still plagued with nonsensical AI answers that sometimes steal other people's work or that AI in legal research has been proven to regularly hallucinate, which by the way is a problem that's impossible to fix.
It's all happening because AI is the new thing that can be sold to the markets.
And it's all happening because Sam Altman, intentionally or otherwise, has created a totally hollow hype cycle.
And all of this is thanks to Sam Altman and a tech industry that's lost its ability to create things worthy of an actual hype cycle to the point that this spacious non-technical manipulator can lead it down this nasty, ugly, offensive, anti-tech path.
The tech industry has spent years pissing off customers with platforms like Facebook and Google actively making their products worse in the pursuit of perpetual growth, unashamedly turning their backs on the people that made them rich and acting with this horrifying contempt for their users.
Tech Faces Harsh Reprimand00:07:27
And I believe the result will be that tech is going to face a harsh reprimand from society.
As I mentioned in the Rot-Com bubble, things are already falling apart.
Web traffic is already dropping.
And what sucks is the people around Sam Altman should have been able to see this.
Even putting aside his resume, I've listened to an alarming amount of Sam Altman talk.
And I'm a public relations person.
Who the hell am I?
I'm someone who's been around a lot of people who make shit up.
I've been around a lot of people whose job it is to kind of obfuscate things.
And quite frankly, Alman's really obvious.
I'm not going to do any weird lie to me-esque ways of proving he's lying.
He just doesn't ever get pushed into any depth.
No one ever asks him really technical questions or even just a question like, hey Sam, did you work on any of the code at OpenAI?
What did you work on?
I know you can't talk about the future, Sam.
But how close are we actually to AGI?
And if he says, oh, a few years, that's not specific enough, Sam.
How about give me a ballpark?
And then when he lies, again, you say, okay, Sam, how do we get from generative AI to AGI?
And when he starts waffling, say, no, no, no, be specific, Sam.
This is how you actually ask questions.
And when you say things like this, by the way, to technical founders, they don't get worried.
They don't obfuscate.
They may say, I can't talk about the streets of legal things, which is fine.
But they'll generally try and talk to you.
Listen to any interview with any other technical AI person.
Listen to them and then listen to Sam Altman.
He's full of it.
It's so obvious.
And one deeply unfair thing with the Valley is there are people that get held to these standards.
Early stage startups generally do.
The ones that aren't handed to people like Altman or Alexis O'Hanian of Reddit or Paul Graham or Reid Hoffman.
They don't get those chances because they're not saying the things that need to be said to the venture capitalists.
They're not in the circles.
They're not doing the right things because the right things are no longer the right thing for the tech industry.
And when all of this falls apart, Sam Altman's going to be fine.
When this all collapses, he'll find something to blame it on.
Market forces, a lack of energy breakthroughs, unfortunate economic things, all of that nonsense.
And he'll remain a billionaire capable of doing anything he wants.
The people that are going to suffer are the people working in Silicon Valley who aren't Sam Altman.
The people that did not get born with a silver spoon in each hand and then handed further silver spoons as they walk the streets of San Francisco.
People that don't live in 9,500 square foot mansions.
The people trying to raise money who can't right now because all the VCs are obsessed with AI.
The people that will get fired from public tech companies when a depression hits because the markets realize that the generative AI boom was a bubble.
When they realize that the most famous people in tech have been making these promises for nobody other than the markets.
Well, the markets need you to do something eventually.
And I just don't think it's going to happen.
And I think that we need to really think, why was Sam Altman allowed to get to this point?
Why did so many people like Paul Graham, like Reid Hoffman, like Brian Chesky, like Satchinadella, back up this obvious con artist who has acted like this forever?
And what sucks is, I don't know if the Valley is going to learn anything unless it's really bad.
And I don't want it to be, by the way.
I would love to be wrong.
I would love for all of this to just be like, Sam Altman's actually a genius.
Turns out the whole thing was not.
No, it's not going to happen.
And I worry that...
There is no smooth way out of this, that there is no way to just casually integrate OpenAI with Microsoft.
Because now there's an antitrust thing going in with Microsoft acquiring Inflection AI, another AI company.
And that's the thing.
It feels like we are approaching a precipice here.
And the only way to avoid it is for people to come clean, which is never going to happen.
Or, of course, for Sam Altman not to be lying.
For AGI to actually come out of OpenAI.
And by the way, it's going to need to be in the next year.
I don't think they've got even three quarters left.
I think that once this falls apart, once the markets realize, oh shit, this is not profitable.
This is not sustainable.
They're going to walk away from it.
When companies realize that generative AI has given them a couple percent profit, maybe, they're going to be pissed.
Because this is not a stock rally worthy boondoggle.
This is not going to be pretty.
When things fall apart for NVIDIA, which is still over $1,000, when those orders stop coming in quite as fast, what do you think is going to happen to tech stocks?
Startups are already having trouble raising money.
And they're having trouble raising money because the people giving out the money are too disconnected from the creation of software and hardware.
The only way to fix Silicon Valley perhaps is an apocalypse.
Perhaps is people like Sam Altman getting washed out.
I don't want it to happen.
I really must be bloody clear.
But maybe it won't be apocalyptic.
Maybe it will just be a brutal realignment.
And maybe Silicon Valley needs that realignment.
Because this industry desperately needs a big bath full of ice and they need to dunk their head in it aggressively and wake the hell up.
Venture capital needs to put money back into real things.
The largest tech companies need to realign and build for sustainability so they're not binging and purging staff with every boom.
And if we really are at the end of the hypergrowth era, every tech company needs to be thinking profit and sustainability again.
And that's a better Silicon Valley.
Because a better Silicon Valley builds things for people.
It solves real problems.
It doesn't have to lie about what the thing could do in the future so that it can sell a thing today.
And I realize that sounds like the foundation of most venture capital.
That's fine at the seed stage.
That's fine at this moonshot stage where you're early, early days.
It is not befitting the most famous company in tech.
It is not befitting a multi-billionaire.
It is not befitting anyone.
And it is insulting to the people actually building things, both in and outside of technology.
The people I hear from after every episode, they are angry.
They are frustrated because there are good people in tech.
There are people building real things.
There are people that remember a time when the tech industry was exciting.
When people were talking about cool shit in the future, and then they'd actually do it.
Returning to that is better for society and the tech industry.
Just don't know when it's going to happen.
Media Fails to Make You Intelligent00:14:40
Thank you for listening to Better Offline.
The editor and composer of the Better Offline theme song is Metosowski.
You can check out more of his music and audio projects at matasowski.com.
M-A-T-T-O-S-O-W-S-K-I dot com.
You can email me at easy at betteroffline.com or visit betteroffline.com to find more podcast links and of course my newsletter.
I also really recommend you go to chat.wheresyoured.at to visit the Discord and go to r slash betteroffline to check out our Reddit.
Thank you so much for listening.
Better Offline is a production of CoolZone Media.
For more from CoolZone Media, visit our website, coolzonemedia.com or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Hello and welcome to Better Offline.
I'm your host, Ed Citron.
It's been a hard couple of weeks.
It's been pretty hard to focus.
I've written a few newsletters, I've gone to Portugal, I've done a bunch of shit, just trying not to think about everything happening outside, but it's time to do so.
Seemingly every single person on earth with a blog or a podcast or even a Twitter account or XDEverything Apple, whatever it's called now, they've all tried to drill down into what happened on November 5th to find the people to blame, to explain what could have gone differently.
Really looking for who to blame though and find out why so many actions led to a result that will overwhelmingly harm women, minorities, immigrants, LGBTQ people and lower-income workers is terrifying.
It fucking sucks.
I'm not going to mince words.
Not that I would usually anyway.
And I don't feel fully equipped to respond to the moment.
I don't have any real answers, at least not political ones.
I'm not a political analyst and I'd feel disingenuous trying to dissect either the Harris or the Trump campaigns.
Because I just feel like there's a take Olympics right now.
It's the Dunning-Kruger Festival out there.
Everyone is trying to rationalize and intellectualize these events that ultimately come down to something quite simple.
People don't trust authority.
And yeah, it's pretty ironic that this often leads them towards authoritarianism.
Now, I don't want to give you the impression that I'm going to go on my crank mode, that I'm somehow against institutions on their face.
I'm not.
But at the same time, understanding this moment requires us to acknowledge that institutions have failed us and failed most people and how certain institutions' missteps have led us to exactly where we are today.
Legacy media, and while oftentimes they're staffed by people who truly love their readers and care about their beats, they're weighed down by this hysterical, nonsensical attachment to the imaginary concept of objectivity and the will of the markets.
Case in point.
Regular people have spent years watching the price of goods increase due to inflation, despite the fact that the increase in pricing was mostly driven by, get this, corporations raising their prices.
Now, that's not to say that external factors like the war in Ukraine or lingering COVID restrictions in China, these things did play a role in it.
They did.
But the bulk of these price increases were caused by these fucking companies raising the prices.
It was in their earnings.
It was right there.
Pepsi Cola said it on the news.
Yet some parts of the legacy media spent an alarming amount of time chiding their readers for thinking otherwise, even going against their own reporting, and there will be links in the episode notes, I promise, as a means of providing balanced coverage, insisting again and again that the economy is actually good, contorting their little bodies to prove that prices aren't actually higher, even as companies literally boasted about raising their prices on earnings.
In fact, the media spent years debating with itself whether price gouging was actually happening, despite years of proof that it was.
Some of them even reported that the price gouging was happening.
So like, I get this.
I just don't think people trust authority and they especially don't trust the media, especially the legacy media.
It also probably didn't help that the legacy media implored readers and viewers to ignore what they saw at the supermarket or at the pump and the growing hits to their wallets from the daily necessities of life.
It was just a national level gaslighting and it was disgusting.
And I know some of you might say, you know where to email me.
Oh, it's not just this.
No, of course it's not just this asshole.
But I think this is a big thing.
Now, before I go any further, I've used the term legacy media here repeatedly, but I don't completely intend for it to come across as a pejorative.
Despite my criticisms, and believe me, I've got a few of them.
There are people in the legacy media doing a good job.
They're reporting the truth.
They're doing the kinds of work that matters and they're actually trying to teach their readers stuff and tell them what's happening and giving them context.
I read and pay for several legacy media outlets and I think the world is a better place for them existing, despite their flaws.
The problem is, as I'll explain, is this editorial industrial complex and how these people are writing about the powerful don't seem to be able to, or maybe they don't want to, actually interrogate the powerful.
This could be an entire episode on its own, but I don't think the answer to these failings is to simply discard legacy media entirely.
But I want to implore them to do better and to strive for the values of truth hunting and truth telling and actually explaining what's happening and criticizing the people that don't have PR firms and lobbying groups and lawyers and the means to protect themselves from the world.
The time for fucking around is over and we're currently finding out.
Now, anyway, as you know, as a person existing in the real world, the price of everything has kept increasing despite the fact that wages are stagnating.
It's forcing many of the poorest people to choose between food and fuel or, I don't know, eating and having heat.
Simultaneously, businesses have spent several years telling workers they're asking for too much and doing too little, telling people a few years ago they were quiet quitting, which is a fucking stupid term that just means going to your job and doing the thing you're paying to do.
Anyway, anyway.
And a year later, in 2023, they insisted that the years of remote work were actually bad because profits didn't reach the same profit levels of 2021, which was something to do with remote work.
Now, did anyone actually prove this?
Did anyone actually go and...
No, they didn't.
They just, well, I just listened to Mark Bernioff, who's one of the more evil people alive.
Now, I also think a lot of these problems come to 2021, a year that we really need to dig into more.
We might not do so today, but we will in the future.
But one of the big things that punished workers and led to so many layoffs in 2023 was the fact that we couldn't get back to the post-lockdown boom of 2021, when everyone bought everything always as they left the house for the first time in a while.
Now, any corporation would be smart enough to know that that was a phase, that that was not going to be forever.
Except every single big company seemed to make the same mistake and say, number going up forever, line go up forever.
When it didn't, well, they started punishing workers and they started thinking, well, could it be that we as companies, we set unrealistic expectations for the markets and we just thought that we'd keep growing forever.
Or maybe it was the people using the computer at home.
Yeah, that seems way better.
Anyway, while the majority of people don't work remotely, from talking to the people I know outside of tech or business, there's this genuine sense that the media has allied itself with the bosses.
And I imagine it's because of the many articles that literally call workers lazy and have done so for years.
Yet when it comes to the powerful, legacy media doesn't seem to have that much piss and vinegar.
They just have much more guarded critiques.
The appetite for shaming and finger wagging.
It's always directed at middle and working class workers and seemingly disappears when a person has a three character job title like CEO.
It's fucking stupid, it's insulting, and yes, it's demoralizing for the average person.
Despite the fact that Elon Musk has spent years telegraphing his intent to use his billions of dollars to wield power, equivalent to that of a nation state, as you may remember from my first episode of anything over on, it could happen here.
Too much of the media, both legacy and otherwise, responded slowly, cautiously, failing to call him a liar, a con artist, an aggressor, a manipulator, a racist, a deadbeat dad, you know, all the things actually happening.
No, no, no.
They kind of danced around him.
They reported stories that might make you think that they maybe noticed it, but there was this desperation to guard objectivity.
And it was just...
It lacked any real intent.
It lacked any interest in calling account to a man who has pretty much bought an election for Donald Trump.
A racist billionaire using his outsized capital to bend society to his will just isn't a fucking problem for the media, or at least not as much of a problem as a worker who might not work 50 to 100 hours a week for a boss who makes 130 times what they do.
The news, at least outside of the right wing, is always separate from opinion, always guarded, always safe, for fear that they might piss somebody off and be declared biased.
Something that happens anyway.
And while there are columnists that are given some space to have their own thoughts, sometimes in the newspaper, sometimes online, the stories themselves are delivered with the kind of reserved hmm tone that often fails to express any actual consequences or context around the news itself and just doesn't seem to care about making sure that the reader or listener learns something.
My mate Casey has a good point about podcasts, and I'd apply it to some of the news too, that there's too much stuff out there that is there to make you feel intelligent rather than make you intelligent.
And I think this falls into it.
Now, this isn't to say that outlets are incapable of doing this correctly.
I love the Washington Post.
They've done an excellent job on analyzing major tech stories.
But a lot of these outlets feel custom-built to be bulldozed the moment an authoritarian turns up.
This force that exists to crush those desperately attached to norms and objectivity.
Authoritarians know that their ideologically charged words will be quoted adverbatim with the occasional, huh, this could mean a little dribble, this drizzle, this spunk of context that's lost in the headline that repeats exactly what the fucking authoritarian wants them to.
And guess what?
Some people don't read the article.
They just read the headline.
And Musk is the most brutal example of this, by the way.
Despite the fact that he's turned Twitter into a website pump full of racism and hatred that literally helped make Donald Trump president, Musk was still able to get mostly positive coverage from the majority of the mainstream media for his fucking robo-taxi nonsense.
Despite the fact that he spent the best part of a decade lying about what Tesla will do next, there are entire websites just based on how much Elon Musk lies, yet they still report this shit.
It makes me very upset.
And it doesn't matter that some of these outlets, by the way, had accompanying coverage that suggested that the markets weren't impressed by Tesla's theoretical robo-taxi plans or their fake ass robots run by people.
Musk is still able to use the media's desperation for objectivity against them.
And he knows that they never dare to combine reporting on stuff with thinking about stuff for fear that Elon Musk might say they're biased, which he has been doing for years.
Do you see my goddamn point yet?
And this, by the way, is not always the fault of the writers.
There are entire foundations of editors that have more faith in the markets and the powerful than they do the people writing or the people reading their fucking words.
And above them are entire editorial superstructures that exist to make sure that the editorial vision never colors too far outside the lines or informs people a little too much.
I'm not even talking about Jeff Bezos or Lauren Powell Jobs or any number of billionaires who own any number of publications, but the editors editing business and tech reports who don't know anything about business and tech, or the senior editors that are terrified of any byline that might dare get the outlet under fire from somebody who could call their boss.
It's fucking cowardice.
There are, however, I should add, also those who simply defer to the powerful, that assume that this much money can't be wrong, even if said money, in the case of Elon Musk, is repeatedly wrong.
And there's an entire website about the wrongness and the lies and the bullshit.
And I'm talking about Elon Musk still, obviously.
These editors are the people that look at the current crop of powerful tech companies that have failed to deliver any truly meaningful innovation in years and they go, ooh, ooh, send me more, daddy.
Show me more of the apps.
It's fucking disgraceful.
Just look at the coverage of Sam Altman from the last year.
You know, the guy who spent years lying about what AI can do.
And tell me why every single thought he says must be uncritically catalogued.
His every goddamn decision applauded.
His every claim trumpeted as certain.
His brittle little company that burns $5 billion a year talked about like it's a fucking living god.
Sam Altman is a liar who's been fired from two companies, including OpenAI, and yet because he's a billionaire with a buzzy company, he's left totally unscathed.
The powerful get a completely different set of rules to live by and exist in a totally different media environment.
They're geniuses, entrepreneurs, firebrands.
Their challenges are framed as missteps and their victories framed as certainties by the same outlets that told us that we were quiet quitting and that the economy is actually good and that we're the problem for high prices.
While it's correct to suggest that the right wing is horrendously ideological and they're terribly biased, it's very hard to look at the rest of the media and claim that they're not.
The problem is that the so-called left media, which usually is just the center, isn't biased towards what we may consider left-wing causes like universal healthcare, strong unions, expanded social safety nets, you know, the stuff that would actually be helpful.
No, they're biased in favor of folating an ever-growing carousel of sociopathic billionaire assholes, elevating them to the status of American royalty, where they exist above expectations and norms that you and I must live by.
This is the definition of elitism.
The media has literally created a class of people who can lie and cheat and steal.
And rather than condemn them for it, they're celebrated.
Accepting a Sucked Digital Life00:15:24
While it might feel a little tangential to bring technology into this, I truly believe that everybody is affected by the rot economy, the growth of the orcas ecosystem, where the number must always go up because everybody is using technology all the time and the technology in question is getting worse.
This election cycle saw more than 25 billion text messages sent to potential voters, and seemingly every website was crammed full of random election advertising.
Here's the thing about elections.
They're not really always about policy.
No, they're a referendum on the incumbent party or president, and by proxy, a poll on how people feel.
And the reality is that most people are fucking miserable.
There's this all-encompassing feeling that things are just harder now.
It's harder to pay your bills.
It's harder to keep in touch with your friends.
It's harder to start a family.
It's harder to buy a house.
It's harder to fall in love.
It's harder to do everything.
And what we're seeing is an inshittification of existence, to use Mr. Doctorow's phrase.
Everything just...
I don't want to be this much of a curmudgeon, but everything just kind of sucks.
It's all terrible.
It's miserable.
And hardly anyone thinks it's going to get better.
And this creates the kind of fertile conditions for a strong man to emerge.
One who arises and says that only he can fix things, even if he spent four years proving how he could not.
And the problem for Democrats and for institutions more broadly is that the all-encompassing nature of this milieu is kind of hard to solve.
It's hard to change the perception that everything's terrible when you're reminded of it when you're trying to do the most basic of tasks.
Our phones are full of notifications trying to growth hack us into doing things that companies want.
Our apps are full of microtransactions.
Our websites are slower and harder to use with endless demands of our emails and our phone numbers and the need to log back in because they couldn't possibly lose a dollar to someone who dared to consume a Washington Post article.
And yes, I'm talking about the post, which I fucking pay for, despite the fact it logs me out all the time.
Our social networks are so algorithmically charged that they barely show us the things we want them to anymore.
With executives dedicated to filling our feeds full of AI-generated slop because despite being the customer, we're also the revenue mechanism.
Our search engines do less as a means of making us use them more.
Our dating apps have become vehicles of private equity to add a toll to falling in love.
Our video games are constantly nagging us to give them more money.
And despite it costing money and being attached to our account, we don't actually own any of the streaming media we purchase.
We're drowning in spam, both in our emails and our phones.
And at this point in our lives, we've probably agreed to 3 million pages of privacy policies allowing companies to use our information as they see fit.
We get one value transaction with every company.
They get 11.
They get 100.
We really actually don't know because there's no legislation to tell us what they're fucking doing.
And these are the issues that hit everything we do all the time, constantly, unrelentingly.
Technology is our lives now.
We wake up, we use our phone, we check our text, three spam calls, two spam texts.
We look at our bank balance, two-factor authentication check.
We read the news, a quarter of the page is bought by an advertisement asking for our email that's deliberately built to hide the button to get rid of it.
And then we log into Slack and feel a pang of anxiety as 15 different notifications appear in a way that is really not built for us to find what we need, just to let us know something happened.
Modern existence is just engulfed in sludge.
The institutions that exist to cut through it seem to bounce between the ignorance of their masters and this misplaced duty to objectivity.
Our mechanisms for exploring and enjoying the world are interfered with by powerful forces that are just basically left unchecked.
Opening our devices Is willfully subjecting us to attack after attack after attack from applications, websites, and devices that are built to make us do things for them rather than operate with dignity and freedom that much of the internet was actually founded upon.
These millions of invisible acts of terror are too often left undiscussed because accepting the truth requires you to accept that most of the tech ecosystem is rotten and that billions of dollars are made harassing and punishing billions of people every single day of their lives through the devices that we're required to use in order to exist in the modern world.
Most users suffer the consequences and most of the media fails to account for them.
And in turn, people walk around knowing something is wrong but not knowing who to blame until somebody provides a convenient excuse like immigrants, like the Democrats, like whatever fucking works because we can't actually call the people out.
The corporations crushing our existence.
Why wouldn't people crave change?
Why wouldn't people be angry?
Living in the current world absolutely fucking sucks sometimes.
It's miserable.
It's bereft of industry and filthy with manipulation.
It's undignified.
It's disrespectful.
And it must be crushed if we want to escape this depressing goddamn world we've found ourselves in.
Our media institutions are fully fucking capable of dealing with these problems.
But it starts with actually evaluating them and aggressively interrogating them without fearing accusations of bias that, as I've said repeatedly, happen either way.
The truth is that the media is more afraid of accusations of bias than they are of misleading their readers.
And while that seems like a slippery slope, and it may very well be one, there must be room to inject the writer's voice back into their work and a willingness to call out bad actors as such, no matter how rich they are, no matter how big their products are, no matter how willing they are to bark and scream that things are unfair as they accumulate more power and money.
We need context in our news.
We need it.
We need it now.
We need opinion.
We need voice.
We need character.
We need life.
Because as long as we follow this bullshit objectivity path, we're screwed.
And if you're in the tech industry and hearing this and saying, oh, the media's too critical of tech, you're flat fucking wrong.
Kiss my asshole.
Everything we're seeing happening right now is a direct result of a society that let technology and the ultra-rich run rampant, free of both the governmental guardrails that might have stopped them and the media ecosystem that might have actually held them in check.
Our default position in interrogating the intentions and actions of the tech industry has become that they will work it out as they continually redefine what work it out means and turn it into make their products worse but more profitable.
Covering META, Twitter, Google, OpenAI and other huge tech companies, as if the products they make are remarkable and perfect, is disrespectful to the reader's intelligence and a disgusting abdication of responsibility, as their products, even when they're functional, are significantly worse, more annoying, more frustrating and more convoluted than ever.
And that's before you get to the ones, like Facebook and Instagram, that are outright broken.
I don't give a shit if these people have raised a lot of money, unless you use that as proof that something is fundamentally wrong with the tech industry.
Meta making billions of dollars of profit is a sign that something is wrong with society, not proof that it's a good company or or anything that should grant Mark Zuckerberg any kind of special treatment.
Shove your chains up your ass, Mark.
Open AI being worth $157 billion for a company that burns $5 billion or more a year to make a product that destroys our environment for a product yet to find any real meaning isn't a sign that it should get more coverage or be taken more seriously.
No, it should be a sign that something is broken, that something is wrong with society.
Whatever you may feel about ChatGPT, the coverage it received is outsized compared to its actual utility and the things built on top of it.
And that's a direct result of a media industry that seems incapable of holding the powerful accountable or actually learning about the subject matter in question.
It's time to accept that most people's digital life fucking sucks, as does the way we consume our information, and that there are people directly responsible.
Be as angry as you want at Jeff Bezos, whose wealth and the inherent cruelty of Amazon's labor practices makes him an obvious target.
But please don't forget Mark Zuckerberg, Elon Musk, Sander Peshai, Tim Cook, and every single other tech executive that has allowed our digital experiences to become fucked up through algorithms that we know nothing about.
Similarly, governments have entirely failed to push through any legislation that might stop the raw, both in terms of dominance and opaqueness of algorithmic manipulation and the ways in which tech products exist with few real quality standards.
We may have, at least for now, consumer standards for the majority of consumer goods, but software is left effectively untouched, which is why so much of our digital lives are such unfettered dog shit.
And if you're hearing this and saying I'm being a hater or a pessimist, shut the fuck up.
I'm tired of you.
I'm so fucking tired of being told to calm down about this as we stare down the barrel of four years of authoritarianism built on top of the decay of our lives, both physical and digital, with a media ecosystem that doesn't do a great job explaining what's being done to people in an ideologically consistent way.
There's this extremely common assumption in the tech media, based on what I'm really not sure, that these companies are all doing a good job and that good job means having lots of users and making lots of money.
And it drives tons of editorial decision making.
If three quarters of the biggest car manufacturers were making record profits by making half of their cars with a brake that sometimes didn't work, that'd be international news.
Government inquiries would happen.
People would go to prison.
And this isn't even conjecture.
It actually happened.
After Volkswagen was caught deliberately programming its engines to only meet emission standards during laboratory testing, they were left to spew excessive pollution into the real world.
But once lawmakers found out, they responded with civil and criminal action.
The executives and engineers responsible were indicted.
One received seven years in jail.
And their former CEO is currently being tried in Germany and being indicted in the US too.
And here we are in the tech industry.
Facebook barely works, used to ignite genocides and bully people and harass teen girls.
Pedophiles run rampant on there.
There was a Wall Street Journal story about it.
They're fine.
So much of the tech industry, consumer software like Google, Facebook, Twitter, and even ChatGPT and business software from companies like Microsoft and Slack.
It sucks.
It sucks.
It's bad.
You use it every day.
You've been listening to me ramble for 50 episodes now.
You know what I'm talking about.
It's everywhere.
Yet the media covers it just like, eh, you know, it's just how things are, mate.
Now, Meta, by the admission of its own internal documents, makes products that are ruinous to the mental health of teenage girls.
And it hasn't made any substantial changes as a result.
Nor has it received any significant pushback for failing to do so.
Little bit of a side note here.
Big shout out to Jeff Horwitz and the rest of the Wall Street Journal people who did the Facebook files.
There are legacy media people doing a good job on this.
Nevertheless, Meta exercises this reckless disregard for public safety, kind of like the auto industry in the 60s.
And that was when Ralph Nader wrote Unsafe at Any Speed.
And his book, it actually brought about change.
It led to the Department of Transportation and the passage of seatbelt laws in 49 states and a bunch of other things that can get overlooked.
But the tech industry is somehow inoculated against any kind of public pressure or shame because it operates in this completely different world with this different rulebook and a different criteria for success as well as this completely different set of expectations.
By allowing the market to become disconnected from the value it creates, we enable companies like, I don't know, NVIDIA that reduce the quality of their services so they make more money for their GeForce Now service or Facebook.
They can just destroy our political discourse or they can facilitate genocide in Myanmar and then, well, they get headlines about how good a CEO Mark Zuckerberg is and how cool his chains are and how everything's just fine with Facebook and they're making more money.
No, no.
I actually want to take a step back though.
I want to take a little bit of a step back.
I previously mentioned, I said it twice now, oh, Meta enables genocide and it destroys our political, our political discourse.
I want to be clear.
When I say that everything is justified at Meta, I'm actually quoting their chief technology officer.
That's quite literally what Andrew Bosworth said in an internal memo from 2016, where he said that, and I quote, hmm, all the work Facebook does in growth is justified.
Even if that includes, and I'm quoting him directly, somebody dying in a terrorist attack coordinated using Facebook's tools.
Now, the mere mention of violent crime is enough to create reams of articles questioning whether society is safe and whether we need more plastic in our Walgreens.
Yet our digital lives are this wasteland that people still discuss like a utopia.
Seriously, putting aside the social networks, have you visited a website on the phone recently?
Have you tried to use a new app?
Have you tried to buy something online starting with a Google search?
Within those experiences, it says, has anything gone wrong?
You know it.
I know it has.
You know it has.
It's time to wake up.
We, the users of products, we're at war with the products we're using and the people that make them.
And right now, we are losing.
The media must realign to fight for how things should be.
This doesn't mean that they can't cover things positively or give credit where credit is due or be willing to accept that something could be something cool.
But what has to change is the evaluation of the products themselves, which have been allowed to decay to a level that has become at best annoying and at worst actively harmful for society.
Our networks are rotten.
Our information ecosystem is poisoned with its pure parts ideologically and strategically concussed.
Our means of speaking to those that we love and making new connections are so constantly interfered with that personal choice and dignity is all but removed.
But there is hope.
There really is.
Those covering the tech industry right now have one of the most consequential jobs in journalism if they choose to fucking do it.
Those willing to guide people through the wasteland, those willing to discuss what needs to change, how bad things have gotten, and hold the powerful accountable and say what good might look like, have the opportunity to push for a better future by spitting in the faces of those ruining it.
I don't know where I sit, by the way.
I don't know what to call myself.
Am I legacy media?
I got my start writing in print magazines.
Am I an independent contractor?
Am I an influencer?
Am I a content creator?
I truly don't know and I don't know if I care.
But all that I know is that I feel like I'm at war too and that we, if I can be considered part of the media, are at war with people that have changed the terms of innovation so that it's synonymous with value extraction.
Technology is how I became a person, how I met my closest friends and loved ones.
And without it, I wouldn't be able to write.
I wouldn't be able to read this podcast.
I wouldn't have got this podcast.
And I feel this poison flowing through my veins as I see what these motherfuckers have done and what they're continuing to do.
And I see how inconsistently and tepidly they're interrogated.
Now is the time to talk bluntly about what is happening.
The declining quality of tech products, the scourge of growth hacking, the cancerous growth at all cost mindset.
These are all the things that need to be raised in every single piece.
And judgments must be unrelenting.
The companies will squeal, ooh, that they're being so unfairly treated by the biased legacy media.
Oh, oh, save me.
Hey, Nile Patel, interview with Sandar Peshai.
This is how you sounded when you handed him your phone.
It was pathetic.
They should be scared of you, Nile.
The powerful should be scared of the media.
They shouldn't be sitting there sending letters to the editor like fucking customer support.
No.
They should see this podcast.
They should see these newsletters.
Users Hold the Real Power00:07:51
They should see everything published by the tech media and go, uh-oh.
And there can be good people.
There can be good boys and girls and others.
There can be plenty of people that make good products and get great press for it.
But do you really think Meta, Google, Apple to an extent, frankly, do you think Amazon looks good right now?
Do you think it's easy to find stuff?
Or do you think it's slop full of more slop?
Mark Zuckerberg said on an earnings call the other day that he intends there to be an AI-specific slop feed.
That should...
These are harmful things.
This is pouring vats of oil into rivers and then getting told you're the best boy in town.
These companies, they're poisoning the digital world and they must be held accountable for the damage they're causing.
Readers are already aware, but are, and this is really thanks to members of the media, by the way, they're gaslighting themselves into believing that, oh, I just don't catch, I don't keep up with technology.
It's getting away from me.
I'm not technical enough to use this.
When the thing that they don't get, that the average person doesn't get is that the tech industry has built legions of obfuscations, legions of legal tricks, and these horrible little user interface traps specifically made to trick you into doing things, to make the experience kind of subordinate to getting the money off of you.
And I think that this is one of the biggest issues in society.
And yes, of course I'm biased.
I'm doing a podcast about tech.
But for real though, billions of people use smartphones.
Billions of people are on the computer every day.
It's how we do everything.
And it stinks.
It stinks so bad.
This is the rot economy.
We're in the rot society.
But things can change.
And for them to change, it has to start with the information sources.
And that starts with journalism.
And the work has already begun and will continue.
But it must scale up and it must do so quickly.
And you, the user, have the power.
Learn to read a privacy policy.
And the link there is to the Washington Post.
Yes, there are plenty of great reporters there.
Fuck Bezos.
You can move to Signal, which is an encrypted messaging app that works on just about everything.
Get a service like DeleteMe.
And by the way, I pay for it.
I worked for them a lot four years ago.
I have no financial relationship with them, but they're great for removing you from data brokers.
Molly White, who's a dear friend of mine, an even better right, who might remember her from one of the early episodes about Wikipedia, she's also written this extremely long guide about what to do next that I'll link to in the notes.
And it runs through a ton of great things you can do.
Unionization, finding your communities, dropping apps that collect and store sensitive data and so on.
I also heartily recommend Wired's Guide to Protecting Yourself from Government Surveillance, which is linked in the show notes.
Now, before we go, I want to leave you with something that I posted on November 6th on the Better Offline Red Hat.
The last 24 hours have felt bleak and will likely feel more bleak as the months and years go on.
It'll be easy to give in to doom, to assume the fight is lost, to assume that the bad guys have permanently won and there will never be any justice or joy again.
Now's the time for solidarity, to crystallize around ideas that matter, even if their position in society is delayed.
Even as the clouds darken and the storms brew and the darkness feels all-encompassing and suffocating.
Reach out to those you love and don't just commiserate.
Plan.
It doesn't have to be political.
It doesn't even really have to matter.
Put shit on your fucking calendar.
Keep yourself active and busy.
And if not distracted, at very least, animated.
Darkness feeds on idleness.
Darkness feasts on a sense of failure and a sense of inability to make change.
You don't know me very well, but know that I'm aware of the darkness and the sadness and the suffocation of when things feel overwhelming.
Give yourself some mercy.
And in the days to come, don't castigate yourself for feeling gutted.
Then keep going.
I realize it's little solace to think, well, if I keep saying stuff out loud, things will get better.
But I promise you doing so has an effect and actually matters.
Keep talking about how fucked up things are.
Make sure it's written down.
Make sure it's spoken cleanly and with the rage and fire and piss and vinegar it deserves.
Things will change for the better, even if it takes more time than it should.
Look, I know I'm imperfect.
I'm emotional.
I'm off kilter at times.
I get emails saying that too angry.
I'm sorry if it's ever triggered you.
I really do mean that.
It's not intentional.
I just feel this in everything I do.
I use technology all the time and it is extremely annoying, but also I'm aware that I have privilege.
And the more privilege you have within tech, the more you're able to escape the little things.
Go and buy a cheap laptop today.
Try and see what a $200, $300 laptop is.
It's slow.
It's full of 18 pop-ups trying to sell you access to cloud storage, the shit that you'll never use, tricking grannies and people who can't afford laptops or people that just don't know.
When I see this stuff, it enrages me.
Not just for me, but because I know that I'm at least lucky enough to know how to get around this shit.
Spent most of my life online, spent most of my life playing with tech.
I know how it works.
And I know I have my tangents and my biases, but I wear them kind of like my heart on my sleeve.
I care about all this stuff in a way that might be a little different to some.
And it's because I've watched an industry that really made me as a person, that allowed me to grow as a person, to actually meet people, to not feel as alone.
And I imagine some of you feel like this too.
And then watching what happens to it every day, watching the people who get so rich off of making it so much worse, and then seeing what happened on November 5th.
And you can draw a line from it.
People are scared.
They're lost.
Their lives are spent digitally, and your digital lives are just endless terrorism.
Endless harm.
Some of you know your way around tech so you can escape some of it, but it's impossible to escape all of it.
Try meeting people these days.
You can't.
Everything is online.
And everything online, everything in your phone is mitigated and interfered with.
It's an assault on your senses, one deprived of dignity.
And I see the people doing this.
And it fills me full of fucking rage.
And it makes me angry for you and for me.
For my son growing up in what will probably be a worse world.
For my friends and loved ones who are harder to see, harder to speak to, whose lives too are interfered with.
And there are the millions and millions of people who have no fucking idea it's happening.
That just exist in this swill in this active digital terrorism.
Poked and prodded and nagged and notified constantly.
And I don't want early on in this, I got a message saying, don't tell people to be angry.
And I stick by that.
But I'm not going to hide that I am.
I'm not going to hide the pain I feel.
I'm not going to hide the pain I feel seeing this shit happen.
And I've watched this thing that I love, technology.
Really do love tech.
I really do deeply.
I've watched it corrupted and broken and the people breaking it.
They don't just make billions of dollars.
They get articles and they get interviewed on the news.
Mark Zuckerberg, he wears a chain and there's articles about how cool he is.
He should be in fucking prison.
He should be on a prison on a boat that just circles the world and he shouldn't have air conditioning or heat depending on how the weather is.
And I know that I'm kind of errant and again, tons of tangents, but look, the reason I'm like this is because I really care.
And I think caring, I think being angry at the things that actually matter and giving context as a result, I think that's deeply valuable.
And I realize I do fly off the handle a lot, but it really is because I care.
Mark Zuckerberg Deserves Prison00:04:21
I care about you.
I care about the subject matter.
I'm so grateful and so honored that you spend your time listening to me every week and hope you'll continue to do so because I'm not going anywhere.
Thank you for listening to Better Offline.
The editor and composer of the Better Offline theme song is Mattasowski.
You can check out more of his music and audio projects at matasowski.com.
M-A-T-T-O-S-O-W-S-K-I dot com.
You can email me at easy at betteroffline.com or visit betteroffline.com to find more podcast links and of course my newsletter.
I also really recommend you go to chat.where's your ed.at to visit the Discord and go to r slash betteroffline to check out our Reddit.
Thank you so much for listening.
Better Offline is a production of CoolZone Media.
For more from CoolZone Media, visit our website, coolzonemedia.com, or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
When a group of women discover they've all dated the same prolific con artist, they take matters into their own hands.
I vowed I will be his last target.
He is not going to get away with this.
He's going to get what he deserves.
We always say that, trust your girlfriends.
Listen to the girlfriends.
Trust me, babe.
On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Hey, it's Nora Jones, and my podcast, Playing Along, is back with more of my favorite musicians.
Check out my newest episode with Josh Grobin.
You related to the Phantom at that point.
Yeah, I was definitely the Phantom in that.
That's so funny.
Share each day with me each night, each morning.
Listen to Nora Jones is playing along on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
What's up, everyone?
I'm Ago Modern.
My next guest, it's Will Farrell.
My dad gave me the best advice ever.
He goes, just give it a shot.
But if you ever reach a point where you're banging your head against the wall and it doesn't feel fun anymore, it's okay to quit.
If you saw it written down, it would not be an inspiration.
It would not be on a calendar of, you know, the cat just hanging in there.
Yeah, it would not be.
Right, it wouldn't be that.
There's a lot in life.
Listen to Thanks Dad on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
In 2023, bachelor star Clayton Eckard was accused of fathering twins, but the pregnancy appeared to be a hoax.
You doctored this particular test twice, Miss Owens, correct?
I doctored the test once.
It took an army of internet detectives to uncover a disturbing pattern.
Two more men who'd been through the same thing.
Greg Gillespie and Michael Marcini.
My mind was blown.
I'm Stephanie Young.
This is Love Trapped.
Laura, Scottsdale Police.
As the season continues, Laura Owens finally faces consequences.
Listen to Love Trapped podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
10-10 shots five, City Hall building.
How could this ever happen in City Hall?
Somebody tell me that.
A shocking public murder.
This is one of the most dramatic events that really ever happened in New York City politics.
They screamed, get down, get down.
Those are shots.
A tragedy that's now forgotten.
And a mystery that may or may not have been political, that may have been about sex.
Listen to Rorschach, murder at City Hall on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.