All Episodes
March 14, 2021 - I Don't Speak German
01:28:26
82: Scott Alexander & Slate Star Codex, with David Gerard and Elizabeth Sandifer

Another special episode this time, as we welcome to the show David Gerard, author of acclaimed takedown of cryptocurrency Attack of the 50ft Blockchain, and Elizabeth Sandifer, author of acclaimed takedown of neo-reaction Neoreaction a Basilisk, to discuss Scott Alexander (or is it Siskind?) and his blog Slate Star Codex, with digressions into Less Wrong, 'rationalism', and why David is responsible for Grimes hooking up with Elon Musk. Content Warnings apply, as ever. Also, we once again we have audio that doesn't quite meet our recent standards, though its perfectly listenable. Podcast Notes: Please consider donating to help us make the show and stay independent.  Patrons get exclusive access to one full extra episode a month. Daniel's Patreon: https://www.patreon.com/danielharper Jack's Patreon: https://www.patreon.com/user?u=4196618 IDSG Twitter: https://twitter.com/idsgpod Daniel's Twitter: @danieleharper Jack's Twitter: @_Jack_Graham_ IDSG on Apple Podcasts: https://podcasts.apple.com/us/podcast/i-dont-speak-german/id1449848509?ls=1 Episode Notes: Elizabeth's... Twitter: https://twitter.com/ElSandifer Site: http://www.eruditorumpress.com/blog/ Patreon: https://www.patreon.com/elizabethsandifer David's... Twitter: https://twitter.com/davidgerard Site: https://davidgerard.co.uk/blockchain/ Patreon: https://www.patreon.com/davidgerard/overview Elizabeth's recent article on Scott Alexander/Siskind: http://www.eruditorumpress.com/blog/the-beigeness-or-how-to-kill-people-with-bad-writing-the-scott-alexander-method/ RationalWiki on Scott Alexander: https://rationalwiki.org/wiki/Scott_Alexander

| Copy link to current segment

Time Text
This is I Don't Speak German.
I'm Jack Graham, he him, and in this podcast I talk to my friend Daniel he him, and in this podcast I talk to my friend Daniel Harper, also he him, about what he learned from years of listening to today's Nazis, white nationalists, white supremacists, and what they say to each white nationalists, white supremacists, and
Be warned, this is difficult subject matter.
Content warnings always apply.
Hello and welcome to episode 82 of I Don't Speak German, the podcast that blah, blah, blah, blah, blah.
And yeah, this is another very special episode this week.
We have more people on the show than have ever been on before.
Daniel, what's up with you then at the moment?
I'm doing all right.
It warmed up here finally.
So all the snow has melted, which just means that the iciness of my heart is not matched by the experience of being outside, which is fine, I suppose.
This is a podcast about the weather now.
You didn't realize that we shifted, but that's... We're just going to be covering climate change on a very, very incremental basis.
No, it's not about climate.
It's about the weather.
But joining us for this episode, we have Two old cronies of mine, particularly Elizabeth Sandifer.
Hello there.
Hello.
Hello.
And also, we're very honoured, as well as being honoured to have Elle here, we're very honoured also to have David Gerrard with us.
Good evening.
So if you could just let us know who you are and what the hell you're doing on this show.
Elle, you start.
I'm Elizabeth Sandfor.
I am the writer of a book called Neoreaction of Basilisk, which looks at the alt-right movement and also this weird thing that we're going to be talking more about called the rationalist subculture.
And some people like it and think it's good.
The book.
Nobody likes Neoreaction at all.
And I'm David Gerrard.
I am the person who helped Elizabeth write Near Reaction to Basilisk, and I've been following these guys for about a decade now.
I do other things as well, mostly talking about why Bitcoin is stupid.
That was also inspired by Elizabeth as well.
You haven't written any books, have you, David, that you'd like to plug?
I have written two books.
I have written Attack of the 50-Foot Blockchain, which is My self-published hit, and Libra Shrugged, how Facebook tried to take over the money, which is a really, really good book, but it sort of is selling slowly because nobody actually cares about Facebook's dumb crypto idea anymore.
So today we are going to be talking about Sleek Star Codex, aka, I don't know, should we say the name?
He might get mad at us if we say his name, which has been publicly available for years, but apparently he threw a shit fit over it.
Scott Siskind.
Who wrote the the the rationality blog quote unquote sleep star codex he has a long been on my radar someone who is worth covering on this podcast eventually there was a recent new york times article about mr siskin very anodyne new york times article talking about him and his connections to kind of the larger silicon valley community and how the rationalist community is.
Well how it behaves and i knew we need to do the podcast and i thought who better to cover this with us than these two individuals who are probably the among the most knowledgeable people in the world about this subculture in this person thank you to both of you for showing up and i guess the place we should start.
Because I know you were both consulted for the New York Times piece is, you know, what is this New York Times piece?
And so what's the experience of being involved with it?
I mean, to some extent the experience of being involved with it was being interviewed eight months ago and then suddenly finding out the article is finally coming out.
But as I understand the story, the reporter Cade Metz was doing a very bland, oh look at this important person in the tech sphere article on Slate Star Codex.
Which is written under the pen name Scott Alexander.
And in the course of that he intended to mention that Scott Alexander is actually this gentleman named Scott Siskind.
And Siskind threw a complete shit fit about it.
Were you interviewed before or after the shit fit?
I was after the shit fit because it was the shit fit and the fact that Siskin directed all of his followers to, I mean did not like say hey everyone go harass this guy, but the effect of his shit fit was hundreds of people harassing
Kate Max and the New York Times and it turns out that if you swarm the New York Times with a harassment mob, they start to wonder what sort of asshole you are and notice things like your repeated support of eugenics, which turns out to change the tone of the article about you, shockingly enough.
And at that point, Someone suggested to Mets that I would be a good person to talk to and that David would be a good person to talk to and we both got roped in to provide background on that whole eugenics thing.
I'm sorry to be the idiot in the room, but I basically don't know anything about this blog or this person, so I was wondering if one of you kind people could maybe go back to basics and give us the 101 guide to Slate Star Codex and who the fuck this Alexander Strokes Siskind guy is that's having all these shit fits.
Sure.
So Slate Star Codex is a blog.
It comes from the rationalist subculture, which is the less wrong mob.
Who unkind people like me would call the Rokos Basilisk people.
Even that is perhaps too kind, but please continue.
I love how we've just instantly jettisoned the, you know, for beginners thing.
It's wonderful.
I'll back it up further than David.
They're a group that emerged out of the San Francisco Bay, Silicon Valley, Transhumanist community around a guy named Eliezer Yudkowsky, who was a blogger who wrote about how to be more rational in your thinking, nominally, and in practice wrote about how cool it would be to be reincarnated on a computer forever.
Yudkowsky runs a completely crank AI research facility that does not actually do anything that can meaningfully be called AI research.
And he is bankrolled, or at least was by Peter Thiel.
And Scott Alexander slash Siskin is part of what's called the Less Wrong Diaspora.
Yudkowsky's site was Less Wrong.
When Yudkowsky wound down his blogging, a bunch of other people stepped in, of which Scott Siskin was one.
He was one of the best writers on Less Wrong at the time.
He used to keep his essays under about 10,000 words.
It's saying very little.
We're comparing it to Elias Yudkowsky, who Yudkowsky is a good pop science writer, right?
When he's writing about things that are actually real, he does a really nice approachable thing.
The trouble is that he then takes that style and uses it to advance crankery, and his followers go, wow, that's amazing!
So that's the problem with rationals.
Siskind wrote on Less Wrong, and he was quite popular, because he's a good writer.
And when he is even slightly constrained, So he got his own blog and he was no longer constrained.
You would not believe, I don't know how he writes that many words a week, but boy can he crank out the copy.
I think there would be an interesting experiment to give Siskins like an editor and a word count.
I'd really like to see that happen.
But yeah, five to ten thousand word essays three times a week are nothing unusual.
His tone is sort of Polite, interested, quirky, casually full of the worst possible comparisons, odd spot of Charles Murray apologetics, that sort of thing.
It's a very nice sort of style and has a lot of fans who think he is one of the best writers they've ever seen.
They presumably have not read very many writers.
I think that if you expect rationalists to have done the reading on any topic whatsoever, you will be disappointed.
I must point out, and this is important, for the purposes of I don't speak German, This is not a Nazi subculture.
Oh yeah, that is an important note to hit.
Yeah, we have to do that whenever we talk about people that aren't Nazis.
But it is totally the intellectual dark web and he is also totally enthusiastic about human biodiversity, that is, scientific racism.
Racism, yeah.
Brad, I'd say Yudkowsky, definitely not a Nazi, only bankrolled by them.
Siskin, who fucking knows, honestly.
I mean, it was great.
So, for about eight years, me and people like me would read Siskin's works and look at all the dog whistles.
If you go into the comments, you see where the dog whistles went down to the contrabass dog trumpet.
Then we spent eight years on this and then after the New York Times piece came out it dared to say that Scott Siskind had an interest in the ideas of Charles Murray.
Lots and lots of earnest centrists got deeply upset at this because how dare you say that he might like any ideas that Charles Murray liked.
So one of Siskind's ex-friends leaked an email From 2014, when they were still on good terms, talking about what he wanted to do with Slate Star Codex and what his interests were.
I'll just quote you from it.
I said a while ago I'd collect lists of importantly correct neoreactionary stuff to convince you I'm not wrong to waste time with neoreactionaries.
That's just a great start right there.
The neoreactionaries were also part of the same Bay Area transhumanist subculture, which is more broadly just Silicon Valley startup culture.
Curtis Yarvin is your man, Mencius Moldbug.
The book to read on this is Elizabeth's book, Neoreaction of Basilisk, which is a fantastic book, and I am not just saying that because my name is in the first sentence.
Yarvin wrote a million words of political theory, which was basically... He wrote a million words.
Sorry?
Well, you wrote a million words.
A million words that were on the topic of politics.
And he convinced people that he was a goddamn genius and they should emulate his style.
So you had a lot of these Quasi-fascist bloggers who were writing with fabulous logaria, but really, really badly.
Like, even worse.
These people more or less fell into quiescence around late 2015, early 2016, which is when their less intellectual descendants, the alt-right, started coming to prominence.
And anyone listening probably knows that story.
Yeah, several of the figures that we've covered, at least glancingly, speak very highly of their time in the neoreactionary community.
In particular, the formerly TRS-affiliated Identity Dixie crowd will heavily reference their time with NRX and some of those quote-unquote thinkers in that crowd.
Yes, and in addition all of the kind of like early HPD stuff a lot of the you know There was a an old podcast called the Darwin Digest and they would kind of reference a lot of those ideas There's a woman who I haven't seen around lately named Cathedral princess or caught herself Cathedral princess and that obviously has for anybody who knows that community is a is a very intimately connected to you know reaction stuff so so there is a clear ideological through line from
This kind of stuff that we're talking about with the new reactionaries with this kind of rationalist community and the kind of early version of what became the alt-right a year or two later.
Just to draw that direct scene.
There's also some pretty strong evidence that the sort of money intellectual figures behind Trumpism and the alt-right, your Steve Bannon, your Robert Mercer, are all people who have read Mencius Moldbug and think he's really smart.
There's not quite an outright smoking gun on that, but there's a lot of reason to think.
I'm pretty sure Bannon's actually said it, or has had it said that he actually was into Yavin.
And of course Yavin is a good friend of Thiel's.
They're mates.
And of course teal teal directly employees eric weinstein who's the brother of brett weinstein who's the husband of heather hyang who we've done multiple episodes on that crowd and they are at the core of the what is now the sort of the functions are not intellectual dark web who you know are not nazis but definitely a traffic and a lot of the same talking points with.
The overt anti-semitism and the, you know, the kind of the worst of the kind of politicism filed off.
So again, this is all part of the same kind of messy mix of individuals.
And part of what we're doing here is to sort of separate that and kind of understand where the threads are.
But part of it is also to kind of go, yeah, you may think you're a reasonable centrist, but you're if you're lying down to closely with Nazis, you draw conclusions.
So please, David, continue.
When I stress yet again that This is not a Nazi blog!
I have to add the, as such.
Because Steve Saylor shows up in those blog comments fairly regularly, as I understand, right?
Yes.
In the leaked email from 2014, where Scott's setting out how much he loves neoreactionaries and he really wants this blog to be neoreactionary friendly, and talks about how much he loves human biodiversity, he says, one, HBD is probably partially correct, or at least very non-provably not correct, unquote.
Now, that's how to do word salad in a single goddamn sentence, but the thing is that he's trying to make himself sound reasonable and moderate about it.
However, this is what PZ Myers calls halfway to crazy town.
Like, the compromise position on HBD is not, well, maybe it's partially true.
The compromise position is, no, we literally know the history of this stuff and it's pernicious bollocks.
So, can I get a philosophical justification for basing your politics on something that was, what was it you said, very possibly non-provably non-correct or something?
That sounds like a great idea.
Well done, guys.
And the first things he quotes as evidence are Steve Saylor and scientific luminaries such as HBD Chick.
So really, we spent eight years trying to divine the dog whistles, and then his ex-friend just tweets it out.
The tweet in question, the guy who posted it deleted it a few hours after he posted it, by which time it had of course been archived and spread everywhere.
No one has actually denied the tweet though, and Yadkowsky Basically declared the guy a persona non grata for daring to attack himself or Scott, who is one of his dearest friends.
Because, you know, they're not a cult or anything like that.
Right.
You also have a live journal exchange that's been screenshotted and going around a bit from 2012 in which someone informs Syskind that quote, paying undesirables to be sterilized is happening and links to a charity that pays drug addicts money to get vasectomies and Syskind replies, I actually think I'm probably going to donate to that charity next time I get money.
So, again, you have a really long history of really fucked up eugenics shit on the part of Scott.
Mike Enoch commented on that charity at the time on his podcast, approvingly, by the way.
That's fantastic.
Well, there you go!
So, yeah, it's – they're not Nazis, but they're very possibly non-provably non-Nazis.
Right.
It's definitely very cuddly.
So yeah, there's a number of public issues about this.
Like when, in this email from 2014, Siskin says to his friend, quote, I will appreciate if you never tell anyone I said this.
Not even in confidence.
And by, quote, appreciate, unquote, I mean that if you ever do, I'll probably either leave the internet forever or seek some sort of horrible revenge, unquote.
Oh, if only!
So, it turns out that Scott knew very well indeed that he had a whole bunch of landmines just waiting to go off, and I'm actually amazed it took seven years.
So, when the New York Times article came about, Scott will have thought, oh my god, my name is going to come out.
So, in his day job, Scott is a doctor, he's a psychiatrist, and so The thing about the New York Times article and his real name, there are arguments both ways, okay?
Let's be fair.
Let's be centrist about this.
His name is Scott Siskind.
Usually, if you're in the New York Times, because your thing is important, they're going to put your name.
On the other hand, His psychiatrist, he didn't want it revealed.
That would actually be a problem.
He's had ex-patients harass him before.
So, I mean, that's the reason why his name wasn't on the Rational Wiki article for many years, even though it was somewhat public knowledge, because, you know, he was actually being hassled by dicks.
And there's plenty to complain about him otherwise.
Wouldn't his ex-patients have known his name?
They have known his name.
It might not be obvious here, but...
But anyway, he actually got harassment over blog stuff.
He claimed it was an ex-patient.
No, it doesn't quite add up for me either, but, you know, I'm going to take the claim because there are stupid people on the internet.
That's true.
I mean, I could totally believe that an ex-patient, like, discovered Scott Alexander was Scott Siskind, was unhappy with their care, and, you know, harassed Scott Siskind.
I don't find that, like, implausible.
Yeah.
This is something that happens to psychiatrists.
Like, even ones who aren't raging eugenicists.
I would say the obvious flip side of this is certainly if my psychiatrist was a raging eugenicist, I would rethink my care.
And yes, that's the other side of the argument to use his name in the New York Times out of sheer public interest.
But anyway, the article was going to come out in June 2020.
The first the world knew of this was when Scott shut down his blog And threw an absolute shit fit and told his fans, this is the guy, this is where they contact, be polite, please.
Reader, they were not polite.
It turns out that when you have an audience like that and you tell them, be polite, please, they aren't polite.
I'm just trying to imagine it with my smaller than Scott Siskins audience.
If I said, there's this person who is going to come and harass me.
I think you should know that this is happening.
Please do not go harass this person.
I mean, It's just a bad look.
Here's his contact details.
I'm not saying harass him, but if you think this is bad, maybe email the New York Times and here's the contact information.
That's not the thing you do when you really don't want this person harassed.
It was really good because Scott was determined that using the name Siskin would constitute doxxing.
Now, this is an exciting new definition of the word doxing, where you use your name publicly all over the place for many, many years.
Everyone knows it's your name.
You publish Slate Star Codex blog content in academic works under the name Siskind, but it's doxing if someone tells you.
Admittedly, the New York Times is a new level up, so he's got an argument there, but it's sort of not doxing when you 100% did it yourself.
I find that bit implausible.
But the one person who did get doxxed, of course, was Kate Metz.
Because rationalists are lovely.
It's the most rational thing.
What I love is this is the mistake rationalists keep making.
The scream bloody murder about how no one should pay attention to this thing.
That has never gone wrong for the rationalists before.
It's quite amazing.
Like, there was a Rational Wiki article on Less Wrong in 2010.
Nobody read Rational Wiki then either.
But they Could not not obsess over the fact that someone had an opinion on them.
And it was like, Daniel, why are you paying this much attention?
We are nobody.
Why do you care what our pissy little blog says about you?
I have very intimate personal experience with this.
Yes, thank you.
Please go ahead.
And then, of course, there's Yudkowsky's famous all-caps freakout about how no one should pay attention to Roko's Basilisk, which immediately became the thing everyone best knows him for.
Like, has this man ever been on the internet?
No, you know, when I was promoting New Reaction to Basilisk, I was really thrilled when Eliezer Yudkowsky made a Tumblr post about how no one should talk about this book and it's evil.
I was like, oh, thank you!
That is exactly the promotion you want from Eliezer Yudkowsky, is for him to really do his signature move like that.
You realize, of course, we're going to have to explain Roko's Basilisk now.
Yeah, now everyone knows Roko's Basilisk at this point.
That's permeated into the pop culture.
This is true.
Yes, I mean, it is my fault that Elon Musk and Grimes hooked up.
Literally, ex-Musk is my doing.
Perhaps you'd like to explain that further.
Yeah, before we get to Roko's Basilisk, I would like to know this story, please.
Oh, absolutely.
There was an article on Rational Wiki about LessWrong.
Then the Roko's Basket incident on LessWrong happened where a guy called Roko posted this thing and it was taken down in seconds about how if you think the wrong thoughts, the future AI will have to create simulations of you that it has to torture.
This is the good guy AI that is the best thing for all of humanity.
It's necessary and utilitarian in terms to torture you to make sure that you will put everything you can towards creating the AI.
That makes sense, right?
So, it's elegantly obvious.
So, this was deleted within hours, but everyone went, hmm, forbidden post, and they tried to pretend it never happened, because rationalists have this thing of shitting the floor and then loudly declaring that if they think it doesn't count, then it doesn't count, and you're a liar and evil if you mention it.
They're very bad on history and object permanence.
So, we wrote about it on Rational Wiki.
And I started getting emails from upset rationalists who couldn't get this idea out of their heads because it was like tailor-made for the sort of people who fall for rationalism, you know.
And it turns out you can't think your way out of things you didn't think your way into either.
So these were sincere, perfectly nice people who had been caught up by this horrible idea and couldn't cope with it.
So, yeah.
The idea, to be specific, the idea being that at some point in the future, Simulacra, AI Simulacra of them, would be resurrected and tortured for eternity by an AI that resented them not doing everything they possibly could in the past to bring it into existence.
If you already believe nine impossible things, that makes perfect sense.
Well, and because you are interested in effective altruism, and because you're interested in, you believe that a future simulacra of yourself is functionally equivalent to you yourself as you exist in the material world, and therefore any punishment given to that future simulacra
Is by definition equal to the pain that you yourself would feel and so the idea is that you are literally the most important thing you could possibly do in this world is to devote every possible amount of resources to building the computer as fast as possible that will solve all the world's ills because otherwise.
Or else it will torture you, because if the computer in the future determines that you did not do that, then you end up on the torture list.
It's such a strained logic.
It's like this set of dominoes.
Thousands of people, like, legitimately believe it?
Like, this isn't, like, a handful of, like, cranks.
This is, like, five people.
This is a larger segment of people who believe some version of this, if I understand correctly.
Well, this is Les Rong's version of QAnon, isn't it?
You know, this emerges from Les Rong in the same way that QAnon emerged from 8chan.
Somewhat.
It's more like the philosophical, materialist, rational atheists constructed a vengeful Yahweh from first principles, which is quite an achievement, really.
Sounds like materialism to me.
It's pretty good.
No, no, it'll all be on computers, you see.
So that makes it material.
Are there things involved?
All right, then there's materialism.
Fine.
Absolutely.
So we've now confused anyone who doesn't already know what we're talking about hopelessly.
However, believe me when I say that people who were in interrationalism, a small proportion of them believed.
It wasn't lots of them or even most of them.
Well, except maybe Yadkowsky, who went off in his fabulous all-caps rant about it, but no, he totally doesn't believe it either, because he said he didn't.
But enough of them got really quite upset about this idea, and whenever they tried to ask about their concerns, their posts were deleted from the blog, and everyone tried to pretend that they were the problem for raising this issue.
Because you only get tortured by the AI if you think about the AI.
If they don't think about it, it can't hurt them.
It literally is that joke about the missionary that goes and tells the tribe about, you know, if you don't believe in God, you'll go to hell.
And they say to him, well, what happens if you've never heard of him?
And he says, well, you're fine then.
And they say, well, thanks for telling us.
It is exactly that joke.
So what happened was these people were coming to us because we were the only people who even talked about it because nobody else on the internet gave a hoot because it's stupid.
So, what do you do?
And I got enough emails from these people that I went, look, we have to do something about this.
So we wrote a separate article.
You'll see the whole second half, the first half of the article explaining the nine impossible things you need to believe, then the story of what happened, and then there's a whole half an article which is trying to reasonably talk through why this doesn't make sense.
It sort of helps some people, so that's good.
The thing is that, of course, this is a hilarious and brilliant science fictional idea.
And of course it was used widely.
You know, Charlie Stross wrote about saying, I could have gotten three novels out of this!
And among the people who wrote about it was the electronic musician Grimes, who on Her third, fourth album, whichever one Art Angels was, had a character she portrayed on some of the songs named Rokoko's Basilisk.
Which, you know, just because Grimes is the sort of nerd who does things like that and writes concept albums about Dune.
And the consequence of that was, a few years later, when Elon Musk thought up the pun Rokoko's Basilisk and Googled it, he discovered that Grimes had already made the joke.
And slid into her DMs.
And, well, I think the story from there is pretty well known.
Yes.
When I saw that piece of news come out, I messaged Elle and said, hey, look at that.
And Elle was upset and went, damn, my chance is gone, but I think you dodged the bullet there.
So, yeah, Musk is also in with these people, by the way.
He's one of the main funders of OpenAI, the people who do the GPT-2 and GPT-3 test examples.
The text generators.
So it turns out that Musk is not actually a brilliant engineer.
He's a salesman.
He's a huckster.
Anyone says- No!
No!
How dare you!
He's easily swayed by science fictional stories from people who've already worked out how to get money from Peter Thiel.
So he funded OpenAI, he got some of this stuff through to reasonably high levels of government.
There was, in Hillary Clinton's book about her campaign trail, there was a paragraph which was basically reheated Yadkowsky via Musk, and it somehow reached her.
So Musk is one of the vectors of the disease as well.
So yeah, that's Rokos Baslisk.
It's fantastic.
The Baslisk was never a subject on Slate Star Codex, but It is absolutely the first thing you should know about rationalism, and if you read it, it will actually give you a good summary of the main weird talking points of LessWrong.
It's fabulous.
So, Scott Alexander was a writer on LessWrong, and Slate Star Codex is his spin-off blog, is that right?
Yes.
Perhaps you can give us some idea of the general content of Slate Style Codex and what its political orientation is.
Oh, do I have some posts for you?
Good, good.
So the thing is that he's one of these guys that when you pick out the incredibly dodgy stuff he says and the Contra-based dog trumpets, his fans immediately swoop in, yes, but what about the other 99%?
Right.
Because there's so much of it.
One of his posts, Untitled, is like, it's incredible.
It has a disclaimer at the top.
It might be the worst thing ever written on the internet, and I know exactly what I'm saying.
I know exactly what I'm saying when I say that.
As someone who has plumbed the depths of the fucking internet, it may be the worst single thing ever written on the internet.
Having done a lengthy close reading of Untitled, you're not wrong.
I sort of skimmed off the surface of it, and most people don't realize how bad it is because they never get past word 5000 or so.
But the attitude, and this is important to understand, not what he writes, but what what he writes is about.
He put at the top a disclaimer saying that if this is the only thing of his you've read, you should read his other stuff.
That you shouldn't take this post out of context.
Like, you know, and read a 10,000 word post and, like, that the author has worked very hard on and, like, assume that any of that is his opinions or relevant or something like that, that'd be a totally unfair reading.
Because he said this, rationalists then went, yes, that's a totally unfair reading, how dare you?
You liar, you thief, and so on.
It's sounding very, very familiar.
What was the guy's name?
Porden Jeterson or something like that?
I'm sure I've encountered this syndrome before.
Absolutely.
And, you know, when you do things like quote his actual words.
So, Scott's very good at being mealy-mouthed, where he will say things in a way that he doesn't think can be quoted and used against him.
You know, things that are probably partially correct or at least very non-provenly non-correct.
He's very into that sort of circumlocutist sort of blanket of disclaimers and qualifiers.
Which, of course, is an infallible sign of good faith.
Yeah, he's made himself almost entirely Twitter proof because there's literally no quote you can offer from him that's shorter than 240 characters.
So the things he wrote about anything, he would write about the practice of being a psychiatrist with somewhat anonymized stories about his patients.
This is of questionable ethicality, but he assumed that he had sort of anonymized them sufficiently.
The Jordan Peterson vibes are getting stronger.
No, no, no.
Alexander is actually, Siskind is actually a medical doctor.
The difference is that Siskind can give you drugs, Peterson can only suggest that.
I think having to interact with Siskind to any degree would require it.
So, I've met him once.
He's at a Less Wrong meetup in early 2011 in London.
I stopped going to these after one guy went into this huge rant about how Racial IQ differences are totally true, and it's pernicious nonsense to say they aren't, and so on and so on and so on, and I decided I didn't want to go to these anymore.
It was quite an experience.
So you cancelled them then, with your fanatical, closed-minded wokeism?
I did, I did.
My lethal weapon is my woke.
But I met Scott, and he's one of these people, his superpowers, he projects a niceness field, a civility field.
It's hard to describe.
This doesn't require a person to actually be nice or be civil, but somehow they have this air about them that people will sort of be on their best behavior around them.
It's great for diffusing arguments without doing anything.
I've met a few people who have that sort of superpower.
It's a pretty good one, and many of them are quite decent people, so there's nothing wrong with that.
Shades of the Weinsteins and a bunch of the other IDW crowd who are, you know, so very polite.
And even Charles Murray, who was described by Ezra Klein in the famous Ezra Klein-Sam Harris quote-unquote debate.
Klein described Murray as, like, he is a lovely man in person.
Which, you know, not to give Ezra Klein, you know, bonus points or anything like that, but this is a common trait of these kind of, like, kind of, I don't think their characteristic is a sign that this is a bad person.
figures who were like, I'm only interested in facts and data.
I really care about racial equality and all these things.
I just think that we need to look through the lens of what's true and what's not.
And it turns out that the racists are completely right.
But I feel really bad about that.
I feel really terrible.
But the racists are right.
So I don't think that characteristic is a sign that this is a bad person.
But if you're a bad person, then it will tend to select people who have that trait because then they'll get away with it longer.
I make the effort of being a righteous asshole in person just to avoid this trope personally.
I don't even have to make any effort.
I mean, I've never met Siskin, but I did, you know, write 15,000 words on his pro style recently.
And really, this bland niceness really does carry through to his pro style.
He's very good at not actually saying anything and using the lack of content and lack of actually saying anything to slowly drift towards therefore eugenics he doesn't even get around to saying therefore eugenics he gets around to eugenics is provably not what even was the verbiage i can't even keep that fucking sentence in my head it's so boring which is kind of a demonstration of how it works
non-provably not correct right yes there you got non-provably not correct like that's that's in a nutshell Which is also the most non-scientific thing.
If you are trying to be uber-rationalist scientific and I'm interested in data and facts and what we can actually demonstrate and understanding epistemic limits, the very idea of using the word proof or provable in that context, you've given the game away.
Yeah.
It's very important to remember The scientific racism is absolutely mainstream in the rationalist subculture, right?
It has been since the beginning.
These people, these less strong subculture and the near reactionaries and Bitcoin and Silicon Valley startups, they're all manifestations of what is called the Californian ideology.
This is a nice short little essay.
It's got its own Wikipedia article.
Everyone should read it because it explains these fucking people, all of them.
The thing is that the racism got into the Lesser Ones subculture quite early on because it filled with the people who became the neo-reactionaries, which is basically tech-fash.
The people who work a nice, highly paid job as a computer toucher, believe in a just world, and therefore people like them are obvious front-end JavaScript.
Well, everything in the world is basically A simpler form of front-end JavaScript.
The Singularity Institute, as it was, moved from Atlanta, where Yudkowsky was living then, to the Bay Area so that they could hook into more bright and brilliant people and transhumanists and, hypothetically, people who knew anything about anything.
But they left that step out and it didn't hurt them any.
Got there in 2005-2006 and immediately hooked up with Peter Thiel.
He then gave the money to... And you can practically see his influence on them from the start.
There's an early, unheralded Yudkowsky blog post that's clearly kind of quietly recycling scientific racist talking points about evolutionary psychology.
Yes, he's very into evolutionary psychology.
He thinks it's correct.
It's obviously correct.
Obviously, our psychology evolved and that's all you need to know because, you know, it's obvious from first principles.
Never mind anyone who's actually looked and said, you know, it turns out it's actually a bit more complicated than that.
No, it's not.
Evolution of psychology, great stuff.
There were two amazing LessWrong blog posts – I've got them open in front of me, actually – which absolutely shot the starter's pistol for LessWrong racism.
One is called, Why Are Individual IQ Differences Okay?
This was starting about What was at the time called the James Watson Affair, where the guy who stole the discovery of DNA from his female co-worker was- Co-stole, to be fair.
He had a collaborator, you know?
Yes, he had a co-conspirator, where the heralded scientist Dr. Watson had said some horribly racist garbage.
And so, This was terribly controversial and people worked super hard to say that, no, no, he didn't say literally what his words said, no, he meant something else.
As always happens when someone acts like a huge racist.
Joukowsky came out saying, well, of course, individuals can have IQ differences.
Why can't we say that groups can?
What groups?
What groups?
What could group mean?
Then he went into race.
So Joukowsky thought this was fine to think about.
Then there's another post a month later, Beware of Stephen Jay Gould.
This is the one I was talking about.
This is a great post.
This is great, this is great.
Please, we chatted about this a bit in the back channel, so I have some pre-roll on this, so please, David, continue.
Now, the thing is that you know what a list of scientific racist talking points looks like, right?
They will go through something, they'll go through it, they'll look like a good side-by-side refutation, point-by-point, defeating this thing in debate.
Now, I love that stuff.
Rational Wiki even has a special template to do side-by-side discussions, you know.
Back in the blogosphere, we'd call that fisking.
Back in the olden days, we called it just Usenet quoting.
I was there too, but probably not as early as you were.
Please continue.
I apologize.
I bow to your superior knowledge there.
Let me tell you!
So, The Post is talking about Stephen Jay Gould and how bad he is and how you must not believe anything Stephen Jay Gould writes.
Now, the important thing If you've read anything Stephen Jay Gould's ever said about evolutionary biology, I have some bad news for you.
In the field of evolutionary biology at large, Gould's reputation is mud.
Not because he was wrong.
Many honest scientists have made honest mistakes.
What Gould did was much worse, involving deliberate misrepresentation of science." So that's bollocks.
His reputation is fine.
And fucking hilarious for Yudkowsky to say.
Yudkowsky is literally not qualified in anything whatsoever.
He had a really, really bad time coping with school.
He dropped out of high school and was homeschooled.
He's an autodidact.
He's a very bright lad.
He should have gone to college.
He would actually know stuff, because he would have escaped the auto.x problem, which is that you don't know what you don't know.
Like, one of the reasons to go through a syllabus is so you learn the stuff that you wouldn't have learned just picked up yourself, you know.
But he never got that memo.
Knowing things is reliably kryptonite to this kind of figure, I've found.
Well, and this is specifically, I think, related to their presence and prominence in the tech scene they love themselves some disruptors um fundamentally they think it is better far better than expertise or knowing things is coming in there and being ready to completely shake up the uh the whole scene and so they love reinventing the wheel and at best what they get from that is a kind of lumpy misshapen wheel and at worst it's eugenics
the wheel is probably eugenics as well probably yeah Yeah, the wheel they make is also eugenics.
It's a wheel that has eugenics added onto it.
As the wheel turns, it just kills the quote-unquote genetically inferior.
That's the wheel they always make, apparently.
And we're not exaggerating in that, because look at how many times they're still trying to reinvent AI-based phrenology.
So, this post is one of those attempted takedowns of someone.
Now, the post does not mention Mismeasure of Man.
It goes through everything else about Gould and then it links saying other evolutionary scientists just absolutely despise him and the evidence of this, many others have said much of what I said here, and he links to an article in the New Criterion.
So are you familiar with the New Criterion?
I am, please go ahead!
So the New Criterion is basically for, it was for culture warriors before they were quite called culture warriors or anti-wokists or whatever, But it's absolutely a culture magazine.
It turns out it is not a science magazine.
And what this article did was list every single scientist that Gould had ever argued with.
They get a bunch of, quote-unquote, iconoclasts with interesting ideas, and they are from all walks of life, from the political perspective.
And so we have one vaguely left-of-center social democrat, all the way up to proto-Nazis.
Yeah.
And so we have 20 proto-Nazis and one social democrat, and we're expressing the entire spectrum of ideas.
That's how we do this, yes.
That's how the New Criterion works.
The specific article about Gould is one of those ones that goes through quotes from other scientists, scientists who are reasonably respected.
Like, they get a quote from Dawkins disagreeing with something that Gould said, but you know, Dawkins and Gould respected each other just fine.
They take a lot of quotes out of context and attempt to use them as evidence that all the biologists hated Gould, and that's like, literally false?
So the comments on this post, which was absolutely the smoking gun post for saying, hey, it's okay to be into this stuff.
The first commenter is Razib Khan, the guy who was fired from the New York Times for being a rage eugenicist.
Yep.
But it's okay because he's not white.
He's not white, so therefore it's fine.
He can't be a white supremacist, you know?
Come on!
Techfash tend to be fine as long as you're a good coder.
But there's actual biologists in the comments saying, what the hell are you talking about?
Eliezer, you're weighing over your head on this one.
It is clear you're behind on current literature, and the guy gets downvoted hugely by all the rationalists who have been told what to think, and so they will make sure that they deal with this fellow.
The thing about this post is, he never posted on the subject again, right?
And the other thing about Yudkowsky is, he loves his own ideas more than anyone.
If he has an idea he thinks is great, he will post it repeatedly, elaborate on it at length, and really, really make sure you know that's what he thinks.
This post looks like it was something that he was fed, because if he thought of it himself, he would have repeated it, and he wouldn't have just made this single post.
So, I'm wondering who fed it to him, and it will definitely have been someone from the Teal Sphere.
So, the actual book he's quoting, just to spend a couple of minutes on this, and I don't have the article, the post in front of me, but the actual book he's quoting, I read in high school, and was a big I'm not gonna say like one of those like big books for me, but it was like I read a lot of Stephen Jay Gould in like my last two years of high school.
I still read and reread Stephen Jay Gould.
He got into a lot of debates with people.
I think you could argue that he is You can over appreciate in certain ways but he was a brilliant thinker and i think anyone interested in these.
Arguments should be studying steven eagle i think he's i think he's he was an amazing person and i respect him a lot and if my career ends up being like his i can die happy but.
The book that they're referring to that is referring to in this article is a full house and the point of this book is to say, is to sort of challenge this sort of great chain of being idea within biology that there is a, you know, kind of a, you know, a lower idea that this comes from kind of early Christian imagery.
And then once, you know, evolutionary biology change over time starts happening, you start to see a, A progression in terms of biologists sort of accepting that, like, okay, at the bottom of life, there are single celled organisms and then below that, maybe protoplasm or whatever.
And then it kind of rises to, you know, reptiles and the mammals and then man at the top.
And of course, white men are at the top.
And of course, it's always white men, not women.
But, you know, they sometimes that's more explicit than others.
And this whole idea is basically nonsense.
And there's a quote from Stephen Jay Gould from that from that book where he says, like, We are in the age of bacteria.
We are always in the age of bacteria.
The point being that the vast majority of material on this planet is bacteria.
The vast majority of biomass is single-celled organisms.
The fact that higher organisms have evolved is not a change in terms of Higher and lower because the amoeba that's on my skin or the things that are in my gut floor right now are just as quote unquote evolved as I am and in fact, more so because their lineage will likely live longer because we're all going to die of climate change in 50 years.
Anyway, that's another point.
But he's making a fairly sophisticated argument about Not just sort of, like, quoting that, but he's kind of making an argument using baseball statistics and, like, the way that what we measure changes based on, like, if you're looking for a single metric, like the 400 hitter in baseball, and you realize that there have been no 400 hitters since a certain date, and you think, well, baseball has declined.
Well, no, it just turns out that
The outfielders are getting better and we're catching more balls than we used to and so therefore hitters can't get up to four hundred and he makes a really compelling argument between that and the way the biology works and all of this gets completely miss this really interesting you think it's wrong it's really interesting arguments about society and sociology and the way that we view the world and the way that we view the scientific world and like the history of life on this planet all that gets.
All that gets simplified into this one thing, this one paragraph, which Miodkowski quotes badly, and then uses this quote-mining technique to attack Gould over it.
It's just bad.
It's just bad on every conceivable level, this piece.
It is the species of advocacy for scientific racism, where they give you an article that looks like a refutation with lots of citations, and you go and look, and none of it checks out.
Right.
He said it in a sentence.
I'm sorry, it took me like four minutes to do that.
These ideas were thoroughly into the rationalist subculture.
I met up with a friend who was into this stuff before I got onto Less Wrong.
I hung out on Less Wrong for a few years because it was fun.
Arguing philosophy is great.
It's highly enjoyable.
It's like We were just down the pub, and he started talking to me about race and IQ theories, and I said, well, those are obviously wrong.
You know, look, Burakumin in Japan, literally the same gene pool, separated 500 years ago, they're treated worse in society, therefore they routinely score 15 points lower on IQ tests.
We already know that you can get 15 points for free.
So, if you're trying to argue on three or four points, then you're obviously full of it.
And he sort of kept going.
They never respond to that.
They never respond to that.
They never manage to respond to, okay, what exactly is race from a genetic or scientific perspective?
It's a group, you see.
It's a sort of cluster.
They get very non-specific.
They'll start talking about multi-foci alleles, and, you know, you can build groups based on, you know, genome-wide association studies, and they'll use a lot of, like, graphs and numbers that they have sort of understood.
Then whenever anybody with an education in biology looks at it, it goes like, well, yeah, that's complete nonsense.
That's not how we use that data.
That data doesn't mean anything in that context.
Yes.
And we have very good documentation from historians of the process of what we call race being invented, and the races being invented.
Please go read The Invention of the White Race.
Yes.
There's a great thing on Andrew Hickey's podcast, 500 Songs, which everybody should listen to because it's brilliant.
He did one episode where he was talking about how Charlie Mingus was just sick of white people, so he didn't want to work with them.
And someone said, what about that guy?
He's Italian.
He's not white, he's Italian.
Italians were added to white in historical memory, that sort of thing.
Right.
Anyway, so Scott of course took on the attitudes of his subculture, and human biodiversity is probably partially correct, or at least very non-provably not correct.
You can't say it either!
Not just non-provably non-correct, but very non-provably not correct.
Is that stronger than quite?
Of course, assuming the possibility that things are true because they haven't yet been disproved, that's a very good scientific principle.
Well done with that.
Particularly when they have.
Yeah, well exactly.
I mean, quite apart from the fact that they have, you know.
Well, you know, if we just ignore all of sociology and history and we ignore the, you know, the voices of the people telling us their experiences, and if we just ignore everything, then we can just accept this data that we have.
It's probably correct and probably collected properly and probably accurate in the same way that if we ignore all the vast documentary evidence and we ignore the voices of all the Jewish people in the 30s and 40s and we ignore the testimony of people at the Nuremberg trials because like well clearly those people were probably coerced and we only look at the compounds of cyanide
on brick walls from 70 years ago, then we can say that it's very provably non-correct that the Holocaust happened.
It's the same form of argument.
The point you're making is very good and true and correct, but with respect, it's even more basic than that.
You go back 20 years, pick up one of the piles of atheism books that were published roundabout 2000 to 2005 right the books that you know baby rationalists read that got them into this.
The subculture in the first place in every single one of those there will be a thing about how you know.
Science doesn't know everything or god hasn't been disproved yet is not an argument every single one of them will have something like that thing where you know the invisible intangible inaudible dragon that lives in your bedroom.
That, you know, the fact that you can't prove it's not there isn't a valid reason for basing your approach to the world on something like that.
This is page one shit, you know, on science.
With the race realism thing, they're not interested in science anyway, because they don't listen to their own scientists.
You have people like Jerry Coyne, evolutionary biologist, anti-workist, hate Culture Warrior, hates all these SJWs, but he's also, like, competent, and he was one of the guys who wrote a letter protesting Nicholas Wade to the New York Times saying, stop abusing our research to push scientific racism.
He thinks race and IQ theories are bollocks.
You know, you have groups of people in different areas with different genetic characteristics, but you know, if you correlate that to IQ, you're talking nonsense.
Richard Dawkins, another culture warrior these days, who is Jerry Coyne's good friend, trying to politely tell Brett Weinstein on stage that he's full of it.
That has made a cameo appearance on this show before, that event.
Because Dawkins suffers greatly from no one having taken away his Twitter, but you know, he's like a biologist.
He's like good at the thing he does.
It strikes me that the The underlying idea here of, you know, ignore all that and trust your own instincts and your own clever thought kind of unites in, I think, a revealing way the logic of, again, that Silicon Valley tech disruptor, ignore everything, let's just change the world with one weird idea.
And the logic of conspiracy theories which again is looking for the idea that everything you know that all of consensus knowledge is wrong.
There's actually a really easy bleed from Silicon Valley Disruptor to conspiracy theory that is, I think, a lot of why scientific racism successfully took off in this crowd.
And to kind of drag us kicking and screaming back to Slitstar Codex is ultimately what Siskind served as a vector to do.
I mean, Siskind's practical effects, hugely read in Silicon Valley crowds, has been to legitimize scientific racism and eugenics among tech people, who then go on to build algorithms that mysteriously can't recognize the faces of black people.
Yeah, it is bad thinking that they learnt on less wrong.
Yudkowsky has a thing he does where he'll say, so this thing is possible.
Then his next post will say, now we know this thing can happen.
Then he'll say in the next post, so this is a known result.
And a fourth post will say, so this is a fact.
You see him progress this way through a number of things.
It's also, none of these people do the reading.
They never ever do the reading in any field.
They will absolutely always choose to start from first principles, apply their own brilliance, and then decide that they understand the world.
There's things like Bayesian epistemology.
Harry Potter and the Methods of Rationality, yeah.
Absolutely.
So, this is a thing that Kelsey talked about continuously, that a lot of people claim that they actually do on a daily basis.
Yes, I've updated on that information, according to my priors.
But the thing is that that's nonsense.
So I'm going to get a little mathematical at this point.
But the reason it's nonsense is not because... Jack and Elle, you can just hold off for a second.
It's fine.
I was raised by math professors.
I can do this bit.
Okay, we're good.
I read Capital Volume 3.
Absolutely.
So Bayes' Rule is a rule of probability.
You know what you know about a thing.
That's your prior probability.
You get some new information.
You can then calculate what you think the probability should therefore be based on that.
It's just a simple mathematical identity.
So that's fair enough.
But it turns out that applying Bayesian real statistics is really, really hard to do.
And the reason for that is because probabilities aren't single numbers.
They're distributions.
You don't just have a number.
You have a graph, like a normal distribution or maybe a Poisson distribution or some weird-ass distribution that isn't regular, you don't have a formula for.
Usually the third kind.
Then you're taking that and new information, which is also a distribution.
So you're not just doing a quick dividing one number by another and adding it or whatever.
You're multiplying two matrices and then combining them, and this is extremely hard and gets you into second year calculus, and I looked at this... Can I admit that I had a nightmare about linear algebra last night?
Absolutely.
So, yeah, but you have that dream every night.
I realized that I forgot.
Come on.
Daniel's basilisk.
It was very, it's very rare, but I literally was like sitting and I imagine, you know that like math test, uh,
Dream nightmare that people are supposed to have I almost never had that dream, but I had it last night And it was literally I was I was literally sitting and going oh I have to take a math test and I have all these things that are wrong in my math test thing, but also like I had completely forgotten how to Do cross products in the dream because I who does cross products unless you do it on a regular basis and
Sorry, it just kind of came back to me of like, you know, apparently I've been prepping for this episode.
So, Daniel understands what I'm talking about.
Because, you know, I did this stuff and went off to become a rock critic.
So, basically, Bayesian epistemology would require you not to be doing numbers all the time.
You see, here's some new information, quickly do some arithmetic in your head and you have what your new belief should be.
You'd be doing matrix calculations in your head Multiple times a minute in daily life.
So anyone who claims they're doing Bayesian epistemology in real life is spouting bollocks.
They're doing literary Bayesianism where you think that if you say certain words then your prejudices must be science.
Yeah, I was going to say, as the person who coined the literary Bayesianism, yeah, what Bayesian epistemology is in practice is you make up a number that you think feels like about the probability of something, and then you talk about that number in a particular way using a particular sort of type of word, and if you do that, nothing you say can possibly be wrong.
Yeah.
If you look into it, you realize that basically it's not even an epistemology, it's a sort of sketch of what an epistemology might be if humans had matrix algebra calculators in their frontal lobes, which they don't.
It's a jargony writing style that lets them quickly filter out people who are members of the cult from people who aren't.
So I went to the source and looked this up.
Let's look at the Stanford Encyclopedia of Philosophy.
You know, it's not terrible, but it's not great, but it's okay.
It'll give you an idea.
And I looked at it, and that too was like a rough sketch of what this might look like in its epistemology if humans weren't actually humans but some other species.
It was great.
So this is what Lesser Wrong Epistemology rests on, which is functionally how to take your prejudices, say that they're your priors, and that therefore saying black people don't show up on AI is because they're stupid is science.
That's what it actually is.
And this is where rationalists get their bad thinking patterns from and eventually learn to just follow with the in-group.
The in-group can do no wrong, the out-group can do no right.
Yeah, that makes an awful lot of sense to me.
I mean, I tend to think of most forms of reactionary politics as basically different varieties of aesthetics for packaging the same ideas.
So this is just another species of that.
Oh, he's not a reactionary.
No, no.
He's very left-liberal.
Oh, yeah.
He's anti-reactionary.
Absolutely.
He even wrote the neo-reactionary fact in a giant nutshell.
Which is one of my favorite examples of highlighting the particular sort of flavor of bullshit that he goes for.
Siskin wrote these two articles, Neo-Reaction in a Planet-Sized Nutshell and the Anti-Reactionary FAQ.
And his defenders love citing this last one as, he's not a neoreactionary.
Look, he wrote this.
And the thing is, if you look at these two articles, neoreaction in a planetary nutshell is this lengthy, but especially by the terms of neoreaction, reasonably succinct, account of neoreaction's kind of high-level philosophical claims and broad beliefs that spins it as, these guys are saying something interesting.
And then the anti-reactionary FAQ is a gish gallop through a ton of completely minor points that doesn't address a single high-level claim.
And the idea is these two somehow are, you know, provide equal weight.
And no, one of them is presenting high-level arguments for, you know, Mencius Moldbug who thinks black people genetically make good slaves, and the other one is refuting a whole bevy of minor points nobody gives a shit about.
Yeah, Scott's very into the aesthetic of near reaction.
I'll quote again from the leaked email from 2014 where he's talking about how he's going to try to attract reactionaries to Slate Star Codex.
They are correct about a bunch of scattered other things.
One, the superiority of corporal punishment to our current punishment system.
Two, various scattered historical events which they seem to be able to pass much better than anyone else.
This links to C for example and it links to a book review.
It's a review of the book Last Lion by Paul Reed, which is a biography of Churchill.
The review is lamenting that Churchill didn't join forces with Hitler and how badly everything worked out because of that.
Which is literally Pat Buchanan's Hitler's Unnecessary War.
Well, it's literally Hitler.
Churchill's Unnecessary War.
Hitler was a great fan of the British Empire.
He loved the Empire.
He thought it was what he wanted to be, model on.
And yeah.
Three.
Moldbug's theory of why modern poetry is so atrocious, which I will not bore you by asking you to read.
I'd just like you to imagine Scott Alexander going to Mencius Moldbug concerning the aesthetics of writing.
I just heard Brett Weinstein do a Dr. Seuss pastiche this afternoon, so I shall show you.
Oh, I bet that was good.
Given that we're recording this on March 7th, 2021, if you look at the news stories from that day, you understand what was actually being said here.
We'll just leave that be.
The places that my research takes me... I just love the idea of anybody turning to Mencius Moldbug for aesthetics.
I've read some Moldbug.
It's like hacking your way through the world's most boring jungle.
Yeah, Moldbug's great.
Um, I worked out the problem with Moldbug.
So, he started writing short... There's one.
There's one.
No, the problem with his writing, why he writes like that.
His first post was 1600 words.
It was like an essay with a start, a middle, a conclusion.
It had a thesis, you could critique it.
So the comments, which have now been deleted, he moved everything to a new blog and left behind all the comments where people call him out.
How sad.
And all the comments immediately got stuck into him for his laughably bad logic and his misunderstandings of basic terms.
This is when he went into his latest style, where he spends a thousand words redefining the English language so that his big bombshell revelations sound profound instead of him using very special personal definitions of things.
And that really could be a house style for this whole bunch of people, couldn't it?
And it's a real conscious strategy, I think, to just obfuscate and obfuscate and obfuscate aesthetically.
And I think that plays in very Very directly to your recent essay about Scott Siskind, Al.
Yeah, I mean, it's, again, you know, I did a real deep dive into his prose style.
And what's, I hesitate to use the word interesting for anything related to this, but what's notable about it is he has this tendency to deploy things, claims entirely through kind of rhetorical structures.
He'll do something like, Ask a question.
Is the answer obviously wrong thing?
Is the answer other obviously wrong thing?
Statement.
And, you know, and that rhetorical structure that keys you in for I'm about to say a true thing is actually the only argument he makes in favor of it.
And then thousands of words of glurge that say nothing, and then he'll subtly restate his earlier not argued for statement.
A couple more thousand words, a third restatement.
Little more.
And now it's somehow become a much spikier and more problematic point.
The case of Untitled, which we all talked about but I don't think we ever actually got the thesis statement out, is roughly feminists should stop talking about sexual assault because they're making it worse.
And he gets there through complete non-argument and then sort of slowly in that length restating the thing he didn't argue for in more and more inflammatory ways in a kind of Slow boiling of the frog and he's doing that in the context of talking about a another Scott a Scott Aronson who?
Is it Prince for MIT?
I think it's MIT, right?
He's he's a he's a brilliant He is from what I understand a brilliant computer scientist who was confronted by a young woman in one of his classes who?
Talked about her sexual assaults And the comment section of his blog.
Right.
Right.
She talked about how some of the most misogynistic and gropey people she's ever met were shy, awkward nerds, and she doesn't feel safe talking about.
And she doesn't feel safe around such people as a result.
Right.
And Arrington's response was this, oh poor me, whinge about how absolutely awful it was to grow up as a shy and awkward nerd who couldn't read social cues around women and thought he was never going to get a girlfriend and be alone and was afraid to even talk to a woman lest he be accused of sexual assault.
And prescribed, just to put this in right now, the answer is, well, if I had grown up in this, like, insular community in which a marriage would just have been arranged for me, I would have been fine.
I would have been much more comfortable.
Without ever considering, A, the feelings of the woman he's responding to, and B, the feelings of this hypothetical young woman who would have been, like, forcibly betrothed to him.
And he's the victim here.
If I'd just been given a woman, it was my right, people like me wouldn't have to rape them.
I mean, come on, what do you want?
I should note, by the way, Scott Aronson is married with kids.
He met a woman who presumably actually likes him.
And he's still talking like a dangerous insult.
But anyway, untitled.
So yeah, I mean, Aronson wrote this post and a couple feminist bloggers, most notably Amanda Marcotte and Laurie Penny, took him to task for it with varying degrees of severity.
And Siskin writes this furious post about how utterly unreasonable it is Uh, that anyone would come at Scott Aronson this way, um, and, you know, post about how absolutely inflammatory and body shaming and wrong feminist rhetoric around nerds is.
And his argument for how body shaming and wrong feminist rhetoric around nerds is, is a completely cherry picked image board of a couple of Memes, some of which, you know, you can validly make a complaint, oh, no, that's body shaming and wasn't a great move.
Some of which are making fun of GamerGate people, and some of which are actually anti-Semitic cartoons that he just threw in to argue purely by analogy that the visual stereotypes of the neck-bearded fedora person and the happy merchant are basically the same thing.
Oh yeah, I love that move.
The old, what if this thing here was something else?
It'd be different, wouldn't it?
Yes, yes it would!
Shockingly!
What if we changed some words in Mein Kampf and then tried to get it published?
What if we changed Jews to another word and then tried to get that published in an academic paper?
That's certainly not an upcoming episode of IDSGPod.
Yeah, and this is literally the entire argument, if you actually drill down and look at Untitled, this image board is the entire argument for feminism is unduly cruel and body-shaming and stereotypical about nerds, and it's this cherry-picked selection of like eight images.
No feminist thinkers are quoted at any length beyond Laurie Penny and Marcotte, who Siskin is railing against.
And again, I really want to stress the Gamergate example that was on there because the larger context of what's going on here is when, you know, women are being a little upset about geeks is anytime they say anything they're being swarmed by a fucking organized harassment mob in 2014 when this is happening.
And this becomes the staging ground for what becomes the alt-right.
Again, we're talking about... Untitled has aged in an elaborately bad way.
Not least because he, at some point, one of his central insights is to compare the way that nerds are treated in society to the way that Jews were treated in 1930s Germany.
Yeah, and so, you know, Siskin's conclusion from all of this is literally feminists shouldn't talk about what, you know, about How shy awkward men should behave because they don't know any heterosexual women don't know anything about being attracted to women.
So why should they talk about this?
It's fractal, and it's stupid!
But, like, any individual paragraph is stupid, and then you connect it to any other three paragraphs, and it's more stupid, and then, like, I have literally sat down and tried to diagram this essay several times back at the time.
Like, I was doing, I was literally doing my degree in chemistry, And so I was doing, like, I think I was in Analytical Chemistry at the time, and doing, like, I think, Calc 3, or, like, Differential Equations.
And, like, that was less brainy than, like, trying to figure out what Siskind was doing in this essay.
It's, it's, it's a, like, you could, an academic could build a career understanding what's going on in this essay, I think.
So, yeah, I mean, this is the sort of intense rationalism that Siskind and his ilk provide to the world.
Right.
Right.
I mean, I have another one I would like to call out.
I know we're kind of getting to the to the point at which we're going to have to kind of cut this off.
And we've had like a fascinating conversation.
But I did want to call out the You Are Still Crying Wolf essay.
Oh, yeah.
That's amazing.
This is like listening to connoisseurs discuss fine wines.
This girl on the sunny side of the toxic waste dump!
When I was writing The Bayesianist, my essay on Scott Siskin, that one was the one I was really torn between that and Untitled as my example of an essay that isn't just stupid but evil.
Well, this podcast, and I'm going to say arguably, but this podcast basically exists because I read You Are a Crying Wolf.
You Are Still Crying Wolf.
And I got really mad.
And Elle may or may not remember this at the time, but this was published shortly after the 2016 American election in which Trump was elected.
And Siskin kind of comes out and puts out this.
Trump isn't really a racist.
He's not a white nationalist.
Here's the thing that he's doing.
Uh, you know, he talks all about black people and like when he talks about Mexicans bringing crime and rape, he's not talking about, he's not, that's not a racist thing.
That's just a, that's just a thing.
He, he doesn't like immigrants and that's a very different kind of thing.
And the more you try to dig into the references that he has to his own blog, the more you realize that he's referencing things that, He has never really clearly defined, like he has this idea of like the meta versus object level understanding of like processes and morality and writing.
This is one of the big things that LessWrong is really huge on that Scott has been pushing since the LessWrong days.
You go up a level, that is you change all the words and then you say, haha, that means something different, doesn't it?
Well, so does Lesserong ever, like, clearly, really give a definition?
Like, is it ever clearly defined?
Does Lesserong ever clearly define something?
No, I can tell you the answer to that without much reading.
I couldn't find it in my Googling in 2016, and, like, the whole argument that he's making in You Are Still Crying Wolf, and I'm gonna, like, over-summarize, I still might write this up for my Patreon at some point, so, you know, like, we'll see if I have the time.
I've been super busy lately for things that you will hopefully find out about in the future but like the argument that he makes is, well the Ku Klux Klan only has a few thousand people and so therefore there are only a few thousand open white supremacists in the United States and so clearly who would want to dog whistle to those people and also dog whistles don't exist but like racism, being called a racist is so bad nobody would ever do it.
No one ever like try to signal to that to those people well that that's funny in the context of his email but his big thing is big thing is like Donald Trump is a racist because like you can find things that he says that are clearly like he says he's not a racist so therefore he's not and I was when he said some I assume are good people so obviously it's not a racial thing if he admits the possibility that one Mexican is not a rapist then he's not saying that Mexicans are rapists is he?
Not provably not a rapist.
You have literally, like, completely demolished all of this in one sentence.
But I said I can do better than that.
And what I said I was going to do is to go and look into the structure of the essay and try to find all his references and where he's going.
And of course, like, the blog format lends itself to referencing the thing you said before, which references to the thing you said before that, which references to something else.
And then when you kind of dig that out, you find there's no there, there, there's never a like central thing where he's actually defined the core thing.
He just kind of assumes it, which we've discussed before.
But, um, I said, well, what I need to do is I need to find things that like people that Scott Siskind or Scott Alexander would clearly agree are racist.
Who also said the thing of, well, I'm not really a racist because I also like black people.
And I have here in front of me the 1963 speech by noted anti-racist George Wallace.
The speech written By Asa Carter, who led several members of the Klan and who is very credibly connected to mass bombing campaigns of black churches, etc, etc.
This is literally a quote from the Segregation Now, Segregation Forever speech when George Wallace was inaugurated as governor of Alabama in 1963.
And I'm going to read just this little point.
So this is a paragraph.
And we invite the Negro citizens of Alabama to work with us from his separate racial station, as we will work with him to develop, to grow an individual freedom and enrichment.
We want jobs and a good future for both races, the tubercular and the infirm.
This is the basic heritage of my religion, of which I make full practice, for we are all the handiwork of God.
There you go.
George Wallace, Asa Carter, not racist.
In the same fucking speech as Segregation Now, Segregation Forever, he's saying, we want all races, we want what's best for all the races, you see?
That's what we want.
And this would convince Scott Siskind, based on everything we know about him, that single bit from that speech would convince all these fucking rationalists, well, he said he wasn't racist, so clearly he's not.
By identi-logic, George Wallace in 1963, when he's actively pursuing segregationist policies in the very same fucking speech, not a racist.
Yes, but you're assuming he's in the in-group as well.
If he's in the in-group, then it absolutely follows.
And then I fell down this rabbit hole of finding more and more of this stuff and getting fascinated with it.
And so, this podcast follows directly from me reading that essay and getting very mad about it and wanting to find all the references I could.
It turns out I found a few.
It turns out I found a few.
So, I don't speak German is literally Scott Siskins' fault.
Yeah, like, if you're a racist and you don't like this podcast and you're listening, and you think that it shouldn't exist and you want me to go do something else, you only have Scott Siskin to blame, ultimately.
We're learning so many secret origins here.
We know why Grimes and Elon Musk got together.
We know why the podcast exists.
This is a deep dive on history.
You finished the podcast by episode 82, you know, six episodes early.
That's always the problem.
I don't know what 88 is going to be.
I'm kind of terrified to get there.
But we will get there, because we're not done yet.
I can promise you.
I just spoke about 88s when I was doing amateur radio, and 88s means love and kisses.
Ah, well, you know.
73 is good wishes, 88 is love and kisses.
This is actually true.
Look it up.
Just think of all those poor people born in 1988 who are trying to find a Twitter handle these days.
Oh yeah, no, it is a problem for people, I know.
I've seen people commenting on it.
Anyway, I thought it was worthwhile mentioning that on this podcast, that that's where this kind of like, I'm gonna go dig as deep as I can into all this stuff, and then you just fall down the rabbit holes and you get really interested, and then you don't know what to do with it once you have The writing and how to organize it.
And then we start doing a podcast because Jack says, hey, I know you're having problems figuring out how to organize this.
Why don't we do a podcast and you can do show notes and the show notes will be the basis of essays.
And then the podcast explodes because people want to listen to it.
And Jack says, you can't stop doing the podcast and write your book because I need the Patreon money because there's a pandemic.
And also, I want the domino meme to exist.
The first domino is Scott Siskin writes a shitty piece about Donald Trump, and then the end is Atomwaffen Division and the Bolt Patrol are raided by Feds.
That's what should happen.
See, that proves what a non-racist cis can do, and that is probably partially correct, or at least very non-provably non-correct.
Anyway, I did just want to make sure we threw that in.
I did just want to make sure we threw that in here, because it is such a terrible essay.
And it is, like, I really admire the Baygenist, which is El's piece, and we will link in the show notes, because it does dig into, like, Siskin's rhetorical style, which I have tried to do more on that kind of fisking style, try to kind of dig in and find the references and find the things that he has said.
But he's so, there's so many words produced, and those words are so hard to condense without feeling like you are putting words in his mouth.
Because he there's no he never says anything directly it's always in this sort of context of like well maybe this and maybe that but you're left with the impression of the thing but if you.
Say he said the same includes the same then he won't he would never admit to that and so.
The further you dig in, then the more it just comes out.
It's like cotton candy.
Try to dig into cotton candy looking for the diamond in the middle.
That's what trying to analyze Siskind's prose is like.
And that's what doing all of this work is kind of like, when you try to analyze it directly.
So, my hat's off to both of you for doing so much of this work and for coming on this podcast.
Well, I've never seen an abyss I didn't think, wow, that's a really cool looking abyss.
I better do a great swan dive into it!
I mean, I now have a second career writing about fucking Bitcoin, so... Well, and we're going to bring you back to talk about that at some point in the very near future, I hope.
If you're willing to come back, I'd love to talk about Nazis and Bitcoin some, so... Big time.
Okay.
Thanks ever so much to both of you for coming on.
It's been a great, fun chat.
My pleasure.
Thank you.
Thank you for having me.
We should have people talk where we can find them on the internet, so they can promote their shit.
Al, where do we find you on the internet?
I was going to say, just do it yourself.
I'm working on it.
Well, I'm on eruditoriumpress.com, where you might have been linked to this very podcast from, where I am imminently restarting my history of British comic books and occultism.
My search engine optimization game is top-notch because I got on the internet in 1995, so If you search for David Gerrard, you'll probably find me.
I'm at David Gerrard on Twitter.
My website's at davidgerrard.co.uk, and that's mostly me talking about why Bitcoin is stupid libertarian trash that came from the same cluster of subcultures that brought us less wrong in Slate Star Codex.
Yeah, it all goes back to the fucking Austrian school, but that's another broadcast.
I think we should blame William Shockley.
It's all his fault.
He started Silicon Valley and he was a massive racist.
What more do you need?
David Duke himself quotes William Shockley as the authority in racism.
And I believe William Luther Pierce did as well.
Transitions and racism.
It's pretty awesome.
Can you believe there's actually a Bond film where there's a guy that wants to set off a bomb to set off the San Andreas Fault and it'll flood Silicon Valley and he's the bad guy?
That is fucked up.
No comment.
That was I Don't Speak German.
Thanks for listening.
If you enjoyed the show or found it useful, please spread the word.
If you want to contact me, I'm at underscore Jack underscore Graham underscore, Daniel is at Daniel E Harper, and the show's Twitter is at IDSGpod.
If you want to help us make the show and stay 100% editorially independent, we both have Patreons.
I Don't Speak German is hosted at idonspeakgerman.libsyn.com, and we're also on Apple Podcasts, Soundcloud, Spotify, Stitcher, and we show up in all podcast apps.
This show is associated with Eruditorum Press, where you can find more details about it.
Export Selection