Behind the Bastards - Part Two: AI Is Coming for Your Children Aired: 2023-06-22 Duration: 01:10:27 === Ridiculous History Origins (02:45) === [00:00:00] This is an iHeart podcast. [00:00:02] Guaranteed human. [00:00:04] When a group of women discover they've all dated the same prolific con artist, they take matters into their own hands. [00:00:13] I vowed I will be his last target. [00:00:15] He is not going to get away with this. [00:00:17] He's going to get what he deserves. [00:00:19] We always say that. [00:00:21] Trust your girlfriends. [00:00:24] Listen to the girlfriends. [00:00:25] Trust me, babe. [00:00:26] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:00:36] On a recent episode of the podcast Money and Wealth with John O'Brien, I sit down with Tiffany the Bajanista Alicia to talk about what it really takes to take control of your money. [00:00:46] What would that look like in our families if everyone was able to pass on wealth to the people when they're no longer here? [00:00:53] We break down budgeting, financial discipline, and how to build real wealth, starting with the mindset shifts too many of us were never ever taught. [00:01:02] If you've ever felt you didn't get the memo on money, this conversation is for you to hear more. [00:01:07] Listen to Money and Wealth with John O'Brien from the Black Effect Network on the iHeartRadio app, Apple Podcast, or wherever you get your podcast. [00:01:18] Ernest, what's up? [00:01:19] Look, money is something we all deal with, but financial literacy is what helps turn income into real wealth. [00:01:24] On each episode of the podcast, Earn Your Leisure, we break down the conversations you need to understand money, investing, and entrepreneurship. [00:01:32] From stocks and real estate to credit, business, and generational wealth, our goal is simple. [00:01:37] Make financial literacy accessible for everyone. [00:01:40] Because when you understand the system, you can start to build within it. [00:01:43] Open your free iHeartRadio app, search Earn Your Leisure, and listen now. [00:01:50] Ah, what's soulless my children's fiction. [00:01:57] I don't know. [00:01:58] I don't know how to open this one. [00:01:59] We're coming back on my investigation, my two-part investigation into the unsettling and moderately evil world of AI children's book grifters. [00:02:10] My guest is Ben Bolan from Ridiculous History. [00:02:14] And boy, basically, all of the podcasts that helped invent podcasting as an industry. [00:02:21] Thank you, Ben, for being on the show and talking with us today. [00:02:26] Ben, you mentioned a book called Ploto in our last episode, which is this weird algorithmic novel that a guy wrote in a book in the 20s where he lays out the 1462 possible book plots. [00:02:41] And we wanted to start talking about this because you found your copy of Ploto. === Pontiac Aztec Analysis (07:05) === [00:02:45] Yeah. [00:02:46] Yes. [00:02:46] Yeah. [00:02:47] Thank you for having me. [00:02:49] Robert, Sophie, fellow fans of Behind the Bastards. [00:02:53] Let's give just a short excerpt from the foreword of Ploto by William Wallace Cook. [00:03:01] Yeah, it begins. [00:03:03] Picture a man and a woman walking through a thick fog in London. [00:03:08] The year is 1926. [00:03:10] They are in love and they are miserable. [00:03:15] Blind men already. [00:03:16] Yeah. [00:03:17] I'm hooked. [00:03:18] Tell me more. [00:03:20] Yeah. [00:03:23] You know what? [00:03:24] Greenlit. [00:03:25] You get exactly two seasons on Netflix. [00:03:27] Make sure to end it on a cliffhanger that we won't resolve because we have to pay you more if we do season three. [00:03:34] Yeah. [00:03:37] Robert owns Netflix now. [00:03:39] Yeah. [00:03:40] I fucking knew it. [00:03:41] I knew it was you. [00:03:43] You know, I traded in my Pontiac Aztec, which provided me with almost twice as much money as I needed to buy Netflix. [00:03:50] So yeah, no, I now own Netflix and no longer have a death trap car. [00:03:55] Hey, Brother, they let me keep the Aztec. [00:03:58] They paid me not to sell it to them. [00:04:00] Sorry. [00:04:01] I'm just seeing native advertising. [00:04:03] I'm hearing native advertising for Pontiac here for Big Pontiac. [00:04:07] Yeah, big advertiser on podcasts. [00:04:10] Dead manufacturer Pontiac. [00:04:13] Pontiac. [00:04:14] We apparently made real cars once. [00:04:16] And I'm wondering, I'm wondering, did you write the beginning of this show? [00:04:23] Did you, I guess we should go public with this. [00:04:27] Part one of this week's series was written entirely by ChatGPT. [00:04:34] Is that correct? [00:04:35] Yeah, I actually had meant to write part two in ChatGPT, but ChatGPT was so horrified by the task of riding behind the bastards that it attempted to hijack my Pontiac Aztec, which then drove itself into a mailbox and detonated on impact. [00:04:55] So RIP, ChatGPT, you know, it was, it contained billions and billions of lines of text, but it did not contain the ability to safely pilot a Pontiac vehicle. [00:05:07] Now, if I was a child, I would think this is an intriguing story, perhaps with some hidden treasure. [00:05:16] Yeah. [00:05:17] Yeah. [00:05:18] We could do a whole children's book about the Pontiac Aztec from Breaking Bad and what the fact that Walt has to drive such a piece of shit car says about his character. [00:05:29] Really, some of the best, most masterful character building. [00:05:32] These AIs could never do such effective character building as setting up the desperation of an impoverished chemistry teacher's life by showing him drive a Pontiac Aztec. [00:05:44] Pontiac, are you unhappy? [00:05:47] Take it on the road. [00:05:48] Yeah. [00:05:50] Pontiac, you won't live long in this car. [00:05:54] Also, Breaking Bad's really good. [00:05:56] It is quite good. [00:05:58] As good as the Aztec was a shitty car. [00:06:01] So when we left off, we were talking about, we went through some really remarkable looking Tyrannosaur images for this terrible coloring book. [00:06:11] Now, in addition to making three-legged T-Rexes, most AI image generators struggle to keep characters consistent across multiple images within a single book. [00:06:20] So if you're like, you know, you've got like 20 pages, right? [00:06:22] You need 20 illustrations for this book and your character is this little girl in a zoo or whatever. [00:06:27] You can give it the same input, like describe the little girl the same in each prompt. [00:06:33] But it's really hard to get it to actually do the exact same girl in each illustration, right? [00:06:40] There are ways, there's whole guides to like keeping characters consistent, but it's a thing that like isn't easy. [00:06:47] And most of the creators that I follow just kind of ignored this because it is kind of a pain in the ass to do. [00:06:53] And they kind of trusted that what they were putting out, like the different illustrations looked close enough that like the parents buying these books wouldn't notice. [00:07:01] And Soby's going to show you, these are two pages from like a children's book about a little girl at a zoo. [00:07:07] It's like bad. [00:07:08] It's not about anything. [00:07:10] But you can see like the little girl. [00:07:12] This is supposed to be the same character, but that's like, those are clearly different little girls in both images. [00:07:19] Wow. [00:07:20] Yeah. [00:07:20] One of them has kind of curly hair. [00:07:21] One of them has straight hair. [00:07:23] They're both done slightly different styles, but it is, they are kind of close enough that unless you're really looking, you might not notice it. [00:07:32] Most of the books I've seen, the consistency is even worse than that. [00:07:36] And the laziest example of this I found is from the comic book, the adventure comic book that we talked about last episode, which is titled Treasures Beyond Gold, even though no treasures, gold or otherwise, actually make it into the book. [00:07:51] Now, the author of this, Chris Hydorn, who I, or Christian Hydorn, uh, prefer like does very technical prompts for his images, but this still means that he's just asking the machine to draw an attractive Western man or an attractive young Asian woman. [00:08:05] Like, those are what he plugs in. [00:08:07] Um, and yeah, it's it's not like the so one of the, for example, like one of the one of the prompts he's got is like slash imagine blend of comic book art and line art in full natural colors, attractive Western man in his early 30s with short cropped brown hair and stubble beard, shirt in beige color, walking through a bustling Southeast Asian market reading treasure map. [00:08:28] Um, and this results in a comic book where every single page, both of our main two characters are completely different people, often drawn in decent styles. [00:08:37] So, you can see a different style. [00:08:38] So, you can see in these two different images from two different pages. [00:08:41] In the first one, she looks like, I don't know, like a, you know, you've got like the lady character and she's kind of like got a t-shirt and what looks like a bandolier, a sashel around her shoulder. [00:08:55] She's got long straight hair. [00:08:56] Uh, the male lead looks like Dean Winchester from a, from Supernatural. [00:09:01] Um, you know, and he's got like a green over, he's not actually wearing beige like she said, like the thing said, but like, and then in the second image from like a page or two later, she's been like anime up like 20%. [00:09:13] Her eyes are like three times as large. [00:09:15] Um, and then he's gotten like 15% shaggy from Scooby-Doo added to him. [00:09:20] Like, they're not the same people. [00:09:24] She's got lip filler. [00:09:25] She's wearing like a, whatchamacallit, a sleeveless shirt now. [00:09:30] Um, and again, like the anime up a little bit, he's like, they're not the same people, like they're very, clearly very different looking characters. [00:09:38] They lost their, uh, they also, they lost their gear. [00:09:42] Yeah, they're wearing totally different clothing and equipment. [00:09:45] Yeah, yeah, and you can tell from the facial structure, like the, the, the curvature of the jaw. === AI Art Risks for Kids (15:14) === [00:09:51] You know what it is? [00:09:52] You know what it is? [00:09:54] If your parent thumbing through something like this, and uh, again, everybody tune into part one if you haven't listened yet. [00:10:01] Uh, if your parent thumbing through something like this or encountering this stuff with a cursory look, then the two images of people, they they look like they could be related to each other. [00:10:15] But to your point, Robert, very clearly, not the same folks, not the same folks. [00:10:22] And like, obviously, there's the normal like weirdness, like you know, the Dean Winchester version of the character looks better, but his neck is like cocked to an angle that next time. [00:10:31] They made her, yeah, she's like a little kid in the second one. [00:10:35] It's her face the size of a child's face, and then, but this is so, yeah, this is unscarced. [00:10:44] It's weird, it's weird, and again, an adult who like actually looked at this would kind of recognize a couple of pages in, oh, this is like some weird shitty AI art thing. [00:10:54] But again, these books, I think, can be damaging to little kids. [00:10:57] And this is what gets me into the actual educational research that I did for this investigation. [00:11:01] Because there's a literary, uh, you were talking about this in part one, right? [00:11:08] This, this triggered uh, or inspired a deep dive into a sort of um yeah, literary educational theory. [00:11:17] Because like, I wanted to know, is it like bad for kids to get handed nonsense books that aren't like aren't about anything at all? [00:11:26] Um, and where the art is like not actually art, like where there's no intentionality behind it, it's just kind of like clip art placed more or less randomly, um, and often slightly warped by you know, uh, a machine hallucination. [00:11:39] Um, and yeah, there's actually there's a substantial body of scientific research into what is referred to as emergent literacy, right? [00:11:47] Emergent literacy is the, these are the reading and writing skills that a child possesses or builds before they are can formally read or write. [00:11:57] So, when you are sitting down with your six-month, eight-month-old kid and you're going over a storybook, that kid can't read, right? [00:12:04] They can't like look at words and recognize what the individual words are. [00:12:08] But because you're reading them a story, they are starting to pick up on aspects of how stories are structured. [00:12:13] What it like, what a story contains, what characters are. [00:12:16] These are all things that they are picking up that aren't literacy, but also are a crucial building block to literacy, right? [00:12:24] This is emergent literacy, right? [00:12:26] That's this is a critical part of kids learning how to read and learning how to appreciate reading, right? [00:12:32] It's why, like, when I was a kid, my mom, like, the honestly, like 90% of her parenting strategy was basically make sure he always has a book in his hands. [00:12:42] Oh, nice. [00:12:43] Did you, uh, did were you able to did you have uh curation, curator, uh, did you have autonomy? [00:12:53] When I was too young to pick my, because you know, at a certain age, like, you know, I was six months, a year old or whatever. [00:12:58] I'm not really picking my own book. [00:12:59] She's just like, you were like, no, you know, but it was also like, I did have a lot of like my grandma had basically every national geographic, and like I would see ones with pirates or dinosaurs. [00:13:09] And so I, I, I had like a lot of that shit. [00:13:12] Um, no, as I, as I, as I, and as I like was younger, you know, when I was in second grade, I found my dad had a copy of The Lost World checked out from the library and I like demanded he renew it because there was a dinosaur skull on the front. [00:13:24] And so in second grade, I read The Lost World, which is not a book that's for second graders, but my mom's attitude was like, like I had my TV access was super restricted. [00:13:33] I couldn't watch like Ren and Stimpy or The Simpsons as a little kid, but like if it was a book, it didn't matter if there was fucking, if there was murder, if there was like sex crimes, as long as it was a book, it was okay. [00:13:44] That was like my mom's attitude. [00:13:45] If he's reading, it's fine. [00:13:47] I feel I read Stephen King's It when I was probably too young. [00:13:52] That's a fucked up book for a kid. [00:13:53] Sure. [00:13:54] It's not ideal. [00:13:56] And my parents were like, hey, they were bragging. [00:14:01] They were like, hey, look, our kid reads this stuff on his own. [00:14:08] What a self-starter. [00:14:09] What a literate child. [00:14:12] And I actually think we're joking about it. [00:14:14] I actually think like kids reading about fucked up shit is good for them in a way that like maybe kids watching fucked up shit in movies or TV isn't because there's the degree of we're going to talk about this like watching TV or a movie is much more of a one-way street, especially for a kid. [00:14:33] You know, as adults, you kind of get the ability to sort of interact with and analyze it more. [00:14:38] I do think that like watching TV or a movie is more of a one-way thing than like when you are reading, as we'll talk about, it's a dialogue between you and the book, right? [00:14:48] Like you are actively constructing meaning alongside that work. [00:14:52] Anyway, this gets us back to like emergent storytelling because a lot of aspects of emergent storytelling are things like understanding that the illustrations in a book are carrying aspects of character and aspects of the story. [00:15:07] And so little kids, very little kids, one-year-olds, two years olds, earlier than you'd expect, have already started to realize that when they are looking at a storybook, what you're reading them is not the whole story. [00:15:20] The illustrations are part of the story. [00:15:22] One study on emergent reading strategies I read by Judith Leiseker and Elizabeth Hopper of Purdue University noted: emergent reading strategies such as wordless book reading are often seen as precursors to the meaning making that comes later during print reading. [00:15:36] And I actually found one study where like they would read kids a story and then they would hand them a copy of that storybook without any text on it. [00:15:44] And they would ask the kids to write the story. [00:15:47] And the kids would write more detailed stories than the original versions because they're taking things that they recognize from the illustrations and adding that in when they recreate the story, you know? [00:15:59] Which is really interesting to me. [00:16:00] And that's what scares me a lot about a lot of these mid-journey created children's books, because when you've got a story that somebody is wanting to tell, transmit information through their story, their illustrations are transmitting information too. [00:16:14] That is not none of these illustrations and these AI books are transmitting information. [00:16:17] They are there to tick a box, but like the characters aren't interacting. [00:16:21] They don't even match what they're supposed to match, what the prompt says. [00:16:24] Like it's all off. [00:16:25] Like it may look like a human drawing, like these look like competent drawings, but they're not drawings of anything. [00:16:31] Nothing is being revealed in the faces of these characters, in their physical positioning, in the actions that they're shown taking part in it. [00:16:39] Like none of that is actually present here because there's not a person driving the artwork. [00:16:46] And that's that's really frightening. [00:16:48] Small children are info vacuums. [00:16:50] They are hoovering up observations about the world at a terrifying pace. [00:16:54] And before they can read, they come to understand things like story structure and the meaning of words and phrases by studying the illustrations that accompany text. [00:17:02] By breaking the illustrative part of a storybook, you are breaking the way in which kids learn to read at a fundamental level before they even under like the precursors to literacy are shattered by not showing them actual illustrations. [00:17:18] Like there's a real danger of that here. [00:17:21] The fact that these are so disjointed and wrong could fuck up the way kids sort of are understanding these stories on a very fundamental level. [00:17:30] Yeah. [00:17:31] And that's really frightening to me. [00:17:33] That's a real risk. [00:17:35] That's not even, I mean, we can call it a risk, but this is, as you've earlier established, man, this is a thing that is happening. [00:17:47] Yeah. [00:17:47] There is dangerous potential, sure, but that potential has to a degree been actualized. [00:17:53] You know, and some number of kids have had these books handed to them already. [00:17:59] And what, what, what would be then? [00:18:02] I think everybody probably is thinking the same question. [00:18:06] What would the risk be? [00:18:10] Like, what is the, what is the worst case scenario? [00:18:14] Does a, does a child read a, does like a latchkey kid sit alone with their fantastically terrible children's stories? [00:18:27] And they, what, they go to a museum one day and they say, hey, the T-Rex skeleton is wrong. [00:18:35] Where are the thumbs? [00:18:36] I think that's, I think that's like one specific thing that could happen as a result of these like weird coloring books. [00:18:42] I think the scarier thing that could happen as the result of stories, one of them is that like kids who become readers who come to love reading and fiction and thus writing and then create, you know, culture, right? [00:18:54] Like large aspects of our culture created by kids who love to read as kids and then become writers. [00:18:59] That a necessary part of that is loving and understanding stories. [00:19:03] And a necessary part of that is gradually integrating the illustrations, which are the first things that you start to recognize in storybooks as a kid, to words and the way that words work and the way that words tell stories and describe characters and plot. [00:19:18] And this kind of can break that. [00:19:21] The risk is that like, and kids are, you know, potentially kind of fragile here. [00:19:25] Like you could, you could damage the ability of children to appreciate reading and to appreciate stories. [00:19:32] And maybe kids who would have been readers, who would have cared about this stuff or who would have wanted to create things won't because they're kind of at a very early stage, their understanding of what reading is for is broken because it's not being, they're not having a conversation, you know, between an author and an illustrator and themselves and constructing meaning from that. [00:19:57] They are having this simulacrum of a story, this, this nonsense, these, the, the fucking potato chips of, of, of like even worse than potato chips. [00:20:08] If you ever, you know, you know, did you ever read Good Omens? [00:20:11] Yes, yeah. [00:20:12] Terry Pratchett and Neil. [00:20:14] Pratchett and Neil Gaiman. [00:20:15] One of the, it's, you know, it's, it's a book, Antichrist is like the hero kind of, and there's like the four horsemen of the apocalypse are characters in it. [00:20:23] And one of them, Famine, is like kind of a little counterintuitively at first. [00:20:27] He's like, his, his big plot is like running this fast food franchise. [00:20:31] So none of the food has any nutrition in it. [00:20:33] Like there's nothing. [00:20:34] It's actually, you can starve to death eating this food. [00:20:37] Like that's what I think of when I think of these books and what they can do to like kids ability, like developing literacy. [00:20:45] I find that unsettling. [00:20:47] AI advocates, when you talk about how fucked up and wrong a lot of this looks, will talk about like, well, you know, this is just mid-journey version, I don't know, three or four or whatever, or this is, you know, version three or four of Chat GPT and it's only going to get better. [00:21:00] Look at how much better it is now than it was, you know, before you knew these things existed. [00:21:04] It's going to get so much better. [00:21:06] You know, eventually it'll be seamless. [00:21:07] You won't be able to tell. [00:21:09] That's actually not a guarantee. [00:21:10] Nobody knows that. [00:21:11] For one thing, we're kind of off the map here. [00:21:13] Like one of the big, like there's aspects of how these things work that are kind of like unclear, even to the people making them and aspects of how good can they be. [00:21:24] And one of the things I will say, I heard this from somebody who's in the industry recently who was like, you know, like, how do you tell whether or not the model is getting more intelligent? [00:21:34] And he's like, I don't know. [00:21:35] How do you tell if people are more or less intelligent? [00:21:37] We don't have a good agreement on that. [00:21:38] Like IQ is bullshit. [00:21:40] Like, yeah, that's actually a pretty good point. [00:21:42] You know, like you're not wrong. [00:21:45] But also just like the idea that these models will get better and better at storytelling, at creating, you know, fiction, at creating images to go with fiction, that is not a guarantee. [00:21:55] One of the reasons why that's not a guarantee is that the popularity of AI tools means that the internet at a very rapid pace is being flooded with more AI generated stuff, right? [00:22:05] More and more of this is getting spat out on the internet every day. [00:22:08] And because new AIs will be trained on this content, that means that new AIs and AIs that are updated to have like more stuff from after 2022 are going to be trained on stuff that they generated. [00:22:22] So you are feeding AI art and text back into the model. [00:22:27] It's a feedback. [00:22:28] It can lead to what researchers call model collapse, right? [00:22:31] Which is the idea that like, well, the kind of these derangements, these messed up illustrations, like the faults, if you're feeding this back and retraining it on flawed stuff that it already put out, it's going to just keep exacerbating those flaws. [00:22:45] A group of researchers published in the journal ArcSiv described this as what happens when, quote, the use of model generated content and training causes irreversible defects in the resulting models. [00:22:56] Oh, like that movie, Multiplicity. [00:22:59] Exactly like multiplicity. [00:23:00] That's right. [00:23:01] That's right. [00:23:01] This is a multiplicity kind of situation. [00:23:04] Yeah. [00:23:06] And I find this particularly worrisome because the present models are already pretty full of defects. [00:23:12] Take the storybook generated for a video called I Create a Best Selling Children's Book Using AI in Under an Hour. [00:23:20] Now, the creator of this video, whose name is Grayson Sands, looks like what you'd get if you fed Midjourney the prompt, what if Ron Weasley was a registered sex offender? [00:23:29] And I don't know if I'll include that joke in the final article. [00:23:33] It's mean and not proper for like a serious piece of journalism, but I don't like this guy. [00:23:41] It's also just, I mean, because we're an audio podcast, right? [00:23:44] So it's also just for everyone playing along at home. [00:23:51] Do look it up. [00:23:52] You know, I hate body. [00:23:55] He's got black hair. [00:23:55] He's got the big, like, white-framed sunglasses. [00:23:59] He's got a leather jacket on. [00:24:01] There's a guitar hung on his wall behind him for some reason. [00:24:04] He's got epaulettes, which for some reason bothers me. [00:24:08] It's like, again, we're very not into body shaming, but. [00:24:13] No, this is not about his body shape or anything about this. [00:24:16] This is his aesthetic choices. [00:24:19] His vibe. [00:24:19] There we go. [00:24:20] His vibe. [00:24:21] His vibe. [00:24:22] His vibe looks off. [00:24:25] So wait, in under an hour, we create a best-selling children's book. [00:24:31] Sounds like. [00:24:31] That's what he promises in the video. [00:24:33] Sounds like there should be some air quotes around a couple of these things, Robert. [00:24:38] All of them. [00:24:39] So he decides he's going to generate a book for kids based on a character in a tattoo he got that. [00:24:46] He says, like, I got this tattoo for know no reason. [00:24:48] It means nothing to me basically, and it's a tattoo of a stegosaur playing a stand-up bass. [00:24:54] Um, so he got this stegosaurus, or triceratops sorry, why did I say stegosaurus? [00:25:00] Jesus Christ, what a see I? [00:25:02] I used to know dinosaur stuff when I was a kid. [00:25:04] You forget? === Tattoo Book Hallucinations (05:32) === [00:25:06] It doesn't even look like a dinosaur. [00:25:08] Really, it's not a good tattoo and it's one of the things that's like, look, I have a lot of tattoos. [00:25:13] I love tattoos. [00:25:14] I don't think every tattoo has to have deep meaning, but if you're, if you're specifically being like this tattoo, I got this random tattoo for no reason and now i'm going to make a book, i'm going to try to trick kids into reading a book about the character in this tattoo. [00:25:28] That just makes me angry. [00:25:29] It does. [00:25:30] I don't like it, Robert you for emphasizing how little you care about what you're making. [00:25:35] So yeah, he feeds a picture of this tattoo and this is one of those. [00:25:39] Like, there are cool things these ais can do, the fact that you can take a picture of your tattoo and say hey, use this as an illustration in a storybook. [00:25:45] That's kind of neat. [00:25:47] Um, you know, considering the crude drawing that it has to work with, mid journey does a decent enough job turning this into a character. [00:25:55] Um, it's like it's okay. [00:25:56] It's not good, but it's like fine, it's off-ish. [00:26:00] It's off-ish, but what's really off? [00:26:03] It's intriguing. [00:26:05] Yeah, there's there. [00:26:06] What's really off is the world behind the character. [00:26:10] Um, there's almost a little bit of a Sussian vibe, but like without any kind of intent behind it, which is unsettling. [00:26:17] And this becomes more obvious in the subsequent pages which get out hallucinatory. [00:26:21] Look at this, like there's random music signs, like drifting through, like hung-like garlands on the, the branchless trees. [00:26:29] All of the dinosaurs have these weird long arcing heads that are almost make them shaped like. [00:26:34] Like it's the sun has a giant question mark in the middle of it and then there's a second question mark next to the sun for No reason. [00:26:42] This is very Salvador Dolly. [00:26:44] Yeah. [00:26:46] A little bit dolly-esh, but really bad. [00:26:49] Yeah. [00:26:50] And then check out this next page because as it goes on, like it gets increasingly more divorced from anything that like, how would you describe this? [00:27:00] Almost pornographic, right? [00:27:01] Like those are tits on that tree. [00:27:03] I was going to say, those are, those are genitalia. [00:27:07] Yeah. [00:27:07] Like, yeah, there's, there's some dicks and some tits in this one. [00:27:11] It's weird, right? [00:27:14] Yeah. [00:27:14] I can admit, like, there's something interesting and amusing about how hallucinatory this is. [00:27:19] If this was like a Winamp visualization, like if I was, if I had just taken a bunch of five MEO MIPT and was like putting on a Murder City Devil's record and sat like a fucking, I don't know, yeah, like Winamp visualizations and I got shit like this. [00:27:34] I'd be like, oh, cool. [00:27:35] That's kind of neat. [00:27:36] But it doesn't have anything to do with the story, right? [00:27:40] The story that this guy has had Chat GPT write for his book is about a brachiosaurus who plays piano teaching a triceratops how to play bass, right? [00:27:49] Like this isn't, there's no reason for the art to look like this. [00:27:52] It's not like in keeping. [00:27:54] He hasn't like written, he hasn't like written like a psychedelic dinosaur story here. [00:28:01] And thus this is kind of like fitting. [00:28:02] It's like a very basic story about a dinosaur that wants to learn bass and meets a friend who teaches him how to do music. [00:28:08] And like it doesn't make any sense that it looks this way. [00:28:10] It's just confusing to kids. [00:28:12] At the end of the video, Grayson tells us how he got this book to be a bestseller. [00:28:17] And it turns out he just called up a bunch of his friends and family and told them he'd written a book that was on Amazon. [00:28:23] And then he begged them all to buy it. [00:28:24] They did. [00:28:25] And then it went up through the rankings because it doesn't take a lot to do that with print books. [00:28:30] And it started appearing higher in Amazon search results. [00:28:33] And he's like, hey, guys, I wrote a book. [00:28:35] Would you buy it? [00:28:35] And they're like, wow, you wrote a book. [00:28:37] And like, man, when they get a, when they get a sight of what you've actually done, I don't know. [00:28:43] Don't invite this kid. [00:28:44] Look, if you're his parents, it's time to just cut bait, you know? [00:28:49] Lock the doors when he comes by for the holidays. [00:28:51] Don't let Grayson in. [00:28:53] It's very, it's, it is very, um, what we're seeing here to your earlier point about this breaking kids, right? [00:29:01] And fundamentals of storytelling is now we're seeing, if you're a kid, you're reading this and you're seeing two very divided things. [00:29:11] There are two different stories being told. [00:29:13] One is kind of Mad Lib style about a Brachiosaurus who feels authoritative enough on bass to teach Triceratops or whatever. [00:29:24] And the second thing is the clear mental decline violently of an artist. [00:29:32] That's what it looks like. [00:29:34] The styles keep switching. [00:29:36] You know what I mean? [00:29:37] Like, I get what you're saying, man. [00:29:41] I think that is, I think another dangerous part of this is that if you were a kid reading this, right? [00:29:48] When you're a kid, you prize the books. [00:29:51] You prize the information you have access to as a sponge. [00:29:55] So if you are reading these things and you're quite impressionable, then you will have these sort of indelibly imprinted in your mind. [00:30:08] And years, decades later, you might say, Triceratops, she get into jazz. [00:30:16] Yeah, exactly. [00:30:17] This could lead to a whole new world of jazz heads. [00:30:21] And then they'll be smoking their jazz cigarettes. [00:30:24] We don't need that kind of shit. [00:30:26] Listeners, regular listeners will know this, but if you're new to the show, Ben and I are both the angry faculty members from Back to the Future who were trying to stop Marty from hanging out with those jazz singers. === Jazz Today Missing (04:10) === [00:30:38] That was our big break. [00:30:42] Speaking of old-timey 50s people being bigoted against jazz music, you know who also hates jazz? [00:30:50] Sponsors of this podcast. [00:30:52] Oh, yeah, yeah, yeah. [00:30:54] Hate it. [00:30:54] We make them send us a certified document stating that with their signature on it or else. [00:31:03] It's a jazz adavit. [00:31:06] Yeah, that's right. [00:31:12] There's two golden rules that any man should live by. [00:31:16] Rule one, never mess with a country girl. [00:31:20] You play stupid games, you get stupid prizes. [00:31:22] And rule two, never mess with her friends either. [00:31:26] We always say, trust your girlfriends. [00:31:30] I'm Anna Sinfield, and in this new season of The Girlfriends... [00:31:34] Oh my God, this is the same man. [00:31:36] A group of women discover they've all dated the same prolific con artist. [00:31:40] I felt like I got hit by a truck. [00:31:42] I thought, how could this happen to me? [00:31:44] The cops didn't seem to care. [00:31:46] So they take matters into their own hands. [00:31:49] They said, oh, hell no. [00:31:51] I vowed I will be his last target. [00:31:53] He's going to get what he deserves. [00:31:58] Listen to the girlfriends. [00:31:59] Trust me, babe. [00:32:00] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:32:10] I'm Lori Siegel, and on Mostly Human, I go beyond the headlines with the people building our future. [00:32:16] This week, an interview with one of the most influential figures in Silicon Valley, OpenAI CEO Sam Altman. [00:32:23] I think society is going to decide that creators of AI products bear a tremendous amount of responsibility to products we put out in the world. [00:32:29] From power to parenthood. [00:32:31] Kids, teenagers, I think they will need a lot of guardrails around AI. [00:32:35] This is such a powerful and such a new thing. [00:32:37] From addiction to acceleration. [00:32:39] The world we live in is a competitive world, and I don't think that's going to stop, even if you did a lot of redistribution. [00:32:44] You know, we have a deep desire to excel and be competitive and gain status and be useful to others. [00:32:50] And it's a multiplayer game. [00:32:53] What does the man who has extraordinary influence over our lives have to say about the weight of that responsibility? [00:32:59] Find out on Mostly Human. [00:33:01] My highest order bit is to not destroy the world with AI. [00:33:04] Listen to Mostly Human on the iHeartRadio app, Apple Podcast, or wherever you listen to your favorite shows. [00:33:12] Hey, I'm Nora Jones, and I love playing music with people so much that my podcast called Playing Along is back. [00:33:18] I sit down with musicians from all musical styles to play songs together in an intimate setting. [00:33:23] Every episode's a little different, but it all involves music and conversation with some of my favorite musicians. [00:33:28] Over the past two seasons, I've had special guests like Dave Grohl, Leve, Mavis Staples, Remy Wolf, Jeff Tweedy, really too many to name. [00:33:38] And this season, I've sat down with Alessia Cara, Sarah McLaughlin, John Legend, and more. [00:33:43] Check out my new episode with Josh Grobin. [00:33:46] You related to the Phantom at that point. [00:33:49] Yeah, I was definitely the Phantom in that. [00:33:51] That's so funny. [00:33:52] Share each day with me each night, each morning. [00:34:01] Say you love me. [00:34:04] You know I. [00:34:05] So come hang out with us in the studio and listen to Playing Along on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:34:15] We're back. [00:34:16] And boy, you know, you know what jazz today is missing? [00:34:21] What's that, Robert? [00:34:22] Jazz today is missing William Riker wearing a onesie, sitting awkwardly backwards in a chair, playing the saxophone in order to win the love of an AI-generated woman. [00:34:35] In that one episode where like he and Picard are kind of gooning together on the holodeck, if you know the term gooning, I have learned my favorite episode of Star Trek, by the way. [00:34:44] I have learned so much in the space of less than 60 seconds. === James Joyce Absurdity (15:35) === [00:34:49] Yeah, that's right. [00:34:50] We're going to get some t-shirts out for you people. [00:34:52] That's a, that's Riker and Picard just gooning out together with that AI lady in the jazz club. [00:34:58] Just really, really, really, because it's the 24th century. [00:35:02] They don't have shame like we do, you know? [00:35:05] You act like Rory Blank couldn't make a beautiful design with that. [00:35:09] I intend to have Rory Blank do a Picard and Riker gooning t-shirt. [00:35:14] Hi, Rory. [00:35:15] Well, they're a post-shame economy in the world, in the universe of Spanish. [00:35:20] Oh, absolutely. [00:35:21] No, there's no such thing as shame. [00:35:24] They are all as horny as a letter from, oh shit. [00:35:31] I just ruined this joke because as a letter from James Joyce to his wife. [00:35:38] Yeah. [00:35:38] Google James Joyce love letters. [00:35:40] You'll learn some fun things about one of history's greatest artists. [00:35:45] So speaking of not one of history's greatest artists. [00:35:50] Sure. [00:35:50] Under an hour. [00:35:51] Under an hour, he wrote this book. [00:35:53] Yeah. [00:35:54] Yeah. [00:35:54] James Joyce wrote Ulysses in less than an hour. [00:35:57] It took him a lot longer. [00:35:58] You know, actually, when you read Ulysses, because one of the most famous scenes in that book that got it attacked by a lot of anti-obscenity laws is one of the characters walking along the strand, basically, and masturbating through a hole in his pocket while his wife fucks some other guy. [00:36:15] And when you are reading a story about a man masturbating through a hole in his pocket, again, all fiction is a dialogue between reader and author. [00:36:24] You're kind of gooning with James Joyce. [00:36:27] Beautiful. [00:36:28] Good point. [00:36:28] Isn't that beautiful? [00:36:29] Inspiring. [00:36:30] Art can really take us to some amazing places. [00:36:35] It sure can. [00:36:36] You know, literacy is the first holo deck, you know? [00:36:39] Exactly. [00:36:41] It's also, it's also like, you know, the thing I think people forget often is that being able to read, to encounter a story is, it's sort of the closest folks have gotten to necromancy, right? [00:36:58] That's right. [00:36:59] Time travel, speaking with the dead. [00:37:01] And you got to be careful with this stuff. [00:37:04] Also, Finnegan's Wake, man. [00:37:07] Just between us, I can tell you're a fellow enthusiast. [00:37:10] Oh, wakehead? [00:37:11] Absolutely. [00:37:12] Yeah. [00:37:13] Our filthy, our filthy pal, James. [00:37:17] Do you think James Joyce knows what Finnegan's Wake is about? [00:37:21] I go back and forth on that. [00:37:23] I had one of my fiction teachers when I was a younger man was very angry whenever you would bring up Finnegan's Wake and just like he hated James Joyce and he thought like the whole book was a con. [00:37:36] I find it like sometimes I'll go through it when I'm trying to go to sleep just because like it's it's it's actually it's kind of I think like Blood Meridian is challenging in some ways that aren't different because like so much of reading Blood Meridian is like, what the fuck did he mean by that? [00:37:50] Like what was the actual, like what was going on here? [00:37:53] Blood Meridian is a lot is much more closer to like a normal novel than Finnegan's Wake is though. [00:37:59] But like they're both interesting. [00:38:01] Also, when I was a kid, I read a series of mystery books that were themed after Finnegan's Wake, where like the character's name was Finnegan's Wake, like Z-W-A-K-E. [00:38:11] Weird. [00:38:11] I just remembered that now. [00:38:14] What an odd idea to make like a series of children's mystery novels themed after James Joyce's Finnegan's Wake. [00:38:20] Chat GPT, make an intriguing story about a dinosaur learning to play the piccolo, written in the style of Cormac McCarthy. [00:38:31] Oh, also make it mind-blowing. [00:38:33] Yeah, yeah, definitely make it mind-blowing. [00:38:36] God, especially, yeah, okay. [00:38:39] So we were just talking about this fucked up Dandy the Dinosaur is the dog shit book that this guy generates based off of his tattoo. [00:38:50] At present, there's only two reviews for it. [00:38:53] Both of them are five stars, but like, I don't know if they're real or not, or if like he had family members write reviews for him to try to help his book sell. [00:39:01] So it's one of those things. [00:39:02] He's gaming the system here, right? [00:39:03] He's having like his friends and family buy a bunch of copies to try to shoot it up the Amazon rankings in the hope that that generates organic sales. [00:39:12] I can't tell based on the information that's available to us if Grayson's effort to game the system was successful or not, if it like made him money. [00:39:20] But the basic tactics he's engaging in can be used and will be used by other people generating AI books who have access to more resources and who might have actual agendas. [00:39:32] Because this is the kind of thing that already happens, right? [00:39:34] Like if you've got, you know, you have like a celebrity or a politician who like releases some ghostwritten book and they like work for the Heritage Foundation or some like fucking think tank or whatever, that think tank will buy like 10,000 copies of the book so that it makes it on the New York Times bestseller list. [00:39:51] Like that's like a common tactic, right? [00:39:53] And what he was attempting to do, Grayson was kind of incompetently attempting to do is the Kindle version of that. [00:39:59] But there's no reason why people couldn't, like, again, who have an actual agenda, couldn't generate a series of children's books, buy a bunch of copies of them to get it to shoot up in the rankings in the hope that that tricks a bunch of charities and parents and all these kind of libraries and stuff to flood the market, like and to flood kids with copies of whatever book they've got. [00:40:19] And when it comes to like how that could be unsettling, that brings me to another unfortunate soul, another unfortunate AI author. [00:40:29] His name is Lewis, Lucas Kitchen. [00:40:32] You can see Lucas up there in the front image. [00:40:36] Can I just say all these people, same vibe? [00:40:39] They all have the exact same vibe. [00:40:41] You know, they all have strong opinions about ape pictures and how much money they might be worth. [00:40:47] I can't wait to hear their opinions on NFTs. [00:40:50] Yeah. [00:40:50] Oh, yeah. [00:40:51] And yeah, Lucas, you know, based on the illustration that we've got down there for his book, it features like you've got this kind of old man on one side looking at like a leatherbound book, and then like a puppy dog in the middle with some like creepy gnomes under it. [00:41:08] And then like this very demonic looking rabid unicorn character that does not look like the same style of art as the as the other characters. [00:41:17] So looking at the cover color of his book, you would guess that it's like the story of an old man and his dog maybe getting murdered in the woods by a unicorn. [00:41:25] But what's what he's actually writing here is much more frightening because Lucas is an evangelical Christian. [00:41:30] He might be a fundamentalist. [00:41:32] He writes science fiction books about like proselytizing on Mars. [00:41:37] He like he has like a bunch of like all of his, he does like a large number of like weird kind of Christian evangelical Evangelical fiction. [00:41:47] And the story prompt that he feeds his AI is one of the more absurd ones I've come across. [00:41:52] And I'm going to read that to you now. [00:41:54] Write a children's book where the protagonist is a little puppy named Fluff. [00:41:57] Fluff wants someone to tell him about Jesus, but he can't read John 3.16. [00:42:02] So he needs a solution. [00:42:03] The antagonist of this story is a bad unicorn. [00:42:06] In the story, include a field with trees. [00:42:09] Also, include magic in the forest. [00:42:11] Also, include a character who says, uh-uh, over and over. [00:42:15] Now, if that sounds weird, it's because Lucas has his kids help him write the prompt, which, you know, I try not to be totally negative. [00:42:22] I can see, like, if you're a parent, you know, and you've got kids who are a little bit older, who have some, you know, who can read a bit themselves, you sit down and you have them, all right, you know, like you would, you know, what kind of bedtime story do we want? [00:42:33] Give me some character names and a plot and let's plug it in the chat GPT. [00:42:37] And then if you're a good parent, a way that this could be a good learning exercise is it generates a crappy AI story. [00:42:42] And then you sit down and you go over it with your kids and you go, well, what's missing here? [00:42:46] Why doesn't this work? [00:42:47] You know, what, what, what is like, why isn't this complete? [00:42:50] What are the things that we would add to this in order to make an actual proper story? [00:42:54] You could actually teach kids something about storytelling in a way that would be useful doing that. [00:43:00] That's not what Lucas is doing here. [00:43:02] Don't worry. [00:43:02] I mean, he's not. [00:43:04] You just like, again, because I know one of the things we get sometimes is the idea that we are not fun at parties or we're dark or depressing, but I think you laid out a really good hypothetical scenario. [00:43:20] What if of children and parents communicating emergent storytelling, creating dialogue? [00:43:27] And I just want to take a moment before we go towards some even more troubling horizons to say that was really nice, man. [00:43:37] Yeah. [00:43:37] That's a really cool idea. [00:43:39] I try to not do all. [00:43:40] So one of the things I did research into this is I like played some of these videos for a friend of mine who is a young mother. [00:43:47] And, you know, and who is not, she's not as online, nearly as online as I am. [00:43:52] And she's not like, she's not someone who is as much of a pessimist about all of this stuff as I am. [00:43:57] And she was like, oh, you know, like, this is creepy, the fact that these people are like just shotgunning novels out onto Kindle to trick people. [00:44:04] But like, I could see it being cool. [00:44:05] I've always, you know, maybe I could make a story and the AI could help me because I'm not a writer. [00:44:10] And I could like, you know, use it to generate the bones and I could fill it out and I could make like a custom little story for my kid. [00:44:16] And that might be nice. [00:44:16] I'm like, yeah, sure. [00:44:17] I don't think that's harmful. [00:44:19] Like that's as long as you're not just printing what the AI gives you. [00:44:22] If this helps you make like a neat little bespoke storybook for your kid and that makes you feel good, that's fine. [00:44:29] You know, I'm not saying like the problem is that all of this stuff is immediately being taken by the worst common denominators in our society, right? [00:44:37] The same people who were trying to like get you to spend your life savings on monkey drawings a year ago are now cramming like shotgunning hundreds and hundreds of books a month onto Amazon that are going to like do damage we probably don't fully understand to any kids who read enough of them. [00:44:56] That's the problem. [00:44:57] Not that there aren't cool uses for this stuff here, but Lucas is not doing anything cool here. [00:45:02] This book, this weird Jesus book that he generates using chat GPT is as nonsensical and devoid of actual plot as all of the others we've seen. [00:45:11] And I'm going to read you the text that the chatbot cooks up for this story. [00:45:15] Oh, buddy. [00:45:16] Once upon a time in a magical forest with fields full of trees, there lived a little puppy named Fluff. [00:45:22] Fluff was a curious puppy and he loved to explore the forest and learn new things. [00:45:26] One day, Fluff heard about a man named Jesus who was very special and had done many wonderful things. [00:45:31] Fluff was very interested and wanted to learn more about Jesus, but there was one problem. [00:45:36] Fluff couldn't read. [00:45:40] Oh, God. [00:45:41] That's so weird. [00:45:44] First off, if you're a fucking dedicated Christian enough that like your AI scam book is trying to get kids hooked on Jesus, can you describe him better than he was special and did wonderful things, right? [00:45:57] Like, at least if you're going to be like, be like, yeah, he saved people's souls. [00:46:00] He made it possible for us to go. [00:46:02] Like, you can say more about Jesus than this. [00:46:06] Well, he did nice stuff. [00:46:08] Yeah. [00:46:08] Check in the boxes, right? [00:46:10] Like, there's this guy. [00:46:12] Jesus, he's got a good vibe. [00:46:14] Yeah, he got a cool vibe. [00:46:15] Yeah. [00:46:16] Yeah. [00:46:16] Dogs can't read though. [00:46:17] So that's like a problem. [00:46:19] Guess I'm fucked. [00:46:21] So wait, this is also dangerous. [00:46:23] Is it not because it is we're verging from the land of Mad Lib, check the boxes, absence of motivation, right? [00:46:35] Into the land of proselytizing or propagandizing without an understanding of it. [00:46:46] Like it's that's very weird. [00:46:47] That's very off-putting. [00:46:49] Yeah, the scale of propaganda that's possible when you don't even have to have anyone to actually write or illustrate it, but also how propaganda that's that like disjointed and incoherent, like does it all just become noise and get lost? [00:47:03] And then the primary problem is just that it like floods the zone and makes it hard to find stuff that isn't this kind of trash? [00:47:08] Or is it just, or does it like, does being exposed to so much of this disjointed, weird robotic hallucination shit alter the way that we think? [00:47:16] We don't really know yet, you know? [00:47:17] We didn't know what social media was going to do to us. [00:47:20] And now all of us, we have all the attention spans of a fucking fruit fly. [00:47:24] So the dopamine casino. [00:47:26] Yeah. [00:47:29] I want to put there's a there's a point in this fucking off-putting book where or off-putting video where he like plays the an A, he has an AI narrator narrate this book because again, why involve human creativity in any way, shape, or form? [00:47:44] Alongside these terrible images that it's generated with like video of his kids reacting to it. [00:47:50] So you can see how his children react to the story. [00:47:54] So we're going to play a section of that and keep an eye on their faces. [00:47:59] The unicorn stopped in her tracks. [00:48:02] She couldn't move any closer to Fluff and the old man. [00:48:09] The old man smiled and said, See, Fluff, Jesus not only gives us eternal life for free, but he protects us too. [00:48:18] We just have to trust him and ask for help. [00:48:22] Fluff was amazed. [00:48:24] He thanked the old man for teaching him about Jesus. [00:48:27] From that day on, Self was a happy puppy and he always remembered the lesson the old man had taught him. [00:48:33] He knew that no matter what, Jesus gives eternal life to those who believe and helps them every day. [00:48:40] Evil. [00:48:41] This shit is evil, man. [00:48:50] Oh my God. [00:48:52] It's so hard. [00:48:53] Because if you watch the video of the kids at the start of it, like they're both kind of, they're all kind of like sitting up. [00:49:01] And as it goes on, like one girl puts her head into her hands. [00:49:03] The other has like her head on hands on her. [00:49:06] Like they're not, they look uncomfortable. [00:49:09] Like they're not enjoying this story that they're being fed. [00:49:13] Oh, it's so fucked up. [00:49:16] Also, dog shit story. [00:49:20] But like, yeah, the fact that in his video where he's trying to like show how well robots can generate Christian propaganda, like his own kids could not be less engaged in this shit, which maybe is a good thing, right? [00:49:35] Maybe the fact that they're so inherently bored by it means it won't do as much damage as I'm afraid. [00:49:41] But you know what will do as much damage as I'm afraid of? [00:49:45] Jazz. [00:49:46] Well, yeah, jazz. [00:49:47] Fundamentally a mind poison, you know? [00:49:50] That's why the teens these days, the teens been, they're always pulling up in their jalopies to the malted milkshakes and the civil rights movement. [00:50:01] God damn it. [00:50:03] Anyway, here's ads. [00:50:07] I like how we're attacking jazz unreasonably in an unrelated way. [00:50:11] Okay. [00:50:12] Let's take it down. [00:50:12] Let's take it down. [00:50:13] Jazz has had enough time in the sun. [00:50:17] All right. [00:50:18] Here's ads. === Poisoning Teens with Ads (02:19) === [00:50:24] There's two golden rules that any man should live by. [00:50:28] Rule one, never mess with a country girl. [00:50:32] You play stupid games, you get stupid prizes. [00:50:35] And rule two, never mess with her friends either. [00:50:38] We always say, trust your girlfriends. [00:50:42] I'm Anna Sinfield. [00:50:43] And in this new season of the girlfriends... [00:50:46] Oh my God, this is the same man. [00:50:48] A group of women discover they've all dated the same prolific con artist. [00:50:53] I felt like I got hit by a truck. [00:50:55] I thought, how could this happen to me? [00:50:56] The cops didn't seem to care. [00:50:58] So they take matters into their own hands. [00:51:01] I said, oh, hell no. [00:51:03] I vowed I will be his last target. [00:51:05] He's going to get what he deserves. [00:51:10] Listen to the girlfriends. [00:51:11] Trust me, babe. [00:51:13] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:51:22] Hey, I'm Nora Jones, and I love playing music with people so much that my podcast called Playing Along is back. [00:51:28] I sit down with musicians from all musical styles to play songs together in an intimate setting. [00:51:33] Every episode's a little different, but it all involves music and conversation with some of my favorite musicians. [00:51:38] Over the past two seasons, I've had special guests like Dave Grohl, Leve, Mavis Staples, Remy Wolf, Jeff Tweedy, really too many to name. [00:51:48] And this season, I've sat down with Alessia Cara, Sarah McLaughlin, John Legend, and more. [00:51:53] Check out my new episode with Josh Grobin. [00:51:56] You related to the Phantom at that point. [00:51:59] Yeah, I was definitely the Phantom in that. [00:52:01] That's so funny. [00:52:02] Shariach stay with me each night, each morning. [00:52:11] Say you love me. [00:52:14] You know I. [00:52:15] So come hang out with us in the studio and listen to Playing Along on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:52:23] I'm Lori Siegel, and on Mostly Human, I go beyond the headlines with the people building our future. [00:52:28] This week, an interview with one of the most influential figures in Silicon Valley, OpenAI CEO Sam Altman. [00:52:35] I think society is going to decide that creators of AI products bear a tremendous amount of responsibility to products we put out in the world. [00:52:42] From power to parenthood. === Plagiarism and Guardrails (15:26) === [00:52:44] Kids, teenagers, I think they will need a lot of guardrails around AI. [00:52:47] This is such a powerful and such a new thing. [00:52:49] From addiction to acceleration. [00:52:52] The world we live in is a competitive world, and I don't think that's going to stop, even if you did a lot of redistribution. [00:52:56] You know, we have a deep desire to excel and be competitive and gain status and be useful to others. [00:53:03] And it's a multiplayer game. [00:53:05] What does the man who has extraordinary influence over our lives have to say about the weight of that responsibility? [00:53:12] Find out on Mostly Human. [00:53:13] My highest order bit is to not destroy the world with AI. [00:53:16] Listen to Mostly Human on the iHeartRadio app, Apple Podcast, or wherever you listen to your favorite shows. [00:53:28] Ah, we are B-A-Dou. [00:53:36] Which is how I spell back. [00:53:37] So, Yeah, this is like fucked up, and I don't think his kids liked it, but it is the kind of thing that I worry about, or at least there's like evidence of the kind of thing that I worry about of like where some of the direction some of this might head. [00:53:55] Because churches and also political organizations, groups like Turning Point USA, who are already looking at young people and have access to grind and a lot of money, like the potential of them to generate huge amounts of propaganda content to buy up copies to get Amazon to spread it. [00:54:11] And in order to do that, in order to like trick large numbers of parents and kids into buying books that contain weird right-wing propaganda, like they're already doing versions of this, this con all throughout the world and throughout media, throughout like our culture. [00:54:28] I am deeply concerned about the ability to like spread it in more subtle ways through like these fake children's books and shit. [00:54:35] Right. [00:54:35] Oh, like a story about dinosaurs who learned that there are bathrooms only for some specific types of dinosaurs. [00:54:45] And that's why, you know, when they stop doing that, when they let all of the dinosaurs use whatever bathroom they feel they should use, that's when the meteor hits because God decided to kill them all. [00:54:57] Yeah, I don't know. [00:54:59] I think it could be fucked up. [00:55:01] And, you know, again, there's some actual rigorous reason to think that this could cause some serious damage. [00:55:07] In the book Literature is Exploration, which is a very influential book on literary theory by Professor Luis Rosenblatt. [00:55:14] Rosenblatt argues that the reader is a crucial piece, as I've been talking about, is a crucial part of any piece of literature. [00:55:21] She writes, there is no such thing as a generic reader or a generic literary work. [00:55:26] There are only the potential millions of individual readers or the potential millions of individual literary works. [00:55:32] A novel or a poem or a play remains merely ink spots on paper until a reader transforms them into a set of meaningful symbols. [00:55:40] And what, yeah, it's beautiful. [00:55:42] And what she means by this is that books from Blood Meridian to Hop on Pop are a dialogue between writer and reader. [00:55:49] The machines that are generating these stories cannot participate in a conversation. [00:55:54] They are mechanical Turks. [00:55:56] They are not conversing. [00:55:57] They are guessing what word comes next based on a mix of complex math and the labor of Kenyan contractors paid $2 an hour to make sure that the responses aren't too racist. [00:56:07] This is a problem in part because one of the things we know about how books impact people is that reading real books, reading novels, teaches empathy. [00:56:17] It is common knowledge and well documented that reading long-term fiction makes people better able to identify with other people's thoughts and struggles. [00:56:25] Being a reader makes you more empathetic, right? [00:56:28] This is well established and well documented. [00:56:31] Educational researchers have found that very young children can actually be influenced towards engaging in new behavior by the stories that they read. [00:56:40] And I'm going to quote from an article by Peggy Albers in The Atlantic here. [00:56:44] Stories can be used to change children's perspectives about their views on people in different parts of the world. [00:56:49] For example, Hillary Jenks works with children and teachers on how images and stories on refugees can influence the way that refugees are perceived. [00:56:57] Kathy Short studied children's engagement with literature around human rights. [00:57:01] In their work in a diverse K-5 school with 200 children, they found stories moved even such young children to consider how they could bring change in their own local community and school. [00:57:11] Now, in that last case, Kathy with Kathy Short, the students that she was like engaging in these stories, she like told them, basically, read them, you know, that you have, they have these collections of books about like kids who did amazing things. [00:57:24] Right, right, yeah. [00:57:24] Kathy reads a bunch of these like K through 5 students, the story of an anti-child labor activist, a real kid named Iqbal Massey, who was murdered at age 12 as a result of attempting to like end child labor. [00:57:36] And I think it was Pakistan. [00:57:38] And the kids that she's reading these two, very young children are so moved by the story of this person and by what they've read that they start, decided to create a community garden and like built a community garden together and then grew food that they donated to a local food bank. [00:57:54] Like this was months and months and months of work. [00:57:57] Like basically, and kind of the point that Kathy was making with this research is that like something as simple as reading a single story can inspire and influence young children to take action, months of action, to like seriously engage themselves in things, because like that's how influential stories can be on behavior and particularly the behavior of children, because they've been fed less shit. [00:58:23] They're able to, it means more when a kid encounters a story than it does when you do, because you've got a lot more stories in your head. [00:58:30] And that's part of what is unsettling to me about all this, because like, you know, is it possible that an AI generated story about Iqbal Massey could inspire, you know, little kids in a school to take positive action like that? [00:58:45] Maybe. [00:58:46] Is it possible that an AI generated story could inspire kids to take negative actions? [00:58:52] Maybe. [00:58:53] We don't know. [00:58:54] But the thing that's most frightening to me is we're all going to learn the answer to these questions together in the very near future, whether we want to or not. [00:59:04] That is chilling. [00:59:07] Yeah, it's cool. [00:59:07] It's good stuff. [00:59:08] Love that we're doing this. [00:59:12] I would say that for many of us playing along at home, this concept might sound somewhat abstract. [00:59:21] This might sound somewhat hypothetical or a thought experiment or something or one of the 1,462 only possible stories in Ploto. [00:59:33] But the reality that you have outlined here, Robert, is stark. [00:59:39] It is inevitable that this will occur. [00:59:43] It's happening now, right? [00:59:44] You're coming to us in real time. [00:59:47] You've cited multiple stories. [00:59:50] Yeah, it could happen here. [00:59:52] I don't want to do fanboy stuff. [00:59:54] But the point is, I think the point is sobering. [01:00:00] And one question that a lot of folks are going to have here is what, if anything, will people do with this knowledge? [01:00:12] Like for people who are listening now, who have kids or who have loved ones, have any sort of ability to curate access to information, is there something they can do? [01:00:27] Yeah. [01:00:28] I mean, I think number one, if you're a parent, if you're someone who buys gifts for kids, you know, because you've got some in the family or whatever, be aware of this. [01:00:38] Be aware of what's out there. [01:00:39] Be aware that like you can't just look at, oh, this kid likes coloring books. [01:00:43] Let me see what the most popular coloring book in dinosaurs is, right? [01:00:48] Take a second look. [01:00:49] Take more of a look at the reviews. [01:00:52] See if you can look through a couple of different pages from it on the Amazon thing. [01:00:57] See if it has any of these hallmarks. [01:00:59] Again, if you go to the shatterzone.substack.com, you'll find the actual article version of this episode. [01:01:08] Look at the images we've put up here and take a look to see if you can see any of these hallmarks. [01:01:13] Take a look at the text. [01:01:14] It should be pretty obvious to you, an adult, if the text is AI generated. [01:01:20] That's kind of the first most basic thing you can do is like be aware of what's possible and try to keep an eye on it and make sure that you don't contribute to paying these people or to getting more of this stuff out to kids. [01:01:34] I think the other things that can and should be done, number one, we could be pressuring Amazon to make it harder to do this stuff, make it clear what their plagiarism detectors are, make it clear what their lines are for AI work being crapped out to children. [01:01:48] Like, do they have any restrictions there? [01:01:50] Are you just allowed to put up as many of these random books as you want without any kind of limitation? [01:01:55] Well, so far, that's the situation. [01:01:57] Can Amazon be pressured to take a different tact? [01:01:59] Well, if there actually was enough bad PR, perhaps. [01:02:03] I just want to pause right there because that sounded like a bar. [01:02:08] Like that sounded like you were about to drop a beat with the internal rhyme scheme and the cadence. [01:02:15] Everybody play that back. [01:02:16] Play that part back. [01:02:17] Set it to a beat. [01:02:18] Someone can fix that up for us. [01:02:21] But yeah, you know, like, look, that's what I would like people to like. [01:02:25] There needs to be pressure on Amazon about this stuff, among other things. [01:02:29] Like, again, I've reached out to several of these creators for comment, but I reached out for Amazon to Amazon for content on a number of things. [01:02:35] They haven't gotten back to me. [01:02:37] I know a lot of journalists listen to this stuff. [01:02:40] There's room for other articles on this. [01:02:41] It'll get good traffic. [01:02:43] Everybody reads AI shit. [01:02:44] Get out there yourself and make them answer some of these things. [01:02:50] You know, one of the, there's a couple of different questions that I asked Amazon, and I'll read them right now. [01:02:56] Number one, does Amazon restrict the publication of AI works on Kindle in any way? [01:03:01] Does it make a difference if the works are marketed towards children? [01:03:04] Number two, does Amazon keep data on how many AI-generated books of various types are selling? [01:03:09] Number three, in the guides that I have watched, creators discuss how the text they generate sets off Amazon plagiarism detectors. [01:03:17] To get around this, creators use a service called Quillbot, which replaces adjectives with synonyms. [01:03:21] Is this a violation of Kindle/slash KDP terms of service? [01:03:26] You know, these are pretty basic questions. [01:03:27] There's more to be asked, but like the sheer factor of like being reached out to and talked to, like if there's enough bad press, it's theoretical that they might make it harder for these people to do what they're doing. [01:03:39] Likewise, you know, I think that these different sort of grind set creators could be, for, you know, for one thing, it's possible that there's some violation of YouTube here that they are like admitting to engaging in plagiarism and then finding ways around it. [01:03:58] Like, I think there's an argument to be made, at least, that like there might be actually rights issues here. [01:04:03] Like if the initial text of this is plagiarism and they are disguising that plagiarism and then profiting off of it, there's a degree to which they could be in a legally dicey area. [01:04:13] It's possible that you could get YouTube to take action against some of this content. [01:04:17] I don't know. [01:04:19] It's largely a matter of like, you can get enough people angry about this on behalf of the kids, then these companies will eventually take action, not because it's the right thing to do, but because if enough people get angry, corporations tend to take the coward's way out, which is what the angry people want. [01:04:36] Yeah. [01:04:37] So I don't know. [01:04:39] That's my only suggestion right now. [01:04:41] I'll keep thinking about it. [01:04:42] Yeah. [01:04:43] I think that's great, though. [01:04:45] This is actionable advice. [01:04:48] I hope so. [01:04:49] Yeah. [01:04:50] We'll see. [01:04:50] Yeah. [01:04:51] Yeah. [01:04:51] We'll see. [01:04:53] It'll be, it'll be interesting to listen to this episode in what do you think? [01:04:58] Five years? [01:05:00] Yeah. [01:05:01] Yeah. [01:05:01] When all of our jobs have been replaced by AI except for Sophie, the entire world of media is just like Sophie sitting down to talk with Harrison Ford about his new baldness cream and Sophie sitting down with Joe Rogan to talk about steroids. [01:05:19] Sophie sitting down with AI Me to read a story about Hitler that uses facts that were just made up in the ether. [01:05:27] There's real life. [01:05:29] There's no way robot you could atonal shriek the way that real you Sophie, you understand the only thing that I'm actually proud of is my atonal shrieking. [01:05:41] I went to school for four years to learn how to shriek like that, you know? [01:05:45] Yeah, I know, I know. [01:05:47] Yeah, that's Dr. A-tonal shrieking. [01:05:49] That's right. [01:05:50] That's right. [01:05:50] Yeah. [01:05:51] Yeah. [01:05:51] I teach a class in Stanford on how to go. [01:05:57] Is it in Stanford? [01:05:58] Because I heard it was on a cross street. [01:06:00] Like it's on a cross street. [01:06:03] I have a bullhorn. [01:06:06] I also have a crude spear that I whittled, you know, but it's basically Stanford. [01:06:11] Hey, that's a human-made spear, though. [01:06:12] Credit where it's due. [01:06:13] Oh, yeah, absolutely. [01:06:15] I only use human-made spears and also sometimes one time a spear that was crafted by a chimpanzee. [01:06:22] But, you know, same day. [01:06:24] That's between you and the chimp, man. [01:06:26] A lot of things are between me and that chimp. [01:06:30] Just to say, do we have any pluggables at the end here, Ben? [01:06:34] Yeah. [01:06:35] You have anything you want to plug? [01:06:36] Oh, yeah. [01:06:38] I will say you can check out more by going to CoolZone Media. [01:06:44] I want to give a big shout out to Gare, who has done some Garrison, who has done some top-notch reporting, in my opinion, on the current events in Atlanta. [01:06:57] You may be familiar with the Stop. [01:06:58] Yeah. [01:06:59] With the Stop Cup City. [01:07:00] Garrison, which is available and it could happen here and is truly incredible. [01:07:05] Yeah, great stuff. [01:07:08] Anything else, Ben? [01:07:10] Oh, yeah. [01:07:11] You can find me. [01:07:12] Yeah, you can find me on Twitter or Instagram, wherever you can find me on Friendster, Farmers Only. [01:07:22] You know, no spoilers. [01:07:24] Only. [01:07:25] Yeah. [01:07:26] So big. [01:07:27] You know, I'm an overall influencer. [01:07:30] Oh, my gosh, levels to these jokes. [01:07:32] Take that. [01:07:33] You know, try that, ChatGPT. [01:07:36] But in a burst of creativity, calling myself some derivation of at Ben Bolin. [01:07:42] Yeah, super secret code. [01:07:45] Sensational. [01:07:46] Brian Robert, what's that link for your sub stack one more time? [01:07:50] Shatterzone.substack.com. [01:07:52] It is not regularly updated, but you know, every week we not super rarely updated. [01:07:59] It's free. [01:08:00] And you can find this article on fucking how AI is coming for your children. [01:08:05] That'll probably be the title, How AI is Coming for Your Your Children. [01:08:08] Just go to the substack. [01:08:10] You'll find it. === Dinosaur Image Check (02:16) === [01:08:11] You'll find all these fucked up dinosaur images. [01:08:14] I guarantee you, you haven't seen enough fucked up looking dinosaurs in your life. [01:08:18] Simply have not. [01:08:19] Check that out. [01:08:20] Maybe. [01:08:22] Okay, bye. [01:08:23] Bye. [01:08:26] Behind the Bastards is a production of CoolZone Media. [01:08:29] For more from CoolZone Media, visit our website, coolzonemedia.com or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:08:40] When a group of women discover they've all dated the same prolific con artist, they take matters into their own hands. [01:08:48] I vowed I will be his last target. [01:08:50] He is not going to get away with this. [01:08:52] He's going to get what he deserves. [01:08:55] We always say that. [01:08:56] Trust your girlfriends. [01:08:59] Listen to the girlfriends. [01:09:01] Trust me, babe. [01:09:01] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:09:11] On a recent episode of the podcast, Money and Wealth with John O'Brien, I sit down with Tiffany the Budginista Alicia to talk about what it really takes to take control of your money. [01:09:22] What would that look like in our families if everyone was able to pass on wealth to the people when they're no longer here? [01:09:28] We break down budgeting, financial discipline, and how to build real wealth, starting with the mindset shifts too many of us were never, ever taught. [01:09:37] If you've ever felt you didn't get the memo on money, this conversation is for you to hear more. [01:09:43] Listen to Money and Wealth with John O'Brien from the Black Effect Network on the iHeartRadio app, Apple Podcast, or wherever you get your podcast. [01:09:53] Earners, what's up? [01:09:54] Look, money is something we all deal with, but financial literacy is what helps turn income into real wealth. [01:10:00] On each episode of the podcast, Earn Your Leisure, we break down the conversations you need to understand money, investing, and entrepreneurship. [01:10:07] From stocks and real estate to credit, business, and generational wealth, our goal is simple. [01:10:12] Make financial literacy accessible for everyone. [01:10:15] Because when you understand the system, you can start to build within it. [01:10:18] Open your free iHeartRadio app, search Earn Your Leisure, and listen now. [01:10:23] This is an iHeart podcast. [01:10:26] Guaranteed human.