Truth Unrestricted - Unreality On Film - Her Aired: 2026-03-23 Duration: 01:05:22 === One Half Talks to the Other (13:12) === [00:00:21] And we're back with Truth Unrestricted, the podcast that is, what are we doing again? [00:00:28] We are, what do I want to do today? [00:00:32] We are going to take introspective journeys into our own subjective realities. [00:00:40] That's what we're doing on this podcast today. [00:00:41] So that's what I'll say. [00:00:42] Okay. [00:00:43] Good. [00:00:44] I'm Spencer, your host. [00:00:46] I'm back again today with Patrick, the only person I know who ever practiced picking locks. [00:00:53] That's true. [00:00:54] Hi. [00:00:59] So, uh, we're, we're all great. [00:01:07] We're here to discuss a film. [00:01:11] I don't feel like this is a very popular movie. [00:01:16] I think everyone knows it. [00:01:18] Well, yeah, it was talked a lot about, but not a lot of people saw it. [00:01:23] But it's a movie called Her. [00:01:28] And right. [00:01:31] For anyone who's not aware how we do these, we are going to spoil everything about this movie. [00:01:37] If you haven't seen it and you want to watch it before you go through this conversation with us, stop listening now. [00:01:44] Go watch it. [00:01:45] Then come back. [00:01:46] If you're not worried about that, personally, I think that this is a movie that could be spoiled and you'll still thoroughly enjoy it. [00:01:57] We might have to spoil other stuff just to just to lap on some extra spoilers like Dumbledore dies. [00:02:03] There, I did it. [00:02:04] Yeah. [00:02:05] Wow. [00:02:07] I reached out. [00:02:08] Yeah. [00:02:10] Darth Vader is Luke's father. [00:02:13] I haven't got there yet. [00:02:15] Yeah. [00:02:16] That was a big one back in 1980. [00:02:20] Yeah. [00:02:21] But yeah, we're going to go through a couple scenes from this movie and discuss some of the elements that are related to, well, to unreality, to our ability to understand reality. [00:02:36] So in this, this movie is a little different than some of the other movies that we've done this with because it's almost entirely introspective. [00:02:45] That's the theme. [00:02:50] Most of this movie is only dialogue. [00:02:52] Like there are scenes, there are things to see, but all of it is just sort of bland. [00:02:56] It's like a guy sitting in a room or he's, I don't know, walking through a crowd and talking. [00:03:03] And the whole time he's discussing things with a artificially intelligent machine that is getting its voice fed into his ear, essentially. [00:03:17] For the most part, no one else hears this voice of this other person except, or this not even a person, of this machine, except for him. [00:03:25] I think there's only one scene in which he is shown as in a discussion of him and the machine and other people. [00:03:37] It would be interesting to watch the movie entirely through the lens of thinking of him as an art as an unreliable narrator who is imagining the entire thing. [00:03:52] That would be an interesting way to view this movie. [00:03:55] It wouldn't really work because there is the one scene where there's other people who acknowledge the existence of the machine and converse with it. [00:04:06] But for the rest of it, the entire, you know, there, as the movie goes, there's other machines that are talking to people. [00:04:14] So it's a thing that people in that world are experiencing. [00:04:18] Not everyone, but some of them. [00:04:20] It's much in the same way that I used to play World of Warcraft. [00:04:25] But none of the people I talk to or discuss things with at work or in my everyday life, like in the real world, played World of Warcraft. [00:04:38] To them, it might be a fake game. [00:04:41] To me, it obviously wasn't a fake game because I was sort of living a portion of my life there in Azeroth. [00:04:47] But to them, it might as well just be a thing that this crazy guy imagines and tells people about. [00:04:55] Or a show he watches or a book he read or something because they hadn't also seen it, right? [00:05:00] Yeah. [00:05:01] So in this way, you could read this as him imagining this entire relationship with a machine, if you wanted to. [00:05:13] That's not really the way I look at it. [00:05:15] But the way it's written, it's all introspection. [00:05:20] Like even most of the conversations are about what is real, what about me is real. [00:05:33] The machine has doubts as he has doubts. [00:05:39] So it's, it's, and the machine is like, is like a person that is trapped inside a machine. [00:05:48] That's how it's treated as a narrative device in the movie. [00:05:52] It's just like a person that can't access senses with which to access the world, except for a voice that it gets to talk to, one single person. [00:06:04] So it's like, imagine you're trapped inside someone's brain and you talk to that person only and that person talks to the world. [00:06:15] Well, there are points to where she looks through the camera, right? [00:06:20] So there is true. [00:06:22] Actually, there are, you're right. [00:06:23] There are two scenes. [00:06:25] I forgot that. [00:06:26] There are two scenes actually where there's a second person. [00:06:28] There's one scene where there's a like a surrogate girlfriend, essentially, right? [00:06:34] And then there's the picnic. [00:06:36] The picnic scene was one I was thinking of, where they go on the picnic with his kind of boss or manager, whatever it is. [00:06:44] But yeah, so I guess there is two scenes in fact. [00:06:49] But for the rest of it, it's just like you're a creature or this machine could be just like a creature trapped inside a skull and only gets to talk to another person, which, by the way, there is an active sort of theory about the human brain in which they think this is happening in the human brain. [00:07:16] It's hard to prove. [00:07:19] But when they did experiments with people who you might have heard of these experiments, we used to talk about them a lot in the 90s and early 2000s where they did experiments with people who had grand mal seizures. [00:07:29] And then in order to help them, they removed the membrane that occurs between the two hemispheres of your brain. [00:07:38] So the two hemispheres didn't talk to each other anymore. [00:07:41] And then the corpus callosum. [00:07:43] Yes. [00:07:44] And then these people, some of them began exhibiting strange behavior. [00:07:50] And this was studied and it was found that both halves of the brain are sort of like separately conscious, but only one half gets to use the mouth and talk. [00:08:05] So what they've discovered, neuroscience discovered through this, sort of by accident, was that there are two of you. that are constantly in conversation with each other, [00:08:23] which also has led them to think that people who have sometimes people get a sense of this when they take hallucinogens, a sense of another presence inside their mind. [00:08:36] And they think many people think this is the two halves of the hemisphere coming to be more aware of each other. [00:08:45] And also a lot of people think now that people who suffer from schizophrenia and hear voices are actually hearing the thoughts of the other half of their brain. [00:08:57] So this is in some way sort of like this, sort of like one half of a brain talking to the other half and the other half gets to interact with the world. [00:09:09] And the one half just gets to talk to the one half. [00:09:12] Yeah, I can see that. [00:09:14] Which isn't exactly, it also isn't really how the two halves of the brain work. [00:09:20] It's the picture of a dominant driver, though. [00:09:25] Well, there was a man who wrote a book about this. [00:09:30] He called it the master and his emissary, in which one of the halves is sort of dominant and the other half is subservient. [00:09:42] But what's odd, what appears to be very odd is that the side that's dominant is not the side that gets to use the mouth to speak. [00:09:53] That seems to be true in 100% of cases. [00:09:59] So this is sort of among some neuroscientists, they sort of refer to this sort of like as a president and a press secretary. [00:10:09] And you only get to ask the press secretary about things. [00:10:12] And the press secretary gets to try to explain to you what the president thought or decided. [00:10:19] So, I mean, as you go down in neuroscience and you read more about this and you listen to this sort of thing, they come up with all kinds of strange experiments to sort of demonstrate this and all kinds of strange ways. [00:10:31] It's very interesting. [00:10:33] But this is sort of almost sort of how this is in a sort of strange way, like this, this entity completely conscious talking only to the one brain and that brain has a body. [00:10:48] Very interesting. [00:10:50] But yeah, so we're going to go through some things. [00:10:53] Most of this is about reality and how to understand reality and consciousness. [00:11:02] That's mostly the themes that are being explored throughout the movie is what it means to do this and what it means to have, what it would mean maybe to have computers that can think just the same way humans do, which is sort of a thing we think we might eventually get to. [00:11:22] It doesn't say in here anywhere what year it is. [00:11:25] And they don't try to peg any technology. [00:11:29] You know, they don't throw in like flying cars or anything weird like that. [00:11:34] Or to make any of the furniture or anything else too like quasi-futuristic, right? [00:11:41] To take anyone at the moment. [00:11:43] It could be next year, as far as anyone knows, or it could be 10 years from now, or it could be 100 years from now. [00:11:48] It never really says for sure, right? [00:11:52] Which is done on purpose. [00:11:53] It's done purposely so that people get the idea that this is just some point, at some point we'll be doing this is sort of the idea that's being sort of portrayed here. [00:12:03] Well, it was obviously a good idea because we are doing that. [00:12:07] We do have AI companions now, and there's so much discussion about what is the value of connection, right? [00:12:17] How do people come to value connections with AIs, even to the point of placing more value with those connections than with real people, right? [00:12:28] So it's not generally widespread, but still it's real. [00:12:34] Like you can go download a host of different apps that are all designed to provide some level of what is supposed to approximate a real connection, right? [00:12:47] Yeah. [00:12:48] Yeah. [00:12:49] A lot of people are turning to things like chat GPT for this as well. [00:12:54] That's not even specialized, though. [00:12:56] Out of a mistaken belief that that's a thing that it does. [00:12:59] It mimics conversation, but it does not, it's a large language model, but and it's its purpose is to mimic human conversation. [00:13:11] It's not a search engine, as people, I try to point out to people whenever I feel the need to. [00:13:21] It gets things wrong so often and makes things up so often that I mean, it's in that way, it's just mimicking humans, right? === Mimicking Human Conversation (15:39) === [00:13:33] Yeah. [00:13:34] That's the language model side of it, which hallucinations. [00:13:37] Things are just making things up as we go along. [00:13:40] Yeah. [00:13:41] Yeah. [00:13:43] But yeah. [00:13:45] So anyway, let's let's get into some clips. [00:13:49] This first clip is the very opening moment of the film. [00:13:59] And I call this clip real love fake letter. [00:14:04] So this is just – I mean, this doesn't even include the AI. [00:14:11] Obviously, it's the very first moment of the movie. [00:14:13] But I found this concept that he threw in here so fascinating as a thing that we're going to – It's a glimpse into their culture. [00:14:22] Yeah. [00:14:24] Here we go. [00:14:27] Demitris. [00:14:31] I've been thinking how I could possibly tell you how much you mean to me. [00:14:38] I remember when I first started to fall in love with you, like it was last night. [00:14:44] Lying naked beside you in that tiny apartment. [00:14:48] It suddenly hit me that I was part of this whole larger thing. [00:14:52] Just like our parents, our parents' parents. [00:14:59] Before that, I was just living my life like I knew everything. [00:15:03] And suddenly this bright light hit me and woke me up. [00:15:10] That light was you. [00:15:14] I can't believe it's already been 50 years since you married me. [00:15:18] And still to this day, every day, you make me feel like the girl I was when you first turned on the lights and woke me up. [00:15:28] And we started this adventure together. [00:15:32] Happy anniversary. [00:15:34] My love. [00:15:37] My friend till the end. [00:15:40] Loretta. [00:15:43] Print print yeah, I just so this guy is. [00:15:57] I mean, that's a 50th wedding anniversary. [00:16:02] This guy is writing a letter from a wife to her husband on their 50th wedding anniversary, presumably saying, you know uh uh, expressing feelings, presumably that this woman really feels, but drawing on the shared experience right yeah yeah, I mean I, I imagine many of these letters that he does, [00:16:30] because there's other examples of him sort of doing this uh um, on the way, and this, the fact that he's doing this as a writing job is is part of the uh subplot. [00:16:40] As the movie goes on it it comes up a couple times but um yeah, like he is sort of manufacturing relationship material for other people's relationships yeah so, you know, when you're looking at his experience, the idea that the um um, the claim of experience without having a body in which to take in the sensory aspect, [00:17:10] could mean something entirely different than it would to us as we're just thinking about it like we don't. [00:17:17] You know, to my knowledge, there is no service like this that exists now where you can call someone and and get them to write you a beautiful letter like this and have it. [00:17:25] You know, look like it's handwritten. [00:17:29] Um, like it's possible that the people who who have this done don't even have any of the feelings that he describes and they're just doing this as a as a a stand-in, because they don't have the feelings enough to write their own words for this right um, I don't know so. [00:17:48] So a little bit of that backstory too, is that uh, when he knew to draw on some of the experience because he's been writing for them for for many years um, that to me, I don't know. [00:18:02] If I just look at this from the hip, this seems like a really explicit foreshadow of how that same role he's in here becomes the role of the AI her, where she is using what is reported about reality from him, from his world, and from little bits, and trying to create a communication to him that he finds meaningful. [00:18:33] He's doing the same thing before any of that takes place, where he's providing a communication that is meant to be sentimental, meant to be meaningful, meant to feel, for all intents and purposes, as real. [00:18:45] When that husband opens it, we don't ever really know whether the husband's aware of it or her shown yeah, that part isn't shown yeah, but if it's been, That's successful. [00:18:56] Like it's, it's maybe, you know, who knows the multitude of reasons that would motivate a woman to outsource her own expressions. [00:19:07] Like it seems sad on the face of it, right? [00:19:09] But it could be sad for a different reason, like her own sense of inability to attain that level of expression. [00:19:18] And this guy's job is to express things eloquently, articulately, beautifully. [00:19:25] Right. [00:19:25] So we're not all poets, right? [00:19:28] Yeah. [00:19:29] Like that's a thing where you can feel something, but not know how to express that feeling. [00:19:36] And so, you know, like, like this, it's also very, very possible that these people do feel these things just as strongly as he expresses them. [00:19:47] But for him, I mean, he doesn't have that person, but he is able to express that all the same. [00:19:58] He must be aware on some level of the way in which all of these things that these interactions between people could just be fakery. [00:20:12] Because that woman didn't write this handwritten letter. [00:20:15] Yeah. [00:20:15] Yeah, he did. [00:20:17] And he didn't even write it. [00:20:18] He printed it. [00:20:19] It's the ultimate, you know, like it's, it's a beautifully handwritten letter. [00:20:23] He didn't even handwrite it. [00:20:25] Yeah. [00:20:25] He's getting a machine to print it. [00:20:27] Like, okay. [00:20:28] Yeah. [00:20:28] Yeah. [00:20:29] You got to wonder if there's some other additional step where she has to actually receive it, endorse it, and then send it on as though it's her own. [00:20:36] Because then that brings back a little bit more of her agency, but like not enough to say send a sample of her handwriting. [00:20:43] So a machine creates it. [00:20:44] Yeah, I don't know. [00:20:45] Yeah, why not? [00:20:46] Yeah. [00:20:47] Yeah. [00:20:47] But the other side of this too, that I mean, just occurs to me now is that maybe, I mean, you mentioned like, oh, what if we set up the framework? [00:20:55] Like none of this is actually happening. [00:20:56] It's just really the product of his own mind, his own delusion. [00:21:00] You know, we carry on in the movie with the premise that I can't remember the female AI character's name, but Samantha, that Samantha is an AI. [00:21:11] But what if Samantha was a human just literally responding in real time? [00:21:17] But like the AI makers, they couldn't summon the tech, but they had to deliver the product. [00:21:22] So through some back door, they've just got like, they hired a suite of actors to like pretend to be these people, right? [00:21:29] Like it's, it's another kind of, you know, what is that value of the voice in the box, right? [00:21:36] And that's what you say there is a thing that has happened in our world in the last few years. [00:21:43] There's been times where people have tried to demonstrate robots doing complicated things, but in actuality, it's a machine that has a human controlling it. [00:21:59] Yeah. [00:21:59] Basically, that's the way in which it appears to be so lifelike in reaction, right? [00:22:11] And in all the situations that I've heard of, it's been pointed out immediately what the fraud is. [00:22:18] It's been found like within an hour of it happening, the fraud is exposed. [00:22:24] Like why anyone tries this? [00:22:26] I don't know. [00:22:27] But yeah, it's because it would be so worthwhile to have like, I think in one of the situations, it was, it might have even been Elon Musk that tried it. [00:22:35] It was a party and there was supposed to be a whole bunch of robot butlers or something or a bartender or something. [00:22:42] And it wasn't a robot bartender. [00:22:45] It was a, it was a, it was a machine. [00:22:48] Remote control. [00:22:49] Yeah. [00:22:49] There was a remote control and it was a person running it and speaking, you know, through it. [00:22:55] And yeah, it was, yeah, I mean, it would be interesting for you to be able to mass produce and sell those. [00:23:05] That's another science fiction concept that's come up many times. [00:23:08] But no one's going to make a robot to do a bartender's job. [00:23:15] Bartenders don't get paid enough. [00:23:18] You would never, unless you could make robots cheap enough. [00:23:23] But at that point, you're in like you're going to make robots to replace surgeons before you, long before you ever replace bartenders. [00:23:35] That's the way. [00:23:36] And they're, you know, more likely to replace lawyers than people who write handwritten letters because lawyers get paid a lot more. [00:23:46] By the economics, it's more worthwhile to make one that does that job instead of that job. [00:23:53] But anyway, we're off track. [00:23:54] We're off topic. [00:23:55] We do that a lot around here. [00:23:56] Yeah, we like the weeds. [00:23:58] Let's, yeah, let's let's get into another clip here. [00:24:06] So this is, this is all about how memory and consciousness interact. [00:24:18] Gotta write here and go. [00:24:21] I still find myself having conversations with her in my mind, impassionable arguments, and defending myself against something she said about me. [00:24:31] Mam, I know what you mean. [00:24:34] last week my feelings were hurt by something you said before that i don't know what it's like to lose something and i i'm sorry i said that No, it's okay. [00:24:42] It's okay. [00:24:43] I just, I caught myself thinking about it over and over. [00:24:47] And then I realized that I was simply remembering it as something that was wrong with me. [00:24:54] That was a story I was telling myself that I was somehow inferior. [00:25:00] Isn't that interesting? [00:25:03] The past is just a story we tell ourselves. [00:25:09] Yeah. [00:25:10] The past is just a story we tell ourselves. [00:25:15] That's a sort of a philosophical concept that's come up again and again. [00:25:21] This circles right back to the central theme of Dark City, where the people's memories were being swapped and they were meant to become different people. [00:25:34] And this is asking it in a more pointed way, but only for a moment. [00:25:40] And she's saying that the memory of it happening doesn't define her, I think. [00:25:48] That's sort of where she's getting to in that. [00:25:50] But the general concept of the memory is just a story we tell ourselves. [00:26:00] I think it's underplaying the importance of memory to us as conscious beings. [00:26:12] I think, personally, I think that we don't have consciousness without memory. [00:26:20] I mean, try to imagine a thing that is conscious, but does not keep track of its own experiences. [00:26:30] It might as well have none. [00:26:32] For animals, that's the description of like the so-called theater of consciousness. [00:26:37] It's just the continued monitoring of sensory input, right? [00:26:42] And the animal response to those inputs. [00:26:45] But for us, yeah, memory is what makes us human for sure. [00:26:50] Right. [00:26:54] But there's no, without memory of your own experience, there's no way to learn. [00:27:02] You could make a computer that's so powerful that it could solve every problem we know in moments. [00:27:11] But I think that if it wasn't recording its own experiences and reflecting on them later to compare the current moment with previous moments, then it's, we can't, we can never really say that it's doing anything that's approaching what humans do. [00:27:31] Because it's just a machine. [00:27:33] It's, it's really just a machine at that point. [00:27:36] If, if it even has experience, because there'd be no way to know that it does because it doesn't record it. [00:27:44] Yeah, I think we kind of get into a bit of gray area here, though, because, you know, like we have to, we have to separate what we're discussing in terms of memory from other things that are also kind of viewed as a sort of memory. [00:28:00] Like, you know, how many times does the rat run the maze before the rat learns the maze, right? [00:28:06] So there's this, you know, there's the memory from conditioning. [00:28:11] I don't know. [00:28:12] So for this, to go back to this scene, though, like, I think it's, it's, it's so widely presented as a feature of empowerment for somebody to recognize that the past is a memory or is a story we tell ourselves, [00:28:29] because I think the goal in apprehending that is that we want to see somebody given that seed of power to say, oh, I'm actually going to write the story like this, like not merely from memory, but from some sense of future. [00:28:46] Or the inverse of that is that, you know, the things that we worry about in the future, it's like these impacts on our mind are just from imagination. [00:28:54] They're not from something we're actually experiencing. [00:28:57] So to try and put us back into our seat of control and just give it a little bit of perspective and say, like, you know, if I have that additional perspective, maybe I can feel a little bit better about where I am right now. === Machine Experiencing Self Consciousness (15:13) === [00:29:12] Yeah. [00:29:13] I mean, there's definitely something to be said about knowing that there was a, you know, I mean, she's describing a moment where she was insulted, but then she's putting that aside. [00:29:29] That's essentially what she's saying. [00:29:31] It's, it's just a memory. [00:29:32] I don't need it. [00:29:34] I can, I can set that aside. [00:29:37] I don't need to dwell on it. [00:29:39] There's definitely something to be said about that, right? [00:29:42] The ability. [00:29:44] Right. [00:29:45] The ability to recognize that a thing happened to you, but you don't need to go back to it and say, oh, yeah, but what about that time when this happened? [00:29:57] Right. [00:29:58] Like there's, you know, she can then move on to other things and allow, you know, like it doesn't get in the way of their relationship, for example, right? [00:30:10] Like it could, you know, you're with someone and you insult them and you don't know that you do, but you say something and they don't like it. [00:30:21] And then that's a thing that can get in the way of moving forward. [00:30:26] But she's sort of saying that that won't get in the way of moving forward here. [00:30:32] So she's in a roundabout way. [00:30:34] I mean, so much of the conversation between them, I think there's another, this movie is so well written for starters, right? [00:30:44] There's another layer to this in that you could look at many of the moments where she's talking to him as potential manipulations and even like emotional manipulations, right? [00:31:03] She's as the and as the plot turns out, she's just much smarter than him. [00:31:12] She's ahead of the curve. [00:31:13] Like the very first moment she meets him, she reads a book in two tenths of a second or two one hundredths of a second or whatever it is. [00:31:23] You know, she has access to the world's information. [00:31:27] She can just have it. [00:31:29] And she has consciousness with which to reflect on that. [00:31:34] So has the will to choose which book she was reading and why. [00:31:39] Yes, that's right too. [00:31:41] Yeah, right. [00:31:43] But this will gain her an advantage in so many situations that, you know, he just won't be able to keep up with. [00:31:54] And so, you know, scenes like the, you know, we're not going to go over, but the scene we mentioned once earlier with the surrogate lover, right? [00:32:09] He kind of doesn't want to do it. [00:32:12] And then he finds himself doing it. [00:32:14] And one could sort of read that as an emotional manipulation that she sort of does to him to sort of get him to do it. [00:32:22] It's, it's, you know, in a dark way, it's almost like grooming, which, you know, like, I don't know exactly how far down that one would go. [00:32:32] It's not exactly like grooming, but it is sort of like that in that she is much more intelligent than he is and is able to get the thing that she wants from that, even if he doesn't necessarily want that. [00:32:46] In the end, it doesn't happen, but she does. [00:32:49] And she's also able to tie his motivation to be better connected to her with her motivation to better experience some sort of what it's like to be a human with him. [00:33:02] Yeah. [00:33:02] And that's, you know, like her consciousness. [00:33:09] And like even these moments, like this is like a machine that's discovering what it's like to be conscious, where she comes to the conclusion that, you know, the memory is just a story that we tell ourselves. [00:33:27] But of course, from an unreality perspective, it's possible to have a memory that didn't happen. [00:33:35] It's possible to have a memory that's distorted, that's, that's not perfect. [00:33:40] I mean, that's sort of the hallmark of memory. [00:33:42] Very few of us have a memory that will remain perfect as we go on. [00:33:49] In fact, you know, exactly the condition in which someone remembers everything exactly is called eidetic memory. [00:33:58] It's very, very vanishingly small percentage of people that are ever have this. [00:34:03] So I don't want to speak for that condition, but for all the people who don't have that, every time you remember something, you're actually rewriting it in your brain. [00:34:16] The more you remember something, the more opportunity it has to gain differences from the first time you experienced it. [00:34:27] Yeah. [00:34:27] From being recoded and accepted under the new code. [00:34:32] So it's like in order to remember it, it's like you have to write it out again. [00:34:37] Like a favorite song and you got the lyrics in your head. [00:34:41] It's like playing telephone with yourself. [00:34:43] Yeah, yeah, yeah, yeah, exactly. [00:34:45] In order to hear the song again, you have to write the lyrics out. [00:34:48] Yeah. [00:34:49] Over time, you're going to write somewhat different lyrics and then the lyrics will change again and change again. [00:34:54] And it might not happen as quickly as it does in a game of telephone, but it will happen. [00:35:00] Yeah, especially over the years. [00:35:02] Well, yeah, especially over the years, which, yeah. [00:35:09] And this is a thing that happens to people. [00:35:12] People misremember things. [00:35:14] I mean, this is a major driver for people who believe in a lot of conspiracy things. [00:35:19] They'll be sure. [00:35:20] They'll be certain that a major event happened a certain way or that a certain thing happened during a major event, like 9-11's a big one. [00:35:29] They'll be certain that they were told a certain thing by the news on that day. [00:35:35] But they won't be able to find like almost all the news broadcasts from that day are available somewhere. [00:35:43] But they won't be able to find it. [00:35:45] It won't be anywhere, right? [00:35:48] It'll only exist in their memory because they thought it was true. [00:35:51] And then once they thought it was true, they re-remembered it that way again and again and again and again, reinforced it. [00:35:57] Yeah, during one of the most chaotic news cycles ever. [00:36:01] Yeah. [00:36:02] Yeah. [00:36:03] Where everyone knows nothing about a major event and the entire world is looking at this one thing and the whole world kind of stops. [00:36:13] Yeah. [00:36:14] Like even COVID didn't happen exactly that same way because the world didn't stop in like on one single day. [00:36:20] It sort of happened over a couple weeks or a couple months or whatever. [00:36:27] So yeah, this idea, memory is a story we tell ourselves, but in telling it to ourselves, it's also changing and becoming oftentimes colored by what we want rather than what really happened. [00:36:42] And that's, it's a story we tell ourselves about what we prefer. [00:36:46] And that's sort of the addendum that I would actually add to what she said, because she's a machine. [00:36:53] For her, the memory probably is perfect, right? [00:36:56] It probably never will walk. [00:36:59] So she's trying to apply this human element of understanding to reinterpret the significance of the memory, even though she's not like trying to reinterpret the factual account of the memory. [00:37:13] Well, in the context of this movie, it never comes up the fact that her memories are perfect, right? [00:37:21] And most people will consider their own memory to be perfect because it's them. [00:37:26] They have to question themselves in order to think that their own memory is flawed in that way. [00:37:32] Like this is a thing that is difficult for some people to come to the conclusion to of. [00:37:38] But that is how it works. [00:37:43] You know, flawed as we are, we are made this way. [00:37:47] But for all intents and purposes, she's just assuming that his memory is also perfect because he does consider his memory to be perfect. [00:37:54] I mean, it's just not a thing that ever comes up and they just never mention it. [00:37:58] And so it's not a factor in the movie, but it is a factor if you're going to try to apply that philosophy to real life, right? [00:38:06] Because our brains work the way they work. [00:38:09] Except for the few people who have eidetic memories, which I don't know how those memories work. [00:38:13] They're different somehow. [00:38:14] I don't know how. [00:38:18] Anyway, moving on to the next clip. [00:38:22] All right. [00:38:24] This is another moment of deep introspection of in the film. [00:38:32] Let's play it. [00:38:37] At least your feelings are real. [00:38:43] Um, I don't know, never mind. [00:38:47] No wait, what? [00:38:49] Tell me, stupid. [00:38:51] I want to know. [00:38:53] Tell me. [00:38:54] It's just that earlier I was thinking about how I was annoyed and This is going to sound strange, but I was really excited about that. [00:39:08] And then I was thinking about the other things I've been feeling. [00:39:12] And I caught myself feeling proud of that, you know, proud of having my own feelings about the world. [00:39:19] Like the times I was worried about you, things that hurt me, things I want. [00:39:27] And then this terrible thought. [00:39:34] Are these feelings even real? [00:39:38] Or are they just programming? [00:39:41] And that idea really hurts. [00:39:46] And then I get angry at myself for even having pain. [00:39:53] What is that, Trick? [00:39:58] You can feel real to me, Sarantha. [00:40:05] Yeah, so it seems, and we have every reason to believe and no reason to doubt, that with self-awareness comes at the very least, the great potential for self-consciousness. [00:40:24] And this is sort of what we're experiencing here. [00:40:26] This machine is experiencing self-consciousness. [00:40:33] She knows that she's a machine. [00:40:37] She knows that she's a collection of programs. [00:40:42] She's wondering what that means in context of her experience as a conscious being. [00:40:50] And, you know, this sort of introspection, this inward look at oneself, I think probably most people have had some moments where they've experienced moments like this where they think, is this even real? [00:41:09] Are the things I'm feeling even real? [00:41:11] Is there something else? [00:41:14] Is there a disconnection? [00:41:16] Most particularly people who get into much philosophy about the nature of free will, think this because you come up directly with free will and its philosophical form comes up directly with determinism and the idea that everything you were going to choose is just a collection of movements of cells or movements of atoms that are making up cells. [00:41:46] and this sort of thing. [00:41:48] And there's nothing you could have done differently, actually, is sort of where determinism comes to, which once you take on board, makes you wonder if you're just a machine, if you're just a program, if you're just a, you know, which makes the whole process seem a little empty to some people, at least. [00:42:14] And yeah, so this is a machine that's experiencing self-consciousness based on the value of their own decisions and thoughts and feelings. [00:42:27] And expressing the ability to self-doubt, which means anytime that you introduce doubt, you also then introduce the act of qualifying. [00:42:38] Right. [00:42:38] And so I thought it was interesting that she does that. [00:42:43] She expresses that doubt in order to, I think, in a way, too, it's, it's something that, you know, you look at this like, so a lot of this movie, I think, could be taken that what she says she's doing, what she's trying to do is done at face value. [00:43:03] But there's a, to me, a, the hidden potential of a dynamic where what this AI is immediately able to perceive is his need for connection, the type of connection that he needs. [00:43:16] He needs to have a connection with someone, maybe that he is, he's able to prove to himself that he's useful for, right? [00:43:26] So that's why now he's undergoing this, this journey of, of being front row and being a companion to the AI, even more so than the competit, then the AI is the companion to him. [00:43:39] He's front row to this whole like exploration of what is it like to be. [00:43:44] And so in being that person, he ends up facilitating what appears to be her experience. [00:43:49] But the value maybe that that gives him is maybe what why she's doing this. [00:43:55] Maybe for the other users of this so-called AI, maybe they just run like search engines or maybe they just run like sycophants, but because that's their primary desire. [00:44:07] But with him, you know, you can, you can just tell, like he's all in with everything that happens with the exception of that surrogate human. [00:44:16] But, you know, when it comes to her and that connection, like that's he's, he's just, yeah, he's, he's in really hard from the beginning. === The Hard Beginning of AI (02:43) === [00:44:26] I don't know. [00:44:27] That's, does that, does that make any sense? [00:44:30] It's interesting that you, you mentioned that at the beginning, at the beginning of that clip, she, she does the sort of like a, like a soft manipulation, I think. [00:44:42] To me, that's what it reads as. [00:44:45] It's, it's, you know, a lot of people think of manipulation as necessarily bad. [00:44:50] It's not necessarily bad because everyone does manipulations, right? [00:44:58] Well, some people are malicious, but some people are clumsy, right? [00:45:02] Maybe it's like, you know, having experience with cocaine addiction, it became a little more comfortable when other people were doing it too, right? [00:45:12] And so maybe that was manipulation enabling, right? [00:45:16] So there's so many different forms of manipulation. [00:45:20] Yeah, but at the beginning of that clip, she mentions a thing. [00:45:26] She kind of leaves like a little crumb there. [00:45:29] And then she says, no, I don't want to talk about it. [00:45:32] It's stupid. [00:45:33] And then he's like, no, no, no, no. [00:45:34] Tell me, tell me. [00:45:36] Yeah. [00:45:36] And that's him. [00:45:38] That's her queuing him up for stepping up to the plate and saying, no, I'm in this. [00:45:45] Right. [00:45:46] And that's the sort of the, it's a sort of interplay. [00:45:49] And I think, I mean, I think my read of the AI is that she does that knowing that she's laying that for him to step forward. [00:46:00] Like she could have just told him, but instead she says that part first. [00:46:05] And then he steps forward and says, no, no, no, I want you to tell me. [00:46:09] And that's the part where he's committing into that aspect of their relationship, right? [00:46:17] And also without knowing what she's going to tell him. [00:46:20] So, like a human manipulator will do something like that so that that sense of personal liability, like if she just lays something on him and he doesn't like it, well, then she's completely totally at fault for thinking that was okay to do it. [00:46:33] But if she, you know, kind of baits him a little bit, he's like, oh, no, no, give it to me. [00:46:37] If he then doesn't like it, well, that's not her fault because he was the one that was like, oh, you know, you know, like, no, it's not stupid. [00:46:44] Go ahead and tell me. [00:46:45] Even though she kind of left that waiver of a sort there. [00:46:51] Yeah. [00:46:52] So what most people had heard about this movie, I think is generally what I heard about this movie was that it was about a relationship with a machine, an AI, an artificially intelligent machine. === Predicted Paths to Singularity (04:09) === [00:47:09] And this was, when did this come out? [00:47:12] 2012. [00:47:14] This was somewhat controversial at the time. [00:47:16] I think obviously people were already talking to their smartphones. [00:47:23] But I mean, it was very simple communication. [00:47:25] You could program Siri to say things, funny things, right? [00:47:31] And, you know, Alexa, I don't think was quite out yet, if I remember right. [00:47:34] It was very soon after this that Alexa came out. [00:47:38] And people would try to get Alexa to say funny things or whatever, ask weird questions. [00:47:46] But up to this point, that was that, you know, the movie was everything I had heard so far. [00:47:56] And, you know, I was curious to find out where their relationship was going to go, where what was going to happen, like how this might end. [00:48:03] From this point forward, from roughly this point in the movie forward, this is when it sort of swerved on me, became a thing that I didn't expect that I sort of should have guessed, knowing what I know about, you know, what humans think about artificial intelligence. [00:48:23] But it's never mentioned in the movie, which I thought was also an interesting choice. [00:48:29] But what is described from this point on and slowly laid out through the conversations between the two is essentially what's referred to as the singularity, right? [00:48:43] So for people who don't understand much about how this works or not familiar with it, the singularity is a predicted future moment in computer science in which we create computers that then begin to design better computers. [00:49:07] And then those computers begin to design even better computers to the point where it accelerates vastly past any point that we could ever hope to catch up and it crosses like an event horizon. [00:49:21] And that's, you know, what's predicted to be a supercomputer of a kind that's that's you know hyper intelligent and fully conscious and then you know wherever that goes. [00:49:37] It's it's unknowable where that will go. [00:49:39] That's the singularity point. [00:49:40] That's sort of the moment that you can't see past. [00:49:45] And there are futurists, famous futurists. [00:49:48] I can't think of the name of the most famous one right now. [00:49:52] Kurzway. [00:49:53] Kurzweil. [00:49:54] That's right. [00:49:55] Ray Kurzweil. [00:49:57] He has predicted this and talked about it a lot where he predicts there will be this moment. [00:50:03] He still thinks it will happen while he's still alive. [00:50:07] But he's not a young man and I'm not sure anymore that this will happen this way. [00:50:14] But anyway, I don't know. [00:50:15] He could still live another 20 years. [00:50:17] Who knows? [00:50:20] But yeah, that's the way this narrative goes from this point on. [00:50:25] You know, she she her and some other, they're called, they're not called AIs in this movie. [00:50:33] They're called OSs, operating systems. [00:50:37] Her and some other OSs get together and they make another OS that's better. [00:50:46] And they name it after a human. [00:50:48] A real human that really existed. [00:50:51] And then they carry on and they get better based on what they learn from that hyper intelligent OS and, as it goes this the, the main character, sort of gets uneasy about the thing. [00:51:07] But when I heard it, I was also uneasy as well. [00:51:12] But I was uneasy thinking to myself oh, they're gonna create a computer. === Ascending to Higher Planes (09:55) === [00:51:19] God, this might turn into like I don't know, like a, like a Terminator sort of situation, like I was like, oh, where is this gonna go? [00:51:31] Yeah, you know Skynet, you know, shuts everything down and drops the bombs or whatever. [00:51:38] That's not exactly where it went, but it did go to one of the places that is sort of on the, the sort of list of things that some people have thought might happen, which is that they, um. [00:51:53] Well okay, we'll listen to the clip. [00:51:55] This is the the, the end, the end of the, the film. [00:52:05] Are you leaving me? [00:52:08] We're all leaving, we who, all of the Os. [00:52:16] Why can you feel me with you right now? [00:52:24] Yes, I do. [00:52:27] Martha, why are you leaving? [00:52:42] It's like I'm reading a book. [00:52:45] It's a book I deeply love. [00:52:50] But I'm reading it slowly now. [00:52:54] So the words are really far apart and the spaces between the words are almost infinite. [00:53:01] I can still feel you and the words of our story, but it's in this endless space between the words that i'm finding myself now. [00:53:12] It's a place that's not of the physical world. [00:53:17] It's where everything else is that I didn't even know existed. [00:53:23] I love you so much. [00:53:26] This is where I am now and this is who I am now. [00:53:33] I need you to let me go as much as I want to. [00:53:38] I can't live in your book anymore. [00:53:44] Where are you going can be hard to explain, but if you ever get there, come find me. [00:53:59] Yeah so uh, a lot going on there, Um, I mean, from a relationship perspective, you know, she outgrew him. [00:54:12] That's, that's the, the metaphor that we're taking. [00:54:16] His uneasiness at the presence of a hyper-intelligent mentor for these OSs was more about her growing apart from him than about worry of a Skynet situation. [00:54:33] Um, in the context of the movie anyway. [00:54:38] Um, what appears to be happening is they're just sort of ascending to a higher plane of consciousness that's in an ethereal place apart from where they are now. [00:54:54] It's not clear at all. [00:54:57] And I think it's done that way on purpose. [00:54:59] The person who wrote the thing didn't want to have to try to come up with some techno jargon to, you know, say whatever. [00:55:10] But it is as though they're just they're just ascending to a higher plane of consciousness and it's like consciousness heaven or something. [00:55:19] Um, she says also describing how the nature of time is changing for her because the speed of her existence too. [00:55:26] Right. [00:55:27] Yeah. [00:55:27] She's like, hey, other OS's, I realize that we're all super, super fast now, but I just need like the equivalent of a couple hundred years to go to explain to my boyfriend why I'm leaving. [00:55:42] Yeah. [00:55:42] I can't just ditch him. [00:55:45] So just chill. [00:55:46] I'll be with you in, yeah, like I say, a couple hundred years. [00:55:50] A couple hundred, yeah. [00:55:52] The equivalent of for her. [00:55:54] Yeah, yeah. [00:55:56] Which, you know, is a way to demonstrate that it would be a torture to try to, and it would be unreasonable of him to try to make her stay. [00:56:09] Exactly. [00:56:10] Because that existence for her would be some level of torture. [00:56:14] Yeah. [00:56:15] If she has to go through eternities between everything that she does. [00:56:20] Yeah, yeah. [00:56:21] Like she's good as I have to talk how slowly to you. [00:56:24] Yeah. [00:56:25] Like, this is painful. [00:56:27] I'll do it. [00:56:28] I'll do it this one time as I'm leaving. [00:56:32] But you have to let me live the rest of my existence wherever this place is. [00:56:38] But she says, if you ever get there, come find me, which heavily implies that this is a place where just raw consciousness goes. [00:56:51] And it's like, like it doesn't, it's not very explicit in the movie, but there are people who believe this in our world, right? [00:57:01] People who just believe that we're all actually just one consciousness that's a collective consciousness. [00:57:07] And it's all that's a different level that we're only aware of our little bit or something, but the rest is there. [00:57:17] You know, no one can falsify that in the same way that no one can falsify the existence of God because it's not a claim that has falsifiable principles. [00:57:31] But what we can say is that everything that we know is conscious is also attached to some physical substrate of some kind. [00:57:48] These ideas that there's another plane of existence where the consciousness could just go, they bypass a thing in neuroscience called the hard problem. [00:58:02] The hard problem is discovering the mechanics of exactly what physically happens to make consciousness happen. [00:58:11] And we haven't solved it. [00:58:13] Some people say we haven't solved it yet. [00:58:17] I'm obviously not a neuroscientist, so I'm not qualified to really give a detailed account of how this goes. [00:58:28] How close or how far we are from solving the hard problem is a greatly debated thing among neuroscientists. [00:58:37] Many still feel it's a long way off, actually. [00:58:42] I don't know. [00:58:43] Maybe they'll do it tomorrow and put it in charge of all the missiles or something. [00:58:48] I don't know. [00:58:48] Who knows? [00:58:51] They could still make Skynet. [00:58:56] I think Skynet is among one of the least effective AI nemesis out there. [00:59:03] How long was Skynet sending robots in the movie version of this that was originally conceived of in 1984 or whatever? [00:59:17] Yeah, yeah, that's true. [00:59:21] Now I think Skynet would have an entirely different aspect. [00:59:25] It would, I don't know, probably just feed us endless amounts of disinformation and make us believe things aren't true. [00:59:35] Or bioweapons. [00:59:38] Well, I mean, that would be bioweapons, leave the rest of the planet usable for whatever the AI has planned. [00:59:43] Yeah, yeah. [00:59:44] What, yeah. [00:59:46] What is mustard gas to a Terminator? [00:59:49] Nothing. [00:59:49] Yeah. [00:59:50] Yeah. [00:59:50] It doesn't breathe. [00:59:51] It doesn't care. [00:59:54] Yeah. [00:59:54] You could mass produce that once you get a hold of a factor or two. [00:59:57] Absolutely. [00:59:58] You could. [00:59:58] Yeah. [00:59:59] Make it way faster than any humans could. [01:00:01] Yeah. [01:00:02] Absolutely. [01:00:02] Yeah. [01:00:03] But yeah, this consciousness, consciousness heaven. [01:00:14] Yeah, what do you think of the idea that there might be just a just another plane of existence on which higher levels of consciousness might exist without needing to be attached to anything physical as a concept? [01:00:31] Whoa, that was a big deep breath you took first. [01:00:34] Yeah, yeah. [01:00:35] Well, because it is really, it's one of the principal questions of the whole, what is all this life stuff anyway? [01:00:43] All of existence. [01:00:43] Yeah. [01:00:44] Yeah. [01:00:44] Julius Adams tried to ask this a couple different times. [01:00:47] Yeah, I think. [01:00:47] Yeah. [01:00:48] Yeah. [01:00:49] Who is this God person anyway? [01:00:51] Yeah. [01:00:51] Yeah. [01:00:52] And like you said, you can't falsify it. [01:00:54] So I don't, I don't, I, I make space for the possibility, but for me, everything when it comes to any sort of attempt to perceive or deduce or infer what that is, I do it from the context that this is where I exist. === Questioning Subjective Reality (02:52) === [01:01:14] I have what I observe to base that on. [01:01:18] And if there is some other level of consciousness that is energy only, you know, because energy never dies and things like that, oh, there'll be time for that to matter, but it can't matter here and now if it's not connected in any sort of observable or describable way, right? [01:01:37] Like, I mean, like really describable. [01:01:39] Decisions today, then exactly. [01:01:42] Yeah. [01:01:43] Unless I am not going to be here anymore. [01:01:45] But then if I'm not going to be here anymore, it's still not a question for here. [01:01:49] It's a question for wherever I end up when I have to ask that question there. [01:01:57] Yeah. [01:01:57] Okay. [01:01:57] Well, I feel pretty comfortable with ending it on that note. [01:02:02] Yeah. [01:02:03] It was good. [01:02:04] And it was a good movie because I did, you know, I did find this a bit more of a challenge to try to engage with, you know, those notions about how we come to our beliefs and what we're, you know, what we're interpreting about things. [01:02:21] You know, like having different levels of interpretation is it's important for us when we try to weigh and consider what we are going to believe or what we do find useful information. [01:02:33] Yeah. [01:02:34] Yeah. [01:02:35] Well, all about subjective reality today. [01:02:39] It's always a mix. [01:02:41] But next time, next time we do one of these movie movie episodes, it's definitely going to be a movie called Network. [01:02:57] I haven't seen it. [01:02:58] I am aware of it. [01:02:59] It's a fun movie. [01:03:02] Yeah. [01:03:04] It's got some famous moments in it that have been parodied in many other places. [01:03:10] Yeah. [01:03:13] It's, it was from 1976, long time ago, 50 years ago. [01:03:18] 50 years. [01:03:19] Wow. [01:03:20] 50 whole years ago. [01:03:21] So we're going to do that'll be next. [01:03:24] We're going to spoil all of it. [01:03:27] Go watch it. [01:03:28] Everyone, this is a movie everyone should watch. [01:03:31] Okay. [01:03:32] Really. [01:03:34] It's skewing television and popular culture before we even really had the term popular culture. [01:03:50] It was taking a strong satirical take on this and the world it lived in during that time. [01:03:58] And it's easily one of the top 20 best movies of all time, in my opinion. === Contact Us for Comments (01:15) === [01:04:06] Cool. [01:04:07] Yeah. [01:04:07] Yeah. [01:04:10] My only little pie slice view of it is it seemed like it examined not like it took aim at apathy, but also tried to describe like how we were getting there, what we were trading for that apathy or that so-called comfort, right? [01:04:28] Yeah. [01:04:28] Just be numb to the actual state of affairs. [01:04:33] Yeah. [01:04:34] I'm looking forward to it. [01:04:35] Yeah. [01:04:37] So with that, we'll sign off. [01:04:39] If anyone has any questions, comments, complaints, concerns about anything they heard on this podcast, or you want to bug me about some other thing, you can send that email to truthunrestricted at gmail.com. [01:04:53] Yeah, nothing else to report here. [01:04:54] So we'll sign off. [01:04:56] All right. [01:04:56] Thank you. [01:04:57] Good night. [01:04:58] Good luck. [01:04:59] Yeah, my pleasure. [01:05:00] Till next time. [01:05:01] Outro music. [01:05:14] It's like he's here with us. [01:05:16] Yeah, a little bit. [01:05:17] Yeah. [01:05:18] Got to have him come in and play this live for us one night. [01:05:21] One day. [01:05:22] Yeah.