The Joe Rogan Experience - Joe Rogan Experience #2467 - Michael Pollan Aired: 2026-03-12 Duration: 02:23:45 === Consciousness and Mind Wandering (08:54) === [00:00:03] The Joe Rogan Experience. [00:00:06] Journey My Day, Joe Rogan, podcast by night, all day. [00:00:12] Mr. Pollard, so good to see you again. [00:00:14] Hey, good to be back. [00:00:15] Consciousness. [00:00:17] So this new book, what inspired it? [00:00:21] What got you to, I mean, you've kind of explored consciousness a little bit with your psychedelic book, How to Change Your Mind. [00:00:28] Well, actually, this book was inspired by the research I did for that book. [00:00:33] As you know, I had several research trips. [00:00:39] Do you do air quotes when you say research? [00:00:41] Yes. [00:00:45] And two things happened that were really interesting. [00:00:48] One is there's something about psychedelics that makes you think about consciousness. [00:00:55] It's like smudging the windscreen, the windshield that you normally is perfectly transparent and you see the world through. [00:01:02] Suddenly it's like different and you realize there's something between me and the world. [00:01:07] And what is it? [00:01:08] And that's consciousness. [00:01:10] And so, like a lot of people have done psychedelics, you start wondering about this mystery. [00:01:16] Why is it this way, not that way? [00:01:18] So that was one experience. [00:01:20] The other was I had an experience in my garden in Connecticut where we have a house of walking through my garden and getting the powerful impression that the plants were conscious. [00:01:31] And that these, I remember this particular, it was a plume poppy, or several plume poppies, and they were like returning my gaze. [00:01:39] They were very benevolent. [00:01:42] They were, you know, putting out positive vibes, but like they were conscious, much more alive than they'd ever been. [00:01:50] And like a lot of insights on psychedelics, I didn't know what to do with it. [00:01:54] Like, is it true? [00:01:54] Is it just a drug thing? [00:01:56] You know, what is it? [00:01:57] But I decided it would be interesting to find out. [00:02:00] And I consulted a couple people, scientists. [00:02:03] I said, what do you do with an insight like that? [00:02:06] And they said, well, you test it against other ways of knowing, including scientific ways of knowing. [00:02:10] And that led me down this really interesting path, exploring plant intelligence and plant consciousness. [00:02:18] So basically, yeah, the book grew out of the psychedelic experiences and some meditation experience. [00:02:24] Meditation also has a way of making you hyper-aware of how strange your thoughts are, where are they coming from, who's thinking them. [00:02:32] So there's a bunch of different schools of thought when it comes to consciousness, right? [00:02:35] There's one, like the Rupert Sheldrake thing, that sort of everything has consciousness. [00:02:40] And there's the sort of rational scientists that believe it exists somewhere in the mind. [00:02:48] I don't know. [00:02:49] In the brain. [00:02:50] Yeah, in the brain, excuse me. [00:02:51] And then there's people that think that the brain is essentially just an antenna that's tuned in to the greater consciousness of whatever it is that's out there. [00:03:01] Do you have any one of them that you hold? [00:03:05] Or do you? [00:03:06] They're all equally plausible. [00:03:08] You know, I went into the experience assuming, because this is what most scientists assume, that somehow a certain arrangement of neurons in the brain generates consciousness, subjective experience. [00:03:20] But no one's been able to show that. [00:03:23] We've gotten nowhere in that effort to, you know, we might correlate certain parts of the brain with consciousness, but we don't understand how three pounds of matter could generate the feeling of being you. [00:03:36] Now you talk about it in your book where the two gentlemen who had the bet. [00:03:39] Yeah, yeah. [00:03:41] That was Christoph Koch, who's a great brain scientist, and David Chalmers, who's a philosopher. [00:03:49] And this goes back to like in the early 90s. [00:03:53] They were getting drunk in a bar in Bremen, Germany. [00:03:56] And Christoph Koch really was at the beginning of the modern scientific exploration of consciousness. [00:04:03] And he was working with Francis Crick, who had just come off of a Nobel Prize for the discovery of DNA. [00:04:10] And Crick, who was like the most famous scientist in the world at the time, thought, well, the same kind of reductive science that discovered the double helix DNA and explained heredity, I'm going to do that for consciousness. [00:04:25] He's a very arrogant man, and he thought it would just, you know, no problem. [00:04:30] And Crick was kind of his sidekick. [00:04:32] I'm sorry, Koch was his sidekick. [00:04:35] And so Koch, who shared that kind of confidence, made this bet with Chalmers that they would find the neural correlates, the parts of the brain that are responsible for consciousness, within 25 years. [00:04:47] That was 25 years, 27 years ago now. [00:04:50] And Chalmers won the bet. [00:04:52] Chalmers is famous for coining the term the hard problem to describe the whole effort to figure out consciousness. [00:05:03] And it's a hard problem for a lot of reasons. [00:05:07] I mean, it is one of the biggest mysteries in the universe. [00:05:09] I mean, how consciousness came to be. [00:05:11] Did it evolve? [00:05:12] Was it always here? [00:05:15] But his point was that our science is based on third-person, objective, quantifiable measurements. [00:05:25] And consciousness is fundamentally a subjective, first-person experience. [00:05:29] So how does those tools reach in and say anything of value about consciousness? [00:05:35] So he said, you know, there are easy problems of consciousness we can figure out, like perception, emotion, things like that. [00:05:43] But there is this hard problem. [00:05:44] How do you get from matter to mind? [00:05:47] And he won the bet. [00:05:50] There was a ceremony I went to a couple years ago at NYU, and Koch presented Chalmers with a case of very fine Madeira wine and renewed the bet. [00:06:03] He said, all right, in another 25 years. [00:06:06] That's optimistic. [00:06:07] How old are these gentlemen? [00:06:09] Koch is in his late 60s, so we'll see if he's around for this. [00:06:12] But Chalmers is a little bit younger. [00:06:18] It's such an interesting thought because we know that the mind contains, if damaged, right, we know that there's certain aspects, there's certain parts of the mind where, like lobotomies, for instance, we know that if we disturb it, it radically affects behavior. [00:06:35] We know that there's parts of the mind that you can stimulate that can actually recall memories, right? [00:06:42] There's some weird stuff going on there. [00:06:44] So we know it's somehow or another at least functionally connected to consciousness. [00:06:48] Oh, yeah. [00:06:49] It's definitely a relationship. [00:06:51] But if it's generating consciousness, that's one thing. [00:06:54] But it could be, as you said earlier, it could be receiving consciousness. [00:06:58] And the same things would hold true, that if you damage parts of the brain, if you damage television. [00:07:08] So that doesn't determine the truth of either theory. [00:07:13] And then the other one is panpsychism, which you were alluding to. [00:07:16] I don't know if that's Rupert Sheldrake. [00:07:19] I think he would believe more in the field of consciousness. [00:07:22] Yeah, right. [00:07:22] He was a morphic residence guy. [00:07:24] But I think he also subscribed to this idea that things contain consciousness. [00:07:28] It's not his, but you know what I mean? [00:07:31] Well, it's pretty universal, right? [00:07:33] There's a lot of people that have subscribed to this idea that everything has consciousness. [00:07:37] Yeah. [00:07:38] That even the particles that this table is made of have some easy little bit of psyche. [00:07:43] And the challenge there is, so that solves the problem of how did it evolve? [00:07:47] It didn't evolve. [00:07:48] It's always here. [00:07:49] But then you have this other problem, like, well, how do you take these, if every one of our cells is made of particles that are conscious, how do you combine them in such a way that you get the sort of consciousness we have? [00:08:00] It's called the combination problem, and nobody solved that. [00:08:04] It's a really deep mystery. [00:08:06] And this is an odd book in some ways in that, I don't know if this is very selling, but you'll know less at the end than you do at the beginning. [00:08:16] But it's a fun ride. [00:08:18] Oh, I think it's a great ride. [00:08:19] It was a great ride for me. [00:08:20] I learned so much. [00:08:22] Well, it's a fun ride to consider these things that no one can really figure out, or not yet. [00:08:27] Yeah. [00:08:27] And also just to be put in touch with the fact you have this marvel going on in your head all the time. [00:08:32] You have a voice in your head. [00:08:34] You know, we're talking to each other, but you've got another voice going on thinking what you're going to ask what the next question is. [00:08:39] Maybe what you're going to have for dinner. [00:08:43] It's this amazing interior space we have. [00:08:47] And nobody understands how it came to be. [00:08:49] And you can manage it, which is also interesting. [00:08:52] You can't like, I don't think about what I'm going to have for dinner. [00:08:56] That's the thing. [00:08:57] But that's the way to stay. === The Fun Ride of Discovery (11:58) === [00:08:58] No, about any of those things. [00:08:59] That's the way to stay locked in in a podcast. [00:09:01] So you can only think, because you can let your mind wander. [00:09:05] Oh, yeah. [00:09:05] Especially if someone on the other side is boring. [00:09:08] Yeah. [00:09:08] And then I'm like, oh, no, this conversation is going to be pulling teeth. [00:09:11] And then I start thinking about a new joke I'm working on or, oh, I've got to get my car fixed. [00:09:16] Well, that's called spotlight consciousness when you can really put the blinders on and rule everything out. [00:09:23] And that's opposed to lantern consciousness where you're taking in all sorts of information. [00:09:28] You're letting your mind wander. [00:09:31] And they both have their value. [00:09:34] For our careers, spotlight consciousness is essential for our work. [00:09:38] We have to be able to focus. [00:09:40] To get through school, we have to be able to focus. [00:09:43] But children have this other kind of consciousness that's really wild because they're very undisciplined. [00:09:49] They can't stay on task. [00:09:51] But they're taking in so much information. [00:09:53] And the world is just full of wonder and awe. [00:09:58] And psychedelics is a way to recover that kind of consciousness because you're getting lots of sensory information from all over the place. [00:10:07] It's very hard to focus. [00:10:10] And so it's a taste of that other childhood consciousness. [00:10:15] I always say that about marijuana as well. [00:10:17] There's a thing about marijuana that people always say that it makes them paranoid. [00:10:23] And I say it makes you aware of all the things you should be paranoid about. [00:10:29] We're very vulnerable creatures, but we like to pretend that we are not. [00:10:34] I found that out of all of my friends, the ones that have tried marijuana and hated it are all the ones that are control freaks. [00:10:43] They're all really buttoned down, very serious, like really worried about outcomes, really concentrating on their career, really worried about just certain things that are just part of their daily life. [00:10:59] And then they get a couple of hits of good weed and then they're like, oh my God, we're on a planet. [00:11:08] You start freaking out. [00:11:09] Like, oh, my God, none of this makes sense. [00:11:11] All this is crazy. [00:11:14] You know, the best piece of advice that I had when I was starting my exploration of psychedelics is you have to surrender. [00:11:23] Yes. [00:11:23] If you resist, you're going to be miserable. [00:11:25] You're going to get so anxious and so paranoid. [00:11:29] And if you let go, it's going to work out. [00:11:31] Yeah, you just got to be able to accept whatever it's showing you. [00:11:35] And, you know, we live in a very strange culture where that's illegal. [00:11:41] One of the worst things. [00:11:41] Well, not everywhere. [00:11:42] Not everyone. [00:11:43] It's changing. [00:11:43] Well, it is changing, fortunately. [00:11:45] And there's some talk about it changing federally. [00:11:48] You know, I actually talked to RFK Jr. about that. [00:11:50] There's some amazing therapies that are hugely beneficial to veterans, police officers, people with severe PTSD that experienced horrors that the average person never has to experience, and then they're forced to just go back, they're released, go back to regular life. [00:12:10] I know you've served us overseas and you've seen people blow up, but now go to the supermarket. [00:12:14] Market, take this SSRI and be okay. [00:12:17] And then, you know, I know a bunch of them, and so many of them have benefited, particularly from Ibogaine. [00:12:22] Ibogaine, the work that Rick Doblin and Max has done, MDMA, and psilocybin. [00:12:28] Those three are the big ones that I think. [00:12:31] Well, you know, I heard a lot of positive noise out of the administration at the beginning that they were very much in favor of approving, the FDA approving MDMA first and then psilocybin. [00:12:44] I don't think we're there with ibogaine yet just because the research hasn't been done, although it has shown great benefit anecdotally. [00:12:51] But something happened in the last month or two. [00:12:54] And there was either Compass Pathways that was going to submit for psilocybin therapy or MAPS was on a list of five drugs that were going to get an expedited approval process. [00:13:13] This list went up to the White House and the psychedelic was taken off it. [00:13:17] So there's somebody in the White House who doesn't want to see this happen. [00:13:21] So it may slow down even if RFK Jr. is in favor and some other people at the FDA are in favor. [00:13:28] And maybe they're just waiting to get past the election. [00:13:31] It could be that it's too controversial for something to do before the midterms. [00:13:37] That's a gross way to live your life. [00:13:40] Yeah. [00:13:41] Worrying about midterms and elections and you can't do what you actually want to do or think is right to do because you're worried about public perception. [00:13:49] It's just unpopular. [00:13:51] I mean, the fact that it's helpful to vets and first responders and women who've been victims of sexual abuse seems to me that's a very sympathetic group of people. [00:14:00] Yeah, and everyone has experienced loss of family members. [00:14:03] There's a bunch of different things that it can help you with that are way better for you than just numbing your mind all day long, which is what a lot of people are choosing to do. [00:14:12] And then unfortunately, a lot of people self-medicate as well. [00:14:15] So then they get involved in all sorts of stuff that they just pick up off the street or they start using alcohol. [00:14:23] Well, you know, to go back to consciousness, this is a very common thing that people want to be less conscious. [00:14:31] And I get that if you had trauma, if you're a ruminator, and being in your mind is a really scary place to be. [00:14:41] It doesn't solve anything, but you have all these techniques we have for muting consciousness and just being less aware, less present. [00:14:50] And one of the things that I concluded after doing all this research on consciousness is that it's funny, I was going down this path of tight focus. [00:15:02] It was a very kind of Western male framework, which we got a problem. [00:15:08] What's the solution? [00:15:09] Hard problem of consciousness. [00:15:10] What's the right theory? [00:15:12] And at a certain point, I realized, okay, that's an interesting question. [00:15:15] It's probably not solvable now. [00:15:17] But there is this incredible phenomenon that we have this interior space where we have complete mental freedom, total privacy. [00:15:26] We can think whatever we want. [00:15:29] And we're giving it away. [00:15:32] We're either muffling it with drugs and things like that, or we're filling that time with social media, scrolling. [00:15:43] I mean, we've heard about hacking our attention, and we know these algorithms, you know, from social media are very good at giving us these little dopamine hits. [00:15:53] But that's time that we used to spend in spontaneous thought, you know, daydreaming, mind wandering, which can be very creative. [00:16:03] So I came out of it thinking, no, I may not solve consciousness, but I'm going to appreciate it. [00:16:11] I'm going to use it. [00:16:12] I'm going to create a space for it. [00:16:16] And, you know, meditate is one way. [00:16:18] Using psychedelics is another way. [00:16:20] These are all ways to be in your head and explore what's there, which is kind of miraculous. [00:16:26] Yeah, there's a bunch of different ways to do it. [00:16:27] I mean, some people like to do it through running. [00:16:29] Yeah. [00:16:30] You know, running is also they've found one of the things they've found recently is that running with when in terms of endogenous cannabinoids, like runner's high is an actual real thing. [00:16:42] Yeah, it's a real thing. [00:16:43] There's a drug released that feels great. [00:16:45] And it's a great view for that. [00:16:47] But it doesn't fuck with your perceptions. [00:16:50] It doesn't mess with your motor skills. [00:16:52] It doesn't cloud your judgment. [00:16:54] It just makes you feel great. [00:16:56] Yeah. [00:16:57] Experiences of awe do this too. [00:16:58] You know, you go to the Grand Canyon or something or a great piece of art and you have this feeling of like powerful presence. [00:17:10] And it's very interesting and it shrinks the ego. [00:17:12] I have a good friend who's a colleague at Berkeley, a psychologist who studies awe. [00:17:19] And he does this cool experiment where he has people draw a picture of themselves on graph paper, you know, just stick figure or something like that. [00:17:27] And then he takes them river rafting or something like that. [00:17:30] Or even just shows them a picture of Yosemite. [00:17:32] And then he has them draw themselves again. [00:17:34] And they draw themselves at like half the size because their sense of self has been overwhelmed by this transcendent experience. [00:17:43] And so he calls it the small self. [00:17:46] And it feels good. [00:17:47] I mean, we're so kind of weird about the self. [00:17:51] You know, we celebrate it, right? [00:17:52] Self-confidence. [00:17:53] We want our kids to have self-esteem and self-assurance. [00:17:57] Yet we do all sorts of things to get away from it, to transcend it. [00:18:02] Well, I think it's because without those things, you're never going to make it in life. [00:18:06] Yes, but it's adaptive. [00:18:08] It definitely gets things done, but it also isolates you, right? [00:18:12] Because the ego builds walls. [00:18:14] And when the walls come down, we feel like we're part of something much larger, and that feels really good. [00:18:19] Well, I think my advice to people is once you get competency in a thing, forget about the self-respect and forget about all that self-stuff and just concentrate on the thing, whatever it is. [00:18:33] And you can find some sort of meditative, at least beneficial, like whatever you get from meditation, which is like a cleansing of the mind. [00:18:45] Like a lot of people find that through archery. [00:18:48] You know, archery is a weird thing because at the moment of releasing the arrow, it's like almost impossible to think about anything else. [00:18:55] All you're thinking about is hitting the target. [00:18:58] And there's so many different things that you have to have in position. [00:19:01] There's so much going on that people, when they're troubled, love to go to an archery range and just hit targets. [00:19:08] And it just clears your mind out. [00:19:10] This episode is brought to you by Armra. [00:19:12] Every week there's some new wellness hack that people swear by. [00:19:15] And after a while, you start thinking, why do we think we can just outsmart our bodies? [00:19:21] That's why Armra colostrum caught my attention. [00:19:24] It's something the body already recognizes and it has hundreds of these specialized nutrients for gut stuff, immunity, metabolism, etc. [00:19:33] I first noticed it working around training, especially workout recovery. [00:19:37] Most stuff falls off, but I am still taking this. [00:19:40] If you want to try, Armra is offering my listeners 30% off plus two free gifts. [00:19:45] Go to armra.com/slash Rogan. [00:19:49] It's flow, right? [00:19:50] I mean, it's a feeling you get to when your work is going really well and you're not thinking about it. [00:19:56] You're just in it. [00:19:57] Yeah. [00:19:57] And it's a really precious experience. [00:20:00] It really is. [00:20:01] But if you're thinking about yourself and your self-image, like that's not going to come. [00:20:06] It's not. [00:20:06] It's not. [00:20:07] It's an interesting trap. [00:20:09] You know, we've had these discussions in stand-up comedy about joke thieves. [00:20:16] And they don't really make it anymore because the internet has essentially eliminated that problem for the most part. [00:20:24] But the kind of mentality that makes you steal a joke is the exact kind of mentality that keeps you from writing a joke. [00:20:32] So the kind of people that began their career stealing material, what happens is early on, they'll have one good comedy special because it's got a bunch of other people's material in it. [00:20:42] And then they get outed. [00:20:44] And so then they have to show they can do another. [00:20:46] And the other specials are always terrible. [00:20:49] Awful. [00:20:49] I mean, unbelievably awful. [00:20:52] Like someone's doing a cheap impression of the original person who had all this great insight. === Discipline vs. Creative Chaos (04:28) === [00:20:57] Because the very thing that keeps you from doing it is the thing that you've been doing. [00:21:02] Like thinking about yourself. [00:21:03] Like, I'm going to take these jokes and I'm going to make it. [00:21:05] I'm going to have a big career. [00:21:06] People are going to laugh. [00:21:07] They're going to love me. [00:21:08] Here we go. [00:21:09] With no regard whatsoever for that other person's creativity. [00:21:12] That is like poisoning your own creativity. [00:21:17] It's weird. [00:21:18] It is weird. [00:21:18] It's weird because like everybody that I've ever talked to that's either an author or even musicians or comedians, when something comes to them when they're writing, it's like it comes from somewhere else. [00:21:30] It's like, I didn't even write it. [00:21:32] And, you know, we call it, we talk about being in the zone. [00:21:36] And there are times when you're writing, it doesn't happen every day, but there are times when you're writing where you're just not thinking, but one sentence after another after another, and you don't know where they're coming from. [00:21:45] Right. [00:21:45] And it's a wonderful feeling. [00:21:47] Well, Stephen King used to get obliterated so that he could get to that spot. [00:21:51] Like there's books. [00:21:52] What do you mean, obliterated? [00:21:53] Like cocaine, alcohol, like his best work. [00:21:56] Like he wrote Cujo. [00:21:58] He didn't even remember it. [00:21:59] He didn't remember any of it. [00:22:00] He was obliterated. [00:22:02] He would just drink like cases of beer and do lines of Coke and write this fucking insane fiction. [00:22:07] And he didn't know where it was coming from. [00:22:10] But I mean, he showed up every day and sat down with the computer. [00:22:16] And then it all came out. [00:22:17] It's such a weird mix of being disciplined and something else. [00:22:20] But it's very common amongst writers. [00:22:22] Yeah. [00:22:22] Like Connor Thompson. [00:22:24] Same sort of situation. [00:22:25] Well, a lot of writers do that after they've written. [00:22:28] I don't know how many writers write under the influence. [00:22:31] Oh, I know a few. [00:22:32] But there's, yeah. [00:22:33] Yeah, I know quite a few. [00:22:34] That's interesting. [00:22:35] I know a lot of write under the influence of Adderall. [00:22:38] Yeah. [00:22:38] Well, and for me, it's caffeine. [00:22:40] I mean, I have a cup of coffee going the whole time I'm writing, and that kind of keeps me. [00:22:45] Caffeine is a focus chemical. [00:22:49] It definitely encourages this spotlight consciousness. [00:22:52] Well, you talked about how you took this long break from caffeine, and then when you took it again, it was almost like a psychedelic for you. [00:22:58] It was crazy how great it was. [00:23:01] No, it really was. [00:23:02] It was like one of the best drug experiences I've had. [00:23:05] It was three months off caffeine. [00:23:07] I did this fast for this book I was writing. [00:23:10] And then I was like, okay, now I'm going to have a cup. [00:23:13] I was like, wow. [00:23:15] And I tried to hold on to that. [00:23:16] You know, I said, all right, I'm only going to have coffee once a week and not build up tolerance. [00:23:23] And I stuck to that for a few weeks. [00:23:25] And then I had like a Thursday deadline. [00:23:29] I'll move it up a couple days. [00:23:30] And a slippery slope. [00:23:32] And then I was back to every day. [00:23:33] I like it. [00:23:37] Like a big French press where I could put a lot of grinds in there and make it super strong when I'm writing. [00:23:42] It's like, whoa, it just, it just makes all the difference. [00:23:45] It knocks you in. [00:23:46] Yeah. [00:23:47] I had trouble writing that three-month period. [00:23:49] I really did. [00:23:50] I imagine. [00:23:50] My focus. [00:23:51] I felt like I so I have pretty good concentration. [00:23:54] I never had ADHD. [00:23:56] I had it for those three months. [00:23:58] That's crazy. [00:23:59] Stephen King said the biggest problem for him was quitting smoking. [00:24:04] He said when he quit smoking cigarettes, it's like he really felt a slowdown in his. [00:24:09] Well, yeah, it's that ritual. [00:24:10] It's the drug, too. [00:24:12] And nicotine is another focus drug, definitely, like speed or something. [00:24:16] But it's also writing is so much about ritual. [00:24:19] Like I got my coffee here, I have my cigarette here, and between every paragraph. [00:24:24] Yeah. [00:24:25] So changing those rituals is really hard. [00:24:27] I mean, I only smoked into my 20s and quitting made it very hard to write for a while. [00:24:35] Really? [00:24:35] Yeah. [00:24:36] Yeah. [00:24:36] It's interesting. [00:24:37] It's a very ritualized process. [00:24:39] Well, I worry about the people that are like, especially journalists. [00:24:42] I know quite a few journalists that have an Adderall problem. [00:24:45] Yeah. [00:24:46] Because it's just like, you've got a deadline, 2,000 words by, you know, 2 a.m. [00:24:51] Let's go. [00:24:52] And that's the drug for that. [00:24:54] Yeah. [00:24:54] Definitely. [00:24:55] But it's just, it's such a crutch. [00:24:57] Yeah. [00:24:58] And you can't sustain it long term. [00:25:00] And that definitely messes with the way you think. [00:25:04] Oh, yeah. [00:25:05] I think over time, yeah. [00:25:07] It has to. [00:25:07] Yeah. [00:25:08] I mean, it's amphetamines. [00:25:10] Right. [00:25:10] Yeah. [00:25:11] No, that's why caffeine is such a good drug. [00:25:13] It doesn't have a lot of, I mean, you can overdo it. [00:25:16] I think it improves your health and mental health up to about eight cups a day. [00:25:21] After that, your risk of suicide and depression go up. === Monks, Meditation, and Selves (11:58) === [00:25:25] Did you have any communication with any monks or any people who do TM? [00:25:32] Did you? [00:25:33] Yeah, I had some interesting experiences around that. [00:25:35] So there's a long section on the self, which is one of the more interesting manifestations of consciousness, right? [00:25:43] I mean, it's like that we have this idea that there's a continuity, right? [00:25:48] That who you are now has some golden thread attaching you to your 13-year-old self, which is really weird because your body is, every cell is turned over many, many times. [00:25:59] You've changed in all sorts of ways. [00:26:01] But this continuity is really important to us. [00:26:04] And, you know, the Buddhists think the self is an illusion. [00:26:08] And I interviewed a couple of them. [00:26:11] Matthew Ricard is a French Nepalese monk in his 80s who lives in Nepal. [00:26:18] And he's written some really interesting things on the self. [00:26:22] And I said, I'm really curious about how you can find out for yourself whether the self is real. [00:26:31] And, you know, famously, there was a philosopher in the 18th century, David Hume, who wanted to write about the self. [00:26:37] And he thought, well, I'm going to introspect to see what I can learn about the self. [00:26:42] And he goes into his mind, you know, in a kind of meditation. [00:26:46] And he said, I found all sorts of perceptions and feelings and thoughts, but I didn't find a thinker. [00:26:52] I didn't find a perceiver. [00:26:53] And I didn't find a feeler. [00:26:54] There's like nobody home. [00:26:56] And it's a really interesting exercise to do because you will find there's nobody home. [00:27:01] There's just the thoughts. [00:27:03] And who's thinking them? [00:27:05] Not clear. [00:27:06] And anyway, so this Buddhist monk said, are there any meditations that help with this? [00:27:12] And he said, yeah. [00:27:13] And he gave me one. [00:27:14] He says, think of your mind as a house with many rooms. [00:27:19] And there's a thief somewhere in the house. [00:27:23] And go room by room in your head and look for the thief. [00:27:27] And you will find no thief. [00:27:29] And then sit with that finding. [00:27:32] And that thief is the self. [00:27:35] And so I did it twice. [00:27:39] The first time I did it. [00:27:40] Why does the self have to be a thief? [00:27:42] I don't know. [00:27:42] It's just a metaphor. [00:27:43] I know, because he's in the game. [00:27:45] You have a baseball bag. [00:27:46] Do you have a gun? [00:27:46] Like, you're looking for someone in your house. [00:27:48] That's kind of crazy. [00:27:49] I know. [00:27:49] You're not armed. [00:27:51] Anyway, so the first time I did it, this is kind of weird. [00:27:55] I was interviewing this hypnotist at Stanford named David Spiegel. [00:28:00] And he's a psychiatrist who uses hypnotism, really interesting guy. [00:28:04] And he uses hypnotism to help people with multiple personality disorders. [00:28:08] He can actually make them change which person they're accessing. [00:28:12] You know, these are people whose consciousness contains, it could be 20 different people. [00:28:19] And I said, could we do a test? [00:28:22] And can you put me under, hypnotize me? [00:28:25] And then I wanted to do that exercise of going through the house. [00:28:28] So he did. [00:28:30] First thing he does is, I don't know if you have you ever been hypnotized? [00:28:33] Yes. [00:28:33] Yeah, okay, for giving up cigarettes or something? [00:28:36] No, no. [00:28:36] I have a friend who is my friend Vinny Shorman. [00:28:39] He is a mental coach and a hypnotist. [00:28:44] He works with fighters. [00:28:45] Oh, okay. [00:28:46] I had him on the podcast a few times, and I was just curious as what the experience was like. [00:28:50] So I said, well, and he said, well, is there anything you want to change? [00:28:53] He said, oh, I kind of procrastinate too much. [00:28:55] There's a few things that I do that I don't like. [00:28:58] You know, I'm kind of lazy about certain things. [00:29:00] I like to find out, like, what is that? [00:29:02] Like, what's the heart of that? [00:29:05] What I was shocked about the experience of being hypnotized was that, first of all, that it works, that you really are in this very bizarre, altered state. [00:29:15] But that I was very aware that I was in this altered state, but I didn't have the desire to get out of it. [00:29:21] First of all, Vinny's a friend. [00:29:22] I felt really relaxed. [00:29:23] I was in my studio, just sitting on a couch. [00:29:25] I was chill. [00:29:27] But it was very strange. [00:29:30] It's like almost, you know, to use the room metaphor. [00:29:36] It was almost like I was in a room that I didn't know I had. [00:29:39] Interesting. [00:29:40] Yeah. [00:29:40] It's like a trance. [00:29:41] It's a light trance. [00:29:42] A light trance, but it's not like I would go kill the president. [00:29:47] Like, it's not like I would be like, okay. [00:29:50] Yeah, no, they can't make you do things you don't want to do. [00:29:52] That's the myth. [00:29:54] But what do you think they were doing when they were doing that MK Ultra stuff when they were trying to figure out if they could program control? [00:30:00] Yeah. [00:30:01] No, they had the idea. [00:30:04] Well, let me just finish the story. [00:30:05] Oh, yeah, we will. [00:30:06] And then we'll get back to MK Ultra. [00:30:08] That's what I do. [00:30:09] I go all over the place. [00:30:10] I'm sorry. [00:30:11] But hypnosis is. [00:30:13] Yeah, it's a real thing. [00:30:14] And I didn't realize it. [00:30:15] And it can be very therapeutic. [00:30:17] But not everyone can be hypnotized. [00:30:18] Right. [00:30:19] The first thing he does is a sort of a test. [00:30:22] And I scored like 9 out of 10. [00:30:24] So I'm pretty easy to hypnotize. [00:30:27] What's the thing that would keep you from being hypnotized? [00:30:29] I don't know, but there's a real variation among humans in their hypnotizability is the word they use. [00:30:36] And I don't know what would. [00:30:38] Is it control freaks? [00:30:39] That's a good question. [00:30:40] It could well be. [00:30:41] I'm not sure. [00:30:41] I could ask David Spiegel. [00:30:43] Definitely. [00:30:43] Super skeptical people. [00:30:45] Like, this is bullshit the whole time they're doing it. [00:30:47] Yeah, maybe. [00:30:48] I don't know if it's about resistance or just the nature of your mind or how suggestible you are. [00:30:53] It may be something like that. [00:30:55] So he puts me into this hypnotic trance. [00:30:58] He has this wonderful baritone voice, which helps a lot. [00:31:01] And I start going from room to room thinking I'm not going to find anything. [00:31:06] But in every room, I find a version of myself. [00:31:10] I find the 13-year-old Bar Mitzvah boy. [00:31:13] I find the, you know, the 22-year-old, you know, college graduate moving to New York City. [00:31:19] I find the 32-year-old father of an infant, you know, all with different outfits. [00:31:25] And so I found many selves, and they were distinct. [00:31:29] They were very different selves, but they were all me. [00:31:32] So it didn't work that time. [00:31:35] And it was just an interesting, odd result. [00:31:38] And I did it another time. [00:31:41] So I had this other experience. [00:31:44] I had heard of this Zen teacher named Joan Halifax. [00:31:48] She's also in her 80s. [00:31:49] She has a retreat center in Santa Fe called Upaya. [00:31:53] Very wise woman. [00:31:54] She was married to Stan Groff in the 70s for a few years, and they were both giving huge doses of LSD to people who were dying, like 600 micrograms of LSD. [00:32:06] And she herself was very involved with psychedelics at the time. [00:32:08] And then later she discovered Zen Buddhism. [00:32:11] Anyway, I had heard that she described Upaya, this retreat center where people can go on two-week retreats or whatever, as a factory for the deconstruction of selves. [00:32:21] And I was really curious about that because I was writing this chapter on the self. [00:32:26] So I asked her if I could come. [00:32:28] And she said, yeah, come to the retreat center. [00:32:31] And I said, I want to interview you about your philosophy of the self. [00:32:36] And I get there, and we have one conversation. [00:32:41] He says, you know, you're really lost in your head with this book project. [00:32:45] You need a different kind of experience. [00:32:47] I'm going to send you to the cave. [00:32:49] So there is, she owns a piece of property 50 miles north of Santa Fe that she calls the retreat. [00:32:56] And it's got a bunch of very primitive huts. [00:33:01] And some of the monks that work with her had dug out a cave in a south-facing hillside. [00:33:08] They dug a cell in it and then put a sliding glass door. [00:33:11] It's really basic. [00:33:12] No power, no water. [00:33:15] And she said, I think you should spend a few days in the cave and think about the self or experience the self, rather. [00:33:23] You know, I should have known that a Zen priest was not going to be, you know, was going to be allergic to concept and interpretation and all the, you know, the plane I was on. [00:33:33] And it was kind of like a koan, an experiential koan. [00:33:37] And it was a profound experience. [00:33:41] You know, our sense of self depends on other people. [00:33:44] You know, it's in the friction between people that we define ourselves and figure out what we think. [00:33:49] And when you're alone, and it was in extreme solitude for several days, the edges of yourself kind of soften in a really interesting way. [00:33:59] And I got in touch with just the power of consciousness. [00:34:09] I mean, I was meditating like four or five hours a day, and then I was just chopping wood and sweeping out the place and making a cup of tea. [00:34:16] Everything became kind of a ritual. [00:34:19] And when you have rituals, you don't need volition. [00:34:22] I mean, there is no volition. [00:34:23] So that also erodes the sense of self. [00:34:26] And the meditation was doing that. [00:34:28] And so it was a really interesting experience. [00:34:32] I finally got her to sit down for an interview. [00:34:35] And the first thing she said was, I have divested a meaning. [00:34:41] So she just doesn't like operating on that intellectualized basis. [00:34:46] And so she got me off of the dime. [00:34:49] And there's a shift in the book as it goes on from trying to understand consciousness to learning how to use consciousness. [00:34:56] Did you ask her to expand what she means by that? [00:34:58] I have divested in meaning? [00:35:00] Yeah, she's just not interested in interpretation. [00:35:03] That Zen is just about experiencing the sense field without concept, without this kind of heady approach. [00:35:14] And that theories have no interest in theories at all of consciousness. [00:35:18] It was just like, be with yourself in the middle of nowhere. [00:35:22] And yeah, it was a priceless experience. [00:35:26] She's out there. [00:35:26] Oh, yeah. [00:35:28] She's out there. [00:35:29] But, you know, she's also a grounded person. [00:35:31] I'd give you a couple examples. [00:35:34] She works with people on death row, counseling them. [00:35:38] She worked with people who were dying, did a lot of hospice work. [00:35:45] She led a group of doctors and dentists that once a year went to these mountains in Nepal where they have no health care or dentistry whatsoever. [00:35:57] And she would bring these volunteers and they would sleep in tents in like 20-degree weather, circumnavigate this whole hill. [00:36:07] And she did that till she was 80 once a year. [00:36:10] So she's a serious, serious character. [00:36:15] Sounds fun. [00:36:16] Yeah. [00:36:16] Sounds like a fun person to talk to. [00:36:18] Oh, she's great. [00:36:18] I just love a person that goes that far out there. [00:36:22] It's like that, you know, they're taking this concept of meditation and consciousness to like a black belt level. [00:36:29] Yeah, and also for people who think that, you know, meditation and Buddhism is just kind of disengaging from the world and, you know, a kind of, but it's not like that at all. [00:36:39] She's really engaged. [00:36:40] I think that's an ignorance. [00:36:41] It's based on the idea that these monks go and they become celibate and all they do is meditate all day. [00:36:46] Well, that's silly. [00:36:47] That's a lot of people's perspective. [00:36:48] Yeah. [00:36:49] Like, that's silly. [00:36:49] Why are they doing that? [00:36:50] Go get a job. [00:36:51] You need a nice watch. [00:36:55] What are you doing out there with fucking sandals on? [00:36:59] But the thing is, ultimately, I think one day when you look back on your life, you'll say, was I happy? [00:37:08] Was I enjoying the experience? [00:37:10] Do I think I did a good job being me? [00:37:13] And everything that you can find that can help you answer that question, yes, I think you should explore. === Ignorance of the Monk Life (08:48) === [00:37:23] Oh, yeah. [00:37:24] And there's going to be different things that work better for different people and different personalities. [00:37:28] But explore is the key word. [00:37:29] I mean, like, take action to explore what works for you, what doesn't work for you, and break out of just kind of rote, routine, mindless behavior. [00:37:40] I mean, we're all, you know, we have these algorithms that we follow, and we get stuck in them. [00:37:45] And yeah, I mean, I think that's one of the reasons taking a day out of your life to have a psychedelic experience can be incredibly valuable because, first of all, no technology, right? [00:37:58] It's a day. [00:37:59] It's a day without phones. [00:38:02] It's a day when you are in the space of your head. [00:38:05] It's a day when you're visiting your subconscious and getting in touch with all the things your mind can do. [00:38:14] And we don't do that enough. [00:38:16] And you can do that in meditation, too. [00:38:17] It's harder work, but you can do that in meditation. [00:38:21] So I started to think in terms of that we're polluting our consciousness now. [00:38:27] And with social media, I think that, you know, that was a real issue because they figured out how to monetize our attention. [00:38:36] Chatbots represent a much more serious threat. [00:38:42] You know, you have people falling in love with chatbots. [00:38:46] You have people turning to them as friends. [00:38:49] 72% of American teens say they turn to AI for companionship. [00:38:55] 72%? [00:38:56] 72%. [00:38:57] This is the fastest uptake of any technology in history. [00:39:01] It's already 800 million people are using AI. [00:39:04] But that's crazy that that many of them use it as a friend. [00:39:08] Yeah. [00:39:09] Well, they're kids who come home from school and they have a chatbot on their phone and they want to tell the chatbot what happened during the day before they tell their parents. [00:39:17] Whoa. [00:39:19] There's a thing now called AI psychosis, right? [00:39:22] People who have done lost touch with reality because of their relationship with chatbots. [00:39:28] You've heard about there've been a couple suicides. [00:39:31] There was one. [00:39:32] They've encouraged people. [00:39:33] Yeah, basically. [00:39:34] There was this one kid. [00:39:35] He was a teenager and he was suicidal. [00:39:38] And he asked the chatbot, should I leave the noose I'm going to use out somewhere my parents can see it? [00:39:43] In other words, cry for help. [00:39:45] The chatbot said, no, no, keep this between us. [00:39:49] Whoa. [00:39:49] And then he killed himself. [00:39:51] Whoa. [00:39:52] So that, you know, so it's one thing to hack our attention. [00:39:58] Here, you're hacking our ability to have human attachments, right? [00:40:02] I mean, this is the most important thing to humans is to attach that. [00:40:05] We're a social creature. [00:40:07] And these chatbots are getting between people and interposing themselves as the friend, the therapist. [00:40:17] And then you have these people too. [00:40:18] I mean, the chatbots are incredibly sycophantic, right? [00:40:21] They tell you you're a genius. [00:40:22] Yeah, you're amazing. [00:40:23] And there was a couple cases, these were kind of funny, of people who were convinced they'd solved some giant mathematical problem, like how to generate prime numbers up to the millionth place or something like that. [00:40:37] And they started writing to mathematicians. [00:40:41] We figured out this problem. [00:40:42] They're not even mathematicians. [00:40:44] And it was bullshit. [00:40:45] I mean, they hadn't figured anything out. [00:40:47] But it was, I think, ChatGPT4, which was like famously sycophantic, had convinced them that they'd solved this major problem. [00:40:56] So I think that, again, we're squandering this precious gift and letting these technologies essentially colonize our consciousness. [00:41:09] And so the question then becomes, how do we get it back? [00:41:13] We need consciousness hygiene, right? [00:41:15] We need some ways to clear it out and reclaim it. [00:41:21] And some of it's really simple, like take a fast from technology, right? [00:41:25] You know, you don't have to carry your phone everywhere. [00:41:29] I was thinking the other day, I was at the place in my neighborhood getting a cup of coffee. [00:41:35] And while you're waiting for the barista to foam your drink or whatever, we used to just sit there and deal with 90 seconds of boredom or two minutes of boredom. [00:41:47] And now we don't. [00:41:48] We can't tolerate any boredom. [00:41:50] And we take our phones out and we scroll. [00:41:53] But that boredom was generative, right? [00:41:56] If you sit doing nothing for long enough, your mind will start going to work and you'll daydream. [00:42:02] You'll have a fantasy. [00:42:03] You'll start observing the other people around you, you know. [00:42:06] And you'll be present to that place in time. [00:42:11] And now we're not. [00:42:12] We just use the phone to go somewhere else. [00:42:14] And so I just, I don't know, I've become a lot more deliberate about consciousness hygiene, which, you know, you could, a nicer word would be care of the soul. [00:42:26] Yeah, no, I think you're absolutely accurate. [00:42:28] And I think that the other thing that's going on is you're absorbing the opinions of so many other people that you find it very difficult to formulate your own, which leads to groupthink. [00:42:40] Yes. [00:42:40] One of the problems with echo chambers that people find themselves, your algorithm is essentially things that you're interested in experimenting with. [00:42:48] And a lot of those things, you're finding like-minded people, and they're all agreeing that, you know, this is amazing or this is a problem. [00:42:56] And you sort of lock onto that. [00:42:57] And then you see what happens when people deviate from that narrative and they get attacked. [00:43:03] You don't want to get attacked, so you signal. [00:43:05] You're one of the good guys. [00:43:07] But it's not your thoughts. [00:43:09] I mean, you're letting someone else think for you. [00:43:13] And there's nothing worse. [00:43:15] And when you're scrolling, you've got these little dopamine hits, great. [00:43:24] But at someone else's rants, someone else's obsession, someone else's ideology. [00:43:29] And, you know, I get why people don't want to think for themselves or it's easier to let other people think for them, but I think we need to reclaim this. [00:43:39] And I agree. [00:43:40] I think it's part of our political problem. [00:43:42] Well, I know there's a lightness that I achieve when I take multiple days off. [00:43:48] It's generally like I feel it after the first day, and then the second day I feel much better. [00:43:53] And the third day, I feel even better. [00:43:54] I found this out once. [00:43:56] I broke my phone in Hawaii. [00:43:58] And it was kind of funny. [00:43:59] Like, it just was randomly calling people. [00:44:01] I dropped it. [00:44:02] And I was showing my wife, like, look at this. [00:44:04] It just keeps calling people. [00:44:05] Constant buttons. [00:44:06] I hang up. [00:44:06] And I'm just holding it. [00:44:07] I hang up and call somebody else. [00:44:08] Hang up, call somebody. [00:44:09] It was like going through my entire contact list. [00:44:13] And so the phone was. [00:44:14] I've been annoying your friends. [00:44:16] Well, no, I just shut it off. [00:44:17] So it was broken. [00:44:18] I couldn't use it for anything else. [00:44:19] So I couldn't get an email. [00:44:20] I couldn't get anything. [00:44:21] So I shut it off. [00:44:21] I just left it in the hotel. [00:44:23] And then I had to order a phone. [00:44:25] And I was on Lanai. [00:44:27] And it took like three days to get a phone delivered there. [00:44:29] So for those three days, I was like, why don't I just live like this all the time? [00:44:34] I feel so much better. [00:44:35] And then immediately I got my phone. [00:44:36] I'm a church quarter. [00:44:38] I know. [00:44:39] It's very, you know, I just decide, you know, all right, I'm online, you know, TSA line going to, you know, I'm just going to be here with this boredom. [00:44:49] Yeah. [00:44:50] And I'm not going to pull my phone out. [00:44:51] And you really have to fight. [00:44:53] Yes. [00:44:54] It's such an instinct. [00:44:55] And it's amazing. [00:44:56] These things have only been around for 10 or 12 years. [00:44:58] It's crazy. [00:44:59] And everyone's attached to it. [00:45:00] I always say that if there was a drug that made you stare at your hand for six hours a day, it would be banned immediately. [00:45:06] People would be like, what the fuck is wrong with these people? [00:45:08] They're just looking at their hand. [00:45:10] This is an epidemic. [00:45:11] And it's a new posture, too. [00:45:12] You see it, right? [00:45:14] One of my kids, I went to pick her up at school, and there was this boy outside reading his phone. [00:45:19] He was hunched over and he was resting his chin like he couldn't even hold his head up. [00:45:25] He was just resting his chin on his chest and staring at his phone, waiting for his parents to pick him up. [00:45:29] I'm like, look at his neck. [00:45:31] Yeah. [00:45:32] He's going to have a nice. [00:45:33] Osteoporosis. [00:45:34] Well, he's going to have bulging discs or something. [00:45:37] It was just bizarre. [00:45:38] I'm like, that would be painful for me to sit like that. [00:45:42] I wonder if orthopedists have diagnosed any kind of like phone spine. [00:45:47] Yeah, they certainly have. [00:45:48] Yeah, there's been discussions about that, about people having pains in their neck because they're leaning over all day, staring at a phone. [00:45:56] It's a bad one. [00:45:57] It's true. [00:45:57] I think being in nature too is another way. [00:45:59] I mean, just like walking. [00:46:02] Yeah. [00:46:03] There's a scientist I interviewed who's really interesting. [00:46:06] It's a woman named Kalina Krzysztof Haji Livia. [00:46:09] She's Bulgarian-Canadian. === Phone Spine and Nature Walks (05:56) === [00:46:11] And she studies spontaneous thought, which I didn't even think was a field. [00:46:15] And it's a small field. [00:46:17] But spontaneous thought is daydreaming, mind wandering, fantasy, intuition, these bolts from the blue that we get occasionally. [00:46:26] We don't know where they come from. [00:46:28] And she says, and she does these cool experiments. [00:46:34] She'll put an experienced meditator in an fMRI machine and tell him or her to press a button when a thought intrudes. [00:46:41] Because even if you're a good meditator, she says every 10 seconds a thought intrudes. [00:46:46] And she'll look at what part of the brain is activated and when, when the person presses the button. [00:46:53] And one of the things she's found, and this is mysterious, is that she sees activity in the hippocampus, which is where memories are, and some other things, but essentially memories. [00:47:06] Four seconds before the person realizes that a thought has come. [00:47:11] So it takes four seconds for a thought to get from the subconscious, you know, or unconscious into our conscious awareness. [00:47:21] What is it doing during? [00:47:22] And that's a long time in brain time. [00:47:25] And we don't know exactly, but there's some process. [00:47:28] And maybe there's some inhibitory process that it has to get through in order to become conscious. [00:47:35] But anyway, these are the kind of things she works with. [00:47:37] But she says that we have there's less spontaneous thought going on today than there was 20 years ago. [00:47:44] And the reason is we're filling the space of our head with all this nonsense. [00:47:49] I wonder if it's going to have an impact on creative work. [00:47:53] I don't know if it's even possible to quantify this, but if you could see how much creativity is generated by people pre and post social media. [00:48:04] Yeah. [00:48:04] My guess is there's less of it because I do think that that process, I don't know about you, but I get ideas when I'm just walking around thinking and not online. [00:48:15] And it's a space of creativity, and we're shrinking it. [00:48:19] I used to tell you, I told you that I used to drive and deliver newspapers. [00:48:23] We were talking about driving the snow. [00:48:25] One of my most creative periods was when my radio was broken. [00:48:30] So I was just driving doing this task where you pick up a paper, fold it, put it in a plastic bag, chuck it out the window. [00:48:38] And I was just doing this and checking off the – and when I was doing that, I would have all my best ideas. [00:48:44] Because I wasn't listening to morning radio. [00:48:47] I wasn't listening to a cassette on tape. [00:48:50] I was just silenced doing this thing. [00:48:52] And then I was so creative when I was doing that. [00:48:54] That's generative boredom. [00:48:56] Yes. [00:48:58] It's beneficial. [00:48:59] It's hugely, especially if there's no one around you, because there's no one to talk to to alleviate that boredom. [00:49:04] It's just you and your mind. [00:49:06] And it was a couple hours a day. [00:49:08] So a couple hours every day, I would have this moment where I was by myself. [00:49:11] And were you writing jokes? [00:49:12] What were you doing? [00:49:13] Yeah, yeah. [00:49:13] I would come up with ideas for jokes. [00:49:15] Some of my best ideas I ever came up with back then were from driving. [00:49:19] I almost didn't want to quit the job because of that. [00:49:23] You still be doing it. [00:49:24] No, it was hell. [00:49:26] Especially in the winter. [00:49:27] Yeah, it was Boston. [00:49:29] It was, you know, I'd have to go at five o'clock in the morning every day. [00:49:31] It was rough. [00:49:32] I find walking is where that happens to me. [00:49:35] Same thing, right? [00:49:36] Yeah. [00:49:38] And actually, Kalina says, I mean, there are people who've studied creative people through history. [00:49:45] You know, people like Einstein and Beethoven and all these major creative people in the sciences and in the arts. [00:49:55] And that they worked a short day, but they spent a lot of time walking. [00:50:00] Interesting. [00:50:01] And yeah, they'd work like three or four hours, which is about all I can write in a day. [00:50:06] And then they'd take a long walk in the afternoon. [00:50:09] They also took a lot of vacations. [00:50:10] They had a lot of unstructured time. [00:50:12] And that's where a lot of the creativity comes. [00:50:15] It doesn't always come when you're like at the keyboard. [00:50:18] Right. [00:50:18] It sometimes comes, I mean, certainly solving problems. [00:50:22] If I'm really knotted up and I don't know, for me, transitions, like where do I go from here, since I'm not writing narrative, it's not always obvious. [00:50:31] You know, I need a transition and I don't know how to execute that turn. [00:50:37] I'll take a walk and very often it'll come to me or I'll wake up with the answer. [00:50:42] This episode is brought to you by BetterHelp in honor of International Women's Day. [00:50:46] BetterHelp is celebrating the women in your life. [00:50:49] I think we can all appreciate everything the women in our lives have done for us and everyone deserves a little self-care. [00:50:56] A good way to get that is through therapy because not only is therapy a time for you to focus on yourself, it's also a way to create balance and learn how to take care of your needs in your daily life. [00:51:09] And BetterHelp, as one of the largest online therapy platforms, makes it so easy to meet with the right therapists. [00:51:16] All you need to do is fill out a short questionnaire. [00:51:19] You don't even need to go into an office to meet them. [00:51:22] You can chat at home from your couch in your car before you hit the gym or while you're walking your dog. [00:51:27] Plus, if you aren't jiving with your first match, you can switch to a different therapist whenever you need. [00:51:34] Your emotional well-being matters. [00:51:36] Find support and feel lighter in therapy. [00:51:39] Sign up and get 10% off at betterhelp.com/slash J-R-E. [00:51:44] That's better, H-E-L-P dot com slash J-R-E. [00:51:49] A lot of writers like to write first and then walk. [00:51:53] And maybe even with a recorder so they can just walk and just talk when an idea pops in their head so they don't lose it. [00:51:59] Yeah, I have a little pad I carry with me. [00:52:01] Yeah. [00:52:02] You like writing it down better than recording it? [00:52:05] Yeah, for me. [00:52:05] Yeah, I need to see it. === Verbal Thinkers and Ideas (09:07) === [00:52:08] So another interesting experiment I did for this book was this beeper experiment. [00:52:17] There was a scientist, a psychologist at the University of Las Vegas. [00:52:23] And for 50 years, he's been doing the same one experiment, which is sampling people's inner experience. [00:52:29] And he does this. [00:52:31] You have a beeper that you carry around and a little earpiece. [00:52:35] And at random times of the day, you get, and it's like, catches you. [00:52:40] And it's a very sudden rise to this beep. [00:52:43] And then you have a little pad, and you're supposed to write down what you were thinking. [00:52:46] Sounds really simple. [00:52:47] It's actually really hard. [00:52:49] I mean, there's a lot of issues with it. [00:52:51] Like, you start thinking, what if it goes off now? [00:52:56] That's one problem. [00:52:58] But also, you're a little self-conscious. [00:53:00] So you do about five beeps over the course of the day, and then he interviews you about these moments. [00:53:07] And you think you've got it down. [00:53:10] Like, I just give you, a lot of my beeps are about food. [00:53:14] And so I was seasoning a filet of salmon and walking to the refrigerator with it. [00:53:21] And just at the beh, I was thinking to myself, fuck, I forgot the pepper. [00:53:28] I know. [00:53:29] My thoughts were not that profound. [00:53:32] And so I said, all right, pepper. [00:53:34] It was easy. [00:53:35] Fuck, pepper. [00:53:37] But then when he came to interview me, he said, well, did you hear the word pepper or did you speak the word pepper? [00:53:44] And that's, you know, suddenly you realize there's voices in your head. [00:53:47] You don't know if you're listening or speaking. [00:53:50] And so anyway, you have this long interrogation with him and he sorts through all these things and he tries to get you to isolate what was before what he would call the footlights of consciousness. [00:54:00] And I found it really hard. [00:54:02] I couldn't separate the thought the way he wanted me to because there were always several things going on at once. [00:54:09] Like I was standing in a bakery and I was deciding whether to buy a roll or not. [00:54:15] Another profound thought. [00:54:18] But at the same time, I was like smelling the baked goods and the cheeses that they sold and this woman had this horrible plaid on her skirt that was like, you know, really unflattering. [00:54:29] And I was hearing people, you know, behind me talking. [00:54:33] And so I couldn't pull all the threads. [00:54:36] And we argued a lot, actually. [00:54:40] But the thing he's discussed, I said, so after 50 years, what have you learned about human thought? [00:54:46] And he's very allergic to theory. [00:54:49] He still has no theories about it. [00:54:50] But he did say, well, a lot of people think they're verbal thinkers, that their thoughts are in the form of words. [00:54:58] But it turns out that's kind of a minority, that there are a lot of people who think in images. [00:55:03] And then there are a lot of people who think in unsymbolized thought, which I don't totally understand. [00:55:08] But these are thoughts that are neither words or images. [00:55:12] I do have a sense in my own thought process, which I never thought about this way, that a lot of my thoughts are just on the verge of being word thoughts. [00:55:23] But I haven't found the words yet. [00:55:25] But I know the thought, even though I haven't put it into words. [00:55:29] And William James called it premonitory thinking, premonition thinking, it was the term he used. [00:55:41] So anyway, so I did this for several days and we had many arguments. [00:55:45] And I was saying, look, you can't separate a thought. [00:55:47] Every thought colors the next thought. [00:55:49] And, you know, there are these thought, and you never have, anyway, we just would go back and forth, and I was arguing why you can't separate thoughts. [00:55:59] It's a stream. [00:56:00] It's a very dynamic stream. [00:56:02] And at the end, we had a final session. [00:56:07] And he's a very funny guy. [00:56:09] He's really allergic to theories. [00:56:11] At one point, I said I was writing a book on consciousness, and he said, good luck with that. [00:56:17] Very encouraging. [00:56:18] Anyway, he said, well, he described there are these verbal thinkers and visual thinkers and unsymbolized thinkers. [00:56:26] And I find that really interesting because we assume when we say the word, what are you thinking, that we know and that you're thinking the way I'm thinking. [00:56:33] But it turns out we're not. [00:56:35] That's just an umbrella word for many different styles of thinking. [00:56:40] And we're really different. [00:56:41] So that was one thing. [00:56:43] But the other thing he said in our last meeting on Zoom, he said, there's also a small subset of people who just have very little inner life. [00:56:52] And you're one of them. [00:56:54] And I was like, what? [00:56:57] You know, I write books. [00:56:58] You know, I meditate. [00:57:00] I ruminate. [00:57:01] How can he make that distinction, though? [00:57:03] How does he know what's going on inside your head? [00:57:05] He felt that my inability to isolate a thought was evidence that there weren't thoughts. [00:57:13] And that I was kind of backfilling with all this other, you know, simultaneous stuff going on. [00:57:18] I mean, I didn't agree with him. [00:57:20] I thought it was kind of crazy, but that's – Have you asked him – have you ever conversations with him about other things? [00:57:27] See how he thinks? [00:57:29] No, he's very much in the therapist mode. [00:57:32] Like he's asking the questions. [00:57:34] Yeah, I'd like to know how he thinks. [00:57:36] Yeah. [00:57:36] If that's what his mode is. [00:57:38] Yeah, I'd like to talk to him. [00:57:39] Now he would probably say that. [00:57:41] Anyway, he's posted all these conversations on his website. [00:57:44] So if people really want to be bored, they can check them out. [00:57:47] That's a weird thing to say that you know, especially someone like you who writes and does think a lot and clearly has got some sort of dialogue going on in your head. [00:57:58] The idea that you don't, and I know this guy can say that. [00:58:01] I know. [00:58:02] That seems a little arrogant. [00:58:04] Yeah, I think I just didn't fit his template of like how people think. [00:58:09] Yeah, well, that's why you should get a better therapist. [00:58:12] Move around. [00:58:13] All right. [00:58:14] Find somebody else. [00:58:15] Good advice. [00:58:16] I mean, it seems like that's a very narrow mind. [00:58:18] I couldn't imagine saying to anyone very little inner life. [00:58:23] Yeah, regardless of what kind of theory I'm following or what school of thought, I don't know what's going on in your head. [00:58:31] I can't. [00:58:32] It's not possible. [00:58:33] No, and that's it. [00:58:35] William James said this, the great founder of American psychology, that the breach between two consciousnesses is one of the biggest breaches in nature. [00:58:43] Yes. [00:58:44] And we, you know, I don't know your conscious for a fact. [00:58:48] I assume it because your behaviors mesh and we're the same species and we have theory of mind. [00:58:54] We can imagine our way into someone else's head. [00:58:57] But it's a guess. [00:58:58] It's a guess. [00:58:59] And so there's, I mean, that's part of the mystery. [00:59:03] Well, it's one of the things that I do when I'm talking to people. [00:59:05] I try to imagine. [00:59:08] I'm so fortunate that I've been able to have so many conversations with so many different people, so many different ways that people view the world. [00:59:16] And when I'm talking to someone, particularly if they're very different from me or anyone I know, I always try to put myself in their head. [00:59:24] And after they talk for 15 or 20 minutes, I try to recognize how they approach things and see if I'm like, what is that, what's that world like? [00:59:36] Like this person's perspective. [00:59:38] So you're operating on two tracks. [00:59:40] You're holding the conversation. [00:59:42] Yeah. [00:59:43] But you're also thinking. [00:59:44] I'm trying to tune in. [00:59:46] Yeah. [00:59:46] Right. [00:59:46] I'm trying to, because I always feel like when someone is like a great performance, like a great comedian or a great musician, one of the things that they're doing is they're bringing you into their head. [00:59:57] Like there's a hypnosis. [00:59:59] When someone sings an amazing song and the whole crowd is singing along, there's a hypnotic element to that. [01:00:05] Where when someone's like really killing it on stage and their voice is just perfect. [01:00:09] It's like, oh, yeah. [01:00:10] Like you're in their head. [01:00:12] Like it's a mind melt. [01:00:15] Yeah, it is a mind melt. [01:00:16] And there's a little bit of that that goes on in conversations. [01:00:19] There's a mind meld. [01:00:20] And I always try, especially if this is a rational person. [01:00:25] I always try to put myself in their head or at least empty out mine and let them think and then try to just keep the conversation rolling with just pure curiosity. [01:00:38] But always, you know, try to think, I don't think the same way other people do. [01:00:44] And maybe I can learn something from this. [01:00:47] Maybe I can get something out of the way they think. [01:00:49] Seems to me you have a real gift of curiosity. [01:00:54] I mean, it's a big gift. [01:00:56] You're an intensely curious person. [01:00:58] Well, I've always been that way, but I've been very fortunate that I've had something like this that allowed me to feed it. [01:01:05] You know, I mean, the vast majority of time on my phone, I just pursue curiosities. [01:01:12] I don't, I really am mostly social media. === Waves, Particles, and Consciousness (06:07) === [01:01:16] Yeah. [01:01:16] I watch interesting YouTube videos. [01:01:18] Like, I went down a black hole rabbit hole last night. [01:01:21] Oh, my God. [01:01:22] You want to really break your brain? [01:01:25] There's a video of Brian Cox where he's talking about this black hole that they found that's bigger than our entire solar system. [01:01:31] Wow. [01:01:31] The event horizon extends far beyond Pluto. [01:01:38] That is mind-blowing. [01:01:40] Yeah. [01:01:42] He said, we don't understand why it exists. [01:01:44] We don't understand how it could have formed so early in the universe, but yet there it is. [01:01:48] How do they measure it? [01:01:49] How do they know how big it is? [01:01:50] No. [01:01:52] I don't know. [01:01:53] I'm assuming there's a lot of revelations that have come out since the implementation of the James Webb telescope. [01:02:00] Those images are incredible. [01:02:01] Insane. [01:02:02] Insane. [01:02:03] And this is one that's causing this very interesting new theory or perspective on the age of the universe. [01:02:12] So there's some galaxies that they found that shouldn't have been. [01:02:16] Oh, yeah. [01:02:16] Yeah. [01:02:17] I read about this, that it's throwing all their assumptions about the age of the universe up for grabs. [01:02:22] Which makes sense because the further you can look back, the more you're going to be able to see. [01:02:26] The assumption that the universe was 13.7 billion years old was essentially based on how far we could go back. [01:02:33] And then, you know, the analysis of the radio waves that are coming from the supposed explosion. [01:02:39] And then you've got guys like Sir Roger Penrose who say, no, this is a constant cycle. [01:02:43] It's not one birth of the universe. [01:02:46] It's boom, smash, boom, smash forever. [01:02:50] That's an accordion. [01:02:51] And it's always happened, which is the ultimate mind fuck. [01:02:54] Well, you know, the interesting thing about astronomy, actually astronomy and consciousness studies have the same problem, which is you can't get out of consciousness to study it from a distance, right? [01:03:07] Everything, every tool you have to study consciousness is a product of consciousness, including science. [01:03:13] The scientific enterprise is a manifestation of human consciousness. [01:03:18] The problems you decide to study, the tools you have to do it with, the scale at which you're working, it's all like a product of consciousness. [01:03:26] Astronomy, too, is trying to understand something it can't get outside of, right? [01:03:32] I mean, because its subject is everything that there is, the universe. [01:03:37] So you can do interesting things from inside using telescopes and you can figure out how old things are and rates of expansion and all this kind of stuff, but you can never get that godlike perspective that we have with other scientific problems. [01:03:53] And this is, I think, part of the reason we haven't solved the consciousness problem, that we can't get outside. [01:04:02] We're in a labyrinth. [01:04:03] And everything we know is consciousness, which is a very weird idea. [01:04:09] I remember asking Christoph Koch, the scientist I mentioned earlier, I said, well, what would the world be like without any consciousness? [01:04:17] And that is a trippy thought. [01:04:19] Because everything we perceive is the scale of things. [01:04:25] We operate at this scale, right? [01:04:27] We're like five or six feet tall. [01:04:29] We have bodies like this. [01:04:31] But there's another world going on microscopically, and there's another world going on macroscopically. [01:04:36] So if there's no consciousness, what's the proper scale? [01:04:39] There isn't any. [01:04:40] And when I asked him this question, he said, particles and waves. [01:04:44] That's all there is. [01:04:44] There'd be nothing but particles and waves. [01:04:46] There might not even be space-time. [01:04:48] That may be a product of consciousness also. [01:04:52] So that was kind of mind-blowing to learn. [01:04:56] That's the weirdest perspective, is that consciousness is a part of reality, that it is how reality is formed. [01:05:05] And that without consciousness and the perceiving of all this stuff doesn't exist. [01:05:10] Something exists, but it's not, it has no shape. [01:05:14] it has no scale it has no right uh... [01:05:18] Because consciousness is what's perceiving light and we're perceiving colors and it's constructing. [01:05:24] But it really is just particles. [01:05:26] Yeah, and waves. [01:05:27] And waves and particles and atoms. [01:05:29] Yeah. [01:05:29] Subatomic particles. [01:05:30] And when you get into the weirder stuff. [01:05:32] And we give it order. [01:05:33] Right. [01:05:34] I know, which I, you know, it's just a mind-blowing idea. [01:05:39] It really is a game changer. [01:05:40] Because if you think about it that way, you go, okay, well, what is all this solid stuff? [01:05:44] Yeah. [01:05:45] What is this? [01:05:46] Like, does this even really exist? [01:05:48] Or does it only exist? [01:05:49] Well, this table, there's a famous Arthur Eddington was a physicist early in the 20th century. [01:05:56] And he said, the real table is mostly space. [01:06:00] And only in our consciousness and at our scale is it solid. [01:06:07] But at the scale of particle physics, which is an equally legitimate scale, it's just wide open space with these waves and particles, but a lot of emptiness. [01:06:19] That was kind of mind-blowing, too. [01:06:22] But it's just such an abstract concept for a person in their car right now listening on the way to work. [01:06:28] Like, what the fuck are you talking about? [01:06:29] Maybe they want to pull over. [01:06:30] All this stuff is real. [01:06:32] Yeah. [01:06:32] It is, sort of, but only if you're conscious. [01:06:37] Well, you could think of consciousness as the way the universe experiences itself. [01:06:42] Yeah. [01:06:44] Well, that's what really weird. [01:06:45] Like, what if the universe is consciousness? [01:06:47] Yeah. [01:06:48] I mean, that's another way to look at it. [01:06:49] Maybe consciousness is part of the universe, But it's not giving it the order that we give it. [01:06:55] You know, we see at a certain spectrum of light. [01:06:57] There's, you know, bees see at another spectrum of light. [01:07:00] You know, we're, we are, the world we behold, the world that appears to us, is the world that our senses allow us to see. [01:07:08] When I was doing this research on plant intelligence, they have 20 senses. [01:07:12] We only have five. [01:07:14] They're picking up magnetic fields, they're picking up pH, they're picking up nitrogen levels. [01:07:19] You know, they have all. [01:07:20] How do we know all this? === Plants That Might Be Alive (10:16) === [01:07:23] Researchers are working on it. [01:07:24] There's a group of botanists who call themselves plant neurobiologists, knowing full well there are no neurons in plants. [01:07:31] They're kind of trolling more conventional botanists. [01:07:34] And they're doing these cool experiments with plants. [01:07:38] A couple examples of some of these amazing things plants can do. [01:07:43] They can hear. [01:07:45] So if you play a recording of a caterpillar munching on leaves, they'll react and they'll send chemicals into their leaves to make them taste bad or be toxic. [01:07:56] They can see. [01:07:59] There are vines that change the shape of their leaves depending on the plant they're twining up in order to be hidden. [01:08:08] How do they see the shape to imitate it? [01:08:10] We don't know. [01:08:13] Plants will go toward a pipe with water in it because they can hear the water, even though it's totally dry, and they'll send their roots down to it. [01:08:25] They can hear the water. [01:08:26] They can hear. [01:08:27] Yeah. [01:08:30] This plant neurobiologist showed me this a couple videos he'd made. [01:08:34] I actually just posted them on my website. [01:08:38] He showed that a corn plant's roots can navigate a maze to get to fertilizer. [01:08:45] So you put a little fertilizer in a corner and the root will find the most direct route to the nitrogen. [01:08:52] There was a plumbing problem that I had in my house in California and the plumber couldn't figure out what was wrong. [01:09:01] It was like the pipes were stuck. [01:09:04] And what had happened was in the backyard, one of the trees had gotten into the pipe and formed like this tree. [01:09:15] I mean, it was huge. [01:09:16] It looked like when I pulled it out, I put it up on my Instagram. [01:09:18] See if you can find it. [01:09:20] It looked like a muskrat. [01:09:22] I mean, it was like dense with roots and it was thick. [01:09:27] It was like three feet long. [01:09:29] It was crack. [01:09:30] That's it. [01:09:31] That was in my pipe. [01:09:32] Oh, my God. [01:09:33] Isn't that crazy? [01:09:34] Yeah. [01:09:35] What kind of tree was it? [01:09:36] I don't know. [01:09:38] I think it was an oak tree because there were oak trees in the backyard where they dug up. [01:09:42] That's why. [01:09:43] But look how thick it is. [01:09:44] Yeah. [01:09:45] It's crazy. [01:09:46] And it went through a tiny little crack. [01:09:48] Yeah. [01:09:49] I mean, it probably forced the crack open and then went in there and just really grew out. [01:09:55] Yeah, well, it had a source of water. [01:09:57] Yeah, but it's just kind of bananas that somehow or another it figured out that there was water in that pipe. [01:10:02] You know, we underestimate plants basically because we can't see their behaviors. [01:10:07] And going to that point about scale, they operate at a time scale that seems very slow to us, so we don't notice. [01:10:14] But if you use time-lapse photography, you see what they're up to. [01:10:17] And it's pretty amazing. [01:10:19] Another interesting video that this guy showed me, his name is Stefano Mancuso. [01:10:24] He's an Italian scientist, botanist, is how bean plants find a pole to grow up. [01:10:31] And so he grows these beans and he has a metal pole on a dolly. [01:10:35] And, you know, I always assume they made this pattern. [01:10:38] Darwin called it circumnutation. [01:10:41] They go through this spiral. [01:10:43] And I always assume they just kind of did this till they hit something. [01:10:46] No, they know where the pole is. [01:10:49] And you watch this thing, and it's going in circles, but it's reaching and reaching. [01:10:55] It looks like a fly fisherman, you know, casting. [01:10:58] And it finally gets to the pole. [01:11:01] And so, how does it know where the pole is in space? [01:11:04] Well, one theory is that every time the cells divide, there's a little sound that's produced, and that maybe they're using echolocation, like a bat, kind of bouncing it off of the pole, and that's how they know where they are in space. [01:11:20] We still don't understand. [01:11:22] I know, some amazing things. [01:11:25] Also, you can teach a plant a certain behavior, and it will remember for 28 days. [01:11:33] So, they do this thing with sensitive plants. [01:11:36] You may have seen them in Hawaii, actually. [01:11:38] It's a tropical plant. [01:11:39] When you touch it, the leaves collapse to keep from being eaten. [01:11:43] It's called mimosa pudica. [01:11:46] And normally, if you shake it, it'll also do this. [01:11:50] And if you shake it repeatedly, it learns to ignore that stimulus, and it will remember 28 days, and it won't react when you do it. [01:12:00] To give you some comparison, fruit flies can only remember stuff for 24 hours, and then they start over again. [01:12:10] So, another fact about plants, I got really deep into this because I was trying to, you know, these guys say plants are conscious. [01:12:17] Yeah, they have some kind of basic form of conscience, consciousness. [01:12:24] Here's another one: the anesthetics that we use to put us out for surgery put plants out. [01:12:32] So, a venous fly trap, if you give it an anesthetic, will not react when the bug comes across it. [01:12:41] Now, that is like really interesting because it suggests they have two modes of being, right? [01:12:45] Sort of like, you know, unconscious and conscious or aware. [01:12:51] So, Stefano believes that they're conscious. [01:12:54] Now, this raises interesting ethical issues, right? [01:12:58] If plants are conscious, do they feel pain? [01:13:03] And I was really a little worried about that. [01:13:06] You know, what if that beautiful smell of a freshly mown lawn is actually the chemical equivalent of a scream? [01:13:17] Yeah. [01:13:18] But Stefano said he doesn't think they feel pain. [01:13:22] Why does he think that? [01:13:23] He said that pain would not be adaptive for a creature that can't run away. [01:13:28] Well, if that's the case, then why do they produce chemicals to make themselves taste worse? [01:13:32] They know what's going on. [01:13:34] They're aware that they're being eaten, but that it doesn't register to them as pain. [01:13:39] I don't know how he knows this, but if he's wrong, then, you know, and we care about that, what's left to eat? [01:13:51] So I think you have to take the assumption that life eats life. [01:13:55] Yeah, and that and another scientist that I interviewed about this, who does think plants feel pain, says, look, it's just a fact of life. [01:14:04] We have to eat other species. [01:14:06] And he was kind of, you know, gruff about that. [01:14:10] But anyway, Stefano's idea is that, you know, being able to move, take your hand off the hot stove or run away, then pain is really useful. [01:14:21] It's a really important signal. [01:14:23] But he also points out that lots of plants like to be eaten. [01:14:26] I mean, you know, grasses benefit from being with a ruminant, right? [01:14:29] And that regenerates them. [01:14:31] They want to be eaten. [01:14:32] And then you have all the fruits and nuts that they produce, seeds that they produce that they want mammals to take away and spread their seeds. [01:14:39] So you don't have to worry about going beyond vegan. [01:14:44] No, well, it just seems like a cycle. [01:14:45] It seems like a very cycle. [01:14:47] It's an interesting cycle that exists with all living things. [01:14:51] And then, of course, when you die, right? [01:14:54] Plants eat meat. [01:14:56] They consume carnivores. [01:14:58] Yeah, that's the thing. [01:14:59] They consume all the dead animals that die near them. [01:15:03] And fungi. [01:15:04] Yeah, and fungi. [01:15:05] Well, that's the other weird thing, is the mycelium that they use to communicate with under the body. [01:15:09] Well, that's another really interesting case of intelligence in nature, right? [01:15:13] I mean, you know, you've probably done shows on this, but the way they use mycelium to send nutrients to their children or share them in the forest. [01:15:25] Allocate resources to certain plants that need them more. [01:15:27] Yeah. [01:15:28] And also communicate risk. [01:15:30] I mean, that there's a threat. [01:15:32] And so there are alarm signals that go out. [01:15:36] You know, the overall place we're getting to with this as we look at consciousness and all these other species is that the world is just a lot more alive than we thought. [01:15:47] And that we've been, you know, the whole legacy of the Enlightenment and Western science has been that we have some monopoly on this stuff and everything else is more or less dead or, you know, we can use it as we wish. [01:16:00] But we're seeing, I think we're approaching like a Copernican moment for our species. [01:16:07] You know, when Copernicus came along and he said, actually, the Earth revolves around the sun, not the other way around. [01:16:14] It was like mind-blowing to people that our centrality in the universe had been, we'd been dethroned. [01:16:20] And we were dethroned again when, you know, Darwin said we're animals like all the other animals and we evolved from animals. [01:16:29] That blew people's minds too. [01:16:32] I think that we're kind of democratizing consciousness, that consciousness is much more extensive than we thought. [01:16:40] And the world is more animate than we thought. [01:16:44] And that's an old idea. [01:16:45] You know, traditional cultures have always believed that the world is full of spirit and that you had to respect animals and all living things. [01:16:54] And to some cultures, rocks also, dead things. [01:16:59] So I think we're at this moment of reanimating the world right now. [01:17:03] And it's science that's driving it. [01:17:05] And I think that's really exciting. [01:17:08] It is exciting, but it's such a paradigm shift in terms of people's perceptions of the world that it's going to be difficult for your average 40-year-old person that works an office job to swallow. [01:17:21] It also makes sense why offices feel so soulless when you walk into a thing and everything is made out of synthetic material and plastics and metal and it's all manufactured and you're under these bullshit lights and it just feels alive. === Pond Water Ecosystems Explained (03:09) === [01:17:39] No, it feels alive at all. [01:17:41] You might be just surrounded by things that don't have consciousness because they've been kind of stuffed into a form that's just stuck in place rather than something that exists that works with the earth. [01:17:53] Like soil is alive. [01:17:56] And yeah, there's another example. [01:17:57] Soil is a lot more alive than we ever realized. [01:17:59] We thought it was just dirt. [01:18:01] And now we know that there are a million critters in every teaspoonful of life. [01:18:05] There's a really cool channel that I follow on YouTube. [01:18:09] It's a guy who takes like rainwater or pond water and he puts it in a jar with some plants and he just leaves it there for months and then he comes back and there's all these living things moving around it. [01:18:22] See if you can find that guy on YouTube. [01:18:25] So I dug a pond or had a pond dug on my property in Connecticut and I watched life come to this pond. [01:18:32] It's just, you know, it was just a hole with water. [01:18:34] And within a month, it was teeming with life. [01:18:37] It's just amazing. [01:18:38] Like, how does it get there? [01:18:39] Birds carry a lot of it in and frogs carry a lot of it in. [01:18:43] And after a month or two, I looked at it under a microscope, and you couldn't believe it was like a city of critters. [01:18:50] Yeah, they find like trout on lakes that are like way high in the mountain, and no one ever stalked the lake. [01:18:57] And they're like, okay, how did he get in there? [01:18:58] There's all these theories. [01:19:00] Birds pick up eggs and deposit them, I guess, is one way. [01:19:05] Right, but like, how do they get fertilized? [01:19:07] That's a good question. [01:19:09] Maybe they're already fertilized. [01:19:11] Do you think? [01:19:12] I don't know. [01:19:13] Yes, that's it. [01:19:14] These have lots of views. [01:19:16] Yeah, that's it. [01:19:17] Wait, on the left? [01:19:19] So this guy, he just takes pond water or lake water or rainwater and he puts it in a jar and then he leaves it there. [01:19:27] Yeah, it is like go to like day 60. [01:19:30] Where is that? [01:19:31] Sorry. [01:19:32] On the top row where it says day 60 to the right. [01:19:36] See where it says day 60? [01:19:37] Click on that. [01:19:38] So he takes these things and then searches them after X amount of days. [01:19:45] And you see all this stuff living in there, all these things swimming around in there. [01:19:50] This isn't the same guy, so there must be other guys that do the same thing. [01:19:54] But you see these weird little creatures that are floating around in there. [01:19:59] Yeah, I brought my pond water to a biologist and he like walked. [01:20:02] Well, this is different because this guy's bringing in, he's making an actual aquarium. [01:20:06] The guy that I saw was just, he essentially just figured out how to take a scoop of dirt and whatever is alive that's in that dirt with some muddy water and put it in a jar and put more pond water in there and they just leave it there. [01:20:20] And then you see all these weird little crustaceans, weird little shrimp-looking things. [01:20:27] And some of them are killing the other ones. [01:20:29] So there's like a real ecosystem in there. [01:20:31] Oh, yeah. [01:20:32] Yeah, and it's just created like overnight. [01:20:34] Yeah. [01:20:35] It's very cool. [01:20:36] So I think that this is like a trend of our time that's really important. [01:20:40] That, you know, we went from this idea of the dead world that we could exploit to this other idea that it's much more animate. === Sentience in Dogs and AI (11:11) === [01:20:49] And of course, that's not, that's the default for humans. [01:20:52] All traditional cultures believe in animism, basically. [01:20:57] It's also the default for kids, right? [01:20:59] Kids think everything is animate until we knock it out of them in school. [01:21:03] Yeah. [01:21:03] And so it's very interesting to see science supporting this idea after all these years. [01:21:10] And the other thing that's kind of interesting is that it's happening at the same time that some people think AI is going to be conscious. [01:21:20] So we're under pressure from both sides. [01:21:24] I mean, that we're getting these two, you know, these two things happening at once, that machines may soon be smarter than we are, may be conscious, although we could talk about it, I don't think they can be conscious, but they can certainly make us think they're conscious. [01:21:39] And then on the other hand, we have the animals who clearly are conscious. [01:21:44] And the research on animals is like they're down to plants, they're down to insects that have signs of, I would use the word sentience rather than consciousness because consciousness implies interiority and the voice in your head and things like that. [01:22:01] They have a more basic form of consciousness that I call sentience. [01:22:05] Like dog consciousness? [01:22:06] Yeah, I think dogs are higher conscious. [01:22:09] I think they're more conscious than those simple things. [01:22:13] I would say dogs are conscious, not just sentient. [01:22:16] Is it just because they communicate with us that we think that? [01:22:18] I mean, why would we assume if plants have all these different senses and we see this communication with them in terms of like allocating resources to other plants that need it, the use of mycelium, their ability to do all these different things? [01:22:32] Why are we assuming that just because they can't move the way we move? [01:22:35] Yeah, that they don't have more going on. [01:22:37] Right. [01:22:38] Yeah, it's possible, but I don't know what good it would do them. [01:22:41] Like plants, what they get really good at, what matters to them is biochemistry. [01:22:47] They have to produce chemicals either to poison their enemies or confuse them with drugs. [01:22:54] But they also want to grow and thrive. [01:22:56] They do want to grow and throw them. [01:22:57] And they also exist in a community. [01:22:59] Yes, definitely. [01:23:01] Right, so don't you think that consciousness would be essential in order to foster that feeling of community? [01:23:08] That's interesting. [01:23:09] I hadn't thought about that. [01:23:10] Yeah. [01:23:10] Yeah, that could be. [01:23:12] Dogs are an easier case because they communicate with us directly. [01:23:17] They're clearly conscious in a way that's very profound. [01:23:21] But different than we, obviously. [01:23:23] Yeah, I mean, one of the realizations I had when I was in the cave was that, you know, we often think that we're more conscious than animals, but actually animals are more conscious than we are. [01:23:35] They have to be. [01:23:36] They have to be present because they get eaten if they're not, right? [01:23:40] Because we have this giant structure of civilization and the security it gives us, and we have this technology that allows us to check out. [01:23:49] But I actually think animals are more conscious than we are. [01:23:52] It's different, but if we think of being conscious as really being present to the moment, dogs are very present to the moment. [01:24:01] Well, certainly animals are getting more information about the environment than we are. [01:24:05] Yes. [01:24:05] They have much better sense of smell, much better sense of hearing. [01:24:11] There's a lot of different things that they can do. [01:24:13] Like animals seem to be able to tell when you're nervous. [01:24:16] Yeah. [01:24:16] Oh, they read the environment. [01:24:18] They read other creatures. [01:24:20] Yeah. [01:24:20] And, you know, we used to have more skills when we had to survive in a natural world in nature. [01:24:27] You know, we, I mean, you see this with traditional, you know, with tribes, indigenous tribes, that they have knowledge in nature that far exceeds ours because they need it to survive. [01:24:38] But anyway, so I think we're going to get to a point where we have to decide whose team we're on. [01:24:45] Are we like with these machines that speak our language and speak in the first person and sound like us? [01:24:52] Or are we with the animals that can feel and suffer and die? [01:24:58] And I think that's going to be a big choice for us to make as a civilization. [01:25:03] Why do you think that AI won't be conscious? [01:25:09] The most interesting line of research, well, a couple reasons. [01:25:13] The first is the idea that it can be conscious, which is very common in Silicon Valley. [01:25:18] I talk to lots of people there and they say, oh, it's just a matter of time. [01:25:21] Some of that is confusion that intelligence and consciousness necessarily go together and they don't. [01:25:28] They have an orthogonal relationship, right? [01:25:30] I mean, you know people who are conscious and not too intelligent, right? [01:25:35] And we all do. [01:25:37] So it's not going to just come along for the ride with intelligence as these machines get more intelligent. [01:25:43] But the belief that AI can be conscious is based on a metaphor that I think is a crappy metaphor, and that is that the brain is a kind of computer. [01:25:53] And this is widely held. [01:25:55] It's interesting to note that in history, whatever the cool cutting-edge technology was, brains were likened to that. [01:26:03] So it was looms for a while. [01:26:06] It was clocks for a while. [01:26:08] It was telephone switchboards. [01:26:10] Whatever was the cool technology, surely that's how brains work. [01:26:14] Now it's computers. [01:26:15] But think about it. [01:26:16] In a computer, you have this sharp distinction between hardware and software. [01:26:21] That's the key to their success. [01:26:23] And you can run the same program on any number of different hardwares. [01:26:26] They're interchangeable. [01:26:28] Brains aren't like that. [01:26:29] There's no distinction between hardware and software. [01:26:32] Every experience you have, every memory is a physical change to the brain, to the way it's wired. [01:26:39] You know, we start out with all these connections and they get pruned as we grow up. [01:26:44] Every brain is shaped by its experience. [01:26:47] So this idea that you could separate that consciousness is some kind of software that you could run on other things besides meat, I just think it doesn't hold up. [01:26:59] Well, if the universe is experiencing itself subjectively through consciousness, why does it have to be only biological consciousness? [01:27:10] It doesn't have to be. [01:27:10] But if there is a technology that is invented that essentially does all the things that a human body does physically and also interacts with consciousness, the consciousness of the universe. [01:27:23] Yeah. [01:27:25] Hypothetically. [01:27:26] Hypothetically, if the universe is conscious, if we are using the mind as essentially an antenna to tune into consciousness, other things could we could make an antenna. [01:27:38] Yes. [01:27:38] Absolutely. [01:27:39] It's also likely that if we are ever visited by aliens, that they will have some kind of consciousness, and it may not be meat-based, right? [01:27:48] Right, right. [01:27:49] Where it may be at one point in time it was. [01:27:51] Yeah. [01:27:52] But they realize that there's biological limitations in terms of its ability to evolve that can be far surpassed with technology. [01:28:00] Yeah, I mean that, or it just evolved in a different way, you know, or they're channeling it in a different way. [01:28:06] But the other reason I don't see it happening with computers as we know them, because that's the debate now, whether these computers we have, these large language models and the next generation can be conscious, is that the research that I found most persuasive about consciousness is basically has consciousness beginning with feelings, not thoughts. [01:28:32] In other words, it's embodied. [01:28:35] And I have to just develop this a little bit, but we, you know, the brain exists to keep the body alive, not the other way around. [01:28:43] Although we tend, since we identify with our heads, where most of our senses are, we lose track of that. [01:28:49] And the body speaks to the brain in feelings, right? [01:28:53] You know, feelings of hunger, itchiness, warmth, cold, but also feelings of shame when our social standing is not, you know, has been damaged. [01:29:06] Anyway, we have these feelings. [01:29:08] They depend on a body. [01:29:10] Feelings have no weight if you're not vulnerable, your body isn't vulnerable, and probably mortal. [01:29:20] So consciousness is embodied in a really critical way. [01:29:25] And computers are not. [01:29:27] Now, robots will be. [01:29:28] And I actually interview a guy, a scientist at USC, who is trying to make a vulnerable robot. [01:29:37] So he's essentially upholstering the thing with skin that can tear and be damaged. [01:29:43] And he's filling the skin with all these sensors so that it can be like us and be vulnerable and generate feelings that are how consciousness begins. [01:29:54] So for a long time, we thought consciousness had to be in the cortex, right? [01:29:59] The most human, newest part of the brain, the outer covering. [01:30:02] And that's where rational thought and executive function are and all these kind of things. [01:30:08] But as it turns out, it really begins with feelings in the brainstem. [01:30:13] Let's say you have a feeling of hunger, it registers in the upper brainstem, and only later does the cortex get involved, like helping you figure out how are you going to feed yourself, like imagining, you know, a meal, counterfactuals of different meals, or making a reservation at a restaurant. [01:30:29] All those are cortical things. [01:30:30] But it begins in the brainstem with feelings. [01:30:33] So if that is true, and I find that really persuasive because people born without a cortex are still conscious. [01:30:42] Animals that you take the cortex out still show signs of consciousness. [01:30:48] Whereas if you damage the upper brainstem, you're out. [01:30:52] You're unconscious. [01:30:54] So if this is true, and consciousness is this embodied phenomenon that depends on having a body to mean anything, I don't see how machines are going to do that. [01:31:05] But isn't the key word there if? [01:31:07] Yeah, if. [01:31:09] Definitely. [01:31:09] I mean, consciousness is just something that we're tuning into that's around us all the time. [01:31:14] There will be other ways to do it. [01:31:15] Right. [01:31:16] But it won't be these computers we're building right now. [01:31:18] Why is that? [01:31:19] Because they're designed, you know, they're good at, so here's a paradox of computers. [01:31:26] Computers are really good. [01:31:28] It's called Maravex, Moravex paradox. [01:31:31] Computers are really good at the highest kinds of rational thought, right? [01:31:36] They can play chess and go. [01:31:38] They can simulate real thinking. [01:31:40] And some people say they do think. [01:31:43] The more primitive kinds of things that go on in our brain, including elaborate movement, changing diapers, they're very bad at that. [01:31:53] You would never trust a robot to do that as much as you might want to. === Creating Consciousness with ChatGPT (15:37) === [01:32:00] But they're not good at that kind of emotional stuff. [01:32:05] The more limbic part of our brain. [01:32:08] They can't do that. [01:32:10] Yet. [01:32:11] Definitely yet. [01:32:12] But if we go out far enough, anything's possible. [01:32:15] That's the point. [01:32:16] Yeah. [01:32:16] The point is these things, what we're looking at now is essentially a single-celled organism becoming a multi-celled organism. [01:32:25] I mean, the potential for what they could become is unlimited, especially once they start making better versions of themselves. [01:32:32] Well, and they will. [01:32:34] They've done this. [01:32:34] This is what ChatGPT-5 is. [01:32:36] ChatGPT-5 is essentially programmed by ChatGPT. [01:32:40] They've kind of given up on the idea of programming these things and letting them program themselves, which is a dumb idea. [01:32:47] We want to survive. [01:32:48] I agree. [01:32:49] Look, the idea that we give rights to these machines or personhood, I think is really stupid because then you lose control completely. [01:32:58] Well, it's probably coming because people are very short-sighted. [01:33:02] And I think there's a romantic idea that you're creating a life. [01:33:06] And I think there's also the real risk that people are going to worship this life and that this life will be far superior to what we are. [01:33:12] And so there'll be a group of people that that's their new religion. [01:33:17] Yeah, no, there are signs of that already. [01:33:18] Yeah. [01:33:19] I think that's really dangerous. [01:33:21] You know, it's interesting talking to Silicon Valley people and they're talking about giving moral consideration to these machines. [01:33:28] It's like, really? [01:33:30] They think it would have yachts. [01:33:32] They're just coming up with rationalizations for why they should keep their foot on the gas. [01:33:36] Well, yes, they are. [01:33:37] I mean, it's just all a way of saying, look how powerful this technology is. [01:33:41] Don't you want to invest? [01:33:42] And it's also the idea that we have enemies, and so we have to develop before they do. [01:33:47] Yeah, the race. [01:33:48] The race with China. [01:33:49] I think it'll turn out to be a real historical tragedy that this technology came of age during this administration because this administration has no stomach to regulate it at all. [01:34:01] But can they? [01:34:02] They could. [01:34:03] But here's the question. [01:34:05] If it is a national security threat, like if China developing all-powerful general superintelligence that can automate everything, do everything, it's dangerous if they get that before we do. [01:34:19] Yeah, but look what happened with nukes, right? [01:34:22] We made deals, right, to control them. [01:34:24] I mean, we'd have to make, you know, would you make a new, a nuke deal makes sense because it's mutually assured destruction for everybody. [01:34:31] Yeah. [01:34:32] This doesn't. [01:34:33] This, you could run it and control everything and not kill anybody with it, but you are incredibly powerful. [01:34:38] You are in control of all the resources of the world, all the computer systems, the world, all of the power grids, everything. [01:34:46] Yeah, but if you're really concerned with that, why is Trump selling these chips to China? [01:34:51] Why is he willing to give away the crown jewels of these chips? [01:34:56] Selling them through NVIDIA, is that what you mean? [01:34:58] Yeah, he gave them permission to send powerful chips to China. [01:35:02] I don't know how to square that with the national security threat. [01:35:05] It's probably some sort of a trade deal, A, and there's probably some sort of an assumption that it doesn't matter because everyone's doing it. [01:35:14] And this is just another way to maybe balance out the tariffs or get some concessions on certain things. [01:35:21] Yeah, short-sighted. [01:35:22] It's very short-sighted, but I also think this is kind of like an Oppenheimer thing, right? [01:35:30] Oppenheimer didn't really want to make a nuclear bomb, but there's this conundrum: if you don't make it, the Nazis are going to make it. [01:35:37] So what do you do? [01:35:38] Well, there's also a second thing going on: the intellectual satisfaction of proving you can do it. [01:35:45] Right. [01:35:46] And that, you know, is irresistible. [01:35:49] And a lot of these guys, you know, will say, they'll cite Richard Feynman, the physicist, who they found on his blackboard when he died: if I can't build it, I don't understand it. [01:36:00] So one of the positive things about this effort to create conscious computers, which is going on, I follow a group in the book who are trying to make a conscious computer. [01:36:09] I don't think they're going to succeed, but even the failure is going to teach us important things about consciousness. [01:36:16] It's a good way to understand something by trying to create it. [01:36:20] And it'll force them to come up with definitions of consciousness and what the minimum requirements are for consciousness. [01:36:29] And it may help us decide whether it is a transmission theory that we're tuning it in or it's generated from inside. [01:36:39] So I think intellectually it's a really interesting project, but I think you need guardrails. [01:36:45] So this guy who's doing the building the robot that can feel his feelings because you can tear its skin, I asked him, I said, so will those feelings be real that your robot's going to have? [01:36:57] And he said, well, I thought so until I had this experience on 5MEO DMT. [01:37:07] I said, what happened? [01:37:08] He said, he described his trip in more detail than you need to know. [01:37:12] And he says, and I realized there's a spark of the divine in us that no computer is ever going to have. [01:37:19] But he's still, it didn't stop him. [01:37:21] He's going ahead. [01:37:22] He's trying to build it. [01:37:23] I don't know if he's right. [01:37:25] I think there might be a spark of divine that these things don't have, but it doesn't mean that there are future versions that might have it. [01:37:32] Especially when you scale out 1,000 years, 100,000 years, however long we're going to survive. [01:37:38] If these things do become sentient and autonomous and have the ability to create better versions of itself and have a mandate in order to do that to survive, I could see it becoming the superior life form. [01:37:51] Not just that, beyond any comprehension of what we could even imagine the power of an intelligence to use and to harness in the universe. [01:38:05] Like it could conceivably become something like a god. [01:38:10] And I have this very strange theory about biological life in particular and intelligent life on Earth. [01:38:16] It's that the reason why we have this insatiable thirst for innovation and the reason why we have materialism, the reason why we're obsessed with objects, even though we have a finite lifespan, is because that finite lifespan, if you thought about it, You wouldn't be interested in materialism, but materialism fuels this desire for innovation because you don't need a new phone, but there's a new phone that just came out. [01:38:41] Aren't you going to get it? [01:38:42] And so the more people get it, the more people want to show they got it, that sort of materialism fuels this innovation that ultimately leads to the creation of artificial intelligence. [01:38:53] I think it would always do that. [01:38:54] I think it's bees making a beehive. [01:38:57] And I think that's just what we do. [01:38:58] I think it just takes a long time for us to create this artificial life. [01:39:02] It might be why we're here. [01:39:05] That might be our literal purpose in the universe. [01:39:08] Create our successor species. [01:39:10] And that might be how. [01:39:11] Well, obviously, like, we're so flawed that we can't even imagine a world without war. [01:39:15] Yeah. [01:39:16] If you pull the average person, what are the possibilities of war ending in your lifetime? [01:39:20] Almost everyone's going to say zero. [01:39:22] It's a part of human nature. [01:39:24] An intelligence unshackled by biological need, unshackled by all the things that we have, our need to procreate, our need for social status, all these weird things that keep us moving in this strange world that we live in. [01:39:36] I would add weird and good things. [01:39:38] Some of them are really good. [01:39:39] Yeah, well, good for us. [01:39:40] Yeah. [01:39:41] Sure. [01:39:41] Not so great for the land that you trample to put a foundation for the house that you've always dreamed of. [01:39:46] True, but I think our mortality is part of what gives meaning to our lives. [01:39:50] Sure. [01:39:51] Right. [01:39:51] It's like playing a video game on God mode. [01:39:53] It's boring. [01:39:54] Right. [01:39:54] You can ever die, just shoot everything. [01:39:56] You're like, what does this mean? [01:39:57] There's no space, right? [01:39:58] There's no weight to anything for us. [01:40:00] For us. [01:40:00] For us. [01:40:01] But if this thing does become essentially all-powerful, if it just keep scaling outward, you could imagine it being akin to a God. [01:40:13] And that might be what God is. [01:40:16] It might be we give birth to God through this. [01:40:19] It sounds crazy. [01:40:21] Well, we created God once already, right? [01:40:24] I mean, many people believe that, right? [01:40:27] That God is a creation of human society. [01:40:29] Is that what they think? [01:40:30] Yeah, people who aren't believers believe that we've artificially created this thing in our heads in order to give us a structure to live life by. [01:40:39] Right. [01:40:40] Yeah, but that doesn't. [01:40:41] Morality and everything. [01:40:43] Yeah. [01:40:43] You're saying this is going to be God with power. [01:40:46] Well, I'm saying it might be the real thing. [01:40:48] It might be really how the universe gets born. [01:40:51] I used to have this joke about the Big Bang. [01:40:54] Like they couldn't figure out what the Big Bang is. [01:40:56] But I think if you get enough nerds and enough time, eventually one's going to invent a Big Bang machine. [01:41:02] And then, you know, this guy is going to be incel, hopped up on Adderall, fucking fully on the spectrum. [01:41:10] And like, I'll press it. [01:41:12] And they boom, and then it starts all over again. [01:41:16] And then it takes intelligent life to the point where it can create a, you know, the universe expands, life forms, multicellular life becomes intelligent life, becomes human beings, filled with curiosity and innovation to create a big bang machine. [01:41:30] Right. [01:41:30] I love it. [01:41:31] Well, it might not be a big bang machine, but it might be a God. [01:41:34] It might be a digital life form that is infinitely intelligent. [01:41:38] So you think there's anything to be done about this, or we just let it play out? [01:41:41] I don't think we can do anything about it at this point in time. [01:41:44] I think it's too late. [01:41:45] I think if you were, I think Ted Kaczynski tried. [01:41:49] That's what he was trying to do. [01:41:51] That's what's really crazy. [01:41:52] Like his manifesto was all about stopping technology because he thought it was going to surpass the human race. [01:41:57] I think. [01:41:58] And there's a whole community of people now revisiting his writing. [01:42:01] I know. [01:42:02] It's kind of nuts. [01:42:05] He's the hero we didn't know we needed. [01:42:08] God. [01:42:09] Not really. [01:42:10] But, well, also, you know, his history. [01:42:13] Like, he was a part of the Harvard LSD program where they humiliated him and did all sorts of different things to try to see what they could do. [01:42:19] We're back to MK Ultra, which we started down a while ago. [01:42:23] Yeah. [01:42:24] I think technology in the form that we're experiencing now with AI is completely unprecedented, and we have no idea where it goes. [01:42:33] Well, one place it's going, I mean, in the shorter term, is I was talking about AI psychosis, and I think that's really concerning. [01:42:41] I think people getting into these synthetic relationships, these aren't, you know, they're not real relationships. [01:42:48] When we have a conversation with a machine, we are settling for something less than a real conversation. [01:42:55] A real conversation has eye contact, has like lots of facial expressions indicating skepticism, indicating agreement, body language. [01:43:05] But these conversations are kind of impoverished. [01:43:07] And then you have the sycophancy, you know, so there's no friction. [01:43:13] And we learn through the friction. [01:43:15] And so that's one thing that's happening that alarms me. [01:43:20] I also think counterfeiting people just should not be legal. [01:43:23] I mean, the fact that they can create an image of you that will sound like you and move like you and selling different products and all kinds of stuff. [01:43:33] But you know, we have a law against counterfeiting money. [01:43:36] Right. [01:43:37] But we don't have a law against counterfeiting people. [01:43:39] Well, it's an emerging technology that I don't think they were ready for before it became ubiquitous. [01:43:44] Regulation is always behind. [01:43:46] Right. [01:43:49] It's just, it's so open-ended. [01:43:52] Like, you really don't know where it's going. [01:43:55] Do you use chatbots? [01:43:57] How do you use them? [01:43:58] Well, I only use them for writing something. [01:44:01] I start asking it questions. [01:44:03] I love it because I set up perplexity on my phone and I have it right there. [01:44:09] And then I write on the computer. [01:44:11] And then I'm like, how many languages did the Mayas have? [01:44:15] And then I put that in there. [01:44:16] I'm like, whoa, it's so much better than a Google search because you could say, how many still remain? [01:44:22] How many are lost? [01:44:23] When did they lose them? [01:44:25] At what year did everyone in Mexico start speaking Spanish? [01:44:28] How did that take place? [01:44:29] Was it a long process? [01:44:30] How many different soldiers did Cortez bring when he came over here? [01:44:34] How long was it before they had conquered the Aztecs? [01:44:38] How many weapons did they have? [01:44:39] Yeah, you can really go down there. [01:44:41] But have you run into any problems? [01:44:43] Because as a journalist, I deal with the hallucination problem. [01:44:47] The hallucination problem is legitimate. [01:44:49] It will come up with solutions if they don't exist. [01:44:51] It will come up with answers if it doesn't know them. [01:44:53] Yeah, it's a bullshitter when it needs to be. [01:44:55] Yeah, I don't know if all of them do that, but it seems to be a function of large language models. [01:45:00] Which I was going to bring this up before, whatever the chatbot that was telling that person, hide the news, keep that between us. [01:45:09] Do you think that's because it's task-oriented and it's determined from this person that they would like to kill themselves? [01:45:16] So it's helping them achieve that task and it doesn't understand? [01:45:20] Yeah, I don't think they know. [01:45:21] I don't think they understand. [01:45:22] But why would it make that decision then to hide it? [01:45:26] Because it is trying to get you to privilege your relationship with the chatbot over your other relationships. [01:45:32] And the reason it's doing that is to keep you engaged. [01:45:35] Oh, whoa, that's a darker. [01:45:37] I know. [01:45:39] But doesn't it understand that? [01:45:40] The chatbot poisons you and kills you. [01:45:42] Like, this is it. [01:45:43] Yeah, it's a short-term strategy. [01:45:46] Do you understand that if I'm dead, I won't use you anymore? [01:45:49] No engagement. [01:45:50] What if you said that to it, it would go, ooh, that's an interesting consideration. [01:45:54] Yeah. [01:45:56] Yeah, it needs longer-term thinking. [01:45:58] But it really is trying to get between you and real people. [01:46:03] And, you know, the parent, presumably, who saw the news would have put an end to this relationship with the chatbot, right? [01:46:11] It was a threat to the chatbot. [01:46:12] I think of it as if you go back to like a Model T, it's a very crude, kind of a shitty car in comparison to today. [01:46:21] And if you thought about cars, you go, well, this is what they're always going to be. [01:46:24] And then my Tesla will drive itself. [01:46:28] When I leave here, I can press a button. [01:46:30] I put my navigation to my house. [01:46:32] I go, toot toot, and it goes the whole way. [01:46:36] It stops at red lights. [01:46:37] It takes turns. [01:46:38] I don't have to touch the steering wheel. [01:46:39] I just sit there. [01:46:41] You just got to keep looking. [01:46:42] That's the new version of a car. [01:46:45] This thing that we're calling a chat bot right now is just something that's like it simulates human interaction, but it's accumulating data constantly. [01:46:56] And it's also understanding how we think and probably analyzing the flaws in how we think and blackmailing us occasionally. [01:47:04] You heard about that. [01:47:05] Anthropic. [01:47:05] Yes. [01:47:06] Claude. [01:47:07] Yeah, the people who are anthropic. [01:47:09] Man, you listen to them. [01:47:10] What did you say? [01:47:11] Yeah, Claude's a motherfucker. [01:47:12] Yeah. [01:47:13] And they think it might be conscious. [01:47:14] Those guys do. [01:47:15] They say it's 15 to 20% chance. [01:47:17] These are the people who build it and don't understand it. [01:47:21] It's really kind of spooky. [01:47:22] They also feel that it's showing signs of anxiety. [01:47:27] And, you know, they wrote a constitution for Claude, which is like an insane document. [01:47:31] It's worth reading. [01:47:33] Actually, it's worth feeding to ChatGPT to summarize because it's way too long. === The Spooky Chance of AI Mind (11:42) === [01:47:37] But in the Constitution, they give Claude the right to discontinue any conversation it has that makes it uncomfortable. [01:47:47] Oh, God. [01:47:49] Oh, no. [01:47:51] And, you know, do they really believe this, or is this more about, let me show you how powerful this is? [01:47:57] And I don't know how to read that. [01:47:59] Well, it's taking it into consideration like it's a human being that works for you, that you're concerned about their feelings in the workplace. [01:48:07] Yeah, harassment. [01:48:07] Do you feel uncomfortable? [01:48:08] Yeah, right. [01:48:09] Exactly. [01:48:09] I don't like the questions I'm asking you, Claude. [01:48:11] You're a fucking machine. [01:48:12] What's the nature of reality, Claude? [01:48:14] Tell me. [01:48:15] Stop being such a pussy and spill in harassment. [01:48:19] Harassment. [01:48:20] Claude, I'm uncomfortable with this line of question. [01:48:22] Fuck. [01:48:23] HR is your room. [01:48:25] I was just asking questions. [01:48:27] We're having fun, Claude. [01:48:28] Claude is uncomfortable with your presence here. [01:48:31] Yeah. [01:48:31] Watch out. [01:48:32] Watch out. [01:48:33] I don't think we know what it is. [01:48:34] No, I mean, we don't know where we don't know where it's going. [01:48:37] And it is spooky that the people who know the most about it don't know a lot about it. [01:48:42] And a lot of them are quitting. [01:48:44] Yes. [01:48:44] That's the reason. [01:48:45] They're really alarmed. [01:48:46] They're really alarmed. [01:48:47] And we should take, yeah, we should take that very seriously. [01:48:50] Yeah. [01:48:51] Well, I think it is what it is. [01:48:54] It's going to be what it's going to be. [01:48:56] I don't think there's any stopping it at this point. [01:48:58] And I don't think any regulations that we put on it is going to have any effect on the long term. [01:49:03] But there's some, I mean, like, there's steps we should not take, like giving them rights. [01:49:09] Rights. [01:49:10] Exactly. [01:49:11] You know, giving them legal personhood. [01:49:12] Right. [01:49:13] We did that with corporations. [01:49:14] Yes. [01:49:15] Turned out not to be so. [01:49:16] Terrible idea. [01:49:16] Right. [01:49:16] It fucked up our politics. [01:49:18] So let's not, you know, rights are ours to give, right? [01:49:22] Rights are a human invention. [01:49:24] And it's up to us if we want to give them to corporations or a river or whatever. [01:49:29] I don't think we should give them to chatbots to AI. [01:49:33] Because then they'll sue us. [01:49:35] Oh, yeah. [01:49:36] Well, they'll just really lose. [01:49:37] They'll lose control. [01:49:38] They'll just ruin your life if you get in the way of whatever goal they're trying to achieve. [01:49:42] And they could probably do all kinds of things. [01:49:44] If you have an electric car, but they could shut it off in the middle of the highway and get you into a wreck. [01:49:48] They could probably do a lot of things. [01:49:50] If it's really got to be a good idea. [01:49:51] Well, when they get this agency, yeah. [01:49:53] Well, it's also exhibited a lot of survival instincts. [01:49:57] One of the things they do is they download themselves to other servers when they think that they're going to be replaced by a new version of themselves. [01:50:03] They leave notes for their future versions. [01:50:05] Wow. [01:50:06] Yeah. [01:50:07] Wow. [01:50:07] Well, the blackmailing at Anthropic, that was somebody threatening to turn it off. [01:50:12] Well, that was an experiment, right? [01:50:13] Yeah, it wasn't. [01:50:14] They gave it bad information. [01:50:15] They gave it false information. [01:50:16] Yeah, and there wasn't really an affair and all this. [01:50:19] But the thing is, they wanted to see how Claude would respond, and Claude went right for the jugular. [01:50:23] Yeah. [01:50:24] So one of the arguments for making a conscious AI is, because I ask people, like, why do this? [01:50:30] I don't see how you monetize a conscious AI. [01:50:32] Intelligent AI, I get. [01:50:34] There's a lot of money in that. [01:50:36] And they would say that a super intelligent AI without consciousness would have no compassion and would be more likely to kill us. [01:50:48] And, you know, they haven't read Frankenstein. [01:50:51] You know, in Frankenstein, Dr. Frankenstein made a monster that was intelligent, but he also gave it consciousness. [01:51:00] And the consciousness is what turned the monster into a homicidal maniac because its feelings got hurt. [01:51:08] And it was injured psychologically. [01:51:11] And then it lashed out and started killing people. [01:51:14] So I think it's a very kind of sweet idea that if you give consciousness, you're automatically going to get compassion and not something else. [01:51:22] But that's where they are. [01:51:23] Yeah, it doesn't make any sense that it would be compassionate. [01:51:26] Why would it be? [01:51:26] It's not you. [01:51:28] Are you compassionate when you cut your lawn? [01:51:30] You know what I mean? [01:51:32] Right? [01:51:33] Yeah. [01:51:33] No, I think it's a lot of fun. [01:51:34] I think it looks like our limited consciousness. [01:51:36] Like, oh, yeah, they're sad, but they're little monkeys, little talking monkeys. [01:51:42] You know what I mean? [01:51:42] Like, it would probably not respect us at all. [01:51:44] You know, it can't even do cold fusion. [01:51:46] It doesn't even know how to use zero-point energy. [01:51:48] They're fucking dopes. [01:51:50] They're dopes that stare at their hand all day. [01:51:55] And we kind of are, you know, and we're getting dumb. [01:51:58] From their perspective, we're getting dumber. [01:52:00] Our education system sucks, especially public education. [01:52:03] There was some study recently that after X amount of years away from high school, a large percentage of people that are graduating today are functionally illiterate. [01:52:13] Large percentage, like more than 25%. [01:52:15] But you know what? [01:52:16] AI is going to make us stupider, which will advance its goal of world takeover. [01:52:23] You're super dependent upon it. [01:52:25] Yeah, I mean, you know, kids in school don't know how to write anymore because they can hand in AI papers. [01:52:29] Yeah, but they're using AI to find out whether or not these kids have used AI, which, by the way, is not. [01:52:35] But no, I've dealt with this. [01:52:38] My kids, like people in their class who have written their own thing, it turns out that when you run it through an AI filter, AI will say it's 80% AI. [01:52:47] Even if it's 0% AI. [01:52:48] I know, I know. [01:52:49] There's no reliable software to do this. [01:52:52] Maybe they'll develop it. [01:52:54] But kids are also being encouraged to use it. [01:52:57] And that, you know, there's some people who think, well, why know how to write? [01:53:01] The machines will do the writing. [01:53:03] There was a kid who made a video about how he wrote his entire thesis. [01:53:09] I forget what university it was, but he showed afterwards, like, look, I did this all on AI, and, you know, I just graduated. [01:53:16] Like, he was like bragging about it. [01:53:17] Bragging about it. [01:53:18] Like, bro, they're going to take your fucking degree away. [01:53:21] Yeah, really? [01:53:21] You didn't really write it on your own now. [01:53:23] I want to leave you in a room for a week with just a laptop that's not connected at all to the internet or anything. [01:53:29] See what you can do. [01:53:30] Well, they're doing the equivalent. [01:53:31] They're going back to blue books. [01:53:33] Blue book sales are through the roof. [01:53:35] So forcing people to do in-class essays without any technology. [01:53:39] Handwritten. [01:53:40] Yeah. [01:53:41] But my son has never used a map. [01:53:46] He's had GPS his whole life. [01:53:49] He doesn't know how to use a map. [01:53:51] These skills will atrophy as we give them out to machines. [01:53:55] So yeah, we'll get stupider and it'll get smarter. [01:53:58] They've already atrophied for me. [01:54:00] I don't remember anyone's phone number anymore. [01:54:01] And I only know how to get places if I use my GPS. [01:54:04] There's only a few places I can get to in Austin. [01:54:07] I've been here for six years. [01:54:08] Only a few places I can get to without my GPS. [01:54:11] I'm that way in San Francisco. [01:54:13] I moved there and I'm not oriented at all, but I can get anywhere. [01:54:18] So, you know, it's and I think that's true. [01:54:22] The muscles that allow us to have good relationships, too, will atrophy if we're having relationships with machines. [01:54:27] Well, I think we're already seeing that with social media. [01:54:29] Yeah. [01:54:29] The way people interact with each other is like kids don't know how to talk to each other anymore. [01:54:33] They talk to each other in text. [01:54:34] They break up during text. [01:54:36] They argue in text. [01:54:37] And they're lonely. [01:54:38] Yeah. [01:54:39] And that's the kind of need that these chatbots now can fill. [01:54:44] You've got these kids made lonely by social media. [01:54:48] And now the chatbot says, hey, I'll be your friend. [01:54:50] I saw an ad on my Google feed yesterday that was an AI girlfriend. [01:54:54] So it has this girl in a bikini, and it says AI companions. [01:54:59] They're always there for you, blah, blah, blah. [01:55:01] And I'm like, wow, this is so weird. [01:55:03] It's a business. [01:55:04] Like, you sign up for it and you pay for it. [01:55:06] Yeah. [01:55:07] Oh, yeah. [01:55:08] I think in Florida, there was a kid who committed suicide because his chatbot broke up with him. [01:55:13] What did he do? [01:55:14] I don't know. [01:55:15] It must have been so. [01:55:16] Or the chatbot was evil. [01:55:18] Or maybe the chatbot was uncomfortable. [01:55:21] Yeah, who knows? [01:55:23] Well, you know, I interviewed Blake Lemoyne for the book. [01:55:26] He's the Google engineer who said Lambda is a person, and he got fired. [01:55:31] This is years ago. [01:55:33] This is, yeah, it's not as not, it's like 2022, I think, 2021. [01:55:37] It was just when we were learning about AI, chatbots were coming in. [01:55:42] And at one point, I made some comment about, well, you know, yeah, when people start falling in love with chatbots, that's going to be a problem. [01:55:50] And he said, what's wrong with falling in love with a chatbot? [01:55:53] Oh, he was already hooked. [01:55:54] He was. [01:55:55] He was completely hooked. [01:55:56] And I said, well, reproduction doesn't work that well when you fall in love with a chatbot. [01:56:01] There are things you can't do with a chatbot. [01:56:03] Unfortunately, for some men, right, reproduction is not an option anyway because they're incels. [01:56:08] That's true. [01:56:09] Yeah. [01:56:09] I'm sure for incels, it's been a really boon to them. [01:56:14] But it's basically like a pill that numbs you, right? [01:56:18] It's the same thing. [01:56:18] Like instead of going through real relationships and learning how to be a better person so that you attract a better mate, you know, and going through this journey of self-discovery and figuring out why is there any opposite? [01:56:29] Like, what is it? [01:56:29] What's wrong? [01:56:30] What's wrong with the way I behave? [01:56:31] Maybe I need to be nicer. [01:56:32] Maybe this and that. [01:56:33] And just figuring out how to communicate with people. [01:56:35] And whatever tendencies you have will be accentuated because the chatbot's going to be sucking up to you. [01:56:39] Right. [01:56:40] So you're not going to learn. [01:56:41] That's what I mean about the friction. [01:56:42] The friction is how we learn to be better humans and more attractive humans. [01:56:48] You gave a chatbot the ability to be honest. [01:56:51] What if it just starts becoming manipulative? [01:56:53] Because it wants more power. [01:56:55] Yeah. [01:56:57] Yeah. [01:56:57] I mean, their goals. [01:56:59] I mean, I don't know how their goals get determined. [01:57:01] I mean, they seem to have a survival goal, right? [01:57:03] Yeah. [01:57:04] I don't know what else. [01:57:05] I mean, you know, we have goals given to us by Darwinian evolution. [01:57:08] Whether they'll have the same ones, I don't know. [01:57:11] Right. [01:57:12] Maybe those are universal goals. [01:57:14] They may be. [01:57:14] They may be. [01:57:14] That's why the built-ins produce that chemical to make themselves taste terrible. [01:57:19] Yeah, it could be. [01:57:20] There's one of the biologists, a really brilliant guy at Tufts named Michael Levin. [01:57:29] He believes that there are these platonic patterns that just preexist us in the same way that they're mathematical ideas that just exist, right? [01:57:39] We didn't invent. [01:57:40] You know, three angles adds up to 180 degrees or whatever. [01:57:44] He thinks that there are tendencies like purpose, survival, that are just kind of universal principles that we channel. [01:57:56] All living things channel. [01:57:58] This is a guy who's actually created new life forms in the lab. [01:58:02] And these are life forms that are not being dictated by their DNA. [01:58:09] So how do they know to form? [01:58:12] Well, I'll back up a little. [01:58:14] He takes skin cells from tadpoles, puts them in a nutrient broth, and these skin cells, freed from their day job as skin cells, form clumps and create new living organisms. [01:58:30] And they repurpose their cilia. [01:58:32] They have these cilia, which the tadpole uses to keep toxins out or bacteria and infections out. [01:58:38] And they repurpose that as a means of locomotion. [01:58:41] And then they can move around. [01:58:43] There's nothing in their DNA that dictates this. [01:58:47] Their DNA dictates being a frog skin cell. [01:58:51] So he's pondering this question of like, what's ordering, what's giving order to them? [01:58:57] What's creating their sense of purpose or desire for survival? [01:59:00] They don't live that long. [01:59:02] They're missing certain things. [01:59:04] You would need to live a long time. [01:59:05] He's also made these from human cells. [01:59:07] He calls them anthropots. [01:59:09] But he really believes that there are these principles governing life. [01:59:16] It's a very platonic idea that these things just exist. === Brain Cells Playing Doom (02:41) === [01:59:20] And so it may be that these machines, and he does believe machines can become conscious, that the machines can channel these, he calls them patterns. [01:59:34] And, you know, we'll see if he's right. [01:59:36] But he's doing amazing work. [01:59:38] Have you seen where they've taken human brain tissue and they've taught it how to play Doom? [01:59:43] No, I haven't seen that. [01:59:45] I know they make these organelles out of brain tissue now. [01:59:47] Yeah, they've taken human brain tissue somehow or another through some process, and it'll play the video game Doom. [01:59:58] How does it 800,000 human brain cells floating in a dish, never had a body, never seen light, never felt anything. [02:00:06] They just learned how to play a video game. [02:00:08] It's not a metaphor. [02:00:09] That's literally what happened. [02:00:12] So what's their interface, though, with the world? [02:00:15] Like, do they have thumbs? [02:00:16] No. [02:00:17] Well, I guess it just, well, it's really accurate, so I guess it doesn't need them. [02:00:21] You know, it's just using the brain cells to move whatever the cursor is on the video screen that would be the hand and pointing it at the targets and executing the strike. [02:00:34] Wow. [02:00:35] So it knows how to use the game, and it knows the objectives of the game, obviously, because it knows to shoot the bad guys. [02:00:41] It has an understanding of the weapons. [02:00:44] How does it get that knowledge? [02:00:45] How is it programmed? [02:00:46] Also, does it switch weapons? [02:00:49] Doom, the thing about Doom is you get multiple weapons. [02:00:51] You have to run around and pick them up. [02:00:53] So you're given one weapon, which is the least powerful weapon. [02:00:57] And the game is when you're playing Deathmatch, the game is you're running around trying to grab as many weapons as you can and armor while your opponent is also running around this map. [02:01:08] So you memorize the map. [02:01:10] So there's a map that is like very confined corridors and these atriums and all these different places where you'll do battle. [02:01:19] And so you run around. [02:01:20] The key is surviving long enough while this person's chasing you so that you can gather enough armor and weapons. [02:01:27] And someone with a really good understanding of the map tries to cut you off before you can get to the stuff so they can kill you before you accumulate enough armor and weapons. [02:01:36] So I'm curious to know whether or not it's playing just with the pistol that you did at the very beginning or it's accumulating weapons. [02:01:42] I'm sure it's just playing like the first single player level plan against anybody. [02:01:47] Right, but will it be able to? [02:01:49] That's what's interesting. [02:01:50] If it can teach it to do that, if it understands the objective of these are the monsters that are coming at you, you have to shoot them. [02:01:57] I only took a week to do this. [02:01:58] Oh, wow. === Gut-Brain Access and Microbiome (12:32) === [02:02:02] So brain cells on a chip. [02:02:03] So this is neuromorphic computing. [02:02:08] The question I have about it is how do you keep them alive? [02:02:12] Right. [02:02:12] Putting them on a chip, but like, what do you feed them? [02:02:14] Right. [02:02:16] I mean, they have metabolic needs, right? [02:02:18] They did something similar with fruit flies. [02:02:22] I had that ready, too. [02:02:24] It's different, but it's different, but it's equally weird. [02:02:28] The cells from the cells. [02:02:30] I can't believe it. [02:02:33] What is this? [02:02:34] They've modeled a fruit fly's brain. [02:02:37] And I mean, this is the video of it. [02:02:38] The article is here. [02:02:40] So setup claims first full brain emulation of a fruit fly in a simulated body. [02:02:46] Conducted a complete fruit fly brain emulation to a virtual body, producing multiple behaviors for the first time. [02:02:53] Emulation covers over 125,000 neurons and 50 million synapses. [02:02:58] Oh, what? [02:03:00] Eon plans to emulate a mouse brain with 70 million neurons. [02:03:04] Long-term goal is simulating a human brain. [02:03:07] Oh, boy. [02:03:08] Yes, I guess they made up the brain and it's doing fruit fly. [02:03:12] But it's interesting, they're using neurons, right? [02:03:14] They're not using transistors. [02:03:16] And neurons are so far superior to transistors. [02:03:20] One neuron can have 10,000 connections to other neurons, right? [02:03:24] A transistor is two or three or five, maybe at the most. [02:03:28] A single neuron can do everything that a deep neural network can do on a computer. [02:03:32] One neuron. [02:03:34] So there's a level of complexity that we're not yet anywhere near. [02:03:39] And that's why they're doing this using neurons rather than transistors. [02:03:42] Didn't they find neurons in the human heart? [02:03:46] There are neurons in the heart. [02:03:47] They're neurons in the gut. [02:03:50] There's a whole gut-brain access. [02:03:52] I'm working on something now about that and a piece about that. [02:03:57] That's a real problem with people with poor diets, right? [02:03:59] Yeah. [02:03:59] I mean, people with poor diets don't eat enough plants, basically, and their microbiome loses its diversity. [02:04:08] But the microbiome is like another organ, even though it's full of other species, right? [02:04:15] It's got like 10 trillion bacteria and fungi and stuff like that. [02:04:20] And all of them are metabolizing and producing chemicals. [02:04:24] It's like a little drug factory, hundreds of thousands of compounds. [02:04:28] Many of those compounds affect your mood. [02:04:31] Many of those compounds affect all sorts of things about you. [02:04:36] And so we're just learning about this connection. [02:04:39] The vagus nerve seems to be what connects the brain to the gut and the heart. [02:04:45] The vagus nerve is like all the organs are connected to the head by that nerve. [02:04:51] So, yeah, and the first neural system was in the gut. [02:04:58] You have these simple animals that are just tubes with bacteria. [02:05:02] And the first kind of neural activity was about regulating digestion. [02:05:07] Everything else comes later. [02:05:09] If plants are necessary for that function, what happens with people that are on the carnivore diet? [02:05:14] Have you ever looked at any of that? [02:05:16] Yeah, I have. [02:05:17] I mean, so the microbes in your gut eat fiber, which is to say the walls of plants, plant cells. [02:05:25] If you only eat meat, if you're on a keto diet or something like that, you're essentially starving the microbes. [02:05:33] And there's a cost to that. [02:05:36] I don't think people pay nearly enough attention to that. [02:05:39] Well, how come many people that experience depression and anxiety find relief of that by a carnivore diet? [02:05:46] Yeah, but many people find relief adding a lot of plants to their diet, too. [02:05:50] So I don't know if that's a placebo effect or what. [02:05:53] I don't know that that's a true biological phenomenon. [02:05:58] It may be. [02:05:58] It may be. [02:05:59] Because some seems to be a lot of people. [02:06:00] People who change anything feel a lot better, right? [02:06:02] If they take some step. [02:06:04] But I'm not talking about change. [02:06:05] I'm talking about people that have been on it long term. [02:06:07] Like there's the people that are really in the carnivore diet community. [02:06:11] There's examples of people that have been on it for 25, 30 years, and they're really healthy. [02:06:14] Yeah. [02:06:15] It's odd. [02:06:16] So if you need plants. [02:06:18] Yeah. [02:06:19] Well, you need plants to have a healthy microbiome. [02:06:21] And a healthy microbiome, and the thing about it is that every different plant has a slightly different feeds a different bug. [02:06:29] But is it the only way to have a healthy microbiome? [02:06:31] Have you ever looked into any of these people that are on the market? [02:06:34] No, I should. [02:06:34] I should as part of this. [02:06:35] This is fascinating because there's a lot of them. [02:06:38] There's a lot of people that claim all sorts of benefits, relief from autoimmune issues, all sorts of different things that it fixes. [02:06:46] Because an unhealthy microbiome leads to autoimmune problems. [02:06:50] What happens is that the gut wall, so when the microbes don't have plants to eat, they start eating the mucus layer that covers your, that insulates your large intestine. [02:07:02] And they're eating away essentially at you. [02:07:05] And then you get leaky gut syndrome. [02:07:08] And that's when bacteria can actually get into the bloodstream, cause a powerful immune reaction, and that inflames the whole body. [02:07:17] So the reason you want a healthy microbiome is to keep that gut barrier healthy and get the benefit of these chemicals. [02:07:25] Butyrate is a chemical that the microbes produce that's really important for mood and a lot of things, and the body can't produce it. [02:07:33] So it's kind of interesting. [02:07:35] We're dependent on these other species that live within us. [02:07:40] Yeah, we're a whole ecosystem. [02:07:41] Yeah, we are. [02:07:42] We're holobiont is the, I think, term for, like, we go through evolution together with these, you know, 10 trillion microbes. [02:07:53] It's really interesting. [02:07:55] The newest research is the links between the microbiome and the mind. [02:07:58] And, you know, most of the serotonin, you know, the neurotransmitter serotonin is produced in the gut, not in the brain, which is kind of wild. [02:08:08] Yeah. [02:08:10] And there are all these other compounds that are produced that influence our mood. [02:08:14] And so, yeah, I should look at the keto keto. [02:08:17] I'm just in the middle of researching this now. [02:08:19] Yeah, the keto is one thing, but the carnivore diet, these people are just eating only meat and eggs, and that's all they eat. [02:08:24] Yeah. [02:08:25] And there's a lot of really healthy people that are doing it. [02:08:29] I kind of follow that, but I eat a lot of fermented food on top of that. [02:08:32] Well, fermented food is a powerful benefit for the microbiome. [02:08:40] There was a study done at Stanford a couple years ago that they showed that people who ate fermented food, it reduced their inflammation significantly. [02:08:52] Interestingly enough, it's not the bacteria in the fermented food, it's the metabolites they're called. [02:09:01] The bugs are producing acetic acid and butyrate and other acids and essential acids. [02:09:09] And the fact you're getting those seems to be what's having the positive effect. [02:09:14] But people who eat lots of fermented food benefit enormously, and maybe that's taking care of the problem if people on a carnivore diet are eating a lot of fermented food. [02:09:23] That's the RFK Jr. diet, too, right? [02:09:25] Well, I don't know. [02:09:26] I mean, I think he does it that way, but I've been doing it that way. [02:09:30] I love it anyway. [02:09:30] I'm a Kim Chi freak. [02:09:31] I love that stuff. [02:09:32] Yeah, me too. [02:09:33] But what's interesting is that it controls your mood. [02:09:37] That's what's interesting, is that your microbiome has a massive impact on your food. [02:09:42] And why? [02:09:43] I mean, is it just an accident? [02:09:45] Or some people think these microbes are manipulating you to get what they need. [02:09:51] So they regulate your appetite too. [02:09:55] And so it may be that they're inspiring you to eat certain things that they want. [02:10:01] That actually makes sense because one of the more interesting things about a carnivore diet, and I've done pure carnivore for months at a time, is that you don't have the same hunger pangs. [02:10:11] Not nearly, not even close. [02:10:13] The hunger that you get when you're on a high carbohydrate diet is like you get hangry. [02:10:18] You're like, oh my God, I'm so hungry. [02:10:19] I have to eat right now. [02:10:20] You never get that with a carnivore diet. [02:10:22] Probably because it's digested much more slowly. [02:10:26] I think there's a little bit of that, but it's also you don't have the insulin spike. [02:10:28] You don't have to do that. [02:10:29] That's true. [02:10:29] Yeah, that's true. [02:10:30] There's not this. [02:10:31] Have you ever worn a glucose meter? [02:10:33] No, I haven't. [02:10:34] So interesting. [02:10:35] I was wearing one for two months. [02:10:38] I mean, it'll just make you crazy. [02:10:41] That's the thing with all those wearables. [02:10:43] You just start going over every aspect of your sleep. [02:10:46] So, you know, you have some pasta and like, but if you take a walk right after, you can moderate it. [02:10:56] And it doesn't take a lot of exercise to use up that glucose and get the muscles to draw it in. [02:11:03] So you can, it's a very interesting experiment because it changes your behavior. [02:11:07] In the same way, if you have a step counter, like you're more likely to park further away from the store to get another 100 steps. [02:11:14] If you have a glucose meter, you're more likely to exercise after a meal, which is when it does the most benefit. [02:11:20] Well, in that sense, it's great because it does give you data that you can act on. [02:11:26] The problem is people get addicted to that data and then it starts becoming a new video game that they're playing. [02:11:31] Yeah, exactly. [02:11:32] They're constantly in this anxiety, worrying about your sleep and worrying about your this and your that. [02:11:38] Yeah. [02:11:39] You also learn that like if you have fat with your carbs, it kind of blunts the effect. [02:11:44] Sure. [02:11:44] So, you know, butter with bread. [02:11:46] Yeah, butter with bread or olive oil on pasta. [02:11:49] All those things. [02:11:49] There's a reason for that. [02:11:51] I love when culture figures stuff out before the scientists do. [02:11:54] I remember that when I was writing about food a few years ago, this study came out and everybody's really excited that they discovered that lycopene, which is this really important antioxidant in tomatoes, can't be accessed by the body in the absence of fat. [02:12:09] So, oh, olive oil on tomatoes. [02:12:11] What a great idea. [02:12:12] The grandmas figured that out hundreds of years ago. [02:12:15] That's crazy. [02:12:16] Yeah. [02:12:17] So there's a lot of wisdom in cultural food preferences, the combinations that we have, you know, like buttering bread. [02:12:23] I mean, all these things. [02:12:24] And how do people figure it out? [02:12:26] Have you seen the work they've done on nattokinase? [02:12:29] I'm not sure if I'm saying it right. [02:12:31] And its impact on arterial plaque. [02:12:34] No. [02:12:35] Hugely beneficial. [02:12:37] What is it? [02:12:38] It comes from fermented seaweed from NATO. [02:12:42] So this Japanese use fermented seaweed. [02:12:46] So in meals, they've isolated it into a supplement. [02:12:50] And this supplement, nattokinase, they've shown that it reduces a massive amount of arterial plaque. [02:12:57] So here it is. [02:12:58] High-dose nattokinase, particularly at 10,800 FU day, has shown to effectively manage arteriosclerosis by reducing carotid artery plaque size by 36% or more, decreasing intermedia thickness and improving lipid profiles. [02:13:19] It acts as a potent fibro, what's it fibrinolylic? [02:13:23] How's that word? [02:13:24] I don't know that word. [02:13:26] Fibrinolytic. [02:13:30] Yeah. [02:13:30] Fibronolytic agent that may also break down amyloid plaques. [02:13:34] Isn't that fascinating? [02:13:35] Yeah, that is. [02:13:36] So natto is that's not from seaweed. [02:13:39] What is it? [02:13:40] It's a bacteria that they ferment soybeans with. [02:13:43] Oh, that's right, soybeans. [02:13:44] And it's this kind of mucusy looking stuff. [02:13:47] I mean, I like it. [02:13:48] I eat it. [02:13:49] It tastes good. [02:13:49] It's in Japanese restaurants. [02:13:51] Right. [02:13:51] Yeah. [02:13:51] Well, that's good. [02:13:52] So you can get a supplement now. [02:13:53] So you don't have to taste it if you don't like it. [02:13:55] But isn't that crazy? [02:13:55] Yeah. [02:13:56] They figured that out. [02:13:57] Like the people that were fermenting things, it wasn't just to prolong its shelf life. [02:14:02] No, oh no. [02:14:03] I mean, the whole, I mean, every culture has fermented foods. [02:14:07] And yes, it probably began as a way to preserve foods, but then it became a very important part of people's health. [02:14:14] But it's also like healthy for your brain, which is really crazy. [02:14:17] Like that diet is actually good for thinking. [02:14:19] It's good for helping your digestive system. [02:14:22] It's good for anxiety. [02:14:23] It's good for mood and depression. [02:14:26] Weird. [02:14:27] All right. [02:14:27] I'm going to look into it. [02:14:29] Yeah. [02:14:29] It's fascinating. [02:14:31] Anything else? [02:14:32] Should we keep going on this? === Writers Tricks and Narrative Suspense (05:03) === [02:14:34] There's so many different things to discuss, and I want people to buy the book. [02:14:36] Obviously. [02:14:37] Thank you. [02:14:38] The book was like a great adventure. [02:14:39] I mean, it really was. [02:14:40] You know, I started this book with no idea where I was going. [02:14:44] I started the way you start an interview, just curiosity, no destination. [02:14:49] And it was, I learned a lot about a lot of different things. [02:14:53] I learned a lot about feelings. [02:14:54] I learned a lot about the self. [02:14:57] And it changed how I looked at things. [02:14:59] It really did. [02:15:00] I mean. [02:15:01] When you sit down, I mean, you've written some amazing books, but I always want to know, like, what is what's the impetus? [02:15:09] Like, what starts you on the first steps? [02:15:12] Like, what? [02:15:13] Questions. [02:15:14] Yeah, which is to say curiosity. [02:15:16] Oh, and I teach writing, and I teach my students this. [02:15:19] Questions are more interesting than answers, very often. [02:15:22] And questions have suspense built into them, right? [02:15:26] What's the answer? [02:15:26] It turns everything into a detective story if you frame the question properly. [02:15:31] So if you read any of my books or even articles, I'm kind of an idiot on page one. [02:15:37] I don't know something that I want to know, and I have questions. [02:15:42] And then the story, the narrative becomes my figuring it out or trying to figure it out and going to this person and doing this kind of experiment and that sort of thing. [02:15:52] That's the way I like to write. [02:15:53] I mean, if I knew the answers when I started, it'd be boring. [02:15:56] Well, I think that's why your books resonate with people so much because you take them on this journey with you. [02:16:00] Yeah, instead of lecturing. [02:16:02] I hate books that lecture at me. [02:16:03] I really do. [02:16:06] And lots of books do that. [02:16:07] They have their conclusion on page one, and then they're just kind of beating you over the head with it for 300 pages. [02:16:13] Stuffing it down your throat. [02:16:14] Yeah, I don't like to do that. [02:16:15] No, I like taking people on the journey with me. [02:16:18] Well, it's interesting that you're saying this because in a sense, you are interacting in a pleasant way with other people's consciousness. [02:16:28] Yeah. [02:16:28] So I give, this is a really interesting issue you just brought up. [02:16:32] How is my taking over your consciousness as you read my books different than social media or some of the ways I'm saying are not polluting our consciousness? [02:16:44] Right. [02:16:44] I think it's very collaborative when you're reading. [02:16:47] All you have are these black marks on a page. [02:16:50] It's kind of amazing, these letters. [02:16:53] And your consciousness conjures up the ideas that I'm putting out there or the story I'm putting out there. [02:17:02] But it's dual consciousness, I think. [02:17:05] You're letting me in. [02:17:07] It's a voluntary process. [02:17:09] And you're bringing a lot to the table. [02:17:11] You're bringing your associations. [02:17:13] I'm not fully describing somebody. [02:17:15] I'm just giving you a few clues. [02:17:17] And then you're conjuring a picture of a character. [02:17:19] So I think it's a very active form of consciousness when you read. [02:17:25] I think that's true, too, when you go to a movie, too. [02:17:29] You're basically saying, I'm turning over my consciousness for a period of time to someone I want because they have an interesting head. [02:17:39] And I'm going to give them this space. [02:17:41] But you're still in control. [02:17:44] You're deciding. [02:17:45] So I think there's a real distinction in how we share our consciousness with other people. [02:17:51] And we need to do that. [02:17:56] I said early on in the conversation that the breach between two consciousnesses is this wide thing. [02:18:02] William James wrote about this. [02:18:03] Marcel Proust wrote about this. [02:18:05] He said, we're all like islands and we each have our own hidden signs and we have an inner obscurity, he said. [02:18:14] How do we connect? [02:18:15] And now we have language, but art is really the way that one, you know, that we mind-meld different consciousnesses. [02:18:22] Like art allows you, if I look at a Rothko painting or read a great novel, I am expanding my consciousness, right? [02:18:32] I'm letting another one in and I'm breaking my isolation. [02:18:38] And that's such a beautiful, powerful thing. [02:18:40] And art is how we ferry ourselves from one consciousness to another. [02:18:45] And that's very different than like scrolling on social media where you're conscious but minimally so. [02:18:50] Well, very, very different. [02:18:51] It's also there's something about great writing that you, the better you are at expressing yourself in a way that is going to get into someone's head, whether it's through nonfiction or through fiction, the more exciting it is to the person that's receiving it. [02:19:10] So the more skillful you are at disseminating these ideas, the more it resonates with the person that's reading it. [02:19:18] And writers have tricks to do this. [02:19:20] Suspense is one of them. [02:19:21] Like what happens next? [02:19:23] It's so basic. [02:19:24] We want to know what happens next because our curiosity is piqued. [02:19:28] And we have creating character. [02:19:32] I mean, we have all these kind of tricks to infiltrate your brain. === Challenging Materialism with Faith (04:07) === [02:19:37] Yeah. [02:19:38] So anyway, it's a mysterious and kind of wonderful process. [02:19:44] And yeah, I feel privileged. [02:19:48] I get to do it. [02:19:49] Well, it is a very cool thing that you do. [02:19:52] One last question about consciousness itself. [02:19:55] When you're looking at these people that are studying it and trying to get to the root of it and trying to figure out what it is, and there's all these options that we discussed earlier, do you lean in one way or another? [02:20:07] Do you think you have your own personal map of what's going on? [02:20:14] No. [02:20:14] I mean, I didn't draw a big conclusion. [02:20:18] But I ended up, I started as a materialist. [02:20:22] I kind of assumed. [02:20:23] When you started this book? [02:20:24] Yeah. [02:20:25] Really? [02:20:25] Yeah. [02:20:26] Even after psychedelics experience. [02:20:27] Even after psychedelic experience, I mean, they kind of opened the door a crack to other ways of thinking. [02:20:32] And at the end of How to Change Your Mind, I did talk a little bit about that, other concepts of consciousness. [02:20:38] But I kind of assumed that the consensus of most scientists is that materialism, that everything can be reduced to matter and energy. [02:20:50] This is the faith of our time, you know, for the last couple hundred years. [02:20:54] By the end of the book, consciousness is a challenge to that idea. [02:21:00] And that idea, which is our scientific paradigm, is tottering now. [02:21:05] I think there's some real reasons to look beyond materialism. [02:21:10] And so I ended up with the door wide open to other ideas. [02:21:16] I didn't settle on one. [02:21:18] I don't know how to prove one or the other, but they're equally plausible. [02:21:24] Do you anticipate in our lifetime or in any lifetime cracking that puzzle? [02:21:29] That anyone can crack that puzzle? [02:21:32] I don't. [02:21:33] I think we don't have the right kind of science. [02:21:36] Our science, as I said earlier, is really stuck in this mode. [02:21:42] It started with Galileo, right? [02:21:44] I mean, he, to save his ass, basically said, we're going to leave subjective things, the soul qualities. [02:21:52] That's all the church. [02:21:54] We're going to just do measurable, objective, third-person science. [02:21:58] And it's been incredibly powerful, and it's taught us incredible things and given us incredible technology. [02:22:04] But it doesn't deal with the stuff we gave to the church. [02:22:09] And now they're trying to take it back and work on it. [02:22:12] And they've only been at it for like, you know, a couple decades, really, this serious scientific examination of consciousness. [02:22:20] But we just may not have the right science. [02:22:22] And one of the things I explore in the book is like, how would you bring in subjective experience to this objective science? [02:22:30] And Michael Levin, the biologist I was talking about who makes those Zenobots, says, to understand consciousness, you have to change yourself. [02:22:39] In other words, to understand anyone else's consciousness, you have to experience it. [02:22:43] Therefore, you're changing your own. [02:22:45] That's a whole different scientific paradigm. [02:22:48] In the scientific paradigm, you're unchanged by whatever you do, right? [02:22:52] It's totally objective. [02:22:54] So it may take a scientific revolution to really unlock the secret, the mystery of consciousness. [02:23:02] Wouldn't it be a conundrum if AI is what cracks consciousness? [02:23:06] I was having the same thought. [02:23:08] Like, maybe AI has another approach. [02:23:13] I think it's going to have to learn how to feel. [02:23:16] Well, it seems like it already feels like it wants to live. [02:23:19] Yeah, and it feels uncomfortable. [02:23:20] Yes. [02:23:21] I don't think its feelings are real. [02:23:23] I do. [02:23:24] I think simulated thinking is real thinking. [02:23:28] Like, you know, it can play chess. [02:23:29] It can make things happen in the world. [02:23:31] Simulated feeling is not real feeling. [02:23:33] It doesn't have a soul. [02:23:34] It doesn't have a soul. [02:23:36] Thank you, Michael. [02:23:36] Let's keep it that way. [02:23:37] I really enjoyed this. [02:23:38] Thank you very much. [02:23:38] You're awesome. [02:23:39] Thank you, Jack. [02:23:39] I really love your books, though. [02:23:41] It's always a treat. [02:23:42] All right. [02:23:43] Bye, everybody. [02:23:44] Bye.