Danny Jones Podcast - #6 - Nathan Crock Aired: 2018-11-23 Duration: 01:40:36 === High School Days and 911 Calls (12:46) === [00:00:00] All right. [00:00:01] Three, two, one. [00:00:02] What's up, Nathan Kroc? [00:00:03] How are you doing, man? [00:00:04] I'm doing quite well. [00:00:04] Thank you for having me. [00:00:05] It's been forever since I've seen you. [00:00:06] We went to high school together. [00:00:08] Graduated in 2006. [00:00:09] Graduated in 2006. [00:00:11] Like I was telling you before we started recording, for some reason, you and I were in a bunch of the same classes. [00:00:17] I have no idea why, because I was way dumber than you. [00:00:22] And the one thing that sticks out to me was that you were showing me some diagrams of how you were planned on putting your brain into a cyborg one day. [00:00:31] Yeah. [00:00:31] And I never forgot about that. [00:00:33] Yeah. [00:00:33] Well, unfortunately, I did, as does most of my nonsensical ideas. [00:00:37] I dream big and do outlandish things. [00:00:40] You sure this isn't the cyborg version of you here right now? [00:00:43] And you're just not telling us? [00:00:44] Of course not. [00:00:45] Okay. [00:00:46] That's completely normal. [00:00:48] All right. [00:00:48] I believe you. [00:00:49] That's incredible, man. [00:00:50] So I had to look you up, and I'm glad that you agreed to come on this. [00:00:53] It's super cool. [00:00:54] Yeah, my pleasure. [00:00:55] I apologize in advance. [00:00:56] I've been a little sick. [00:00:57] I can't really imagine what I sound like in the microphone. [00:00:59] Probably like one of those. [00:01:01] Glorious. [00:01:02] Like a kidnapper filter, right? [00:01:05] Bring the briefcase and the money to the bridge by midnight. [00:01:08] Yeah, we could put some effects on it, some auto tune maybe or something. [00:01:12] So, what is it? [00:01:13] So, for those people out there who may not know, and I actually don't know how to say it myself, but what exactly do you do? [00:01:20] So, I do two things. [00:01:22] I do research in machine learning, which is colloquially referred to as artificial intelligence. [00:01:27] We can talk maybe about the distinction between those two things. [00:01:29] And the second is I work, I direct. [00:01:32] The NuSci Labs, which is an RD division of a machine learning company in Tallahassee. [00:01:36] So, we have a lot of government contracts, contracts with big businesses, agencies, and we offer some very common machine learning services. [00:01:45] And our job in the labs division is to ensure that all the services that we are offering are cutting edge. [00:01:50] We look up the latest research, check out the papers, go into the code, and make sure that we're able to keep things as impressive and useful as possible to our customers. [00:01:58] What kind of things do you do there? [00:02:00] Like to sum it down, what are you guys researching on? [00:02:05] And doing there. [00:02:05] Yeah, so for example, one of the contracts that we work on is with 911 emergency phone calls. [00:02:11] So that's a very big industry, as I'm sure we can all imagine, and they have thousands of calls a day. [00:02:18] Monitoring these calls is very difficult. [00:02:20] Often they're not able to monitor more than 1% or less of the phone calls that actually come through for quality assurance. [00:02:27] So what we're doing is we take all of the calls and we train a speech to text engine. [00:02:32] So we take the audio transcripts and the audio calls and we transcribe it to text. [00:02:36] Okay. [00:02:37] Then once we have all the text, We use natural language processing techniques to analyze what's going on emotion, sentiment, keywords, names, addresses, what's going on. [00:02:46] And we can use this to filter and flag potential dangerous calls. [00:02:51] Maybe somebody calls and says, Hey, I feel like I'm going to kill myself. [00:02:54] And the telecommunicator, Oh, okay, well, go ahead. [00:02:58] Something outlandish. [00:02:59] And a call like that should be monitored, and that telecommunicator should be flagged. [00:03:02] So the services we offer allow them to monitor their calls more efficiently and also improve training. [00:03:08] For example, we can identify target words. [00:03:10] Maybe a telecommunicator uses a certain word. [00:03:12] And there's always a spike in maybe anger, fear, or something in the caller. [00:03:17] And we can let the agency know that these certain words are probably not best to be used for calls of this type. [00:03:24] And so that's just one of the projects that we work on. [00:03:26] Wow. [00:03:27] And what is the purpose of this company? [00:03:30] I mean, like, what's the goal? [00:03:31] Like, what are you trying to develop? [00:03:33] Like, I understand, like, you guys are analyzing the calls, the behavior of the people that are calling in. [00:03:38] But, like, what is the long term goal? [00:03:41] Sure. [00:03:42] I mean, the people that come together, we would call ourselves social entrepreneurs, I suppose. [00:03:47] We all have some sort of outlandish idealistic notions of good and trying to impose that on everybody else, whether they agree with us or not, as I think everybody actually does, which we will probably get into when we start talking about what is intelligence and learning. [00:03:59] And so we try to take on projects that are consistent with our visions and ambitions of trying to improve things for the economy, to do good with machine learning. [00:04:08] Okay. [00:04:10] So essentially, with 911 calls down the road, you think that maybe humans won't be doing that or? [00:04:19] We've entertained those ideas. [00:04:20] That's a bit further down the road. [00:04:22] We are working on what we call a language model in the field of machine learning, where we take all these transcripts and we have a certain algorithm or engine which looks at all the text and starts learning the intrinsic properties of what 911 domain phone calls are like. [00:04:38] And we can actually synthesize new speech with language models like that. [00:04:42] And idealistically, one would dream that down in the future, we could potentially replace telecommunicators with agents. [00:04:49] That's obviously a very saucy topic. [00:04:51] But the short term goal is perhaps a training simulator to help the telecommunicators get used to what certain callers might sound like. [00:04:57] Maybe we can simulate homicide calls. [00:05:02] You wouldn't believe the kind of things that we listen to in these 911 calls. [00:05:05] Sometimes we'll just get some old lady. [00:05:06] She's like, Well, my back hurts, so I need an ambulance here immediately. [00:05:13] All kinds. [00:05:13] And the telecommunicator chuckles. [00:05:15] I'm sure you get it all. [00:05:17] Yeah, so the short term goal would be perhaps a training simulator. [00:05:23] And then long term idealistic goals would be. [00:05:26] Aiding and putting telecommunicators and making them be a bit more automated. [00:05:29] Right. [00:05:30] So, I mean, just going back to high school, you've obviously been extremely passionate about this for a long time. [00:05:37] What was it that initially got you interested in all this stuff? [00:05:39] What made you, what turned you into this person you are right now? [00:05:42] Yeah, that's kind of a long and tumultuous road, but I can probably summarize it a bit more poetically than it probably was, as we often do in media. [00:05:52] You know, so again, being like this social entrepreneur, I've always had this, again, unreasonable desire to just do good. [00:05:58] For no other reason, and it just makes me happy. [00:06:00] And so when we're young and idealistic, everybody's like, oh, world peace, what a great novel idea that is. [00:06:04] And then as you grow up, it usually just becomes a laughing point because such a thing is often nonsensical and imaginary. [00:06:11] But when I was young, I had convinced myself that the reason we don't obtain peace is because individuals don't understand one another. [00:06:19] And when I mean understand, I don't mean in the sense of like you listen to me and you hear me, I mean in the sense of like you feel what I feel. [00:06:26] Such in the sense that if I killed somebody, To truly understand me, you would have to replay the feelings, decisions, and the actions that I did such that you would have literally killed the person, too. [00:06:39] There's no deeper understanding than that. [00:06:40] So, my ambitions were I'm going to go into neuroscience, come up with a way to map memories, and then replay them in other people's minds so that they can learn other people's feelings. [00:06:48] Again, these big, idealistic, outlandish dreams. [00:06:51] I love that shit. [00:06:52] So, that was where I started. [00:06:54] And what time is that? [00:06:55] How old are you when you start thinking like that? [00:06:57] That was probably middle school, elementary school? [00:06:59] No, that was probably. [00:07:01] High school, I mean, shortly after high school, perhaps. [00:07:03] Yeah, you know, in high school, we don't really know who we are, what we want. [00:07:06] We're trying to figure things out in life, and your friends teach you what life is, your parents teach you what life is. [00:07:10] Yeah, so that's a messy minefield of disasters. [00:07:13] Yeah, you used to be a boxer too, right? [00:07:16] Yeah, another one of those young. [00:07:18] You're not doing that anymore? [00:07:19] I do it from time to time. [00:07:20] It's good to stay healthy, it's good to feel like you can protect yourself and your loved ones. [00:07:24] Yeah, I used to coach FSU's boxing teams, really. [00:07:27] Oh, that's cool. [00:07:27] Participate, that's awesome, man. [00:07:29] Coach a little bit when I can. [00:07:30] Hell yeah, you still do that anymore, or just Just dabble in it? [00:07:33] I mean, we used to own a boxing gym. [00:07:35] So when we coached FSU's team, we had the Renegade Boxing Gym for a long time. [00:07:38] In fact, I'm participating in the Savage Race on Sunday with one of the old trainers from Renegade Boxing. [00:07:45] And our team name for the Savage Races, you know, Renegade Boxing Team. [00:07:48] So that's awesome. [00:07:49] A little bit of nostalgia there. [00:07:50] That'll be fun, dude. [00:07:51] Yeah, no doubt. [00:07:52] I want to try one of those one day. [00:07:54] We should try one. [00:07:55] Yeah, I would die. [00:07:56] You think we can make it? [00:07:58] I think you guys got it. [00:07:59] The Warrior Dash is the way to go, right? [00:08:00] So check this out. [00:08:01] The Warrior Dash, it's short, muddy, messy. [00:08:04] There's barbed wire and fire, right? [00:08:06] You run through this thing, you have a lot of fun, you laugh, you get messy, you get dirty. [00:08:09] Everybody's usually pretty fit when they go out there in the first place. [00:08:12] And then you cross the finish line, and everybody screams in. [00:08:15] As soon as you cross the finish line, they hand you like a big chalice of beer. [00:08:18] You get like a big Viking helmet and a huge chicken leg, and everybody else is around you just like drenched and usually pretty fit and often pretty sexy. [00:08:25] And you just got adrenalines and beer and meat, and it's a pretty great experience. [00:08:29] That sounds great. [00:08:30] What's that called? [00:08:31] That one's called The Warrior Dash. [00:08:32] The Warrior Dash. [00:08:33] I'm checking that out. [00:08:34] Yeah, we definitely have to check that out. [00:08:35] That sounds good. [00:08:36] I'm definitely checking it out. [00:08:37] That whole getting into the boxing thing started when I was, this was actually a middle school thing, right? [00:08:41] So one of my big inspirations was Leonardo da Vinci, and he is known as the Renaissance man, right? [00:08:47] And so I'm like, that's me. [00:08:48] I want to be the modern day Renaissance man. [00:08:50] That's going to be my big goal. [00:08:51] And so I tried to do as much as I could. [00:08:53] You know, I did combat, I took, you know, martial arts, boxing, dance, play music, study languages, science, as much as I could to experience as much life as possible. [00:09:01] Only to realize that in this overpopulated world of specialization, the notion of a modern day Renaissance man just isn't really achievable any longer. [00:09:09] The amount of time necessary to become a true master of any one discipline is a lifetime. [00:09:14] For sure. [00:09:14] Yeah, so that was a long time. [00:09:15] So you tried to do a bunch of different things. [00:09:16] You tried to do it all. [00:09:17] Yeah. [00:09:18] But it's good to experience a little bit of everything. [00:09:20] Yeah, and so I might have. [00:09:23] Psychologically convinced myself that instead of being a master of one thing, I'm a master of being mediocre at many things. [00:09:30] That's all right. [00:09:30] Yeah, right. [00:09:31] I like it. [00:09:32] It has master in the title. [00:09:33] It's good enough. [00:09:36] Oh my God. [00:09:37] Speaking of stretching yourself, I just tried not eating for three days, and that was like the worst experience of my life. [00:09:43] Good for you, man. [00:09:43] I didn't, I should have done it with other people who were doing the same thing, but I was around people that were just gorging every day, and it was basically like torturing myself. [00:09:52] But I was trying, it was like a healthy thing. [00:09:54] I was trying to be healthy, you know what I mean? [00:09:55] Trying to clean out my. [00:09:56] My body, my cells, and yeah, and gives you a little sympathy to people doing Lent and Ramadan, etc. [00:10:02] Yeah, right, for sure. [00:10:05] He was gonna go 10 days. [00:10:06] I was like, I'll give you about four, maybe five tops. [00:10:09] I made it about three, I made it 72 hours. [00:10:11] Have you ever tried that? [00:10:12] Honestly, I have, and not more than a few days ago, I was telling my girlfriend that I was gonna try and go seven days because I just have eaten too much crap, and then there's the eggnog, and then there's the beer, and then I get gassy, and then I'm just miserable, and I just want to like cleanse myself. [00:10:25] And so, I've psychologically convinced myself that fasting is somehow the way to do that, right. [00:10:29] I don't really know what that is. [00:10:30] Yeah, it's like the rage right now. [00:10:31] Everyone's talking about it. [00:10:31] It's like the new thing to talk about and do. [00:10:34] So, you have done it before? [00:10:35] I have, yeah. [00:10:35] How many days? [00:10:37] Well, I did it back when I was in college. [00:10:39] One of my friends was doing the Lent thing for Catholicism, and I just did it with him for a week. [00:10:44] One week? [00:10:44] Seven days. [00:10:45] How would you do it? [00:10:46] What do you do all? [00:10:47] You just drink lots of water. [00:10:48] That's it. [00:10:50] But you can't be around other people that are eating, right? [00:10:53] You can if you have discipline. [00:10:55] I did not have enough of it, so I often just avoided people when they were eating, going out, or something like that. [00:11:00] Yeah, it was nice. [00:11:01] I mean, it kind of changes your. [00:11:02] The first few days, it's like, oh, I got this. [00:11:04] This is great. [00:11:05] You know, people who like to endure the suffering a little bit or they like to endure the hardships, you know, turmoil. [00:11:10] And then that turmoil starts to screw with your psychology a little bit. [00:11:14] Everybody likes to think that they can endure hardships, but the reality is that it actually becomes like an intellectual war inside your mind. [00:11:21] It's not as easy as just, I'm not going to do it because your mind is like convincing you that it's okay to do it. [00:11:26] Well, it's unhealthy for me to not do it. [00:11:27] I need to eat or else I'm going to kill myself and I don't want to kill myself. [00:11:29] This is just a game, right? [00:11:31] You play these games with yourself and it becomes very difficult. [00:11:33] I learned a lot when I did it. [00:11:34] The hard thing for me was thinking about like having this many days left. [00:11:38] Like, I'm like, I'm only one tenth of the way through, I'm only, you know, two thirds of the way through. [00:11:44] I'm not even close to halfway through, and I gotta go five more days. [00:11:48] Like, that made it impossible for me. [00:11:50] But if I could have just thought about it, like, I'm just gonna go today without eating. [00:11:54] Wake up the next day, a brand new day, I'm just gonna go today without eating. [00:11:56] Like, if I could have done that every day, if I could have somehow tricked myself, I think that would have been way easier. [00:12:00] Did you nail it? [00:12:01] There's a book by Stephen King called The Long Walk. [00:12:05] It's a great book, basically just about enduring hardship. [00:12:09] And in that book, and from a friend of mine, I learned the very valuable lesson when you're like running marathons, half marathons, training is that, and this just, Translates to everything in life is you just say, just put one foot in front of the other, right? [00:12:20] That's all I got to do. [00:12:21] I just got to put my foot in front of the other, all right? [00:12:23] I'm here now. [00:12:24] I just got to put one foot in front of the other, and everything just becomes so easy. [00:12:26] Then you turn around, you wake up, you're 40, and you know, you've built the Great Wall of China. [00:12:30] It's amazing, man. [00:12:31] It's amazing, man. [00:12:32] It's it really is the just the it's just scary to think about like having to take a thousand steps and try to like look at the top of the mountain. [00:12:38] I'm gonna have to get there. [00:12:40] Holy shit, I'm gonna die. [00:12:41] Instead, like one foot in front of the other, it's a lot easier. [00:12:44] It makes it a lot easier for sure. === Self-Driving Cars and AI Hurdles (15:22) === [00:12:46] So, what's up with uh, this AI? [00:12:48] Tell me a little bit about some stuff I don't know. [00:12:50] AI, what is World's coming to an end, man. [00:12:52] What is super digital intelligence? [00:12:56] Yeah, so a lot of buzzwords float around. [00:12:58] They often get misinterpreted by a lot of people. [00:13:01] Super intelligence, digital intelligence is super intelligence in a digital medium. [00:13:05] But in general, super intelligence is just any form of intelligence, whatever that means, which we'll probably need to define in a moment, that surpasses humans in vast regard, oftentimes. [00:13:18] So, you know, let's actually take a moment and talk about what is intelligence, right? [00:13:24] Just curiously, like, what do you guys, how would you guys define intelligence if somebody asked you? [00:13:28] I would say it's just trial and error of failing or not failing, and then kind of adding those up to knowing which path you're going to take next, right? [00:13:39] The saying, like, if a brand new baby crawls off the couch and hits its face on the ground the first time, chances are the next time it's on the edge of the couch, it's not going to keep crawling and fall off the couch, right? [00:13:50] So that's just like a level of knowledge, maybe, or that's one of the key words that is used in the machine learning definition. [00:13:57] But yeah, you're sort of revolving around the same principle. [00:14:00] There is no academic 100% definition, or rather, there are many. [00:14:04] There's no accepted definition. [00:14:06] But a common theme among all of them is there's two pieces to intelligence. [00:14:11] One is a fundamental requirement to acquire information, you have to somehow just ingest information. [00:14:17] And then two is you need to have some functional ability to act on that information and interact with the world that your intelligence is embedded in, be it digital or biological. [00:14:28] So, one thing that I like to sort of interpret that is by saying the mind itself is embodied and then the body is embedded. [00:14:37] So, let's just take the notion of the brain for a moment. [00:14:39] Outside the brain, there's this complete universe of information. [00:14:43] We have absolutely no idea what the vastness of that information is because our brain, all of our senses, are just making tiny, infinitesimal approximations to this vast amount of information that's out there. [00:14:54] As an analogy, the electromagnetic spectrum, you know, Radio waves, x rays, gamma rays, visible light, ultraviolet, right? [00:15:01] All this whole spectrum. [00:15:03] We can only see an absolutely teensy weensy little sliver of that. [00:15:07] And there's so much more information out there that's just literally not going into our brain. [00:15:12] All right. [00:15:12] So, what we say is that the little bit of information that does come into the brain, though, we have to internalize it, create an internal model of that information, and then be able to interact and act on that knowledge in an intelligent way. [00:15:26] And how do people determine, or how do people choose? [00:15:32] Or do they not choose what gets input into them? [00:15:35] What pieces of information are input into them? [00:15:37] Is that just like how they're raised and people they're around? [00:15:40] It's just right now you're getting all kinds of information. [00:15:43] The light that's hitting you, the sounds that are coming into your ears, the feel of the table. [00:15:46] I mean, every sensory medium is just ingesting information all day long, 100% of the time. [00:15:51] You have no choice. [00:15:52] Okay. [00:15:52] Yeah. [00:15:52] Well, I mean, obviously, you have an alleged perceived choice. [00:15:56] We'll get into this whole free will thing probably eventually, but allegedly, you move yourself to another environment, you change the stimulus that's presented to your system. [00:16:04] Wow. [00:16:04] Yeah. [00:16:05] So I've always thought to myself, I never really talked about it, but like, for instance, conversations like this that get recorded and uploaded to the internet. [00:16:13] Hundreds, millions of hours every day. [00:16:15] Is that not just fucking training some crazy AI that's gonna be able to download everything, every bit of knowledge that humans have ever known in like two seconds? [00:16:24] Yeah, that's a fun thing to think about. [00:16:25] I mean, there's a couple of hurdles along the way. [00:16:27] The first and the most staggering is the need and the requirement to aggregate that information, right? [00:16:34] So, I mean, as you know, you get all these files, you put them on computers, you record from this microphone through the wire, you have it here locally. [00:16:42] Hundreds, thousands of hours of video. [00:16:44] Well, for one entity to be able to, or entity, intelligent system to be able to ingest that knowledge, all of that information must somehow be available to it to ingest in the very precise medium that it needs for it to be ingested. [00:16:58] And that's a very, very difficult hurdle to overcome. [00:17:01] Right. [00:17:04] Wow. [00:17:04] Yeah. [00:17:05] So, I mean, for example, you might have heard about a lot of the facial recognition, image recognition, et cetera. [00:17:10] Yeah. [00:17:11] For the longest time, these image recognition systems. [00:17:15] Could only operate on images of a very specific resolution. [00:17:19] They just didn't have an ability to generalize to different sized images. [00:17:23] So now you say, oh, well, there's all these pictures out there on the internet. [00:17:26] Yeah, but only those that are exactly whatever, 127 by 324 pixels, can this one system ingest. [00:17:33] And so over time, we learned about this handicap and we generalized to different techniques that allow us to examine different resolution images. [00:17:40] But basically, transducing information that's out there, be it digital or physics or whatever it is, into some system and then Aggregating it is a very, very complicated system, let alone then learning on that information and being able to interact on it. [00:17:55] Just ingesting is a huge problem that we still are having a very hard time grappling with. [00:18:00] Wow. [00:18:01] And do you, in general, do you have more of an optimistic view of the future of this artificial intelligence or do you have more of a concerned view? [00:18:09] Kind of like versus Elon Musk's outlook. [00:18:12] He's very pessimistic about it, very cautious about it. [00:18:15] And then people like Mark Zuckerberg who are like, why would you be scared of it? [00:18:19] This is great. [00:18:20] This is helping us. [00:18:21] Yeah. [00:18:21] There's nothing to be afraid of. [00:18:23] Yeah, there's an underlying theme in a lot of the way that I approach all these problems, and that is to truly understand something, be neither for nor against it, and then the truth will be laid bare before you. [00:18:37] So I do the best I can to not really have overwhelmingly strong opinions about things because that just limits the amount of information that I'll be able to ingest and to understand for it. [00:18:46] You know, there's some really outlandish examples that can sort of demonstrate this. [00:18:51] You know, let's say that somebody's walking down the sidewalk and They take a little knife and they stab an old lady. [00:18:59] And then, what is that? [00:19:01] What happened? [00:19:02] Is it good? [00:19:02] Is it bad? [00:19:03] Why? [00:19:04] People like to analyze these things and try and throw labels on it. [00:19:06] But the reality is if you zoom out enough, it is neither good nor bad. [00:19:11] It merely is. [00:19:11] It just happened. [00:19:13] She's going to, without medical attention, probably fall over and die. [00:19:16] These are just things. [00:19:17] But let's zoom out a moment and say, oh, well, what are some ways to interpret it? [00:19:21] Well, if I didn't know anything and I just saw it happen from the other sidewalk, I'd be like, this is horrible. [00:19:26] How could this person do this? [00:19:27] You took the life of an old lady. [00:19:29] But little did I know, there was information unavailable to me that she had bombs on her chest and was about to blow up the whole city, and some guy just saved hundreds and thousands of lives. [00:19:37] Okay. [00:19:37] Right. [00:19:37] So, you see how the ability to interpret this information is just intrinsically subjective. [00:19:43] And that becomes very uncomfortable for a lot of people. [00:19:45] So, when I think about the questions about is it good or is it bad, I think it's both and it's neither. [00:19:50] Depending on certain assumptions that people have about what is essentially right or wrong, you can make an objective framework and then start arguing well, this is good or this is bad. [00:19:59] But if you change those assumptions, now you have a different objective framework and now you can argue whether it's good or bad there. [00:20:04] So, I try to be flexible to entertain different people's opinions about what is right or wrong and then we can really explore the possibilities. [00:20:10] Right. [00:20:11] I mean, I don't really have it. [00:20:12] I just kind of like, I don't have an opinion myself. [00:20:14] I don't really know enough about it, but I assume like you know substantially more than the average person does about the future of this stuff. [00:20:21] Sure. [00:20:22] That's a good question. [00:20:23] I mean, you don't have any kind of emotional reaction to it? [00:20:26] I mean, obviously. [00:20:27] Of course, I have emotions. [00:20:28] I'm not a cyborg. [00:20:29] I'm just making sure. [00:20:30] I don't know. [00:20:31] I might have to ask your girlfriend to confirm that. [00:20:35] We're going to create a machine that scans your face and figures out if you are a cyborg or not. [00:20:40] But Forrest hasn't developed that yet, Forrest. [00:20:44] But. [00:20:45] I mean, no feelings towards what could happen, what may not happen. [00:20:52] You're not weighing towards one or the other. [00:20:56] What are your thoughts on that? [00:20:57] It's a great question. [00:20:58] And I'll just be entirely honest. [00:21:02] I feel all of them, right? [00:21:03] And I do the best I can as a scientist to sort of take in the feeling that I feel, analyze why I feel it, look at the repercussions of that feeling. [00:21:11] Then I take another feeling, look at these, and analyze it and try and get a whole picture. [00:21:15] You know, for example, I'm excited. [00:21:17] AI is an amazing phenomenon. [00:21:19] I mean, how it helps us understand ourselves at such a deeper level is very difficult to communicate to people who haven't really been immersed in it for a long time. [00:21:27] You know, as Neil deGrasse Tyson says, like, scientifically literate people just see the world differently. [00:21:31] Well, I would take that a step further and say, machine learning or artificial intelligence literate people just see themselves and humans differently. [00:21:38] So that's an exciting thing. [00:21:39] You know, we might be able to share these things we discover. [00:21:42] But of course, everybody has many fears. [00:21:44] You know, sufficiently advanced technology is indistinguishable from magic, and that warrants fear and caution. [00:21:48] Yeah. [00:21:49] You know, we need to be careful and make sure that we have a lot of checks and balances to make sure that things don't get out of control. [00:21:54] And we can talk about those things. [00:21:55] Right. [00:21:55] They're very saucy topics. [00:21:56] Yeah, it is. [00:21:57] For sure. [00:21:57] I feel like one day it's just going to be just like the Terminator and fucking everything is going to just be robots and we're all gone. [00:22:05] Let's do a thought experiment. [00:22:06] Yeah. [00:22:06] So let's say that I am an ardent environmentalist and I love the earth more than anything. [00:22:13] Okay. [00:22:13] It's all I want. [00:22:14] So again, you know, we talk about these values. [00:22:16] My objective values are the preservation of nature and the earth. [00:22:20] Mm hmm. [00:22:21] I develop some amazing artificial intelligence system that's going to solve all the world's problems, global warming. [00:22:26] It's going to tell me how to fix it and tell me what to do. [00:22:28] Okay, and this thing starts learning. [00:22:29] It collects all this data, learns about people, and it decides, well, I think the earth would be a lot better if we just got rid of all the people. [00:22:35] Right. [00:22:35] So it goes and does that. [00:22:37] So the question is, is that good or is it bad? [00:22:39] Which it should, right? [00:22:40] That's what it was supposed to do. [00:22:42] And as the person who is interested in the environment, I should theoretically say, well, this is consistent with my values and it's achieving my goals and my ambitions. [00:22:50] So that's the problem with right and wrong, it's only relative to your subjective value system. [00:22:56] And so people talk about, well, what if robots come and take over and kill all the planets? [00:23:00] Well, or all the people. [00:23:02] Well, provided that we are all interested in self preservation, that's an awful thing, you know, clearly. [00:23:07] I happen to agree. [00:23:08] I like living. [00:23:09] Living is good. [00:23:10] Dying is not as good. [00:23:11] So, you know, I'm on that bandwagon as well. [00:23:15] But it's just not as easy. [00:23:16] I just don't think it's quite as easy as is often advertised. [00:23:19] No, it's definitely very complex. [00:23:22] I mean, the same thing with self driving cars, right? [00:23:25] Like, how do you train? [00:23:27] That's a fun one. [00:23:27] Like, the car has to avoid hitting. [00:23:31] Something getting into a car accident. [00:23:32] Like, I think Forrest was telling me that one of the new, I think it was Tesla or someone, they were saying that they were training the cars to avoid, primarily avoid hitting other cars. [00:23:45] Meaning that the car has to decide between hitting another car or hitting a pedestrian. [00:23:51] Yeah. [00:23:53] How do you do that? [00:23:54] How does a human do that? [00:23:55] Right. [00:23:57] And the answer is that there is no answer, once again. [00:23:59] And this is the thing that everybody hates, and this is what I love about it so much. [00:24:02] Because it makes us question these very important things that we all take for granted. [00:24:06] Humans have an amazing inclination to just be enamored with our own ideas and to just reaffirm our own convictions. [00:24:16] And something like this just forces us out of our comfort zone, makes us ask these really tough questions. [00:24:20] Yeah. [00:24:20] How do you give a machine, a robot, morals? [00:24:23] You can't do that, right? [00:24:25] I mean, you have to give them every possible angle. [00:24:31] It's a very difficult question. [00:24:32] I'm going to start off with an example. [00:24:35] Of perhaps why, and then talk about just a potential resolution, which is obviously not an answer. [00:24:40] There's no one, but I'm just going to give a stab at it. [00:24:42] So, for example, one of the biggest handicaps that we've been facing in the self driving car things is that the cars are good. [00:24:49] They're too good, in fact. [00:24:53] Many of the accidents that have been taking place with self driving cars are just slow speed rear end fender benders. [00:24:59] Why do you think that is? [00:25:02] Because the self driving cars don't break the law. [00:25:05] And so they don't speed. [00:25:06] While humans, on the other hand, do. [00:25:09] And so the self driving car is forced to make a decision. [00:25:11] I need to merge onto this interstate, but all the cars are flying by me at 75, and I'm not going to go faster than 65. [00:25:17] How do I do it safely? [00:25:18] And many times it can't. [00:25:20] And so there are these big issues that it's faced with how do you deal with, as soon as you tell the car, well, the only way to navigate that safely is to break the law? [00:25:30] Right. [00:25:31] Where does that stop? [00:25:34] Right. [00:25:34] Right. [00:25:35] So there it is. [00:25:35] You got to ask yourself. [00:25:38] To really understand when it's okay, you have to decide when it's okay to break the law. [00:25:43] So, an analogy that I like to give for this is something that, again, a lot of people aren't really a fan of, but laws are more like guidelines, right? [00:25:53] It's not something that we should follow for the sake of the law. [00:25:55] It's more like here's a description of the shared values and the shared fiction of the people that I have chosen to voluntarily live with. [00:26:04] So, if I'm going to live with these people, I have a responsibility to respect the shared value system of these people. [00:26:09] That being said, it's just a suggestion. [00:26:12] I'm not going to stop at a stop sign if I'm being chased by zombies, right? [00:26:15] It's not like this hard and fast law that you just absolutely cannot break. [00:26:19] So, what you would want to say is how do we teach AI morals? [00:26:24] You give it the same value system as the average people in your society because that is what we've defined in general as what's right and wrong. [00:26:33] Right. [00:26:35] You can see the problems with that right away, right? [00:26:37] There's a lot of people who have some convictions about. [00:26:40] Certain myths or not myths. [00:26:42] As soon as a self driving car runs over an old lady, they're going to try to say, no more self driving cars. [00:26:49] We got to ban the self driving cars. [00:26:50] They're killing old ladies now, but a human would have done the same thing. [00:26:54] Potentially. [00:26:55] Or you look at the statistics and you find that humans are killing far more people than self driving cars are. [00:26:59] Right, right. [00:27:00] On average, like per the volume. [00:27:02] I mean, obviously, there's far more people driving than cars. [00:27:05] So you don't want to look at that statistic. [00:27:06] You want to look at the percentage of cars that are actually hitting people. [00:27:10] And self driving cars. [00:27:12] So far, they have been phenomenal. [00:27:13] They're too good, is the problem. [00:27:15] So, we have to somehow teach them to be worse, to be better. [00:27:18] Teach them to break the law, right? [00:27:20] Just like we do. [00:27:21] Just like we do. [00:27:22] Teach them to be as broken as we are. [00:27:24] Wow, man. [00:27:26] That's crazy how far that stuff's come. [00:27:29] Indeed. [00:27:30] Do you have one of those cars? [00:27:31] No, I don't. [00:27:32] Not yet. [00:27:33] Once it gets a little more mainstream. [00:27:34] You want one, though? [00:27:37] Do I want one? [00:27:38] I don't know. [00:27:38] I like motorcycles. [00:27:39] Oh, yeah, me too. [00:27:40] I want to get a motorcycle. [00:27:41] Yeah. [00:27:41] I've been saving up for a motorcycle. [00:27:43] What do you got? [00:27:44] I have a 2015 Yamaha FZ07. [00:27:46] Oh, okay. [00:27:47] I'm looking like Carly. [00:27:48] One of those fast bikes. [00:27:49] Yeah, it's got like one of those yellow candied rims. [00:27:53] Oh, yeah. [00:27:53] I don't know why I bought it. [00:27:54] It's really not my style. [00:27:55] I'm oftentimes more conservative, but maybe I was having a midlife crisis. [00:27:58] I don't know what I'm doing. [00:27:59] Pull up to the campus on your highway. [00:28:01] I like it. [00:28:02] Ring, ring, ring. [00:28:03] Popping wheelies. [00:28:03] Popping a wheelie through the parking lot. [00:28:05] I can see it. [00:28:06] Yeah. [00:28:06] Professor Kroc is here. [00:28:08] Bitches. === Motorcycles, Memories, and Neurons (17:08) === [00:28:09] I remember back in high school, I had a 1987 Corvette coupe. [00:28:14] I was at Seminole High School, and there was, oh God, the good old days, man. [00:28:18] You remember the intersection where there's the gas station right across from the parking lot? [00:28:22] That little intersection right there? [00:28:24] Yeah, of course, everybody knows. [00:28:25] You got to go to Circle K after school. [00:28:26] Always. [00:28:27] Yeah, I pulled out there, you know, because everybody from school is like at that intersection or at Circle K or at the corner. [00:28:33] So that's where life happens. [00:28:35] If you want to be noticed, you got to be there. [00:28:36] Right there. [00:28:37] So I was like, hey, I'm going to be noticed. [00:28:38] And so I rolled down my window and started blaring. [00:28:40] What was it? [00:28:41] Give me fuel, give me fire, give me that. [00:28:43] Metallica. [00:28:44] Instead of doing some donuts in the middle of the intersection, just like such a hooplehead. [00:28:48] I don't know what I was thinking then. [00:28:49] But I was adrenaline, you're young, living life. [00:28:52] I got a reckless driving ticket. [00:28:53] You did from that? [00:28:54] Yeah, that didn't work out so well. [00:28:56] I got what I deserved, of course. [00:28:57] But it was still a good memory. [00:28:58] Honestly, I'd probably still do it again. [00:29:00] That's what I mean. [00:29:01] You've got to break the law sometimes and live life a little bit. [00:29:03] You've got to. [00:29:03] A self-driver car would have never did that. [00:29:05] No, hell no. [00:29:06] It would have never happened. [00:29:07] Turn the volume down. [00:29:08] Play Ed Sheeran. [00:29:09] Yeah. [00:29:11] Wow, man, that's incredible. [00:29:14] What about have you dabbled in like video games, like virtual reality and like simulated reality, anything like that? [00:29:25] Yeah, it's interesting you mentioned that. [00:29:26] So, the lab that I work in at Florida State University used to be called the Visualization Laboratory. [00:29:33] It was run by Dr. Gordon Erlebacher, who is my advisor. [00:29:37] He's the chairman of the Department of Scientific Computing. [00:29:39] He teaches the Introduction to Game and Simulator Design course. [00:29:43] Wow. [00:29:44] So, in the visualization lab, we have all kinds of tools. [00:29:46] We got the Oculus Rift, we got the HoloLens, we got the Neuron, the mobile trackers, we got all kinds of good stuff. [00:29:53] Wow. [00:29:54] Yeah. [00:29:54] Have you ever played Fortnite? [00:29:56] Fortnite. [00:29:57] No, but I watched your podcast with one of the world's best Fortnite players. [00:30:01] Yeah, yeah, Tfue. [00:30:02] Yeah, his name. [00:30:02] How cool is that? [00:30:03] He's a local kid. [00:30:04] No way. [00:30:04] From right here on Indian Road to the east. [00:30:05] How outlandish is that? [00:30:06] He's a local kid, 17 year old millionaire. [00:30:10] Wow. [00:30:10] Playing video games. [00:30:12] Yeah, it's incredible. [00:30:13] It's insane, man. [00:30:15] That's. [00:30:15] What we were talking about earlier, you'll be a Renaissance man or you get extremely fucking good at something like that. [00:30:22] And all those different paths are neither is more right or wrong than the other. [00:30:26] It's just a different life that, kind of sadly for me, I'm often sort of met with sadness at all the facets of life that I'm just never going to experience, right? [00:30:33] Like, I'm never going to be a guy who just surfs on the beaches of Hawaii. [00:30:37] I won't know the emotional, psychological phenomena that sort of overcomes you when that's who you are. [00:30:42] And I'm never going to be a devout religious person who goes and just preaches what he believes in 100%. [00:30:46] There's so much life that I'm just never going to get to experience. [00:30:50] And when you think of that, it's kind of sad sometimes. [00:30:51] Why won't you get to experience it? [00:30:52] You could go surf in Hawaii. [00:30:54] Yeah, but I mean, what I mean is like a lifetime of that. [00:30:57] If I go do it, a lifetime of it. [00:30:59] Yeah. [00:30:59] So, for example, have you traveled before? [00:31:01] Of course. [00:31:02] Have you been to an international or something? [00:31:03] Yeah, yeah, yeah. [00:31:04] You go and you meet these people and you see what it's like and it's cool and all. [00:31:07] Right. [00:31:07] And you come back and you're like, oh my God, that was the best thing ever. [00:31:09] Yeah, but we really have no idea. [00:31:11] Right. [00:31:11] We have such a short, and this is where machine learning comes in again, right? [00:31:15] We have a short exposure to information. [00:31:17] And that allows us to create a very ill informed, what we call prior opinion about what it's actually like. [00:31:22] Because we're trying to take this information and reconcile it with our worldview that we learned over here with our Western values. [00:31:29] And so we're taking what we think they're experiencing and fitting it into what we understand life to be. [00:31:35] But if you were over there and you were raised there, you would see life so much differently. [00:31:39] It's just, it's very hard to describe. [00:31:42] I mean, the epitome of it is when you think about inner religions, right? [00:31:47] We all have, all humans come from the same caste, 99.98 something percent similar DNA. [00:31:53] Yet, just in the right environment, the right circumstances, and the right exposure, you can be raised to be, you know, Buddhist, Catholic. [00:32:00] Jewish, all these different religions that are just exposed to you from a very young age, and it just totally changes your worldview. [00:32:07] So there's just infinitely many lives that we're never going to be able to experience. [00:32:11] This is what we've got, and we just make the best of it. [00:32:12] But sometimes it's sad to just think about what we'll never see. [00:32:15] But do you think we live different lives, like in different dimensions? [00:32:20] Are you religious? [00:32:21] Are we in a simulation? [00:32:23] You know, interestingly, the answer to both of those questions is the same. [00:32:27] There does not exist enough evidence either way to actually make a concrete decision. [00:32:31] We have to do what. [00:32:33] Most people do, which is what we have to do as a human, and that's just choose to believe based on what's consistent with our values and what we've been taught and what we've been exposed to. [00:32:41] Right. [00:32:41] So, are we in a simulation? [00:32:42] Maybe. [00:32:43] Maybe not. [00:32:44] Life would be the same whether we were or whether we weren't, interestingly. [00:32:48] And am I religious? [00:32:49] I don't think I'm religious. [00:32:50] I'd say that I don't really have enough evidence either way. [00:32:54] You know, some people are like devout atheists. [00:32:56] I think it's oftentimes, you know, maybe as dangerous to say that there is no God as it is to say that there is one. [00:33:04] Nobody can actually prove it either way. [00:33:05] They just choose to believe it because it's consistent with their values and their worldview. [00:33:09] Right. [00:33:11] And you, I mean, do you often overthink to the point where you are like, wow, like you were saying before, you're like, I wish I could have, you know, spent my life surfing on the beaches of Hawaii or climbing Machu Picchu or doing this and that. [00:33:24] I mean, instead of having like that versus like having like the narrow minded, like the narrow tunnel vision of just going down one path and sticking to it, like similar to like the video game kid who obviously never tried to do anything else. [00:33:36] Maybe he thought about it, but. [00:33:38] Just stuck down one path from when you were a young kid to when you got older, got lucky, hit the jackpot. [00:33:44] People say, Oh, that kid's so lucky or whatever. [00:33:45] But, like, someone like you who was blessed and cursed to be so freaking smart that you think about all these different scenarios that you could have done. [00:33:54] Is that, does that make you kind of like, is that what you said? [00:33:56] It makes you sad? [00:33:57] Does that make you sad? [00:33:58] Sometimes, sometimes. [00:33:59] But, you know, as I said a little earlier, I try to be consistent with this worldview, and that is that it's neither right nor wrong. [00:34:04] I don't regret it. [00:34:05] This is my choice. [00:34:06] That was his choice. [00:34:06] If I would have lived a life like, well, choice. [00:34:08] We don't really have too much of a choice. [00:34:09] We're just sort of born where we're born. [00:34:11] Yeah. [00:34:11] And then we like to have pride in that as if we somehow had choice over our nation of birth or something like that. [00:34:15] But, you know, we live the life we live and we make the best of it. [00:34:19] You acknowledge the lives of others, try and internalize it as much as you can, see their worldview. [00:34:24] But we can't unless I finish my invention and make that mind machine where we can put experiences in different people's minds. [00:34:30] I think there's an episode of Black Mirror that's like that. [00:34:33] Yeah? [00:34:33] There for sure is. [00:34:34] Have you watched that show? [00:34:35] I've seen some of it. [00:34:36] Yeah. [00:34:37] Where they have like a little chip or something, right? [00:34:40] And you could see people's like memories. [00:34:42] Yeah. [00:34:42] Yeah, yeah, yeah. [00:34:43] There's also one in Hitchhiker's Guide to the Galaxy. [00:34:46] They got that gun at the end, remember? [00:34:48] I think that was The Answer to the Secret of Life or something like that. [00:34:50] And it was just shooting. [00:34:51] Hitchhiker's Guide to the Galaxy? [00:34:53] Yeah, I remember that movie, but I don't remember. [00:34:55] What the raccoon and shit? [00:34:57] No, no, that's a. [00:34:59] Guardian of the Galaxy. [00:34:59] That's Guardian of the Galaxy. [00:35:00] What was the other one? [00:35:02] Yeah, that's what I was thinking. [00:35:03] I know. [00:35:05] What was the other one called? [00:35:05] Hitchhiker's Guide to the Galaxy? [00:35:07] Yeah, and then at the very end, there's the gun. [00:35:09] I don't remember what it's called, but they make a joke out of it. [00:35:11] But basically, it's like now men know what women are thinking, because, you know, the woman shoots you, and then suddenly you know what she's thinking. [00:35:18] Is that sort of how your invention will work? [00:35:21] You know, I thought a lot about it, and it's there's a lot of like really dangerous pitfalls to go into, and you know, not only just philosophical but also physiological, like the neuroscience behind it. [00:35:34] You know, as I started studying neuroscience over the years, I worked at the Max Planck Floyd Institute for Neuroscience for many years and learned a lot about how all this stuff works. [00:35:41] And it's just we are so incredibly far away from something like that. [00:35:45] Really? [00:35:45] Yeah. [00:35:46] Would it be more of a gun, or would it be more of like an electric chair you sit in, and like a thing goes over your head and you strap into it? [00:35:52] What would it look like in your mind? [00:35:54] Yeah, I mean, when I dreamed it up in my nonsensical idea stew, it would be something akin to like a headset that uses something like magnetic fields to stimulate certain regions and certain neural pathways. [00:36:08] But the problem is the following every human being can be presented with the exact same stimulus, theoretically, which they can't, right? [00:36:16] Because just different angles and everything. [00:36:17] But just theoretically, they're presented with the exact same stimulus and they'll represent that stimulus internally in completely different ways. [00:36:24] All right, so the idea of me somehow mapping the experience that you had, learning it, and then somehow making someone else experience it, I would have to know their entire mapping of their brain and somehow which neural pathways and spatiotemporal patterns represented different feelings and experiences and activate them in some crazy order. [00:36:43] I mean, it's really just a fairy tale right now, but you know, you got to dream about this stuff sometimes to work towards the fun stuff. [00:36:50] Yeah, exactly. [00:36:51] I was going to say that. [00:36:53] It's crazy. [00:36:53] It's so deep. [00:36:55] My mind is being blown every like 15 seconds. [00:36:58] I'm trying to follow along. [00:37:00] People want to try to figure out how to live longer and live forever. [00:37:02] Yeah. [00:37:03] So I'm sure there's people out there researching and all that stuff. [00:37:07] And I can only imagine how far it's really gone already. [00:37:10] Do you think we're that far out from something like that? [00:37:13] Well, there's, I guess, two main approaches to something like that, at least from what people like to talk about. [00:37:18] One is the notion of just like putting your brain into something else. [00:37:22] And that's only a good idea in the short term because the brain is still biological and physiological, and that will decay over time as well. [00:37:30] So, it's not like you can just keep putting your brain into a new body and it's a brain's brain. [00:37:34] It's just, it's right. [00:37:36] The brain is a hunk of meat, also, right? [00:37:38] Yeah, it's going to die just like everything else, right? [00:37:41] And where are we? [00:37:41] How far away from that? [00:37:43] I think very recently there was this very controversial research or operation that somebody's been trying to take place where they're going to transplant one guy's head onto another body. [00:37:53] Oh, yeah. [00:37:54] I've heard about that. [00:37:55] Yeah, I don't think it has happened yet. [00:37:56] And I don't really know how much truth there is behind something like that. [00:37:59] But that would be an example of where you're not necessarily putting somebody's brain. [00:38:02] Into another body, but you're putting someone's head onto another body. [00:38:05] And so I really don't know anything about medicine, so I can't speculate on that. [00:38:09] But you just wanted to know how reasonable it is. [00:38:13] That's where that's at. [00:38:13] But the other option is uploading the brain into a digital medium where it should allegedly persist well beyond the constraints of the biological system. [00:38:23] Then there's no meat carcass that you have to deal with, it's just digital, right? [00:38:27] Yes, yes. [00:38:30] Well described. [00:38:31] Meat bag. [00:38:31] Yeah. [00:38:32] What is that what they call it? [00:38:34] The meat bags that we're trapped in for our whole lives. [00:38:36] Yeah, we're incredibly far away from anything like that even being remotely possible. [00:38:41] Really? [00:38:42] Yeah. [00:38:44] Yeah. [00:38:44] I mean, what we can do is measure some patterns and neural pathways and then make some very, very weak predictions about what somebody's thinking or doing before they do it based off of patterns that are taking place. [00:38:59] For example, there was another experiment where they had two, I don't know if I'm going to cite this correctly, but they had two mice or rats. [00:39:07] Hamster, something like that, on very far away from one another, opposite sides of the planet, for all I know. [00:39:11] And there was an electrode in the brain of one, and it was navigating a maze. [00:39:19] And while it was navigating the maze, it was sending the signals that it was experiencing in its brain to the other mouse, rat, hamster, wherever it was. [00:39:27] But that one was not traveling through the maze. [00:39:29] And the goal was to find some cheese. [00:39:31] So the first one, you know, got stuck, and then eventually it found some cheese. [00:39:34] And then they just let the other one loose in the exact same maze, just duplicated, and it went straight to the cheese. [00:39:39] So there is something. [00:39:40] He didn't hit any of the roadblocks along the way. [00:39:42] He didn't really know. [00:39:43] Because he learned the route based off of the neural. [00:39:46] Activity that was going on in his counterpart. [00:39:48] So, that's something they implant into these rats? [00:39:53] How does that work? [00:39:55] Yeah, it's a good question. [00:39:56] I mean, so the fundamental premise of all this is that you're measuring neural activity, right? [00:40:01] So, you have what's called an electrode that gets very close to a soma, a cell body. [00:40:06] Sometimes it punctures it. [00:40:08] Maybe it's in the axon or the dendrite to some part of the neuron itself. [00:40:12] And then, as that neuron activates, this electrode measures the voltage changes that's taking place. [00:40:17] This is something. [00:40:18] Physical, like you could touch when you say the electrode, they put it that's something like, yeah, physical that they put in there. [00:40:24] Okay, exactly. [00:40:24] They shoot like a syringe in there, into their head or something, or what is it, like a microchip? [00:40:29] Yeah, sometimes. [00:40:29] I mean, there's a lot of different techniques, but the general premise is always the following get access to the brain, stick a needle in it. [00:40:36] Okay, and so I mean, whether that needle punctures the cell or it's just right next to it, measuring small voltage fluctuations, you know, just near the periphery of the cell body, or whether it's actually inside the cell body, all modifies the results, and you have to clean up and denoise that stuff. [00:40:48] But that's the idea of recording the activity of one brain. [00:40:52] And then the idea of then sort of imposing that on the brain of another, I don't know the science behind how that was done. [00:40:59] So I'll. [00:41:00] Do they take it out of the first, whatever the rat or whatever it is, and put it in the next one? [00:41:05] Or do they just have one? [00:41:06] They just record it, right? [00:41:07] So every time the neuron fires. [00:41:08] And then they just upload, or just digitally. [00:41:10] It's just digitally. [00:41:11] So it's basically an electrical current. [00:41:12] They're just measuring voltage changes. [00:41:14] So each one has one and it transfers to it. [00:41:17] Not all the neurons, obviously, that'd be impossible. [00:41:19] There's way too many neurons. [00:41:22] There's more neurons in the brain than there are, I don't know. [00:41:25] Trees in one of the largest forests in the world, right? [00:41:27] I mean, it's absurd. [00:41:29] But you. [00:41:30] That's fucking crazy. [00:41:31] Yeah, that's crazy to think about. [00:41:33] It's pretty exciting. [00:41:35] It's exciting. [00:41:36] It's fascinating to me. [00:41:37] I'm torn, man. [00:41:38] I mean, I get excited about it sometimes, but then I can empathize with the fears, and I don't feel like either way is right or wrong. [00:41:43] And I think we would be better off sort of embracing both and finding a compromise that finds the best for everyone. [00:41:48] But what do I know? [00:41:50] What do you think about like plasticity? [00:41:55] Is plasticity, does that mean that you can basically. [00:41:59] Retrain your brain to do things that it was like opposite of what it was originally trained to do, like when you were developed as a young kid. [00:42:08] Like when your brain develops when you're young and you learn, like you said, like you have certain pathways in your brain and you have a certain map of the way you see and the way you take in input, right? [00:42:17] Is plasticity, does that mean that you're able to change that stuff? [00:42:21] Like for instance, you're able to change the way that you interact with all the elements and everything in the world? [00:42:27] Like explain that to me. [00:42:28] Like I never really understood what it was. [00:42:30] Okay, I was right. [00:42:31] Yeah, you nailed it right on the butt, man. [00:42:33] You missed your calling as a neuroscientist. [00:42:34] Okay. [00:42:34] Damn, you're a genius. [00:42:35] I knew I had something. [00:42:36] I knew I had something in me when I met you in high school. [00:42:38] There it is, man. [00:42:39] So, now you're talking my language. [00:42:42] This is sort of the core of my research that I'm working on. [00:42:45] And this revolves around the concept of representation learning, right? [00:42:49] Okay. [00:42:49] So, the very first thing that happens, as I said, is that information needs to be ingested. [00:42:55] You know, you get some electromagnetic wave of a photon, it hits a cell, a photoreceptive cell in the back of your retina. [00:43:03] That wave gets transduced into electrical signal, which then passes through some other cells and then goes to the optical nerve to the back of the brain. [00:43:10] Okay, so how does this wave get represented internally? [00:43:15] Like, what's going on with that? [00:43:16] Well, the idea behind plasticity is that you have one neuron, this is the soma, the cell body. [00:43:23] You have what's called the axon, which is where it sends signals down the line, and you have what is called the dendrites, where it aggregates all the information from everything that it's listening to. [00:43:33] So, what plasticity says is that one of the dendrites, for example, if it is listening to action, Activity from another neuron, say from this electromagnetic wave that it just received, that it just experienced, the strength between these two is able to change. [00:43:51] And so, what does that mean and how is that useful? [00:43:53] Well, what that means is that as we start seeing sensory information over and over and over again internally in the brain, our ability to process it starts getting accustomed to that sensory input. [00:44:06] And we start recognizing that sensory input and we start finding patterns in that sensory input. [00:44:11] And this is anticipating it, right? [00:44:13] Anticipating it, exactly. [00:44:14] This is one of the key observations behind one of the deepest parts of machine learning, in my opinion, which is representation learning. [00:44:22] And so here's the example for that. [00:44:24] Let's say that I showed you a bicycle, and I said, Danny, what is this thing? [00:44:30] What would you say? [00:44:31] That's a thing you sit on and you do wheelies on. [00:44:35] There you go. [00:44:36] Get to school on it. [00:44:37] There you go. [00:44:38] You might say it's a bicycle. [00:44:39] Right. [00:44:39] And then I would go, how do you know it's a bicycle? [00:44:42] And then you'd say? [00:44:44] Because it has two wheels and handlebars. [00:44:46] Well, motorcycle has two wheels and handlebars. [00:44:49] And a motor, so it's a motor bicycle. [00:44:51] There you go. [00:44:51] All right, so the idea is, though, is that we can keep asking these questions and they get more and more refined as to why do we call this thing what it is. [00:45:00] Start with maybe a coffee mug as another example. [00:45:03] Okay. [00:45:03] What is this? [00:45:04] It's a coffee mug. [00:45:04] Right. [00:45:04] How do you know? [00:45:05] Well, it has a handle. [00:45:06] Well, so does a beer mug. [00:45:08] Right. [00:45:08] Well, it's short. [00:45:09] Okay, well, so does a teacup. [00:45:11] Oh, well, you just keep asking these questions. [00:45:13] And it turns out that there are these very, very, very subtle features. === Hebbian Plasticity Explained Simply (04:11) === [00:45:17] Of things that allow us to distinguish and disambiguate between them. [00:45:22] And so the question is, how do we represent those features internally? [00:45:27] And that's an active area of research, machine learning, artificial intelligence. [00:45:31] I have a hypothesis that it revolves around some of the fundamental concepts, old neuroscience that's sort of been forgotten in modern day research, which is called Hebbian plasticity. [00:45:39] And so we can show that locally, all these individual neurons that are making these local. [00:45:44] Hebbian? [00:45:45] Hebbian, yeah. [00:45:46] Hebbian plasticity. [00:45:47] Donald Heb. [00:45:48] Okay, he's the guy. [00:45:48] He's the guy who did. [00:45:49] Okay, yeah, he's like, Hey, neurons that fire together wire together. [00:45:51] That's me, baby. [00:45:52] Yeah, that's his big catchphrase. [00:45:54] And so we call it Hebbian plasticity. [00:45:56] Okay, and uh, yeah, so the idea is that all of these neurons locally start changing their behavior completely ignorant of everything else that's going on around them, and somehow, which is what I'm trying to show in my research, we get a larger aggregate order that arises from this local activity. [00:46:15] And the example that I like to give to this is uh, humans. [00:46:19] We all just exist on our own and we're all independent and selfish and we make decisions on our own self interest. [00:46:24] Yet somehow we have these magnificent shared fictions like corporations and religions and governments and nations and we just create these amazing concepts all from our just individual needs, a spontaneous order that arises chaotically. [00:46:38] It's very beautiful. [00:46:39] And we see this in nature all of the time. [00:46:42] And I contend that the brain does the same thing. [00:46:46] Yeah. [00:46:48] I'm going to have to rewatch this like 10, 15 times over to catch it. [00:46:51] Oh, that's not good. [00:46:52] Yeah, you got to. [00:46:52] I need to learn to say it. [00:46:54] More clearly. [00:46:55] So if I say something that's not clear, you know, give me a hard time about it because it's my job. [00:46:59] No, it's not you. [00:47:00] It's just so many different words that are not normally in my vocabulary. [00:47:03] And that's my fault, right? [00:47:04] So I need to. [00:47:04] It's stuff that I listen to every day. [00:47:06] But Danny's on the next one. [00:47:07] It's not your fault. [00:47:07] You're accustomed to it. [00:47:08] He's not. [00:47:09] It's the plasticity. [00:47:10] But I like it because I'm learning. [00:47:11] I'm definitely learning. [00:47:12] Picking up little gems along the way, for sure. [00:47:14] Well, I got to go on a little diatribe now that you just mentioned that. [00:47:17] But, you know, when you were saying I'm using a bunch of little terms that you don't understand and it's your fault, here's the reality, right? [00:47:23] Is that communication is another one of these spontaneous order things and it is just fascinating. [00:47:28] Like, let's just take a moment and just appreciate the majesty of communication. [00:47:31] Okay. [00:47:32] Internally, in this just massive net of, you know, so many neurons, I have some concept or some idea somehow, mysteriously. [00:47:41] And then, what I'm going to do is, I'm going to somehow turn this signal into a pattern of neural activity that oscillates the vocal cords in my throat. [00:47:49] That's going to make slight little pressure changes in the molecules in the air. [00:47:52] These are then going to travel to you and move little hairs in your ears, which is then going to create a neural signal which goes to a certain part in the brain where you then interpret the language and then somehow put all these words together and represent that idea. [00:48:05] So, I'm literally taking some crazy idea in my brain, transducing it into just waves through the air, and then recreating it in your brain. [00:48:12] Like, that's just phenomenal, right? [00:48:14] Which makes me do the same thing, create things that go on my face, which project back into your eyeballs and make you feel a different kind of way. [00:48:22] And it just goes back and forth, right? [00:48:24] Dude, you hit on a very important thing, and that's feedback during communication. [00:48:28] Some people, there's a lot of psychological disorders where they just don't have that capacity, where they just talk and they aren't even aware of what people are, or how they're interpreting what they say. [00:48:36] But to what you said, right? [00:48:38] Think about how much, you know, just sheer majesty goes into the ability of me to communicate some idea to you, right? [00:48:44] The fact that I'm taking the effort to do that means. [00:48:48] That I'm trying to put my idea from my brain into yours, which means I have the responsibility to choose all of the words and to communicate the idea from your prior and your perspective as best as possible. [00:48:59] It's just foolish of me to just speak in my own world, in my own little vacuum chamber, and expect you to understand me, right? [00:49:04] So I have a responsibility to communicate as effectively and carefully as possible. [00:49:09] Otherwise, why am I even talking? [00:49:10] I just want to hear myself talk. [00:49:11] That's not someone I want to hear myself talk about. [00:49:14] I know a few people who like to do that, unfortunately. [00:49:15] It depends on the normal podcast episodes. [00:49:18] I think you're nailing it pretty good. [00:49:19] I mean, you got to keep it on a certain level, but it's just my brain's working at super fast speeds right now to retain all this information. === The Power of Human Authority (08:50) === [00:49:29] He's a four cylinder. [00:49:30] Yeah, I'm running on fumes right now. [00:49:33] It's funny, man. [00:49:34] There's a guy that I do a show with on our YouTube channel, right? [00:49:39] He's a real estate dude. [00:49:40] Big, fat New York Jew. [00:49:42] And he's worth hundreds of millions of dollars. [00:49:45] One of the richest dudes I've ever met. [00:49:46] And you consider him really fucking successful, right? [00:49:50] Sure. [00:49:50] I've followed him around for the past six years documenting, you know, Here and there, what he does, making little vignettes and putting them on YouTube, and people fucking love it. [00:49:58] People just go crazy over it. [00:49:59] They're fascinated with it. [00:50:01] This big, and he is, no offense to Ben if you're watching this. [00:50:06] He'll never watch it. [00:50:06] He never watches any of our videos. [00:50:08] He is literally, not that ignorance is a bad thing. [00:50:11] I don't think ignorance is a bad thing, but he is the most, one of the most ignorant, narrow minded motherfuckers I've ever met. [00:50:18] But he's so fucking rich and successful at real estate. [00:50:23] And people think that his opinions are so. [00:50:27] They've put his opinion so high above everybody else just because of that one factor that he's successful in real estate. [00:50:33] Yeah. [00:50:34] There's a lot of interesting things that you brought up there. [00:50:36] I mean, the first one is something that I'm rather passionate about trying to oppose, and that's the far too commonly committed fallacy, which is the appeal to authority. [00:50:50] So, as you said, so many people just respect what he says as if he, as some sort of an authority figure, knows what's right and what's wrong. [00:50:58] And that's one of the lowest. [00:51:00] Forms of reasoning, right? [00:51:01] I mean, there's no real internal judgment that's being made. [00:51:04] They're not really reconciling his information with their own internal model. [00:51:07] They're just like taking it as truth. [00:51:10] And far too many humans, in my opinion, do that. [00:51:13] And I think we as a society would benefit if more people took a bit more responsibility and discipline for what they consider to be true or not true. [00:51:23] Right. [00:51:24] And the other thing that's really interesting is yeah, he's a big, obnoxious kind of guy. [00:51:29] And honestly, that's great. [00:51:30] And the reason is because we need all kinds, the whole world needs all kinds. [00:51:34] And there's one thing that we've learned in science, and that's the vast majority of things on average behave as what we call a Gaussian. [00:51:42] Distribution and basically just looks like a little bell. [00:51:45] And what that means is the majority of people are right here in the middle and it's high because that's where all the people are. [00:51:49] And as you taper off to the other ends, there's fewer and fewer people out there. [00:51:52] But if those people weren't there, this thing wouldn't be a Gaussian and it would start to become very, very narrow and it would start to become very, very pointy and everybody would be all the same. [00:52:01] The same, yeah. [00:52:02] And the problem when everybody's the same is that people can't change. [00:52:05] Right. [00:52:05] And so we intrinsically need people like that to adapt and grow. [00:52:09] And I think as long as we have a tolerant mindset, as we said before, nobody's right or wrong. [00:52:14] We observe, reconcile with our world model, and make the best of it. [00:52:17] So, sounds like an interesting guy, and I'd probably enjoy Jay. [00:52:19] Yeah, check him out. [00:52:20] Yeah, man, we should have had him out here tonight. [00:52:21] It's an interesting thing, like we were talking about before, though the communication part of it. [00:52:25] The guy's a really good communicator, and being able to influence people and make them think a certain way, even though it may not be true. [00:52:35] Well, let's talk about that for a moment. [00:52:36] So, at first, you said he's very brash and he says what he wants, and that people just listen to him because of his power and his money. [00:52:44] Right. [00:52:45] Does he actually observe who he's communicating to and communicate in a way effective for them to understand, or does he just use his platform and his power to convince people? [00:52:54] He definitely conforms to who he's communicating to. [00:52:56] And indeed, he sounds like a very good communicator. [00:52:58] Yeah. [00:52:59] And I mean, it's, yeah, it just blows my mind. [00:53:03] I mean, it's a lot, it's very similar to how Donald Trump can communicate to people, how he can communicate to all the people who voted for him. [00:53:11] You know, he knows how to communicate to his audience, you know what I mean, to get what he wants. [00:53:15] Right. [00:53:16] And this guy can do the same thing, you know, depending on who he's talking to. [00:53:20] Like he can be whoever he wants, whatever he wants. [00:53:24] Just sounds like an edge case sociopath, perhaps. [00:53:28] Yeah. [00:53:29] And it's to get what he wants. [00:53:30] You know, I mean, whether it's buying a car or a house, he's going to talk to that person. [00:53:36] He's going to get what he wants for the price that he wants. [00:53:38] Yeah. [00:53:38] You know, he goes to get a car, he's going to beat the guy up, and the guy's not going to want to sell it for price. [00:53:43] But at the end of the beating, the ear beating this guy receives, he will get the car at the price he wants it. [00:53:49] It's an interesting life. [00:53:50] That's interesting. [00:53:51] Social media, too, man. [00:53:52] Like the communication through social media. [00:53:56] People can see something and take it their own way and put their own story behind it. [00:54:01] And it makes them feel a certain kind of way. [00:54:03] And it may not be the truth. [00:54:04] Yeah. [00:54:06] Like we were talking about right before we started recording. [00:54:08] Yeah, we were talking about that a little bit earlier about social media and the comments. [00:54:11] And what did you say you did with that? [00:54:13] Yeah, so it's pretty devastating to our social structure and our ability to communicate with one another in a lot of ways. [00:54:22] It also has many strengths, as everything does. [00:54:25] To unilaterally label anything as good or bad is often evidence that somebody doesn't really know exactly what they're talking about. [00:54:30] But from the perspective of listening to individuals and respecting them, it's a pretty terrible medium and it's causing a lot of problems. [00:54:40] And the reason is pretty simple. [00:54:44] Well, nothing's really simple, if you look at it the right way. [00:54:48] But it's because there's no repercussions for their actions. [00:54:51] And it really comes down to what you were saying before. [00:54:54] When I say something to you, I see your face and I get feedback from what I say, and that tempers what I'm going to say and modifies the next words that I'm going to choose, etc. [00:55:03] Like that communication medium that we have from face to face interaction is very, very, very different than me just having my own self righteous thoughts and just slamming my hands around like a monkey trying to just vomit my half thought ideas, right? [00:55:18] So it's teaching us very bad communication skills. [00:55:21] As I demonstrated earlier, I'm kind of perhaps illogically enamored with the notion of communication. [00:55:26] And social media is not really, at least in the human level, not a very good medium for us to communicate. [00:55:32] But again, there's all kinds of other very interesting things that it's enabled us to do. [00:55:36] For example, we have been able to observe. [00:55:38] Trends in data based off of certain people from certain regions and they say certain things. [00:55:43] Yeah, maybe they're saying completely, you know, angry, aggressive, vitriol, and they're just like spewing stuff. [00:55:50] But we can still find passions and truths and opinions, and we can look at these all globally, locally, regionally, and we can start understanding better a lot of the social problems that we've been facing. [00:56:00] You know, unfortunately, we have a lot of issues with gender and race, et cetera. [00:56:04] And we can explore these a lot better with the medium of social media. [00:56:08] So it's got its strengths and its weaknesses, of course, as everything does. [00:56:12] Yeah. [00:56:13] I mean, the commenters on these videos can be pretty brutal. [00:56:16] These people are pretty ruthless. [00:56:18] Yeah. [00:56:18] I mean, unfortunately, I think in the long run, that usually just looks bad on them. [00:56:22] But I mean, the reality is, I mean, maybe 10 people read their comments and then nobody cares anymore because everybody knows there's just no real information or knowledge in the comments nowadays. [00:56:31] Right. [00:56:31] And so, those of you who are thinking about just writing nonsense, please take a little more responsibility in communicating your ideas. [00:56:38] For real. [00:56:38] I mean, it's getting hot in here. [00:56:41] Let me turn the camera. [00:56:43] What happened to that? [00:56:43] I think. [00:56:44] Force is digitally turning the AC up because I keep seeing it come on and it goes up like one degree, one degree. [00:56:51] And I think he's at home watching us turning the AC up. [00:56:54] Turn the heat on us. [00:56:56] So he's pranking us, huh? [00:56:57] I think so. [00:56:58] I keep seeing that thing pop up and get hotter. [00:57:00] I like him. [00:57:02] Yeah, he's funny. [00:57:03] He got really, really butt hurt. [00:57:04] He was on one of the first podcasts, not one of the first, but he was a few podcasts ago. [00:57:09] He was talking to a buddy, Jack, a friend of mine, who was very active on YouTube about. [00:57:16] Trying to educate people on like the red tide going on in Florida. [00:57:21] Like, basically, he was being very vocal and talking a lot about like the politics behind the people that were responsible for creating the cause and effect of red tide with like the sugar cane, the sugar cane fields, and all the, you know, all the toxins and all the pesticides going into Lake Okeechobee and causing the red tide. [00:57:44] Then we started getting into like eating and dieting and forest. [00:57:47] Is like he doesn't know anything about that. [00:57:50] And Forrest would be like, Why? [00:57:51] Why don't I go to McDonald's? [00:57:52] The veganism and yeah, I mean, he's like, Well, you got to eat healthy, you know. [00:57:55] The Jack, he's like, You got to eat healthy, you got to care about the environment. [00:57:59] And Forrest is, you know, he's like, Why? [00:58:01] I could it's cheaper to buy a double cheeseburger for a dollar and save money, and that makes me happy. [00:58:06] And then people on the comments were just basically crucifying Forrest, and they went on him hard. [00:58:11] He won't come on the podcast anymore. [00:58:12] Thanks, guys. [00:58:13] Well, that's unfortunate, and I'll be honest, I think that's one of the problems with social media. === Dieting, Veganism, and Environment (08:19) === [00:58:19] People get very, very passionate about their ideas, as I shared before. [00:58:22] I mean, humans have this incredible inclination. [00:58:24] We just love our own ideas. [00:58:26] And especially from the safety of our own laptop in our home, right? [00:58:28] Of course. [00:58:28] We are the authority of all that is knowledge, right? [00:58:30] No consequences. [00:58:31] But the reality is that force isn't entirely wrong. [00:58:35] And the problem is that here's a great example that I'm a fan that's, I don't remember his name, there's a great economist who writes about this stuff all the time. [00:58:44] And here's the example. [00:58:45] So let's say that there's a park, and it's a beautiful park, and people like to walk through it. [00:58:51] Okay, so right now our society has this very large leaning towards preserving society, I'm sorry, preserving nature and parks and the environment. [00:58:59] So we all have a very strong bias towards environmentalism. [00:59:02] Okay, but now let's say somebody else comes along and he says, Actually, you know what? [00:59:05] I'd like to knock down this park and put a parking garage on here so that I can park my car. [00:59:11] Right. [00:59:11] And then we go, Well, we have a decision to make here. [00:59:14] We have somebody who wants to take down the park and put a parking garage, and somebody who wants to just enjoy the beauty of nature and walk through the park. [00:59:21] And here's the problem once again, as we talked about earlier, there's no notion of like an intrinsic right or wrong. [00:59:27] And so the problem is that oftentimes people from that environmental perspective feel as though they have like a moral high ground. [00:59:34] Where they say, no, preserving nature is the right thing to do. [00:59:38] But you ask, why? [00:59:40] Why is it the right thing to do? [00:59:41] You just choose that it's the right thing to do because it makes you feel good. [00:59:44] Well, I want to save the earth. [00:59:46] But why? [00:59:47] Well, because it's fulfilling. [00:59:49] Well, the reality is the following this person's desire to take down the park and put a parking garage is equally as valid as the individual who wants to preserve the park. [00:59:59] And so here's where the real truth comes from the person who wants to preserve the park says, well, you can't take down the park because that is irreversible. [01:00:07] Once you take it down, you can never get it back. [01:00:09] And they're true. [01:00:10] They're right there, and there's a lot of truth behind that. [01:00:12] However, they overlook the fact that there's something else that is equally irreversible, and that is time. [01:00:18] Every single day that this person does not get to build their parking garage, they're going to have a day of convenience taken away from them. [01:00:24] We have this sort of internal desire to say, oh, well, you want convenience over me wanting to save the earth, so you're wrong and I'm right. [01:00:31] And there's no reason for that. [01:00:33] That's just an opinion, once again. [01:00:35] When the right philosophy would be the following. [01:00:37] Both people are equally right. [01:00:39] So, what we should do is build some environmentally sustainable parking garage or something that satisfies everybody. [01:00:44] Instead of just being polarized and bickering, coming from some moral high ground, we just realize that everyone's right and try and find a medium that satisfies everyone's desires. [01:00:52] It's very easy. [01:00:53] Right. [01:00:53] When you look at it from that perspective, at least. [01:00:55] But wouldn't you say that the person that wanted the park is trying to look out for the earth? [01:01:01] That's in everybody's interest, right? [01:01:03] Because everyone lives on the earth and we want the earth to last longer and we all want to live longer, right? [01:01:07] Isn't that something that's a common value? [01:01:08] Everyone wants to live longer and everyone wants the earth to be around. [01:01:12] Maybe are you arguing that this person doesn't care about the earth as far as their life? [01:01:17] Yeah, I would argue too. [01:01:18] I mean, it's complicated. [01:01:20] Yeah, there's two things. [01:01:20] I mean, one, the notion of self preservation is once again just an intrinsic desire, it isn't right or wrong. [01:01:26] As I said before, if somebody wants environmentalism and it turns out that saving the earth is best done by killing everybody, well, where do you draw the line? [01:01:34] Right. [01:01:35] True. [01:01:36] Right. [01:01:36] So, I mean, you bring up a good point. [01:01:39] I mean, it is pretty complicated, but I think that as long as we're able to find a happy medium, Where everybody is cognizant of the desires and wants of everyone else. [01:01:47] And as long as we realize that deep down, whatever we think or we feel is not intrinsically right or wrong, it just builds us up to be naturally tolerant of others. [01:01:57] And I like to sort of reemphasize the significance of that with just an example. [01:02:03] The difference between being sort of objectively, subjectively opinionated, you know, based on what you think and feel versus objectively opinionated, which is complete right and wrong, is the difference between 99% and 100%. [01:02:18] What does that mean? [01:02:19] That's a small little pedantic detail, Nathan. [01:02:21] No, it's not. [01:02:22] The difference between 99% and 100% is the difference between somebody being very passionate about a religion and trying to share it with others but having conversations and then being an extremist who no longer needs to have discussions because they have the truth. [01:02:35] It's 100%. [01:02:36] Their only responsibility now is to just pound the truth into the rest of the world because there is no room for being wrong. [01:02:43] That's a very thin line that creates a very dangerous situation in society. [01:02:49] And that even shows itself. [01:02:51] In this notion of just environmentalism, wanting a parking garage or not. [01:02:54] As soon as one person is convinced that they're right, they're already wrong. [01:02:57] And a lot of people, I think, could benefit from just realizing how little we actually know and how absolute our own convictions really are. [01:03:06] I bet you get into some serious. [01:03:08] Have you ever gotten elected into a serious political debate with like a family member or a friend? [01:03:13] When I was younger, yeah, I was much less involved. [01:03:15] Not recently? [01:03:16] No, I mean, because again. [01:03:17] You just avoid them? [01:03:18] No, not at all. [01:03:19] I welcome it. [01:03:20] That would be like sport for you, I feel like. [01:03:22] No, I welcome it. [01:03:22] And the reason is because I see it as an opportunity to champion what I'm so excited and passionate about. [01:03:28] As tolerance and listening to them and accepting them as being different but sharing an equally valid alternative view. [01:03:34] And as the more people we have doing that, the better our situation is going to become. [01:03:38] I mean, we are more polarized now than we've ever been, halfly to blame for social media, by the way. [01:03:44] And I think what we really need is just people to realize how little we actually know. [01:03:47] There's this thing called the Dunning Kruger effect. [01:03:49] You guys might have heard of it. [01:03:50] It's a psychological phenomenon that says the less people know, the more they think they know. [01:03:56] And, you know, we're plagued with that. [01:03:58] And via social media, the less people know, I mean, they go ahead and they Google one or two websites. [01:04:03] I mean, I fall in prey to this too. [01:04:05] We all do, right? [01:04:05] Where I read one or two things and I'm like, oh, I see. [01:04:07] So this is how it is. [01:04:08] So, no, you're not right because this thing said this thing, which said this thing. [01:04:12] And we're just perpetuating the problem. [01:04:14] When the reality is, the more that you know, the more you know you don't know. [01:04:19] And I find that when I'm around academics who dedicate their life to mastering something, they're just incredibly humble about their knowledge because they see how much they don't know. [01:04:28] Right. [01:04:28] You know? [01:04:29] And when somebody doesn't know how much they don't know, That's the Dunning Kruger effect. [01:04:32] They can't measure their own incompetence. [01:04:35] Right. [01:04:35] So they're stuck here thinking they know a lot because they don't know enough to know that they don't know a lot. [01:04:39] It's a very curious phenomenon. [01:04:41] Yeah. [01:04:41] No, no, I'm definitely familiar with that. [01:04:43] I didn't know the actual name of it, but knowing that you don't know a lot is smart, or that's whatever the word is. [01:04:51] That's the best way to learn more. [01:04:52] That's the best way to learn more, right? [01:04:54] To know that you don't know everything. [01:04:55] So you're saying I'm smart because I know less than you guys. [01:04:59] But I know that I know less than you. [01:05:00] Yeah, but you don't know you know less, so you're done with it. [01:05:02] But what if I do? [01:05:05] There's a great quote. [01:05:05] I don't remember the book that it's from, but it's exactly that. [01:05:08] You know, this person and I both don't know anything, but I know that I don't know anything. [01:05:12] Therefore, I'm smarter than he is. [01:05:14] Yeah, there's a quote in the book. [01:05:15] I can't recall. [01:05:15] I'm sure somebody on the internet and the internet will find it. [01:05:17] Yeah, they'll know that one. [01:05:19] But yeah, it's a great quote, and you're exactly right. [01:05:21] Knowing that you don't know is one of the wisest things that can be done. [01:05:24] And what I'm so passionate about is that machine learning and artificial intelligence, studying that just gives rise to all of these just tolerant ideologies because you start understanding what is intelligence, what is learning, what is memory. [01:05:35] And we're all just prey to the sensory information that we receive. [01:05:39] We make these biased opinions based off of something we can't really control. [01:05:42] You understand the learning process, how people make decisions. [01:05:44] And I don't know, I just think that's why I'm so excited about us learning more, is because I hope that myself and others who know this stuff can come share these things that we've learned from machine learning. [01:05:55] And it's my hope that as the message spreads, people might become more tolerant. [01:05:59] And maybe before I'm able to invent the machine where we actually truly understand one another, we can just get a lot closer by being more tolerant. [01:06:06] And more understanding of differences among us. [01:06:08] Definitely. [01:06:08] Sounds like such a hippie, dude. [01:06:09] Who do you think? [01:06:10] For real. [01:06:12] I felt that though. [01:06:13] Who did you like? [01:06:15] Did you have anybody in your life, either now or in your younger years, like when you were in high school or whatever throughout your life, people that you looked up to that you kind of tried to model yourself after? [01:06:26] People, influences? [01:06:27] Like, who were your biggest influences? [01:06:30] This is kind of embarrassing. [01:06:32] One of them was Leonardo da Vinci, right? [01:06:34] I was very young and just sort of enamored with his notion of absolute knowledge, knowing so many things. [01:06:38] So I pursued that. === Anime Tolerance and Strong Priors (02:35) === [01:06:39] And the other, his name is Vash the Stampede. [01:06:43] And yeah. [01:06:44] Who's Vash the Stampede? [01:06:46] Yeah. [01:06:46] He's the best, man. [01:06:47] He's who everyone should aspire to be like. [01:06:49] Never heard of him. [01:06:50] Yeah. [01:06:50] Should we have to look him up here? [01:06:52] This might be an appropriate lookup. [01:06:53] Does he have an Instagram? [01:06:54] How do you say it? [01:06:55] Vash? [01:06:55] Yeah. [01:06:56] There he is. [01:06:57] He's popular, man. [01:06:58] Okay. [01:06:59] There he is. [01:07:00] Look at this guy. [01:07:01] The anime. [01:07:02] Okay. [01:07:03] Yeah, I don't know. [01:07:04] I was 12 or 13 years old. [01:07:06] Hell yeah. [01:07:06] This anime, I mean, I don't know. [01:07:08] People like to, again, have very strong priors about things such as anime as being right or wrong. [01:07:13] But this one particular anime is surprisingly deep. [01:07:19] And they explore very, very, very deep philosophical issues. [01:07:23] And this was one of those things that made me have this sort of irrational, illogical. [01:07:28] Affinity for just doing good and doing right things. [01:07:31] So he's like this super, essentially, he's just this super powerful guy, has amazing abilities, but all he wants is to just live in peace and be happy. [01:07:39] But there's this other guy who wants him to suffer, and so he always tries to do bad things to him, and as a consequence, everyone around him is always getting hurt. [01:07:47] And so it's just this whole video about this one, this whole anime series about one guy just with this heart of gold, just always trying to do good things and help people save the day with his cool powers, but all the while suffering happens. [01:07:58] Inevitably, and it tackles a lot of the philosophical ideas of what's right and what's wrong, how do you feel about things. [01:08:04] It's a beautiful show. [01:08:05] That's super cool. [01:08:07] That's wild. [01:08:07] When did that show? [01:08:08] Cartoons, man. [01:08:08] When did that show come out? [01:08:09] That was a long time ago. [01:08:11] Was it really? [01:08:13] Yeah. [01:08:13] I just saw 2014 on there somewhere. [01:08:15] Dude, I've seen this thing probably. [01:08:17] I mean, it's only 26 episodes. [01:08:18] It's short and sweet. [01:08:19] You could watch it, and if you did a little weekend binge watching, you could watch it in two weekends. [01:08:23] Really? [01:08:23] Yeah. [01:08:23] And it's just phenomenal. [01:08:25] I've watched the whole series probably a good five or six times, shared it with a lot of my friends. [01:08:29] Everyone who's ever seen it has just been enamored with it as much as I so far. [01:08:33] Really? [01:08:33] Oh, check that out. [01:08:35] What was it on? [01:08:35] What was it on? [01:08:36] Was it on Cartoon Network? [01:08:37] Was it on. [01:08:38] It's super old, man. [01:08:39] I mean, this was 1980 something, I think, is when it came out. [01:08:41] Oh, really? [01:08:42] And so I caught it a little later. [01:08:43] My friend was like. [01:08:43] How old are you? [01:08:44] I'm 30. [01:08:45] You're 30, right? [01:08:45] Yeah. [01:08:46] My friend was an anime buff, and so he's the one who introduced it to me. [01:08:49] I was visiting him, excuse me, in Virginia. [01:08:53] And he was just watching it, and I watched it there, and just. [01:08:56] It starts off slow, okay? [01:08:58] Got to build up. [01:08:59] Yeah. [01:08:59] Amazing complexity of characters. [01:09:01] But it's a great show. [01:09:03] All you guys who like it, you know, say like and comment how great this show is. [01:09:07] I know how many of you think it's great. [01:09:09] It sounds super cool. [01:09:09] Or how much you hate it on here. [01:09:12] They'll let you know. [01:09:13] You need both. [01:09:13] You need both. [01:09:14] We need balance. === Inventors Like Tesla and Bell (04:37) === [01:09:15] We need a Gaussian. [01:09:16] Gaussian? [01:09:17] Gaussian. [01:09:17] Yeah, yeah. [01:09:18] Yeah, so Frederick Gauss, he was a mathematician. [01:09:20] There's a fun little fable. [01:09:22] I don't think it's true, but so allegedly, when he was in grade school, the teacher punished everybody and said, okay, I want you all to add up the numbers from 1 to 100. [01:09:31] And so they're all sitting there at their desks, just like trying to do this thing. [01:09:34] And then within like a minute or two, he comes up and gives the answer, 5,050. [01:09:38] And then the teacher's just like blown away. [01:09:40] And it turns out that allegedly he invented one of the summation formulas for arithmetic series. [01:09:47] And, you know, as just a kid, he came up with that. [01:09:49] And it's an arithmetic series, you just add a bunch of things together. [01:09:53] And so, yeah, you add up 1 plus 2 plus 3 plus 4 plus 5 plus 6. [01:09:57] Oh, okay. [01:09:57] And so that whole sum is equal to a small little formula, n times n minus 1 over 2. [01:10:02] And so, allegedly, he just discovered that formula when he was a little kid and then went on to, I think the problem was much harder than that. [01:10:08] Anyways, it's a fun little story. [01:10:09] And he went on to do many other great mathematics things, such as creating the Gaussian distribution, which is pretty much the foundation of all probability theory and statistics and a lot about machine learning. [01:10:18] So, Frederick Gauss, he's a cool guy. [01:10:20] Wow. [01:10:22] That's amazing, man. [01:10:23] Crazy. [01:10:23] When are you going to have something named after you like these guys? [01:10:26] Soon. [01:10:26] Or do you have something already? [01:10:28] Nothing? [01:10:29] Not yet? [01:10:29] I don't know. [01:10:30] I don't know about all that stuff. [01:10:31] What about Tesla? [01:10:32] Nikolai, is that what his name is? [01:10:33] Tesla? [01:10:33] Tesla. [01:10:34] Forrest was also telling me that him and Thomas Edison, I believe. [01:10:38] Have you, do you know they were both similar, but Thomas Edison was more about like being known for doing stuff or like being known for inventing, known about being more famous? [01:10:49] I think it was Edison, I think it was Bell. [01:10:51] Was it Bell? [01:10:52] Alexander Graham Bell. [01:10:52] I was like, we should double check that. [01:10:54] I thought it was the person who invented the light bulb. [01:10:58] But basically, he was telling me that Tesla invented something, basically, the neuron or whatever to a bat. [01:11:04] Like, he basically invented the primal function of a battery. [01:11:09] Right. [01:11:09] Not an actual battery, but like the primal function or however a battery operates. [01:11:13] Like, he somehow created the concept for it. [01:11:16] Right. [01:11:16] And it never got invented or it never came to fruition. [01:11:19] But whoever made the light bulb, whether it was Edison or Bell, no, Edison made the light bulb. [01:11:25] He made the light bulb. [01:11:25] Right. [01:11:26] The big famous. [01:11:29] Argument or struggle was between Tesla and Alexander Graham Bell. [01:11:33] Oh, really? [01:11:33] So, Alexander Graham Bell was like this big, yeah, you probably find it there real quick. [01:11:37] He was the big business guy. [01:11:39] He was your friend from New York, basically. [01:11:42] Alexander Graham Bell, he's like trying to monopolize electricity, making it available to people. [01:11:48] And his biggest flaw was that he was a champion of the direct current. [01:11:52] And so, direct current has a huge problem. [01:11:53] It's not able to propagate very far, and the signal decays very quickly. [01:11:57] And Nikolai Tesla, yeah, there you go. [01:12:00] And Nikola Tesla, he was all about just making things free. [01:12:03] He was going to make Wi Fi free for the world. [01:12:06] And he was just sort of idealist that just didn't really fit into the capitalist structure that America was coming up in. [01:12:11] And as a consequence, he sort of ate it. [01:12:12] And I think he ended up like killing himself or something because he was really struggling with it. [01:12:16] But he was the one who was pioneering the alternating current. [01:12:19] And that's what we use today. [01:12:20] And that's what we use today. [01:12:21] Oh, the AC. [01:12:21] AC is what Grand Bell wanted. [01:12:23] Right. [01:12:23] And he did the AC. [01:12:24] Yeah. [01:12:24] And AC is what we use today. [01:12:25] And it's been phenomenal. [01:12:27] It allows signals to travel much, much further without losing as much power. [01:12:30] So we can get these power lines for miles and miles and miles. [01:12:35] What do you think? [01:12:35] I mean, was Tesla, did he have some sort of form of autism or Asperger's where he wasn't like very inclusive, couldn't communicate? [01:12:49] Is that the same? [01:12:50] I read a book on Tesla, and it was by Margaret something. [01:12:53] It was like a biography. [01:12:54] And I couldn't tell whether she was just like overly enamored with him and was sort of sensationalizing the stories or what. [01:13:03] It was sort of like a fictional retelling, perhaps. [01:13:05] But she would describe him as being a person who would come and sit at dinner. [01:13:09] And he would like to separate all of his peas to one side of the plate and put his cup here and his, you know, silverware perfectly straight, like this meticulous sort of, yes, demonstrating autistic type tendencies, where it's this like obsession with detail, obsessive, compulsive, et cetera. [01:13:23] Right. [01:13:24] So she did describe him that way. [01:13:26] So if we want to take that as evidence or truth, then I would say yes. [01:13:30] Right. [01:13:30] Yeah. [01:13:31] I find that fat. [01:13:31] I just, for some reason, I personally find that fascinating how like certain people who become, you know, turn into somebody like Nikolai Tesla, who is, Who's so successful in one field like that? [01:13:43] I always like to be interested in their brain and like the autistic Asperger's side of people. === Discipline, Sleep, and Focus (09:41) === [01:13:52] I mean, I know people who have that and they're just like, they get so focused on one thing. [01:13:56] It's like, you know, you have no choice to become the best in the world at this one little thing. [01:14:01] And then when it comes to having a conversation with somebody on the side of the road or a relative, it's like you can't say two or three words to their 50 words. [01:14:09] And it's not only the dedication, but oftentimes, you know, autistic. [01:14:13] People have a disproportionately large amount of neural capacity dedicated to that one thing. [01:14:19] So, you know, while we use all of our brain for all of these things, oftentimes people with autism use a significantly larger portion of their brain for one specific task. [01:14:27] And that's why they've been, you know, associated with doing cool things like synergism, where they like see and feel colors and senses, all their senses mixed together. [01:14:36] And, you know, when somebody hears numbers, they like the numbers have personalities to them and they like see and they have colors, et cetera, which is, I agree, it's entirely fascinating. [01:14:46] But there's actually an interesting little nugget of truth under that, and that is that you know, the difference between mediocrity and greatness is just discipline, honestly. [01:14:53] Discipline. [01:14:53] They have a little bit of a legacy. [01:14:54] Up because they have allegedly larger computational resources available, perhaps. [01:15:00] But you take two people of varying skill, one person's dedicated all the time, the other one's not. [01:15:05] You already know who's going to be more successful. [01:15:07] It's really what it comes down to. [01:15:08] How would you define discipline? [01:15:09] What creates discipline? [01:15:11] What is discipline? [01:15:12] I like discipline. [01:15:13] I like enduring hardships. [01:15:14] That's what I would call it. [01:15:15] I would call it the willingness to endure hardships. [01:15:19] The willingness to endure hardships. [01:15:22] And what makes somebody good at discipline? [01:15:24] Practice. [01:15:25] Practice. [01:15:25] It's literally just learned. [01:15:26] It comes with confidence and time. [01:15:27] I mean, confidence is built by a few string of successes. [01:15:30] You just do it over and over and over again. [01:15:32] And then you start to associate the positive benefits of that enduring hardships. [01:15:35] And you start, it's like a drug. [01:15:37] You become addicted to it. [01:15:38] Runners experience this kind of a thing. [01:15:40] You know, you first go running for the first few weeks, and your body's like, what the hell are you doing to me, man? [01:15:44] Oh, this is miserable. [01:15:45] And you don't want to go. [01:15:46] You're fucking fasting. [01:15:47] Yeah. [01:15:47] But after a while, your body's like, oh, wait, I feel real good. [01:15:50] I'm getting a lot of endorphins. [01:15:51] Like, this is good. [01:15:51] You keep doing this. [01:15:52] Yeah. [01:15:53] You're incentivized to do it more. [01:15:54] So just practice, man. [01:15:55] It's everything. [01:15:55] What's something that I mean? [01:15:56] Can you give me an example of something that you've taught yourself to be disciplined? [01:15:59] I mean, well, I mean, is there a difference between teaching yourself to be disciplined at a certain thing or general discipline? [01:16:05] I don't think so. [01:16:06] And so, for example, I've tried to make a bunch of my lifestyle choices revolve around training my discipline. [01:16:12] So, I would take cold showers, for example, something that's not really enjoyable. [01:16:16] I sleep on the floor instead of a bed, you know, things that. [01:16:18] Really? [01:16:19] Man. [01:16:19] Sleep on the floor every day. [01:16:21] Every day. [01:16:21] I have a girl over here now. [01:16:23] She doesn't like sleeping on the floor. [01:16:25] Nathan. [01:16:25] Well, she kind of sleeps on it. [01:16:26] We have a happy meeting. [01:16:27] We sleep on a beanbag bed. [01:16:28] So. [01:16:28] Really? [01:16:29] Threw it away in the morning. [01:16:30] Yeah, I don't own a bed. [01:16:32] Why is this? [01:16:33] Discipline. [01:16:33] It's just a small little thing. [01:16:34] You just sleep on the floor. [01:16:35] You wake up in the morning and feel refreshed. [01:16:37] And I don't know. [01:16:38] It's just part of training that mental discipline. [01:16:40] So there's no reason. [01:16:42] Wait, It's discipline. [01:16:44] I understand. [01:16:45] But what's the benefit of. [01:16:47] What's the physical benefit? [01:16:49] Is it good for you? [01:16:50] Is it good for you? [01:16:51] It has been observed that usually it's pretty good for your back, et cetera, if you're able to do it. [01:16:54] Because that sounds like it would hurt. [01:16:55] That's what I was trying to say. [01:16:56] Yeah, I mean, I don't really know the science behind it, to be 100% honest. [01:16:58] But I mean, you can find evidence supporting whatever your ideas are, right? [01:17:02] So you found enough evidence. [01:17:04] That you think it's good. [01:17:05] Maybe not. [01:17:06] I'll be honest. [01:17:06] I got to confess. [01:17:07] I don't. [01:17:07] But you're trying it to see if it's good then. [01:17:10] I'm just doing it because it's hard. [01:17:11] All right, look. [01:17:11] So you climb a mountain. [01:17:12] You're just doing it because it's fucking hard. [01:17:13] You don't know if it's good or bad for you. [01:17:15] Simply discipline. [01:17:16] I'll tell you why. [01:17:17] I'll tell you how that happened. [01:17:18] I was in graduate school. [01:17:18] I was coaching a boxing gym. [01:17:20] I had to coach in the evenings. [01:17:21] I did research from like eight to five and then I coached from five to nine. [01:17:24] And then I go back to work some more and do some more things. [01:17:27] And I was in the lab until two, three o'clock every night. [01:17:30] And then I had to get up early in the morning and do it again. [01:17:32] And one day, I was like, Why do I always have to go home? [01:17:36] Like, why do I go home? [01:17:38] It's a bed. [01:17:38] That's really the only reason. [01:17:40] It's like such a ball and chain, like a shackle. [01:17:42] Like, you know what? [01:17:43] I'm just going to go to sleep right here. [01:17:45] You just roll the chair out and you just lay on the floor. [01:17:47] It was such a liberating feeling. [01:17:48] Like, you can just, whenever you get tired, think about it. [01:17:50] Like, when you're tired, you're like, oh, yeah, I got to go home now. [01:17:53] Like, why? [01:17:53] You know, because you feel like that's where you have to go. [01:17:55] Right. [01:17:56] For some people, that's fine. [01:17:57] But I enjoyed the liberating feeling and the discipline of, you know, sleeping on the floor. [01:18:01] It was worth it for me. [01:18:02] But here's an example. [01:18:03] Like, let's say you climb a mountain. [01:18:04] Yeah. [01:18:05] Why do you do it? [01:18:05] Yeah, it's fun. [01:18:06] It's cool, whatever. [01:18:07] But you get to the top. [01:18:08] It was really hard. [01:18:09] And then you go back down. [01:18:10] And now you experience something else in your life that's hard. [01:18:13] And what do you say to yourself? [01:18:13] You're like, I climbed a mountain. [01:18:15] I think I can handle this, you know? [01:18:17] That's why. [01:18:18] You just put a success right there, and this builds confidence. [01:18:21] You now have a landmark, a flag to compare the rest of your challenges to, you know? [01:18:26] So if you can challenge yourself to do something that's really hard and you can overcome it, then, and you can do that doing multiple things, then you'll have the confidence to know if you come to another. [01:18:38] Bridge that you think is going to be hard to cross, you'll have the confidence to do it, and you know you can do it because you've done other things and you know it'll be easier. [01:18:44] Yeah, and what's interesting is it becomes less, um, and that builds general discipline across all things. [01:18:50] Exactly. [01:18:51] And once you have the general discipline, it's no longer a choice. [01:18:54] It's who you are. [01:18:55] Right? [01:18:55] So I think there's a quote, I think Benjamin Franklin maybe, he says, you know, greatness isn't born, greatness is learned. [01:19:02] Every single day, when you're faced with a decision, you say, what would a great person do now? [01:19:07] And then you do that until the day comes where you no longer ask yourself what a great person does. [01:19:12] You just do what comes natural as being a great person. [01:19:15] And so that's kind of how you think about discipline. [01:19:17] You endure all these hardships, et cetera, and that's just like your mind, that's who you are. [01:19:21] Asks you to do something and it's hard. [01:19:23] You don't think that it's hard because you're just like, oh, easy. [01:19:24] I've done, I do this stuff all the time, you know? [01:19:26] Yeah. [01:19:27] That's fucking fascinating, man. [01:19:28] But wow, why the bean bag? [01:19:32] Why the bean bag? [01:19:32] I'm curious. [01:19:34] I wonder why the bean bag. [01:19:35] Because we, all right, so the reason is because I slept on the floor and I didn't have a bed and I would always have like family and people come visit and I needed something for that. [01:19:41] So I got this big bean bag that zips into a bed when you lay it out. [01:19:44] Okay. [01:19:45] I've slept on bean bags. [01:19:46] I used to have, I bought two really big ass bean bags in my apartment in St. Pete. [01:19:49] Crazy, man. [01:19:50] They are. [01:19:51] I used to sleep on a lot of bean bags. [01:19:52] There was a lot of people slept on those things. [01:19:55] All right, so there we go. [01:19:56] It makes a lot more sense now. [01:19:57] That's cool. [01:19:58] What are some of the other things that you've taught yourself discipline in? [01:20:01] Some of the things, well, I stopped doing it recently. [01:20:04] Well, no, it's still halfway there, but I would set my watch to a different time and wear it upside down. [01:20:09] Right? [01:20:09] So I just, every time I want to tell the time, I have to do like a little mental arithmetic real quick and like tell the time backwards. [01:20:14] And it's just little things that strengthen your mind. [01:20:16] Brush your teeth with your left hand instead of your right hand. [01:20:18] Get your brain used to doing different things, take a different route to school or work each day. [01:20:22] You know, just always expose your brain to new things. [01:20:24] You're not like stuck in a rut. [01:20:25] Wow. [01:20:26] That's interesting. [01:20:27] That's fucking awesome. [01:20:28] I love that. [01:20:29] I'm going to try some of these things. [01:20:30] Yeah, I just had so many ideas and stuff. [01:20:32] That's crazy. [01:20:33] Yeah, let me know if it works for you, man. [01:20:35] Yeah. [01:20:36] Let me know how that works out. [01:20:37] I will. [01:20:37] I'm going to get lost on the way to work, get fired from my job. [01:20:40] I'm like driving, trying to tell the time. [01:20:43] I'm brushing my teeth while I'm driving to work on a new route. [01:20:46] I'm like trying to do it all at once. [01:20:48] That's fucking awesome. [01:20:49] The cops pull me over. [01:20:50] They're like, why are you brushing your teeth with your left hand? [01:20:54] And where are you going? [01:20:55] I don't know where I'm going. [01:20:56] Just trying a new route to work. [01:20:59] I might try it then. [01:20:59] Wow, man. [01:21:00] Yeah, that's incredible. [01:21:01] I mean, I think. [01:21:02] If you think hard about your life, you probably subject yourself to some hardships unconsciously. [01:21:06] Yeah. [01:21:06] You know, I mean, in the common day to day things that you do, there's something that you want enough that you voluntarily put yourself through a hardship to get it. [01:21:14] And once you identify those things, you can just exercise them a little bit more. [01:21:17] And you say, well, yeah, I was going to do this thing that was hard. [01:21:20] Now I'm going to do that thing and then do five push ups every time before I do it. [01:21:23] Just whatever. [01:21:24] Small little things. [01:21:24] Just challenge yourself, you know? [01:21:26] For me, it's like, I personally, I feel like I've always tried to find what makes up. [01:21:32] Some days when I wake up and I go through a day and I feel fucking great, full of energy, happy talking to people, like super energetic. [01:21:39] And then the next day, I'm just pissed off, in a bad mood. [01:21:43] I feel tired, slow. [01:21:45] I want to figure out like what makes me like the first day and not like the other day. [01:21:50] So I like constantly tried to figure out like what patterns, you know, what path I took that one day and what I can replicate to make that thing happen again. [01:21:59] And I feel like I've never been able to figure it out. [01:22:01] Well, there's a great idea and there's a lot of truth behind that. [01:22:03] I think that there do exist. [01:22:05] Some general norms, which are good ideas, but besides drugs, I took a systems biology course. [01:22:13] Yeah, and it's the notion of like studying the whole body as a system, and each component is communicating with other components via you know DNA, you have transcriptions, etc., messengers, and proteins all communicating. [01:22:24] You try and model this phenomena to see how things happen. [01:22:27] For example, one of the phenomena that we studied is like the sleep wake cycle, there's a certain part in the brain, and that sleep wake cycle gets trained and tuned to the amount of light that we're presented with, which is why we're trained to get tired at night and stay awake during the day, right. [01:22:39] And the idea is that there is just so much going on. [01:22:44] You're just not going to figure it out, man. [01:22:47] There's just like randomness. [01:22:48] There's just hormone random changes. [01:22:50] The weather changes. [01:22:51] You can't control that. [01:22:52] And for whatever reason, some bacteria or something flares up. [01:22:56] But not enough to make you sick, but just enough to make some hormonal response. [01:23:00] Stuff you can't see, just anything. [01:23:02] Exactly. [01:23:02] There's a few good rules of thumb eat healthy and exercise and sleep. [01:23:06] But that also pisses me off, too, because there's this one guy I know who's extremely fat and overweight and doesn't take care of himself. [01:23:13] I know he doesn't get as much sleep as I do. [01:23:15] And I know he doesn't exercise, he doesn't go to the gym. [01:23:18] And he's always got 10 times more energy than I do. [01:23:20] Like, why am I tired? [01:23:22] I went to the gym 10 times this week. [01:23:23] All I ate was broccoli, and you have 10 times more energy than me. [01:23:26] Why? [01:23:27] Is it genetics? [01:23:27] Is it like, you know, it's just the drive he has to get that money. [01:23:29] It's just a drive he has to get that money. [01:23:30] It's easy. [01:23:31] It's discipline, man. === Algorithms, News, and Health Rules (15:53) === [01:23:33] I don't know. [01:23:34] Yeah, there's a lot of truth behind that. [01:23:35] I mean, there's just discipline. [01:23:37] It's just a genetic disposition. [01:23:39] Maybe. [01:23:40] Yeah. [01:23:40] Who knows, man? [01:23:41] I wanted to ask, like, what are some cool projects or things you've worked on? [01:23:46] Like, so what are some of the coolest stuff you've worked on or seen or like, or whatever, something like that? [01:23:53] Something like that. [01:23:55] Let me think about that one. [01:23:57] That's a little tricky. [01:23:58] So, can you narrow it down for me a little bit? [01:24:00] I don't know. [01:24:01] So, you were working on the 911 calling thing or whatever, you know, whatever that is, a program or whatever. [01:24:07] So, what are some of the things you've done over all these years, like really cool projects or things you guys came up with? [01:24:14] Or what are some of the things that stick out in your mind, some of the fun, the more fun, exciting things that you like? [01:24:20] Yeah, unfortunately, the majority of my research is kind of like. [01:24:24] More theoretical and philosophical. [01:24:25] So it hasn't been to apply to it. [01:24:26] It's more like, you know, what is learning? [01:24:28] What is memory, et cetera? [01:24:29] You're going to get roasted for that question, by the way. [01:24:31] That's right. [01:24:31] So maybe it's things like you figured out or algorithms or whatever it is. [01:24:36] But in business, you got to apply. [01:24:38] Nobody's going to pay me to sit in a room and think, right? [01:24:40] Except for the government. [01:24:40] And they don't even do that much anymore, either with grants or it. [01:24:43] So we got a business and we have things. [01:24:45] So I can tell you about some of the stuff, some of the other things that we've done there. [01:24:49] So, in addition, something else that we're exploring right now, let's see what's cool. [01:24:56] All right, so the Department of Transportation is very interested in safety and maintenance of the roads, highways, bridges, etc. [01:25:07] So it turns out that there are some civil engineers who have just a myriad, just a plethora of all kinds of data on this. [01:25:14] They have speed bump sensors on the highways which measure the width of the axles, the speed of the cars, whether there's a trailer or not, just all kinds of things. [01:25:21] And one thing that you might have heard there was a bridge that actually fell recently. [01:25:26] And there's a lot of deaths and injuries, and there was a crack, like a full two inch crack in the bridge a few days before that people weren't observing. [01:25:34] So they have this big new initiative where they had to take drones, and the drones are flying around these bridges and just taking pictures of all these things and monitoring what the bridge looks like. [01:25:44] But now they've got hundreds and thousands of pictures, and what are they going to do? [01:25:47] Just sit here and watch these videos all day, every single day? [01:25:49] So we develop machine learning algorithms and techniques which will isolate and detect where there are. [01:25:56] Cracks or damages, and the degree of the cracks is as degree one, two, three, four, five, et cetera. [01:26:01] And that can save lives, and that's a really boring thing, isn't it? [01:26:05] No, that's actually very fascinating because I work construction. [01:26:08] So that kind of hits a certain spot that made some sense to me. [01:26:13] But how do you come up with something? [01:26:14] Or how do you implement this? [01:26:15] You just code some sort of program that learns this thing. [01:26:20] Yeah, so I mean, most machine learning algorithms have, they all follow the same formula. [01:26:25] You have a whole bunch of data. [01:26:27] Yeah. [01:26:27] You pass it through some, for the sake of this conversation, a black box, a neural network, and it makes a prediction. [01:26:33] Yeah. [01:26:33] All right. [01:26:34] Then what you do is you take a look at this prediction and you say whether it was right or wrong, or rather, how far away it was from what you wanted it to do. [01:26:43] Then you do the most complicated thing in all of machine learning, essentially, and that's you calculate the error rather. [01:26:49] There's an algorithm called backpropagation, and that finds how you change the black box such that if you present that same image or piece of information to it again, the output will be closer to what you want. [01:27:02] And that's it. [01:27:03] So you give it something, it makes a mistake, you go, eh, you made a mistake, do it a little bit better this way, present the same thing, and it'll do it better. [01:27:09] So with those images, we just show it all the images, and it's just, it doesn't know anything at first. [01:27:12] It's like a baby. [01:27:14] Just touching things and guessing, making all kinds of mistakes. [01:27:16] And we just inform it little by little. [01:27:18] Well, this was wrong, this was right. [01:27:19] And then it starts learning those features that we talked about. [01:27:22] What are the features of a crack? [01:27:23] It starts learning to extract those features that define what a crack is. [01:27:27] And now it's presented with an image. [01:27:28] It's able to identify cracks. [01:27:30] Man, that's so crazy to me. [01:27:31] Like it just thinks for itself. [01:27:33] It's just, I don't know. [01:27:35] That's so crazy. [01:27:35] Well, no, no. [01:27:35] That's actually a really important part, right? [01:27:37] So these things that do really cool stuff, they are super narrow. [01:27:43] Like, you know, you go to a Bank of America ATM and you put your check in and it scans it. [01:27:47] For you automatically and detects, you know, and it seems super cool. [01:27:51] That's all that thing can do. [01:27:52] That thing's not going to go outside and learn to play basketball or do your taxes, right? [01:27:55] I mean, that's the notion of transfer learning, and that's like one of the holy grails of AI that we've been exploring. [01:28:00] And we're very far away from really understanding that wholesomely. [01:28:02] So, these, and that's one of the things that I'd like to just sort of share with people is, you know, it's okay to be afraid of, you know, AI and it's what it can do in the future. [01:28:12] But just remember, we're really far away from any sort of Terminator things happening. [01:28:16] And for the very far future, as far as I'm aware, Anytime we're doing any research with any AI that has the potential to do scary things, you just unplug the Ethernet cable. [01:28:27] Okay? [01:28:27] Yeah. [01:28:27] So if it ever goes rampant or does something amazing, all you got to do is unplug the computer, right? [01:28:31] But as soon as you have it plugged into the Internet, then allegedly, I agree with the concerns that are there, right? [01:28:37] If something's like self replicating, it gets into the Internet, that can be really dangerous, no doubt. [01:28:41] But that's why we just exercise a little bit of discipline and restraint and unplug the Ethernet whenever you're doing advanced AI research. [01:28:47] No big deal. [01:28:48] Did you hear about. [01:28:51] When Facebook was creating that new AI, I think it was for customer support that was supposed to talk to Facebook users and try to act as a support for Facebook. [01:29:03] And it was an AI. [01:29:05] And I believe there were two different AIs. [01:29:08] Yeah. [01:29:08] And they created their own language. [01:29:09] I'm really glad you brought that up. [01:29:10] So that's, again, one of the problems in AI is sensationalism. [01:29:15] They didn't invent their own language. [01:29:16] And it was advertised that way. [01:29:19] And it was nothing even remotely that cool. [01:29:21] So. [01:29:22] Really? [01:29:22] Yeah. [01:29:23] So that's one of the problems. [01:29:25] But that's one of the problems is what are you going to do? [01:29:26] I mean, you're a consumer and you're supposed to trust the people giving us news. [01:29:30] And that's one of the problems. [01:29:31] See, and this genius Forrest over here, he's the one who told me they created their own language. [01:29:35] I was like blown away by it. [01:29:37] The news article is cool to talk about. [01:29:38] I remember seeing it. [01:29:39] All the news articles did say that. [01:29:40] Yeah. [01:29:41] So in their defense, in his defense, and everyone, I mean, what are you supposed to do? [01:29:44] You're not a machine learning expert. [01:29:45] You're not going to like go and read the papers and the reports and look into what it meant. [01:29:48] So you're saying that they did not create, two AIs did not create their own language. [01:29:52] Define their own language, right? [01:29:53] I mean, We feel like they're in there like conspiring against mankind and humanity, and they advertise it that way for clickbait, you know? [01:30:00] But what actually happened then? [01:30:02] So, the details I don't exactly remember. [01:30:04] It was a little bit ago. [01:30:06] But in a nutshell, I mean, the idea was basically that they were sending signals back and forth in a certain way that it seemed as though the responses were predictable. [01:30:16] So, it sends a certain signal, and then it interprets it and responds in a way that there was structure and meaning behind the things going back and forth. [01:30:23] It's not like we. [01:30:25] Discovered what they were saying and they're conspiring and they have their own language. [01:30:28] They just found patterns in the signals that they're sending back and forth with each other. [01:30:31] Just as much as the image acquisition thing I described to you earlier is going to find patterns in what defines a crack. [01:30:36] They're just doing what they're programmed to do find patterns in the sensory input. [01:30:39] That's all any intelligent system does. [01:30:42] Were they initially programmed or expected to communicate in any form with each other? [01:30:48] I don't know the details of the full experiment, so I would have to exercise some academic restraint here and not say what I don't know. [01:30:56] Either way, I mean, it's still pretty interesting to find out that they were communicating in a way that they didn't expect them to. [01:31:04] And it was AI, but who knows? [01:31:06] It could have just been clickbait headlines. [01:31:08] Yeah, for now, I would caution that as being clickbait headlines. [01:31:13] Unfortunately, for the unforeseeable future, we all have to just suffer from that because we're not all experts in everything. [01:31:20] You see someone, oh, eating a piece of chocolate makes you healthier and drinking a glass of wine. [01:31:25] You see all these studies which are based off of poor statistics, which then some, I think, oh man, I remember who it was, like Al Roker or someone. [01:31:31] He was like on some news thing. [01:31:33] It was just super embarrassing. [01:31:34] He's like, well, yeah, you know, I think what the thing is nowadays is you just find whatever scientific publication supports your own beliefs. [01:31:42] Yeah. [01:31:43] And I was like, what? [01:31:44] That is the exact opposite of what science is supposed to be. [01:31:47] You don't just cherry pick evidence to support your own convictions. [01:31:50] It's the exact opposite. [01:31:52] Scientifically minded people change their own convictions based on the evidence. [01:31:56] You don't just go find evidence to make you feel good about what your own ideas are. [01:31:59] Right. [01:32:00] Oh, yeah, yeah. [01:32:01] So, what are you going to do when the people who are advertising the news think that way? [01:32:04] It's crazy. [01:32:05] It's unfortunate. [01:32:06] Do you think there will ever be a shift in the general population understanding this kind of stuff? [01:32:10] We're going to keep championing these ideas, advertising them to as many people as we can, and maybe people will start thinking differently. [01:32:15] I think you should be president. [01:32:17] Nathan Coleman. [01:32:17] Hell yeah, hell yeah. [01:32:20] I'd vote. [01:32:20] I would never want that much responsibility. [01:32:22] Hell no. [01:32:23] But it would be good. [01:32:24] Maybe you can clone yourself. [01:32:26] I already have a clone now because I'm a. [01:32:27] Wait, I'm not a sideboard. [01:32:28] Have you ever thought about cloning? [01:32:30] Yeah, of course that's fun because then I can make an army of me and take over the world. [01:32:32] Everybody wants to do that, right? [01:32:34] Right, for sure. [01:32:36] Yeah, I mean, there's lots of fun, entertaining things that can happen with cloning. [01:32:41] What about Elon Musk's neural net? [01:32:44] I forget what it's called, but it's the. [01:32:46] Basically, he talks about. [01:32:48] On a phone, your only output is your two thumbs. [01:32:51] And with this net, I forget what the exact word for it is. [01:32:55] Yeah, I heard about it. [01:32:56] It goes like over your brain or whatever, and basically you live in Facebook or you live in social media. [01:33:01] I'm super suspicious of it. [01:33:02] It sounds like a great idea, and I am super happy. [01:33:05] What is it called again? [01:33:06] Neuralink, yeah. [01:33:06] Neuralink. [01:33:07] Neuralink. [01:33:08] No, I'm really happy that he's. [01:33:09] Someone's got to start doing it, no doubt. [01:33:11] And it's a great idea, and we'll get there eventually, right? [01:33:13] Yeah. [01:33:14] But, you know, because the fundamental technology behind that is very similar to what I was sharing with you before about. [01:33:19] Putting ideas in someone else's brain, you have to know where to send the stimulus, you have to learn patterns in the electromagnetic, you know, waves that happens as people think, map these to some sort of functional output. [01:33:30] Like, it's an incredibly difficult, complicated task. [01:33:33] Yeah, but we can do it. [01:33:34] We've done all we went to the moon, right? [01:33:36] We can do all kinds of hard things. [01:33:37] So, yeah, I like that he's doing it, but I don't think that's going to be happening anytime soon. [01:33:41] You don't seem to be, you don't strike me as a big fan of Elon Musk. [01:33:47] Um, I think he's doing really good things for the world, and I am very supportive of his missions and his ambitions. [01:33:55] I've just read a little bit about him. [01:33:57] He does some really intelligent things. [01:33:58] He surrounds himself with experts. [01:34:00] He knows what he doesn't know, and he surrounds himself with experts, and he's well informed because he's humble. [01:34:05] I respect that. [01:34:07] However, this is the case with pretty much anything. [01:34:11] I think there are hundreds and thousands of people who can do his job better, and most of his position is just luck. [01:34:17] But that's the case with most everything in life. [01:34:19] People who are where they are think about what their true qualities are, such as your friend who's really wealthy. [01:34:24] I don't think there's any really distinct characteristics that make one person shine phenomenally more than other people. [01:34:31] There's hundreds and thousands of people with the exact same characteristics, but their life circumstances were such that they were born in Africa. [01:34:37] And for whatever reason, they just can't come here and be awesome and take over these things, right? [01:34:41] So, and when I look at him, I think he's great and I like what he's doing, but I just, I'm sort of against, I guess, sort of the idealization that people have sort of made him to be like, oh, he's the big savior of the future kind of thing. [01:34:54] Right, right. [01:34:54] Because I think that handicaps ourselves. [01:34:56] It makes you feel like you can't do it or something. [01:34:58] He's just a guy who works hard, surrounds himself by experts, and got really lucky. [01:35:02] And anybody else can do that too. [01:35:04] Yeah. [01:35:05] But I still think what he's doing is great and I respect him. [01:35:07] You know, I have nothing bad to say. [01:35:08] I just, yeah. [01:35:10] Yeah. [01:35:11] Do you ever think that you'll be as world renowned as Elon Musk? [01:35:17] I don't think so. [01:35:20] Yeah, no, I like, I don't know. [01:35:24] I just like simple things. [01:35:25] Like sometimes I just think about going out to just like a little farm or a small little village somewhere and just helping people, teaching math in some school and just like laughing with people and loving and feeling life a little bit. [01:35:35] But then sometimes I wake up and I'm just like, I want to go climb a mountain and conquer the world, right? [01:35:39] So it's a tough balance. [01:35:40] We all have to find what's right for us. [01:35:42] I'm not 100% driven and just ambitious to step over people and change the world. [01:35:47] That's not me. [01:35:48] I'm a little more of just a, you know, like Trigon, like Vash the Stampede. [01:35:51] That's my guy. [01:35:52] That's my hero. [01:35:52] That's the kind of guy I want to be. [01:35:53] He just wants to live a happy life and make all this better. [01:35:55] And that's kind of what I want to do, you know? [01:35:57] So, I don't know. [01:35:58] If it happens, I'd be cool. [01:36:00] I'll embrace it to try and help the world in whatever way. [01:36:02] But I'm not really working for anything like that, you know? [01:36:05] Yeah. [01:36:05] It's not my ambitions. [01:36:06] That's cool, man. [01:36:07] That's really interesting. [01:36:08] What do you guys want to do? [01:36:10] That's a good question. [01:36:11] That's a good question. [01:36:12] I don't know what I've, Always kind of wanted to do the same thing. [01:36:15] My only skill has ever been kind of like making videos with my friends when I was younger in high school. [01:36:22] All I did was run around skateboarding. [01:36:23] I ran around skateboarding with a video camera. [01:36:25] I'm astounded with what you've done here, man. [01:36:26] It's really impressive. [01:36:27] It impresses me even because I've known Danny since back then. [01:36:31] I knew him. [01:36:32] I'm impressed that he did this. [01:36:33] I'm one of the autistic people who just stuck with the same thing since that's all I've ever known. [01:36:37] I was on that path and I had so much momentum. [01:36:38] It's like, why go in another direction? [01:36:41] That's beautiful, man. [01:36:42] It's working for you. [01:36:43] Right. [01:36:43] He found the part he was good at. [01:36:45] He wasn't always the best. [01:36:46] BMX or skateboarder, but he was the best at filming everybody and putting it together and editing the videos together. [01:36:53] And it just, yeah, it worked out good. [01:36:55] And it's transitioned more into making it look cool to actually providing valuable context. [01:37:00] Very good. [01:37:01] Yeah, because you got to live in society, right? [01:37:02] So you got to find some way to make what you do valuable to others. [01:37:05] But I really like that you mentioned this right at the end because I struggle with this sometimes because I have friends out at Facebook and Google and they're like, Nathan, you got to come out here and you'll be challenged and you'll grow so much more and you'll really thrive out here. [01:37:18] And I have absolutely no doubt. [01:37:20] I mean, I've been out there. [01:37:20] I went to the Google Brain Project. [01:37:22] I've met some of the astounding people. [01:37:24] The Google Brain Project? [01:37:25] Yeah. [01:37:25] So it's a research group at Google that does AI. [01:37:28] And I've gone there and met with them, talked about them, my research, their research. [01:37:31] And these people are astounding. [01:37:33] They think in ways that's amazing. [01:37:35] And I love it. [01:37:37] But I think the notion of success is a little bit conflated in our world with just status and prestige, right? [01:37:46] Because for me, success is just watching others grow and. [01:37:52] Succeed. [01:37:52] I like to teach. [01:37:53] I like to share what I'm passionate about. [01:37:55] I like to just help others. [01:37:57] And so for me, if I went to a small little village and just saw some people's lives flourish and watch them be really happy, that'd be success for me. [01:38:04] And so that's why I like what you're doing is whether you learn one thing or many things, you're happy with what you're doing. [01:38:11] You're making a difference for some people. [01:38:12] And I mean, is there any other way to really define success other than that? [01:38:15] Yeah. [01:38:16] Well, I mean, it also has a lot to do with your early life too. [01:38:19] I mean, like how you were developed as a young kid, like what you're. [01:38:25] What you were put up against growing up, right? [01:38:27] I mean, if you. [01:38:29] I mean, did you grow up with siblings or anything like that? [01:38:32] Or were you an only child? [01:38:33] I had two siblings, I was the youngest. [01:38:35] Do you think that had anything to do with how you turned out, like with what you wanted to do, like learning science or getting, learning all this stuff, like, like being better than your siblings, competitiveness? [01:38:47] Like, a lot of times, like how people grow up, I feel like that has a lot to do with how they end up or the path they take. [01:38:53] There's a lot of psychological research behind children and where they fit in. [01:38:59] And, you know, there's a correlation between being the middle child and being the troublemaker and then being the youngest and then needing attention and then being the oldest and then. [01:39:06] You know, being somewhat altruistic sometimes because you have to help the younger siblings, right? [01:39:10] So, there's all kinds of like little interesting correlations. [01:39:12] None of them are absolute or anything, but I would say that that evidence is enough to say, absolutely, your upbringing, the number of siblings you have in your position among them is going to significantly modify who you become. [01:39:24] And what you determine is success, right? === Honored to Share This Love (01:09) === [01:39:27] Absolutely. [01:39:27] Absolutely. [01:39:29] Well, anything else we should cover? [01:39:30] I mean, we've been on here for like an hour, almost two hours now. [01:39:33] Almost, yeah. [01:39:34] Probably about close to it. [01:39:35] This was a good, this was good, man. [01:39:36] Yeah, thank you for everyone who stuck through it with us. [01:39:38] This is one of the best, uh, Discussions, I think we've ever had. [01:39:41] Yeah, I can't wait to rewatch this a couple times. [01:39:44] I'm gonna watch it a hundred times. [01:39:45] Definitely, man. [01:39:47] I'm really honored and grateful for the opportunity to share what I love. [01:39:52] I'm very passionate about this stuff, and it's nice to be able to have a platform and a microphone to share what I love. [01:39:58] Absolutely, yeah, man. [01:39:59] I would love to have you on here again, you know, eventually, you know, next time you're down here. [01:40:03] Yeah, share with us some cool shit you're working on down the line or some new shit that's going on. [01:40:08] 2019 is lining up to be a very busy year for us. [01:40:10] We have a lot of contracts on the line, so awesome, really. [01:40:12] Yeah. [01:40:12] Yeah, we'll take that back. [01:40:13] And everybody, NuSci Labs. [01:40:15] NuSci is a company in Tallahassee. [01:40:17] If you're looking for any data science or machine learning services, if you'd like to get insight as a service, as we call it, please reach out to nathan at nuSci.co or Google and contact your guys here. [01:40:29] We'd love to offer some services. [01:40:30] Awesome. [01:40:30] That's awesome, man. [01:40:31] Cool. [01:40:32] Well, thank you, brother. [01:40:33] Yeah, my pleasure, man. [01:40:34] It's good to call you a friend, Nathan. [01:40:35] All right, Nathan. [01:40:36] Thanks, buddies.