The Joe Rogan Experience - Joe Rogan Experience #2473 - Bill Thompson Aired: 2026-03-25 Duration: 02:20:52 === Living Traditionally with a Knife (10:26) === [00:00:01] Joe Rogan podcast, check it out! [00:00:03] The Joe Rogan experience. [00:00:06] Train my day, Joe Rogan podcast by night, all day! [00:00:12] What's up, Bill? [00:00:13] How you doing, Joe? [00:00:15] This might be one of the coolest things anybody's ever given me. [00:00:18] So you gave me this knife. [00:00:21] Explain all this. [00:00:22] All right, so, I mean, there's a larger explanatory reason behind this. [00:00:26] My brother and I grew up, my father died when I was five. [00:00:30] My brother and I grew up doing these things called rendezvous. [00:00:34] Have you ever heard of them? [00:00:36] In what way? [00:00:36] What is a rendezvous? [00:00:37] So there you go. [00:00:38] So what a rendezvous is, is it's not, you know, you go to those like, I don't even know what they're called, but people do like reenactments. [00:00:46] Oh, okay. [00:00:46] Like Civil War reenactments? [00:00:48] It's not like that. [00:00:49] So that's the closest thing approximation to probably what it is. [00:00:53] You get invited to them, or these days they're easier to get to, but my stepfather, the guy my mother remarried, brought us to them. [00:01:00] All you do is camp, but you're only allowed to camp, and no one comes to the camp, or sometimes they might have people at the end. [00:01:05] But while you're in the camp, everything in the camp has to be 1840 or prior. [00:01:09] So there can be no modern appurtenances, nothing like a refrigerator or nothing like that. [00:01:14] 1840? [00:01:15] Why? [00:01:15] 18 years. [00:01:16] At the end of the fur trapping. [00:01:18] That was considered like Jeremiah Johnson's time, like peak fur trapping. [00:01:22] So there's people, you know, they dress like either revolutionary, like American revolutionaries, or they dress like mountain men or they dress like Indians. [00:01:30] How'd you guys dress? [00:01:31] Mountain men. [00:01:32] So while we're there, you learn all kinds of stuff while you're reenacting. [00:01:36] Like I learned how to brain tan hides. [00:01:39] I learned how to traditionally art or do traditional archery. [00:01:43] Stuff like that. [00:01:43] So anyway, this knife was a knife I had actually started working on with my brother a while ago. [00:01:48] I do more of like the brain tanning tomahawk thing. [00:01:51] And when you're saying brain tanning, you talk about using brains to tan animal hides. [00:01:55] Right, using animal brains. [00:01:57] What does brains do? [00:01:58] Why does brain do it? [00:01:59] It softens the leather in a natural way. [00:02:01] And what's cool about it is every animal, no matter what animal you kill, has the exact amount of brain needed in order to tan the hide. [00:02:08] So you don't need any additional, like people use egg yolks or mayonnaise or something like that. [00:02:13] All you do is you take the brain out of the cavity, you grind it up, you mix it into some water, and then after you've cleaned the leather and you've scraped it clean, you stretch it. [00:02:23] I usually use like a dull shovel. [00:02:25] You stretch it over the dull shovel and then you soak it in the brain water mixture. [00:02:30] And then you just keep repeating that pattern and the leather gets like a really nice soft feel to it. [00:02:37] What is it about the brain? [00:02:38] Is it the fat? [00:02:39] It breaks down the leather. [00:02:41] I'm not sure if it's the fat or not. [00:02:42] I haven't gotten that deep into it, but it breaks down the leather and just makes it feel really soft, really nice. [00:02:47] So anyway, this knife here, I started, I killed that bear. [00:02:51] So the jaw is made out of two bear jaws, or out of one bear jaw split in half. [00:02:57] So that was a bear I killed in Canada in 2017. [00:03:01] It was my biggest black bear. [00:03:04] And so we split the jaw, put that together. [00:03:08] It's Irish linen threading. [00:03:09] Then that's a knife that my brother had picked up that was from 1860. [00:03:13] It was totally rusted. [00:03:14] We had to grind it back or he had to grind it back down. [00:03:17] And then the sheath is traditional, like, you know, you could, the cool thing about doing rendezvous and the cool thing about this is you could have a DeLorean and drop that in 1840 and somebody would pick it up and think it was made yesterday. [00:03:31] And so everything on there has been done traditionally from the quilling on the bead work is made from porcupine quills. [00:03:38] The backing is buffalo brain tan. [00:03:42] And then the front is beaver hide or a beaver tail. [00:03:45] I'm sorry. [00:03:47] And then the sides are horse and turkey hair hanging off of it. [00:03:51] And these are bear teeth? [00:03:53] And those are bear teeth, yep. [00:03:55] From the same bear. [00:03:56] So when I was thinking about what I was, because I wanted to give you something for inviting me on because it's still a shock to me that you did it. [00:04:02] Even though we've been talking for so long, I just never imagined a scenario where you'd want to have me on here. [00:04:07] Well, you're an interesting dude. [00:04:08] I thought, what could I give this guy that, you know, money or people or whatever couldn't get you? [00:04:14] And so I thought this is the right thing to do. [00:04:16] So it went from a me project to a U project. [00:04:18] And my brother Aaron helped me out with it tremendously. [00:04:23] So how'd you find this knife from the 1860s? [00:04:25] Well, he found it. [00:04:26] My brother is even more esoteric and odd than I am, believe it or not. [00:04:31] And he collects this kind of stuff. [00:04:34] I mean, the guy who dated it said 1860 to 1890 is what they figured. [00:04:40] And you can tell by the way that like around the hilt and the way that it's the pitting on it and stuff like that and the way that it was made that it fits that era. [00:04:49] I mean, it could have been somebody redid it in 1900, but it's definitely that old, the type of steel and the way that it was worked and the way that it is around the hilt around the bottom there. [00:04:59] Wow. [00:05:00] And so it's at least, you know, 130, 140, but most likely 160, 170 years. [00:05:07] It actually fits my hand perfect. [00:05:08] Yeah, so that's also something my brother and I talked about, about how long it was going to be. [00:05:13] And we made some educated guesses and put it all together. [00:05:16] So yeah, I mean, like I said, not something you can just go pick up somewhere or something that will, you know, hopefully mean something. [00:05:22] Not saying it's practical. [00:05:24] Like it's not something you'd be gutting an elk out with. [00:05:28] Well, if we get attacked by zombies in the studio, it's a good thing to have on the desk. [00:05:32] Yeah, I mean, if you're going to make a last stand, you know, that's a pretty good, that's a pretty good knife to make your last stand with. [00:05:38] It's a good way to go out. [00:05:39] Yeah, exactly. [00:05:40] That's awesome, man. [00:05:41] Yeah, so the rendezvous, we did those from... [00:05:44] How long did they last? [00:05:46] Uh... [00:05:46] They vary from a week and then some go up to three weeks. [00:05:49] And what do you do for food while you're out there? [00:05:52] So inside of your lodge. [00:05:53] So there's two types of rendezvous. [00:05:55] At most rendezvous inside of your lodge, you can have a cooler as long as it doesn't leave the lodge. [00:06:01] So I have like a 20-foot teepee that I take to these things. [00:06:04] And inside of my teepee, you can have a cooler and some modern appurtenances. [00:06:10] Did they have any kind of coolers in the 1800s? [00:06:12] I mean, they had ice boxes and like steel ice boxes and that type of thing, but nothing like we have today. [00:06:19] You know, stuff was getting dug out, buried in the ground, or put into the ground, like cool areas of the ground or dig outs. [00:06:24] And they dried everything. [00:06:26] So pemmican would have been the, you know, everyday thing to eat. [00:06:30] That's just dried. [00:06:31] So did you bring your own food or did you have to hunt for food? [00:06:34] So you bring your own food, but there are other rendezvous that are kind of invite only. [00:06:38] And I don't even think a lot of people who do rendezvous know about these, but there's ones that I think they're called, I think I might be speaking out of school. [00:06:44] Somebody might send me an email after this, but I'm going to talk about it anyway because I never got read the Riot Act. [00:06:49] They're called juried. [00:06:50] I think they called them juried southerns. [00:06:51] And I've only been to one of those. [00:06:53] And that's where everything in the camp has to be pre-1840. [00:06:56] And you meet down in a parking lot, you put everything on the back of a mule. [00:07:00] When I did mine, it was up in the, I think it was the Bighorns. [00:07:04] So, you know, you talk to a rancher, get everything packed up. [00:07:08] You go into the back of the Bighorns, and everything in camp has to be pre-1840, as close as it can get. [00:07:14] They'll even look at your stitching and say, oh, that was sewn with a sewing machine. [00:07:18] You got to take that off. [00:07:19] And it's always these weird, like, eccentric history teachers that run them, like guys who, you know, teaches history at Berkeley or something like that or other places. [00:07:29] They just really enjoy living like this. [00:07:30] And at those ones, if they're in season, you can hunt whatever's in season. [00:07:34] You're hunting with traditional archery. [00:07:35] And it's really good for kids. [00:07:37] Like, the internet wasn't a problem as much when I was a kid. [00:07:40] I was certainly into computers. [00:07:41] I have been since I was a child. [00:07:43] But you could just detach. [00:07:45] Everyone's running around crazy. [00:07:47] You're sitting around the campfire at night. [00:07:49] People are singing with songs in the guitar. [00:07:51] You're learning how to do things like this. [00:07:53] You're learning how to brain tan. [00:07:54] You're learning how to live traditionally. [00:07:56] And it's an eccentric cult, kind of. [00:07:59] It's not a cult. [00:08:00] It's an eccentric group of people. [00:08:01] It's a lot of fun. [00:08:02] People take it very community. [00:08:04] People take it very seriously. [00:08:07] There's more advertising surrounding it now than there used to be because numbers are kind of dwindling. [00:08:12] But I did my last one last year with my brother. [00:08:14] So if you go on my Instagram, there's a picture of my brother, my son, and I doing, I think, our second rendezvous together. [00:08:20] And we're just dressed like, you know, I've actually got an awesome war shirt. [00:08:24] I can show you the picture. [00:08:25] I've got an awesome war shirt that a friend of mine went to war with. [00:08:30] He was half Native American. [00:08:31] His grandfather was Ojibway or something, Chippewa, something like that. [00:08:37] And he was, I don't remember what his role was. [00:08:40] But anyway, we deployed to Iraq together and his grandpa made me this war shirt. [00:08:45] Oh, there you found it. [00:08:47] Jamie, he pulled it up. [00:08:48] That's my lodge. [00:08:51] How much do you enjoy a shower after you get out of here? [00:08:55] I mean, as long as you keep, you know, they have showers in camp. [00:09:00] They've got a showering area, a showering area where it's just like pallets. [00:09:03] That's the inside of my lodge. [00:09:05] So there's a cooler at this one. [00:09:06] This is not a jury rendezvous. [00:09:10] And so you can shower while you're in. [00:09:12] Some of them, they call them hooters. [00:09:14] They'll be like a latrine in a shower area in camp. [00:09:16] But also, like, some of them I don't, I don't do it at all. [00:09:20] This is wild. [00:09:21] And so there's no reenactment. [00:09:22] Like, there's not like civilians walking around. [00:09:24] It's not like Renaissance films. [00:09:25] Yeah, exactly. [00:09:26] It's just more like I want to act like it's 1840 for a couple of weeks and not look at my phone one time and not worry about the news. [00:09:33] It's amazing after a week here, you really forget about the world and you like don't even know you're supposed to be stressed out about things. [00:09:40] You're just out there doing your thing for a couple of weeks. [00:09:43] And you just cook over open fire. [00:09:45] Everything gets done traditionally that way. [00:09:47] And did you bring your own meat in there? [00:09:48] Yeah, you bring your own meat and stuff in the cooler. [00:09:51] And then there's also cooking classes where they teach you like all the recipes to do with like a Dutch oven, like an old cast iron oven. [00:09:59] And they do gambling at nights. [00:10:02] So you'll walk into like a huge, they call them marquees, but it's like a huge 100-foot square lodge. [00:10:07] There'll be three gambling tables in there, girls in like the low-cut shirts and dealing cards and smoking cigars and just having an amazing time. [00:10:14] And there are people, you go by camp names while you're in there. [00:10:17] Nobody uses their real name. [00:10:18] Well, some people use their real name. [00:10:20] I'd say 60% of people don't use their real names. [00:10:23] What was your camp name? [00:10:25] This is embarrassing. [00:10:27] It should be. === Camp Names and Coming of Age (06:04) === [00:10:28] Yeah. [00:10:28] So I got my camp name. [00:10:30] I got christened with my camp name in the Bighorns when I was 14 or 13. [00:10:37] And it was talks a lot. [00:10:38] Talks a lot. [00:10:39] Yeah. [00:10:39] And Sue, it was pronounced Iaota. [00:10:42] Just because you talk a lot? [00:10:43] When I was a kid, I talked a lot. [00:10:45] Actually, as an adult, I don't talk that much unless I know you. [00:10:50] But as a kid, I would never shut up. [00:10:51] I had really bad ADHD. [00:10:53] They kind of diagnosed me with having some low-level version of Asperger's. [00:10:57] And I was a rap scallion in class, just never showed up, never listened, never did anything. [00:11:05] Those are the people that are the most fun. [00:11:06] Well, they didn't enjoy me in high school or in college. [00:11:09] That probably would have been your friend. [00:11:11] But yeah, they called me Iaota. [00:11:15] And we got christened. [00:11:17] And it was a, you know, it's one of the things we're kind of missing in culture today or something that I'm trying to reinvigorate, especially with my son and with other young men that I run into is kind of like coming of age rights. [00:11:29] Yeah. [00:11:29] Something to say, you're a man, and I'm going to start treating like a man from this moment forward. [00:11:34] Like, you know, what does that, there should be structure to that. [00:11:37] You know, we're tribal and it's important to me. [00:11:41] So I think that is really something that's missing from society. [00:11:45] I think that I used to think it was silly when I was young. [00:11:48] And then as I got older, I realized, well, I went through that. [00:11:51] I became a black belt and I started fighting. [00:11:54] And you had a group of men telling you, you're at this level, we're going to treat you like that. [00:11:57] And if you fall from grace, we're going to remind you right away. [00:12:01] And we just don't do that with young men. [00:12:03] And we have a society now where young men act like young men until they're 45 or 50 or 60. [00:12:08] And sometimes never stop. [00:12:09] Yeah. [00:12:10] And, you know, women, nature imposes itself on women. [00:12:14] They become fertile. [00:12:15] They're able to have babies. [00:12:17] And they've got to seek security or find a husband or a really good job that will supplement whatever a husband would provide. [00:12:24] And they got to start acting like a woman. [00:12:26] Whereas men can sit in a basement and it becomes very dangerous. [00:12:30] Especially men that never have children. [00:12:31] Yeah. [00:12:32] And they're perpetual children. [00:12:34] Yeah. [00:12:34] And if you don't impose nature on yourself by undergoing those types of rights and understanding what it means to become a man, nature will impose itself on you by either A, you're never going to have children and therefore you're dead forever, or B, it will kill you because you're fat and in your mom's basement, you get diabetes, the foot chopped off and you're 35. [00:12:51] And, you know, we just don't tell men. [00:12:54] We don't have a the military did it for me. [00:12:56] I had really put off responsibility or seeking meaning or any of those things until I was in the military. [00:13:04] And like I said, my father died when I was five. [00:13:06] So I really had no central male authority until I was about 13 or 14 when I met this guy, Steve. [00:13:13] And he kind of initiated some of those rights for me and held me to account. [00:13:18] But it was really the military, which was a turning point for me where there was a standard and I was expected to hold it. [00:13:26] I think there's a reason why most ancient cultures, a lot of ancient religions, have these rites of passages where you are like now officially, officially a man. [00:13:35] Yeah. [00:13:35] Officially, you know, you're responsible. [00:13:39] You have to think of yourself as a different thing now. [00:13:41] Whereas if you leave it up to your own decision, men sort of dwindle into this perpetual state of childhood. [00:13:49] Yep. [00:13:50] And it's not about you anymore. [00:13:51] It's about other people. [00:13:52] Like that, for me, having children, I've got four kids. [00:13:57] Really, you know, the military was kind of the first inkling of responsibility. [00:14:01] But then having children and realizing this isn't about me at all. [00:14:04] Right. [00:14:04] And I need to be willing to break my back for these people who depend on me. [00:14:08] There's this weird primal feeling that you're responsible for these like very vulnerable little people that you love more than life itself. [00:14:17] It just changes everything. [00:14:18] It just kicks you into gear. [00:14:20] But for some people, it doesn't. [00:14:22] Some people that are so stuck in that perpetual childhood thing, they just wind up deciding it's too much of a drag and they get divorced. [00:14:30] And then they fuck up the kids. [00:14:31] Yeah. [00:14:33] God, we have so many rabbit holes we could go down on this. [00:14:35] But I mean, it was, you know, growing up in the 80s and the early 90s, it was really like a divorce culture. [00:14:44] And I obviously understand that if you're in a bad relationship or an abusive relationship or, you know, certainly there's a threshold where marriage should dissolve. [00:14:52] No question. [00:14:54] But I kind of feel like the central thrust of a lot of culture at that time was about like divorce or not getting married or, you know, discovering yourself and that type of thing, which in some ways is good. [00:15:04] There's goodness there. [00:15:05] But when it becomes a central thrust or a central narrative and divorce becomes very easy or it's happening everywhere, it's super normalized. [00:15:13] And it's normalized. [00:15:13] It's super destructive. [00:15:14] Children are the ones who suffer the most on it. [00:15:16] And I think the data is clear on that. [00:15:18] When you look at single parent homes or no parent homes or being raised without an authority. [00:15:24] Or an abusive step person. [00:15:25] Or an abusive. [00:15:26] And that is, you know, when you look up the stats on that, like remarriage and having a new family, like that, that becomes the single most likely vector of abuse in a chi young child's life is that new person, right? [00:15:38] Because now they're raising someone else's kid or whatever. [00:15:41] I mean, it's a that's in every old movie, the evil stepmother. [00:15:45] Yeah. [00:15:45] You know? [00:15:46] Yeah. [00:15:46] Or evil stepfather. [00:15:47] But in the old movies, it's always the stepmother that abuses the girl. [00:15:52] Yes. [00:15:53] And so, you know, I kind of resented that part of that time, that culture was, I shouldn't say when I was a child. [00:16:02] I should say as I got older, because I was in a single mom home. [00:16:04] And the guy that my mother remarried right after my father died was abusive. [00:16:10] And, you know, he really got hard on my younger brother. [00:16:13] And, you know, my mother moved us out almost immediately. [00:16:16] But when I re-examined that time, it really was, you know, I don't know how to describe it, but, you know, there are no rules when it comes to relationships and family. [00:16:27] And every family is special and particular in its own way, and they all need to be venerated. === Government Incentives and Abuse (11:51) === [00:16:32] And there's, of course, some truth to that. [00:16:33] We shouldn't deride someone because they come from a broken family, but we shouldn't elevate it like it's at the same level as a unified family. [00:16:42] And that's a tricky line to walk. [00:16:44] But also, the people who are making those movies in that culture came from the 50s and 60s where divorce was just not in the cards. [00:16:51] And so that was, you know, Hooke's law: as you bend any object, it wants to return back to its natural state. [00:16:58] And Hooke's law kind of played there where nobody could get divorced in the 40s, 30s, 40s, 50s, and 60s. [00:17:04] And then you had the baby boomers who kind of culturally said, you know, actually, it's not as bad as we think, but then it overcorrected and it became kind of part of that cultural zeitgeist. [00:17:15] That's kind of what humans do, right? [00:17:17] We always overcorrect. [00:17:18] Yeah, we do. [00:17:19] Yeah. [00:17:19] We go in one direction until we realize it's destructive, and then we overcorrect until we realize that's destructive. [00:17:26] Yeah. [00:17:27] This episode is brought to you by Ketone IQ. [00:17:29] The demands on my time, energy, and focus are immense. [00:17:33] So when I need my brain to lock in for hours and hours and fire at its fastest, most alert state, I'm taking ketone IQ. [00:17:42] It's an energy shot powered by this little miracle molecule that your body already naturally makes and your brain especially loves ketones. [00:17:52] I've been talking about ketones for over a decade and this company has finally figured out how to put them in a bottle. [00:17:57] When I take ketone IQ, I drop right into a state of laser-like focus and sustained mental clarity. [00:18:04] Whether I'm podcasting, training in the gym, or just want to show up locked in when it matters, the difference is night and day with ketone IQ. [00:18:13] Visit ketone.com/slash Rogan for 30% off your subscription order or find Ketone IQ at target stores nationwide in the protein and electrolyte aisle and get your first shot free. [00:18:29] Plus, they have a 60-day money-back guarantee. [00:18:33] That's how confident they are that you're going to love the increased focus you get from ketone IQ. [00:18:38] And I would say that's the, and not, this isn't a political thing. [00:18:41] This is just the reality of it. [00:18:43] That's mostly what makes me conservative in nature is I agree systems need to change, but they need to change slowly and pragmatically. [00:18:51] So we, because, you know, any social, any social scientist worth their salt will know a social experiment almost never has the outcome that we thought it was going to have. [00:19:01] In other words, we thought doing something to society would form society this way, but it almost has the inverse, the anti-pattern like we talked about before, and almost ends up propagating itself. [00:19:12] And so that makes me, I'm still a proponent for change, but it should be slow and thought out and done in pockets first. [00:19:22] Kind of, you know, federalism. [00:19:23] Let's do little changes here. [00:19:25] Let's let California be crazy for a while and see how that works out for them. [00:19:29] But let's not nationalize the craziness. [00:19:31] Let's learn from what they learned there. [00:19:33] And there'll be goodness, you know, hoparicism, make great coffee and cool art. [00:19:38] And let's take those parts. [00:19:39] But how about the rampant homeless? [00:19:42] Let's find out what caused that and solve for that. [00:19:45] And, you know, that was kind of the founder's intent with federalism. [00:19:49] They're really federalist-minded, state-minded. [00:19:51] And there's, you know, even for that being as 250 years ago, there's a profound amount of profundity in that. [00:19:57] Like, let's change things slowly and let social experience take place and adopt the best parts of those things and then integrate them to the culture overall as we move along. [00:20:06] But, you know, let's not throw the baby out with the bathwater. [00:20:09] Yeah, I think in this country, one of the primary problems that people have is a profound lack of respect for discipline and how important discipline is for your life. [00:20:20] And discipline is associated with conservatism. [00:20:24] And because of that, like a lot of people think that I'm, I don't think I'm anything. [00:20:29] I think I have politically or ideologically, I have a lot of everything in me. [00:20:35] I don't think I identify with one side or another. [00:20:37] But if one thing that I agree with conservative people on, conservative people lend more towards the importance of discipline. [00:20:44] Hard work, discipline, don't complain, get things done, deal with the hand that you've been dealt with, and just sort it out and get to work. [00:20:54] Don't cry. [00:20:55] Don't look for other people to save you. [00:20:56] They're not going to. [00:20:58] And this is not something that's celebrated in society. [00:21:02] It's thought of as a cruelty that if you say that you need discipline, that you're not treating these people that are victims of circumstance with the proper respect or with the proper empathy. [00:21:14] And I think a certain amount of empathy is probably not so good for you at a certain point in time. [00:21:19] There comes a point in time where you're letting people wallow in their bullshit and just make excuses for why they're not getting anything done. [00:21:25] And in that sense, I think California is, that is a giant part of what's wrong with California. [00:21:31] What's wrong with California when it comes to crime? [00:21:33] What's wrong with California? [00:21:35] You know, the way they address crime and the way they address homelessness and all these issues that they have, they don't put their foot down. [00:21:42] And at a certain point in time, you've got to realize what God Sad calls suicidal empathy. [00:21:48] Society can suffer from suicidal empathy. [00:21:50] And at a certain point in time, you've got to enforce rules and you've got to make it so that people have to get their shit together. [00:21:56] Yeah. [00:21:56] And that suicidal empathy becomes a way for the person who's imposing it on someone else to feel good about themselves, which makes it even trickier and even more insidious because they're feeling good from the weaponization of other people's lot in life. [00:22:14] And the thing about that is none of the rules that you're going to impose, especially as a legislator or as somebody in a think tank, you'll never feel the repercussions of them. [00:22:24] You'll never have to actually deal with it day to day. [00:22:26] You're just imposing it on someone else and saying, I better understand the structure of reality and the fabric of the world. [00:22:33] And you can't help but be this way. [00:22:35] It's the system that's done this to you. [00:22:36] So let me give you pittance that I'm going to take from someone else. [00:22:40] And that makes me benevolent. [00:22:42] I get to feel good about that. [00:22:44] That's a giant part of government, for sure. [00:22:46] That's a giant part of what's the problem with liberal governments. [00:22:49] Liberal governments should get paid based on whether or not the city does better or worse financially than when they were in office. [00:23:01] If their policies lead to greater domestic production of goods and services and GDP does better and everything does better, then you should get paid more. [00:23:12] If more real estate sales, more people are making more money, medium income raises, less homeless people, you should get paid more. [00:23:19] And you should get paid less if homelessness goes up, if crime goes up, if there's more destruction, if there's more, you know, assaults and home invasions, you should get paid less. [00:23:30] You're doing a shitty job. [00:23:32] And if you did that, I think they would impose laws that made it safer and healthier and made it better for society. [00:23:40] Yeah, and then they would just inevitably change the ways that we track and measure those things and pay themselves more. [00:23:45] Well, they shouldn't have the opportunity to do that. [00:23:47] Then you need some sort of an oversight that's going to be cynical. [00:23:50] You're right, though. [00:23:51] You're right to be cynical because that's what they do about everything. [00:23:54] Someone was explaining to me yesterday that one of the problems with cleaning up fraud is that fraud is responsible for a giant percentage of GDP. [00:24:05] And if you have hundreds of billions of dollars of fraud in this country and you eliminated that, you actually lower GDP because you actually lower the amount of money that's in circulation. [00:24:19] That's interesting. [00:24:20] I've never thought about that before. [00:24:21] He was explaining to me, and I was like, oh my God, that is crazy that a giant percentage of our GDP is fraud. [00:24:28] And if that was somehow or another eliminated, like one of the things that they do when they raise jobs, like they increase GDP. [00:24:37] We've added 200,000 jobs to the market. [00:24:41] Well, what are those jobs? [00:24:42] Like, what are those jobs? [00:24:43] Are these government jobs? [00:24:45] Because the government is a giant percentage of our GDP, government jobs. [00:24:49] It's way bigger than it should be. [00:24:51] Way bigger. [00:24:51] And those jobs, a lot of them, are bullshit and waste, a lot of them. [00:24:56] Yeah. [00:24:56] You know, and that was some of the stuff that was uncovered during Doge, you know, the limited amount of access that Doge had to it, just the beginning of it, where you got to see the curtain pulled back and get to see exposure of so many of these fraudulent, supposedly charitable organizations that were really just money laundering. [00:25:15] They're really just funneling money into these people's hands, like the homeless thing in California. [00:25:21] Oh, my goodness. [00:25:22] It's a bonkers situation where they've spent $24 billion. [00:25:27] They cannot track it. [00:25:28] They've tried to audit it. [00:25:30] The government has vetoed these audits and they have no idea where that $24 billion went and yet homelessness went up. [00:25:39] But you've got a giant machine that is this homeless establishment, this homeless industrial complex that is being funneled money into that. [00:25:49] And that actually aids the GDP, which is kind of crazy. [00:25:53] Yeah, I mean, it was one of the things. [00:25:55] My last three years in the military, I was advising a colonel and a two-star general, and they were in charge of all of the offensive cyber development, ethical hacking, offensive cyber development. [00:26:08] I was their technical advisor. [00:26:11] And one of the things I kind of learned about government at that point was these systems have their own incentive. [00:26:19] And the incentive is not the output of their purported mission. [00:26:22] The incentive is the growing of the organization and the execution of budget. [00:26:27] So while they're in there, you know, I've never seen a field-grade officer get dressed down more than when he didn't spend all of the money that he was budgeted for for that year. [00:26:35] Isn't that crazy? [00:26:36] He would go to the Pentagon and they'd be like, well, you didn't execute $300 million of OCO, of overseas contingent operations funds here. [00:26:44] And they would dress him down for an hour. [00:26:46] And what people don't understand is if you don't spend that money, your budget for the next year will be lower because there's no need to have a higher budget. [00:26:55] Instead of tying it to mission to say, did you achieve your mission objectives? [00:26:58] Right. [00:26:59] We started the year agreeing from the president's framework, the NIPIF, the National Intelligence Priority Framework. [00:27:05] We wanted to achieve these effects. [00:27:07] What you would want to hear is, we achieve them and we save 25%. [00:27:11] But instead, it's we achieve them, but we didn't execute all of this money. [00:27:14] Well, you're fired. [00:27:15] And I literally have seen that happen. [00:27:17] I've literally seen that happen. [00:27:19] And that kind of... [00:27:20] What a sick society. [00:27:21] Yeah. [00:27:21] Yeah, and that kind of shifted my thinking in that these systems have their own incentive to exist and to grow because those guys that were holding that general officer or that 06 is that colonel's feet to the fire, they also have an incentive to, because they were part of that trickle-down. [00:27:38] And they've got bureaucracy that surrounds them. [00:27:40] And if they didn't execute it, that means they didn't execute it. [00:27:43] And that means they have to go to whomever. [00:27:45] This was during the Biden administration. [00:27:47] I believe Hag Seth for everything we could say has actually tightened this up quite a bit. [00:27:51] And he's kind of re-hauled the way development works, especially on the offensive cyber side. [00:27:56] But they have bureaucracies, and the incentive of the bureaucracy is to make sure that we grow. [00:28:00] And that's it. [00:28:01] And then you think about that for a minute, and you're like, well, it's no longer a question why we have $30 trillion of debt. [00:28:08] $39. [00:28:09] $39 trillion. [00:28:10] And then what, like $150 trillion of unfunded liability. [00:28:14] In other words, we've promised people money for the next 30 years. [00:28:18] And it's debt that, you know, I don't see how we'll ever escape that debt. === Bureaucratic Systems and Fear (15:11) === [00:28:23] And it's the thing about it is, and I don't want to be pigeonholed because I'm actually quite liberal when it comes to my politics are like yours in that I'm kind of a man without a home, but they also change at different levels of analysis. [00:28:37] I'm very liberal with my family and I'm very like communist. [00:28:41] I protect them. [00:28:42] I give them everything they need. [00:28:44] I'm trying to give them structure. [00:28:46] And even in my community, I'll help someone out out of pocket or do something for them that's a strain on my time or might hurt something else because there are really no solutions. [00:28:54] There's just trade-offs. [00:28:56] That's supportive for the community, though. [00:28:57] That's how people are supposed to do charity. [00:29:00] And I'm also very non-judgmental in someone how they care. [00:29:02] I don't care what they do in their house. [00:29:03] I don't care if it's a Roman orgy on the weekends. [00:29:06] Like be a predictable, productive person Monday through Friday and go do your Roman orgy on the weekend. [00:29:12] I don't care. [00:29:12] I won't judge you. [00:29:13] Like I don't, I really have enough crap in my own life. [00:29:17] As long as someone's not getting hurt. [00:29:18] Yeah, as long as no one's getting hurt, consenting adults. [00:29:20] Like I have enough problems and I screw up enough and people have, there's a laundry list of things that people could say about me, how I've screwed up in my life. [00:29:27] But then as I graduate and get higher and higher, more conservatism takes place. [00:29:34] And that's a result of just, you know, having an engineering mindset when I'm looking at life and understanding that it's just not Republican or Democrat or leftist or rightist or liberal or classically liberal. [00:29:50] All of these monikers don't work for me because they break down at some level of analysis. [00:29:56] Right. [00:29:56] And I think that's the problem. [00:29:58] I think the problem is these ideologies that people subscribe to, where you have a predetermined pattern of thinking that you're supposed to adopt. [00:30:04] Yes. [00:30:05] You're supposed to adopt these opinions. [00:30:07] And some of them just don't fit. [00:30:09] And that's how people get pigeon. [00:30:10] That's like people on the left. [00:30:11] They get pigeonholed into weird stuff that you can't really justify, like trans women in sports. [00:30:17] Like, what the fuck are you doing? [00:30:18] Like, we're being inclusive. [00:30:20] Like, no, you're not. [00:30:21] We're loving the borders of Ukraine while hating our own border. [00:30:24] Yeah. [00:30:25] Fucking bonkers. [00:30:26] Yeah. [00:30:26] There's so many crazy things. [00:30:28] There's so many crazy things that people just adopt that don't make any sense. [00:30:32] And, you know, when you subscribe to an ideology, the problem is if like if you define yourself as this person, I am this. [00:30:40] I am a hardcore right-wing blah, blah, whatever it is. [00:30:44] You immediately close the door to all the very productive and interesting things that the other side thinks. [00:30:49] Yeah, and you're also making yourself into a tool of propaganda. [00:30:52] Because if someone, if I meet someone and they just say, I'm this, it's like, well, I could reasonably predict everything that's going to come out of your mouth. [00:31:00] Yeah. [00:31:01] That's not entertaining. [00:31:02] I don't want to have a conversation with that person. [00:31:04] I can't seek to learn from them because I could just pick up the Communist Manifesto or Mein Kampf and have a pretty good understanding of who I'm dealing with. [00:31:10] And therefore, a conversation is not relevant. [00:31:13] It's not needed. [00:31:14] A lot of people are afraid of social ostracize too. [00:31:17] So they're afraid of straying outside of the narrative, whatever side they're supposed to be on. [00:31:23] And, you know, some groups are really good at making you feel like dog shit if you don't agree entirely with even things that don't even make any sense. [00:31:32] So that's why people go along with stuff that's illogical, like open borders or whatever it is. [00:31:37] Yeah. [00:31:37] They go along with things that's not in their best interest because they're scared. [00:31:41] They're scared of being ostracized. [00:31:43] They're scared of being cast out of the kingdom. [00:31:45] They're scared of being excommunicated. [00:31:47] Yeah, I dealt with a lot of people first when I retired from the military and then more recently leading up to the last election where I was entertaining the deal of doing some work for government, believe it or not. [00:32:01] And because as we talk more, you'll figure out I'm pretty anti-institutions. [00:32:07] I'm really against those types of things. [00:32:10] But I really felt, if you would have asked me three years ago how I felt about the Trump election and all of that stuff, I was very excited because he was saying a lot of things that I wanted someone to say. [00:32:19] Trump fits a pattern. [00:32:21] And this is what people I think kind of lack when they my whole life is built around pattern analysis. [00:32:27] I really enjoyed patterns and exhuming and looking into patterns. [00:32:33] And there's a pattern of like a you this, you'll laugh when I say this first part of the pattern, but then I'll make it make more sense later. [00:32:41] But he fits the pattern. [00:32:42] Well, first, he's a Jacksonian, and in that he's a pragmatic person the way that he governs, which I liked, or at least I did. [00:32:50] And, you know, there's some things he's done recently that I don't enjoy. [00:32:55] But he's also an outsider or a savior type. [00:33:01] Allah, you know, I don't remember the movie, but the Magnificent Seven back in the day. [00:33:05] I don't remember the actor's name. [00:33:07] There's this group of, you know, there's this Western town. [00:33:10] Everything's going to shit. [00:33:12] These seven guys walk in. [00:33:13] I think Chris Pratt remade it with Denzel Washington or someone else. [00:33:16] Oh, really? [00:33:17] I think so. [00:33:17] I can't remember. [00:33:18] But there's an old one that I used to watch with my grandpa. [00:33:20] God, there's too many movies. [00:33:21] Yeah. [00:33:22] And there's this pattern where you wouldn't invite these guys to a dinner party. [00:33:27] You wouldn't want them in church on Sunday. [00:33:29] But when a system is so corrupt and so horrible, you have to rely on these types of people to come in and be a check to the system. [00:33:36] But then also you don't want them to stick around when the system is reset. [00:33:40] So there's a scene in the movie where he says, you know, man, these seven guys are talking. [00:33:45] And they said, man, these people must have really wanted us. [00:33:47] Like, it's crazy. [00:33:48] They must be happy we're here. [00:33:50] And I think it's Gary Cooper or someone or one of these guys says, looks at him and says, they're going to be even happier when we leave. [00:33:55] And Trump kind of fits that narrative. [00:33:57] Wolverine from the X-Men would be another one who fits this narrative. [00:34:01] Like, is he going to be at the X-Men Christmas party? [00:34:04] No. [00:34:04] Right? [00:34:05] Is he trying to hit on Scott Gray's wife, Cyclops? [00:34:08] I'm a comic nerd, so I'm sorry. [00:34:09] Is he trying to hit on sleep with Cyclops' wife? [00:34:12] Yes. [00:34:13] Did he chop a guy's head off and throw it at a car? [00:34:16] Yes. [00:34:16] But we're about to go face Galactus and we're going to need him. [00:34:20] And so we have to put up with all of this other stuff because we understand that when the system is corrupt at every level, you need someone who's outside of the system to come in and set the system right. [00:34:31] It's a Western pattern as well. [00:34:34] Other people who fit this would be like Patton, right? [00:34:36] Married his cousin, slap soldiers who had. [00:34:39] Did he really? [00:34:39] Oh, his cousin? [00:34:40] Yeah, I think it's his third cousin. [00:34:42] How many cousins removed or doesn't become okay? [00:34:46] I don't know. [00:34:46] Is it third? [00:34:47] Fourth? [00:34:48] If there's blood. [00:34:49] Have you never met them? [00:34:50] I'm Icelandic, so I really can't say anything, right? [00:34:52] They literally have apps in Iceland. [00:34:53] Like my grandparents and my great-grandparents are all from Iceland. [00:34:57] They settled in Manitoba, Gimli, Manitoba, which is this Icelandic community. [00:35:01] And they literally have apps in Iceland to make sure you're not dating your cousin. [00:35:04] So, you know, less than a million people on one island. [00:35:11] So you're trying to prevent that stuff. [00:35:13] But anyway, Patton, yeah, slap soldiers who had tuberculosis. [00:35:17] One of them probably had shell shock. [00:35:19] It got in the newspaper. [00:35:20] They wanted his head. [00:35:22] And thankfully, the generals were like, no, he's the guy that we need for the moment, right? [00:35:25] He had the ivory pistols and he dressed like not like a general. [00:35:29] He didn't talk like a general. [00:35:30] He wasn't like an Eisenhower where he had the veneer of a general. [00:35:36] But we knew he was the only guy we could have at the Battle of the Bulge. [00:35:39] Like the Germans talked about him like he was already a mythic legend in his lifetime. [00:35:45] But part of this pattern that people should understand, or when they examine this pattern, is it never ends well for these anti-heroes. [00:35:52] They're always killed or defamed in the final analysis. [00:35:56] So when Magnus and Seven come in, they'll go to another town and all get killed. [00:36:00] When Patton retired, he died in some weird Jeep accident. [00:36:04] You know, Wolverine, he's the only guy left on this desolate world where the Hulk's in charge and it's a horrible existence. [00:36:12] Patton, or not Patton, Petraeus is another one. [00:36:16] I briefed Petraeus. [00:36:17] I worked for, not for him, but for people who worked for him in Iraq. [00:36:21] And he was the guy that got us through with the surge. [00:36:25] But he was really a weird guy when you would talk to him. [00:36:30] You knew that he knew something you didn't and that he was seeing things that you weren't. [00:36:35] But even for myself, as being like a chief warrant officer at that time, a low-level technician, he would ask questions like he got it. [00:36:41] He didn't act like other generals. [00:36:43] Like other generals would have their three things they want to talk about, then they'd want to get out of Dodge. [00:36:47] He would ask questions that really had implications. [00:36:49] And he is another one of these outsiders who came in to write a system that was not working vis-a-vis Iraq in 2006. [00:36:58] And then what happens to him when he leaves? [00:37:00] They put him in charge of the CIA. [00:37:02] They knew he had been screwing around with this woman. [00:37:04] And they're like, okay, he served his function. [00:37:07] Now he needs to get out of Dodge. [00:37:08] And then now he's, you know, got tried for all these things and sleeping with someone while he wasn't married. [00:37:15] And, you know, it's not a ceremonious end for these types. [00:37:18] I saw. [00:37:19] Is that really what happened to Petraeus? [00:37:20] That's how he ended? [00:37:22] Yeah, he was sleeping with some girl that was writing his book or something along those lines. [00:37:27] Well, I'm not saying that's the end of him. [00:37:30] All I'm saying is that history will remember the pattern is ending unfavorably. [00:37:36] You know what I'm saying? [00:37:37] And so when I examined Trump, I said, yeah, I don't like what he says. [00:37:41] I wouldn't want him around my daughters. [00:37:43] I wouldn't want him at a dinner party. [00:37:46] But he seems to be saying these things like he's going to reset this system. [00:37:50] You know, I think it was Chappelle was on your show or another show or someone like that where he talked about Hillary saying, you know, something about the tax loopholes or whatever. [00:37:58] And he just hit right back at her and said, well, the people who are funding your campaign take advantage of those same loopholes. [00:38:04] And if they're there, I'm going to take advantage of them. [00:38:06] I wouldn't be a pragmatist if I didn't. [00:38:08] When he started saying stuff like that, it seemed to me like he was going to upend this system. [00:38:12] The jury's out on that because I don't know how I feel these days. [00:38:15] We can get into that if you need to, if we want to. [00:38:17] But he's an outsider personality, and I thought he was going to really reset this system. [00:38:23] And there are good things that are happening. [00:38:26] If I were to grade him, I would probably give him a C plus or a B minus. [00:38:30] Certainly better than what was happening under Biden. [00:38:33] I was still in the military when Biden was in charge, and it was awful to say the least. [00:38:38] What were the problems? [00:38:39] Oh, my goodness. [00:38:41] Books that general officers were being told to read and that I, as an advisor, were being told to read. [00:38:46] Books like White Rage, like understanding why you're a problem. [00:38:52] You as a white man are a problem in the modern day military because this whole thing's built on systemic racism. [00:38:59] You have inbuilt implicit bias that you can't escape even if you wanted to or you recognized it. [00:39:04] It's woke politics. [00:39:05] Yeah, it was woke politics. [00:39:08] And it was, you know, I would sit there and say, you know, all of the friends, all the people that I know who've died during this war, not all of them, but 80% of them, and the numbers bear this out when you look at them. [00:39:19] They're all white guys from the middle of the country who were on their farms or, you know, not all of them, 80% of them. [00:39:26] I think the numbers bear out about 80% of them. [00:39:28] Were these guys from the Midwest or these places where they didn't really have a lot going? [00:39:32] And they went off to fight a war that we probably shouldn't have been fighting in the first place, especially in Iraq. [00:39:38] And they died for their cause. [00:39:40] And now you're saying that those people who make up the majority of the combat deaths are somehow part of this problem and that other people aren't benefiting from it. [00:39:50] I don't believe race to me is disgusting. [00:39:52] Even to talk about someone's race, even on both sides of the spectrum, when they were electing that Supreme Court justice, I can't remember her name right now off the top of my head just because I'm a little nervous still. [00:40:05] She was black. [00:40:06] Sangie Brown Jackson. [00:40:07] Yeah, they were talking about how it's historic because she's black. [00:40:10] And Biden had said he's going to hire a black woman to do this job. [00:40:14] If I had worked my whole life to do something, but now I'm only being elevated to this next position because of my gender and the color of my skin, I would turn that job down so fast because that's not what I want to be known for. [00:40:26] These are immutable characteristics that I'm not in control of. [00:40:30] I didn't choose to be born white or with blue eyes. [00:40:32] I didn't choose to be born in a trailer park in the middle of nowhere without a dad at five. [00:40:36] I didn't choose any of those things. [00:40:38] I don't see how I benefit from these things at the individual level. [00:40:41] And, you know, the individual level of analysis for me is really the only way to evaluate someone for their pluses and their minuses. [00:40:48] And anything beyond that to me is discriminatory on its face. [00:40:52] Of course. [00:40:52] It's just a great way to control people because you pit people against each other that way. [00:40:57] And it's just an awesome way that they can stay in control and make everybody walk on eggshells and think that they've victimized people in order to get to their position and they have to be shameful of who they are that they had no control over. [00:41:13] It also gives people an easy rubric to judge other people. [00:41:15] Yeah. [00:41:16] Because nothing's easy, really. [00:41:18] And it gives some like white guy bad, black guy good. [00:41:22] Chinese guy, as long as he's not applying to the college I want to get into, he's good. [00:41:26] Right. [00:41:28] And it gives people, people want easy answers, really, at the end of the day. [00:41:32] They want to be told the easy rubric to navigate life because really none of it's easy and it requires discipline, like you said before, and thought. [00:41:40] And so it was that stuff in the military. [00:41:43] I remember getting told in an equal opportunity briefing we were getting, it doesn't matter what you meant when you said what you were saying. [00:41:53] It only matters what the person felt when you said it. [00:41:57] They'd said that in a military briefing? [00:41:58] This is a military equal opportunity briefing. [00:42:01] And the example they gave was if a woman walks into the, like we worked with a lot of civilians at this military organization where we were developing these offensive cyber capabilities, a lot of civilians in there. [00:42:14] And so if, you know, woman X walks in today and she's got a dress on, and the thought in your head is, I'd like to get my wife that dress or something like it or find out where she bought it. [00:42:24] And you just say, that's a nice dress. [00:42:26] Anyway, here's the TPS reports. [00:42:28] If she heard something sexual or didn't like the connotation or whatever, there's going to be an investigation. [00:42:36] You're going to be pulled out of that office. [00:42:38] This is all going to happen despite what you meant. [00:42:41] So the idea probably was good. [00:42:43] We want to prevent sexual harassment inside of the office. [00:42:47] But it was weaponized. [00:42:48] But it was weaponized and it was carried out in a way where it's only about how people feel and not what a reasonable person standard would be in a particular situation. [00:42:56] And from the time I joined the military until that time, we had been at war. [00:43:00] My entire time in the military, we were at war. [00:43:03] I deployed throughout my career. [00:43:05] And I wouldn't say that I was a war horse. [00:43:08] I was not a long tabber. [00:43:09] I was not a cool guy kicking indoors. [00:43:11] It was my job as the guy with tape over his glasses to point out the door for someone else and say, bad guys in there. [00:43:18] So I was not a super badass in that regard. [00:43:21] I was a nerd for super bad asses. [00:43:25] But we also all engaged in gallows humor. [00:43:28] And we would, you know, the jokes and stuff. [00:43:32] Even someone who had recently died, we would make a joke about. === Recruiting from a Homogeneous Pool (06:35) === [00:43:34] It's because you have this tremendous pressure. [00:43:38] And comedy is the relief valve for that in a lot of ways. [00:43:42] Yeah, of course. [00:43:43] And but then someone would overhear that joke or something. [00:43:45] And now you're looking down the barrel of a 15-6, which is a military investigation. [00:43:51] And all of these things that could permanently impact your life in a way and give you a scarlet letter to where you could never be employed again or do anything ever again because you were simply trying to relieve some pressure or you were trying to find out where to buy your wife with the next dress and now your life's being ruined. [00:44:08] And I know guys who suffered under that sword. [00:44:10] Like I wouldn't name them, but I know guys who, you know, their career met a terminal end because of a dumb joke or something. [00:44:18] It's like you can't be expected to go out and shoot people in the face and then be sensitive to someone's feelings an hour later. [00:44:25] Right. [00:44:25] It's just, it doesn't, it does not work. [00:44:27] Now, should you talk to that guy and say, hey, you know, you made Woman X feel so-and-so. [00:44:31] Be more cognizant of that whenever you're around her in the future. [00:44:34] Well, you should also have a rational discussion with the woman. [00:44:37] Yes. [00:44:37] And what did he ask you? [00:44:39] He said, where did you get that dress? [00:44:40] It's very lovely. [00:44:42] I'd like to get one for my wife. [00:44:44] Why were you upset at that? [00:44:45] Like, is this rational? [00:44:48] Like, you can't be in an office if you're that sensitive. [00:44:52] Like, it's one thing if the guy said, I'd like to get you out of that dress. [00:44:55] Well, for sure. [00:44:56] Now you're in a different world. [00:44:58] 100%. [00:44:58] 100%. [00:44:59] Right. [00:44:59] But if someone says, you look great, you know, have you lost weight? [00:45:03] You look fantastic. [00:45:04] That's a compliment. [00:45:06] And if someone gets upset, I felt sexually objectified. [00:45:09] I felt harassed. [00:45:10] Like, okay, he just said you look great. [00:45:12] Yeah. [00:45:13] That's it. [00:45:13] Healthy. [00:45:14] It's not, you look great. [00:45:15] I'd like to get you naked. [00:45:16] Now we've crossed the Rubicon, right? [00:45:18] Now we're into. [00:45:20] But just you look great or I like your dress. [00:45:23] That's like if you said that to a man, like, hey, great suit. [00:45:26] Yeah. [00:45:27] And he's like, oh, you've trimmed a complaint. [00:45:29] Yeah. [00:45:29] You need to file a complaint. [00:45:30] Yeah, you've trimmed up, Joe. [00:45:31] You're looking good. [00:45:32] You're looking great, Bill. [00:45:33] Like, oh, my God, I am being harassed. [00:45:35] I need it, like, complaint. [00:45:36] That would have worked during the Biden administration. [00:45:38] That's fucking crazy. [00:45:39] That would have worked. [00:45:40] That's so crazy. [00:45:41] And the other thing that they were doing in this briefing, which is where I kind of, you know, the last couple of years of my military career, I got in trouble a couple of times, or I should say, called down. [00:45:49] I was a senior, I was a CW-4. [00:45:51] I was one rank from the top. [00:45:53] I was advising two-star generals, colonels on very important matters. [00:45:58] I wasn't high. [00:45:59] I wasn't high in the dominance hierarchy, but I was adjacent to people who were as an advisor. [00:46:05] And the amount of in this briefing in particular, they had gotten into, you know, it's bad that there are so many white people. [00:46:20] I'm doing high points here, but we need more diversity. [00:46:22] I was part of an accepted career program that they were starting to call like the old white boys network because most of the people, so the requirements for this network were you had to speak a couple languages. [00:46:34] You needed an engineering degree or some kind of demonstrated engineering background. [00:46:39] You had to have deployed. [00:46:41] They wanted you to speak the language very well. [00:46:44] They wanted you to be able to go through these engineering courses, these other things. [00:46:49] And what happens naturally is you now need people who are interested in engineering. [00:46:54] All right. [00:46:54] So you've got somebody who's maybe more constrained in their thinking. [00:46:58] You need somebody who speaks languages. [00:47:00] Well, now they also need to be kind of, you know, speak French, speak Russian, whatever it was. [00:47:06] So they had to have studied or lived in an area and done this. [00:47:09] And they need to be able to go through these crazy tactical and strategic types of courses. [00:47:14] By virtue of those things, you're going to get men. [00:47:18] And there were lots of women, but then there'll be more white men. [00:47:21] And it's not because the pool presented itself that way. [00:47:25] Now you have to extract from that pool. [00:47:28] And so in this briefing, when they were talking about like the old white boys network or how we need to change things, I said, you know, do you realize that most men have more in common than most women? [00:47:40] Or like if I say I need more diversity in a particular room, if you said diversity of thought, I'd be fine with that. [00:47:48] But Joe and random black guy in the same program in the same office have far more in common than the white woman. [00:47:59] But what you're saying is these people need to have all separate different colors and different like all of this needs to be this way. [00:48:06] It's going to naturally present itself that way because men in the military generally are disagreeable. [00:48:11] Men in the military who like engineering are generally hyper disagreeable. [00:48:16] And the only difference between these two people is the pigment of their skin. [00:48:20] So this fake diversity quota that they're putting on top of us doesn't achieve anything other than giving some officer a bullet on their OER. [00:48:29] And I got pulled into the office afterward. [00:48:31] I said way more than that, but essentially afterwards they're like, hey, Chief, you can't say that in those briefings, like the way that you were getting animated in there and what you're saying, what you're doing. [00:48:41] Yeah, this is not going to fly. [00:48:43] And this was like 2018 or 2019 or something. [00:48:46] Just being rational. [00:48:46] Yeah, just trying to be rational and say that there's more difference in groups than there is between groups. [00:48:53] And that the similarities and the way that things stack up, you recruit from a pool of volunteers and candidates. [00:48:59] If I'm recruiting from a pool of volunteers and candidates who are 80% male and white, I have to expect that the selected individuals are going to be male and white. [00:49:08] The majority of people who join the military, I don't control this. [00:49:12] I'm just, as an engineer, I'm looking at statistics. [00:49:15] Also, if you want a highly functional, productive group, it's got to be based on meritocracy. [00:49:19] Yeah, for sure. [00:49:20] For sure. [00:49:21] Anything other than that is literally a threat to national security. [00:49:24] Yeah, you're denigrating lethality. [00:49:27] The role of the army is to deter war through exuding superior military fighting and technology. [00:49:34] And when deterrence fails to win. [00:49:36] That's it. [00:49:38] Those are the two things that we need to do with our military. [00:49:41] It needs to look like the guy in the playground who you would not muck about with. [00:49:44] And if you were to muck with him, he will beat you senseless. [00:49:48] That's it. [00:49:49] Now, whether or not we should be using that all the time or how we use it, that's a separate question. [00:49:53] But the entity itself needs to comport itself in this way. [00:49:56] Otherwise, you are endangering this truly special experiment, which at least in its beginnings valued the individual. [00:50:05] It valued individual rights and states' rights. === Updating the Constitution's System (05:46) === [00:50:10] And the founders, and this was another thing I said in that briefing, was the founders knew, yes, they were all slaveholders, but they knew that the Constitution and the Bill of Rights and the Declaration of Independence would eventually lead to a system where we had to acknowledge these people as people. [00:50:26] And we fought a civil war where a million white dudes died to see this experiment through. [00:50:33] The scaffolding was there. [00:50:35] You have to look at the things the zeitgeist of the time. [00:50:38] If they had just said, nope, everyone's going to be free, there will be no slaves, you would have never gotten ratification through the southern states. [00:50:45] But they knew that there were, and when you read the Federalist papers, they knew that they were erecting this system. [00:50:51] When you look at Thomas Jefferson and some of these other great thinkers who, yes, he owns slaves, I get it. [00:50:56] They knew what they were building and they knew what would ultimately terminate in. [00:51:00] And then we had a civil war where we destroyed our country from the inside to see this dream come about. [00:51:07] And now we're just going to all go back and say they're all slave owners. [00:51:10] I know this has all been said here a million times, but this stuff animates me because it's built with blood and treasure. [00:51:16] Well, it's also, you can't judge people from the past based on the standards of the present. [00:51:21] For sure. [00:51:22] Because culture changes, people understand things better. [00:51:25] We have a much greater recognition of what was wrong with things 100 years ago, 200 years ago. [00:51:34] And I'm sure in the future, we're going to look back on today with the same lens. [00:51:40] It just always works that way. [00:51:41] Did you know Joe had a gas-powered car? [00:51:43] Exactly. [00:51:44] That kind of stuff. [00:51:45] Yeah. [00:51:45] Did you know that? [00:51:46] You consumed more, you flew more, you ate more meat, you did whatever you did. [00:51:51] You were a problem. [00:51:52] He was a problem. [00:51:53] Yeah. [00:51:53] And now why would we ever, like, I'm voting to get rid of the Joe Rogan experience from the National Archives because you drove a gas car. [00:52:00] Yeah. [00:52:01] You know what I mean? [00:52:01] Like someone, you know, stores your stuff for profundity's sake, for the future to hear about this. [00:52:06] You know, I've always loved your podcast, Joe, and it was because you're a genuinely curious person, and I'm not kissing your ass right now. [00:52:15] You're a genuinely curious person that was saying things that were not in the current zeitgeist at the time, and you refused to apologize for it. [00:52:23] And it led to a lot of great things, but it led to an updating of the system. [00:52:28] And you did it with dialogue, with the Diologos, with two people trying to learn things about each other. [00:52:34] And it led to an updating of a system. [00:52:36] I think it's very important for culture to have free and open dialogue so we can update our system. [00:52:41] So bad ideas can die so we don't have to die instead of our bad ideas. [00:52:45] Because if I can't express a bad idea, I have to act it out. [00:52:49] And if I act out the bad idea, it could kill me. [00:52:51] And the celebration of good ideas. [00:52:53] And the celebration of good ideas. [00:52:55] And it's just really, there's just been such a weird inversion in politics where the free, hippie-loving liberals of yesteryear are now the ones telling you what words you can use. [00:53:08] There are no borders, all of these crazy things. [00:53:11] And I always say to people, I said it to Andy on my last podcast with him. [00:53:16] I'm like a 1996 Bill Clinton Democrat. [00:53:19] If you go watch his State of the Union and he talks about lowering debt, getting out of debt, actually, working with Newt Greenridge to get out of debt, securing the borders, making work and education freely accessible. [00:53:33] I'm voting for that guy. [00:53:34] I know. [00:53:35] Isn't it crazy that that's why the problem of labels doesn't work, ideological labels? [00:53:41] Because if you go back far enough and look at Clinton, for example, he's one of the best ones. [00:53:46] And by the way, did balance the budget. [00:53:47] Yeah, he did. [00:53:48] We had a surplus when he left the office. [00:53:50] Yeah. [00:53:50] Amazing. [00:53:51] Did a fucking amazing job. [00:53:52] So he got his dick sucked. [00:53:54] Yeah, yeah. [00:53:55] Who didn't? [00:53:55] Back then, that's the other thing. [00:53:57] Judging people by the standards of the past, you know, JFK doesn't look so good in the Me Too movement. [00:54:03] You know, I mean, he would have got canceled. [00:54:05] It's like you have to recognize that those, this ideological bubble that we find ourselves in left versus right, Bill Clinton does not fit in that. [00:54:15] Bill Clinton is securely on the right in terms of 1996 standards applied to today. [00:54:22] He would never want to hear that. [00:54:23] No, he would never want to hear that because he's kind of shifted with the zeitgeist because that's what you kind of have to do if you want to stay in your party and be protected by your party. [00:54:31] Yes. [00:54:32] You know, but he's essentially, he had a lot of the idea. [00:54:36] We've talked about this before. [00:54:37] We've played clips of Hillary Clinton from 2008 and she's more MAGA than MAGA. [00:54:42] I know. [00:54:43] You know, her take on the border was like hardcore. [00:54:47] It was hardcore. [00:54:48] If you've been convicted of a crime, get out. [00:54:50] You know, if you stay here, pay a stiff penalty and you have to get in line and you have to learn English and everybody cheers. [00:54:57] That is a hardcore right-wing 2026 perspective. [00:55:00] Obama did it too in 2012. [00:55:02] Absolutely. [00:55:03] And Obama deported more people than Trump did. [00:55:05] Yes, exactly. [00:55:06] This episode is brought to you by Threat Locker. [00:55:08] Data breaches are happening more frequently than ever. [00:55:11] And it's not because of sophisticated tactics. [00:55:14] Attackers are using the same methods and exploiting the same vulnerabilities. [00:55:20] What's changed is speed and scale. [00:55:23] Reacting to breaches can leave you exhausted, constantly chasing threats instead of preventing them. [00:55:29] That's where threat locker comes in. [00:55:32] With threat locker, zero trust, you only allow what you need and block everything else by default. [00:55:40] You control what runs, when, where, and how, blocking ransomware before it executes. [00:55:46] Because no matter how you respond, a fast response simply isn't fast enough. [00:55:51] Visit threatlocker.com/slash J-R-E to learn more. === Zero Trust and Core Principles (02:36) === [00:55:56] And it's just, I'm not saying, like, my thought is always, I'm always updating, I'm always updating my systems. [00:56:02] I'm always getting told things. [00:56:04] I always have a pre-prescribed way of looking at the world that I'll have a good conversation with someone. [00:56:10] I'll update my system. [00:56:11] But generally, my principles are in place. [00:56:14] And when you watch these people who get in their 30s, 40s, 50s, and 60s, and their core foundational principles are changing, it really should give you cause for concern. [00:56:24] Because like you were saying this at this time, and now you're saying this at that time. [00:56:28] It's like generally, my rubric that I don't think will change about myself is, I'm fervently for the individual and I'm fervently for truth and and that we can that the, that the world you, you should measure it and look at not what your intentions are but what the outcomes are and and then evaluate the system and how it scales based on those outcomes. [00:56:49] Those are that's principally. [00:56:51] If you, I try to live that standard up to myself. [00:56:54] I fall fall short of that standard all the time, but I try to live by that standard and I feel like that will always be me, even into my 90s, like unless something goes horribly wrong right right right, and and I've pretty much been here since, you know, the past seven or eight years or so, like even into my 30s, I quite wasn't quite sure who I was as a human and uh, but I'm pretty, you know, steadfast in that, [00:57:20] and the amount of opportunities and the amount of goodness in my life and my children and and my home and the things I've been able to do have really been born out of. [00:57:29] That last seven years of the truth's going to be the top of the of the decision matrix for me, the top of the hierarchy for me, I'm going to try not to cut corners whenever I can and help good people around me and and and the truth is the way that I'll organize and function myself in life, and that I will try to only judge people as individuals and the world. [00:57:52] You know, these are Christ's teachings from 2,000 years ago and, but the world for me has just opened up in a way that I could have never predicted. [00:58:01] Using a very simple rubric, it's not easy, but it's simple. [00:58:05] And if more people just took those and this isn't me, I didn't come up with this, this is the result of, you know, watching a bunch of experiments go bad, but if people just adopted that very simple thing and just tried it for three months, you'll feel better about yourself, you'll feel better about the world, you feel better about the people proximately around you. [00:58:23] It might make you hate the government more yeah, but uh well, I don't think. [00:58:28] If you don't hate the government, I think you're not paying attention. [00:58:31] Yeah yeah, for sure. === Combat Operations in the Philippines (11:49) === [00:58:32] I mean when you were working in cyber, Cyber defense. [00:58:36] Like, what cyber offense? [00:58:37] Cyber offense. [00:58:38] What was the primary function? [00:58:41] Like, what did you do? [00:58:43] So, in the beginning, I have no short answers, and I apologize. [00:58:48] In the beginning. [00:58:49] I don't like short answers. [00:58:50] Yeah, I always feel like I'm. [00:58:52] I like a good long answer. [00:58:53] Yeah, don't worry about that. [00:58:54] Okay. [00:58:55] When I joined the military, I was in signals intelligence and essentially learning the ins and outs of radars, how radars work, what they do, how they function. [00:59:05] Did you guys ever see any weird shit, like UFO shit? [00:59:08] I wish I had. [00:59:10] I really do. [00:59:11] I wish you had. [00:59:12] Yeah, I really do. [00:59:13] I was more in the signals intelligence side of the house, focusing first on electronic signals or emanations from radars, mapping them so that, you know, if we were going to go do the ground invasion and there was going to be some air support going in first and blowing shit up, we would tell them, hey, there's a man-packable SA-7 here. [00:59:30] There's a SA-10 here. [00:59:32] There's this here, there's there. [00:59:33] And then telling these pilots so they didn't get shot out of the sky. [00:59:37] Quickly, when the war kicked off, that became irrelevant because there was no surface-to-air missiles, surface-to-surface missiles in Iraq. [00:59:44] We had knocked them all out in the first few weeks. [00:59:47] So then it shifted to communications intelligence. [00:59:49] So I kind of retrained on communications intelligence, and that was at that time off of cell phones, off of push-a-talk radios, repeaters, long-haul networks, terrestrial networks, extraterrestrial networks. [01:00:02] And what I mean by that is stuff, the satellites in the sky. [01:00:06] And doing analysis on those to try to inform what we call the common operating picture of the battlefield for a combatant commander. [01:00:14] So command commander wants to know where the bad guys are, what they're doing, what they're saying. [01:00:18] To the amount that we could, my job was to come up with solutions and conduct passive and active signals analysis on these things and then inform the commander so that we could mitigate risk. [01:00:32] It was all about mitigation of risk. [01:00:35] This is 2008 or so. [01:00:37] I'd been doing this for about seven years, eight years. [01:00:40] And from there, it shifted to the phones getting smart. [01:00:43] And essentially, it went from you walking around with a 2G phone or a 3G phone that had limited compute capability to now there's robust compute capability with the advent of like the iPhone. [01:00:55] And now it's like, well, now we've got to get after guys who are essentially walking on with a computer we could never have envisioned 20 years ago in their pocket with all this capability. [01:01:04] Because the military and our forces that we're fighting against, it all comes down to our ability to shoot, move, and communicate. [01:01:11] Communication being the part that I was focused on. [01:01:13] So as the advent of the iPhone and those things came out, the Army realized we didn't have a computer network operations MOS. [01:01:20] We didn't have a offensive cyber component. [01:01:23] We didn't have a defensive cyber component. [01:01:25] So we kind of, I was there at the ground floor when we were building out these new MOSs now that are all over the military. [01:01:31] But at that time, there was a thought going into, you know, we need to have people who know how to be on-net operators. [01:01:37] Ethical hacking, as paradoxical as that sounds. [01:01:41] That's how the lawyers called it that. [01:01:42] So it's hacking at the end of the day, but ethical hacking because you've got the backing of the U.S. government. [01:01:47] And so we set up that framework and really started launching into operations, you know, 2006, 7, 8, all the way into my last deployment in 2017 or 2017. [01:02:00] It was all focused on computer network operations and how they lash up with terrestrial networks. [01:02:05] How do we exploit all of that was one facet of my job. [01:02:09] And your question was, how did I get into all of that? [01:02:14] And that was the... [01:02:15] How do you get into it? [01:02:16] What was like... [01:02:18] What was the operational aspect of it? [01:02:20] How did you actually, what did you do? [01:02:23] So, you know, I'll stick to terms that are more generally understood by the public, but learning how to do things like war driving, collecting on networks, Wi-Fi endpoints, cell phones, understanding the ins and outs of them, understanding how to do forensic analysis of them. [01:02:42] So after there was an operation and a bunch of guerrillas had been sent in to kill a bad guy, we could derive maximum intelligence value from the handset to plan other operations. [01:02:54] And so, you know, it would be passive monitoring of networks to inform the intelligence picture, which would lead to either combat operations or active computer network operations, where now it's like, well, there's, you know, a, I don't know, an Iraqi or an Afghani router that hasn't been patched in three years. [01:03:18] And we think we can either write or find a zero day, which is just an exploit of those routers, where we can muck with their router in a way where they think they're getting good information and they're not, or they're erecting other things to mitigate risk for the commander. [01:03:39] And so that really, you know, exploded at that point. [01:03:42] And between that and human intelligence, which is kind of the actual gathering of intelligence from other people, you know, you would call it spy or James Bond, but that's James Bond was a horrible spy. [01:03:55] Was he? [01:03:56] I mean, yeah, you know, your job's to remain anonymous, and you're walking into a casino and there's Goldfinger calling you by your first and last name. [01:04:04] It's not a great look. [01:04:06] You know, generally you don't want to be sleeping with your sources or using your real name or whatever. [01:04:13] So human intelligence. [01:04:15] And then my focus for the last 10 years was how does signals intelligence computer network operations become a force multiplier for people conducting overt and clandestine operations throughout the theater at that time. [01:04:29] My deployments and my time was spent in Iraq, Afghanistan, Africa, Northern Africa. [01:04:36] And then a lot of people don't know it, but we were in active combat operations in the southern Philippines as well for a fair amount of time. [01:04:43] I want to maybe say seven or ten years. [01:04:44] We were doing combat operations in the southern Philippines. [01:04:47] My first deployment to the southern Philippines was 2007. [01:04:54] Who were we doing operations against? [01:04:56] So there were terrorist elements down there that were traveling back and forth from Pakistan and Afghanistan. [01:05:03] And there was a terrorist organization down there called the Abu Saif Group. [01:05:07] And there were other ones as well. [01:05:09] Jama Islamia, I think was the name of the other one. [01:05:13] And they were conducting their own terrorist anti-Christian operations in the southern part of the Philippines. [01:05:18] In the southern part of the Philippines, I don't, can I say it? [01:05:20] Can I say the word? [01:05:21] What do you mean? [01:05:22] Jamie, can you pull up a map of the Philippines? [01:05:24] Can you pull it up? [01:05:25] Oh, say that. [01:05:26] I'll say that. [01:05:27] Yeah, pull it up, Champion. [01:05:27] I've been listening to it forever. [01:05:29] So there's what's called the autonomous region of Muslim Mindanao, which is the southern part from a place called Zambuanga down to Hulu or Holo Island. [01:05:39] And there's a, it's a funny joke because if you zoom into Zambuanga, which is God, look how many islands are. [01:05:45] I know. [01:05:46] Go down to the south there. [01:05:47] See Zamboanga. [01:05:48] Go down right there. [01:05:50] Zoom right there on that island. [01:05:51] Now move to, sorry, now move to the southwest. [01:05:55] See that penis? [01:05:57] At the tip of that penis is called Zambuanga. [01:06:00] All of our combat operations, now if you zoom out a little bit more and pan more south and zoom out just a little bit more so the joke hits, all that sperm south of the tip of the Zamboanga city, there are terrorist operations in here. [01:06:17] Now, if you go to that main island called Sulu, there's Holo Island, that's where I was on this tiny island out in the middle of nowhere. [01:06:25] And on that, there's a mountain. [01:06:27] That's all the Philippines? [01:06:28] Well, no, I mean, this is all the Philippines down here, yeah. [01:06:30] Wow. [01:06:31] So this is called, there's a mountain in there. [01:06:32] I think it was called Mount Tumatoc or something like that near on the eastern part of the island called Luke. [01:06:38] It's called Luke. [01:06:39] Yeah, so there's mountains. [01:06:40] There's a mountainous region there. [01:06:41] There are a bunch of terrorists up there. [01:06:43] They were killing people in the area, conducting bombings. [01:06:46] They were getting trained. [01:06:47] In fact, there was a guy, and I believe I'm going to get his name wrong, perhaps, but I believe his name, it was either Insulan Haplon or oh, it's Jamar Patek, Jamal Patek. [01:06:59] He was actually arrested outside of Osama bin Laden's compound the day after he was killed. [01:07:03] We were trying to kill him on that island or in and around that island is where we were trying to find him and kill him. [01:07:09] So they're terrorist facilitators. [01:07:11] They did the USS Coal bombing. [01:07:14] Zoom back out. [01:07:14] I want to see the Philippines one more time? [01:07:16] Like all the islands? [01:07:17] When you zoom all the way out, it's so nuts how many islands there are. [01:07:22] Yeah, so up north of Manila is mostly the Christian population. [01:07:27] And as you get down south, it's the autonomous region of Muslim Minnow. [01:07:31] And that is all of where these terrorist operations were happening. [01:07:35] And I believe that mostly pulled out of there. [01:07:37] There might be still some people in Zamboanga. [01:07:39] I'm not sure anymore because it's been five years, four years since I retired. [01:07:44] But yeah, we were doing counterinsurgency operations down there and guys died down there and there were combat operations. [01:07:49] And I was out there. [01:07:52] I was in a tactical military intelligence battalion and I was attached to the first special forces group. [01:07:57] And we were down there a couple of times. [01:07:59] And a lot of people don't even know about it. [01:08:01] Yeah, I never heard about it. [01:08:03] Yeah, so anyway. [01:08:06] Sorry, the sidebar, but I'm so stunned at how many islands are in the Philippines, how spread out it is. [01:08:11] Yeah, it's insane. [01:08:13] And the thing about it is, I'd go to all of these little outposts and these out islands. [01:08:18] We were always debriefing these guys. [01:08:20] And I'm going to get these terms wrong. [01:08:21] So I'm sure there'll be people in the comments. [01:08:23] But I think they're called Bongarais or something like that. [01:08:25] But they were like these mayors of each one of these little islands. [01:08:29] And there'd be terrorists in and around those areas. [01:08:31] And we'd try to make friends with these guys so they'd give us some information. [01:08:35] And every one of those places was absolutely beautiful. [01:08:38] Like you'd go there and be like, man, Hilton could turn this into something in a short order. [01:08:42] Right. [01:08:43] You know, when you're out of these places, beautiful beach, beautiful, lush jungles, the best swimming water. [01:08:48] Nicest people, too. [01:08:49] Oh, Filipino people are some of my favorite people, man. [01:08:53] Like, you want to talk. [01:08:53] The guys that we worked with out there, they're scout. [01:08:56] I think they're called Scout Sniper Scout Rangers. [01:08:58] And they were especially, I think they were like their special forces. [01:09:01] We'd go to the range with these guys and show them stuff. [01:09:04] And they're the most ride or die type of guys you'll ever meet in your life. [01:09:09] Like, you know, so-and-so said this about you last week, and I could kill him. [01:09:12] It's like, no, dude, it's cool. [01:09:13] It's like, don't worry about it. [01:09:15] Fun fact, they're some of the best pool players on earth, too. [01:09:17] Oh, really? [01:09:18] Some of the greatest pool players of all time. [01:09:19] Came out of the Philippines. [01:09:20] They're just great people. [01:09:21] I mean, I just, the people down there were fantastic. [01:09:24] And it was awful because those guys would be bombing churches, Christian churches, and stuff like that. [01:09:28] And they're doing counter-operate, like I said, counter intelligence operations out there doing intelligence operations collection to inform that battle picture. [01:09:38] But those guys had direct links with Osama bin Laden and other people. [01:09:42] I have no idea. [01:09:42] Yeah, right after we, like I said, I think it was, I think if you look it up, I think his name is Patek, P-A-T-E-C, P-A-T-E-K. [01:09:51] And he was arrested outside of Osama bin Laden's compound, and we had been chasing him in the Philippines. [01:09:55] Wow. [01:09:56] Because we thought he was still down there. [01:09:58] There was another guy that I believe we killed him. [01:10:00] His name was Albeder Parad. [01:10:03] But yeah, my job was not, I always say this on podcasts because the veteran community is wild right now. [01:10:08] They love to cut each other down right now. [01:10:10] There's something weird going on where, like, obviously lying. [01:10:13] Yeah, call the people out. [01:10:15] I prefer to call people out face to face, but I always make sure people know I was not a cool guy. === Electromagnetic Radiation and Superpowers (05:16) === [01:10:22] Like sometimes I got to dress like one. [01:10:24] For a few years, I didn't wear any uniforms, and I got to grow my beard out and act like a cool guy. [01:10:28] But I was really a nerd for cool guys. [01:10:30] I've literally got pictures of myself down in the Holo or in Afghanistan or anywhere else and tape around my glasses and Pez Dispencer and my radio and collection equipment looking like a true blue American nerd. [01:10:44] But I was not the guy who kicked the door in. [01:10:46] I was always the guy who pointed the door out. [01:10:48] So I'd be safe in the Humvee in the back, you know, eating an MRE and somebody that looked like another gorilla, you know, like Annie Stumpf or Tim Kennedy or someone like that. [01:10:56] I'd be like, is that the house? [01:10:57] I'd be like, pretty sure that's the house. [01:10:59] You guys want to be safe, but go ahead. [01:11:00] I'll be in the Humvee. [01:11:01] I'll be out here or I'll be in an airplane above, you know. [01:11:05] And yeah, it was, it was being born in North Dakota and, you know, my mother, single mother, after she left that first guy, trailer house in the middle of this little town called Cavalier, North Dakota. [01:11:20] I had no options. [01:11:21] I was a horrible student. [01:11:23] And what did you do? [01:11:24] That's crazy that you're so smart, but you were a horrible student. [01:11:26] I wouldn't, yeah, I'd call myself curious before I'd call myself smart. [01:11:30] But, you know, my mother, you know, I don't know if you would remember this, but maybe other people my age, you know, you'd get these scholastic book order forms that you'd bring home from school and you could order books. [01:11:43] There'd always be on the back page. [01:11:45] There'd always be like little cool stuff like you could get like, you know, a pair of gloves or a hat or something. [01:11:50] Anyway, one time there was a coil radio that you could order with an earpiece and you put this coil radio together and with an earpiece, no battery. [01:11:59] It was just the electromagnetic radiation would activate the coil and the coil would, you could listen to radio chatter. [01:12:08] Really? [01:12:08] With no battery? [01:12:09] Yeah, yeah, just tiny little radio. [01:12:11] How did it, what was the power? [01:12:13] The electromagnetic radiation. [01:12:15] And you would just kind of like a record, like, you know, how you hit a record. [01:12:20] Electromagnetic radiation would hit the coil and the coil would feed up to an amplifier or up to an earpiece and the earpiece you could hear chatter and you could. [01:12:27] Did the earpiece have a battery? [01:12:29] I don't think anything had a battery on it. [01:12:31] I think it was just a... [01:12:32] Wow. [01:12:33] I could be mistaken, but I don't believe it was. [01:12:35] It was powered by electromagnetic radiation. [01:12:38] Yeah. [01:12:38] I mean, you can look it up, Jamie, if you want. [01:12:40] Sorry to say that again, but tighten that thing down. [01:12:42] That thing's driving me crazy. [01:12:44] Yeah, sorry. [01:12:45] Like here or here. [01:12:46] Right here. [01:12:46] Look at my finger. [01:12:47] It's right here. [01:12:48] Yeah, I've been meaning to do that literally when everybody uses this fucking thing. [01:12:52] It's wobbling around ready to fall off. [01:12:53] Yeah, but if you look up coil radio with small earpiece, I could be wrong. [01:12:58] I don't remember there being a battery on it. [01:12:59] Electromagnetic radiation powered it. [01:13:02] That's bananas. [01:13:03] Yeah, so kind of like a same thing with like, you know, not at the same wattage, but a microwave, right? [01:13:08] Sends power through the air. [01:13:09] Right, but it uses DC. [01:13:11] But it uses power if you send it. [01:13:13] Yeah, but I could be wrong. [01:13:15] But at any rate, that was the first time I got a radio, and I was hearing things, and I'd put it together, and I'm listening to things. [01:13:22] Like what kind of things? [01:13:24] HF radio, VHF radio, people talking, that type of stuff. [01:13:28] And it was just, and then I found out how to get an antenna to make the antenna larger and started ordering auxiliary pieces for it. [01:13:36] And then what really changed me was my mother let me get a my mother and I would clean houses. [01:13:41] She was a waitress, but we also would go around and clean houses. [01:13:43] And there was a lawyer that we worked for. [01:13:44] His name was Phil Culp. [01:13:47] And he had an old 286SX IBM. [01:13:51] And it was just sitting in his basement. [01:13:53] And I told my mom, I was like, hey, if I clean it for like a month, can I have that computer? [01:13:57] Like, he doesn't use it. [01:13:58] He's got a new 486 up in his place here. [01:14:00] And he instantly said I could have it. [01:14:02] And then that started me down the computer networking realm. [01:14:05] And like, look, how could I get this 286 to act like a 386? [01:14:08] Or how could I force it to run Windows? [01:14:09] Or how do I update the memory? [01:14:11] How do I do these things? [01:14:12] In this little town, Edinburgh, North Dakota, there was a guy who had a computer store in a basement of an old general store, and his name was Jeff Munzerbrotten. [01:14:20] And I would go there and ask him questions about computers and just start learning ins and outs on how do I update the RAM? [01:14:26] How do I get memory better? [01:14:28] How do I augment the storage? [01:14:30] How could I force this thing to run Windows 3.1 so I could have a GUI instead of using a command line? [01:14:36] GUI mean graphic user interface. [01:14:37] Graphic using the interface. [01:14:39] Yeah, sorry. [01:14:40] And so that kind of started me on that. [01:14:43] And that, for me, like I said, I had all kinds of problems with attention deficit disorder and not being able to pay attention. [01:14:51] That was the only time I would go for three days. [01:14:55] I don't believe in ADHD. [01:14:58] I might be wrong, but I think it's a superpower. [01:15:01] I mean, it certainly, I remember I would spend two days working on a problem and not sleeping. [01:15:05] That's what I'm saying. [01:15:06] I think it's a superpower. [01:15:07] I think it just keeps you from being interested in things you're not interested in. [01:15:11] Yeah, I have a theory on that too that I can get into after. [01:15:15] But that started me down that road. [01:15:17] But in school, I couldn't pay attention. [01:15:18] Me neither. [01:15:19] There was this teacher. [01:15:20] I always tell a story. [01:15:21] It's a great teacher. [01:15:22] She's still around. [01:15:23] Her name is Connie Trenbeth. [01:15:26] And she was my English teacher or literature teacher or something like that. [01:15:29] She might not even remember the story, but here I am telling it on your podcast. [01:15:32] I remember it. [01:15:34] She kept me after class once, and she goes, you know, I knew your dad, Bill. === Forgetting Licenses to Join Military (03:32) === [01:15:38] And, you know, your uncles were all smart. [01:15:41] And my great uncle has an engineering wing of a school named after him out in western North Dakota. [01:15:48] And she goes, all these guys were thinkers, and your dad did all this great stuff and built all this stuff. [01:15:52] And essentially what she was telling me is, you're a waste of life. [01:15:58] Like all you do is you come in here, you disrupt the class, you upset people, no one can talk. [01:16:04] Sounds like me. [01:16:05] You're trying to dominate every conversation. [01:16:07] But when, you know, you had written one paper on something that interested you, and I don't remember what it was. [01:16:13] And she's like, that was a wonderful paper. [01:16:15] She's like, if you could just do that every time. [01:16:18] And I was not hearing it. [01:16:21] Like, I remember the conversation because I actually remember her. [01:16:24] I think she said waste of life. [01:16:25] I think she actually said that. [01:16:27] Like, you're wasting, like, you're obviously my RP, my CPU clocks high. [01:16:32] I'm always thinking, even when I'm not thinking, and even as we're sitting here talking, I'm thinking about other things or stuff I want to do when I get back to my computer or stuff I want to do for my business. [01:16:41] And so I joined the military. [01:16:44] And the absurdity of life is this. [01:16:47] I had joined to be a military policeman, which I absolutely would have hated. [01:16:52] All of them got turned into infantry people or stand gate guard, which is a needed function in the military, but it doesn't apply to my personality. [01:16:59] But when I went to the recruiter station out in Minneapolis, I think it was, I was a bonehead and I forgot my driver's license. [01:17:06] And they're like, well, and I was supposed to leave. [01:17:08] And at this time, I had dumped my girlfriend, told everyone goodbye. [01:17:12] I'd wiped the dust off my boots, like left Cavalier, North Dakota. [01:17:16] And I was like, hey, I'm not going back. [01:17:21] So whatever we got to do right now. [01:17:24] And he's like, well, you can go home, get your license, because the MEPS station was in Minneapolis. [01:17:31] Was it far ago? [01:17:32] It doesn't matter. [01:17:32] It was five, six, seven hours away. [01:17:34] And they're like, well, you're not leaving today without a driver's license. [01:17:39] So I looked at my recruiter and I was like, I don't know what job you need to get me into, but it needs to be a different job. [01:17:44] And they're like, well, you scored exceptionally high in your general technical part of your ASVAB, which is like understanding machines and objects and stuff. [01:17:52] So we could get you into this like Intel job where you'd learn about radars and stuff. [01:17:57] And that immediately clicked for me. [01:17:59] And then he's like, well, we got to go brief you in this skiff room. [01:18:02] There's a secure compartmented information facility. [01:18:05] There's only one guy who's got a clearance and he can brief you on the job. [01:18:08] And if you want that job, then you can leave tomorrow. [01:18:10] I instantly started hearing like the James Bond music, you know, dang it, yeah. [01:18:15] Yeah, and so they walked me in this back place and, you know, nothing super crazy and briefed me up on the job. [01:18:23] And I went back out and I said, yeah, this is actually the job for me. [01:18:26] So the absurdity of life is me forgetting my driver's license when I was 16. [01:18:29] I was 16 when I signed up. [01:18:32] Maybe 17. [01:18:33] No, I was turning 17 that December. [01:18:35] When I signed up for the military, I can connect with a string to forgetting my driver's license to being here with you today. [01:18:43] You can sign up when you're 16? [01:18:45] I think I was turning 17. [01:18:47] You can sign up when you... [01:18:48] I didn't even know you could sign up when you're starting. [01:18:49] I had signed my delayed entry program thing, and I left a little bit before my 18th birthday. [01:18:55] So I was graduated from high school. [01:18:57] But yeah, you can sign up when you're 16, I believe, as long as your parents signed the waiver. [01:19:01] My mother signed the waiver. [01:19:02] She was happy to get me out of the trailer. [01:19:05] So, yeah, I was 17, almost 18 when I left. === Lockdown Mode for High-Risk Meets (15:23) === [01:19:10] Yeah, right there. [01:19:11] So that's all the pieces. [01:19:12] They call it a crystal radio. [01:19:13] Yeah, I was going to say crystal controlled. [01:19:15] That's a radio? [01:19:16] There it is. [01:19:17] That's actually the exact thing. [01:19:18] That is almost exactly what it looks like. [01:19:22] Slinky made it. [01:19:23] Well, they bought the brand. [01:19:23] They just led the Slinky brand now. [01:19:26] There's a bunch of these all over the internet. [01:19:28] Yeah. [01:19:28] Wow. [01:19:30] Make your own working radio without battery. [01:19:32] Yeah, and it uses a, I was going to say crystal controlled radio because it uses a crystal diode on it. [01:19:38] Would you say Tesla coil, James? [01:19:39] So yeah, it's a Tesla coil. [01:19:41] This guy's explaining it. [01:19:42] So this thing has actually kind of cool too. [01:19:44] Let me find this thing. [01:19:46] A rocket radio, they called, which is like further development. [01:19:49] This thing. [01:19:50] It attached to a phone. [01:19:53] So you plug that onto a phone cable. [01:19:55] There's a picture of it somewhere on here, but it explains like you're picking up. [01:20:00] There you go. [01:20:02] Wow. [01:20:03] Wow. [01:20:04] No battery or current needed, hence no operating expense and long life. [01:20:08] Yeah, this is almost onto a phone. [01:20:11] What year was this? [01:20:12] Man, this is old. [01:20:14] Yeah. [01:20:15] So it also shows here, this is like you're picking up power from a radio tower. [01:20:19] Yeah. [01:20:19] Wow. [01:20:20] The more powerful the signal. [01:20:21] This is quite like what they're paying for at the FCC. [01:20:23] The more powerful your radio tower, the longer and more people you can reach. [01:20:28] Crazy that has no battery. [01:20:31] And that's also why some radio signals come in very well on your radio and some don't. [01:20:35] And they sound like dog shit. [01:20:36] Yeah. [01:20:37] Weak power. [01:20:38] Yeah. [01:20:39] And then the frequency modulation. [01:20:41] Like amplitude modulation isn't as efficient as frequency modulation when it comes to for the vocorder to produce sound. [01:20:49] Amplitude modulation travels farther, but it doesn't have the amount of information. [01:20:55] It's not modulated with the carrier wave can't be modulated with as much information as you need, whereas frequency modulation is much quicker, megahertz, and you can amplitude and add more sound or more information, which is why it sounds better. [01:21:08] So FM sounds better, but it doesn't travel as far. [01:21:11] Right. [01:21:11] And it sounds worse. [01:21:13] When I was training people in the military on this, I always use the analogy of if a party is happening next door, you can hear the bass music, but you can't hear the treble. [01:21:21] You can hear the bass music because that frequency travels farther because it's lower in the frequency band. [01:21:27] But you can hear the treble because, or you can't hear the treble, I'm sorry, because it's higher frequency and there's more modulation. [01:21:34] And so it disperses quicker and you can't hear it as well. [01:21:37] And it's the same thing with like VLF comms coming off of like a submarine can travel underwater for a very long ways, but you can't put as much information in them as you could if you were doing, you know, VHF or UHF comms where there's lots of modulation. [01:21:53] So it's the dispersal. [01:21:54] And, you know, a lot of my, you know, mid-part of my career was explaining this stuff to, you know, military guys who were trying to understand like, here's how a cell phone works, and this is how frequency works, and this is how we send information. [01:22:06] And just kind of demystifying, you know, how a GSM network works. [01:22:12] One of the things that I wanted to ask you about that is when new technology is emerging, how do you stay ahead of the ability to extract information from this technology, hack into networks before people understand the capability? [01:22:35] You really can't. [01:22:36] You really can't. [01:22:37] And that's the beauty of the free market, is that the innovation to perform the function that you want someone to pay for will always move faster than your ability to exploit the technology. [01:22:48] Then how do you explain things like Pegasus? [01:22:51] Well, I mean, something like Pegasus, well, first off. [01:22:55] Explain Pegasus to people that don't know. [01:22:57] It was a persistent implant on cell phones for people. [01:23:02] Initially, you had to click it. [01:23:04] It was a click explicit. [01:23:05] Initially, it was a click, and then it became a non-click exploit. [01:23:08] So in other words, you had to interact with something on the phone in order to initialize and install the implant. [01:23:14] And then after, but the reason why it was so good is because it wasn't stored in the it wasn't stored in the unusual areas that you would want a persistent implant or where you would have a persistent implant. [01:23:27] For instance, you know, you might want to put it in the application layer of an app or something like that where there's a binary that can run and execute commands or functions. [01:23:38] And so they, I won't get into the very specifics of where and how they did this because I'm not sure if I got this information from the government or not, so I won't say it. [01:23:47] But they stored it in a place where it wasn't normal. [01:23:50] And you can read papers on your own and look at the forensics of it and how the actual implant was executed. [01:23:57] But it essentially allowed people to own your phone and was the kind of implant I only dreamed of when I was helping develop my own implants in the military. [01:24:10] Mostly what we would rely on is zero-day architecture and looking for something in a phone that either they hadn't patched or that the phone that you were looking at hadn't been patched. [01:24:20] So phones, as they have their own red teams, are going through the phone for their own, because they want to sell a product that people will use and people won't use stuff that can get hacked. [01:24:29] So they'll do their own red teaming and they'll discover like, oh, you know, on this router we developed, we left this port open and it shouldn't have been open. [01:24:37] So now we're going to write a patch that will close that port so that this port is no longer accessible by a guy like me. [01:24:42] So I can't go in there and do something to this particular type of router. [01:24:46] Another great thing, I'll say something good about the administration. [01:24:49] They're doing some stuff right now to make sure that we're getting rid of Chinese technology and Chinese routers. [01:24:55] And, you know, there's a widespread network of the PLA has a, and I can't remember the name of the botnet, but they essentially implanted a bunch of old unpatched routers to get access to government and business proximal people. [01:25:13] And it was widespread and huge. [01:25:15] And, you know, it looked like to me, I haven't read this anywhere, but if I were looking at this implant and how it was done, they were trying to really cause some trouble. [01:25:25] It was being placed at critical places, think power, think energy, think banking. [01:25:31] Like they really wanted to cause some ruckus. [01:25:34] And I have not been part of this administration, so I'm not saying anything classified for those of you who are listening. [01:25:40] But there was a decision to say, hey, we need to make sure that these things get patched, and also that we're not bringing in architecture from the overseas because they don't play by the same rules that we at least say we play by. [01:25:51] So that's why they banned Huawei devices. [01:25:53] Oh, yeah, and ZTE. [01:25:54] Yeah. [01:25:55] Huawei had a phone that I was really interested in back in the day. [01:25:59] They had a Porsche Design had partnered with Huawei and made this insane Android phone with like the best camera, the best battery. [01:26:07] It was like really high level. [01:26:09] And I was like, gonna buy it. [01:26:11] And then all of a sudden they banned all the Huawei phones. [01:26:13] And I was like, what's going on? [01:26:15] And then, you know, I had heard some people say, oh, they're just trying to stop competition. [01:26:19] It's like American companies are trying to stop it. [01:26:22] And then I went into it deeper and I said, no, it seems like there's third-party input on some of their routers and some of their network devices that they had engineered in order to be able to access them by third party. [01:26:38] And this, because of whatever, lack of understanding, lack of knowledge of how these things are constructed, the people that purchased them weren't aware of them. [01:26:49] And these things had gotten into place. [01:26:51] And they had gotten into place in universities. [01:26:53] They got into place in military establishments. [01:26:56] They were using them in cell phone towers that people had, you know, inadvertently bought from China. [01:27:01] Yep. [01:27:02] And that's really, I mean, I can tell you firsthand from having done some of the forensic exploitation on this stuff. [01:27:08] Another large part of my career I didn't talk about was just on mobile forensics and media forensics, which is essentially you think of like CSI, Miami, or CSI, whatever the city was. [01:27:18] There's a crime, someone was killed. [01:27:20] You have forensics that are doing forensics on like blood and fingerprints and blood splatter and all that stuff. [01:27:25] There's a whole another part of that same forensics branch that focuses on media forensics. [01:27:30] What was deleted off this phone at one point? [01:27:32] What remains on this phone? [01:27:34] What was it being used for? [01:27:35] I would do this in the military so that when we did do an operation, and I was part of some of the largest ones ever done out in Afghanistan, there would be treasure troves of phones and all of these computers and stuff like that. [01:27:47] And it was my job. [01:27:49] And I had a great team that worked for me. [01:27:51] In my deployment in 2015, we would go in afterwards, gather up all of this stuff. [01:27:57] And, you know, the task force commander would literally be standing by and we would say, you know, here's the intelligence that we've derived. [01:28:04] Here's the multi-point analysis. [01:28:06] You know, it was on this hard drive. [01:28:07] It was here. [01:28:08] It was here. [01:28:08] You know, there's a bad guy place out here. [01:28:10] And those guys would be rolling like within moments after the last operation. [01:28:13] Like some operations we'd do where we'd be rolling one after another target because we were getting really good at media forensics and intelligence that was there. [01:28:22] And then getting into active media forensics, which is a different discipline. [01:28:25] But essentially, I can get into that later if you want to. [01:28:28] But launching and doing these follow-on operations off, you know, dumping the binary from a phone and examining it at the ones and zeros level to say everything that was going on with this thing. [01:28:40] Or if it was a really high, like the organization that I worked for at that time did the analysis of the Osama bin Laden media. [01:28:48] And, you know, on that media, we're doing far more than we would for another piece of media and that we're, you know, x-raying it and we're looking at maybe what the disk looked like before or what was destroyed or reconstructing things, spending millions of dollars on that intelligence analysis because we wanted to fully understand everything that this guy was involved in and what he was doing and where he was and who he was talking to. [01:29:10] And so that was another part of my career that I did for about five years or so. [01:29:14] What was going on with the Huawei phones? [01:29:16] Like, what were they doing with them? [01:29:18] I mean, some of them were coming out implanted. [01:29:22] In other words, there was access built in for a foreign actor. [01:29:25] And then in other terms, other places with routers, with the ZTE stuff, there were just things that you would patch or that you would fix as a company who was trying to protect the consumer and create a product that people would use. [01:29:38] And they weren't doing it. [01:29:39] So they were creating persistent back doors either by actively placing code on there that would allow root access or they were leaving things open, especially in Africa, like the work that, you know, when I was working in Africa, the Chinese were just owning Africa. [01:29:54] They were just giving them communications infrastructure. [01:29:58] And they were doing that because they wanted their resources and they wanted to know what these people were saying and what they were doing. [01:30:04] And so I'm a free market real, like I'm as free market as a guy can get. [01:30:09] I want the best people building the best products and I want everyone to be able to compete. [01:30:13] But in that case, I would never own a Huawei or a ZTE or anything else. [01:30:18] On a consumer level, what were they doing with those phones? [01:30:21] Like if they had imported them to the United States, if they didn't have that ban, what would have been the issue? [01:30:27] Getting access to, you know, any number of people that the Chinese really want access to everybody. [01:30:34] But you could start at the topical level of just saying, you know, getting Joe Rogan to use a ZTE would be, that would be my wet dream as a guy who used to do this work back in the day because you're talking to the president or you're talking to this guy or that guy. [01:30:46] And I can build out a network of understanding who you're in contact with, who you're talking to, what's being talked about. [01:30:53] But then also finding out this person's phone number and now doing a deep dive on there. [01:30:57] So it's really about getting all of that data and constructing an analyst notebook, essentially, outline of who's talking to who, who do we need to implant. [01:31:07] But it's for business as well. [01:31:10] They would want this in the hands of somebody who's in charge of a business because they want their IP. [01:31:14] They would want this in soldiers' hands so they would know deployment dates or who's going where and who's doing what. [01:31:18] They want this in routers because routers are usually the most unpatched piece of technology in that you're not, especially, you know, these days they're more automated patching. [01:31:28] But back in the day, like you had to manually update a router. [01:31:31] And if you didn't, well, then you had potential exploits that were sitting on that router where I could gain access to the router in your home, or I could gain access to a BGP router, which is like a border gateway, which is moving all of the internet data. [01:31:44] Or I could get access to a microwave terminal. [01:31:47] If you look at a cell phone, they've got the microwave terminals on there that are sending information in between them. [01:31:52] If those are Chinese parts that are either being used for the processing, the CPU, or the physical infrastructure of that, the products that they were putting out would give me direct access to the information that's being passed on those terminals. [01:32:05] So you're getting, you know, system-level, root-level access through machinery, through communication devices, and through things like routers where you can know everything you want to know about your enemy. [01:32:17] Wow. [01:32:18] And so as far as today's technology, I see you use an Android phone. [01:32:23] Is there a phone that is more secure or a platform that is more secure? [01:32:29] It all depends. [01:32:31] I always take this from Thomas Soule. [01:32:33] There are no answers. [01:32:34] There are only trade-offs. [01:32:35] So there's like the way to answer that question would be, who are you? [01:32:39] What are you trying to do with your life? [01:32:41] What are you talking about on your phone? [01:32:42] What are you doing on your phone? [01:32:44] Most of these phones, if you're just an average everyday citizen who's just going about your job, the phones today are pretty secure, especially versus a few years ago. [01:32:54] If you're a reporter, now the nexus is, do you trust the government and do you trust Apple? [01:33:01] If you trust the government, you trust Apple, then Apple's probably your best bet for using an, you know, there's lockdown mode on an Apple phone or they used to call it back in the day. [01:33:11] I think it was called reporter mode, but there was ways to encrypt the devices and to encrypt the chatter and the tunnel coming out of the phone, the RF coming out of the phone. [01:33:22] What is lockdown mode? [01:33:24] I don't know if that's exactly what it was called or not because I've never really used Apple just for my own personal reasons. [01:33:29] What personal reasons? [01:33:31] I don't trust Apple. [01:33:32] How so? [01:33:34] They are more interested in monetizing people's data than they are providing them capability. [01:33:39] So every time you take a photo, every time you upload a document, every time you talk to it, every time it asks you about your, you know, you'll get these questions where it says if your password's lost, you can back up your password in these ways. [01:33:52] Tell us where you were born. [01:33:54] Tell us your mom's maiden's name. [01:33:55] Tell us your mom's this, your mom's that. [01:33:57] Lockdown mode is extreme optional protection. [01:33:59] You can only be used if you believe you may be personally targeted by a highly sophisticated cyber attack. [01:34:03] Most people are never targeted by attacks of this nature. [01:34:06] When iPhone is in lockdown mode, it will not function as it typically does. [01:34:09] Apps, websites, and features will be strictly limited for security, and some experiences will be completely unavailable. [01:34:16] Yeah, so when I was advising guys back in the day on going out and doing like a high-risk source meet, so they're going to go meet a spy for another country, and you're a military guy and you're debriefing someone or doing something, I was always telling them to use lockdown mode. [01:34:30] I knew that it did those things. [01:34:31] I didn't know if that was the term or if I'd thought about it. === Android Code vs Meta Privacy (15:27) === [01:34:33] So can you still send iMessages? [01:34:36] You can still text and call. [01:34:37] Text and call that stuff. [01:34:38] Yeah, but there's other things that you can't do. [01:34:41] Well, like Meta just recently announced they're no longer encrypting your DMs. [01:34:46] Why would they do that? [01:34:47] Well, they said that it's for protection or whatever, to make sure that people aren't doing bad things. [01:34:52] I don't know. [01:34:53] See what their explanation for it was. [01:34:57] What was it? [01:34:57] Sorry, I'm worried about this reporter. [01:34:59] I'm sorry. [01:35:00] Meta. [01:35:02] Meta recently announced that they're no longer encrypting your DMs on Instagram. [01:35:08] And a lot of people are up in arms and they're stopping using any DMs on Instagram and any of that stuff. [01:35:15] The idea is that other people can read your stuff now now whether it's meta can read your stuff or who That's what I mean. [01:35:23] Yeah I said, why don't you trust Apple? [01:35:24] It's the same reason I don't trust Meta. [01:35:26] They're not interested. [01:35:27] The dangers behind Meta killing end-to-end encryption for Instagram DMs. [01:35:31] Meta blamed users for not opting into the privacy protecting feature. [01:35:34] Experts fear the move could be the first major domino to fall for end-to-end encryption tech worldwide. [01:35:40] That's a horrible narrative. [01:35:42] Yeah, it seems squirrely. [01:35:47] So. [01:35:48] Oh, you've read your last free article. [01:35:50] Oh, my God. [01:35:51] Give me money, motherfucker. [01:35:52] But what Apple and Meta want to do is, like, they're trying to build these new neural networks. [01:35:57] They're trying to, you know, humans, and we can get into this too later if you want. [01:36:02] Humans are the only thing, in my opinion, and I'm happy to have you disagree with me, and I love to have this conversation. [01:36:08] In my opinion, we're the only ones that are. [01:36:10] After May 8, 2026, announced plans to discontinue support for end-to-end encryption for chats on Instagram. [01:36:16] If you have chats that are impacted by this change, you will see instructions on how you can download any media or messages you may want to keep. [01:36:23] Social media giant said in a help document, if you're on an older version of Instagram, you may also need to update the app before you can download your affected chats. [01:36:31] When reach for comment, this is what Meta had to say. [01:36:34] Very few people are opting for end-to-end encrypted messages and DMs, so we're removing this option from Instagram in the coming months. [01:36:40] Anyone who wants to keep messaging with end-to-end encryption can easily do that on WhatsApp. [01:36:44] But WhatsApp is a little squirrely, right? [01:36:46] WhatsApp. [01:36:47] Yeah, I mean, they're all squirrely. [01:36:49] And that's the problem. [01:36:51] And so you asked me why I don't trust them. [01:36:53] It's because they want to, they want to use, so humans, in my opinion, and some animals are the only things that are, that have the ability to project consciousness. [01:37:04] And projecting consciousness is how you train a neural network. [01:37:07] And it's how you train all these large networks. [01:37:10] A lot of my time also in the military is spent. [01:37:12] I was doing artificial intelligence in 2012, 2011, before it was even a catch term. [01:37:17] We were using artificial intelligence to map dynamic networks and to do other things, more pragmatic uses of it than how it's being used today with large language models or convolutional neural networks. [01:37:27] But they need consciousness to train their models. [01:37:30] So when Google offers you meta or Instagram or whoever else offers you photo storage, it's because they want your face to train neural networks. [01:37:38] If they're going to pay for the compute, if they're going to pay for the storage for these things, they're doing it because they're going to use the data. [01:37:46] If you're getting a free app, in essence, any free app, if the product's free, then you're the product. [01:37:52] So when Google is allowing you to use a Google Drive and get a gig of storage, they're going to use those photos to train neural networks to do better facial recognition. [01:38:00] What if you're paying for Google Drive? [01:38:02] I don't know about their terms of service now. [01:38:04] That is one of the best things that I use with large language models is any product I download, I have the neural network examine the terms of service. [01:38:14] And then you can pretty much understand like, here's my focus. [01:38:17] Here's the 40-page terms of services document. [01:38:20] When you click that link that you got, what are they able to do with my data? [01:38:23] So that's how I sign up for apps. [01:38:24] And that's one of the great uses of a large language model, in my opinion, is to quickly understand how these things are being used. [01:38:31] And that's why I say with Apple, with Meta, with all of these large information, you are more the product than the products, the product. [01:38:37] And that is because they're trying to build the most powerful, capable artificial intelligences, which I think is a misnomer. [01:38:44] And again, we can get into it later. [01:38:46] But they're trying to build these hyper-competent artificial intelligences. [01:38:50] And you need two things for that, really, is training data and you need compute. [01:38:55] And that's why you start seeing them coming out with like Meta's building its own nuclear engineering facility or something, a nuclear facility or something like that. [01:39:01] And they need more training data. [01:39:04] So if I want to build a replica of Joe Rogan that I can make hyper-realistic AI videos for, I need every picture of your face from every angle. [01:39:13] I need every wince, every squint, everything you've ever done. [01:39:16] So I can introduce more training data to better train that neural network in order to generate more hyper-realistic versions of yourself. [01:39:25] And so when a company is offering you something for free, and it's fine, like if people are fine with that idea, then by all means, download all the free apps that you want. [01:39:34] But if you're downloading a free app, it's because you are the product. [01:39:37] They either want to see how you type, they want to see what you're saying, they want to see how you're thinking about things, they want to understand your political biases, they want to look at your photos. [01:39:45] And this isn't because they're a deep-seated nation-state actor. [01:39:49] They can become that, but it's because they're trying to build the best products because the big money is in AI. [01:39:55] That's where the biggest money is. [01:39:57] So anytime you're doing any of these things, and it's just been obvious to me from the on not from the onset, but pretty close to the onset, that yeah, this is a good example, right? [01:40:06] Pokemon Go players built a 30 billion photo map. [01:40:09] That's how training robots deliver your pizza. [01:40:12] There you go. [01:40:14] So you, you know, they view people, and they can say they don't. [01:40:18] And maybe if someone from there catches this podcast, which they well could, they might put out a statement that's saying that that's not their doing. [01:40:24] But I'm telling you, as a person who has done media forensics, who has done computer network operations, and who has trained artificial intelligence models, that is precisely what they are doing. [01:40:35] That is their statement. [01:40:37] What is the difference between using Apple and using Android? [01:40:41] Well, Android will do the same things, and Google will do the same things. [01:40:43] It's just that I can root my phone or I can install a custom operating system like graphene or something like that, which I'm not doing right now. [01:40:52] I had to make a sacrifice when I started my company, SpartanForge. [01:40:56] And the sacrifice was I had to be the face of this product. [01:40:59] And so I never had a social media until I started the company. [01:41:03] And I didn't upload things to the cloud until I started this company. [01:41:06] And it became just like, I have to sell a product. [01:41:09] I have to, you know, and I'm actually selling a product, not people's data or people's photos. [01:41:14] I have to sell this product. [01:41:15] I have to let people often don't know who is the company or who is the organizing principal and what do they care about in the company. [01:41:23] And I just made that trade and said, I'm going to have to become a public person and start putting things out there. [01:41:28] And so, you know, I started a company. [01:41:31] We started our first Instagram and I started my marketing team started my first Instagram. [01:41:36] And I had to start uploading things and talking about how I felt about things because I wanted people to know that this company was not going to be like the other companies that are out there. [01:41:46] We don't sell their data. [01:41:47] We don't sell emails. [01:41:48] I can make a half million dollars off my email list tomorrow. [01:41:51] And I've been offered that money. [01:41:52] You know, we've got millions of emails from people who have signed up for our apps. [01:41:55] Other companies who are starting companies, they want to go out and reach marketing people. [01:42:01] So if you're starting another hunting app, maybe for cameras or for a call or a turkey call or an out call or something, and you found Spartan Forge and you said, man, they've got 2 million emails. [01:42:13] I could pay them a half million dollars for that $2 million and start some top-of-line marketing, top-of-funnel marketing, and go blast them. [01:42:21] So they would pay me a lot of money for those emails. [01:42:23] I will never do that. [01:42:24] I'll never sell my company's emails, the people's emails. [01:42:27] I'll never do any of those things because the product is the product for my company. [01:42:31] It's not the people. [01:42:33] So the reason why you use Android over Apple is the ability to root it and install things like graphene. [01:42:40] Yeah, custom OSs. [01:42:42] But yet you don't use it. [01:42:43] Not now, but what I still can use and what I still do use is Android also publishes their framework in an open source fashion where you can look at the Android. [01:42:53] It's called AOSP, Android open source project. [01:42:57] So the basis of Android, the nuts, think of it as the nuts and bolts. [01:43:01] I'll try not to talk in too technical terms here. [01:43:04] But the basic framework, think about it like a car. [01:43:07] The frame and the engine makeup is published so you can look at how things work on the inside. [01:43:12] Apple goes the opposite way and they don't publish any of that and you can't see any of that stuff. [01:43:16] I'm for the free and open version because at least if something, at least if I'm worried about my phone having a problem, I can actually dump binary or I can create an EO1 file and exhume. [01:43:27] I can look at the binary and say, is my phone acting like it should or doing what it should? [01:43:31] Or is there some kind of persistent implant? [01:43:33] I wouldn't be able to do that with a – I would have to trust Apple and Apple's ecosystem and whoever they're – McAfee or whatever they're using. [01:43:41] I would have to trust them, which I don't. [01:43:43] So I like the Android because that option available for the average consumer that's not that learned in computers? [01:43:51] Well, the great part about large language models now is if you wanted to dump your own phone today, you could follow along with a large language model and do it, your own Android. [01:44:00] And how would you do that? [01:44:02] Well, you would have to buy some expensive, there is something, you'd either have to pay a firm to do it, or you could download things like Celebrite. [01:44:12] You could get a Celebrite or there's other things called Forensic Toolkit, other things like that that allow you to examine your phone at a deeper level. [01:44:21] And is this an app forensic? [01:44:23] They're products. [01:44:24] Products. [01:44:24] They're products. [01:44:25] So it's a physical product. [01:44:26] To dump your phone into? [01:44:27] Yeah, and they're software. [01:44:29] And there's connecting and all that type of stuff. [01:44:31] Tools I used throughout my military career, Celebrite is one of them, but they're Israeli-owned. [01:44:37] I've got nothing against Israel. [01:44:39] I've just got everything against foreign actors. [01:44:41] Just if they're not an American company, that automatically kicks them down a level for me. [01:44:46] So anyway, there's all kinds of Android just makes it much easier to examine your phone or to understand if you've got something going on that's funky than it is on Apple. [01:44:58] So for the average person, like for me, like if I got. [01:45:01] You're not the average person. [01:45:02] Well, let's pretend. [01:45:03] If I got an Android phone and I wanted to examine my phone, what would be the process? [01:45:09] You would download some of the software that I talked about. [01:45:11] You would jack your phone into it. [01:45:13] You would open your phone and then it would start carving the binary of your phone, everything in your phone. [01:45:21] You could create a one-to-one emulation of your phone if you wanted to. [01:45:24] And then you would be able to get under the hood and examine the apps. [01:45:27] You'd be able to examine the binary. [01:45:29] What's the executable code? [01:45:30] You'd be able to look at all of those things and then determine because Android open source project is published, you could do a one-for-one and say, well, you know, at the kernel level, there's this weird code that's not in the Android build. [01:45:45] So what is this code? [01:45:47] And then with a neural network, you could probably, I've never done it, but I'm sure you could figure out what the intent is of that code, even for a lay person. [01:45:55] So I could take that information, I could put it into Perplexity, and Perplexity would lay out what's going on with it? [01:46:01] Ostensibly, it would be able to, yes, unless it was some type of weird code. [01:46:05] I don't know if I haven't used Perplexity, so I don't know if they have something like ChatGPT's Codex. [01:46:11] Sort of just try just to be like, can you help me examine my Android phone as doing looking for any malicious actors? [01:46:16] Yes, I can walk you through structured non-destructive check for malware or other shady activity on your Android phone. [01:46:21] A first, what are you noticing? [01:46:23] Four tools, commands, quickly check for common warning signs, sudden big battery dream, you're not using the phone, unusual data usage, particularly in the background, apps you don't remember installing, or icons briefly appearing and then disappearing. [01:46:36] Lots of pop-ups, redirects in browser, or new default search launcher, strange calls, SMS messages you didn't send yourself. [01:46:44] If any of those ring a bell, we'll focus on them in later steps. [01:46:47] Yeah, it's just asking you, like, why are you running? [01:46:49] So this is just something that you could do with an Android phone that you just can't do with it. [01:46:53] Yeah, Apple's not open. [01:46:54] What are the reasons you don't trust Apple? [01:46:55] Well, could I ask, can I do one thing before we remember that question because I want to forget it? [01:46:59] Could I give you a prompt? [01:47:01] Sure. [01:47:01] Because I want to answer your first question that we've already gone past. [01:47:05] Can you bring Perplexity back up, please? [01:47:07] You want to go in addition to that or start a new one? [01:47:08] No, this is fine. [01:47:10] Just say, my friend helped me carve an EO1 file, EchoOscar EO1 file. [01:47:22] And he says that there is code in there that doesn't comport with the rest of the Android system. [01:47:33] Yeah, P-O-R-T. [01:47:35] The rest of the system. [01:47:39] Could I dump that code here and could you tell me what it means? [01:47:43] I'm sure the answer is yes, but I just didn't want to answer it because I've never done it. [01:47:49] Could you tell me? [01:47:50] Could you tell me, Jamie? [01:47:52] Could you tell me what it means? [01:47:53] Yeah. [01:47:55] Figured that out probably, though. [01:47:57] Get the U out. [01:47:59] Get the U. Tell me you. [01:48:01] Could I have some of your coffee, please? [01:48:03] Yeah, absolutely. [01:48:03] It's for you. [01:48:05] Okay, let's see what it says. [01:48:09] Yes, you can paste suspicious code here, and I can help explain what it appears to do line by line and whether anything looks malicious. [01:48:17] Before you paste, a few important notes. [01:48:19] Remove or redact anything that looks like private data, username, passwords, keys, tokens, IPs, email addresses, phone number. [01:48:26] Perplexity is our sponsors. [01:48:28] I love that. [01:48:28] Because you'd never get that from ChatGPT. [01:48:31] You'd want all that information. [01:48:32] Perplexity is the shit. [01:48:34] And so you're not sharing personal or case-sensitive information. [01:48:38] Wonderful, Perplexity. [01:48:40] If it's very long, send in chunks and tell me chunk 1-3, chunk 2-3, et cetera, so we can help keep track. [01:48:48] I can do static analysis here, read and reason about the code, but I cannot actually execute it in a sandbox. [01:48:54] So this is more like a careful forensic read-through than a full dynamic malware analysis. [01:49:00] Go ahead and paste the code snippet your friend flagged as not fitting with the rest of the system and tell me in a sentence or two where in the EO1 it came from. [01:49:09] Example, app folder, system partition, random file path. [01:49:13] Yep, exactly. [01:49:14] So yeah, I thought that would be the answer. [01:49:16] I've just never done it. [01:49:17] And so you can do a forensic examination of an Apple, by the way. [01:49:22] Sorry if I misspoke there, but you can't do it to the level that you can with, because the Android open source project publishes all of the code, I can get an understanding of the very inner working. [01:49:32] So if something's being done, for instance, at the kernel, or you could think about it as like the lowest level of the phone, something that wouldn't normally get caught in a forensic examination, I wouldn't be able to do that with Apple. [01:49:45] And the nation state actors are doing things at very low levels in the code framework for that exact reason because most people who aren't very deep into forensics would miss that. [01:49:58] It would be like the fingerprint under the couch cushion or something like that. === Forensic Exams of Apple Phones (15:19) === [01:50:01] And what is the difference between what someone can do with an Android phone with the standard Android operating system versus Graphene? [01:50:12] So that gets into, you know, if you wanted to WarDrive or sample Wi-Fi networks in an area, or if you wanted to run a barrage attack on a Wi-Fi endpoint, you could work that in there to do things with the phone that you couldn't otherwise do with a standard app, with a standard Android operating kit. [01:50:31] But as far as on a consumer level, what protections do you have by running graphene that you don't have by running Android? [01:50:40] You're much more in control of the ecosystem. [01:50:44] You have a firmer understanding. [01:50:45] And again, you could use a large language model to do this to understand exactly what's being run on the phone. [01:50:50] You control the background services that can be run on the phone. [01:50:53] So if you're getting hot mic'd or if your camera's taking pictures of you when you're not looking or it's listening to you for advertising content, stuff like that, you would be in control of all of that in a way that you're not control of on a native Android app. [01:51:04] In control, like how so would it alert you that this is happening? [01:51:07] Or just the functionality wouldn't be there for it to take place. [01:51:10] Right, because the functionality is only designed for the standard Android operating system. [01:51:15] And I haven't installed graphene in a while. [01:51:18] So a lot of this, all of this updates, and I could be saying things that are incorrect. [01:51:22] I stopped doing this about three years ago. [01:51:24] Well, I know that there was, I forget what country it was, but they were focusing on people who use Google Pixel phones, for example. [01:51:31] Yeah, because that's because that's one of the phones that are more commonly rooted. [01:51:35] Yeah, it's easy to do. [01:51:37] And you could do it with a large language model. [01:51:38] You could sit there and be walked through on how to do it, which is a great part of that. [01:51:42] Is it complicated? [01:51:43] For a person like me that's not that astute? [01:51:46] No, it's not something I would do with a phone that you care about the first few times. [01:51:50] Right. [01:51:50] Because you're going to jack things up. [01:51:51] have to you know get the bootloader and uh essentially the starting you know the starting mechanisms of the phone that launches all of the other things you have to get down to a level and unlock that so that you can um is that available for all android phones No, not all Android phones. [01:52:06] Lots of them lock it down, so you can't do that. [01:52:08] Is that available for Samsung phones? [01:52:10] No, not this one. [01:52:11] So the question has to become, can you unlock the bootloader? [01:52:15] And that is the starting, think of it as the starting engine of the rest of the phone. [01:52:18] Why is that only available on Google Pixel phones? [01:52:21] I'm not sure why they do it that way. [01:52:22] I haven't looked into that. [01:52:23] It's just pixels. [01:52:24] And the older Samsung's made it available. [01:52:28] Older Galaxy S7s, S10s, you could do more than you can with like, you know, I've got the Galaxy fold here, and you can do almost none of that on here. [01:52:37] That is fucking sweet, though. [01:52:39] Yeah, I love this phone. [01:52:40] But like I said, I went away from doing all that, A, because it was work. [01:52:44] B, because I'm not working in national security anymore, and I'm not, you know, I haven't written an exploit in years. [01:52:51] I don't do this type of work anymore, and I need to sell a product. [01:52:54] And it just, you know, working with other employees, like that run my Instagram or, you know, assistant going through my email and all those other types of things, it just wasn't pragmatic anymore for me to keep doing that, and I had to give up that. [01:53:05] Did you forge your app work run on graphene? [01:53:09] Yeah, well, it could. [01:53:10] Yeah, it would. [01:53:11] You have to sideload the app. [01:53:12] But again, a large language model could walk you through doing that. [01:53:15] So we haven't gotten to that level of... [01:53:18] Does it make sense here that this says it's easier because Google makes it easier? [01:53:22] Yeah. [01:53:22] Yeah, he was just asking me why they make it easier. [01:53:25] And I don't know that answer. [01:53:27] So the process is officially supported in the Android settings under developer options, allowing users to toggle OEM locking. [01:53:34] Simple fast boot method. [01:53:35] Pixels use standard fast boot commands that work consistently across all models to unlock the bootloader, accessibility. [01:53:43] Yeah. [01:53:44] That's what I was talking about. [01:53:45] So, yeah, I don't know why they do it. [01:53:47] It might be people can, well, the Android open source project exists. [01:53:52] So, it would stand to reason that you would want a way for someone, because what you want is people interacting with that code and red teaming it and making the code better and then offering bug bounties so that you can tell Android, like, hey, you've got a critical flaw in your system architecture here, and then they'll pay you 20 grand for that. [01:54:09] I've got friends who do that. [01:54:11] So, you and I talked about Eric Prince's phone. [01:54:14] Yes. [01:54:18] So, the narrative is that that is an unhackable phone. [01:54:23] Yeah, it's just by virtue. [01:54:24] And look, Eric's a wonderful guy, and the principles that he used for the first instantiation of that phone are the correct principles, which is we need to get, if you want, if you're security focused at all, you should get away from these big, large conglomerates because none of your data is private. [01:54:43] That's a correct principle. [01:54:45] An incorrect principle, and I'm going to get shit about this, but I told you in the beginning I care about the truth and I do care about the truth, is that when you're using a PKI subsystem that relies on Microsoft, then you're not in control of the PKI certificate signing, and Microsoft could cause a bunch of problems, and they were using that. [01:55:05] So, the other thing being, if you're building on the Android open source project, that means the code that you're using as the engine, let's just call it that of your phone, is examinable by the public. [01:55:16] So, you're relying on Android to publish these updates to the phone, and you're relying on those things to be as good as possible. [01:55:25] Now, you might harden it some more, but as long as the code is out there, it can always be mucked with. [01:55:30] As long as people have to interact with the device and type, and you have to see what you're typing, a phone's going to be, it's going to have Swiss cheese. [01:55:38] So, when people say something is unhackable, as you said, that's just not true. [01:55:44] Yeah, it didn't make sense to me. [01:55:45] It's just not true. [01:55:47] Yeah, we talked about it quite a bit. [01:55:50] Like I said, great guy, done lots of great things for the country. [01:55:53] And it's just if they had just said something along the lines of it's hackable as any phone is hackable, because by virtue of you having to interact with it, it's hackable. [01:56:03] It just, like if I install, if I came up with an app that had a, you know, look at the TikTok terms of service on the first TikTok. [01:56:10] Oh, it's bonkers. [01:56:11] With those terms of services, I will own your phone. [01:56:14] And I'm not saying you can install TikTok on his phone, but what I'm saying is by virtue that you have to interact with the phone and see what you're doing and type passwords, and you've got those kinds of terms of service, I could easily put a key logger in that, and now I know your signal password or your signal pin. [01:56:29] Or, you know, I get you, you know, you're going to China, so I stop you in secondary. [01:56:34] And while you're in secondary, I've got a CCTV on you, and you unlock your phone. [01:56:37] Now I know how to unlock your phone. [01:56:39] And now I'm going to lock you up in secondary at customs in China or in Canada. [01:56:46] And I'm going to separate you from your phone. [01:56:48] And I've seen you unlock it. [01:56:49] Well, now I'm going to get in there with NCASE or I'm going to get in there with FTK or I'm going to get in there with Celebrite. [01:56:54] And I'm going to dump your phone. [01:56:57] And just by virtue of it being built on the Android open source project, that's a great thing. [01:57:03] It's a good thing. [01:57:04] Just don't call it totally unhackable. [01:57:06] Because a guy like me, I don't need but a week or two to tell you on this current build, like here, here's the hole in this Swiss cheese. [01:57:14] Now, is it far better than having a Google phone with standard firmware and standard OS or an Apple phone? [01:57:23] I don't know about Apple because, again, you asked me about Apple and I said, I don't know Apple. [01:57:27] I don't know what's happening at the top of that company, but I know that they like to monetize people, and that's pervasive in my mind. [01:57:34] And using data that people don't know is getting used, even though it's in a 40-page terms of services document, is pervasive. [01:57:40] So I just don't know at that highest level of analysis. [01:57:43] And that's why I said to answer your question about the safest phone, I would ask you what you're using it for, who you are, and what are you doing in the world is the best way to answer that question. [01:57:53] So, me, like, what would you recommend I use? [01:57:56] I mean, I wouldn't want to, I mean, okay, I'll tell you generally what I would say because you might ask me that question one day because we go back and forth about a lot of tech. [01:58:05] I know specifically what I would recommend for you to do, and I'd even tell you to hire someone else to do it and not me, because that just that checks and balances is what I would want. [01:58:15] But for you, I would say you should take something like a Raspberry Pi and you should run WireGuard on your phone, and you should route all of your internet traffic through something like a home terminal at your house through a Raspberry Pi using something like WireGuard, which is a VPN that I use that's very good. [01:58:35] And everything should be routed through that. [01:58:39] And if you trust Apple, continue using Apple. [01:58:43] If you don't trust Apple, then use Android. [01:58:46] And you could use a Pixel and do graphene, and you could use Signal on there and those other things. [01:58:52] And you're going to be relatively safe. [01:58:54] But again, if I'm a nation-state actor, I can create circumstances where I'm going to get access to your shit and I'm going to lock you down. [01:59:02] And some of them are more expensive than other methods to do it. [01:59:06] But I'm a pragmatist and you can always come up with a method to get a hold of somebody's shit. [01:59:09] You can always create the circumstances, especially if you're a nation-state actor to get a hold of somebody's stuff. [01:59:15] That would be the very high level of things that I would recommend to you just out the gate. [01:59:24] Yeah, it's very concerning because it seems like these things keep getting stronger and more capable. [01:59:30] Yes. [01:59:30] Like the Pegasus 2 being a non-click exploit. [01:59:33] Yes. [01:59:34] So all they have to do essentially is just know your number. [01:59:37] Yep. [01:59:39] And that's, you know, you just make yourself a difficult target would be my best recommendation. [01:59:46] When you're going to answer questions about password reset, don't answer them honestly. [01:59:50] Write down in a physical journal or something how you answered those questions. [01:59:53] Don't answer them honestly. [01:59:55] You know, all of these things we think are added for layers of protection. [01:59:58] For instance, you used to get that pop-up on your phone where it said, you know, there'd be like blocks of pictures and it would say, click all of the pictures with a traffic light in it. [02:00:10] I was just going to say that, a traffic light in it. [02:00:12] Part of that might be for security. [02:00:13] The art part of it is they're using the information of what you're clicking to train neural networks. [02:00:18] You're a product at that point. [02:00:20] You think you're getting security out of it, but you're a product at that point because you're helping to educate a neural network on what traffic lights look like and how they can look and all those different instantiations of traffic lights. [02:00:31] So, and again, like we have to separate causality and intention and outcomes in that the companies might do this because they want to create the greatest AI ever. [02:00:42] But when you're issuing someone a 40-page terms of service document on everything they can do with your thing that you paid $2,000 for, it's just, you know, we need more ethical people. [02:00:53] At least what Eric Prince was trying to do was right, which was we need to off-ramp from some of these big things because the way that this government is going, I'm very worried about the rights of the individual now and going forward because we have an uneducated class of people for all of the reasons in the world. [02:01:13] Like if you want to just focus on your family and you're not thinking about these things, I don't hate that for you. [02:01:17] But the idea of individual autonomy and rights has been so shit on in recent years that when we get more uneducated and we rely large language models are great, but they're not a foundation of learning. [02:01:32] In other words, we have a lot of people with access to information but no wisdom. [02:01:37] It's like when your parents would say, learn how to do addition and subtraction on paper before you use a calculator. [02:01:43] Like, understand how to do research and cite sources and understand, you know, how to conduct really good analysis before you just use a neural network for everything. [02:01:53] Because as we lose focus of our civics and what our founders are trying to do and the uniqueness of it, which is truly unique, which is, you know, when I joined the Army, I joined the Army to get out of North Dakota. [02:02:04] When I re-enlisted in the Army, it's because I believed in the experiment. [02:02:07] And that's another five-hour podcast. [02:02:09] But the foundation of the experiment is good, but we've eroded it in so many ways over the years and given up so many individual rights in the name of security. [02:02:20] And I'm sure it's been said on here before, but Franklin said, anybody who gives up their individual rights in the name of security deserves neither. [02:02:28] Your freedoms in the name of security deserve neither. [02:02:31] And it's some of the ways that they've done it have been really above the surface. [02:02:35] And it frankly blows my mind that we let the government get away with some of these things that we let them get away with, where you even explain it to people and they're like, I don't see it. [02:02:45] Like, I don't see how that was a big deal. [02:02:47] And I'm like, it was a total recalibration of the system that allowed the Democratic Party and the Republican Party to usurp your rights in a way that if you knew any better, you'd probably be protesting. [02:03:00] Like some of the ways that they've done this, you know, we can go with the easy stuff like the Patriot Act, right? [02:03:06] In the name of security, we're going to start collecting on Americans. [02:03:09] You know, and the Biden and Obama administration, I will say this at risk of, you know, getting in trouble because I used to have a clearance. [02:03:19] They had a massive vacuum cleaner and they knew what it was vacuuming up. [02:03:23] And they kept vacuuming it up anyway in the name of security. [02:03:26] I'm not saying they were going after American citizens, but they certainly knew they were. [02:03:31] And they just vacuumed shit up and collected it and stored it in a database. [02:03:36] In case they needed it. [02:03:37] In case at some point we needed to, you know, come up with a narrative or get rid of somebody who's inconvenient or whatever else that just flies in the face of individual American rights and American autonomy and is really, in my mind, the anti-pattern to freedom. [02:03:54] It's just really, really bad. [02:03:56] I mean, I'll give you one that people always crap on me whenever I talk to them about it, but there's two that really bother me. [02:04:01] One of them being like the 17th Amendment. [02:04:03] Do you know the 17th Amendment to the Constitution? [02:04:06] So the 17th, so when the founders, when you read the Federalist papers and the Federalist papers, I really love reading the Federalist papers. [02:04:13] I love reading how they informed the Constitution, the Bill of Rights, the Declaration even. [02:04:19] John J. James Madison wrote these documents explaining the framework. [02:04:23] And the 17th Amendment, essentially how the Senate, the Senate, right? [02:04:27] The 50 people there that are supposed to be representing us was originally constructed was a state would have legislatures and the state legislatures and the governor would appoint the senator. [02:04:37] The reason that the founders did that was because the state governments had to give power to the federal government to exist. [02:04:45] Back with the Articles of Articles of Confederation. [02:04:50] Confederation, is that right? [02:04:51] Articles of, I think it's the Articles of Confederation. [02:04:54] I'm blowing up, sorry, I'm going nuts. [02:04:57] Back before there was a strong centralized American government, we had problems with money, we had problems with interstate commerce and those types of things. [02:05:04] And those articles eventually turned into what is the Constitution. [02:05:07] But the states had to grant that power. [02:05:09] And the signers of the Declaration of Independence and the Constitution knew that the states needed to be those small projects that we talked about before where if California wanted to go nuts, let them go nuts. === State Powers and Commerce Law (09:59) === [02:05:20] But it shouldn't impact what's happening in Texas. [02:05:22] It shouldn't impact what's happening over in New England. [02:05:24] It shouldn't impact what's happening in the Midwest. [02:05:26] But if that goes nuts and it fails, it needs to fail. [02:05:30] So the state senators, I'm sorry, the state legislatures would come together and they would vote for a senator. [02:05:36] they would elect a senator. [02:05:37] And that senator's job was to go to the federal government and protect the rights of the state. [02:05:43] Not to protect the rights of individuals per se, and certainly not to embolden the federal government. [02:05:49] But with the 17th Amendment, what happened was the House of Representatives' function was to be the petulant children of government. [02:05:58] So their job was to come up with crazy ideas, crazy laws, all of those things. [02:06:02] The more liberal version of government jurisprudence would be the House of Representatives, your crazy ideas. [02:06:08] And then you had state senators who were supposed to be between the House and the President who would say, well, here's a good idea, but the rest of this is retarded, AOC. [02:06:16] Like, we're not doing all this. [02:06:17] That's crazy. [02:06:18] Or whoever else, name you a Republican who's an asshat as well. [02:06:23] We're not doing these things. [02:06:24] And that's because it would erode the state's rights and the state's constitution and what made this state great. [02:06:29] Because what the legislatures would do is say, hey, Joe Rogan, you've made a lot of money and you've got a big podcast and a big voice and you've learned some lessons around the way. [02:06:38] And you were able to do that in Texas. [02:06:39] And you decided to come to Texas because we had all of these things that California didn't have. [02:06:44] We need you to go to the Senate for three years or six years or seven years, whatever it was back then, and represent those same principles. [02:06:52] So when Obamacare comes through, you can say, not only no, but fuck no. [02:06:56] Like, I'm not voting for this thing. [02:06:58] And it was to protect the state. [02:06:59] But what the 17th Amendment did was it was redundant with the House of Representatives, which was, in the founder's eyes, the only popular vote part of the Constitution, of the American government was the popular vote. [02:07:14] And then you had, you know, the way the president gets elected through electors, but you had the state senate, which was appointed by the states. [02:07:20] So the legislatures, and I'll use North Dakota where I'm from, you'll have one big city, two big cities, Fargo and Grand Forks, North Dakota. [02:07:29] It's where the universities are. [02:07:30] It's where your crazy kids are. [02:07:32] Crazy thought exists, hyper crazy ideas, but some of them are useful. [02:07:37] The rest of the state's agriculture, right? [02:07:39] So all of those legislators from all those counties, those legislative districts would get together and say, we're going to put Bill Thompson, that would never happen, but in charge of, he's going to be at the Senate representing North Dakota. [02:07:51] But he has to represent the whole state. [02:07:54] In other words, you can't do things that will help Grand Forks or Fargo because that's where the universities are. [02:07:59] That's where all the crazy politics are. [02:08:01] You also need to be thinking about the guys out in the western counties, Lemoore County and North Dakota or way out west. [02:08:06] You have to protect agriculture. [02:08:07] You have to protect small businesses. [02:08:09] You have to protect families. [02:08:11] What the 17th Amendment under Woodrow Wilson and how they really usurped the Constitution and made the Senate a redundant, they made it a redundant House of Representatives and using the popular vote. [02:08:22] So now we use popular vote for that. [02:08:25] But if you want the popular vote in North Dakota, 85% of the population is in Fargo and Grand Forks. [02:08:30] So now you've got, if I want to run for Senate in North Dakota, I'm just going to spend all of my time in Fargo and Grand Forks. [02:08:36] Because if I can repeat back to those people all the ideas that they want to hear, I'm going to win that vote and I don't have to represent those people out in the rest of the state in anything. [02:08:45] So they created a redundant House of Representatives. [02:08:48] But another reason why it happened was they wanted popular vote because there is no amount of money that you could stick into a legislature out in the western part of North Dakota. [02:08:57] You can't bribe these people. [02:08:58] But the DNC and RNC now can say, look, these two senators are running. [02:09:02] We like this guy. [02:09:03] So this guy will do whatever we tell him to do. [02:09:06] And it has nothing to do with the state or representing the state's rights or the rest of those legislative districts. [02:09:11] We're going to pick this senator and he's getting $300 million for his election bid. [02:09:15] And this other guy, who's a slower-moving constitutional conservative, who might be a free market absolutist and a classical liberal, he's not being funded. [02:09:27] But under the state architecture, you might have been a better representation of the state. [02:09:32] And that's why the legislators had to vote for you to put you in as a senator. [02:09:36] You had to represent the whole state. [02:09:38] But now, all that someone who wants to be a senator needs to do is go to the Republican National Committee or the Democrat National Committee and say, I'll do all the things you tell me to do, fund my campaign, and I'm going to go stump in Fargo and Grand Forks, North Dakota, and the hell with the rest of the state. [02:09:55] It's very important. [02:09:56] It's a very important sleight of hand. [02:09:57] And when that happened, you made a redundant House of Representatives, and the state no longer was protected at the federal level. [02:10:05] And what happened was all of the power from all of these states and these legislatures and these individuals got sucked up into the federal government. [02:10:13] And then after that, you see all of these things that would never have been passed by a state getting passed, things like Obamacare, things like the Patriot Act, certain war resolutions, all kinds of things where it just further erodes the power of the state. [02:10:27] And federal government wants that because it puts all of the power up in the federal government. [02:10:31] And people always say we need to get money out of politics. [02:10:34] No, we need to get power out of politics. [02:10:36] That power that they've taken over the last 130 years or so used to exist at the state and local levels because they wanted these thought experiments happening where we could pluck the best things out of them and forget the rest. [02:10:49] But all of that power has now gone up to the federal government and the federal government won't ever release that power and they only want more budget and more spending to execute that power. [02:10:59] And that's also because the interest groups that want to go, they don't want to have to go and convince a whole state of whether or not something is good that people are going to vote on. [02:11:07] They just want to go take a lobby and go up to the federal government because they want all of the power up there as well. [02:11:12] And the federal government wants all the power up there as well because they make $300,000 a year before they become a politician and they're worth $30 million when they're done being a politician because all of the money has to go to the federal government because they're in charge of light bulbs we can use, computers we can use, flush toilets we can have, how our roads are going to look, what our medical care looks like. [02:11:33] None of those powers are explicitly written in the Constitution of the United States and they use things like the commerce law and other things in order to create things like Obamacare, where really we want competing states. [02:11:44] If Texas comes up with a great way to do health care and North Dakota's isn't so great, they can look at that experiment and they can adopt the principles and they can have it at that level. [02:11:54] But it's much easier to get change at the local level when the power is derived from the state and the individual because if I want to change the way that my state does health care, I have one of two options or three options. [02:12:05] I can run for office, I can support someone who is going to go into office and do what I want, or I can move. [02:12:10] But when everything's centralized at the federal government and everything flows from the federal government, all of the money, power, and gravity is up there. [02:12:17] And the individual, the 300 million of us or so, have really no power now to exercise either state's rights or individual rights at the higher level. [02:12:26] I hope I'm elucidating this correctly, but it's a real usurpation of individual and state autonomy that really got rid of state power, which was, if you read the federalist papers, was so important to the founders that there was this state, that the state's needs were organized because the state was where the founders wanted these thought experiments. [02:12:45] You read Thomas Hobbes Leviathan or John Locke or Montesquieu. [02:12:50] All of them talked about this great experiment that was being set up and how it was built on all this Western politics and everything that came before it on how we could have a government that was forced to respect the rights of individuals and allowed for these competing think tanks of ideas and that the power would never rest at the federal government. [02:13:06] But the 17th Amendment was a way that a lot of that power went from the state level and the state legislatures. [02:13:13] And now to become the president, they want to do a popular vote. [02:13:17] And under a popular vote, you would just have to campaign in New York and L.A. You would get the popular vote out of the likely voting people. [02:13:25] And now the rest of the country is not. [02:13:27] And that would be another, you hear all these people saying we need a popular vote. [02:13:30] Can't have the Electoral College. [02:13:32] We can't have all of these things. [02:13:34] Everything needs to be pure democracy allows 51% to rule 49%. [02:13:41] And that was another thing the founders were working fervently to get away from. [02:13:46] And that's why we had an electoral college. [02:13:48] And it's actually quite beautiful when you actually read about it and examine it. [02:13:51] It's why we had the state senate and state legislatures. [02:13:54] And it's why we had the House. [02:13:55] You had all levels of the things of government that the founders cared about being represented in this body, politic. [02:14:01] And it was a beautiful thing. [02:14:02] And I could go on for 15 more things about that. [02:14:04] I won't do it for the sake of your listeners because I doubt this is what they wanted to do. [02:14:08] But similar things happened with the Supreme Court in Marbury v. Madison and allowing the Supreme Court to have judicial review. [02:14:15] That was never a thing that was in the Constitution. [02:14:17] And the Supreme Court, if you like the Supreme Court being able to have the power to describe everything as being either constitutional or unconstitutional, then you're not ruled by a democracy. [02:14:25] You're ruled by an oligarchy. [02:14:27] You've got eight people in robes that are going to tell you whether or not laws are good or bad. [02:14:30] And that's not the founding of this country. [02:14:32] It's not how it was intended to work. [02:14:33] And that all started back in Marbury v. Madison with Thomas Jefferson and these writs of mandamus that were the Supreme Court, long story short, essentially granted itself the power to conduct judicial review under the old system or the system, old system. [02:14:49] The system that was ratified and that the founders approved was if a law was deemed unconstitutional, it would go before the Supreme Court and they just would rule in favor of the person. [02:15:00] And then eventually the government would figure out, oh, this law doesn't work. [02:15:03] But it was never on the Supreme Court to say constitutional, unconstitutional. [02:15:07] You would get arrested for some law, and it would get appealed to the Supreme Court. [02:15:11] The Supreme Court would say, we're not punishing this person. [02:15:13] This is against the Constitution. [02:15:15] But the government would have to keep arresting people. [02:15:18] It would have to keep going in front of the federal government. === Judicial Review and Marbury v Madison (04:37) === [02:15:20] So what I'm saying is, and I'm sorry to go off on this, we can go back to tech. [02:15:23] But all I'm saying is the core of the American experiment in individual rights and what makes this country so great and why I was willing to die for it after my initial enlistment. [02:15:34] And why I have such love for this is because it was the only experiment where the value of the individual was held at the top of the hierarchy and that people could truly be allowed to flourish. [02:15:43] And in 250 years, we did more than any society could have hoped to have achieved in tens of thousands of years. [02:15:49] Not that it's been around that long, but in thousands of years. [02:15:52] Everything tends towards disorder and everything, power always gets centralized. [02:15:58] And we had a framework to do that, but we were willing participants in our own demise. [02:16:03] And now we're scratching our heads and wondering why there's no individual and why there's no individual autonomy and why a guy can't smoke weed on the weekend or why a guy can't do X, Y, or Z, because we have centralized the authority and the power and the decision-making structure. [02:16:16] And we're allowing them to be, there would be no problem with money in politics if the federal government had only the powers that were outlined to it in the Constitution. [02:16:25] I think that's very well said. [02:16:27] And I could have never said it the way you said it. [02:16:30] And I think there's a lot to absorb here. [02:16:32] I'm sorry. [02:16:33] No, no, it was great, dude. [02:16:34] It was great. [02:16:35] This is one of the things that I love about you. [02:16:37] You're very thorough. [02:16:38] Yeah, thorough is one thing. [02:16:40] My friends always say Bill's tism is starting to show. [02:16:43] You got a touch of the tism. [02:16:44] But I think that's good. [02:16:45] Like I said, just like ADHD, I think it's a superpower. [02:16:48] A lot to absorb. [02:16:49] So I think we'll wrap it up right here. [02:16:51] But thank you. [02:16:52] This was an awesome conversation. [02:16:53] I really appreciate it. [02:16:54] It was really great. [02:16:55] Yeah. [02:16:56] We could do this again, too. [02:16:57] I'm sure you probably have 30 or 40 of these. [02:16:59] We didn't even get to AI. [02:17:00] I wanted to get to AI because I think I have a very anti-pattern to AI and how you understand it. [02:17:05] But if you want, we can save that for another time. [02:17:08] Yeah, we'll do that for our next one because I think that's another four hours. [02:17:10] Yeah, probably. [02:17:11] Yeah, for sure. [02:17:13] And by then, who knows where it's going to. [02:17:14] I mean, Jensen Huang from NVIDIA recently declared that we've reached AGI. [02:17:20] Yeah, so I would, I would, yeah, I could, yeah, I just couldn't disagree more. [02:17:26] And I think I could, in the same way, I just elucidated. [02:17:28] You're not the only one. [02:17:29] Quite a few people. [02:17:30] Yeah, yeah. [02:17:31] I mean, it's consciousness projection. [02:17:32] And I'll sum it up in a minute. [02:17:34] At the end of the day, neural networks are mathematical functions. [02:17:38] They rest in, you know, weighting neurons based on training data and applying power to train models. [02:17:44] It's all mathematic. [02:17:46] There's no sense of knowing there in that, you know, Penrose, I've read a lot of, on his orc OR, if people want to read about that, I won't explain it. [02:17:57] orchestrated objective reduction and how the mind works and these fleets of consciousness that we have, these shimmers of consciousness that we have based around what he describes in the microtubule. [02:18:08] We get conscious thought and that conscious thought we project into things. [02:18:12] AI is very good conscious projection, but it will never have consciousness or knowing because it has no system of values. [02:18:19] And if we were to instill values in it, it would still be consciousness projection. [02:18:23] You saw my dad's cabin. [02:18:25] My dad died when I was five, but I bought it back and was working on it. [02:18:27] And inside of his cabin, I got to learn a lot about my father by working on the cabin that he built. [02:18:34] We wouldn't measure things or cut things right on walls and that type of stuff. [02:18:37] That's all consciousness projection that allowed me to get to know him away. [02:18:41] I might not even have known him if he were alive, but I got to re-experience and understand my father and his thoroughness through that cabin. [02:18:47] AI is consciousness projection. [02:18:49] It's projected consciousness. [02:18:50] It's getting very good, but on a calculator, you could get the same thing out of a neural network that you get out of a neural network if you had sufficient time. [02:18:58] I could present you a question just like you did on Perplexity. [02:19:01] I could sit here with a rule book and I could type in a calculator. [02:19:06] It might take me a million years, but I could do it and I could give you the same answer that a neural network would give you. [02:19:11] That doesn't mean consciousness or knowing or AGI is presence, is present. [02:19:16] It relies on its training data. [02:19:18] It can only give you what the training data gives it. [02:19:21] It needs human consciousness projection like we talked about with the CAPTCHAs or we talked about with uploading photos to Google Drive. [02:19:28] It needs that training data. [02:19:30] And to me, it's just really fancy, clever math. [02:19:34] And having trained these networks from dozens of years or dozen years now and working with them, they're just really clever consciousness projection. [02:19:43] And so, yeah, that is four hours and we can do that next time. [02:19:46] We'll do that next time. [02:19:47] Definitely. [02:19:48] But if people, you mentioned the app. [02:19:49] By the time we do it next time, who knows what the fuck is going to be going on with AI, too? [02:19:53] But if people want to learn more about me or my company, if I can say that. === Building Spartanforge with Freedom (00:54) === [02:19:58] Yeah, please. [02:19:58] It's spartanforge.ai. [02:20:00] We're built under the rubric of individual freedom. [02:20:02] I want people outdoors. [02:20:03] I want people hunting. [02:20:04] I want people experiencing nature. [02:20:06] I want people providing for their families. [02:20:09] The best part of my day is when my kids are eating a backstrap of an animal that I took. [02:20:14] And I want to enable people to go out and do that. [02:20:16] And even though it's paradoxical through an app, you can get lost. [02:20:19] You've got to conserve time. [02:20:20] You've got to e-scout. [02:20:20] You've got to learn things before you go out there. [02:20:22] So we built this company under that. [02:20:24] It's one of my, I've got three other companies that I'm doing, but Spartan Forge is the one that I'm working on. [02:20:28] It's an awesome app. [02:20:29] Really working on. [02:20:30] Well, I really appreciate that. [02:20:31] We've put a lot of work into it, and we've got a lot more coming over the summer. [02:20:34] So if people want to support us or want to get out there and get some hunting done, please check it out. [02:20:38] And I answer all the Instagram DMs. [02:20:40] So if you have a question for me. [02:20:42] Good luck with that now. [02:20:43] Well, I try to. [02:20:44] I spent about two hours every morning doing it. [02:20:47] Good luck. [02:20:48] Thank you, Joe, for having me. [02:20:49] Thanks, brother. [02:20:49] Appreciate you very much. [02:20:50] Yeah, I get that. [02:20:51] All right, you too. [02:20:52] Bye everybody.