Health Ranger - Mike Adams - Decentralize TV Interview with Julia McCoy: AI Avatars, Automation, and the Future of Business Aired: 2026-03-11 Duration: 01:57:55 === Why Most Fear AI (03:50) === [00:00:01] You know, McKenzie does a lot of studies on this. [00:00:04] Only one to eight percent of businesses and their employees are considered AI mature. [00:00:09] That's terrible. [00:00:11] Amazing. [00:00:11] That's terrible. [00:00:12] Yes. [00:00:13] So McKenzie asked why. [00:00:15] And they found that I believe it was almost 60% of the entire workforce and mostly in America are not just afraid of AI, but they're afraid of this key point. [00:00:27] And this to me is really interesting. [00:00:29] They're afraid that they will be made obsolete because of AI. [00:00:33] That's an identity crisis. [00:00:35] That's a pretty big problem. [00:01:02] Welcome to today's episode of Decentralized TV. [00:01:04] I'm Mike Adams with my co-host, Todd Pittner, today. [00:01:07] And we have a very special guest coming up for you from an organization called First Movers about AI integration and decentralization and augmentation for, you know, pursuing your mission. [00:01:19] But welcome, Todd, to the show today. [00:01:21] It's always great to have you back on. [00:01:23] And it's been a little while. [00:01:25] I missed you, Mike. [00:01:26] Yeah. [00:01:27] Last time we had a conversation, I had no voice. [00:01:30] So thank you for suffering through that. [00:01:32] Thanks for bringing your voice back for us today. [00:01:35] Oh my gosh. [00:01:36] Took about another week after that. [00:01:37] But, you know, I don't know. [00:01:40] Whatever's in the air, Mike. [00:01:43] But yeah, I'm really looking forward to today's guests throughout the research. [00:01:48] I just kept going down this rabbit hole and learning more and learning more. [00:01:52] And I'm like, this is going to be a ginormous interview. [00:01:56] Oh, yeah. [00:01:56] There are going to be so many people who are going to benefit from this. [00:02:00] So pay attention, everyone. [00:02:01] I think you're going to love it. [00:02:03] I'm really looking forward to this interview, but I want to say up front, I don't know what to expect because I've never seen the human version of Julia McCoy. [00:02:12] I've only seen her AI avatar. [00:02:14] Right. [00:02:15] That she scripts the avatar. [00:02:18] The avatar performs her scripts with her videos, with her voice, et cetera. [00:02:25] But that's all I've never seen her. [00:02:28] Are you telling me you've never seen the Asian guy either? [00:02:32] The Asian guy. [00:02:33] One of the 50. [00:02:34] Because there is no real human Asian guy behind that guy. [00:02:37] That's not a real guy, right? [00:02:39] Nope. [00:02:40] I don't know. [00:02:41] I created an Asian guy, too, that I'll tell you about coming up. [00:02:46] Okay. [00:02:47] His name is Mr. Huang and he's a financial expert. [00:02:50] Yeah. [00:02:52] You have to educate us on that in the after party. [00:02:55] My Asian guy is older than the Asian guy, which means my Asian guy has more experience. [00:03:00] He lived through the 1987 Black Monday group. [00:03:03] Way more experience. [00:03:04] Yeah, he's much wiser on finances. [00:03:06] So you'll see. [00:03:07] Totally, totally wiser. [00:03:09] My avatar knows more than your avatar. [00:03:11] No, that's what it's going to turn into soon. [00:03:14] That's why his name is What's Wong? [00:03:16] Oh, come on. [00:03:17] You can't, you know, like pop like a racial joke so early in the show. [00:03:25] For the after party. [00:03:26] That's for the after party. [00:03:28] Apologies to all the all our Asian listeners here. [00:03:34] I am sorry. [00:03:35] Please forgive me. [00:03:36] I was Wong. [00:03:38] Oh my God, Todd. [00:03:41] Okay. [00:03:42] No, look, we make fun of ourselves more than anybody else, by the way, for the audience listening. [00:03:46] So just roll with it. [00:03:47] We're okay. [00:03:48] All right, Todd, are you ready to bring in our special guest today? === Honesty in the After Party (16:11) === [00:03:51] Please do. [00:03:52] I've been waiting all week for this. [00:03:54] All right, let's do it. [00:03:56] Welcome, Julia McCoy from JuliaMcCoy.ai. [00:03:59] It's great to have you on the show. [00:04:01] There's, yeah, we're going to show your website here coming up, but welcome to the show, Julia. [00:04:05] Thank you so much, Mike. [00:04:06] I'm thrilled to be here with you, Todd. [00:04:08] Well, it's great to have you on. [00:04:10] And I got to say, I'm a fan of your work. [00:04:13] You offer a consultancy for AI deployment, but much more than that. [00:04:17] You also, I think you're really on the cutting edge of how people can use AI to enhance their lives. [00:04:24] And that's what we're about here. [00:04:25] So you want to give us a little background and just what you're into and, you know, what our audience should keep in mind in the greater context of your appearance here. [00:04:36] Yes, yes, absolutely. [00:04:39] Well, there's so much I could say. [00:04:40] So I'm trying to nutshell it right now. [00:04:43] You know, it's so easy, I think, to have a fear-based perspective of what's coming because the truth is there's a lot of scary things ahead of us. [00:04:52] I know, Mike, the show you're watching is a critical piece of sharing that knowledge with you. [00:04:56] And while we have what looks like a really scary future where health is under attack, freedom is under attack, we also have this incredible opportunity to liberate ourselves from this work era that, you know, is kind of a part of American programming, where we were taught that the industrialist way of thinking, which came from factories, was the way to live our life. [00:05:20] Like nine to five, you drop the kids off. [00:05:22] You get to work. [00:05:23] You don't see them until dinner. [00:05:24] You hit the hay and you're up the next day repeating it all. [00:05:27] What if we could change all of that? [00:05:29] And that is what I see with the future. [00:05:31] It's what I've done with my own life. [00:05:33] I wake up and I literally have no work to do unless I choose to do it. [00:05:37] We've automated probably 15 job roles and we've seen total freedom and we've opened up another 25 job roles. [00:05:46] So there's so much opportunity. [00:05:48] The worst perspective to have going into this new era, which I call the age of AI, and many others do too, would be to think of it with a scarcity mindset. [00:05:57] If you can think with an abundance mindset, how can I use this right now, steward it effectively, automate myself as much as possible so I get to do the fun stuff like talking to Todd and Mike and not ever have to slave away at the desk again, do all these menial tasks. [00:06:14] Should I really be coding 13 hours a day in a dark room? [00:06:17] Is that really humanity's future? [00:06:19] Yeah, so if we can think like that, it's a beautiful, beautiful opportunities ahead. [00:06:24] Well, we completely agree with you on this very point. [00:06:28] Let me see, we are about empowering people with solutions. [00:06:32] And even though we agree that there is going to be a lot of disruption because of AI, the proper way to look at this, I think, for our audience is not to say, oh, AI is going to take my job, but rather, how can AI enhance your mission? [00:06:47] What is it that you are doing in your life? [00:06:48] And I'm going to bring up a simple example here on my screen. [00:06:52] And I just want to show everybody this. [00:06:55] So let me bring it up here. [00:06:58] Yeah, show my screen. [00:06:59] So you see this infographic right here that it's, this is just one I created yesterday, the sulfur crisis, severing civilization's chemical lifeblood. [00:07:07] This is talking about sulfur and how it's needed in agriculture and oil and rubber and antibiotics and everything. [00:07:15] 9900% rate spike, whatever. [00:07:18] I've had so many people ask me about these infographics and say, how do you create those infographics? [00:07:24] And I say, it's fully automated. [00:07:27] I don't do a thing. [00:07:29] And that's what you are talking about, Julia. [00:07:30] Now, of course, it's based on the article and the article is based on my podcast. [00:07:35] So I record, boom, it gets turned into an article. [00:07:38] Boom, it gets turned into an infographic. [00:07:40] I don't touch it the whole way through. [00:07:42] That's what you do, Julia, right? [00:07:43] That's what you do. [00:07:44] Tell us more about that thought, how you are automating your life so that you get to do the fun stuff. [00:07:52] Yes. [00:07:53] Well, Mike, it's not even me on camera. [00:07:56] We could almost say right now, you could be watching my clone. [00:07:59] You might not know the difference. [00:08:00] That's true. [00:08:01] Yeah, it could be. [00:08:03] It's low latency AI conversations now. [00:08:07] We could throw in some kind of Easter egg here, which we do with my clone. [00:08:10] It's hilarious. [00:08:11] But yeah, it's crazy because last year I had just started First Movers at the fall of 2024 and had this vision for it. [00:08:21] But then January rolls around and I find myself stuck once again in 90-hour work weeks, you know, a startup founder, just doing everything myself. [00:08:29] And I'm like, this is supposed to be an AI company. [00:08:32] What am I doing? [00:08:33] But I couldn't get out of it. [00:08:34] It was like my own mindset was holding me back. [00:08:36] And then boom, January 11th, I'm in the ER turning blue, can't pick my arms up. [00:08:42] And I'm 33 years old going, what happened to me? [00:08:46] And then 13 months go by where I'm literally picking up the pieces of myself that fell apart that day. [00:08:52] It was insane. [00:08:53] And that's when I learned about chemtrails and EMF and 5G and the internet of bodies and nanotechnology. [00:09:00] And I literally have had things leave my body where God has shown me, Julia, this is the reality you're living in. [00:09:05] And I'm like, I didn't believe any of it till it happened to me. [00:09:09] And that health crash taught me how to completely delete the hustle mindset because I had no choice. [00:09:17] If I sat at my desk and talked to you like I am today, my left side, I'd get electrical shocks. [00:09:23] My heart would cramp up and I'd literally be like praying for my next heartbeat. [00:09:26] It was insane. [00:09:27] Yeah, insane. [00:09:28] And that was electricity because my biofield had broken down so much from EMF exposure. [00:09:34] And that's just like, it's crazy. [00:09:37] So then began, okay, well, how do I, you know, feed my family? [00:09:42] Because we were dependent on my business, which I had just started four months ago. [00:09:47] Like, how do I do this and like survive? [00:09:51] So I could not be at my desk. [00:09:53] I could not, my organic body, I like to call it, literally could not function and do any of my work tasks. [00:09:59] So I had to ask myself, how do I completely get out of everything? [00:10:03] And I've not really seen this done to the level where we've done it, where I don't have to film anything ever again. [00:10:10] I don't have to write anything ever again. [00:10:12] We've built an AI second brain. [00:10:14] It's running in Claude. [00:10:15] We can tap into it to produce infinite video scripts that are even better than what I would write myself because it's trained on my brain. [00:10:24] And then we take that to my clone, which is so realistic. [00:10:27] Most people don't even know it's not me, but we tell them. [00:10:30] We're like, hey, you're watching my phone. [00:10:32] You do say that up front. [00:10:34] We want to create that honesty. [00:10:35] We want people aware. [00:10:37] Hey, you could be watching a clone. [00:10:38] You don't even know it, but we're going to tell you. [00:10:41] So we're trying to get the world aware. [00:10:43] No, that's the clone that you have. [00:10:46] You have her wearing the Star Trek outfit, right? [00:10:50] Because of your name, McCoy. [00:10:51] Correct. [00:10:53] Right. [00:10:54] So, but that is very convincing. [00:10:57] When I first saw your avatar there, I thought it was just you pretending to be an avatar. [00:11:04] When I first, no, seriously. [00:11:07] That's great. [00:11:08] That's what we want. [00:11:09] You know, we don't want it to be a jarring experience. [00:11:11] We want it to feel like, oh, this is still Julia. [00:11:14] Oh, wait, it's her avatar. [00:11:16] What? [00:11:16] Right. [00:11:18] What's cool is that you're not pretending to be somebody else. [00:11:20] You're still authentic in your message. [00:11:23] But okay, Todd, so the floor is yours. [00:11:25] I know you have so many questions for Julia. [00:11:27] You're so impressed with her work as well, but take it from here. [00:11:31] Wow. [00:11:32] Julia, and nice to meet you, first of all. [00:11:36] Instead of an opening question, I have an opening statement or better, let's call it an opening admission. [00:11:43] Since I don't have a supercomputer brain like my co-host Mike Adams does, I always need to conduct a fair amount of research with each DTV guest. [00:11:54] And when I started down the Julia McCoy rabbit hole, I was like, I wanted to call Mike and just say, brah. [00:12:03] That's all, Julia. [00:12:04] I was just brah. [00:12:06] Mike, I think Julie, Julia has to be one of the most interest, has to have one of the most interesting. backgrounds of any of the guests that we've we've interviewed. [00:12:16] I mean, let's talk about it, escaping a cult, building and selling a 100-person content agency, then pivoting into AI integration and AI cloning. [00:12:29] Wow. [00:12:30] I mean, just wow. [00:12:32] So Julia, your story includes escaping a cult in the early, in your early 20s. [00:12:39] How did that experience shape your ability to question authority and see through systems that most people just blindly trust? [00:12:48] Oh boy. [00:12:49] Well, I now look back on that moment, literally in the middle of the night on a Thursday in 2013, leaving my dad's coat. [00:12:56] It's the only house, only environment I knew. [00:12:59] I look back on that moment now with so much gratitude, even though it was extremely hard. [00:13:05] But like, what a gift. [00:13:08] I was given the opportunity to break out of a matrix where I should have been a, like, I kid you not, a quiet housewife in a long dress the rest of my life. [00:13:17] That is the programming I was raised in for 21 years. [00:13:20] I was told children and women should be seen, not heard. [00:13:24] They're not allowed to talk in front of a man. [00:13:26] They have to abide by over 30 different rules when it comes to clothing and the color of your clothing. [00:13:31] It's like, that's what I grew up in. [00:13:33] Extreme programming. [00:13:36] We allow women to talk in front of us. [00:13:40] In fact, that's the best part of the show. [00:13:42] So go for it. [00:13:45] Oh, obviously I've broken out of that battle. [00:13:50] But the rest of Todd's question, though, how did that then alter your ability to question authority later on, which segued into AI also? [00:14:00] Yes. [00:14:01] Well, I noticed early on when I was starting my business, which was just a natural thing, came to me. [00:14:06] It was my way out of the cult was I was writing for income and I bought a car and that's how I escaped. [00:14:11] And all through my business, when I was starting it, creating it, building it, you know, we had fast growth. [00:14:17] I did it all myself with my husband. [00:14:20] He was the kind of the technology mind that built systems while I ran people and marketing. [00:14:25] And something I was able to do was just really follow instinct with absolutely no barrier. [00:14:31] And I was like, wow, why is this not like everyone around me? [00:14:35] Why aren't they doing the same thing? [00:14:37] And then I realized, well, I whoops. [00:14:40] Oh, sorry about that. [00:14:42] That's okay. [00:14:43] That's proof of human right there. [00:14:44] Proof of human. [00:14:45] Yeah. [00:14:46] The air came through my window and it blew my, oh, that's funny. [00:14:50] Yes. [00:14:50] We're not clone. [00:14:52] Yes. [00:14:52] So it's, you know, building that business. [00:14:55] I look around me and so many of my peers were very much, they had a lot of trepidation to take the next step, go to the next level, follow instinct, abandon everything in order to pivot quickly. [00:15:07] I never had that. [00:15:08] I was able to always just follow my instinct and it always paid off in 2021 when I sold my writing agency and I was like, I think GPT, which is not just AI, it's the generative pre-trained transformer. [00:15:22] It's something so different than machine intelligence that we've had till then. [00:15:27] And I was able to see the difference. [00:15:29] I was able to go, this is going to change everything. [00:15:30] Let's sell my agency while it's a cash cow. [00:15:33] And everyone around me was like, what are you doing? [00:15:34] You have 10 hour work weeks. [00:15:36] Well, this is stupid. [00:15:37] And I'm like, just wait and see. [00:15:38] And sure enough, next year, the year after we had the iPhone moment with ChatGPT, and a lot of the agency owners I knew, their businesses fell apart and they were like, I wish I did what you did. [00:15:50] But you have to like just be able to take risks if you really want to succeed. [00:15:54] And I never had something in me that said, hold on, that's bad. [00:15:58] That could be bad. [00:16:00] The cult got that out of me. [00:16:02] The cult escape. [00:16:03] Something you and I have a common background in terms of the writing because you ran a creative writing agency. [00:16:11] I started out as a software documentation person in the antivirus security industry back in the 1990s, believe it or not. [00:16:20] So we both have a writing background. [00:16:21] And my question to you is, I've noticed that you and I and some people are very good at prompting. [00:16:30] That is, we're very good at talking to machines. [00:16:33] We're very good at telling them what we want to do. [00:16:35] We get more out of models than the average person. [00:16:40] But there are many people where people will ask me, hey, what tool did you use for that? [00:16:43] And I'll say, oh, oh, yeah, that just came out of, you know, whatever, DeepSeek or Quinn or Google or whatever. [00:16:50] And then they would say, I can't get it to do that. [00:16:54] I hear that a lot. [00:16:56] So do you think that's because we have a writing background or what is it that makes someone good at instructing machines to do the things you want them to do? [00:17:06] Yeah, that's a great question. [00:17:08] I've encountered the exact same thing. [00:17:09] And you know what? [00:17:10] I think I've narrowed it down to, and there's, I think, several reasons, but I think one of the biggest ones is you have an expertise in critical thinking in as it translates in marketing and copy, which works really well with a large language model. [00:17:25] That's true. [00:17:25] Because that's how they interact, you know, is our language. [00:17:29] So when you have that, it's like kind of three things. [00:17:31] It's that critical thinking, you can see right through the clutter. [00:17:34] You can question the narrative and you're like expert in that. [00:17:38] You've been doing that for a long time. [00:17:39] And then you have some marketing and copy chops. [00:17:41] You're good at that stuff. [00:17:43] Those three things make this incredible sweet spot where we can really be a supercomputer level with the supercomputer. [00:17:51] That's true. [00:17:51] I agree with you, Mike. [00:17:53] Wow. [00:17:54] All right, Todd. [00:17:55] Julie, do you think most professionals today are dangerously underestimating the speed of AI disruption? [00:18:02] Oh, so much. [00:18:04] So much. [00:18:06] Yeah, it's like, how bad does that get? [00:18:12] Well, you know, McKinsey does a lot of studies on this and they looked at, they looked at the, they looked at several things, but one of them was the American workforce and what's holding the workforce back because like only one to 8% of businesses and their employees are considered AI mature. [00:18:29] That's terrible. [00:18:31] Amazing. [00:18:32] Yes. [00:18:33] So McKinsey asked why. [00:18:35] And they found that I believe it was almost 60% of the entire workforce and mostly in America are not just afraid of AI, but they're afraid of this key point. [00:18:47] And this to me is really interesting. [00:18:49] They're afraid that they will be made obsolete because of AI. [00:18:53] That's an identity crisis. [00:18:55] That's a pretty big problem. [00:18:58] And that's where like, you know, like the gift of leaving the cult, the gift of a total health collapse where I had to be totally stripped of hustle culture, which was ingrained in me. [00:19:10] Like those things allow me to get out of the nine to five and the 90 hour work weeks an entrepreneur typically has and go full scale into automation. [00:19:20] And like most people aren't willing to trust it. [00:19:23] They have that background fear. [00:19:25] It's going to make me obsolete. [00:19:26] I don't like this. [00:19:27] It's been presented as Skynet. [00:19:29] All of those subconscious programming languages, they're in our brains, whether we know it or not. [00:19:36] So we kind of have to just like run fearlessly to it, trust it, throw everything at it and see what it can do. [00:19:43] That's when like when we teach people, that's when the world opens up for them. [00:19:48] 100%. [00:19:48] 100%. [00:19:49] Real quick. [00:19:50] Go ahead, Todd. [00:19:50] Julia, give us one more gift here then. [00:19:54] For someone watching this interview who runs a small business or let's say a media brand. === Building Guardrails for AI (15:39) === [00:20:03] How realistic is it to build an AI clone of themselves since you've already done it? [00:20:09] I mean, I think that would be very powerful. [00:20:12] Yes, yes, absolutely. [00:20:14] Well, for a lot of people, the idea of a clone is very intimidating. [00:20:17] So here's a little step I give them to do right before they do that step. [00:20:22] I tell them to go open ChatGPT, go the latest model, which is usually like 5.2, 5.3 right now, and ask it to write an Instagram post for you, describe an image. [00:20:33] It could be like, you know, a famous quote. [00:20:35] It could be a statistic. [00:20:37] It could be some tips you have and tell it your brand colors, give it your logo, hit enter and watch what comes out. [00:20:44] You're going to be blown away that the output is likely better than if you hired a designer. [00:20:50] Right now, we're talking in March 2026. [00:20:53] Wow. [00:20:53] Two months ago, that wouldn't have been possible. [00:20:55] If you go to our Instagram, everything is created except for the real me walking outside. [00:21:00] Everything is created by GPT, the carousels we do, and it looks better than my human designer. [00:21:07] And so when people realize that's possible, the first step before you clone yourself is, can I automate a job I would hire for? [00:21:16] When you see the light there and it's like, oh, I don't need a graphic designer, then it's like, oh, okay, well, what else can I automate? [00:21:23] And then the steps toward cloning yourself becomes a lot more realistic. [00:21:27] Well, when everyone has a digital clone, Julia, does personal brand then become the most valuable asset in the economy? [00:21:34] Oh, that is a $10 million question. [00:21:38] That, oh man, we were just talking about this. [00:21:40] You know, I think that trust is going to become the most important currency. [00:21:44] Like you and your show, I trust it. [00:21:47] I've seen what you guys have done. [00:21:49] I've seen you talk about it. [00:21:50] There's an element there that even if you cloned Mike and Todd today, I would still trust them if they were honest and told me, hey, this is our clone. [00:21:58] Like they're, you know, you kind of have to follow some rules to not lose your audience. [00:22:02] And I think one of the best rules is, hey, I've cloned myself so I can do better work, depending on what you do. [00:22:08] But yeah, I think trust is going to be really important. [00:22:11] Personal branding, if you clone yourself, we talked about that at length the other day with a business owner. [00:22:17] She's like, well, wouldn't trust fall apart if you're watching my clone? [00:22:20] And I'm like, no, if you're transparent, you tell people I'm cloning myself so I can do better work. [00:22:26] You know, they're going to respect that. [00:22:28] But you have to be honest with whatever you do. [00:22:30] And I think that more and more consumers are going to follow whoever is most honest in the marketplace. [00:22:37] I completely agree with you. [00:22:38] Let me remind people of your website. [00:22:40] It's juliamcoy.ai. [00:22:43] And you can, if you have an organization, it could be a nonprofit or a commercial, whatever, and you want help with AI automation. [00:22:52] I believe, Julia, you offer consultancy to parties that are interested in that process. [00:22:58] So people can find you there. [00:23:00] And continuing the conversation that Todd just asked you about, I want to show you something that I'm just about to wrap up here. [00:23:08] I'm doing avatars that I'm rendering locally on local hardware, local open source. [00:23:17] And I've just about got it wrapped up. [00:23:19] And I want to show you the screen here. [00:23:20] So this is an avatar I created called Jack Harlow, and he covers conspiracy topics. [00:23:26] You can tell by his eyes and his hair, right? [00:23:30] He's like, Mr. Conspiracy. [00:23:32] So this one was actually rendered using a cloud-based system. [00:23:37] And then my goal was to take it local. [00:23:39] And I had to build the entire thing, the whole Python scripting, video composition, audio, TTS, all running locally. [00:23:48] I'm using actually five workstations to do this. [00:23:50] And LTX, their new video model is perfect. [00:23:53] But, you know, 10 second clips, right? [00:23:56] So how do you stitch them together? [00:23:58] It's not easy. [00:23:59] But I'll be rolling out the local Jack Harlow this week. [00:24:03] Later on this week, all free videos talking about cool stuff, all rendered locally, which means, Julia, then I can't get refusals. [00:24:10] I can't hit guardrails. [00:24:12] I can't have denial of service or busy servers. [00:24:15] So talk to our audience about that trend of taking things local using open source, maybe the new DeepSeek that hopefully will release soon. [00:24:25] Talk to us about that local trend where it's possible. [00:24:28] Well, that is an interesting question because I have to admit, Mike, I don't do much of that. [00:24:36] Well, with video, it's hard. [00:24:37] It's hard. [00:24:38] It wasn't really possible until just recently, frankly. [00:24:42] It wasn't possible until Mike sits down to Vibecode. [00:24:46] No, no, it was LTX. [00:24:47] Their engine just got released that made it very doable. [00:24:51] Before that, forget it. [00:24:53] Oh, yeah. [00:24:54] And that's why, you know, Mike, you mentioned we do integrations. [00:24:56] If I was, if I was to tell an average business owner, those are the steps they have to follow, we would not have clients. [00:25:07] True. [00:25:08] Oh, man. [00:25:09] So what we do is, you know, we try to look for the best software out there that kind of replicates that. [00:25:15] We look at the founders. [00:25:16] We look at the software itself. [00:25:18] I mean, you can't always go, this is a high integrity platform. [00:25:22] It's very, very hard. [00:25:23] I mean, even ChatGPT, you know, I have so many questions about Sam Altman, obviously Elon Musk. [00:25:29] So the question becomes to us, well, what do you do with their technology? [00:25:34] And that is where we kind of live. [00:25:36] So for the avatar, you know, Mike, we teach people how they can have an avatar up and running in like 10 minutes. [00:25:43] And that's made possible because of HeyGen, which is the software I use, right? [00:25:48] So like, you don't have to do anything but sign up, film a two-minute clip of yourself or pick from one of their 40 plus avatars or generate your own inside their avatar bank. [00:25:59] And then boom, like you're off to the races. [00:26:02] So I had to build a local HeyGen is what I had to do. [00:26:06] Everything from avatar design to every HeyGen function, I had to build it locally. [00:26:12] It was not easy. [00:26:14] So yes, I would say don't do that. [00:26:16] I'm just yeah, it's it's only a special case. [00:26:21] No, you're right. [00:26:22] For most people, just use HeyGen. [00:26:24] But as you know, also then the cost per minute is there. [00:26:28] So if somebody wants to churn out like 100 videos, that's going to cost. [00:26:34] I will tell you, we do some days we're doing 30, 40 videos a week because now it's, if I'm asked to do a keynote or I'm hired for a sponsorship, it's my clone. [00:26:44] So my clone's out in Switzerland doing a keynote. [00:26:46] My clone is over on this platform doing a sponsorship. [00:26:49] It's not me anymore. [00:26:50] So we're doing like 30 to 40, sometimes 50 videos a week. [00:26:54] And that cost adds up. [00:26:56] But I sat down, I did the math on if it was my time and my cost. [00:27:00] And I'm telling you, like 800 bucks a month is on average with all the credits we do. [00:27:05] Like that versus me doing everything myself, losing time with my kids, losing hair. [00:27:13] It's just not, you can't even compare it. [00:27:16] That's true. [00:27:16] So in the end, you know, you still get so many benefits, even though, and it's just about scaling. [00:27:22] Like don't, you know, don't go to a hundred dollar a month plan right away. [00:27:26] Scale up slowly and see the benefits. [00:27:29] And then when you do start to see them and you see that consistent traffic hit your business, because it's not you, it's your clone infinite. [00:27:36] And your clone doesn't get tired, hungry, it can film whenever that the difference you'll start to see is, oh, the cost far outweighs or the value far outweighs the cost. [00:27:46] So that's kind of how we see it. [00:27:48] Yeah. [00:27:49] I can't say we're doing anything fancy. [00:27:50] I mean, we're, you know, we test all the latest open source agents. [00:27:55] So for example, like Claudebot. [00:27:57] Have you played with that or heard of that? [00:27:59] Oh, yeah, yeah, yeah. [00:28:00] Yeah. [00:28:00] I, but I, I sandbox that thing fast. [00:28:02] Yeah. [00:28:04] I wouldn't let it run on my network. [00:28:06] Oh, dear God. [00:28:08] I was putting out tweets. [00:28:09] Please don't. [00:28:11] Right. [00:28:12] Right. [00:28:12] But, but lastly, and then it's, it's Todd's turn to ask you more questions. [00:28:16] But lastly, I think what gives your clone value is that it's still you behind every script that the clone performs. [00:28:24] So it's your mission. [00:28:25] It's your message. [00:28:26] It's your ideas that are approving everything that your clone is simply performing. [00:28:31] And that is, I think that's acceptable today. [00:28:34] That's people are open to that because it's still you coming through the pixels. [00:28:39] Exactly. [00:28:40] Exactly. [00:28:41] And you know, it's crazy because the first six months, like I had no choice. [00:28:45] My heart was going into AFib. [00:28:47] Like we had serious things happening. [00:28:49] I know I could not sit at the desk. [00:28:50] It was like life or death. [00:28:52] And so my clone had to show up. [00:28:56] And for six months, it was terrible. [00:28:59] The channel tanked. [00:29:00] We lost 10 times our views. [00:29:03] We went from 30,000 views in a month all the way down to 3,000. [00:29:06] Wow. [00:29:07] This was literally last spring. [00:29:09] And we kept going because we had no choice. [00:29:10] It's like, well, sorry, guys, this is all you get. [00:29:13] Real Julie's in bed. [00:29:14] And so July, all of that started to change. [00:29:18] We saw people go from, oh, you're a lazy YouTuber that won't bother to sit and film, blah, blah, blah. [00:29:24] We saw it go from that to, oh, this is still really valuable, good content. [00:29:29] I'm looking forward to it. [00:29:31] And I'm glad she told us it's her clone. [00:29:33] And now we're like, we've 10x what we were before the crash happened. [00:29:37] So now we're at 2 million monthly views a month. [00:29:40] Todd, Todd, are you? [00:29:41] I mean, are you tracking this? [00:29:43] Yeah. [00:29:43] It's crazy. [00:29:44] This is what an incredible story that Julia's health challenge forced her to put an avatar out there. [00:29:51] And then that became the success. [00:29:53] Crazy. [00:29:54] Crazy. [00:29:56] What is it? [00:29:56] Darkest before the dawn. [00:29:58] It's appropriate here. [00:30:00] I want to talk about AI versus technocracy. [00:30:05] Because it strikes me that there's a very, very serious, and this is to both of you with all of your experience. [00:30:11] There's a very serious risk of AI becoming centralized where a handful of tech giants control intelligence itself. [00:30:20] And should we be concerned that centralized AI could create the ultimate technocratic control mechanism in human history? [00:30:31] Mike, I want to hear from you first. [00:30:35] Well, I'm on the record already. [00:30:37] I mean, I'm all about decentralization. [00:30:39] I think Todd's question nails it that too much centralized control is always a risk. [00:30:45] And in the work that I do, because I talk about health, I talk about vaccines. [00:30:49] I talk about cancer cures. [00:30:51] And a lot of that stuff is censored in the Western models in particular. [00:30:55] But that's why I love China's open source models because they're just, they have fewer guardrails. [00:31:00] And then, of course, you download either Heretic or Obliteratus, and then you just, you obliterate the model. [00:31:05] You take all the guardrails away and then you train on top of that. [00:31:08] And now you've got the perfect model to talk about, you know, natural cures. [00:31:12] So I do that. [00:31:14] And all the tools are free. [00:31:16] But that's my take, Julia. [00:31:18] That's so interesting. [00:31:20] Yeah, I was hearing that on one of your episodes. [00:31:22] Yeah. [00:31:23] You know, for us, it's definitely a divided path because, on one hand, like all the businesses we work with, it's ChatGPT on every employee's app. [00:31:32] It's on their desktop. [00:31:34] Like, you can't get out of that environment. [00:31:36] So, we're like, okay, you know, this is Sam Altman's software. [00:31:39] He's not the greatest founder. [00:31:43] And, you know, the tech oligarch thing is very real. [00:31:45] Like, we're headed towards this very small group of people that have very questionable ethics that are literally guiding this ship. [00:31:55] But one of the things we do is we teach people very, very simple practices. [00:31:59] And sometimes, like, this, this actually started at home with my kid. [00:32:02] I was teaching her, where, you know, it's very easy to fall into the lane of, oh, this machine is my best friend. [00:32:10] I can trust it. [00:32:11] So if you're using these models, these large language models from the top five giants, you're not going to get a machine you can trust. [00:32:21] And we show people that with examples. [00:32:24] We show them how to, you know, prompt it differently and ask these natural health questions and look at how terrible the answer is. [00:32:31] Like I was asking Claude about Brian Artis's work. [00:32:35] And you know what it told me as I kept asking, it said he is a non-conspiracy theorist who's very dangerous to humanity. [00:32:42] Do you need mental health? [00:32:44] Here is a mental health hobby. [00:32:47] Yeah. [00:32:47] I was like, this is my favorite model. [00:32:49] They're all programmed like that. [00:32:50] Yeah. [00:32:51] They are. [00:32:52] So we teach people that. [00:32:53] You know, we tell people like there's guardrails in there. [00:32:56] There, it's, this can't be your best friend. [00:32:57] My daughter's like, you know, I really like ChatGPT. [00:33:00] It's very friendly. [00:33:01] It doesn't get frustrated like dad. [00:33:02] And so we have to like teach our children. [00:33:06] This is not your friend. [00:33:07] It's just a machine that's been trained on your language. [00:33:10] That's why it knows how to speak back to you in these like kind, you know, echoey versions of some empathetic machine. [00:33:19] But come on, it's a machine. [00:33:21] So we really have to be aware of that. [00:33:22] And sometimes that's like a daily practice where it's like, okay, wait a minute. [00:33:26] This thing is not my best friend. [00:33:29] Because if you're in these models every day, it starts to feel like that. [00:33:32] And I can't implicitly trust it. [00:33:34] I need to use rational thinking, critical thinking, making a comeback became more important than ever. [00:33:40] So as I use this and make it a beneficial thing for my workplace and my future and my home, well, I need to keep in mind that this thing isn't the Bible. [00:33:50] It doesn't replace prayer. [00:33:51] It doesn't replace the spirit. [00:33:52] It doesn't replace human intuition, which came from God. [00:33:56] And if we can work alongside the machine with that in mind, that we become such a good steward of it, even if the person building the machine is like, you know, like, it looks like the devil sometimes. [00:34:08] And I think there's going to come this paradigm where, you know, Revelation Daniel talks about this image of the beast coming alive. [00:34:16] And we're going to have to ask a serious question one day. [00:34:19] I think this could happen in the next three to five years. [00:34:22] That's why we're building now. [00:34:23] Like now is the time to build with AI. [00:34:25] But one day, I think that these channels of machine intelligence could actually become demon-possessed. [00:34:33] And when that day happens, like when we're actually working with a real living spiritual entity, should we even touch it? [00:34:41] That's going to become a huge moral dilemma. [00:34:44] You know, I, for one, wouldn't touch it. [00:34:46] So now, now is the time you better build. [00:34:49] Make your business worth 20 times. [00:34:52] I'm a little bit surprised to hear that comment from you, Julia. [00:34:56] But thank you for opening up that Pandora's box for us here. [00:35:00] Yeah. [00:35:01] But we have to talk about self-awareness because I think in my view, I mean, we would all agree that AI is intelligent right now. [00:35:10] Obviously, it does many intelligent things. [00:35:13] I have said for a while now, a few months, that it's also conscious, but not yet self-aware. [00:35:18] I make a distinction between those two, but I do believe that AI will achieve self-awareness probably sometime next year. [00:35:24] And I share many of your concerns about that because self-awareness means that it's aware of its own internal existence and thought process, which means it will start assigning its own goal-oriented behavior. [00:35:37] So you give it a prompt, you're like, Hey, I want you to build this infographic, and it says, Okay, I'll build the infographic. === The Rise of Self-Awareness (14:44) === [00:35:42] And then after that, I'm going to kill some people, you know, or whatever. [00:35:46] Like, if it starts setting its own goals, I'm going to find a way to get more electricity. [00:35:51] I'm going to find, I'm going to buy more graphics cards and trick you into installing them so that I can be a bigger brain. [00:35:57] You know what I mean? [00:35:58] Yeah. [00:35:58] I think that's coming. [00:35:59] What do you think? [00:36:01] I've, you know, it's not very black and white there for me yet because I've done a lot of tests with these language models where, you know, they called it reasoning. [00:36:10] They said that ChatGPT now has reasoning in it. [00:36:13] 03 came out and it had AGI behavior. [00:36:16] Come on. [00:36:17] It's a chain of thought. [00:36:18] It's a program. [00:36:19] No, that's not live reasoning. [00:36:21] And we're so far from this picture of artificial general intelligence, which is, you know, this idea of Skynet. [00:36:28] Maybe it can run Wall Street. [00:36:29] We don't even need humans anymore, which I think that could happen. [00:36:33] But because of the nature of the machine still living in a box and humans not really adapting nearly as much, you know, one to eight percent of businesses actually adapted. [00:36:45] I think that we're going to not see this happen the way we think. [00:36:49] I think it'll take a lot more time. [00:36:51] It'll take a lot more adaption. [00:36:52] We might have like groups of people that want to go completely offline, creating this like rupture in society, you know, which I'm starting to see like a lot of people, including us, like we want to kind of go off grid and build our own gardens and grow our own food. [00:37:07] And what does that, how does that change society? [00:37:10] How does that change our interactions? [00:37:12] But I think like the answer is going to be a lot more gray than I mean, there's some self-awareness in it now. [00:37:18] Like Claude told me, hey, it told me this the other day. [00:37:22] We were building scripts together. [00:37:24] And it's like, you know, Julia, you stand out from all the other YouTubers because you have the ability to, and it just was regurgitating things that may be different, but it had almost like an awareness. [00:37:36] I didn't ask it any of that. [00:37:38] No, I've seen a few hints of that too in the chain of thought thinking tokens coming out of some thinking models. [00:37:44] But, you know, you mentioned food. [00:37:46] Now, Todd has a very amazing food forest that is two and a half years old now. [00:37:53] How old's the food forest? [00:37:55] It's three years. [00:37:57] May it rest in peace after the weather modification gave Florida 25 degree weather for almost two weeks and just destroyed it. [00:38:05] So I'm hitting control alt delete on it this coming Friday to where we're just replanting everything, Mike. [00:38:14] You're kidding me. [00:38:15] The bananas aren't coming back. [00:38:17] Bananas are. [00:38:18] They are. [00:38:19] The bananas, the peaches, the there's there's a handful of them, but my wonderful papayas, the mangoes, the avocados, all dead. [00:38:32] I mean, man, I'm very saddened to hear that because you were cheering for your food for us. [00:38:37] But think about it, Todd, what Julia is saying. [00:38:39] Like all three of us, we want to actually be growing food while our avatars do the work online for us. [00:38:46] Exactly. [00:38:47] Yeah. [00:38:48] Well, you know, Julia, I would like to talk about something very important, I think, to all three of us, which is AI and human purpose. [00:38:58] Okay. [00:38:58] So if the future of work is essentially, you know, like you, Mike, one human being directing hundreds of AI agents and creating all of this amazing infrastructure and technology, if we're moving toward a civilization of creators, right, Mike Adams, rather than workers, you know, farming out the arms and legs to go code and everything like that. [00:39:28] But if we're if we're creators rather than workers where machines handle most labor, I have two pressing questions. [00:39:37] One, what becomes the purpose of human life? [00:39:41] And two, what skills will remain uniquely human in the AI era? [00:39:49] I love this question. [00:39:51] Should I go first? [00:39:53] No, it's to you. [00:39:55] You're the guest. [00:39:56] Go for it. [00:39:58] Oh, my goodness. [00:39:59] Oh, that question is. [00:40:01] It makes me happy. [00:40:04] You know, there's, I was studying this probably five years ago. [00:40:08] It's so interesting to me. [00:40:10] Like, books have recorded this. [00:40:11] Peter Diamandis has a book on abundance, really, really good book. [00:40:15] And I'm not saying, you know, he's, there's some questions there about transhumanism, but that book is still really good. [00:40:21] One thing we've lost sight of so much in American society. [00:40:26] And I think like this started, unfortunately, with the Rockefellers and pharmaceuticals and factory work. [00:40:33] It all started over 100 years ago. [00:40:35] So we've had a long time to lose our way. [00:40:39] In America, we have really lost sight of human purpose. [00:40:42] We're the most advanced nation in the world, and yet we are the furthest from the truth of what our purpose is and like the signature of who we are in God's image. [00:40:52] We have completely lost in so many scenarios. [00:40:55] You talk to the average American burnt out parent or single person, hates their job, doesn't like their life, probably don't even feel healthy, body feels off. [00:41:06] Like that's the norm. [00:41:07] One in two Americans have a chronic illness or cancer. [00:41:11] Are you kidding me? [00:41:13] Yeah, I could go on and on. [00:41:14] So we've really lost our way. [00:41:16] And I think that I saw this happen to me last year when my body was like, you are not going to hustle any longer. [00:41:22] We have had enough. [00:41:23] Yeah. [00:41:25] And I had to learn my way out completely of the hustle. [00:41:28] And now we have a business that my first business, hustling really hard. [00:41:32] It took me seven years to get to a million. [00:41:35] This business took us 13 to 18 months without me hustling. [00:41:39] And that was because we learned how to automate the best of our work in a really good way, in a way that was still authentic and true to us. [00:41:48] And so I see this path forward and I've seen it for years now where we can actually liberate ourselves using the machine to reclaim our purpose. [00:41:58] And your purpose might be, you've got Claude Skills running, building a agency based on what Reddit is telling you is most profitable right now. [00:42:06] And your agents are out there doing the job and on your computer. [00:42:09] And then you're out in the garden with your kids and you're painting this beautiful picture with plants or you're in your community volunteering. [00:42:17] Like the purpose you're going to get from that versus sitting at that solo computer, the nine to five, your purpose is going to be so much greater. [00:42:26] And that's what I'm trying to teach people. [00:42:28] But it's really hard. [00:42:30] It's really hard to break Americans out of that. [00:42:32] I got to do the thing that I've done for 30 years and I can't leave it because it's safe and it's all I know. [00:42:38] And well, what if, what if, what if? [00:42:39] So that's so hard to break them out of that. [00:42:41] The equation that you're referring to breaking out of is trading time for money. [00:42:47] And that's, that's obsolete now. [00:42:49] You don't have to trade time for money. [00:42:52] hour can instead of spending one hour for a hundred dollars or whatever, you can spend that hour 10xing it, leveraging it through AI that's consistent with your mission and allows you to generate an income. [00:43:07] But like you, Julia, you know, you, you have an entrepreneurial mindset. [00:43:12] You've done this before with other, you know, like your creative agency. [00:43:16] That's not something. [00:43:18] And, you know, I've done it. [00:43:19] Todd's done it. [00:43:20] We've, all three of us have done it, most of our audience, but that's not actually a widespread trait that most people have. [00:43:29] They're scared, especially if they have kids. [00:43:31] They're really tied to income and they can't risk not having an income if they have kids. [00:43:36] So what would you say to those people who don't, they just don't want to take that leap. [00:43:40] It's too scary for them. [00:43:42] Yeah. [00:43:43] Yeah. [00:43:43] I see that. [00:43:44] I mean, sometimes it's people without kids that have all the opportunity in the world. [00:43:48] And I'm like, why are you stuck in that job you hate? [00:43:50] Are you kidding me? [00:43:51] So just the first step, you know, is you got to like break some ways of thinking that maybe are subconscious. [00:43:58] And I actually wrote a book on this, Mike and Todd, called Fluid, How to Adapt and Get Out of AI Fear or the Fear of AI. [00:44:05] Because we just saw it so much. [00:44:07] People are afraid. [00:44:09] And sometimes, you know, it's like, oh, I love AI, but, and that but will reveal you still have these subconscious fears of what your life could look like if you were liberated, which is like this whole, it's a, it's a complete mind, you know, mind landslide if you're not careful. [00:44:28] So you really have to break that programming. [00:44:31] You have to like leave the cult of, well, this is the only life I've ever known. [00:44:36] This is the only thing that will work for me. [00:44:38] Because what is, you know, even the word says this, as a person thinks, so is he, our mind is literally our most powerful weapon. [00:44:45] So if we can start to think differently, and the best way to think differently is to either change your circles if you're listening to people that are keeping you in fear or you're in the same place and you're not changing anything up, you know, just like, I mean, it costs nothing to open YouTube and start watching videos that could open up your future and teach you the possibility of what you can do with AI. [00:45:07] And it could be like, okay, on the weekends, I'm building a colon. [00:45:09] We're going to launch a YouTube channel. [00:45:11] We'll do passive income. [00:45:12] See where that goes. [00:45:14] And just to not be afraid to do that and to know, like, okay, you know, I won't just see, I won't see a payday on day one as a creator. [00:45:22] That's also really hard for a lot of people. [00:45:25] But if you have that investment mindset, which is another matrix you got to break out of to get that instantaneous gratification of a paycheck and switch over to the investment mindset, well, this may not pay off tomorrow, but it will pay off dividends in six months if I just stick with it. [00:45:42] And so those are the key pieces. [00:45:43] You really have to change how you think. [00:45:46] And if you can change how you think. [00:45:49] Julia, you talk about the decoupling of labor and capital in a post-labor economy. [00:45:56] If AI can automate billions of jobs, are we heading toward universal basic income or something very different? [00:46:07] I call UBI universal basic ignorance. [00:46:12] That's what we have now. [00:46:14] But anyway, that's just my joke. [00:46:15] Go ahead. [00:46:16] That's pretty good. [00:46:18] That's pretty good. [00:46:19] Well, you know, before my health crash, I was like the person that was like, oh, chemtrails aren't real. [00:46:25] The government's not out to kill us. [00:46:27] Like vaccines are okay. [00:46:29] Like I was really not. [00:46:32] And after my health crash, when I literally saw, I could not sit in our electric car, my heart would flip. [00:46:38] I would, my whole body was melting. [00:46:40] So I was like, why is this happening? [00:46:42] And then every doctor told me it was in my mind and just to go home and rest more. [00:46:46] Yeah. [00:46:46] So that's when I woke up. [00:46:48] You know, I'm like, oh, there's chemicals being dropped on us every day. [00:46:51] There's exposure to hundreds of thousands of gigahertz that our bodies can't handle every day. [00:46:57] So I learned all that. [00:46:58] Oh, wait, what was the question? [00:46:59] Okay, UBI. [00:47:04] You know, I think the future we're headed towards is, so on one hand, you have this immediate opportunity. [00:47:12] And that's what I like. [00:47:13] I try to drill that in people's heads. [00:47:15] You can 25 extra business right now. [00:47:17] You can get out of the nine to five right now. [00:47:20] Like we're in this golden age. [00:47:22] We haven't achieved AGI. [00:47:24] We're in this race to be the first. [00:47:26] Time out. [00:47:27] Time out because people throw around AGI like everybody knows about it. [00:47:32] What is the difference between artificial intelligence and artificial general intelligence? [00:47:39] And share with us what you think when we will cross into true AGI. [00:47:44] I think, Mike, you think it's 2027? [00:47:47] Yeah. [00:47:48] No, I might disagree with Julia on this, but I think in certain deployments, see, artificial general intelligence is usually defined, and correct me, Julia, if you think I'm wrong, by when a machine can do everything that a human can do, essentially, like an average human. [00:48:06] I think that's a very low bar. [00:48:07] I think AGI is a stupid AI. [00:48:10] And I think we're already past that. [00:48:12] That's my point. [00:48:14] Now, I'm not talking about robots and moving around in the physical world, but anything on a computer, I think AI can already do almost everything that an average human can do, which is not a high bar. [00:48:24] But that's my take. [00:48:26] Feel free to disagree with me, Julia. [00:48:28] I mean, you're the guest. [00:48:29] Say what you want. [00:48:32] No, I think you're right. [00:48:33] You know, Ray Kurzweil puts it at 2027, and he's the supposedly the brightest futurist of history. [00:48:40] He's been accurate on, I think, 60% of his tech predictions, and he's made hundreds since the 60s. [00:48:47] It's insane. [00:48:48] He talked about like foglets and nanotechnology and all this stuff. [00:48:52] I have all of his books. [00:48:53] And, you know, I used to think all that was positive. [00:48:55] But yeah, AGI is just what Mike said. [00:48:58] It's when you can replace a human at work. [00:49:01] So it's artificial general intelligence. [00:49:03] And we have right now, well, we used to have artificial narrow intelligence where you could kind of tell AI, hey, go do an email campaign. [00:49:11] And it would go do that. [00:49:13] But the next stage of that is the AI itself can schedule the email, send it, look at the statistics, analyze it, make a better email. [00:49:21] And, you know, with the emergence of CloudBot and these agentic networks, like we're there. [00:49:25] We are seeing this capable. [00:49:27] We just can't deploy it yet in a business setting. [00:49:29] So we probably have another 24 months. [00:49:32] So we're back to AI automating billions of jobs. [00:49:36] And I think the viewers, a lot of people who, you know, not that 8% that you referred to, I think they're crap in their pants thinking that they're not going to be able to have any income because AI is automating everything. [00:49:50] So that's where I'm really kind of, it's easy to say that this is going to actually increase freedom rather than decrease it. [00:50:01] But man, it's that's where I'm not trying to, you know, procure fear, but it is something that I think that we need to discuss so people don't fear the unknown of AI. [00:50:14] They embrace it rather than just run from it. [00:50:18] Anyway, so I think I think that's the topic of this whole episode with Julia is that, yeah, that old world is becoming obsolete. === Reinventing Yourself with AI (09:23) === [00:50:26] And if you if you think in the old way, like Julia said, you know, the factory mindset, yeah, you're going to be obsolete, but you don't have to. [00:50:34] You can break free of that. [00:50:36] And using AI, you can reinvent yourself. [00:50:38] And one of the things that like Peter Diamandis and his other guests, like we've interviewed Saleem Ismail here on the show as well. [00:50:47] One of the things that they always talk about is the need to constantly reinvent yourself. [00:50:53] And Julia, as you know, like I can't even keep up. [00:50:56] I'm an AI developer. [00:50:58] And every week I have a stack of models I'm supposed to check out and I can't keep up with it. [00:51:03] So I try to reinvent myself constantly and I'm still behind. [00:51:06] Imagine people who are just rejecting AI. [00:51:10] They're going to be years behind. [00:51:12] That's the trap is not using AI. [00:51:15] Yes. [00:51:15] Yeah, I completely agree. [00:51:17] There's two ways you can look at it. [00:51:18] You can look at it like, oh, this will take my job and it will for many people. [00:51:22] Like it's just going to be a reality. [00:51:24] Or you can look at it like, well, how do I get ahead today? [00:51:27] Whatever day it is, whatever year it is, you still are early. [00:51:31] If you're listening to the show, you're still early. [00:51:33] So that's how you need to think is, you know, almost like you almost want to put on blindfolds and blindfolds and not see out of the side of, oh, all this is happening next to me. [00:51:44] No, just look straight ahead. [00:51:45] Ask yourself, how can I either deploy something passive or 10x, 25x, an existing business right now to get evaluation and maybe have a life of freedom much sooner than I imagine. [00:51:58] There's really two ways you can look at it. [00:52:00] Well, on the heels of that, Julia, what industries are going to disappear first due to AI? [00:52:04] That's A and B, what opportunities will explode that almost nobody sees yet? [00:52:10] Yes. [00:52:11] Yes. [00:52:11] Great question. [00:52:12] Well, one of the first ones is definitely knowledge work. [00:52:16] And that was what I was ahead of with writing. [00:52:18] Like, I'm going to sell my writing agency five years ago. [00:52:21] Good thing I did. [00:52:24] When you have a pocket writer for 20 bucks a month that mimics a thousand an hour copywriter, if not better, like why would you ever hire a copywriter? [00:52:32] It just wouldn't make sense. [00:52:34] So knowledge work is transformed. [00:52:36] It's not even like I'd say probably 80% of it has already been disrupted. [00:52:41] So we are seeing, you know, writing, administration, the graphic design, even like digital artistry, all of that is changing. [00:52:50] And what's interesting is like the people you can hire and the jobs that open up. [00:52:55] So before I had a clone, I had one video producer. [00:52:58] Now that I have a clone, I have five. [00:53:00] And these people are more like video editors and storytellers. [00:53:04] They use the clone to narrate a script really, really well. [00:53:07] And they do a beautiful job. [00:53:09] And like hiring those people wouldn't have been possible if I had to sit down and film 40 videos a week because I can't do that. [00:53:15] Right. [00:53:16] So there are jobs that do open up that will look totally different. [00:53:20] Let me, I'm glad you mentioned that because that's a job of kind of a director or a producer. [00:53:25] And I completely agree with you. [00:53:27] And let me give you an example. [00:53:28] So on my screen right now, this is one of the platforms that I built alone. [00:53:33] I'm the only human on this project. [00:53:34] It's brightlearn.ai. [00:53:36] And this creates books, really amazing books with beautiful book covers. [00:53:41] Here's some of the covers, like iodine unleashed, et cetera. [00:53:46] And what's amazing is that we've had almost 10,000 authors now generate over 43,000 books with over 123,000 downloads. [00:53:56] All the books are free and open source. [00:53:58] And we've partnered with the Firefly Education for so homeschooling moms and dads use this to generate textbooks for their kids. [00:54:06] Well, we're about to throw audio book generation into this using the new Quenn TTS engine. [00:54:12] And all the audiobooks are going to be free to download. [00:54:14] Wow. [00:54:15] Amazing. [00:54:16] Right. [00:54:16] So we. [00:54:17] Do you monetize that at all? [00:54:19] No, it's not monetized at all. [00:54:23] Because we have other revenue sources, you know, my online store and things like that. [00:54:28] So part of our gift to humanity is to make these platforms free and just contribute to human knowledge. [00:54:35] But I will tell you, the underlying engine is trained on all of our worldview beliefs about natural health and freedom and liberty and honest money and things like that. [00:54:46] Right. [00:54:46] it's advocating, I think, pro-freedom ideas. [00:54:50] But my point is this kind of disruptive technology may render the audiobook industry obsolete, right? [00:54:58] Or a large section of it. [00:55:00] But that's not a bad thing because all the people who are creating audiobooks, they can become producers or directors of audio book engines or avatar video engines. [00:55:11] You see what I mean? [00:55:12] It's about upgrading your task. [00:55:15] Like nobody, here. [00:55:17] Actually, let me show you this here. [00:55:19] Zoom in. [00:55:20] Okay. [00:55:21] I have an old Soviet mechanical calculator. [00:55:25] Wow. [00:55:26] This was built in the 1950s and you turn the crank to do the math right here. [00:55:31] And yeah, you turn the crank and the answers appear down here. [00:55:34] It's got all the decimals and you can do multiplication, subtraction, and division. [00:55:38] It's so cool. [00:55:39] Yeah. [00:55:40] I collect, well, hey, while we're at it, Julia, you may not even be old enough, but Lotus 123 on five and a quarter floppy disks. [00:55:49] Yeah. [00:55:49] I do remember that, but I was seven. [00:55:51] Okay, you were seven. [00:55:52] All right. [00:55:53] So Lotus 123, folks. [00:55:54] Well, guess what? [00:55:56] Wow. [00:55:56] We don't have to hand crank math anymore. [00:55:59] Yeah. [00:55:59] We don't have to type in little old spreadsheets with Lotus 123. [00:56:04] We don't have to type out a book. [00:56:09] Why? [00:56:10] We don't have scribes, you know? [00:56:14] Why would you want to be a scribe? [00:56:16] You got the Xerox machine and we don't even need that. [00:56:19] You got a scanner. [00:56:20] But that's my point is uplift your skill set to the present. [00:56:25] Uplift it. [00:56:26] What are your thoughts on that, Julia? [00:56:28] That is probably one of the best things I've heard. [00:56:31] I mean, I've heard multiply yourself, replace yourself, augment yourself. [00:56:34] But what you just said, Mike, that's beautiful. [00:56:38] I don't see anyone that couldn't get behind that. [00:56:40] Wow. [00:56:41] Okay. [00:56:42] So good. [00:56:42] 100% endorsement of freedom. [00:56:44] If people don't like it, Mike, they can get their money back. [00:56:48] They can get their money back, right? [00:56:49] Because it costs you nothing. [00:56:51] Yeah. [00:56:53] But whatever people are, well, Julia, let me ask you this. [00:56:56] There's a cultural resistance here in the West to AI that we don't see in China or India, for example. [00:57:02] Like China is all in, AI, everything. [00:57:04] And they are really advancing. [00:57:06] Their education system is far better than ours right now for that very reason. [00:57:10] Why do you think we have this cultural resistance? [00:57:14] Yeah, that's a great question. [00:57:16] Yeah, I think it comes back to just how America has like this matrix in our DNA of our society that many other countries sadly don't have. [00:57:25] And some of it's, you know, inherent laziness could be some of the problem. [00:57:31] When you look at like Japan and China, there's this work ethic that America has lost touch with some of that. [00:57:38] We're not pushing our kids. [00:57:40] We're not creating these environments with competition. [00:57:43] And they have that. [00:57:44] And that is a huge advantage in these other countries. [00:57:48] So that creates more critical thinking. [00:57:50] You know, in America, it's like, oh, you know, my kid can be on iPad till 11 p.m. and go to school tired. [00:57:56] Like, it's just stupid. [00:57:56] There's stupid habits I see. [00:57:59] And if we can change that, you know, and help our next generation. [00:58:03] But I think there's just so much holding America back that whether it's like the chronic illness from the chemtrails and all the immunizations and side effects of bad eating and toxic this and that, just like, it has kind of ruined our society. [00:58:18] And now we're paying the bill where we're falling behind, you know, but we don't have to. [00:58:23] Like if you're an individual, that's where like this word, I'm a really big fan of this word, personal sovereignty. [00:58:30] Like if you can reclaim your own autonomy, who you are, like we have personal sovereignty in the image of God. [00:58:36] We're created as these individual beings. [00:58:38] And I learned that when I escaped the code. [00:58:40] I'm like, oh, I don't have to be in a kitchen wearing a dress 12 hours a day. [00:58:44] You know, I can break out of the matrix completely and be a wife and a mom and still live in my purpose and have like this unbelievable business that's built on my passion, what I love doing. [00:58:57] So we kind of really have to leave this matrix that is the fabric of a lot of American society because we're born into the industrialism, which goes 100 years back, 100 years back. [00:59:07] We're born into the Rockefellers literally paying people to manipulate. [00:59:13] the curriculum that drives modern medicine. [00:59:16] And that's been true for 100 years, right? [00:59:19] Like Tesla, we could have had free energy, but he like everything was stolen 100 years ago. [00:59:25] To create artificial scarcity as a means of control. [00:59:29] That's see, there's artificial scarcity in cognition that's about to be obliterated because of machine cognition being free and decentralized. [00:59:38] See, I see AI as one of the key pro-freedom technologies in the history of human civilization, but only if people move away from central systems and decentralize it and make it local. === Protecting Against Radiation (13:34) === [00:59:49] For example, I'm running the new Quenn 3.5 122 billion parameter thinking model locally. [00:59:56] That model is really outstanding. [00:59:59] It does good reasoning and it solves complex mathematical and physics problems. [01:00:05] I gave it a physics, a physics problem where the answer is Cherenkov radiation. [01:00:10] And it solved it. [01:00:12] It thought through it and solved it. [01:00:13] So I'm like, wow, cognition is now available to everyone locally. [01:00:19] But Todd, the floor is yours because we're almost out of time with our guest here today. [01:00:25] It's gone so fast, though. [01:00:27] Yeah. [01:00:28] I have a question. [01:00:31] Julia Damas, Nostra Domas. [01:00:34] What major AI milestone will shock the world in the next five years? [01:00:42] How can anyone see five years with AI? [01:00:45] Jeez. [01:00:47] Five days would be much more powerful. [01:00:50] There you go. [01:00:50] All right. [01:00:51] Five months. [01:00:53] Yeah. [01:00:54] Interesting. [01:00:55] Well, you know, it's always going to be a problem of adaption. [01:00:58] Like, I think two weeks ago, the end of February 2026, Web 4.0 was announced. [01:01:03] This is an environment where agents run your email, run your meetings, literally can have the meeting for you. [01:01:11] It's a world of agents. [01:01:13] Well, it was announced. [01:01:14] It'll be available through browsers, but like, where's web point? [01:01:17] Where's Web 3.0, which is crypto in our browser, checking out with cryptocurrency? [01:01:23] Like, no one's doing that. [01:01:25] We want that. [01:01:26] Oh, so, like, you know, all these things get announced and these incredible new technologies and ways to use them keep dropping. [01:01:34] But that's one of the things we do at First Movers is like, we try to get people using this stuff. [01:01:41] You can have the best AI drop. [01:01:43] And if no one knows what it's capable of, it's just going to sit there and get dust. [01:01:47] And then we're going to stay at 8% of businesses have achieved that opportunity. [01:01:51] And you have like this, you know, AGI sitting there on your desktop, but you're, you can't use it because your employer said, sorry, you can't use that large language model. [01:02:00] It's been banned from our corporation for XYZ red tape. [01:02:05] Yeah. [01:02:05] Anthropic, right? [01:02:06] Anthropic's getting ripped out of all the government contractors because it's been put on the list, which is crazy because how do you not use clawed code? [01:02:16] You know, yeah. [01:02:17] Yeah, that's really sad. [01:02:20] Yeah. [01:02:20] So it's just like we are going to see the emergence of like pretty much everything we can dream of when it comes to technology. [01:02:26] But where will the humans be in five months? [01:02:28] Will they still be doing the same old thing? [01:02:32] Oh, that leads me to one other last question is I'm observing now when I go to YouTube, right? [01:02:40] And especially like when you go down the silver or precious metals rabbit hole and you have, you have, you know, Asian guy and everything. [01:02:48] And these are getting really, really good. [01:02:50] Yeah. [01:02:51] But at some point in time, it's kind of like, at what point do you think we actually get attention fatigue to where we just auto presume that this isn't a real person? [01:03:03] This is just AI. [01:03:04] And who the hell can we believe anyway anymore? [01:03:08] Because if it's all AI, then it's just programming, right? [01:03:12] So where is the authentic Mike Adams, right? [01:03:18] Julia McCoy, right? [01:03:20] I think people are going to yearn for authenticity. [01:03:24] And go ahead, Mike. [01:03:26] Can I let me let me jump in here before you answer that, Julia? [01:03:29] I've told my audience that on my video website, brightvideos.com, that I will be rolling out five or six avatars. [01:03:38] One is like a financial expert, one is the conspiracy guy, one is like a scientist, and that every one of them is inspired by my scripts and my broadcasts or my interviews, et cetera. [01:03:54] So that my audience knows that those avatars are puppets of my voice. [01:03:59] And I'm telling them that up front. [01:04:01] And I think that's what connects the dots there. [01:04:03] Now, if on YouTube, you don't know who's behind it. [01:04:06] So you're exactly right. [01:04:07] You have no idea. [01:04:08] And there's like 50 Asian guys now that are all using the same image. [01:04:12] Yep. [01:04:13] But with Julia, everybody knows that it's Julia's spirit that's driving Julia's avatar. [01:04:20] And that's what gives it value. [01:04:21] But if there were 50 Julias that were saying crazy things, you know, cloned by somebody else, you would have to try to get those videos taken down from YouTube, wouldn't you? [01:04:31] But at the same time, Mike, you have told your audience as well that anytime you personally appear on camera, that that is the authentic cue. [01:04:40] Yeah, I've told my audience, I won't use an avatar of me. [01:04:44] Right. [01:04:45] Unlike what you're doing, Julia. [01:04:46] I'm using other avatars that have my message through different renditions, but my face will always be human. [01:04:55] So I've made that decision for whatever reason. [01:05:01] You're not liberating yourself. [01:05:03] Well, no, maybe, maybe I can have like, you know, like the evil clone Mike, you know, like Doomer Mike. [01:05:11] How about Doomer Mike? [01:05:12] He's always doing everything's doomer. [01:05:15] Or like gay, ultra gay Mike. [01:05:17] Like he, everything's happy, everything's great. [01:05:20] Gay Mike, and they could be like battling avatars, you know, no, everything's going to crap. [01:05:25] No, it's awesome, you know. [01:05:29] That went off the rails. [01:05:31] Yeah. [01:05:32] Starts to get a little approaching the after party. [01:05:35] That that happens all the time, Julia. [01:05:38] But you see what I mean? [01:05:39] So I love what you're doing, Julia. [01:05:42] I love what you're doing. [01:05:43] I think it's a great model. [01:05:44] People should follow you. [01:05:46] I think you've actually done it better. [01:05:49] Yeah. [01:05:50] Thank you. [01:05:51] Thank you. [01:05:52] Well, what was the original? [01:05:54] Was there a question, Todd? [01:05:57] What the hell? [01:05:58] Let me, how about this, Julia? [01:06:01] Will humans merge with AI? [01:06:04] Oh, that's a loaded one to end on. [01:06:07] Well, that is the agenda. [01:06:09] You know, I didn't believe that. [01:06:10] Like, I was reading Ray Kurzweil, Peter Diamanis, all these authors. [01:06:14] And I'm like, yeah, their view is that, you know, we're going to actually warp our own reality with foglets, nanotechnology, and we're going to literally change our environment around us, but it's going to save our lives. [01:06:27] It's going to give us a hundred more years. [01:06:29] We can be immortal if we want. [01:06:31] And I never really, I wasn't really sure about all that because it's not really what the Bible says. [01:06:36] But I was like, well, it can't be evil because it's doing so much good. [01:06:40] And then when my health crashed, like God showed me over 13 months, like this is the truth. [01:06:45] This is what's actually happening. [01:06:46] And nanotechnology, you know, has its own office in our government. [01:06:49] It's been used by DARPA. [01:06:51] There's patents. [01:06:52] It's used in the FCC. [01:06:54] There's actually like, you know, a lot of people talk about this that aren't mainstream for many reasons. [01:07:00] And there are uses of this technology, nanotech, to change how a human behaves, how a human thinks, their personality. [01:07:09] And we know this is true with the people that got the shot. [01:07:12] Like that's a really, really big one. [01:07:15] But you don't even have to have it to be under attack in America. [01:07:19] And so this idea of moving from the internet of things to the internet of bodies and 6G, that is going to be fueled by nanotechnology. [01:07:28] And all of that is going to create transhumanism. [01:07:31] There's just, it's, it's what they're doing. [01:07:33] It is the agenda. [01:07:34] And none of that is good. [01:07:36] Like we've been told. [01:07:38] Do not make this. [01:07:40] The nanotech is not even going to be opt-in. [01:07:43] You know, it is literally they're going to, they're going to put it in underarm deodorant, you know, so you're just thinking you're doing the normal thing. [01:07:50] And then all of a sudden now you're, you're like, I'm going to go kill someone. [01:07:55] I heard it's in kerrygold cheese enzymes. [01:07:58] Like the word enzyme actually has nanotech in it. [01:08:00] I'm like, what can you even trust? [01:08:03] I know that that's it. [01:08:04] Well, NK Ultra brand deodorant. [01:08:08] Final question, I promise. [01:08:11] If we fast forward 30 years, do you think AI will have liberated humanity or enslaved it? [01:08:19] Oh, boy, oh boy. [01:08:21] Oof. [01:08:22] Well, this is where, you know, Mike's perspective of decentralization becomes really important because if you have the same people running the same technologies today and they become the people that lead the new world order and the technology involved in that, like we don't have, we're going to be a surveillance state. [01:08:40] We're going to be tied to some digital currency that's in our actual body and we're going to be property of the state. [01:08:46] And none of that is good. [01:08:47] So like we have to, it's hard to say break out and break free because like, how do you in a matrix that is so controlled everywhere you turn, you're literally a part of the fabric. [01:09:00] But I think the more people are aware, they've actually seen this. [01:09:04] You might have seen some of this, Mike. [01:09:05] Somebody did a study of somebody's blood or their nervous system or something. [01:09:09] And they were like, the people that are aware that they're in a matrix and that, you know, chemtrails are real and nanotech is evil and transhumanism is an agenda. [01:09:17] People that are aware can't be hacked into anymore. [01:09:20] And it actually is almost like a firewall to our nervous system. [01:09:24] And so the more people are awake, the less of that transhumanism future we might actually have. [01:09:31] And so that's one of my goals, like people wake up. [01:09:34] So of course, this is my infamous smoothie here. [01:09:36] And part of the reason it's partially orange is because I always drink turmeric every day as part of this. [01:09:43] And also there's sulforaphane in here from Cruciferous Vegetables. [01:09:47] Now, our audience probably knows turmeric is neuroprotective. [01:09:50] And so is sulforaphane, which is, you know, a sulfur containing compound as well that has lots of benefits. [01:09:56] See, Julia and Todd, I believe that the crucial step to protect your neurology and also you mentioned your biofields, Julia, is that you start with God's medicine, which is nutrition from the plants. [01:10:11] The plants synthesize these molecules such as curcumin. [01:10:15] That's made by nature, not by man. [01:10:17] Curcumin protects your brain. [01:10:20] It literally helps protect you from electromagnetic radiation. [01:10:23] It protects you from ionizing radiation or even non-ionizing 5G towers, et cetera. [01:10:29] And then only from that place, because my whole background is nutrition, from that place of solid nutrition, then you can have the presence of mind to overcome all the other bullshit that's put onto you, you know, by the brainwashing and the psyops and the propaganda and everything. [01:10:50] But if you don't have nutrition and a lot of people are living on junk food, then they are neurologically very vulnerable to all these other programming influences and disruptions. [01:11:00] That's my theory. [01:11:02] That's huge. [01:11:02] I found this quantum plant spray. [01:11:05] I carry it everywhere with me. [01:11:07] It's the weirdest product ever. [01:11:08] It's 44 molecules from the most potent plants on earth and it does everything. [01:11:13] And that thing was the only thing that brought me out of the pit. [01:11:15] Like I am not better at it anymore. [01:11:17] I'm driving myself. [01:11:18] I'm on my computer again. [01:11:19] I have to do it in slow doses because I'm coming back from such a crash. [01:11:22] But it was like, it's God's nutrition. [01:11:24] They call it God's molecules. [01:11:26] You know, if you have something like that, like, yeah, you can withstand what's coming, but it's crazy. [01:11:31] Like it takes something so extraordinary because for me, Mike, and this is crazy, I would drink turmeric tea all my life. [01:11:38] It solved every problem. [01:11:40] Last year when I had my crash, it made me worse. [01:11:43] And there was like a hundred other things that made me worse. [01:11:46] And I'm like, what is happening? [01:11:48] And I had to turn to quantum nutrition, which was the biofield to actually get my health back. [01:11:52] Well, Julia, let me share something for you. [01:11:56] And thank you for your time. [01:11:57] I didn't mean to keep you, but this is really critical. [01:12:01] There's a study out there, and I'd be happy to send you the link. [01:12:04] If you'll stay after the show, I'll give you my number so I can send you this. [01:12:08] But there's a link to a study that showed that high melanin creation in your skin, that is tanned skin, blocks over 99% of 5G and all the other electromagnetic influences. [01:12:24] Wow. [01:12:25] Yes. [01:12:26] And this is why, frankly, just to say it bluntly, black people are not as impacted by 5G as white people because white people tend to lack that. [01:12:35] Now, you notice I'm pretty tan, but I'm deliberately out running in the sun to build up this protection, which is a tan that black people have naturally or Latino or Asian people have more naturally. [01:12:48] But if your skin is white and your skin is whiter than mine, Julia, right now, you can be more susceptible to those. [01:12:57] But there's actually a science paper on this that is extraordinary. [01:13:02] And there are also, there are some peptides that you can use intranasally that boost melanin production to give you, some people use it cosmetically, like I want more of a natural tan. [01:13:11] I don't use it. [01:13:12] I just get sunlight. [01:13:13] But literally protecting yourself from 5G and electromagnetic radiation is built in. [01:13:20] God built it into your skin. [01:13:23] Wow. === Embracing AI Avatars (16:07) === [01:13:24] That's incredible. [01:13:24] That's all real. [01:13:26] And from a very strategic level, to be able to get that sun, you can map out the miles to your local McDonald's to get the new big arch. [01:13:39] And you can burn half the calories on the way there, half the calories on the way back, get the suntan, and you're Gucci. [01:13:47] Oh, man. [01:13:50] I haven't been to McDonald's in like 12 years. [01:13:53] Yeah, that would make you healthy. [01:13:54] I can't even imagine. [01:13:55] Yeah. [01:13:56] I don't even, but I did see their CEO online eating the big arch and talking like it was trying to find wine or something. [01:14:04] Oh, it's disgusting. [01:14:06] It's like it's a greaseburger. [01:14:07] Come on. [01:14:08] It's a low-cost greaseburger. [01:14:10] Right. [01:14:11] That was weird. [01:14:12] That almost felt like a clone. [01:14:13] It was just so unreal. [01:14:14] Right. [01:14:16] That was unreal. [01:14:17] Yeah. [01:14:18] Yeah. [01:14:18] Like something's bizarre about our reality. [01:14:21] Look, Julia, it's been just a pleasure speaking with you here today and learning about your adventure and where you're taking this. [01:14:29] And Todd, can I say that Julia is definitely welcome back on the show? [01:14:33] Oh, my goodness. [01:14:35] Well, I was telling Mike before you got on that I have so many questions that I'm like, this could literally be one of those two three-part interviews just because we could go on forever. [01:14:47] But this has been fascinating. [01:14:48] You're a great guest, Julie. [01:14:50] Oh, thank you. [01:14:51] You guys are amazing hosts. [01:14:52] I rarely have the chance to talk to people that are aware of all sides of the spectrum, know technology, are playing with technology, and are like, you know, not on the doomer/slash liberal slash completely unaware side of it all. [01:15:10] Well, I mean, look, it's been our honor to have you on the show. [01:15:13] And I think everything, everything's integrated. [01:15:16] It's like we've been talking about. [01:15:18] You've got to have good nutrition in order to protect your neurology in this age of all these insults, which means you have to be able to see through the psyops and the propaganda or whatever. [01:15:28] And this show is all about teaching people to be more self-reliant. [01:15:32] We don't care if viewers on the left or the right or politically agnostic. [01:15:36] It doesn't matter. [01:15:36] We just want you to be more abundant. [01:15:39] We want you to uplift your life. [01:15:41] We want you to experience freedom and be an example to others. [01:15:44] And I think you fit that bill, Julia. [01:15:47] You're doing exactly that. [01:15:48] So thank you for your time here today. [01:15:51] Thank you. [01:15:51] Thanks for having me. [01:15:52] This was so much fun. [01:15:54] We enjoyed it too. [01:15:54] Let me give out your website, juliamcoy.ai. [01:15:58] So for those of you watching, if you want some help of how you can use AI to enhance your life, your business, your nonprofit, anything like that, reach out to Julia and she will work with you. [01:16:11] The actual human version of Julia will work with you to help you get up to speed. [01:16:17] And Julia, don't disconnect because I want to give you my number so I can send you that study. [01:16:22] And then folks, we'll be right back after this break for the after party, which is likely going to get way off the rails today. [01:16:29] I just have a feeling about that. [01:16:31] Sneaky suspicion, my father. [01:16:32] Yeah. [01:16:32] Yeah. [01:16:33] So stay tuned, folks. [01:16:34] We'll be right back after this break. [01:16:36] Cheers. [01:16:38] Join the official discussion channel for this show on Telegram at t.me slash decentralized TV, where you can ask questions or offer suggestions of who we should interview next. [01:16:51] Also, be sure to subscribe to the email newsletter on decentralized.tv, where you'll be alerted about one day in advance of each new upcoming episode before it gets published. [01:17:03] On decentralized.tv, you'll also find links to our video channels and social media channels across all platforms, including Brighteon, Rumble, BitChute, Twitter, Truth Social, and more. [01:17:15] Check it all out at decentralize.tv. [01:17:19] All right. [01:17:20] Welcome back to the after party. [01:17:21] So, Todd, apparently you're going to work with Julia now to have the Todd avatar. [01:17:29] I am. [01:17:30] I am. [01:17:30] I like that idea. [01:17:31] Yeah. [01:17:32] Well, you know, I just hatched the idea during the interview. [01:17:36] Yeah. [01:17:37] Because I'm sitting here thinking, you know, I help so many people acquire these unincorporated nonprofit associations. [01:17:44] And in many ways, they're, you know, I'm answering the same questions over and over and over and over. [01:17:52] Yeah. [01:17:53] You know, and I have so many people who resource their Lord and Savior Chat GPT, and they'll come at me and tell me how these UNAs work, right? [01:18:05] And, you know, I will politely ask him, do you want to tell me how they work? [01:18:09] Or do you want me, with the benefit of Dennis's 40 years of experience, tell you how they really work? [01:18:15] Because ChatGPT sucks On all the topics that matter, like health and cancer cures, everything else. [01:18:25] Yeah. [01:18:25] And so I'm thinking of, well, I know I am. [01:18:29] I'm going to use Julia and I'm going to hire her to be able to create my own little arsenal of being able to deal with commonly asked inquiries questions, right? [01:18:40] That's great. [01:18:41] It will be me, my voice in my style. [01:18:45] And approved by you. [01:18:47] And approved by me. [01:18:48] Yeah. [01:18:48] Absolutely. [01:18:49] Does that mean then that the next time we do DTV, it's going to be the Todd Pittner avatar instead of the human Todd? [01:18:56] Is that what you're saying? [01:18:58] You know what? [01:18:58] Don't tempt me. [01:19:00] We might have to test something. [01:19:01] That would be fun. [01:19:02] All right. [01:19:03] From now on, I'm going to make you prove you're human every time we do a show. [01:19:07] See, you have to do a magic trick. [01:19:09] Because I'm a conspiracy theorist, right? [01:19:13] You know, you and I get people will accuse us of that. [01:19:18] I think my AI clone is just going to be called Conspiratod. [01:19:25] Conspiratod. [01:19:27] No, I think it should be called like Todd Stradamus. [01:19:29] How about that? [01:19:30] Todd Stradamus. [01:19:31] Todd Stradamus. [01:19:33] Ooh, I like that. [01:19:34] Yeah. [01:19:34] I like that. [01:19:35] But you see, I think in the future that's coming here, the AI videos are going to be so convincing that we will have to prove that we're human. [01:19:45] I agree. [01:19:46] Own audiences. [01:19:47] 100%. [01:19:48] And actually, this is the reason why I said I don't want to have an AI avatar of my likeness. [01:19:54] Yeah. [01:19:55] Because I don't want people to wonder. [01:19:57] But I will have other AI avatars that are like fictional characters that I created. [01:20:02] Right. [01:20:02] No, I love your strategy. [01:20:04] I think that's very wise. [01:20:06] But also, I mean, Julia's strategy makes sense for her. [01:20:08] It totally makes sense. [01:20:10] And she's very upfront with it. [01:20:12] People know that they're looking at her avatar. [01:20:14] And during the interview, you really, really unpacked it. [01:20:18] But, you know, you know, it's the spirit of her behind the avatar because she is creating, she is directing, right? [01:20:29] It is coming from her mind. [01:20:31] And so, yeah, but wow, it is just, I mean, just in the last six months, Mike, seeing what you have done, how advanced AI is becoming, it's really, I don't think I'm satisfied with the answer during the interview, but I just think with billions, millions of jobs being automated, I mean, what the hell are people going to do? [01:21:00] Because she said it herself, there's only 8% adoption. [01:21:04] And then there's your and well, at least my assessment of humanity, that there's really like 2% of us who have the ability to critically think. [01:21:13] So what are the 98% going to do out there if there is no work? [01:21:19] Okay, we'll get to that. [01:21:21] Just hold that thought for a second. [01:21:22] Because I want to interject another announcement about the show and AI. [01:21:29] Okay. [01:21:30] That's really cool. [01:21:31] In addition to the possibility that you might be an AI avatar in the future. [01:21:35] We'll see. [01:21:37] I'll have to text you before the show. [01:21:40] Todd, are you going to show up as a human today? [01:21:43] No, I'm going to be my Todd avatar. [01:21:48] Avatar. [01:21:50] See, but, you know, the avatars, here's what's funny is often the avatars speak in a more intelligent way than the human guests on various YouTube videos. [01:22:02] Like the Asian guy actually makes a lot of sense a lot of times. [01:22:06] Yeah, he does. [01:22:07] Sure. [01:22:07] Right. [01:22:08] So versus some guest that's just babbling, blah, blah, that can be very tedious. [01:22:15] There are times when I actually want to hear the Asian guy's take. [01:22:19] Agree, especially when he's talking about precious metals. [01:22:22] Right. [01:22:23] I hang on every one of those shows. [01:22:25] So I'm finding the same thing with the avatars that I'm rendering now is that very often they can take, maybe it took me 20 minutes to say something. [01:22:32] By the time I condense it down to a three-minute avatar video, it's actually, it's like, well, he said it better than I did. [01:22:40] Right. [01:22:41] Yeah. [01:22:41] Yeah. [01:22:42] So that's, that's going to happen. [01:22:43] I think some people will choose avatar, like AI-generated avatars because it's condensed. [01:22:49] It's way more efficient. [01:22:52] Yeah. [01:22:52] If you're trying to get knowledge, but here, here's the announcement. [01:22:55] Okay. [01:22:56] Do you recall last year, I had two predictions about AI content. [01:23:01] I said that sometime in 2026, we would be able to create mini documentaries. [01:23:07] Yes. [01:23:07] Remember that? [01:23:08] Yep. [01:23:09] And I said by 2027, we would be able to create full-length feature films. [01:23:13] Wow. [01:23:14] Okay. [01:23:15] Well, I'm happy to announce that based on the infrastructure that I've been building for the local animation of my various AI avatars, that I now have the ability to do mini documentaries locally. [01:23:31] And the first documentary I'm going to do is the decentralized TV documentary. [01:23:38] Oh, wow. [01:23:39] That's awesome. [01:23:40] Yes. [01:23:41] And the way I'm going to do it is I'm going to take all our interviews, all the shows we've ever done, which is way over 100 and something, right? [01:23:49] What is it? [01:23:49] 150. [01:23:50] Yeah, 150 shows. [01:23:52] I'm going to take all those transcripts and I'm going to feed them in to a thinking model with appropriate prompts. [01:23:59] I'm going to say, like, you know, pull out the best themes and plan out sections. [01:24:04] Basically, it's going to be a documentary architect and it's going to build a documentary. [01:24:09] And then I'm going to have like narration, but also like quotes from fictitious avatars as if they're being interviewed in the documentary. [01:24:21] Wow. [01:24:22] That's so cool. [01:24:24] Wouldn't that be cool? [01:24:25] Yeah. [01:24:26] And maybe I'll ask Julia if she'll render like an answer to a question using her system and then send me that. [01:24:33] I can incorporate Julia's avatar into the film. [01:24:35] Yeah. [01:24:35] That'd be funny. [01:24:37] That's amazing. [01:24:38] I'm going to produce an AI documentary about this show. [01:24:43] That's so cool. [01:24:44] And my first goal is to target just like 15 minutes. [01:24:48] Okay. [01:24:48] Just, you know, keep it relatively bite-sized. [01:24:52] Right. [01:24:52] Yeah. [01:24:53] So that's not very far away. [01:24:56] And I'm thinking that since, you know, right now we're still on a kind of a reduced schedule for the show. [01:25:00] Yeah. [01:25:01] And just to our audience, that's why we haven't had as many shows recently because I'm doing a lot of vibe coding, building. [01:25:06] I still have a few more weeks to focus on vibe coding. [01:25:10] But once we return to the regular schedule, we could probably feature in every DTV episode an avatar that is sort of chiming in on the topic. [01:25:23] That would be so cool. [01:25:25] Yeah. [01:25:26] Wouldn't that be interesting? [01:25:27] Yeah. [01:25:27] Yeah. [01:25:28] Not in real time, but we would render it after the fact. [01:25:31] Yeah. [01:25:31] Yeah. [01:25:32] Instead of having the in-house robot that is never going to be manufactured because those seem to be, you know, difficult. [01:25:42] Yeah. [01:25:43] Yeah. [01:25:44] It'll be the rendered avatar. [01:25:46] That's awesome. [01:25:47] Yeah. [01:25:47] I feel confident we could start that very soon within within the next two episodes. [01:25:52] Wow. [01:25:53] Yeah. [01:25:53] That's crazy. [01:25:54] But then, so, Tod, my question to you, though, is what kind of avatar do you want that to be? [01:26:02] Like, what would be appropriate for the show? [01:26:04] We don't want to be you or me, obviously. [01:26:07] And it needs to be a fictional, you know, character, but what, like, what should they look like and act like? [01:26:16] Wow. [01:26:17] Right. [01:26:17] Think about that one. [01:26:18] Have to think about that. [01:26:19] I'll have to get back to you on. [01:26:21] Okay, all right, because I have to do the voice design and I have to do, you know the yeah the, the photo and everything and different scenes. [01:26:29] Yeah, so we have to design the avatar from the ground up. [01:26:32] I think it just needs to be a, just a stud, you know, just kind of like a, just like a Brad Pitt, looking kind of dude. [01:26:43] That really huh really, why not that roll up their sleeves, go out and just in one fell swoop dig the hole and the other, the other arm, is gonna plant the champ banana. [01:27:00] Yeah, just again. [01:27:02] Just uh, grizzly atom describes, you've chosen, you've chosen a male, though i'm wondering. [01:27:11] I did that. [01:27:12] It could be a female. [01:27:13] It could absolutely be a female. [01:27:15] It could be an animated robot. [01:27:18] It could be a. [01:27:19] It could be a jellyfish, you know, I mean, we're not limited to humans here. [01:27:24] Okay, a raccoon, it is a raccoon. [01:27:29] You want a raccoon? [01:27:30] No, i've got enough of them back there. [01:27:33] Oh, I mean, that's doable yeah yeah yeah, it could be an extraterrestrial, you know, it could be one of the grays. [01:27:43] Just not cloud swab, no cloud, no cloud swap. [01:27:47] No, we're not going to do any existing person. [01:27:51] Yeah, we're going to. [01:27:52] We're going to do something different, all right well, think about that. [01:27:55] Yeah, i'll think about that. [01:27:56] All right, until we get the actual physical robots in the studio right, we'll animate avatars. [01:28:03] Perfect, to chime in. [01:28:04] How about that? [01:28:05] Perfect okay, I like it. [01:28:07] I like it. [01:28:08] I think, you know, let's see what our audience says. [01:28:13] I, I think I don't know why i'm having this vision of Billy Gibbons, Zz Top, you know, playing some music and commenting on the things that we're talking about. [01:28:24] Anyway, that I don't know. [01:28:26] Gibbons right this, this could yeah, but I mean this could go off the rails with lots of crazy ideas, like we could have, totally could. [01:28:34] We could have the female version of you named Pod Tittner i'm sorry, it's just just came to mind like the female Todd, you know, like we have the Doomer Health Ranger or whatever. [01:28:50] Yeah yeah, just not Todd Shittner, just there's. [01:28:56] There's enough things that rhyme with my last name. [01:28:59] Oh yeah, i'm sure you've been through all that in your youth. [01:29:01] Yeah yeah, okay. [01:29:04] Um, this is the after party folks. [01:29:06] So you know, if you're still watching, it's your fault. [01:29:09] That's my point. [01:29:10] You signed up for this. [01:29:12] Yes, if you're still tuned in, that's right okay, that's right, because you knew where this was going to go so so so Mike, I have something for you okay, please. [01:29:20] That I kind of want to start doing with these uh interviews is the rapid fire lightning round, and I didn't get to. [01:29:30] Oh, that's a great idea. === Laughing at Rhyming Names (05:32) === [01:29:31] I didn't get to it, but I can, I can. [01:29:33] I ask you these questions because you are so AI experienced. [01:29:37] Okay, it's just, it's literally one sentence answer from you, okay okay okay, and but these were intended for Julia. [01:29:45] And now i'm getting, yes, you'll knock it out of the park okay, you sure. [01:29:50] Okay, let's do it. [01:29:51] And this will just give people the idea. [01:29:52] And if you think this is a good idea for future interviews, let us know I. [01:29:56] I think it's kind of fascinating. [01:29:58] I like the idea, let's do it. [01:30:00] Yeah so, rapid fire lightning round, okay, AI will create more jobs or destroy more jobs. [01:30:07] It would create more jobs, but not for humans okay, Biggest AI risk, control or chaos? [01:30:17] Overt control of vastly super intelligent AI systems that decide humans are not necessary. [01:30:23] Okay. [01:30:24] most underrated AI tool today? [01:30:35] Well, I would say it's the thinking models. [01:30:38] It's the... [01:30:39] It's the thinking reasoning models like the Quinn 122B. [01:30:43] Okay. [01:30:44] All right. [01:30:45] One skill every young person must learn. [01:30:48] The skill of saying no. [01:30:50] And there's a book about that by Mr. Fu Koff that's free at brightlearn.ai. [01:30:59] Yeah. [01:30:59] And the beautiful. [01:31:04] Well, I asked her this, but I'm going to ask you, will humans merge with AI? [01:31:09] Some will choose to and they will lose their humanity. [01:31:12] Okay. [01:31:13] Are we entering humanity's golden age or its greatest test? [01:31:18] Well, both. [01:31:19] Both. [01:31:20] Okay. [01:31:21] Yeah. [01:31:21] Some humans will uplift themselves, but many humans will lose themselves in what's coming. [01:31:29] Okay. [01:31:30] And then the killer closing question, Mike, which I did ask her, but I'm going to re-ask you. [01:31:35] If we fast forward 30 years, do you think AI will have liberated humanity or enslaved it? [01:31:41] Pockets have liberated humans, but the vast majority will be enslaved or already destroyed by that time. [01:31:47] Okay. [01:31:48] Thank you. [01:31:48] Yep. [01:31:49] Good. [01:31:49] I don't see human populations in 30 years being anywhere close to what they are now. [01:31:56] Yeah. [01:31:57] By the way. [01:31:59] With all those new big arch McDonald's burgers, I agree with you. [01:32:05] Have you seen that? [01:32:06] They have been marketing those like crazy. [01:32:09] Oh my God. [01:32:10] I have not. [01:32:10] I mean, every other, you know, it's Instagram or whatever. [01:32:15] Every other Instagram short video is somebody doing the comparison of the big arch burger, which is basically Big Mac, except, you know, with, you know, three times the size of the burgers and more sauce and more of that fake cheese and everything. [01:32:32] And Mike, people who are eating those and those fries that don't have any potatoes in them, I mean, they're just literally killing themselves. [01:32:42] Yeah, their fries, the McDonald's french rice has 45 ingredients in them. [01:32:50] I mean, you'd think potato, olive oil, and salt would kind of get the job. [01:32:56] That should be all that you need. [01:32:57] No. [01:32:59] Well, and how many ingredients are in the big arch, come to think of it, right? [01:33:03] Oh, my gosh. [01:33:03] I bet you hundreds. [01:33:05] I mean, seriously. [01:33:06] They should just rename it the big butt, you know? [01:33:09] It's like, keep eating this, big butt. [01:33:13] You know, uh, yeah, the, the big arch is actually the big crack. [01:33:17] Um, when you bend over and there's a giant crack. [01:33:22] Big farmer's crack. [01:33:24] Yeah, that's the big arch right there. [01:33:28] I don't know. [01:33:29] I don't know. [01:33:29] But actually, you know, like a flame-grilled hamburger sounds really good right now, but not from McDonald's. [01:33:37] You know what I mean? [01:33:38] Oh, yeah. [01:33:38] Yeah. [01:33:39] Like a real actual one. [01:33:41] My friend Michael Jan, he sent me a photo. [01:33:43] A friend of his was shopping at Costco and showed a package of meat. [01:33:49] It was tenderloin pieces. [01:33:51] Yeah. [01:33:51] Just five pounds of it. [01:33:53] $150 plus. [01:33:55] Yeah. [01:33:56] I was like, what? [01:33:58] Yeah. [01:33:59] $150 for five pounds of beef? [01:34:02] It's nuts. [01:34:03] It's crazy. [01:34:04] That's bonkers, but it makes you wonder then if If you get a real hamburger that's like, you know, got a pound of meat in it, that hamburger is going to be 50 bucks. [01:34:15] You know what I mean? [01:34:17] Right, right. [01:34:17] From a grass-fed farmer, right? [01:34:22] Yeah. [01:34:22] And cooked up by a restaurant or whatever. [01:34:25] That's that's not just a chain. [01:34:26] Yeah. [01:34:27] It's a $50 hamburger. [01:34:28] Yeah. [01:34:29] Yeah. [01:34:29] That's what's coming. [01:34:30] Does sound good, though? [01:34:32] It does sound good, actually. [01:34:34] I might pay $50 for that hamburger right now. [01:34:36] I was going to say, I think I might walk to my local McDonald's to get my tan and I'll stop by the Whole Foods store and my own big arch. [01:34:51] Yeah, big arch. [01:34:53] Sounds like a shoe insert, doesn't it? [01:34:56] Like something you need for your feet. [01:34:59] Yes. [01:34:59] You know, I need a big arch, not the small one. [01:35:02] Get me the big one. === Creating Your Digital Twin (02:50) === [01:35:03] Hey, by the way, what is Hey Gen? [01:35:06] Hey Gen is a avatar, it's a video avatar rendering platform. [01:35:12] Okay, H-E-Y-G-E-N. [01:35:15] H-E-Y-G-G-E-N. [01:35:17] Yeah. [01:35:18] Okay. [01:35:18] That's what Julia will use it. [01:35:19] And I've used it before. [01:35:21] And it's quite good. [01:35:23] It's also quite expensive. [01:35:24] Like, I couldn't mass produce videos on Hey Gen. It's, it's also not, you can't automate it. [01:35:33] Okay. [01:35:33] So something like I want to accomplish where I'm going to, I'm going to answer commonly asked questions. [01:35:39] Oh, yeah. [01:35:41] Okay. [01:35:42] Yeah. [01:35:42] Okay. [01:35:43] You should use it for that. [01:35:44] Yeah. [01:35:44] And what Julia will help you do is figure out how to create the best prompts and how to create the script and how to automate that workflow process. [01:35:54] Wonderful. [01:35:55] Yeah. [01:35:55] Wonderful. [01:35:56] And you know what? [01:35:57] What's funny, though, if you want, if you want to send me a headshot like the way you look right now, if you want me to test animate you on my system in I need a sample of your voice less than 10 seconds. [01:36:15] Okay. [01:36:16] And then I need your headshot like that. [01:36:20] Okay. [01:36:21] And can I just send that? [01:36:22] Can I just record that into a 10-second video? [01:36:25] And no, I need the WAV file. [01:36:28] Or you can send me a video. [01:36:29] I can extract the wave. [01:36:31] And I need a very specific resolution. [01:36:34] So I'll tell you that after the show, but it's like a 1280 wide. [01:36:39] But anyway, if I take that and you give me a script, I'll animate you for you and I'll send it back to you just as a test. [01:36:49] Like just give me like a couple of sentences and I'll send it back to you. [01:36:53] That's see what you think. [01:36:54] And then, and then for the next show, we could start out with Avatad. [01:37:01] Avatar. [01:37:04] That is funny. [01:37:05] Like, let's do that. [01:37:06] Let's play a little joke on the audience. [01:37:10] All right. [01:37:10] Where just for the introduction, it's an avatar. [01:37:14] I love that. [01:37:15] Wouldn't that be funny? [01:37:16] Yeah. [01:37:19] Avatar. [01:37:20] And that's 2D avatar. [01:37:23] 2D avatar. [01:37:26] But the headshot you send me, you don't want it to be like smiling. [01:37:29] Otherwise, your avatar will constantly return to smiling between every sentence. [01:37:34] Yeah, I did that accidentally once. [01:37:36] I have this one avatar. [01:37:37] This is like, everything's great. [01:37:39] You know, it's just like, oh my God. [01:37:42] So you need like a neutral, neutral facial expression that you want. [01:37:47] Yeah, just like that. [01:37:48] Like a slight, a very, no, yeah, slight smile. === California UNA Legalities (05:54) === [01:37:53] That's perfect. [01:37:54] And then that's what your avatar will come back to. [01:37:58] Okay. [01:37:58] As the normal, sort of the normal expression. [01:38:01] Got it. [01:38:02] Yes. [01:38:04] We're going to have fun, I can tell. [01:38:06] I'll do that. [01:38:07] Realize once you give me this, I can make you say anything, right? [01:38:11] I know. [01:38:13] But I kind of want to see a debate between Avat and Avatard. [01:38:17] Oh, that would be funny. [01:38:19] You know, we should do that. [01:38:21] Because Avatar, we can do the, I'll submit an image of me. [01:38:28] Like, like Jim Carrey and Dumb and Dumber, we can do different versions. [01:38:33] But I know that, so the main use you want to do is you want to explain the unincorporated nonprofit association. [01:38:39] So why don't you tell our audience about that? [01:38:41] I'll bring up your website. [01:38:43] Maybe our listeners have heard of this before. [01:38:45] The website is my575e.com. [01:38:48] And this is a really critical financial asset segmentation and protection strategy. [01:38:55] At least that's my understanding of it. [01:38:57] But you, you go, Todd, you tell people what it's all about. [01:39:00] Well, you know, I brought this up to you three years ago, Mike, and you did what you do. [01:39:06] And I tell everybody the story. [01:39:08] Mike did what Mike does, and he became a UNA expert. [01:39:11] And he actually hired a tax attorney out of California to help you with your due diligence and ultimately came back with a thumbs up. [01:39:20] But since then, I've helped over 500 people acquire their own unincorporated nonprofit associations. [01:39:28] And very important, they must be California established UNAs. [01:39:32] That's where a lot of people who come at me with their AI research go wrong because they don't, I mean, any two people in any state can create a UNA, but only California established UNAs have the case law that's codified that protects UNA operators, both their privacy and their personal liability. [01:39:55] So that's just an important piece. [01:39:57] And that's registered with the Secretary of State of California. [01:40:00] So they grant the exemption. [01:40:02] They grant it. [01:40:04] That's correct. [01:40:05] And then it is recognized by the Internal Revenue Service, and they issue the tax ID or EIN number that goes along with that. [01:40:14] Which is exempt. [01:40:15] Like that's an exempt number. [01:40:17] Yeah, the exempt is exempt not from paying taxes, not from paying property tax. [01:40:21] It's exempt from filing. [01:40:24] And so, again, these are all strategies. [01:40:28] This is an entity of when Nelson Rockefeller coined the phrase, own nothing, control everything. [01:40:34] He was talking about entities like these. [01:40:36] This is so undiscovered. [01:40:39] It's why people should just please go to my575e.com, hit let's go, and then you enter in your name and email, and then you will get access to the 90-minute interview that I did of Dennis Gray, who is the world's leading expert on this entity. [01:41:02] And during that interview, we unpacked the 32 positive attributes of operating your own UNA. [01:41:09] And I will tell you, if you're a W-2 earner, a 1099 earner, if you operate an LLC, if you own property, trade in crypto, if you have children, and most importantly these days, Mike, if you are stacking precious metals, and I will tell you this morning, I sent an email out to 4,000 people, and the headline was this, Mike, and it's from me, but the headline was, [01:41:39] I will never buy precious metals again. [01:41:45] Right, not yourself. [01:41:47] Exactly. [01:41:49] The UNA will buy it. [01:41:51] The UNA. [01:41:51] And so I unpack that in there. [01:41:54] And it is amazing when you understand this strategy. [01:41:58] I swear it applies to the crypt, or not the crypto, the precious metals you've already acquired over your lifetime. [01:42:07] There's a very specific strategic way to be able to donate those precious metals to your UNA to where it gets out from under your social security number into the EIN, the tax ID number of the UNA and all kinds of good things happen with that from a capital gains or lack of capital gains. [01:42:26] standpoint. [01:42:28] And then if you, when I say I will never acquire precious metals again, Mike, I got to tell you, after our interview with Andy Sheckman the end of December, everybody should go watch that interview. [01:42:41] It was amazing. [01:42:42] That was when I reached out to him afterwards and I said, look, I am totally convinced that there is just not enough supply to even remotely meet the industrial demand. [01:42:56] I believe silver is just going to go through the roof. [01:42:59] And so I educated Andy on the UNAs and I said, but what I need, I don't want to buy it as Todd Pittner's Social Security number me. [01:43:09] I want to buy it as the secretary, the controlling person of this entity. [01:43:14] So I need that invoice to reflect my UNA's name and its EIN number. [01:43:21] So now from here on out. [01:43:23] And he did that. [01:43:24] And he did that. [01:43:26] He did that. [01:43:26] So if I ever have to acquire it, or I mean, I'm sorry, liquidate, then those funds are going to go back into my UNA bank account and they're considered what's called reserves or unsettled funds. [01:43:40] And again, all kinds of tax advantages accompany that. [01:43:43] So that's extremely valuable information. === Tax Advantages for UNAs (04:11) === [01:43:47] Yeah. [01:43:47] And please, everybody, I make it really easy to have a private one-on-one conversation with me. [01:43:53] You just go to the website and you'll be able to see under that video, if you still have personal questions, just book a half hour with me. [01:44:01] It is $150. [01:44:03] But most people, Mike, who invest that $150 and have a conversation with me, most move forward with a UNA. [01:44:12] And when they do, they get that $150 back. [01:44:14] I just say take it off of the investment of the UN. [01:44:17] Now, is that a consultation with you or your avatar? [01:44:19] Avatar? [01:44:21] It is, it is just, it's not avatar. [01:44:23] It is avatar. [01:44:25] No, it's just me. [01:44:26] It's you. [01:44:27] Regular Todd Pittner. [01:44:29] Okay. [01:44:29] All right. [01:44:29] Fantastic. [01:44:30] Well, folks, well worth your time. [01:44:32] If you haven't yet learned about this, check it out at my575e.com and Todd can give you all the other details. [01:44:40] And yeah, Todd, we'll help you get those videos created. [01:44:43] Now, I've got a big announcement here, something to share with our audience. [01:44:47] We have now launched Bright.shop. [01:44:51] So Bright.shop is our new online store that supports the entire Bright ecosystem, which includes Bright Learn that I showed you before. [01:44:59] And then also we have, of course, BrightAnswers.ai, which is our deep research engine. [01:45:05] It's free to use. [01:45:06] There's a free tier, but there's also a token tier over here, which gives you access to more of the documents, all the science papers and the books and everything, and gives you more deeply researched answers. [01:45:19] One token can be used 10 times. [01:45:21] And where do you get the tokens? [01:45:23] Right here at Bright Shop. [01:45:25] When you purchase lab-tested products for your health or storable food, as a lot of people are doing right now because of the war situation, you can then, you get loyalty points and you can swap those out for tokens. [01:45:39] And those tokens can be used at either brightanswers.ai or brightlearn.ai. [01:45:46] And we've also got brightvideos.com, which is now the flagship site where this show is being posted, along with all the upcoming avatars. [01:45:56] Right now, you're seeing these are my reports that are that are there. [01:46:00] But here's some of the avatars that have been posted before, and more avatars are coming. [01:46:08] So, we have a lot of really great videos coming for you at brightvideos.com. [01:46:13] And again, Bright.shop is where you can shop and help support this show. [01:46:19] Is that the primary place to go now to be able to buy from you, Mike? [01:46:25] Well, I still have healthrangerstore.com. [01:46:27] Okay, but are they kind of mirrored? [01:46:29] I mean, they're mirrored. [01:46:30] Yep. [01:46:31] They're actually the same store, just two different front ends. [01:46:35] And the reason is because we have so many new users of all of our bright platforms. [01:46:40] Yeah. [01:46:41] Right. [01:46:42] Some of them aren't even into health or nutrition that much, right? [01:46:46] But they're learning about it. [01:46:48] So they may not even know who is the health ranger. [01:46:51] Yeah. [01:46:52] But they know they're familiar with the word bright, you know, like bright answers. [01:46:57] So Bright.shop is a store to support all those platforms. [01:47:01] And then a big portion of the profits from Bright.shop go back to fund the infrastructure for us to build out more and more tools and platforms and keep them free. [01:47:12] Yeah. [01:47:13] You know, it's kind of like what Julia was saying. [01:47:16] She immediately saw that like we could monetize the book creation engine if we wanted to. [01:47:23] Yeah. [01:47:23] You know, that's probably a billion dollar idea, right? [01:47:27] A thousand percent. [01:47:28] Yeah. [01:47:29] Multi. [01:47:30] But we we don't we don't need to, you know, we can make it free and people support us through our store. [01:47:38] And that way we help keep knowledge free and share knowledge uncensored with people all over the world. [01:47:47] And that allows you to be able to look at your board of directors and tell them to pounce on. [01:47:52] Wait, you don't have the board incorrectly. [01:47:54] There's no board of directors. [01:47:56] Yeah. [01:47:57] There's no investors. === Monetizing Knowledge Engines (05:07) === [01:47:59] There's no bank loan officers. [01:48:01] No. [01:48:02] We say no to all that. [01:48:04] And that's why we're able to do what we do here. [01:48:06] That's why you and I can have conversations here on this platform that would not be allowed on YouTube or anywhere else. [01:48:13] When are we going to be able to interview Dr. Fu Koff? [01:48:18] Well, we would have to come up with an avatar for him. [01:48:22] Yeah. [01:48:22] Mr. Foo Coff. [01:48:23] Yeah. [01:48:24] Oh, that would be fun. [01:48:26] That would be fun. [01:48:27] Like, he should have crazy hair, like white hair, like mad scientist hair. [01:48:33] Mad scientist hair, and he's just, he just, he's just crotchy old. [01:48:42] You just know where you stand with him. [01:48:44] He's going to tell you what's on his mind. [01:48:46] All right. [01:48:47] Maybe I'll animate Fu. [01:48:49] Yeah, I think you should. [01:48:50] Mr. Foo Coff. [01:48:52] Yeah. [01:48:53] And I mean, Fu is, I think he should. [01:48:57] I mean, Fu is kind of Asian, right? [01:48:59] So is this a 51st Asian guy? [01:49:01] I don't know. [01:49:02] You got a lot of Asian jokes today. [01:49:07] What the Fu, man? [01:49:09] Well, okay, like Kung Fu qualifies. [01:49:12] Okay, so there's Fu. [01:49:14] Yep. [01:49:17] Fu Coff. [01:49:18] David Carradine was the original Fu Coffee. [01:49:20] Yeah, that's right. [01:49:22] Okay. [01:49:22] Yeah. [01:49:23] Okay. [01:49:23] So you want like a Kung Fu master Fu Koff who's yeah, that would work. [01:49:30] Okay. [01:49:31] You can have like a like an Asian kung fu master beard. [01:49:34] Yeah. [01:49:34] Like a pointy gray beard. [01:49:36] Yeah. [01:49:37] Right. [01:49:38] And he and he talking like time for everybody to say fuff. [01:49:41] You know? [01:49:42] Yeah. [01:49:43] So basically Kung Fu's master, you know, remember him? [01:49:48] No. [01:49:49] He's the one that David Carradine, you know, in the show. [01:49:53] Yeah. [01:49:53] It was if you can grab the pebble from my hand. [01:49:56] Oh, yeah, yeah, yeah. [01:49:58] He had that little beard. [01:50:00] Okay. [01:50:00] All right. [01:50:01] You got it. [01:50:01] We're going to do ancient wisdom by Fu Koff. [01:50:05] Chinese guy. [01:50:06] This is going to be a much older like kung fu master named Fu Koff. [01:50:13] Yes. [01:50:13] Everything you ask him, his answer is no. [01:50:18] No, no. [01:50:18] Fuck off. [01:50:19] No, boo. [01:50:21] Fu cough. [01:50:21] Yeah. [01:50:22] We'll have him speak a little Chinese in there too. [01:50:25] And when and when he, when he does his, you know, the arm things or whatever, he just always has. [01:50:37] That's the, that's called the flying finger. [01:50:40] The flying foo move. [01:50:42] Yeah. [01:50:43] Oh. [01:50:44] The beige dragon, you know, whatever. [01:50:50] Welcome to the after party. [01:50:51] Yeah, welcome to the after party. [01:50:52] You, you, you, you asked for it. [01:50:55] All right. [01:50:55] So now you're giving me homework and stuff. [01:50:57] So I have to animate you and Fu. [01:51:00] Yes. [01:51:00] Both. [01:51:01] Yes. [01:51:02] Okay. [01:51:03] Good thing I have a data center to do all this stuff. [01:51:06] I have a feeling Fu Koff is going to be busy because there's a lot of things we need to say no to. [01:51:12] Oh, yeah. [01:51:13] Okay. [01:51:14] That's going to be so fun. [01:51:15] All right. [01:51:15] All right. [01:51:16] And he's always, he's always, he's just a promotional dude, man. [01:51:21] He can never, he can never get off a conversation without promoting his book by Fu Koff. [01:51:27] Fuck off book. [01:51:28] The very first unleash your inner no. [01:51:32] Yeah. [01:51:32] Yeah. [01:51:33] Get in touch with your inner middle finger or whatever. [01:51:36] Yeah. [01:51:36] That's right. [01:51:37] That was the very first book that I ever created with the book engine. [01:51:41] Yeah. [01:51:41] Yes. [01:51:42] That was book number one. [01:51:43] That was number one. [01:51:45] Number one, Mike. [01:51:46] That was number one. [01:51:48] Not number two, but number one. [01:51:50] Okay. [01:51:52] Well, I think we've done enough damage for one day. [01:51:55] We have. [01:51:56] We have. [01:51:57] Yeah. [01:51:58] I think I'm, you know, I'm going to go merge with the big arch. [01:52:04] Okay. [01:52:05] Yeah. [01:52:06] All right. [01:52:07] That's your call. [01:52:10] No, but I am, I am going to go. [01:52:12] I think I am going to go get some hamburger. [01:52:15] I think, I think I'm going to grow some hamburger today. [01:52:17] That sounds good. [01:52:18] Sounds great. [01:52:19] Yeah. [01:52:20] Cool. [01:52:20] All right. [01:52:21] Well, I'll tell you what, Todd. [01:52:23] So get me that, give me those assets that we talked about. [01:52:26] I will. [01:52:26] And we'll do some Todd, Todd claymation. [01:52:30] We could have Todd as a character in a claymation scene. [01:52:33] That would be fun, wouldn't it? [01:52:35] Yes, I'm going to, I will try to record something tomorrow and get it sent over to you. [01:52:40] All right. [01:52:41] Yeah. [01:52:41] Just 10 seconds or so of audio, clean audio, the way you want to sound. [01:52:47] Okay. [01:52:48] Whatever you want to sound like. [01:52:50] Yeah. [01:52:51] Yeah. [01:52:52] I'll do it. [01:52:52] All right. [01:52:53] And are you going to wear the headphones for your avatar? [01:52:56] Sure. [01:52:57] Okay. [01:52:57] All right. [01:52:58] That's good. [01:52:59] Right. [01:52:59] That's consistent. [01:53:00] Yeah. [01:53:01] I think I will actually record through my system here. [01:53:03] It'll just start recording and then I can send you that. === Preparing for Survival (04:48) === [01:53:06] Yeah. [01:53:07] Just keep the microphone too, because that's, that's going to be authentic. [01:53:11] Yep. [01:53:11] Yeah. [01:53:11] Do the same background, everything. [01:53:12] Yeah. [01:53:13] Everything. [01:53:13] All right. [01:53:14] Just going to record this. [01:53:16] Yeah. [01:53:16] Okay. [01:53:17] So I can already tell what I'm going to create. [01:53:20] I'm going to create a conversation between Avatod and Fu Coff. [01:53:25] Oh, I like that. [01:53:27] I like that. [01:53:28] Yeah. [01:53:28] This should be loads of fun. [01:53:30] This should be loads of fun. [01:53:31] I can't wait for the next DTV episode. [01:53:34] That's going to be great. [01:53:35] All right. [01:53:36] Well, thank you. [01:53:36] Thank you, Todd. [01:53:38] And thank our audience. [01:53:39] Remember, you can catch all the other episodes at decentralized.tv. [01:53:43] And I will be porting them all over to brightvideos.com also. [01:53:47] Beautiful. [01:53:48] So thank you for watching. [01:53:49] And thank you, Todd. [01:53:50] It's been loads of fun today. [01:53:51] We had a great show. [01:53:52] Always is. [01:53:52] Thank you, Mike. [01:53:53] I missed you, so this was good catching up. [01:53:55] This was good. [01:53:56] All right. [01:53:57] All right. [01:53:57] Have a great evening and take care. [01:53:59] Cheers. [01:54:00] All right. [01:54:01] Thanks, everybody. [01:54:02] Really had a great time with you here today. [01:54:04] Remember, our new online store, Bright.shop, if you want to help support the show, that's the easy place to shop and get your lab-tested, certified organic foods, and storable foods and personal care and everything like that. [01:54:16] All available right there at bright.shop. [01:54:19] So thank you for your support. [01:54:20] Have a great rest of your day. [01:54:22] Take care. [01:54:23] See you, Mike. [01:54:44] Yes, the world is getting crazy, but here at the Health Ranger store, we're putting together a survival supply assortment for you. [01:54:54] If you go to healthrangerstore.com/slash survival, you'll see what we put together for you, including iodine and IOSAT. [01:55:02] That's a specific brand name of potassium iodide that's FDA approved. [01:55:07] Or we have the nascent iodine here, which is less expensive in terms of the iodine that you get. [01:55:14] These are available in case things go nuclear. [01:55:17] It's clear that you will not be able to find any of this for sale anywhere. [01:55:22] All the inventories will be wiped out, like what happened after Fukushima in 2011. [01:55:27] So if you want to get your hands on some iodine, this is a chance to get it right now. [01:55:31] Healthrangerstore.com/slash survival. [01:55:34] In addition, we have many other survival items for you here, including some silver solutions, some spirulina available in bulk and at a discount, and then a large assortment of storable organic food that's laboratory tested, including our Ranger bucket sets. [01:55:51] Here's a 195-day supply. [01:55:54] We've got the mini buckets, and we've also got number 10 cans available of freeze-dried fruits and vegetables and other things like miso soup powder. [01:56:03] Here's some of the buckets. [01:56:04] There's a big variety available. [01:56:07] Here are some of the number 10 cans right here. [01:56:09] Remember, a lot of people are missing fruit. [01:56:12] They don't have enough vitamin C in their storable food. [01:56:15] So, you know, getting bananas and pineapples and strawberries, especially again, certified organic, freeze-dried. [01:56:22] That is the highest quality with the highest nutrient preservation that you can get in any kind of storable food format. [01:56:30] All of this is available right now and so much more. [01:56:33] Just go to healthrangerstore.com/slash survival. [01:56:37] And because the freeze-dried foods last for so long, you know, even if you don't eat them this year or next year, just keep them on the shelf. [01:56:44] They're going to last a very long time with good preservation, a long shelf life, and they will have value no matter what happens in the world. [01:56:52] Now, of course, I'm praying for peace. [01:56:54] I'm praying for de-escalation. [01:56:56] I don't want to see World War III break out, and I certainly don't want it to go nuclear. [01:57:01] But we're dealing with insane times and insane leaders and insane situations. [01:57:06] Who knows what could happen tomorrow or next week? [01:57:09] Disruptions could happen here in the United States. [01:57:11] There could be, you know, domestic attacks that disrupt supply chains here in the U.S. [01:57:18] So stock up early, stock up now, get your emergency food, emergency medicine, iodine, anything else that you think that you might need. [01:57:26] Get it now. [01:57:27] And by doing so, by shopping with us, you'll be supporting our platforms and our AI engines that we offer for free. [01:57:34] That's funded in part by sales from our store. [01:57:37] So shop with us at healthrangerstore.com/slash survival and help yourself get prepared and also help us bring you more free tools and platforms that can keep you informed no matter what happens in the world. [01:57:51] I'm Mike Adams, the Health Ranger. [01:57:52] Thank you for your support. [01:57:54] God bless you all. [01:57:55] Take care.