My Wife Shares Shocking Updates You Need To Hear!
My Wife Shares Shocking Updates You Need To Hear!
My Wife Shares Shocking Updates You Need To Hear!
Time | Text |
---|---|
Good night, dear friends. | |
Joining me right now is my better half. | |
I like when they say full disclosure, the incredible Lynn Shaw from Lynn's Warriors, who happens to be my wife. | |
And I wanted to put that aside to talk about some very, very critical developments that you should know about. | |
And who better than you, my darling, to tell us? | |
Bring us up to speed. | |
Where do we start? | |
Well, welcome, my big warrior, Lionel, Lionel Nation, truth, reality. | |
I really think it's very important that the public understand that an historic move was made at 4.30 a.m. this morning in Washington, D.C. And everybody needs to bear with us for a minute and why this is so important. | |
This big, beautiful bill that everybody's talking about, right, that most people are probably tuning out, right? | |
I don't think most people really care because most people are overwhelmed with putting food on the table and taking care of their children. | |
Ted Cruz, who's been nothing but a proponent, Senator Ted Cruz, of helping mold child safety legislation in Washington over the past several years, was also a co-sponsor of the Take It Down Act that we at the Warriors were recently in the Rose Garden at the White House with POTUS and Flotus to sign the first piece of child safety legislation since 1998. | |
And what he did a couple of weeks ago, nobody, not a Democrat, not a Republican, not anybody can understand. | |
He attached. | |
This is Cruz, not Trump. | |
This is Senator Ted Cruz, yes. | |
He attached to a broadband provision, holding hostage, basically. | |
I don't want to bore everybody with all the details, but holding hostage this broadband financing, you know, getting their funds unless he said they need this AI moratorium, meaning 10 years of unregulated AI can just do what they want, run away, train. | |
Nobody can touch them. | |
There can be no state laws. | |
There can be no federal laws, no anything. | |
Nobody could understand why he did this because we all know, I don't care if you're sleeping under something, you had to hear about AI, how it's happening, how it's escalating. | |
And also 10 years, so nobody could figure this out. | |
We came from the Rose Garden and the Take It Down Act a couple of weeks later to this. | |
And you were there in the Rose Garden when you say we were, you were in the Rose Garden. | |
Yes. | |
You were there. | |
We, as in the Warriors, were in the Rose Garden. | |
Yes. | |
Back it up a little bit. | |
You say that broadband can't get something. | |
Explain that. | |
And I know, I really know nothing about this. | |
So when you say... | |
Believe it or not, there are a lot of communities that still can't get internet or have sketchy internet. | |
This was to finance putting it in so everybody's on equal footing. | |
Well, Ted Cruz attached this AI moratorium to that, saying, listen, in order to get this, you know, financing for this internet for everybody across the country, we need AI because at the rate we're going, AI is going to dominate, control everything. | |
He tied it together. | |
Nobody could understand this at all. | |
The thinking, though, was, what is he doing? | |
It's never, it doesn't belong in this big, beautiful bill. | |
It's never going to go anywhere. | |
And long story short, it kept going up the ladder almost to approval, almost to approval until Senator Marsha Blackburn stepped in. | |
She first stepped in, which I'm still questioning, and she negotiated with Ted Cruz, let us reduce this from 10 to five years and give people some internet, right? | |
We still don't know all the behind the scenes because this just all played out in the last, you know, within the last 24 hours. | |
Now, as of 4.43 a.m., I believe it was, she put out adamantly, this is what the public has to understand, taking it back, we need to ban any AI moratoriums because she has worked. | |
I want to explain why this is so important. | |
I want to go back to this beginning because we're throwing a lot of terms just so that people understand. | |
The AI moratorium said that there would be, if this moratorium were held up, there would be no laws limiting, restricting, dealing with, addressing artificial intelligence for 10 years. | |
Yes. | |
Now, that means that for 10 years, everybody hands off. | |
No talk about this. | |
It basically gave a free pass, a 10-year heads up, but a head start to do whatever you have to do. | |
10 years. | |
Imagine if somebody did that for, let's say, vaccines or red dyes, or if somebody, if there were LGBTQ, no limitations on funding for puberty interruption or anything. | |
Pick the story you want. | |
Imagine if there was something that said for 10 years, it's going to go full throttle, unfettered, unencumbered. | |
It's unheard of. | |
Unheard of. | |
So Marja Blackburn then seemed to sign on to that. | |
And you said, wait a minute. | |
She was, you thought she was a friend, an ally, a colleague. | |
You were shocked. | |
And people within your significant coterie of like-minded warriors and from various groups, you were shocked by this. | |
And you thought, what happened? | |
You were betrayed, stabbed in the back. | |
So then the next thing she said was, all right, I'll sign on. | |
I'll agree to five-year moratorium, which is still a moratorium or five years of a moratorium, which is still five years of unfettered, uncontrolled, unregulated, unwatched artificial intelligence Rules where you can't touch anything. | |
It's almost like immunity. | |
So then overnight at four or whatever, you told me, guess what? | |
She did a 180 and said, you know what? | |
I changed my mind. | |
I'm going back to what I originally said. | |
No moratoriums, no regulation as far as she wants on any kind of legislation or regulations regarding AI. | |
So she went from a 10 to a 5 to nothing. | |
And we don't know why. | |
We don't know who got to her. | |
It's a flip that was good, but Cruz, Cruz is not backing down. | |
No, but I want to explain something, the importance of the public. | |
In a 48-hour time period, truly, the first, truly bipartisan movement, I really have seen, we always say all of these issues we work on at the Warriors are bipartisan, you know, sexual exploitation, human trafficking, child safety. | |
Every group, individual came together, worked around the clock. | |
This is the importance of our hashtag Community Creates Change. | |
140 organizations presented a letter to them on the floor, including the Warriors. | |
We got everybody together, okay, to say this cannot happen. | |
Marsha Blackburn, who everybody generally loves, even if you're a Democrat, she's a Republican from Tennessee. | |
Nobody bothers Marsha, right? | |
She's a mom, she's a grandma. | |
They just let her be. | |
Everybody pushed back on her like you couldn't believe. | |
And I believe that pushing back, especially the parents we rallied up who've stood before Congress, whose children have been harmed from different social media platforms, and who she has stood with and said, because a lot of the parents wrote this openly on their platforms, I stand with you. | |
I will help you. | |
I will protect you. | |
I will legislate in your child's name, went against. | |
Everybody pushed back on her and she reversed it. | |
Okay. | |
So that is the power of community. | |
I'm going to stay in the hopeful lane that we can do this with a lot of things. | |
But let's get to Senator Ted Cruz because, you know, he is not backing down from this. | |
So we're going to take the moment and we have victory today and tomorrow, but we must be prepared for the next step. | |
Now, a couple of things too. | |
You have, you showed me something that there are some of the most talented people, young warriors, if you will, who did, who had some of the most fascinating, I don't know what the word is, but interactive methodology where you get an email and when you click something, explain this. | |
You showed it to me. | |
I can't even explain it. | |
But it was some of the most brilliant uses of technology to get the word out. | |
Tell us about that. | |
Yeah. | |
And that's the other thing I want to point out that this is why we need the public involved because we will be handing you these take actions where you just have to click on something. | |
All the work is done for you. | |
So in particular for this, we call it the AI moratorium ban. | |
Okay. | |
Received an email. | |
Click on this link. | |
Now, again, it's vetted. | |
We know who we're dealing with. | |
These are not just strange links from strange people. | |
When I clicked on that one word, click on link, the link alone, it's then said, now pick up your phone. | |
I got a phone call immediately. | |
Okay. | |
I picked it up. | |
It connected automatically. | |
It started with, you know, Ted Cruz's office and it automatically for you once you left your message. | |
Okay. | |
It rolled you over to the next office. | |
So you didn't have to look for numbers. | |
You didn't have to search out, you know, even do emails. | |
You didn't have to do anything. | |
We flooded it. | |
It provided the message. | |
All you have to hang up once. | |
Right. | |
All it did was you give your number. | |
You have to obviously input your number at some point. | |
So that it. | |
It was my organization and my phone number. | |
And then that goes to Ted Cruz or whoever it is. | |
So all of a sudden they are flooded with actual like roboc, but he got them. | |
And it just was, you've always been saying it is the most powerful thing that anybody's ever seen. | |
And it was so wonderful to see these very, very smart young people. | |
And it's great to see really the smartest, but people also devout and focused on this. | |
Okay. | |
Not to interrupt. | |
No, but I just want the public to understand because I understand, you know, these issues. | |
People close their eyes. | |
They don't want to deal with it. | |
They may find it boring. | |
You cannot find it boring. | |
This is about children, our future adults, our future leaders, your grandchildren. | |
It's really about all of us. | |
Okay. | |
Remember that AI moratorium, that affected, we talk, you know, about children, right? | |
It affected everybody. | |
You have to remember that. | |
It affected the deep fakes. | |
It affected. | |
So as we talk about children, I just want to make sure we point out it also was for all of society. | |
And I also want to point out that there are a thousand AI laws already on the books in the states. | |
This would have erased everything that they have been working on in the state level and just done away with them. | |
There'd be no protections for anybody with AI, any of the sexual content. | |
We can all agree there's some great things with AI that will help with certain things. | |
Healthcare, we always talk about the oil industry traditionally. | |
But as far as children and safety and families, that was our piece of concern for this. | |
But again, this AI moratorium ban is good for all of us in society. | |
So we'll keep you updated on that, what's going on, because I really feel like Senator Ted Cruz, well, we're hearing a little scuttlebutt that he's already lining up to present a separate bill on this. | |
This got kicked to the curb, so it's not. | |
Meaning what? | |
Explain. | |
A separate bill for what? | |
Well, he'll introduce into Congress just a separate bill about all this. | |
There's a part of me that says he did this because nobody can understand why he did this. | |
This is his way of getting his own PR, believe it or not. | |
I don't know this to be fact. | |
He got it in there. | |
Everybody was talking about it. | |
Everybody was upset. | |
He got a lot of attention. | |
Now, it got kicked to the curb, but now he's going to start from the ground level up. | |
Who knows who's helping him back this? | |
what you're saying is he is going to be initially. | |
He is going to be. | |
That's what I hear. | |
That's what I hear. | |
But let me finish. | |
He's going to be reintroducing, again, a version of the moratorium or something along those because he is in the pocket, it would seem, of big tech. | |
That's who he is worrying about. | |
You know, I don't want to talk about any other issues which really have nothing to do with what we're talking about here. | |
But I will tell you this. | |
There's been a lot of talk about certain, dare I say, lobbying groups in Congress, powerful lobbying groups. | |
There is nothing that can compare it to big tech. | |
If ever you think they are, explain a little bit how powerful they are and what you've seen up front. | |
I'm going to compare it, in my opinion, really to like a criminal organization. | |
They are so well-oiled, well-paid. | |
Lobbyists will get upwards of $2,500 to $3,000 an hour on Capitol Hill. | |
I go to Capitol Hill. | |
That's one of the reasons with the Warriors, any of the donations greatly appreciated, by the way, fund those travels to Washington, fund our work in Washington. | |
But they're getting up to $3,000 an hour, we hear, and they are relentless. | |
And they are like, they are not kind either. | |
They're very aggressive on the floor. | |
They see somebody walking in a hallway. | |
They will knock you over. | |
They grab that person. | |
But here's the thing. | |
I think, let's take Congress for a minute, right? | |
They work for us. | |
We're constituents. | |
I also think they're overwhelmed. | |
I know it's their jobs. | |
I'm just trying to think like maybe why things are slipping by everybody. | |
If you get in somebody's face constantly and you just, you know, like a bulldog and you're just at them and at them, they may just relent because they can't take that anymore. | |
I'm breaking it down very simplistically, but there's a lot of money also attached to big tech. | |
We have to realize this. | |
And, you know, people aren't in Congress forever. | |
Maybe they want to be on the board of a big tech company, work at a big tech company. | |
You have something to do with a big tech company. | |
Oh, yeah. | |
So this is why it is a little disheartening that I have to even think that way and dig deeper about people's motives. | |
But one must to be successful and to know we have to try to be one step ahead with the big tech. | |
Big tech right now is ruling the roost. | |
Roost. | |
Can't even speak anymore after the last 48 hours. | |
Yeah. | |
With what's going down. | |
And also, I dare say the administration. | |
There's so many connections, you know, to big tech. | |
And I don't understand the concern is not for the children. | |
The concern is for global dominance. | |
And it's two separate things. | |
Now, a couple of things. | |
Fill people in. | |
You've talked to so many parents, so many people, so many, you've been at so many, not demonstrations, but marches and protests. | |
And give us an idea of what so many parents have dealt with specifically. | |
And what, is it AI or is it just big tech or social media or just vulnerable kids who found a means of accessing their vulnerability? | |
Because some people are saying, look, kids today, sure, maybe there are more of these cases, but whenever you have a new technology, there are new ways for very weak children to find new ways to express their own inferiorities or problems, sensitivities. | |
When rock and roll came out and people thought they were hearing satanic messages, there were suicide. | |
So dispel. | |
What is the problem? | |
What have you seen? | |
And take us through some of the horrors you've seen directly. | |
Listen, you can't compare big tech to anything else. | |
You can't compare it to rock and roll. | |
Big tech has one goal, make money. | |
Who's the consumer? | |
The consumer is the child because those eyeballs will be for the rest of their lives a consumer. | |
So they keep going after kids, kids, kids. | |
This is nobody's fault. | |
This has been thrust upon society. | |
It's not the parents' fault. | |
It's not the kids' fault. | |
We were just given all of these apps and platforms and everybody's on technology. | |
And, you know, it was kind of cute in the beginning. | |
Think back, you know, 20, 25 years ago when we all started. | |
Think back, again, getting out there, a little bit boring, but necessary to talk about 1996, the introduction. | |
It's usually thought of as the birth of the internet. | |
And we had something called Section 230, still there, of the Communications Decency Act. | |
It basically says we had two platforms then, 1996. | |
We're a newsstand, we're a billboard, we're not responsible for any third-party content. | |
That Section 230 has never been amended, changed, nothing. | |
So as the platforms and the technology has been here, as I understand, since the 1940s, all of this has been here. | |
However, as people became smarter, more research, the technology even improved, it's been able to increase so fast. | |
I mean, don't you see how it's just speeding up? | |
It seems like in the last year, going in this direction. | |
And so therefore, it's feeding off of each other. | |
It's this runaway train. | |
That's the only way I can make the analogy. | |
And now, so we haven't had guardrails in the past. | |
And now we have almost, in my opinion, this new technology, this artificial intelligence. | |
What are we going to do? | |
How are we going to protect? | |
Let me just go through the harms. | |
So again, it was nobody's fault. | |
Everybody's on social media. | |
Remember MySpace? | |
I don't know what happened to it. | |
It went out of business or something. | |
But no, but that was like college kids. | |
And it was, you know, cool and cute and your friends. | |
But the predators are always 10 steps ahead, the criminals, and they figure out how to take advantage. | |
And we have seen, and you can say, in my opinion, that big tech themselves has definitely taken advantage. | |
They have lied repeatedly. | |
They don't have the safety guardrails. | |
They're not interested in helping keep kids safe. | |
They want the dollar sign. | |
And what happened in the meantime? | |
All kinds of dangerous algorithms, all kinds of things you can do from cyberbullying to sextortion to eating disorders to actual predators conversing directly with anybody. | |
It's an open door to the world. | |
And this is what we're dealt with. | |
And we're trying now. | |
We're slapping the band-aids on, right? | |
Trying to help everybody. | |
But we need to get in there, especially with AI and intervene to prevent because we learned. | |
It doesn't work now. | |
We're trying to catch up, slap the band-aids. | |
We got to get in there and do something to stop, not stop, but put guardrails on AI. | |
What would you like to see done ultimately? | |
Because you can't stop this. | |
Nobody, you've never said you want to stop this. | |
But I always think back to show, see, these folks made a big mistake. | |
They showed the world how fast it can act. | |
If you were to say something, let's say in a chat, in a post, in the old days during the time of COVID, if you were to say something that was not considered to be allowed, let's say something about ivermectin or hydroxychloroquine or whatever it was. | |
They moved in so fast. | |
It was like you never realized how many algorithms they had and how many means of filtering this. | |
So they let the cat out of the bag. | |
They showed us, this is how fast we can move. | |
So they turned around and they said, you know, there's just too many people online and we're not able to address this. | |
They're like, oh, no, no, you can address this. | |
You do it all the time because we never knew they could do it until now. | |
So what would you like to be done if you had your way? | |
Believe it or not, right now, people have to be able to open their eyes and listen and learn and discuss things. | |
Okay. | |
We're not talking about cute little robots, cute little this. | |
I don't have time to watch what my kids are doing. | |
I don't have time. | |
You've got to take the time because I really believe right now we are at a change. | |
This is a historical moment. | |
That's what I believe right now. | |
So people have to wake up and understand we live in a new environment. | |
We don't want to get rid of the internet technology. | |
No way. | |
It's not going away anyway. | |
How do we learn to leverage all of this for good? | |
You know what? | |
It starts with intervention education by talking about it. | |
And that means in your own home, you are going to have to be responsible for your children and talk to them and try your very best to take time to be open and honest with them and understand what is going on online. | |
I know it's additional work for everybody, but this is an American crisis, in my opinion. | |
These are our precious kids. | |
So I want more conversation. | |
I want more schools to have programs about this. | |
Kids can't read and write because they've just been forced. | |
And that COVID period, and I'm going to say it, it forced everybody to do your schoolwork online, to shop online, to be online. | |
It was our only way of connection. | |
Right. | |
And they did such harm with that that our country will never, we will never be able to get out of that. | |
So it's up to us now. | |
People have been waking up. | |
We've got to get schools on board, individuals in their homes on board, and we have to have pushback. | |
So when the Warriors gives out a take action, okay, call your senator. | |
We need your name and your zip code to represent. | |
You need to participate in one shape somehow. | |
But I'd say in your own home, people have to really wake up. | |
You bought the devices for your kids, even yourselves, if you're an adult. | |
You must be aware of what is going on and wake up and discuss these issues. | |
How does AI work into this? | |
What is your fear about AI in particular? | |
So AI specifically, we'll start with an example of deep fakes. | |
So AI originally, all of a sudden there was a rash, the media didn't really cover it, of high schools, all of a sudden, of students creating nude images of other students and wreaking havoc. | |
Schools were not prepared. | |
First of all, everybody's embarrassed. | |
Nobody knew what this was. | |
And I'm talking in the last 18 months. | |
Schools, instead of knowing about this, many schools said, I never heard of this. | |
I don't know about this stuff. | |
We don't know what to do with somebody who created it because free programs were put online, free things, like you have free chat GPT. | |
There were free programs to build. | |
It used to be in order to do a deep fake. | |
You needed a great computer. | |
You needed to do coding. | |
You needed to collect a lot of images in order to create. | |
But now the technology is so good out of one picture of somebody, you can take off their Facebook, right? | |
Their Twitter X, one image, because of the free programs, you can create this fake image and put them into sexual situations and videos that look very realistic. | |
So we had that flooding high schools, right? | |
So that should have been kind of a national PSA, in my opinion. | |
Now we have middle and high schools. | |
There's no protocol in place. | |
How do schools deal with this? | |
The case that was brought out and Melania Trump showcased with the Take It Down Act because she brought the young woman to the White House. | |
This was done to her in her high school in New Jersey, okay? | |
This student and her mother made a big fuss. | |
Most people sweep it under the rug. | |
They're embarrassed. | |
They don't want to be involved. | |
They don't know what to do. | |
They're all freaked out. | |
This woman and her daughter really called on this school, called them out. | |
They did a lot of media, media that would take them, and they talked about it. | |
They talked about it all the way to the White House. | |
And the school's answer, they let those images online. | |
The school themselves, remember, a lot of schools, and I'm digressing here a little bit, they're using shared calendars. | |
They're using shared technology. | |
So everybody's seeing this. | |
The school took four days before they would even address like, we're going to take this down because it infiltrated the whole school technology system. | |
Then when the administrators were pressed. | |
They said, well, we don't know what to do. | |
They had the student who did it, okay, a male student. | |
Well, I guess we'll suspend him for a few days and that'll be his punishment. | |
And the mother and this young woman said, not good enough. | |
And they took it to the media. | |
So that kind of started. | |
But the harm of one, we're hearing can't get a job, can't get into a school when their social media is searched and they're seeing nude images of people. | |
But here's what I say about all of this. | |
There's so much of this going on. | |
Everybody should say, and I'm not kidding when I say this, those are fake images. | |
It started with celebrities, where people could make a lot of money off of nude images of celebrities, you know, uploading to porn sites and stuff. | |
But it has infiltrated now to everyday people. | |
And people are losing their minds, the mental infiltration. | |
She's a 13-year-old girl. | |
So what you're saying is we have to teach kids to say, repeat after me. | |
That's not me. | |
It's not me. | |
Don't get upset over something that's not you. | |
You have to learn. | |
It's not you anyway. | |
But we also, we know the current study, and remember the studies are always behind. | |
We know that over, it's about 60% now of kids in high school routinely see deep fakes of nude images of people they know. | |
That's of people they know. | |
So where is this all heading? | |
Right. | |
My fear is everybody, it becomes so normalized, which I think it already is to a certain extent. | |
People are habituated, are not paying attention. | |
But I want to point out it is still, because there's still kind of this cloud of mystery with the public, with all of these issues, right? | |
It is not talked about enough. | |
It is not pushed back. | |
And big tech themselves is not held accountable. | |
They're the ones with all the money. | |
And people will say to me on a daily basis, why is this happening? | |
And it's because I'm afraid to say there's a lot of money being made. | |
They're not being stopped. | |
There's no accountability. | |
It's the money. | |
I mean, we like capitalism here in the United States. | |
I mean, it's going on around the world, all of this, but we're talking about right here at home right now. | |
So until people push back, understand this is detrimental, a young woman or a young man, you know, in high school, we have children harming themselves. | |
They don't know how to deal with this, or they have parents and guardians that yell at them or push back on them and tell them they're disgusting and they've embarrassed the family. | |
We cannot have conversations like that. | |
We cannot let these predators win. | |
I don't understand the mindset of everybody. | |
These are our precious children. | |
Why aren't we out in Times Square marching in the millions? | |
I mean, we're marching for all kinds of issues around here. | |
We're concerned about all kinds of things, but we don't march in the protection of our children. | |
I think the hardest part is for people still to understand what it is. | |
I've tried my best. | |
I've tried so hard to. | |
I try to explain sometimes that imagine you're talking, you're a bot and your child is talking to a bot, which nobody really understands. | |
And you come in, you say, give me your phone. | |
And you remove, you either take that phone, destroy it, get a new number or whatever it is. | |
All of a sudden, your child gets a phone call from the bot. | |
This is eventually, it's going to have the perseveration, the perseverance, the conscience, the tenacity, the genius of a 300 IQ human being. | |
It's not just a game. | |
It's not, it's so pervasive. | |
I mean, nobody really knows. | |
They still don't know because once this gets out, it's out of the, it's out. | |
It's out already. | |
That's why people have to really wake up right now, like 10 minutes ago, because people have to understand a bot, a bot is fake, but because kids, I'm going to say adults too, here on Lion Isle Nation, because people are so craving connection. | |
Again, coming out of that COVID period where everybody got conditioned, being online more than normal, they're giving away their information. | |
They're talking to what they think the kids, it's called Character AI. | |
That's a free program for kids. | |
They could create their favorite character and bring it quote unquote to life. | |
And then they could talk directly, but they're feeding it information. | |
Everything we're doing is feeding information. | |
But it also, what people don't understand, remember in classic AI, we're not really there yet, but soon when you have that recursive self-improvement, it's going to be able to write its own code. | |
Yes, and it's going to be able, it's also going to scan and have access to metadata of every single thing a child has put in. | |
And it will be able to grasp, to get an overview, a psychological kind of a, not a post-mortem of an autopsy before, is a wrong term, but kind of like a profile. | |
And it will find out what the child is most vulnerable. | |
It also, remember, it learns this is not an app. | |
This is a sentient, I want to say human being. | |
That's what people don't understand. | |
But then it takes on a personality, for lack of a better word. | |
And as the child, as an example, is conversing and this bot, this fake thing, because it's not real, learns the personality of the child and is conversing. | |
And the child expresses, you know, I don't like my mother. | |
I don't like my father, my sibling, my sister, my brother. | |
This is what's happening now. | |
That bot is directing, right? | |
Harmfully, you know, do harm. | |
It's not real. | |
It's again, where does this stop, right? | |
I don't know, which is why we have to educate. | |
We have to talk about it. | |
You have to talk to your kids. | |
And you've got to get the kids off the devices. | |
I mean, even if you start with 15 minutes a day, you've got to get them into other activities and get off these computers. | |
We do have truly internet addictions going on, right? | |
We truly have the average is eight hours a day. | |
A teen is spending online, eight hours a day. | |
Kids under 12 are spending an average of five hours a day online. | |
So, and that's what we know about. | |
We've got to get them into other activities. | |
And here's the hopeful news, though. | |
We at the Warriors are working with such terrific groups have come forward. | |
I'm going to say the 30 and under, whether it's the Heat Initiative, I'm going to call them out because they do terrific work, designed for us. | |
Also, ENCODE. | |
These are brilliant young people that are angry now that they were used as guinea pigs and thrown into all of this, right? | |
And they see the harms and they don't want any younger children to have to face what they were brought up on. | |
So I see tremendous success and work coming out of young people. | |
I'm so proud of them. | |
I'm honored to work and walk alongside them. | |
They are the ones right now doing this work that I really think can flip the minds of young people. | |
Because after all, peer-to-peer is what resonates as opposed to me preaching to somebody. | |
I can educate parents and grandparents and communities, but we need the young people themselves to educate the young people. | |
Well, what can somebody do right this moment who wants to learn more about this and wants to join Lynn's Warriors, who wants to join the fray? | |
What can they do and what do you recommend? | |
Well, first of all, always follow lynnswarriors.org, our website, for updates, and see the organizations I list. | |
I mean, you can go to my Facebook, my X account. | |
Every day I put out information. | |
I share information of well-vetted organizations such as I just mentioned, ENCODE. | |
I just did a terrific interview. | |
It's up Fair Play for Kids with Josh Golling on our Warriors YouTube channel. | |
Look at what they do. | |
They're the only organization in the United States that is geared toward marketers that go after our kids. | |
They have a wonderful toolkit. | |
I listed it in the body of the video. | |
Look for those free resources. | |
There's also a tremendous movement starting through Fairplay, which we're part of, we're advocates for, and we work on the phone-free schools across the United States. | |
That toolkit is also listed under fairplayforkids.org and also under the Warriors. | |
So you've got a lot of free resources. | |
You have to spend a few minutes looking for those resources. | |
And you can do anything. | |
You can take the phone-free schools, present it, get another parent, do it yourself. | |
Go to your child's teacher. | |
Say, I want to get this started. | |
There's no federal regulation with this. | |
It's kind of a very grassroots. | |
You've got to introduce it and you got to present it to the school and carry forth. | |
And the studies coming out of just the phone-free schools alone, the kids are doing better academically, socially. | |
Teachers are able to teach. | |
So I want to point out there's a lot of good work. | |
And for policy, you've got to go to ncode.ai. | |
It's there. | |
But to make it easy for everybody, one-stop shopping, you got to follow the Warriors. | |
You got to listen to some of these interviews where I have experts that explain it. | |
And that's what I would recommend to everybody. | |
And you can't believe the number of parents who have been forever shattered. | |
Shattered by everything from... | |
Okay. | |
Selena was 11 years old, two years ago, and she became addicted. | |
And I speak freely because I work with her mother, Tammy. | |
I have permission to speak about it anytime I want. | |
I've marched alongside. | |
She became addicted to the internet. | |
Tammy was a single mom working hard. | |
Selena's older sister, you know, 11 and 14, were in the house together. | |
And Tammy, the mom, didn't realize her daughter was becoming deeper and deeper on the internet. | |
But predators were talking to her and bullying her. | |
So I just want to finish that with saying that I do everything I do for Selena and all the children. | |
Selena did away with herself. | |
I don't, you know, we have algorithms here too. | |
I can't even bring myself to say, how does an 11-year-old self-harm themselves, 11 years old, a baby, do that? | |
But Tammy Rodriguez, her mom, who is very quiet, but about this issue, has gone to Washington, Congress. | |
We gladly walked with her peacefully protesting in front of Meta headquarters here in New York. | |
She doesn't want any other mother to have to go through this. | |
And Selena's sister, Destiny, who then tried to harm herself because she thought it was her fault. | |
So everybody has to understand these issues affect all of us. | |
All of us. | |
But all of us must stand up for our children. | |
We cannot hand our children over to criminals. | |
And that's what we do our work for. | |
But really, I urge everybody to follow lynnswarriors.org or write to me, ask a question. | |
I'll direct you towards resources. | |
Please, lynn at lynnswarriors.org. | |
We must come together as a community and address this. | |
And I will have the links accordingly. | |
Again, thank you, thank you, thank you. | |
There is nobody, and I'm not saying it merely because you're my wife and I love you, but there is no, I have not seen anybody who knows or who can speak as eloquently and as comprehensively as you because I know firsthand you live and breathe this. | |
This is a 24-hour, not obsession, but a calling and a focus that I don't think people really can understand unless you see it firsthand. | |
So thank you, my darling. | |
Thank you, but I want to throw one other thing in, please. | |
Indulge me. | |
I want to thank everybody who has donated to me, who has followed along, who has sent me things, in-kind services when we do our silent auctions, anything. | |
Thank you for being such a warrior. | |
That is what empowers me in this very tough. | |
We're talking children here. | |
We're talking families. | |
But your participation empowers me to do bigger and to do better. | |
And I just know that all of us can create a difference because I'm seeing it. | |
So thank you to everybody. | |
And thank you to you, my wonderful husband, for even wanting to showcase this and talk about it. | |
Because, you know, a lot of media does not, but we want everybody to get the truth and reality. | |
And we want everybody to be a warrior. | |
Absolutely. | |
All right, my darling. | |
We will talk again and talk soon. | |
And again, thank you. |