The DARK SIDE of AI ... a Mental Health "Mind" Field
|
Time
Text
When people are disappointed by the fact that the world isn't responding to them the way they expected because their internal mental models are all jacked up, you know, because they're woke or whatever, then they can't handle it and they just start thinking about suicide.
So there's a story now involving ChatGPT that says there's over a million people a week that discuss suicide with ChatGPT.
A million people a week who have admitted suicidal tendencies in their chats with ChatGPT.
That's incredible.
It even said that around 0.07% of its weekly users, that's a number over half a million because it's got a massive user base.
Those people have messages that indicate, quote, possible signs of mental health emergencies related to psychosis or mania.
Whoa.
And that 1.2 million active users exhibit behavior that indicates heightened levels of emotional attachment to the chat bot.
That's frightening.
I mean, what's with these people?
They're like, hello, ChatGPT.
I need you in my life, but I hope to die.
How's that to start off a conversation?
Like, I love you.
I need you.
Can you tell me how to kill myself, please?
Oh, my God.
Well, the reason this is happening, I believe, I mean, I'm not a mental health expert, just to be clear.
I'm not mentally unstable enough to study that area and to understand it.
Let me tell you.
But my guess is that what's happening is that people's worldview is fracturing.
There's a psychological fracturing that's taking place because the things that people believed are falling apart.
And part of this is because of liberalism and wokeism, which had become this really massive delusion of just insane, idiot beliefs that like a man can become a woman or that biological men should compete in women's sports.
And that was taking over.
You know, that was accepted by almost all the sports authorities, whatever those groups are, like the NCAA, you know, or the college sports and the Olympics even and the professional sports and everything.
There were men competing with women in boxing matches and wrestling and like martial arts and cycling and swimming and everything.
It's insane.
But that was accepted as real.
And that delusion is crumbling.
And so the people who built their lives around those false beliefs, they're having a psychological fracturing event.
There's got to be a better word for that.
Some of you psychiatrists listening, you probably have a word.
You probably have a whole chapter on this in your books, but some kind of a psychological decoherence event of some kind.
Again, there's got to be a better term.
I just don't know what it is.
I don't know, a psychological disembodiment or something where people are losing their minds because, again, their worldview is all being fractured.
And that's not only happening to the transgenderism pushers who are coming to realize that everybody in the world disagrees with them.
I mean, every reasonable person.
But it's also happening because of all these job losses due to AI.
So a lot of people had a plan.
Let's take a younger couple in their late 20s or early 30s.
We gave an example here from the Wall Street Journal story, a guy who's 33.
He's got a wife and kids.
And I'm sure they had their whole life planned out for them by their financial planner.
And, you know, you're going to invest in 60% stocks and 40% bonds.
And then you're going to buy this life insurance policy and it's going to compound.
And, you know, we're going to run the numbers and you're going to get a 30-year loan.
Sorry to use this annoying voice, but this is the way that financial planners talk, in my opinion.
With apologies, if any of you listening are a financial planner.
I'm not mocking you.
I'm just the ones that I've heard.
It's just like, oh my God, this is the most boring, stupid waste of time I've ever heard.
You know, let's get a 30-year loan on your house, and then you're going to make payments on your house, and you're going to gain principal.
And it's like your whole world is based on a whole set of assumptions that are just crumbling by the day.
And also, one of my big beefs with all financial planners is they never take into account the collapse of the dollar.
Never.
You know, all their plans are like, in 25 years, you know, you're going to be a millionaire.
Yeah, in 25 years, a million dollars will buy you a cheeseburger.
Okay.
So it's not, you're not gaining jack.
You know, in 30 years, your house is going to be worth $10 million.
Yeah.
Right.
But the dollar won't even exist.
And real estate's probably going to collapse.
And you're not going to be ahead at that time.
No, because bad stuff's happening between now and the year, you know, 2055 or whatever.
I'm shocked at people that buy five-year investment vehicles or like five-year treasury bills or whatever.
It's like, you have faith in five years from now?
You think you know where this world is going to be in five years?
Are you insane?
I'm having a hard time figuring out five months.
And I'm a high IQ individual.
I'm usually very good at seeing what's coming.
And the future that I'm envisioning is pretty cloudy right now because the big factor that we cannot predict is the rise of artificial intelligence, as it's called, AI, even though it's not really artificial, but that's a different topic.
But the rise of AI makes things totally unpredictable in terms of what's going to change.
And it also means the future in five years will not resemble the past.
So I laugh.
I mock people who have a 10-year plan or a 30-year plan.
This is not the 1950s.
Okay.
You know, in the 1950s, things weren't changing that fast.
You know, you could get a job.
You could work your job for 30 or 40 years or whatever.
You could retire at 55.
You could buy a house.
Didn't cost that much in terms of the number of work hours that it took to buy the house.
You could live in a nice neighborhood with a white picket fence.
And if you were a man, you were probably the single income earner and you had a wife and kids and you could support them all on one income.
And you could buy a car and you could buy a house and you could buy food.
You could buy all kinds of food.
The grocery stores were loaded with amazing food and it was all non-GMO, by the way.
It was all non-GMO.
Wasn't all organic, but it was all non-GMO.
And the food was cheap.
Nobody went hungry in America.
I mean, almost nobody at the time.
Because food was cheap.
Fast forward to today, things are changing like that.
Just like the woman in that story who's fired from Whole Foods, she got a text in the early morning.
Don't come to work today.
Yeah, you're done.
We're going to mail you the stuff on your desk.
And what the hell is this thing on your desk, by the way?
It's like, what did you put in the drawers at your desk?
We don't know, but we're sending it back to you, whatever it is.
I think some of that food has been in there for six months too long.
We're going to send it back to you, mold and all.
I tell you what, anybody who says that they can see five years in the future right now is either a liar or an idiot.
There's no way.
There's no way.
I mean, you can talk about likelihood.
You could talk about percentages, maybe.
But even then, like, you know, seven years ago, I was predicting the end of America as we know it by the year 2025, which is almost over.
And that prediction might actually turn out to be a little bit premature, probably.
We'll see.
But there's no way I would make a seven-year prediction now.
I can't tell you what's going to happen in 2032 because I don't know how we're going to make it through 2028.
You know what I mean?
Things are changing so rapidly.
It's unbelievable.
Oh, also, I forgot to mention that getting back to the story here about ChatGPT and how 800, no, over a million weekly users are talking about suicide.
There's another story linked here that says ChatGPT to allow porn.
So apparently, let's open that story and see what that says.
We'll soon allow a wider range of content, including erotic material for verified adult users.
So sexually explicit material, which has been banned on open AI platforms.
But, you know, they're seeing the moolah, the moolah.
If they can show naked people, then what do you think is going to happen to the mental health problems of the people who are already emotionally attached to ChatGPT?
Huh?
So let's see.
Sam Altman, who I don't think is a good person.
He's the head of ChatGPT.
He said they're going to make ChatGPT, quote, more useful and enjoyable for adults.
We're going to relax the restrictions, he says, for mental health issues.
We're going to treat adults like adults.
Well, there you go.
So online users are calling ChatGPT, quote, AI-only fans, AI-only fans.
Okay.
So now you're going to get apparently, I guess this is coming soon on ChatGPT.
I rarely use ChatGPT.
I sometimes use it to generate logos and images, but I use my own AI engine for everything else or almost everything else.
But pretty soon, I guess you're going to go to ChatGPT and you're going to be able to type in like some porn scene and it's just going to show it to you.
And you know people are already mentally ill.
I mean, I don't want to get graphic here.
I'm not going to, but we've already talked about some of the weird, perverted craziness in our society and all the furry, the furry people and the people dress up like dogs and have sex with each other as animals.
Yeah.
It's like wearing like leather dog face masks and stuff.
What the hell, man?
You unleash those people on like porn chat GPT who already have mental health issues?
My God.
I mean, is this part of a depopulation agenda?
Seriously, because what is going to happen to these people?
You're going to find some of these pervs like dying at their desk because they forgot to drink water.
You know, they were too busy typing perv prompts or whatever.
This is a real sick side of AI.
And I do not support that, obviously.
I mean, I use AI technology to promote knowledge, to empower people with knowledge.
I mean, if you use the AI tools that I've put out there, number one, they're all free.
I mean, go to brightyou.ai.
And if you even try to ask it to generate a porn image or something, it will laugh at you.
What are you talking about?
No, ask me a question about vaccine safety.
Ask me a question about cancer cures.
We don't do porn.
That's kind of a waste of time anyway.
Come on.
Or you can go to vaccineforensics.com, right?
That's the new site that I just launched.
Or go to censored news.
This is about knowledge.
This is about decentralizing knowledge.
But ChatGPT wants to monetize porn.
AI porn, no less.
You know, going back in time, I remember reading an article, it's like in PC magazine many years ago, back in the 1990s.
Remember a dial-up bandwidth when you had you've got mail, you know, America Online, and they were mailing out billions of CDs to every computer user that said like a thousand hours free, you know.
What do you mean, a thousand hours free?
Because you would pay by the hour to use AOL.
Do you remember that?
Are you old enough to remember that?
And you would dial up with your modem, you know?
You plug in your phone line into your modem and your modem would call AOL and it had a big bank of thousands of local numbers that you could call.
And then you dial in, right?
And then you have a connection.
And then you could check your AOL mail or you could read the different forums.
Well, I remember there was an article in PC Magazine in the late 1990s that said AOL's success was built on adult chat forums.
The people would dial up and they would spend hours and they would get billed.
In those days, you get billed by the hour.
It's like, I don't know, $5 an hour or something.
Maybe it was only $4 an hour, but I mean, think about paying by the hour to use the internet back in 1990s dollars.
That's a lot.
But the people were desperate to do it because they had adult chat forums.
And I want to be clear.
All the forums on AOL were just text, just text.
You could not even, you couldn't post images.
It was all text.
But even with text alone, it was highly, highly addictive to certain people who became highly dependent on AOL.
And they would spend hundreds of hours in a month.
And they would have these insane multi-thousand dollar AOL connection bills because they wanted to have a, you know, a perv chat with somebody on the other side who was pretending, a dude pretending to be a woman, you know?
Because you don't know, it's just text, right?
Who knows?
Well, I never wasted time doing that kind of garbage.
I was never a fan of AOL.
I was visiting websites and building websites in the early 1990s and registering domain names early on.
That's how I got so many amazing domain names, by the way.
I was buying them in the 1990s.
You wonder how I got all these domain names?
Yeah, that's how.
But the adult interests powered AOL to profitability.
And that's now what's going to power ChatGPT, apparently.
So I shudder at the thought that ChatGPT is going to bankroll basically the funding of its super intelligence systems that will replace humans by serving up AI porn to mentally ill addicted humans who will part with a lot of money to get their perv fix through ChatGPT.
That is a messed up cycle of despair.
If you thought gambling was an addiction for some people, wait till ChatGPT serves up whatever porn a user is demanding.
All their fantasies about dressing up as a furry and humping a Volkswagen vehicle.
Oh, they can have unlimited Volkswagen sex porn on ChatGPT for the people that think they want to hump cars.
I'm always joking about that because it's sick.
And I mean, it's funny.
It's also funny.
And there are things that we've all seen online that we can never erase from our memories that we really wish we had never seen.
And one of those things that I have seen, and I don't even remember who sent this to me, but it was clearly out of Europe somewhere.
It was a video.
It was a drive-by video of an older guy standing next to a gasoline pump at a gas station with his pants down, like literally right in front of the gas pump.
And he had the gas nozzle shoved up his rear with his pants down in front.
And whoever was taking the video is driving by.
And this was before AI video, so I know it's not AI.
And my God, I wish I could delete that.
I never want to see that again.
I did not want to see it in the first place.
I didn't ask for it.
Some asshat sent it to me and thought it was funny.
And now I can't get rid of that stupid scene.
I don't want to see that stuff.
But I guess, you know, the perfect slogan for whatever gas station would be, if they offered food also, it'd be like, eat here, get gas.
That would be perfect.
But I never want to see that kind of stuff again.
And I can only imagine what crazy horrors Chat GPT is going to serve up for these people that they're going to catapult out all over society to say, oh my God, look at this.
Isn't this funny?
Isn't this horrifying?
Isn't this bizarre?
No, that's sick.
You sick F. That's sick.
Okay.
Don't do that to animals.
Don't do that to vehicles.
Stop that stuff.
Mental illness has run amok in our society.
So this is the dark side of AI that's really going to serve as a predatory kind of function.
And imagine, imagine that a lot of these people are going to be the recently employed but now replaced workers.
I mean, let's say they used to work at Amazon.
They were in middle management.
Now 30,000 of those people got pink slips.
They're sent home.
What are they going to do at home?
Can't find a job because nobody's hiring because of AI replacements.
What are they going to do?
Oh, fire up Chat GPT, you know, custom perv porn for 20 hours a day, you know?
My God.
So you're taking people who used to be productive in society and you're taking away their jobs with AI and then you're feeding them dark AI or at least making it available to them.
I mean, obviously it's a choice, but there's a lot of mentally ill people out there who are going to choose very unhealthy exposure through ChatGPT.
Okay.
And that's sick.
And, you know, who was it that told me a few weeks?
Oh, yeah, it was a reporter for the Epoch Times that was quoting me for a story.
And he said that he had interviewed a guy, an entrepreneur who ran, what do you call them?
These sort of relationship AI chatbots that have characters, mostly they're like young female avatar type of characters, even like anime style out of Japanese graphic novels and things like that.
And that you can sign up and pay a fee, apparently, and then you can have your own, you know, your AI girlfriend or whatever, your AI girlfriend chatbot.
And these are highly addictive.
And there was a survey recently that said some shocking percentage, I don't know, it's like a quarter of young men have had a relationship with an AI personality or an AI girlfriend of some kind.
A quarter of them.
What?
How do you even have, how do you have a relationship with a chat bot?
You know, I mean, come on, it's AI-generated content.
You know, it's a cartoon avatar.
It's not even real, but it doesn't matter to a lot of these people.
They fall in love with the machine.
And that has to just magnify whatever mental illness they already had.
Now it's on steroids, you know?
And these companies enable this.
And, you know, the regulators, you know, are so far behind the curve on this.
And I don't know what the right answer is.
You know, I tend to be anti-regulation for the most part, but this probably is an area that needs some kind of common sense approach.
Like, you know, I mean, should it be legal for companies to offer like AI girlfriends and AI porn to nine-year-olds?
Yeah, probably not.
That's probably not healthy for society.
Or even at any age, is it okay for an AI company to offer a relationship forming chat personality that's supposed to draw you in so you have more hours of engagement so they can hike the subscription fees and make more money while you're burning your time and wasting your life away, falling in love with a freaking machine?
Like, that seems highly predatory to me.
That seems like a, I mean, I know the companies would call it, well, it's just a form of entertainment.
It's just entertainment.
You know, it's just freedom of speech.
People just want to be entertained.
They want to talk to somebody.
That would be their defense.
Okay.
But at what point does it become predatory and manipulative because the AI engine is given a prompt to sort of seduce the person into a long-standing relationship in order to maximize subscription profits and usage time?
See, to me, that seems highly unethical, highly manipulative.
Probably should be illegal.
But we'll see where that goes.
I mean, see, that's the dark side of AI that is frightening.
And I want to be clear that there's nothing in my universe that would ever even come close to that.
What we do, again, I produce AI tools that are designed to empower and uplift humanity with knowledge.
It's never about entertainment.
It's never about addicting people to the engine.
In fact, our engines don't even have a memory.
I mean, you type in a prompt, it answers your question.
It's like, done, you know, what next?
But it doesn't remember what you just said five minutes ago.
It's not a relationship.
It's a research tool.
It's an education engine.
That's what I'm into: sharing knowledge, uplifting human lives, and helping to end human suffering.
I want people to become aware of how to reverse cancer, how to reverse diabetes, how to be happier, healthier, more abundant, how to improve your life, how to improve your cognition, etc.
Not how do we suck you into a girlfriend chat bot with a cartoon body our world's about to get twisted is all I'm saying.
It's about to get twisted.
Your body deserves high quality vitamin C. Gruvie B delivers pure, highly bioavailable, non-GMO ascorbic acid you can trust.