When uncertainty strikes, peace of mind is priceless.
Dirty Man Underground Safes protects what matters most.
Discreetly designed, these safes are where innovation meets reliability, keeping your valuables close yet secure.
Be ready for anything.
Use code DIRTY10 for 10% off today and take the first step towards safeguarding your future.
Dirty Man's Safe.
Because protecting your family starts with protecting what you treasure.
Disaster can strike when least expected.
Wildfires, hurricanes, tornadoes, earthquakes.
They can instantly turn your world upside down.
Dirty Man Underground Safes is a safeguard against chaos.
Hidden below, your valuables remain protected no matter what.
Prepare for the unexpected.
Use code DIRTY10 for 10% off and secure peace of mind for you and your family.
Dirty Man Safe.
The storm is coming.
Markets are crashing.
Banks are closing.
When the economy collapses, how will you survive?
You need a plan.
Cash.
Gold.
Bitcoin.
Dirty Man safes keep your assets hidden underground at a secret location ready for any crisis.
Don't wait for disaster to strike.
Get your Dirty Man safe today.
Use promo code DIRTY10 for 10% off your order.
The most important, the most existential threat to our civilization is artificial intelligence and specifically AGI or Artificial General Intelligence.
And I am telling you this is a waste of time on my part to try to even explain this to somebody.
But let me ask you a question.
What happens if you find out that somebody has created an AI deepfake photo of your daughter, nude, depicted in some horrible sexual contortion, but it's not real?
It might be her head, her face, it might be an image.
But the act itself, the particular act which is depicted, did not occur.
It was created by virtue of what have you.
What do you do?
What do you call that?
Is that against the law?
Should it be against the law?
What is it?
Remember one of Lionel's laws.
The law always lags behind technology.
When I was a prosecutor, years ago, We didn't have anything called stalking.
Nobody understood it.
Stalking?
You can't charge somebody for being a pest.
Well, times have changed.
The law has to give way to new forms of pathology, new forms of insidious criminality.
Now, let's look at this.
What is, we used to call it kiddie porn, child pornography.
It's now called CSAM, Child Sexual Abuse Material.
Now what it is, it is not under the protection of the First Amendment.
It is not under any of the usual considerations that the Supreme Court has used regarding legitimacy of message.
It is basically the depiction of the I guess the memorialization of an act, an act against a child, and the depiction of it is the crime, in addition to what was happening.
So the child has to relive this.
You also want to stop it from being commercialized, or actually put into the stream of commerce.
Because that will disincentivize this from happening again.
But every time this is recorded, every time somebody looks at this, this is a new crime.
It is a memorialization.
It is a crime scene, is what I'm saying.
It's different.
And it has weird forms of variations.
For example, there have been people who have taken pictures of murders or There's lynchings or robberies, things that have occurred.
And some have said, well, shouldn't that be under the same consideration?
And one can say, well, maybe that's newsworthy.
I don't want to get into the finer gradations.
But even something as simple as this does have some very interesting, perhaps, First Amendment expression aspects to it.
But we're going to put that aside.
Your mother, your child called, by the way, this happened in Westfield, New Jersey.
This is a story which I will refer to you.
It's very, very simple.
You can read it.
It's at, this is a school in Westfield, New Jersey.
They investigated, a student allegedly created AI-generated nudes of girls.
And to make a long story short, let's assume your daughter calls you, your daughter's 14, and says, Mom, Dad, There are pictures being circulated of me, not only nude, but let's make it even worse, or just nude.
What do you do?
Question number one, has there been a crime committed?
Answer, no!
No!
Not that I know of.
No crimes on the books.
Why?
This is not child sexual abuse material.
Now, I don't want to get into the idea, and believe it or not, you might think this is pretty cut and dry.
It's not.
Is a picture of a nude child in and of itself problematic?
Is that child sexual abuse material?
Well, not really.
Not necessarily.
This has been problematic.
Years ago, when you would go to a drugstore or a one-hour developer, sometimes those people would call the police and say, listen, I've got a problem here.
I got these pictures.
I don't know if they're okay.
It's a bunch of nude kids.
Well, it turns out it was a nude beach.
It was a picture of the parents at a nude beach taking pictures of people who were nude at a place where nudity was legal.
Is that okay?
Babies in bathtubs.
Babies in bearskin rugs.
It can get tough.
Okay, that's five.
What if it's a 13-year-old girl?
What if it's 15?
I don't know.
Even that was confusing for law because there was no sexual activity occurring other than the nudity.
So again, gray areas.
But in this particular case, what is it?
It's not a real picture.
It's not real.
Good luck trying to tell your daughter, honey, this isn't you.
Yes, it is.
This is my face.
They're going to think it's me, but it's not you.
But they're going to think it's me.
What do we do?
Here's what I suggest.
We need to look at this as a form of libel.
Why?
Libel, defamation, is basically a false statement, a statement as true of a statement which is false.
A statement of fact which is false, which hurts, defames, which basically distorts.
Puts you into a false light.
Embarrasses.
Depending upon how it can be done, it can also be a form of extortion.
It can be revenge porn, which is another issue.
Fraught with problems, because in that particular case, who owns the image?
If I own it, if I take a picture of you, and it's you being nude, and I took the picture, and let's assume you knew I was taking it, I own it.
And if I'm putting it out there, One could argue, that's mine to disseminate as I see.
Now, if I'm trying to extract something from you, that could be extortion, compulsion, whatever it is.
Again, not to confuse things, but I want you to understand the complexity of it.
In this particular case, legislatures need to come out and say that when you put a picture out that depicts somebody...
It's not a parody.
We've got to carve First Amendment exceptions to it because remember the old parody exception and the old Jerry Falwell and Larry Flint case years ago.
The Supreme Court said that parody and exaggeration are protected.
This is different though.
This is almost two things, libel and the intentional infliction of emotional distress.
I'm doing it for two reasons.
Number one, I'm spreading something which is false, which causes damages, affects reputation, and depending upon the context, it might be under this intentional infliction of emotional distress, I'm doing it not merely just to show it, But also to exact some type of harm from you.
That's a factual consideration.
My point for this is, do you see how gray these areas are?
And I've been talking about this forever.
We're going to go a step further.
Not only are we going to talk about this, but let's say I'm able...
In addition to creating these particular images, I'm able to create a body form, a mannequin, an animatronic, of a baby, of a child, of an infant, that is so realistic.
So the artificial intelligence that can show heripolation and skin blushing and, I mean, it's just...
Assume, arguendo, that it is absolutely lifelike.
And I sell it so that people can do horrible things to it, but it's not real.
But it's intended to entice people to do acts upon it, which, if performed on a child, would be against the law.
What is that?
That's a thought crime.
I would be basically prosecuting you, For thinking something.
If I sold a thermos, one of those tartan kind of thermoses, and you use it for a variety of things, am I going to take that off of the market also because I know what you're using that for?
It's a thought.
I'm penalizing your thought.
Understand the gray areas of this.
We can't throw away expression in the First Amendment just because the subject matter is problematic.
It's like Rashida Tlaib.
People want to censure her and kick her out of Congress because what she's saying is negative to Israel.
I got it.
What if she was saying the same thing but against Nazi Germany?
That's okay.
So it's not the subject.
It's not what you're saying.
It's the subject matter.
And if it's something that we like, I have said, and I believe, If the First Amendment is ever lost completely, it will be lost not because somebody came and wrested it from the control of us, but that we would have given it away because we didn't know enough about it to pay attention to it.