The “Digital Voyeurism” Trap: How New Florida AI Laws Could Land You in Prison
- Josef Mitkevicius
- Nov 30
- 7 min read

It used to be that “seeing is believing.” In Florida law, that’s no longer true.
For years we’ve warned clients about “revenge porn” – real intimate photos shared without consent. That was bad enough. But with Florida’s 2024–2025 updates, the ground has shifted again.
The Legislature has now gone after AI-generated sexual images – even when the scene never actually happened in real life.
Florida has:
Expanded Sexual Cyberharassment and related laws to cover deepfakes and “altered sexual depictions” – realistic AI images or videos that paste someone’s face onto a sexual body or situation they never agreed to.
Created and expanded felony offenses for generating, soliciting, promoting, and possessing with intent to promote these images under Fla. Stat. § 836.13, as amended by HB 757, effective October 1, 2025.
At Mitkevicius Law, we’re seeing the beginning of what I call “Digital Voyeurism” prosecutions: cases where the State treats an AI-generated nude on your phone almost like a secret bathroom video.
If you use AI tools, trade memes in group chats, or have any kind of “joke” deepfake on your phone, you need to understand where the lines are now.
The New Law: When “Fake” Becomes a Felony
Florida didn’t start from scratch. We already had:
Digital Voyeurism – § 810.145
Secretly recording someone in a place where they have a reasonable expectation of privacy (bathroom, bedroom, tanning booth, etc.), and distributing those recordings, is digital or video voyeurism and digital voyeurism dissemination, both crimes.
Sexual Cyberharassment – § 784.049
“Revenge porn” laws making it a crime to post real intimate images without consent.
The problem was deepfakes: realistic sexual images of a real person that never actually happened.
To close that gap, Florida created and expanded § 836.13 – Promotion of an altered sexual depiction, through HB 757 and related bills.
What Is an “Altered Sexual Depiction”?
Under the new statute, an “altered sexual depiction” is basically a deepfake or “nudified” image that:
Uses someone’s real, identifiable face or likeness, and
Shows them either:
With another person’s nude body parts as if they were their own, or
With computer-generated nude body parts, or
Engaged in sexual conduct they never actually engaged in.
If a reasonable person would recognize the face or unique features (tattoos, birthmarks, etc.), that’s enough. The law does not require the State to prove the actual identity of the victim – only that the person depicted is recognizably real.
The New Felonies Under § 836.13
As of October 1, 2025, Florida now makes it a third-degree felony (up to 5 years in prison) to do any of the following without the person’s consent:
Generate an altered sexual depiction of an identifiable person.
“Generate” includes creating, altering, adapting, or modifying an image using electronic or computer tools – in other words, using AI to “make” the nude.
Solicit an altered sexual depiction.
Asking someone to make or send you a deepfake of a specific person is itself a felony.
Promote, or possess with intent to maliciously promote, an altered sexual depiction.
“Promote” is defined very broadly: sending, posting, sharing, giving, uploading, or distributing in almost any way.
If the State thinks you possessed the image with intent to maliciously promote it, that’s another third-degree felony.
Civil Lawsuits: Minimum $10,000 in Damages
On top of the criminal side, the same statute gives victims the right to sue anyone who generated or possessed with intent to maliciously promote an altered sexual depiction of them, and recover:
Injunctive relief (court orders to pull content down),
Monetary damages of at least $10,000 or their actual damages, whichever is higher, and
Attorney’s fees and costs.
So you’re now looking at felony charges + a five-figure civil lawsuit from the same set of screenshots.
The “Digital Voyeurism” Trap: Why Possession Is So Dangerous
Most people assume: “If I just have it on my phone and never post it, I’m safe.”
That’s exactly the mindset that gets clients in trouble.
Here’s why the new regime is so dangerous:
1. “Possession With Intent to Promote” – The Real Hook
The statute doesn’t punish pure private possession of adult deepfakes in a vacuum. The crime is possessing with intent to maliciously promote an altered sexual depiction.
In the real world, though, intent is almost always inferred from the circumstances:
Did you text the image to one friend as a “joke”?
Did you drop it in a group chat, Discord server, or private story?
Are there messages talking about “sending it around,” “exposing” someone, or “destroying” their reputation?
Do you have multiple versions or edits of the same person?
From a cop’s and prosecutor’s perspective, that’s often enough to claim you possessed it with intent to promote, and suddenly you’re staring at a third-degree felony.
And if the image involves a minor or looks like a minor, there’s a completely separate set of even harsher child-pornography and generated child-pornography statutes that can apply, including simple knowing possession or intentional viewing.
2. There Is No “But It’s Not Real” Defense
The new law is explicit: it does not matter that the body is fake or the sex act never occurred.
If:
The person is recognizable, and
The image shows them nude or in sexual conduct they never consented to,
then you’re in the same ballpark as distributing real, stolen nudes.
Even slapping a “this is AI / this is fake” disclaimer on the image is not a defense. The statute says that kind of disclaimer does not relieve you of criminal liability.
3. Civil Lawsuits: $10,000+ Just to Start
The civil remedy is not theoretical. Under § 836.13, a victim can sue and recover:
At least $10,000 in damages (per defendant),
Plus actual damages if those are higher (lost job, therapy bills, reputational harm),
Plus attorney’s fees.
So even if the criminal case gets resolved, you may still be dragged into civil court over the same conduct.
Real-World Consequences If You’re Charged
If you get charged under these new “Digital Voyeurism” / deepfake laws, you’re not looking at a slap on the wrist.
Typical exposure includes:
Third-Degree Felony
Up to 5 years in Florida State Prison.
Up to $5,000 in fines.
Felony Record That Looks Like a Sex Crime
Background checks don’t say “funny meme mistake” – they show felony involving sexual imagery and harassment.
Device Seizure and Forensic Examination
Law enforcement can and will seize your phones, laptops, tablets, and external drives, clone them, and hunt for:
Other images,
Search terms,
Chat logs showing how and why the image was made or shared.
Collateral Fallout
Employment and professional licensing issues,
Immigration consequences for non-citizens,
Family law / custody complications if the alleged victim is a co-parent, coworker, or classmate.
How We Defend These Cases
Just because the law is broad doesn’t mean you’re helpless. At Mitkevicius Law, we live in both worlds: criminal defense and technology/AI. That matters in these cases.
Some of the main defense angles we explore:
1. Lack of Intent to Promote
The State must prove more than “it was on your phone.”
We dig into:
Who actually had access to the device?
Was the image unsolicited (sent to you without your request)?
Did you immediately delete it or did someone else forward it from your account?
Are there zero messages or behavior suggesting you planned to share it?
The more we can show no intent to maliciously promote, the more room we have to argue the statute doesn’t apply to your conduct.
2. Consent (and Scope of Consent)
In some situations, there may have been:
Explicit agreement to create or share sexual imagery, or
Prior messages suggesting the other person knew and agreed to a certain use.
We look closely at whether:
There was actual consent to generate the image, and
There was any consent to share or “promote” it beyond that private understanding.
Lack of consent is baked into the statute; if the State can’t prove that element, they don’t have the offense.
3. Identity and “Identifiable Person” Disputes
The law requires the image to depict an “identifiable person.” That doesn’t always mean what the police think it means.
We may challenge:
Whether a reasonable, neutral observer would recognize the person,
Whether the face was heavily stylized, distorted, or merged with generic features,
Whether the alleged victim is the only plausible interpretation of who’s depicted.
If the State can’t prove beyond a reasonable doubt that the image clearly depicts the person they say it does, that’s a serious problem for the prosecution.
4. Technology, Metadata, and “Who Actually Made This?”
In some cases, the police assume:
“If it’s on your phone, you made it.”
That’s not always true.
We often review:
App logs, timestamps, and metadata,
Cloud backups and message histories,
Whether the file was forwarded, downloaded, or cached by an app,
Whether someone else had physical or remote access to your accounts.
The difference between “I made this” and “someone sent this to me” can be the difference between a felony and a dismissal.
5. Constitutional and Overbreadth Challenges
These laws are new, broad, and aggressive. Across Florida, defense lawyers will be testing:
First Amendment arguments (when something is closer to parody, art, or political commentary), and
Overbreadth / vagueness challenges when statutes capture a lot of borderline or unintended conduct.
Those are long-term fights, but they matter – especially in cases where the facts are more about bad taste than true exploitation.
The Bottom Line
Florida’s new AI and deepfake laws are designed to cast a very wide net.
If you:
Experiment with AI image generators,
Joke around with explicit memes of classmates, coworkers, or exes, or
Participate in group chats trading “funny” or humiliating edits,
you’re walking through a legal minefield.
One file. One forward. One wrong joke – and suddenly you’re being booked on a felony that looks like a sex crime.
Don’t let a digital image destroy your real life.
If you or a loved one has been accused of creating, soliciting, sharing, or possessing AI-generated sexual images (deepfakes) in Northwest Florida, call Mitkevicius Law before you talk to the police. We’ll:
Break down exactly what you’re charged with,
Explain your exposure in plain English, and
Build a defense strategy tailored to this new, rapidly evolving area of law.
Disclaimer: This blog is for informational and educational purposes only and does not constitute legal advice. Reading this page does not create an attorney–client relationship. Every case turns on its own facts, and these new statutes are still being interpreted by courts. If you are under investigation or have been arrested, speak directly with a qualified Florida criminal defense attorney about your specific situation.
.png)