this post was submitted on 27 Feb 2025
598 points (98.1% liked)

Technology

63375 readers
6574 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 90 points 1 day ago* (last edited 1 day ago) (24 children)

First off, I am sex positive, pro porn, pro sex work, and don't believe sex work should be shameful, and that there is nothing wrong about buying intimacy from a willing seller.

That said. The current state of the industry and the conditions for many professionals raises serious ethical issues. Coercion being the biggest issue.

I am torn about AI porn. On one hand it can produce porn without suffering, on the other hand it might be trained on other peoples work and take peoples jobs.

I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can't logically argue for it being illegal without a victim.

[–] [email protected] 1 points 1 hour ago* (last edited 1 hour ago)

without a victim

You are wrong.

AI media models has to be trained on real media. The illegal content would mean illegal media and benefiting ,supporting, & profiting from and to victims of crime.

The lengths and fallacies pedophiles will go to justify themselves is absurd.

[–] [email protected] 0 points 4 hours ago (1 children)

Whats illegal in real porn should be illegal in AI porn, since eventually we won’t know whether it’s AI

[–] [email protected] 1 points 1 hour ago

That's the same as saying we shouldn't be able to make videos with murder in them because there is no way to tell if they're real or not.

[–] [email protected] 6 points 13 hours ago

I've found that there's a lot of things on the Internet that went wrong because it was ad supported for "free". Porn is one of them.

There is ethically produced porn out there, but you're going to have to pay for it. Incidentally, it also tends to be better porn overall. The versions of the videos they put up on tube sites are usually cut down, and are only part of their complete library. Up through 2012 or so, the tube sites were mostly pirated content, but then they came to an agreement with the mainstream porn industry. Now it's mostly the studios putting up their own content (plus independent, verified creators), and anything pirated gets taken down fast.

Anyway, sites like Crash Pad Series, Erika Lust, Vanessa Cliff, and Adulttime (the most mainstream of this list) are worth a subscription fee.

[–] [email protected] 40 points 1 day ago* (last edited 17 hours ago) (19 children)

I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can't logically argue for it being illegal without a victim.

I've been thinking about this recently too, and I have similar feelings.

I'm just gonna come out and say it without beating around the bush: what is the law's position on AI-generated child porn?

More importantly, what should it be?

It goes without saying that the training data absolutely should not contain CP, for reasons that should be obvious to anybody. But what if it wasn't?

If we're basing the law on pragmatism rather than emotional reaction, I guess it comes down to whether creating this material would embolden paedophiles and lead to more predatory behaviour (i.e. increasing demand), or whether it would satisfy their desires enough to cause a substantial drop in predatory behaviour (I.e. lowering demand).

And to know that, we'd need extensive and extremely controversial studies. Beyond that, even in the event allowing this stuff to be generated is an overall positive (and I don't know whether it would or won't), will many politicians actually call for this stuff to be allowed? Seems like the kind of thing that could ruin a political career. Nobody's touching that with a ten foot pole.

[–] [email protected] 4 points 9 hours ago

I think the concern is that although it's victimless, if it's legal it could.... Normalise (within certain circles) the practice. This might make the users more confident to do something that does create a victim.

Additionally, how do you tell if it's really or generated? If AI does get better, how do you tell?

[–] [email protected] 3 points 13 hours ago (1 children)

what is the law’s position on AI-generated child porn?

Pretend underage porn is illegal in the EU and some other countries. I believe, in the US it is protected by the first amendment.

Mind that when people talk about child porn or CSAM that means anything underage, as far as politics is concerned. When two 17-year-olds exchange nude selfies, that is child porn. There were some publicized cases of teens in the US being convicted as pedophile sex offenders for sexting.

[–] [email protected] 3 points 6 hours ago* (last edited 6 hours ago)

I believe, in the US it is protected by the first amendment.

CSAM, artificial or not, is illegal in the United States.

https://www.justice.gov/archives/opa/pr/man-arrested-producing-distributing-and-possessing-ai-generated-images-minors-engaged

[–] [email protected] 6 points 19 hours ago (1 children)
[–] [email protected] 1 points 1 hour ago

Illegal is most of the west already as creating sexual assault material of minors is already illegal regardless of method.

[–] [email protected] 14 points 1 day ago (1 children)

It’s so much simpler than that—it can be created now, so it will be. They will use narrative twists to post it on the clearnet, just like they do with anime (she’s really a 1000 year old vampire, etc.). Creating laws to allow it are simply setting the rules of the phenomenon that is already going to be happening.

The only question is whether or not politicians will stop mud slinging long enough to have an adult conversation, or will we just shove everything into the more obscure parts of the internet and let it police itself.

[–] [email protected] 4 points 4 hours ago

No adult conversation required, just a quick "looks like we don't get internet privacy after all everyone." And erosion of more civil liberties. Again.

[–] [email protected] 3 points 22 hours ago* (last edited 22 hours ago)

what is the law’s position on AI-generated child porn?

the simplest possible explanation here, is that any porn created based on images of children, is de facto illegal. If it's trained on adults explicitly, and you prompt it for child porn, that's a grey area, probably going to follow precedent for drawn art, rather than real content.

load more comments (14 replies)
[–] [email protected] 8 points 23 hours ago (3 children)

i have no problem with ai porn assuming it's not based on any real identities, i think that should be considered identity theft or impersonation or something.

Outside of that, it's more complicated, but i don't think it's a net negative, people will still thrive in the porn industry, it's been around since it's been possible, i don't see why it wouldn't continue.

[–] [email protected] 1 points 4 hours ago* (last edited 4 hours ago) (1 children)

thispersondoesnotexist.com

Refresh for a new fake person

[–] [email protected] 1 points 58 minutes ago

this ones a classic.

[–] [email protected] 0 points 6 hours ago (1 children)

i have no problem with ai porn assuming it’s not based on any real identities

With any model in use, currently, that is impossible to meet. All models are trained on real images.

[–] [email protected] 1 points 55 minutes ago

With any model in use, currently, that is impossible to meet. All models are trained on real images.

yes but if i go to thispersondoesnotexist.com and generate a random person, is that going to resemble the likeness of any given real person close enough to perceptibly be them?

You are literally using the schizo argument right now. "If an artists creates a piece depicting no specific person, but his understanding of persons is based inherently on the facial structures of other people that he knows and recognizes, therefore he must be stealing their likeness"

[–] [email protected] 8 points 22 hours ago (2 children)

Identity theft only makes sense for businesses. I can sketch naked Johny Depp in my sketchbook and do whatever I want with it and no one can stop me. Why should an AI tool be any different if distribution is not involved?

[–] [email protected] 3 points 22 hours ago (1 children)

revenge porn, simple as. Creating fake revenge porn of real people is still to some degree revenge porn, and i would argue stealing someones identity/impersonation.

To be clear, you're example is a sketch of johnny depp, i'm talking about a video of a person that resembles the likeness of another person, where the entire video is manufactured. Those are fundamentally, two different things.

[–] [email protected] 9 points 22 hours ago (3 children)

Again you're talking about distribution

[–] [email protected] 3 points 13 hours ago* (last edited 13 hours ago) (1 children)

I guess the point is this enables the mass production of revenge porn essentially at a person on the street level which makes it much harder to punish and prevent distribution. when it is relatively few sources that produces the unwanted product then only punishing the distribution might be a viable method. But when the production method becomes available to the masses then the only feasible control mechanism is to try to regulate the production method. It is all a matter of where is the most efficient position to put the bottle neck.

For instance when 3D printing allows people to produce automatic rifles in their homes "saying civil use of automatic rifles is illegal so that is fine" is useless.

[–] [email protected] 2 points 12 hours ago (1 children)

I think that's a fair point and I wonder how will this effect the freedom of expression on the internet. If you can't find the distributor then it'll be really tough to get a handle of this.

On the other hand the sheer over abundance could simply break the entire value of revenge porn as in "nothing is real anyway so it doesn't matter" sort of thing which I hope would be the case. No one will be watching revenge porn cause they generate any porn they want in a heartbeat. Thats the ideal scenario anyway.

[–] [email protected] 2 points 9 hours ago* (last edited 9 hours ago)

It is indeed a complicated problem with many intertwined variables, wouldn't wanna be in the shoes of policy makers (assuming that they actually are searching for an honest solution and not trying to turn this into profit lol).

For instance too much regulation on fields like this essentially would kill high quality open source AI tools and make most of them proprietary software leaving the field in the mercy of tech monopolies. This is probably what these monopolies want and they will surely try to push things this way to kill competition (talk about capitalism spurring competition and innovation!). They might even don the cloak of some of these bad actors to speed up the process. Given the possible application range of AI, this is probably even more dangerous than flooding the internet with revenge porn.

%100 freedom, no regulations will essentially lead to a mixed situation of creative and possibly ground breaking uses of the tech vs many bad actors using the tech for things like scamming, disinformation etc. how it will balance out on the long run is probably very hard to predict.

I think two things are clear, 1-both extremities are not ideal, 2- between the two extremities %100 freedom is still the better option (the former just exchanges many small bad actors for a couple giant bad actors and chokes any possible good outcomes).

Based on these starting with a solution closer to the "freedom edge" and improving it step by step based on results is probably the most sensible approach.

load more comments (2 replies)
load more comments (19 replies)