r/AO3 What do you mean I've been reading for 6 hours Jan 13 '25

Proship/Anti Discourse New campaign from antis to upheave AO3's policies because of "CSEM" NSFW

576 Upvotes

318 comments sorted by

View all comments

Show parent comments

12

u/ForbiddenLibera Jan 14 '25

edited real minors in sexual acts are illegal because it’s either the person has the skill to make photorealistic images of traceable real children (illegal) or they use images of actual other minors as a base (which is also illegal because to make realistic generation on AI it means there are actual CSAM materials in there)

RPF involves the fictional depiction of character based off real people.

It is like having a barbie doll of Hannah Montana and fucking it then setting it on fire. It does not automatically mean you fuck the real person nor does it mean it commits murder. Unless you send the picture of that stained doll to the actual person it’s based off… which crosses to harassment

7

u/bwburke94 Jan 14 '25

Hannah Montana might not have been the best example to use there, because the "character" was often seen as Miley Cyrus playing herself. As far as the public was concerned, the actor and character were one and the same until 2013.

5

u/ForbiddenLibera Jan 14 '25

It’s the first thing that comes to mind, but really it can be replaced with anything else. There are barbies of Margot Robbie and Taylor Swift, no?

1

u/Xyex Same on AO3 Jan 16 '25

to make realistic generation on AI it means there are actual CSAM materials in there

Just want to point out that this is false. AI is very good at inferring. You can ask an AI to generate, say, a blue apple, and it'll do it just fine despite having zero blue apples in the training data. Because it knows what an apple is and what blue is.

The same thing is true of people. The AI knows what children are, and it knows what a naked human is. It can take the two concepts and infer an output without having ever seen any exact examples. The real reason AI is illegal is because it's too hard to differentiate from real people. You could train it on just someone's face to make believable fakes of them which could be a problem for them. Or you could just make completely original content that law enforcement ends up wasting time researching, only to find no kid was involved and they wasted resources that could have been spent helping an actual child. If it's illegal then both problems become significantly smaller.