A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
Hard to say. I generally agree with what you’ve said though. Also, lots of people have other fantasies that they would never enact in real life for various reasons (e.g. it’s unsafe, illegal, or both; edit: I should also absolutely list non-consensual here). I feel like pedophilia isn’t necessarily different.
However part of the reason loli/whatever is also illegal to distribute (it is, right? I assume it is at least somewhere) is that otherwise it helps people facilitate/organize distribution of real CSAM, which increases demand for it. That’s what I’ve heard at least and it makes sense to me. And I feel like that would apply to AI generated as well.
It’s obvs. very hard to get accounts of what pedophiles are doing; the only ones that you can survey are ones that have been caught, which isn’t necessarily a representative sample. I don’t think that there are any good estimates on the rate of pedophilic tendencies.
From a cursory reading, it looks like possession and distribution are both felonies. Lolicon hentai is pretty widely available online, and prosecutions appear to be very uncommon when compared to the availability. (Low priority for enforcement, probably?)
I’m not sure that increasing the supply of CSAM would necessarily increase demand for CSAM in people that aren’t already pedophiles though. To put it another way, I’m sure that increasing the supply of gay porn would increase consumption of gay porn, but I am pretty sure that it’s not going to make more people gay. And people that aren’t gay (or at least bi-) aren’t going to be interested in gay porn, regardless of how hard up (heh) they might be for porn, as long as they have any choices at all. There’s a distinction between fetishes/paraphilia, and orientations, and my impression has been that pedophilia is much more similar to an orientation than a paraphilia.
No, but allowing people to organize increases demand because then those who would want CSAM have a place to look for it and ask for it where it’s safe for them to do so, and maybe even pay for it to be created. It’s rather the other way around, the demand increases the supply if you want to put it like that. I’m not saying lolicon being freely available turns people into pedophiles or something like that, at all.
I guess where I come down is that, as long as no real people are being harmed–either directly, or because their likeness is being used–then I’d rather see it out in the open than hidden. At least if it’s open you can have a better chance of knowing who is immediately unsafe around children, and easily using that to exclude people from positions where they’d have ready access to children (teachers, priests, etc.).
Unfortunately, there’s also a risk of pedophilia being ‘normalized’ to the point where people let their guard down around them.
Yeah I agree.