

103·
4 days agoI don’t know personally. The admins of the fediverse likely do, considering it’s something they’ve had to deal with from the start. So, they can likely answer much better than I might be able to.
I don’t know personally. The admins of the fediverse likely do, considering it’s something they’ve had to deal with from the start. So, they can likely answer much better than I might be able to.
Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the “AI” companies are generating a ton of csam and nobody is doing anything about it.
Oh so many.
Don’t worry, she’s probably getting deported/disappeared soon.
There’s a thing that was happening in the past. Not sure it’s still happening, due to lack of news about it. It was something called “glamour modeling” I think or an extension of it.
Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.
Nothing untoward directly happened to the children. They weren’t physically abused. They were treated as regular fashion models. And yet, it’s still csam. Why? Because of the intention behind making those pictures.
The intention to exploit.