The charges, announced Thursday by the Lancaster County District Attorney’s Office, came as Lancaster Country Day School has been facing mounting pressure over its response to the deepfakes.
I’m sorry. While reading that post, I misread and thought that they were claiming that the students had ACTUAL NUDE photos of hundreds of students and were using the AI to make them MORE graphic.
I was arguing that having that many nudes to begin with was implausible.
I understand that they collected hundreds of publicly available photos and ran them through a porn AI, which resulted in hundreds of nude drawings.
Not defending anybody here, just gonna touch on a single point. When dealing with AI generated images, ‘hundreds of images’ is the work of a single command and leaving it to run for an hour. Unlike Photoshopped images, the quantity here is fairly meaningless.
Separately… we don’t know how much variation there was in the source images. There is a lot of difference between your hypothetical fire-and-forget and the hypothetical on the other end, where the illegal images are mostly comprised of unique source images.
It’s all hair-splitting, because at the end of the day, between the accused, their parents, and the environment around them, these kids should have been taught better than to do this.
Yes, I know the law doesn’t care how they were generated. It was more just bringing up a point of consideration in the discussion.
Even unique source images don’t mean much. If you have the know how, it’s one script to scrape the hundreds of images and a second one to modify them all.
Again, not defending the kids. I’m just adding a technical perspective to the discussion
The “showing each other their creations” is distribution regardless of whether or not it was in private.
Commenting without even reading my entire post? The article literally states “police found 347 images and videos.”
I’m sorry. While reading that post, I misread and thought that they were claiming that the students had ACTUAL NUDE photos of hundreds of students and were using the AI to make them MORE graphic.
I was arguing that having that many nudes to begin with was implausible.
I understand that they collected hundreds of publicly available photos and ran them through a porn AI, which resulted in hundreds of nude drawings.
I respect that, to err is human.
Not defending anybody here, just gonna touch on a single point. When dealing with AI generated images, ‘hundreds of images’ is the work of a single command and leaving it to run for an hour. Unlike Photoshopped images, the quantity here is fairly meaningless.
Not in the eyes of the law it isn’t.
Separately… we don’t know how much variation there was in the source images. There is a lot of difference between your hypothetical fire-and-forget and the hypothetical on the other end, where the illegal images are mostly comprised of unique source images.
It’s all hair-splitting, because at the end of the day, between the accused, their parents, and the environment around them, these kids should have been taught better than to do this.
Yes, I know the law doesn’t care how they were generated. It was more just bringing up a point of consideration in the discussion.
Even unique source images don’t mean much. If you have the know how, it’s one script to scrape the hundreds of images and a second one to modify them all.
Again, not defending the kids. I’m just adding a technical perspective to the discussion