Making porn of actual people without their consent regardless of age is not a thought crime. For children, that’s obviously fucked up. For adults it’s directly impacting their reputation. It’s not a victimless crime.
But generating images of adults that don’t exist? Or even clearly drawn images that aren’t even realistic? I’ve seen a lot of people (from both sides of the political spectrum) advocate that these should be illegal if the content is what they consider icky.
Like let’s take bestiality for example. Obviously gross and definitely illegal in real life. But should a cartoon drawing of the act really be illegal? No one was abused. No reputation was damaged. No illegal act took place. It was simply someone’s fucked up fantasy. Yet lots of people want to make that into a thought crime.
I’ve always thought that if there isn’t speech out there that makes you feel icky or gross then you don’t really have free speech at all. The way you keep free speech as a right necessarily requires you to sometimes fight for the right of others to say or draw or write stuff that you vehemently disagree with, but recognize as not actually causing harm to a real person.
Making porn of actual people without their consent regardless of age is not a thought crime. For children, that’s obviously fucked up. For adults it’s directly impacting their reputation. It’s not a victimless crime.
That is also drawing a certain arrangement of lines and colours, and an example of “free speech” that you don’t think should be absolute.
Yes sorry. My original statement was too vague. I was talking specifically about scenarios where there is no victim and the action was just a drawing/story/etc.
I’m not a free speech absolutist. I think that lacks nuance. There are valid reasons to restrict certain forms of speech. But I do think the concept is core to a healthy democracy and society and should be fiercely protected.
The issue with child porn is how you specify victim. One could argue easily available pornographic images of fake children increases the market and desire for pornographic images of real children and as such can result in more victims. Especially if someone can argue that images of real victims are “fake and AI generated.”
In regards to X and Grok however, my understanding is it is taking images of real children and producing naked images of those children. So there are real victims and Tim Sweeny is saying that shouldn’t be censored.
One could argue easily available pornographic images of fake children increases the market and desire for pornographic images of real children and as such can result in more victims
Not speaking to any realistic images (which, if nothing else, make it harder for real investigations into child abuse to happen), only cartoon drawings and the like, but it’s hard for me to separate this logic from all of the calls about video game violence.
That moves things into ‘pre crime’ territory. “We’re going to jail you for having fake drawn images because we think those will cause you to commit a real crime in the future”. That’s also extremely problematic and has been rightly criticized when it’s used to censor violent games, movies and music.
Okay, but the original post is how Grok is taking pictures of children and removing their clothes. That’s not cartoons and not fictional people, so I don’t know why the conversation is being shifted this far away from the actual issue at hand here.
GenAI is vastly different though. Those are known to sometimes regurgitate people or things from their dataset, (mostly) unaltered. Like how you can get Copilot to spit out valid secrets that people accidentally committed by typing NPM_KEY=. You can’t have any guarantee that if you ask it to generate a picture of a person, that person does not actually exist.
Making porn of actual people without their consent regardless of age is not a thought crime. For children, that’s obviously fucked up. For adults it’s directly impacting their reputation. It’s not a victimless crime.
But generating images of adults that don’t exist? Or even clearly drawn images that aren’t even realistic? I’ve seen a lot of people (from both sides of the political spectrum) advocate that these should be illegal if the content is what they consider icky.
Like let’s take bestiality for example. Obviously gross and definitely illegal in real life. But should a cartoon drawing of the act really be illegal? No one was abused. No reputation was damaged. No illegal act took place. It was simply someone’s fucked up fantasy. Yet lots of people want to make that into a thought crime.
I’ve always thought that if there isn’t speech out there that makes you feel icky or gross then you don’t really have free speech at all. The way you keep free speech as a right necessarily requires you to sometimes fight for the right of others to say or draw or write stuff that you vehemently disagree with, but recognize as not actually causing harm to a real person.
That is also drawing a certain arrangement of lines and colours, and an example of “free speech” that you don’t think should be absolute.
Yes sorry. My original statement was too vague. I was talking specifically about scenarios where there is no victim and the action was just a drawing/story/etc.
I’m not a free speech absolutist. I think that lacks nuance. There are valid reasons to restrict certain forms of speech. But I do think the concept is core to a healthy democracy and society and should be fiercely protected.
The issue with child porn is how you specify victim. One could argue easily available pornographic images of fake children increases the market and desire for pornographic images of real children and as such can result in more victims. Especially if someone can argue that images of real victims are “fake and AI generated.”
In regards to X and Grok however, my understanding is it is taking images of real children and producing naked images of those children. So there are real victims and Tim Sweeny is saying that shouldn’t be censored.
Not speaking to any realistic images (which, if nothing else, make it harder for real investigations into child abuse to happen), only cartoon drawings and the like, but it’s hard for me to separate this logic from all of the calls about video game violence.
That moves things into ‘pre crime’ territory. “We’re going to jail you for having fake drawn images because we think those will cause you to commit a real crime in the future”. That’s also extremely problematic and has been rightly criticized when it’s used to censor violent games, movies and music.
Okay, but the original post is how Grok is taking pictures of children and removing their clothes. That’s not cartoons and not fictional people, so I don’t know why the conversation is being shifted this far away from the actual issue at hand here.
Drawings are one conversation I won’t get into.
GenAI is vastly different though. Those are known to sometimes regurgitate people or things from their dataset, (mostly) unaltered. Like how you can get Copilot to spit out valid secrets that people accidentally committed by typing
NPM_KEY=. You can’t have any guarantee that if you ask it to generate a picture of a person, that person does not actually exist.