• greenskye@lemmy.zip
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    3
    ·
    2 days ago

    I get this and I don’t disagree, but I also hate that AI fully brought back thought crimes as a thing.

    I don’t have a better approach or idea, but I really don’t like that simply drawing a certain arrangement of lines and colors is now a crime. I’ve also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated.

    Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.

    • Kanda@reddthat.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      It was already a thing in several places. In my country it’s legal to sleep with a 16 year old, but fiction about the same thing is illegal.

    • SorteKanin@feddit.dk
      link
      fedilink
      English
      arrow-up
      2
      ·
      22 hours ago

      I really don’t like that simply drawing a certain arrangement of lines and colors is now a crime

      I’m sorry to break it to you, but this has been illegal for a long time and it doesn’t need to have anything to do with CSAM.

      For instance, drawing certain copyrighted material in certain contexts can be illegal.

      To go even further, numbers and maths can be illegal in the right circumstances. For instance, it may be illegal where you live to break the encryption of a certain file, depending on the file and encryption in question (e.g. DRM on copyrighted material). “Breaking the encryption of a file” essentially translates to “doing maths on a number” when you boil it down. That’s how you can end up with the concept of illegal numbers.

      • greenskye@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        14 hours ago

        To further clarify it’s specifically around thought crimes in scenarios where there is no victim being harmed.

        If I’m distributing copyrighted content, that’s harming the copyright holder.

        I don’t actually agree with breaking DRM being illegal either, but at least in that case, doing so is supposedly harming the copyright holder because presumably you might then distribute it, or you didn’t purchase a second copy in the format you wanted or whatever. There’s a ‘victim’ that’s being harmed.

        Doodling a dirty picture of a totally original character doing something obscene harms absolutely no one. No one was abused. No reputation (other than my own) was harmed. If I share that picture with other consenting adults in a safe fashion, again no one was harmed or had anything done to them that they didn’t agree to.

        It’s totally ridiculous to outlaw that. It’s punishing someone for having a fantasy or thought that you don’t agree with and ruining their life. And that’s an extremely easy path to expand into other thoughts you don’t like as well. And then we’re back to stuff like sodomy laws and the like.

    • shani66@ani.social
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      18 hours ago

      Sure, i think it’s weird to really care about loli or furry or any other niche the way a lot of people do around here, but ai generating material of actual children (and unwilling people besides) is actually harmful. If they can’t have effective safeguards against that harm it makes sense to restrict it legally.

      • greenskye@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        Making porn of actual people without their consent regardless of age is not a thought crime. For children, that’s obviously fucked up. For adults it’s directly impacting their reputation. It’s not a victimless crime.

        But generating images of adults that don’t exist? Or even clearly drawn images that aren’t even realistic? I’ve seen a lot of people (from both sides of the political spectrum) advocate that these should be illegal if the content is what they consider icky.

        Like let’s take bestiality for example. Obviously gross and definitely illegal in real life. But should a cartoon drawing of the act really be illegal? No one was abused. No reputation was damaged. No illegal act took place. It was simply someone’s fucked up fantasy. Yet lots of people want to make that into a thought crime.

        I’ve always thought that if there isn’t speech out there that makes you feel icky or gross then you don’t really have free speech at all. The way you keep free speech as a right necessarily requires you to sometimes fight for the right of others to say or draw or write stuff that you vehemently disagree with, but recognize as not actually causing harm to a real person.

        • CileTheSane@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          4 hours ago

          Making porn of actual people without their consent regardless of age is not a thought crime. For children, that’s obviously fucked up. For adults it’s directly impacting their reputation. It’s not a victimless crime.

          That is also drawing a certain arrangement of lines and colours, and an example of “free speech” that you don’t think should be absolute.

          • greenskye@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            4 hours ago

            Yes sorry. My original statement was too vague. I was talking specifically about scenarios where there is no victim and the action was just a drawing/story/etc.

            I’m not a free speech absolutist. I think that lacks nuance. There are valid reasons to restrict certain forms of speech. But I do think the concept is core to a healthy democracy and society and should be fiercely protected.

            • CileTheSane@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 hours ago

              The issue with child porn is how you specify victim. One could argue easily available pornographic images of fake children increases the market and desire for pornographic images of real children and as such can result in more victims. Especially if someone can argue that images of real victims are “fake and AI generated.”

              In regards to X and Grok however, my understanding is it is taking images of real children and producing naked images of those children. So there are real victims and Tim Sweeny is saying that shouldn’t be censored.

        • azertyfun@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 day ago

          Drawings are one conversation I won’t get into.

          GenAI is vastly different though. Those are known to sometimes regurgitate people or things from their dataset, (mostly) unaltered. Like how you can get Copilot to spit out valid secrets that people accidentally committed by typing NPM_KEY=. You can’t have any guarantee that if you ask it to generate a picture of a person, that person does not actually exist.