• Carighan Maconar@lemmy.world
    link
    fedilink
    arrow-up
    15
    arrow-down
    3
    ·
    5 months ago

    It’s called Chinese Room and it’s exactly what “AI” is. It recombines pieces of data into “answers” to a “question”, despite not understanding the question, the answer it gives, or the piece sit uses.

    It has a very very complex chart of which elements in what combinations need to be in an answer for a question containing which elements in what combinations, but that’s all it does. It just sticks word barf together based on learned patterns with no understanding of words, language, context of meaning.

    • Valmond@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      5 months ago

      Yeah but the proof was about consciousness, and a really bad one IMO.

      I mean we are probably not more advanced than computers, which would indicate that consciousness is needed to understand context which seems very shaky.

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        5 months ago

        I think it’s kind of strange.

        Between quantification and consciousness, we tend to dismiss consciousness because it can’t be quantified.

        Why don’t we dismiss quantification because it can’t explain consciousness?

        • Valmond@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          5 months ago

          We can understand and poke on one but not the other I guess. I think so much more energy should be invested in understanding consciousness.