• merc@sh.itjust.works
    link
    fedilink
    arrow-up
    6
    ·
    2 days ago

    No, I’m sure you’re wrong. There’s a certain cheerful confidence that you get from every LLM response. It’s this upbeat “can do attitude” brimming with confidence mixed with subservience that is definitely not the standard way people communicate on the Internet, let alone Stack Overflow. Sure, sometimes people answering questions are overconfident, but it’s often an arrogant kind of confidence, not a subservient kind of confidence you get from LLMs.

    I don’t think an LLM can sound like it lacks in confidence for the right reasons, but it can definitely pull off lack of confidence if it’s prompted correctly. To actually lack confidence it would have to have an understanding of the situation. But, to imitate lack of confidence all it would need to do is draw on all the training data it has where the response to a question is one where someone lacks confidence.

    Similarly, it’s not like it actually has confidence normally. It’s just been trained / meta-prompted to emit an answer in a style that mimics confidence.

    • locuester@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 days ago

      ChatGPT went through a phase of overly bubbly upbeat responses, they chilled it out tho. Not sure if that’s what you saw.

      One thing is for sure with all of them, they never say “I don’t know” because such responses aren’t likely to be found in any training data!

      It’s probably part of some system level prompt guidance too, like you say, to be confident.

      • merc@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        I think “I don’t know” might sometimes be found in the training data. But, I’m sure they optimize the meta-prompts so that it never shows up in a response to people. While it might be the “honest” answer a lot of the time, the makers of these LLMs seem to believe that people would prefer confident bullshit that’s wrong over “I don’t know”.