Posting this here as I feel like similar things are happening in open source projects we like to host.

  • parson0@startrek.website
    link
    fedilink
    arrow-up
    33
    ·
    3 days ago

    Oh I’ve been on a similar journey, after a year long break I’m back in the education space and things have only gotten worse.

    I voiced concern about professors using LLMs to create presentations. The sloppy images with nonsense text are one thing, but it’s hallucinating on actual content. Since I’m learning, I can’t be certain if something on this topic is true or an hallucination. My criticism was understood as me just not being tech savvy enough for AI.

    I’ll probably drop the course and get my money back, I could get a chatgpt subscription for a lot less…

    I’ve beefed up my legal insurance because I believe the only way to hammer some sense into people and businesses is to sue when their agentic ai makes mistakes. Currently nobody seems accountable for the output and that has to change.

        • SlowBurn@slrpnk.net
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          Yes. Every manager has responsibilities, because when shit goes south the meeting in the corner office wants to know who to blame, or even in less punitive cultures, where mistakes were made — where the opportunities to learn are. Corporations mask people from most personal criminal liability, but not all of it.

          Allowing people to avoid responsibility by pinning it on a machine would be a big mistake.

          Cathy O’Neil’s Weapons of Math Destruction is still a solid read on this stuff.