• ssillyssadass@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 day ago

    I wonder if it would be possible to plant an instruction bomb somewhere on the page which would trip up LLM-powered bots. I dunno how much of the page they take in.

    • ameancow@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 day ago

      If you have a personal web-page or blog, you can easily poison your content just my making white text on white background or something, containing an assortment of prompts and nonsense.

      But that’s only for the current models of LLM, next gen might easily bypass those kinds of tricks. We’re cooked, yo.