Zerush@lemmy.ml to Technology@lemmy.ml · 4 个月前Google's AI Deletes User's Entire Hard Drive, Issues Groveling Apology: "I Cannot Express How Sorry I Am"futurism.comexternal-linkmessage-square25fedilinkarrow-up1100arrow-down14cross-posted to: technology@lemmy.zip
arrow-up196arrow-down1external-linkGoogle's AI Deletes User's Entire Hard Drive, Issues Groveling Apology: "I Cannot Express How Sorry I Am"futurism.comZerush@lemmy.ml to Technology@lemmy.ml · 4 个月前message-square25fedilinkcross-posted to: technology@lemmy.zip
minus-squareutopiah@lemmy.mllinkfedilinkarrow-up4·4 个月前Well there are guardrails from what I understood, including : executing commands (off by default) executing commands without user confirmation (off by default) which are IMHO reasonable but if the person this happened to is right, there is no filesystem sandbox, e.g. limited solely to the project repository.
minus-squareScrubbles@poptalk.scrubbles.techlinkfedilinkEnglisharrow-up1·4 个月前Okay that changes things. If they turned off these guardrails than that was on them, never blindly trust an LLM like that
Well there are guardrails from what I understood, including :
which are IMHO reasonable but if the person this happened to is right, there is no filesystem sandbox, e.g. limited solely to the project repository.
Okay that changes things. If they turned off these guardrails than that was on them, never blindly trust an LLM like that