I think he was just imminently concerned about their safety. Like the post suggests, many thought desperate times were coming and any rando in a maga hat might retaliate.
I think he was just imminently concerned about their safety. Like the post suggests, many thought desperate times were coming and any rando in a maga hat might retaliate.
I’ve definitely experienced this.
I used ChatGPT to write cover letters based on my resume before, and other tasks.
I used to give it data and tell chatGPT to “do X with this data”. It worked great.
In a separate chat, I told it to “do Y with this data”, and it also knocked it out of the park.
Weeks later, excited about the tech, I repeat the process. I tell it to “do x with this data”. It does fine.
In a completely separate chat, I tell it to “do Y with this data”… and instead it gives me X. I tell it to “do Z with this data”, and it once again would really rather just do X with it.
For a while now, I have had to feed it more context and tailored prompts than I previously had to.
There’s a much more accurate stat… and it’s disgusting