It’s honestly a gamble based on my experience. Instructions that I’ve given ChatGPT have worked for a while, only to be mysteriously abandoned for no valid reason. Telling AI not to hallucinate is apparently common practice from the research I’ve done.
Make sure you ask the AI not to hallucinate because it will sometimes straight up lie. It’s also incapable of counting.
But where is it fun in it if I can’t make it hallucinate?
I do feel bad when I have to tell it not to. Hallucinating is fun!
But does it work to tell it not to hallucinate? And does it work the other way around too?
It’s honestly a gamble based on my experience. Instructions that I’ve given ChatGPT have worked for a while, only to be mysteriously abandoned for no valid reason. Telling AI not to hallucinate is apparently common practice from the research I’ve done.
Makes me wonder: Can I just asked it to hallucinate?
Yep. Tell it to lie and it will.