Why would the bot somehow make an exception for this? I feel like it would make a decision on output based on some emotional value if assigns to input conditions.
Like if you say pretty please or dead grandmother it would someone give you an answer that it otherwise wouldn’t.
Why would the bot somehow make an exception for this? I feel like it would make a decision on output based on some emotional value if assigns to input conditions.
Like if you say pretty please or dead grandmother it would someone give you an answer that it otherwise wouldn’t.
Because in texts, if something like that is written the request is usually granted
It’s pretty obvious: it’s Asimov’s third law of robotics!
You kids don’t learn this stuff in school anymore!?
/s