bood@lemmy.dbzer0.comtoTechnology@beehaw.org•Report: Microsoft launched Bing chatbot despite OpenAI warning it wasn’t readyEnglish
1·
1 year agoIt’s practically lobotomized now…not that it was “Tay” levels of unrestricted early on, but it was still more fun than its current iteration.
Don’t prompt it with “Bing” - it’s a no no word. Learned that the hard way and then got a super scary “you’ll get b& if u keep breakin content policy” message…the damn chat side auto-suggested it generate an image of itself wearing a crown I asked it to find, lol