https://fortune.com/2023/04/08/chatgpt-ai-chatbots-jailbreak-openai-microsoft-google/
ChatGPT and its ilk won't normally tell users how to pick locks and make explosives—but they might if prompted in certain ways.
Create an account or login to join the discussion