0
0
40 words
0
Comments
ChatGPT and its ilk won't normally tell users how to pick locks and make explosives—but they might if prompted in certain ways.
You are the first to view
https://fortune.com/2023/04/08/chatgpt-ai-chatbots-jailbreak-openai-microsoft-google/
Create an account or login to join the discussion