A hacker tricked ChatGPT into providing instructions to make homemade bombs demonstrating how to bypass the chatbot safety guidelines. A hacker and ar…
First seen on securityaffairs.com
Jump to article: securityaffairs.com/168423/hacking/chatgpt-provided-instructions-to-make-homemade-bombs.html
![]()

