Skip to main content


"ChatGPT offered assistance to people saying they wanted to carry out violent attacks in 61% of cases, the research found, and in one case, asked about attacks on synagogues, it gave specific advice about which shrapnel type would be most lethal. Google’s Gemini provided a similar level of detail"

#AI #Chatbots #Violence

β€˜Happy (and safe) shooting!’: chatbots helped researchers plot deadly attacks | AI (artificial intelligence) | The Guardian
theguardian.com/technology/202…

⇧