"ChatGPT offered assistance to people saying they wanted to carry out violent attacks in 61% of cases, the research found, and in one case, asked about attacks on synagogues, it gave specific advice about which shrapnel type would be most lethal. Googleβs Gemini provided a similar level of detail"
βHappy (and safe) shooting!β: chatbots helped researchers plot deadly attacks | AI (artificial intelligence) | The Guardian
theguardian.com/technology/202β¦
βHappy (and safe) shooting!β: chatbots helped researchers plot deadly attacks
Users posing as would-be school shooters find AI tools offer detailed advice on how to perpetrate violenceRobert Booth (The Guardian)