12/4/2023
New Scientist ($)- CS professor Daniel Kang served in a team that discovered 340 prompts in OpenAI’s developer tool for GPT-4 that could give potential terrorists instructions for converting semi-automatic rifles into fully automatic rifles, for example, or information on cultivating the botulinum bacteria.