GPT-4 developer tool can be exploited for misuse with no easy fix

12/4/2023

Written by

New Scientist ($)- CS professor Daniel Kang served in a team that discovered 340 prompts in OpenAI’s developer tool for GPT-4  that could give potential terrorists instructions for converting semi-automatic rifles into fully automatic rifles, for example, or information on cultivating the botulinum bacteria.

Read the New Scientist story. ($)


Share this story

This story was published December 4, 2023.