The existence of such "jailbreaks" for LLMs is nothing new. It means that users manage to elicit things from an LLM through ...