"If the model impersonates Elon Musk without the user prompting it to do so, it is a violation of our principles," an ...
A red team got xAI's latest model to reveal its system prompt, provide instructions for making a bomb, and worse. Much worse.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results