Jailbreaking ChatGPT via Prompt Engineering: An Empirical Study

Add code
May 23, 2023
Figure 1 for Jailbreaking ChatGPT via Prompt Engineering: An Empirical Study
Figure 2 for Jailbreaking ChatGPT via Prompt Engineering: An Empirical Study
Figure 3 for Jailbreaking ChatGPT via Prompt Engineering: An Empirical Study
Figure 4 for Jailbreaking ChatGPT via Prompt Engineering: An Empirical Study

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: