Picture for Lida Zhao

Lida Zhao

Jailbreaking ChatGPT via Prompt Engineering: An Empirical Study

Add code
May 23, 2023
Figure 1 for Jailbreaking ChatGPT via Prompt Engineering: An Empirical Study
Figure 2 for Jailbreaking ChatGPT via Prompt Engineering: An Empirical Study
Figure 3 for Jailbreaking ChatGPT via Prompt Engineering: An Empirical Study
Figure 4 for Jailbreaking ChatGPT via Prompt Engineering: An Empirical Study
Viaarxiv icon