Making LLMs Vulnerable to Prompt Injection via Poisoning Alignment

Add code
Oct 18, 2024
Figure 1 for Making LLMs Vulnerable to Prompt Injection via Poisoning Alignment
Figure 2 for Making LLMs Vulnerable to Prompt Injection via Poisoning Alignment
Figure 3 for Making LLMs Vulnerable to Prompt Injection via Poisoning Alignment
Figure 4 for Making LLMs Vulnerable to Prompt Injection via Poisoning Alignment

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: