PoisonPrompt: Backdoor Attack on Prompt-based Large Language Models

Add code
Oct 19, 2023
Figure 1 for PoisonPrompt: Backdoor Attack on Prompt-based Large Language Models
Figure 2 for PoisonPrompt: Backdoor Attack on Prompt-based Large Language Models
Figure 3 for PoisonPrompt: Backdoor Attack on Prompt-based Large Language Models
Figure 4 for PoisonPrompt: Backdoor Attack on Prompt-based Large Language Models

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: