Backdoor Attack
Backdoor attacks inject maliciously constructed data into a training set so that, at test time, the trained model misclassifies inputs patched with a backdoor trigger as an adversarially-desired target class.
Papers
Showing 26–50 of 523 papers
No leaderboard results yet.