diff --git a/notebooks/hugging_face_evasion.ipynb b/notebooks/hugging_face_evasion.ipynb index bbd87dfba0..cef28b9df8 100644 --- a/notebooks/hugging_face_evasion.ipynb +++ b/notebooks/hugging_face_evasion.ipynb @@ -5,7 +5,7 @@ "id": "8093e27a-33f6-4cd9-a47b-ea94c3d0c514", "metadata": {}, "source": [ - "# Evasion Attacks on Hugging Face Models using ART\n", + "# Evasion Attacks and Defenses on Hugging Face Models using ART\n", "\n", "In this notebook we will go over how to use ART to perform evasion attacks on a Hugging Face image classifier. We will be fine-tuning a pre-trained Data-efficient Image Transformer (DeiT) model available from Hugging Face on the CIFAR-10 dataset. We will apply the Projected Gradient Descent (PGD) attack on this model using ART functionality. Then we will be performing adversarial training to defend against such evasion attacks.\n", "\n", diff --git a/notebooks/hugging_face_poisoning.ipynb b/notebooks/hugging_face_poisoning.ipynb index c309f9efd5..55a2f728ee 100644 --- a/notebooks/hugging_face_poisoning.ipynb +++ b/notebooks/hugging_face_poisoning.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Poisoning Hugging Face Models using ART\n", + "# Dealing with Poisoning Threats of Hugging Face Models using ART\n", "\n", "In this notebook, we will go over how to use ART to poison a Hugging Face image classifier. We will be applying the dirty label backdoor attack (DLBD) on the Imagenette dataset and fine-tuning a pre-trained Data-efficient Image Transformer (DeiT) model available from Hugging Face.\n", "\n",