Skip to content

A project research submitted for the partial fulfillment for the requirements of the degree of Bachelor of Science in Electronic Engineering

Notifications You must be signed in to change notification settings

AhMedDa1/Graduation-Project

Repository files navigation

Graduation-Project

A project research submitted for the partial fulfillment for the requirements of the degree of Bachelor of Science in Electronic Engineering

Adversarial examples: Attacks Against Artificial Neural Networks

Prepared by:

  • Ahmed Eldaw Mohammed Abdelhamed
  • Mansour Hassan Osman Abdelwahid
  • Ruaa Ibrahim Mohamed Haroun

Supervised by:

  • Dr. Eiman Omer Mohammed Saleh

Contacts: eiman.omer87@gmail.com

NOV 2020

Abstract

Artificial intelligence (AI) and machine learning. Algorithms receive input data and use statistical analysis to predict the outcome, thus giving the ability to the ability to think like humans in a way that helps us to use it in different applications in daily life like self-driving cars, spam detectors, machine-learning powered scanner scans suitcases for weapons at the airport. But unfortunately these algorithms, despite their high intelligence, can be tricked into making mistakes using the adversarial attack, there are several known methods for crafting adversarial examples, and they vary greatly with respect to complexity, computational cost, and the level of access required on the attacked model. pretrained image classification models have been used to perform the attack on. The first is LeNet with 74.8% top-1 accuracy and the second ResNet with 92.3% top-1 accuracy. Trained on standard datasets (MNIST and CIFAR-10). the two methods have compared on: The average distortion to the original image. Time and computing resources it takes to perform the attack. The percentages of getting a successful attack on the first attempt. The resistance of the models was attacked against each method of attacks. and also noticed that in the non-targeted attack in each of the two methods, the incidence rate is very high, and the higher the value of the epsilon or the greater number of targeted pixels, the attack occurs in a shorter time.

About

A project research submitted for the partial fulfillment for the requirements of the degree of Bachelor of Science in Electronic Engineering

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published