USENIX Enigma 2017 — Adversarial Examples in Machine Learning

USENIX Enigma 2017 — Adversarial Examples in Machine Learning

Nicolas Papernot, Google PhD Fellow at The Pennsylvania State University

Machine learning models, including deep neural networks, were shown to be vulnerable to adversarial examples—subtly (and often humanly indistinguishably) modified malicious inputs crafted to compromise the integrity of their outputs. Adversarial examples thus enable adversaries to manipulate system behaviors. Potential attacks include attempts to control the behavior of vehicles, have spam content identified as legitimate content, or have malware identified as legitimate software.

In fact, the feasibility of misclassification attacks based on adversarial examples has been shown for image, text, and malware classifiers. Furthermore, adversarial examples that affect one model often affect another model, even if the two models are very different. This effectively enables attackers to target remotely hosted victim classifiers with very little adversarial knowledge.

This talk covers adversarial example crafting algorithms operating under varying threat models and application domains, as well as defenses proposed to mitigate such attacks. A practical tutorial will be given throughout the talk, allowing participants to familiarize themselves with adversarial example crafting.

Sign up to find out more about Enigma conferences: https://www.usenix.org/conference/enigma2017#signup

Watch all Enigma 2017 videos at: http://enigma.usenix.org/youtube

via YouTube https://youtu.be/hUukErt3-7w

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s