Stealing Machine Learning Models via Prediction APIs

Machine learning (ML) models may be deemed confidential
due to their sensitive training data, commercial
value, or use in security applications. Increasingly often,
confidential ML models are being deployed with publicly
accessible query interfaces. ML-as-a-service (“predictive
analytics”) systems are an example: Some allow
users to train models on potentially sensitive data and
charge others for access on a pay-per-query basis.
The tension between model confidentiality and public
access motivates our investigation of model extraction
attacks. In such attacks, an adversary with black-box access,
but no prior knowledge of an ML model’s parameters
or training data, aims to duplicate the functionality
of (i.e., “steal”) the model. Unlike in classical learning
theory settings, ML-as-a-service offerings may accept
partial feature vectors as inputs and include confidence
values with predictions. Given these practices, we show
simple, efficient attacks that extract target ML models
with near-perfect fidelity for popular model classes including
logistic regression, neural networks, and decision
trees. We demonstrate these attacks against the online
services of BigML and Amazon Machine Learning.
We further show that the natural countermeasure of omitting
confidence values from model outputs still admits
potentially harmful model extraction attacks. Our results
highlight the need for careful ML model deployment and new model extraction countermeasures.

Source: https://www.usenix.org/system/files/conference/usenixsecurity16/sec16_paper_tramer.pdf

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s