Extremal mechanisms for local differential privacy

Peter Kairouz, Sewoong Oh, Pramod Viswanath

Research output: Contribution to journalArticle

Abstract

Local differential privacy has recently surfaced as a strong measure of privacy in contexts where personal information remains private even from data analysts. Working in a setting where both the data providers and data analysts want to maximize the utility of statistical analyses performed on the released data, we study the fundamental trade-off between local differential privacy and utility. This trade-off is formulated as a constrained optimization problem: maximize utility subject to local differential privacy constraints. We introduce a combinatorial family of extremal privatization mechanisms, which we call staircase mechanisms, and show that it contains the optimal privatization mechanisms for a broad class of information theoretic utilities such as mutual information and f-divergences. We further prove that for any utility function and any privacy level, solving the privacy-utility maximization problem is equivalent to solving a finite-dimensional linear program, the outcome of which is the optimal staircase mechanism. However, solving this linear program can be computationally expensive since it has a number of variables that is exponential in the size of the alphabet the data lives in. To account for this, we show that two simple privatization mechanisms, the binary and randomized response mechanisms, are universally optimal in the low and high privacy regimes, and well approximate the intermediate regime.

Original languageEnglish (US)
JournalJournal of Machine Learning Research
Volume17
StatePublished - Apr 1 2016

Fingerprint

Privatization
Privacy
Constrained optimization
Linear Program
Trade-offs
Maximise
F-divergence
Randomized Response
Utility Maximization
Binary Response
Private Information
Constrained Optimization Problem
Mutual Information
Utility Function

Keywords

  • Estimation
  • Hypothesis testing
  • Information theoretic utilities
  • Local differential privacy
  • Mutual information
  • Privacy-preserving machine learning algorithms
  • Statistical inference
  • f-divergences

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Cite this

Extremal mechanisms for local differential privacy. / Kairouz, Peter; Oh, Sewoong; Viswanath, Pramod.

In: Journal of Machine Learning Research, Vol. 17, 01.04.2016.

Research output: Contribution to journalArticle

@article{de7ac555460047a896730679471b13cf,
title = "Extremal mechanisms for local differential privacy",
abstract = "Local differential privacy has recently surfaced as a strong measure of privacy in contexts where personal information remains private even from data analysts. Working in a setting where both the data providers and data analysts want to maximize the utility of statistical analyses performed on the released data, we study the fundamental trade-off between local differential privacy and utility. This trade-off is formulated as a constrained optimization problem: maximize utility subject to local differential privacy constraints. We introduce a combinatorial family of extremal privatization mechanisms, which we call staircase mechanisms, and show that it contains the optimal privatization mechanisms for a broad class of information theoretic utilities such as mutual information and f-divergences. We further prove that for any utility function and any privacy level, solving the privacy-utility maximization problem is equivalent to solving a finite-dimensional linear program, the outcome of which is the optimal staircase mechanism. However, solving this linear program can be computationally expensive since it has a number of variables that is exponential in the size of the alphabet the data lives in. To account for this, we show that two simple privatization mechanisms, the binary and randomized response mechanisms, are universally optimal in the low and high privacy regimes, and well approximate the intermediate regime.",
keywords = "Estimation, Hypothesis testing, Information theoretic utilities, Local differential privacy, Mutual information, Privacy-preserving machine learning algorithms, Statistical inference, f-divergences",
author = "Peter Kairouz and Sewoong Oh and Pramod Viswanath",
year = "2016",
month = "4",
day = "1",
language = "English (US)",
volume = "17",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

TY - JOUR

T1 - Extremal mechanisms for local differential privacy

AU - Kairouz, Peter

AU - Oh, Sewoong

AU - Viswanath, Pramod

PY - 2016/4/1

Y1 - 2016/4/1

N2 - Local differential privacy has recently surfaced as a strong measure of privacy in contexts where personal information remains private even from data analysts. Working in a setting where both the data providers and data analysts want to maximize the utility of statistical analyses performed on the released data, we study the fundamental trade-off between local differential privacy and utility. This trade-off is formulated as a constrained optimization problem: maximize utility subject to local differential privacy constraints. We introduce a combinatorial family of extremal privatization mechanisms, which we call staircase mechanisms, and show that it contains the optimal privatization mechanisms for a broad class of information theoretic utilities such as mutual information and f-divergences. We further prove that for any utility function and any privacy level, solving the privacy-utility maximization problem is equivalent to solving a finite-dimensional linear program, the outcome of which is the optimal staircase mechanism. However, solving this linear program can be computationally expensive since it has a number of variables that is exponential in the size of the alphabet the data lives in. To account for this, we show that two simple privatization mechanisms, the binary and randomized response mechanisms, are universally optimal in the low and high privacy regimes, and well approximate the intermediate regime.

AB - Local differential privacy has recently surfaced as a strong measure of privacy in contexts where personal information remains private even from data analysts. Working in a setting where both the data providers and data analysts want to maximize the utility of statistical analyses performed on the released data, we study the fundamental trade-off between local differential privacy and utility. This trade-off is formulated as a constrained optimization problem: maximize utility subject to local differential privacy constraints. We introduce a combinatorial family of extremal privatization mechanisms, which we call staircase mechanisms, and show that it contains the optimal privatization mechanisms for a broad class of information theoretic utilities such as mutual information and f-divergences. We further prove that for any utility function and any privacy level, solving the privacy-utility maximization problem is equivalent to solving a finite-dimensional linear program, the outcome of which is the optimal staircase mechanism. However, solving this linear program can be computationally expensive since it has a number of variables that is exponential in the size of the alphabet the data lives in. To account for this, we show that two simple privatization mechanisms, the binary and randomized response mechanisms, are universally optimal in the low and high privacy regimes, and well approximate the intermediate regime.

KW - Estimation

KW - Hypothesis testing

KW - Information theoretic utilities

KW - Local differential privacy

KW - Mutual information

KW - Privacy-preserving machine learning algorithms

KW - Statistical inference

KW - f-divergences

UR - http://www.scopus.com/inward/record.url?scp=84979920224&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84979920224&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:84979920224

VL - 17

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -