Abstract
In distributed control systems with shared resources, participating agents can improve the overall performance of the system by sharing data about their personal preferences. In this paper, we formulate and study a natural tradeoff arising in these problems between the privacy of the agent's data and the performance of the control system. We formalize privacy in terms of differential privacy of agents' preference vectors. The overall control system consists of N agents with linear discrete-time coupled dynamics, each controlled to track its preference vector. Performance of the system is measured by the mean squared tracking error. We present a mechanism that achieves differential privacy by adding Laplace noise to the shared information in a way that depends on the sensitivity of the control system to the private data. We show that for stable systems the performance cost of using this type of privacy preserving mechanism grows as O(T3/N∈2), where T is the time horizon and ∈ is the privacy parameter. For unstable systems, the cost grows exponentially with time. From an estimation point of view, we establish a lower-bound for the entropy of any unbiased estimator of the private data from any noise-adding mechanism that gives ∈-differential privacy. We show that the mechanism achieving this lower-bound is a randomized mechanism that also uses Laplace noise.
Original language | English (US) |
---|---|
Article number | 7833044 |
Pages (from-to) | 118-130 |
Number of pages | 13 |
Journal | IEEE Transactions on Control of Network Systems |
Volume | 4 |
Issue number | 1 |
DOIs | |
State | Published - Mar 2017 |
Keywords
- Communication networks
- decision/estimation theory
- differential privacy
- distributed algorithms/control
ASJC Scopus subject areas
- Control and Systems Engineering
- Signal Processing
- Computer Networks and Communications
- Control and Optimization