Differential privacy is a strong privacy notion based on indistinguishability of outputs of two neighboring datasets, which represent two states of one's information is within or without of a dataset. However, when facing dependent records, the representation would lose its foundation. Motivated by the observation, we introduce a variant of differential privacy notion based on the influence of outputs to an individual's inputs. The new notion accurately captures the the weakening of the dependent records to the privacy guarantee of differential privacy. Our new privacy notion gets on well with the differential privacy. When the individuals are independent, the differential privacy model would be one spatial case of our model. When the individuals are dependent, the group privacy method to achieve differential privacy in dependent case can be used to achieve new privacy model. This fits in well with the results of differential privacy. Finally, our new privacy model fits in well with the information theory. We prove that if one mechanism satisfies the new privacy notion, the mutual information of one individual to the mechanism's outputs would be upper bounded by a small valued. This implies that the rationality of our new model is based on the information theory.
View on arXiv