CURATE: Scaling-Up Differentially Private Causal Graph Discovery

Entropy (Basel). 2024 Nov 5;26(11):946. doi: 10.3390/e26110946.

Abstract

Causal graph discovery (CGD) is the process of estimating the underlying probabilistic graphical model that represents the joint distribution of features of a dataset. CGD algorithms are broadly classified into two categories: (i) constraint-based algorithms, where the outcome depends on conditional independence (CI) tests, and (ii) score-based algorithms, where the outcome depends on optimized score function. Because sensitive features of observational data are prone to privacy leakage, differential privacy (DP) has been adopted to ensure user privacy in CGD. Adding the same amount of noise in this sequential-type estimation process affects the predictive performance of algorithms. Initial CI tests in constraint-based algorithms and later iterations of the optimization process of score-based algorithms are crucial; thus, they need to be more accurate and less noisy. Based on this key observation, we present CURATE (CaUsal gRaph AdapTivE privacy), a DP-CGD framework with adaptive privacy budgeting. In contrast to existing DP-CGD algorithms with uniform privacy budgeting across all iterations, CURATE allows for adaptive privacy budgeting by minimizing error probability (constraint-based), maximizing iterations of the optimization problem (score-based) while keeping the cumulative leakage bounded. To validate our framework, we present a comprehensive set of experiments on several datasets and show that CURATE achieves higher utility compared to existing DP-CGD algorithms with less privacy leakage.

Keywords: adaptive privacy budgeting; causal graph discovery; differential privacy.