S Lab

[Sibin Mohan]

Systems Security Research Group at GWU and University of Illinois


Indistinguishability: Differential Privacy Protects Runtime Behaviors

Team Members: Sibin Mohan, Maryam Ghorbanvirdi

Collaborators: Chien-Ying Chen, Debopam Sanyal


Differential Privacy (DP) is the most popular framework for data analysis with robust mathematical assurances of privacy protection. DP has evolved from a theoretical concept to practical applications over the last 20 years, protecting critical data in extensive deployments. Recent studies have examined DP’s suitability for protecting real-time systems (RTS), where timing data itself may be a source of privacy leakage, even if its primary application has been in data privacy. Nevertheless, their determinism makes them susceptible to scheduler side-channel attacks, in which a malicious actor uses execution patterns to deduce the private system states. Previous research has investigated ε-differential privacy (ε-DP) based scheduling, which adds Laplace noise to execution times, to reduce such hazards. However, Laplace noise is vulnerable to persistent adversarial attacks because of its predictability and set privacy budgets. Inspired by Rényi Differential Privacy (RDP), we present a unique adaptive privacy-preserving scheduling technique in this study that dynamically modifies privacy noise according to system restrictions and attack risk. Our method, which uses Gaussian-based noise injection instead of fixed-noise ε-DP methods, guarantees schedule indistinguishability even when subjected to long-term adversarial monitoring. Additionally, we test our method in a client-server environment, where an adversary can either watch system replies from the server side or inject timing-based attacks from the client side. According to our experimental findings, RDP-based scheduling works better than ε-DP techniques in preserving indistinguishability and minimizing performance deterioration.

Funding: