Local Differentially Private Data Stream Mean Estimation
DOI:
https://doi.org/10.63313/AJET.9038Keywords:
Local Differential Privacy, Privacy Protection, Data Stream, Privacy Budget Allocation, Mean EstimatioAbstract
With the rapid development of mobile IoT and smart devices, time-series data generated in applications such as health monitoring, traffic management, environmental sensing, and smart homes are continuously collected and uploaded for state monitoring, trend analysis, and intelligent decision-making. Due to their continuous generation, strong temporal correlation, and long-term accumulation, such data may still reveal sensitive information, including behavioral patterns, health conditions, and lifestyle preferences, even without explicit identifiers. Local Differential Privacy (LDP) offers an effective user-side privacy protection paradigm by perturbing raw data before transmission. However, in continuous data stream publishing scenarios, existing LDP methods for mean estimation still suffer from two major limitations: frequent perturbation uploads lead to cumulative noise and degraded estimation stability, while fixed privacy budget allocation cannot adapt well to dynamic data changes, resulting in an unsatisfactory privacy–utility trade-off. To address these challenges, this paper proposes Local Differentially Private Data Stream Mean Estimation (LDP-SM) for continuous data publishing. On the user side, a Kalman prediction-based triggering mechanism is introduced to adaptively switch between prediction-based and perturbation-based uploads, thereby reducing unnecessary perturbations and privacy budget consumption. When perturbation is required, a dynamic privacy budget allocation strategy combining PID feedback and sliding-window constraints is applied to improve budget utilization and maintain continuity in long-term release. On the server side, the reported data are aggregated in real time to produce more accurate and stable mean estimation results. Experimental results show that the proposed method effectively improves the accuracy and stability of mean estimation under LDP constraints.
References
[1] Singh B C, Hossain M J, Diaz R, Roy S, Mukkamala R, Shetty S. Cooperative Local Differential Privacy: Securing Time Series Data in Distributed Environments (Version 1) [J]. arXiv, 2025.
[2] Duchi J C, Jordan M I, Wainwright M J. Local Privacy and Statistical Minimax Rates [C]// 2013 IEEE 54th Annual Symposium on Foundations of Computer Science. 429–438, 2013.
[3] Pan K, Ong Y-S, Gong M, Li H, Qin A K, Gao Y. Differential Privacy in Deep Learning: A Literature Survey [J]. Neurocomputing, 2024, 589: 127663.
[4] Hu R, Li H, Li J, Wang Z, Wang B. Continuous Release of Temporal Correlation Location Statistics with Local Differential Privacy [J]. Multimedia Tools and Applications, 2023, 83(17): 50225–50243.
[5] Wang H, Xu Z. CTS-DP: Publishing Correlated Time-Series Data via Differential Privacy [J]. Knowledge-Based Systems, 2017, 122: 167–179.
[6] Ren X, Shi L, Yu W, Yang S, Zhao C, Xu Z. LDP-IDS: Local Differential Privacy for Infinite Data Streams [J]. arXiv, 2022.
[7] Wang T, Ren X, Yu W, et al. Continuous release of data streams under both centralized and local differential privacy[C]//Proceedings of the ACM SIGSAC Conference on Computer and Communications Security (CCS). 2021: 1237-1253.
[8] Li Z, Wang T, Lopuhaä-Zwakenberg M, Li N, Skoric B. Estimating numerical distributions under local differential privacy[C]//Proceedings of the ACM SIGMOD International Conference on Management of Data. 2020: 621-635.
[9] Holohan N, Antonatos S, Braghin S, Mac Aonghusa P. The bounded Laplace mechanism in differential privacy[J]. Journal of Privacy and Confidentiality, 2020, 10(1).
Downloads
Published
Issue
Section
License
Copyright (c) 2026 by author(s) and Erytis Publishing Limited.

This work is licensed under a Creative Commons Attribution 4.0 International License.













