Academic literature on the topic 'Computational Differential Privacy'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Computational Differential Privacy.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Computational Differential Privacy"
Bhavani Sankar Telaprolu. "Privacy-Preserving Federated Learning in Healthcare - A Secure AI Framework." International Journal of Scientific Research in Computer Science, Engineering and Information Technology 10, no. 3 (July 16, 2024): 703–7. https://doi.org/10.32628/cseit2410347.
Full textEt. al., Dr Priyank Jain,. "Differentially Private Data Release: Bias Weight Perturbation Method - A Novel Approach." Turkish Journal of Computer and Mathematics Education (TURCOMAT) 12, no. 10 (April 28, 2021): 7165–73. http://dx.doi.org/10.17762/turcomat.v12i10.5607.
Full textKii, Masanobu, Atsunori Ichikawa, and Takayuki Miura. "Lightweight Two-Party Secure Sampling Protocol for Differential Privacy." Proceedings on Privacy Enhancing Technologies 2025, no. 1 (January 2025): 23–36. http://dx.doi.org/10.56553/popets-2025-0003.
Full textMeisingseth, Fredrik, and Christian Rechberger. "SoK: Computational and Distributed Differential Privacy for MPC." Proceedings on Privacy Enhancing Technologies 2025, no. 1 (January 2025): 420–39. http://dx.doi.org/10.56553/popets-2025-0023.
Full textKim, Jongwook. "DistOD: A Hybrid Privacy-Preserving and Distributed Framework for Origin–Destination Matrix Computation." Electronics 13, no. 22 (November 19, 2024): 4545. http://dx.doi.org/10.3390/electronics13224545.
Full textFang, Juanru, and Ke Yi. "Privacy Amplification by Sampling under User-level Differential Privacy." Proceedings of the ACM on Management of Data 2, no. 1 (March 12, 2024): 1–26. http://dx.doi.org/10.1145/3639289.
Full textAlborch Escobar, Ferran, Sébastien Canard, Fabien Laguillaumie, and Duong Hieu Phan. "Computational Differential Privacy for Encrypted Databases Supporting Linear Queries." Proceedings on Privacy Enhancing Technologies 2024, no. 4 (October 2024): 583–604. http://dx.doi.org/10.56553/popets-2024-0131.
Full textLiu, Hai, Zhenqiang Wu, Yihui Zhou, Changgen Peng, Feng Tian, and Laifeng Lu. "Privacy-Preserving Monotonicity of Differential Privacy Mechanisms." Applied Sciences 8, no. 11 (October 28, 2018): 2081. http://dx.doi.org/10.3390/app8112081.
Full textPavan Kumar Vadrevu. "Scalable Approaches for Enhancing Privacy in Blockchain Networks: A Comprehensive Review of Differential Privacy Techniques." Journal of Information Systems Engineering and Management 10, no. 8s (January 31, 2025): 635–48. https://doi.org/10.52783/jisem.v10i8s.1119.
Full textHong, Yiyang, Xingwen Zhao, Hui Zhu, and Hui Li. "A Blockchain-Integrated Divided-Block Sparse Matrix Transformation Differential Privacy Data Publishing Model." Security and Communication Networks 2021 (December 7, 2021): 1–15. http://dx.doi.org/10.1155/2021/2418539.
Full textDissertations / Theses on the topic "Computational Differential Privacy"
Alborch, escobar Ferran. "Private Data Analysis over Encrypted Databases : Mixing Functional Encryption with Computational Differential Privacy." Electronic Thesis or Diss., Institut polytechnique de Paris, 2025. http://www.theses.fr/2025IPPAT003.
Full textIn our current digitalized society, data is ruling the world. But as it is most of the time related to individuals, its exploitation should respect the privacy of the latter. This issue has raised the differential privacy paradigm, which permits to protect individuals when querying databases containing data about them. But with the emergence of cloud computing, it is becoming increasingly necessary to also consider the confidentiality of "on-cloud'' storage confidentiality of such vast databases, using encryption techniques. This thesis studies how to provide both privacy and confidentiality of such outsourced databases by mixing two primitives: computational differential privacy and functional encryption. First, we study the relationship between computational differential privacy and functional encryption for randomized functions in a generic way. We analyze the privacy of the setting where a malicious analyst may access the encrypted data stored in a server, either by corrupting or breaching it, and prove that a secure randomized functional encryption scheme supporting the appropriate family of functions guarantees the computational differential privacy of the system. Second, we construct efficient randomized functional encryption schemes for certain useful families of functions, and we prove them secure in the standard model under well-known assumptions. The families of functions considered are linear functions, used for example in counting queries, histograms and linear regressions, and quadratic functions, used for example in quadratic regressions and hypothesis testing. The schemes built are then used together with the first result to construct encrypted databases for their corresponding family of queries. Finally, we implement both randomized functional encryption schemes to analyze their efficiency. This shows that our constructions are practical for databases with up to 1 000 000 entries in the case of linear queries and databases with up to 10 000 database entries in the case of quadratic queries
Leukam, Lako Franklin. "Protection des données à caractère personnel pour les services énergétiques." Electronic Thesis or Diss., Institut polytechnique de Paris, 2021. http://www.theses.fr/2021IPPAS004.
Full textSmart grids are important bricks in the fight against climate change. Smart grids allow the massive introduction of renewable energies, which are intermittent, while guaranteeing grid stability, i.e., ensuring a real-time balance between demand and production in the power grid. The management of grid stability is possible thanks to smart meters installed in households, allowing the distribution system operator to collect consumption/production data from consumers/producers at a time step of up to 10 min in France. This real-time consumption data enables to provide new energy services, such as customer consumption forecasts or demand response. Demand response services help to avoid consumption peaks in a neighborhood by ensuring that, at all times, users' consumption does not exceed the maximum power of the local grid. However, the collection of users' consumptions is a key privacy concern. Indeed, individual consumption data reflect the use of all electric appliances by inhabitants in a household over time, and enable to deduce the behaviors, activities, age or preferences of the inhabitants. This thesis aims to propose new energy services, while protecting the privacy of consumers. We propose five contributions that relate to two themes:1- The transformation of a demand response algorithm by making it privacy friendly. This transformation uses secure multiparty computation, allowing to compute an aggregate, such as a sum of users’ consumption, without disclosing any individual consumption.2- The publication of sum of users' consumption while preserving privacy and good utility. This publication uses differential privacy, ensuring that the publication of the sum does not indirectly reveal individual users' consumption. Among other energy services, these sums of consumption enable to perform consumption forecasts
(6565679), Fang-Yu Rao. "Privacy-Enhancing Techniques for Data Analytics." Thesis, 2019.
Find full textOrganizations today collect and aggregate huge amounts of data from individuals under various scenarios and for different purposes. Such aggregation of individuals’ data when combined with techniques of data analytics allows organizations to make informed decisions and predictions. But in many situations, different portions of the data associated with individuals are collected and curated by different organizations. To derive more accurate conclusions and predictions, those organization may want to conduct the analysis based on their joint data, which cannot be simply accomplished by each organization exchanging its own data with other organizations due to the sensitive nature of data. Developing approaches for collaborative privacy-preserving data analytics, however, is a nontrivial task. At least two major challenges have to be addressed. The first challenge is that the security of the data possessed by each organization should always be properly protected during and after the collaborative analysis process, whereas the second challenge is the high computational complexity usually accompanied by cryptographic primitives used to build such privacy-preserving protocols.
In this dissertation, based on widely adopted primitives in cryptography, we address the aforementioned challenges by developing techniques for data analytics that
not only allow multiple mutually distrustful parties to perform data analysis on their
joint data in a privacy-preserving manner, but also reduce the time required to complete the analysis. More specifically, using three common data analytics tasks as
concrete examples, we show how to construct the respective privacy-preserving protocols under two different scenarios: (1) the protocols are executed by a collaborative process only involving the participating parties; (2) the protocols are outsourced to
some service providers in the cloud. Two types of optimization for improving the
efficiency of those protocols are also investigated. The first type allows each participating party access to a statistically controlled leakage so as to reduce the amount
of required computation, while the second type utilizes the parallelism that could
be incorporated into the task and pushes some computation to the offline phase to
reduce the time needed for each participating party without any additional leakage.
Extensive experiments are also conducted on real-world datasets to demonstrate the
effectiveness of our proposed techniques.
Book chapters on the topic "Computational Differential Privacy"
Mironov, Ilya, Omkant Pandey, Omer Reingold, and Salil Vadhan. "Computational Differential Privacy." In Advances in Cryptology - CRYPTO 2009, 126–42. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-03356-8_8.
Full textPejó, Balázs, and Damien Desfontaines. "Computational Power (C)." In Guide to Differential Privacy Modifications, 55–57. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-96398-9_9.
Full textBen Hamida, Sana, Hichem Mrabet, and Abderrazak Jemai. "How Differential Privacy Reinforces Privacy of Machine Learning Models?" In Advances in Computational Collective Intelligence, 661–73. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-16210-7_54.
Full textValovich, Filipp, and Francesco Aldà. "Computational Differential Privacy from Lattice-Based Cryptography." In Number-Theoretic Methods in Cryptology, 121–41. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-76620-1_8.
Full textKang, Yilin, Jian Li, Yong Liu, and Weiping Wang. "Data Heterogeneity Differential Privacy: From Theory to Algorithm." In Computational Science – ICCS 2023, 119–33. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-35995-8_9.
Full textTao, Fuqiang, Zhe Sun, Rui Liang, Rundong Shao, Yuhan Chai, and Yangyang Wang. "FEDSET: Federated Random Forest Based on Differential Privacy." In Computational and Experimental Simulations in Engineering, 791–806. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-42987-3_55.
Full textPodsevalov, Ivan, Alexei Podsevalov, and Vladimir Korkhov. "Differential Privacy for Statistical Data of Educational Institutions." In Computational Science and Its Applications – ICCSA 2022 Workshops, 603–15. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-10542-5_41.
Full textGroce, Adam, Jonathan Katz, and Arkady Yerukhimovich. "Limits of Computational Differential Privacy in the Client/Server Setting." In Theory of Cryptography, 417–31. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-19571-6_25.
Full textQian, Jiaqing, Yanan Chen, and Sanxiu Jiao. "DPFL-AES: Differential Privacy Federated Learning Based on Adam Early Stopping." In Computational and Experimental Simulations in Engineering, 905–19. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-42515-8_64.
Full textBun, Mark, Yi-Hsiu Chen, and Salil Vadhan. "Separating Computational and Statistical Differential Privacy in the Client-Server Model." In Theory of Cryptography, 607–34. Berlin, Heidelberg: Springer Berlin Heidelberg, 2016. http://dx.doi.org/10.1007/978-3-662-53641-4_23.
Full textConference papers on the topic "Computational Differential Privacy"
Li, Xianzhi, Ran Zmigrod, Zhiqiang Ma, Xiaomo Liu, and Xiaodan Zhu. "Fine-Tuning Language Models with Differential Privacy through Adaptive Noise Allocation." In Findings of the Association for Computational Linguistics: EMNLP 2024, 8368–75. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-emnlp.491.
Full textGupta, Yash, Jeswin M. S, Aniruddh Mantrala, Davin Henry Monteiro, Adhithi M, and M. N. Thippeswamy. "Enhancing Differential Privacy in Federated Learning via Quantum Computation and Algorithms." In 2024 8th International Conference on Computational System and Information Technology for Sustainable Solutions (CSITSS), 1–6. IEEE, 2024. https://doi.org/10.1109/csitss64042.2024.10816807.
Full textVu, Doan Nam Long, Timour Igamberdiev, and Ivan Habernal. "Granularity is crucial when applying differential privacy to text: An investigation for neural machine translation." In Findings of the Association for Computational Linguistics: EMNLP 2024, 507–27. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-emnlp.29.
Full textTajima, Arisa, Wei Jiang, Virendra Marathe, and Hamid Mozaffari. "Enhanced Private Decision Trees using Secure Multiparty Computation and Differential Privacy." In 2024 IEEE International Conference on Knowledge Graph (ICKG), 352–59. IEEE, 2024. https://doi.org/10.1109/ickg63256.2024.00051.
Full textFlemings, James, and Murali Annavaram. "Differentially Private Knowledge Distillation via Synthetic Text Generation." In Findings of the Association for Computational Linguistics ACL 2024, 12957–68. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-acl.769.
Full textChen, Bo, Baike She, Calvin Hawkins, Alex Benvenuti, Brandon Fallin, Philip E. Paré, and Matthew Hale. "Differentially Private Computation of Basic Reproduction Numbers in Networked Epidemic Models." In 2024 American Control Conference (ACC), 4422–27. IEEE, 2024. http://dx.doi.org/10.23919/acc60939.2024.10644264.
Full textRamesh, Krithika, Nupoor Gandhi, Pulkit Madaan, Lisa Bauer, Charith Peris, and Anjalie Field. "Evaluating Differentially Private Synthetic Data Generation in High-Stakes Domains." In Findings of the Association for Computational Linguistics: EMNLP 2024, 15254–69. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-emnlp.894.
Full textMeisenbacher, Stephen, Maulik Chevli, Juraj Vladika, and Florian Matthes. "DP-MLM: Differentially Private Text Rewriting Using Masked Language Models." In Findings of the Association for Computational Linguistics ACL 2024, 9314–28. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-acl.554.
Full textGhazi, Badih, Rahul Ilango, Pritish Kamath, Ravi Kumar, and Pasin Manurangsi. "Towards Separating Computational and Statistical Differential Privacy." In 2023 IEEE 64th Annual Symposium on Foundations of Computer Science (FOCS). IEEE, 2023. http://dx.doi.org/10.1109/focs57990.2023.00042.
Full textKotevska, Olivera, Folami Alamudun, and Christopher Stanley. "Optimal Balance of Privacy and Utility with Differential Privacy Deep Learning Frameworks." In 2021 International Conference on Computational Science and Computational Intelligence (CSCI). IEEE, 2021. http://dx.doi.org/10.1109/csci54926.2021.00141.
Full textReports on the topic "Computational Differential Privacy"
Rannenberg, Kai, Sebastian Pape, Frédéric Tronnier, and Sascha Löbner. Study on the Technical Evaluation of De-Identification Procedures for Personal Data in the Automotive Sector. Universitätsbibliothek Johann Christian Senckenberg, October 2021. http://dx.doi.org/10.21248/gups.63413.
Full text