Inhaltsverzeichnis
Auswahl der wissenschaftlichen Literatur zum Thema „Computational Differential Privacy“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Computational Differential Privacy" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Zeitschriftenartikel zum Thema "Computational Differential Privacy"
Bhavani Sankar Telaprolu. „Privacy-Preserving Federated Learning in Healthcare - A Secure AI Framework“. International Journal of Scientific Research in Computer Science, Engineering and Information Technology 10, Nr. 3 (16.07.2024): 703–7. https://doi.org/10.32628/cseit2410347.
Der volle Inhalt der QuelleEt. al., Dr Priyank Jain,. „Differentially Private Data Release: Bias Weight Perturbation Method - A Novel Approach“. Turkish Journal of Computer and Mathematics Education (TURCOMAT) 12, Nr. 10 (28.04.2021): 7165–73. http://dx.doi.org/10.17762/turcomat.v12i10.5607.
Der volle Inhalt der QuelleKii, Masanobu, Atsunori Ichikawa und Takayuki Miura. „Lightweight Two-Party Secure Sampling Protocol for Differential Privacy“. Proceedings on Privacy Enhancing Technologies 2025, Nr. 1 (Januar 2025): 23–36. http://dx.doi.org/10.56553/popets-2025-0003.
Der volle Inhalt der QuelleMeisingseth, Fredrik, und Christian Rechberger. „SoK: Computational and Distributed Differential Privacy for MPC“. Proceedings on Privacy Enhancing Technologies 2025, Nr. 1 (Januar 2025): 420–39. http://dx.doi.org/10.56553/popets-2025-0023.
Der volle Inhalt der QuelleKim, Jongwook. „DistOD: A Hybrid Privacy-Preserving and Distributed Framework for Origin–Destination Matrix Computation“. Electronics 13, Nr. 22 (19.11.2024): 4545. http://dx.doi.org/10.3390/electronics13224545.
Der volle Inhalt der QuelleFang, Juanru, und Ke Yi. „Privacy Amplification by Sampling under User-level Differential Privacy“. Proceedings of the ACM on Management of Data 2, Nr. 1 (12.03.2024): 1–26. http://dx.doi.org/10.1145/3639289.
Der volle Inhalt der QuelleAlborch Escobar, Ferran, Sébastien Canard, Fabien Laguillaumie und Duong Hieu Phan. „Computational Differential Privacy for Encrypted Databases Supporting Linear Queries“. Proceedings on Privacy Enhancing Technologies 2024, Nr. 4 (Oktober 2024): 583–604. http://dx.doi.org/10.56553/popets-2024-0131.
Der volle Inhalt der QuelleLiu, Hai, Zhenqiang Wu, Yihui Zhou, Changgen Peng, Feng Tian und Laifeng Lu. „Privacy-Preserving Monotonicity of Differential Privacy Mechanisms“. Applied Sciences 8, Nr. 11 (28.10.2018): 2081. http://dx.doi.org/10.3390/app8112081.
Der volle Inhalt der QuellePavan Kumar Vadrevu. „Scalable Approaches for Enhancing Privacy in Blockchain Networks: A Comprehensive Review of Differential Privacy Techniques“. Journal of Information Systems Engineering and Management 10, Nr. 8s (31.01.2025): 635–48. https://doi.org/10.52783/jisem.v10i8s.1119.
Der volle Inhalt der QuelleHong, Yiyang, Xingwen Zhao, Hui Zhu und Hui Li. „A Blockchain-Integrated Divided-Block Sparse Matrix Transformation Differential Privacy Data Publishing Model“. Security and Communication Networks 2021 (07.12.2021): 1–15. http://dx.doi.org/10.1155/2021/2418539.
Der volle Inhalt der QuelleDissertationen zum Thema "Computational Differential Privacy"
Alborch, escobar Ferran. „Private Data Analysis over Encrypted Databases : Mixing Functional Encryption with Computational Differential Privacy“. Electronic Thesis or Diss., Institut polytechnique de Paris, 2025. http://www.theses.fr/2025IPPAT003.
Der volle Inhalt der QuelleIn our current digitalized society, data is ruling the world. But as it is most of the time related to individuals, its exploitation should respect the privacy of the latter. This issue has raised the differential privacy paradigm, which permits to protect individuals when querying databases containing data about them. But with the emergence of cloud computing, it is becoming increasingly necessary to also consider the confidentiality of "on-cloud'' storage confidentiality of such vast databases, using encryption techniques. This thesis studies how to provide both privacy and confidentiality of such outsourced databases by mixing two primitives: computational differential privacy and functional encryption. First, we study the relationship between computational differential privacy and functional encryption for randomized functions in a generic way. We analyze the privacy of the setting where a malicious analyst may access the encrypted data stored in a server, either by corrupting or breaching it, and prove that a secure randomized functional encryption scheme supporting the appropriate family of functions guarantees the computational differential privacy of the system. Second, we construct efficient randomized functional encryption schemes for certain useful families of functions, and we prove them secure in the standard model under well-known assumptions. The families of functions considered are linear functions, used for example in counting queries, histograms and linear regressions, and quadratic functions, used for example in quadratic regressions and hypothesis testing. The schemes built are then used together with the first result to construct encrypted databases for their corresponding family of queries. Finally, we implement both randomized functional encryption schemes to analyze their efficiency. This shows that our constructions are practical for databases with up to 1 000 000 entries in the case of linear queries and databases with up to 10 000 database entries in the case of quadratic queries
Leukam, Lako Franklin. „Protection des données à caractère personnel pour les services énergétiques“. Electronic Thesis or Diss., Institut polytechnique de Paris, 2021. http://www.theses.fr/2021IPPAS004.
Der volle Inhalt der QuelleSmart grids are important bricks in the fight against climate change. Smart grids allow the massive introduction of renewable energies, which are intermittent, while guaranteeing grid stability, i.e., ensuring a real-time balance between demand and production in the power grid. The management of grid stability is possible thanks to smart meters installed in households, allowing the distribution system operator to collect consumption/production data from consumers/producers at a time step of up to 10 min in France. This real-time consumption data enables to provide new energy services, such as customer consumption forecasts or demand response. Demand response services help to avoid consumption peaks in a neighborhood by ensuring that, at all times, users' consumption does not exceed the maximum power of the local grid. However, the collection of users' consumptions is a key privacy concern. Indeed, individual consumption data reflect the use of all electric appliances by inhabitants in a household over time, and enable to deduce the behaviors, activities, age or preferences of the inhabitants. This thesis aims to propose new energy services, while protecting the privacy of consumers. We propose five contributions that relate to two themes:1- The transformation of a demand response algorithm by making it privacy friendly. This transformation uses secure multiparty computation, allowing to compute an aggregate, such as a sum of users’ consumption, without disclosing any individual consumption.2- The publication of sum of users' consumption while preserving privacy and good utility. This publication uses differential privacy, ensuring that the publication of the sum does not indirectly reveal individual users' consumption. Among other energy services, these sums of consumption enable to perform consumption forecasts
(6565679), Fang-Yu Rao. „Privacy-Enhancing Techniques for Data Analytics“. Thesis, 2019.
Den vollen Inhalt der Quelle findenOrganizations today collect and aggregate huge amounts of data from individuals under various scenarios and for different purposes. Such aggregation of individuals’ data when combined with techniques of data analytics allows organizations to make informed decisions and predictions. But in many situations, different portions of the data associated with individuals are collected and curated by different organizations. To derive more accurate conclusions and predictions, those organization may want to conduct the analysis based on their joint data, which cannot be simply accomplished by each organization exchanging its own data with other organizations due to the sensitive nature of data. Developing approaches for collaborative privacy-preserving data analytics, however, is a nontrivial task. At least two major challenges have to be addressed. The first challenge is that the security of the data possessed by each organization should always be properly protected during and after the collaborative analysis process, whereas the second challenge is the high computational complexity usually accompanied by cryptographic primitives used to build such privacy-preserving protocols.
In this dissertation, based on widely adopted primitives in cryptography, we address the aforementioned challenges by developing techniques for data analytics that
not only allow multiple mutually distrustful parties to perform data analysis on their
joint data in a privacy-preserving manner, but also reduce the time required to complete the analysis. More specifically, using three common data analytics tasks as
concrete examples, we show how to construct the respective privacy-preserving protocols under two different scenarios: (1) the protocols are executed by a collaborative process only involving the participating parties; (2) the protocols are outsourced to
some service providers in the cloud. Two types of optimization for improving the
efficiency of those protocols are also investigated. The first type allows each participating party access to a statistically controlled leakage so as to reduce the amount
of required computation, while the second type utilizes the parallelism that could
be incorporated into the task and pushes some computation to the offline phase to
reduce the time needed for each participating party without any additional leakage.
Extensive experiments are also conducted on real-world datasets to demonstrate the
effectiveness of our proposed techniques.
Buchteile zum Thema "Computational Differential Privacy"
Mironov, Ilya, Omkant Pandey, Omer Reingold und Salil Vadhan. „Computational Differential Privacy“. In Advances in Cryptology - CRYPTO 2009, 126–42. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-03356-8_8.
Der volle Inhalt der QuellePejó, Balázs, und Damien Desfontaines. „Computational Power (C)“. In Guide to Differential Privacy Modifications, 55–57. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-96398-9_9.
Der volle Inhalt der QuelleBen Hamida, Sana, Hichem Mrabet und Abderrazak Jemai. „How Differential Privacy Reinforces Privacy of Machine Learning Models?“ In Advances in Computational Collective Intelligence, 661–73. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-16210-7_54.
Der volle Inhalt der QuelleValovich, Filipp, und Francesco Aldà. „Computational Differential Privacy from Lattice-Based Cryptography“. In Number-Theoretic Methods in Cryptology, 121–41. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-76620-1_8.
Der volle Inhalt der QuelleKang, Yilin, Jian Li, Yong Liu und Weiping Wang. „Data Heterogeneity Differential Privacy: From Theory to Algorithm“. In Computational Science – ICCS 2023, 119–33. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-35995-8_9.
Der volle Inhalt der QuelleTao, Fuqiang, Zhe Sun, Rui Liang, Rundong Shao, Yuhan Chai und Yangyang Wang. „FEDSET: Federated Random Forest Based on Differential Privacy“. In Computational and Experimental Simulations in Engineering, 791–806. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-42987-3_55.
Der volle Inhalt der QuellePodsevalov, Ivan, Alexei Podsevalov und Vladimir Korkhov. „Differential Privacy for Statistical Data of Educational Institutions“. In Computational Science and Its Applications – ICCSA 2022 Workshops, 603–15. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-10542-5_41.
Der volle Inhalt der QuelleGroce, Adam, Jonathan Katz und Arkady Yerukhimovich. „Limits of Computational Differential Privacy in the Client/Server Setting“. In Theory of Cryptography, 417–31. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-19571-6_25.
Der volle Inhalt der QuelleQian, Jiaqing, Yanan Chen und Sanxiu Jiao. „DPFL-AES: Differential Privacy Federated Learning Based on Adam Early Stopping“. In Computational and Experimental Simulations in Engineering, 905–19. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-42515-8_64.
Der volle Inhalt der QuelleBun, Mark, Yi-Hsiu Chen und Salil Vadhan. „Separating Computational and Statistical Differential Privacy in the Client-Server Model“. In Theory of Cryptography, 607–34. Berlin, Heidelberg: Springer Berlin Heidelberg, 2016. http://dx.doi.org/10.1007/978-3-662-53641-4_23.
Der volle Inhalt der QuelleKonferenzberichte zum Thema "Computational Differential Privacy"
Li, Xianzhi, Ran Zmigrod, Zhiqiang Ma, Xiaomo Liu und Xiaodan Zhu. „Fine-Tuning Language Models with Differential Privacy through Adaptive Noise Allocation“. In Findings of the Association for Computational Linguistics: EMNLP 2024, 8368–75. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-emnlp.491.
Der volle Inhalt der QuelleGupta, Yash, Jeswin M. S, Aniruddh Mantrala, Davin Henry Monteiro, Adhithi M und M. N. Thippeswamy. „Enhancing Differential Privacy in Federated Learning via Quantum Computation and Algorithms“. In 2024 8th International Conference on Computational System and Information Technology for Sustainable Solutions (CSITSS), 1–6. IEEE, 2024. https://doi.org/10.1109/csitss64042.2024.10816807.
Der volle Inhalt der QuelleVu, Doan Nam Long, Timour Igamberdiev und Ivan Habernal. „Granularity is crucial when applying differential privacy to text: An investigation for neural machine translation“. In Findings of the Association for Computational Linguistics: EMNLP 2024, 507–27. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-emnlp.29.
Der volle Inhalt der QuelleTajima, Arisa, Wei Jiang, Virendra Marathe und Hamid Mozaffari. „Enhanced Private Decision Trees using Secure Multiparty Computation and Differential Privacy“. In 2024 IEEE International Conference on Knowledge Graph (ICKG), 352–59. IEEE, 2024. https://doi.org/10.1109/ickg63256.2024.00051.
Der volle Inhalt der QuelleFlemings, James, und Murali Annavaram. „Differentially Private Knowledge Distillation via Synthetic Text Generation“. In Findings of the Association for Computational Linguistics ACL 2024, 12957–68. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-acl.769.
Der volle Inhalt der QuelleChen, Bo, Baike She, Calvin Hawkins, Alex Benvenuti, Brandon Fallin, Philip E. Paré und Matthew Hale. „Differentially Private Computation of Basic Reproduction Numbers in Networked Epidemic Models“. In 2024 American Control Conference (ACC), 4422–27. IEEE, 2024. http://dx.doi.org/10.23919/acc60939.2024.10644264.
Der volle Inhalt der QuelleRamesh, Krithika, Nupoor Gandhi, Pulkit Madaan, Lisa Bauer, Charith Peris und Anjalie Field. „Evaluating Differentially Private Synthetic Data Generation in High-Stakes Domains“. In Findings of the Association for Computational Linguistics: EMNLP 2024, 15254–69. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-emnlp.894.
Der volle Inhalt der QuelleMeisenbacher, Stephen, Maulik Chevli, Juraj Vladika und Florian Matthes. „DP-MLM: Differentially Private Text Rewriting Using Masked Language Models“. In Findings of the Association for Computational Linguistics ACL 2024, 9314–28. Stroudsburg, PA, USA: Association for Computational Linguistics, 2024. http://dx.doi.org/10.18653/v1/2024.findings-acl.554.
Der volle Inhalt der QuelleGhazi, Badih, Rahul Ilango, Pritish Kamath, Ravi Kumar und Pasin Manurangsi. „Towards Separating Computational and Statistical Differential Privacy“. In 2023 IEEE 64th Annual Symposium on Foundations of Computer Science (FOCS). IEEE, 2023. http://dx.doi.org/10.1109/focs57990.2023.00042.
Der volle Inhalt der QuelleKotevska, Olivera, Folami Alamudun und Christopher Stanley. „Optimal Balance of Privacy and Utility with Differential Privacy Deep Learning Frameworks“. In 2021 International Conference on Computational Science and Computational Intelligence (CSCI). IEEE, 2021. http://dx.doi.org/10.1109/csci54926.2021.00141.
Der volle Inhalt der QuelleBerichte der Organisationen zum Thema "Computational Differential Privacy"
Rannenberg, Kai, Sebastian Pape, Frédéric Tronnier und Sascha Löbner. Study on the Technical Evaluation of De-Identification Procedures for Personal Data in the Automotive Sector. Universitätsbibliothek Johann Christian Senckenberg, Oktober 2021. http://dx.doi.org/10.21248/gups.63413.
Der volle Inhalt der Quelle