Tesis sobre el tema "Privacy"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte los 50 mejores tesis para su investigación sobre el tema "Privacy".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.
Loesing, Karsten. "Privacy-enhancing technologies for private services". Bamberg Univ. of Bamberg Press, 2009. http://d-nb.info/994593937/34.
Texto completoRICCI, STEFANO. "Il global privacy standard: i modelli di tutela della privacy". Doctoral thesis, Università degli Studi di Milano-Bicocca, 2013. http://hdl.handle.net/10281/44620.
Texto completoThis work proposes an alternative account of data protection in terms of confidentiality: the argument is that confidentiality should have more relevance because it lights up the relationship between those who have information and those to whom the information relates. In fact, I argue that the deficit of understanding in data protection laws is due to an equivocal centrality of the need to safeguard an individualistic and general right of privacy as shield against arbitrary interference instead of a need to protect relationships of trust. Conceptualising data protection through confidentiality serves to point out the breaches of fiduciary duties. In contrast to the classic approach of privacy and data protection, confidentiality focuses on relationships rather than individuals because, far from a right to be let alone, confidentiality is based on the rules of trust within relationships: it is adequate to describe a breach of privacy through confidentiality as a breach of an implicit clause of a fiduciary relationship existing between the data controller (the confidant) and the data subject (the confider). Global Privacy Standard (GPS) and Fair Information Practises (FIPs) show a clearly delineated ground for personal data protection in the form of confidentiality. If GPS are understood not as creating a general right of privacy of personal data, but as carrying out the extremely different duty of confidentiality with respect to data protection, GPS can be seen not as attempting to regulate personal data based on the private nature of that information, but only establish the framework in which such information is exchanged. Data protection should therefore be separated from privacy and should put into a legal frame of confidentiality, so that personal data could be better protected.
Foerster, Marian. "WWW Privacy - P3P Platform of Privacy Preferencers". Universitätsbibliothek Chemnitz, 2000. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-200000598.
Texto completoPurandare, Darshan. "ENHANCING MESSAGE PRIVACY IN WIRED EQUIVALENT PRIVACY". Master's thesis, University of Central Florida, 2005. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/2998.
Texto completoM.S.
School of Computer Science
Engineering and Computer Science
Computer Science
WITTE, NATHAN ALLAN. "PRIVACY: ARCHITECTURE IN SUPPORT OF PRIVACY REGULATION". University of Cincinnati / OhioLINK, 2003. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1053701814.
Texto completoWitte, Nathan. "Privacy architecture in support of privacy regulation /". Cincinnati, Ohio : University of Cincinnati, 2003. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=ucin1053701814.
Texto completoDe, Montjoye Yves-Alexandre. "Computational privacy : towards privacy-conscientious uses of metadata". Thesis, Massachusetts Institute of Technology, 2015. http://hdl.handle.net/1721.1/101850.
Texto completoCataloged from PDF version of thesis.
Includes bibliographical references (pages 111-127).
The breadcrumbs left behind by our technologies have the power to fundamentally transform the health and development of societies. Metadata about our whereabouts, social lives, preferences, and finances can be used for good but can also be abused. In this thesis, I show that the richness of today's datasets have rendered traditional data protections strategies outdated, requiring us to deeply rethink our approach. First, I show that the concept of anonymization, central to legal and technical data protection frameworks, does not scale. I introduce the concept of unicity to study the risks of re-identification of large-scale metadata datasets given p points. I then use unicity to show that four spatio-temporal points are enough to uniquely identify 95% of people in a mobile phone dataset and 90% of people in a credit card dataset. In both cases, I also show that traditional de-identification strategies such as data generalization are not sufficient to approach anonymity in modern high-dimensional datasets. Second, I argue that the second pillar of data protection, risk assessment, is similarly crumbling as data gets richer. I show, for instance, how standard mobile phone data-information on how and when somebody calls or texts-can be used to predict personality traits up to 1.7 times better than random. The risk of inference in big data will render comprehensive risks assessments increasingly difficult and, moving forward, potentially irrelevant as they will require evaluating what can be inferred now, and in the future, from rich data. However, this data has a great potential for good especially in developing countries. While it is highly unlikely that we will ever find a magic bullet or even a one-size- fits-all approach to data protection, there are ways that exist to use metadata in privacy-conscientious ways. I finish this thesis by discussing technical solutions (incl. privacy-through-security ones) which, when combined with legal and regulatory frameworks, provide a reasonable balance between the imperative of using this data and the legitimate concerns of the individual and society.
by Yves-Alexandre de Montjoye.
Ph. D.
Sato, Keiko. "Privacy on the internet : Investigation into corporate privacy policy of Australian large private sector organisations on the internet". Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2001. https://ro.ecu.edu.au/theses/1032.
Texto completoVéliz, Carissa. "On privacy". Thesis, University of Oxford, 2017. https://ora.ox.ac.uk/objects/uuid:afb31b0e-f022-48a6-b239-4c704cfd4484.
Texto completoBruggen, Harry van der. "Patiënt, privaat en privacy de stoelgang als gezondheidswetenschappelijk probleem /". Lochem : Maastricht : De Tijdstroom ; University Library, Maastricht University [Host], 1991. http://arno.unimaas.nl/show.cgi?fid=5616.
Texto completoLoesing, Karsten [Verfasser]. "Privacy-enhancing technologies for private services / von Karsten Loesing". Bamberg : Univ. of Bamberg Press, 2009. http://d-nb.info/994593937/34.
Texto completoTuomisto, Tino, Adrian Ringström y Aleksi Vekki. "Is your privacy private on mobile social media platforms?" Thesis, Linnéuniversitetet, Institutionen för marknadsföring (MF), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-96089.
Texto completoJohnson, Virginia Wilson. "Architectural correlates of privacy : the dynamics of privacy regulation /". Diss., This resource online, 1990. http://scholar.lib.vt.edu/theses/available/etd-07132007-143142/.
Texto completoMao, Congcong. "Privacy Issues in IoT : Privacy concerns in smart home". Thesis, Linnéuniversitetet, Institutionen för informatik (IK), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-90587.
Texto completoAlhussein, Nawras. "Privacy by Design & Internet of Things: managing privacy". Thesis, Malmö universitet, Fakulteten för teknik och samhälle (TS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:mau:diva-20590.
Texto completoPrivacy means the right to be left alone. It has been questioned many times if privacy really exists on the internet, especially in Internet of Things systems or smart systems as they are also called. More questions occur when the new general data protection regulation (GDPR) within the European Union applies in May. In this paper privacy by design that the general data protection regulation comes with is being studied. This study answers whether privacy by design will be able to increase the protection of privacy in Internet of Things systems. Advantages and disadvantages are also addressed and how companies and common users are affected by the implementation of privacy by design. The question has been answered by a literature review and two interviews. It turned out that a significant part of the problems in Internet of Things regarding privacy may be solved by data management. The privacy by design includes protection of data in all states through different methods such as encryption. In this way, privacy by design contributes to increased security within Internet of Things system.
Morrison, Roberta. "Drawing the line : understanding privacy concern, privacy literacy and trust influences on online social network privacy boundaries". Thesis, University of Strathclyde, 2013. http://oleg.lib.strath.ac.uk:80/R/?func=dbin-jump-full&object_id=25563.
Texto completoKolter, Jan Paul. "User-centric privacy a usable and provider-independent privacy infrastructure". Lohmar Köln Eul, 2009. http://d-nb.info/1002958776/04.
Texto completoNordström, Michael y Sergej Sevcenko. "Internet Privacy : A look into the construct of Privacy Knowledge". Thesis, Internationella Handelshögskolan, Högskolan i Jönköping, IHH, Marketing and Logistics, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-18313.
Texto completoAlhalafi, Dhafer. "Privacy policy-based framework for privacy disambiguation in distributed systems". Thesis, De Montfort University, 2015. http://hdl.handle.net/2086/12267.
Texto completoGunnarsson, Annicka y Siri Ekberg. "Invasion of Privacy : Spam - one result of bad privacy protection". Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik och datavetenskap, 2003. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-5393.
Texto completoBiondi, Alessandro. "Tutela della privacy in Android ed educazione alla mobile privacy". Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2022. http://amslaurea.unibo.it/25784/.
Texto completoMasmoudi, Souha. "Malleable privacy-enhancing-technologies for privacy-preserving identity management systems". Electronic Thesis or Diss., Institut polytechnique de Paris, 2022. http://www.theses.fr/2022IPPAS023.
Texto completoDigital identities are, nowadays, used at a large scale (i.e., in public services, social medias, at work, online shopping, etc.). This brings usability issues as users are constrained to deal with multiple identities and attributes for access control and data sharing objectives. In addition, security and privacy challenges have arisen as the interacting entities, those that issue, process and collect these identities can, due to their behavior or security deficiencies, lead to identity theft, massive data collection and tracking of users' behaviors on the Internet.This thesis aims at finding the best trade-off between security, privacy and usability for identity management systems, based on cryptographic primitives. The first two contributions focus on identity management for access control and consider real identities and attributes that contain personal (e.g., age) and sensitive (e.g., biometric traits) information.The first contribution proposes a user-centric and privacy-preserving identity management system in which users keep control over their attributes. A user, that receives attributes certified by an identity provider, is able to interact, in a pseudonymized manner, with a service provider and prove the authenticity of the provided attributes while ensuring that he discloses only the minimum number of attributes. This solution is based on a new malleable signature scheme that allows users to modify the certificate issued by the identity provider on his attributes in a restricted and controlled manner. It also preserves privacy by satisfying the unlinkability property between curious service providers that try to link different transactions to the same user.The second contribution presents a new biometric authentication scheme that offers robustness and privacy guarantees. Three steps are required. First, the user physically visits the identity provider that pushes an encrypted and certified biometric template onto his smartphone. Then he remotely enrolls at a service provider, in an anonymous manner. Finally, he authenticates offline to the service provider that captures a new biometric template in order to be locally verified via the smartphone. By relying on malleable signatures, the proposed solution prevents the use of fake biometric identities and guarantees the authentication soundness. Unlinkability and anonymity are also preserved.The third contribution provides a solution to meet the need of data sharing in an identity management system. In particular, it studies the management of users ephemeral attributes in the context of proximity tracing for e-healthcare systems. The proposed solution ensures data consistency and integrity and preserves the privacy of users who share their contact information with people in proximity. Alerts are issued to users who have been in contact with infected persons. The use of a hybrid architecture, which relies on a centralized server and decentralized proxies, allows to prevent malicious users from injecting false alerts, and to prevent the linkability of contact information to the same user and the re-identification of users involved in contact with an infected person
Iachello, Giovanni. "Privacy and Proportionality". Diss., Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/10487.
Texto completoBoldt, Martin. "Privacy-Invasive Software". Doctoral thesis, Karlskrona : Blekinge Institute of Technology, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-00459.
Texto completoZimmermann, Christian [Verfasser] y Günter [Akademischer Betreuer] Müller. "Privacy through accountability". Freiburg : Universität, 2016. http://d-nb.info/1122594003/34.
Texto completoKang, Ted Taiho. "Respect My Privacy". Thesis, Massachusetts Institute of Technology, 2009. http://hdl.handle.net/1721.1/53125.
Texto completoIncludes bibliographical references (p. 64).
Most social networks have implemented extensive, complex privacy controls in order to battle the host of privacy concerns that initially plagued their online communities. These privacy controls have taken the form of access restriction, which allows users to specify who is and who is not allowed to view their personal information. This binary system leaves users unprotected in the, hopefully rare, cases in which the access restriction mechanisms are bypassed. Respect My Privacy offers a different approach to privacy protection, founded on the philosophies of Information Accountability. Respect My Privacy aims to allow users to clearly declare the policies that govern the use of their data, and implement mechanisms that promptly notify the user of misuse after the fact. In its current state, the Respect My Privacy project has been implemented across three platforms: Facebook, OpenSocial, and the Tabulator extension with a focus on defining a clear vocabulary for discussing restrictions on use of data and making it simple for users to display and edit the restrictions users wish to place on their personal information. There is also a discussion on decentralized social networks and their role in the future of Respect My Privacy and social networks in general.
by Ted Taiho Kang.
M.Eng.
Winkler, Stephanie D. "Protecting Online Privacy". UKnowledge, 2016. http://uknowledge.uky.edu/comm_etds/47.
Texto completoJakobsson, Björn Markus. "Privacy vs. authenticity /". Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 1997. http://wwwlib.umi.com/cr/ucsd/fullcit?p9804529.
Texto completoCobb, Christopher B. R. "Combatting maritime privacy". access online version, 1994. http://handle.dtic.mil/100.2/ADA295083.
Texto completoSalucci, Simone. "Privacy su Facebook". Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2013. http://amslaurea.unibo.it/5055/.
Texto completoBonatti, Piero A., Bert Bos, Stefan Decker, Garcia Javier David Fernandez, Sabrina Kirrane, Vassilios Peristeras, Axel Polleres y Rigo Wenning. "Data Privacy Vocabularies and Controls: Semantic Web for Transparency and Privacy". CEUR Workshop Proceedings, 2018. http://epub.wu.ac.at/6490/1/SW4SG_2018.pdf.
Texto completoBARBOSA, Pedro Yóssis Silva. "Privacy by evidence: a software development methodology to provide privacy assurance". Universidade Federal de Campina Grande, 2018. http://dspace.sti.ufcg.edu.br:8080/jspui/handle/riufcg/1613.
Texto completoMade available in DSpace on 2018-08-30T21:18:48Z (GMT). No. of bitstreams: 1 PEDRO YÓSSIS SILVA BARBOSA – TESE (PPGCC) 2018.pdf: 5191620 bytes, checksum: bf226ff6b5409b6330fd126cc2677503 (MD5) Previous issue date: 2018-02
Capes
Em um mundo cada vez mais conectado, uma diversidade de softwares e sensores coletam dados dos ambientes e seus habitantes. Devido à riqueza das informações coletadas, privacidade se torna um requisito importante. Aplicações estão sendo desenvolvidas, e, apesar de existirem princípios e regras para lidar com a privacidade dos indivíduos, faltam metodologias para guiar a integração das diretrizes de privacidade em um processo de desenvolvimento. Metodologias existentes como o Privacidade desde a Concepção (do inglês Privacy by Design – PbD) ainda são vagas e deixam muitos questionamentos em aberto sobre como aplicá-las na prática. Neste trabalho, nós propomos o conceito de Privacidade por Evidência (do inglês Privacy by Evidence – PbE), uma metodologia de desenvolvimento de software para prover privacidade. Dada a dificuldade em prover privacidade total, propomos que as documentações das mitigações sejam em formas de evidências de privacidade, objetivando aumentar a confiança no projeto. Para validar a eficácia, PbE tem sido utilizada durante o desenvolvimento de quatro aplicações que servem como estudos de caso. O primeiro estudo de caso considerado é uma aplicação de medição inteligente de energia; o segundo considera uma aplicação de contagem e monitoramento de pessoas; o terceiro considera um sistema de monitoramento de eficiência energética; e o quarto considera um sistema de autenticação de dois fatores. Para estas aplicações, os times proveram sete,cinco,cinco e quatro evidências de privacidade, respectivamente, e concluimos que a PbE pode ser efetiva em ajudar a entender e a tratar as necessidades de proteção à privacidade quando se está desenvolvendo software.
In anincreasinglyconnectedworld,adiversityofsoftwareandsensorscollectdatafromthe environmentanditsinhabitants.Becauseoftherichnessoftheinformationcollected,privacy becomes animportantrequirement.Applicationsarebeingdeveloped,and,althoughthere are principlesandrulesregardingtheprivacyofindividuals,thereisstillalackofmethod- ologies toguidetheintegrationofprivacyguidelinesintothedevelopmentprocess.Existing methodologies likethe Privacy byDesign (PbD) arestillvagueandleavemanyopenques- tions onhowtoapplytheminpractice.Inthisworkweproposetheconceptof Privacy by Evidence (PbE), asoftwaredevelopmentmethodologytoprovideprivacyassurance.Given the difficultyinprovidingtotalprivacyinmanyapplications,weproposetodocumentthe mitigationsinformofevidencesofprivacy,aimingtoincreasetheconfidenceoftheproject. Tovalidateitseffectiveness, PbE has beenusedduringthedevelopmentoffourapplications that serveascasestudies.Thefirstconsideredcasestudyisasmartmeteringapplication; the secondconsidersapeoplecountingandmonitoringapplication;thethirdconsidersan energyefficiencymonitoringsystem;andthefourthconsidersatwofactorauthentication system. Fortheseapplications,theteamswereabletoprovideseven,five,five,andfour evidencesofprivacy,respectively,andweconcludethat PbE can beeffectiveinhelpingto understand andtoaddresstheprivacyprotectionneedswhendevelopingsoftware.
Al-Rawashdeh, Sami H. "Is privacy brought home? : criminal justice and the right to privacy". Thesis, University of Aberdeen, 2003. http://digitool.abdn.ac.uk/R?func=search-advanced-go&find_code1=WSN&request1=AAIU176274.
Texto completoBromander, Anton. "Using Privacy Indicators to Nudge Users into Selecting Privacy Friendly Applications". Thesis, Karlstads universitet, Institutionen för matematik och datavetenskap (from 2013), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-73154.
Texto completoThibes, Mariana Zanata. "Orkut: o público, o privado e o íntimo na era das novas tecnologias da informação". Universidade de São Paulo, 2009. http://www.teses.usp.br/teses/disponiveis/8/8132/tde-12072010-135357/.
Texto completoThe concepts of private, public and the intimacy have suffered important changes since the new information technologies took part of the everyday life. If, on the one hand, it is possible to observe a certain refinement of the techniques of control and vigilance, on the other, these technologies allow a reflexive exercise that leads to new experiences of the private, public and the intimacy. Through the examination of the sociability that takes place at orkut, this research tried to analyze how these new experiences have been configured, observing that, despite of the dynamic of this sociability reveals affinities with the objectives of the control society, it also allows the liberty to create identities and to redefine the rules that guide the life, stimulating a kind of reflection that points out the strengthening of the politics.
Braathen, Anders Magnus y Hans Steien Rasmussen. "Preserving privacy in UbiCollab: Extending privacy support in a ubiquitous collaborative environment". Thesis, Norwegian University of Science and Technology, Department of Computer and Information Science, 2005. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-9224.
Texto completoUbiCollab is a platform that supports the development of cooperative applications for collaboration in a ubiquitous environment. The platform enables entities of different types and technologies to work together and share a common set of resources. In a collaborative setting, trust is crucial for creating bonds between the different participants and the system. People using these kinds of systems need to feel secure and trust the system enough to give personal information away and feel that they can control the use of this gathered information. By personal information we mean name, title, email etc., but also location or type of task the user is performing within the system. This thesis explores multiple identities in ubiquitous collaboration, as a mechanism for improving the privacy of UbiCollab. The thesis also explores the building and displaying of a reputation from past collaborative experiences in connection with the different identities. To realize these mechanisms the system allows anonymous access to services by communicating through a privacy proxy. UbiCollab uses a privacy policy description engine that enables negotiation on how private data is gathered and used by the system. The different identities will be supplied with a set of preferences that describes what actions the system is allowed to perform on their personal data. This provides a way to give the user control over the gathering and sharing of personal information. The policy description is based on an adaptation of the P3P standard, designed to suit policy descriptions in a service-based architecture. Privacy extensions to the existing or new services will be easily performed by adding a reference to where the policies can be found. As a counterpart to the P3P policies, the P3P Preference Exchange Language (APPEL) has been incorporated into the platform to allow the users a way to post their privacy preferences. The adapted API has been redefined to better suit the development of UbiCollab applications. The resulting prototype demonstrates the use of these privacy mechanisms and their value to the UbiCollab platform.
Fischer-Hübner, Simone. "IT-security and privacy : design and use of privacy-enhancing security mechanisms /". Berlin [u.a.] : Springer, 2001. http://www.loc.gov/catdir/enhancements/fy0812/2001034161-d.html.
Texto completoHarmer, Jeremy Michael. "Is Internet privacy dead? : recovering Internet privacy in an increasingly surveillant society". Thesis, University of Leeds, 2017. http://etheses.whiterose.ac.uk/18753/.
Texto completoBordenabe, Nicolás E. "Measuring Privacy with Distinguishability Metrics: Definitions, Mechanisms and Application to Location Privacy". Palaiseau, Ecole polytechnique, 2014. https://tel.archives-ouvertes.fr/tel-01098088/document.
Texto completoThe increasing availability of smartphone and tablets has given place to the development of a broad new class of applications, which collect and analyze big amounts of information about its users for different reasons: offering a personalized service, offer targeted advertisement, or provide accurate aggregated data for research and analysis purposes. However, serious privacy concerns have been risen about the kind and quantity of data being collected: this data is in general private by nature, and often it can be linked to other kinds of sensitive information. And in most cases, this information is made available to an untrusted entity, either because the service provider itself is not reliable, or because some aggregated information is being publicly released. In order to deal with these concerns, some kind of privacy guarantee is needed. Differential Privacy is one of the most prominent frameworks used to deal with disclosure prevention in statistical databases. It provides a formal privacy guarantee, ensuring that sensitive information relative to individuals cannot be easily inferred by disclosing answers to aggregate queries. If two databases are adjacent, i. E. Differ only for an individual, then the query should not allow to tell them apart by more than a certain factor. This induces a bound also on the distinguishability of two generic databases, which is determined by their distance on the Hamming graph of the adjacency relation. When the sensitive information to be protected is other than the value of a single individual, or when the secrets itself are not databases at all, it is common to consider different notions of distinguishability, which depend on the application at hand and the privacy guarantees we wish to express. In the first part of this thesis we explore the implications of differential privacy when the indistinguishability requirement depends on an arbitrary notion of distance. We show that we can naturally express, in this way, (protection against) privacy threats that cannot be represented with the standard notion, leading to new applications of the differential privacy framework. We give intuitive characterizations of these threats in terms of Bayesian adversaries. We revisit the well-known results about universally optimal mechanisms, and show that, in our setting, these mechanisms exist for sum, average, and percentile queries. In the second part of this thesis we introduce geo-indistinguishability, a formal notion of privacy for location-based systems. This privacy definition corresponds to an instance of the generalized version of differential privacy presented before. We also show a mechanism for achieving this notion and study different issues that arise with its implementation, namely the truncation of the result and the effect of the precision of the machine. We also describe how to use our mechanism to enhance LBS applications with geo-indistinguishability guarantees without compromising the quality of the results. In the last part of this thesis, we consider the location privacy framework of Shokri et al. , which offers an optimal trade-off between the loss of quality of service and the privacy protection with respect to a given Bayesian adversary. We show that it is possible to combine the advantages of this approach with ours: given a minimum threshold for the degree of geo-indistinguishability, we construct a mechanism that offer maximal utility, as the solution of a linear optimization problem. Since geo-indistinguishability is insensitive to the remapping of a Bayesian adversary, this mechanism is optimal also in the sense of Shokri et al. Furthermore we propose a method to reduce the number of constraints of the linear program from cubic to quadratic, enlarging significantly the size of location sets for which the optimal trade-off mechanisms can still be computed, while maintaining the privacy guarantees without affecting significantly the utility of the generated mechanism
Frik, Alisa. "Economics of Privacy: Users’ Attitudes and Economic Impact of Information Privacy Protection". Doctoral thesis, Università degli studi di Trento, 2017. https://hdl.handle.net/11572/368319.
Texto completoFrik, Alisa. "Economics of Privacy: Users' Attitudes and Economic Impact of Information Privacy Protection". Doctoral thesis, University of Trento, 2017. http://eprints-phd.biblio.unitn.it/2025/1/Frik_Alisa_Thesis.pdf.
Texto completoGrivet, Sébert Arnaud. "Combining differential privacy and homomorphic encryption for privacy-preserving collaborative machine learning". Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG037.
Texto completoThe purpose of this PhD is to design protocols to collaboratively train machine learning models while keeping the training data private. To do so, we focused on two privacy tools, namely differential privacy and homomorphic encryption. While differential privacy enables to deliver a functional model immune to attacks on the training data privacy by end-users, homomorphic encryption allows to make use of a server as a totally blind intermediary between the data owners, that provides computational resource without any access to clear information. Yet, these two techniques are of totally different natures and both entail their own constraints that may interfere: differential privacy generally requires the use of continuous and unbounded noise whereas homomorphic encryption can only deal with numbers encoded with a quite limited number of bits. The presented contributions make these two privacy tools work together by coping with their interferences and even leveraging them so that the two techniques may benefit from each other.In our first work, SPEED, we built on Private Aggregation of Teacher Ensembles (PATE) framework and extend the threat model to deal with an honest but curious server by covering the server computations with a homomorphic layer. We carefully define which operations are realised homomorphically to make as less computation as possible in the costly encrypted domain while revealing little enough information in clear to be easily protected by differential privacy. This trade-off forced us to realise an argmax operation in the encrypted domain, which, even if reasonable, remained expensive. That is why we propose SHIELD in another contribution, an argmax operator made inaccurate on purpose, both to satisfy differential privacy and lighten the homomorphic computation. The last presented contribution combines differential privacy and homomorphic encryption to secure a federated learning protocol. The main challenge of this combination comes from the necessary quantisation of the noise induced by encryption, that complicates the differential privacy analysis and justifies the design and use of a novel quantisation operator that commutes with the aggregation
Zhang, Nan. "Privacy-preserving data mining". [College Station, Tex. : Texas A&M University, 2006. http://hdl.handle.net/1969.1/ETD-TAMU-1080.
Texto completoSteine, Asgeir. "Privacy-Preserving Cryptographic Protocols". Doctoral thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for matematiske fag, 2012. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-17284.
Texto completoLillebo, Ole Kristian. "Next generation privacy policy". Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for datateknikk og informasjonsvitenskap, 2011. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-13647.
Texto completoAleem, Muhammad Usman. "Essays in information privacy". Thesis, University of British Columbia, 2015. http://hdl.handle.net/2429/53940.
Texto completoBusiness, Sauder School of
Graduate
Bodriagov, Oleksandr. "Social Networks and Privacy". Licentiate thesis, KTH, Teoretisk datalogi, TCS, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-166818.
Texto completoCentraliserade sociala online nätverk utgör ett hot mot användarnas integritet. Detta eftersom leverantörer av sociala nätverkstjänster har obegränsad tillgång till användarnas information. Decentraliserade sociala nätverk löser integritetsproblemet genom att eliminera leverantörer och ge användarna kontroll över deras data. Innebörden av detta är att användarna själva får bestämma vem som får tillgång till deras data. Även om det finns flera förslag och vissa framsteg i utvecklingen avseende integritetsbevarande decentraliserade sociala nätverk, har målet om säkra, effektiva, och tillgängliga sociala nätverk i en decentraliserad miljö inte uppnåtts fullt ut. Denna avhandling bidrar till forskning inom säkerhet avseende sociala nätverk med fokus på decentraliserade sociala nätverk. Avhandlingen inriktas på krypteringsbaserad åtkomstkontroll och hantering av kryptografiska nycklar (som krävs för denna åtkomstkontroll) med hjälp av användarkonton med lösenordsbaserad inloggning i decentraliserade sociala nätverk. Först undersöker denna avhandling krav på kryptering för decentraliserade sociala nätverk och föreslår utvärderingskriterier. Dessa utvärderingskriterier används sedan för bedömning av befintliga krypteringsbaserade system för åtkomstkontroll. Vår utredning visar att samtliga garanterar sekretess av själva innehållet. Integritet av information om innehållet eller åtkomstprinciper är dock inte skyddat alls, alternativt skyddade på bekostnad av systemets prestanda och flexibilitet. Vi lyfter fram potentialen i två klasser av integritetsbevarande system i DOSN sammanhang: broadcast-krypteringssystem med dolda tillgångsstrukturer och predikat krypteringssystem; vi föreslår användning av dessa system. Båda dessa klasser innehåller system som uppvisar önskvärda egenskaper och uppfyller kriterier på ett bättre sätt. För det andra analyserar avhandlingen predikat kryptering och anpassar denna till DOSN sammanhang, eftersom det är för dyrt att använda som det är. Vi föreslår en ”univariate polynomial construction” för åtkomstprinciper i predikat kryptering som drastiskt ökar systemets prestanda, men läcker någon del av åtkomstprincipen till användare med åtkomsträttigheter. Vi använder Bloom-filter för att minska dekrypteringstiden och indikera objekt som kan dekrypteras av en viss användare. Genom att göra ett experiment med nyhetsflödessammansättning visas att det anpassade systemet ger goda resultat och därmed användarupplevelse. För det tredje presenterar avhandlingen en lösning på problemet avseende hanteringen av kryptografiska nycklar för autentisering och kommunikation mellan användare i decentraliserade sociala online nätverk. Vi föreslår en lösenordsbaserad inloggningsprocedur för peer-to-peer (P2P) miljön, som gör att användaren som passerar autentisering får återvinna en uppsättning kryptografiska nycklar som krävs för applikationen. Förutom lösenordsinloggning presenterar vi också stödprotokoll för att ge relaterat funktionalitet, såsom inloggning med lagrade lösenord, lösenordsbyte, och återställning av bortglömda lösenord. Kombinationen av dessa protokoll tillåter simulera lösenordsinloggning i centraliserade system. Prestandautvärderingen visar att tiden som krävs för inloggning är inom acceptabla gränser.
QC 20150602
Lindqvist, Anton. "Privacy Preserving Audit Proofs". Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-210694.
Texto completoDen ökande avlastningen av kritisk funktionalitet till datorer ställer högre krav på loggning och möjlighet till övervakning. Loggen måste vara resistent mot manipulation och möjliggöra för andra parter att ställa frågor berörande en viss händelse i loggen utan att läcka känslig information. Eftersom loggen inte antas vara att lita på måste varje svar vara verifierbart med hjälp av ett bevis. Denna rapport presenterar ett protokoll kapabelt till att producera verifierbara och integritetsbevarande svar på frågor om en viss händelse i loggen genom användning av Merkle-träd. Vid avsaknad av den förfrågade händelsen används ny metod för att autentisera ett Bloom filter med hjälp av Merkle-träd. Eftersom Bloom filtren är en probabilistisk konstruktion presenteras även en metod för att hantera falsk positiva svar.
Ball, Yvonne. "Privacy rights in employment". Thesis, University of Central Lancashire, 2008. http://clok.uclan.ac.uk/21606/.
Texto completoGibb, Susan Jennifer. "Privacy and Australian law". Title page, contents and abstract only, 1987. http://web4.library.adelaide.edu.au/theses/09PH/09phg4372.pdf.
Texto completo