To see the other types of publications on this topic, follow the link: Mathematical data- Security.

Dissertations / Theses on the topic 'Mathematical data- Security'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 19 dissertations / theses for your research on the topic 'Mathematical data- Security.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Ma, Chunyan. "Mathematical security models for multi-agent distributed systems." CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2568.

Full text
Abstract:
This thesis presents the developed taxonomy of the security threats in agent-based distributed systems. Based on this taxonomy, a set of theories is developed to facilitate analyzng the security threats of the mobile-agent systems. We propose the idea of using the developed security risk graph to model the system's vulnerabilties.
APA, Harvard, Vancouver, ISO, and other styles
2

Atoui, Ibrahim Abdelhalim. "Data reduction techniques for wireless sensor networks using mathematical models." Thesis, Bourgogne Franche-Comté, 2018. http://www.theses.fr/2018UBFCD009.

Full text
Abstract:
Dans ce travail, nous présentons des techniques de réduction de données et de sécurité conçues pour économiser de l’énergie dans les réseaux de capteurs sans fil. Premièrement, nous proposons un modèle d’agrégation de données basé sur la fonction de similarité servant à éliminer les données redondantes. En plus, nous avons travaillé sur l’envoi, le moins possible, de caractéristiques de données en se basant sur des fonctions d’ajustement qui expriment ces caractéristiques. Deuxièmement, nous nous sommes intéressés à l’hétérogénéité des données tout en étudiant la corrélation entre ces caractéristiques multi variantes après avoir éliminé les mesures identiques durant la phase d’agrégation. Finalement, nous donnons un cadre de sécurité rigoureux, conçu à partir de la cryptographie, qui satisfait le niveau d’exigence atteint normalement dans les réseaux de capteurs sans fil arborescents. Il empêche les pirates d’obtenir des informations à propos des données détectées en assurant une certaine confidentialité de bout-en-bout entre les nœuds du capteur et le puits. Afin de valider nos techniques proposées, nous avons implémenté les simulations de la première technique sur des données collectées en temps réel à partir du réseau Sensor Scope déployé à Grand-St-Bernard. Les simulations de la deuxième et de la troisième technique sont réalisées sur des données collectées en temps réel à partir de 54 capteurs déployés au laboratoire de recherche Intel Berkeley. L’efficacité de nos techniques est évaluée selon le taux de réduction de données, la consommation de l’énergie, la précision des données et la complexité de temps
In this thesis, we present energy-efficient data reduction and security techniques dedicated for wireless sensor networks. First, we propose a data aggregation model based on the similarity function that helps in removing the redundant data. In addition, based on the fitting functions we worked on sending less data features, accompanied with the fitting function that expresses all features. Second, we focus on heterogeneity of the data while studying the correlation among these multivariate features in order to enhance the data prediction technique that is based on the polynomial function, all after removing the similar measures in the aggregation phase using the Euclidean distance. Finally, we provide a rigorous security framework inherited from cryptography satisfies the level of exigence usually attained in tree-based WSNs. It prevents attackers from gaining any information about sensed data, by ensuring an end-to-end privacy between sensor nodes and the sink. In order to validate our proposed techniques, we implemented the simulations of the first technique on real readings collected from a small Sensor Scope network which is deployed at the Grand-St-Bernard, while the simulations of the second and the third techniques are conducted on real data collected from 54 sensors deployed in the Intel Berkeley Research Lab. The performance of our techniques is evaluated according to data reduction rate, energy consumption, data accuracy and time complexity
APA, Harvard, Vancouver, ISO, and other styles
3

Sathisan, Shashi Kumar. "Encapsulation of large scale policy assisting computer models." Thesis, Virginia Polytechnic Institute and State University, 1985. http://hdl.handle.net/10919/101261.

Full text
Abstract:
In the past two decades policy assisting computer models have made a tremendous impact in the analysis of national security issues and the analysis of problems in various government affairs. SURMAN (Survivability Management) is a policy assisting model that has been developed for use in national security planning. It is a large scale model formulated using the system dynamics approach of treating a problem in its entirety rather than in parts. In this thesis, an encapsulation of SURMAN is attempted so as to sharpen and focus its ability to perform policy/design evaluation. It is also aimed to make SURMAN more accessible to potential users and to provide a simple tool to the decision makers without having to resort to the mainframe computers. To achieve these objectives a personal/microcomputer version of SURMAN (PC SURMAN) and a series of curves relating inputs to outputs are developed. PC SURMAN reduces the complexity of SURMAN by dealing with generic aircraft. It details the essential survivability management parameters and their causal relationships through the life-cycle of aircraft systems. The model strives to link the decision parameters (inputs) to the measures of effectiveness (outputs). The principal decision variables identified are survivability, availability, and inventory of the aircraft system. The measures of effectiveness identified are the Increase Payload Delivered to Target Per Loss (ITDPL), Cost Elasticity of Targets Destroyed Per Loss (CETDPL), Combat Value Ratio (COMVR), Kill to Loss Ratio (KLR), and Decreased Program Life-Cycle Cost (DPLCC). The model provides an opportunity for trading off decision parameters. The trading off of survivability enhancement techniques and the defense budget allocation parameters for selecting those techniques/parameters with higher benefits and lower penalties are discussed. The information relating inputs to outputs for the tradeoff analysis is presented graphically using curves derived from experimentally designed computer runs.
M.S.
APA, Harvard, Vancouver, ISO, and other styles
4

Alkadi, Alaa. "Anomaly Detection in RFID Networks." UNF Digital Commons, 2017. https://digitalcommons.unf.edu/etd/768.

Full text
Abstract:
Available security standards for RFID networks (e.g. ISO/IEC 29167) are designed to secure individual tag-reader sessions and do not protect against active attacks that could also compromise the system as a whole (e.g. tag cloning or replay attacks). Proper traffic characterization models of the communication within an RFID network can lead to better understanding of operation under “normal” system state conditions and can consequently help identify security breaches not addressed by current standards. This study of RFID traffic characterization considers two piecewise-constant data smoothing techniques, namely Bayesian blocks and Knuth’s algorithms, over time-tagged events and compares them in the context of rate-based anomaly detection. This was accomplished using data from experimental RFID readings and comparing (1) the event counts versus time if using the smoothed curves versus empirical histograms of the raw data and (2) the threshold-dependent alert-rates based on inter-arrival times obtained if using the smoothed curves versus that of the raw data itself. Results indicate that both algorithms adequately model RFID traffic in which inter-event time statistics are stationary but that Bayesian blocks become superior for traffic in which such statistics experience abrupt changes.
APA, Harvard, Vancouver, ISO, and other styles
5

Li, Jianzhou, and University of Lethbridge Faculty of Arts and Science. "Design of a novel hybrid cryptographic processor." Thesis, Lethbridge, Alta. : University of Lethbridge, Faculty of Arts and Science, 2005, 2005. http://hdl.handle.net/10133/266.

Full text
Abstract:
A new multiplier that supports fields GF(p) and GF (2n) for the public-key cryptography, and fields GF (28) for the secret-key cryptography is proposed in this thesis. Based on the core multiplier and other extracted common operations, a novel hybrid crypto-processor is built which processes both public-key and secret-key cryptosystems. The corresponding instruction set is also presented. Three cryptographic algorithms: the Elliptic Curve Cryptography (ECC), AES and RC5 are focused to run in the processor. To compute scalar multiplication kP efficiently, a blend of efficient algorthms on elliptic curves and coordinates selections and of hardware architecture that supports arithmetic operations on finite fields is requried. The Nonadjacent Form (NAF) of k is used in Jacobian projective coordinates over GF(p); Montgomery scalar multiplication is utilized in projective coordinates over GF(2n). The dual-field multiplier is used to support multiplications over GF(p) and GF(2n) according to multiple-precision Montgomery multiplications algorithms. The design ideas of AES and RC5 are also described. The proposed hybrid crypto-processor increases the flexibility of security schemes and reduces the total cost of cryptosystems.
viii, 87 leaves : ill. (some col.) ; 28 cm.
APA, Harvard, Vancouver, ISO, and other styles
6

Li, Ling Feng. "An image encryption system based on two-dimensional quantum random walks." Thesis, University of Macau, 2018. http://umaclib3.umac.mo/record=b3950660.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Terashima, Robert Seth. "Tweakable Ciphers: Constructions and Applications." PDXScholar, 2015. https://pdxscholar.library.pdx.edu/open_access_etds/2484.

Full text
Abstract:
Tweakable ciphers are a building block used to construct a variety of cryptographic algorithms. Typically, one proves (via a reduction) that a tweakable-cipher-based algorithm is about as secure as the underlying tweakable cipher. Hence improving the security or performance of tweakable ciphers immediately provides corresponding benefits to the wide array of cryptographic algorithms that employ them. We introduce new tweakable ciphers, some of which have better security and others of which have better performance than previous designs. Moreover, we demonstrate that tweakable ciphers can be used directly (as opposed to as a building block) to provide authenticated encryption with associated data in a way that (1) is robust against common misuses and (2) can, in some cases, result in significantly shorter ciphertexts than other approaches.
APA, Harvard, Vancouver, ISO, and other styles
8

Haraldsson, Emil. "Strong user authentication mechanisms." Thesis, Linköping University, Department of Electrical Engineering, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2688.

Full text
Abstract:

For Siemens Industrial Turbomachinery to meet its business objectives a modular authentication concept has to be implemented. Such a mechanism must be cost- effective while providing a well-balanced level of security, easy maintenance and be as user-friendly as possible.

Authenticating users securely involves the combination of two fields, theory of authentication mechanisms in information systems and human computer interaction. To construct a strong user authentication system the correlations of these fields has to be understood and provide guidance in the design.

Strong user authentication mechanisms enforce the use of two-factor authentication or more. The combinations implemented rely on knowledge, possession and sometimes logical-location.

A user authentication system has been implemented using leading industrial products as building blocks glued together with security analysis, programming and usability research.

The thesis is divided into two parts, the first part giving the theoretical background of cryptography, authentication theory and protocols needed for the understanding of the second part, providing security analysis, blueprints, and detailed discussions on the implemented system.

Conclusions have been drawn regarding the implemented system and its context as well as from strict theoretical reasoning regarding the authentication field in general. Conclusions include:

· The unsuitability of remote authentication using biometrics

· The critical importance of client security in remote authentication

· The importance of a modular structure for the security of complex network-based systems

APA, Harvard, Vancouver, ISO, and other styles
9

Huang, Jian. "FPGA Implementations of Elliptic Curve Cryptography and Tate Pairing over Binary Field." Thesis, University of North Texas, 2007. https://digital.library.unt.edu/ark:/67531/metadc3963/.

Full text
Abstract:
Elliptic curve cryptography (ECC) is an alternative to traditional techniques for public key cryptography. It offers smaller key size without sacrificing security level. Tate pairing is a bilinear map used in identity based cryptography schemes. In a typical elliptic curve cryptosystem, elliptic curve point multiplication is the most computationally expensive component. Similarly, Tate pairing is also quite computationally expensive. Therefore, it is more attractive to implement the ECC and Tate pairing using hardware than using software. The bases of both ECC and Tate pairing are Galois field arithmetic units. In this thesis, I propose the FPGA implementations of the elliptic curve point multiplication in GF (2283) as well as Tate pairing computation on supersingular elliptic curve in GF (2283). I have designed and synthesized the elliptic curve point multiplication and Tate pairing module using Xilinx's FPGA, as well as synthesized all the Galois arithmetic units used in the designs. Experimental results demonstrate that the FPGA implementation can speedup the elliptic curve point multiplication by 31.6 times compared to software based implementation. The results also demonstrate that the FPGA implementation can speedup the Tate pairing computation by 152 times compared to software based implementation.
APA, Harvard, Vancouver, ISO, and other styles
10

Friot, Nicolas. "Itérations chaotiques pour la sécurité de l'information dissimulée." Thesis, Besançon, 2014. http://www.theses.fr/2014BESA2035/document.

Full text
Abstract:
Les systèmes dynamiques discrets, œuvrant en itérations chaotiques ou asynchrones, se sont avérés être des outils particulièrement intéressants à utiliser en sécurité informatique, grâce à leur comportement hautement imprévisible, obtenu sous certaines conditions. Ces itérations chaotiques satisfont les propriétés de chaos topologiques et peuvent être programmées de manière efficace. Dans l’état de l’art, elles ont montré tout leur intérêt au travers de schémas de tatouage numérique. Toutefois, malgré leurs multiples avantages, ces algorithmes existants ont révélé certaines limitations. Cette thèse a pour objectif de lever ces contraintes, en proposant de nouveaux processus susceptibles de s’appliquer à la fois au domaine du tatouage numérique et au domaine de la stéganographie. Nous avons donc étudié ces nouveaux schémas sur le double plan de la sécurité dans le cadre probabiliste. L’analyse de leur biveau de sécurité respectif a permis de dresser un comparatif avec les autres processus existants comme, par exemple, l’étalement de spectre. Des tests applicatifs ont été conduits pour stéganaliser des processus proposés et pour évaluer leur robustesse. Grâce aux résultats obtenus, nous avons pu juger de la meilleure adéquation de chaque algorithme avec des domaines d’applications ciblés comme, par exemple, l’anonymisation sur Internet, la contribution au développement d’un web sémantique, ou encore une utilisation pour la protection des documents et des donnés numériques. Parallèlement à ces travaux scientifiques fondamentaux, nous avons proposé plusieurs projets de valorisation avec pour objectif la création d’une entreprise de technologies innovantes
Discrete dynamical systems by chaotic or asynchronous iterations have proved to be highly interesting toolsin the field of computer security, thanks to their unpredictible behavior obtained under some conditions. Moreprecisely, these chaotic iterations possess the property of topological chaos and can be programmed in anefficient way. In the state of the art, they have turned out to be really interesting to use notably through digitalwatermarking schemes. However, despite their multiple advantages, these existing algorithms have revealedsome limitations. So, these PhD thesis aims at removing these constraints, proposing new processes whichcan be applied both in the field of digital watermarking and of steganography. We have studied these newschemes on two aspects: the topological security and the security based on a probabilistic approach. Theanalysis of their respective security level has allowed to achieve a comparison with the other existing processessuch as, for example, the spread spectrum. Application tests have also been conducted to steganalyse and toevaluate the robustness of the algorithms studied in this PhD thesis. Thanks to the obtained results, it has beenpossible to determine the best adequation of each processes with targeted application fields as, for example,the anonymity on the Internet, the contribution to the development of the semantic web, or their use for theprotection of digital documents. In parallel to these scientific research works, several valorization perspectiveshave been proposed, aiming at creating a company of innovative technology
APA, Harvard, Vancouver, ISO, and other styles
11

Suriadi, Suriadi. "Strengthening and formally verifying privacy in identity management systems." Thesis, Queensland University of Technology, 2010. https://eprints.qut.edu.au/39345/1/Suriadi_Suriadi_Thesis.pdf.

Full text
Abstract:
In a digital world, users’ Personally Identifiable Information (PII) is normally managed with a system called an Identity Management System (IMS). There are many types of IMSs. There are situations when two or more IMSs need to communicate with each other (such as when a service provider needs to obtain some identity information about a user from a trusted identity provider). There could be interoperability issues when communicating parties use different types of IMS. To facilitate interoperability between different IMSs, an Identity Meta System (IMetS) is normally used. An IMetS can, at least theoretically, join various types of IMSs to make them interoperable and give users the illusion that they are interacting with just one IMS. However, due to the complexity of an IMS, attempting to join various types of IMSs is a technically challenging task, let alone assessing how well an IMetS manages to integrate these IMSs. The first contribution of this thesis is the development of a generic IMS model called the Layered Identity Infrastructure Model (LIIM). Using this model, we develop a set of properties that an ideal IMetS should provide. This idealized form is then used as a benchmark to evaluate existing IMetSs. Different types of IMS provide varying levels of privacy protection support. Unfortunately, as observed by Jøsang et al (2007), there is insufficient privacy protection in many of the existing IMSs. In this thesis, we study and extend a type of privacy enhancing technology known as an Anonymous Credential System (ACS). In particular, we extend the ACS which is built on the cryptographic primitives proposed by Camenisch, Lysyanskaya, and Shoup. We call this system the Camenisch, Lysyanskaya, Shoup - Anonymous Credential System (CLS-ACS). The goal of CLS-ACS is to let users be as anonymous as possible. Unfortunately, CLS-ACS has problems, including (1) the concentration of power to a single entity - known as the Anonymity Revocation Manager (ARM) - who, if malicious, can trivially reveal a user’s PII (resulting in an illegal revocation of the user’s anonymity), and (2) poor performance due to the resource-intensive cryptographic operations required. The second and third contributions of this thesis are the proposal of two protocols that reduce the trust dependencies on the ARM during users’ anonymity revocation. Both protocols distribute trust from the ARM to a set of n referees (n > 1), resulting in a significant reduction of the probability of an anonymity revocation being performed illegally. The first protocol, called the User Centric Anonymity Revocation Protocol (UCARP), allows a user’s anonymity to be revoked in a user-centric manner (that is, the user is aware that his/her anonymity is about to be revoked). The second protocol, called the Anonymity Revocation Protocol with Re-encryption (ARPR), allows a user’s anonymity to be revoked by a service provider in an accountable manner (that is, there is a clear mechanism to determine which entity who can eventually learn - and possibly misuse - the identity of the user). The fourth contribution of this thesis is the proposal of a protocol called the Private Information Escrow bound to Multiple Conditions Protocol (PIEMCP). This protocol is designed to address the performance issue of CLS-ACS by applying the CLS-ACS in a federated single sign-on (FSSO) environment. Our analysis shows that PIEMCP can both reduce the amount of expensive modular exponentiation operations required and lower the risk of illegal revocation of users’ anonymity. Finally, the protocols proposed in this thesis are complex and need to be formally evaluated to ensure that their required security properties are satisfied. In this thesis, we use Coloured Petri nets (CPNs) and its corresponding state space analysis techniques. All of the protocols proposed in this thesis have been formally modeled and verified using these formal techniques. Therefore, the fifth contribution of this thesis is a demonstration of the applicability of CPN and its corresponding analysis techniques in modeling and verifying privacy enhancing protocols. To our knowledge, this is the first time that CPN has been comprehensively applied to model and verify privacy enhancing protocols. From our experience, we also propose several CPN modeling approaches, including complex cryptographic primitives (such as zero-knowledge proof protocol) modeling, attack parameterization, and others. The proposed approaches can be applied to other security protocols, not just privacy enhancing protocols.
APA, Harvard, Vancouver, ISO, and other styles
12

Morton, Stuart Michael. "An Improved Utility Driven Approach Towards K-Anonymity Using Data Constraint Rules." Thesis, 2013. http://hdl.handle.net/1805/3427.

Full text
Abstract:
Indiana University-Purdue University Indianapolis (IUPUI)
As medical data continues to transition to electronic formats, opportunities arise for researchers to use this microdata to discover patterns and increase knowledge that can improve patient care. Now more than ever, it is critical to protect the identities of the patients contained in these databases. Even after removing obvious “identifier” attributes, such as social security numbers or first and last names, that clearly identify a specific person, it is possible to join “quasi-identifier” attributes from two or more publicly available databases to identify individuals. K-anonymity is an approach that has been used to ensure that no one individual can be distinguished within a group of at least k individuals. However, the majority of the proposed approaches implementing k-anonymity have focused on improving the efficiency of algorithms implementing k-anonymity; less emphasis has been put towards ensuring the “utility” of anonymized data from a researchers’ perspective. We propose a new data utility measurement, called the research value (RV), which extends existing utility measurements by employing data constraints rules that are designed to improve the effectiveness of queries against the anonymized data. To anonymize a given raw dataset, two algorithms are proposed that use predefined generalizations provided by the data content expert and their corresponding research values to assess an attribute’s data utility as it is generalizing the data to ensure k-anonymity. In addition, an automated algorithm is presented that uses clustering and the RV to anonymize the dataset. All of the proposed algorithms scale efficiently when the number of attributes in a dataset is large.
APA, Harvard, Vancouver, ISO, and other styles
13

"Anomaly detection via high-dimensional data analysis on web access data." 2009. http://library.cuhk.edu.hk/record=b5894067.

Full text
Abstract:
Suen, Ho Yan.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2009.
Includes bibliographical references (leaves 99-104).
Abstract also in Chinese.
Abstract --- p.i
Acknowledgement --- p.iv
Chapter 1 --- Introduction --- p.1
Chapter 1.1 --- Motivation --- p.1
Chapter 1.2 --- Organization --- p.4
Chapter 2 --- Literature Review --- p.6
Chapter 2.1 --- Related Works --- p.6
Chapter 2.2 --- Background Study --- p.7
Chapter 2.2.1 --- World Wide Web --- p.7
Chapter 2.2.2 --- Distributed Denial of Service Attack --- p.11
Chapter 2.2.3 --- Tools for Dimension Reduction --- p.13
Chapter 2.2.4 --- Tools for Anomaly Detection --- p.20
Chapter 2.2.5 --- Receiver operating characteristics (ROC) Analysis --- p.22
Chapter 3 --- System Design --- p.25
Chapter 3.1 --- Methodology --- p.25
Chapter 3.2 --- System Overview --- p.27
Chapter 3.3 --- Reference Profile Construction --- p.31
Chapter 3.4 --- Real-time Anomaly Detection and Response --- p.32
Chapter 3.5 --- Chapter Summary --- p.34
Chapter 4 --- Reference Profile Construction --- p.35
Chapter 4.1 --- Web Access Logs Collection --- p.35
Chapter 4.2 --- Data Preparation --- p.37
Chapter 4.3 --- Feature Extraction and Embedding Engine (FEE Engine) --- p.40
Chapter 4.3.1 --- Sub-Sequence Extraction --- p.42
Chapter 4.3.2 --- Hash Function on Sub-sequences (optional) --- p.45
Chapter 4.3.3 --- Feature Vector Construction --- p.46
Chapter 4.3.4 --- Diffusion Wavelets Embedding --- p.47
Chapter 4.3.5 --- Numerical Example of Feature Set Reduction --- p.49
Chapter 4.3.6 --- Reference Profile and Further Use of FEE Engine --- p.50
Chapter 4.4 --- Chapter Summary --- p.50
Chapter 5 --- Real-time Anomaly Detection and Response --- p.52
Chapter 5.1 --- Session Filtering and Data Preparation --- p.54
Chapter 5.2 --- Feature Extraction and Embedding --- p.54
Chapter 5.3 --- Distance-based Outlier Scores Calculation --- p.55
Chapter 5.4 --- Anomaly Detection and Response --- p.56
Chapter 5.4.1 --- Length-Based Anomaly Detection Modules --- p.56
Chapter 5.4.2 --- Characteristics of Anomaly Detection Modules --- p.59
Chapter 5.4.3 --- Dynamic Threshold Adaptation --- p.60
Chapter 5.5 --- Chapter Summary --- p.63
Chapter 6 --- Experimental Results --- p.65
Chapter 6.1 --- Experiment Datasets --- p.65
Chapter 6.1.1 --- Normal Web Access Logs --- p.66
Chapter 6.1.2 --- Attack Data Generation --- p.68
Chapter 6.2 --- ROC Curve Construction --- p.70
Chapter 6.3 --- System Parameters Selection --- p.71
Chapter 6.4 --- Performance of Anomaly Detection --- p.82
Chapter 6.4.1 --- Performance Analysis --- p.85
Chapter 6.4.2 --- Performance in defending DDoS attacks --- p.87
Chapter 6.5 --- Computation Requirement --- p.91
Chapter 6.6 --- Chapter Summary --- p.95
Chapter 7 --- Conclusion and Future Work --- p.96
Bibliography --- p.99
APA, Harvard, Vancouver, ISO, and other styles
14

Mohammed, Yassene. "Data Protection and Data Security Concept for Medical Applications in a Grid Computing Environment." Doctoral thesis, 2008. http://hdl.handle.net/11858/00-1735-0000-0006-B3AE-A.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

"Privacy preserving in serial data and social network publishing." 2010. http://library.cuhk.edu.hk/record=b5894365.

Full text
Abstract:
Liu, Jia.
"August 2010."
Thesis (M.Phil.)--Chinese University of Hong Kong, 2010.
Includes bibliographical references (p. 69-72).
Abstracts in English and Chinese.
Chapter 1 --- Introduction --- p.1
Chapter 2 --- Related Work --- p.3
Chapter 3 --- Privacy Preserving Network Publication against Structural Attacks --- p.5
Chapter 3.1 --- Background and Motivation --- p.5
Chapter 3.1.1 --- Adversary knowledge --- p.6
Chapter 3.1.2 --- Targets of Protection --- p.7
Chapter 3.1.3 --- Challenges and Contributions --- p.10
Chapter 3.2 --- Preliminaries and Problem Definition --- p.11
Chapter 3.3 --- Solution:K-Isomorphism --- p.15
Chapter 3.4 --- Algorithm --- p.18
Chapter 3.4.1 --- Refined Algorithm --- p.21
Chapter 3.4.2 --- Locating Vertex Disjoint Embeddings --- p.30
Chapter 3.4.3 --- Dynamic Releases --- p.32
Chapter 3.5 --- Experimental Evaluation --- p.34
Chapter 3.5.1 --- Datasets --- p.34
Chapter 3.5.2 --- Data Structure of K-Isomorphism --- p.37
Chapter 3.5.3 --- Data Utilities and Runtime --- p.42
Chapter 3.5.4 --- Dynamic Releases --- p.47
Chapter 3.6 --- Conclusions --- p.47
Chapter 4 --- Global Privacy Guarantee in Serial Data Publishing --- p.49
Chapter 4.1 --- Background and Motivation --- p.49
Chapter 4.2 --- Problem Definition --- p.54
Chapter 4.3 --- Breach Probability Analysis --- p.57
Chapter 4.4 --- Anonymization --- p.58
Chapter 4.4.1 --- AG size Ratio --- p.58
Chapter 4.4.2 --- Constant-Ratio Strategy --- p.59
Chapter 4.4.3 --- Geometric Strategy --- p.61
Chapter 4.5 --- Experiment --- p.62
Chapter 4.5.1 --- Dataset --- p.62
Chapter 4.5.2 --- Anonymization --- p.63
Chapter 4.5.3 --- Evaluation --- p.64
Chapter 4.6 --- Conclusion --- p.68
Bibliography --- p.69
APA, Harvard, Vancouver, ISO, and other styles
16

Matondo, Sandra Bazebo. "Two-level chaos-based cryptography for image security." 2012. http://encore.tut.ac.za/iii/cpro/DigitalItemViewPage.external?sp=1001028.

Full text
Abstract:
M. Tech. Electrical engineering.
Discusses a desirable chaos-based encryption scheme for image storage and transmission is one that can resist different types of attacks in less time and with successful decryption. To resist different kinds of attacks, a higher security level is required. As a result, there is a need to enhance the security level of existing chaos-based image encryption schemes using hyper-chaos. To increase the level of security using hyper-chaos, the research will present a scheme that combines two different techniques that are used to improve the degree of security of chaos-based cryptography; a classical chaos-based cryptographic technique and a hyper-chaos masking technique. The first technique focuses on the efficient combination and transformation of image characteristics based on hyper-chaos pseudorandom numbers. The second technique focuses on driving the hyper-chaos system by using the results of the first technique to change the transmitted chaos dynamic as well as using synchronisation and a high-order differentiator for decryption. To achieve the objective of our research the following sub-problems are addressed.
APA, Harvard, Vancouver, ISO, and other styles
17

Ling, Jie. "Smart card fault attacks on public key and elliptic curve cryptography." Thesis, 2014. http://hdl.handle.net/1805/5967.

Full text
Abstract:
Indiana University-Purdue University Indianapolis (IUPUI)
Blömmer, Otto, and Seifert presented a fault attack on elliptic curve scalar multiplication called the Sign Change Attack, which causes a fault that changes the sign of the accumulation point. As the use of a sign bit for an extended integer is highly unlikely, this appears to be a highly selective manipulation of the key stream. In this thesis we describe two plausible fault attacks on a smart card implementation of elliptic curve cryptography. King and Wang designed a new attack called counter fault attack by attacking the scalar multiple of discrete-log cryptosystem. They then successfully generalize this approach to a family of attacks. By implementing King and Wang's scheme on RSA, we successfully attacked RSA keys for a variety of sizes. Further, we generalized the attack model to an attack on any implementation that uses NAF and wNAF key.
APA, Harvard, Vancouver, ISO, and other styles
18

(6012225), Huian Li. "Transparent and Mutual Restraining Electronic Voting." Thesis, 2019.

Find full text
Abstract:
Many e-voting techniques have been proposed but not widely used in reality. One of the problems associated with most of existing e-voting techniques is the lack of transparency, leading to a failure to deliver voter assurance. In this work, we propose a transparent, auditable, end-to-end verifiable, and mutual restraining e-voting protocol that exploits the existing multi-party political dynamics such as in the US. The new e-voting protocol consists of three original technical contributions -- universal verifiable voting vector, forward and backward mutual lock voting, and in-process check and enforcement -- that, along with a public real time bulletin board, resolves the apparent conflicts in voting such as anonymity vs. accountability and privacy vs. verifiability. Especially, the trust is split equally among tallying authorities who have conflicting interests and will technically restrain each other. The voting and tallying processes are transparent to voters and any third party, which allow any voter to verify that his vote is indeed counted and also allow any third party to audit the tally. For the environment requiring receipt-freeness and coercion-resistance, we introduce additional approaches to counter vote-selling and voter-coercion issues. Our interactive voting protocol is suitable for small number of voters like boardroom voting where interaction between voters is encouraged and self-tallying is necessary; while our non-interactive protocol is for the scenario of large number of voters where interaction is prohibitively expensive. Equipped with a hierarchical voting structure, our protocols can enable open and fair elections at any scale.
APA, Harvard, Vancouver, ISO, and other styles
19

Eloff, Corné. "Spatial technology as a tool to analyse and combat crime." Thesis, 2006. http://hdl.handle.net/10500/1193.

Full text
Abstract:
This study explores the utilisation of spatial technologies as a tool to analyse and combat crime. The study deals specifically with remote sensing and its potential for being integrated with geographical information systems (GIS). The integrated spatial approach resulted in the understanding of land use class behaviour over time and its relationship to specific crime incidents per police precinct area. The incorporation of spatial technologies to test criminological theories in practice, such as the ecological theories of criminology, provides the science with strategic value. It proves the value of combining multi-disciplinary scientific fields to create a more advanced platform to understand land use behaviour and its relationship to crime. Crime in South Africa is a serious concern and it impacts negatively on so many lives. The fear of crime, the loss of life, the socio-economic impact of crime, etc. create the impression that the battle against crime has been lost. The limited knowledge base within the law enforcement agencies, limited logistical resources and low retention rate of critical staff all contribute to making the reduction of crime more difficult to achieve. A practical procedure of using remote sensing technology integrated with geographical information systems (GIS), overlaid with geo-coded crime data to provide a spatial technological basis to analyse and combat crime, is illustrated by a practical study of the Tshwane municipality area. The methodology applied in this study required multi-skilled resources incorporating GIS and the understanding of crime to integrate the diverse scientific fields into a consolidated process that can contribute to the combating of crime in general. The existence of informal settlement areas in South Africa stresses the socio-economic problems that need to be addressed as there is a clear correlation of land use data with serious crime incidents in these areas. The fact that no formal cadastre exists for these areas, combined with a great diversity in densification and growth of the periphery, makes analysis very difficult without remote sensing imagery. Revisits over time to assess changes in these areas in order to adapt policing strategies will create an improved information layer for responding to crime. Final computerised maps generated from remote sensing and GIS layers are not the only information that can be used to prevent and combat crime. An important recipe for ultimately successfully managing and controlling crime in South Africa is to strategically combine training of the law enforcement agencies in the use of spatial information with police science. The researcher concludes with the hope that this study will contribute to the improved utilisation of spatial technology to analyse and combat crime in South Africa. The ultimate vision is the expansion of the science of criminology by adding an advanced spatial technology module to its curriculum.
Criminology
D.Litt. et Phil. (Criminology)
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography