Dissertations / Theses on the topic 'Authentication method'

To see the other types of publications on this topic, follow the link: Authentication method.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Authentication method.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Bani-Hani, Raed M. "Enhancing the IKE preshared key authentication method." Diss., Columbia, Mo. : University of Missouri-Columbia, 2006. http://hdl.handle.net/10355/4406.

Full text
Abstract:
Thesis (Ph. D.) University of Missouri-Columbia, 2006.
The entire dissertation/thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file (which also appears in the research.pdf); a non-technical general description, or public abstract, appears in the public.pdf file. Title from title screen of research.pdf file (viewed on July 31, 2007) Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
2

Tellini, Niklas, and Fredrik Vargas. "Two-Factor Authentication : Selecting and implementing a two-factor authentication method for a digital assessment platform." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-208097.

Full text
Abstract:
Two-Factor Authentication (2FA) is a two-step verification process that aims to provide an additional layer of security by requiring the user to authenticate himself/herself using a secondary means (ownership factor or inheritance factor). Without the use of 2FA, an attacker could gain access to a person’s devices or accounts solely by knowing the victim’s password, while with 2FA knowing only this password is insufficient to pass the authentication check. In this project, we analyze different methods in which 2FA could be implemented by a Digital Assessment Platform. These platforms allow test assessments to be built directly into digital content; therefore, an important requirement of these systems is secure authentication. Moreover, it is important to securely protect teachers’ account in order to avoid unauthorized people gaining access to those accounts. We investigate how 2FA could be used to add an extra layer of security to teachers’ accounts, focusing on cost, user experience, ease of use, and deployment of the solution. We arrived at the conclusion that 2FA through an ownership factor is a suitable method and we implemented a solution based upon One-Time Passwords. This thesis project will hopefully benefit Digital Assessment Platforms who wish to implement 2FA by providing broader knowledge regarding this subject. The project should also benefit society by increasing the general knowledge of 2FA, hence leading to more secure services.
Tvåfaktorsautentisering (2FA) är en tvåstegs verifieringsprocess som syftar att ge en extra nivå av säkerhet, i och med att den kräver användaren att autentisera sig själv genom en sekundär faktor (något man äger eller har ärvt). Utan användning av 2FA, kan en förövare få åtkomst till en persons mobila enhet eller konto endast genom att kunna offrets lösenord. Att enbart kunna lösenordet är inte tillräckligt för att en autentiseringsprocess ska vara godkänd om 2FA är implementerad. I det här projektet analyseras olika 2FA som skulle kunna implementeras av en digital utvärderingsplattform. Sådana plattformar  förvandlar tester och prov till digitalt innehåll och kräver därför en säker autentisering. Dessutom är det viktigt att säkra lärarnas konton för att undvika att icke auktoriserade personer loggar in på deras konton. Vi undersöker hur 2FA kan användas för att lägga till en extra nivå av säkerhet på lärarnas konton, med fokus på kostnad, användarupplevelse, lättanvändlighet och utplacering av lösningen. Vi kom fram till att 2FA via en faktor man äger är en passande metod  och vi implementerade sedan en lösning grundad på engångslösenord. Detta projekt kan förhoppningsvis vara till förmån för digitala utvärderingsplattformar  som vill implementera 2FA,  genom att ge en bredare kunskap inom detta område. Projektet skulle kunna gynna allmänheten genom att bidra till ökad generell kunskap om 2FA, och därav leda till säkrare tjänster.
APA, Harvard, Vancouver, ISO, and other styles
3

OLSSON, JOAKIM. "Method for gesture based authentication in physical access control." Thesis, KTH, Maskinkonstruktion (Inst.), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-209339.

Full text
Abstract:
ASSA Abloy är den största globala leverantören av intelligenta lås och säkerhetslösningar. Företaget strävar ständigt efter att utveckla nya och innovativa lösningar för fysisk passerkontroll. Ett koncept som företaget ville undersöka riktade sig mot att göra det möjligt för användaren att enkelt låsa upp en dörr med hjälp av gester, vilket resulterar i en användarvänlig upplevelse. Tanken var att använda en wearable som en credential-enhet och identifiera användarens gester med sensorerna som tillhandahålls av denna. Gesten som används i denna avhandling var knackar, vilket innebär att användaren låser uppdörren genom att knacka på den. Huvudsyftet med detta arbete var att utveckla ett system som tillåter knackar att användas som en metod för autentisering och att utvärdera systemet baserat på systemsäkerhet och användarvänlighet. Systemet som har utvecklats består av två accelerometersensorer; en belägen i wearablen och en belägen i låset/dörren. Signalerna från varje sensor bearbetas och analyseras för att detektera knackar. Tidskorrelationen mellan knackar som detekteras av varje sensor analyseras för att kontrollera att de härstammar från samma användare. En teoretisk modell av systemet har utvecklats för att underlätta utvärdering av systemet. Utvärderingen av systemet visade att både systemetsäkerheten och användarvänligheten uppnår tillfredsställande värden. Denna avhandling visar att konceptet har stor potential men det krävs ytterligare arbete. Metoderna som har används för att utvärdera systemet i denna avhandling kan på samma sätt användas för att utvärdera system under fortsatt arbete.
ASSA Abloy is the largest global supplier of intelligent locks and security solutions. The company constantly strives to develop new and innovative solutions for physical access control. One concept the company wanted to investigate aimed to allow the user to effortlessly unlock a door using gestures, resulting in a seamless experience. The idea was to use a wearable as a credential device and identifying the user gestures with the sensors supplied by the wearable. The gesture used in this thesis project were knocks, meaning that the user unlocks the door by knocking on it. The main objective of this thesis project was to develop a system allowing knocks to be used as a method of authentication and evaluate the system based on system security and user convenience. The system developed consists of two accelerometer sensors; one located in the wearable and one located in the lock/door. The signals from each sensor are processed and analyzed to detect knocks. The time correlation between the knocks detected by each sensor are analyzed to verify that they originate from the same user. A theoretical model of the system was developed to facilitate the evaluation of the system. The evaluation of the system showed that both the system security and the user continence attained satisfying values. This thesis shows that the concept has high potential but further work is needed. The framework of methods used to evaluate the system in this thesis can in the same way be used to evaluate systems during any further work.
APA, Harvard, Vancouver, ISO, and other styles
4

Dasun, Weerasinghe P. W. H. "Parameter based identification, authentication and authorization method for mobile services." Thesis, City University London, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.510696.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Langlotz, Benjamin. "Usable Security : A seamless user authentication method using NFC and Bluetooth." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-297835.

Full text
Abstract:
Currently, the majority of user authentication procedures for computers, web services or software involve typing user names and passwords. Passwords which should have a reasonable complexity to be considered secure. The securest password, however, does not guard a user's data if she does not log out when leaving the computer. The research question posed in this thesis is "How should a user authentication method be designed to automate login/logout and to mitigate negative effects of lacking security awareness?". Based on this question, the goal of this work is to develop a new solution for user authentication with NFC and Bluetooth, that takes care of logging in and out of computers and services without the user having to lose a thought about it. This is done by first looking at currently existing alternatives to password authentication. Secondly, the qualities and requirements of a new user authentication concept are devised and described. Thirdly, a testable prototype called NFCLogin, implementing the key aspects of logging in and logging out of Google chrome as well as saving and reopening of the user's opened tabs is implemented. Finally, an observational assessment test is conducted. The aim of the study is to get a hint about whether the system could be useful, if users are inclined to trust it and in which way it could be improved. The main outcome of this thesis is the definition of a user authentication method coupled with suggestions for improvement gathered from a usability study, conducted with the method's prototype, NFCLogin. An important take away from the study is that participants seem to appreciate the prototype and are likely willing to use the proposed method, if it is sufficiently secure.
APA, Harvard, Vancouver, ISO, and other styles
6

Cetin, Cagri. "Design, Testing and Implementation of a New Authentication Method Using Multiple Devices." Scholar Commons, 2015. http://scholarcommons.usf.edu/etd/5660.

Full text
Abstract:
Authentication protocols are very common mechanisms to confirm the legitimacy of someone’s or something’s identity in digital and physical systems. This thesis presents a new and robust authentication method based on users’ multiple devices. Due to the popularity of mobile devices, users are becoming more likely to have more than one device (e.g., smartwatch, smartphone, laptop, tablet, smart-car, smart-ring, etc.). The authentication system presented here takes advantage of these multiple devices to implement authentication mechanisms. In particular, the system requires the devices to collaborate with each other in order for the authentication to succeed. This new authentication protocol is robust against theft-based attacks on single device; an attacker would need to steal multiple devices in order to compromise the authentication system. The new authentication protocol comprises an authenticator and at least two user devices, where the user devices are associated with each other. To perform an authentication on a user device, the user needs to respond a challenge by using his/her associated device. After describing how this authentication protocol works, this thesis will discuss three different versions of the protocol that have been implemented. In the first implementation, the authentication process is performed by using two smartphones. Also, as a challenge, a QR code is used. In the second implementation, instead of using a QR code, NFC technology is used for challenge transmission. In the last implementation, the usability with different platforms is exposed. Instead of using smartphones, a laptop computer and a smartphone combination is used. Furthermore, the authentication protocol has been verified by using an automated protocol-verification tool to check whether the protocol satisfies authenticity and secrecy properties. Finally, these implementations are tested and analyzed to demonstrate the performance variations over different versions of the protocol.
APA, Harvard, Vancouver, ISO, and other styles
7

Torres, Peralta Raquel. "Recognizing User Identity by Touch on Tabletop Displays: An Interactive Authentication Method." Diss., The University of Arizona, 2012. http://hdl.handle.net/10150/265555.

Full text
Abstract:
Multi-touch tablets allow users to interact with computers through intuitive, natural gestures and direct manipulation of digital objects. One advantage of these devices is that they can offer a large, collaborative space where several users can work on a task at the same time. However the lack of privacy in these situations makes standard password-based authentication easily compromised. This work presents a new gesture-based authentication system based on users' unique signature of touch motion. This technique has two key features. First, at each step in authentication the system prompts the user to make a specific gesture selected to maximize the expected long-term information gain. Second, each gesture is integrated using a hierarchical probabilistic model, allowing the system to accept or reject a user after a variable number of gestures. This touch-based approach would allow the user to accurately authenticate without the need to cover their hand or look over their shoulder. This method has been tested using a set of samples collected under real-world conditions in a business office, with a touch tablet that was used on a near daily basis by users familiar with the device. Despite the lack of sophisticated, high-precision equipment, the system is able to achieve high user recognition accuracy with relatively few gestures, demonstrating that human touch patterns have a distinctive signature" that can be used as a powerful biometric measure for user recognition and personalization.
APA, Harvard, Vancouver, ISO, and other styles
8

Cordeiro, Raposo Fernando. "Chromatographic studies of major milk proteins : towards a reliable method for the assessment of milk authentication." Thesis, University of the West of England, Bristol, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.275891.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Mendoza, Patricia A. "An enhanced method for the existing bluetooth pairing protocol to avoid impersonation attacks." To access this resource online via ProQuest Dissertations and Theses @ UTEP, 2009. http://0-proquest.umi.com.lib.utep.edu/login?COPT=REJTPTU0YmImSU5UPTAmVkVSPTI=&clientId=2515.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Marnell, Joseph. "An Empirical Investigation of Factors Affecting Resistance to Using Multi-Method Authentication Systems in Public-Access Environments." NSUWorks, 2016. http://nsuworks.nova.edu/gscis_etd/970.

Full text
Abstract:
Over the course of history, different means of object and person identification as well as verification have evolved for user authentication. In recent years, a new concern has emerged regarding the accuracy of verifiable authentication and protection of personal identifying information (PII), because previous misuses have resulted in significant financial loss. Such losses have escalated more noticeably because of human identity-theft incidents due to breaches of PII within multiple public-access environments. Although the use of various biometric and radio frequency identification (RFID) technologies is expanding, resistance to using these technologies for user authentication remains an issue. This study addressed the effect of individuals’ perceptions on their resistance to using multi-method authentication systems (RMS) in public-access environments and uncovered key constructs that may significantly contribute to such resistance. This study was a predictive study to assess the contributions of individuals’ perceptions of the importance of organizational protection of their PII, noted as Perceived Value of Organizational Protection of PII (PVOP), authentication complexity (AC), and invasion of privacy (IOP) on their resistance to using multi-method authentication systems (RMS) in public-access environments. Moreover, this study also investigated if there were any significant differences on the aforementioned constructs based on age, gender, prior experience with identity theft, and acquaintance experience with identity theft. As part of this study, a rollout project was implemented of multi-factor biometric and RFID technologies for system authentication prior to electronic-commerce (e-commerce) use in public-access environments. The experimental group experienced the multi-factor authentication and also was trained on its use. Computer users (faculty & students) from a small, private university participated in the study to determine their level of PVOP, IOP, and AC on their resistance to using the technology in public-access environments. Multiple Linear Regression (MLR) was used to formulate a model and test predictive power along with the significance of the contribution of the aforementioned constructs on RMS. The results show that all construct measures demonstrated very high reliability. The results also indicate that the experimental group of the multi-factor authentication had lower resistance than the control group that didn’t use the technology. The mean increases indicate an overall statistically significant difference between the experimental and control groups overall. The results also demonstrate that students and participants’ increased levels of education indicate an overall statistically significant decrease in resistance. The findings demonstrate that overall computer authentication training do provide added value in the context of measuring resistance to using newer multi-method authentication technology.
APA, Harvard, Vancouver, ISO, and other styles
11

Marnell, Joseph W. "An Empirical Investigation of Factors Affecting Resistance to Using Multi-Method Authentication Systems in Public-Access Environments." Thesis, Nova Southeastern University, 2016. http://pqdtopen.proquest.com/#viewpdf?dispub=10126659.

Full text
Abstract:

Over the course of history, different means of object and person identification as well as verification have evolved for user authentication. In recent years, a new concern has emerged regarding the accuracy of verifiable authentication and protection of personal identifying information (PII), because previous misuses have resulted in significant financial loss. Such losses have escalated more noticeably because of human identity-theft incidents due to breaches of PII within multiple public-access environments. Although the use of various biometric and radio frequency identification (RFID) technologies is expanding, resistance to using these technologies for user authentication remains an issue. This study addressed the effect of individuals’ perceptions on their resistance to using multi-method authentication systems (RMS) in public-access environments and uncovered key constructs that may significantly contribute to such resistance.

This study was a predictive study to assess the contributions of individuals’ perceptions of the importance of organizational protection of their PII, noted as Perceived Value of Organizational Protection of PII (PVOP), authentication complexity (AC), and invasion of privacy (IOP) on their resistance to using multi-method authentication systems (RMS) in public-access environments. Moreover, this study also investigated if there were any significant differences on the aforementioned constructs based on age, gender, prior experience with identity theft, and acquaintance experience with identity theft. As part of this study, a rollout project was implemented of multi-factor biometric and RFID technologies for system authentication prior to electronic-commerce (e-commerce) use in public-access environments. The experimental group experienced the multi-factor authentication and also was trained on its use. Computer users (faculty & students) from a small, private university participated in the study to determine their level of PVOP, IOP, and AC on their resistance to using the technology in public-access environments. Multiple Linear Regression (MLR) was used to formulate a model and test predictive power along with the significance of the contribution of the aforementioned constructs on RMS. The results show that all construct measures demonstrated very high reliability. The results also indicate that the experimental group of the multi-factor authentication had lower resistance than the control group that didn’t use the technology. The mean increases indicate an overall statistically significant difference between the experimental and control groups overall. The results also demonstrate that students and participants’ increased levels of education indicate an overall statistically significant decrease in resistance. The findings demonstrate that overall computer authentication training do provide added value in the context of measuring resistance to using newer multi-method authentication technology.

APA, Harvard, Vancouver, ISO, and other styles
12

Pikoulas, John. "An agent-based Bayesian method for network intrusion detection." Thesis, Edinburgh Napier University, 2003. http://researchrepository.napier.ac.uk/Output/4057.

Full text
Abstract:
Security is one of the major issues in any network and on the Internet. It encapsulates many different areas, such as protecting individual users against intruders, protecting corporate systems against damage, and protecting data from intrusion. It is obviously impossible to make a network totally secure, as there are so many areas that must be protected. This thesis includes an evaluation of current techniques for internal misuse of computer systems, and tries to propose a new way of dealing with this problem. This thesis proposes that it is impossible to fully protect a computer network from intrusion, and shows how different methods are applied at differing levels of the OSI model. Most systems are now protected at the network and transport layer, with systems such as firewalls and secure sockets. A weakness, though, exists in the session layer that is responsible for user logon and their associated password. It is thus important for any highly secure system to be able to continually monitor a user, even after they have successfully logged into the system. This is because once an intruder has successfully logged into a system, they can use it as a stepping-stone to gain full access (often right up to the system administrator level). This type of login identifies another weakness of current intrusion detection systems, in that they are mainly focused on detecting external intrusion, whereas a great deal of research identifies that one of the main problems is from internal intruders, and from staff within an organisation. Fraudulent activities can often he identified by changes in user behaviour. While this type of behaviour monitoring might not be suited to most networks, it could be applied to high secure installations, such as in government, and military organisations. Computer networks are now one of the most rapidly changing and vulnerable systems, where security is now a major issue. A dynamic approach, with the capacity to deal with and adapt to abrupt changes, and be simple, will provide an effective modelling toolkit. Analysts must be able to understand how it works and be able to apply it without the aid of an expert. Such models do exist in the statistical world, and it is the purpose of this thesis to introduce them and to explain their basic notions and structure. One weakness identified is the centralisation and complex implementation of intrusion detection. The thesis proposes an agent-based approach to monitor the user behaviour of each user. It also proposes that many intrusion detection systems cannot cope with new types of intrusion. It thus applies Bayesian statistics to evaluate user behaviour, and predict the future behaviour of the user. The model developed is a unique application of Bayesian statistics, and the results show that it can improve future behaviour prediction than existing ARIMA models. The thesis argues that the accuracy of long-term forecasting questionable, especially in systems that have a rapid and often unexpected evolution and behaviour. Many of the existing models for prediction use long-term forecasting, which may not be the optimal type for intrusion detection systems. The experiments conducted have varied the number of users and the time interval used for monitoring user behaviour. These results have been compared with ARIMA, and an increased accuracy has been observed. The thesis also shows that the new model can better predict changes in user behaviour, which is a key factor in identifying intrusion detection. The thesis concludes with recommendations for future work, including how the statistical model could be improved. This includes research into changing the specification of the design vector for Bayesian. Another interesting area is the integration of standard agent communication agents, which will make the security agents more social in their approach and be able to gather information from other agents
APA, Harvard, Vancouver, ISO, and other styles
13

Ajdler, Arnaud 1975. "Iconic authentication methods." Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/80621.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Hagström, Adrian, and Rustam Stanikzai. "Writer identification using semi-supervised GAN and LSR method on offline block characters." Thesis, Högskolan i Halmstad, Akademin för informationsteknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-43316.

Full text
Abstract:
Block characters are often used when filling out forms, for example when writing ones personal number. The question of whether or not there is recoverable, biometric (identity related) information within individual digits of hand written personal numbers is then relevant. This thesis investigates the question by using both handcrafted features and extracting features via Deep learning (DL) models, and successively limiting the amount of available training samples. Some recent works using DL have presented semi-supervised methods using Generative adveserial network (GAN) generated data together with a modified Label smoothing regularization (LSR) function. Using this training method might improve performance on a baseline fully supervised model when doing authentication. This work additionally proposes a novel modified LSR function named Bootstrap label smooting regularizer (BLSR) designed to mitigate some of the problems of previous methods, and is compared to the others. The DL feature extraction is done by training a ResNet50 model to recognize writers of a personal numbers and then extracting the feature vector from the second to last layer of the network.Results show a clear indication of recoverable identity related information within the hand written (personal number) digits in boxes. Our results indicate an authentication performance, expressed in Equal error rate (EER), of around 25% with handcrafted features. The same performance measured in EER was between 20-30% when using the features extracted from the DL model. The DL methods, while showing potential for greater performance than the handcrafted, seem to suffer from fluctuation (noisiness) of results, making conclusions on their use in practice hard to draw. Additionally when using 1-2 training samples the handcrafted features easily beat the DL methods.When using the LSR variant semi-supervised methods there is no noticeable performance boost and BLSR gets the second best results among the alternatives.
APA, Harvard, Vancouver, ISO, and other styles
15

Fält, Markus. "Multi-factor Authentication : System proposal and analysis of continuous authentication methods." Thesis, Mittuniversitetet, Institutionen för informationssystem och –teknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-39212.

Full text
Abstract:
It is common knowledge that the average user has multiple online accounts which all require a password. Some studies have shown that the number password for the average user is around 25. Considering this, one can see that it is unreasonable to expect the average user to have 25 truly unique passwords. Because of this multi-factor authentication could potentially be used to reduce the number of passwords to remember while maintaining and possibly exceeding the security of unique passwords. This thesis therefore, aims to examine continuous authentication methods as well as proposing an authentication system for combining various authentication methods. This was done by developing an authentication system using three different authentication factors. This system used a secret sharing scheme so that the authentication factors could be weighted according to their perceived security. The system also proposes a secure storage method for the secret shares and the feasibility of this is shown. The continuous authentication methods tests were done by testing various machine learning methods on two public datasets. The methods were graded on accuracy and the rate at which the wrong user was accepted. This showed that random forest and decision trees worked well on the particular datasets. Ensemble learning was then tested to see how the two continuous factors performed once combined into a single classifier. This gave an equal error rate of around 5% which is comparable to state-of-the-art methods used for similar datasets.
APA, Harvard, Vancouver, ISO, and other styles
16

Rintelmann, Anke. "DNA based methods for food authentication." Thesis, University of Nottingham, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.272357.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Bocoum, Mounina G. "Acceptance threshold's adaptability in fingerprint-based authentication methods." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0030/MQ64319.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Van, Balen Nicolas Jorge. "Enhancing Usability and Security through Alternative Authentication Methods." W&M ScholarWorks, 2017. https://scholarworks.wm.edu/etd/1516639579.

Full text
Abstract:
With the expanding popularity of various Internet services, online users have be- come more vulnerable to malicious attacks as more of their private information is accessible on the Internet. The primary defense protecting private information is user authentication, which currently relies on less than ideal methods such as text passwords and PIN numbers. Alternative methods such as graphical passwords and behavioral biometrics have been proposed, but with too many limitations to replace current methods. However, with enhancements to overcome these limitations and harden existing methods, alternative authentications may become viable for future use. This dissertation aims to enhance the viability of alternative authentication systems. In particular, our research focuses on graphical passwords, biometrics that depend, directly or indirectly, on anthropometric data, and user authentication en- hancements using touch screen features on mobile devices. In the study of graphical passwords, we develop a new cued-recall graphical pass- word system called GridMap by exploring (1) the use of grids with variable input entered through the keyboard, and (2) the use of maps as background images. as a result, GridMap is able to achieve high key space and resistance to shoulder surfing attacks. to validate the efficacy of GridMap in practice, we conduct a user study with 50 participants. Our experimental results show that GridMap works well in domains in which a user logs in on a regular basis, and provides a memorability benefit if the chosen map has a personal significance to the user. In the study of anthropometric based biometrics through the use of mouse dy- namics, we present a method for choosing metrics based on empirical evidence of natural difference in the genders. In particular, we develop a novel gender classifi- cation model and evaluate the model’s accuracy based on the data collected from a group of 94 users. Temporal, spatial, and accuracy metrics are recorded from kine- matic and spatial analyses of 256 mouse movements performed by each user. The effectiveness of our model is validated through the use of binary logistic regressions. Finally, we propose enhanced authentication schemes through redesigned input, along with the use of anthropometric biometrics on mobile devices. We design a novel scheme called Triple Touch PIN (TTP) that improves traditional PIN number based authentication with highly enlarged keyspace. We evaluate TTP on a group of 25 participants. Our evaluation results show that TTP is robust against dictio- nary attacks and achieves usability at acceptable levels for users. We also assess anthropometric based biometrics by attempting to differentiate user fingers through the readings of the sensors in the touch screen. We validate the viability of this biometric approach on 33 users, and observe that it is feasible for distinguishing the fingers with the largest anthropometric differences, the thumb and pinkie fingers.
APA, Harvard, Vancouver, ISO, and other styles
19

Kiwanuka, Pauline. "The authentication of citrus essential oils by chromatographic methods." Thesis, University of Reading, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.390612.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Haitham, Seror. "Design and Evaluation of Accelerometer Based User Authentication Methods." Thesis, Linköpings universitet, Informationskodning, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-142036.

Full text
Abstract:
Smartphone's are extremely popular and in high demand nowadays. They are easy to handle and very intuitive compared with old phones for end users. Approximately two billion people use Smartphones all over the world, so it is clear that these phones are very popular. One of the major issues of these smart phones is theft. What happens if someone steals your phone? Why should we try to secure our phones? The reason is that, even if the phone is stolen, the thief should not be able to open and use it through unlocking easily. People are generally careless while typing their password/pin code or drawing a pattern while others are watching. Maybe someone can see it just by standing next to or behind the person who is typing the pin or drawing the pattern. This scenario of getting the information is called shoulder surfing. Another scenario is to use a hidden camera, so-called Record monitoring. Shoulder surfing can be used by an attacker/observer to get passwords or PINs. Shoulder surfing is very easy to perform by just looking over the shoulder when a user is typing the PIN or drawing the unlock pattern. Record monitoring needs more preparation, but is not much more complicated to perform. Sometimes it also happens that the phone gets stolen and by seeing fingerprints or smudge patterns on the phone, the attacker can unlock it. These above two are general security threats for smart phone users. This thesis introduces some different approaches to overcome the above mentioned security threats in Smartphones. The basic aim is to make it more difficult to perform shoulder surfing or record monitoring, and these will not be easy to perform by the observer after switching to the new techniques introduced in the thesis. In this thesis, the usability of each method developed will be described and also future use of these approaches. There are a number of techniques by which a user can protect the phone from observation attacks. Some of these will be considered, and a user interface evaluation will be performed in the later phase of development. I will also consider some important aspects while developing the methods such as -user friendliness, Good UI concepts etc. I will also evaluate the actual security added by the methods, and the overall user impression. Two separate user studies have been performed, first one with students from the Computer Science department, and then one with students from other departments. The results indicate that students from Computer Science are more attracted to the new security solution than students from other departments.
APA, Harvard, Vancouver, ISO, and other styles
21

Woo, Chaw-Seng. "Digital image watermarking methods for copyright protection and authentication." Thesis, Queensland University of Technology, 2007. https://eprints.qut.edu.au/16457/1/Chaw-Seng_Woo_Thesis.pdf.

Full text
Abstract:
The ease of digital media modification and dissemination necessitates content protection beyond encryption. Information hidden as digital watermarks in multimedia enables protection mechanism in decrypted contents. The aims of this research are three-fold: (i) to investigate the strength and limitations of current watermarking schemes, (ii) to design and develop new schemes to overcome the limitations, and (iii) to evaluate the new schemes using application scenarios of copyright protection, tamper detection and authentication. We focus on geometrically robust watermarking and semi-fragile watermarking for digital images. Additionally, hybrid schemes that combine the strength of both robust and semi-fragile watermarks are studied. Robust watermarks are well suited for copyright protection because they stay intact with the image under various manipulations. We investigated two major approaches of robust watermarking. In the synchronization approach, we employed motion estimation for watermark resynchronization. We also developed a novel watermark resynchronization method that has low computational cost using scale normalization and flowline curvature. In another approach, we firstly analyzed and improved a blind watermark detection method. The new method reduces significantly the computational cost of its watermark embedding. Secondly, we created a geometric invariant domain using a combination of transforms, and adapted the blind watermark detection method that we improved. It totally eliminates the need of resynchronization in watermark detection, which is a very desirable achievement that can hardly be found in existing schemes. On the other hand, semi-fragile watermarks are good at content authentication because they can differentiate minor image enhancements from major manipulations. New capabilities of semi-fragile watermarks are identified. Then, we developed a semi-fragile watermarking method in wavelet domain that offers content authentication and tamper localization. Unlike others, our scheme overcomes a major challenge called cropping attack and provides approximate content recovery without resorting to an original image. Hybrid schemes combine robust and semi-fragile watermarks to offer deductive information in digital media forensics. We firstly carried out a pilot study by combining robust and fragile watermarks. Then, we performed a comparative analysis on two implementation methods of a hybrid watermarking scheme. The first method has the robust watermark and the fragile watermark overlapped while the second method uses non-overlapping robust and fragile watermarks. Based on the results of the comparative analysis, we merge our geometric invariant domain with our semi-fragile watermark to produce a hybrid scheme. This hybrid scheme fulfilled the copyright protection, tamper detection, and content authentication objectives when evaluated in an investigation scenario.
APA, Harvard, Vancouver, ISO, and other styles
22

Woo, Chaw-Seng. "Digital image watermarking methods for copyright protection and authentication." Queensland University of Technology, 2007. http://eprints.qut.edu.au/16457/.

Full text
Abstract:
The ease of digital media modification and dissemination necessitates content protection beyond encryption. Information hidden as digital watermarks in multimedia enables protection mechanism in decrypted contents. The aims of this research are three-fold: (i) to investigate the strength and limitations of current watermarking schemes, (ii) to design and develop new schemes to overcome the limitations, and (iii) to evaluate the new schemes using application scenarios of copyright protection, tamper detection and authentication. We focus on geometrically robust watermarking and semi-fragile watermarking for digital images. Additionally, hybrid schemes that combine the strength of both robust and semi-fragile watermarks are studied. Robust watermarks are well suited for copyright protection because they stay intact with the image under various manipulations. We investigated two major approaches of robust watermarking. In the synchronization approach, we employed motion estimation for watermark resynchronization. We also developed a novel watermark resynchronization method that has low computational cost using scale normalization and flowline curvature. In another approach, we firstly analyzed and improved a blind watermark detection method. The new method reduces significantly the computational cost of its watermark embedding. Secondly, we created a geometric invariant domain using a combination of transforms, and adapted the blind watermark detection method that we improved. It totally eliminates the need of resynchronization in watermark detection, which is a very desirable achievement that can hardly be found in existing schemes. On the other hand, semi-fragile watermarks are good at content authentication because they can differentiate minor image enhancements from major manipulations. New capabilities of semi-fragile watermarks are identified. Then, we developed a semi-fragile watermarking method in wavelet domain that offers content authentication and tamper localization. Unlike others, our scheme overcomes a major challenge called cropping attack and provides approximate content recovery without resorting to an original image. Hybrid schemes combine robust and semi-fragile watermarks to offer deductive information in digital media forensics. We firstly carried out a pilot study by combining robust and fragile watermarks. Then, we performed a comparative analysis on two implementation methods of a hybrid watermarking scheme. The first method has the robust watermark and the fragile watermark overlapped while the second method uses non-overlapping robust and fragile watermarks. Based on the results of the comparative analysis, we merge our geometric invariant domain with our semi-fragile watermark to produce a hybrid scheme. This hybrid scheme fulfilled the copyright protection, tamper detection, and content authentication objectives when evaluated in an investigation scenario.
APA, Harvard, Vancouver, ISO, and other styles
23

Kothaluru, Tirumala Rao, and Mohamed Youshah Shameel Mecca. "Evaluation of EAP Authentication Methods in Wired and Wireless Networks." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4240.

Full text
Abstract:
In any networking environment, security, connection time and scalability of the network are the major concerns to keep network safe, faster and stable. Administrators working within the networking environment need to have complete account of manageability, scalability and security of the network, so that the organizational data can be kept confidential and maintain integrity. There are different authentication methods used by network administrators for accessing network in wired and wireless environments. As network usage and attacks on network increases, a secure, scalable and standard network protocol is needed for accessing and to keep data safe in both wired and wireless networks. IEEE 802.1x is an IEEE standard used to provide authentication and authorization to the devices over LAN/WLAN. The framework IEEE 802.1x uses EAP for authentication and authorization with a RADIUS server. In this report, an experimental analysis for different EAP authentication methods in both wired and wireless networks in terms of authentication time and the total processing time is presented. Wireshark is used to capture the network traffic on server and client ends. After analyzing each packet timestamps that are captured using Wireshark, it is seen that EAP-MD5 takes less time in both wired and wireless networks, if the number of users increases, there is not much difference in the network connection time. Concerning with security of the network, EAP-MD5 is vulnerable to many attacks so it is not used by many companies. The alternative methods with their strengths and weaknesses are discussed.
APA, Harvard, Vancouver, ISO, and other styles
24

Rooney, James. "The usability of knowledge based authentication methods on mobile devices." Thesis, Keele University, 2013. http://eprints.keele.ac.uk/382/.

Full text
Abstract:
Mobile devices are providing ever increasing functionality to users, and the risks associated with applications storing personal details are high. Graphical authentication methods have been shown to provide better security in terms of password space than traditional approaches, as well as being more memorable. The usability of any system is important since an unusable system will often be avoided. This thesis aims to investigate graphical authentication methods based on recall, cued recall and recognition memory in terms of their usability and security.
APA, Harvard, Vancouver, ISO, and other styles
25

Topal, Baran. "Comparison of Methods of Single Sign-On : Post authentication methods in single sign on." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-183144.

Full text
Abstract:
Single sign-on (SSO) is a session verification mechanism that allows a client to use a single password and name combination to be able to access multiple applications. The mechanism validates the client for all the applications and eliminates the need for authentication prompts when a user switches between applications within a session. SSO mechanisms can be classified as software versus hardware or customer-requirements oriented versus server-side arrangements. The five commonly used mechanisms of Single Sign-On currently are: Web Single Sign-On, Enterprise Single Sign-On, Kerberos (or Ticket/Token Authentication), Open ID, and Federation or Federated Identity. SSO has the main benefit of allowing a user to access many different systems without having to log on to each and every one of them separately. However, SSO introduces a security risk as once an attacker gains access to a single system, then the attacker has access to all of the systems. This thesis describes SSO technology, the Security Assertion Markup Language, and the advantages and risks involved in using SSO. It examines authentication mechanisms and their suitability for SSO integration. The main emphasis is a description of a mechanism that ameliorates some of the disadvantages of SSO by monitoring the user behavior with respect to a template. If a user performs actions that fit the defined template behavior, then the post authentication mechanism will not get activated. If, on the other hand, a user does something unforeseen, the mechanism will not perform authentication for this user, but rather trigger manual authentication. If this manual authentication succeeds, then the user will continue to interact with the system, otherwise user session will be ended. This behavior extension authentication mechanism is a method that eases the authentication process in which users are not expected to remember any username and password that can be forgotten easily or have a biometric attribute that can change over time. This method can be integrated to existing web application without a major risk and increase in cost.
Single sign-on (SSO) är en sessionkontrollmekanism som gör det möjligt för en kund att använda en ett enda par av lösenord och namn för att kunna få tillgång till flera olika program. Mekanismen validerar klienten för alla anrop och eliminerar behovet av ytterligare inloggningsdialoger när en användare växlar mellan program inom en session. SSO-mekanismer kan klassificeras enligt olika kriterier, såsom programvara kontra hårdvara eller kunder krav orienterade mot serversidan arrangemang. De fem vanligen använda mekanismerna för Single Sign-On är närvarande: Web Single Sign-On Enterprise Single Sign-On, Kerberos (eller Token autentisering), Open ID och Federation eller Federated Identity. SSO har den stora fördelen att en användare kan få tillgång till många olika system utan att behöva logga in på vart och ett av dem separat. Men SSO inför också en säkerhetsrisk i och med att tillgång till ett enda av systemen också automatiskt innebär tillgång till samtliga. Denna avhandling beskriver SSO-teknik, Security Assertion Markup Language, och fördelarna och riskerna med att använda SSO, samt undersöker autentiseringsmekanismer och deras lämplighet för SSO integration. Tyngdpunkten är en beskrivning av en mekanism som minskar några av nackdelarna med SSO genom att övervaka användarnas beteende med avseende på en mall. Om en användare utför åtgärder som passar det beteende som beskrivs av mallen, då den föreslagna mekanismen kommer att hantera autentiseringen automatiskt. Om, å andra sidan, en användare gör något oförutsett, kommer mekanismen inte att automatiskt utföra autentisering för den här användaren, utan utlöser manuellt autentisering. Om denna manuella autentiseringen lyckas, så kan användare fortsätta att fortsätta att interagera med systemet, annars kommer användarsessionen att avslutas. Denna beteendebaserade utvidgning av autentiseringsmekanismen är en lovande metod som minskar behovet av att komma ihåg många namn och lösenord, utan att lämna delsystem öppna till de säkerhetsproblem som uppstår i ren SSO, och utan att vara beroende av biometriska egenskaper som kan förändras över tiden. Denna metod kan integreras med befintliga webbaserade lösningar utan ökad risk och ökade kostnader.
APA, Harvard, Vancouver, ISO, and other styles
26

Palombo, Hernan Miguel. "A Comparative Study of Formal Verification Techniques for Authentication Protocols." Scholar Commons, 2015. http://scholarcommons.usf.edu/etd/6008.

Full text
Abstract:
Protocol verification is an exciting area of network security that intersects engineering and formal methods. This thesis presents a comparison of formal verification tools for security protocols for their respective strengths and weaknesses supported by the results from several case studies. The formal verification tools considered are based on explicit model checking (SPIN), symbolic analysis (Proverif) and theorem proving (Coq). We formalize and provide models of several well-known authentication and key-establishment protocols in each of the specification languages, and use the tools to find attacks that show protocols insecurity. We contrast the modelling process on each of the tools by comparing features of their modelling languages, verification efforts involved, and analysis results Our results show that authentication and key-establishment protocols can be specified in Coq’s modeling language with an unbounded number of sessions and message space. However, proofs in Coq require human guidance. SPIN runs automated verification with a restricted version of the Dolev-Yao attacker model. Proverif has several advantages over SPIN and Coq: a tailored specification language, and better performance on infinite state space analysis.
APA, Harvard, Vancouver, ISO, and other styles
27

Petersson, Jakob. "Analysis of Methods for Chained Connections with Mutual Authentication Using TLS." Thesis, Linköpings universitet, Informationskodning, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-119455.

Full text
Abstract:
TLS is a vital protocol used to secure communication over networks and it provides an end- to-end encrypted channel between two directly communicating parties. In certain situations it is not possible, or desirable, to establish direct connections from a client to a server, as for example when connecting to a server located on a secure network behind a gateway. In these cases chained connections are required. Mutual authentication and end-to-end encryption are important capabilities in a high assur- ance environment. These are provided by TLS, but there are no known solutions for chained connections. This thesis explores multiple methods that provides the functionality for chained connec- tions using TLS in a high assurance environment with trusted servers and a public key in- frastructure. A number of methods are formally described and analysed according to multi- ple criteria reflecting both functionality and security requirements. Furthermore, the most promising method is implemented and tested in order to verify that the method is viable in a real-life environment. The proposed solution modifies the TLS protocol through the use of an extension which allows for the distinction between direct and chained connections. The extension which also allows for specifying the structure of chained connections is used in the implementation of a method that creates chained connections by layering TLS connections inside each other. Testing demonstrates that the overhead of the method is negligible and that the method is a viable solution for creating chained connections with mutual authentication using TLS.
APA, Harvard, Vancouver, ISO, and other styles
28

Guerreiro, Joana Maria Gomes dos Santos. "Molecular methods for authentication of protected denomination of origin (PDO) cheeses." Thesis, University of Nottingham, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.435529.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Freimanis, Davis. "Vulnerability Assessment of Authentication Methods in a Large-Scale Computer System." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-261594.

Full text
Abstract:
Vulnerabilities exist in almost all software programs. Some software is more vulnerable than others. A method that can be used to mitigate the vulnerabilities is penetration testing. In this thesis, a penetration test was conducted on a large scale computer system provided by a company. The goal of the thesis was to see if vulnerabilities could be found, with a focus on the field of authentication. After conduction a thorough penetration test there were vulnerabilities found that threaten the confidentiality and integrity of the system. Authentication vulnerabilities were found by leaking password hashes and by performing pass-the-hash and pass-the-ticket exploits.
Sårbarheter finns i nästan alla mjukvaruprogram. Vissa mer allvarliga än andra. En metod som kan användas för att minska risken att ett system blir utsatt för hackerattacker är att utföra så kallade penetrationstest. I den här uppsatsen så presenteras ett sätt att utföra penetrationstest såväl som resultatet av ett penetrationstest som har utförts hos ett företag. Målet var att hitta om det finns sårbarheter i systemet, först och främst inom autentisering. I systemet hittades ett par sårbarheter som hotar konfidentialiteten och integriteten i systemet. Brister i autentisering hittades genom att läcka lösenordshashar och genom att utföra pass-the-hash och pass-the-ticket-exploateringar.
APA, Harvard, Vancouver, ISO, and other styles
30

Dzurenda, Petr. "Bezpečnostní rizika autentizačních metod." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2013. http://www.nusl.cz/ntk/nusl-219935.

Full text
Abstract:
Master's thesis deals with the security risks of current authentication methods. There are described methods which are based on user's knowledge and ownership of authentication object and biometric authentication method. The practical part of this Master's thesis deals with a specific design of authentication system based on protocol ACP, when the user proves his identity by smart card on provider assets, which is represented by ACP portal on the user's computer.
APA, Harvard, Vancouver, ISO, and other styles
31

Lorentzen, Peter, and Johan Lindh. "Evaluation of EAP-methods." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4452.

Full text
Abstract:
Network administrators typically employ di erent methods for authenticating and authorizing the access to their networks. A exible and scalable network access method is needed to combat the ever increasing network ubiquity brought on by technological advancements. The IEEE 802.1x Port-Based Network Access is a technology that allows transparent authentication to a network. It uses EAP-methods in order to authenticate against a server. There are a lot of di erent EAP-methods to choose from, and they vary in complexity and security. This report will bring up the di erences between the most commonly used authentication methods regarding the authentication time depending on di erent delay and network load. Results showed that EAP-methods that are less complex take less time to perform authentication than their counterparts. When there is no delay, or a very small delay, this might not matter, but when the delay is higher complex EAP-methods take signi cantly longer time to perform the authentication process. This is very negative considering the nature of transparent authentication, and could lead to users becoming annoyed. A general formula for determining how long time an EAP-authentication process will take is presented.
APA, Harvard, Vancouver, ISO, and other styles
32

El-Din, Sherif Nour. "Application of generalized finite automata methods to image authentication and copyright protection." Thesis, Staffordshire University, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.402739.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Bin, Safie Sairul Izwan. "Pulse domain novel feature extraction methods with application to ecg biometric authentication." Thesis, University of Strathclyde, 2012. http://oleg.lib.strath.ac.uk:80/R/?func=dbin-jump-full&object_id=17829.

Full text
Abstract:
This thesis presents the concept of representing finite signals in terms of sequential output pulses called pulse domain to extract Electrocardiogram (ECG) features for biometric authentication systems. Two novel methods based on the pulse domain philosophy namely Pulse Active (PA) and Adaptive Pulse Active (APA) techniques are presented in this thesis. A total of 11 algorithms are derived from these two methods and used to generate novel ECG feature vectors. Six algorithms of the PA technique are named as Pulse Active Bit (PAB), Pulse Active Width (PAW), Pulse Active Area (PAA), Pulse Active Mean (PAM), Pulse Active Ratio (PAR) and Pulse Active Harmonic (PAH). Five APA algorithms are named as Adaptive Pulse Active Bit (APAB), Adaptive Pulse Active Width (APAW), Adaptive Pulse Active Area (APAA), Adaptive Pulse Active Mean (APAM) and Adaptive Pulse Active Harmonic (APAH). The proposed techniques are validated using ECG experimental data from 112 subjects. Simulation results indicate that APAW generates the best biometric performance of all 11 algorithms. Selected ranges of PA and APA parameters are determined in this thesis that generates approximate similar biometric performance. Using this suggested range, these parameters are than used as a personal identification number (PIN) which are a part of the proposed PA-APA ECG based multilevel security biometric authentication system.
APA, Harvard, Vancouver, ISO, and other styles
34

Cetin, Cagri. "Authentication and SQL-Injection Prevention Techniques in Web Applications." Scholar Commons, 2019. https://scholarcommons.usf.edu/etd/7766.

Full text
Abstract:
This dissertation addresses the top two “most critical web-application security risks” by combining two high-level contributions. The first high-level contribution introduces and evaluates collaborative authentication, or coauthentication, a single-factor technique in which multiple registered devices work together to authenticate a user. Coauthentication provides security benefits similar to those of multi-factor techniques, such as mitigating theft of any one authentication secret, without some of the inconveniences of multi-factor techniques, such as having to enter passwords or biometrics. Coauthentication provides additional security benefits, including: preventing phishing, replay, and man-in-the-middle attacks; basing authentications on high-entropy secrets that can be generated and updated automatically; and availability protections against, for example, device misplacement and denial-of-service attacks. Coauthentication is amenable to many applications, including m-out-of-n, continuous, group, shared-device, and anonymous authentications. The principal security properties of coauthentication have been formally verified in ProVerif, and implementations have performed efficiently compared to password-based authentication. The second high-level contribution defines a class of SQL-injection attacks that are based on injecting identifiers, such as table and column names, into SQL statements. An automated analysis of GitHub shows that 15.7% of 120,412 posted Java source files contain code vulnerable to SQL-Identifier Injection Attacks (SQL-IDIAs). We have manually verified that some of the 18,939 Java files identified during the automated analysis are indeed vulnerable to SQL-IDIAs, including deployed Electronic Medical Record software for which SQL-IDIAs enable discovery of confidential patient information. Although prepared statements are the standard defense against SQL injection attacks, existing prepared-statement APIs do not protect against SQL-IDIAs. This dissertation therefore proposes and evaluates an extended prepared-statement API to protect against SQL-IDIAs.
APA, Harvard, Vancouver, ISO, and other styles
35

Yakimov, Vladimir. "Examining and comparing the authentication methods for users in computer networks and systems." Thesis, Linnéuniversitetet, Institutionen för datavetenskap, fysik och matematik, DFM, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-15164.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Shultz, Jacque. "Authenticating turbocharger performance utilizing ASME performance test code correction methods." Thesis, Kansas State University, 2011. http://hdl.handle.net/2097/8451.

Full text
Abstract:
Master of Science
Department of Mechanical and Nuclear Engineering
Kirby S. Chapman
Continued regulatory pressure necessitates the use of precisely designed turbochargers to create the design trapped equivalence ratio within large-bore stationary engines used in the natural gas transmission industry. The upgraded turbochargers scavenge the exhaust gases from the cylinder, and create the air manifold pressure and back pressure on the engine necessary to achieve a specific trapped mass. This combination serves to achieve the emissions reduction required by regulatory agencies. Many engine owner/operators request that an upgraded turbocharger be tested and verified prior to re-installation on engine. Verification of the mechanical integrity and airflow performance prior to engine installation is necessary to prevent field hardware iterations. Confirming the as-built turbocharger design specification prior to transporting to the field can decrease downtime and installation costs. There are however, technical challenges to overcome for comparing test-cell data to field conditions. This thesis discusses the required corrections and testing methodology to verify turbocharger onsite performance from data collected in a precisely designed testing apparatus. As the litmus test of the testing system, test performance data is corrected to site conditions per the design air specification. Prior to field installation, the turbocharger is fitted with instrumentation to collect field operating data to authenticate the turbocharger testing system and correction methods. The correction method utilized herein is the ASME Performance Test Code 10 (PTC10) for Compressors and Exhausters version 1997.
APA, Harvard, Vancouver, ISO, and other styles
37

Tran, Florén Simon. "Implementation and Analysis of Authentication and Authorization Methods in a Microservice Architecture : A Comparison Between Microservice Security Design Patterns for Authentication and Authorization Flows." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-301620.

Full text
Abstract:
Microservices have emerged as an attractive alternative to more classical monolithic software application architectures. Microservices provides many benefits that help with code base comprehension, deployability, testability, and scalability. As the Information technology (IT) industry has grown ever larger, it makes sense for the technology giants to adopt the microservice architecture to make use of these benefits. However, with new software solutions come new security vulnerabilities, especially when the technology is new and vulnerabilities are yet to be fully mapped out. Authentication and authorization are the cornerstone of any application that has a multitude of users. However, due to the lack of studies of microservices, stemming from their relatively young age, there are no standardized design patterns for how authentication and authorization are best implemented in a microservice. This thesis investigates an existing microservice in order to secure it by applying what is known as a security design pattern for authentication and authorization. Different security patterns were tested and compared on performance. The differing levels of security provided by these approaches assisted in identifying an acceptable security versus performance trade-off. Ultimately, the goal was to give the patterns greater validity as accepted security patterns within the area of microservice security. Another goal was to find such a security pattern suitable for the given microservice used in this project. The results showed a correlation between increased security and longer response times. For the general case a security pattern which provided internal authentication and authorization but with some trust between services was suggested. If horizontal scaling was used the results showed that normal services proved to be the best target. Further, it was also revealed that for lower user counts the performance penalties were close to equal between the tested patterns. This meant that for the specific case where microservices sees lower amounts of traffic the recommended pattern was the one that implemented the maximum amount access control checks. In the case for the environment where the research were performed low amounts of traffic was seen and the recommended security pattern was therefore one that secured all services of the microservices.
Mikrotjänster har framträtt som ett mer attraktivt alternativ än mer konventionella mjukvaruapplikationsarkitekturer såsom den monolitiska. Mikrotjänster erbjuder flera fördelar som underlättar med en helhetsförståelse för kodbasen, driftsättning, testbarhet, och skalbarhet. Då IT industrin har växt sig allt större, så är det rimligt att tech jättar inför mikrotjänstarkitekturen för att kunna utnyttja dessa fördelar. Nya mjukvarulösningar medför säkerhetsproblem, speciellt då tekniken är helt ny och inte har kartlagts ordentligt. Autentisering och auktorisering utgör grunden för applikationer som har ett flertal användare. Då mikrotjänster ej hunnit blivit utförligt täckt av undersökning, på grund av sin relativt unga ålder, så finns det ej några standardiserade designmönster för hur autentisering och auktorisering är implementerade till bästa effekt i en mikrotjänst. Detta examensarbete undersöker en existerande mikrotjänst för att säkra den genom att applicera vad som är känt som ett säkerhetsdesignmönster för autentisering och auktorisering. Olika sådana mönster testades och jämfördes baserat på prestanda i olika bakgrunder. De varierade nivåerna av säkerhet från de olika angreppssätten som säkerhetsmönstrena erbjöd användes för att identifiera en acceptabel kompromiss mellan säkerhet mot prestanda. Målet är att i slutändan så kommer detta att ge mönstren en högre giltighet när det kommer till att bli accepterade som säkerhetsdesignmönster inom området av mikrotjänstsäkerhet. Ett annat mål var att hitta den bästa kandidaten bland dessa säkerhetsmönster för den givna mikrotjänsten som användes i projektet. Resultaten visade på en korrelation mellan ökad säkerhet och längre responstider. För generella fall rekommenderas det säkerhetsmönster som implementerade intern autentisering och auktorisering men med en viss del tillit mellan tjänster. Om horisontell skalning användes visade resultaten att de normala tjänsterna var de bästa valet att lägga dessa resurser på. Fortsättningsvis visade resultaten även att för ett lägre antal användare så var den negativa effekten på prestandan nästan likvärdig mellan de olika mönstren. Detta innebar att det specifika fallet då mikrotjänster ser en lägre mängd trafik så är det rekommenderade säkerhetsmönstret det som implementerad flest åtkomstkontroller. I fallet för den miljö där undersökningen tog plats förekom det en lägre mängd trafik och därför rekommenderades det säkerhetsmönster som säkrade alla tjänster närvarande i mikrotjänsten.
APA, Harvard, Vancouver, ISO, and other styles
38

Webb, James Braxton. "Methods for Securing the Integrity of FPGA Configurations." Thesis, Virginia Tech, 2006. http://hdl.handle.net/10919/35209.

Full text
Abstract:
As Field Programmable Gate Arrays (FPGAs) continue to become integral parts of embedded systems, it is imperative to consider their security. While much of the research in this field is oriented toward the protection of the intellectual property contained in the FPGA's configuration, the protection of the design's integrity from malicious attack against the configuration is critical to the operation of the system. Methods for attacking the configuration are semi-invasive attacks, such as fault injection, and data tampering of incoming partial bitstreams. This thesis introduces methods for securing the integrity of an FPGA's configuration. The design and implementation is discussed for a system that consists of three parts. The first subsystem monitors the running configuration. The second subsystem authenticates partial bistreams that may be used for repairing the configuration from malicious alterations during run-time. The third subsystem indicates if the system itself succumbs to a malicious attack. The system is implemented on-chip, allowing the FPGA to effectively secure itself from attack.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
39

Beaudin, Shauna. "An Empirical Study of Authentication Methods to Secure E-learning System Activities Against Impersonation Fraud." NSUWorks, 2016. http://nsuworks.nova.edu/gscis_etd/958.

Full text
Abstract:
Studies have revealed that securing Information Systems (IS) from intentional misuse is a concern among organizations today. The use of Web-based systems has grown dramatically across industries including e-commerce, e-banking, e-government, and e learning to name a few. Web-based systems provide e-services through a number of diverse activities. The demand for e-learning systems in both academic and non-academic organizations has increased the need to improve security against impersonation fraud. Although there are a number of studies focused on securing Web-based systems from Information Systems (IS) misuse, research has recognized the importance of identifying suitable levels of authenticating strength for various activities. In e-learning systems, it is evident that due to the variation in authentication strength among controls, a ‘one size fits all’ solution is not suitable for securing diverse e-learning activities against impersonation fraud. The main goal of this study was to use the framework of the Task-Technology Fit (TTF) theory to conduct an exploratory research design to empirically investigate what levels of authentication strength users perceive to be most suitable for activities in e-learning systems against impersonation fraud. This study aimed to assess if the ‘one size fits all’ approach mainly used nowadays is valid when it comes to securing e-learning activities from impersonation fraud. Following the development of an initial survey instrument (Phase 1), expert panel feedback was gathered for instrument validity using the Delphi methodology. The initial survey instrument was adjusted according to feedback (Phase 2). The finalized Web-based survey was used to collect quantitative data for final analyses (Phase 3). This study reported on data collected from 1,070 e-learners enrolled at a university. Descriptive statistics was used to identify what e-learning activities perceived by users and what users perceived that their peers would identify to have a high potential for impersonation. The findings determined there are a specific set of e-learning activities that high have potential for impersonation fraud and need a moderate to high level of authentication strength to reduce the threat. Principal Component Analysis was used to identify significant components of authentication strength to be suitable against the threats of impersonation for e-learning activities.
APA, Harvard, Vancouver, ISO, and other styles
40

Doganay-Knapp, Kirsten [Verfasser]. "Potential of Complementary Methods for the Authentication of Herbal Substances and their Mixtures / Kirsten Doganay-Knapp." Bonn : Universitäts- und Landesbibliothek Bonn, 2015. http://d-nb.info/1188726110/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Spielmann, Gesche [Verfasser], and Gerhard [Akademischer Betreuer] Haszprunar. "Establishment and comparison of molecular biological methods for seafood species authentication / Gesche Spielmann ; Betreuer: Gerhard Haszprunar." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2019. http://d-nb.info/1198112085/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Amoah, Raphael. "Formal security analysis of the DNP3-Secure Authentication Protocol." Thesis, Queensland University of Technology, 2016. https://eprints.qut.edu.au/93798/1/Raphael_Amoah_Thesis.pdf.

Full text
Abstract:
This thesis evaluates the security of Supervisory Control and Data Acquisition (SCADA) systems, which are one of the key foundations of many critical infrastructures. Specifically, it examines one of the standardised SCADA protocols called the Distributed Network Protocol Version 3, which attempts to provide a security mechanism to ensure that messages transmitted between devices, are adequately secured from rogue applications. To achieve this, the thesis applies formal methods from theoretical computer science to formally analyse the correctness of the protocol.
APA, Harvard, Vancouver, ISO, and other styles
43

Mattord, Herbert J. "Assessment of Web-Based Authentication Methods in the U.S.: Comparing E-Learning Systems to Internet Healthcare Information Systems." NSUWorks, 2012. http://nsuworks.nova.edu/gscis_etd/235.

Full text
Abstract:
Organizations continue to rely on password-based authentication methods to control access to many Web-based systems. This research study developed a benchmarking instrument intended to assess authentication methods used in Web-based information systems (IS). It developed an Authentication Method System Index (AMSI) to analyze collected data from representative samples of e-learning systems in the U.S. and from healthcare ISs, also in the U.S. This data were used to compare authentication methods used by those systems. The AMSI measured 1) password strength requirements, 2) password usage methods, and 3) password reset requirements. Those measures were combined into the single index that represents the current authentication methods. This study revealed that there is no significant difference in the ways that authentication methods are employed between the two groups of ISs. This research validated the criteria proposed for the AMSI using a panel of experts drawn from industry and academia. Simultaneously, the same panel provided preferences for the relative weight of specific criteria within some measures. The panel of experts also assessed the relative weight of each measure within the AMSI. Once the criteria were verified and the elicited weights were computed, an opportunity sample of Web-based ISs in the two groups identified earlier were assessed to ascertain the values for the criteria that comprise the AMSI. After completion of pre-analysis data screening, the collected data were assessed using the results of the AMSI benchmarking tool. Results of the comparison within and between the two sample groups are presented. This research found that the AMSI can be used as a mechanism to measure some aspects of the authentication methods used by Web-based systems. There was no measurable significance in the differences between the samples groups. However, IS designers, quality assurance teams, and information security practitioners charged with validating ISs methods may choose to use it to measure the effectiveness of such authentication methods. This can enable continuous improvement of authentication methods employed in such Web-based systems.
APA, Harvard, Vancouver, ISO, and other styles
44

Howard, Caroline. "The development of Deoxyribonucleic Acid (DNA) based methods for the identification and authentication of medicinal plant material." Thesis, De Montfort University, 2010. http://hdl.handle.net/2086/3972.

Full text
Abstract:
Herbal medicines are growing in popularity in the Western world and are becoming more stringently regulated under new EU legislation. Within the arena of herbal medicines, St. John’s Wort (SJW), Hypericum perforatum, is a top ten best seller with clinical evidence to support its use as an anti-depressant. A fundamental requirement of the new legislation is to prove the identity of the plant material in question. This is currently achieved via morphological and chemical methods, neither of which are ideal. A wide range of DNA based methods have been applied to this arena, standardisation is required to realise the potential of DNA based techniques. The DNA barcoding initiative aims to produce sequence data for all plant species, capable of species identification. The proposal is to use these data to design fast and effective DNA based methods of identification. For assay design, the putative barcode region nrITS was selected as a platform. Three assays were designed; • A PCR assay designed to hyper variable sequences within a barcode region. This assay is capable of distinguishing SJW from other closely related species. • A quantitative qPCR assay designed to measure total DNA and specific SJW DNA within a mixed sample. • A multiplex PCR incorporating fluorescently labelled primers, allowing amplicon detection by capillary electrophoresis. This assay identifies four separate Hypericum species, including SJW, with a mixed sample in one reaction. The suitability of the nrITS and three other barcode regions is assessed based on sequence data generated for 32 vouchered samples of different Hypericum species, and a Lithuanian sample set of 22 and 16 H. perforatum and H. maculatum samples respectively. The matK is currently unusable, the rbcL highly conserved, trnH-psbA problematically variable and the nrITS proved to be ideal for assay design.
APA, Harvard, Vancouver, ISO, and other styles
45

Quintal, Kyle. "Context-Awareness for Adversarial and Defensive Machine Learning Methods in Cybersecurity." Thesis, Université d'Ottawa / University of Ottawa, 2020. http://hdl.handle.net/10393/40835.

Full text
Abstract:
Machine Learning has shown great promise when combined with large volumes of historical data and produces great results when combined with contextual properties. In the world of the Internet of Things, the extraction of information regarding context, or contextual information, is increasingly prominent with scientific advances. Combining such advancements with artificial intelligence is one of the themes in this thesis. Particularly, there are two major areas of interest: context-aware attacker modelling and context-aware defensive methods. Both areas use authentication methods to either infiltrate or protect digital systems. After a brief introduction in chapter 1, chapter 2 discusses the current extracted contextual information within cybersecurity studies, and how machine learning accomplishes a variety of cybersecurity goals. Chapter 3 introduces an attacker injection model, championing the adversarial methods. Then, chapter 4 extracts contextual data and provides an intelligent machine learning technique to mitigate anomalous behaviours. Chapter 5 explores the feasibility of adopting a similar defensive methodology in the cyber-physical domain, and future directions are presented in chapter 6. Particularly, we begin this thesis by explaining the need for further improvements in cybersecurity using contextual information and discuss its feasibility, now that ubiquitous sensors exist in our everyday lives. These sensors often show a high correlation with user identity in surprising combinations. Our first contribution lay within the domain of Mobile CrowdSensing (MCS). Despite its benefits, MCS requires proper security solutions to prevent various attacks, notably injection attacks. Our smart-injection model, SINAM, monitors data traffic in an online-learning manner, simulating an injection model with undetection rates of 99%. SINAM leverages contextual similarities within a given sensing campaign to mimic anomalous injections. On the flip-side, we investigate how contextual features can be utilized to improve authentication methods in an enterprise context. Also motivated by the emergence of omnipresent mobile devices, we expand the Spatio-temporal features of unfolding contexts by introducing three contextual metrics: document shareability, document valuation, and user cooperation. These metrics are vetted against modern machine learning techniques and achieved an average of 87% successful authentication attempts. Our third contribution aims to further improve such results but introducing a Smart Enterprise Access Control (SEAC) technique. Combining the new contextual metrics with SEAC achieved an authenticity precision of 99% and a recall of 97%. Finally, the last contribution is an introductory study on risk analysis and mitigation using context. Here, cyber-physical coupling metrics are created to extract a precise representation of unfolding contexts in the medical field. The presented consensus algorithm achieves initial system conveniences and security ratings of 88% and 97% with these news metrics. Even as a feasibility study, physical context extraction shows good promise in improving cybersecurity decisions. In short, machine learning is a powerful tool when coupled with contextual data and is applicable across many industries. Our contributions show how the engineering of contextual features, adversarial and defensive methods can produce applicable solutions in cybersecurity, despite minor shortcomings.
APA, Harvard, Vancouver, ISO, and other styles
46

Cerda, III Cruz. "Medical Identity Theft and Palm Vein Authentication: The Healthcare Manager's Perspective." ScholarWorks, 2018. https://scholarworks.waldenu.edu/dissertations/4778.

Full text
Abstract:
The Federal Bureau of Investigation reported that cyber actors will likely increase cyber intrusions against healthcare systems and their concomitant medical devices because of the mandatory transition from paper to electronic health records, lax cyber security standards, and a higher financial payout for medical records in the deep web. The problem addressed in this quantitative correlational study was uncertainty surrounding the benefits of palm vein authentication adoption relative to the growing crime of medical identity theft. The purpose of this quantitative correlational study was to understand healthcare managers' and doctors' perceptions of the effectiveness of palm vein authentication technology. The research questions were designed to investigate the relationship between intention to adopt palm vein authentication technology and perceived usefulness, complexity, security, peer influence, and relative advantage. The unified theory of acceptance and use of technology was the theoretical basis for this quantitative study. Data were gathered through an anonymous online survey of 109 healthcare managers and doctors, and analyzed using principal axis factoring, Pearson's product moment correlation, multiple linear regression, and 1-way analysis of variance. The results of the study showed a statistically significant positive correlation between perceived usefulness, security, peer influence, relative advantage, and intention to adopt palm vein authentication. No statistically significant correlation existed between complexity and intention to adopt palm vein authentication. These findings indicate that by effectively using palm vein authentication, organizations can mitigate the risk of medical fraud and its associated costs, and positive social change can be realized.
APA, Harvard, Vancouver, ISO, and other styles
47

Defernez, Marianne. "Methods based on principal component analysis of mid-infrared spectra : a new approach for the classification and authentication of fruit products." Thesis, University of East Anglia, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.309908.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Doan, Thi Ngoc Canh. "Statistical Methods for Digital Image Forensics." Thesis, Troyes, 2018. http://www.theses.fr/2018TROY0036.

Full text
Abstract:
L’explosion de la technologie d’imagerie numérique s’est considérablement accrue, posant d’énormes problèmes pour la sécurité de l’information. Grâce à des outils d'édition d'images à faible coût, l'omniprésence des images falsifiées est devenue une réalité incontournable. Cette situation souligne la nécessité d'étendre les recherches actuelles dans le domaine de la criminalistique numérique afin de restaurer la confiance dans les images numériques. Deux problèmes importants sont abordés dans cette thèse: l’estimation du facteur de qualité d’une image JPEG et la détection de la falsification des images numériques. Ces travaux s’inscrivent dans le cadre de la théorie des tests d’hypothèse et proposent la construction de détecteurs permettant de respecter une contrainte sur la probabilité de fausse alarme. Afin d’atteindre une performance de détection élevée, il est proposé d’exploiter un modèle statistique des images naturelles. Ce modèle est construit à partir du processus de formation des images. Des expériences numériques sur des images simulées et réelles ont mis en évidence la pertinence de l'approche proposée
Digital imaging technology explosion has grown significantly posing tremendous security concerns to information security. Under the support of low-cost image editing tools, the ubiquity of tampered images has become an unavoidable reality. This situation highlights the need to improve and extend the current research in the field of digital forensics to restore the trust of digital images. Since each stage of the image history leaves a specific trace on the data, we propose to extract the digital fingerprint as evidence of tampering. Two important problems are addressed in this thesis: quality factor estimation for a given JPEG image and image forgery authentication. For the first problem, a likelihood ratio has been constructed relied on a spatial domain model of the variance of 8 × 8 blocks of JPEG images. In the second part of thesis, the robust forensic detectors have been designed for different types of tampering in the framework of the hypothesis testing theory based on a parametric model that characterizes statistical properties of natural images. The construction of this model is performed by studying the image processing pipeline of a digital camera. The statistical estimation of unknown parameters is employed, leading to application of these tests in practice. This approach allows the design of the most powerful test capable of warranting a prescribed false alarm probability while ensuring a high detection performance. Numerical experiments on simulated and real images have highlighted the relevance of the proposed approach
APA, Harvard, Vancouver, ISO, and other styles
49

Hitchcock, Yvonne Roslyn. "Elliptic curve cryptography for lightweight applications." Thesis, Queensland University of Technology, 2003. https://eprints.qut.edu.au/15838/1/Yvonne_Hitchcock_Thesis.pdf.

Full text
Abstract:
Elliptic curves were first proposed as a basis for public key cryptography in the mid 1980's. They provide public key cryptosystems based on the difficulty of the elliptic curve discrete logarithm problem (ECDLP) , which is so called because of its similarity to the discrete logarithm problem (DLP) over the integers modulo a large prime. One benefit of elliptic curve cryptosystems (ECCs) is that they can use a much shorter key length than other public key cryptosystems to provide an equivalent level of security. For example, 160 bit ECCs are believed to provide about the same level of security as 1024 bit RSA. Also, the level of security provided by an ECC increases faster with key size than for integer based discrete logarithm (dl) or RSA cryptosystems. ECCs can also provide a faster implementation than RSA or dl systems, and use less bandwidth and power. These issues can be crucial in lightweight applications such as smart cards. In the last few years, ECCs have been included or proposed for inclusion in internationally recognized standards. Thus elliptic curve cryptography is set to become an integral part of lightweight applications in the immediate future. This thesis presents an analysis of several important issues for ECCs on lightweight devices. It begins with an introduction to elliptic curves and the algorithms required to implement an ECC. It then gives an analysis of the speed, code size and memory usage of various possible implementation options. Enough details are presented to enable an implementer to choose for implementation those algorithms which give the greatest speed whilst conforming to the code size and ram restrictions of a particular lightweight device. Recommendations are made for new functions to be included on coprocessors for lightweight devices to support ECC implementations Another issue of concern for implementers is the side-channel attacks that have recently been proposed. They obtain information about the cryptosystem by measuring side-channel information such as power consumption and processing time and the information is then used to break implementations that have not incorporated appropriate defences. A new method of defence to protect an implementation from the simple power analysis (spa) method of attack is presented in this thesis. It requires 44% fewer additions and 11% more doublings than the commonly recommended defence of performing a point addition in every loop of the binary scalar multiplication algorithm. The algorithm forms a contribution to the current range of possible spa defences which has a good speed but low memory usage. Another topic of paramount importance to ECCs for lightweight applications is whether the security of fixed curves is equivalent to that of random curves. Because of the inability of lightweight devices to generate secure random curves, fixed curves are used in such devices. These curves provide the additional advantage of requiring less bandwidth, code size and processing time. However, it is intuitively obvious that a large precomputation to aid in the breaking of the elliptic curve discrete logarithm problem (ECDLP) can be made for a fixed curve which would be unavailable for a random curve. Therefore, it would appear that fixed curves are less secure than random curves, but quantifying the loss of security is much more difficult. The thesis performs an examination of fixed curve security taking this observation into account, and includes a definition of equivalent security and an analysis of a variation of Pollard's rho method where computations from solutions of previous ECDLPs can be used to solve subsequent ECDLPs on the same curve. A lower bound on the expected time to solve such ECDLPs using this method is presented, as well as an approximation of the expected time remaining to solve an ECDLP when a given size of precomputation is available. It is concluded that adding a total of 11 bits to the size of a fixed curve provides an equivalent level of security compared to random curves. The final part of the thesis deals with proofs of security of key exchange protocols in the Canetti-Krawczyk proof model. This model has been used since it offers the advantage of a modular proof with reusable components. Firstly a password-based authentication mechanism and its security proof are discussed, followed by an analysis of the use of the authentication mechanism in key exchange protocols. The Canetti-Krawczyk model is then used to examine secure tripartite (three party) key exchange protocols. Tripartite key exchange protocols are particularly suited to ECCs because of the availability of bilinear mappings on elliptic curves, which allow more efficient tripartite key exchange protocols.
APA, Harvard, Vancouver, ISO, and other styles
50

Hitchcock, Yvonne Roslyn. "Elliptic Curve Cryptography for Lightweight Applications." Queensland University of Technology, 2003. http://eprints.qut.edu.au/15838/.

Full text
Abstract:
Elliptic curves were first proposed as a basis for public key cryptography in the mid 1980's. They provide public key cryptosystems based on the difficulty of the elliptic curve discrete logarithm problem (ECDLP) , which is so called because of its similarity to the discrete logarithm problem (DLP) over the integers modulo a large prime. One benefit of elliptic curve cryptosystems (ECCs) is that they can use a much shorter key length than other public key cryptosystems to provide an equivalent level of security. For example, 160 bit ECCs are believed to provide about the same level of security as 1024 bit RSA. Also, the level of security provided by an ECC increases faster with key size than for integer based discrete logarithm (dl) or RSA cryptosystems. ECCs can also provide a faster implementation than RSA or dl systems, and use less bandwidth and power. These issues can be crucial in lightweight applications such as smart cards. In the last few years, ECCs have been included or proposed for inclusion in internationally recognized standards. Thus elliptic curve cryptography is set to become an integral part of lightweight applications in the immediate future. This thesis presents an analysis of several important issues for ECCs on lightweight devices. It begins with an introduction to elliptic curves and the algorithms required to implement an ECC. It then gives an analysis of the speed, code size and memory usage of various possible implementation options. Enough details are presented to enable an implementer to choose for implementation those algorithms which give the greatest speed whilst conforming to the code size and ram restrictions of a particular lightweight device. Recommendations are made for new functions to be included on coprocessors for lightweight devices to support ECC implementations Another issue of concern for implementers is the side-channel attacks that have recently been proposed. They obtain information about the cryptosystem by measuring side-channel information such as power consumption and processing time and the information is then used to break implementations that have not incorporated appropriate defences. A new method of defence to protect an implementation from the simple power analysis (spa) method of attack is presented in this thesis. It requires 44% fewer additions and 11% more doublings than the commonly recommended defence of performing a point addition in every loop of the binary scalar multiplication algorithm. The algorithm forms a contribution to the current range of possible spa defences which has a good speed but low memory usage. Another topic of paramount importance to ECCs for lightweight applications is whether the security of fixed curves is equivalent to that of random curves. Because of the inability of lightweight devices to generate secure random curves, fixed curves are used in such devices. These curves provide the additional advantage of requiring less bandwidth, code size and processing time. However, it is intuitively obvious that a large precomputation to aid in the breaking of the elliptic curve discrete logarithm problem (ECDLP) can be made for a fixed curve which would be unavailable for a random curve. Therefore, it would appear that fixed curves are less secure than random curves, but quantifying the loss of security is much more difficult. The thesis performs an examination of fixed curve security taking this observation into account, and includes a definition of equivalent security and an analysis of a variation of Pollard's rho method where computations from solutions of previous ECDLPs can be used to solve subsequent ECDLPs on the same curve. A lower bound on the expected time to solve such ECDLPs using this method is presented, as well as an approximation of the expected time remaining to solve an ECDLP when a given size of precomputation is available. It is concluded that adding a total of 11 bits to the size of a fixed curve provides an equivalent level of security compared to random curves. The final part of the thesis deals with proofs of security of key exchange protocols in the Canetti-Krawczyk proof model. This model has been used since it offers the advantage of a modular proof with reusable components. Firstly a password-based authentication mechanism and its security proof are discussed, followed by an analysis of the use of the authentication mechanism in key exchange protocols. The Canetti-Krawczyk model is then used to examine secure tripartite (three party) key exchange protocols. Tripartite key exchange protocols are particularly suited to ECCs because of the availability of bilinear mappings on elliptic curves, which allow more efficient tripartite key exchange protocols.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography