To see the other types of publications on this topic, follow the link: Reuse and protocol exchange.

Dissertations / Theses on the topic 'Reuse and protocol exchange'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Reuse and protocol exchange.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Djaffardjy, Marine. "Pipelines d'Analyse Bioinformatiques : solutions offertes par les Systèmes de Workflows, Cadre de représentation et Étude de la Réutilisation." Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG059.

Full text
Abstract:
La bioinformatique est un domaine multidisciplinaire qui combine biologie, informatique et statistiques, permettant de mieux comprendre les mécanismes du vivant.Son fondement repose essentiellement sur l'analyse des données biologiques.L'émergence de nouvelles technologies, en particulier les avancées majeures dans le domaine du séquençage, a entraîné une croissance exponentielle des données, posant de nouveaux défis en matière d'analyse et de gestion des données.Pour exploiter ces données, des pipelines sont utilisés, enchaînant des outils et des processus informatiques pour conduire les analyses de manière fiable et efficace. Cependant, la crise de la reproductibilité dans la recherche scientifique souligne la nécessité de rendre les analyses reproductibles et réutilisables par des tiers.Les systèmes de workflows scientifiques ont émergé comme une solution pour rendre les pipelines plus structurés, compréhensibles et reproductibles. Les workflows décrivent des procédures en plusieurs étapes coordonnant des tâches et leurs dépendances de données. Ces systèmes aident les bioinformaticiens à concevoir et exécuter des workflows, et facilitent leur partage et réutilisation. En bioinformatique, les systèmes de workflows les plus populaires sont Galaxy, Snakemake, et Nextflow.Cependant, la réutilisation des workflows fait face à des difficultés, notamment l'hétérogénéité des systèmes de workflows, le manque d'accessibilité des workflows et le besoin de bases de données publiques de workflows. De plus, l'indexation et le développement de moteurs de recherche de workflows sont nécessaires pour faciliter la recherche et la réutilisation des workflows.Dans un premier temps, nous avons développé une méthode d'analyse des spécifications de workflows afin d'extraire plusieurs caractéristiques représentatives à partir d'un ensemble de données de workflows. Notre objectif était de proposer un cadre standard pour leur représentation, indépendamment de leur langage de spécification.Dans un second temps, nous avons sélectionné un ensemble de caractéristiques de ces workflows et les avons indexées dans une base de données relationnelle, puis dans un format structuré sémantique.Enfin, nous avons mis en place une approche pour détecter les similarités entre les workflows et les processeurs, permettant ainsi d'observer les pratiques de réutilisation adoptées par les développeurs de workflows
Bioinformatics is a multidisciplinary field that combines biology, computer science, and statistics, aiming to gain a better understanding of living mechanisms. It relies primarily on the analysis of biological data. Major technological improvements, especially sequencing technologies, gave rise to an exponential increase of data, laying out new challenges in data analysis and management.In order to analyze this data, bioinformaticians use pipelines, which chain computational tools and processes. However, the reproducibility crisis in scientific research highlights the necessity of making analyses reproducible and reusable by others.Scientific workflow systems have emerged as a solution to make pipelines more structured, understandable, and reproducible. Workflows describe procedures with multiple coordinated steps involving tasks and their data dependencies. These systems assist bioinformaticians in designing and executing workflows, facilitating their sharing and reuse. In bioinformatics, the most popular workflow systems are Galaxy, Snakemake, and Nextflow.However, the reuse of workflows faces challenges, including the heterogeneity of workflow systems, limited accessibility to workflows, and the need for public workflow databases. Additionally, indexing and developing workflow search engines are necessary to facilitate workflow discovery and reuse.In this study, we developed an analysis method for workflow specifications to extract several representative characteristics from a dataset of workflows. The goal was to propose a standardized representation framework independent of the specification language. Additionally, we selected a set of workflow characteristics and indexed them into a relational database and a structured semantic format. Finally, we established an approach to detect similarity between workflows and between processors, enabling us to observe the reuse practices adopted by workflow developers
APA, Harvard, Vancouver, ISO, and other styles
2

Soltwisch, Rene Alexander. "The Inter-Domain Key Exchange Protocol." Doctoral thesis, [S.l.] : [s.n.], 2006. http://hdl.handle.net/11858/00-1735-0000-0006-B403-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Nilsson, Kim. "Reactive Networking using Dynamic Link Exchange Protocol." Thesis, KTH, Kommunikationsnät, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-154832.

Full text
Abstract:
This master thesis studies the possibilities of using a radio-router protocol in order to increase the quality of service in dynamic tactical network environments. We cover three radio-router protocols with emphasis on Dynamic Link Exchange Protocol (DLEP). Many applications, such as voice and video communication, have bandwidth and latency requirements which need to be fulfilled in order to provide a sufficient level of quality. This poses a problem in tactical network environments where links are typically dynamic and both bandwidth andlatency can vary. A radio-router protocol can alleviate this problem and also improve the routing in a network by allowing routers to take part of link-layer information. By using a radio link emulator (RLE) developed by Saab we are able to simulate dynamic network environments. We have performed two experiments by combining the RLE and an implementation of a subset ofthe DLEP specification draft. Both experiments simulate typical military network scenarios and allow us to analyse the effects of utilizing link-layerfeedback.Our results show that by using DLEP it is possible to provide better quality of service in highly dynamic conditions. We also show that DLEP can influence Optimized Link State Routing (OLSR) by making OLSR aware of changes in the network topology. This leads to a reduced network convergence time with only a small increase in OLSR overhead.
APA, Harvard, Vancouver, ISO, and other styles
4

Chandramohan, Vijay. "Design and Performance Evaluation of a New Spatial Reuse FireWire Protocol." [Tampa, Fla.] : University of South Florida, 2003. http://purl.fcla.edu/fcla/etd/SFE0000128.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sardana, Divya. "Control-channel Reuse-based Multi-channel MAC Protocol for Ad Hoc Networks." University of Cincinnati / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1249856000.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Page, Shannon Charles. "Testing Protocol Development for a Proton Exchange Membrane Fuel Cell." Thesis, University of Canterbury. Department of Mechanical Engineering, 2007. http://hdl.handle.net/10092/3519.

Full text
Abstract:
Fuel cell technology has undergone significant development in the past 15 years, spurred in part by its unique energy conversion characteristics; directly converting chemical energy to electrical energy. As fuel cell technology has past through the prototype/pre-commercialisation development, there is increasing interest in manufacturing and application issues. Of the six different fuel cell types pursued commercially, the Proton Exchange Membrane (PEM) fuel cell has received the greatest amount of research and development investment due to its suitability in a variety of applications. A particular application, to which state-of-the art PEMFC technology is suited, is backup/uninterruptible power supply (UPS) systems, or stand-by power systems. The most important feature of any backup/UPS system is reliability. Traditional backup power systems, such as those utilising valve regulated lead acid (VRLA) batteries, employ remote testing protocols that acquire battery state-of-health and state-of-charge information. This information plays a critical role in system management and reliability assurance. A similar testing protocol developed for a PEM fuel cell would be a valuable contribution to the commercialization of these systems for backup/UPS applications. This thesis presents a novel testing and analysis procedure, specifically designed for a PEM fuel cell in a backup power application. The test procedure electronically probes the fuel cell in the absence of hydrogen. Thus, the fuel cell is in an inactive, or passive, state throughout the testing process. The procedure is referred to as the passive state dynamic behaviour (PSDB) test. Analysis and interpretation of the passive test results is achieved by determining the circuit parameter values of an equivalent circuit model (ECM). A novel ECM of a fuel cell in a passive state is proposed, in which physical properties of the fuel cell are attributed to the circuit model components. Therefore, insight into the physical state of the fuel cell is achieved by determining the values of the circuit model parameters. A method for determining the circuit parameter values of many series connected cells (a stack) using the results from a single stack test is also presented. The PSDB test enables each cell in a fuel cell stack to be tested and analysed using a simple procedure that can be incorporated into a fuel cell system designed for backup power applications. An experimental system for implementing the PSDB test and evaluating the active performance of three different PEM fuel cells was developed. Each fuel cell exhibited the same characteristic voltage transient when subjected to the PSDB test. The proposed ECM was shown to accurately model the observed transient voltage behaviour of a single cell and many series connected cells. An example of how the PSDB test can provide information on the active functionality of a fuel cell is developed. This method consists of establishing baseline performance of the fuel cell in an active state, in conjunction with a PSDB test and identification of model parameter values. A subsequent PSDB test is used to detect changes in the state of the fuel cell that correspond to performance changes when the stack is active. An explicit example is provided, where certain cells in a stack were purposefully humidified. The change in state of the cells was identified by the PSDB test, and the performance change of the effected cells was successfully predicted. The experimental test results verify the theory presented in relation to the PSDB test and equivalent circuit model.
APA, Harvard, Vancouver, ISO, and other styles
7

Ninet, Tristan. "Formal verification of the Internet Key Exchange (IKEv2) security protocol." Thesis, Rennes 1, 2020. http://www.theses.fr/2020REN1S002.

Full text
Abstract:
Dans cette thèse, nous analysons le protocole IKEv2 à l'aide de trois outils de vérification formelle : Spin, ProVerif et Tamarin. Pour effectuer l'analyse avec Spin, nous étendons une méthode existante de modélisation. En particulier, nous proposons un modèle de la signature numérique, du MAC et de l'exponentiation modulaire, nous simplifions le modèle d'adversaire pour le rendre applicable à des protocoles complexes, et nous proposons des modèles de propriétés d'authentification. Nos analyses montrent que l'attaque par réflexion, une attaque trouvée par une précédente analyse, n'existe pas. De plus, nos analyses avec ProVerif et Tamarin produisent de nouvelles preuves concernant les garanties d'accord non injectif et d'accord injectif pour IKEv2 dans le modèle non borné. Nous montrons ensuite que la faille de pénultième authentification, une vulnérabilité considérée comme bénigne par les analyses précédentes, permet en fait d'effectuer un nouveau type d'attaque par déni de service auquel IKEv2 est vulnérable : l'Attaque par Déviation. Cette attaque est plus difficile à détecter que les attaques par déni de service classiques mais est également plus difficile à réaliser. Afin de démontrer concrètement sa faisabilité, nous attaquons avec succès une implémentation open-source populaire de IKEv2. Les contre-mesures classiques aux attaques DoS ne permettent pas d'éviter cette attaque. Nous proposons alors deux modifications simples du protocole, et prouvons formellement que chacune d'entre elles empêche l'Attaque par Déviation
In this thesis, we analyze the IKEv2 protocol specification using three formal verification tools: Spin, ProVerif and Tamarin. To perform the analysis with Spin, we extend and improve an existing modeling method with a simpler adversary model and a model for common cryptographic primitives and Lowe's authentication properties. As a result we show that the reflection attack, an attack found by a previous analysis, is actually not applicable. Moreover, our analysis using ProVerif and Tamarin provides new results regarding non-injective agreement and injective agreement guaranties of IKEv2 in the unbounded model. We then show that the penultimate authentication flaw, a vulnerability that was considered harmless by previous analyses, actually allows for a new type of Denial-of-Service attack, which works against IKEv2: the Deviation Attack. The Deviation Attack is harder to detect than existing DoS attacks, but is also harder to perform. To concretely demonstrate the attack, we successfully implement it against a popular open-source implementation of IKEv2. Finally, we study the use of existing DoS countermeasures and existing configuration options to defeat the attack, but we only find mitigations or incomplete workarounds. We therefore tackle the problem at a higher level: we propose two possible inexpensive modifications of the protocol, and formally prove that they both prevent the attack
APA, Harvard, Vancouver, ISO, and other styles
8

Thomson, Derek Stewart. "The development of packaged, reusable building services components : a pilot study in the UK national health service." Thesis, Heriot-Watt University, 2000. http://hdl.handle.net/10399/1165.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Buck, Randall Jay. "WiFu Transport: A User-level Protocol Framework." BYU ScholarsArchive, 2012. https://scholarsarchive.byu.edu/etd/2959.

Full text
Abstract:
It is well known that the transport layer protocol TCP has low throughput and is unfair in wireless mesh networks. Transport layer solutions for mesh networks have been primarily validated using simulations with simplified assumptions about the wireless network. The WiFu Transport framework complements simulator results by allowing developers to easily create and experiment with transport layer protocols on live networks. We provide a user-space solution that is flexible and promotes code reuse while maintaining high performance and scalability. To validate WiFu Transport we use it to build WiFu TCP, a decomposed Tahoe solution that preserves TCP semantics. Furthermore, we share other WiFu developers' experiences building several TCP variants as well as a hybrid protocol to demonstrate flexibility and code reuse. We demonstrate that WiFu Transport performs as well as the Linux kernel on 10 and 100 Mbps Ethernet connections and over a one-hop wireless connection. We also show that our WiFu TCP implementation is fair and that the framework also scales to support multiple threads.
APA, Harvard, Vancouver, ISO, and other styles
10

Gustavsson, C. C. Magnus. "Mail Exchange Protocol (MEP): Ett utkast till nytt protokoll för elektronisk post." Thesis, Linköping University, Department of Computer and Information Science, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2698.

Full text
Abstract:

SMTP, the current protocol for sending electronic mail (e-mail) over the Internet, has for many years suffered from several problems and limitations. When it was designed, well over twenty years ago, the requirements for e-mail were very different from those of today. A message was a text message in English, and both user and machine were explicitly named in the address. The protocol was not designed to transfer other types of messages, and no mechanism was included to verify the identity of the sender.

In order to solve these shortcomings, a new e-mail protocol needs to be defined. This report specifies a basis for what such a protocol may look like. The protocol has been designed to be easy to modify and expand, as well as to benefit from more recent ideas and technology. Binary message content is transferred without conversion, sender addresses are verified, and the address format is flexible. Along with the specification of the protocol, a sample implementation has been provided.

APA, Harvard, Vancouver, ISO, and other styles
11

Chawla, Nitin. "Registration and authentication protocol for OCEAN (Open Computation Exchange and Auctioning Network) /." [Gainesville, Fla.] : University of Florida, 2002. http://purl.fcla.edu/fcla/etd/UFE1001125.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Myadam, Nishkal Gupta, and Bhavith Patnam. "Design and Implementation of Key Exchange Mechanisms for Software Artifacts using Ocean Protocol." Thesis, Blekinge Tekniska Högskola, Institutionen för datavetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-20665.

Full text
Abstract:
During the modern times, innovators and researchers developed a key technology known as Artificial Intelligence (AI) Marketplace which leverages the power of AI to efficiently utilize the data generated by millions of devices to create new and better services and software products. H2020 Bonseyes is one such project that provides us a collaborative cloud based model of the AI marketplace for the users who generally don’t have access to large data sets, algorithms etc by allowing them to collaborate which each other and exchange the software artifacts. Collaboration leads to issues related to authentication and authorization which are addressed by Public Key In- frastructure(PKI).The main component of the PKI is the Certificate Authority which acts a anchor of trust, whose architecture is designed to be centralized. Centralized architecture is prone to many attacks and also failures which makes it vulnerable and weak.The adverse effects of the CA based PKI can be avoided by implementing a distributed PKI.This thesis focuses on a hybrid methodology consisting of Qualitative and Quanti- tative analysis by performing a literature review for accumulating knowledge from the Ocean Protocol which is a decentralized AI marketplace.The thesis aims to design and implement the framework used in the ocean protocol and evaluate its performance.The thesis also aims to develop a reference framework to be compatible with the Bonseyes Project. Moreover, our research also provides the reader with the concepts and technologies used in other implementations of distributed PKI.
APA, Harvard, Vancouver, ISO, and other styles
13

Dahlén, Marcus. "Security Architecture and Technologies for the Electronic Document Exchange with SOAP as Communication Protocol." Thesis, Linköping University, Department of Electrical Engineering, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2835.

Full text
Abstract:

In many industries the tracking and tracing of products within the supply chain is required by law. Companies in the metal working industry exchange so-called material test reports, which specify the product’s properties, the customer’s requirements, and serve as an assurance between the supplier and the customer. Internet technologies have changed the way companies exchange information and conduct business. In the metal working industry companies can implement an intermediary platform and make the exchange of material test reports more efficient. Furthermore, a client application that allows the company to export test reports from their information system directly to the intermediary can significantly decrease the processing costs. This inter-organizational collaboration can render an increase in productivity for customers and suppliers.

The main goal of the thesis is to analyze how companies in a supply chain can exchange documents with an intermediary over the protocol SOAP as well as support companies by showing a structured procedure for how to achieve security in a system using SOAP. SOAP is a platform independent XML-based communication protocol. The Extensible Markup Language (XML) is of major importance in e-business applications, because of its platform, language, and vendor independent way of describing data. As a universal data format, it enables the seamless connection of business systems.

SOAP does not provide any security and is usually implemented over HTTP, which allows it to pass through firewalls. Companies are only prepared to join an inter-organizational collaboration if IT-security is guaranteed. In the exchange of material test reports, security has two objectives. The first is to replace the handwritten signature in the paper-based document exchange. The second is to guarantee security for the material test reports as well as for the information intermediary.

SOAP’s extensibility model allows organizations to develop new extensions, which build upon the protocol and provide functions which aren’t specified. Specifications for attachments as well as for security should be implemented in the electronic document exchange. To design a secure system, each security concept, such as confidentiality, authentication and integrity, can be analyzed in its context and the appropriate standard can thereafter be implemented.

APA, Harvard, Vancouver, ISO, and other styles
14

Alotaibi, Abdullah S. "Design and evaluate a fair exchange protocol based on online trusted third party (TTP)." Thesis, De Montfort University, 2012. http://hdl.handle.net/2086/7879.

Full text
Abstract:
One of the most crucial factors that e-commerce protocols should address is a fair exchange. In this research, an advanced method of cryptography coupled with the pay per use technique is used. A new electronic commerce protocol for the exchange of commodities is introduced. The proposed new protocol guarantees both features while addressing the main drawbacks associated with other related protocols. The new suggested e-commerce protocol is composed of two stages: pre-exchange and exchange stages. When the suggested new protocol is analysed with scrupulous protocol analysis, it attains fair exchange and a secure method of payment. The suggested new e-commerce protocol is more efficient than other related existing protocols. In this research “protocol prototype” and “model checking” is used for the purpose of authentication. The protocol prototype verifies that the suggested new protocol is executable when it's used in a real context. By experimental designs, this research shows the length of asymmetric keys as the biggest element that affects the efficiency of the protocol. When model-checking is applied in this protocol, the outcome indicates that the suggested protocol achieves the required features of fairness. Protocol extensions give those involved in the protocol the capacity to be resilient to failure. By using three methods of authentication, this research confirms that the new proposed protocol is well formulated. The work reported in this thesis first study the existing fair exchange protocols that solve the fairness problem. Then, propose more efficient protocol to solve the fairness problem. The original idea in this thesis is to reduce the communication overheads, risks and solve the bottleneck problems in the protocols that involve an online TTP.
APA, Harvard, Vancouver, ISO, and other styles
15

Geary, Aaron C. "Analysis of a man-in-the-middle attack on the Diffie-Hellman key exchange protocol." Thesis, Monterey, California : Naval Postgraduate School, 2009. http://edocs.nps.edu/npspubs/scholarly/theses/2009/Sep/09Sep%5FGeary.pdf.

Full text
Abstract:
Thesis (M.S. in Applied Mathematics and M.S. in Information Technology Management)--Naval Postgraduate School, September 2009.
Thesis Advisor(s): Stanica, Pantelimon ; Kanevsky, Valery. "September 2009." Description based on title screen as viewed on November 6, 2009 Author(s) subject terms: Cryptography, Diffie-Hellman, Man-in-the-Middle Attack. Includes bibliographical references (p. 55-56). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
16

Watkins, E. James. "Foulant adsorption onto ion exchange membranes." Diss., Georgia Institute of Technology, 1999. http://hdl.handle.net/1853/7062.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

van, Leeuwen Daniel, and Leonel Taku Ayuk. "Security testing of the Zigbee communication protocol in consumer grade IoT devices." Thesis, Högskolan i Halmstad, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-40189.

Full text
Abstract:
With the ever increasing number of Internet of Things devices going out on the market for consumers that are Zigbee certified there is a need for security testing. This is to make sure that security standards are upheld and improved upon in order to make sure networks are protected from unauthorized users. Even though a lot of research and testing has been done on the Zigbee key exchange mechanism, called Zigbee commissioning, improvements have still not been enough with severe vulnerabilities in consumer grade devices still existing today. The devices tested in this study use EZ-mode commissioning in order to exchange the network key between a Zigbee coordinator and a Zigbee end device in order to encrypt later communication after being paired.  By using a simple radio receiver and a packet capturing program such as Wireshark an eavesdropping attack was conducted in order to capture the network key. The experiment demonstrates that this is still a weak point as the network key was successfully captured using eavesdropping. The analysis of the results show that previous criticisms of Zigbee commissioning have still not fully been addressed and can be a potential weak point in networks that use Zigbee certified IoT products.
APA, Harvard, Vancouver, ISO, and other styles
18

Stefik, Christopher J. "Effect of protocol mouthguard on VO₂ max in female hockey players using the skating treadmill." Thesis, McGill University, 2003. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=79136.

Full text
Abstract:
Athletes competing in contact sports commonly wear intra-oral dental mouthguards. Data are sparse concerning the influence of a mouthguard on breathing during exercise. We compared VE and VO2 during submaximal and maximal exercise on a skating treadmill (TM) while wearing an intra-oral dental mouthguard. Female varsity hockey players (n = 12) performed two skating tests on a TM with and without a mouthguard (WIPSS Jaw-Joint Protecto(TM)). The players wore the mouthguard during hockey practices prior to collection of ventilation data on the treadmill. Also, the players completed a questionnaire that examined their perceptions of the mouthguard in terms of ventilation, comfort and performance. A 10-point rating scale was used for this evaluation. Two performance tests on the skating treadmill examined the effect of the mouthguard on submaximal and maximal aerobic exercise. The subjects skated for 4 min at 2 submaximal velocities (14 and 16 km·h-1 ) separated by 5 min of passive recovery. A VO2 max test followed the submaximal tests and commenced at 18 km·h-1 with the velocity increasing by 1 km·h-1 every minute until volitional fatigue. VE, VO2, VCO 2 and RER were analyzed using a Sensor Medics 2900 metabolic cart. Two-way (2 conditions x 3 velocities) repeated measures ANOVAs were used to examine differences in VE, VO2 and HR. Ventilation was unchanged when skating at the two submaximal velocities. VO2 max was 48.8 ml·kg-1·min-1 using the intra-oral mouthguard and was 52.4 ml·kg-1·min -1 without a mouthguard. VE max was 108.5 L·min -1 using the intra-oral mouthguard and was 114.1 L·min -1 without a mouthguard. The results showed that VE max and VO2 max were lower using the mouthguard compared to the no mouthguard condition.
APA, Harvard, Vancouver, ISO, and other styles
19

Lampinen, Björn. "Protocol optimization of the filter exchange imaging (FEXI) sequence and implications on group sizes : a test-retest study." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-196327.

Full text
Abstract:
Diffusion weighted imaging (DWI) is a branch within the field of magnetic resonance imaging (MRI) that relies on the diffusion of water molecules for its contrast. Its clinical applications include the early diagnosis of ischemic stroke and mapping of the nerve tracts of the brain. The recent development of filter exchange imaging (FEXI) and the introduction of the apparent exchange rate (AXR) present a new DWI based technique that uses the exchange of water between compartments as contrast. FEXI could offer new clinical possibilities in diagnosis, differentiation and treatment follow-up of conditions involving edema or altered membrane permeability, such as tumors, cerebral edema, multiple sclerosis and stroke. Necessary steps in determining the potential of AXR as a new biomarker include running comparative studies between controls and different patient groups, looking for conditions showing large AXR-changes. However, before designing such studies, the experimental protocol of FEXI should be optimized to minimize the experimental variance. Such optimization would improve the data quality, shorten the scan time and keep the required study group sizes smaller.  Here, optimization was done using an active imaging approach and the Cramer-Rao lower bound (CRLB) of Fisher information theory. Three optimal protocols were obtained, each specialized at different tissue types, and the CRLB method was verified by bootstrapping. A test-retest study of 18 volunteers was conducted in order to investigate the reproducibility of the AXR as measured by one of the protocols, adapted for the scanner. Group sizes required were calculated based on both CRLB and the variability of the test-retest data, as well as choices in data analysis such as region of interest (ROI) size. The result of this study is new protocols offering a reduction in coefficient of variation (CV) of around 30%, as compared to previously presented protocols. Calculations of group sizes required showed that they can be used to decide whether any patient group, in a given brain region, has large alterations of AXR using as few as four individuals per group, on average, while still keeping the scan time below 15 minutes. The test-retest study showed a larger than expected variability however, and uncovered artifact like changes in AXR between measurements. Reproducibility of AXR values ranged from modest to acceptable, depending on the brain region. Group size estimations based on the collected data showed that it is still possible to detect AXR difference larger than 50% in most brain regions using fewer than ten individuals. Limitations of this study include an imprecise knowledge of model priors and a possibly suboptimal modeling of the bias caused by weak signals. Future studies on FEXI methodology could improve the method further by addressing these matters and possibly also the unknown source of variability. For minimal variability, comparative studies of AXR in patient groups could use a protocol among those presented here, while choosing large ROI sizes and calculating the AXR based on averaged signals.
APA, Harvard, Vancouver, ISO, and other styles
20

Alston, Katherine Yvette. "A heuristic on the rearrangeability of shuffle-exchange networks." CSUSB ScholarWorks, 2004. https://scholarworks.lib.csusb.edu/etd-project/2521.

Full text
Abstract:
The algorithms which control network routing are specific to the network because the algorithms are designed to take advantage of that network's topology. The "goodness" of a network includes such criteria as a simple routing algorithm and a simple routing algorithm would increase the use of the shuffle-exchange network.
APA, Harvard, Vancouver, ISO, and other styles
21

R, V. Saraswathy. "Zero-Knowledge Proof for Knowledge of RLWE (Ring-Learning with Errors) Secret Keys." University of Cincinnati / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1521192556946491.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Wang, Hao-Hsien. "Desired Features and Design Methodologies of Secure Authenticated Key Exchange Protocols in the Public-Key Infrastructure Setting." Thesis, University of Waterloo, 2004. http://hdl.handle.net/10012/1087.

Full text
Abstract:
The importance of an authenticated key exchange (AKE) protocol has long been known in the field of cryptography. Two of the questions still being asked today are (1) what properties or features does a secure AKE protocol possess, and (2) How does one, in a step by step fashion, create a secure AKE protocol? This thesis aims to answer these two questions. The thesis contains two parts: one is a survey of previous works on the desired features of the Station-to-Station (STS) protocol, and the other is a study of a previously proposed design methodology in designing secure AKE protocols, as well as contributing an original idea of such methodologies. Descriptions and comparisons of the two design methodologies are included. The thesis surveys the literature and conducts a case study of the STS protocol, analyzes various attacks on STS through some known attacks to it, and extracts the desired properties and features of a secure AKE protocol via the case study. This part of the thesis does not propose any new result, but summarizes a complete list of issues one should take consideration of while designing an AKE protocol. We also show that at the end of this part, a secure version of STS which possesses the desired features of an AKE protocol. The other major part of the thesis surveys one design methodology of creating a secure AKE protocol by Bellare, Canetti, and Krawczyk; it is based on having a secure key exchange protocol then adding (mutual) authentication to it. The thesis then proposes another original design methodology; it starts with a secure mutual authentication protocol, then adds the secure key exchange feature without modifying overheads and number of flows of the original mutual authentication protocol. We show in this part the "secure" AKE protocol developed through these two design approaches is identical to the secure version of STS described in the other part, and thus possesses the desired features of a secure AKE protocol. We also give a proof of security of the secure AKE protocol developed under our design methodology.
APA, Harvard, Vancouver, ISO, and other styles
23

Andreasson, Samuel, and Jesper Palmér. "OPC UA Field eXchange Prototyping : Enabling decentralized communication using Publish/Subscribe." Thesis, Högskolan i Halmstad, Akademin för informationsteknologi, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-45029.

Full text
Abstract:
Open Platform Communication Unified Architecture, or OPC UA, is a world-leading communication protocol specializing in unifying and automating production systems communication. In 2018 the OPC Foundation, an industrial consortium, started the OPC UA Field eXchange initiative to develop the current protocol to extend the reach of the communication down to field level devices like sensors and actuators. This paper explores whether OPC UA FX software can be implemented and integrated with HMS Networks’ product AnyBus CompactCom M40.  The problem formulation stems from the future need for factory communication. For factories to compete, they need to adapt and keep up with the technological progression. OPC UA FX is based on decentralized communication where devices transmit data to each other by distributing the load over the entire system.   The purpose of this report is to, based on the Open62541 implementation, develop software that extends OPC UA with PubSub functionality and methods that enable two or more instances to run as an FX application, meaning that the program publishes and subscribes data simultaneously. Once the software is developed, we integrate it on an AnyBus CompactCom 40 module. This will work as a communication prototype that proves that it is possible to extend OPC UA with FX into HMS Networks’ products.  Open62541 is used to gather libraries and methods needed for OPC UA development. The software is developed using C in Visual Studios and integrated into the hardware using Eclipse.  The result in the form of software was a connection-oriented data exchange, based on the OPC UA information model, where two or more instances can publish and subscribe to information simultaneously. HMS Networks can use the result on their way to implementing OPC UA FX in their products.  In conclusion, the Open62541 implementation is beneficial when developing the OPC UA protocol. The software is complete, but it could not be fully integrated into the CompactCom module. The achieved application is still useful for the development of HMS Network’s products that might use the protocol.
Open Platform Communication Unified Architechure, eller OPC UA, är ett av de världsledande kommunikationsprotokoll som är specialiserat i att förena kommunikation i produktionssystem. 2018 startade OPC Foundation, ett industriellt konsortium, ett initiativ vid namn OPC UA Field eXchange med målet att utvekcla det nuvarande protokollet så att det kan användas till kommunikation på låg nivå, exempelvis mellan sensorer och ställdon. Denna rapport utforskar ifall det är möjligt att utveckla protokollet och integrera det i HMS Networks modul AnyBus CompactCom 40.  Problemformluleringen baseras på framtida behov hos fabriker relaterat till automatisering. För att konkurrera framöver behöver fabriker anpassa sig till utvecklingen inom automatisering. OPC UA FX fokuserar på decentralierad kommunikation mellan enheter som fältanordning, maskin och moln samtidigt för att belastningen ska delas upp över hela systemet. Samtidigt som enheter i industiella nätverket fritt ska kunna överföra data mellan varandra oberoende vilken tillverkare som skapat enheten.   Syftet med arbetet är att, baserat på Open62541, utveckla PubSub teknologi med metoder som möjlighetsgör att två eller fler instanser av en FX applikation ska kunna transportera data genom att prenumerera på och publicera data samtidigt. När mjukvaran fungerar är tanken att integrera mjukvaran på en AnyBus CompactCom 40 modul för att bevisa att implementationen är möjlig i ett praktiskt sammanhang.  Open62541 används för att inkludera nödvändiga OPC UA bibliotek, funktionalitet och datatyper. Protokollet utvecklas i C i en VisualStudio miljö och integreras med hjälp av Eclipse. Resultatet i form av mjukvara var en kopplings intriktat data utbyte, baserad på OPC UA information modell, där en eller två instanser av ett program kan publicera och prenumerera på data samtidigt. HMS Networks kan använda resultatet i arbetet att implementera OPC UA FX i deras produkter. ​Sammanfattningsvis är Open62541 ett mycket användbart verktyg för utvekcling av OPC UA protokol. Dessvärre lyckades inte integrationen av mjukvaran i CompactCom modulen helt och hållet, men det som åstadkommits i arbetet kan i hög grad användas för fortsatt utveckling av OPC UA FX i HMS Networks produkter.
APA, Harvard, Vancouver, ISO, and other styles
24

KUNDA, SAKETH RAM. "Methods to Reuse CAD Data in Tessellated Models for Efficient Visual Configurations : An Investigation for Tacton Systems AB." Thesis, KTH, Skolan för industriell teknik och management (ITM), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-281748.

Full text
Abstract:
Data loss has always been a side effect of sharing 3D models between different CAD systems. Continuous research and new frameworks have been implemented to minimise the data loss in CAD and in other downstream applications separately like 3D visual graphics applications (eg. 3DS Max, Blender etc.). As a first step into this research area, the thesis is an explorative study on understanding the problem of CAD data loss while exchanging models between a CAD application and a visual application. The thesis is performed at Tacton systems which provides product configurations to their customers in both CAD and visual environments and hence the research is focussed on reusing the CAD data in visual applications or restoring the data after exchange. The research questions are framed to answer the reasons of data loss and address the possible implementation techniques at the company. Being a niche topic, the thesis required inputs from different perspectives and knowledge sharing from people outside the company which proves the significance of open innovation in technology-oriented companies. Ten different ideas were brainstormed and developed into concepts to solve the problem of data loss. All the concepts are analysed and evaluated to check the functionality and feasibility of implementing it within the company workflow. The evaluations resulted in different concepts that are capable of solving the research problem. They have also been verified with various people internal and external to the company. The results also highlight the strengths and weaknesses of each of these concepts giving clear instructions to the company on the next steps.
Dataförluster har alltid varit en följd av att dela 3D-modeller mellan olika CAD-system. I forskning har nya metoder utvecklats och implementerats för att minimera dataförlusten i CAD och andra program, som t.ex. visuella 3D-grafikapplikationer (3DS Max, Blender etc.). Denna rapport är resultatet av en studie kring CAD-dataförluster när man överför modeller mellan ett CAD-program och ett visualiseringsprogram. Studien har utförts vid Tacton Systems AB, som tillhandahåller produktkonfigureringslösningar både i CAD-program och i visuella miljöer och därför har studien haft fokus på att återanvända eller återskapa CAD-data i visualiseringsprogramvaror. Forskningsfrågorna är inriktade på att hitta orsaker till dataförlusterna och möjliga lösningar för företaget. Eftersom detta är ett högspecialiserat ämne krävde arbetet insatser från olika perspektiv och kunskapsinhämtning från människor också utanför företaget, vilket visar på betydelsen av öppen innovation i teknikorienterade företag. Tio olika idéer togs fram och utvecklades till koncept för att lösa problemet med dataförluster. Alla koncept har analyserats och utvärderats för att bedöma deras funktionalitet och genomförbarhet, för att implementera dem inom företagets arbetsflöde. Utvärderingarna resulterade i sex olika koncept som skulle kunna lösa problemet. Dessa koncept har diskuterats och verifierats med olika personer inom och utanför företaget. Resultatet visar styrkor och svagheter i vart och ett av dessa koncept och ger tydliga rekommendationer till företaget om nästa steg.
APA, Harvard, Vancouver, ISO, and other styles
25

Dowling, Benjamin James. "Provable security of internet protocols." Thesis, Queensland University of Technology, 2017. https://eprints.qut.edu.au/108960/1/Benjamin%20James_Dowling_Thesis.pdf.

Full text
Abstract:
Secure communications over the Internet are typically established by first running an authenticated key exchange protocol, which computes a secret key between two users, which is then utilised in an encryption protocol. In this work we examine novel security properties of the most prominent communications protocols, including the Transport Layer Security and Secure Shell protocols. We introduce new security frameworks for analysing security properties of protocols involving negotiation, multiple ciphersuites, long-term key reuse, and time synchronisation. Our results have increased confidence in the security of real-world protocols, and our analyses of next-generation protocols have informed their development by standardisation bodies.
APA, Harvard, Vancouver, ISO, and other styles
26

Lluch-Ariet, Magí. "Contributions to efficient and secure exchange of networked clinical data : the MOSAIC system." Doctoral thesis, Universitat Politècnica de Catalunya, 2016. http://hdl.handle.net/10803/388037.

Full text
Abstract:
The understanding of certain data often requires the collection of similar data from different places to be analysed and interpreted. Multi-agent systems (MAS), interoperability standards (DICOM, HL7 or EN13606) and clinical Ontologies, are facilitating data interchange among different clinical centres around the world. However, as more and more data becomes available, and more heterogeneous this data gets, the task of accessing and exploiting the large number of distributed repositories to extract useful knowledge becomes increasingly complex. Beyond the existing networks and advances for data transfer, data sharing protocols to support multilateral agreements are useful to exploit the knowledge of distributed Data Warehouses. The access to a certain data set in a federated Data Warehouse may be constrained by the requirement to deliver another specific data set. When bilateral agreements between two nodes of a network are not enough to solve the constraints for accessing to a certain data set, multilateral agreements for data exchange can be a solution. The research carried out in this PhD Thesis comprises the design and implementation of a Multi-Agent System for multilateral exchange agreements of clinical data, and evaluate how those multilateral agreements increase the percentage of data collected by a single node from the total amount of data available in the network. Different strategies to reduce the number of messages needed to achieve an agreement are also considered. The results show that with this collaborative sharing scenario the percentage of data collected dramatically improve from bilateral agreements to multilateral ones, up to reach almost all data available in the network.
APA, Harvard, Vancouver, ISO, and other styles
27

Aguilar, Rodriguez Adriana. "Building networks in the Climate Change Convention : co-ordination failure in the establishment of Clean Development Mechanism (CDM) in Mexico." Thesis, University of Manchester, 2012. https://www.research.manchester.ac.uk/portal/en/theses/building-networks-in-the-climate-change-convention--coordination-failure-in-the-establishment-of-clean-development-mechanism-cdm-in-mexico(02f1f20b-914a-4ca0-8ce0-0423ab3e6100).html.

Full text
Abstract:
This thesis evaluates why the implementation of a tree plantation project in Chiapas, Mexico, called Scolel Te failed in its attempt to participate in the CDMs scheme. The Scolel Te project brings together farmers and local organisations into a network of exchange of resources that aims at producing an outcome that is only possible through the co-ordination and co-operation of all participants: the emission of carbon certificates. This thesis studies the co-ordination problems that local actors face at the moment of establishing the carbon projects by identifying how formal and informal mechanisms such as contracts, economic incentives, trust, and reputation, create or solve co-ordination problems in the Scolel Te network. The thesis also describes how changes in the distribution of power among actors affect the functioning of the network and how individual's interests and strategic alliances have the potential of derailing the aims of the environmental project. For such purposes, this thesis analyses the exchange relationships among actors at the micro level and identifies how exchange relationships evolve over time. Then an overall picture of the exchange relationships is presented (macro level) with focus in understanding how and why power in the network is exerted. Findings suggest that relying on economic incentives as the main mechanism to generate commitment among communities has failed to create stable exchange relationships in the long term. Trust and reputation are stronger mechanisms to achieve commitment. Moreover, we find that the ability to generate commitment depends highly on the generation of interdependencies between tree plantation projects and the main economic activities of local actors. However, type of land tenure, main economic activity, and pre-existing power relationships embedded at local level are also the principal factors that determine the dynamism of the social exchange relationships and commitment in the long-run. This thesis considers that co-ordination failure occurs because a lack of knowledge about the real dependencies between local actors and their natural resources in the design of CDMs. At macro level, this thesis found that the lack of accountability of the unregulated local carbon market at local level has created unintended incentives for actors to adopt less environmentally responsible strategies and disincentive participation in the CDMs.
APA, Harvard, Vancouver, ISO, and other styles
28

Shoaib, Naveed. "A Portable and Improved Implementation of the Diffie-Hellman Protocol for Wireless Sensor Networks." Connect to resource online, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=ysu1253597142.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Pfeffer, Katharina. "Formal Verification of a LTE Security Protocol for Dual-Connectivity : An Evaluation of Automatic Model Checking Tools." Thesis, KTH, Radio Systems Laboratory (RS Lab), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-148047.

Full text
Abstract:
Security protocols are ubiquitously used in various applications with the intention to ensure secure and private communication. To achieve this goal, a mechanism offering reliable and systematic protocol verification is needed. Accordingly, a major interest in academic research on formal methods for protocol analysis has been apparent for the last two decades. Such methods formalize the operational semantics of a protocol, laying the base for protocol verification with automatic model checking tools. So far, little work in this field has focused on protocol standardization. Within this thesis a security analysis of a novel Authenticated Key-Exchange (AKE) protocol for secure association handover between two Long-Term Evolution (LTE) base stations (which support dual-connectivity) is carried out by applying two state-of-the-art tools for automated model checking (Scyther and Tamarin Prover). In the course of this a formal protocol model and tool input models are developed. Finally, the suitability of the used tools for LTE protocol analysis is evaluated. The major outcome is that none of the two applied tools is capable to accurately model and verify the dual-connectivity protocol in such detail that it would make them particularly useful in the considered setting. The reason for this are restrictions in the syntax of Scyther and a degraded performance of Tamarin when using complex protocol input models. However, the use of formal methods in protocol standardization can be highly beneficial, since it implies a careful consideration of a protocol’s fundamentals. Hence, formal methods are helpful to improve and structure a protocol’s design process when applied in conjunction to current practices.
Säkerhetsprotokoll används i många typer av applikationer för att säkerställa säkerhet och integritet för kommunikation. För att uppnå detta mål behövs en behövs mekanismer som tillhandahåller pålitlig och systematisk verifiering av protokollen. Därför har det visats stort akademiskt intresse för forskning inom formell verifiering av säkerhetsprotokoll de senaste två decennierna. Sådana metoder formaliserar protokollsemantiken, vilket lägger grunden till automatiserad verifiering med modellverifieringsverktyg. Än så la¨nge har det inte varit stort focus på praktiska tilla¨mpningar, som t.ex. hur väl metoderna fungerar för de problem som dyker upp under en standardiseringsprocess. I detta examensarbete konstrueras en formell modell för ett säkerhetsprotokoll som etablerar en säkerhetsassociation mellan en terminal och två Long-Term Evolution (LTE) basstationer i ett delsystem kallat Dual Connectivity. Detta delsystem standardiseras för närvarande i 3GPP. Den formella modellen verifieras sedan med bästa tillgängliga verktyg för automatiserad modellverifiering (Scyther och Tamarin Prover). För att åstadkomma detta har den formella modellen implementerats i inmatningsspråken för de två verktygen.  Slutligen ha de två verktygen evaluerats. Huvudslutsatsen är att inget av de två verktygen tillräckligt väl kan modellera de koncept där maskinstödd verifiering som mest behövs. Skälen till detta är Scythers begränsade syntax, och Tamarins begränsade prestanda och möjlighet att terminera för komplexa protokollmodeller. Trots detta är formella metoder andvändbara i standardiseringsprocessen eftersom de tvingar fram väldigt noggrann granskning av protokollens fundamentala delar. Därför kan formella metoder bidra till att förbättra strukturen på protokollkonstruktionsprocessen om det kombineras med nuvarande metoder.
APA, Harvard, Vancouver, ISO, and other styles
30

Pitchai, Karthik Raja. "An executable meta-model for safety oriented software and systems development processes within the avionics domain in compliance with RTCA DO 178 B." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-19296.

Full text
Abstract:
"There are two critical points in every aerial flight—its beginning and its end." — Alexander Graham Bell, 1906. From beginning till the end, the safety critical software plays a vital role in avionics and hence its development and its certification are indispensable. “RTCA DO-178B- Software Considerations in Airborne Systems and Equipment Certification” provides the normative guidelines to develop such systems. In particular, this standard provides the safety protocol and processes that should be followed to achieve safe systems. The safety guideline of DO178B emphasizes more on better documentation, communication and visibility into actual process. For realizing the guidelines of DO178B, a well-defined and collectively accepted (at least at the development team–level) interpretationof the protocol and processes is needed. To achieve such interpretation, a well-defined modeling language that models the process with safety construct is essential. The Object Management Group’s Software and System Process Engineering Metamodel SPEM 2.0 standard provides specification for modeling software and systems development processes. SPEM2.0, however, is a general purpose language and does notprovide sufficient coverage in terms of language constructs to address safety concerns. This thesis proposes S-SPEM, an extension of the SPEM2.0 to allow users to specify safety-oriented processes for the development of safety critical systems in the context of RTCA DO 178B. The DO178B is analyzed to capture the safety related process elements and SPEM 2.0 is extended to include those safety concepts. Moreover, to simulate and validate the modeled processes, S-SPEMconcepts are mapped onto XML Process Definition Language (XPDL) concepts and a transformation algorithm is sketched. Finally, a case-study will illustrate theusage and effectiveness of the proposed extension.
APA, Harvard, Vancouver, ISO, and other styles
31

Rozsnyó, Tomáš. "Modular Multiple Liquidity Source Price Streams Aggregator." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2012. http://www.nusl.cz/ntk/nusl-236492.

Full text
Abstract:
This MSc Thesis was performed during a study stay at the Hochschule Furtwangen University, Furtwangen, Germany. This Master Project provides a theoretical background for understanding financial market principles. It focuses on foreign exchange market, where it gives a description of fundamentals and price analysis. Further, it covers principles of high-frequency trading including strategy, development and cost. FIX protocol is the financial market communication protocol and is discussed in detail. The core part of Master Project are sorting algorithms, these are covered on theoretical and practical level. Aggregator design includes implementation environment, specification and individual parts of aggregator application represented as objects. Implementation overview can be found in last Chapter.
APA, Harvard, Vancouver, ISO, and other styles
32

Greene, Owen J., and P. Batchelor. "Information Exchange and Transparency: Key Elements of an International Action Programme on Small Arms." Thesis, British American Security Information Council (BASIC), International Alert and Saferworld, 2001. http://hdl.handle.net/10454/4267.

Full text
Abstract:
yes
Efforts to combat and prevent illicit trafficking in, and proliferation and misuse of, small arms and light weapons (SALW) are hampered by lack of relevant information-exchange and transparency. International information exchange and transparency arrangements are key elements of each of the main elements of the international action programme on SALW to be launched at the UN 2001 Conference. There is great scope to develop information management and distribution arrangements to disseminate and exchange of relevant information on SALW without seriously compromising national security, necessary commercial secrecy, or law enforcement. Indeed, national security, commerce, crime prevention and law enforcement are generally enhanced by appropriate transparency and information exchange
APA, Harvard, Vancouver, ISO, and other styles
33

Dhavala, Kishore. "Essays on Emissions Trading Markets." FIU Digital Commons, 2012. http://digitalcommons.fiu.edu/etd/733.

Full text
Abstract:
This dissertation is a collection of three economics essays on different aspects of carbon emission trading markets. The first essay analyzes the dynamic optimal emission control strategies of two nations. With a potential to become the largest buyer under the Kyoto Protocol, the US is assumed to be a monopsony, whereas with a large number of tradable permits on hand Russia is assumed to be a monopoly. Optimal costs of emission control programs are estimated for both the countries under four different market scenarios: non-cooperative no trade, US monopsony, Russia monopoly, and cooperative trading. The US monopsony scenario is found to be the most Pareto cost efficient. The Pareto efficient outcome, however, would require the US to make side payments to Russia, which will even out the differences in the cost savings from cooperative behavior. The second essay analyzes the price dynamics of the Chicago Climate Exchange (CCX), a voluntary emissions trading market. By examining the volatility in market returns using AR-GARCH and Markov switching models, the study associates the market price fluctuations with two different political regimes of the US government. Further, the study also identifies a high volatility in the returns few months before the market collapse. Three possible regulatory and market-based forces are identified as probable causes of market volatility and its ultimate collapse. Organizers of other voluntary markets in the US and worldwide may closely watch for these regime switching forces in order to overcome emission market crashes. The third essay compares excess skewness and kurtosis in carbon prices between CCX and EU ETS (European Union Emission Trading Scheme) Phase I and II markets, by examining the tail behavior when market expectations exceed the threshold level. Dynamic extreme value theory is used to find out the mean price exceedence of the threshold levels and estimate the risk loss. The calculated risk measures suggest that CCX and EU ETS Phase I are extremely immature markets for a risk investor, whereas EU ETS Phase II is a more stable market that could develop as a mature carbon market in future years.
APA, Harvard, Vancouver, ISO, and other styles
34

Hussain, Dostdar, and Muhammad Ismail. "Requirement Engineering : A comparision between Traditional requirement elicitation techniqes with user story." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-70174.

Full text
Abstract:
Requirements are features or attributes which we discover at the initial stage of building a product. Requirements describe the system functionality that satisfies customer needs. An incomplete and inconsistent requirement of the project leads to exceeding cost or devastating the project. So there should be a process for obtaining sufficient, accurate and refining requirements such a process is known as requirement elicitation. Software requirement elicitation process is regarded as one of the most important parts of software development. During this stage it is decided precisely what should be built. There are many requirements elicitation techniques however selecting the appropriate technique according to the nature of the project is important for the successful development of the project. Traditional software development and agile approaches to requirements elicitation are suitable in their own context. With agile approaches a high-level, low formal form of requirement specification is produced and the team is fully prepared to respond unavoidable changes in these requirements. On the other hand in traditional approach project could be done more satisfactory with a plan driven well documented specification. Agile processes introduced their most broadly applicable technique with user stories to express the requirements of the project. A user story is a simple and short written description of desired functionality from the perspective of user or owner. User stories play an effective role on all time constrained projects and a good way to introducing a bit of agility to the projects. Personas can be used to fill the gap of user stories.
APA, Harvard, Vancouver, ISO, and other styles
35

Olejník, Tomáš. "Zpracování obchodních dat finančního trhu." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2011. http://www.nusl.cz/ntk/nusl-412828.

Full text
Abstract:
The master's thesis' objective is to study basics of high-frequency trading, especially trading at foreign exchange market. Project deals with foreign exchange data preprocessing, fundamentals of market data collecting, data storing and cleaning are discussed. Doing decisions based on poor quality data can lead into fatal consequences in money business therefore data cleaning is necessary. The thesis describes adaptive data cleaning algorithm which is able to adapt current market conditions. According to design a modular plug-in application for data collecting, storing and following cleaning has been implemented.
APA, Harvard, Vancouver, ISO, and other styles
36

Behnam, Bobby. "Modélisation d'échange d'informations de composants électroniques virtuels." Grenoble INPG, 1998. http://www.theses.fr/1998INPG0126.

Full text
Abstract:
Les circuits integres d'aujourd'hui sont fabriques en utilisant des technologies submicroniques, ce qui permet de realiser un systeme entier sur une seule puce de silicium. On a alors assiste a la naissance du concept de design-reuse qui, par reutilisation des blocs fonctionnels externes, offre aux concepteurs des solutions pour atteindre le systeme-sur-puce. Ces blocs sont aussi appeles composants virtuels (vc) ou proprietes intellectuelles (ip). Ce nouveau concept induit, beaucoup de problemes nouveaux. Le tout premier probleme rencontre par un integrateur d'ip, est l'absence de normes et d'informations sur les ips. Pour connaitre les produits existants, il doit disposer d'un indice lui permettant de demarrer sa recherche d'ip. Le travail de cette these se situe dans ce contexte et a contribue a resoudre ce probleme. Dans un premier temps, un format standard de description recapitulative d'ip (isdf) a ete elabore pour decrire, de facon concise, les caracteristiques principales d'un ip, selon les exigences des integrateurs potentiels d'ip. Ensuite, un arbre de taxonomie (classification) a ete defini pour classer les ips par categorie, selon leur domaine d'application. Des techniques de balisage pour regrouper differentes offres avec la meme fonctionnalite sont presentees. Elles assurent une recherche efficace. L'acces repandu au reseau internet a conduit au developpement d'une base de donnees orientee internet qui incorpore l'isdf, la taxonomie et les balises pour offrir un outil efficace, et permettre l'echange d'informations sur les ips a l'echelle mondiale. L'ecriture de la base de donnees en java a permit sont extension rapide vers un systeme de gestion de catalogue d'ips sur intranet, dont le format et la taxonomie sont parametrables. Ce travail a suivi de pres, et contribue aux directives de la virtual socket interface alliance (vsia), un organisme dont le but est de definir des normes pour le developpement et l'integration d'ips.
APA, Harvard, Vancouver, ISO, and other styles
37

Wen, Wen. "Energy Efficient Secure Key Management Schemes for WSNs and IoT." Thesis, Université d'Ottawa / University of Ottawa, 2016. http://hdl.handle.net/10393/35257.

Full text
Abstract:
Secret sharing is critical to most applications making use of security and remains one of the most challenging research areas in modern cryptography. In this thesis, we propose a novel efficient multi-secret sharing scheme based on the Chinese remainder theorem (CRT) with two verification methods, while the previous works are mostly based on the Lagrange polynomial. Key management schemes play an important role in communication security in Wireless Sensor Networks (WSNs). While the previous works mainly targeting on two different types of WSNs: distributed and hieratical, in this thesis, we propose our flexible WSN key management scheme, which is based on (n,t,n) multi-secret sharing technique, to provide a key management solution for heterogeneous architecture. The powerful key managers are responsible for most of the communicational and computational workload. They can provide Peer-to-Peer pair-wise keys for a pair of sensors to establish a secure communication session, and in the same time, they can also form communication clusters as cluster heads according to different application requirements. Internet of Things (IoT) becomes more and more popular and practical in recent years. Considering the diversity of the devices and the application scenarios, it is extremely hard to couple two devices or sub-networks with different communication and computation resources. In this thesis, we propose novel key agreement schemes based on (n,t,n) multi-secret sharing techniques for IoT in order to achieve light weighted key exchange while using Host Identity Protocol (HIP). We refer the new schemes as HIP-MEXs with different underlying multi-secret sharing techniques. We analyzed the computational and communication costs of the extremely resource constrained device which is referred to as Initiator, and CRT based HIP-MEX successfully outsource the heavy workload to the proxy, which are considered more powerful, when establishing new secret key.
APA, Harvard, Vancouver, ISO, and other styles
38

Valle, Edson Cordeiro do. "Minimização do uso de água e efluentes com considerações econômicas e operacionais via programação matemática." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2005. http://hdl.handle.net/10183/8170.

Full text
Abstract:
A água é uma matéria-prima estratégica na indústria química, petroquímica e de alimentos, sendo utilizada em diversas etapas dos processos. Devido à importância econômica e ao impacto ambiental resultante do consumo da água na indústria, o uso racional da água vem sendo amplamente enfatizado. Nas últimas décadas, diversas metodologias foram propostas para a minimização do consumo de água e emissão de efluentes líquidos na indústria, baseadas em conceitos de integração mássica como reuso, reciclo e regeneração das correntes que conectam os processos. As metodologias propostas na literatura para a integração mássica dividem-se em dois grandes grupos: o primeiro utiliza a abordagem termodinâmica-heurística-evolutiva, enquanto que o segundo faz uso da programação matemática e otimização. Ambas as metodologias apresentam dificuldades para tratar do caso de múltiplos poluentes, sendo que na primeira, é necessário uma série de aproximações para tratar deste tipo de problema e na segunda estão presentes limitações relativas aos métodos de otimização. Além disso, sistemas integrados massicamente podem apresentar dificuldades operacionais devido ao maior número de reciclos e reusos de correntes. O presente trabalho através do uso de ferramentas de programação matemática e, baseado no problema padrão de síntese de redes proposto por Fontana (2002), propõe métodos para a redução da complexidade do problema de síntese e obtenção de redes integradas massicamente, considerando aspectos econômicos e operacionais. Inicialmente foi proposta a eliminação de variáveis inteiras do problema de otimização, transformando o problema de programação nãolinear inteira-mista (MINLP) em um problema de programação não-linear (NLP). Em seguida, foi proposta uma metodologia para a redução da complexidade do problema baseada na utilização das restrições de igualdade, reduzindo o número de variáveis de decisão. As formulações propostas apresentaram bons resultados, diminuindo consideravelmente o tempo computacional em relação ao problema MINLP original. Baseada nas metodologias desenvolvidas para redução da complexidade do problema, foi implementado ainda um gerador de estimativas iniciais viáveis utilizando números aleatórios. Também foram propostas três implementações para a síntese de redes com considerações econômicas e operacionais, resolvendo o problema de otimização em dois níveis: um externo para avaliação de critérios econômicos e um interno de onde é obtido o índice operacional. Das três formulações com considerações operacionais propostas, duas apresentaram redes com melhor desempenho operacional quando comparadas com os problemas somente com considerações econômicas. Tais resultados foram avaliados através de simulações dinâmicas com modelos linearizados, sendo observado, nestas duas implementações, aumento da robustez do sistema (redução da ação de controle sobre as variáveis manipuladas) frente a distúrbios positivos nas cargas de poluentes e mudanças nos set-points das variáveis controladas.
Water is an important raw material in the chemical, petrochemical, and food industry, being used in several stages of a process. Based on the economic importance of the water and the environmental impact through its employment in the industry, its rational consume has been receiving special attention. In the past decades, several methodologies have been proposed for the minimization of industrial water consume and wastewater emissions based on concepts of mass integration such as reuse, recycle, and regeneration of process streams. The methodologies proposed in the literature for the mass integration can be classified in two main groups: the first uses the themodynamic-heuristic-evolutive approach while the second uses mathematical programming and optimization. Both methodologies show difficulties to deal with the case of multiple pollutants: on the first one it is necessary to make some assumptions to solve this problem while the second, limitations relative to optimization methods are present. Another problem is the operational limitations due to streams recycle and reuse, that integrated processes can present. The present study, through the use of mathematical programming tools, based on the synthesis problem proposed by Fontana (2002), proposes methods to reduce the problem complexity and also for the synthesis of mass integration networks regarding economical and operational aspects. Initially it was proposed the elimination of the integer variables from the optimization problem, changing the problem from mixed-integer nonlinear programming (MINLP) to nonlinear programming (NLP). In the following, a methodology for the reduction of the problems complexity, based on the equality constraints, was applied to reduce the number of decision variables. The proposed formulations showed good results, reducing the computational time comparing to the MINLP problem. Based on the proposed methodologies for the reduction of the problem complexity described above, it was implemented a feasible initial guess generator based on random numbers. It was also proposed three formulations for the network synthesis with economic and operational considerations, solving the synthesis problem in two levels: an external level for the evaluation of economical criteria and an internal level where the operational index is obtained. Two out of the three implementations presented networks with better operational results comparing with problems only with economical aspects. These results were evaluated by dynamic simulations with linearized models and it was observed an increase in the systems robustness (reduction of the control action over the manipulated variables) when the system was subjected to positive disturbances in the pollutants loads and set-points changes.
APA, Harvard, Vancouver, ISO, and other styles
39

Uematsu, Akira Arice de Moura Galvão. "Algoritmos de negociação com dados de alta frequência." Universidade de São Paulo, 2012. http://www.teses.usp.br/teses/disponiveis/45/45133/tde-28042012-114138/.

Full text
Abstract:
Em nosso trabalho analisamos os dados provenientes da BM&F Bovespa, a bolsa de valores de São Paulo, no período de janeiro de 2011, referentes aos índices: BOVESPA (IND), o mini índice BOVESPA (WIN) e a taxa de câmbio (DOL). Estes dados são de alta frequência e representam vários aspectos da dinâmica das negociações. No conjunto de valores encontram-se horários e datas dos negócios, preços, volumes oferecidos e outras características da negociação. A primeira etapa da tese foi extrair as informações necessárias para análises a partir de um arquivo em protocolo FIX, foi desenvolvido um programa em R com essa finalidade. Em seguida, estudamos o carácter da dependência temporal nos dados, testando as propriedades de Markov de um comprimento de memória fixa e variável. Os resultados da aplicação mostram uma grande variabilidade no caráter de dependência, o que requer uma análise mais aprofundada. Acreditamos que esse trabalho seja de muita importância em futuros estudos acadêmicos. Em particular, a parte do carácter específico do protocolo FIX utilizado pela Bovespa. Este era um obstáculo em uma série de estudos acadêmicos, o que era, obviamente, indesejável, pois a Bovespa é um dos maiores mercados comerciais do mundo financeiro moderno.
In our work we analyzed data from BM&F Bovespa, the stock exchange in São Paulo. The dataset refers to the month January 2011 and is related to BOVESPA index (IND), mini BOVESPA index (WIN) and the exchange tax (DOL). These, are high frequency data representing various aspects of the dynamic of negotiations. The array of values includes the dates/times of trades, prices, volumes offered for trade and others trades characteristics. The first stage of the thesis was to extract information to the analysis from an archive in FIX protocol, it was developed a program in R with this aim. Afterwards, we studied the character of temporal dependence in the data, testing Markov properties of a fixed and variable memory length. The results of this application show a great variability in the character of dependence, which requires further analysis. We believe that our work is of great importance in future academic studies. In particular, the specific character of the FIX protocol used by Bovespa. This was an obstacle in a number of academic studies, which was, obviously, undesirable since Bovespa is one of the largest trading markets in the modern financial world.
APA, Harvard, Vancouver, ISO, and other styles
40

Cunha, Rafael de Souza. "Protocolo de Negociação Baseado em Aprendizagem-Q para Bolsa de Valores." Universidade Federal do Maranhão, 2013. http://tedebc.ufma.br:8080/jspui/handle/tede/501.

Full text
Abstract:
Made available in DSpace on 2016-08-17T14:53:24Z (GMT). No. of bitstreams: 1 Dissertacao Rafael de Souza.pdf: 5581665 bytes, checksum: 4edbe8b1f2b84008b5129a93038f2fee (MD5) Previous issue date: 2013-03-04
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
In this work, we applied the technology of Multi-Agent Systems (MAS) in the capital market, i.e., the stock market, specifically in Bolsa de Mercadorias e Futuros de São Paulo (BM&FBovespa). The research focused mainly on negotiation protocols and learning of investors agents. Within the Stock Exchange competitive field, the development of an agent that could learn to negotiate, could become differential for investors who wish to increase their profits. The decision-making based on historical data is motivation for further research in the same direction, however, we sought a different approach with regard to the representation of the states of q-learning algorithm. The reinforcement learning, in particular q-learning, has been shown to be effective in environments with various historical data and seeking reward decisions with positive results. That way it is possible to apply in the purchase and sale of shares, an algorithm that rewards the profit and punishes the loss. Moreover, to achieve their goals agents need to negotiate according to specific protocols of stock exchange. Therefore, endeavor was also the specifications of the rules of negotiation between agents that allow the purchase and sale of shares. Through the exchange of messages between agents, it is possible to determine how the trading will occur and facilitate communication between them, because it sets a standard of how it will happen. Therefore, in view of the specification of negotiation protocols based on q-learning, this research has been the modeling of intelligent agents and models of learning and negotiation required for decision making entities involved.
Neste trabalho, aplicou-se a tecnologia de Sistemas MultiAgente (SMA) no mercado de capitais, isto é, na Bolsa de Valores, especificamente na Bolsa de Mercadorias e Futuros de São Paulo (BM&FBovespa). A pesquisa concentrou-se principalmente nos protocolos de negociação envolvidos e na aprendizagem dos agentes investidores. Dentro do cenário competitivo da Bolsa de Valores, o desenvolvimento de um agente que aprendesse a negociar poderia se tornar diferencial para os investidores que desejam obter lucros cada vez maiores. A tomada de decisão baseada em dados históricos é motivação para outras pesquisas no mesmo sentido, no entanto, buscou-se uma abordagem diferenciada no que diz respeito à representação dos estados do algoritmo de aprendizagem-q. A aprendizagem por reforço, em especial a aprendizagem-q, tem demonstrado ser eficiente em ambientes com vários dados históricos e que procuram recompensar decisões com resultados positivos. Dessa forma é possível aplicar na compra e venda de ações, um algoritmo que premia o lucro e pune o prejuízo. Além disso, para conseguir alcançar seus objetivos os agentes precisam negociar de acordo com os protocolos específicos da bolsa de valores. Sendo assim, procurou-se também as especificações das regras de negociação entre os agentes que permitirão a compra e venda de títulos da bolsa. Através da troca de mensagens entre os agentes, é possível determinar como a negociação ocorrerá e facilitará comunicação entre os mesmos, pois fica padronizada a forma como isso acontecerá. Logo, tendo em vista as especificações dos protocolos de negociação baseados em aprendizagem-q, tem-se nesta pesquisa a modelagem dos agentes inteligentes e os modelos de aprendizagem e negociação necessários para a tomada de decisão das entidades envolvidas.
APA, Harvard, Vancouver, ISO, and other styles
41

Zekri, Dorsaf. "Agrégation et extraction des connaissances dans les réseaux inter-véhicules." Thesis, Evry, Institut national des télécommunications, 2013. http://www.theses.fr/2013TELE0001/document.

Full text
Abstract:
Les travaux réalisés dans cette thèse traitent de la gestion des données dans les réseaux inter-véhiculaires (VANETs). Ces derniers sont constitués d’un ensemble d’objets mobiles qui communiquent entre eux à l’aide de réseaux sans fil de type IEEE 802.11, Bluetooth, ou Ultra Wide Band (UWB). Avec de tels mécanismes de communication, un véhicule peut recevoir des informations de ses voisins proches ou d’autres plus distants, grâce aux techniques de multi-sauts qui exploitent dans ce cas des objets intermédiaires comme relais. De nombreuses informations peuvent être échangées dans le contexte des «VANETs», notamment pour alerter les conducteurs lorsqu’un événement survient (accident, freinage d’urgence, véhicule quittant une place de stationnement et souhaitant en informer les autres, etc.). Au fur et à mesure de leurs déplacements, les véhicules sont ensuite « contaminés » par les informations transmises par d’autres. Dans ce travail, nous voulons exploiter les données de manière sensiblement différente par rapport aux travaux existants. Ces derniers visent en effet à utiliser les données échangées pour produire des alertes aux conducteurs. Une fois ces données utilisées, elles deviennent obsolètes et sont détruites. Dans ce travail, nous cherchons à générer dynamiquement à partir des données collectées par les véhicules au cours de leur trajet, un résumé (ou agrégat) qui fourni des informations aux conducteurs, y compris lorsqu’aucun véhicule communicant ne se trouve pas à proximité. Pour ce faire, nous proposons tout d’abord une structure d’agrégation spatio-temporelle permettant à un véhicule de résumer l’ensemble des événements observés. Ensuite, nous définissons un protocole d’échange des résumés entre véhicules sans l’intermédiaire d’une infrastructure, permettant à un véhicule d’améliorer sa base de connaissances locale par échange avec ses voisins. Enfin, nous définissons nos stratégies d’exploitation de résumé afin d’aider le conducteur dans la prise de décision. Nous avons validé l’ensemble de nos propositions en utilisant le simulateur « VESPA » en l’étendant pour prendre en compte la notion de résumés. Les résultats de simulation montrent que notre approche permet effectivement d’aider les conducteurs à prendre de bonnes décisions, sans avoir besoin de recourir à une infrastructure centralisatrice
The works in this thesis focus on data management in inter-vehicular networks (VANETs). These networks consist of a set of moving objects that communicate with wireless networks IEEE 802.11, Bluetooth, or Ultra Wide Band (UWB). With such communication mechanisms, a vehicle may receive information from its close neighbors or other more remote, thanks to multi-jump techniques that operate in this case intermediate objects as relays. A lot of information can be exchanged in the context of « VANETs », especially to alert drivers when an event occurs (accident, emergency braking, vehicle leaving a parking place and want to inform others, etc.). In their move vehicles are then « contaminated » by the information provided by others. In this work, we use the data substantially different from the existing work. These are, in fact, use the data exchanged to produce alerts drivers. Once these data are used, they become obsolete and are destroyed. In this work, we seek to generate dynamically from data collected by vehicles in their path, a summary (or aggregate) which provides information to drivers, including when no communicating vehicle is nearby. To do this, we first propose a spatio-temporal aggregation structure enabling a vehicle to summarize all the observed events. Next, we define a protocol for exchanging summaries between vehicles without the mediation of an infrastructure, allowing a vehicle to improve its local knowledge base by exchange with its neighbors. Finally, we define our operating strategies of the summary to assist the driver in making decision. We validated all of our proposals using the «VESPA» simulator by extending it to take into account the concept of summaries. Simulation results show that our approach can effectively help drivers make good decisions without the need to use a centralized infrastructure
APA, Harvard, Vancouver, ISO, and other styles
42

Kuppusamy, Lakshmi Devi. "Modelling client puzzles and denial-of-service resistant protocols." Thesis, Queensland University of Technology, 2012. https://eprints.qut.edu.au/61032/1/Lakshmi_Kuppusamy_Thesis.pdf.

Full text
Abstract:
Denial-of-service (DoS) attacks are a growing concern to networked services like the Internet. In recent years, major Internet e-commerce and government sites have been disabled due to various DoS attacks. A common form of DoS attack is a resource depletion attack, in which an attacker tries to overload the server's resources, such as memory or computational power, rendering the server unable to service honest clients. A promising way to deal with this problem is for a defending server to identify and segregate malicious traffic as earlier as possible. Client puzzles, also known as proofs of work, have been shown to be a promising tool to thwart DoS attacks in network protocols, particularly in authentication protocols. In this thesis, we design efficient client puzzles and propose a stronger security model to analyse client puzzles. We revisit a few key establishment protocols to analyse their DoS resilient properties and strengthen them using existing and novel techniques. Our contributions in the thesis are manifold. We propose an efficient client puzzle that enjoys its security in the standard model under new computational assumptions. Assuming the presence of powerful DoS attackers, we find a weakness in the most recent security model proposed to analyse client puzzles and this study leads us to introduce a better security model for analysing client puzzles. We demonstrate the utility of our new security definitions by including two hash based stronger client puzzles. We also show that using stronger client puzzles any protocol can be converted into a provably secure DoS resilient key exchange protocol. In other contributions, we analyse DoS resilient properties of network protocols such as Just Fast Keying (JFK) and Transport Layer Security (TLS). In the JFK protocol, we identify a new DoS attack by applying Meadows' cost based framework to analyse DoS resilient properties. We also prove that the original security claim of JFK does not hold. Then we combine an existing technique to reduce the server cost and prove that the new variant of JFK achieves perfect forward secrecy (the property not achieved by original JFK protocol) and secure under the original security assumptions of JFK. Finally, we introduce a novel cost shifting technique which reduces the computation cost of the server significantly and employ the technique in the most important network protocol, TLS, to analyse the security of the resultant protocol. We also observe that the cost shifting technique can be incorporated in any Diffine{Hellman based key exchange protocol to reduce the Diffie{Hellman exponential cost of a party by one multiplication and one addition.
APA, Harvard, Vancouver, ISO, and other styles
43

Foo, Ernest. "Strategies for designing efficient electronic payment schemes." Thesis, Queensland University of Technology, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
44

Krontiris, Alexandros. "Evaluation of Certificate Enrollment over Application Layer Security." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-236033.

Full text
Abstract:
This thesis analyzes Application Layer security protocols for certificate enrollment and management. EDHOC, Ephemeral Diffie-HellmanOver COSE, is a recently developed key exchange protocol whichis designed to provide authentication and key-exchange functionality with compact message sizes and minimum round-trip-time. The workof this thesis extends the EDHOC protocol with a certificate enrollment functionality, targeting IoT constrained devices and it has been implemented for analysis and evaluation purposes. The main scope of this document is to study the security, performance and scalability (in descendingorder of importance) of enrollment over EDHOC compared to other certificate enrollment protocols.
Detta examensarbete analyserar säkerhetsprotokoll av typen ApplicationLayer för certifikatregistrering och hantering. EDHOC, Ephemeral Diffie-Hellman Over COSE, har implementerats, analyserats och utvärderats. EDHOC är ett nyligen utvecklat Application Layer-protokoll som är utformat för att tillhandahålla autentiserings- och nyckelfunktionsfunktioner med kompakta meddelandestorlekar och minimala rundturstider, inriktat på IoT-begränsade enheter. Huvudområdet för examensarbetet är att studera säkerhet, prestanda och skalbarhet (i fallande ordning av betydelse) hos EDHOC jämfört med andra föreslagna Application Layer-säkerhetsprotokoll som utför certifikatsskrivning.
APA, Harvard, Vancouver, ISO, and other styles
45

"Channel reuse multiple access protocol for bidirectional bus networks." Massachusetts Institute of Technology, Laboratory for Information and Decision Systems], 1992. http://hdl.handle.net/1721.1/3254.

Full text
Abstract:
Whay Chiou Lee, Pierre Humblet.
Caption title.
Includes bibliographical references (leaf 5).
Supported by the Defense Advanced Research Projects Agency. N00014-84-K-0357 Supported by the National Science Foundation. NSF-ECS-7919880 Supported by the Army Research Office. ARO-DAAL03-86-K-0171
APA, Harvard, Vancouver, ISO, and other styles
46

Chen, Huai-Hsien. "Implementation of Internet Key Exchange Protocol." 2004. http://www.cetd.com.tw/ec/thesisdetail.aspx?etdun=U0001-2507200423262600.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Chen, Huai-Hsien, and 陳懷先. "Implementation of Internet Key Exchange Protocol." Thesis, 2004. http://ndltd.ncl.edu.tw/handle/61507602852405847722.

Full text
Abstract:
碩士
國立臺灣大學
電機工程學研究所
92
When the Internet has grown in the past ten years, more and more people communicate with their friends over the Internet. But Internet does not provide privacy; this means that a perpetrator may observe confidential data when the data traverses the Internet. A perpetrator may also modify data traversed the Internet, and we loss data integrity. Things can be worse; a perpetrator may pretend you to send data to others. Thus, some security mechanisms must be used to prevent the above situation from happening IP Security (IPSec) is a network-layer protocol. By implementing security mechanism as the IP level, one can ensure secure communication not only for applications that have security mechanisms but also for many security-ignorant applications. IPSec combines a lot of security technologies, including IPSec header which defines the information to be added to an IP packet and IKE which negotiates the security association between two entities. In this thesis, we describe implementation of IKEv2 protocol which can be used to performing mutual authentication and establishing and maintaining security association.
APA, Harvard, Vancouver, ISO, and other styles
48

HSU, LIN-CHIH, and 許令芷. "Password authentication and key exchange protocol." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/59451420718306413483.

Full text
Abstract:
碩士
佛光大學
資訊學系
96
Internet provides various services enabling the convenience of human life. However, it accompanies a variety of information security attacks. Before having access to the computer resources in remote site, it is one of the important issues in network security considerations to have a mutual authentication of the communicating entities. We proposed a new method of password authentication and key exchange protocol with smart card in this paper. The proposed scheme has the following characteristics: (1) it is not required for the server to store any authentication data for users; (2) users can freely choose their own passwords; (3) it provides mutual authentication; (4) the communicating parties can exchange one common session key; (5) it enables the control of expiration; (6) the computation complexities of the login phase is lower than those of the previously proposed schemes.
APA, Harvard, Vancouver, ISO, and other styles
49

Lee, Wei-Cheng, and 李威成. "A Fair Exchange Protocol for Internet Auction." Thesis, 2006. http://ndltd.ncl.edu.tw/handle/94696804468461683616.

Full text
Abstract:
碩士
輔仁大學
資訊管理學系
94
With the popularity of internet and the rise of electronic commerce, internet auctions have become one of the most popular shopping ways. But, there are so many cheating events in internet auctions, which will cause the consumers to doubt of the feasibility of online shopping and obstruct the development of internet auctions. Facing the increasing cheating problems of internet auctions, some major auction websites guarantee their customers against all lost and bring out some policies. However, these policies have some deficiencies, so it is necessary to design a fair transaction way for the environment of internet auctions. Because the cheating problems in the internet auction are getting worse, there is still no standard protocol to solve these problems. This study proposes a fair exchange protocol for internet auctions, which can let the buyer and seller deal with the transaction in a fair way. This protocol is based on the verifiable and recoverable encryption of DSA signature posed by Nenadic, et al. (2005) and the technique of electronic cash posed by Song and Korba (2004) to design a fair exchange protocol. This protocol can: (1) assure the fairness of transaction result, (2) protect the privacy of transaction message, (3) produce non-repudiation evidence to prove the participation of buyer and seller, (4) allow the buyer to verify the goods that are correct, and (5) reduce auction websites’ involving extent during the transaction process. This study using the cryptography method to ensure the transaction security between buyers and sellers, and expects to solve the problem and promote the security of internet auctions.
APA, Harvard, Vancouver, ISO, and other styles
50

Li, Wen-Hui, and 李文輝. "An Efficient Three-Party Key Exchange Protocol." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/59665431311256984116.

Full text
Abstract:
碩士
朝陽科技大學
資訊工程系碩士班
97
The introduction of the Internet era also means the dawn of mass threats of e-virus and hackers. Thus, the most appealing element of three-party communication would be the minimizing need of the session key in each user. All recent studies about the three party communications have been concentrating on the safety reliance of public key cryptosystem on the authentication server between each two parties. Yet, this method is prone to forgery identity and password guessing attacks. Therefore, some claimed that the bypassing of public key exchange protocol on authentication server would be necessary to prevent the abovementioned safety issue. I.e. the users can obtain and exchange the session key from a trusted third party. During the session, the authentic server approves the users’ identities and passwords, and generates a temporary session key for each participant, and which session key would be highly secure against all sorts of modern-day attacks effectively and efficiently. In this paper, we design an efficient three party session key exchange authentication protocol that is based on pre-registered identity and password on one authentication server. During the session, the public key would be encrypted with the identity of another participant, thereby reassuring its counterpart’s identity and prevent the session from unknown party’s potential threats. In such a calculation for the temporary session keys which are sent separately to each participant, passwords are removed. For the reasons not only to reduce the required calculation capacity, but also to keep the security of the protocol, namely the dictionary guessing attacks. Thus, in comparisons to others’ protocols, our protocol not only does reduce the required calculation capacity, but also safeguard the communications in today’s Internet environment.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography