Dissertations / Theses on the topic 'Verification'

To see the other types of publications on this topic, follow the link: Verification.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Verification.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Wang, Xuan. "Verification of Digital Controller Verifications." BYU ScholarsArchive, 2005. https://scholarsarchive.byu.edu/etd/681.

Full text
Abstract:
This thesis presents an analysis framework to verify the stablility property of a closed-loop control system with a software controller implementation. The usual approach to verifying stability for software uses experiments which are costly and can be dangerous. More recently, mathematical models of software have been proposed which can be used to reason about the correctness of controllers. However, these mathematical models ignore computational details that may be important in verification. We propose a method to determine the instability of a closed-loop system with a software controller implementation under l^2 inputs using simulation. This method avoids the cost of experimentation and the loss of precision inherent in mathematical modeling. The method uses the small gain theorem to compute a lower bound on the 2-induced norm of the uncertainty in the software implementation; if the lower bound is greater than 1/(2-induced norm of G), where G is the feedback system consisting of the mathematical model of the plant and the mathematical model of the controller, the closed-loop system is unsafe in a certain sense. The resulting method can not determine if the closed-loop system is stable, but can only suggest instability.
APA, Harvard, Vancouver, ISO, and other styles
2

Krupp, Alfred Alexander. "A verification plan for systematic verification of mechatronic systems." Aachen Shaker, 2009. http://d-nb.info/995161909/04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Tiikkainen, M. (Martti). "Automated functional coverage driven verification with Universal Verification Methodology." Master's thesis, University of Oulu, 2017. http://jultika.oulu.fi/Record/nbnfioulu-201711033027.

Full text
Abstract:
Abstract. In this Master’s thesis, the validity of Universal Verification Methodology in digital design verification is studied. A brief look into the methodology’s history is taken, and its unique properties and object-oriented features are presented. Important coverage topics in project planning are discussed, and the two main types of coverage, code and functional coverage, are explained and the methods how they are captured are presented. The practical section of this thesis shows the implementation of a monitoring environment and an Universal Verification Methodology environment. The monitoring environment includes class-based components that are used to collect functional coverage from an existing SystemVerilog test bench. The Universal Verification Methodology environment uses the same monitoring system, but a different driving setup to stress the design under test. Coverage and simulation performance values are extracted and from all test benches and the data is compared. The results indicate that the Universal Verification Methodology environment incorporating constrained random stimulus is capable of faster simulation run times and better code coverage values. The simulation time measured was up to 26 % faster compared to a module-based environment.Automaattinen toiminnallisen kattavuuden ohjaama verifiointi universaalilla varmennusmenetelmällä. Tiivistelmä. Tässä diplomityössä tutkitaan universaalin varmennusmenetelmän (Universal Verification Methodology) soveltuvuutta digitaalisten laitteiden verifiointiin. Työssä tehdään lyhyt katsaus menetelmän historiaan. Lisäksi menetelmän omia ainutlaatuisia ja olio-pohjaisia ominaisuuksia käydään läpi. Kattavuuteen liittyviä käsitteitä esitetään projektihallinnan näkökulmasta. Kattavuudesta käsitellään toiminnallinen ja koodikattavuus, ja tavat, miten näitä molempia kerätään simulaatioista. Työn käytännön osuudessa esitetään monitorointiympäristön ja universaalin varmennusmenetelmän pohjalta tehdyn ympäristön toteutus. Monitorointi-ympäristössä on luokkapohjaisia komponentteja, joiden avulla kerätään toiminnallista kattavuutta jo olemassa olevasta testipenkistä. Universaalin varmennusmenetelmän pohjalta tehdyssä ympäristössä on samojen monitorointikomponenttien lisäksi testattavan kohteen ohjaamiseen vaadittavia komponentteja. Eri testipenkeistä kerätään kattavuuteen ja suorituskykyyn liittyvää dataa vertaamista varten. Tulokset viittaavat siihen, että rajoitettua satunnaista herätettä hyödykseen käyttävät universaalit varmennusmenetelmäympäristöt pääsevät nopeampiin suoritusaikoihin ja parempiin koodikattavuuslukuihin. Simulaation suoritusaikaan saatiin parhaassa tapauksessa jopa 26 % parannus.
APA, Harvard, Vancouver, ISO, and other styles
4

Sandström, Krantz Alexander. "Node hardening verification." Thesis, Linköping University, Department of Electrical Engineering, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-11822.

Full text
Abstract:

Secure networks require each node to individually be as secure as possible. Transporting telecommunication data using IP based networks increases the need for security solutions due to increased exposure to threats. Ericsson currently provides a reference solution for carrying radio traffic over standardized Ethernet using IP, which in the current version relies on third party equipment. This equipment, and their recommended configuration, needs to be tested to ensure that the reference solution is as secure as possible.

The main purpose of this thesis is to provide Ericsson with a template that describes how security testing of the currently recommended equipment can be carried out.


För att ett nätverk skall vara säkert krävs att dess noder är invidivuellt säkrade. Transportering av telekommunikation över IP baserade nätverk ökar behovet av säkerhetslösningar då det ökar riskerna. Ericsson erbjuder idag en referenslösning för transport av telekommunikationstrafik över IP nätverk, som i dagsläget använder tredje-parts-utrustning. Denna utrustning och den konfiguration som rekommenderas i referenslösningen behöver säkerhetstestas för att säkerställa att den erbjudna lösningen håller en hög säkerhetsnivå.

Huvudsyftet med detta exjobb är att ta fram en praktisk metod som kan användas vid Ericsson för att säkerhetstesta den utrustning som i dagsläget rekommenderas i referenslösningen.

APA, Harvard, Vancouver, ISO, and other styles
5

Yager, Neil Gordon Computer Science &amp Engineering Faculty of Engineering UNSW. "Hierarchical fingerprint verification." Awarded by:University of New South Wales. Computer Science and Engineering, 2006. http://handle.unsw.edu.au/1959.4/27008.

Full text
Abstract:
Fingerprints have been an invaluable tool for law enforcement and forensics for over a century, motivating research into automated fingerprint based identification in the early 1960's. More recently, fingerprints have found an application in the emerging industry of biometric systems. Biometrics is the automatic identification of an individual based on physiological or behavioral characteristics. Due to its security related applications and the current world political climate, biometrics is presently the subject of intense research by private and academic institutions. Fingerprints are emerging as the most common and trusted biometric for personal identification. However, despite decades of intense research there are still significant challenges for the developers of automated fingerprint verification systems. This thesis includes an examination of all major stages of the fingerprint verification process, with contributions made at each step. The primary focus is upon fingerprint registration, which is the challenging problem of aligning two prints in order to compare their corresponding features for verification. A hierarchical approach is proposed consisting of three stages, each of which employs novel features and techniques for alignment. Experimental results show that the hierarchical approach is robust and outperforms competing state-of-the-art registration methods from the literature. However, despite its power, like most algorithms it has limitations. Therefore, a novel method of information fusion at the registration level has been developed. The technique dynamically selects registration parameters from a set of competing algorithms using a statistical framework. This allows for the relative advantages of different approaches to be exploited. The results show a significant improvement in alignment accuracy for a wide variety of fingerprint databases. Given a robust alignment of two fingerprints, it still remains to be verified whether or not they have originated from the same finger. This is a non-trivial problem, and a close examination of fingerprint features available for this task is conducted with extensive experimental results.
APA, Harvard, Vancouver, ISO, and other styles
6

Wahab, Matthew. "Object code verification." Thesis, University of Warwick, 1998. http://wrap.warwick.ac.uk/61068/.

Full text
Abstract:
Object code is a program of a processor language and can be directly executed on a machine. Program verification constructs a formal proof that a program correctly implements its specification. Verifying object code therefore ensures that the program which is to be executed on a machine is correct. However, the nature of processor languages makes it difficult to specify and reason about object code programs in a formal system of logic. Furthermore, a proof of the correctness of an object code program will often be too large to construct manually because of the size of object code programs. The presence of pointers and computed jumps in object code programs constrains the use of automated tools to simplify object code verification. This thesis develops an abstract language which is expressive enough to describe any sequential object code program. The abstract language supports the definition of program logics in which to specify and verify object code programs. This allows the object code programs of any processor language to be verified in a single system of logic. The abstract language is expressive enough that a single command is enough to describe the behaviour of any processor instruction. An object code program can therefore be translated to the abstract language by replacing each instruction with the equivalent command of the abstract language. This ensures that the use of the abstract language does not increase the difficulty of verifying an object code program. The verification of an object code program can be simplified by constructing an abstraction of the program and showing that the abstraction correctly implements the program specification. Methods for abstracting programs of the abstract language are developed which consider only the text of a program. These methods are based on describing a finite sequence of commands as a single, equivalent, command of the abstract language. This is used to define transformations which abstract a program by replacing groups of program commands with a single command. The abstraction of a program formed in this way can be verified in the same system of logic as the original program. Because the transformations consider only the program text, they are suitable for efficient mechanisation in an automated proof tool. By reducing the number of commands which must be considered, these methods can reduce the manual work needed to verify a program. The use of an abstract language allows object code programs to be specified and verified in a system of logic while the use of abstraction to simplify programs makes verification practical. As examples, object code programs for two different processors are modelled, abstracted and verified in terms of the abstract language. Features of processor languages and of object code programs which affect verification and abstraction are also summarised.
APA, Harvard, Vancouver, ISO, and other styles
7

Gonzalez, Perez Carlos Alberto. "Pragmatic model verification." Thesis, Nantes, Ecole des Mines, 2014. http://www.theses.fr/2014EMNA0189/document.

Full text
Abstract:
L’Ingénierie Dirigée par les Modèles (IDM) est une approche populaire pour le développement logiciel qui favorise l’utilisation de modèles au sein des processus de développement. Dans un processus de développement logiciel base sur l’IDM, le logiciel est développé en créant des modèles qui sont transformés successivement en d’autres modèles et éventuellement en code source. Quand l’IDM est utilisée pour le développement de logiciels complexes, la complexité des modèles et des transformations de modèles augmente, risquant d’affecter la fiabilité du processus de développement logiciel ainsi que le logiciel en résultant.Traditionnellement, la fiabilité des logiciels est assurée au moyen d’approches pour la vérification de logiciels, basées sur l’utilisation de techniques pour l’analyse formelle de systèmes et d’approches pour le test de logiciels. Pour assurer la fiabilité du processus IDM de développement logiciel, ces techniques ont en quelque sorte été adaptées pour essayer de s’assurer la correction des modèles et des transformations de modèles associées. L’objectif de cette thèse est de fournir de nouveaux mécanismes améliorant les approches existantes pour la vérification de modèles statiques, et d’analyser comment ces approches peuvent s’avérer utiles lors du test des transformations de modèles
Model-Driven Engineering (MDE) is a popular approach to the development of software which promotes the use of models as first-Class citizens in the software development process. In a MDE-Based software development process, software is developed by creating models to be successively transformed into another models and eventually into the software source code. When MDE is applied to the development of complex software systems, the complexity of models and model transformations increase, thus risking both, the reliability of the software development process and the soundness of the resulting software. Traditionally, ensuring software correctness and absence of errors has been addressed by means of software verification approaches, based on the utilization of formal analysis techniques, and software testing approaches. In order to ensure the reliability of MDE-Based software development processes, these techniques have some how been adapted to try to ensure correctness of models and model transformations. The objective of this thesis is to provide new mechanisms to improve the landscape of approaches devoted to the verification of static models, and analyze how these static model verification approaches can be of assistance at the time of testing model transformations
APA, Harvard, Vancouver, ISO, and other styles
8

Ingraham, Daniel. "Verification of a Computational Aeroacoustics Code Using External Verification Analysis (EVA)." University of Toledo / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1271271426.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kenger, Patrik. "Module property verification : A method to plan and perform quality verifications in modular architectures." Doctoral thesis, Stockholm, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3965.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Krupp, Alfred Alexander [Verfasser]. "A Verification Plan for Systematic Verification of Mechatronic Systems / Alfred Alexander Krupp." Aachen : Shaker, 2009. http://d-nb.info/1156518482/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Hofer, Simon. "Verification of design patterns." Zurich : ETH, Swiss Federal Institute of Technology Zurich, Department of Computer Science, Chair of Programming Methodology, 2009. http://e-collection.ethbib.ethz.ch/show?type=dipl&nr=435.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Pace, Kyongsuk P. "FNMOC model verification system." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1998. http://handle.dtic.mil/100.2/ADA349774.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Klein, Gerwin. "Verified Java bytecode verification." [S.l. : s.n.], 2003. http://deposit.ddb.de/cgi-bin/dokserv?idn=967128749.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Schmidt, Karsten. "Explicit state space verification." Doctoral thesis, [S.l. : s.n.], 2002. http://deposit.ddb.de/cgi-bin/dokserv?idn=967940745.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Nilsson, Daniel. "System for firmware verification." Thesis, University of Kalmar, School of Communication and Design, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:hik:diva-2372.

Full text
Abstract:

Software verification is an important part of software development and themost practical way to do this today is through dynamic testing. This reportexplains concepts connected to verification and testing and also presents thetesting-framework Trassel developed during the writing of this report.Constructing domain specific languages and tools by using an existinglanguage as a starting ground can be a good strategy for solving certainproblems, this was tried with Trassel where the description-language forwriting test-cases was written as a DSL using Python as the host-language.

APA, Harvard, Vancouver, ISO, and other styles
16

Sundell, Johanna. "Colour proof quality verification." Thesis, Linköping University, Department of Science and Technology, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2358.

Full text
Abstract:

BACKGROUND

When a customer delivers a colour proof to a printer, they expect the final print to look similar to that proof. Today it is impossible to control if a match between proof and print is technically possible to reach at all. This is mainly due to the fact that no information regarding the production circumstances of the proof is provided, for instance the printer does not know which proofer, RIP or ICC-profile that was used. Situations where similarity between proof and print cannot be reached and the press has to be stopped are both costly and time consuming and are therefore wished to be avoided.

PURPOSE

The purpose of this thesis was to investigate the possibility to form a method with the ability control if a proof is of such good quality that it is likely to produce a print that is similar to it.

METHOD

The basic assumption was that the quality of a proof could be decided by spectrally measuring known colour patches and compare those values to reference values representing the same patches printed at optimal press conditions. To decide which and how many patches that are required, literature and reports were studied, then a test printing and a comparison between proofing systems were performed. To be able to analyse the measurement data in an effective way a tool that analyses the difference between reference and measurement data was developed using MATLAB.

RESULT

The result was a suggestion for a colour proof quality verification method that consists two parts that are supposed to complement each other.The first one was called Colour proofing system evaluation and is supposed to evaluate entire proofing systems. It consists of a test page containing colour patches, grey balance fields, gradations and photographs. The second part is called Colour proof control and consists of a smaller set of colour patches that is supposed to be attached to each proof.

CONCLUSIONS

The method is not complete since more research regarding the difference between measurement results and visual impression is needed. To be able to obtain realistic tolerance levels for differences between measurement- and reference data, the method must be tested in every-day production. If this is done the method is thought to provide a good way of controlling the quality of colour proofs.

APA, Harvard, Vancouver, ISO, and other styles
17

Nosratighods, Mohaddeseh Electrical Engineering &amp Telecommunications Faculty of Engineering UNSW. "Robust speaker verification system." Publisher:University of New South Wales. Electrical Engineering & Telecommunications, 2008. http://handle.unsw.edu.au/1959.4/42796.

Full text
Abstract:
Identity verification or biometric recognition systems play an important role in our daily lives. Applications include Automatic Teller Machines (ATM), banking and share information retrieval, and personal verification for credit cards. Among the biometric techniques, authentication of speakers by his/her voice is of great importance, since it employs a non-invasive approach and is the only available modality in many applications. However,the performance of Automatic Speaker Verification (ASV) systems degrades significantly under adverse conditions which cause recordings from the same speaker to be different.The objective of this research is to investigate and develop robust techniques for performing automatic speaker recognition over various channel conditions, such as telephony and recorded microphone speech. This research is shown to improve the robustness of ASV systems in three main areas of feature extraction, speaker modelling and score normalization. At the feature level, a new set of dynamic features, termed Delta Cepstral Energy (DCE) is proposed, instead of traditional delta cepstra, which not only greatly reduces thedimensionality of the feature vector compared with delta and delta-delta cepstra, but is also shown to provide the same performance for matched testing and training conditions on TIMIT and a subset of the NIST 2002 dataset. The concept of speaker entropy, which conveys the information contained in a speaker's speech based on the extracted features, facilitates comparative evaluation of the proposed methods. In addition, Frequency Modulation features are combined in a complementary manner with the Mel Frequency CepstralCoefficients (MFCCs) to improve the performance of the ASV system under channel variability of various types. The proposed fused system shows a relative reduction of up to 23% in Equal Error Rate (EER) over the MFCC-based system when evaluated on the NIST 2008 dataset. Currently, the main challenge in speaker modelling is channel variability across different sessions. A recent approach to channel compensation, based on Support Vector Machines (SVM) is Nuisance Attribute Projection (NAP). The proposed multi-component approach to NAP, attempts to compensate for the main sources of inter-session variations through an additional optimization criteria, to allow more accurate estimates of the most dominant channel artefacts and to improve the system performance under mismatched training and test conditions. Another major issue in speaker recognition is that the variability of score distributions due to incompletely modelled regions of the feature space can produce segments of the test speech that are poorly matched to the claimed speaker model. A segment selection technique in score normalization is proposed that relies only on discriminative and reliable segments of the test utterance to verify the speaker. This approach is particularly useful in noisy conditions where using speech activity detection is not reliable at the feature level. Another source of score variability comes from the fact that not all phonemes are equally discriminative. To address this, a new score re-weighting technique is applied to likelihood values based on the discriminative level of each Gaussian component, i.e. each particular region of the feature space. It is found that a limited number of Gaussian mixtures, herein termed discriminative components are responsible for the overall performance, and that inclusion of the other non-discriminative components may only degrade the system performance.
APA, Harvard, Vancouver, ISO, and other styles
18

Larkins, Robert L. "Off-line signature verification." The University of Waikato, 2009. http://hdl.handle.net/10289/2803.

Full text
Abstract:
In today’s society signatures are the most accepted form of identity verification. However, they have the unfortunate side-effect of being easily abused by those who would feign the identification or intent of an individual. This thesis implements and tests current approaches to off-line signature verification with the goal of determining the most beneficial techniques that are available. This investigation will also introduce novel techniques that are shown to significantly boost the achieved classification accuracy for both person-dependent (one-class training) and person-independent (two-class training) signature verification learning strategies. The findings presented in this thesis show that many common techniques do not always give any significant advantage and in some cases they actually detract from the classification accuracy. Using the techniques that are proven to be most beneficial, an effective approach to signature verification is constructed, which achieves approximately 90% and 91% on the standard CEDAR and GPDS signature datasets respectively. These results are significantly better than the majority of results that have been previously published. Additionally, this approach is shown to remain relatively stable when a minimal number of training signatures are used, representing feasibility for real-world situations.
APA, Harvard, Vancouver, ISO, and other styles
19

Millin, Anthony. "Verification of stereotactic radiotherapy." Thesis, Cardiff University, 2011. http://orca.cf.ac.uk/12287/.

Full text
Abstract:
Investigations have been made into the use of a computer based simulation technique (Monte Carlo (MC)) to ionising radiation transport in order to verify the doses delivered during linear accelerator based stereotactic radiotherapy and radiosurgery. Due to the complex nature of the micro multi-leaf collimators (μMLC) used in this these treatments, a bespoke model of the μMLC was developed and combined with standard component modules to represent the remainder of the linear accelerator. Following validation of the above models, investigations were made into the dosimetry of small fields, defined by the μMLC and measured with a variety of detectors. Comparisons of relative output, profiles and depth doses were made against MC simulations, and a series of correction factors determined, to account for detector geometry and the non water equivalence of materials used in semiconductor detectors. An assessment was then made to determine the smallest fields that can be measured with each detector with confidence. Systems were then developed to independently simulate stereotactic treatments and compare doses simulated with those calculated by the treatment planning system (TPS); excellent agreement between TPS calculations and MC simulations was observed. The application of MC methods to determine the most appropriate treatment tactics and calculation algorithms for stereotactic body radiotherapy in the lung was then investigated with recommendations made on the most appropriate calculation algorithms and beam arrangements for the technique. The doses calculated using the type-b or collapsed cone algorithm agreed most closely with the MC simulation. There was little difference observed between plans using more than four beams in the treatment delivery. Treatment techniques using only three beams or less achieved poorer coverage of the tumour with dose, producing lower doses at the periphery of the tumour near the interface with the surrounding lung tissue, compared to using a greater number of beams. Finally, methods of transit dosimetry using Electronic Portal Imaging Devices were investigated for use in cranial stereotactic radiotherapy. Three methods were investigated based on a full MC simulation of the radiation transport through the patient and on to the imager, prediction of the dose based on a TPS calculation and an approximation of the radiological path length of the central axis of the beams to derive an expected dose at the imager plane. The MC method produced the best agreement at the expense of a longer time to acquire the comparison doses compared to the TPS calculation method. The equivalent path length method showed good agreement (within 3.5%) between delivered and predicted doses but at a single point.
APA, Harvard, Vancouver, ISO, and other styles
20

Cunningham, P. A. "Verification of asynchronous circuits." Thesis, University of Cambridge, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.598222.

Full text
Abstract:
The purpose of this thesis is to introduce proposition-oriented behaviours and apply them to the verification of asynchronous circuits. The major contribution of proposition-oriented behaviours is their ability to extend existing formal notations to permit the explicit use of both signal levels and transitions. This thesis begins with the formalisation of proposition-oriented behaviours in the context of gate networks, and with the set-theoretic extension of both regular-expressions and trace-expressions to reason over proposition-oriented behaviours. A new trace-expression construct, referred to as biased composition, is also introduced. Algorithmic realisation of these set-theoretic extensions is documented using a special form of finite automata called proposition automata. A verification procedure for conformance of gate networks to a set of proposition automata is described in which each proposition automaton may be viewed either as a constraint or a specification. The implementation of this procedure as an automated verification program called Veraci is summarised, and a number of example Veraci programs are used to demonstrate contributions of proposition-oriented behaviour to asynchronous circuit design. These contributions include level-event unification, event abstraction, and relative timing assumptions using biased composition. The performance of Veraci is also compared to an existing event-oriented verification program called Versify, the result of this comparison being a consistent performance gain using Veraci over Versify. This thesis concludes with the design and implementation of a 2048 bit dual-rail asynchronous Montgomery exponentiator, MOD_EXP, in a 0.18μm standard-cell process. The application of Veraci to the design of MOD_EXP is summarised, and the practical benefits of proposition-oriented verification are discussed.
APA, Harvard, Vancouver, ISO, and other styles
21

Sigurnjak, S. K. "Biometric verification using gait." Thesis, Manchester Metropolitan University, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.592681.

Full text
Abstract:
The work presented within this document details the development of a novel gait verification system suitable for a variety of applications such as human motion studies, medical analysis and security situations. The human gait is a spatio temporal process involving the coordination and interaction between the nervous, skeletal and muscular systems. Due to inherent variations in the limb lengths, muscle strengths and body mass gait is inherently individual. To develop a suitable feature extraction process a virtual gait laboratory was developed. The virtual laboratory contains virtual character templates articulated with a 32 bone skeleton system using motion capture data. Data was extracted from the character as a series of X, Y and Z translations for pro cessing. The virtual laboratory allows the testing of data extraction processes without the need for direct testing on human subjects. Feature extraction was performed using Principal Component Analysis (PCA). PCA allows data to be compressed and describes as a series of principal scores (PC) containing the weightings of the data. Feature extraction was performed on human subjects with the motions applied to a skeletal system containing individual physical dimensions. A second set of features was created by applying the motions to a single skeletal system. This removed the interpersonal variations from the dataset to explore the difference in classification when these variables have been removed. Overall generic motions are present within the first PC score. Higher PC scores contain unique motion characteristics suitable for classification of the subject s within a database. To verify a subject within the database Linear Discriminant Analysis (LOA) was performed. LOA projects data as a linear combination of features using a t raining data set of known outcomes. A subsequent sample can then be projected into the linear space for classification and verification of the subject within the database.
APA, Harvard, Vancouver, ISO, and other styles
22

Latham, J. T. "Abstraction in program verification." Thesis, University of Manchester, 1985. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.372026.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Buttle, Darren Lee. "Verification of compiled code." Thesis, University of York, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.341121.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Kim, Jaehyun 1970. "NC verification using octree." Thesis, Massachusetts Institute of Technology, 1998. http://hdl.handle.net/1721.1/9880.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Romano, Raquel Andrea. "Real-time face verification." Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/36649.

Full text
Abstract:
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1995.
Includes bibliographical references (p. 57-59).
by Raquel Andrea Romano.
M.S.
APA, Harvard, Vancouver, ISO, and other styles
26

Kuncak, Viktor (Viktor Jaroslav) 1977. "Modular data structure verification." Thesis, Massachusetts Institute of Technology, 2007. http://hdl.handle.net/1721.1/38533.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2007.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Includes bibliographical references (p. 149-166).
This dissertation describes an approach for automatically verifying data structures, focusing on techniques for automatically proving formulas that arise in such verification. I have implemented this approach with my colleagues in a verification system called Jahob. Jahob verifies properties of Java programs with dynamically allocated data structures. Developers write Jahob specifications in classical higher-order logic (HOL); Jahob reduces the verification problem to deciding the validity of HOL formulas. I present a new method for proving HOL formulas by combining automated reasoning techniques. My method consists of 1) splitting formulas into individual HOL conjuncts, 2) soundly approximating each HOL conjunct with a formula in a more tractable fragment and 3) proving the resulting approximation using a decision procedure or a theorem prover. I present three concrete logics; for each logic I show how to use it to approximate HOL formulas, and how to decide the validity of formulas in this logic. First, I present an approximation of HOL based on a translation to first-order logic, which enables the use of existing resolution-based theorem provers. Second, I present an approximation of HOL based on field constraint analysis, a new technique that enables decision procedures for special classes of graphs (such as monadic second-order logic over trees) to be applied to arbitrary graphs.
(cont.) Third, I present an approximation using Boolean Algebra with Presburger Arithmetic (BAPA), a logic that combines reasoning about sets of elements with reasoning about cardinalities of sets. BAPA can express relationships between sizes of data structures and invariants that correlate data structure size with integer variables. I present the first implementation of a BAPA decision procedure, and establish the exact complexity bounds for BAPA and quantifier-free BAPA. Together, these techniques enabled Jahob to modularly and automatically verify data structure implementations based on singly and doubly-linked lists, trees with parent pointers, priority queues, and hash tables. In particular, Jahob was able to prove that data structure implementations satisfy their specifications, maintain key data structure invariants expressed in a rich logical notation, and never produce run-time errors such as null dereferences or out of bounds accesses.
by Viktor Kuncak.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
27

Bouldin, Shannon. "Verification of Use Cases." Digital Commons at Loyola Marymount University and Loyola Law School, 2011. https://digitalcommons.lmu.edu/etd/384.

Full text
Abstract:
40% of all project failures are the result of inadequate requirement analysis; the end results do not fit the customer's desires. To combat this problem, the Department of Defense has instituted a new acquisition process necessitating that the requirements aspect of the program be examined earlier. It is more imperative than ever that programs deliver the right system. Verification of Use Cases (VUC) is a solid way to increase the odds that the customer wants are met. VUC has four main components: tracking system, requirements attached to respective use case, test cases built from use cases, and the subsystem requirement tested against the system requirement. The tracking system is a way to ensure the requirements, use cases and test cases are all linked. Regardless of the document, or phase of the program, an alphanumeric system is used, as the unique ID, linking the requirements, use cases, and test cases together. The use case does not usually include the requirement, however with VUC it will. This way, regardless of the phase of the project or the personnel on the project, the use case can never be lost as it in the same document as the requirements. It is this document that is flowed down to the test engineers to use to build the test cases. Using the scenarios of the use cases, the test cases are built and stored in a matrix. Each scenario will have at least one, if not more, test cases associated with it. This results in thorough test cases for each use case and linked requirement, increasing the likelihood that no part of the requirement is overlooked. The last phase of VUC takes the subsystems requirements and tests them back to their parent system test case, which was built from its respective use cases. This ensures that each subsystem meets the customer's expectations, getting the system closer to fulfilling its expected role and on budget and on schedule.
APA, Harvard, Vancouver, ISO, and other styles
28

Koskinen, Eric John. "Thermal verification of programs." Thesis, University of Cambridge, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.607698.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Coetzer, Johannes. "Off-line signature verification." Thesis, Stellenbosch : University of Stellenbosch, 2005. http://hdl.handle.net/10019.1/1355.

Full text
Abstract:
Thesis (PhD (Mathematical Sciences))--University of Stellenbosch, 2005.
A great deal of work has been done in the area of off-line signature verification over the past two decades. Off-line systems are of interest in scenarios where only hard copies of signatures are available, especially where a large number of documents need to be authenticated. This dissertation is inspired by, amongst other things, the potential financial benefits that the automatic clearing of cheques will have for the banking industry.
APA, Harvard, Vancouver, ISO, and other styles
30

Tashan, T. "Biologically inspired speaker verification." Thesis, Nottingham Trent University, 2012. http://irep.ntu.ac.uk/id/eprint/89/.

Full text
Abstract:
Speaker verification is an active research problem that has been addressed using a variety of different classification techniques. However, in general, methods inspired by the human auditory system tend to show better verification performance than other methods. In this thesis three biologically inspired speaker verification algorithms are presented. The first is a vowel-dependent speaker verification method that uses a modified Self Organising Map (SOM) algorithm. For each speaker, a seeded SOM is trained to produce representative Discrete Fourier Transform (DFT) models of three vowels from a spoken input using positive samples only. This SOM training is performed both during a registration phase and during each subsequent verification attempt. Speaker verification is achieved by computing the Euclidean distance between the registration and verification SOM trained weight sets. An analysis of the comparative system performance when using DFT input vectors, as well as Linear Prediction Code (LPC) spectrum and Mel Frequency Cepstrum Coefficients (MFCC) alternative input features indicates that the DFT spectrum outperforms both MFCC and LPC features. The algorithm was evaluated using 50 speakers from the Centre for Spoken Language Understanding (CSLU2002) speaker verification database. The second method consists of two neural network stages. The first stage is the modified SOM which now operates as a vowel clustering stage that filters the input speech data and separates it into three sets of vowel information. The second stage then contains three Multi Layer Perceptron (MLP) networks; each acting as a distinct vowel verifier. Adding this second stage allows the use of negative sample training. The input of each MLP network is the respective filtered output vowel data from the first stage. The DFT spectrum is again used as the input feature vector due to its optimal performance in the first algorithm. The overall system was evaluated using the same dataset as used in the first algorithm, showing improved verification performance when compared to the algorithm without using the MLP stage. The third biologically plausible method is a speaker verification algorithm that uses a positive-sample-only trained self organising map composed of spiking neurons. The architecture of the system is inspired by the biomechanical mechanism of the human auditory system which converts speech into electrical spikes inside the cochlea. A spike-based rank order coding input feature vector is proposed that is designed to be representative of the real biological spike trains found within the human auditory nerve. The Spiking Self Organising Map (SSOM) updates its winner neuron only when its activity exceeds a specified threshold. The algorithm is evaluated using the same 50 speaker dataset from the CSLU2002 speaker verification database and the results indicate that the SSOM verification performance is comparable to the non-spike based SOM. Finally, a new speech detection technique to detect speech activity within speech signals is also proposed. This novel technique uses the linear correlation coefficient (Parson Coefficient). The correlation is calculated in the frequency domain between neighbouring frames of DFT spectrum feature vectors. By summing the correlation coefficients within a sliding window over time, a correlation envelope is produced, which can be used to identify speech activity. The proposed technique is compared with a conventional energy frame analysis method and shows greater robustness against changes in speech volume level. A comparison of the two techniques, in terms of speaker verification application performance, is presented in Appendix A using 240 speech waveforms from the CSLU2002 speaker verification database.
APA, Harvard, Vancouver, ISO, and other styles
31

Von, Essen Christian. "Quantitative Verification and Synthesis." Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENM090/document.

Full text
Abstract:
Cette thèse contribue à l'étude théorique et a l'application de la vérification et de la synthèse quantitative. Nous étudions les stratégies qui optimisent la fraction de deux récompenses des MDPs. L'objectif est la synthèse de régulateurs efficaces dans des environnements probabilistes. Premièrement nous montrons que les stratégies déterministes et sans mémoire sont suffisants. Sur la base de ces résultats, nous proposons trois algorithmes pour traiter des modèles explicitement encodées. Notre évaluation de ces algorithmes montre que l'un de ces derniers est plus rapide que les autres. Par la suite nous proposons et mettons en place une variante symbolique basé sur les diagrammes de décision binaire.Deuxièmement, nous étudions le problème de réparation des programmes d'un point de vue quantitatif. Cela conduit à une reformulation de la réparation d'un log: que seules les exécutions fautives du programme soient modifiées. Nous étudions les limites de cette approche et montrons comment nous pouvons assouplir cette nouvelle exigence. Nous concevons et mettons en œuvre un algorithme pour trouver automatiquement des réparations, et montrons qu'il améliore les modifications apportées aux programmes. Troisièmement, nous étudions une nouvelle approche au framework pour la vérification et synthèse quantitative. La vérification et la synthèse fonctionnent en tandem pour analyser la qualité d'un contrôleur en ce qui concerne, par exemple , de robustesse contre des erreurs de modélisation. Nous considérons également la possibilité d'approximer la courbure de Pareto, qui appataît de la combinaison du modèle avec de multiples récompenses. Cela nous permet à la fois d'étudier les compromis inhérents au système et de choisir une configuration adéquate. Nous appliquons notre framework aux plusieurs études de cas. La majorité de l'étude de cas est concernée par un système anti-collision embarqué (ACAS X). Nous utilisons notre framework pour aider à analyser l'espace de conception du système et de valider le contrôleur en cours d'investigation par la FAA. En particulier, nous contribuons l'analyse par PCTL et stochastic model checking
This thesis contributes to the theoretical study and application of quantitative verification and synthesis. We first study strategies that optimize the ratio of two rewards in MDPs. The goal is the synthesis of efficient controllers in probabilistic environments. We prove that deterministic and memoryless strategies are sufficient. Based on these results we suggest 3 algorithms to treat explicitly encoded models. Our evaluation of these algorithms shows that one of these is clearly faster than the others. To extend its scope, we propose and implement a symbolic variant based on binary decision diagrams, and show that it cope with millions of states. Second, we study the problem of program repair from a quantitative perspective. This leads to a reformulation of program repair with the requirement that only faulty runs of the program be changed. We study the limitations of this approach and show how we can relax the new requirement. We devise and implement an algorithm to automatically find repairs, and show that it improves the changes made to programs.Third, we study a novel approach to a quantitative verification and synthesis framework. In this, verification and synthesis work in tandem to analyze the quality of a controller with respect to, e.g., robustness against modeling errors. We also include the possibility to approximate the Pareto curve that emerges from combining the model with multiple rewards. This allows us to both study the trade-offs inherent in the system and choose a configuration to our liking. We apply our framework to several case studies. The major case study is concerned with the currently proposed next generation airborne collision avoidance system (ACAS X). We use our framework to help analyze the design space of the system and to validate the controller as currently under investigation by the FAA. In particular, we contribute analysis via PCTL and stochastic model checking to add to the confidence in the controller
APA, Harvard, Vancouver, ISO, and other styles
32

Renault, Sophie. "Verification des programmes normaux." Orléans, 1996. http://www.theses.fr/1996ORLE2018.

Full text
Abstract:
La verification des programmes a pour objectif de garantir la conformite des programmes avec leur specification. Cette activite prend des formes variees selon le langage de programmation utilise, la nature de la specification, et ce que l'on entend par conformite. Cette these est consacree a l'etude de methodes de preuves formelles pour les programmes logiques avec negation. Notre approche est declarative, c'est-a-dire que la semantique des programmes est donnee en termes purement logiques. Nous etudions principalement deux methodes. La premiere a pour but de demontrer des proprietes du modele bien-fonde. La technique proposee consiste a representer ces proprietes sous forme de d'annotations, ce qui facilite l'organisation et accroit la modularite des preuves. La seconde methode utilise une procedure a base de depliage. Elle permet de demontrer des proprietes vraies dans tous les modeles du complete. Ces methodes ont en pratique des atouts et des limitations de nature differente. La possibilite et l'interet qui resulte de leur combinaison, ainsi que le support d'outils automatiques, montrent une voie prometteuse vers la mise en uvre de preuves a plus grande echelle. Ce travail contribue a promouvoir l'interet de la programmation en logique pour le genie logiciel
APA, Harvard, Vancouver, ISO, and other styles
33

Kattenbelt, Mark Alex. "Automated quantitative software verification." Thesis, University of Oxford, 2010. http://ora.ox.ac.uk/objects/uuid:62430df4-7fdf-4c4f-b3cd-97ba8912c9f5.

Full text
Abstract:
Many software systems exhibit probabilistic behaviour, either added explicitly, to improve performance or to break symmetry, or implicitly, through interaction with unreliable networks or faulty hardware. When employed in safety-critical applications, it is important to rigorously analyse the behaviour of these systems. This can be done with a formal verification technique called model checking, which establishes properties of systems by algorithmically considering all execution scenarios. In the presence of probabilistic behaviour, we consider quantitative properties such as "the worst-case probability that the airbag fails to deploy within 10ms", instead of qualitative properties such as "the airbag eventually deploys". Although many model checking techniques exist to verify qualitative properties of software, quantitative model checking techniques typically focus on manually derived models of systems and cannot directly verify software. In this thesis, we present two quantitative model checking techniques for probabilistic software. The first is a quantitative adaptation of a successful model checking technique called counter-example guided abstraction refinement which uses stochastic two-player games as abstractions of probabilistic software. We show how to achieve abstraction and refinement in a probabilistic setting and investigate theoretical extensions of stochastic two-player game abstractions. Our second technique instruments probabilistic software in such a way that existing, non-probabilistic software verification methods can be used to compute bounds on quantitative properties of the original, uninstrumented software. Our techniques are the first to target real, compilable software in a probabilistic setting. We present an experimental evaluation of both approaches on a large range of case studies and evaluate several extensions and heuristics. We demonstrate that, with our methods, we can successfully compute quantitative properties of real network clients comprising approximately 1,000 lines of complex ANSI-C code — the verification of such software is far beyond the capabilities of existing quantitative model checking techniques.
APA, Harvard, Vancouver, ISO, and other styles
34

Salem, Mohamed A. "Functional verification coverage closure." Thesis, University of Bristol, 2015. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.685927.

Full text
Abstract:
Verification is a critical phase of the development cycle. It confirms the compliance of a design implementation with its functional specification. Coverage measures the progress of the verification plan. Structural coverage determines the code exercised by the functional tests. Modified Condition Decision Coverage (MC/DC) is a structural coverage type. This thesis is based on a comprehensive study for MC/DC conventions. It provides a new MC/DC test generation algorithm, presents associated MC/DC empirical work from which it draws novel insights into MC/DC utilization as a coverage metric, and investigates the design faults detection strength of MC/DC. The research results have had significant impact on industry. The MC/DC study in hardware verification is motivated by the MC/DC certification requirements for critical software applications, the MC/DC foundation on hardware principles like controllability and observability, and the linear growth of MC/DC test set. A new MC/DC test generation algorithm named OBSRV is developed, implemented, and optimized based on the D-algorithm. It is distinguished from conventional techniques as it is mainly based on logic analysis. The thesis provides the empirical work, and associated results that represent an exhaustive validation of OBSRV. It has identified novel MC/DC insights represented by the minimal MC/DC requirements optimization, the MC/DC compositionality aspects, and the design options for MC/DC fulfillment. The research has had direct impact on industrial MC/DC applications. A major EDA MC/DC product has been completely re-architected, and the verification of an industrial safety critical embedded processor has been guided for MC/DC fulfillment. It demonstrates the feasibility of MC/DC as an applicable solution for structural, and functional coverage by an evaluation that proves the MC/DC detection strength for main design faults in microprocessors. The results motivate the continuity of future research leading to MC/DC adoption as main metric for functional verification coverage closure in hardware, and software domain.
APA, Harvard, Vancouver, ISO, and other styles
35

Poskitt, Christopher M. "Verification of graph programs." Thesis, University of York, 2013. http://etheses.whiterose.ac.uk/4700/.

Full text
Abstract:
This thesis is concerned with verifying the correctness of programs written in GP 2 (for Graph Programs), an experimental, nondeterministic graph manipulation language, in which program states are graphs, and computational steps are applications of graph transformation rules. GP 2 allows for visual programming at a high level of abstraction, with the programmer freed from manipulating low-level data structures and instead solving graph-based problems in a direct, declarative, and rule-based way. To verify that a graph program meets some specification, however, has been -- prior to the work described in this thesis -- an ad hoc task, detracting from the appeal of using GP 2 to reason about graph algorithms, high-level system specifications, pointer structures, and the many other practical problems in software engineering and programming languages that can be modelled as graph problems. This thesis describes some contributions towards the challenge of verifying graph programs, in particular, Hoare logics with which correctness specifications can be proven in a syntax-directed and compositional manner. We contribute calculi of proof rules for GP 2 that allow for rigorous reasoning about both partial correctness and termination of graph programs. These are given in an extensional style, i.e. independent of fixed assertion languages. This approach allows for the re-use of proof rules with different assertion languages for graphs, and moreover, allows for properties of the calculi to be inherited: soundness, completeness for termination, and relative completeness (for sufficiently expressive assertion languages). We propose E-conditions as a graphical, intuitive assertion language for expressing properties of graphs -- both about their structure and labelling -- generalising the nested conditions of Habel, Pennemann, and Rensink. We instantiate our calculi with this language, explore the relationship between the decidability of the model checking problem and the existence of effective constructions for the extensional assertions, and fix a subclass of graph programs for which we have both. The calculi are then demonstrated by verifying a number of data- and structure-manipulating programs. We explore the relationship between E-conditions and classical logic, defining translations between the former and a many-sorted predicate logic over graphs; the logic being a potential front end to an implementation of our work in a proof assistant. Finally, we speculate on several avenues of interesting future work; in particular, a possible extension of E-conditions with transitive closure, for proving specifications involving properties about arbitrary-length paths.
APA, Harvard, Vancouver, ISO, and other styles
36

Yan, Weiwei. "Software-hardware Cooperative Embedded Verification System Fusing Fingerprint Verification and Shared-key Authentication." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-66677.

Full text
Abstract:
In order to protect the security of the commercial information, personnel information, military information, governmental information on the Internet, the claimed identity should be authenticated. Now there are three main security authentication methods: first: using user PIN, such as password; second: using physical key, such as USBKey; third: using biological authentication technology, such as fingerprint, iris, voice and palm prints, etc. Because of the uniqueness, invariance, and ubiquity properties of biometric authentication, biometric authentication is becoming popular, especially fingerprint recognition. However, when the fingerprint recognition information is transported on the public channel, it may be attacked, such as the fingerprint information is stolen. So a cryptology mechanism is needed to protect the fingerprint recognition information. In the field of embedded security authentication system, the traditional hardware implementation mechanism, such as ASIC, can satisfy requires of functions and performances, but it is not configurable, flexible, and easy to expand; the traditional software implementation mechanism, such as general purpose processor, is flexible, but the cost and the power consumption are higher than hardware implementation. In order to take the advantages of biometrics, cryptology, hardware implementation, and software implementation, a hardware-software cooperating embedded authentication system based on shared-key authentication and fingerprint verification is proposed. First, this system authenticates the identities of client and server by shared-key authentication, creates the current encrypt key and hash key, and then authenticates the identity of them via fingerprint recognition. During fingerprint recognition, the information of fingerprint is not needed to transmit over the public channel, so the security of fingerprint is increased. Theoretic analysis and experiments show that, this system reach very high authentication rate and security. This system can resist replay attack, server template attack, device template attack, effectively.
APA, Harvard, Vancouver, ISO, and other styles
37

Hossain, Mousam. "Formal Verification Methodology for Asynchronous Sleep Convention Logic Circuits Based on Equivalence Verification." Thesis, North Dakota State University, 2019. https://hdl.handle.net/10365/31574.

Full text
Abstract:
Sleep Convention Logic (SCL) is an emerging ultra-low power Quasi-Delay Insensitive (QDI) asynchronous design paradigm with enormous potential for industrial applications. Design validation is a critical concern before commercialization. Unlike other QDI paradigms, such as NULL Convention Logic (NCL) and Pre-Charge Half Buffers (PCHB), there exists no formal verification methods for SCL. In this thesis, a unified formal verification scheme for combinational as well as sequential SCL circuits is proposed based on equivalence checking, which verifies both safety and liveness. The method is demonstrated using several multipliers, MACs, and ISCAS benchmarks.
APA, Harvard, Vancouver, ISO, and other styles
38

RENBERG, ADAM. "Test-inspired runtime verification : Using a unit test-like specification syntax for runtime verification." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-153674.

Full text
Abstract:
Computer software is growing ever more complex, and more sophisticated tools are required to make sure the software operates in a correct way — i.e. according to its specification. There are two general approaches to assure the correctness of software. While formal methods can give useful mathematical proofs about the correctness of programs,they suffer from complexity and are difficult to use. Often they work only on a constructed system model, not the actual program. Testing, on the other hand, has a simple syntax and great tool support, and it is in widespread use. It is however informal and incomplete, only testing the specific test cases that the test-writers can come up with. A relatively new approach called runtime verification is an attempt for a lightweight alternative. It verifies a program’s actual execution against its specification, possibly while the program is running. This work investigates how testing, and specifically unit testing, can be combined with runtime verification. It shows how the syntax ofunit tests, written in the target program’s programming language, can be used to inspire the syntax for specifications for runtime verification. Both informal and formal specifications are described and supported. A proof-of-concept framework for Python called pythonrv is implemented and described, and it is tested on a real-life application. A formal foundation is constructed for specifications written in a subset of Python, enabling formal verification. Informal specifications are also supported, with the full power of Python as specification language. The result shows that the proof-of-concept framework allow for effective use of runtime verification. It is easy to integrate into existing programs, and the informal specification syntax is relatively simple. It also shows that formal specifications can be written in Python, but in a more unwieldy syntax and structure than the informal one. Many interesting properties can be verified using it that ordinary tests would have trouble with. The recommendation for future work lies in improving the specification syntax, using unit testing concepts such as expectations, and on working to make the formal specification syntax more like that of its informal sibling.
APA, Harvard, Vancouver, ISO, and other styles
39

Finfando, Filip. "Indoor scene verification : Evaluation of indoor scene representations for the purpose of location verification." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-288856.

Full text
Abstract:
When human’s visual system is looking at two pictures taken in some indoor location, it is fairly easy to tell whether they were taken in exactly the same place, even when the location has never been visited in reality. It is possible due to being able to pay attention to the multiple factors such as spatial properties (windows shape, room shape), common patterns (floor, walls) or presence of specific objects (furniture, lighting). Changes in camera pose, illumination, furniture location or digital alteration of the image (e.g. watermarks) has little influence on this ability. Traditional approaches to measuring the perceptual similarity of images struggled to reproduce this skill. This thesis defines the Indoor scene verification (ISV) problem as distinguishing whether two indoor scene images were taken in the same indoor space or not. It explores the capabilities of state-of-the-art perceptual similarity metrics by introducing two new datasets designed specifically for this problem. Perceptual hashing, ORB, FaceNet and NetVLAD are evaluated as the baseline candidates. The results show that NetVLAD provides the best results on both datasets and therefore is chosen as the baseline for the experiments aiming to improve it. Three of them are carried out testing the impact of using the different training dataset, changing deep neural network architecture and introducing new loss function. Quantitative analysis of AUC score shows that switching from VGG16 to MobileNetV2 allows for improvement over the baseline.
Med mänskliga synförmågan är det ganska lätt att bedöma om två bilder som tas i samma inomhusutrymme verkligen har tagits i exakt samma plats även om man aldrig har varit där. Det är möjligt tack vare många faktorer, sådana som rumsliga egenskaper (fönsterformer, rumsformer), gemensamma mönster (golv, väggar) eller närvaro av särskilda föremål (möbler, ljus). Ändring av kamerans placering, belysning, möblernas placering eller digitalbildens förändring (t. ex. vattenstämpel) påverkar denna förmåga minimalt. Traditionella metoder att mäta bildernas perceptuella likheter hade svårigheter att reproducera denna färdighet . Denna uppsats definierar verifiering av inomhusbilder, Indoor SceneVerification (ISV), som en ansats att ta reda på om två inomhusbilder har tagits i samma utrymme eller inte. Studien undersöker de främsta perceptuella identitetsfunktionerna genom att introducera två nya datauppsättningar designade särskilt för detta. Perceptual hash, ORB, FaceNet och NetVLAD identifierades som potentiella referenspunkter. Resultaten visar att NetVLAD levererar de bästa resultaten i båda datauppsättningarna, varpå de valdes som referenspunkter till undersökningen i syfte att förbättra det. Tre experiment undersöker påverkan av användning av olika datauppsättningar, ändring av struktur i neuronnätet och införande av en ny minskande funktion. Kvantitativ AUC-värdet analys visar att ett byte frånVGG16 till MobileNetV2 tillåter förbättringar i jämförelse med de primära lösningarna.
APA, Harvard, Vancouver, ISO, and other styles
40

Rastelli, Davide. "Structural verification of the ESEO spacecraft: from launch loads analysis to composite panels verification." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amslaurea.unibo.it/7618/.

Full text
Abstract:
The present thesis work was performed in the frame of ESEO (European Student Earth Orbiter) project. The activities that are described in this document were carried out in the Microsatellites and Space Micro systems Lab led by Professor Paolo Tortora and in ALMASpace company facilities. The thesis deals with ESEO structural analysis, at system and unit level, and verification: after determining the design limit loads to be applied to the spacecraft as an envelope of different launchers load profiles, a finite element structural analysis was performed on the model of the satellite in order to ensure the capability to withstand the loads encountered during the launch; all the analyses were performed according to ESA standards and using the software MSC NASTRAN SIMXPERT. Amplification factors were derived and used to determine loads to be considered at unit level. In particular structural analyses were carried out on the GPS unit, the payload developed for ESEO by students of University of Bologna and results were used in the preparation of GPS payload design definition file. As for the verification phase a study on the panels and inserts to be used in the spacecraft was performed: different designs were created exploiting methods to optimize weight and mechanical behavior. The configurations have been analyzed and results compared to select the final design.
APA, Harvard, Vancouver, ISO, and other styles
41

Lang, Matthew. "Maximality modular verification and implementability /." Columbus, Ohio : Ohio State University, 2009. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1243962353.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Tschannen, Julian. "Automatic verification of Eiffel programs." Zurich : ETH, Eidgenössische Technische Hochschule Zürich, Department of Computer Science, Chair of Sotware Engineering, 2009. http://e-collection.ethbib.ethz.ch/show?type=dipl&nr=459.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Wagner, Rolf. "Secure verification of computational power." Zürich : ETH, Eidgenössische Technische Hochschule Zürich, Institute of Information Security, 2009. http://e-collection.ethbib.ethz.ch/show?type=dipl&nr=450.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Wang, Xuan. "Verification of digital controller implementations /." Diss., CLICK HERE for online access, 2005. http://contentdm.lib.byu.edu/ETD/image/etd1073.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Stenzel, Kurt. "Verification of Java card programs." [S.l.] : [s.n.], 2005. http://deposit.ddb.de/cgi-bin/dokserv?idn=975584510.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Dunbäck, Otto, and Simon Gidlöf. "Verification of hybrid operation points." Thesis, Department of Electrical Engineering, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-19932.

Full text
Abstract:

This thesis is an approach to improve a two-mode hybrid electric vehicle, which is currently under development by GM, with respect to fuel consumption. The study is not only restricted to the specific two-mode HEV but also presents results regarding parallel as well as serial HEV’s. GM whishes to verify if the online-based controller in the prototype vehicle utilizes the most of the HEV ability and if there is more potential to lower the fuel consumption. The purpose is that the results and conclusions from this work are to be implemented in the controller to further improve the vehicle’s performance. To analyze the behavior of the two-mode HEV and to see where improvements can be made, models of its driveline and components are developed with a focuson losses and efficiency. The models are implemented in MATLAB together with an optimization algorithm based on Dynamic Programming. The models are validated against data retrieved from the prototype vehicle and various cases with different inputs is set up and optimized over the NEDC cycle. Compensation for cold starts and NOx emissions are also implemented in the final model. Deliberate simplifications are made regarding the modeling of the power split’s functionality due to the limited amount of time available for this thesis. The optimizations show that there is potential to lower the fuel consumptionfor the two-mode HEV. The results are further analyzed and the behavior of the engine, motors/generators and battery are compared with recorded data from a prototype vehicle and summarized to a list of suggestions to improve fuel economy.

APA, Harvard, Vancouver, ISO, and other styles
47

Mroczkowski, Piotr. "Identity Verification using Keyboard Statistics." Thesis, Linköping University, Department of Electrical Engineering, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2265.

Full text
Abstract:

In the age of a networking revolution, when the Internet has changed not only the way we see computing, but also the whole society, we constantly face new challenges in the area of user verification. It is often the case that the login-id password pair does not provide a sufficient level of security. Other, more sophisticated techniques are used: one-time passwords, smart cards or biometric identity verification. The biometric approach is considered to be one of the most secure ways of authentication.

On the other hand, many biometric methods require additional hardware in order to sample the corresponding biometric feature, which increases the costs and the complexity of implementation. There is however one biometric technique which does not demand any additional hardware – user identification based on keyboard statistics. This thesis is focused on this way of authentication.

The keyboard statistics approach is based on the user’s unique typing rhythm. Not only what the user types, but also how she/he types is important. This report describes the statistical analysis of typing samples which were collected from 20 volunteers, as well as the implementation and testing of the identity verification system, which uses the characteristics examined in the experimental stage.

APA, Harvard, Vancouver, ISO, and other styles
48

Tristan, Jean-Baptiste. "Formal verification of translation validators." Phd thesis, Université Paris-Diderot - Paris VII, 2009. http://tel.archives-ouvertes.fr/tel-00437582.

Full text
Abstract:
Comme tout logiciel, les compilateurs, et tout particulièrement les compilateurs optimisant, peuvent être défectueux. Il est donc possible qu'ils changent la sémantique du programme compilé, et par conséquent ses propriétés. Dans le cadre de développement de logiciels critiques, où des méthodes formelles sont utilisées pour s'assurer qu'un programme satisfait certaines propriétés, et cela avant qu'il soit compilé, cela pose un problème de fond. Une solution à ce problème est de vérifier le compilateur en s'assurant qu'il préserve la sémantique des programmes compilés. Dans cette thèse, nous évaluons une méthode nouvelle pour développer des passes de compilations sûres: la vérification formelle de validateurs de traduction. D'une part, cette méthode utilise la vérification formelle à l'aide d'assistant de preuve afin d'offrir le maximum de garanties de sûreté sur le compilateur. D'autre part, elle repose sur l'utilisation de la validation de traduction, où chaque exécution du compilateur est validée a posteriori, une méthode de vérification plus pragmatique qui a permis de vérifier des optimisations avancées. Nous montrons que cette approche nouvelle du problème de la vérification de compilateur est viable, et même avantageuse dans certains cas, à travers quatre exemples d'optimisations réalistes et agressives: le list scheduling, le trace scheduling, le lazy code motion et enfin le software pipelining.
APA, Harvard, Vancouver, ISO, and other styles
49

Trinh, Cong Quy. "Formal Verification of Skiplist Algorithms." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-160314.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Wickerson, John Peter. "Concurrent verification for sequential programs." Thesis, University of Cambridge, 2013. https://www.repository.cam.ac.uk/handle/1810/265613.

Full text
Abstract:
This dissertation makes two contributions to the field of software verification. The first explains how verification techniques originally developed for concurrency can be usefully applied to sequential programs. The second describes how sequential programs can be verified using diagrams that have a parallel nature. The first contribution involves a new treatment of stability in verification methods based on rely-guarantee. When an assertion made in one thread of a concurrent system cannot be invalidated by the actions of other threads, that assertion is said to be 'stable'. Stability is normally enforced through side-conditions on rely-guarantee proof rules. This dissertation proposes instead to encode stability information into the syntactic form of the assertion. This approach, which we call explicit stabilisation, brings several benefits. First, we empower rely-guarantee with the ability to reason about library code for the first time. Second, when the rely-guarantee method is redepleyed in a sequential setting, explicit stabilisation allows more details of a module's implementation to be hidden when verifying clients. Third, explicit stabilisation brings a more nuanced understanding of the important issue of stability in concurrent and sequential verification; such an understanding grows ever more important as verification techniques grow ever more complex. The second contribution is a new method of presenting program proofs conducted in separation logic. Building on work by Jules Bean, the ribbon proof is a diagrammatic alternative to the standard 'proof outline'. By emphasising the structure of a proof, ribbon proofs are intelligible and hence useful pedagogically. Because they contain less redundancy than proof outlines, and allow each proof step to be checked locally, they are highly scalable; this we illustrate with a ribbon proof of the Version 7 Unix memory manager. Where proof outlines are cumbersome to modify, ribbon proofs can be visually manoeuvred to yield proofs of variant programs. We describe the ribbon proof system, prove its soundness and completeness, and outline a prototype tool for mechanically checking the diagrams it produ1res.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography