Rozprawy doktorskie na temat „Système de vérification de locuteur”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Sprawdź 50 najlepszych rozpraw doktorskich naukowych na temat „Système de vérification de locuteur”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Przeglądaj rozprawy doktorskie z różnych dziedzin i twórz odpowiednie bibliografie.
Larcher, Anthony. "Modèles acoustiques à structure temporelle renforcée pour la vérification du locuteur embarquée". Phd thesis, Université d'Avignon, 2009. http://tel.archives-ouvertes.fr/tel-00453645.
Pełny tekst źródłaOttens, Kévin. "Un système multi-agent adaptatif pour la construction d'ontologies à partir de textes". Phd thesis, Université Paul Sabatier - Toulouse III, 2007. http://tel.archives-ouvertes.fr/tel-00176883.
Pełny tekst źródłaParce que l'ontologie doit être maintenue, et parce qu'elle peut-être vue comme un système complexe constitué de concepts, nous proposons d'utiliser les systèmes multi-agents adaptatifs pour semi-automatiser le processus de construction des ontologies à partir de texte. L'état stable de ces systèmes résulte des interactions coopératives entre les agents logiciels qui les constituent. Dans notre cas, les agents utilisent des algorithmes distribués d'analyse statistique pour trouver la structure la plus satisfaisante d'après une analyse syntaxique et distributionnelle des textes. L'utilisateur peut alors valider, critiquer ou modifier des parties de cette structure d'agents, qui est la base de l'ontologie en devenir, pour la rendre conforme à ses objectifs et à sa vision du domaine modélisé. En retour, les agents se réorganisent pour satisfaire les nouvelles contraintes introduites. Les ontologies habituellement fixées deviennent ici dynamiques, leur conception devient « vivante ». Ce sont les principes sous-jacents de notre système nommé Dynamo.
La pertinence de cette approche a été mise à l'épreuve par des expérimentations visant à évaluer la complexité algorithmique de notre système, et par son utilisation en conditions réelles. Dans ce mémoire, nous présentons et analysons les résultats obtenus.
Mariéthoz, Johnny. "Algorithmes d'apprentissage discriminants en vérification du locuteur". Lyon 2, 2006. http://theses.univ-lyon2.fr/documents/lyon2/2006/mariethoz_j.
Pełny tekst źródłaDans cette thèse le problème de la vérification du locuteur indépendante du texte est abordée du point de vue de l'apprentissage statistique (machine learning). Les théories développées en apprentissage statistique permettent de mieux définir ce problème, de développer de nouvelles mesures de performance non-biaisées et de proposer de nouveaux tests statistiques afin de comparer objectivement les modèles proposés. Une nouvelle interprétation des modèles de l'état de l'art basée sur des mixtures de gaussiennes (GMM) montre que ces modèles sont en fait discriminants et équivalents à une mixture d'experts linéaires. Un cadre théorique général pour la normalisation des scores est aussi proposé pour des modèles probabilistes et non-probabilistes. Grâce à ce nouveau cadre théorique, les hypothèses faites lors de l'utilisation de la normalisation Z et T (T- and Z-norm) sont mises en évidence. Différents modèles discriminants sont proposés. On présente un nouveau noyau utilisé par des machines à vecteurs de support (SVM) qui permet de traîter des séquences. Ce noyau est en fait la généralisation d'un noyau déjà existant qui présente l'inconvénient d'être limité à une forme polynomiale. La nouvelle approche proposée permet la projection des données dans un espace de dimension infinie, comme c'est le cas, par exemple, avec l'utilisation d'un noyau gaussien. Une variante de ce noyau cherchant le meilleur vecteur acoustique (frame) dans la séquence à comparer, améliore les résultats actuellement connus. Comme cette approche est particulièrement coûteuse pour les séquences longues, un algorithme de regroupement (clustering) est utilisé pour en réduire la complexité. Finalement, cette thèse aborde aussi des problèmes spécifiques de la vé-ri-fi-ca-tion du locuteur, comme le fait que les nombres d'exemples positifs et négatifs sont très déséquilibrés et que la distribution des distances intra et inter classes est spécifique de ce type de problème. Ainsi, le noyau est modifié en ajoutant un bruit gaussien sur chaque exemple négatif. Même si cette approche manque de justification théorique pour l'instant, elle produit de très bons résultats empiriques et ouvre des perspectives intéressantes pour de futures recherches
Sanchez-Soto, Eduardo. "Réseaux Bayésiens Dynamiques pour la Vérification du Locuteur". Phd thesis, Télécom ParisTech, 2005. http://tel.archives-ouvertes.fr/tel-00011440.
Pełny tekst źródłaPierrot, Jean-Benoît. "Elaboration et validation d'approches en vérification du locuteur". Paris, ENST, 1998. http://www.theses.fr/1998ENST0029.
Pełny tekst źródłaSánchez-Soto, Eduardo. "Réseaux bayésiens dynamiques pour la vérification du locuteur". Paris, ENST, 2005. http://www.theses.fr/2005ENST0032.
Pełny tekst źródłaThis thesis is concerned with the statistical modeling of speech signal applied to Speaker Verification (SV) using Bayesian Networks (BNs). The main idea of this work is to use BNs as a mathematical tool to model pertinent speech features keeping its relations. It combines theoretical and experimental work. The difference between systems and humans performance in SV is the quantity of information and the relationships between the sources of information used to make decisions. A single statistical framework that keeps the conditional dependence and independence relations between those variables is difficult to attain. Therefore, the use of BNs as a tool for modeling the available information and their independence and dependence relationships is proposed. The first part of this work reviews the main modules of a SV system, the possible sources of information as well as the basic concepts of graphical models. The second part deals with Modeling. A new approach to the problems associated with the SV systems is proposed. The problem of inference and learning (parameters and structure)in BNs are presented. In order to obtain an adapted structure the relations of conditional independence among the variables are learned directly from the data. These relations are then used in order to build an adapted BN. In particular, a new model adaptation technique for BN has been proposed. This adaptation is based on a measure between Conditional Probability Distributions for discrete variables and on Regression Matrix for continuous variables used to model the relationships. In a large database for the SV task, the results have confirmed the potential of use the BNs approach
Mtibaa, Aymen. "Towards robust and privacy-preserving speaker verification systems". Electronic Thesis or Diss., Institut polytechnique de Paris, 2022. http://www.theses.fr/2022IPPAS002.
Pełny tekst źródłaSpeaker verification systems are a key technology in many devices and services like smartphones, intelligent digital assistants, healthcare, and banking applications. Additionally, with the COVID pandemic, access control systems based on fingerprint scanners or keypads increase the risk of virus propagation. Therefore, companies are now rethinking their employee access control systems and considering touchless authorization technologies, such as speaker verification systems.However, speaker verification system requires users to transmit their recordings, features, or models derived from their voice samples without any obfuscation over untrusted public networks which stored and processed them on a cloud-based infrastructure. If the system is compromised, an adversary can use this biometric information to impersonate the genuine user and extract personal information. The voice samples may contain information about the user's gender, accent, ethnicity, and health status which raises several privacy issues.In this context, the present PhD Thesis address the privacy and security issues for speaker verification systems based on Gaussian mixture models (GMM), i-vector, and x-vector as speaker modeling. The objective is the development of speaker verification systems that perform biometric verification while preserving the privacy and the security of the user. To that end, we proposed biometric protection schemes for speaker verification systems to achieve the privacy requirements (revocability, unlinkability, irreversibility) described in the standard ISO/IEC IS~24745 on biometric information protection and to improve the robustness of the systems against different attack scenarios
Dugas, Charles. "Les techniques de normalisation appliquées à la vérification du locuteur". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk2/tape17/PQDD_0006/MQ38676.pdf.
Pełny tekst źródłaScheffer, Nicolas. "Structuration de l'espace acoustique par le modèle générique pour la vérification du locuteur". Avignon, 2006. http://www.theses.fr/2006AVIG0146.
Pełny tekst źródłaLouradour, Jérôme. "Noyaux de séquences pour la vérification du locuteur par machines à vecteurs de support". Toulouse 3, 2007. http://www.theses.fr/2007TOU30004.
Pełny tekst źródłaThis thesis is focused on the application of Support Vector Machines (SVM) to Automatic Text-Independent Speaker Verification. This speech processing task consists in determining whether a speech utterance was pronounced or not by a target speaker, without any constraint on the speech content. In order to apply a kernel method such as SVM to this binary classification of variable-length sequences, an appropriate approach is to use kernels that can handle sequences, and not acoustic vectors within sequences. As explained in the thesis report, both theoretical and practical reasons justify the effort of searching such kernels. The present study concentrates in exploring several aspects of kernels for sequences, and in applying them to a very large database speaker verification problem under realistic recording conditions. After reviewing emergent methods to conceive sequence kernels and presenting them in a unified framework, we propose a new family of such kernels : the Feature Space Normalized Sequence (FSNS) kernels. These kernels are a generalization of the GLDS kernel, which is now well-known for its efficiency in speaker verification. A theoretical and algorithmic study of FSNS kernels is carried out. In particular, several forms are introduced and justified, and a sparse greedy matrix approximation method is used to suggest an efficient and suitable implementation of FSNS kernels for speaker verification. .
Kharroubi, Jamal. "Etude de techniques de classement "Machines à vecteurs supports" pour la vérification automatique du locuteur". Phd thesis, Télécom ParisTech, 2002. http://pastel.archives-ouvertes.fr/pastel-00001124.
Pełny tekst źródłaBlouet, Raphaël. "Approche probabiliste par arbres de décision pour la vérification automatique du locuteur sur architectures embarquées". Rennes 1, 2002. http://www.theses.fr/2002REN10151.
Pełny tekst źródłaSeidner, Charlotte. "Vérification des EFFBDs : model checking en ingénierie système". Nantes, 2009. http://www.theses.fr/2009NANT2140.
Pełny tekst źródłaSystems Engineering (SE) is an interdisciplinary and methodological approach for the design and operation of complex systems. Safety Engineering is a major SE process, yet the use of formal methods such as model checking, however powerful they may be, is hampered by their intrinsic complexity. Our research work, supported by an industrial partnership between the IRCCyN lab and Sodius, aimed at designing a tool which is directly usable during the SE design phase and which formally veries functional models. To that end, high-level models and behavioral properties are transformed into low-level equivalents on which formal verications are performed; analysis results are then expressed on the high-level models. To be specic, we considered EFFBDs as input models; this modeling language is indeed widely used in SE and adapted to model checking constraints. We formally established their syntax and semantics, then we were able to describe a translation into time Petri nets (TPNs) which we proved as preserving the model temporal behavior. Simultaneously, we described a quantitative temporal logic adapted to the EFFBDs and its translation into a corresponding logic on TPNs; we then established the computational complexity of its model checking. These successive theoretical results led us to develop a simulation and verication software tool that can analyze both functional and dysfunctional architectures (i. E. Modeling functions failures); this tool is deployed and operated in an industrial context
Soldi, Giovanni. "Diarisation du locuteur en temps réel pour les objets intelligents". Electronic Thesis or Diss., Paris, ENST, 2016. http://www.theses.fr/2016ENST0061.
Pełny tekst źródłaOn-line speaker diarization aims to detect “who is speaking now" in a given audio stream. The majority of proposed on-line speaker diarization systems has focused on less challenging domains, such as broadcast news and plenary speeches, characterised by long speaker turns and low spontaneity. The first contribution of this thesis is the development of a completely unsupervised adaptive on-line diarization system for challenging and highly spontaneous meeting data. Due to the obtained high diarization error rates, a semi-supervised approach to on-line diarization, whereby speaker models are seeded with a modest amount of manually labelled data and adapted by an efficient incremental maximum a-posteriori adaptation (MAP) procedure, is proposed. Obtained error rates may be low enough to support practical applications. The second part of the thesis addresses instead the problem of phone normalisation when dealing with short-duration speaker modelling. First, Phone Adaptive Training (PAT), a recently proposed technique, is assessed and optimised at the speaker modelling level and in the context of automatic speaker verification (ASV) and then is further developed towards a completely unsupervised system using automatically generated acoustic class transcriptions, whose number is controlled by regression tree analysis. PAT delivers significant improvements in the performance of a state-of-the-art iVector ASV system even when accurate phonetic transcriptions are not available
Fux, Thibaut. "Vers un système indiquant la distance d'un locuteur par transformation de sa voix". Thesis, Grenoble, 2012. http://www.theses.fr/2012GRENT120/document.
Pełny tekst źródłaThis thesis focuses on speaker voice transformation in the aim to indicate the distance of it: a spokento-whispered voice transformation to indicate a close distance and a spoken-to-shouted voicetransformation for a rather far distance. We perform at first, in-depth analysis to determine mostrelevant features in whispered voices and especially in shouted voices (much harder). The maincontribution of this part is to show the relevance of prosodic parameters in the perception of vocaleffort in a shouted voice. Then, we propose some descriptors to better characterize the prosodiccontours. For the actual transformation, we propose several new transformation rules whichimportantly control the quality of transformed voice. The results showed a very good quality oftransformed whispered voices and transformed shouted voices for relatively simple linguisticstructures (CVC, CVCV, etc.)
Vanackère, Vincent. "Trust : un système de vérification automatique de protocoles cryptographiques". Aix-Marseille 1, 2004. http://www.theses.fr/2004AIX11063.
Pełny tekst źródłaChane-Yack-Fa, Raphaël. "Vérification formelle de systèmes d'information". Thèse, Université de Sherbrooke, 2018. http://hdl.handle.net/11143/11630.
Pełny tekst źródłaCharbuillet, Christophe. "Algorithmes évolutionnistes appliqués à l'extraction de caractéristiques pour la reconnaissance du locuteur". Paris 6, 2008. http://www.theses.fr/2008PA066564.
Pełny tekst źródłaBenlahouar, Azzouz. "Nouvelles techniques de segmentation pour caractériser le timbre vocal d'un locuteur en vue de la vérification automatique de l'identité". Mémoire, École de technologie supérieure, 2003. http://espace.etsmtl.ca/778/1/BENLAHOUAR_Azzouz.pdf.
Pełny tekst źródłaFernandez, Jean-Claude. "ALDEBARAN : un système de vérification par réduction de processus communicants". Phd thesis, Grenoble 1, 1988. http://tel.archives-ouvertes.fr/tel-00326157.
Pełny tekst źródłaLiu, Yinling. "Conception et vérification du système d'Information pour la maintenance aéronautique". Thesis, Lyon, 2019. http://www.theses.fr/2019LYSEI133.
Pełny tekst źródłaOperational support is one of the most important aspects of aeronautical maintenance. It aims to provide a portfolio of services to implement maintenance with a high level of efficiency, reliability and accessibility. One of the major difficulties in operational support is that there is no platform that integrates all aircraft maintenance processes in order to reduce costs and improve the level of service. It is therefore necessary to build an autonomous aircraft maintenance system in which all maintenance information can be collected, organized, analyzed and managed in a way that facilitates decision-making. To do this, an innovative methodology has been proposed, which concerns modelling, simulation, formal verification and performance analysis of the autonomous system mentioned. Three axes were addressed in this thesis. The first axis concerns the design and simulation of an autonomous system for aeronautical maintenance. We offer an innovative design of an autonomous system that supports automatic decision making for maintenance planning. The second axis is the verification of models on simulation systems. We propose a more comprehensive approach to verifying global behaviours and operational behaviours of systems. The third axis focuses on the analysis of the performance of simulation systems. We propose an approach of combining an agent-based simulation system with the “Fuzzy Rough Nearest Neighbor” approach, in order to implement efficient classification and prediction of aircraft maintenance failures with missing data. Finally, simulation models and systems have been proposed. Simulation experiments illustrate the feasibility of the proposed approach
Mpe, A. Guilikeng Albert. "Un système de prédiction/vérification pour la localisation d'objets tridimentionnels". Compiègne, 1990. http://www.theses.fr/1990COMPD286.
Pełny tekst źródłaFrançois, Dominique. "Détection et identification des occlusives et fricatives au sein du système indépendant du locuteur APHODEX". Nancy 1, 1995. http://www.theses.fr/1995NAN10044.
Pełny tekst źródłaGondelman, Léon. "Un système de types pragmatique pour la vérification déductive des programmes". Thesis, Université Paris-Saclay (ComUE), 2016. http://www.theses.fr/2016SACLS583/document.
Pełny tekst źródłaThis thesis is conducted in the framework of deductive software verification.is aims to formalize some concepts that are implemented in the verification tool Why3. The main idea is to explore solutions that a type system based approach can bring to deductive verification. First, we focus our attention on the notion of ghost code, a technique that is used in most of modern verification tools and which consists in giving to some parts of specification the appearance of operational code. Using ghost code correctly requires various precautions since the ghost code must never interfere with the operational code. The first chapter presents a type system with effects illustrating how ghost code can be used in a way which is both correct and expressive. The second chapter addresses some questions related to verification of programs with pointers in the presence of aliasing, i.e. when several pointers handled by a program denote a same memory cell. Rather than moving towards to approaches that address the problem in all its complexity to the costs of abandoning the framework of Hoare logic, we present a type system with effects and singleton regions which resolves a liasing issues by performing a static control of aliases even before the proof obligations are generated. Although our system is limited to pointers whose identity must be known statically, we observe that it fits for most of the code we want to verify. Finally, we focus our attention on a situation where there exists an abstraction barrier between the user's code and the one of the libraries which it depends on. That means that libraries provide the user a set of functions and of data structures, without revealing details of their implementation. When programs are developed in a such modular way, verification must be modular it self. It means that the verification of user's code must take into account only function contracts supplied by libraries while the verification of libraries must ensure that their implementations refine correctly the exposed entities. The third chapter extends the system presented in the previous chapter with these concepts of modularity and data refinement
Vergamini, Didier. "Vérification de réseaux d'automates finis par équivalences observationnelles : le système AUTO". Nice, 1987. http://www.theses.fr/1987NICE4142.
Pełny tekst źródłaRobbana, Riadh. "Spécification et vérification de systèmes hybrides". Phd thesis, Université Joseph Fourier (Grenoble), 1995. http://tel.archives-ouvertes.fr/tel-00346070.
Pełny tekst źródłaBoucherit, Aziza. "L'implication du locuteur dans le discours et le renouvellement du système aspectif dans le dialecte arabe d'Alger". Paris 3, 1994. http://www.theses.fr/1994PA030140.
Pełny tekst źródłaThe study sets up a synchronic description of the dialectal Arabic verbal system spoken in Algiers. Emphasis is laid on one of the processes by which a verbal system, fonctioning basically as an aspective one, is likely to renew itself by the insertion, within the system, of an inflected pre-verbal particle (ra- + suffixal pronoun) aiming at expressing the notion of concomitance. Besides the introduction (chap. I), defining the aims of the research, the different types ofmaterial analyzed and the conditions of field investigation, the study includes four sections. The first section presents the main outlines of the history of the dialect examined and its position from the point of view of dialectal typology (chap. Ii) and the essential linguistic characteristics of this dialect : phonological study and remarks on morphology and syntax with regards to the functioning of verbal syntagm (chap. Iii). The second section (chap. Iv) presents, from the same point of view, the main theoretical notions on which the usages (iv. 1) and the situation are based in the magrebin Arabic idioms (iv. 2)
Charguéraud, Arthur. "Vérification de programmes à l'aide de formules caractéristiques". Paris 7, 2010. http://www.theses.fr/2010PA077214.
Pełny tekst źródłaThis dissertation describes a new approach to program verification, based on characteristic formulae the characteristic formula of a program is a higher-order logic formula that describes the behavior of that program, in the sense that it is sound and complete with respect to the semantics. This formula can be exploited in an interactive theorem prover to establish that the program satisfies a specification expressed in the style of separation logic, with respect to total correctness. The characteristic formula of a program is automatically generated from its source code alone. In particular, there is no need to annotate the source code with specifications or loop invariants, as such information can be given in interactive proof scripts. One key feature of characteristic formulae is that they are of linear size and that they can be pretty-printed in a way that closely resemble the source code they describe, even though they do not refer to the syntax of the programming language Characteristic formulae serve as a basis for a tool, called CFML, that supports the verification of CAML programs using the coq proof assistant. CFML has been employed to verify about half of the content of okasaki's book on purely functional data structures, and to verify several imperative data structures such as mutable lists, sparse arrays and union-find. Cfml also supports reasoning on higher-order imperative functions, such as functions in cps form and higher-order iterators
Graja, Zaineb. "Vérification formelle des systèmes multi-agents auto-adaptatifs". Thesis, Toulouse 3, 2015. http://www.theses.fr/2015TOU30105/document.
Pełny tekst źródłaA major challenge for the development of self-organizing MAS is to guarantee the convergence of the system to the overall function expected by an external observer and to ensure that agents are able to adapt to changes. In the literature, several works were based on simulation and model-checking to study self-organizing MAS. The simulation allows designers to experiment various settings and create some heuristics to facilitate the system design. Model checking provides support to discover deadlocks and properties violations. However, to cope with the complexity of self-organizing MAS, the designer also needs techniques that support not only verification, but also the development process itself. Moreover, such techniques should support disciplined development and facilitate reasoning about various aspects of the system behavior at different levels of abstraction. In this thesis, three essential contributions were made in the field of formal development and verification of self-organizing MAS: a formalization with the Event-B language of self-organizing MAS key concepts into three levels of abstraction, an experimentation of a top-down refinement strategy for the development of self-organizing MAS and the definition of a bottom-up refinement process based on refinement patterns
Mateescu, Radu. "Vérification des propriétés temporelles des programmes parallèles". Phd thesis, Grenoble INPG, 1998. http://tel.archives-ouvertes.fr/tel-00004896.
Pełny tekst źródłaMethni, Amira. "Méthode de conception de logiciel système critique couplée à une démarche de vérification formelle". Thesis, Paris, CNAM, 2016. http://www.theses.fr/2016CNAM1057/document.
Pełny tekst źródłaSoftware systems are critical and complex. In order to guarantee their correctness, the use of formal methodsis important. These methods can be defined as mathematically based techniques, languages and tools for specifying and reasoning about systems. But, the application of formal methods to software systems, implemented in C, is challenging due to the presence of pointers, pointer arithmetic andinteraction with hardware. Moreover, software systems are often concurrent, making the verification process infeasible. This work provides a methodology to specify and verify C software systems usingmodel-checking technique. The proposed methodology is based on translating the semantics of Cinto TLA+, a formal specification language for reasoning about concurrent and reactive systems. We define a memory and execution model for a sequential program and a set of translation rules from C to TLA+ that we developed in a tool called C2TLA+. Based on this model, we show that it can be extended to support concurrency, synchronization primitives and process scheduling. Although model-checking is an efficient and automatic technique, it faces the state explosion problem when the system becomes large. To overcome this problem, we propose a state-space reduction technique. The latter is based on agglomerating a set of C instructions during the generation phase of the TLA+ specification. This methodology has been applied to a concrete case study, a microkernel of an industrial real-time operating system, on which a set of functional properties has been verified. The application of the agglomeration technique to the case study shows the usefulness of the proposed technique in reducing the complexity of verification. The obtained results allow us to study the behavior of the system and to find errors undetectable using traditional testing techniques
Methni, Amira. "Méthode de conception de logiciel système critique couplée à une démarche de vérification formelle". Electronic Thesis or Diss., Paris, CNAM, 2016. http://www.theses.fr/2016CNAM1057.
Pełny tekst źródłaSoftware systems are critical and complex. In order to guarantee their correctness, the use of formal methodsis important. These methods can be defined as mathematically based techniques, languages and tools for specifying and reasoning about systems. But, the application of formal methods to software systems, implemented in C, is challenging due to the presence of pointers, pointer arithmetic andinteraction with hardware. Moreover, software systems are often concurrent, making the verification process infeasible. This work provides a methodology to specify and verify C software systems usingmodel-checking technique. The proposed methodology is based on translating the semantics of Cinto TLA+, a formal specification language for reasoning about concurrent and reactive systems. We define a memory and execution model for a sequential program and a set of translation rules from C to TLA+ that we developed in a tool called C2TLA+. Based on this model, we show that it can be extended to support concurrency, synchronization primitives and process scheduling. Although model-checking is an efficient and automatic technique, it faces the state explosion problem when the system becomes large. To overcome this problem, we propose a state-space reduction technique. The latter is based on agglomerating a set of C instructions during the generation phase of the TLA+ specification. This methodology has been applied to a concrete case study, a microkernel of an industrial real-time operating system, on which a set of functional properties has been verified. The application of the agglomeration technique to the case study shows the usefulness of the proposed technique in reducing the complexity of verification. The obtained results allow us to study the behavior of the system and to find errors undetectable using traditional testing techniques
Cambolive, Guillaume. "Scrables : un système intelligent d'audit". Toulouse 3, 1993. http://www.theses.fr/1993TOU30237.
Pełny tekst źródłaBerkane, Bachir. "Vérification des systèmes matériels numériques séquentiels synchrones : application du langage Lustre et de l'outil de vérification Lesar". Phd thesis, Grenoble INPG, 1992. http://tel.archives-ouvertes.fr/tel-00340909.
Pełny tekst źródłaLy, Van Bao. "Réalisation d'un système de vérification de signature manuscrite en-ligne indépendant de la plateforme d'acquisition". Evry, Institut national des télécommunications, 2005. http://www.theses.fr/2005TELE0008.
Pełny tekst źródłaThis thesis contributes to the automatic identity verification using the online handwritten signature, which is often sampled by a digitizing tablet or a touch screen. The handwritten signature is a highly accepted biometric modality. The proposed algorithm is original, generic and independent of the experimental signature database. It can be installed with different acquisition devices without any adaptation. The signature is modelled by a Hidden Markov Model. Firstly, we perform a personalized normalization of the signature features, which improves the quality of the Hidden Markov Model. In this stage, we experiment only the Likelihood information of the Hidden Markov Model, and show that the normalization of the signature features is crucial to the system performance. Then, we exploit second information given by the Hidden Markov Model in order to verify the identity. It's the Segmentation of the signature, never used previously for this task. After, this information is fused with the Likelihood information to reinforce the verification system. The experiments show that the system performances are greatly improved compared to the exclusive use of the Likelihood information. These experiments are performed on 4 signature databases, whose characteristics are very different, and then on the integrated database, which is simply a mixture of the 4 previous databases. The good system performances show the independence of the proposed algorithm with respect to the considered signature database, or to the signature acquisition device
Mami, Yassine. "Reconnaissance de locuteurs par localisation dans un espace de locuteurs de référence". Phd thesis, Télécom ParisTech, 2003. http://tel.archives-ouvertes.fr/tel-00005757.
Pełny tekst źródłaNous avons proposé ensuite une nouvelle représentation des locuteurs basée sur une distribution de distances. L'idée est de modéliser un locuteur par une distribution sur les distances mesurées dans l'espace des modèles d'ancrage. Cela permet d'appliquer une mesure statistique entre l'occurrence de test et les modèles des locuteurs à reconnaître (au lieu d'une mesure géométrique).
Ainsi, si nous avons approfondi la modélisation d'un locuteur par sa position dans un espace de locuteurs de référence, nous avons également étudié comment cette position pouvait permettre une meilleure estimation du modèle GMM du locuteur, par exemple en fusionnant les modèles de ses plus proches voisins. Finalement, en complément à la modélisation GMM-UBM, nous avons étudié des algorithmes de fusion de décisions avec les différentes approches proposées.
Arcile, Johan. "Conception, modélisation et vérification formelle d’un système temps-réel d’agents coopératifs : application aux véhicules autonomes communicants". Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLE029.
Pełny tekst źródłaThis thesis is motivated by the question of the validation of properties in a system composed of several mobile agents individually making decisions in real time.Each agent has a perception of their own environment and can communicate with other agents nearby.The application that has been chosen as a case study is that of autonomous vehicles, which because of the large number of variables involved in the representation of such systems, makes naive approaches impossible.The issues addressed concern, on the one hand, the modeling of such a system, in particular the choice of the formalism and the level of abstraction of the model, and on the other hand, the implementation of an evaluation protocol of decision making of vehicles.This last point includes the question of the efficiency of the exploration of the state space of the model.The thesis presents a set of works, which can be complementary, aiming to treat these problems.First, the system, consisting of autonomous vehicles and their environment, is precisely defined.It allows in particular to observe the impact of communications between vehicles on their behavior.The VerifCar software framework dedicated to decision-making analysis of communicating autonomous vehicles is then presented.It includes a parametric model of timed automata with the ability to check temporal logic properties.An analysis methodology using these properties is presented.A complementary approach is also proposed, which in some cases allows for greater efficiency and greater expressiveness.It is based on the formalism of MAPTs (Multi-Agent with Timed Periodic Tasks), which was designed for modeling real-time systems of cooperative agents.Algorithms allowing a dynamic exploration of the states of this type of model (that is to say without the state space having to be built beforehand) are presented.Finally, a combined method combining simulation and model verification tools to control the level of realism is described and applied to the case study
Lafon, Philippe. "Méthodes de vérification de bases de connaissances". Phd thesis, Ecole Nationale des Ponts et Chaussées, 1991. http://tel.archives-ouvertes.fr/tel-00520738.
Pełny tekst źródłaAzzoune, Hamid. "Les types en Prolog : un système d'inférence de type et ses applications". Phd thesis, Grenoble INPG, 1989. http://tel.archives-ouvertes.fr/tel-00332314.
Pełny tekst źródłaCellier, Peggy. "DeLLIS : débogage de programmes par localisation de fautes avec un système d’information logique". Rennes 1, 2008. ftp://ftp.irisa.fr/techreports/theses/2008/cellier.pdf.
Pełny tekst źródłaWhen testing a program, some executions can fail. Fault localization gives clues to locate the faults that cause those failures. The first contribution of this thesis is a new data structure for fault localization: a lattice that contains information from execution traces. The lattice is computed thanks to the combination of association rules and formal concept analysis, two data mining techniques. The lattice computes all differences between execution traces and, at the same time, gives a partial ordering on those differences. Unlike existing work, the method takes into account the dependencies between elements of the traces thanks to the lattice. The second contribution of this thesis is an algorithm that traverses the lattice in order to locate several faults in one pass of a test suite of the program. Experiments show that while the method takes into account multiple faults, it is not penalized, compared to existing work, when the program contains only one fault (in terms of number of lines to inspect). In addition, the study of the impact on the method of the dependences between faults shows that in three out of the four identified cases of dependency the faults can be located. The third contribution is an algorithm to compute association rules. The particularity of that algorithm is that it can take into account taxonomies, such as the hierarchy of the abstract syntax tree, without redundancy. It is used to generate association rules to build the lattice for fault localization
Karfoul, Hazem. "L'efficacité du système de contrôle interne et le seuil optimal du risque opérationnel : un modèle d'équilibre de l'allocation des ressources pour l'organisation bancaire". Bordeaux 4, 2010. http://www.theses.fr/2010BOR40089.
Pełny tekst źródłaOperational risk measurement models, aiming to meet regulatory capital requirements (ORC), have seen a great proliferation lately. Nonetheless, accounting for the effectiveness of internal control system (ICS) in these models is still carried out in an implicit manner. Such failure may lead to biased estimations of ORC which are inconsistent with the effective operational risk profile of the bank. In this work, we endeavor to overcome this pitfall through a direct and explicit measure of the internal control system’s quality. Our findings show that ORC is negatively related to ICS. However, each bank has a given threshold beyond which his operational risk capital does not respond to additional improvements brought into his ICS (i. E. , optimal threshold for operational risk, OTOR). We argue that, the so-called “law of decreasing effectiveness” seems explain well the theorem of OTOR. Taking into account the information asymmetry problems, arguably hard to be verified by banking supervisors, we provide as well an equilibrium model for resources allocation subject to regulatory constraints and management preferences. The introduction of OTOR, as an optimization problem, will affect the bank’s decision about how the scarce resources should be allocated. Accordingly, sharing out the resources becomes more efficient. While reducing the ORC charge, the bank’s management can shift more resources to be invested in higher return-generating activities, keeping up, meantime, in compliance with regulatory requirements. Furthermore, by means of OTOR, a simple formula for pricing the “operational risk premium” is provided
Bellefeuille, Sylvain. "Proposition d'un modèle de système d'aide à la vérification de la cohérence dans les bases de règles". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/MQ60612.pdf.
Pełny tekst źródłaKhoury, Raphaël. "Détection du code malicieux : système de type à effets et instrumentation du code". Thesis, Université Laval, 2005. http://www.theses.ulaval.ca/2005/23250/23250.pdf.
Pełny tekst źródłaThe purpose of this thesis is twofold. In the first place it presents a comparative study of the advantages and drawbacks of several approaches to insure software safety and security. It then focuses more particularly on combining static analyses and dynamic monitoring in order to produce a more powerful security architecture. The first chapters of the thesis present an analytical review of the various static, dynamic and hybrid approaches that can be used to secure a potentially malicious code. The advantages and drawbacks of each approach are thereby analyzed and the field of security properties that can be enforced by using it are identified. The thesis then focuses on the possibility of combining static and dynamic analysis through a new hybrid approach. This approach consists in a code instrumentation, that only alters those parts of a program where it is necessary to do so to insure the respect of a user-defined security policy expressed in a set of modal μ-calculus properties. this instrumentation is guided by a static analysis based on a type and effect system. The effects represent the accesses made to pretested system ressources.
Dang, Thi Xuan Thao. "Vérification et synthèse des systèmes hybrides". Phd thesis, Grenoble INPG, 2000. https://theses.hal.science/tel-00006738.
Pełny tekst źródłaThis thesis proposes a practical framework for the verification and synthesis of hybrid systems, that is, systems combining continuous and discrete dynamics. The lack of methods for computing reachable sets of continuous dynamics has been the main obstacle towards an algorithmic verification methodology for hybrid systems. We develop two effective approximate reachability techniques for continuous systems based on an efficient representation of sets and a combination of techniques from simulation, computational geometry, optimization, and optimal control. One is specialized for linear systems and extended to systems with uncertain input, and the other can be applied for non-linear systems. Using these reachability techniques we develop a safety verification algorithm which can work for a broad class of hybrid systems with arbitrary continuous dynamics and rather general switching behavior. We next study the problem of synthesizing switching controllers for hybrid systems with respect to a safety property. We present an effective synthesis algorithm based on the calculation of the maximal invariant set and the approximate reachability techniques. Finally, we describe the experimental tool "d/dt" which provides automatic safety verification and controller synthesis for hybrid systems with linear differential inclusions. Besides numerous academic examples, we have successfully applied the tool to verify some practical systems
Johnen, Colette. "Analyse algorithmique des réseaux de Petri : vérification d'espace d'accueil, systèmes de réécriture". Paris 11, 1987. http://www.theses.fr/1987PA112481.
Pełny tekst źródłaThe main motivation of this thesis is the conception of automatic analysis techniques of Petri nets. The thesis deals with two themes: verification of home space, application of techniques based on rewriting systems. A home space is a always reachable set of markings whatever the evolution of the system may be. This property allows, for instance, verifying that idle state of process is always accessible. It also allows validating behavioural properties (such as liveliness). In the first part, we prove that the property "a finite union of linear sets having the same periods is a home space" is decidable. A semi-decision algorithm checking a home space is presented in the second part. An approach using marking sets leads to especially efficient results. In one stage, it verifies that the home space is reachable from all the markings of a marking set. A class of marking sets which can be easily manipulated is defined (delimited sets). The resulting text is short; nevertheless it precisely indicates how to reach the home space from a given marking. The third part links Petri nets and abstract data types. A representation of Petri nets into a set of equations is established. Some properties (boundless, confluence) are directly related to properties of the rewriting system obtained by completion "à la Knuth-Bendix" of the equations. Some other properties (quasi-liveliness, finite termination, accessibility of a marking. . . ) can be validated by the convergence of the rewriting system obtained by adding the equation corresponding to the property to be proved. Proofs which usually require manual elaborated induction principles can be automatically completed this way
Geffroy, Thomas. "Vers des outils efficaces pour la vérification de systèmes concurrents". Thesis, Bordeaux, 2017. http://www.theses.fr/2017BORD0848/document.
Pełny tekst źródłaThe goal of this thesis is to solve in practice the coverability problem in Petri nets and lossy channel systems (LCS). These systems are interesting to study because they can be used to model concurrent and distributed systems. The coverability problem in a transition system is to decide whether it is possible, from an initial state, to reach a greater state than a target state. In the first part, we discuss how to solve this problem for well-structured transition systems (WSTS). Petri nets and LCS are WSTS. In the first part, we present a general method to solve this problem quickly in practice. This method uses coverability invariants, which are over-approximations of the set of coverable states. The second part studies Petri nets.We present comparisons of coverability invariants, both in theory and in practice. A particular attention will be paid on the combination of the classical state inequation and a simple sign analysis. LCS are the focus of the third part. We present a variant of the state inequation for LCS and two invariants that compute properties for the order in which messages are sent. Two tools, ICover and BML, were developed to solve the coverability problem in Petri nets and LCS respectively
Djelouah, Redouane. "Vérification et réparation interactive de bases de connaissances : le système ICC de traitement des incohérences et des incomplétudes". Angers, 2004. http://www.theses.fr/2004ANGE0031.
Pełny tekst źródłaTo use a knowledge base we must verify its quality by ensuring it does not contain what we call an anomaly. In this thesis, we use knowledge bases which are rule bases. Two types of anomalies revealing serious errors are studied : incoherencies and incompleteness. An incompleteness shows the necessity to complete the knowledge in the base so as to cover the whole area studied. An incoherence shows the need to reduce the knowledge in the base so as to eliminate the contradictory deductions. In the first phase of our work called verification of the base we propose, on the one hand, a formal characterization of incoherency and incompleteness of a rule base, on the other hand, we propose algorithms to detect these anomalies. Here we propose a new formal characterization of a rule base called C_Cohérence, this improves characterizations found in other studies. The second phase of our work called reparation of the base offers a new method of repairing the contents of a rule base which eliminates the incoherencies and incompleteness detected in the first phase. This repair takes place with interaction with an expert : we suggest modifications of the base contents to the expert who then decides whether to apply them one by one. The two phases of verification and reparation were implemented in a system called ICC
Prat, Sophie. "Intégration de techniques de vérification par simulation dans un processus de conception automatisée de contrôle commande". Thesis, Lorient, 2017. http://www.theses.fr/2017LORIS476/document.
Pełny tekst źródłaNowadays, engineers have to design open, complex and sociotechnical systems. The process control systems belong to this class of systems, in which the system performance relies on the joint optimisation of technical components and human components. To avoid the late discovery of design errors, it is necessary to perform tests throughout the design without adding design costs and delays. The aim of this work is therefore to facilitate the integration of checking by simulation, from early design stage, for process control systems such as fluid management systems. Regarding the adaptable feature of the system and its evolution in a dynamic environment, a first contribution focusses on the verification approach, by modelling the requirements within the context. Then, to facilitate the obtaining of the process simulation models required for checking throughout the design, we propose an automatic generation approach of simulation models in Modelica language (multi-domain modelling), from a P&ID model (modelling of the functional architecture of the process) and a library of elements (containing the simulation models of elements). To provide a proof of concept and a proof of use of our proposals, this approach has been implemented into Anaxagore, an automated design flow for monitoring and control
Baro, Sylvain. "Conception et implémentation d'un système d'aide à la spécification et à la preuve de programmes ML". Phd thesis, Université Paris-Diderot - Paris VII, 2003. http://tel.archives-ouvertes.fr/tel-00008416.
Pełny tekst źródłaNguyen, Le-Vinh. "Technique de programmation par contraintes pour la vérification bornée des programmes : une stratégie non-séquentielle appliquée à un système embarqué critique". Nice, 2011. http://www.theses.fr/2011NICE4042.
Pełny tekst źródłaThis thesis is devoted to program verification using the constraint programming technique. In particular, it focus on the incremental exploration strategy of executable paths of a program for verification and automatic counterexample generation using constraint solvers such as CP, LP, MIP, SMT. The context of this work is the Bounded Model Checking (BMC), an incomplete formal verification method, which only deals with paths of a bounded length in programs. In this thesis, we propose DPVS (Dynamic Postcondition-Variables driven Strategies), a new strategy based on the dynamic generation of a system of constraints in during the exploration of the control flow graph of the program. DPVS uses a backward search technique guided by the variables related to the property to prove. This strategy was evaluated on academic examples and real applications. Experiments on an industrial controller which manages the flashing lights of a car show that our system is more efficient than CPBPV, our previous approach, and than CBMC, a-state-of-the-art bounded model checker. We have developed a prototype in COMET that uses the DPVS strategy for program verification and automatic generation of counterexamples. This prototype uses many classical techniques to simplify the control flow graph such as calculating bounds of variables, slicing, propagation of constants. DPVS was successful in finding a counter–example of a real application, the Flasher Manager, that was provided by Gensoft, an industrial partner of the research project TESTEC