Teses / dissertações sobre o tema "Interopérabilité des systèmes de preuve"
Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos
Veja os 50 melhores trabalhos (teses / dissertações) para estudos sobre o assunto "Interopérabilité des systèmes de preuve".
Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.
Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.
Veja as teses / dissertações das mais diversas áreas científicas e compile uma bibliografia correta.
Cauderlier, Raphaël. "Object-Oriented Mechanisms for Interoperability Between Proof Systems". Thesis, Paris, CNAM, 2016. http://www.theses.fr/2016CNAM1065/document.
Texto completo da fonteDedukti is a Logical Framework resulting from the combination ofdependent typing and rewriting. It can be used to encode many logical systems using shallow embeddings preserving their notion of reduction. These translations of logical systems in a common format are a necessary first step for exchanging proofs between systems. This objective of interoperability of proof systems is the main motivation of this thesis.To achieve it, we take inspiration from the world of programming languages and more specifically from object-oriented languages because they feature advanced mechanisms for encapsulation, modularity, and default definitions. For this reason we start by a shallow translation of an object calculus to Dedukti. The most interesting point in this translation is the treatment of subtyping. Unfortunately, it seems very hard to incorporate logic in this object calculus. To proceed, object-oriented mechanisms should be restricted to static ones which seem enough for interoperability. Such a combination of static object-oriented mechanisms and logic is already present in the FoCaLiZe environment so we propose a shallow embedding of FoCaLiZe in Dedukti. The main difficulties arise from the integration of FoCaLiZe automatic theorem prover Zenon and from the translation of FoCaLiZe functional implementation language featuring two constructs which have no simple counterparts in Dedukti: local pattern matching and recursion. We then demonstrate how this embedding of FoCaLiZe to Dedukti can be used in practice for achieving interoperability of proof systems through FoCaLiZe, Zenon, and Dedukti. In order to avoid strengthening to much the theory in which the final proof is expressed, we use Dedukti as a meta-language for eliminating unnecessary axioms
Cauderlier, Raphaël. "Object-Oriented Mechanisms for Interoperability Between Proof Systems". Electronic Thesis or Diss., Paris, CNAM, 2016. http://www.theses.fr/2016CNAM1065.
Texto completo da fonteDedukti is a Logical Framework resulting from the combination ofdependent typing and rewriting. It can be used to encode many logical systems using shallow embeddings preserving their notion of reduction. These translations of logical systems in a common format are a necessary first step for exchanging proofs between systems. This objective of interoperability of proof systems is the main motivation of this thesis.To achieve it, we take inspiration from the world of programming languages and more specifically from object-oriented languages because they feature advanced mechanisms for encapsulation, modularity, and default definitions. For this reason we start by a shallow translation of an object calculus to Dedukti. The most interesting point in this translation is the treatment of subtyping. Unfortunately, it seems very hard to incorporate logic in this object calculus. To proceed, object-oriented mechanisms should be restricted to static ones which seem enough for interoperability. Such a combination of static object-oriented mechanisms and logic is already present in the FoCaLiZe environment so we propose a shallow embedding of FoCaLiZe in Dedukti. The main difficulties arise from the integration of FoCaLiZe automatic theorem prover Zenon and from the translation of FoCaLiZe functional implementation language featuring two constructs which have no simple counterparts in Dedukti: local pattern matching and recursion. We then demonstrate how this embedding of FoCaLiZe to Dedukti can be used in practice for achieving interoperability of proof systems through FoCaLiZe, Zenon, and Dedukti. In order to avoid strengthening to much the theory in which the final proof is expressed, we use Dedukti as a meta-language for eliminating unnecessary axioms
Hondet, Gabriel. "Expressing predicate subtyping in computational logical frameworks". Electronic Thesis or Diss., université Paris-Saclay, 2022. http://www.theses.fr/2022UPASG070.
Texto completo da fonteSafe programming as well as most proof systems rely on typing. The more a type system is expressive, the more these types can be used to encode invariants which are therefore verified mechanically through type checking procedures. Dependent types extend simple types by allowing types to depend on values. For instance, it allows to define the types of lists of a certain length. Predicate subtyping is another extension of simple type theory in which types can be defined by predicates. A predicate subtype, usually noted {x: A | P(x)}, is inhabited by elements t of type A for which P(t) is true. This extension provides an extremely rich and intuitive type system, which is at the heart of the proof assistant PVS, at the cost of making type checking undecidable.This work is dedicated to the encoding of predicate subtyping in Dedukti: a logical framework with computation rules. We begin with the encoding of explicit predicate subtyping for which the terms in {x: A | P(x)} and terms of Aare syntactically different. We show that any derivable judgement of predicate subtyping can be encoded into a derivable judgement of the logical framework. Predicate subtyping, is often used implicitly: with no syntactic difference between terms of type A and terms of type {x: A | P(x) }. We enrich our logical framework with a term refiner which can add these syntactic markers. This refiner can be used to refine judgements typed with implicit predicate subtyping into explicited judgements.The proof assistant PVS uses extensively predicate subtyping. We show how its standard library can be exported to Dedukti. Because PVS only store proof traces rather than complete proof terms, we sketch in the penultimate section a procedure to generate complete proof terms from these proof traces.The last section provides the architecture of a repository dedicated to the exchange of formal proofs. The goal of such a repository is to categorise and store proofs encoded in Dedukti to promote interoperability
Felicissimo, Thiago. "Generic bidirectional typing in a logical framework for dependent type theories". Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG049.
Texto completo da fonteDependent type theories are formal systems that can be used both as programming languages and for the formalization of mathematics, and constitute the foundation of popular proof assistants such as Coq and Agda. In order to unify their study, Logical Frameworks (LFs) provide a unified meta-language for defining such theories in which various universal notions are built in by default and metatheorems can be proven in a theory-independent way. This thesis focuses on LFs designed with implementation in mind, with the goal of providing generic type-checkers. Our main contribution is a new such LF which allows for representing type theories with their usual non-annotated syntaxes. The key to allowing the removal of annotations without jeopardizing decidability of typing is the integration of bidirectional typing, a discipline in which the typing judgment is decomposed into inference and checking modes. While bidirectional typing has been well known in the literature for quite some time, one of the central contributions of our work is that, by formulating it in an LF, we give it a generic treatment for all theories fitting our framework. Our proposal has been implemented in the generic type-checker BiTTs, allowing it to be used in practice with various theories. In addition to our main contribution, we also advance the study of Dedukti, a sibling LF of our proposed framework. First, we revisit the problem of showing that theories are correctly represented in Dedukti by proposing a methodology for encodings which allows for showing their conservativity easily. Furthermore, we demonstrate how Dedukti can be used in practice as a tool for translating proofs by proposing a transformation for sharing proofs with predicative systems. This transformation has allowed for the translation of proofs from Matita to Agda, yielding the first-ever Agda proofs of Fermat's Little Theorem and Bertrand's Postulate
Besançon, Léo. "Interopérabilité des Systèmes Blockchains". Thesis, Lyon, 2021. https://tel.archives-ouvertes.fr/tel-03789639.
Texto completo da fonteBlockchains are a disruptive technology. They enable an ecosystem for decentralized applications with interesting properties: transaction transparency, application auditability or censorship resistance. It has applications in various fields, such as finance, healthcare or video games. It has evolved a lot since its creation in 2008, and presents numerous perspectives. However, the field faces many challenges, in particular a lack of interoperability at several levels: between projects on a same Blockchain, between different Blockchains, or between Blockchains and other systems. One important aspect of Blockchain systems interoperability is semantic interoperability. It relies on formal definitions of the related concepts. Another challenge is the design of decentralized Blockchain applications. These applications integrate Blockchain technology, but also other services that satisfy the constraints of the application that Blockchain is not suitable for. However, it is complex to choose the most suited Blockchain service for a given application. With this PhD work, we propose a framework that can improve interoperability for decentralized Blockchain applications. We achieve this with the design of a methodology and a Blockchain ontology that help formalize the concepts related to an application. This framework is validated in the context of Blockchain video game development. It is a complex use case, as it needs to share storage intensive data and satisfy the latency constraints
Kirchner, Florent. "Systèmes de preuve interopérables". Phd thesis, Ecole Polytechnique X, 2007. http://pastel.archives-ouvertes.fr/pastel-00003192.
Texto completo da fonteSeyrat, Claude. "L' interopérabilité dans les systèmes d'indexation multimédia". Paris 6, 2003. http://www.theses.fr/2003PA066307.
Texto completo da fonteNowak, David. "Spécification et preuve de systèmes réactifs". Rennes 1, 1999. http://www.theses.fr/1999REN10094.
Texto completo da fonteCuggia, Marc. "Interopérabilité des systèmes d'information en santé et aspects sémantiques". Rennes 1, 2008. http://www.theses.fr/2008REN1B116.
Texto completo da fonteThe interoperability of health information systems is a major issue, since these systems are heterogeneous both in their areage of coverages, their structures and their contents. The objective of this work is to explore some aspects of interoperability, in particular the semantic aspects. The thesis is three parts structured. (i) An inventory of health information systems and the place of interoperability norms and standards in health domain. (ii) Modeling and experimental implementation information of a system for health care networks. (iii) Specific aspects of semantic issues are investigated through several studies, the difficulties to connect information models (HL7 here, OpenEHR) with models of knowledge (by SNOMED International, CT & LOINC)
Malet, Emmanuel. "Interopérabilité des systèmes d'IAO : les services de prototypage rapide". Reims, 2003. http://theses.univ-reims.fr/sciences/2003REIMS020.pdf.
Texto completo da fonteBah, Abdramane. "Interopérabilité et sécurité des systèmes d'information : application aux systèmes de gestion de l'éducation". Thesis, Nantes, 2020. http://www.theses.fr/2020NANT4028.
Texto completo da fonteAccess control for shared services is an essential requirement for a federation of services from different domains. In this thesis, we tackled two access control problems : authorizing users and delegating user authentication to their own domains. To address these problems, we have proposed a method of authorizing users based on the attribute mapping technique and a federated access control mechanism to implement this authorization method. We have proposed a service federation method to delegate user authentication to their domains. The main objective is to allow various domains to collaborate despite the heterogeneity of their security models
Mornet, Marie-Noëlle. "La preuve par vidéosurveillance". Université Robert Schuman (Strasbourg) (1971-2008), 2003. http://www.theses.fr/2003STR30007.
Texto completo da fonteThe study discusses evidence-seeking and the way it is implemented in a legal dispute. It aims at determining the regulation with regard to closed circuit television (CCTV) and its relevance. There are common criteria for a lawfull use of the process, i-e the information of individuals which is necessary, although insufficient, and the proportionality allowing to adapt the infringement of a person's rights to the necessity of CCTV. The divergence of case law in regard to the video recordings' admissibility, and the variable valuation of the evidence's weight are against the principles of the law suit's equity. The system must be improved: a limit must be set to the admissibility of the irregular evidence and more accuracy of the clauses of valuation of the probative value is needed. Moreover, the judge should take into consideration the proportionality criterion upon his sentence. This may have consequences on the use made of CCTV by individual interests
Melliti, Tarek. "Interopérabilité des services Web Complexes : Application aux systèmes multi-agents". Paris 9, 2004. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=2004PA090024.
Texto completo da fonteThis work concerns the formalization of operational interoperabilility between two communicating systems and particularly between the composite services. Formalization retranscribes the possibility of a correct interaction between two communicating systems using different specifications. This formalization materializes in three steps: Firstly, we define the observability as a reference of abstraction to express the semantics of different specification. Then, we propose a relation of conformity on the states of two systems which retranscribes the concept of correct interaction. Lastly, we define a synthesis algorithm which, being given the specification of a composite service and its observable semantics, either generates a correct client or detect an ambiguous behaviour. This work lead to a realisation of a composite Web services interaction plate-form and an environment of multi-agent systems integration
Diagne, Fama. "Preuve de propriétés dynamiques en B". Phd thesis, Institut National des Télécommunications, 2013. http://tel.archives-ouvertes.fr/tel-00939071.
Texto completo da fonteBaïna, Salah. "Interopérabilité dirigée par les modèles : une Approche Orientée Produit pour l'interopérabilité des systèmes d'entreprise". Nancy 1, 2006. https://tel.archives-ouvertes.fr/tel-00123271.
Texto completo da fonteSystems integration aims at assembling several systems in order to build a unique system working and reacting as a whole. In this context, systems interoperability can be seen as a means to obtain integration based on a light coupling of sub-parts of the system. Recently, a new approach of systems development based on modelling techniques has been founded. This approach is called “Model driven engineering”, in this approach models play a key role in systems engineering and development. This Phd thesis presents a product oriented approach for interoperability to ensure data and schema consistency between all product representations in a production. The aim of product oriented interoperability is, on one hand, to define mechanisms to maintain consistency between this representation and the reality of the product, and on the other hand to make sure that all applications share the image (representation) of the same product. Our result is a model driven approach based on a generic meta-model that defines all concepts used for representing the product. To enable data exchange between different applications, a formalisation of model driven mappings for semantic interoperability has also been proposed. To achieve our goal we focus on the formalisation of a unified product representation inside enterprise information systems. This unified representation has been inspired from the well know BWW ontology (Bunge - Wand - Weber) that is widely used for real world objects representation for information systems design. In this thesis, a reference meta-model has been defined, to enable structured data exchange between different applications
Lick, Anthony. "Logique de requêtes à la XPath : systèmes de preuve et pertinence pratique". Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLN016/document.
Texto completo da fonteMotivated by applications ranging from XML processing to runtime verificationof programs, many logics on data trees and data streams have been developed in the literature.These offer different trade-offs between expressiveness and computationalcomplexity; their satisfiability problem has often non-elementary complexity or is even undecidable.Moreover, their study through model-theoretic or automata-theoretic approaches can be computationally impractical or lacking modularity.In a first part, we investigate the use of proof systems as a modular way tosolve the satisfiability problem of data logics on linear structures.For each logic we consider, we develop a sound and complete hypersequentcalculus and describe an optimal proof search strategy yielding an NPdecision procedure.In particular, we exhibit an NP-complete fragment of the tense logic over data ordinals---the full logic being undecidable---, which is exactly as expressive as the two-variable fragment of the first-order logic on data ordinals.In a second part, we run an empirical study of the main decidable XPath-likelogics proposed in the literature.We present a benchmark we developed to that end, and examine how these logicscould be extended to capture more real-world queries without impacting thecomplexity of their satisfiability problem.Finally, we discuss the results we gathered from our benchmark, and identifywhich new features should be supported in order to increase the practicalcoverage of these logics
Sioud, Esma. "Contribution à l'évaluation de l'interopérabilité sémantique entre systèmes d'information d'entreprises : application aux systèmes d'information de pilotage de la production". Thesis, Nancy 1, 2011. http://www.theses.fr/2011NAN10049/document.
Texto completo da fonteWithin the context of collaborative enterprise information systems, these works aim to propose an approach for assessing the interoperability and the non-interoperability. In fact, the non-interoperability poses a series of challenging problems to the industrial community. Indeed, the non-interoperability leads to significant costs. The majority of these costs are attributable to the time and resources spent to put in place interfaces for exchanging information. This mainly affects enterprise global performance by increasing the cost and the delay to obtain the expected services. We suggest to address enterprise interoperability measurement in order to allow to any enterprise to fully evaluate, a priori, its own capacity to interoperate, and therefore to anticipate possible problems before a partnership. Our works consist in defining indicators and metrics to quantify and then to qualify the interoperability between the enterprise systems and to propose some improvement strategies when the evaluated interoperability level is not sufficient
Yahia, Esma. "Contribution à l'évaluation de l'interopérabilité sémantique entre systèmes d'information d'entreprise : Application aux systèmes d'information de pilotage de la production". Phd thesis, Université Henri Poincaré - Nancy I, 2011. http://tel.archives-ouvertes.fr/tel-00630118.
Texto completo da fonteFrançois, Axel. "Interopérabilité des modèles géométriques pour les Systèmes d'Information Géographique : applications de la norme ISO 19107". Thesis, Aix-Marseille 2, 2011. http://www.theses.fr/2011AIX22078/document.
Texto completo da fonteIn the current economic context, the representation and the analysis of 3D data is growing more and more in the field of Geographic Information Systems (GIS). The number of GIS applications is constantly increasing in many industries, such as Defense, Regional Development and Civil Security. However, we can point out the emergence of an important request for 3D analysis in GIS. These developments require a standardization of data and exchange structures. This is carried out by Open Geospatial Consortium (OGC) and International Organization for Standardization (ISO). A recent standard (ISO 19107:2003) describes the complex objects to be taken into consideration and the associated treatments that can be used. It aims to develop interoperable exchange and analysis of geometric and topological data in GIS. Currently, no complete implementation on a computer of this standard has been done yet, regarding the level of abstraction sought. However, a simplified version was exclusively developed for 2D objects (ISO 19137:2007). This thesis proposes the modeling and the analysis of complex objects in three dimensional space, with their associated treatments. The objective is to create a first library whose the features are conform to ISO19107. Moreover, this standard is currently under review within the OGC Consortium (Open Geospatial Consortium) with a need for correction and evolution. The work done until now have enabled us to make a meaningful contribution with the addition of new geometrical primitives, the harmonization of primitives through the use of rational parametric curves and surfaces. The works on this standard also allowed the development of an application within the GEOMATYS company, making possible the 3D modeling and analysis for traffic simulation based on GIS data
Bondé, Lossan. "Transformations de modèles et interopérabilité dans la conception de systèmes hétérogènes sur puce à base d'IP". Lille 1, 2006. https://pepite-depot.univ-lille.fr/LIBRE/Th_Num/2006/50376-2006-Bonde.pdf.
Texto completo da fonteNguyen, Quang Huy. "Calcul de réécriture et automatisation du raisonnement dans les assistants de preuve". Nancy 1, 2002. http://www.theses.fr/2002NAN10144.
Texto completo da fonteIn this thesis, we are interested in the cooperation of first-order rewriting and constructive type theory to improve the automation of computer-aided proofs. We aim to integrate (simple, conditional or AC) rewriting in the Coq proof assistant by an automatic, safe and efficient means using an external programming environment ELAN. Two different approaches are investigated. In the mixed approach, ELAN is used as an external rewriting engine for normalising terms in a reflexive rewriting tactic in Coq. The normalisation trace generated by ELAN is replayed in Coq to compute the corresponding normal form. In order to improve the performance of this reflexive tactic, we show how to use lazy rewriting for producing compact normalisation traces. The second approach called external is more flexible since it also covers several useful extensions of term rewriting such as conditional rewriting and AC rewriting, and it can be easily adapted to other proof assistants. In this approach, term rewriting is entirely performed in ELAN and proof assistants are only used for checking purpose. We formalise the notion of proof term for ELAN computations in the rewriting calculus with explicit substitutions. We investigate the equivalent properties of the proof terms and translate them into the syntax of proof assistant in order to check the corresponding computations. By this work, non-deterministic ELAN computations can be independently certified and we obtain term rewriting in proof assistant. In order to speed up proof-checking of the computations involving AC rewriting, we also propose an efficient method for proving equalities modulo AC based on the syntacticness of AC theories. These results have been integrated in a new ELAN-based rewriting tactic in Coq
Choquer, Marie-Anne. "Preuve de formules conditionnelles dans des spécifications algébriques conditionnelles". Paris 11, 1986. http://www.theses.fr/1986PA112335.
Texto completo da fonteIn this thesis, different proof methods in algebraic specifications are outlined. One of them (the induction-narrowing method) is extended to prove conditional formulas in conditional specifications. This method can only be applied on specifications satisfying the property of sufficient completness with respect to the constructors. This property is studied for conditional specifications
Gaaloul, Sana. "Interopérabilité basée sur les standards Modelica et composant logiciel pour la simulation énergétique des systèmes de bâtiment". Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00782540.
Texto completo da fonteBaïna, Salah. "INTEROPERABILITE DIRIGEE PAR LES MODELES :Une Approche Orientée Produit pour l'interopérabilité dessystèmes d'entreprise". Phd thesis, Université Henri Poincaré - Nancy I, 2006. http://tel.archives-ouvertes.fr/tel-00123271.
Texto completo da fontemodèles de produit, nous appellerons cette approche « l'interopérabilité orientée produit ».
Nous proposons ainsi un meta-modèle dont les instances jouent le rôle de passerelle de
communication entre différentes applications d'entreprise pour assurer l'interopérabilité des
parties de systèmes concernant le produit.
Nous nous sommes intéressés à formaliser un meta-modèle pour la définition du concept de
produit comme l'agrégation d'une partie physique représentant les éléments physiques du
produit et une partie informationnelle reprenant les éléments informationnels décrivant le
produit.
L'outillage et le prototypage du concept de produit holonique lors de la modélisation du
processus de fabrication dans l'entreprise ainsi que la prise en charge des mapping pour
l'interopérabilité s'appuient sur l'intégration du meta-modèle holonique dans un
environnement de modélisation d'entreprise particulier.
La phase de validation a été réalisée en deux parties représentées chacune par une application
industrielle. La première application se situe dans le cadre d'une collaboration avec le
département meunerie dans un grand groupe industriel, pour une application en
environnement réel de la modélisation holonique. L'objectif de cette application est de
concevoir un système de traçabilité pour les différents produits par les moulins du groupe.
Notre participation dans ce projet, a consisté en l'application de l'approche de modélisation
holonique pour la spécification, a priori, de l'ensemble des données et informations relatives
aux produits devant être prises en compte dans le système de traçabilité, et ainsi de générer de
manière automatique le schéma de données préliminaire du système. La seconde application
concerne la mise en œuvre de l'approche holonique pour une solution d'interopérabilité
orienté produit dans le cadre du pôle AIP Lorrain (AIPL).
Jomaa, Narjes. "Le co-design d’un noyau de système d’exploitation et de sa preuve formelle d’isolation". Thesis, Lille 1, 2018. http://www.theses.fr/2018LIL1I075/document.
Texto completo da fonteIn this thesis we propose a new kernel concept adapted to verification that we have called protokernel. It is a minimal operating system kernel where the minimization of its size is motivated by the reduction of the cost of proof and of the attack surface. This leads us to define a new strategy of codesign of the kernel and its proof. It is based mainly on the feedbacks between the various steps of development of the kernel, ranging from the definition of its specification to the formal verification of its properties. Thus, in this context we have designed and implemented the Pip protokernel. All of its system calls were carefully identified during the design step to ensure both the feasibility of proof and the usability of the system. The code of Pip is written in Gallina (the specification language of the Coq proof assistant) and then automatically translated into C code. The main property studied in this work is a security property, expressed in terms of memory isolation. This property has been largely discussed in the literature due to its importance. Thus, our work consists more particularly in guiding the developer to define the fundamental concepts of this minimalistic kernel through the formal verification of its isolation property. The verification strategy was first experimented with a generic microkernel model that we also wrote in Gallina. With this simplified microkernel model we were able to validate our verification approach before applying it to the concrete implementation of the Pip protokernel
Xu, Yigang. "Une architecture de médiation basée sur des composants génériques pour l'intégration des systèmes d'information de santé". Paris 6, 2002. http://www.theses.fr/2002PA066376.
Texto completo da fonteSioud, Esma. "Contribution à l'évaluation de l'interopérabilité sémantique entre systèmes d'information d'entreprises : application aux systèmes d'information de pilotage de la production". Electronic Thesis or Diss., Nancy 1, 2011. http://www.theses.fr/2011NAN10049.
Texto completo da fonteWithin the context of collaborative enterprise information systems, these works aim to propose an approach for assessing the interoperability and the non-interoperability. In fact, the non-interoperability poses a series of challenging problems to the industrial community. Indeed, the non-interoperability leads to significant costs. The majority of these costs are attributable to the time and resources spent to put in place interfaces for exchanging information. This mainly affects enterprise global performance by increasing the cost and the delay to obtain the expected services. We suggest to address enterprise interoperability measurement in order to allow to any enterprise to fully evaluate, a priori, its own capacity to interoperate, and therefore to anticipate possible problems before a partnership. Our works consist in defining indicators and metrics to quantify and then to qualify the interoperability between the enterprise systems and to propose some improvement strategies when the evaluated interoperability level is not sufficient
Diebold, Morgane. "Systèmes composites organogélateurs/polymères semi-conducteurs : de la preuve conceptuelle aux matériaux nanostructurés pour l'électronique plastique". Thesis, Strasbourg, 2018. http://www.theses.fr/2018STRAE002.
Texto completo da fonteImproving the performances of organic photovoltaic devices requires morphology control of the active layers. Highly nanostructured donor-acceptor bulk heterojunctions were prepared by heterogeneous nucleation of poly (3-hexylthiophene) (P3HT, donor) on naphthalene diimide organogelators fibers (NDI, acceptor). The first part of this work was dedicated to the self-assembly of NDI-core organogelators substituted by amide groups and trialkoxyphenyls dendrons. We evaluated the influence of the flexible chain between the naphthalene core and the amide groups (2 C-C bonds for NDI2 and 4 for NDI4) on the physico-chemical properties of the organogelators.The second part of this work focused on the polymorphism of NDI2 with identification of four different polymorphs with their optical, spectroscopic and structural signatures. A phase diagram of NDI2 in the solid state was determined. The last part of this manuscript concerns the fabrication of donor-acceptor nano-composites between NDI organogelators and P3HT. The formation process in solution of these nano-composites was analyzed by following the crystallization kinetics of P3HT by UV-Vis absorption spectroscopy and the thin film morphology (shish-kebab structures) by transmission electron microscopy. The nucleating effect of various organogelators on P3HT was demonstrated. Solar cells were made from the composites P3HT:PCBM : organogelator and their energetic conversion yield was shown to be increased in the presence of organogelators
Liao, Yongxin. "Annotations sémantiques pour l'intéropérabilité des systèmes dans un environnement PLM". Thesis, Université de Lorraine, 2013. http://www.theses.fr/2013LORR0135/document.
Texto completo da fonteIn manufacturing enterprises the Product Lifecycle Management (PLM) approach has been considered as an essential solution for improving the product competitive ability. It aims at providing a shared platform that brings together different enterprise systems at each stage of a product life cycle in or across enterprises. Although the main software companies are making efforts to create tools for offering a complete and integrated set of systems, most of them have not implemented all of the systems. Finally, they do not provide a coherent integration of the entire information system. This results in a kind of "tower of Babel", where each application is considered as an island in the middle of the ocean of information, managed by many stakeholders in an enterprise, or even in a network of enterprises. The different peculiarities of those stakeholders are then over increasing the issue of interoperability. The objective of this thesis is to deal with the issue of semantic interoperability, by proposing a formal semantic annotation method to support the mutual understanding of the semantics inside the shared and exchanged information in a PLM environment
Belghaouti, Fethi. "Interopérabilité des systèmes distribués produisant des flux de données sémantiques au profit de l'aide à la prise de décision". Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLL003.
Texto completo da fonteInternet is an infinite source of data coming from sources such as social networks or sensors (home automation, smart city, autonomous vehicle, etc.). These heterogeneous and increasingly large data can be managed through semantic web technologies, which propose to homogenize, link these data and reason above them, and data flow management systems, which mainly address the problems related to volume, volatility and continuous querying. The alliance of these two disciplines has seen the growth of semantic data stream management systems also called RSP (RDF Stream Processing Systems). The objective of this thesis is to allow these systems, via new approaches and "low cost" algorithms, to remain operational, even more efficient, even for large input data volumes and/or with limited system resources.To reach this goal, our thesis is mainly focused on the issue of "Processing semantic data streamsin a context of computer systems with limited resources". It directly contributes to answer the following research questions : (i) How to represent semantic data stream ? And (ii) How to deal with input semantic data when their rates and/or volumes exceed the capabilities of the target system ?As first contribution, we propose an analysis of the data in the semantic data streams in order to consider a succession of star graphs instead of just a success of andependent triples, thus preserving the links between the triples. By using this approach, we significantly impoved the quality of responses of some well known sampling algoithms for load-shedding. The analysis of the continuous query allows the optimisation of this solution by selection the irrelevant data to be load-shedded first. In the second contribution, we propose an algorithm for detecting frequent RDF graph patterns in semantic data streams.We called it FreGraPaD for Frequent RDF Graph Patterns Detection. It is a one pass algorithm, memory oriented and "low-cost". It uses two main data structures : A bit-vector to build and identify the RDF graph pattern, providing thus memory space optimization ; and a hash-table for storing the patterns.The third contribution of our thesis consists of a deterministic load-shedding solution for RSP systems, called POL (Pattern Oriented Load-shedding for RDF Stream Processing systems). It uses very low-cost boolean operators, that we apply on the built binary patterns of the data and the continuous query inorder to determine which data is not relevant to be ejected upstream of the system. It guarantees a recall of 100%, reduces the system load and improves response time. Finally, in the fourth contribution, we propose Patorc (Pattern Oriented Compression for RSP systems). Patorc is an online compression toolfor RDF streams. It is based on the frequent patterns present in RDF data streams that factorizes. It is a data lossless compression solution whith very possible querying without any need to decompression.This thesis provides solutions that allow the extension of existing RSP systems and makes them able to scale in a bigdata context. Thus, these solutions allow the RSP systems to deal with one or more semantic data streams arriving at different speeds, without loosing their response quality while ensuring their availability, even beyond their physical limitations. The conducted experiments, supported by the obtained results show that the extension of existing systems with the new solutions improves their performance. They illustrate the considerable decrease in their engine’s response time, increasing their processing rate threshold while optimizing the use of their system resources
Belghaouti, Fethi. "Interopérabilité des systèmes distribués produisant des flux de données sémantiques au profit de l'aide à la prise de décision". Electronic Thesis or Diss., Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLL003.
Texto completo da fonteInternet is an infinite source of data coming from sources such as social networks or sensors (home automation, smart city, autonomous vehicle, etc.). These heterogeneous and increasingly large data can be managed through semantic web technologies, which propose to homogenize, link these data and reason above them, and data flow management systems, which mainly address the problems related to volume, volatility and continuous querying. The alliance of these two disciplines has seen the growth of semantic data stream management systems also called RSP (RDF Stream Processing Systems). The objective of this thesis is to allow these systems, via new approaches and "low cost" algorithms, to remain operational, even more efficient, even for large input data volumes and/or with limited system resources.To reach this goal, our thesis is mainly focused on the issue of "Processing semantic data streamsin a context of computer systems with limited resources". It directly contributes to answer the following research questions : (i) How to represent semantic data stream ? And (ii) How to deal with input semantic data when their rates and/or volumes exceed the capabilities of the target system ?As first contribution, we propose an analysis of the data in the semantic data streams in order to consider a succession of star graphs instead of just a success of andependent triples, thus preserving the links between the triples. By using this approach, we significantly impoved the quality of responses of some well known sampling algoithms for load-shedding. The analysis of the continuous query allows the optimisation of this solution by selection the irrelevant data to be load-shedded first. In the second contribution, we propose an algorithm for detecting frequent RDF graph patterns in semantic data streams.We called it FreGraPaD for Frequent RDF Graph Patterns Detection. It is a one pass algorithm, memory oriented and "low-cost". It uses two main data structures : A bit-vector to build and identify the RDF graph pattern, providing thus memory space optimization ; and a hash-table for storing the patterns.The third contribution of our thesis consists of a deterministic load-shedding solution for RSP systems, called POL (Pattern Oriented Load-shedding for RDF Stream Processing systems). It uses very low-cost boolean operators, that we apply on the built binary patterns of the data and the continuous query inorder to determine which data is not relevant to be ejected upstream of the system. It guarantees a recall of 100%, reduces the system load and improves response time. Finally, in the fourth contribution, we propose Patorc (Pattern Oriented Compression for RSP systems). Patorc is an online compression toolfor RDF streams. It is based on the frequent patterns present in RDF data streams that factorizes. It is a data lossless compression solution whith very possible querying without any need to decompression.This thesis provides solutions that allow the extension of existing RSP systems and makes them able to scale in a bigdata context. Thus, these solutions allow the RSP systems to deal with one or more semantic data streams arriving at different speeds, without loosing their response quality while ensuring their availability, even beyond their physical limitations. The conducted experiments, supported by the obtained results show that the extension of existing systems with the new solutions improves their performance. They illustrate the considerable decrease in their engine’s response time, increasing their processing rate threshold while optimizing the use of their system resources
Azami, Ikram El. "Ingéniérie des Systèmes d'Information Coopératifs, Application aux Systèmes d'Information Hospitaliers". Thesis, Valenciennes, 2012. http://www.theses.fr/2012VALE0013.
Texto completo da fonteIn this thesis, we deal with hospital information systems (HIS), we analyze their design issues, interoperability and communication, with the aim of contributing to the design of a canonical, cooperative, and communicative HIS, and model the exchanges between its components and also with other systems involved in the management of patient in a healthcare network.We propose a structure and a conceptual model of a canonical HIS based on three main concepts involved in the production of healthcare data, namely, the pathological case, the Production Post of Healthcare Data (PPHD) and medical activity itself. The latter, being modeled as a tree, will allow better structuring of the care process.However, in view of ensuring continuity of care, we provide an XML-based model for exchanging medical data. This model consists of a set of relevant data organized around five categories: patient data, data on patient history, data of medical activity, data of medical prescriptions and medical records data (images, reporting ...).Finally, we describe a solution for integrating hospital information systems. The solution is inspired by the engineering of cooperatives information systems and consists of mediation-based architecture, structured into three levels: the level of information systems, the level of mediation, and the user level. The architecture offers a modular organization of hospital information systems and helps to insure data, function and workflow integration
Cerqueus, Thomas. "Contributions au problème d’hétérogénéité sémantique dans les systèmes pair-à-pair : application à la recherche d’information". Nantes, 2012. http://archive.bu.univ-nantes.fr/pollux/show.action?id=c40274d8-21a2-4f4a-9892-1367fd993b9c.
Texto completo da fonteWe consider peer-to-peer (P2P) data sharing systems in which each peer is free to choose the ontology that best fit its needs to represent its data. This is what we call semantic heterogeneity. This situation prevents from perfect interoperability because queries issued by peers may be misunderstood by other peers. First we focus on the notion of semantic heterogeneity because it seems to us that it is a complex notion. We define several measures allowing to precisely characterize semantic heterogeneity of a P2P system according to different facets. Second we define two protocols. The first one, called CorDis, allows to reduce semantic heterogeneity related to the disparities between peers. It disseminates correspondences in the system so that peers learn new correspondences. The second protocol, called GoOD-TA, allows to reduce semantic heterogeneity related to the topology of a system. The goal is to organize it in way that semantically close peers are close in the system. Thus two peers are neighbours if they use the same ontology, or if numerous correspondences exist between their respective ontologies. Third we propose an algorithm called DiQuESH for the routing and the treatment of top-k queries in semantically heterogeneous P2P systems. This algorithm allows a peer to retrieve the k most relevant documents from its neighbourhood. We experimentally show that CorDis and GoOD-TA improve results obtained by DiQuESH
Panetto, Hervé. "Meta-modèles et modèles pour l'intégration et l'interopérabilité des applications d'entreprises de production". Habilitation à diriger des recherches, Université Henri Poincaré - Nancy I, 2006. http://tel.archives-ouvertes.fr/tel-00119423.
Texto completo da fontePlus récemment, au cours de la dernière décennie, une nouvelle approche d'ingénierie logicielle a été élaborée : l'ingénierie dirigée par les modèles. Dans cette nouvelle approche, les modèles occupent une place de premier plan parmi les artefacts d'ingénierie des systèmes. En effet, ils doivent être suffisamment précis afin de pouvoir être interprétés ou transformés par des outils logiciels, tout en étant mis en correspondance avec les démarches d'ingénierie système. Le rôle central des modèles dans le développement des capacités d'interopérabilité participe notablement à cette qualité, car les processus d'ingénierie de l'interopérabilité des applications sont basés sur des flux de modèles.
Ce mémoire d'Habilitation à Diriger des Recherches est centré sur l'étude des modèles en vue de l'interopérabilité des systèmes, et plus particulièrement des systèmes d'entreprises, centrée sur la représentation des produits. Pour contribuer à la fiabilisation des modèles d'entreprise et de leur interrelations pour l'intégration forte des applications, le projet de recherche proposé dans ce mémoire a pour objectif, dans un contexte d'ingénierie système appliqué aux systèmes de systèmes, de définir, d'étendre et de formaliser les processus de modélisation des systèmes d'entreprise en tenant compte des connaissances, généralement non explicites, des acteurs de ces systèmes. Cette preuve de concept relatif à la formalisation de la connaissance des modèles se base sur la définition d'ontologies, ainsi que la mise en correspondance, la transformation et l'échange de modèles, permettant l'intégration des applications d'entreprise et assurant ainsi des flux d'informations cohérents entre tous les acteurs. Ils nécessitent ainsi de rendre explicite la connaissance tacite des modélisateurs (et opérateurs) sur les processus (et donc les modèles) dont ils sont responsables ou auteurs.
Labéjof, Jonathan. "R-*, Réflexion au Service de l'Évolution des Systèmes de Systèmes". Phd thesis, Université des Sciences et Technologie de Lille - Lille I, 2012. http://tel.archives-ouvertes.fr/tel-00768568.
Texto completo da fonteTursi, Angela. "Ontology-based approach for product-driven interoperability of enterprise production systems". Thesis, Nancy 1, 2009. http://www.theses.fr/2009NAN10086/document.
Texto completo da fonteIn recent years, the enterprise applications interoperability has become the leitmotiv of developers and designers in systems engineering. Most approaches to interoperability in the company have the primary objective of adjustment and adaptation of types and data structures necessary for the implementation of collaboration between companies. In the field of manufacturing, the product is a central component. Scientific works propose solutions taking into account information systems derived from products technical data throughout their life cycle. But this information is often uncorrelated. The management of product data (PDM) is commonly implemented to manage all information concerning products throughout their life cycle. The modelling of management and manufacturing processes is widely applied to both physical products and services. However, these models are generally independent “islands” ignoring the problem of interoperability between applications that support these models. The objective of this thesis is to study the problem of interoperability applied to applications used in the manufacturing environment and to define a model of the ontological knowledge of enterprises related to the products they manufacture, based on technical data, ensuring the interoperability of enterprise systems. The outcome of this research concerns the formalization of a methodology for identifying a product-centric information system in the form of an ontology, for the interoperability of applications in manufacturing companies, based on existing standard such as ISO 10303 and IEC 62264
Bergougnoux, Quentin. "Co-design et implémentation d’un noyau minimal orienté par sa preuve, et évolution vers les architectures multi-coeur". Thesis, Lille 1, 2019. http://www.theses.fr/2019LIL1I029/document.
Texto completo da fonteDue to the major growth of the Internet of Things and Cloud Computing worlds, security in those systems has become a major issue. Many exploits and attacks happened in the last few years, highlighting the need of strong security guarantees on those systems. Most of the times, a vulnerability in the kernel or one of its modules is enough to compromise the whole system. Etablishing and proving security properties through proof assistants seems to be a huge step towards bringing security guarantees. This relies on using mathematical models in order to reason on their behaviour, and prove the latter remains correct. Still, due to the huge and complex code base of the software running on those systems, especially the kernel, this is a tedious task. Understanding the internals of those kernels, and writing an associated proof on some security property, is more and more difficult as the kernel grows in size. In this thesis, I propose a new approach of kernel design, the proto-kernel. By reducing the features provided by the kernel to their most minimal subset, this model, in addition to lowering the attack surface, reduces the cost of the proof effort. It also allows a wide range of systems to be built on top of it, as the minimality of the features embedded into the kernel causes the remaining features to be built at the userland level. I also provide in this thesis a concrete implementation of this model, the Pip proto-kernel. By providing only the most minimal and mandatory system calls, both the usability of the kernel and the feasibility of the proof are ensured. In order to reduce the model-to-binary transition effort, most of the kernel is written directly in Gallina, the language of the Coq Proof Assistant, and is automatically converted to compilable C code during compilation phase. Pip only relies on a thin hardware abstraction layer written in low-level languages, which provides the operations the model requires, such as modifying the hardware configuration. Moreover, as Internet of Things and Cloud Computing use cases would require, I propose some extensions of Pip’s model, in order to support multicore hardware. Backed up by real implementations, those models bring the Pip proto-kernel to multicore architectures, bringing strong security guarantees in those modern environments. Finally, I validate my approach and its implementation through benchmarks and a Linux kernel port proof-of-concept, displaying the flexibility of the Pip proto-kernel in real world environments
Rouvoy, Romain. "Une démarche à granularité extrêmement fine pour la construction de canevas intergiciels hautement adaptables : applications aux services de transactions". Lille 1, 2006. https://ori-nuxeo.univ-lille1.fr/nuxeo/site/esupversions/7959a934-a6e5-48b2-866f-2bbc36e3159b.
Texto completo da fonteBelaud, Jean-Pierre. "Architecture et technologies des systèmes logiciels ouverts : CAPE-OPEN, un standard pour l'interopérabilité et l'intégration des composants logiciels de l'ingénierie des procédés". Toulouse, INPT, 2002. http://www.theses.fr/2002INPT026G.
Texto completo da fonteRezgui, Abir. "Interopérabilité de modèles dans le cycle de conception des systèmes électromagnétiques via des supports complémentaires : VHDL-AMS et composants logiciels ICAr". Phd thesis, Université de Grenoble, 2012. http://tel.archives-ouvertes.fr/tel-00858315.
Texto completo da fonteViard, Louis. "Méthodes et outils pour la programmation des systèmes cyber-physiques". Electronic Thesis or Diss., Université de Lorraine, 2021. http://www.theses.fr/2021LORR0105.
Texto completo da fonteBuilding cyber-physical systems is an up-and-coming discipline which involves many engineering domains. Cyber-physical systems have a controller monitoring their physical behaviour, resulting in intertwined discrete and continuous evolution. Faulty programs or environmental hazards might lead to unwanted control and disastrous consequences. Safe operation of cyber-physical systems requires to pay dedicated attention to their programming. Our work attempts to provide a solution to this challenge. We present a domain specific language for programming cyber-physical systems, Sophrosyne, as well as a formal method to verify the correction of the resulting missions. The language is based on monitoring control structures, which provide reactive behaviours to the system. It furthermore includes continuous modelling of the system with differential equations to enable verification of missions using differential dynamic logic. Various softwares have been built to provide Sophrosyne with mission planification, compilation, analysis, and execution. Together they form a complete toolchain from a graphical user interface supporting the definition of a mission to its execution on the real system. These tools have been used to define aerial inspections of infrastructure with unmanned aircraft. We demonstrate our contribution on such applications
Régis-Gianas, Yann. "Des types aux assertions logiques : preuve automatique ou assistée de propriétés sur les programmes fonctionnels". Paris 7, 2007. http://www.theses.fr/2007PA077155.
Texto completo da fonteThis work studies two approaches to improve the safety of computer programs using static analysis. The first one is typing which guarantees that the evaluation of program cannot fail. The functional language ML has a very rich type system and also an algorithm that infers automatically the types. We focus on its adaptation to generalized algebraic datatypes (GADTs). In this setting, efficient computation of a most general type is impossible. We propose a stratification of the language that retains the usual caracteristics of the ML fragment and make explicit the use of GADTs. The resulting language, MLGX, entails a burden on the programmer who must annotate its programs too much. A second stratum, MLGI, offers a mecanism to infer locally, in a predictable and efficient but incomplete way most of the type annotations. The first part ends up on an illustration of thé expressiveness of GADTs to encode the invariants of pushdown automata used In LR parsing. The second approach augments the language with logic assertions that enable arbitrarily compiex specifications to be expressed. We check the compliance of the program semantics with respect to these specifications thanks to a method called Hoare logic and thanks to semi-automatic computer-based proofs. The design choices permit to handle first-class functions. They are directed by an implementation which is illustrated by the certifiction of a module of trees that denote finite sets
Motie, Yassine. "Interopérabilité entre dispositifs hétérogènes en environnement ouvert pour la mise en oeuvre de co-simulation". Thesis, Toulouse 3, 2019. http://www.theses.fr/2019TOU30102.
Texto completo da fonteThe large number of electronic device features we use on a daily basis means a shift from a vision of old multifunction machines to distributed, widely distributed distributed devices in the environment. Knowing that a system is an integrated set of connected and interrelated elements (products, people, processes) in order to satisfy, in a given environment, one or more defined objectives and having characteristics such as the components that constitute it , the relations between these components, its environment, the constraints it undergoes, evolutions over time. The combination of these leads us to qualify some systems as complex due to the heterogeneity of the components constituting them, their evolution at various time scales and their geographical distribution integrating digital systems, physical and / or human operators in the loop. The difficulty of having a good vision of the system when it is complex (real and other simulated devices) and the probability of significant design error leads us to reflect on the ability to specify the product and verify the design using a virtual prototype, we are talking about simulation. When a complex system requires the use of different components specified by different designers working on different domains, this greatly increases the number of virtual prototypes. These different components unfortunately tend to remain too independent of each other thus preventing both the different designers from collaborating and their systems from being interconnected in order to fulfill one or more tasks that could not be accomplished by one of these elements only. The need for communication and cooperation is needed. This must take into account the different actors and coordinate them in their interactions within this complex system. But the advances in simulation in each area are considerable, each with its own software. Interoperability solutions are therefore necessary for the implementation of a co-simulation encouraging dialogue between disciplines and reducing errors, cost and development time. In our thesis we participate in the design of a co-simulation system which integrates different tools of simulation-trades based on the modeling of the behavior of devices like the simulation energetics and the simulation of wear of building materials within the same platform. After taking into account the concepts of architecture, communication (between simulators or with users) and visualization to define architecture models. We analyze the architecture that manages interoperability. We propose an interoperability approach based on the reuse and exchange of computational components. We will successively address the issues related to the interoperability structural and semantic levels, the co-simulation strategies, the design methods of the task model allowing the construction of black box components. Then we will present the concrete implementation of our global design methodology and the verification tool of some properties of the architecture, such as coherence and semantics
El, Hami Norelislam. "Contribution aux méthodes hybrides d'optimisation heuristique : Distribution et application à l'interopérabilité des systèmes d'information". Phd thesis, INSA de Rouen, 2012. http://tel.archives-ouvertes.fr/tel-00771360.
Texto completo da fonteGrieu, Jérôme. "Analyse et évaluation de techniques de commutation Ethernet pour l'interconnexion des systèmes avioniques". Phd thesis, Toulouse, INPT, 2004. http://oatao.univ-toulouse.fr/7385/1/grieu.pdf.
Texto completo da fonteKan, Channmeta. "Le rôle du juge dans l'administration de la preuve : étude comparée des grands systèmes dans les différents pays de l'ASEAN". Lyon 3, 2007. https://scd-resnum.univ-lyon3.fr/in/theses/2007_in_kan_c.pdf.
Texto completo da fonteSingh, Neeraj Kumar. "Fiabilité et sûreté des systèmes informatiques critiques". Thesis, Nancy 1, 2011. http://www.theses.fr/2011NAN10129/document.
Texto completo da fonteSoftware systems are pervasive in all walks of our life and have become an essential part of our daily life. Information technology is one major area, which provides powerful and adaptable opportunities for innovation, and it seems boundless. However, systems developed using computer-based logic have produced disappointing results. According to stakeholders, they are unreliable, at times dangerous, and fail to provide the desired outcomes. Most significant reasons of system failures are the poor development practices for system development. This is due to the complex nature of modern software and lack of adequate and proper understanding. Software development provides a framework for simplifying the complex system to get a better understanding and to develop the higher fidelity quality systems at lower cost. Highly embedded critical systems, in areas such as automation, medical surveillance, avionics, etc., are susceptible to errors, which can lead to grave consequences in case of failures. This thesis intends to contribute to further the use of formal techniques for the development computing systems with high integrity. Specifically, it addresses that formal methods are not well integrated into established critical systems development processes by defining a new development life-cycle, and a set of associated techniques and tools to develop highly critical systems using formal techniques from requirements analysis to automatic source code generation using several intermediate layers with rigorous safety assessment approach. The approach has been realised using the Event-B formalism. This thesis has mainly two parts: techniques and tools and case studies. The techniques and tools section consists of development life-cycle methodology, a framework for real-time animator, refinement chart, a set of automatic code generation tools and formal logic based heart model for close loop modeling. New development methodology, and a set of associated techniques and tools are used for developing the critical systems from requirements analysis to code implementation, where verification and validation tasks are used as intermediate layers for providing a correct formal model with desired system behavior at the concrete level. Introducing new tools help to verify desired properties, which are hidden at the early stage of the system development. We also critically evaluate the proposed development methodology and developed techniques and tools through case studies in the medical and automotive domains. In addition, the thesis work tries to address the formal representation of medical protocols, which is useful for improving the existing medical protocols. We have fully formalised a real-world medical protocol (ECG interpretation) to analyse whether the formalisation complies with certain medically relevant protocol properties. The formal verification process has discovered a number of anomalies in the existing protocols. We have also discovered a hierarchical structure for the ECG interpretation efficiently that helps to find a set of conditions that can be very helpful to diagnose particular disease at the early stage. The main objective of the developed formalism is to test correctness and consistency of the medical protocol
Radhouani, Amira. "Méthodes formelles pour l'extraction d'attaques internes des Systèmes d'Information". Thesis, Université Grenoble Alpes (ComUE), 2017. http://www.theses.fr/2017GREAM025/document.
Texto completo da fonteThe early detection of potential threats during the modelling phase of a Secure Information System (IS) is required because it favours the design of a robust access control policy and the prevention of malicious behaviours during the system execution. This involves studying the validation of access control rules and performing vulnerabilities automated checks before the IS operationalization. We are particularly interested in detecting vulnerabilities that can be exploited by internal trusted users to commit attacks, called insider attacks, by taking advantage of their legitimate access to the system. To do so, we use formal B specifications which are generated by the B4MSecure platform from UML functional models and a SecureUML modelling of role-based access control rules. Since these vulnerabilities are due to the dynamic evolution of the functional state, we propose to study the reachability of someundesirable states starting from a normal state of the system. The proposed techniques are an alternative to model-checking techniques. Indeed, they implement symbolic backward search algorithm based on complementary approaches: proof and constraint solving. This rich technical background allowed the development of the GenISIS tool which automates our approach and which was successfully experimented on several case studies available in the literature. These experiments showed its capability to extract already published attacks but also new attacks
Liao, Yongxin. "Annotations sémantiques pour l'intéropérabilité des systèmes dans un environnement PLM". Electronic Thesis or Diss., Université de Lorraine, 2013. http://www.theses.fr/2013LORR0135.
Texto completo da fonteIn manufacturing enterprises the Product Lifecycle Management (PLM) approach has been considered as an essential solution for improving the product competitive ability. It aims at providing a shared platform that brings together different enterprise systems at each stage of a product life cycle in or across enterprises. Although the main software companies are making efforts to create tools for offering a complete and integrated set of systems, most of them have not implemented all of the systems. Finally, they do not provide a coherent integration of the entire information system. This results in a kind of "tower of Babel", where each application is considered as an island in the middle of the ocean of information, managed by many stakeholders in an enterprise, or even in a network of enterprises. The different peculiarities of those stakeholders are then over increasing the issue of interoperability. The objective of this thesis is to deal with the issue of semantic interoperability, by proposing a formal semantic annotation method to support the mutual understanding of the semantics inside the shared and exchanged information in a PLM environment
Boudjani, Nadira. "Aide à la construction et l'évaluation des preuves mathématiques déductives par les systèmes d'argumentation". Thesis, Montpellier, 2018. http://www.theses.fr/2018MONTS060/document.
Texto completo da fonteLearning deductive proofs is fundamental for mathematics education. Yet, many students have difficulties to understand and write deductive mathematical proofs which has severe consequences for problem solving as highlighted by several studies. According to the recent study of TIMSS (Trends in International Mathematics and Science Study), the level of students in mathematics is falling. students have difficulties to understand mathematics and more precisely to build and structure mathematical proofs.To tackle this problem, several approaches in mathematical didactics have used a social approach in classrooms where students are engaged in a debate and use argumentation in order to build proofs.The term "argumentation" in this context refers to the use of informal discussions in classrooms to allow students to publicly express claims and justify them to build proofs for a given problem. The underlying hypotheses are that argumentation: (i) enhances critical thinking and meta-cognitive skills such as self monitoring and self assessment; (ii) increases student's motivation by social interactions; and (iii) allows learning among students. From instructors' point of view, some difficulties arise with these approaches for assessment. In fact, the evaluation of outcomes -- that includes not only the final proof but also all intermediary steps and aborted attempts -- introduces an important work overhead.In this thesis, we propose a system for constructing and evaluating deductive mathematical proofs. The system has a twofold objective: (i) allow students to build deductive mathematical proofs using structured argumentative debate; (ii) help the instructors to evaluate these proofs and assess all intermediary steps in order to identify misconceptions and provide a constructive feedback to students. The system provides students with a structured framework to debate during construction of proofs using the proposed argumentation frameworks in artificial intelligence. These argumentation frameworks are also used in the analysis of the debate which will be used to represent the result in different forms in order to facilitate the evaluation to the instructors. The system has been implemented and evaluated experimentally by students in the construction of deductive proofs and instructors in the evaluation of these proofs