To see the other types of publications on this topic, follow the link: Formal Modeling.

Dissertations / Theses on the topic 'Formal Modeling'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Formal Modeling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Shaw, Kevin B. "Curated Reasoning by Formal Modeling of Provenance." ScholarWorks@UNO, 2013. http://scholarworks.uno.edu/td/1782.

Full text
Abstract:
The core problem addressed in this research is the current lack of an ability to repurpose and curate scientific data among interdisciplinary scientists within a research enterprise environment. Explosive growth in sensor technology as well as the cost of collecting ocean data and airborne measurements has allowed for exponential increases in scientific data collection as well as substantial enterprise resources required for data collection. There is currently no framework for efficiently curating this scientific data for repurposing or intergenerational use. There are several reasons why this problem has eluded solution to date to include the competitive requirements for funding and publication, multiple vocabularies used among various scientific disciplines, the number of scientific disciplines and the variation among workflow processes, lack of a flexible framework to allow for diversity among vocabularies and data but a unifying approach to exploitation and a lack of affordable computing resources (mostly in past tense now). Addressing this lack of sharing scientific data among interdisciplinary scientists is an exceptionally challenging problem given the need for combination of various vocabularies, maintenance of associated scientific data provenance, requirement to minimize any additional workload being placed on originating data scientist project/time, protect publication/credit to reward scientific creativity and obtaining priority for a long-term goal such as scientific data curation for intergenerational, interdisciplinary scientific problem solving that likely offers the most potential for the highest impact discoveries in the future. This research approach focuses on the core technical problem of formally modeling interdisciplinary scientific data provenance as the enabling and missing component to demonstrate the potential of interdisciplinary scientific data repurposing. This research develops a framework to combine varying vocabularies in a formal manner that allows the provenance information to be used as a key for reasoning to allow manageable curation. The consequence of this research is that it has pioneered an approach of formally modeling provenance within an interdisciplinary research enterprise to demonstrate that intergenerational curation can be aided at the machine level to allow reasoning and repurposing to occur with minimal impact to data collectors and maximum impact to other scientists.
APA, Harvard, Vancouver, ISO, and other styles
2

Lisowski, Matthew A. "Development of a target recognition system using formal and semi-formal software modeling methods." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2000. http://handle.dtic.mil/100.2/ADA386925.

Full text
Abstract:
Thesis (M.S. in Software Engineering) Naval Postgraduate School, Dec. 2000.<br>Thesis advisors, Neil Rowe, Man-Tak Shing. "December 2000." Includes bibliographical references (p. 101-102). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
3

Sidorowicz, Piotr Roald. "A formal framework for modeling and testing memories." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape4/PQDD_0028/NQ51227.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Wathugala, Wathugala Gamage Dulan Manujinda. "Formal Modeling Can Improve Smart Transportation Algorithm Development." Thesis, University of Oregon, 2017. http://hdl.handle.net/1794/22608.

Full text
Abstract:
201 pages<br>Ensuring algorithms work accurately is crucial, especially when they drive safety critical systems like self-driving cars. We formally model a published distributed algorithm for autonomous vehicles to collaborate and pass thorough an intersection. Models are built and validated using the “Labelled Transition System Analyser” (LTSA). Our models reveal situations leading to deadlocks and crashes in the algorithm. We demonstrate two approaches to gain insight about a large and complex system without modeling the entire system: Modeling a sub system - If the sub system has issues, the super system too. Modeling a fast-forwarded state - Reveals problems that can arise later in a process. Some productivity tools developed for distributed system development are also presented. Manulator, our distributed system simulator, enables quick prototyping and debugging on a single workstation. LTSA-O, extension to LTSA, listens to messages exchanged in an execution of a distributed system and validates it against a model.
APA, Harvard, Vancouver, ISO, and other styles
5

Park, Hoon. "Formal Modeling and Verification of Delay-Insensitive Circuits." PDXScholar, 2015. https://pdxscholar.library.pdx.edu/open_access_etds/2639.

Full text
Abstract:
Einstein's relativity theory tells us that the notion of simultaneity can only be approximated for events distributed over space. As a result, the use of asynchronous techniques is unavoidable in systems larger than a certain physical size. Traditional design techniques that use global clocks face this barrier of scale already within the space of a modern microprocessor chip. The most common response by the chip industry for overcoming this barrier is to use Globally Asynchronous Locally Synchronous (GALS) design techniques. The circuits investigated in this thesis can be viewed as examples of GALS design. To make such designs trustworthy it is necessary to model formally the relative signal delays and timing requirements that make these designs work correctly. With trustworthy asynchrony one can build reliable, large, and scalable systems, and exploit the lower power and higher speed features of asynchrony. This research presents ARCtimer, a framework for modeling, generating, verifying, and enforcing timing constraints for individual self-timed handshake components that use bounded-bundled-data handshake protocols. The constraints guarantee that the component's gate-level circuit implementation obeys the component's handshake protocol specification. Because the handshake protocols are delay insensitive, self-timed systems built using ARCtimer-verified components can be made delay insensitive. Any delay sensitivity inside a component is detected and repaired by ARCtimer. In short: by carefully considering time locally, we can ignore time globally. ARCtimer applies early in the design process as part of building a library of verified components for later system use. The library also stores static timing analysis (STA) code to validate and enforce the component's constraints in any self-timed system built using the library. The library descriptions of a handshake component's circuit, protocol, timing constraints, and STA code are robust to circuit modifications applied later in the design process by technology mapping or layout tools. New contributions of ARCtimer include: 1. Upfront modeling on a component by component basis to reduce the validation effort required to (a) reimplement components in different technologies, (b) assemble components into systems, and (c) guarantee system-level timing closure. 2. Modeling of bounded-bundled-data timing constraints that permit the control signals to lead or lag behind data signals to optimize system timing.
APA, Harvard, Vancouver, ISO, and other styles
6

Kühnberger, Kai-Uwe. "Formal frameworks for circular phenomena possibilities of modeling pathological expressions in formal and natural languages /." [S.l. : s.n.], 2002. http://deposit.ddb.de/cgi-bin/dokserv?idn=964198576.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Smirnov, Oleg. "Formal evolutionary modeling and the problems of political science /." view abstract or download file of text, 2005. http://wwwlib.umi.com/cr/uoregon/fullcit?p3190550.

Full text
Abstract:
Thesis (Ph. D.)--University of Oregon, 2005.<br>Typescript. Includes vita and abstract. Includes bibliographical references (leaves 113-131). Also available for download via the World Wide Web; free to University of Oregon users.
APA, Harvard, Vancouver, ISO, and other styles
8

Jacobs, Petrus Jacobus. "A formal refinement framework for the systems modeling language." Thesis, University of Oxford, 2015. https://ora.ox.ac.uk/objects/uuid:8be42735-8a31-41e2-82e2-05f7d0e6cb1a.

Full text
Abstract:
The Systems Modeling Language (SysML), an extension of a subset of the Unified Modeling Language (UML), is a visual modelling language for systems engineering applications. At present, the semi-formal SysML, which is widely utilised for the design of complex heterogeneous systems, lacks integration with other more formal approaches. In this thesis, we describe how Communicating Sequential Processes (CSP) and its associated refinement checker, Failures Divergences Refinement (FDR), may be used to underpin an approach that facilitates the refinement checking of the behavioural consistency of SysML diagrams. We do so by utilising CSP as a semantic domain for reasoning about SysML behavioural aspects: activities, state machines and interactions are given a formal process-algebraic semantics. These behaviours execute within the context of the structural diagrams to which they relate, and this is reflected in the CSP descriptions that depict their characteristic patterns of interaction. The resulting abstraction gives rise to a framework that enables the formal treatment of integrated behaviours via refinement checking. In SysML, requirement diagrams allow for the allocation of behavioural features in order to present a more detailed description of a captured requirement. Moreover, we demonstrate that, by providing a common basis for behaviours and requirements, the approach supports requirements traceability: SysML requirements are amenable to formal verification using FDR. In addition, the proposed framework is able to detect inconsistencies that arise due to the multi-view nature of SysML. We illustrate and validate the contribution by applying our methodology to a safety critical system of moderate size and complexity.
APA, Harvard, Vancouver, ISO, and other styles
9

Haur, Imane. "AUTOSAR compliant multi-core RTOS formal modeling and verification." Electronic Thesis or Diss., Ecole centrale de Nantes, 2022. http://www.theses.fr/2022ECDN0057.

Full text
Abstract:
La vérification formelle est une solution pour augmenter la fiabilité de l’implémentation du système. Dans notre travail de thèse, nous nous intéressons à l’utilisation de ces méthodes pour la vérification des systèmes d’exploitation multi-coeurs temps réel. Nous proposons une approche de model-checking utilisant les réseaux de Petri temporels, étendus avec des transitions colorées et des fonctionnalités de haut niveau. Nous utilisons ce formalisme pour modéliser le système d’exploitation multi-coeur Trampoline, conforme aux standards OSEK/VDX etAUTOSAR. Nous définissons dans un premier temps ce formalisme et montrons son adéquation avec la modélisation de systèmes concurrents temps reel. Nous utilisons ensuite ce formalisme pour modéliser le système d’exploitation multi-coeur Trampoline et vérifions par model-checking sa conformité avec le standard AUTOSAR. À partir de ce modèle, nous pouvons vérifier des propriétés aussi bien sur l’OS que sur l’application telles que l’ordonnançabilité d’un système tempsréel ainsi que les mécanismes de synchronisation : accès concurrents aux structures de données du système d’exploitation, ordonnancement multi-coeur et traitement des interruptions inter-coeur. À titre d’illustration, cette méthode a permis l’identification automatique de deux erreurs possibles de l’OS Trampoline dans l’exécution concurrente, montrant une protection insuffisante des données et une synchronisation défectueuse<br>Formal verification is a solution to increase the system’s implementation reliability. In our thesis work, we are interestedin using these methods to verify multi-core RTOS. We propose a model-checking approach using time Petri nets extended with colored transitions and high-level features. We use this formalism to model the Trampoline multi-core OS, compliant with the OSEK/VDX and AUTOSAR standards. We first define this formalism and show its suitability for modeling real-time concurrent systems. We then use this formalism to model the Trampoline multi-core RTOS and verify by model-checkingits conformity with the AUTOSAR standard. From this model, we can verify properties of both the OS and the application, such as the schedulability of a real-time system and the synchronization mechanisms: concurrent access to the data structures of the OS, multicore scheduling, and inter-core interrupt handling. As an illustration, this method allowed the automatic identification of two possible errors of the Trampoline OS in concurrent execution, showing insufficient data protection andfaulty synchronization
APA, Harvard, Vancouver, ISO, and other styles
10

Liu, Su. "Formal Modeling and Analysis Techniques for High Level Petri Nets." FIU Digital Commons, 2014. http://digitalcommons.fiu.edu/etd/1522.

Full text
Abstract:
Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. There are two issues in using HLPNs - modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.
APA, Harvard, Vancouver, ISO, and other styles
11

Linck, Ricardo Ramos. "Conceptual modeling of formal and material relations applied to ontologies." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2014. http://hdl.handle.net/10183/108626.

Full text
Abstract:
Ontologias representam uma conceitualização compartilhada de uma comunidade de conhecimento. São construídas a partir da descrição dos significados dos conceitos, descritos através de seus atributos e dos relacionamentos entre os conceitos. Conceitos se referem ao objeto da conceitualização, o universo do discurso. São caracterizados por seus atributos e domínios de valores possíveis. Relacionamentos são utilizados para descreverem de que forma os conceitos se estruturam no mundo. Nas ontologias todos os conceitos são hierarquicamente definidos, porém existem outros relacionamentos que são definicionais, dando identidade aos conceitos e sentido ao mundo. Além dos relacionamentos de subsunção que constroem as taxonomias de conceitos, outras relações formais e materiais auxiliam na estruturação do domínio e na definição conceitual. As ferramentas de modelagem, no entanto, ainda são falhas em diferenciar os vários tipos de relacionamentos formais e materiais para atribuir as possibilidades de raciocínio automático. Em especial, relacionamentos mereológicos e partonômicos carecem de opções de implementação que permitam extrair o potencial semântico da modelagem. Este projeto de pesquisa tem como ponto de partida o estudo da literatura sobre ontologias e relações, em especial sobre relações formais e materiais, incluindo relações mereológicas e partonômicas, revisando os princípios encontrados nas ontologias. Além disso, nós identificamos os fundamentos teóricos das relações e analisamos a aplicação dos conceitos das relações sobre as principais ontologias de fundamentação em prática na atualidade. Na sequência, a partir das propostas levantadas, este trabalho propõe uma alternativa para a modelagem conceitual destas relações em uma ontologia de domínio visual. Esta alternativa foi disponibilizada na ferramenta de construção de ontologias do Projeto Obaitá, a qual está sendo desenvolvida pelo Grupo de Pesquisa de Bancos de Dados Inteligentes (BDI) da UFRGS.<br>Ontologies represent a shared conceptualization of a knowledge community. They are built from the description of the meaning of concepts, expressed through their attributes and their relationships. Concepts refer to the object of conceptualization, the universe of discourse. They are characterized by their attributes and domains of possible values. Relationships are used to describe how the concepts are structured in the world. In ontologies all concepts are hierarchically defined, however there are other relationships that are definitional, giving identity to the concepts and meaning to the world. In addition to the subsumption relationships that build the taxonomies of concepts, other formal and material relations assist in structuring the domain and the conceptual definition. The modeling tools, however, are still deficient in differentiating the various types of formal and material relationships in order to assign the possibilities of automated reasoning. In particular, mereological and partonomic relationships lack of implementation options that allow extracting the semantic potential when modeling. This research project takes as a starting point the study of the literature on ontologies and relations, especially on formal and material relations, including mereological and partonomic relations, reviewing the principles found on ontologies. Furthermore, we identify the theoretical foundations of the relations and analyze the application of the relations concepts to the main foundational ontologies in use nowadays. Following, from the raised proposals, this work proposes an alternative for the conceptual modeling of these relations in a visual domain ontology. This alternative has been made available on the ontology building tool of the Obaitá Project, which is under development by the Intelligent Databases Research Group (BDI) from UFRGS.
APA, Harvard, Vancouver, ISO, and other styles
12

Khlifi, Oussama [Verfasser]. "Modeling and formal verification of probabilistic reconfigurable systems / Oussama Khlifi." Saarbrücken : Saarländische Universitäts- und Landesbibliothek, 2020. http://d-nb.info/1221129384/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Moustafa, Iman Saleh. "Formal Specification and Verification of Data-Centric Web Services." Diss., Virginia Tech, 2012. http://hdl.handle.net/10919/26294.

Full text
Abstract:
In this thesis, we develop and evaluate a formal model and contracting framework for data-centric Web services. The central component of our framework is a formal specification of a common Create-Read-Update-Delete (CRUD) data store. We show how this model can be used in the formal specification and verification of both basic and transactional Web service compositions. We demonstrate through both formal proofs and empirical evaluations that our proposed framework significantly decreases ambiguity about a service, enhances its reuse, and facilitates detection of errors in service-based implementations. Web Services are reusable software components that make use of standardized interfaces to enable loosely-coupled business-to-business and customer-to-business interactions over the Web. In such environments, service consumers depend heavily on the service interface specification to discover, invoke, and synthesize services over the Web. Data-centric Web services are services whose behavior is determined by their interactions with a repository of stored data. A major challenge in this domain is interpreting the data that must be marshaled between consumer and producer systems. While the Web Services Description Language (WSDL) is currently the de facto standard for Web services, it only specifies a service operation in terms of its syntactical inputs and outputs; it does not provide a means for specifying the underlying data model, nor does it specify how a service invocation affects the data. The lack of data specification potentially leads to erroneous use of the service by a consumer. In this work, we propose a formal contract for data-centric Web services. The goal is to formally and unambiguously specify the service behavior in terms of its underlying data model and data interactions. We address the specification of a single service, a flow of services interacting with a single data store, and also the specification of distributed transactions involving multiple Web services interacting with different autonomous data stores. We use the proposed formal contract to decrease ambiguity about a service behavior, to fully verify a composition of services, and to guarantee correctness and data integrity properties within a transactional composition of services.<br>Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
14

Горбачев, В. А. "Malicious Hardware: characteristics, classification and formal models." Thesis, IEEE, 2014. http://openarchive.nure.ua/handle/document/3435.

Full text
Abstract:
Electronic Systems (ES) that contain embedded malicious hardware represent a serious threat, especially for government, aeronautic, financial and energy system applications. MHs can be implemented as hardware modifications to application specific ICs (ASICs), microprocessors, digital signal processors, or as IP core modifications for field programmable gate arrays (FPGA) [1]. They are able to turn off the CPU, to send confidential information and bypass the software user authentication mechanisms. There are some important characteristics of this type of threat: standard testing methods, such as the common functional verification and Automatic Test Pattern Generation (ATPG) cannot always be used to solve the problem of detecting MH [2], [3]; identification of the threat sources without special tools is practically impossible; even in cases when an information security violation is detected, it is very difficult to prove that this action was performed by MH. These and other features make MHs very promising embedded devices for planning of electronic terrorism. Therefore, detecting and preventing approaches are in the attention centre of IT systems security investigation.
APA, Harvard, Vancouver, ISO, and other styles
15

Zobair, Md Hasan. "Modeling and formal verification of a telecom system block using MDGs." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/MQ59312.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Pino, Lou. "A formal method for modeling and analysis of requirements for software /." Thesis, McGill University, 1993. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=69740.

Full text
Abstract:
Requirements that are well understood by a client and a provider are a major contributor to developing and supporting reliable, quality software on time and within budget. This thesis has two thrusts to facilitate improved interpretation of requirements: (1) a requirements model and (2) a new formalism called LaP, with automated tools, to express and analyze requirements. The new formalism is based on the integration of an algebraic based language, Larch, and an extended finite state machine based language, Promela. Larch comes with a theorem prover (Larch Prover) and Promela comes with a tool (SPIN) to aid in the validation of dynamic properties. It is the objective of LaP to express the control and data intensive aspects of requirements. The two thrusts are demonstrated by building a requirements model for real telecommunications requirements that ask for a system that can manage the access configurations of user's accounts in telecommunications equipment.
APA, Harvard, Vancouver, ISO, and other styles
17

Carvalho, Fabiano Costa. "On the design of integrated modular avionics assisted by formal modeling." Instituto Tecnológico de Aeronáutica, 2009. http://www.bd.bibl.ita.br/tde_busca/arquivo.php?codArquivo=1218.

Full text
Abstract:
Avionics system manufacturers are currently facing the problem of developing highly-integrated systems under economic pressures. In this scenario, the empirical approach, characterized by trial and error techniques, is not adequate since the correction of design flaws is often related to expensive re-work and schedule overruns. The evolution of airborne systems toward Integrated Modular Avionics (IMA) pushes the need for advanced methods that could enforce correctness of complex designs while minimizing the chances of introducing errors. Considering this problem, this work proposes a systematic conceptual design strategy based on formal methods, aiming at improving the development processes for IMA systems. The basic idea is to concentrate efforts on the construction, simulation, and formal analysis of a mathematical model for the new system at early development lifecycle phases. The proposed approach was exercised on a case study of practical avionics project in order to evaluate the drawbacks and advantages. Results suggest that this work could contribute to the aeronautics industry by offering alternative means to cope with complexity in modern avionics projects.
APA, Harvard, Vancouver, ISO, and other styles
18

Widel, Wojciech. "Formal modeling and quantitative analysis of security using attack- defense trees." Thesis, Rennes, INSA, 2019. http://www.theses.fr/2019ISAR0019.

Full text
Abstract:
L'analyse de risque est un processus très complexe. Elle nécessite une représentation rigoureuse et une évaluation approfondie des menaces et de leur contre-mesures. Cette thèse porte sur la modélisation formelle de la sécurité à l'aide d'arbres d'attaque et de défense. Ces derniers servent à représenter et à quantifier les attaques potentielles afin de mieux comprendre les enjeux de sécurité auxquels le système analysé peut être confronté. Ils permettent donc de guider un expert dans le choix des contre-mesures à implémenter pour sécuriser son système. - Le développement d'une méthodologie basée sur la dominance de Pareto et permettant de prendre en compte plusieurs aspects quantitatifs simultanément (e.g., coût, temps, probabilité, difficulté, etc.) lors d'une analyse de risques. - La conception d'une technique, utilisant les méthodes de programmation linéaire, pour sélectionner un ensemble de contre-mesures optimal, en tenant compte du budget destiné à la protection du système analysé. C'est une technique générique qui peut être appliquée à plusieurs problèmes d'optimisation, par exemple, la maximisation de la couverture de surface d'attaque, Les principales contributions de cette thèse sont les ou encore la maximisation du investissement de suivantes : l'attaquant. - L'enrichissement du modèle des arbres d'attaque et de défense permettant d'analyser des scénarios de Pour garantir leur applicabilité pratique, le modèle et sécurité réels. Nous avons notamment développé les les algorithmes mathématiques développés ont été fondements théoriques et les algorithmes d'évaluation implémentés dans un outil informatique à source quantitative pour le modèle où une action de ouverte et accès gratuit. Tous les résultats ont l'attaquant peut contribuer à plusieurs attaques et où également été validés lors d'une étude pratique une contre-mesure peut prévenir plusieurs menaces. portant sur un scénario industriel d'altération de compteurs de consommation d'électricité<br>Risk analysis is a very complex process. It requires rigorous representation and in-depth assessment of threats and countermeasures. This thesis focuses on the formal modelling of security using attack and defence trees. These are used to represent and quantify potential attacks in order to better understand the security issues that the analyzed system may face. They therefore make it possible to guide an expert in the choice of countermeasures to be implemented to secure their system. The main contributions of this thesis are as follows: - The enrichment of the attack and defence tree model allowing the analysis of real security scenarios. In particular, we have developed the theoretical foundations and quantitative evaluation algorithms for the model where an attacker's action can contribute to several attacks and a countermeasure can prevent several threats. - The development of a methodology based on Pareto dominance and allowing several quantitative aspects to be taken into account simultaneously (e.g., cost, time, probability, difficulty, etc.) during a risk analysis. - The design of a technique, using linear programming methods, for selecting an optimal set of countermeasures, taking into account the budget available for protecting the analyzed system. It is a generic technique that can be applied to several optimization problems, for example, maximizing the attack surface coverage, or maximizing the attacker's investment. To ensure their practical applicability, the model and mathematical algorithms developed were implemented in a freely available open source tool. All the results were also validated with a practical study on an industrial scenario of alteration of electricity consumption meters
APA, Harvard, Vancouver, ISO, and other styles
19

Sakib, Ashiq Adnan. "Formal Modeling and Verification Methodologies for Quasi-Delay Insensitive Asynchronous Circuits." Diss., North Dakota State University, 2019. https://hdl.handle.net/10365/29896.

Full text
Abstract:
Pre-Charge Half Buffers (PCHB) and NULL convention Logic (NCL) are two major commercially successful Quasi-Delay Insensitive (QDI) asynchronous paradigms, which are known for their low-power performance and inherent robustness. In industry, QDI circuits are synthesized from their synchronous counterparts using custom synthesis tools. Validation of the synthesized QDI implementation is a critical design prerequisite before fabrication. At present, validation schemes are mostly extensive simulation based that are good enough to detect shallow bugs, but may fail to detect corner-case bugs. Hence, development of formal verification methods for QDI circuits have been long desired. The very few formal verification methods that exist in the related field have major limiting factors. This dissertation presents different formal verification methodologies applicable to PCHB and NCL circuits, and aims at addressing the limitations of previous verification approaches. The developed methodologies can guarantee both safety (full functional correctness) and liveness (absence of deadlock), and are demonstrated using several increasingly larger sequential and combinational PCHB and NCL circuits, along with various ISCAS benchmarks.<br>National Science Foundation (Grant No. CCF-1717420)
APA, Harvard, Vancouver, ISO, and other styles
20

VanValkenburg, MaryAnn E. "Alloy-Guided Verification of Cooperative Autonomous Driving Behavior." Digital WPI, 2020. https://digitalcommons.wpi.edu/etd-theses/1354.

Full text
Abstract:
Alloy is a lightweight formal modeling tool that generates instances of a software specification to check properties of the design. This work demonstrates the use of Alloy for the rapid development of autonomous vehicle driving protocols. We contribute two driving protocols: a Normal protocol that represents the unpredictable yet safe driving behavior of typical human drivers, and a Connected protocol that employs connected technology for cooperative autonomous driving. Using five properties that define safe and productive driving actions, we analyze the performance of our protocols in mixed traffic. Lightweight formal modeling is a valuable way to reason about driving protocols early in the development process because it can automate the checking of safety and productivity properties and prevent costly design flaws.
APA, Harvard, Vancouver, ISO, and other styles
21

Chrszon, Philipp, Clemens Dubslaff, Christel Baier, Joachim Klein, and Sascha Klüppelholz. "Modeling Role-Based Systems with Exogenous Coordination." Springer, 2016. https://tud.qucosa.de/id/qucosa%3A70791.

Full text
Abstract:
The concept of roles is a promising approach to cope with context dependency and adaptivity of modern software systems. While roles have been investigated in conceptual modeling, programming languages and multi-agent systems, they have been given little consideration within component-based systems. In this paper, we propose a hierarchical role-based approach for modeling relationships and collaborations between components. In particular, we consider the channel-based, exogenous coordination language Reo and discuss possible realizations of roles and related concepts. The static requirements on the binding of roles are modeled by rule sets expressed in many-sorted second-order logic and annotations on the Reo networks for role binding, context and collaborations, while Reo connectors are used to model the coordination of runtime role playing. The ideas presented in this paper may serve as a basis for the formalization and formal analysis of role-based software systems.
APA, Harvard, Vancouver, ISO, and other styles
22

Charfi, Leila. "Formal modeling and test generation automation with Use Case Maps and LOTOS." Thesis, University of Ottawa (Canada), 2001. http://hdl.handle.net/10393/9138.

Full text
Abstract:
This thesis addresses the problem of formal modelling and test generation, from system requirements represented in the form of Use Case Maps. In the first part of our thesis, we present an existent development methodology based on Use Case Maps for the design of the requirements and on LOTOS and SDL for the formal modeling of telecommunication systems. We follow this methodology for the formal specification and validation of a telephony system using LOTOS. In the second part of the thesis, we develop a method for the automatic generation of LOTOS scenarios from Use Case Maps called Ucm2LotosTests. The obtained scenarios can be used for the verification of the LOTOS specification built from the same Use Case Maps and for conformance testing purposes at the implementation stage. Finally, we propose a development methodology based on Use Case Maps for the design of the requirements and on LOTOS for the formal modeling of the system. In addition, this methodology offers a fast test generation process; it proposes the use of Ucm2LotosTests for the automatic generation of LOTOS scenarios from requirements in UCM and of TGV for the automatic generation of TTCN test suites from LOTOS. The methodology is illustrated with a case study which is a telephony system providing the basic call feature.
APA, Harvard, Vancouver, ISO, and other styles
23

Čaušević, Aida. "Formal Approaches to Service-oriented Design : From Behavioral Modeling to Service Analysis." Licentiate thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-12166.

Full text
Abstract:
Service-oriented systems (SOS) have recently emerged as context-independent component-based systems. In contrast to components, services can be created, invoked, composed and destroyed at run-time. Services are assumed to be platform independent and available for use within heterogeneous applications. One of the main assets in SOS is service composability. It allows the development of composite services with the main goal of reusable functionality provided by existing services in a low cost and rapid development process at run-time. However, in such distributed systems it becomes difficult to guarantee the quality of services (QoS), both in isolation, as well as of the newly created service compositions. Means of checking correctness of service composition can enable optimization w.r.t. the function and resource-usage of composed services, as well as provide a higher degree of QoS assurance of a service composition. To accomplish such goals, we employ model-checking technique for both single and composed services. The verification eventually provides necessaryinformation about QoS, already at early development stage.This thesis presents the research that we have been carrying out, on developing of methods and tools for specification, modeling, and formal analysis of services and service compositions in SOS. In this work, we first show how to formally check QoS in terms of performance and reliability for formallyspecified component-based systems (CBS). Next, we outline the commonalities and differences between SOS and CBS. Third, we develop constructs for the formal description of services using the resource-aware timed behavioral language called REMES, including development of language to support service compositions. At last, we show how to check service and service composition(functional, timing and resource-wise) correctness by employing the strongest post condition semantics. For less complex services and service compositions we choose to prove correctness using Hoare triples and the guarded command language. In case of complex services described as priced timed automata(PTA), we prove correctness via algorithmic computation of strongest post-condition of PTA.<br>Q-ImPreSS
APA, Harvard, Vancouver, ISO, and other styles
24

Pradalier, Sylvain. "A formal approach to the modeling, simulation and analysis of nano-devices." Phd thesis, Ecole Polytechnique X, 2009. http://tel.archives-ouvertes.fr/tel-00780567.

Full text
Abstract:
Nano-devices are molecular machines synthesized from molecular subcomponents whose functions are combined in order to perform the func- tion of the machine. It frequently results of relative motions of subcomponents triggered by chemical events such as excitement induced by light, acidity or tem- perature changes. Thus the function consists in the transformation of a chemical event into a mechanical event. An important and characteristic feature of these devices is their intrinsic compositional nature. Therefore process-algebra for- malisms are natural candidates for their modeling. To this aim we introduce a dialect of the -calculus, the nano calculus. It is a rule-based language, the basic agents are molecules, with explicit representa- tion of molecular complexations and internal states. Its stochastic semantics is governed by rules which correspond to chemical reactions. The stochastic rate of the rule, possibly in nite, corresponds to the kinetic rate of the reaction. We illustrated its relevance for the modeling and simulation of nano-devices with an example stemming from the collaboration with the chemistry department of bologna: the [2]RaH rotaxane. We modeled it in nano and simulated its behaviour under various conditions of concentration: rst we validate our model by checking its correspondance with the experimental data and then we investi- gate extreme conditions not observable in practice. We were able to show that some classical assumption about kinetic rates were not correct any longer in this setting. The calculus has many advantages for the modelling of biochemical sys- tems. It is in particular compact, easily reusable and modi able and maybe more importantly much biological-like and thus easier to learn for biochemists. On the other hand the -calculus, also often used to model biochemical sys- tems, has a much more developed theory and more available tools. We present an encoding from the nano calculus to the stochastic -calculus. It satis es a very strong correctness property: S ! T , [[S]] ! [[T]], where S and T are nano terms, is the rate of the reaction and [[:]] is the encoding. Thus it permits to use nano as a front-end formalism and still get the bene ts of the theory and tools of the -calculus. We carry on with a study of the chemical master equation. It probabilisti- cally describes the possible behaviours of the system over time as a di erential equation on the probability to be in a given state at a given instant. It is a key notion in chemistry. There have been many e orts to solve it, and methods such as the Gillespie's algorithm has been developed to simulate its solution. We introduce and motivate a notion of equivalence based on the chemical master equation. It equates state with similar stochastic behavior. Then we prove that this equivalence corresponds exactly to the notion backward stochastic bisimu- lation. This bisimulation di ers from the usual ones because it considers ingoing transitions instead of outgoing transitions. This results is worth in itself since it establishes a bridge between a chemical semantics and a computer semantics, but it is also the rst step towards a metrics for biochemistry. Finally we present an unexpected consequence of our study of the nano calculus. We study the relative expressiveness of the synchronous and asyn- chronous -calculus. In the classical setting the latter is known to be strictly less expressive than the former. We prove that the separation also holds in the stochastic setting. We then extend the result to the -calculi with in nite rates. We also show that under a small restriction the asynchronous -calculus with in nite rates can encode the synchronous -calculus without in nite rates. In- terestingly the separation results are proved using the encodability of the nano calculus. We also propose and motivate a stochastic -calculus with rates of di erent orders of magnitude: the multi-scale -calculus to which we generalize our results. Finally we prove that in the probabilistic settings the asynchronous -calculus can be encoded into the asynchronous one.
APA, Harvard, Vancouver, ISO, and other styles
25

BARTOCCI, Ezio. "A Formal Framework for Modeling, Simulating and Analyzing Networks of Excitable Cells." Doctoral thesis, Università degli Studi di Camerino, 2009. http://hdl.handle.net/11581/401755.

Full text
Abstract:
The main goal of this thesis was to investigate the use of HA as a unifying systems biology approach to model, simulate and analyze excitable cell networks in general and those of cardiac myocytes in particular. We propose a new biological framework based on the Lynch et al. theory of Hybrid I/O Automata (HIOA) for modeling and simulating excitable tissue. Within this framework, we view an excitable tissue as the composition of two main kinds of components: a diffusion medium and a collection of cells, both modeled as an HIOA. This approach yields a notion of decomposition that allows us to describe a tissue as the parallel composition of several interacting tissues, a property that could be exploited to parallelize, and hence improve, the efficiency of the simulation process. On the basis of HA theory, we have developed CellExcite, an efficient simulation environment for excitable-cell networks. CellExcite allows the user to sketch a tissue of excitable cells, plan the stimuli to be applied during simulation, and customize the diffusion model. CellExcite adopts Hybrid I/O Automata (HIOA) as the computational model in order to efficiently capture both discrete and continuous excitable-cell behavior. We demonstrate the feasibility of our HIOA-based framework to capture and mimic different kinds of wave-propagation behavior in 2D isotropic cardiac tissue, including normal wave propagation along the tissue; the creation of spiral waves; the break-up of spiral waves into more complex patterns such as fibrillation; and the recovery of the tissue to the resting state via electrical defibrillation. We address also the problem of specifying and detecting emergent behavior in networks of cardiac myocytes, spiral electric waves in particular, a precursor to atrial and ventricular fibrillation. To solve this problem we: (1) Apply discrete mode-abstraction to the cycle-linear hybrid automata (CLHA) we have recently developed for modeling the behavior of myocyte networks; (2) Introduce the new concept of spatial-superposition of CLHA modes; (3) Develop a new spatial logic, based on spatial-superposition, for specifying emergent behavior; (4) Devise a new method for learning the formulae of this logic from the spatial patterns under investigation; and (5) Apply bounded model checking to detect (within milliseconds) the onset of spiral waves. We have implemented our methodology as the Emerald tool-suite, a component of our EHA framework for specification, simulation, analysis and control of excitable hybrid automata. We illustrate the effectiveness of our approach by applying Emerald to the scalar electrical fields produced by our CellExcite simulator. Furthermore, we focus our attention on understanding of the synchronized collective behavior that is essential in the networks of pacemaker cells in the heart and could be useful for developing methods to control the dynamics of systems with desired properties. For this purpose we define a subclass of timed automata, called oscillator timed automata, suitable to model biological coupled oscillators. The semantics of their interactions, parametric w.r.t. a model of synchronization, is introduced. We apply it to the Kuramoto model. Then, we introduce a logic, Kuramoto Synchronization Logic (KSL), and a model checking algorithm in order to verify collective synchronization properties of a population of coupled oscillators.
APA, Harvard, Vancouver, ISO, and other styles
26

SANAULLAH, MUHAMMAD. "Design Time Methodology for the Formal Modeling and Verification of Smart Environments." Doctoral thesis, Politecnico di Torino, 2014. http://hdl.handle.net/11583/2536725.

Full text
Abstract:
Smart Environments (SmE) are intelligent and complex due to smart connectivity and interaction of heterogeneous devices achieved by complicated and sophisticated computing algorithms. Based on their domotic and industrial applications, SmE system may be critical in terms of correctness, reliability, safety, security and other such vital factors. To achieve error-free and requirement-compliant implementation of these systems, it is advisable to enforce a design process that may guarantee these factors by adopting formal models and formal verification techniques at design time. The e-Lite research group at Politecnico di Torino is developing solutions for SmE based on integration of commercially available home automation technologies with an intelligent ecosystem based on a central OSGi-based gateway, and distributed collaboration of intelligent applications, with the help of semantic web technologies and applications. The main goal of my research is to study new methodologies which are used for the modeling and verification of SmE. This goal includes the development of a formal methodology which ensures the reliable implementation of the requirements on SmE, by modeling and verifying each component (users, devices, control algorithms and environment/context) and the interaction among them, especially at various stages in design time, so that all the complexities and ambiguities can be reduced.
APA, Harvard, Vancouver, ISO, and other styles
27

Chen, Wei. "Formal Modeling and Automatic Generation of Test Cases for the Autonomous Vehicle." Electronic Thesis or Diss., université Paris-Saclay, 2020. http://www.theses.fr/2020UPASG002.

Full text
Abstract:
Les véhicules autonomes reposent principalement sur un pilote de système intelligent pour réaliser les fonctions de la conduite autonome. Ils combinent une variété de capteurs (caméras, radars, lidars,..) pour percevoir leurs environnements. Les algorithmes de perception des ADSs (Automated Driving Systems) fournissent des observations sur les éléments environnementaux à partir des données fournies par les capteurs, tandis que les algorithmes de décision génèrent les actions à mettre en oeuvre par les véhicules. Les ADSs sont donc des systèmes critiques dont les pannes peuvent avoir des conséquences catastrophiques. Pour assurer la sûreté de fonctionnement de tels systèmes, il est nécessaire de spécifier, valider et sécuriser la fiabilité de l’architecture et de la logique comportementale de ces systèmes pour toutes les situations qui seront rencontrées par le véhicule. Ces situations sont décrites et générées comme différents cas de test.L'objectif de cette thèse est de développer une approche complète permettant la conceptualisation et la caractérisation de contextes d'exécution pour le véhicule autonome, et la modélisation formelle des cas de test dans le contexte de l’autoroute. Enfin, cette approche doit permettre une génération automatique des cas de test qui ont un impact sur les performances et la fiabilité du véhicule.Dans cette thèse, nous proposons une méthodologie de génération de cas de test composée de trois niveaux. Le premier niveau comprend tous les concepts statiques et mobiles de trois ontologies que nous définissons afin de conceptualiser et de caractériser l'environnement d'execution du véhicule autonome: une ontologie de l'autoroute et une ontologie de la météo pour spécifier l'environnement dans lequel évolue le véhicule autonome, et une ontologie du véhicule qui se compose des feux du véhicule et les actions de contrôle. Chaque concept de ces ontologies est défini en termes d'entité, de sous-entités et de propriétés.Le second niveau comprend les interactions entre les entités des ontologies définies. Nous utilisons les équations de la logique du premier ordre pour représenter les relations entre ces entités.Le troisième et dernier niveau est dédié à la génération de cas de test qui est basée sur l'algèbre des processus PEPA (Performance Evaluation Process Algebra). Celle-ci est utilisée pour modéliser les situations décrites par les cas de test.Notre approche permet de générer automatiquement les cas de test et d'identifier les cas critiques. Nous pouvons générer des cas de test à partir de n'importe quelle situation initiale et avec n'importe quel nombre de scènes. Enfin, nous proposons une méthode pour calculer la criticité de chaque cas de test. Nous pouvons évaluer globalement l'importance d'un cas de test par sa criticité et sa probabilité d'occurrence<br>Autonomous vehicles mainly rely on an intelligent system pilot to achieve the purpose of self-driving. They combine a variety of sensors (cameras, radars, lidars,..) to perceive their surroundings. The perception algorithms of the Automated Driving Systems (ADSs) provide observations on the environmental elements based on the data provided by the sensors, while decision algorithms generate the actions to be implemented by the vehicles. Therefore, ADSs are safety-critical systems whose failures can have catastrophic consequences. To ensure the safety of such systems, it is necessary to specify, validate and secure the dependability of the architecture and the behavioural logic of ADSs running on vehicle for all the situations that will be met by the vehicle. These situations are described and generated as different test cases.The objective of this thesis is to develop a complete approach allowing the conceptualization and the characterization of execution contexts of autonomous vehicle, and the formal modelling of the test cases in the context of the highway. Finally, this approach has to allow an automatic generation of the test cases that have an impact on the performances and the dependability of the vehicle.In this thesis, we propose a three-layer test case generation methodology. The first layer includes all static and mobile concepts of three ontologies we define in order to conceptualize and characterize the driving environment for the construction of test cases: a highway ontology and a weather ontology to specify the environment in which evolves the autonomous vehicle, and a vehicle ontology which consists of the vehicle lights and the control actions. Each concept of these ontologies is defined in terms of entity, sub-entities and properties.The second layer includes the interactions between the entities of the defined ontologies. We use first-order logic equations to represent the relationships between these entities.The third and last layer is dedicated to the test case generation which is based on the process algebra PEPA (Performance Evaluation Process Algebra), which is used to model the situations described by the test cases.Our approach allows us to generate automatically the test cases and to identify the critical ones. We can generate test cases from any initial situation and with any number of scenes. Finally we propose a method to calculate the criticality of each test case. We can comprehensively evaluate the importance of a test case by its criticality and its probability of occurrence
APA, Harvard, Vancouver, ISO, and other styles
28

Mendling, Jan, Henrik Leopold, and Fabian Pittke. "25 Challenges of Semantic Process Modeling." Gitice, 2014. http://epub.wu.ac.at/5983/1/6%2D11%2D1%2DSM.pdf.

Full text
Abstract:
Process modeling has become an essential part of many organizations for documenting, analyzing and redesigning their business operations and to support them with suitable information systems. In order to serve this purpose, it is important for process models to be well grounded in formal and precise semantics. While behavioural semantics of process models are well understood, there is a considerable gap of research into the semantic aspects of their text labels and natural language descriptions. The aim of this paper is to make this research gap more transparent. To this end, we clarify the role of textual content in process models and the challenges that are associated with the interpretation, analysis, and improvement of their natural language parts. More specifically, we discuss particular use cases of semantic process modeling to identify 25 challenges. For each challenge, we identify prior research and discuss directions for addressing them.
APA, Harvard, Vancouver, ISO, and other styles
29

Venugopal, Manu. "Formal specification of industry foundation class concepts using engineering ontologies." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/42868.

Full text
Abstract:
Architecture, Engineering, Construction (AEC) and Facilities Management (FM) involve domains that require a very diverse set of information and model exchanges to fully realize the potential of Building Information Modeling (BIM). Industry Foundation Classes (IFC) provides a neutral and open schema for interoperability. Model View Definitions (MVD) provide a common subset for specifying the exchanges using IFC, but are expensive to build, test and maintain. A semantic analysis of IFC data schema illustrates the complexities of embedding semantics in model views. A software engineering methodology based on formal specification of shared resources, reusable components and standards that are applicable to the AEC-FM industry for development of a Semantic Exchange Module (SEM) structure for IFC schema is adopted for this research. This SEM structure is based on engineering ontologies that are capable of developing more consistent MVDs. In this regard, Ontology is considered as a machine-readable set of definitions that create a taxonomy of classes and subclasses, and relationships between them. Typically, the ontology contains the hierarchical description of important entities that are used in IFC, along with their properties and business rules. This model of an ontological framework, similar to that of Semantic Web, makes the IFC more formal and consistent as it is capable of providing precise definition of terms and vocabulary. The outcome of this research, a formal classification structure for IFC implementations for the domain of Precast/ Prestressed Concrete Industry, when implemented by software developers, provides the mechanism for applications such as modular MVDs, smart and complex querying of product models, and transaction based services, based on the idea of testable and reusable SEMs. It can be extended and also helps in consistent implementation of rule languages across different domains within AEC-FM, making data sharing across applications simpler with limited rework. This research is expected to impact the overall interoperability of applications in the BIM realm.
APA, Harvard, Vancouver, ISO, and other styles
30

Modica, Tony [Verfasser], and Hartmut [Akademischer Betreuer] Ehrig. "Formal Modeling, Simulation, and Validation of Communication Platforms / Tony Modica. Betreuer: Hartmut Ehrig." Berlin : Universitätsbibliothek der Technischen Universität Berlin, 2012. http://d-nb.info/1028072295/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Ivanov, Dinko. "Integrating formal analysis techniques into the Progress-IDE." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-13884.

Full text
Abstract:
In this thesis we contribute to the Progress IDE, an integrated development enviroment for real-time embedded systems and more precisely to the REMES toolchain, a set of tools to enabling construction and analysis of embedded system behavior models. The contribution aims to facilitate the formal analysis of behavioral models, so that certain extra-functional properties might be verified during early stages of development. Previous work in the field proposes use of the Priced Timed Automata framework for verification of such properties. The thesis outlines the main points where the current toolchain should be extended in order to allow formal analysis of modeled components. Result of the work is a prototype, which minimizes the manual efforts of system designer by model to model transformations and provides seamless integration with existing tools for formal analysis.
APA, Harvard, Vancouver, ISO, and other styles
32

Suhaib, Syed Mohammed. "XFM: An Incremental Methodology for Developing Formal Models." Thesis, Virginia Tech, 2004. http://hdl.handle.net/10919/9905.

Full text
Abstract:
We present a methodology of an agile formal method named eXtreme Formal Modeling (XFM) recently developed by us, based on Extreme Programming concepts to construct abstract models from a natural language specification of a complex system. In particular, we focus on Prescriptive Formal Models (PFMs) that capture the specification of the system under design in a mathematically precise manner. Such models can be used as golden reference models for formal verification, test generation, etc. This methodology for incrementally building PFMs work by adding user stories (expressed as LTL formulae) gleaned from the natural language specifications, one by one, into the model. XFM builds the models, retaining correctness with respect to incrementally added properties by regressively model checking all the LTL properties captured theretofore in the model. We illustrate XFM with a graded set of examples including a traffic light controller, a DLX pipeline and a Smart Building control system. To make the regressive model checking steps feasible with current model checking tools, we need to keep the model size increments under control. We therefore analyze the effects of ordering LTL properties in XFM. We compare three different property-ordering methodologies: 'arbitrary ordering', 'property based ordering' and 'predicate based ordering'. We experiment on the models of the ISA bus monitor and the arbitration phase of the Pentium Pro bus. We experimentally show and mathematically reason that predicate based ordering is the best among these orderings. Finally, we present a GUI based toolbox for users to build PFMs using XFM.<br>Master of Science
APA, Harvard, Vancouver, ISO, and other styles
33

Lee, Ghang. "A new formal and analytical process to product modeling (PPM) method and its application to the precast concrete industry." Diss., Available online, Georgia Institute of Technology, 2004:, 2004. http://etd.gatech.edu/theses/available/etd-10262004-191554/unrestricted/lee%5Fghang%5F200412%5Fphd.pdf.

Full text
Abstract:
Thesis (Ph. D.)--Architecture, Georgia Institute of Technology, 2005.<br>Eastman, Charles M., Committee Chair ; Augenbroe, Godfried, Committee Co-Chair ; Navathe, Shamkant B., Committee Co-Chair ; Hardwick, Martin, Committee Member ; Sacks, Rafael, Committee Member. Vita. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
34

Pamplin, Jason Andrew. "Formal Object Interaction Language: Modeling and Verification of Sequential and Concurrent Object-Oriented Software." unrestricted, 2007. http://etd.gsu.edu/theses/available/etd-04222007-205349/.

Full text
Abstract:
Thesis (Ph. D.)--Georgia State University, 2007.<br>Title from file title page. Ying Zhu, committee chair; Xiaolin Hu, Geoffrey Hubona, Roy Johnson, Rajshekhar Sunderraman, committee members. Electronic text (216 p. : ill. (some col.)) : digital, PDF file. Description based on contents viewed Nov. 29, 2007. Includes bibliographical references (p. 209-216).
APA, Harvard, Vancouver, ISO, and other styles
35

Čaušević, Aida. "Formal Approaches for Behavioral Modeling and Analysis of Design-time Services and Service Negotiations." Doctoral thesis, Mälardalens högskola, Inbyggda system, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-23271.

Full text
Abstract:
During the past decade service-orientation has become a popular design paradigm, offering an approach in which services are the functional building blocks. Services are self-contained units of composition, built to be invoked, composed, and destroyed on (user) demand. Service-oriented systems (SOS) are a collection of services that are developed based on several design principles such as: (i) loose coupling between services (e.g., inter-service communication can involve either simple data passing or two or more connected services coordinating some activity) that allows services to be independent, yet highly interoperable when required; (ii) service abstraction, which emphasizes the need to hide as many implementation details as possible, yet still exposing functional and extra-functional capabilities that can be offered to service users; (iii) service reusability provided by the existing services in a rapid and flexible development process; (iv) service composability as one of the main assets of SOS that provide a design platform for services to be composed and decomposed, etc. One of the main concerns in such systems is ensuring service quality per se, but also guaranteeing the quality of newly composed services. To accomplish the above, we consider two system perspectives: the developer's and the user's view, respectively. In the former, one can be assumed to have access to the internal service representation: functionality, enabled actions, resource usage, and interactions with other services. In the second, one has information primarily on the service interface and exposed capabilities (attributes/features). Means of checking that services and service compositions meet the expected requirements, the so-called correctness issue, can enable optimization and possibility to guarantee a satisfactory level of a service composition quality. In order to accomplish exhaustive correctness checks of design-time SOS, we employ model-checking as the main formal verification technique, which eventually provides necessary information about quality-of-service (QoS), already at early stages of system development. ~As opposed to the traditional approach of software system construction, in SOS the same service may be offered at various prices, QoS, and other conditions, depending on the user needs. In such a setting, the interaction between involved parties requires the negotiation of what is possible at request time, aiming at meeting needs on demand. The service negotiation process often proceeds with timing, price, and resource constraints, under which users and providers exchange information on their respective goals, until reaching a consensus. Hence, a mathematically driven technique to analyze a priori various ways to achieve such goals is beneficial for understanding what and how can particular goals be achieved. This thesis presents the research that we have been carrying out over the past few years, which resulted in developing methods and tools for the specification, modeling, and formal analysis of services and service compositions in SOS. The contributions of the thesis consist of: (i)constructs for the formal description of services and service compositions using the resource-aware timed behavioral language called REMES; (ii) deductive and algorithmic approaches for checking correctness of services and service compositions;(iii) a model of service negotiation that includes different negotiation strategies, formally analyzed against timing and resource constraints; (iv) a tool-chain (REMES SOS IDE) that provides an editor and verification support (by integration with the UPPAAL model-checker) to REMES-based service-oriented designs;(v) a relevant case-study by which we exercise the applicability of our framework.The presented work has also been applied on other smaller examples presented in the published papers.<br>Under det senaste årtiondet har ett tjänstorienterat paradigm blivit allt-mer populärt i utvecklingen av datorsystem. I detta paradigm utgör så kallade tjänster den minsta funktionella systemenheten. Dessa tjänster är konstruerade så att de kan skapas, användas, sammansättas och avslutas separat. De ska vara oberoende av varandra samtidigt som de ska kunna fungera effektivt tillsammans och i samarbete med andra system när så behövs. Vidare ska tjänsterna dölja sina interna implementa-tionsdetaljer i så stor grad som möjligt, samtidigt som deras fulla funktionalitet ska exponeras för systemdesignern. Tjänsterna ska också på ett enkelt sätt kunna återanvändas och sammansättas i en snabb och flexibel utvecklingsprocess.En av de viktigaste aspekterna i tjänsteorienterade datorsystem är att kunna säkerställa systemens kvalitet. För att åstadkomma detta ärdet viktigt att få en djupare insikt om tjänstens interna funktionalitet, i termer av möjliga operationer, resursinformation, samt tänkbar inter-aktion med andra tjänster. Detta är speciellt viktigt när utvecklaren har möjlighet att välja mellan två funktionellt likvärda tjänster somär olika med avseende på andra egenskaper, såsom responstid eller andra resurskrav. I detta sammanhang kan en matematisk beskrivning av en tjänsts beteende ge ökad förståelse av tjänstemodellen, samt hjälpa användaren att koppla ihop tjänster på ett korrekt sätt. En matematisk beskrivning öppnar också upp för ett sätt att matematiskt resonera kring tjänster. Metoder för att kontrollera att komponerade tjänstermöter ställda resurskrav möjliggör också resursoptimering av tjänster samt verifiering av ställda kvalitetskrav.I denna avhandling presenteras forskning som har bedrivits under de senaste åren. Forskningen har resulterat i metoder och verktyg föratt specificera, modellera och formellt analysera tjänster och sammansättning av tjänster. Arbetet i avhandlingen består av (i) en formell definition av tjänster och sammansättning av tjänster med hjälp avett resursmedvetet formellt specifikationsspråk kallat Remes; (ii) två metoder för att analysera tjänster och kontrollera korrektheten i sammansättning av tjänster, både deduktivt och algoritmiskt; (iii) en modell av förhandlingsprocessen vid sammansättning av tjänster som inkluderar olika förhandlingsstrategier; (iv) ett antal verktyg som stödjer dessa metoder. Metoderna har använts i ett antal fallstudier som är presenterade i de publicerade artiklarna.<br>Contesse
APA, Harvard, Vancouver, ISO, and other styles
36

Gonçalves, Monteiro Pedro Tiago. "Towards an integrative approach for the modeling and formal verification of biological regulatory networks." Thesis, Lyon 1, 2010. http://www.theses.fr/2010LYO10239/document.

Full text
Abstract:
L'étude des grands modèles de réseaux biologiques par l'utilisation d'outils d'analyse et de simulation conduit à un grand nombre de prédictions. Cela soulève la question de savoir comment identifier les prédictions intéressantes de nouveaux phénomènes, qui peuvent être confrontés à des données expérimentales. Les techniques de vérification formelle basées sur le model checking constituent une technologie puissante pour faire face à cette augmentation d'échelle et de complexité pour l'analyse de ces réseaux. L'application de ces techniques est par contre difficile, pour plusieurs raisons. Premièrement, le domaine de la biologie des systèmes a mis en évidence quelques propriétés dynamiques du réseau, comme la multi-stabilité et les oscillations, qui ne sont pas facilement exprimables avec les logiques temporelles classiques. Deuxièmement, la difficulté de poser des questions pertinentes et intéressantes en logique temporelle est difficile pour les utilisateurs non-experts. Enfin, la plupart des modèles existants et des outils de simulation ne sont pas capables d'appliquer des techniques de model checking d'une manière transparente. La mise en œuvre des approches développées dans ce travail contribue à enlever des obstacles pour l'utilisation de la technologie de vérification formelle en biologie. Leur application a été validée sur l'analyse et la simulation de deux modèles biologiques complexes<br>The study of large models of biological networks by means of analysis and simulation tools leads to large amounts of predictions. This raises the question of how to identify interesting predictions of novel phenomena that can be confronted with experimental data. Formal verification techniques based on model-checking have recently been used to the analysis of these networks, providing a powerful technology to keep up with this increase in scale and complexity. The application of these techniques is hampered, however, by several key issues. First, the systems biology domain brought to the fore a few properties of the network dynamics like multistability and oscillations, that are not easily expressed using classical temporal logics. Second, the problem of posing relevant and interesting questions in temporal logic, is difficult for non-expert users. Finally, most of the existing modeling and simulation tools are not capable of applying model-checking techniques in a transparent way. The approaches developed in this work lower the obstacles to the use of formal verification in systems biology. They have been validated on the analysis and simulation of two real and complex biological models<br>O estudo de redes biológicas tem originado o desenvolvimento de modelos cada vez mais complexos e detalhados. O estudo de redes biológicas complexas utilizando ferramentas de análise e simulação origina grandes quantidades de previsões. Isto levanta a questão de como identificar previsões interessantes de novos fenómenos que possam ser comparados com dados experimentais. As técnicas de verificação formal baseadas em model-checking têm sido usadas na análise destas redes, fornecendo uma tecnologia poderosa para acompanhar o aumento de escala e complexidade do problema. A aplicação destas técnicas tem sido dificultada por um conjunto importante de factores. Em primeiro lugar, em biologia de sistemas têm sido tratadas diversas questões acerca da dinâmica da rede, como a multi-estabilidade e oscilações, que não são facilmente expressas usando lógicas temporais clássicas. Em segundo lugar, o problema de como elaborar perguntas relevantes em lógica temporal, é difícil para o utilizador comum. Por último, a maioria das ferramentas de modelação e simulação não estão preparadas para a aplicação de técnicas de model-checking de forma transparente. Os métodos desenvolvidos nesta tese aliviam os obstáculos no uso da verificação formal em biologia de sistemas. Estes métodos foram validados através da análise e simulação de dois modelos biológicos complexos
APA, Harvard, Vancouver, ISO, and other styles
37

Pavawalla, Shital Prabodh. "Prospective memory following moderate to severe traumatic brain injury a formal multinomial modeling approach /." Pullman, Wash. : Washington State University, 2009. http://www.dissertations.wsu.edu/Dissertations/Summer2009/s_pavawalla_071909.pdf.

Full text
Abstract:
Thesis (Ph. D. in psychology)--Washington State University, August 2009.<br>Title from PDF title page (viewed on Aug. 19, 2009). "Department of Psychology." Includes bibliographical references (p. 32-36).
APA, Harvard, Vancouver, ISO, and other styles
38

Ghosh, Krishnendu. "Formal Analysis of Automated Model Abstractions under Uncertainty: Applications in Systems Biology." University of Cincinnati / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1330024977.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Kühn, Thomas, Kay Bierzynski, Sebastian Richly, and Uwe Aßmann. "FRaMED: Full-Fledge Role Modeling Editor (Tool Demo)." ACM, 2016. https://tud.qucosa.de/id/qucosa%3A75117.

Full text
Abstract:
Since the year 1977, role modeling has been continuously investigated as promising paradigm to model complex, dynamic systems. However, this research had almost no influence on the design of todays increasingly complex and context-sensitive software systems. The reason for that is twofold. First, most modeling languages focused either on the behavioral, relational or context-dependent nature of roles rather than combining them. Second, there is a lack of tool support for the design, validation, and generation of role-based software systems. In particular, there exists no graphical role modeling editor supporting the three natures as well as the various proposed constraints. To overcome this deficiency, we introduce the Full-fledged Role Modeling Editor (FRaMED), a graphical modeling editor embracing all natures of roles and modeling constraints featuring generators for a formal representation and source code of a rolebased programming language. To show its applicability for the development of role-based software systems, an example from the banking domain is employed.
APA, Harvard, Vancouver, ISO, and other styles
40

Pow, Jacky W. C. "A study of formal modeling for sharing the experience of using ICT in university teaching." Thesis, University of Nottingham, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.289436.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

DRAGONE, Luigi. "Modeling and reasoning about semantic e-services in cooperative information systems." Doctoral thesis, La Sapienza, 2008. http://hdl.handle.net/11573/917061.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Hamadi, Rachid Computer Science &amp Engineering Faculty of Engineering UNSW. "Formal Composition and Recovery Policies in Service-Based Business Processes." Awarded by:University of New South Wales. Computer Science and Engineering, 2005. http://handle.unsw.edu.au/1959.4/20666.

Full text
Abstract:
Process-based composition of Web services is emerging as a promising technology for the effective automation of integrated and collaborative applications. As Web services are often autonomous and heterogeneous entities, coordinating their interactions to build complex processes is a difficult, error prone, and time-consuming task. In addition, since Web services usually operate in dynamic and highly evolving environments, there is a need for supporting flexible and correct execution of integrated processes. In this thesis, we propose a Petri net-based framework for formal composition and recovery policies in service-based business processes. We first propose an algebra for composing Web services. The formal semantics of this algebra is expressed in terms of Petri nets. The use of a formal model allows the effective verification and analysis of properties, both within a service, such as termination and absence of deadlock, and between services, such as behavioral equivalences. We also develop a top down approach for the correct (e.g., absence of deadlock and termination) composition of complex business processes. The approach defines a set of refinement operators that guarantee correctness of the resulting business process nets at design time. We then introduce Self-Adaptive Recovery Net (SARN), an extended Petri net model for specifying exceptional behavior in business processes. SARN adapts the structure of the underlying Petri net at run time to handle exceptions while keeping the Petri net design simple and easy. The proposed framework caters for the specification of high-level recovery policies that are incorporated either with a single task or a set of tasks, called a recovery region. Finally, we propose a pattern-based approach to dynamically restructure SARN. These patterns capture the ways past exceptions have been dealt with. The objective is to continuously restructure recovery regions within the SARN model to minimize the impact of exception handling. To illustrate the viability of the proposed composition and exception handling techniques, we have developed HiWorD (HIerarchical WORkflow Designer), a hierarchical Petri net-based business process modeling and simulation tool.
APA, Harvard, Vancouver, ISO, and other styles
43

Grover, Russell J. "An Exploration of Formal Methods and Tools Applied to a Small Satellite Software System." DigitalCommons@USU, 2010. https://digitalcommons.usu.edu/etd/743.

Full text
Abstract:
Formal system modeling has been a topic of interest in the research community for many years. Modeling a system helps engineers understand it better and enables them to check different aspects of it to ensure that there is no undesired or unexpected behavior and that it does what it was designed to do. This thesis takes two existing tools that were created to aid in the designing of spacecraft systems and creates a layer to connect them together and allow them to be used jointly. The first tool is a library of formal descriptions used to specify spacecraft behavior in an unambiguous manner. The second tool is a graphical modeling language that allows a designer to create a model using traditional block diagram descriptions. These block diagrams can be translated to the formal descriptions using the layer created as part of this thesis work. The software of a small satellite, and the additions made to it as part of this thesis work, is also described. Approaches to modeling this software formally are discussed, as are the problems that were encountered that led to expansions of the formal description library to allow better system description.
APA, Harvard, Vancouver, ISO, and other styles
44

Chang, Lily. "A Nested Petri Net Framework for Modeling and Analyzing Multi-Agent Systems." FIU Digital Commons, 2011. http://digitalcommons.fiu.edu/etd/339.

Full text
Abstract:
In the past two decades, multi-agent systems (MAS) have emerged as a new paradigm for conceptualizing large and complex distributed software systems. A multi-agent system view provides a natural abstraction for both the structure and the behavior of modern-day software systems. Although there were many conceptual frameworks for using multi-agent systems, there was no well established and widely accepted method for modeling multi-agent systems. This dissertation research addressed the representation and analysis of multi-agent systems based on model-oriented formal methods. The objective was to provide a systematic approach for studying MAS at an early stage of system development to ensure the quality of design. Given that there was no well-defined formal model directly supporting agent-oriented modeling, this study was centered on three main topics: (1) adapting a well-known formal model, predicate transition nets (PrT nets), to support MAS modeling; (2) formulating a modeling methodology to ease the construction of formal MAS models; and (3) developing a technique to support machine analysis of formal MAS models using model checking technology. PrT nets were extended to include the notions of dynamic structure, agent communication and coordination to support agent-oriented modeling. An aspect-oriented technique was developed to address the modularity of agent models and compositionality of incremental analysis. A set of translation rules were defined to systematically translate formal MAS models to concrete models that can be verified through the model checker SPIN (Simple Promela Interpreter). This dissertation presents the framework developed for modeling and analyzing MAS, including a well-defined process model based on nested PrT nets, and a comprehensive methodology to guide the construction and analysis of formal MAS models.
APA, Harvard, Vancouver, ISO, and other styles
45

Robol, Marco. "Consent modeling and verification: privacy regulations compliance from business goals to business processes." Doctoral thesis, Università degli studi di Trento, 2020. http://hdl.handle.net/11572/277802.

Full text
Abstract:
Privacy regulations impose on companies limitations about the collection, use, and disclosure of user data. One of the actions most companies undertake for this, consists in modifying their systems with processes for consent acquisition and management. Unfortunately, where systems are large and with many dependencies, they often also have little documentation, and knowledge on the system is distributed among different domain experts. These circumstances make the re-engineering of systems a tedious and complex, if not impossible, activity. This PhD Thesis proposes a model-based method with a top-down approach, for modeling consent requirements and analyzing compliance with regulations, by refinement of models from organizational structure down to business processes. The method is provided with guidelines in the form of a process and includes modeling languages and reasoning frameworks for the analysis of requirements with respect to a preset of privacy principles on consent. The Thesis includes validations with realistic scenarios and with domain practitioners from the healthcare domain.
APA, Harvard, Vancouver, ISO, and other styles
46

Robol, Marco. "Consent modeling and verification: privacy regulations compliance from business goals to business processes." Doctoral thesis, Università degli studi di Trento, 2020. http://hdl.handle.net/11572/277802.

Full text
Abstract:
Privacy regulations impose on companies limitations about the collection, use, and disclosure of user data. One of the actions most companies undertake for this, consists in modifying their systems with processes for consent acquisition and management. Unfortunately, where systems are large and with many dependencies, they often also have little documentation, and knowledge on the system is distributed among different domain experts. These circumstances make the re-engineering of systems a tedious and complex, if not impossible, activity. This PhD Thesis proposes a model-based method with a top-down approach, for modeling consent requirements and analyzing compliance with regulations, by refinement of models from organizational structure down to business processes. The method is provided with guidelines in the form of a process and includes modeling languages and reasoning frameworks for the analysis of requirements with respect to a preset of privacy principles on consent. The Thesis includes validations with realistic scenarios and with domain practitioners from the healthcare domain.
APA, Harvard, Vancouver, ISO, and other styles
47

Nguyen, Vu. "A Deontic Analysis of Inter-Organizational Control Requirements." FIU Digital Commons, 2008. http://digitalcommons.fiu.edu/etd/69.

Full text
Abstract:
This research focuses on the design and verification of inter-organizational controls. Instead of looking at a documentary procedure, which is the flow of documents and data among the parties, the research examines the underlying deontic purpose of the procedure, the so-called deontic process, and identifies control requirements to secure this purpose. The vision of the research is a formal theory for streamlining bureaucracy in business and government procedures. Underpinning most inter-organizational procedures are deontic relations, which are about rights and obligations of the parties. When all parties trust each other, they are willing to fulfill their obligations and honor the counter parties’ rights; thus controls may not be needed. The challenge is in cases where trust may not be assumed. In these cases, the parties need to rely on explicit controls to reduce their exposure to the risk of opportunism. However, at present there is no analytic approach or technique to determine which controls are needed for a given contracting or governance situation. The research proposes a formal method for deriving inter-organizational control requirements based on static analysis of deontic relations and dynamic analysis of deontic changes. The formal method will take a deontic process model of an inter-organizational transaction and certain domain knowledge as inputs to automatically generate control requirements that a documentary procedure needs to satisfy in order to limit fraud potentials. The deliverables of the research include a formal representation namely Deontic Petri Nets that combine multiple modal logics and Petri nets for modeling deontic processes, a set of control principles that represent an initial formal theory on the relationships between deontic processes and documentary procedures, and a working prototype that uses model checking technique to identify fraud potentials in a deontic process and generate control requirements to limit them. Fourteen scenarios of two well-known international payment procedures -- cash in advance and documentary credit -- have been used to test the prototype. The results showed that all control requirements stipulated in these procedures could be derived automatically.
APA, Harvard, Vancouver, ISO, and other styles
48

Chrszon, Philipp, Clemens Dubslaff, Sascha Klüppelholz, and Christel Baier. "Family-Based Modeling and Analysis for Probabilistic Systems." Springer, 2016. https://tud.qucosa.de/id/qucosa%3A70790.

Full text
Abstract:
Feature-based formalisms provide an elegant way to specify families of systems that share a base functionality and differ in certain features. They can also facilitate an all-in-one analysis, where all systems of the family are analyzed at once on a single family model instead of one-by-one. This paper presents the basic concepts of the tool ProFeat, which provides a guarded-command language for modeling families of probabilistic systems and an automatic translation of family models to the input language of the probabilistic model checker PRISM. This translational approach enables a family-based quantitative analysis with PRISM. Besides modeling families of systems that differ in system parameters such as the number of identical processes or channel sizes, ProFeat also provides special support for the modeling and analysis of (probabilistic) product lines with dynamic feature switches, multi-features and feature attributes. By means of several case studies we show how ProFeat eases family-based modeling and compare the one-by-one and all-in-one analysis approach.
APA, Harvard, Vancouver, ISO, and other styles
49

Eriksson, Lundström Jenny S. Z. "On the Formal Modeling of Games of Language and Adversarial Argumentation : A Logic-Based Artificial Intelligence Approach." Doctoral thesis, Uppsala universitet, Institutionen för informationsvetenskap, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-9538.

Full text
Abstract:
Argumentation is a highly dynamical and dialectical process drawing on human cognition. Successful argumentation is ubiquitous to human interaction. Comprehensive formal modeling and analysis of argumentation presupposes a dynamical approach to the following phenomena: the deductive logic notion, the dialectical notion and the cognitive notion of justified belief. For each step of an argumentation these phenomena form networks of rules which determine the propositions to be allowed to make sense as admissible, acceptable, and accepted. We present a formal logic framework for a computational account of formal modeling and systematical analysis of the dynamical, exhaustive and dialectical aspects of adversarial argumentation and dispute. Our approach addresses the mechanisms of admissibility, acceptability and acceptance of arguments in adversarial argumentation by use of metalogic representation and Artificial Intelligence-techniques for dynamical problem solving by exhaustive search. We elaborate on a common framework of board games and argumentation games for pursuing the alternatives facing the adversaries in the argumentation process conceived as a game. The analogy to chess is beneficial as it incorporates strategic and tactical operations just as argumentation. Drawing on an analogy to board games like chess, the state space representation, well researched in Artificial Intelligence, allows for a treatment of all possible arguments as paths in a directed state space graph. It will render a game leading to the most wins and fewest losses, identifying the most effective game strategy. As an alternate visualization, the traversal of the state space graph unravels and collates knowledge about the given situation/case under dispute. Including the private knowledge of the two parties, the traversal results in an increased knowledge of the case and the perspectives and arguments of the participants. As we adopt metalogic as formal basis, arguments used in the argumentation, expressed in a non-monotonic defeasible logic, are encoded as terms in the logical argumentation analysis system. The advantage of a logical formalization of argumentation is that it provides a symbolic knowledge representation with a formally well-formed semantics, making the represented knowledge as well as the behavior of knowledge representation systems reasoning comprehensible. Computational logic as represented in Horn Clauses allows for expression of substantive propositions in a logical structure. The non-monotonic nature of defeasible logic stresses the representational issues, i.e. what is possible to capture in non-monotonic reasoning, while from the (meta)logic program, the sound computation on what it is possible to compute, and how to regard the semantics of this computation, are established.
APA, Harvard, Vancouver, ISO, and other styles
50

Gladigau, Jens [Verfasser], and Teich [Akademischer Betreuer] Jürgen. "Combining Formal Model-Based System-Level Design with SystemC Transaction Level Modeling / Jens Gladigau. Betreuer: Teich Jürgen." Erlangen : Universitätsbibliothek der Universität Erlangen-Nürnberg, 2012. http://d-nb.info/1028958757/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography