To see the other types of publications on this topic, follow the link: Formal Modeling.

Dissertations / Theses on the topic 'Formal Modeling'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Formal Modeling.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Shaw, Kevin B. "Curated Reasoning by Formal Modeling of Provenance." ScholarWorks@UNO, 2013. http://scholarworks.uno.edu/td/1782.

Full text
Abstract:
The core problem addressed in this research is the current lack of an ability to repurpose and curate scientific data among interdisciplinary scientists within a research enterprise environment. Explosive growth in sensor technology as well as the cost of collecting ocean data and airborne measurements has allowed for exponential increases in scientific data collection as well as substantial enterprise resources required for data collection. There is currently no framework for efficiently curating this scientific data for repurposing or intergenerational use. There are several reasons why this problem has eluded solution to date to include the competitive requirements for funding and publication, multiple vocabularies used among various scientific disciplines, the number of scientific disciplines and the variation among workflow processes, lack of a flexible framework to allow for diversity among vocabularies and data but a unifying approach to exploitation and a lack of affordable computing resources (mostly in past tense now). Addressing this lack of sharing scientific data among interdisciplinary scientists is an exceptionally challenging problem given the need for combination of various vocabularies, maintenance of associated scientific data provenance, requirement to minimize any additional workload being placed on originating data scientist project/time, protect publication/credit to reward scientific creativity and obtaining priority for a long-term goal such as scientific data curation for intergenerational, interdisciplinary scientific problem solving that likely offers the most potential for the highest impact discoveries in the future. This research approach focuses on the core technical problem of formally modeling interdisciplinary scientific data provenance as the enabling and missing component to demonstrate the potential of interdisciplinary scientific data repurposing. This research develops a framework to combine varying vocabularies in a formal manner that allows the provenance information to be used as a key for reasoning to allow manageable curation. The consequence of this research is that it has pioneered an approach of formally modeling provenance within an interdisciplinary research enterprise to demonstrate that intergenerational curation can be aided at the machine level to allow reasoning and repurposing to occur with minimal impact to data collectors and maximum impact to other scientists.
APA, Harvard, Vancouver, ISO, and other styles
2

Lisowski, Matthew A. "Development of a target recognition system using formal and semi-formal software modeling methods." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2000. http://handle.dtic.mil/100.2/ADA386925.

Full text
Abstract:
Thesis (M.S. in Software Engineering) Naval Postgraduate School, Dec. 2000.
Thesis advisors, Neil Rowe, Man-Tak Shing. "December 2000." Includes bibliographical references (p. 101-102). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
3

Sidorowicz, Piotr Roald. "A formal framework for modeling and testing memories." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape4/PQDD_0028/NQ51227.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Wathugala, Wathugala Gamage Dulan Manujinda. "Formal Modeling Can Improve Smart Transportation Algorithm Development." Thesis, University of Oregon, 2017. http://hdl.handle.net/1794/22608.

Full text
Abstract:
201 pages
Ensuring algorithms work accurately is crucial, especially when they drive safety critical systems like self-driving cars. We formally model a published distributed algorithm for autonomous vehicles to collaborate and pass thorough an intersection. Models are built and validated using the “Labelled Transition System Analyser” (LTSA). Our models reveal situations leading to deadlocks and crashes in the algorithm. We demonstrate two approaches to gain insight about a large and complex system without modeling the entire system: Modeling a sub system - If the sub system has issues, the super system too. Modeling a fast-forwarded state - Reveals problems that can arise later in a process. Some productivity tools developed for distributed system development are also presented. Manulator, our distributed system simulator, enables quick prototyping and debugging on a single workstation. LTSA-O, extension to LTSA, listens to messages exchanged in an execution of a distributed system and validates it against a model.
APA, Harvard, Vancouver, ISO, and other styles
5

Park, Hoon. "Formal Modeling and Verification of Delay-Insensitive Circuits." PDXScholar, 2015. https://pdxscholar.library.pdx.edu/open_access_etds/2639.

Full text
Abstract:
Einstein's relativity theory tells us that the notion of simultaneity can only be approximated for events distributed over space. As a result, the use of asynchronous techniques is unavoidable in systems larger than a certain physical size. Traditional design techniques that use global clocks face this barrier of scale already within the space of a modern microprocessor chip. The most common response by the chip industry for overcoming this barrier is to use Globally Asynchronous Locally Synchronous (GALS) design techniques. The circuits investigated in this thesis can be viewed as examples of GALS design. To make such designs trustworthy it is necessary to model formally the relative signal delays and timing requirements that make these designs work correctly. With trustworthy asynchrony one can build reliable, large, and scalable systems, and exploit the lower power and higher speed features of asynchrony. This research presents ARCtimer, a framework for modeling, generating, verifying, and enforcing timing constraints for individual self-timed handshake components that use bounded-bundled-data handshake protocols. The constraints guarantee that the component's gate-level circuit implementation obeys the component's handshake protocol specification. Because the handshake protocols are delay insensitive, self-timed systems built using ARCtimer-verified components can be made delay insensitive. Any delay sensitivity inside a component is detected and repaired by ARCtimer. In short: by carefully considering time locally, we can ignore time globally. ARCtimer applies early in the design process as part of building a library of verified components for later system use. The library also stores static timing analysis (STA) code to validate and enforce the component's constraints in any self-timed system built using the library. The library descriptions of a handshake component's circuit, protocol, timing constraints, and STA code are robust to circuit modifications applied later in the design process by technology mapping or layout tools. New contributions of ARCtimer include: 1. Upfront modeling on a component by component basis to reduce the validation effort required to (a) reimplement components in different technologies, (b) assemble components into systems, and (c) guarantee system-level timing closure. 2. Modeling of bounded-bundled-data timing constraints that permit the control signals to lead or lag behind data signals to optimize system timing.
APA, Harvard, Vancouver, ISO, and other styles
6

Kühnberger, Kai-Uwe. "Formal frameworks for circular phenomena possibilities of modeling pathological expressions in formal and natural languages /." [S.l. : s.n.], 2002. http://deposit.ddb.de/cgi-bin/dokserv?idn=964198576.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Smirnov, Oleg. "Formal evolutionary modeling and the problems of political science /." view abstract or download file of text, 2005. http://wwwlib.umi.com/cr/uoregon/fullcit?p3190550.

Full text
Abstract:
Thesis (Ph. D.)--University of Oregon, 2005.
Typescript. Includes vita and abstract. Includes bibliographical references (leaves 113-131). Also available for download via the World Wide Web; free to University of Oregon users.
APA, Harvard, Vancouver, ISO, and other styles
8

Jacobs, Petrus Jacobus. "A formal refinement framework for the systems modeling language." Thesis, University of Oxford, 2015. https://ora.ox.ac.uk/objects/uuid:8be42735-8a31-41e2-82e2-05f7d0e6cb1a.

Full text
Abstract:
The Systems Modeling Language (SysML), an extension of a subset of the Unified Modeling Language (UML), is a visual modelling language for systems engineering applications. At present, the semi-formal SysML, which is widely utilised for the design of complex heterogeneous systems, lacks integration with other more formal approaches. In this thesis, we describe how Communicating Sequential Processes (CSP) and its associated refinement checker, Failures Divergences Refinement (FDR), may be used to underpin an approach that facilitates the refinement checking of the behavioural consistency of SysML diagrams. We do so by utilising CSP as a semantic domain for reasoning about SysML behavioural aspects: activities, state machines and interactions are given a formal process-algebraic semantics. These behaviours execute within the context of the structural diagrams to which they relate, and this is reflected in the CSP descriptions that depict their characteristic patterns of interaction. The resulting abstraction gives rise to a framework that enables the formal treatment of integrated behaviours via refinement checking. In SysML, requirement diagrams allow for the allocation of behavioural features in order to present a more detailed description of a captured requirement. Moreover, we demonstrate that, by providing a common basis for behaviours and requirements, the approach supports requirements traceability: SysML requirements are amenable to formal verification using FDR. In addition, the proposed framework is able to detect inconsistencies that arise due to the multi-view nature of SysML. We illustrate and validate the contribution by applying our methodology to a safety critical system of moderate size and complexity.
APA, Harvard, Vancouver, ISO, and other styles
9

Liu, Su. "Formal Modeling and Analysis Techniques for High Level Petri Nets." FIU Digital Commons, 2014. http://digitalcommons.fiu.edu/etd/1522.

Full text
Abstract:
Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. There are two issues in using HLPNs - modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.
APA, Harvard, Vancouver, ISO, and other styles
10

Linck, Ricardo Ramos. "Conceptual modeling of formal and material relations applied to ontologies." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2014. http://hdl.handle.net/10183/108626.

Full text
Abstract:
Ontologias representam uma conceitualização compartilhada de uma comunidade de conhecimento. São construídas a partir da descrição dos significados dos conceitos, descritos através de seus atributos e dos relacionamentos entre os conceitos. Conceitos se referem ao objeto da conceitualização, o universo do discurso. São caracterizados por seus atributos e domínios de valores possíveis. Relacionamentos são utilizados para descreverem de que forma os conceitos se estruturam no mundo. Nas ontologias todos os conceitos são hierarquicamente definidos, porém existem outros relacionamentos que são definicionais, dando identidade aos conceitos e sentido ao mundo. Além dos relacionamentos de subsunção que constroem as taxonomias de conceitos, outras relações formais e materiais auxiliam na estruturação do domínio e na definição conceitual. As ferramentas de modelagem, no entanto, ainda são falhas em diferenciar os vários tipos de relacionamentos formais e materiais para atribuir as possibilidades de raciocínio automático. Em especial, relacionamentos mereológicos e partonômicos carecem de opções de implementação que permitam extrair o potencial semântico da modelagem. Este projeto de pesquisa tem como ponto de partida o estudo da literatura sobre ontologias e relações, em especial sobre relações formais e materiais, incluindo relações mereológicas e partonômicas, revisando os princípios encontrados nas ontologias. Além disso, nós identificamos os fundamentos teóricos das relações e analisamos a aplicação dos conceitos das relações sobre as principais ontologias de fundamentação em prática na atualidade. Na sequência, a partir das propostas levantadas, este trabalho propõe uma alternativa para a modelagem conceitual destas relações em uma ontologia de domínio visual. Esta alternativa foi disponibilizada na ferramenta de construção de ontologias do Projeto Obaitá, a qual está sendo desenvolvida pelo Grupo de Pesquisa de Bancos de Dados Inteligentes (BDI) da UFRGS.
Ontologies represent a shared conceptualization of a knowledge community. They are built from the description of the meaning of concepts, expressed through their attributes and their relationships. Concepts refer to the object of conceptualization, the universe of discourse. They are characterized by their attributes and domains of possible values. Relationships are used to describe how the concepts are structured in the world. In ontologies all concepts are hierarchically defined, however there are other relationships that are definitional, giving identity to the concepts and meaning to the world. In addition to the subsumption relationships that build the taxonomies of concepts, other formal and material relations assist in structuring the domain and the conceptual definition. The modeling tools, however, are still deficient in differentiating the various types of formal and material relationships in order to assign the possibilities of automated reasoning. In particular, mereological and partonomic relationships lack of implementation options that allow extracting the semantic potential when modeling. This research project takes as a starting point the study of the literature on ontologies and relations, especially on formal and material relations, including mereological and partonomic relations, reviewing the principles found on ontologies. Furthermore, we identify the theoretical foundations of the relations and analyze the application of the relations concepts to the main foundational ontologies in use nowadays. Following, from the raised proposals, this work proposes an alternative for the conceptual modeling of these relations in a visual domain ontology. This alternative has been made available on the ontology building tool of the Obaitá Project, which is under development by the Intelligent Databases Research Group (BDI) from UFRGS.
APA, Harvard, Vancouver, ISO, and other styles
11

Khlifi, Oussama [Verfasser]. "Modeling and formal verification of probabilistic reconfigurable systems / Oussama Khlifi." Saarbrücken : Saarländische Universitäts- und Landesbibliothek, 2020. http://d-nb.info/1221129384/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Moustafa, Iman Saleh. "Formal Specification and Verification of Data-Centric Web Services." Diss., Virginia Tech, 2012. http://hdl.handle.net/10919/26294.

Full text
Abstract:
In this thesis, we develop and evaluate a formal model and contracting framework for data-centric Web services. The central component of our framework is a formal specification of a common Create-Read-Update-Delete (CRUD) data store. We show how this model can be used in the formal specification and verification of both basic and transactional Web service compositions. We demonstrate through both formal proofs and empirical evaluations that our proposed framework significantly decreases ambiguity about a service, enhances its reuse, and facilitates detection of errors in service-based implementations. Web Services are reusable software components that make use of standardized interfaces to enable loosely-coupled business-to-business and customer-to-business interactions over the Web. In such environments, service consumers depend heavily on the service interface specification to discover, invoke, and synthesize services over the Web. Data-centric Web services are services whose behavior is determined by their interactions with a repository of stored data. A major challenge in this domain is interpreting the data that must be marshaled between consumer and producer systems. While the Web Services Description Language (WSDL) is currently the de facto standard for Web services, it only specifies a service operation in terms of its syntactical inputs and outputs; it does not provide a means for specifying the underlying data model, nor does it specify how a service invocation affects the data. The lack of data specification potentially leads to erroneous use of the service by a consumer. In this work, we propose a formal contract for data-centric Web services. The goal is to formally and unambiguously specify the service behavior in terms of its underlying data model and data interactions. We address the specification of a single service, a flow of services interacting with a single data store, and also the specification of distributed transactions involving multiple Web services interacting with different autonomous data stores. We use the proposed formal contract to decrease ambiguity about a service behavior, to fully verify a composition of services, and to guarantee correctness and data integrity properties within a transactional composition of services.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
13

Zobair, Md Hasan. "Modeling and formal verification of a telecom system block using MDGs." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/MQ59312.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Pino, Lou. "A formal method for modeling and analysis of requirements for software /." Thesis, McGill University, 1993. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=69740.

Full text
Abstract:
Requirements that are well understood by a client and a provider are a major contributor to developing and supporting reliable, quality software on time and within budget. This thesis has two thrusts to facilitate improved interpretation of requirements: (1) a requirements model and (2) a new formalism called LaP, with automated tools, to express and analyze requirements. The new formalism is based on the integration of an algebraic based language, Larch, and an extended finite state machine based language, Promela. Larch comes with a theorem prover (Larch Prover) and Promela comes with a tool (SPIN) to aid in the validation of dynamic properties. It is the objective of LaP to express the control and data intensive aspects of requirements. The two thrusts are demonstrated by building a requirements model for real telecommunications requirements that ask for a system that can manage the access configurations of user's accounts in telecommunications equipment.
APA, Harvard, Vancouver, ISO, and other styles
15

Carvalho, Fabiano Costa. "On the design of integrated modular avionics assisted by formal modeling." Instituto Tecnológico de Aeronáutica, 2009. http://www.bd.bibl.ita.br/tde_busca/arquivo.php?codArquivo=1218.

Full text
Abstract:
Avionics system manufacturers are currently facing the problem of developing highly-integrated systems under economic pressures. In this scenario, the empirical approach, characterized by trial and error techniques, is not adequate since the correction of design flaws is often related to expensive re-work and schedule overruns. The evolution of airborne systems toward Integrated Modular Avionics (IMA) pushes the need for advanced methods that could enforce correctness of complex designs while minimizing the chances of introducing errors. Considering this problem, this work proposes a systematic conceptual design strategy based on formal methods, aiming at improving the development processes for IMA systems. The basic idea is to concentrate efforts on the construction, simulation, and formal analysis of a mathematical model for the new system at early development lifecycle phases. The proposed approach was exercised on a case study of practical avionics project in order to evaluate the drawbacks and advantages. Results suggest that this work could contribute to the aeronautics industry by offering alternative means to cope with complexity in modern avionics projects.
APA, Harvard, Vancouver, ISO, and other styles
16

Widel, Wojciech. "Formal modeling and quantitative analysis of security using attack- defense trees." Thesis, Rennes, INSA, 2019. http://www.theses.fr/2019ISAR0019.

Full text
Abstract:
L'analyse de risque est un processus très complexe. Elle nécessite une représentation rigoureuse et une évaluation approfondie des menaces et de leur contre-mesures. Cette thèse porte sur la modélisation formelle de la sécurité à l'aide d'arbres d'attaque et de défense. Ces derniers servent à représenter et à quantifier les attaques potentielles afin de mieux comprendre les enjeux de sécurité auxquels le système analysé peut être confronté. Ils permettent donc de guider un expert dans le choix des contre-mesures à implémenter pour sécuriser son système. - Le développement d'une méthodologie basée sur la dominance de Pareto et permettant de prendre en compte plusieurs aspects quantitatifs simultanément (e.g., coût, temps, probabilité, difficulté, etc.) lors d'une analyse de risques. - La conception d'une technique, utilisant les méthodes de programmation linéaire, pour sélectionner un ensemble de contre-mesures optimal, en tenant compte du budget destiné à la protection du système analysé. C'est une technique générique qui peut être appliquée à plusieurs problèmes d'optimisation, par exemple, la maximisation de la couverture de surface d'attaque, Les principales contributions de cette thèse sont les ou encore la maximisation du investissement de suivantes : l'attaquant. - L'enrichissement du modèle des arbres d'attaque et de défense permettant d'analyser des scénarios de Pour garantir leur applicabilité pratique, le modèle et sécurité réels. Nous avons notamment développé les les algorithmes mathématiques développés ont été fondements théoriques et les algorithmes d'évaluation implémentés dans un outil informatique à source quantitative pour le modèle où une action de ouverte et accès gratuit. Tous les résultats ont l'attaquant peut contribuer à plusieurs attaques et où également été validés lors d'une étude pratique une contre-mesure peut prévenir plusieurs menaces. portant sur un scénario industriel d'altération de compteurs de consommation d'électricité
Risk analysis is a very complex process. It requires rigorous representation and in-depth assessment of threats and countermeasures. This thesis focuses on the formal modelling of security using attack and defence trees. These are used to represent and quantify potential attacks in order to better understand the security issues that the analyzed system may face. They therefore make it possible to guide an expert in the choice of countermeasures to be implemented to secure their system. The main contributions of this thesis are as follows: - The enrichment of the attack and defence tree model allowing the analysis of real security scenarios. In particular, we have developed the theoretical foundations and quantitative evaluation algorithms for the model where an attacker's action can contribute to several attacks and a countermeasure can prevent several threats. - The development of a methodology based on Pareto dominance and allowing several quantitative aspects to be taken into account simultaneously (e.g., cost, time, probability, difficulty, etc.) during a risk analysis. - The design of a technique, using linear programming methods, for selecting an optimal set of countermeasures, taking into account the budget available for protecting the analyzed system. It is a generic technique that can be applied to several optimization problems, for example, maximizing the attack surface coverage, or maximizing the attacker's investment. To ensure their practical applicability, the model and mathematical algorithms developed were implemented in a freely available open source tool. All the results were also validated with a practical study on an industrial scenario of alteration of electricity consumption meters
APA, Harvard, Vancouver, ISO, and other styles
17

Sakib, Ashiq Adnan. "Formal Modeling and Verification Methodologies for Quasi-Delay Insensitive Asynchronous Circuits." Diss., North Dakota State University, 2019. https://hdl.handle.net/10365/29896.

Full text
Abstract:
Pre-Charge Half Buffers (PCHB) and NULL convention Logic (NCL) are two major commercially successful Quasi-Delay Insensitive (QDI) asynchronous paradigms, which are known for their low-power performance and inherent robustness. In industry, QDI circuits are synthesized from their synchronous counterparts using custom synthesis tools. Validation of the synthesized QDI implementation is a critical design prerequisite before fabrication. At present, validation schemes are mostly extensive simulation based that are good enough to detect shallow bugs, but may fail to detect corner-case bugs. Hence, development of formal verification methods for QDI circuits have been long desired. The very few formal verification methods that exist in the related field have major limiting factors. This dissertation presents different formal verification methodologies applicable to PCHB and NCL circuits, and aims at addressing the limitations of previous verification approaches. The developed methodologies can guarantee both safety (full functional correctness) and liveness (absence of deadlock), and are demonstrated using several increasingly larger sequential and combinational PCHB and NCL circuits, along with various ISCAS benchmarks.
National Science Foundation (Grant No. CCF-1717420)
APA, Harvard, Vancouver, ISO, and other styles
18

Горбачев, В. А. "Malicious Hardware: characteristics, classification and formal models." Thesis, IEEE, 2014. http://openarchive.nure.ua/handle/document/3435.

Full text
Abstract:
Electronic Systems (ES) that contain embedded malicious hardware represent a serious threat, especially for government, aeronautic, financial and energy system applications. MHs can be implemented as hardware modifications to application specific ICs (ASICs), microprocessors, digital signal processors, or as IP core modifications for field programmable gate arrays (FPGA) [1]. They are able to turn off the CPU, to send confidential information and bypass the software user authentication mechanisms. There are some important characteristics of this type of threat: standard testing methods, such as the common functional verification and Automatic Test Pattern Generation (ATPG) cannot always be used to solve the problem of detecting MH [2], [3]; identification of the threat sources without special tools is practically impossible; even in cases when an information security violation is detected, it is very difficult to prove that this action was performed by MH. These and other features make MHs very promising embedded devices for planning of electronic terrorism. Therefore, detecting and preventing approaches are in the attention centre of IT systems security investigation.
APA, Harvard, Vancouver, ISO, and other styles
19

VanValkenburg, MaryAnn E. "Alloy-Guided Verification of Cooperative Autonomous Driving Behavior." Digital WPI, 2020. https://digitalcommons.wpi.edu/etd-theses/1354.

Full text
Abstract:
Alloy is a lightweight formal modeling tool that generates instances of a software specification to check properties of the design. This work demonstrates the use of Alloy for the rapid development of autonomous vehicle driving protocols. We contribute two driving protocols: a Normal protocol that represents the unpredictable yet safe driving behavior of typical human drivers, and a Connected protocol that employs connected technology for cooperative autonomous driving. Using five properties that define safe and productive driving actions, we analyze the performance of our protocols in mixed traffic. Lightweight formal modeling is a valuable way to reason about driving protocols early in the development process because it can automate the checking of safety and productivity properties and prevent costly design flaws.
APA, Harvard, Vancouver, ISO, and other styles
20

Charfi, Leila. "Formal modeling and test generation automation with Use Case Maps and LOTOS." Thesis, University of Ottawa (Canada), 2001. http://hdl.handle.net/10393/9138.

Full text
Abstract:
This thesis addresses the problem of formal modelling and test generation, from system requirements represented in the form of Use Case Maps. In the first part of our thesis, we present an existent development methodology based on Use Case Maps for the design of the requirements and on LOTOS and SDL for the formal modeling of telecommunication systems. We follow this methodology for the formal specification and validation of a telephony system using LOTOS. In the second part of the thesis, we develop a method for the automatic generation of LOTOS scenarios from Use Case Maps called Ucm2LotosTests. The obtained scenarios can be used for the verification of the LOTOS specification built from the same Use Case Maps and for conformance testing purposes at the implementation stage. Finally, we propose a development methodology based on Use Case Maps for the design of the requirements and on LOTOS for the formal modeling of the system. In addition, this methodology offers a fast test generation process; it proposes the use of Ucm2LotosTests for the automatic generation of LOTOS scenarios from requirements in UCM and of TGV for the automatic generation of TTCN test suites from LOTOS. The methodology is illustrated with a case study which is a telephony system providing the basic call feature.
APA, Harvard, Vancouver, ISO, and other styles
21

Čaušević, Aida. "Formal Approaches to Service-oriented Design : From Behavioral Modeling to Service Analysis." Licentiate thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-12166.

Full text
Abstract:
Service-oriented systems (SOS) have recently emerged as context-independent component-based systems. In contrast to components, services can be created, invoked, composed and destroyed at run-time. Services are assumed to be platform independent and available for use within heterogeneous applications. One of the main assets in SOS is service composability. It allows the development of composite services with the main goal of reusable functionality provided by existing services in a low cost and rapid development process at run-time. However, in such distributed systems it becomes difficult to guarantee the quality of services (QoS), both in isolation, as well as of the newly created service compositions. Means of checking correctness of service composition can enable optimization w.r.t. the function and resource-usage of composed services, as well as provide a higher degree of QoS assurance of a service composition. To accomplish such goals, we employ model-checking technique for both single and composed services. The verification eventually provides necessaryinformation about QoS, already at early development stage.This thesis presents the research that we have been carrying out, on developing of methods and tools for specification, modeling, and formal analysis of services and service compositions in SOS. In this work, we first show how to formally check QoS in terms of performance and reliability for formallyspecified component-based systems (CBS). Next, we outline the commonalities and differences between SOS and CBS. Third, we develop constructs for the formal description of services using the resource-aware timed behavioral language called REMES, including development of language to support service compositions. At last, we show how to check service and service composition(functional, timing and resource-wise) correctness by employing the strongest post condition semantics. For less complex services and service compositions we choose to prove correctness using Hoare triples and the guarded command language. In case of complex services described as priced timed automata(PTA), we prove correctness via algorithmic computation of strongest post-condition of PTA.
Q-ImPreSS
APA, Harvard, Vancouver, ISO, and other styles
22

Pradalier, Sylvain. "A formal approach to the modeling, simulation and analysis of nano-devices." Phd thesis, Ecole Polytechnique X, 2009. http://tel.archives-ouvertes.fr/tel-00780567.

Full text
Abstract:
Nano-devices are molecular machines synthesized from molecular subcomponents whose functions are combined in order to perform the func- tion of the machine. It frequently results of relative motions of subcomponents triggered by chemical events such as excitement induced by light, acidity or tem- perature changes. Thus the function consists in the transformation of a chemical event into a mechanical event. An important and characteristic feature of these devices is their intrinsic compositional nature. Therefore process-algebra for- malisms are natural candidates for their modeling. To this aim we introduce a dialect of the -calculus, the nano calculus. It is a rule-based language, the basic agents are molecules, with explicit representa- tion of molecular complexations and internal states. Its stochastic semantics is governed by rules which correspond to chemical reactions. The stochastic rate of the rule, possibly in nite, corresponds to the kinetic rate of the reaction. We illustrated its relevance for the modeling and simulation of nano-devices with an example stemming from the collaboration with the chemistry department of bologna: the [2]RaH rotaxane. We modeled it in nano and simulated its behaviour under various conditions of concentration: rst we validate our model by checking its correspondance with the experimental data and then we investi- gate extreme conditions not observable in practice. We were able to show that some classical assumption about kinetic rates were not correct any longer in this setting. The calculus has many advantages for the modelling of biochemical sys- tems. It is in particular compact, easily reusable and modi able and maybe more importantly much biological-like and thus easier to learn for biochemists. On the other hand the -calculus, also often used to model biochemical sys- tems, has a much more developed theory and more available tools. We present an encoding from the nano calculus to the stochastic -calculus. It satis es a very strong correctness property: S ! T , [[S]] ! [[T]], where S and T are nano terms, is the rate of the reaction and [[:]] is the encoding. Thus it permits to use nano as a front-end formalism and still get the bene ts of the theory and tools of the -calculus. We carry on with a study of the chemical master equation. It probabilisti- cally describes the possible behaviours of the system over time as a di erential equation on the probability to be in a given state at a given instant. It is a key notion in chemistry. There have been many e orts to solve it, and methods such as the Gillespie's algorithm has been developed to simulate its solution. We introduce and motivate a notion of equivalence based on the chemical master equation. It equates state with similar stochastic behavior. Then we prove that this equivalence corresponds exactly to the notion backward stochastic bisimu- lation. This bisimulation di ers from the usual ones because it considers ingoing transitions instead of outgoing transitions. This results is worth in itself since it establishes a bridge between a chemical semantics and a computer semantics, but it is also the rst step towards a metrics for biochemistry. Finally we present an unexpected consequence of our study of the nano calculus. We study the relative expressiveness of the synchronous and asyn- chronous -calculus. In the classical setting the latter is known to be strictly less expressive than the former. We prove that the separation also holds in the stochastic setting. We then extend the result to the -calculi with in nite rates. We also show that under a small restriction the asynchronous -calculus with in nite rates can encode the synchronous -calculus without in nite rates. In- terestingly the separation results are proved using the encodability of the nano calculus. We also propose and motivate a stochastic -calculus with rates of di erent orders of magnitude: the multi-scale -calculus to which we generalize our results. Finally we prove that in the probabilistic settings the asynchronous -calculus can be encoded into the asynchronous one.
APA, Harvard, Vancouver, ISO, and other styles
23

Chen, Wei. "Formal Modeling and Automatic Generation of Test Cases for the Autonomous Vehicle." Electronic Thesis or Diss., université Paris-Saclay, 2020. http://www.theses.fr/2020UPASG002.

Full text
Abstract:
Les véhicules autonomes reposent principalement sur un pilote de système intelligent pour réaliser les fonctions de la conduite autonome. Ils combinent une variété de capteurs (caméras, radars, lidars,..) pour percevoir leurs environnements. Les algorithmes de perception des ADSs (Automated Driving Systems) fournissent des observations sur les éléments environnementaux à partir des données fournies par les capteurs, tandis que les algorithmes de décision génèrent les actions à mettre en oeuvre par les véhicules. Les ADSs sont donc des systèmes critiques dont les pannes peuvent avoir des conséquences catastrophiques. Pour assurer la sûreté de fonctionnement de tels systèmes, il est nécessaire de spécifier, valider et sécuriser la fiabilité de l’architecture et de la logique comportementale de ces systèmes pour toutes les situations qui seront rencontrées par le véhicule. Ces situations sont décrites et générées comme différents cas de test.L'objectif de cette thèse est de développer une approche complète permettant la conceptualisation et la caractérisation de contextes d'exécution pour le véhicule autonome, et la modélisation formelle des cas de test dans le contexte de l’autoroute. Enfin, cette approche doit permettre une génération automatique des cas de test qui ont un impact sur les performances et la fiabilité du véhicule.Dans cette thèse, nous proposons une méthodologie de génération de cas de test composée de trois niveaux. Le premier niveau comprend tous les concepts statiques et mobiles de trois ontologies que nous définissons afin de conceptualiser et de caractériser l'environnement d'execution du véhicule autonome: une ontologie de l'autoroute et une ontologie de la météo pour spécifier l'environnement dans lequel évolue le véhicule autonome, et une ontologie du véhicule qui se compose des feux du véhicule et les actions de contrôle. Chaque concept de ces ontologies est défini en termes d'entité, de sous-entités et de propriétés.Le second niveau comprend les interactions entre les entités des ontologies définies. Nous utilisons les équations de la logique du premier ordre pour représenter les relations entre ces entités.Le troisième et dernier niveau est dédié à la génération de cas de test qui est basée sur l'algèbre des processus PEPA (Performance Evaluation Process Algebra). Celle-ci est utilisée pour modéliser les situations décrites par les cas de test.Notre approche permet de générer automatiquement les cas de test et d'identifier les cas critiques. Nous pouvons générer des cas de test à partir de n'importe quelle situation initiale et avec n'importe quel nombre de scènes. Enfin, nous proposons une méthode pour calculer la criticité de chaque cas de test. Nous pouvons évaluer globalement l'importance d'un cas de test par sa criticité et sa probabilité d'occurrence
Autonomous vehicles mainly rely on an intelligent system pilot to achieve the purpose of self-driving. They combine a variety of sensors (cameras, radars, lidars,..) to perceive their surroundings. The perception algorithms of the Automated Driving Systems (ADSs) provide observations on the environmental elements based on the data provided by the sensors, while decision algorithms generate the actions to be implemented by the vehicles. Therefore, ADSs are safety-critical systems whose failures can have catastrophic consequences. To ensure the safety of such systems, it is necessary to specify, validate and secure the dependability of the architecture and the behavioural logic of ADSs running on vehicle for all the situations that will be met by the vehicle. These situations are described and generated as different test cases.The objective of this thesis is to develop a complete approach allowing the conceptualization and the characterization of execution contexts of autonomous vehicle, and the formal modelling of the test cases in the context of the highway. Finally, this approach has to allow an automatic generation of the test cases that have an impact on the performances and the dependability of the vehicle.In this thesis, we propose a three-layer test case generation methodology. The first layer includes all static and mobile concepts of three ontologies we define in order to conceptualize and characterize the driving environment for the construction of test cases: a highway ontology and a weather ontology to specify the environment in which evolves the autonomous vehicle, and a vehicle ontology which consists of the vehicle lights and the control actions. Each concept of these ontologies is defined in terms of entity, sub-entities and properties.The second layer includes the interactions between the entities of the defined ontologies. We use first-order logic equations to represent the relationships between these entities.The third and last layer is dedicated to the test case generation which is based on the process algebra PEPA (Performance Evaluation Process Algebra), which is used to model the situations described by the test cases.Our approach allows us to generate automatically the test cases and to identify the critical ones. We can generate test cases from any initial situation and with any number of scenes. Finally we propose a method to calculate the criticality of each test case. We can comprehensively evaluate the importance of a test case by its criticality and its probability of occurrence
APA, Harvard, Vancouver, ISO, and other styles
24

Chrszon, Philipp, Clemens Dubslaff, Christel Baier, Joachim Klein, and Sascha Klüppelholz. "Modeling Role-Based Systems with Exogenous Coordination." Springer, 2016. https://tud.qucosa.de/id/qucosa%3A70791.

Full text
Abstract:
The concept of roles is a promising approach to cope with context dependency and adaptivity of modern software systems. While roles have been investigated in conceptual modeling, programming languages and multi-agent systems, they have been given little consideration within component-based systems. In this paper, we propose a hierarchical role-based approach for modeling relationships and collaborations between components. In particular, we consider the channel-based, exogenous coordination language Reo and discuss possible realizations of roles and related concepts. The static requirements on the binding of roles are modeled by rule sets expressed in many-sorted second-order logic and annotations on the Reo networks for role binding, context and collaborations, while Reo connectors are used to model the coordination of runtime role playing. The ideas presented in this paper may serve as a basis for the formalization and formal analysis of role-based software systems.
APA, Harvard, Vancouver, ISO, and other styles
25

Mendling, Jan, Henrik Leopold, and Fabian Pittke. "25 Challenges of Semantic Process Modeling." Gitice, 2014. http://epub.wu.ac.at/5983/1/6%2D11%2D1%2DSM.pdf.

Full text
Abstract:
Process modeling has become an essential part of many organizations for documenting, analyzing and redesigning their business operations and to support them with suitable information systems. In order to serve this purpose, it is important for process models to be well grounded in formal and precise semantics. While behavioural semantics of process models are well understood, there is a considerable gap of research into the semantic aspects of their text labels and natural language descriptions. The aim of this paper is to make this research gap more transparent. To this end, we clarify the role of textual content in process models and the challenges that are associated with the interpretation, analysis, and improvement of their natural language parts. More specifically, we discuss particular use cases of semantic process modeling to identify 25 challenges. For each challenge, we identify prior research and discuss directions for addressing them.
APA, Harvard, Vancouver, ISO, and other styles
26

Venugopal, Manu. "Formal specification of industry foundation class concepts using engineering ontologies." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/42868.

Full text
Abstract:
Architecture, Engineering, Construction (AEC) and Facilities Management (FM) involve domains that require a very diverse set of information and model exchanges to fully realize the potential of Building Information Modeling (BIM). Industry Foundation Classes (IFC) provides a neutral and open schema for interoperability. Model View Definitions (MVD) provide a common subset for specifying the exchanges using IFC, but are expensive to build, test and maintain. A semantic analysis of IFC data schema illustrates the complexities of embedding semantics in model views. A software engineering methodology based on formal specification of shared resources, reusable components and standards that are applicable to the AEC-FM industry for development of a Semantic Exchange Module (SEM) structure for IFC schema is adopted for this research. This SEM structure is based on engineering ontologies that are capable of developing more consistent MVDs. In this regard, Ontology is considered as a machine-readable set of definitions that create a taxonomy of classes and subclasses, and relationships between them. Typically, the ontology contains the hierarchical description of important entities that are used in IFC, along with their properties and business rules. This model of an ontological framework, similar to that of Semantic Web, makes the IFC more formal and consistent as it is capable of providing precise definition of terms and vocabulary. The outcome of this research, a formal classification structure for IFC implementations for the domain of Precast/ Prestressed Concrete Industry, when implemented by software developers, provides the mechanism for applications such as modular MVDs, smart and complex querying of product models, and transaction based services, based on the idea of testable and reusable SEMs. It can be extended and also helps in consistent implementation of rule languages across different domains within AEC-FM, making data sharing across applications simpler with limited rework. This research is expected to impact the overall interoperability of applications in the BIM realm.
APA, Harvard, Vancouver, ISO, and other styles
27

Modica, Tony [Verfasser], and Hartmut [Akademischer Betreuer] Ehrig. "Formal Modeling, Simulation, and Validation of Communication Platforms / Tony Modica. Betreuer: Hartmut Ehrig." Berlin : Universitätsbibliothek der Technischen Universität Berlin, 2012. http://d-nb.info/1028072295/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Ivanov, Dinko. "Integrating formal analysis techniques into the Progress-IDE." Thesis, Mälardalens högskola, Akademin för innovation, design och teknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-13884.

Full text
Abstract:
In this thesis we contribute to the Progress IDE, an integrated development enviroment for real-time embedded systems and more precisely to the REMES toolchain, a set of tools to enabling construction and analysis of embedded system behavior models. The contribution aims to facilitate the formal analysis of behavioral models, so that certain extra-functional properties might be verified during early stages of development. Previous work in the field proposes use of the Priced Timed Automata framework for verification of such properties. The thesis outlines the main points where the current toolchain should be extended in order to allow formal analysis of modeled components. Result of the work is a prototype, which minimizes the manual efforts of system designer by model to model transformations and provides seamless integration with existing tools for formal analysis.
APA, Harvard, Vancouver, ISO, and other styles
29

Lee, Ghang. "A new formal and analytical process to product modeling (PPM) method and its application to the precast concrete industry." Diss., Available online, Georgia Institute of Technology, 2004:, 2004. http://etd.gatech.edu/theses/available/etd-10262004-191554/unrestricted/lee%5Fghang%5F200412%5Fphd.pdf.

Full text
Abstract:
Thesis (Ph. D.)--Architecture, Georgia Institute of Technology, 2005.
Eastman, Charles M., Committee Chair ; Augenbroe, Godfried, Committee Co-Chair ; Navathe, Shamkant B., Committee Co-Chair ; Hardwick, Martin, Committee Member ; Sacks, Rafael, Committee Member. Vita. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
30

Pamplin, Jason Andrew. "Formal Object Interaction Language: Modeling and Verification of Sequential and Concurrent Object-Oriented Software." unrestricted, 2007. http://etd.gsu.edu/theses/available/etd-04222007-205349/.

Full text
Abstract:
Thesis (Ph. D.)--Georgia State University, 2007.
Title from file title page. Ying Zhu, committee chair; Xiaolin Hu, Geoffrey Hubona, Roy Johnson, Rajshekhar Sunderraman, committee members. Electronic text (216 p. : ill. (some col.)) : digital, PDF file. Description based on contents viewed Nov. 29, 2007. Includes bibliographical references (p. 209-216).
APA, Harvard, Vancouver, ISO, and other styles
31

Čaušević, Aida. "Formal Approaches for Behavioral Modeling and Analysis of Design-time Services and Service Negotiations." Doctoral thesis, Mälardalens högskola, Inbyggda system, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-23271.

Full text
Abstract:
During the past decade service-orientation has become a popular design paradigm, offering an approach in which services are the functional building blocks. Services are self-contained units of composition, built to be invoked, composed, and destroyed on (user) demand. Service-oriented systems (SOS) are a collection of services that are developed based on several design principles such as: (i) loose coupling between services (e.g., inter-service communication can involve either simple data passing or two or more connected services coordinating some activity) that allows services to be independent, yet highly interoperable when required; (ii) service abstraction, which emphasizes the need to hide as many implementation details as possible, yet still exposing functional and extra-functional capabilities that can be offered to service users; (iii) service reusability provided by the existing services in a rapid and flexible development process; (iv) service composability as one of the main assets of SOS that provide a design platform for services to be composed and decomposed, etc. One of the main concerns in such systems is ensuring service quality per se, but also guaranteeing the quality of newly composed services. To accomplish the above, we consider two system perspectives: the developer's and the user's view, respectively. In the former, one can be assumed to have access to the internal service representation: functionality, enabled actions, resource usage, and interactions with other services. In the second, one has information primarily on the service interface and exposed capabilities (attributes/features). Means of checking that services and service compositions meet the expected requirements, the so-called correctness issue, can enable optimization and possibility to guarantee a satisfactory level of a service composition quality. In order to accomplish exhaustive correctness checks of design-time SOS, we employ model-checking as the main formal verification technique, which eventually provides necessary information about quality-of-service (QoS), already at early stages of system development. ~As opposed to the traditional approach of software system construction, in SOS the same service may be offered at various prices, QoS, and other conditions, depending on the user needs. In such a setting, the interaction between involved parties requires the negotiation of what is possible at request time, aiming at meeting needs on demand. The service negotiation process often proceeds with timing, price, and resource constraints, under which users and providers exchange information on their respective goals, until reaching a consensus. Hence, a mathematically driven technique to analyze a priori various ways to achieve such goals is beneficial for understanding what and how can particular goals be achieved. This thesis presents the research that we have been carrying out over the past few years, which resulted in developing methods and tools for the specification, modeling, and formal analysis of services and service compositions in SOS. The contributions of the thesis consist of: (i)constructs for the formal description of services and service compositions using the resource-aware timed behavioral language called REMES; (ii) deductive and algorithmic approaches for checking correctness of services and service compositions;(iii) a model of service negotiation that includes different negotiation strategies, formally analyzed against timing and resource constraints; (iv) a tool-chain (REMES SOS IDE) that provides an editor and verification support (by integration with the UPPAAL model-checker) to REMES-based service-oriented designs;(v) a relevant case-study by which we exercise the applicability of our framework.The presented work has also been applied on other smaller examples presented in the published papers.
Under det senaste årtiondet har ett tjänstorienterat paradigm blivit allt-mer populärt i utvecklingen av datorsystem. I detta paradigm utgör så kallade tjänster den minsta funktionella systemenheten. Dessa tjänster är konstruerade så att de kan skapas, användas, sammansättas och avslutas separat. De ska vara oberoende av varandra samtidigt som de ska kunna fungera effektivt tillsammans och i samarbete med andra system när så behövs. Vidare ska tjänsterna dölja sina interna implementa-tionsdetaljer i så stor grad som möjligt, samtidigt som deras fulla funktionalitet ska exponeras för systemdesignern. Tjänsterna ska också på ett enkelt sätt kunna återanvändas och sammansättas i en snabb och flexibel utvecklingsprocess.En av de viktigaste aspekterna i tjänsteorienterade datorsystem är att kunna säkerställa systemens kvalitet. För att åstadkomma detta ärdet viktigt att få en djupare insikt om tjänstens interna funktionalitet, i termer av möjliga operationer, resursinformation, samt tänkbar inter-aktion med andra tjänster. Detta är speciellt viktigt när utvecklaren har möjlighet att välja mellan två funktionellt likvärda tjänster somär olika med avseende på andra egenskaper, såsom responstid eller andra resurskrav. I detta sammanhang kan en matematisk beskrivning av en tjänsts beteende ge ökad förståelse av tjänstemodellen, samt hjälpa användaren att koppla ihop tjänster på ett korrekt sätt. En matematisk beskrivning öppnar också upp för ett sätt att matematiskt resonera kring tjänster. Metoder för att kontrollera att komponerade tjänstermöter ställda resurskrav möjliggör också resursoptimering av tjänster samt verifiering av ställda kvalitetskrav.I denna avhandling presenteras forskning som har bedrivits under de senaste åren. Forskningen har resulterat i metoder och verktyg föratt specificera, modellera och formellt analysera tjänster och sammansättning av tjänster. Arbetet i avhandlingen består av (i) en formell definition av tjänster och sammansättning av tjänster med hjälp avett resursmedvetet formellt specifikationsspråk kallat Remes; (ii) två metoder för att analysera tjänster och kontrollera korrektheten i sammansättning av tjänster, både deduktivt och algoritmiskt; (iii) en modell av förhandlingsprocessen vid sammansättning av tjänster som inkluderar olika förhandlingsstrategier; (iv) ett antal verktyg som stödjer dessa metoder. Metoderna har använts i ett antal fallstudier som är presenterade i de publicerade artiklarna.
Contesse
APA, Harvard, Vancouver, ISO, and other styles
32

Gonçalves, Monteiro Pedro Tiago. "Towards an integrative approach for the modeling and formal verification of biological regulatory networks." Thesis, Lyon 1, 2010. http://www.theses.fr/2010LYO10239/document.

Full text
Abstract:
L'étude des grands modèles de réseaux biologiques par l'utilisation d'outils d'analyse et de simulation conduit à un grand nombre de prédictions. Cela soulève la question de savoir comment identifier les prédictions intéressantes de nouveaux phénomènes, qui peuvent être confrontés à des données expérimentales. Les techniques de vérification formelle basées sur le model checking constituent une technologie puissante pour faire face à cette augmentation d'échelle et de complexité pour l'analyse de ces réseaux. L'application de ces techniques est par contre difficile, pour plusieurs raisons. Premièrement, le domaine de la biologie des systèmes a mis en évidence quelques propriétés dynamiques du réseau, comme la multi-stabilité et les oscillations, qui ne sont pas facilement exprimables avec les logiques temporelles classiques. Deuxièmement, la difficulté de poser des questions pertinentes et intéressantes en logique temporelle est difficile pour les utilisateurs non-experts. Enfin, la plupart des modèles existants et des outils de simulation ne sont pas capables d'appliquer des techniques de model checking d'une manière transparente. La mise en œuvre des approches développées dans ce travail contribue à enlever des obstacles pour l'utilisation de la technologie de vérification formelle en biologie. Leur application a été validée sur l'analyse et la simulation de deux modèles biologiques complexes
The study of large models of biological networks by means of analysis and simulation tools leads to large amounts of predictions. This raises the question of how to identify interesting predictions of novel phenomena that can be confronted with experimental data. Formal verification techniques based on model-checking have recently been used to the analysis of these networks, providing a powerful technology to keep up with this increase in scale and complexity. The application of these techniques is hampered, however, by several key issues. First, the systems biology domain brought to the fore a few properties of the network dynamics like multistability and oscillations, that are not easily expressed using classical temporal logics. Second, the problem of posing relevant and interesting questions in temporal logic, is difficult for non-expert users. Finally, most of the existing modeling and simulation tools are not capable of applying model-checking techniques in a transparent way. The approaches developed in this work lower the obstacles to the use of formal verification in systems biology. They have been validated on the analysis and simulation of two real and complex biological models
O estudo de redes biológicas tem originado o desenvolvimento de modelos cada vez mais complexos e detalhados. O estudo de redes biológicas complexas utilizando ferramentas de análise e simulação origina grandes quantidades de previsões. Isto levanta a questão de como identificar previsões interessantes de novos fenómenos que possam ser comparados com dados experimentais. As técnicas de verificação formal baseadas em model-checking têm sido usadas na análise destas redes, fornecendo uma tecnologia poderosa para acompanhar o aumento de escala e complexidade do problema. A aplicação destas técnicas tem sido dificultada por um conjunto importante de factores. Em primeiro lugar, em biologia de sistemas têm sido tratadas diversas questões acerca da dinâmica da rede, como a multi-estabilidade e oscilações, que não são facilmente expressas usando lógicas temporais clássicas. Em segundo lugar, o problema de como elaborar perguntas relevantes em lógica temporal, é difícil para o utilizador comum. Por último, a maioria das ferramentas de modelação e simulação não estão preparadas para a aplicação de técnicas de model-checking de forma transparente. Os métodos desenvolvidos nesta tese aliviam os obstáculos no uso da verificação formal em biologia de sistemas. Estes métodos foram validados através da análise e simulação de dois modelos biológicos complexos
APA, Harvard, Vancouver, ISO, and other styles
33

Pavawalla, Shital Prabodh. "Prospective memory following moderate to severe traumatic brain injury a formal multinomial modeling approach /." Pullman, Wash. : Washington State University, 2009. http://www.dissertations.wsu.edu/Dissertations/Summer2009/s_pavawalla_071909.pdf.

Full text
Abstract:
Thesis (Ph. D. in psychology)--Washington State University, August 2009.
Title from PDF title page (viewed on Aug. 19, 2009). "Department of Psychology." Includes bibliographical references (p. 32-36).
APA, Harvard, Vancouver, ISO, and other styles
34

Suhaib, Syed Mohammed. "XFM: An Incremental Methodology for Developing Formal Models." Thesis, Virginia Tech, 2004. http://hdl.handle.net/10919/9905.

Full text
Abstract:
We present a methodology of an agile formal method named eXtreme Formal Modeling (XFM) recently developed by us, based on Extreme Programming concepts to construct abstract models from a natural language specification of a complex system. In particular, we focus on Prescriptive Formal Models (PFMs) that capture the specification of the system under design in a mathematically precise manner. Such models can be used as golden reference models for formal verification, test generation, etc. This methodology for incrementally building PFMs work by adding user stories (expressed as LTL formulae) gleaned from the natural language specifications, one by one, into the model. XFM builds the models, retaining correctness with respect to incrementally added properties by regressively model checking all the LTL properties captured theretofore in the model. We illustrate XFM with a graded set of examples including a traffic light controller, a DLX pipeline and a Smart Building control system. To make the regressive model checking steps feasible with current model checking tools, we need to keep the model size increments under control. We therefore analyze the effects of ordering LTL properties in XFM. We compare three different property-ordering methodologies: 'arbitrary ordering', 'property based ordering' and 'predicate based ordering'. We experiment on the models of the ISA bus monitor and the arbitration phase of the Pentium Pro bus. We experimentally show and mathematically reason that predicate based ordering is the best among these orderings. Finally, we present a GUI based toolbox for users to build PFMs using XFM.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
35

Ghosh, Krishnendu. "Formal Analysis of Automated Model Abstractions under Uncertainty: Applications in Systems Biology." University of Cincinnati / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1330024977.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Kühn, Thomas, Kay Bierzynski, Sebastian Richly, and Uwe Aßmann. "FRaMED: Full-Fledge Role Modeling Editor (Tool Demo)." ACM, 2016. https://tud.qucosa.de/id/qucosa%3A75117.

Full text
Abstract:
Since the year 1977, role modeling has been continuously investigated as promising paradigm to model complex, dynamic systems. However, this research had almost no influence on the design of todays increasingly complex and context-sensitive software systems. The reason for that is twofold. First, most modeling languages focused either on the behavioral, relational or context-dependent nature of roles rather than combining them. Second, there is a lack of tool support for the design, validation, and generation of role-based software systems. In particular, there exists no graphical role modeling editor supporting the three natures as well as the various proposed constraints. To overcome this deficiency, we introduce the Full-fledged Role Modeling Editor (FRaMED), a graphical modeling editor embracing all natures of roles and modeling constraints featuring generators for a formal representation and source code of a rolebased programming language. To show its applicability for the development of role-based software systems, an example from the banking domain is employed.
APA, Harvard, Vancouver, ISO, and other styles
37

Pow, Jacky W. C. "A study of formal modeling for sharing the experience of using ICT in university teaching." Thesis, University of Nottingham, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.289436.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Hamadi, Rachid Computer Science &amp Engineering Faculty of Engineering UNSW. "Formal Composition and Recovery Policies in Service-Based Business Processes." Awarded by:University of New South Wales. Computer Science and Engineering, 2005. http://handle.unsw.edu.au/1959.4/20666.

Full text
Abstract:
Process-based composition of Web services is emerging as a promising technology for the effective automation of integrated and collaborative applications. As Web services are often autonomous and heterogeneous entities, coordinating their interactions to build complex processes is a difficult, error prone, and time-consuming task. In addition, since Web services usually operate in dynamic and highly evolving environments, there is a need for supporting flexible and correct execution of integrated processes. In this thesis, we propose a Petri net-based framework for formal composition and recovery policies in service-based business processes. We first propose an algebra for composing Web services. The formal semantics of this algebra is expressed in terms of Petri nets. The use of a formal model allows the effective verification and analysis of properties, both within a service, such as termination and absence of deadlock, and between services, such as behavioral equivalences. We also develop a top down approach for the correct (e.g., absence of deadlock and termination) composition of complex business processes. The approach defines a set of refinement operators that guarantee correctness of the resulting business process nets at design time. We then introduce Self-Adaptive Recovery Net (SARN), an extended Petri net model for specifying exceptional behavior in business processes. SARN adapts the structure of the underlying Petri net at run time to handle exceptions while keeping the Petri net design simple and easy. The proposed framework caters for the specification of high-level recovery policies that are incorporated either with a single task or a set of tasks, called a recovery region. Finally, we propose a pattern-based approach to dynamically restructure SARN. These patterns capture the ways past exceptions have been dealt with. The objective is to continuously restructure recovery regions within the SARN model to minimize the impact of exception handling. To illustrate the viability of the proposed composition and exception handling techniques, we have developed HiWorD (HIerarchical WORkflow Designer), a hierarchical Petri net-based business process modeling and simulation tool.
APA, Harvard, Vancouver, ISO, and other styles
39

Grover, Russell J. "An Exploration of Formal Methods and Tools Applied to a Small Satellite Software System." DigitalCommons@USU, 2010. https://digitalcommons.usu.edu/etd/743.

Full text
Abstract:
Formal system modeling has been a topic of interest in the research community for many years. Modeling a system helps engineers understand it better and enables them to check different aspects of it to ensure that there is no undesired or unexpected behavior and that it does what it was designed to do. This thesis takes two existing tools that were created to aid in the designing of spacecraft systems and creates a layer to connect them together and allow them to be used jointly. The first tool is a library of formal descriptions used to specify spacecraft behavior in an unambiguous manner. The second tool is a graphical modeling language that allows a designer to create a model using traditional block diagram descriptions. These block diagrams can be translated to the formal descriptions using the layer created as part of this thesis work. The software of a small satellite, and the additions made to it as part of this thesis work, is also described. Approaches to modeling this software formally are discussed, as are the problems that were encountered that led to expansions of the formal description library to allow better system description.
APA, Harvard, Vancouver, ISO, and other styles
40

Chang, Lily. "A Nested Petri Net Framework for Modeling and Analyzing Multi-Agent Systems." FIU Digital Commons, 2011. http://digitalcommons.fiu.edu/etd/339.

Full text
Abstract:
In the past two decades, multi-agent systems (MAS) have emerged as a new paradigm for conceptualizing large and complex distributed software systems. A multi-agent system view provides a natural abstraction for both the structure and the behavior of modern-day software systems. Although there were many conceptual frameworks for using multi-agent systems, there was no well established and widely accepted method for modeling multi-agent systems. This dissertation research addressed the representation and analysis of multi-agent systems based on model-oriented formal methods. The objective was to provide a systematic approach for studying MAS at an early stage of system development to ensure the quality of design. Given that there was no well-defined formal model directly supporting agent-oriented modeling, this study was centered on three main topics: (1) adapting a well-known formal model, predicate transition nets (PrT nets), to support MAS modeling; (2) formulating a modeling methodology to ease the construction of formal MAS models; and (3) developing a technique to support machine analysis of formal MAS models using model checking technology. PrT nets were extended to include the notions of dynamic structure, agent communication and coordination to support agent-oriented modeling. An aspect-oriented technique was developed to address the modularity of agent models and compositionality of incremental analysis. A set of translation rules were defined to systematically translate formal MAS models to concrete models that can be verified through the model checker SPIN (Simple Promela Interpreter). This dissertation presents the framework developed for modeling and analyzing MAS, including a well-defined process model based on nested PrT nets, and a comprehensive methodology to guide the construction and analysis of formal MAS models.
APA, Harvard, Vancouver, ISO, and other styles
41

Robol, Marco. "Consent modeling and verification: privacy regulations compliance from business goals to business processes." Doctoral thesis, Università degli studi di Trento, 2020. http://hdl.handle.net/11572/277802.

Full text
Abstract:
Privacy regulations impose on companies limitations about the collection, use, and disclosure of user data. One of the actions most companies undertake for this, consists in modifying their systems with processes for consent acquisition and management. Unfortunately, where systems are large and with many dependencies, they often also have little documentation, and knowledge on the system is distributed among different domain experts. These circumstances make the re-engineering of systems a tedious and complex, if not impossible, activity. This PhD Thesis proposes a model-based method with a top-down approach, for modeling consent requirements and analyzing compliance with regulations, by refinement of models from organizational structure down to business processes. The method is provided with guidelines in the form of a process and includes modeling languages and reasoning frameworks for the analysis of requirements with respect to a preset of privacy principles on consent. The Thesis includes validations with realistic scenarios and with domain practitioners from the healthcare domain.
APA, Harvard, Vancouver, ISO, and other styles
42

Nguyen, Vu. "A Deontic Analysis of Inter-Organizational Control Requirements." FIU Digital Commons, 2008. http://digitalcommons.fiu.edu/etd/69.

Full text
Abstract:
This research focuses on the design and verification of inter-organizational controls. Instead of looking at a documentary procedure, which is the flow of documents and data among the parties, the research examines the underlying deontic purpose of the procedure, the so-called deontic process, and identifies control requirements to secure this purpose. The vision of the research is a formal theory for streamlining bureaucracy in business and government procedures. Underpinning most inter-organizational procedures are deontic relations, which are about rights and obligations of the parties. When all parties trust each other, they are willing to fulfill their obligations and honor the counter parties’ rights; thus controls may not be needed. The challenge is in cases where trust may not be assumed. In these cases, the parties need to rely on explicit controls to reduce their exposure to the risk of opportunism. However, at present there is no analytic approach or technique to determine which controls are needed for a given contracting or governance situation. The research proposes a formal method for deriving inter-organizational control requirements based on static analysis of deontic relations and dynamic analysis of deontic changes. The formal method will take a deontic process model of an inter-organizational transaction and certain domain knowledge as inputs to automatically generate control requirements that a documentary procedure needs to satisfy in order to limit fraud potentials. The deliverables of the research include a formal representation namely Deontic Petri Nets that combine multiple modal logics and Petri nets for modeling deontic processes, a set of control principles that represent an initial formal theory on the relationships between deontic processes and documentary procedures, and a working prototype that uses model checking technique to identify fraud potentials in a deontic process and generate control requirements to limit them. Fourteen scenarios of two well-known international payment procedures -- cash in advance and documentary credit -- have been used to test the prototype. The results showed that all control requirements stipulated in these procedures could be derived automatically.
APA, Harvard, Vancouver, ISO, and other styles
43

Chrszon, Philipp, Clemens Dubslaff, Sascha Klüppelholz, and Christel Baier. "Family-Based Modeling and Analysis for Probabilistic Systems." Springer, 2016. https://tud.qucosa.de/id/qucosa%3A70790.

Full text
Abstract:
Feature-based formalisms provide an elegant way to specify families of systems that share a base functionality and differ in certain features. They can also facilitate an all-in-one analysis, where all systems of the family are analyzed at once on a single family model instead of one-by-one. This paper presents the basic concepts of the tool ProFeat, which provides a guarded-command language for modeling families of probabilistic systems and an automatic translation of family models to the input language of the probabilistic model checker PRISM. This translational approach enables a family-based quantitative analysis with PRISM. Besides modeling families of systems that differ in system parameters such as the number of identical processes or channel sizes, ProFeat also provides special support for the modeling and analysis of (probabilistic) product lines with dynamic feature switches, multi-features and feature attributes. By means of several case studies we show how ProFeat eases family-based modeling and compare the one-by-one and all-in-one analysis approach.
APA, Harvard, Vancouver, ISO, and other styles
44

Eriksson, Lundström Jenny S. Z. "On the Formal Modeling of Games of Language and Adversarial Argumentation : A Logic-Based Artificial Intelligence Approach." Doctoral thesis, Uppsala universitet, Institutionen för informationsvetenskap, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-9538.

Full text
Abstract:
Argumentation is a highly dynamical and dialectical process drawing on human cognition. Successful argumentation is ubiquitous to human interaction. Comprehensive formal modeling and analysis of argumentation presupposes a dynamical approach to the following phenomena: the deductive logic notion, the dialectical notion and the cognitive notion of justified belief. For each step of an argumentation these phenomena form networks of rules which determine the propositions to be allowed to make sense as admissible, acceptable, and accepted. We present a formal logic framework for a computational account of formal modeling and systematical analysis of the dynamical, exhaustive and dialectical aspects of adversarial argumentation and dispute. Our approach addresses the mechanisms of admissibility, acceptability and acceptance of arguments in adversarial argumentation by use of metalogic representation and Artificial Intelligence-techniques for dynamical problem solving by exhaustive search. We elaborate on a common framework of board games and argumentation games for pursuing the alternatives facing the adversaries in the argumentation process conceived as a game. The analogy to chess is beneficial as it incorporates strategic and tactical operations just as argumentation. Drawing on an analogy to board games like chess, the state space representation, well researched in Artificial Intelligence, allows for a treatment of all possible arguments as paths in a directed state space graph. It will render a game leading to the most wins and fewest losses, identifying the most effective game strategy. As an alternate visualization, the traversal of the state space graph unravels and collates knowledge about the given situation/case under dispute. Including the private knowledge of the two parties, the traversal results in an increased knowledge of the case and the perspectives and arguments of the participants. As we adopt metalogic as formal basis, arguments used in the argumentation, expressed in a non-monotonic defeasible logic, are encoded as terms in the logical argumentation analysis system. The advantage of a logical formalization of argumentation is that it provides a symbolic knowledge representation with a formally well-formed semantics, making the represented knowledge as well as the behavior of knowledge representation systems reasoning comprehensible. Computational logic as represented in Horn Clauses allows for expression of substantive propositions in a logical structure. The non-monotonic nature of defeasible logic stresses the representational issues, i.e. what is possible to capture in non-monotonic reasoning, while from the (meta)logic program, the sound computation on what it is possible to compute, and how to regard the semantics of this computation, are established.
APA, Harvard, Vancouver, ISO, and other styles
45

Gladigau, Jens [Verfasser], and Teich [Akademischer Betreuer] Jürgen. "Combining Formal Model-Based System-Level Design with SystemC Transaction Level Modeling / Jens Gladigau. Betreuer: Teich Jürgen." Erlangen : Universitätsbibliothek der Universität Erlangen-Nürnberg, 2012. http://d-nb.info/1028958757/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Eriksson, Lundström Jenny. "On the formal modeling of games of language and adversarial argumentation : a logic-based artificial intelligence approach /." Uppsala : Uppsala universitet, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-9538.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Gonçcalves, Marcos André. "Streams, Structures, Spaces,Scenarios, and Societies (5S): A Formal Digital Library Framework and Its Applications." Diss., Virginia Tech, 2004. http://hdl.handle.net/10919/29942.

Full text
Abstract:
Digital libraries (DLs) are complex information systems and therefore demand formal foundations lest development efforts diverge and interoperability suffers. In this dissertation, we propose the fundamental abstractions of Streams, Structures, Spaces, Scenarios, and Societies (5S), which allow us to define digital libraries rigorously and usefully. Streams are sequences of arbitrary items used to describe both static and dynamic (e.g., video) content. Structures can be viewed as labeled directed graphs, which impose organization. Spaces are sets with operations that obey certain constraints. Scenarios consist of sequences of events or actions that modify states of a computation in order to accomplish a functional requirement. Societies are sets of entities and activities, and the relationships among them. Together these abstractions provide a formal foundation to define, relate, and unify concepts -- among others, of digital objects, metadata, collections, and services -- required to formalize and elucidate ``digital libraries''. A digital library theory based on 5S is defined by proposing a formal ontology that defines the fundamental concepts, relationships, and axiomatic rules that govern the DL domain. The ontology is an axiomatic, formal treatment of DLs, which distinguishes it from other approaches that informally define a number of architectural invariants. The applicability, versatility, and unifying power of the 5S theory are demonstrated through its use in a number of distinct applications including: 1) building and interpreting a DL taxonomy; 2) informal and formal analysis of case studies of digital libraries (NDLTD and OAI); 3)utilization as a formal basis for a DL description language, digital library visualization and generation tools, and a log format specific for DLs; and 4) defining a quality model for DLs.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
48

Krause, Christian, and Holger Giese. "Quantitative modeling and analysis of service-oriented real-time systems using interval probabilistic timed automata." Universität Potsdam, 2012. http://opus.kobv.de/ubp/volltexte/2012/5784/.

Full text
Abstract:
One of the key challenges in service-oriented systems engineering is the prediction and assurance of non-functional properties, such as the reliability and the availability of composite interorganizational services. Such systems are often characterized by a variety of inherent uncertainties, which must be addressed in the modeling and the analysis approach. The different relevant types of uncertainties can be categorized into (1) epistemic uncertainties due to incomplete knowledge and (2) randomization as explicitly used in protocols or as a result of physical processes. In this report, we study a probabilistic timed model which allows us to quantitatively reason about nonfunctional properties for a restricted class of service-oriented real-time systems using formal methods. To properly motivate the choice for the used approach, we devise a requirements catalogue for the modeling and the analysis of probabilistic real-time systems with uncertainties and provide evidence that the uncertainties of type (1) and (2) in the targeted systems have a major impact on the used models and require distinguished analysis approaches. The formal model we use in this report are Interval Probabilistic Timed Automata (IPTA). Based on the outlined requirements, we give evidence that this model provides both enough expressiveness for a realistic and modular specifiation of the targeted class of systems, and suitable formal methods for analyzing properties, such as safety and reliability properties in a quantitative manner. As technical means for the quantitative analysis, we build on probabilistic model checking, specifically on probabilistic time-bounded reachability analysis and computation of expected reachability rewards and costs. To carry out the quantitative analysis using probabilistic model checking, we developed an extension of the Prism tool for modeling and analyzing IPTA. Our extension of Prism introduces a means for modeling probabilistic uncertainty in the form of probability intervals, as required for IPTA. For analyzing IPTA, our Prism extension moreover adds support for probabilistic reachability checking and computation of expected rewards and costs. We discuss the performance of our extended version of Prism and compare the interval-based IPTA approach to models with fixed probabilities.
Eine der wichtigsten Herausforderungen in der Entwicklung von Service-orientierten Systemen ist die Vorhersage und die Zusicherung von nicht-funktionalen Eigenschaften, wie Ausfallsicherheit und Verfügbarkeit von zusammengesetzten, interorganisationellen Diensten. Diese Systeme sind oft charakterisiert durch eine Vielzahl von inhärenten Unsicherheiten, welche sowohl in der Modellierung als auch in der Analyse eine Rolle spielen. Die verschiedenen relevanten Arten von Unsicherheiten können eingeteilt werden in (1) epistemische Unsicherheiten aufgrund von unvollständigem Wissen und (2) Zufall als Mittel in Protokollen oder als Resultat von physikalischen Prozessen. In diesem Bericht wird ein probabilistisches, Zeit-behaftetes Modell untersucht, welches es ermöglicht quantitative Aussagen über nicht-funktionale Eigenschaften von einer eingeschränkten Klasse von Service-orientierten Echtzeitsystemen mittels formaler Methoden zu treffen. Zur Motivation und Einordnung wird ein Anforderungskatalog für probabilistische Echtzeitsysteme mit Unsicherheiten erstellt und gezeigt, dass die Unsicherheiten vom Typ (1) und (2) in den untersuchten Systemen einen Ein uss auf die Wahl der Modellierungs- und der Analysemethode haben. Als formales Modell werden Interval Probabilistic Timed Automata (IPTA) benutzt. Basierend auf den erarbeiteten Anforderungen wird gezeigt, dass dieses Modell sowohl ausreichende Ausdrucksstärke für eine realistische und modulare Spezifikation als auch geeignete formale Methoden zur Bestimmung von quantitativen Sicherheits- und Zuverlässlichkeitseigenschaften bietet. Als technisches Mittel für die quantitative Analyse wird probabilistisches Model Checking, speziell probabilistische Zeit-beschränkte Erreichbarkeitsanalyse und Bestimmung von Erwartungswerten für Kosten und Vergütungen eingesetzt. Um die quantitative Analyse mittels probabilistischem Model Checking durchzuführen, wird eine Erweiterung des Prism-Werkzeugs zur Modellierung und Analyse von IPTA eingeführt. Die präsentierte Erweiterung von Prism ermöglicht die Modellierung von probabilistischen Unsicherheiten mittelsWahrscheinlichkeitsintervallen, wie sie für IPTA benötigt werden. Zur Verifikation wird probabilistische Erreichbarkeitsanalyse und die Berechnung von Erwartungswerten durch das Werkzeug unterstützt. Es wird die Performanz der Prism-Erweiterung untersucht und der Intervall-basierte IPTA-Ansatz mit Modellen mit festen Wahrscheinlichkeitswerten verglichen.
APA, Harvard, Vancouver, ISO, and other styles
49

Friese, Max Jonas [Verfasser]. "Modeling and Analysis of Automotive Cyber-physical Systems - Formal Approaches to Latency Analysis in Practice / Max Jonas Friese." Kiel : Universitätsbibliothek Kiel, 2021. http://d-nb.info/1232812536/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Miralles, José Armando San Pedro. "GHENeSys, uma rede unificada e de alto nível." Universidade de São Paulo, 2012. http://www.teses.usp.br/teses/disponiveis/3/3152/tde-07062013-113518/.

Full text
Abstract:
Esquemas baseados em grafos, em diferentes níveis de formalismo, são um forte apelo para a constituição de representações de sistemas complexos e de grande porte aplicados em várias áreas do conhecimento. Este fato responde pelo crescimento acentuado de métodos e representações formais baseadas em grafos e aplicadas em diferentes áreas, especialmente na Engenharia. As Redes de Petri (RdP) constituem um destes métodos, que apareceu em 1962 e desde então tem contribuído para o avanço dos métodos formais para o tratamento de sistemas de controle, sistemas discretos, logística, workflow, cadeia de fornecedores, redes de computadores, e uma variada classe de outros sistemas. Da mesma forma que outras representações formais, as primeiras tentativas de uso prático destas redes estiveram sempre ligadas ao domínio de aplicação, o que levou à criação de várias extensões. Por outro lado, a necessidade de se aplicar a representação em redes para sistemas de grande porte suscitou a discussão sobre as limitações do formalismo e sobre a necessidade de se inserir redes de alto nível. No entanto, todo este desenvolvimento, apesar de sua difusão em diferentes domínios, levantou a discussão sobre a unificação das redes. Desde 1992 a unificação do formalismo das RdPs é discutida pela comunidade acadêmica e, finalmente, no início deste século um padrão ISO/IEC foi proposto. Esta proposta conduz a dois desafios: i) mostrar que um formalismo de redes que seja candidato a ser usado na prática pertença de fato à classe de redes prescrita pelo padrão; ii) participar da discussão sobre a semântica das extensões propondo ambientes computacionais para o uso prático na modelagem e design de sistemas de grande porte. A rede GHENeSys, concebida e desenvolvida no Design Lab da Universidade de São Paulo, é uma rede estendida com conceitos de orientação a objetos, um mecanismo de hierarquia e, até o momento, parece ser uma das primeiras tentativas de prover um ambiente de modelagem e design com as propriedades de uma rede unificada, com capacidade para cobrir as diferentes variantes das RdP e suas extensões. Neste trabalho é apresentada uma proposta de ambiente integrado de modelagem para a representação de sistemas a eventos discretos (SEDs) em RdP, baseada em um formalismo enquadrado dentro da norma ISO/IEC 15909 recentemente proposta. Este formalismo é a rede GHENeSys, que terá sua definição estendida utilizando como base a definição das RdPs Coloridas (CPN) com o objetivo de permitir a representação de tipos nas marcas. Um protótipo para testes, resultado da integração de diversos trabalhos desenvolvidos separadamente por membros do D-Lab que nunca foram implementados nem integrados em formalismo único, é apresentado. Este protótipo é utilizado em um estudo de caso com a finalidade de validar de forma prática os novos elementos acrescentados à definição da rede GHENeSys para permitir a modelagem de sistemas utilizando os elementos das RdPs de alto nível.
Graph schemas are a strong approach to the representation (in dierent degrees of formality) of large and complex systems in several areas of knowledge. This fact has provided a continuous growth of methods and new formal schemas, specially in Engineering. Petri Nets(PN) are one of these methods, which appears in 1962 and since then has improved the representation of discrete control, discrete systems, logistics, workflow, supply chain, computer networks, and a variety of other systems. As any other representation, the first attempts to use it in practice were always made in a close relation between the representation and the domain of discourse, openning opportunity for several extensions. Also the need to use it in large systems brought a discussion about the formalism and the need for high level systems. However, all this development, besides the broad use in different domains, rose the need for an unified approach. Since 1992 such unification has been addressed by the scientific community and finally, in the beginning of this century, a ISO/IEC standard was proposed. That proposal also brings two new challenges: i) to show that any proposed net that belongs to Petri Net class proved itself as satisfying the requirements of the standard; ii) to enter the discussions of the semantics of extensions and also provide practical and unified system environments that can really support the design of large and complex systems. In this work, we present a proposal for the developing of an integrated modeling environment for the representation of discrete event systems using Petri Nets. This environment will use an underlying formalism framed within the rules defined recently by the ISO/IEC, in the standard 15909. The formalism to be used will be the GHENeSys net, which will have its definition extended using the definition of the Coloured PN (CPN) as a starting point in order to allow the representation of types within the net tokens. A testing prototype for this integrated modeling environment, result of the integration of several previous works of D-Lab members that were never implemented or integrated in a unique formalism, is presented. This prototype will be used in a case study in order to validate in practical way the new elements added to the definition of GHENeSys, to allow the modeling of systems using the elements of HLPNs.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography