To see the other types of publications on this topic, follow the link: Algebraic semantics.

Dissertations / Theses on the topic 'Algebraic semantics'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Algebraic semantics.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Azevedo, Terceiro Antonio Soares de. "Semantics for an algebraic specification language." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2006. http://hdl.handle.net/10183/8126.

Full text
Abstract:
Prosoft é um grupo de pesquisa do Instituto de Informática da UFRGS, desenvolvido pelo grupo de pesquisa homônimo e coordenado pelo Professor Daltro José Nunes. O objetivo do projeto é desenvolver um ambiente de desenvolvimento de software completo, o Ambiente Prosoft, que é baseado nos conceitos de Modelos, Cálculo Lambda, Tipos Abstratos de Dados e Orientação a Objetos. Um dos componentes do Ambiente Prosoft é sua linguagem de especificação algébrica: o Prosoft Algébrico. Apesar de ser base e tema de diversos trabalhos no grupo de pesquisa Prosoft, o Prosoft Algébrico não tem sua semântica devidamente definida. Os trabalhos desenvolvidos até agora foram baseados em noções operacionais, e apresentam diferentes interpretações do Prosoft Algébrico. Esta dissertação apresenta uma especificação de semântica denotacional para o Prosoft Algébrico, compreendendo, entre outras características, sua primitiva de comunicação entre tipos de dados, chamada ICS, e sua notação gráfica para representação de instanciação de tipos abstratos de dados. Essa dissertação apresenta também um estudo sobre prototipação semântica usando a linguagem de programação Haskell. O conceito de Literate Programming e a proximidade entre Cálculo Lambda e Haskell foram cruciais no rápido desenvolvimento de uma implementação protótipo do Prosoft Algébrico, baseada na sua semântica especificada. As principais contribuições dessa dissertação incluem: uma interpretação precisa e sem ambiguidades do Prosoft Algébrico, através da especificação da sua semântica; a definição de semântica para a ICS, um conceito único (até o limite do nosso conhecimento) que fornece um mecanismo de passagem de mensagens entre tipos de dados algébricos; uma implementação protótipo do Prosoft Algébrico, que pode realmente ser utilizada para experimentar e testar a definição da linguagem e a especificação da semântica do Prosoft Algébrico; resultados sobre prototipação semântica de especificações tanto de semântica denotacional quanto de semântica operacional usando a linguagem de programação Haskell para desenvolvimento rápido de protótipos de linguagens baseados na sua semântica. Como grande parte do desenvolvimento do Ambiente Prosoft é realizado através de projetos de cooperação internacional e essa dissertação irá influenciar fortemente o seu desenvolvimento futuro, o texto foi escrito em inglês para facilitar a troca de informação entre o grupo Prosoft e seus parceiros estrangeiros.
Prosoft is a research project at Instituto de Informática da UFRGS, developed by the research group with the same name and coordinated by Professor Daltro José Nunes. The project’s goal is to develop a full software development environment, the Prosoft Environment, based on the concepts of Models, Lambda Calculus, Abstract Data Types and Object orientation. One of the components of the Prosoft Environment is its algebraic specification language: Algebraic Prosoft. Although being the basis and theme of several works in the Prosoft research group, Algebraic Prosoft doesn’t have its semantics properly defined. Works done up to now were based on operational notions and presented different interpretations of Algebraic Prosoft. This thesis presents a denotational semantics specification for Algebraic Prosoft, comprising, among other features, its “inter-data type” communication primitive, called ICS, and its graphical notation for representing instantiations of abstract data types. This thesis also presents a study of semantic prototyping using the Haskell programming language. The concept of Literate Programing and the proximity between lambda calculus and Haskell were crucial to the rapid development of a prototype implementation of Algebraic Prosoft, based on its specified semantics. This thesis’ main contributions include: a precise and unambiguous interpretation of Algebraic Prosoft, through a semantics specification; the definition of semantics to the ICS, a unique (to the best of our knowledge) concept that provides a messagepassing mechanism between algebraic data types; a prototype implementation of Algebraic Prosoft, which can actually be used to experiment and test the Algebraic Prosoft language definition and semantics specification; results regarding semantics prototyping of both denotational and operational semantics specifications using the Haskell programming language for rapid development of semantics-based prototypes of languages. Since a large portion of Prosoft Environment’s development is done through international cooperation projects and this thesis will strongly influence its future development, the text was written in English in order to facilitate the information exchange between the Prosoft research group and its foreign partners.
APA, Harvard, Vancouver, ISO, and other styles
2

Ross, Brian James. "An algebraic semantics of Prolog control." Thesis, University of Edinburgh, 1992. http://hdl.handle.net/1842/585.

Full text
Abstract:
The coneptual distinction between logic and control is an important tenet of logic programing. In practice, however, logic program languages use control strategies which profoundly affect the computational behavior of programs. For example, sequential Prolog's depth-first-left-first control is an unfair strategy under which nontermination can easily arise if programs are ill-structured. Formal analyses of logic programs therefore require an explicit formalisation of the control scheme. To this ends, this research introduces an algebraic proccess semantics of sequential logic programs written in Milner's calculus of Communicating Systems (CCS). the main contribution of this semantics is that the control component of a logic programming language is conciesly modelled. Goals and clauses of logic programs correspond semantically to sequential AND and OR agents respectively, and these agents are suitably defined to reflect the control strategy used to traverse the AND/OR computation tree for the program. The main difference between this and other process semantics which model concurrency is that the processes used here are sequential. The primary control strategy studied is standard Prolog's left-first-depth-first control. CCS is descriptively robust, however, and a variety of other sequential control schemes are modelled, including breadth-first, predicate freezing, and nondeterministic strategies. The CCS semantics for a particular control scheme is typically defined hierarchically. For example, standard Prolog control is initially defined in basic CCS using two control operators which model goal backtracking and clause sequencing. Using these basic definitions, higher-level bisimilarities are derived, ehich are more closely mappable to Prolog program constructs. By using variuos algebraic properties of the control operators, as well as the stream domain and theory of observational equivalence from CCS, a programming calculus approach to logic program analysis is permitted. Some example applications using the semantics include proving program termination, verifying transformations which use cut, and characterising some control issues of partial evaluation. Since progress algebras have already been used to model concurrency, this thesis suggests that they are an ideal means for unifying the operational semantics of the sequential and concurrent paradigms of logic programming.
APA, Harvard, Vancouver, ISO, and other styles
3

Silva, Thiago Nascimento da. "Algebraic semantics for Nelson?s logic S." PROGRAMA DE P?S-GRADUA??O EM SISTEMAS E COMPUTA??O, 2018. https://repositorio.ufrn.br/jspui/handle/123456789/24823.

Full text
Abstract:
Submitted by Automa??o e Estat?stica (sst@bczm.ufrn.br) on 2018-03-02T23:39:14Z No. of bitstreams: 1 ThiagoNascimentoDaSilva_DISSERT.pdf: 675458 bytes, checksum: 9123812e69a846020d3cd6346e530e1e (MD5)
Approved for entry into archive by Arlan Eloi Leite Silva (eloihistoriador@yahoo.com.br) on 2018-03-13T18:55:45Z (GMT) No. of bitstreams: 1 ThiagoNascimentoDaSilva_DISSERT.pdf: 675458 bytes, checksum: 9123812e69a846020d3cd6346e530e1e (MD5)
Made available in DSpace on 2018-03-13T18:55:45Z (GMT). No. of bitstreams: 1 ThiagoNascimentoDaSilva_DISSERT.pdf: 675458 bytes, checksum: 9123812e69a846020d3cd6346e530e1e (MD5) Previous issue date: 2018-01-25
Al?m da mais conhecida l?gica de Nelson (?3) e da l?gica paraconsistente de Nelson (?4), David Nelson introduziu no artigo de 1959 "Negation and separation of concepts in constructive systems", com motiva??es de aritm?tica e construtividade, a l?gica que ele chamou de "?". Naquele trabalho, a l?gica ? definida por meio de um c?lculo (que carece crucialmente da regra de contra??o) tendo infinitos esquemas de regras, e nenhuma sem?ntica ? fornecida. Neste trabalho n?s tomamos o fragmento proposicional de ?, mostrando que ele ? algebriz?vel (de fato, implicativo) no sentido de Blok & Pigozzi com respeito a uma classe de reticulados residuados involutivos. Assim, fornecemos a primeira sem?ntica para ? (que chamamos de ?-?lgebras), bem como um c?lculo estilo Hilbert finito equivalente ? apresenta??o de Nelson. Fornecemos um algoritmo para construir ?-?lgebras a partir de ?-?lgebras ou reticulados implicativos e demonstramos alguns resultados sobre a classe de ?lgebras que introduzimos. N?s tamb?m comparamos ? com outras l?gicas da fam?lia de Nelson, a saber, ?3 e ?4.
Besides the better-known Nelson logic (?3) and paraconsistent Nelson logic (?4), in Negation and separation of concepts in constructive systems (1959) David Nelson introduced a logic that he called ?, with motivations of arithmetic and constructibility. The logic was defined by means of a calculus (crucially lacking the contraction rule) having infinitely many rule schemata, and no semantics was provided for it. We look in the present dissertation at the propositional fragment of ?, showing that it is algebraizable (in fact, implicative) in the sense of Blok and Pigozzi with respect to a class of involutive residuated lattices. We thus provide the first known algebraic semantics for ?(we call them of ?-algebras) as well as a finite Hilbert-style calculus equivalent to Nelson?s presentation. We provide an algorithm to make ?-algebras from ?-algebras or implicative lattices and we prove some results about the class of algebras which we have introduced. We also compare ? with other logics of the Nelson family, that is, ?3 and ?4.
APA, Harvard, Vancouver, ISO, and other styles
4

Klingler, Carol Diane. "Syntax-directed semantics-supported editing of algebraic specifications." Master's thesis, This resource online, 1990. http://scholar.lib.vt.edu/theses/available/etd-01202010-020048/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Stephenson, K. "An algebraic approach to syntax, semantics and compilation." Thesis, Swansea University, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.639106.

Full text
Abstract:
In this thesis, we develop an algebraic strategy and tools for modelling the syntax and semantics of programming languages and for proving the correctness of the process of compiling one language into another. Our first step in algebraically specifying language syntax is to apply a variation of an established technique of transforming a context-free grammar into a closed term algebra. Next, we design equational definitions of additional functions that act as a filter for the context-sensitive features of a language. To reduce the work involved in devising such specifications, we provide parameterised descriptions of commonly occurring language features. We illustrate the practicability of these modular methods by considering the algebraic specification of a range of programming languages and constructions. We then develop a modular algebraic method for defining operational semantics. The key to this is the employment of a notion of time by means of a simple clock, to enumerate the sequences of states produced by executing a program. We determine this behaviour by generating a sequence of atomic programs, such that the execution of each atomic program provides the next state in the execution sequence. We use functions that decompose the syntax one step at a time to determine which atomic program we should execute at each moment in time to simulate the behaviour of the entire program. We illustrate our technique with a wide-ranging set of examples. Finally, we describe how we can structure the compilation process using hierarchies of algebras, and how we can use equational methods to prove compiler correctness. The basis of our proof is in establishing correctness over just one step of time. We illustrate our technique with a case study of translating a high-level while language into instructions for a low-level register machine.
APA, Harvard, Vancouver, ISO, and other styles
6

Clarke, Daoud. "Context-theoretic Semantics for Natural Language: an Algebraic Framework." Thesis, University of Sussex, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.486979.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Fujinami, Tsutomu. "A process algebraic approach to computational linguistics." Thesis, University of Edinburgh, 1996. http://hdl.handle.net/1842/521.

Full text
Abstract:
The thesis presents a way to apply process algebra to computational linguistics. We are interested in how contexts can affect or contribute to language understanding and model the phenomena as a system of communicating processes to study the interaction between them in detail. For this purpose, we turn to the pie-calculus and investigate how communicating processes may be defined. While investigating the computational grounds of communication and concurrency,we devise a graphical representation for processes to capture the structure of interaction between them. Then, we develop a logic, combinatory intuitionistic linear logic with equality relation, to specify communicating processes logically. The development enables us to study Situation Semantics with process algebra. We construct semantic objects employed in Situation Semantics in the pi-calculus and then represent them in the logic. Through the construction,we also relate Situation Semantics with the research on the information flow, Channel Theory, by conceiving of linear logic as a theory of the information flow. To show how sentences can be parsed as the result of interactions between processes, we present a concurrent chart parser encoded in the pi-calculus. We also explain how a semantic representation can be generated as a process by the parser. We conclude the thesis by comparing the framework with other approaches.
APA, Harvard, Vancouver, ISO, and other styles
8

Avery, Thomas Charles. "Structure and semantics." Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/29517.

Full text
Abstract:
Algebraic theories describe mathematical structures that are defined in terms of operations and equations, and are extremely important throughout mathematics. Many generalisations of the classical notion of an algebraic theory have sprung up for use in different mathematical contexts; some examples include Lawvere theories, monads, PROPs and operads. The first central notion of this thesis is a common generalisation of these, which we call a proto-theory. The purpose of an algebraic theory is to describe its models, which are structures in which each of the abstract operations of the theory is given a concrete interpretation such that the equations of the theory hold. The process of going from a theory to its models is called semantics, and is encapsulated in a semantics functor. In order to define a model of a theory in a given category, it is necessary to have some structure that relates the arities of the operations in the theory with the objects of the category. This leads to the second central notion of this thesis, that of an interpretation of arities, or aritation for short. We show that any aritation gives rise to a semantics functor from the appropriate category of proto-theories, and that this functor has a left adjoint called the structure functor, giving rise to a structure{semantics adjunction. Furthermore, we show that the usual semantics for many existing notions of algebraic theory arises in this way by choosing an appropriate aritation. Another aim of this thesis is to find a convenient category of monads in the following sense. Every right adjoint into a category gives rise to a monad on that category, and in fact some functors that are not right adjoints do too, namely their codensity monads. This is the structure part of the structure{semantics adjunction for monads. However, the fact that not every functor has a codensity monad means that the structure functor is not defined on the category of all functors into the base category, but only on a full subcategory of it. This deficiency is solved when passing to general proto-theories with a canonical choice of aritation whose structure{semantics adjunction restricts to the usual one for monads. However, this comes at a cost: the semantics functor for general proto-theories is not full and faithful, unlike the one for monads. The condition that a semantics functor be full and faithful can be thought of as a kind of completeness theorem | it says that no information is lost when passing from a theory to its models. It is therefore desirable to retain this property of the semantics of monads if possible. The goal then, is to find a notion of algebraic theory that generalises monads for which the semantics functor is full and faithful with a left adjoint; equivalently the semantics functor should exhibit the category of theories as a re ective subcategory of the category of all functors into the base category. We achieve this (for well-behaved base categories) with a special kind of proto-theory enriched in topological spaces, which we call a complete topological proto-theory. We also pursue an analogy between the theory of proto-theories and that of groups. Under this analogy, monads correspond to finite groups, and complete topological proto-theories correspond to profinite groups. We give several characterisations of complete topological proto-theories in terms of monads, mirroring characterisations of profinite groups in terms of finite groups.
APA, Harvard, Vancouver, ISO, and other styles
9

Martin, Clare. "Preordered categories and predicate transformers." Thesis, University of Oxford, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.302864.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Barros, Jose Bernado dos Santos Monteiro Vieira de. "Semantics of non-terminating systems through term rewriting." Thesis, University of Oxford, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.260738.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Kammar, Ohad. "Algebraic theory of type-and-effect systems." Thesis, University of Edinburgh, 2014. http://hdl.handle.net/1842/8910.

Full text
Abstract:
We present a general semantic account of Gifford-style type-and-effect systems. These type systems provide lightweight static analyses annotating program phrases with the sets of possible computational effects they may cause, such as memory access and modification, exception raising, and non-deterministic choice. The analyses are used, for example, to justify the program transformations typically used in optimising compilers, such as code reordering and inlining. Despite their existence for over two decades, there is no prior comprehensive theory of type-and-effect systems accounting for their syntax and semantics, and justifying their use in effect-dependent program transformation. We achieve this generality by recourse to the theory of algebraic effects, a development of Moggi’s monadic theory of computational effects that emphasises the operations causing the effects at hand and their equational theory. The key observation is that annotation effects can be identified with the effect operations. Our first main contribution is the uniform construction of semantic models for typeand- effect analysis by a process we call conservative restriction. Our construction requires an algebraic model of the unannotated programming language and a relevant notion of predicate. It then generates a model for Gifford-style type-and-effect analysis. This uniform construction subsumes existing ad-hoc models for type-and-effect systems, and is applicable in all cases in which the semantics can be given via enriched Lawvere theories. Our second main contribution is a demonstration that our theory accounts for the various aspects of Gifford-style effect systems. We begin with a version of Levy’s Callby- push-value that includes algebraic effects. We add effect annotations, and design a general type-and-effect system for such call-by-push-value variants. The annotated language can be thought of as an intermediate representation used for program optimisation. We relate the unannotated semantics to the conservative restriction semantics, and establish the soundness of program transformations based on this effect analysis. We develop and classify a range of validated transformations, generalising many existing ones and adding some new ones. We also give modularly-checkable sufficient conditions for the validity of these optimisations. In the final part of this thesis, we demonstrate our theory by analysing a simple example language involving global state with multiple regions, exceptions, and nondeterminism. We give decision procedures for the applicability of the various effect-dependent transformations, and establish their soundness and completeness.
APA, Harvard, Vancouver, ISO, and other styles
12

Rajaona, Solofomampionona Fortunat. "An algebraic framework for reasoning about security." Thesis, Stellenbosch : Stellenbosch University, 2013. http://hdl.handle.net/10019/9983.

Full text
Abstract:
Thesis (MSc)--Stellenbosch University, 2013.
ENGLISH ABSTRACT: Stepwise development of a program using refinement ensures that the program correctly implements its requirements. The specification of a system is “refined” incrementally to derive an implementable program. The programming space includes both specifications and implementable code, and is ordered with the refinement relation which obeys some mathematical laws. Morgan proposed a modification of this “classical” refinement for systems where the confidentiality of some information is critical. Programs distinguish between “hidden” and “visible” variables and refinement has to bear some security requirement. First, we review refinement for classical programs and present Morgan’s approach for ignorance pre- serving refinement. We introduce the Shadow Semantics, a programming model that captures essential properties of classical refinement while preserving the ignorance of hidden variables. The model invalidates some classical laws which do not preserve security while it satisfies new laws. Our approach will be algebraic, we propose algebraic laws to describe the properties of ignorance preserving refinement. Thus completing the laws proposed in. Moreover, we show that the laws are sound in the Shadow Semantics. Finally, following the approach of Hoare and He for classical programs, we give a completeness result for the program algebra of ignorance preserving refinement.
AFRIKAANSE OPSOMMING: Stapsgewyse ontwikkeling van ’n program met behulp van verfyning verseker dat die program voldoen aan die vereistes. Die spesifikasie van ’n stelsel word geleidelik ”verfyn” wat lei tot ’n implementeerbare kode, en word georden met ‘n verfyningsverhouding wat wiskundige wette gehoorsaam. Morgan stel ’n wysiging van hierdie klassieke verfyning voor vir stelsels waar die vertroulikheid van sekere inligting van kritieke belang is. Programme onderskei tussen ”verborgeën ”sigbare” veranderlikes en verfyning voldoen aan ’n paar sekuriteitsvereistes. Eers hersien ons verfyning vir klassieke programme en verduidelik Morgan se benadering tot onwetendheid behoud. Ons verduidelik die ”Shadow Semantics”, ’n programmeringsmodel wat die noodsaaklike eienskappe van klassieke verfyning omskryf terwyl dit die onwetendheid van verborge veranderlikes laat behoue bly. Die model voldoen nie aan n paar klassieke wette, wat nie sekuriteit laat behoue bly nie, en dit voldoen aan nuwe wette. Ons benadering sal algebraïese wees. Ons stel algebraïese wette voor om die eienskappe van onwetendheid behoudende verfyning te beskryf, wat dus die wette voorgestel in voltooi. Verder wys ons dat die wette konsekwent is in die ”Shadow Semantics”. Ten slotte, na aanleiding van die benadering in vir klassieke programme, gee ons ’n volledigheidsresultaat vir die program algebra van onwetendheid behoudende verfyning.
APA, Harvard, Vancouver, ISO, and other styles
13

Siirtola, A. (Antti). "Algorithmic multiparameterised verification of safety properties:process algebraic approach." Doctoral thesis, University of Oulu, 2010. http://urn.fi/urn:isbn:9789514262524.

Full text
Abstract:
Abstract Due to increasing amount of concurrency, systems have become difficult to design and analyse. In this effort, formal verification, which means proving the correctness of a system, has turned out to be useful. Unfortunately, the application domain of the formal verification methods is often indefinite, tools are typically unavailable, and most of the techniques do not suit especially well for the verification of software systems. These are the questions addressed in the thesis. A typical approach to modelling systems and specifications is to consider them parameterised by the restrictions of the execution environment, which results in an (infinite) family of finite-state verification tasks. The thesis introduces a novel approach to the verification of such infinite specification-system families represented as labelled transition systems (LTSs). The key idea is to exploit the algebraic properties of the correctness relation. They allow the correctness of large system instances to be derived from that of smaller ones and, in the best case, an infinite family of finite-state verification tasks to be reduced to a finite one, which can then be solved using existing tools. The main contribution of the thesis is an algorithm that automates the reduction method. A specification and a system are given as parameterised LTSs and the allowed parameter values are encoded using first order logic. Parameters are sets and relations over these sets, which are typically used to denote, respectively, identities of replicated components and relationships between them. Because the number of parameters is not limited and they can be nested as well, one can express multiply parameterised systems with a parameterised substructure, which is an essential property from the viewpoint of modelling software systems. The algorithm terminates on all inputs, so its application domain is explicit in this sense. Other proposed parameterised verification methods do not have both these features. Moreover, some of the earlier results on the verification of parameterised systems are obtained as a special case of the results presented here. Finally, several natural and significant extensions to the formalism are considered, and it is shown that the problem becomes undecidable in each of the cases. Therefore, the algorithm cannot be significantly extended in any direction without simultaneously restricting some other aspect.
APA, Harvard, Vancouver, ISO, and other styles
14

Sarkis, Ralph. "Lifting Algebraic Reasoning to Generalized Metric Spaces." Electronic Thesis or Diss., Lyon, École normale supérieure, 2024. http://www.theses.fr/2024ENSL0025.

Full text
Abstract:
On retrouve le raisonnement algébrique partout en mathématique et en informatique, et il a déjà été généralisé à pleins de contextes différents. En 2016, Mardare, Panangaden et Plotkin ont introduit les algèbres quantitatives, c'est-à-dire, des espaces métriques équipés d'opérations 1-lipschitzienne relativement à la métrique. Ils ont prouvées des homologues à des résultats importants en algèbre universelle, et en particulier ils ont donné un système de déduction correct et complet qui généralise la logique équationnelle de Birkhoff en remplaçant l'égalité par l'égalité à \varepsilon près. Ça leur a permis de donner une axiomatisation algébrique pour quelques métriques importantes comme la distance de Hausdorff et celle de Kantorovich.Dans cette thèse, on modifie deux aspects du cadre de Mardare et al. Premièrement, on remplace les métriques par une notion plus générale qui englobe les pseudométriques, les ordres partiels, les métriques probabilistes, entre autres. Deuxièmement, on n'exige pas que les operations de nos algèbres quantitatives soient lipschitzienne. On donne un système de déduction correct et complet, on construit les algèbres quantitatives libres, et on démontre la valeur de notre généralisation en prouvant que toute monade sur les espaces métriques généralisés qui est le relèvement d'une monade finitaire sur les ensembles peut être présentée par une théorie algébrique quantitative. On applique ce dernier résultat pour obtenir une axiomatisation de la distance de \L ukaszyk--Karmowski
Algebraic reasoning is ubiquitous in mathematics and computer science, and it has been generalized to many different settings. In 2016, Mardare, Panangaden, and Plotkin introduced quantitative algebras, that is, metric spaces equipped with operations that are nonexpansive relative to the metric. They proved counterparts to important results in universal algebra, and in particular they provided a sound and complete deduction system generalizing Birkhoff's equational logic by replacing equality with equality up to \varepsilon. This allowed them to give algebraic axiomatizations for several important metrics like the Hausdorff and Kantorovich distances.In this thesis, we make two modifications to Mardare et al.'s framework. First, we replace metrics with a more general notion that captures pseudometrics, partial orders, probabilistic metrics, and more. Second, we do not require the operations in a quantitative algebra to be nonexpansive. We provide a sound and complete deduction system, we construct free quantitative algebras, and we demonstrate the value of our generalization by proving that any monad on generalized metric spaces that lifts a monad on sets can be presented with a quantitative algebraic theory. We apply this last result to obtain an axiomatization for the \L ukaszyk--Karmowski distance
APA, Harvard, Vancouver, ISO, and other styles
15

Bueno-Soler, Juliana 1976. "Semantica algebrica de traduções possiveis." [s.n.], 2004. http://repositorio.unicamp.br/jspui/handle/REPOSIP/279780.

Full text
Abstract:
Orientadores: Marcelo Esteban Coniglio, Carlos Caleiro
Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Filosofia e Ciencias Humanas
Made available in DSpace on 2018-08-04T00:28:23Z (GMT). No. of bitstreams: 1 Bueno-Soler_Juliana_M.pdf: 944055 bytes, checksum: 560404307eedeebf3b45f7ca82f30d78 (MD5) Previous issue date: 2004
Mestrado
Filosofia
Mestre em Filosofia
APA, Harvard, Vancouver, ISO, and other styles
16

Chanti, Houda. "Développement d'un outil d'évaluation performantielle des réglementations incendie en France et dans les pays de l'Union Européenne." Thesis, Mulhouse, 2017. http://www.theses.fr/2017MULH8193/document.

Full text
Abstract:
Dans le but de faciliter la tâche d'évaluation du niveau de sécurité incendie aux ingénieurs et permettre aux spécialistes impliqués dans le domaine d'utiliser leurs langages et outils préférés, nous proposons de créer un langage dédié au domaine de la sécurité incendie générant automatiquement une simulation en prenant en considération les langages métiers utilisés par les spécialistes intervenants dans le domaine. Ce DSL nécessite la définition, la formalisation, la composition et l'intégration de plusieurs modèles, par rapport aux langages spécifiques utilisés par les spécialistes impliqués dans le domaine. Le langage spécifique dédié au domaine de la sécurité incendie est conçu par composition et intégration de plusieurs autres DSLs décrits par des langages techniques et naturels (ainsi que des langages naturels faisant référence à des langages techniques). Ces derniers sont modélisés de manière à ce que leurs composants soient précis et fondés sur des bases mathématiques permettant de vérifier la cohérence du système (personnes et matériaux sont en sécurité) avant sa mise en œuvre. Dans ce contexte, nous proposons d'adopter une approche formelle, basée sur des spécifications algébriques, pour formaliser les langages utilisés par les spécialistes impliqués dans le système de génération, en se concentrant à la fois sur les syntaxes et les sémantiques des langages dédiés. Dans l'approche algébrique, les concepts du domaine sont abstraits par des types de données et les relations entre eux. La sémantique des langages spécifiques est décrite par les relations, le mapping (correspondances) entre les types de données définis et leurs propriétés. Le langage de simulation est basé sur un langage conçu par la composition de plusieurs DSL spécifiques précédemment décrits et formalisés. Les différents DSLs sont implémentés en se basant sur les concepts de la programmation fonctionnelle et le langage fonctionnel Haskell bien adapté à cette approche. Le résultat de ce travail est un outil informatique dédié à la génération automatique de simulation, dans le but de faciliter la tâche d'évaluation du niveau de sécurité incendie aux ingénieurs. Cet outil est la propriété du Centre Scientifique et Technique du bâtiment (CSTB), une organisation dont la mission est de garantir la qualité et la sécurité des bâtiments, en réunissant des compétences multidisciplinaires pour développer et partager des connaissances scientifiques et techniques, afin de fournir aux différents acteurs les réponses attendues dans leur pratique professionnelle
In order to facilitate the engineers task of evaluating the fire safety level, and to allow the specialists involved in the field to use their preferred languages and tools, we propose to create a language dedicated to the field of fire safety, which automatically generates a simulation, taking into account the specific languages used by the specialists involved in the field. This DSL requires the definition, the formalization, the composition and the integration of several models, regardig to the specific languages used by the specialists involved in the field. The specific language dedicated to the field of fire safety is designed by composing and integrating several other DSLs described by technical and natural languages (as well as natural languages referring to technical ones). These latter are modeled in a way that their components must be precise and based on mathematical foundations, in order to verify the consistency of the system (people and materials are safe) before it implementation. In this context, we propose to adopt a formal approach, based on algebraic specifications, to formalize the languages used by the specialists involved in the generation system, focusing on both syntaxes and semantics of the dedicated languages. In the algebraic approach, the concepts of the domain are abstracted by data types and the relationships between them. The semantics of specific languages is described by the relationships, the mappings between the defined data types and their properties. The simulation language is based on a composition of several specific DSLs previously described and formalized. The DSLs are implemented based on the concepts of functional programming and the Haskell functional language, well adapted to this approach. The result of this work is a software dedicated to the automatic generation of a simulation, in order to facilitate the evaluation of the fire safety level to the engineers. This tool is the property of the Scientific and Technical Center for Building (CSTB), an organization whose mission is to guarantee the quality and safety of buildings by combining multidisciplinary skills to develop and share scientific and technical knowledge, in order to provide to the different actors the expected answers in their professional practice
APA, Harvard, Vancouver, ISO, and other styles
17

Kuntz, Georg Wolfgang Matthias. "Symbolic semantics and verification of stochastic process algebras." [S.l.] : [s.n.], 2006. http://deposit.ddb.de/cgi-bin/dokserv?idn=97894139X.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Vu, Thuy Duong. "Semantics and applications of process and program algebra." [S.l. : Amsterdam : s.n.] ; Universiteit van Amsterdam [Host], 2007. http://dare.uva.nl/document/44054.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Guttmann, Walter. "Algebraic foundations of the Unifying Theories of Programming." [S.l. : s.n.], 2007. http://nbn-resolving.de/urn:nbn:de:bsz:289-vts-60992.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Kartsaklis, Dimitrios. "Compositional distributional semantics with compact closed categories and Frobenius algebras." Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:1f6647ef-4606-4b85-8f3b-c501818780f2.

Full text
Abstract:
The provision of compositionality in distributional models of meaning, where a word is represented as a vector of co-occurrence counts with every other word in the vocabulary, offers a solution to the fact that no text corpus, regardless of its size, is capable of providing reliable co-occurrence statistics for anything but very short text constituents. The purpose of a compositional distributional model is to provide a function that composes the vectors for the words within a sentence, in order to create a vectorial representation that re ects its meaning. Using the abstract mathematical framework of category theory, Coecke, Sadrzadeh and Clark showed that this function can directly depend on the grammatical structure of the sentence, providing an elegant mathematical counterpart of the formal semantics view. The framework is general and compositional but stays abstract to a large extent. This thesis contributes to ongoing research related to the above categorical model in three ways: Firstly, I propose a concrete instantiation of the abstract framework based on Frobenius algebras (joint work with Sadrzadeh). The theory improves shortcomings of previous proposals, extends the coverage of the language, and is supported by experimental work that improves existing results. The proposed framework describes a new class of compositional models thatfind intuitive interpretations for a number of linguistic phenomena. Secondly, I propose and evaluate in practice a new compositional methodology which explicitly deals with the different levels of lexical ambiguity (joint work with Pulman). A concrete algorithm is presented, based on the separation of vector disambiguation from composition in an explicit prior step. Extensive experimental work shows that the proposed methodology indeed results in more accurate composite representations for the framework of Coecke et al. in particular and every other class of compositional models in general. As a last contribution, I formalize the explicit treatment of lexical ambiguity in the context of the categorical framework by resorting to categorical quantum mechanics (joint work with Coecke). In the proposed extension, the concept of a distributional vector is replaced with that of a density matrix, which compactly represents a probability distribution over the potential different meanings of the specific word. Composition takes the form of quantum measurements, leading to interesting analogies between quantum physics and linguistics.
APA, Harvard, Vancouver, ISO, and other styles
21

Khani, Fereshte. "Learning precise partial semantic mappings via linear algebra." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/106099.

Full text
Abstract:
Thesis: S.M. in Computer Science and Engineering, Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2016.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 41-42).
In natural language interfaces, having high precision, i.e., abstaining when the system is unsure, is critical for good user experience. However, most NLP systems are trained to maximize accuracy with precision as an afterthought. In this thesis, we put precision first and ask: Can we learn to map parts of the sentence to logical predicates with absolute certainty? To tackle this question, we model semantic mappings from words to predicates as matrices, which allows us to reason efficiently over the entire space of semantic mappings consistent with the training data. We prove that our method obtains 100% precision. Empirically, we demonstrate the effectiveness of our approach on the GeoQuery dataset.
by Fereshte Khani.
S.M. in Computer Science and Engineering
APA, Harvard, Vancouver, ISO, and other styles
22

Gorsky, Samir 1981. "A semantica algebrica para as logicas modais e seu interesse filosofico." [s.n.], 2008. http://repositorio.unicamp.br/jspui/handle/REPOSIP/279514.

Full text
Abstract:
Orientador: Walter Alexandre Carnielli
Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Filosofia e Ciencias Humanas
Made available in DSpace on 2018-08-11T08:51:47Z (GMT). No. of bitstreams: 1 Gorsky_Samir_M.pdf: 690814 bytes, checksum: 2435beecaa8f0656155bf7b65f58df5b (MD5) Previous issue date: 2008
Resumo: No século XX tivemos um considerável avanço sobre o entendimento formal do significado das modalidades. Os trabalhos de Jónsson, McKinsey e Tarski na década de quarenta permitiram a construção dos resultados de completude algébrica para os sistemas modais. Estes resultados, porém, não receberam a devida atenção. Na década de cinqüenta, Kripke propôs uma semântica interessante para estes sistemas. Tal semântica, hoje conhecida como semântica de Kripke ou semântica dos mundos possíveis, causou um grande impacto no âmbito da filosofia analítica. Os artigos escritos por Lemmon na década de 60 têm por objetivo apresentar uma síntese destas duas semânticas. Um interessante resultado mostrado nestes artigos c que a completude semântica pode ser deduzida de resultados algébricos por meio de um teorema central. Um dos resultados mais surpreendente e interessante do trabalho do Lemmon é o teorema da representação. Esse teorema de representação para a lógica modal tem como conseqüência a conexão entre o ponto de vista algébrico e o ponto de vista da semântica dos mundos possíveis (ou semântica de Kripke). O objetivo inicial do presente trabalho era estender este mesmo resultado algébrico para os sistemas da classe "Gmnpq" proposta por Lemmon e Scott nas "Lemmon notes". Argumentaremos que as semânticas algébricas para as lógicas modais podem servir de base para respostas às diversas críticas direcionadas ao desenvolvimento da lógica modal. Mostraremos, por fim, como que a semântica algébrica, sendo uma semântica que não usa o conceito de mundos possíveis, pode ser considerada útil por defensores do antirealismo modal
Abstract: In XX century we had a considerable advance on the understanding of the formal meaning of modalities. The Jonsson, McKinsey and Tarski works in fourties enabled the construction of the results of algebraic completeness for the modal systems. In fifties Kripkc proposed a interesting semantic for these systems. Such semantics, today known as possible world's semantics, or Kripke's semantics, caused a great impact in the context of analytical philosophy. Articles written by Lemmon in the decade of 60 are supposed to present a synthesis of these two semantics, the algebraic semantic and the possible world's semantic. One interesting result shown in these articles is that the semantic completeness can be inferred from algebraic results through a central theorem. One of the most surprising and interesting results in the paper of Leuunon is the theorem or representation for modal algebras. This theorem of representation for the modal algebra is as a result the connection between the point of view and algebraic point of view of the semantics of possible worlds (or Kripkc's semantics). The initial objective of the present work was to extend this same result for algebraic systems of Class "Gmnpq" proposed by Lemmon and Scott in the "Lemmon notes". We argue that the algebraic semantic for modal logic can serve as a basis for answers to the various criticisms directed to the development of modal logic. We'll show, finally, that the algebraic semantics, as a semantics that does not use the concept of possible worlds, may be deemed useful by supporters of modal ant i realism
Mestrado
Filosofia
Mestre em Filosofia
APA, Harvard, Vancouver, ISO, and other styles
23

Brooke, Phillip James. "A timed semantics for a hierarchical design notation." Thesis, University of York, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.298382.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Genet, Bryan Howard. "Is Semantic Query Optimization Worthwhile?" The University of Waikato, 2007. http://hdl.handle.net/10289/2531.

Full text
Abstract:
The term quote semantic query optimization quote (SQO) denotes a methodology whereby queries against databases are optimized using semantic information about the database objects being queried. The result of semantically optimizing a query is another query which is syntactically different to the original, but semantically equivalent and which may be answered more efficiently than the original. SQO is distinctly different from the work performed by the conventional SQL optimizer. The SQL optimizer generates a set of logically equivalent alternative execution paths based ultimately on the rules of relational algebra. However, only a small proportion of the readily available semantic information is utilised by current SQL optimizers. Researchers in SQO agree that SQO can be very effective. However, after some twenty years of research into SQO, there is still no commercial implementation. In this thesis we argue that we need to quantify the conditions for which SQO is worthwhile. We investigate what these conditions are and apply this knowledge to relational database management systems (RDBMS) with static schemas and infrequently updated data. Any semantic query optimizer requires the ability to reason using the semantic information available, in order to draw conclusions which ultimately facilitate the recasting of the original query into a form which can be answered more efficiently. This reasoning engine is currently not part of any commercial RDBMS implementation. We show how a practical semantic query optimizer may be built utilising readily available semantic information, much of it already captured by meta-data typically stored in commercial RDBMS. We develop cost models which predict an upper bound to the amount of optimization one can expect when queries are pre-processed by a semantic optimizer. We present a series of empirical results to confirm the effectiveness or otherwise of various types of SQO and demonstrate the circumstances under which SQO can be effective.
APA, Harvard, Vancouver, ISO, and other styles
25

Alberti, Michele. "On operational properties of quantitative extensions of lambda-calculus." Thesis, Aix-Marseille, 2014. http://www.theses.fr/2014AIXM4076/document.

Full text
Abstract:
Cette thèse porte sur les propriétés opérationnelles de deux extensions quantitatives du λ-calcul pur : le λ-calcul algébrique et le λ-calcul probabiliste.Dans la première partie, nous étudions la théorie de la β-réduction dans le λ-calcul algébrique. Ce calcul permet la formation de combinaisons linéaires finies de λ-termes. Bien que le système obtenu jouisse de la propriété de Church-Rosser, la relation de réduction devient triviale en présence de coefficients négatifs, ce qui la rend impropre à définir une notion de forme normale. Nous proposons une solution qui permet la définition d'une relation d'équivalence sur les termes, partielle mais cohérente. Nous introduisons une variante de la β-réduction, restreinte aux termes canoniques, dont nous montrons qu'elle caractérise en partie la notion de forme normale précédemment établie, démontrant au passage un théorème de factorisation.Dans la seconde partie, nous étudions la bisimulation et l'équivalence contextuelle dans un λ-calcul muni d'un choix probabliste. Nous donnons une technique pour établir que la bisimilarité applicative probabiliste est une congruence. Bien que notre méthode soit adaptée de celle de Howe, certains points techniques sont assez différents, et s'appuient sur des propriétés non triviales de « désintrication » sur les ensembles de nombres réels. Nous démontrons finalement que, bien que la bisimilarité soit en général strictement plus fine que l'équivalence contextuelle, elles coïncident sur les λ-termes purs. L'égalité correspondante est celle induite par les arbres de Lévy-Longo, généralement considérés comme l'équivalence extensionnelle la plus fine pour les λ-termes en évaluation paresseuse
In this thesis we deal with the operational behaviours of two quantitative extensions of pure λ-calculus, namely the algebraic λ-calculus and the probabilistic λ-calculus.In the first part, we study the β-reduction theory of the algebraic λ-calculus, a calculus allowing formal finite linear combinations of λ-terms to be expressed. Although the system enjoys the Church-Rosser property, reduction collapses in presence of negative coefficients. We exhibit a solution to the consequent loss of the notion of (unique) normal form, allowing the definition of a partial, but consistent, term equivalence. We then introduce a variant of β-reduction defined on canonical terms only, which we show partially characterises the previously established notion of normal form. In the process, we prove a factorisation theorem.In the second part, we study bisimulation and context equivalence in a λ-calculus endowed with a probabilistic choice. We show a technique for proving congruence of probabilistic applicative bisimilarity. While the technique follows Howe's method, some of the technicalities are quite different, relying on non-trivial "disentangling" properties for sets of real numbers. Finally we show that, while bisimilarity is in general strictly finer than context equivalence, coincidence between the two relations is achieved on pure λ-terms. The resulting equality is that induced by Lévy-Longo trees, generally accepted as the finest extensional equivalence on pure λ-terms under a lazy regime
APA, Harvard, Vancouver, ISO, and other styles
26

Fan, Yang, Hidehiko Masuhara, Tomoyuki Aotani, Flemming Nielson, and Hanne Riis Nielson. "AspectKE*: Security aspects with program analysis for distributed systems." Universität Potsdam, 2010. http://opus.kobv.de/ubp/volltexte/2010/4136/.

Full text
Abstract:
Enforcing security policies to distributed systems is difficult, in particular, when a system contains untrusted components. We designed AspectKE*, a distributed AOP language based on a tuple space, to tackle this issue. In AspectKE*, aspects can enforce access control policies that depend on future behavior of running processes. One of the key language features is the predicates and functions that extract results of static program analysis, which are useful for defining security aspects that have to know about future behavior of a program. AspectKE* also provides a novel variable binding mechanism for pointcuts, so that pointcuts can uniformly specify join points based on both static and dynamic information about the program. Our implementation strategy performs fundamental static analysis at load-time, so as to retain runtime overheads minimal. We implemented a compiler for AspectKE*, and demonstrate usefulness of AspectKE* through a security aspect for a distributed chat system.
APA, Harvard, Vancouver, ISO, and other styles
27

Lermusiaux, Pierre. "Analyse statique de transformations pour l’élimination de motifs." Electronic Thesis or Diss., Université de Lorraine, 2022. http://www.theses.fr/2022LORR0372.

Full text
Abstract:
La transformation de programmes est une pratique très courante dans le domaine des sciences informatiques. De la compilation à la génération de tests en passant par de nombreuses approches d'analyse de codes et de vérification formelle des programmes, c'est un procédé qui est à la fois omniprésent et crucial au bon fonctionnement des programmes et systèmes informatiques. Cette thèse propose une étude formelle des procédures de transformation de programmes dans le but d'exprimer et de garantir des propriétés syntaxiques sur le comportement et les résultats d'une telle transformation. Dans le contexte de la vérification formelle des programmes, il est en effet souvent nécessaire de pouvoir caractériser la forme des termes obtenus par réduction suivant une telle transformation. En s'inspirant du modèle de passes de compilation, qui décrivent un séquençage de la compilation d'un programme en étapes de transformation minimales n'affectant qu'un petit nombre des constructions du langage, on introduit, dans cette thèse, un formalisme basé sur les notions de filtrage par motif et de réécriture permettant de décrire certaines propriétés couramment induites par ce type de transformations. Le formalisme proposé se repose sur un système d'annotations des symboles de fonction décrivant une spécification du comportement attendu des fonctions associées. On présente alors une méthode d'analyse statique permettant de vérifier que les transformations étudiées, exprimées par un système de réécriture, satisfont en effet ces spécifications
Program transformation is an extremely common practice in computer science. From compilation to tests generation, through many approaches of code analysis and formal verification of programs, it is a process that is both ubiquitous and critical to properly functionning programs and information systems. This thesis proposes to study the program transformations mechanisms in order to express and verify syntactical guarantees on the behaviour of these transformations and on their results.Giving a characterisation of the shape of terms returned by such a transformation is, indeed, a common approach to the formal verification of programs. In order to express some properties often used by this type of approaches, we propose in this thesis a formalism inspired by the model of compilation passes, that are used to describe the general compilation of a program as a sequence of minimal transformations, and based on the notions of pattern matching and term rewriting.This formalism relies on an annotation mechanism of function symbols in order to express a set of specfications describing the behaviours of the associated functions. We then propose a static analysis method in order to check that a transformation, expressed as a term rewrite system, actually verifiesits specifications
APA, Harvard, Vancouver, ISO, and other styles
28

Morara, Massimo. "Una semantica distribuita per il Multi-CCS utilizzando reti di Petri." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2011. http://amslaurea.unibo.it/2717/.

Full text
Abstract:
Questa tesi descrive e approfondisce l'algebra di processo Multi-CCS, le fornisce una semantica basata sulle reti di Petri non limitate - a correzione e miglioramento della precedente - e una dimostrazione dettagliata della sua correttezza, ovvero della bisimilitudine tra la marcatura ottenuta dalla uova semantica e da un generico processo Multi-CCS e lo stesso processo nella semantica di default definita sugli LTS
APA, Harvard, Vancouver, ISO, and other styles
29

Vasireddy, Jhansi Lakshmi. "Applications of Linear Algebra to Information Retrieval." Digital Archive @ GSU, 2009. http://digitalarchive.gsu.edu/math_theses/71.

Full text
Abstract:
Some of the theory of nonnegative matrices is first presented. The Perron-Frobenius theorem is highlighted. Some of the important linear algebraic methods of information retrieval are surveyed. Latent Semantic Indexing (LSI), which uses the singular value de-composition is discussed. The Hyper-Text Induced Topic Search (HITS) algorithm is next considered; here the power method for finding dominant eigenvectors is employed. Through the use of a theorem by Sinkohrn and Knopp, a modified HITS method is developed. Lastly, the PageRank algorithm is discussed. Numerical examples and MATLAB programs are also provided.
APA, Harvard, Vancouver, ISO, and other styles
30

Gil, Iranzo Rosa María. "Agents negotiating in a semantic web architecture (SWA)." Doctoral thesis, Universitat Pompeu Fabra, 2005. http://hdl.handle.net/10803/7532.

Full text
Abstract:
La arquitectura semántica diseñada ha sido probada dentro del entorno de la gestión de los derechos de la propiedad intelectual. Esta permite asimilar nuevos módulos y cambios en la estructura. Se ha diseñado específicamente una propia ontología: IPROnto, que es totalmente interoperable con el resto de estándares, y propone una nueva forma de gestionar los derechos de la propiedad intelectual.
Se realizó un análisis estadístico de la Web Semántica así como de los elementos que estábamos utilizando, IPROnto. Este estudio reveló que la Web Semántica se comportaba como un sistema complejo, que poseía propiedades por las cuales se podía caracterizar un comportamiento microscópico.
Algunas ontologías son discutidas como paradigma de conocimiento común como DOLCE o FrameNet, además de procesos para poder interiorizarlas cercanos a la ciencia cognitiva utilizando 'image-schemas' para conectarlos a la web semántica, además de utilizar álgebra geométrica para conectar el simbolismo (álgebra) y semántica (geometría entendida como significado geométrico) como último paso.
The main issue of this work is to discover and face new challenges in negotiation over the WWW, concretely over the Semantic Web (SW) because it provides a new paradigm not only in language expression but also in its manipulation.
As a result, a heterogeneous architecture is provided (with Multi-agent Systems) and IPR knowledge is formalized in an IPR ontology.
Nowadays, agents have to know about other agents and their environments. Ontologies are been used to model agents' knowledge. In order to provide a model of the SW as real as possible, a deep statistical analysis of it has been made. It reveals that the SW behaves as a complex system and shares some properties with them.
Concepts classification could not be as objective as we expect. An effort is done by SW community to establish a shared basis of knowledge for common understanding. The contribution is a way to connect it to the physical domain.
APA, Harvard, Vancouver, ISO, and other styles
31

Passerino, Liliana Maria. "Um sistema de tipos para uma linguagem de representacao estruturada de conhecimento." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 1992. http://hdl.handle.net/10183/26392.

Full text
Abstract:
A noção de tipo é intrínseca ao raciocínio humano, na medida que os seres humanos tendem a "classificar" os objetos segundo seu use e seu comportamento como parte do processo de resolução de problemas. Tal classificação dos objetos implica numa abstração das características irrelevantes dos mesmos,permitindo dessa maneira uma simplificação importante da complexidade do universo de discurso Por outro lado, certos problemas são altamente complexos e requerem um tratamento diferenciado.Esses problemas exigem, para sua resolução, um grande conhecimento do universo de discurso. O ponto critico nesta situação é que o domínio do problema não é exato como poderia ser um domínio matemático. Pelo contrario, ele inclui geralmente aspectos ambíguos e pouco formais que dificultam seu entendimento. Tal domínio a chamado de senso comum e é objeto de estudo de uma linha da computação, a Inteligência Artificial (IA). Para [KRA 87], entre outros, as soluc6es pare muitos problemas de IA dependern mais da capacidade de adquirir e manipular conhecimento do que de algoritmos sofisticados. Por este motivo, existem na IA muitos tipos de linguagens que tentam, de di verses maneiras,facilitar a representação de conhecirnentos sobre universos de discurso de problemas particulares. São as chamadas Linguagens de Representação de Conhecimento. A noção de tipo e implícita nas linguagens de representação de conhecimento, uma vez que tal noção é natural no raciocínio humano e esta intimamente ligada ao conceito de abstração. Este trabalho visa explicitar a noção de tipo subjacente ao núcleo definido da linguagem RECON-II. Para isto, foi realizado um estudo semântico prévio para identificar os tipos semânticos da linguagem. A partir da noção semântica dos tipos foi possível definir a correspondente sintática e finalmente, descrever um Sistema de Tipos para RECON-II. Um Sistema de Tipos consiste numa Linguagem de Tipos (tipos básicos + construtores de tipos) e num Sistema de Dedução que relaciona as expresses da linguagem objeto (linguagem de programação com as expresses da linguagem de tipos. Para a primeira etapa realizada neste trabalho, a determinação da semântica da linguagem, foi utilizado o método algébrico. Nele toda expressão RECON-II é um termo de uma assinatura Z, de modo quo cada assinatura Z determina um conjunto de expressos RECON-UL Mas, por outro lado, uma assinatura também determina um conjunto de álgebras. Dessas álgebras-Z só um subconjunto significativo para as expressões RECON-II. As álgebras-Z significativas são aquelas que satisfazem a assinatura-Z mais um conjunto E de axiomas. A assinatura-Z junto como o conjunto E de axiomas constituem o quo se denomina Tipo Abstrato de Dados, T=CZ, E), e as álgebras-Z significativas são os chamados modelos-Z do tipo T. Assim, uma expressão RECON-II a e um elemento da álgebra de termos quo g uma Álgebra gerada a partir do E. Essa álgebra, 44 4-) conjunto das expressi5es_: RECON-II significativas, e o modelo inicial de tais expressões WOG 781 Dado um tipo abstrato T existe um único modelo para T, ou uma classe de modelos, não isomórficos, denominada MCT>. No segundo caso, asses modelos constituem uma "quasi" ordem parcial com modelo inicial e terminal. A existência e unicidade do modelo inicial para qualquer tipo T foi demonstrada por [GOG 77] Com Σ = (S, F). a (Ws )para 9 S, e o conjunto dos termos de "sort." e. Na RECON-II, são os termos de uma categoria sintática determinada. As categorias sintáticas principais são : Conceitos, Relações, Funções e Redes. Um tipo semântico para s E S é um subconjunto M(T) S M(T) quo satisfaz os axiomas E exigidos de (WΣ) s, constituindo o tipo abstrato T .s.(por exemplo TConceitos, TRedes, etc.) Por último foi definido o Sistema de Tipos, que consiste numa estrutura sintática adequada para os tipos semânticos de cada expressão-RECON e, para cada expressão de tipo, um conjunto de regras de inferências que permuta, a partir de uma expressão-RECON inferir seu tipo mais geral.
The notion of type is intrinsic to human reasoning, since human beings tend to classify objects according their use and behaviour as part of the problem solving process. By classifying objects, their irrevelant characteristics are abstrated; in this way, the complexity of the universe of discourse is much reduced. On the other hand, certain problems are higly complex and require a differentiated treatament. In order to solve these problems, a great knowledge of de universe of discourse is needed. The critical proint in this situation is that the domain of the problem isn't as precise as a matliematic domain. On the contrary, it generally, includes ambiguous and not very formal aspects wich make its uderstanding difficult.. Such a domains is known as common sense and this is the object of studies of one line of Computer Science, Artificial Intelligence CAI). For [KRA 871, among others, the solutions for many AI problems depend on the ability for acquiring and manipulating knowledge rather than on sophisticated algorithm. For this reason, there are in AI many type of languages that attemps in different ways, to represent the UD of a particular problem. These languagesare known as Knowledge Representation Languages. The notion of type is implicit in Knowledge Representation Languages, since it is natural in human reasoning and closely rrelated to the concept of abstraction. This work intends to make the notion of type intrinsic to the RECON-II's kernel language, explicity. In order to do this, a preliminary semantic stidy was carriedaut to identify the semantic types of the languages. From the semantic notion of the types it was possible to define the sintactic counterpart and finally to describe a Type System for RECON- II. A Type System conssit of a type language (basic types + types constructors) end a deduction system that relattes expressions in the language object (programming language) to the expressions in the type language. In the first step of this work, language semantic determination, the algebric method was used. In it every RECON-II expression is one term of a signature 2, so Chet every signature 2 determines a RECON-II expressions set. On the other hand, a signature also determines a set of algebras. Out of these 2-algebras only one subset is significant to the RECON-II expressions. The significant 2-algebras are those t.het satisfy the 2-signature and a' set E of axioms. Together the 2-siganture and the set E of axioms, constitute what is called Abstract Data Type T = (2, E) and the significant E-algebras are the so-called Z-models of type T. Therefore a RECON-II expressions a is an element, of the wich is an algebra generated from E. This 2- 211)1`.9 is the set. of Sl !.171-11. RECON-II expressions, and is the initia; model of such expressions CLOG 78]. Given an abstract type T there is one single model for T or one class of nonisomorphic models denominated M(T). In the second cas,4, these models constitute a "quasi" partial order with an initial and terminal model. the exixstence nad uniqueness of the inititia1 model for any type T was shown at. CLOG 773. With r = that satisfies the axioms E required from C.W_), constituting the 8 2- abstract type T (for instance Tconcepts, Tnets, etc.). 8 Finally, the type systems was defined, consisting a syntatic structure suitable for the semantic types of each RECON-II expressions and for every type expressions, a set of inference rules wich allows infering its more general type from a RECON-II expressions.
APA, Harvard, Vancouver, ISO, and other styles
32

Brunet, Paul. "Algebras of Relations : from algorithms to formal proofs." Thesis, Lyon, 2016. http://www.theses.fr/2016LYSE1198/document.

Full text
Abstract:
Les algèbres de relations apparaissent naturellement dans de nombreux cadres, en informatique comme en mathématiques. Elles constituent en particulier un formalisme tout à fait adapté à la sémantique des programmes impératifs. Les algèbres de Kleene constituent un point de départ : ces algèbres jouissent de résultats de décidabilités très satisfaisants, et admettent une axiomatisation complète. L'objectif de cette thèse a été d'étendre les résultats connus sur les algèbres de Kleene à des extensions de celles-ci.Nous nous sommes tout d'abord intéressés à une extension connue : les algèbres de Kleene avec converse. La décidabilité de ces algèbres était déjà connue, mais l'algorithme prouvant ce résultat était trop compliqué pour être utilisé en pratique. Nous avons donné un algorithme plus simple, plus efficace, et dont la correction est plus facile à établir. Ceci nous a permis de placer ce problème dans la classe de complexité PSpace-complete.Nous avons ensuite étudié les allégories de Kleene. Sur cette extension, peu de résultats étaient connus. En suivant des résultats sur des algèbres proches, nous avons établi l'équivalence du problème d'égalité dans les allégories de Kleene à l'égalité de certains ensembles de graphes. Nous avons ensuite développé un modèle d'automate original (les automates de Petri), basé sur les réseaux de Petri, et avons établi l'équivalence de notre problème original avec le problème de comparaison de ces automates. Nous avons enfin développé un algorithme pour effectuer cette comparaison dans le cadre restreint des treillis de Kleene sans identité. Cet algorithme utilise un espace exponentiel. Néanmoins, nous avons pu établir que la comparaison d'automates de Petri dans ce cas est ExpSpace-complète. Enfin, nous nous sommes intéressés aux algèbres de Kleene Nominales. Nous avons réalisé que les descriptions existantes de ces algèbres n'étaient pas adaptées à la sémantique relationnelle des programmes. Nous les avons donc modifiées pour nos besoins, et ce faisant avons trouvé diverses variations naturelles de ce modèle. Nous avons donc étudié en détails et en Coq les ponts que l'on peut établir entre ces variantes, et entre le modèle “classique” et notre nouvelle version
Algebras of relations appear naturally in many contexts, in computer science as well as in mathematics. They constitute a framework well suited to the semantics of imperative programs. Kleene algebra are a starting point: these algebras enjoy very strong decidability properties, and a complete axiomatisation. The goal of this thesis was to export known results from Kleene algebra to some of its extensions. We first considered a known extension: Kleene algebras with converse. Decidability of these algebras was already known, but the algorithm witnessing this result was too complicated to be practical. We proposed a simpler algorithm, more efficient, and whose correctness is easier to establish. It allowed us to prove that this problem lies in the complexity class PSpace-complete.Then we studied Kleene allegories. Few results were known about this extension. Following results about closely related algebras, we established the equivalence between equality in Kleene allegories and equality of certain sets of graphs. We then developed an original automaton model (so-called Petri automata), based on Petri nets. We proved the equivalence between the original problem and comparing these automata. In the restricted setting of identity-free Kleene lattices, we also provided an algorithm performing this comparison. This algorithm uses exponential space. However, we proved that the problem of comparing Petri automata lies in the class ExpSpace-complete.Finally, we studied Nominal Kleene algebras. We realised that existing descriptions of these algebra were not suited to relational semantics of programming languages. We thus modified them accordingly, and doing so uncovered several natural variations of this model. We then studied formally the bridges one could build between these variations, and between the existing model and our new version of it. This study was conducted using the proof assistant Coq
APA, Harvard, Vancouver, ISO, and other styles
33

Zanasi, Fabio. "Interacting Hopf Algebras- the Theory of Linear Systems." Thesis, Lyon, École normale supérieure, 2015. http://www.theses.fr/2015ENSL1020/document.

Full text
Abstract:
Dans cette thèse, on présente la théorie algébrique IH par le biais de générateurs et d’équations.Le modèle libre de IH est la catégorie des sous-espaces linéaires sur un corps k. Les termes de IH sont des diagrammes de cordes, qui, selon le choix de k, peuvent exprimer différents types de réseaux et de formalismes graphiques, que l’on retrouve dans des domaines scientifiques divers, tels que les circuits quantiques, les circuits électriques et les réseaux de Petri. Les équations de IH sont obtenues via des lois distributives entre algèbres de Hopf – d’où le nom “Interacting Hopf algebras” (algèbres de Hopf interagissantes). La caractérisation via les sous-espaces permet de voir IH comme une syntaxe fondée sur les diagrammes de cordes pour l’algèbre linéaire: les applications linéaires, les espaces et leurs transformations ont chacun leur représentation fidèle dans le langage graphique. Cela aboutit à un point de vue alternatif, souvent fructueux, sur le domaine.On illustre cela en particulier en utilisant IH pour axiomatiser la sémantique formelle de circuits de calculs de signaux, pour lesquels on s’intéresse aux questions de la complète adéquation et de la réalisabilité. Notre analyse suggère un certain nombre d’enseignements au sujet du rôle de la causalité dans la sémantique des systèmes de calcul
We present by generators and equations the algebraic theory IH whose free model is the category oflinear subspaces over a field k. Terms of IH are string diagrams which, for different choices of k, expressdifferent kinds of networks and graphical formalisms used by scientists in various fields, such as quantumcircuits, electrical circuits and Petri nets. The equations of IH arise by distributive laws between Hopfalgebras - from which the name interacting Hopf algebras. The characterisation in terms of subspacesallows to think of IH as a string diagrammatic syntax for linear algebra: linear maps, spaces and theirtransformations are all faithfully represented in the graphical language, resulting in an alternative, ofteninsightful perspective on the subject matter. As main application, we use IH to axiomatise a formalsemantics of signal processing circuits, for which we study full abstraction and realisability. Our analysissuggests a reflection about the role of causality in the semantics of computing devices
APA, Harvard, Vancouver, ISO, and other styles
34

Donadello, Ivan. "Semantic Image Interpretation - Integration of Numerical Data and Logical Knowledge for Cognitive Vision." Doctoral thesis, Università degli studi di Trento, 2018. https://hdl.handle.net/11572/369055.

Full text
Abstract:
Semantic Image Interpretation (SII) is the process of generating a structured description of the content of an input image. This description is encoded as a labelled direct graph where nodes correspond to objects in the image and edges to semantic relations between objects. Such a detailed structure allows a more accurate searching and retrieval of images. In this thesis, we propose two well-founded methods for SII. Both methods exploit background knowledge, in the form of logical constraints of a knowledge base, about the domain of the images. The first method formalizes the SII as the extraction of a partial model of a knowledge base. Partial models are built with a clustering and reasoning algorithm that considers both low-level and semantic features of images. The second method uses the framework Logic Tensor Networks to build the labelled direct graph of an image. This framework is able to learn from data in presence of the logical constraints of the knowledge base. Therefore, the graph construction is performed by predicting the labels of the nodes and the relations according to the logical constraints and the features of the objects in the image. These methods improve the state-of-the-art by introducing two well-founded methodologies that integrate low-level and semantic features of images with logical knowledge. Indeed, other methods, do not deal with low-level features or use only statistical knowledge coming from training sets or corpora. Moreover, the second method overcomes the performance of the state-of-the-art on the standard task of visual relationship detection.
APA, Harvard, Vancouver, ISO, and other styles
35

Donadello, Ivan. "Semantic Image Interpretation - Integration of Numerical Data and Logical Knowledge for Cognitive Vision." Doctoral thesis, University of Trento, 2018. http://eprints-phd.biblio.unitn.it/2888/1/PhD-Thesis.pdf.

Full text
Abstract:
Semantic Image Interpretation (SII) is the process of generating a structured description of the content of an input image. This description is encoded as a labelled direct graph where nodes correspond to objects in the image and edges to semantic relations between objects. Such a detailed structure allows a more accurate searching and retrieval of images. In this thesis, we propose two well-founded methods for SII. Both methods exploit background knowledge, in the form of logical constraints of a knowledge base, about the domain of the images. The first method formalizes the SII as the extraction of a partial model of a knowledge base. Partial models are built with a clustering and reasoning algorithm that considers both low-level and semantic features of images. The second method uses the framework Logic Tensor Networks to build the labelled direct graph of an image. This framework is able to learn from data in presence of the logical constraints of the knowledge base. Therefore, the graph construction is performed by predicting the labels of the nodes and the relations according to the logical constraints and the features of the objects in the image. These methods improve the state-of-the-art by introducing two well-founded methodologies that integrate low-level and semantic features of images with logical knowledge. Indeed, other methods, do not deal with low-level features or use only statistical knowledge coming from training sets or corpora. Moreover, the second method overcomes the performance of the state-of-the-art on the standard task of visual relationship detection.
APA, Harvard, Vancouver, ISO, and other styles
36

Mahmood, Imran. "A Verification Framework for Component Based Modeling and Simulation : “Putting the pieces together”." Doctoral thesis, KTH, Programvaruteknik och Datorsystem, SCS, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-116678.

Full text
Abstract:
The discipline of component-based modeling and simulation offers promising gains including reduction in development cost, time, and system complexity. This paradigm is very profitable as it promotes the use and reuse of modular components and is auspicious for effective development of complex simulations. It however is confronted by a series of research challenges when it comes to actually practice this methodology. One of such important issue is Composability verification. In modeling and simulation (M&S), composability is the capability to select and assemble components in various combinations to satisfy specific user requirements. Therefore to ensure the correctness of a composed model, it is verified with respect to its requirements specifications.There are different approaches and existing component modeling frameworks that support composability however in our observation most of the component modeling frameworks possess none or weak built-in support for the composability verification. One such framework is Base Object Model (BOM) which fundamentally poses a satisfactory potential for effective model composability and reuse. However it falls short of required semantics, necessary modeling characteristics and built-in evaluation techniques, which are essential for modeling complex system behavior and reasoning about the validity of the composability at different levels.In this thesis a comprehensive verification framework is proposed to contend with some important issues in composability verification and a verification process is suggested to verify composability of different kinds of systems models, such as reactive, real-time and probabilistic systems. With an assumption that all these systems are concurrent in nature in which different composed components interact with each other simultaneously, the requirements for the extensive techniques for the structural and behavioral analysis becomes increasingly challenging. The proposed verification framework provides methods, techniques and tool support for verifying composability at its different levels. These levels are defined as foundations of a consistent model composability. Each level is discussed in detail and an approach is presented to verify composability at that level. In particular we focus on theDynamic-Semantic Composability level due to its significance in the overallcomposability correctness and also due to the level of difficulty it poses in theprocess. In order to verify composability at this level we investigate the application ofthree different approaches namely (i) Petri Nets based Algebraic Analysis (ii) ColoredPetri Nets (CPN) based State-space Analysis and (iii) Communicating SequentialProcesses based Model Checking. All the three approaches attack the problem ofverifying dynamic-semantic composability in different ways however they all sharethe same aim i.e., to confirm the correctness of a composed model with respect to itsrequirement specifications. Beside the operative integration of these approaches inour framework, we also contributed in the improvement of each approach foreffective applicability in the composability verification. Such as applying algorithmsfor automating Petri Net algebraic computations, introducing a state-space reductiontechnique in CPN based state-space analysis, and introducing function libraries toperform verification tasks and help the molder with ease of use during thecomposability verification. We also provide detailed examples of using each approachwith different models to explain the verification process and their functionality.Lastly we provide a comparison of these approaches and suggest guidelines forchoosing the right one based on the nature of the model and the availableinformation. With a right choice of an approach and following the guidelines of ourcomponent-based M&S life-cycle a modeler can easily construct and verify BOMbased composed models with respect to its requirement specifications.

Overseas Scholarship for PHD in selected Studies Phase II Batch I

Higher Education Commision of Pakistan.

QC 20130224

APA, Harvard, Vancouver, ISO, and other styles
37

Novakovic, Novak. "Sémantique algébrique des ressources pour la logique classique." Thesis, Vandoeuvre-les-Nancy, INPL, 2011. http://www.theses.fr/2011INPL075N/document.

Full text
Abstract:
Le thème général de cette thèse est l’exploitation de l’interaction entre la sémantique dénotationnelle et la syntaxe. Des sémantiques satisfaisantes ont été découvertes pour les preuves en logique intuitionniste et linéaire, mais dans le cas de la logique classique, la solution du problème est connue pour être particulièrement difficile. Ce travail commence par l’étude d’une interprétation concrète des preuves classiques dans la catégorie des ensembles ordonnés et bimodules, qui mène à l’extraction d’invariants significatifs. Suit une généralisation de cette sémantique concrète, soit l’interprétation des preuves classiques dans une catégorie compacte fermée où chaque objet est doté d’une structure d’algèbre de Frobenius. Ceci nous mène à une définition de réseaux de démonstrations pour la logique classique. Le concept de correction, l’élimination des coupures et le problème de la “full completeness” sont abordés au moyen d’un enrichissement naturel dans les ordres sur la catégorie de Frobenius, produisant une catégorie pour l'élimination des coupures et un concept de ressources pour la logique classique. Revenant sur notre première sémantique concrète, nous montrons que nous avons une représentation fidèle de la catégorie de Frobenius dans la catégorie des ensembles ordonnés et bimodules
The general theme of this thesis is the exploitation of the fruitful interaction between denotational semantics and syntax. Satisfying semantics have been discovered for proofs in intuitionistic and certain linear logics, but for the classical case, solving the problem is notoriously difficult.This work begins with investigations of concrete interpretations of classical proofs in the category of posets and bimodules, resulting in the definition of meaningful invariants of proofs. Then, generalizing this concrete semantics, classical proofs are interpreted in a free symmetric compact closed category where each object is endowed with the structure of a Frobenius algebra. The generalization paves a way for a theory of proof nets for classical proofs. Correctness, cut elimination and the issue of full completeness are addressed through natural order enrichments defined on the Frobenius category, yielding a category with cut elimination and a concept of resources in classical logic. Revisiting our initial concrete semantics, we show we have a faithful representation of the Frobenius category in the category of posets and bimodules
APA, Harvard, Vancouver, ISO, and other styles
38

LIMA, Lucas Albertins de. "Formalisation of SysML design models and an analysis strategy using refinement." Universidade Federal de Pernambuco, 2016. https://repositorio.ufpe.br/handle/123456789/17636.

Full text
Abstract:
Submitted by Fabio Sobreira Campos da Costa (fabio.sobreira@ufpe.br) on 2016-08-08T12:10:14Z No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) v_final_assinaturas_branco.pdf: 10378086 bytes, checksum: 35e52eff52531ee36b6a5af5b2a20645 (MD5)
Made available in DSpace on 2016-08-08T12:10:14Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) v_final_assinaturas_branco.pdf: 10378086 bytes, checksum: 35e52eff52531ee36b6a5af5b2a20645 (MD5) Previous issue date: 2016-03-03
The increasing complexity of systems has led to increasing difficulty in design. Thestandard approach to development, based on trial and error, with testing used at later stages toidentify errors, is costly and leads to unpredictable delivery times. In addition, for critical systems,for which safety is a major concern, early verification and validation (V&V) is recognised asa valuable approach to promote dependability. In this context, we identify three important anddesirable features of a V&V technique: (i) a graphical modelling language; (ii) formal andrigorous reasoning, and (iii) automated support for modelling and reasoning. We address these points with a refinement technique for SysML supported by tools. SysML is a UML-based language for systems design; it has itself become a de facto standard in the area. There is wide availability of tool support from vendors like IBM, Atego, and Sparx Systems. Our work is distinctive in two ways: a semantics for refinement and for a representative collection of elements from the UML4SysML profile (blocks, state machines, activities, and interactions) used in combination. We provide a means to analyse design models specified using SysML. This facilitates the discovery of problems earlier in the system development lifecycle, reducing time and costs of production. In this work we describe our semantics, which is defined using a state-rich process algebra called CML and implemented in a tool for automatic generation of formal models. We also show how the semantics can be used for refinement-based analysis and development. Our case studies are a leadership-election protocol, a critical component of an industrial application, and a dwarf signal, a device used to control rail traffic. Our contributions are: a set of guidelines that provide meaning to the different modelling elements of SysML used during the design of systems; the individual formal semantics for SysML activities, blocks and interactions; an integrated semantics that combines these semantics with another defined for state machines; and a framework for reasoning using refinement about systems specified by collections of SysML diagrams.
O aumento da complexidade dos sistemas tem levado a um aumento na dificuldade da atividade de projeto. A abordagem padrão para desenvolvimento, baseada em tentativa e erro, com testes usados em estágios avançados para identificar erros, é custosa e leva a prazos de entrega imprevisíveis. Além disto, para sistemas críticos, para os quais segurança é um conceito chave, Verificação e Validação (V&V) com antecedência é reconhecida como uma abordagem valiosa para promover confiança. Neste contexto, nós identificamos três características importantes e desejáveis de uma técnica de V&V: (i) uma linguagem de modelagem gráfica; (ii) raciocínio formal e rigoroso, e (iii) suporte automático para modelagem e raciocínio. Nós tratamos estes pontos com uma técnica de refinamento para SysML apoiada por ferramentas. SysML é uma linguagem baseada na UML para o projeto de sistemas. Ela tem se tornado um padrão de facto na área. Há uma grande disponibilidade de ferramentas de fornecedores como IBM, Atego, e Sparx Systems. Nosso trabalho se destaca de duas maneiras: ao fornecer uma semântica para refinamento e considerar uma coleção representativa de elementos do perfil UML4SysML (blocos, máquina de estados, atividades, e interações) usados de forma combinada. Nós fornecemos uma estratégia para analisar modelos de projeto especificados em SysML. Isto facilita a descoberta de problemas mais cedo durante o ciclo de vida de desenvolvimento de sistemas, reduzindo tempo e custos de produção. Neste trabalho nós descrevemos nossa semântica a qual é definida usando uma álgebra de processo rica em estado chamada CML e implementada em uma ferramenta para geração automática de modelos formais. Nós também mostramos como esta semântica pode ser usada para análise baseada em refinamento. Nossos estudos de caso são um protocolo de eleição de líder, o qual é um componente crítico de uma aplicação industrial, e um sinal anão, o qual é um dispositivo para controlar tráfego em linhas férreas. Nossas contribuições são: um conjunto de orientações que fornecem significado para os diferentes elementos de modelagem de SysML usados durante o projeto de sistemas; as semânticas formais individuais para atividades, blocos e interações de SysML; uma semântica integrada que combina estas semânticas com outra definida para máquina de estados; e um arcabouço que usa refinamento para raciocínio de sistemas especificados por coleções de diagramas SysML.
APA, Harvard, Vancouver, ISO, and other styles
39

Novakovic, Novak. "Sémantique algébrique des ressources pour la logique classique." Electronic Thesis or Diss., Vandoeuvre-les-Nancy, INPL, 2011. http://www.theses.fr/2011INPL075N.

Full text
Abstract:
Le thème général de cette thèse est l’exploitation de l’interaction entre la sémantique dénotationnelle et la syntaxe. Des sémantiques satisfaisantes ont été découvertes pour les preuves en logique intuitionniste et linéaire, mais dans le cas de la logique classique, la solution du problème est connue pour être particulièrement difficile. Ce travail commence par l’étude d’une interprétation concrète des preuves classiques dans la catégorie des ensembles ordonnés et bimodules, qui mène à l’extraction d’invariants significatifs. Suit une généralisation de cette sémantique concrète, soit l’interprétation des preuves classiques dans une catégorie compacte fermée où chaque objet est doté d’une structure d’algèbre de Frobenius. Ceci nous mène à une définition de réseaux de démonstrations pour la logique classique. Le concept de correction, l’élimination des coupures et le problème de la “full completeness” sont abordés au moyen d’un enrichissement naturel dans les ordres sur la catégorie de Frobenius, produisant une catégorie pour l'élimination des coupures et un concept de ressources pour la logique classique. Revenant sur notre première sémantique concrète, nous montrons que nous avons une représentation fidèle de la catégorie de Frobenius dans la catégorie des ensembles ordonnés et bimodules
The general theme of this thesis is the exploitation of the fruitful interaction between denotational semantics and syntax. Satisfying semantics have been discovered for proofs in intuitionistic and certain linear logics, but for the classical case, solving the problem is notoriously difficult.This work begins with investigations of concrete interpretations of classical proofs in the category of posets and bimodules, resulting in the definition of meaningful invariants of proofs. Then, generalizing this concrete semantics, classical proofs are interpreted in a free symmetric compact closed category where each object is endowed with the structure of a Frobenius algebra. The generalization paves a way for a theory of proof nets for classical proofs. Correctness, cut elimination and the issue of full completeness are addressed through natural order enrichments defined on the Frobenius category, yielding a category with cut elimination and a concept of resources in classical logic. Revisiting our initial concrete semantics, we show we have a faithful representation of the Frobenius category in the category of posets and bimodules
APA, Harvard, Vancouver, ISO, and other styles
40

Nour, Abir. "Etude de systèmes logiques extensions de la logique intuitionniste." Université Joseph Fourier (Grenoble), 1997. http://www.theses.fr/1997GRE10128.

Full text
Abstract:
Pour modeliser le raisonnement d'un ensemble ordonne t d'agents intelligents, h. Rasiowa a introduit des systemes logiques appeles approximation logics. Dans ces systemes, un ensemble de constantes -bien que difficile a justifier dans les applications- joue un role fondamental. Dans notre travail, nous considerons des systemes logiques appeles l#t#f sans ce type de constantes mais en nous limitant au cas ou t est un ensemble ordonne fini. Nous demontrons un theoreme de deduction faible et nous presentons une liste de proprietes qui seront utilisees par la suite. Nous introduisons aussi une semantique algebrique en utilisant les algebres de heyting avec des operateurs. Pour demontrer la completude du systeme l#t#f par rapport a la semantique algebrique, nous utilisons la methode de h. Rasiowa et r. Sikorski pour la logique du premier ordre. Dans le cas propositionnel, un corollaire nous permet d'affirmer que la question de savoir si une formule du calcul propositionnel est valide ou non est effectivement decidable. Nous etudions aussi certaines relations entre les logiques l#t#f et les logiques intuitionniste et classique. De plus, la methode des tableaux est consideree car elle est connue dans la litterature sur les logiques non classiques. Enfin, nous introduisons une semantique de type kripke. L'ensemble dit des mondes possibles est ici enrichi d'une famille de fonctions indexee par l'ensemble fini t et verifiant certaines conditions. Ce type de semantique nous permet de deduire plusieurs resultats. Nous construisons un modele fini particulier de type kripke qui caracterise le calcul l#t#f propositionnel. Nous introduisons une semantique relationnelle en suivant la methode de e. Orlowska, qui a l'enorme avantage de permettre une interpretation de la logique propositionnelle l#t#f en n'utilisant qu'un type d'objet : les relations. Nous traitons aussi le probleme de la complexite du calcul propositionnel l#t#f.
APA, Harvard, Vancouver, ISO, and other styles
41

Alves, Aretha Fontes. "Álgebra linear como um curso de serviço para a licenciatura em matemática: o estudo dos espaços vetoriais." Universidade Federal de Juiz de Fora, 2013. https://repositorio.ufjf.br/jspui/handle/ufjf/1159.

Full text
Abstract:
Submitted by Renata Lopes (renatasil82@gmail.com) on 2016-04-11T15:37:22Z No. of bitstreams: 1 arethafontesalves.pdf: 3205656 bytes, checksum: 5fc98b7b40ec8618513671e0d6085944 (MD5)
Approved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2016-04-24T03:36:16Z (GMT) No. of bitstreams: 1 arethafontesalves.pdf: 3205656 bytes, checksum: 5fc98b7b40ec8618513671e0d6085944 (MD5)
Made available in DSpace on 2016-04-24T03:36:16Z (GMT). No. of bitstreams: 1 arethafontesalves.pdf: 3205656 bytes, checksum: 5fc98b7b40ec8618513671e0d6085944 (MD5) Previous issue date: 2013-11-08
O presente trabalho tem como objetivo levantar as características de um Curso de Serviço de Álgebra Linear voltado a alunos de Licenciatura em Matemática. Para tanto, desenvolvemos uma análise segundo o Modelo dos Campos Semânticos e uma revisão de literatura, voltada para três temas que permeiam nosso estudo, são eles: a Produção de Significados para a Álgebra Linear; a noção de Curso de Serviço e; a Formação Matemática do Professor de Matemática. Esta análise possibilitou que nos preparássemos para realizar uma pesquisa de campo que se constituiu em um seminário de Álgebra Linear com o intuito de estruturar quais seriam estas características.
This work aims to raise the characteristics of a Service Course in Linear Algebra for students of Mathematics. For that, we develop an analysis according to the Model of Semantic Fields and a literature review focused on three themes that permeate our study, they are: Production of meanings for Linear Algebra, the notion of Service Course and; the Mathematics Training of mathematics teacher. This analysis allowed us to prepare ourselves to conduct a field research which constituted which was constituted by a seminary in Linear Algebra with the aim of structuring what these characteristics would be.
APA, Harvard, Vancouver, ISO, and other styles
42

Seiller, Thomas. "Logique dans le facteur hyperfini : Géométrie de l' interaction et complexité." Thesis, Aix-Marseille, 2012. http://www.theses.fr/2012AIXM4064.

Full text
Abstract:
Cette thèse est une étude de la géométrie de l'interaction dans le facteur hyperfini (GdI5), introduite par Jean-Yves Girard, et de ses liens avec les constructions plus anciennes. Nous commençons par montrer comment obtenir des adjonctions purement géométriques comme une identité entre des ensembles de cycles apparaissant entre des graphes. Il est alors possible, en choisissant une fonction qui mesure les cycles, d'obtenir une adjonction numérique. Nous montrons ensuite comment construire, sur la base d'une adjonction numérique, une géométrie de l'interaction pour la logique linéaire multiplicative additive où les preuves sont interprétées par des graphes. Nous expliquons également comment cette construction permet de définir une sémantique dénotationnelle de MALL, et une notion de vérité. Nous étudions finalement une généralisation de ce cadre afin d'interpréter les exponentielles et le second ordre. Les constructions sur les graphes étant paramétrées par une fonction de mesure des cycles, nous entreprenons ensuite l'étude de deux cas particuliers. Le premier s'avère être une version combinatoire de la GdI5, et nous obtenons donc une interprétation géométrique de l'orthogonalité basée sur le déterminant de Fuglede-Kadison. Le second cas particulier est une version combinatoire des constructions plus anciennes de la géométrie de l'interaction, où l'orthogonalité est basée sur la nilpotence. Ceci permet donc de comprendre le lien entre les différentes versions de la géométrie de l'interaction, et d'en déduire que les deux adjonctions — qui semblent à première vue si différentes — sont des conséquences d'une même identité géométrique
This work is a study of the geometry of interaction in the hyperfinite factor introduced by Jean-Yves Girard, and of its relations with ancient constructions. We start by showing how to obtain purely geometrical adjunctions as an identity between sets of cycles appearing between graphs. It is then possible, by chosing a function that measures those cycles, to obtain a numerical adjunction. We then show how to construct, on the basis of such a numerical adjunction, a geometry of interaction for multiplicative additive linear logic where proofs are interpreted as graphs. We also explain how to define from this construction a denotational semantics for MALL, and a notion of truth. We extend this setting in order to deal with exponential connectives and show a full soundness result for a variant of elementary linear logic (ELL). Since the constructions on graphs we define are parametrized by a function that measures cycles, we then focus our study to two particular cases. The first case turns out to be a combinatorial version of GoI5, and we thus obtain a geometrical caracterisation of its orthogonality which is based on Fuglede-Kadison determinant. The second particular case we study will giveus a refined version of older constructions of geometry of interaction, where orthogonality is based on nilpotency. This allows us to show how these two versions of GoI, which seem quite different, are related and understand that the respective adjunctions are both consequences of a unique geometrical property. In the last part, we study the notion of subjective truth
APA, Harvard, Vancouver, ISO, and other styles
43

Sellami, Akrem. "Interprétation sémantique d'images hyperspectrales basée sur la réduction adaptative de dimensionnalité." Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2017. http://www.theses.fr/2017IMTA0037/document.

Full text
Abstract:
L'imagerie hyperspectrale permet d'acquérir des informations spectrales riches d'une scène dans plusieurs centaines, voire milliers de bandes spectrales étroites et contiguës. Cependant, avec le nombre élevé de bandes spectrales, la forte corrélation inter-bandes spectrales et la redondance de l'information spectro-spatiale, l'interprétation de ces données hyperspectrales massives est l'un des défis majeurs pour la communauté scientifique de la télédétection. Dans ce contexte, le grand défi posé est la réduction du nombre de bandes spectrales inutiles, c'est-à-dire de réduire la redondance et la forte corrélation de bandes spectrales tout en préservant l'information pertinente. Par conséquent, des approches de projection visent à transformer les données hyperspectrales dans un sous-espace réduit en combinant toutes les bandes spectrales originales. En outre, des approches de sélection de bandes tentent à chercher un sous-ensemble de bandes spectrales pertinentes. Dans cette thèse, nous nous intéressons d'abord à la classification d'imagerie hyperspectrale en essayant d'intégrer l'information spectro-spatiale dans la réduction de dimensions pour améliorer la performance de la classification et s'affranchir de la perte de l'information spatiale dans les approches de projection. De ce fait, nous proposons un modèle hybride permettant de préserver l'information spectro-spatiale en exploitant les tenseurs dans l'approche de projection préservant la localité (TLPP) et d'utiliser l'approche de sélection non supervisée de bandes spectrales discriminantes à base de contraintes (CBS). Pour modéliser l'incertitude et l'imperfection entachant ces approches de réduction et les classifieurs, nous proposons une approche évidentielle basée sur la théorie de Dempster-Shafer (DST). Dans un second temps, nous essayons d'étendre le modèle hybride en exploitant des connaissances sémantiques extraites à travers les caractéristiques obtenues par l'approche proposée auparavant TLPP pour enrichir la sélection non supervisée CBS. En effet, l'approche proposée permet de sélectionner des bandes spectrales pertinentes qui sont à la fois informatives, discriminantes, distinctives et peu redondantes. En outre, cette approche sélectionne les bandes discriminantes et distinctives en utilisant la technique de CBS en injectant la sémantique extraite par les techniques d'extraction de connaissances afin de sélectionner d'une manière automatique et adaptative le sous-ensemble optimal de bandes spectrales pertinentes. La performance de notre approche est évaluée en utilisant plusieurs jeux des données hyperspectrales réelles
Hyperspectral imagery allows to acquire a rich spectral information of a scene in several hundred or even thousands of narrow and contiguous spectral bands. However, with the high number of spectral bands, the strong inter-bands spectral correlation and the redundancy of spectro-spatial information, the interpretation of these massive hyperspectral data is one of the major challenges for the remote sensing scientific community. In this context, the major challenge is to reduce the number of unnecessary spectral bands, that is, to reduce the redundancy and high correlation of spectral bands while preserving the relevant information. Therefore, projection approaches aim to transform the hyperspectral data into a reduced subspace by combining all original spectral bands. In addition, band selection approaches attempt to find a subset of relevant spectral bands. In this thesis, firstly we focus on hyperspectral images classification attempting to integrate the spectro-spatial information into dimension reduction in order to improve the classification performance and to overcome the loss of spatial information in projection approaches.Therefore, we propose a hybrid model to preserve the spectro-spatial information exploiting the tensor model in the locality preserving projection approach (TLPP) and to use the constraint band selection (CBS) as unsupervised approach to select the discriminant spectral bands. To model the uncertainty and imperfection of these reduction approaches and classifiers, we propose an evidential approach based on the Dempster-Shafer Theory (DST). In the second step, we try to extend the hybrid model by exploiting the semantic knowledge extracted through the features obtained by the previously proposed approach TLPP to enrich the CBS technique. Indeed, the proposed approach makes it possible to select a relevant spectral bands which are at the same time informative, discriminant, distinctive and not very redundant. In fact, this approach selects the discriminant and distinctive spectral bands using the CBS technique injecting the extracted rules obtained with knowledge extraction techniques to automatically and adaptively select the optimal subset of relevant spectral bands. The performance of our approach is evaluated using several real hyperspectral data
APA, Harvard, Vancouver, ISO, and other styles
44

Kouchnarenko, Olga. "Sémantique des programmes récursifs-parallèles et méthodes pour leur analyse." Phd thesis, Université Joseph Fourier (Grenoble), 1997. http://tel.archives-ouvertes.fr/tel-00004949.

Full text
Abstract:
Cette thèse s'inscrit dans le cadre des travaux consacrés au développement des modèles sémantiques destinés aux langages de programmation concurrents. Une particularité de notre étude réside dans la considération explicite d'une récursivité au niveau des programmes parallèles. Pour ces programmes nous proposons une sémantique originale, dont nous étudions en détail les propriétés. Bien que le modèle proposé ne soit pas d'états finis, il est possible de le munir d'une structure de systèmes de transitions bien structurés au sens de Finkel ce qui établit la décidabilité de nombreux problèmes de vérification. Cette sémantique à la Plotkin permet de mieux comprendre le comportement des programmes récursifs-parallèles, d'analyser formellement certaines de leurs propriétés, de décrire la stratégie d'implémentation et d'énoncer en quel sens elle est correcte.
APA, Harvard, Vancouver, ISO, and other styles
45

Sun, Juemin. "On the algebraic denotational specifications of programming language semantics." 1988. http://hdl.handle.net/2097/22461.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

陳彥廷. "Exploring Students’ Semantics Understanding toward Algebraic Literal Symbols through Example-Questioning Model." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/92607798683917148539.

Full text
Abstract:
博士
國立高雄師範大學
科學教育研究所
95
The study was to explore the conceptual change model of understanding literal symbols’ semantics toward algebraic expression in a group of Taiwanese seven graders. After analyzing questionnaire responses of an initial sample of 76 students, 15 students were selected, from three levels of high, medium, and low score, as the final subjects. Comparing the result of questionnaire and the analysis by flow map technique as the role of literal symbols’ semantics, the study examined the starting behavior before inquired the 15 students by the guiding model of Example-Question. Based on the Growing Model of Mathematic Understanding, a qualitative analysis through a series of Example-Question revealed the conceptual change mode of understanding literal symbols’ semantics toward algebraic expression about three level students. This study disclosed (1) the different types of question asked by instructor on promoting Progress Understanding and Regress Understanding among three level students (2) the different strategies of solving problem using by three level students (3) the distribution of understanding literal symbols among three level students. First, the findings showed that different level students hold different perception toward literal symbols’ semantics. Secondly, it was also revealed that higher level students engender more frequencies of Progress Understanding and Regress Understanding. Higher level students concentrate on constructing and clarifying the semantics understanding towards algebraic literal symbols; whereas, lower level students focus on constructing the operational skills about algebraic literal symbols. Higher level students can achieve the situation of Formalization, Observation, and Construction; however, lower level students cannot reach the situation of Construction. Third, the examples, from the phase of promoting Progress Understanding, guiding high level and medium level students focus on clarifying the connotation of each formula and comprehending the semantic role of literal symbols; nevertheless, instructor should take more time on lower level students to explain the meaning of examples and the skills of operation. From the phase of promoting Regress Understanding, examples about metacognitive judgment make three level students reach Regress Understanding, but the operational skill about literal symbols also make lower level students reach Regress Understanding. From the strategy of promoting Progress Understanding, three level students utilizing the method of Advanced Question, Pause Question, and Obstructed Question
APA, Harvard, Vancouver, ISO, and other styles
47

"Semantica algebrica de traduções possiveis." Tese, Biblioteca Digital da Unicamp, 2004. http://libdigi.unicamp.br/document/?code=vtls000337884.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Meinke, Julia. "Individuelle Curricula von Lehrkräften in der Algebra." Doctoral thesis, 2016. http://hdl.handle.net/11858/00-1735-0000-0028-87EB-6.

Full text
Abstract:
Das theoretische Konstrukt der individuellen Curricula von Lehrkräften, operationalisiert mit Hilfe des Forschungsprogramms Subjektive Theorien (FST), wurde für die Untersuchung der individuellen Konzepte von Lehrkräften im Bereich der Algebra der Klassenstufen 7 und 8 mit dem Ziel ihre Unterrichts- und Planungsentscheidungen im Algebraunterricht nachzuvollziehen und zu verstehen und damit einen Einblick in den realen Unterrichtsalltag zu erhalten, genutzt. Untersucht wurden neun Gymnasiallehrkräfte mit Hilfe einer Interviewstudie. Weiterhin wurden die Resultate der einzelnen Fallstudien gegeneinander kontrastiert und es konnten drei Typen individueller Algebracurricula entwickelt werden. Die Ergebnisse zeigten beispielsweise, dass ein Hauptproblem für die Lehrkräfte darin besteht, dass sie in ihrem Algebraunterricht die syntaktischen und semantischen Elemente der Algebra eher voneinander trennen und dabei für sie die Frage auftaucht, in welchem Verhältnis diese unterrichtet werden sollten. Ein Vergleich der vorliegenden Untersuchung mit bestehenden Untersuchungen in den Bereichen Stochastik, Geometrie und der Analysis untermauert den Verdacht der Domänenspezifität individueller Curricula.
APA, Harvard, Vancouver, ISO, and other styles
49

Kuntz, Matthias [Verfasser]. "Symbolic semantics and verification of stochastic process algebras = Symbolische Semantik und Verifikation stochastischer Prozessalgebren / vorgelegt von Georg Wolfgang Matthias Kuntz." 2006. http://d-nb.info/97894139X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Baker-Finch, C. A. "An application of algebra to the semantics of programming languages." Thesis, 1985. https://eprints.utas.edu.au/18885/1/whole_Baker-FinchCA1985_thesis.pdf.

Full text
Abstract:
This dissertation investigates the use of the algebraic style of abstract data type specifications for the definition of programming language semantics. The choice of appropriate mathematical models for such presentations is an important aspect of this work largely because the semantics of programming languages will generally be defined in terms of domains that are more complex than those required for dealing with more elementary data types. The relationship between initial algebra semantics and the proposed style of specification is explored. From this foundation, the intuitive notion of the congruence of a pair of semantic definitions can be inspected and formalised against an algebraic background. Using the formal definition so developed and the simple but powerful notion of initiality, proofs of congruence are possible for semantics that are not amenable to the more traditional techniques of structural and fixed-point induction. Finally the problem of establishing the correctness of a compiler is investigated, reworking the traditional "commuting square" approach for the style of semantic presentation developed in this thesis rather than the usual initial algebra style. This allows a clearer focus on some of the shortcomings of the commuting square notion.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography