Dissertations / Theses on the topic 'Propositional logic'

To see the other types of publications on this topic, follow the link: Propositional logic.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Propositional logic.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Barbosa, Fábio Daniel Moreira. "Probabilistic propositional logic." Master's thesis, Universidade de Aveiro, 2016. http://hdl.handle.net/10773/22198.

Full text
Abstract:
Mestrado em Matemática e Aplicações
O termo Lógica Probabilística, em geral, designa qualquer lógica que incorpore conceitos probabilísticos num sistema lógico formal. Nesta dissertacção o principal foco de estudo e uma lógica probabilística (designada por Lógica Proposicional Probabilística Exógena), que tem por base a Lógica Proposicional Clássica. São trabalhados sobre essa lógica probabilística a síntaxe, a semântica e um cálculo de Hilbert, provando-se diversos resultados clássicos de Teoria de Probabilidade no contexto da EPPL. São também estudadas duas propriedades muito importantes de um sistema lógico - correcção e completude. Prova-se a correcção da EPPL da forma usual, e a completude fraca recorrendo a um algoritmo de satisfazibilidade de uma fórmula da EPPL. Serão também considerados na EPPL conceitos de outras lógicas probabilísticas (incerteza e probabilidades intervalares) e Teoria de Probabilidades (condicionais e independência).
The term Probabilistic Logic generally refers to any logic that incorporates probabilistic concepts in a formal logic system. In this dissertation, the main focus of study is a probabilistic logic (called Exogenous Probabilistic Propo- sitional Logic), which is based in the Classical Propositional Logic. There will be introduced, for this probabilistic logic, its syntax, semantics and a Hilbert calculus, proving some classical results of Probability Theory in the context of EPPL. Moreover, there will also be studied two important properties of a logic system - soundness and completeness. We prove the EPPL soundness in a standard way, and weak completeness using a satis ability algorithm for a formula of EPPL. It will be considered in EPPL concepts of other probabilistic logics (uncertainty and intervalar probability) and of Probability Theory (independence and conditional).
APA, Harvard, Vancouver, ISO, and other styles
2

VIEIRA, BRUNO LOPES. "EXTENDING PROPOSITIONAL DYNAMIC LOGIC FOR PETRI NETS." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2014. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=24052@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
COORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
PROGRAMA DE EXCELENCIA ACADEMICA
Lógica Proposicional Dinâmica (PDL) é um sistema lógico multi-modal utilizada para especificar e verificar propriedades em programas sequenciais. Redes de Petri são um formalismo largamente utilizado na especificação de sistemas concorrentes e possuem uma interpretação gráfica bastante intuitiva. Neste trabalho apresentam-se extensões da Lógica Proposicional Dinâmica onde os programas são substituídos por Redes de Petri. Define-se uma codificação composicional para as Redes de Petri através de redes básicas, apresentando uma semântica composicional. Uma axiomatização é definida para a qual o sistema é provado ser correto, e completo em relação à semântica proposta. Três Lógicas Dinâmicas são apresentadas: uma para efetuar inferências sobre Redes de Petri Marcadas ordinárias e duas para inferências sobre Redes de Petri Estocásticas marcadas, possibilitando a modelagem de cenários mais complexos. Alguns sistemas dedutivos para essas lógicas são apresentados. A principal vantagem desta abordagem concerne em possibilitar efetuar inferências sobre Redes de Petri [Estocásticas] marcadas sem a necessidade de traduzí-las a outros formalismos.
Propositional Dynamic Logic (PDL) is a multi-modal logic used for specifying and reasoning on sequential programs. Petri Net is a widely used formalism to specify and to analyze concurrent programs with a very intuitive graphical representation. In this work, we propose some extensions of Propositional Dynamic Logic for reasoning about Petri Nets. We define a compositional encoding of Petri Nets from basic nets as terms. Second, we use these terms as PDL programs and provide a compositional semantics to PDL Formulas. Then we present an axiomatization and prove completeness regarding our semantics. Three versions of Dynamic Logics to reasoning with Petri Nets are presented: one of them for ordinary Marked Petri Nets and two for Marked Stochastic Petri Nets yielding to the possibility of model more complex scenarios. Some deductive systems are presented. The main advantage of our approach is that we can reason about [Stochastic] Petri Nets using our Dynamic Logic and we do not need to translate it into other formalisms. Moreover our approach is compositional allowing for construction of complex nets using basic ones.
APA, Harvard, Vancouver, ISO, and other styles
3

Lee, Chen-Hsiu. "A tabular propositional logic: and/or Table Translator." CSUSB ScholarWorks, 2003. https://scholarworks.lib.csusb.edu/etd-project/2409.

Full text
Abstract:
The goal of this project is to design a tool to help users translate any logic statement into Disjunctive Normal Form and present the result as an AND/OR TABLE, which makes the logic relation easier to express by using a two-dimensional grid of values or expressions. This tool is implemented through a web-based and Java-based application. Thus, the user can utilize this tool via World Wide Web.
APA, Harvard, Vancouver, ISO, and other styles
4

Mitrović, Moreno. "Morphosyntactic atoms of propositional logic : (a philo-logical programme)." Thesis, University of Cambridge, 2015. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.709276.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Boskovitz, Agnes, and abvi@webone com au. "Data Editing and Logic: The covering set method from the perspective of logic." The Australian National University. Research School of Information Sciences and Engineering, 2008. http://thesis.anu.edu.au./public/adt-ANU20080314.163155.

Full text
Abstract:
Errors in collections of data can cause significant problems when those data are used. Therefore the owners of data find themselves spending much time on data cleaning. This thesis is a theoretical work about one part of the broad subject of data cleaning - to be called the covering set method. More specifically, the covering set method deals with data records that have been assessed by the use of edits, which are rules that the data records are supposed to obey. The problem solved by the covering set method is the error localisation problem, which is the problem of determining the erroneous fields within data records that fail the edits. In this thesis I analyse the covering set method from the perspective of propositional logic. I demonstrate that the covering set method has strong parallels with well-known parts of propositional logic. The first aspect of the covering set method that I analyse is the edit generation function, which is the main function used in the covering set method. I demonstrate that the edit generation function can be formalised as a logical deduction function in propositional logic. I also demonstrate that the best-known edit generation function, written here as FH (standing for Fellegi-Holt), is essentially the same as propositional resolution deduction. Since there are many automated implementations of propositional resolution, the equivalence of FH with propositional resolution gives some hope that the covering set method might be implementable with automated logic tools. However, before any implementation, the other main aspect of the covering set method must also be formalised in terms of logic. This other aspect, to be called covering set correctibility, is the property that must be obeyed by the edit generation function if the covering set method is to successfully solve the error localisation problem. In this thesis I demonstrate that covering set correctibility is a strengthening of the well-known logical properties of soundness and refutation completeness. What is more, the proofs of the covering set correctibility of FH and of the soundness / completeness of resolution deduction have strong parallels: while the proof of soundness / completeness depends on the reduction property for counter-examples, the proof of covering set correctibility depends on the related lifting property. In this thesis I also use the lifting property to prove the covering set correctibility of the function defined by the Field Code Forest Algorithm. In so doing, I prove that the Field Code Forest Algorithm, whose correctness has been questioned, is indeed correct. The results about edit generation functions and covering set correctibility apply to both categorical edits (edits about discrete data) and arithmetic edits (edits expressible as linear inequalities). Thus this thesis gives the beginnings of a theoretical logical framework for error localisation, which might give new insights to the problem. In addition, the new insights will help develop new tools using automated logic tools. What is more, the strong parallels between the covering set method and aspects of logic are of aesthetic appeal.
APA, Harvard, Vancouver, ISO, and other styles
6

QUILLEN, KEITH RAYMOND. "PROPOSITIONAL ATTITUDES AND PSYCHOLOGICAL EXPLANATION (MIND, MENTAL)." Diss., The University of Arizona, 1985. http://hdl.handle.net/10150/188053.

Full text
Abstract:
Propositional attitudes, states like believing, desiring, intending, etc., have played a central role in the articulation of many of our major theories, both in philosophy and the social sciences. Until relatively recently, psychology was a prominent entry on the list of social sciences in which propositional attitudes occupied center stage. In this century, though, behaviorists began to make a self-conscious effort to expunge "mentalistic" notions from their theorizing. Behaviorism has failed. Psychology therefore is again experiencing "formative years," and two themes have caught the interest of philosophers. The first is that psychological theories evidently must exploit a vast array of relations obtaining among internal states. The second is that the use of mentalistic idioms seems to be explicit again in much of current theorizing. These two observations have led philosophers to wonder about the probable as well as the proper role of propositional attitudes in future psychological theories. Some philosophers wonder, in particular, about the role of the contents of propositional attitudes in the forthcoming theories. Their strategy is in part to discern what sorts of theory psychologists now will want to construct, and then discern what role propositional attitude contents might play in theories of those sorts. I consider here two sorts of theory, what I call minimal functional theories and what is known as propositional attitude psychology. I outline these two kinds of theory, and show how each defines a role for contents. Contents are ultimately eliminable in minimal functional theories. Although they play an apparently ineliminable role in propositional attitude psychology, they do so at an apparent cost. Propositional attitude psychology does not seem to accommodate a certain methodological principle, a principle of individualism in psychology, which is endorsed even by some of the philosophers most enamored of the approach. Such philosophers have two options: they can attempt to show that the conflict between the approach and the principle is not genuine, or they can reject the principle. I argue that the conflict is real, and recommend a qualified rejections of the principle.
APA, Harvard, Vancouver, ISO, and other styles
7

Favro, Giordano <1985&gt. "Algebraic structures for the lambda calculus and the propositional logic." Doctoral thesis, Università Ca' Foscari Venezia, 2015. http://hdl.handle.net/10579/8333.

Full text
Abstract:
Nella prima parte della tesi definiamo due famiglie di insiemi, Mn e Gn, dove n è un indice sui naturali, costituiti da termini muti detti rispettivamente restricted regular mute e regular mute e definiti induttivamente. Proviamo inoltre che gli insiemi Mn sono graph easy, ovvero che per ogni termine chiuso t esiste un graph model che eguaglia t a tutti gli elementi di Mn. Nella seconda parte introduciamo le factor algebras su tipi del primo ordine. Mostriamo come possano essere usate come controparte algebrica per le strutture su tipi del primo ordine. Mostriamo che questa traduzione si estende a formule ed equazioni fra termini e che queste traduzioni hanno un significato semantico. Utilizzando questi risultati, possiamo studiare la logica del primo ordine tramite tecniche algebriche. Costruiamo quindi un calcolo algebrico per la logica proposizionale basato sugli assiomi della varietà generata dalle factor algebras sul tipo della logica proposizionale. Forniamo inoltre un sistema di riscrittura confluente e terminante per il calcolo. Inoltre analizziamo le proprietà algebriche di base delle factor algebras su tipi del primo ordine.
APA, Harvard, Vancouver, ISO, and other styles
8

Walton, Matthew. "First-order lax logic : a framework for abstraction, constraints and refinement." Thesis, University of Sheffield, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.299599.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Gore, Rajeev. "Cut-free sequent and tableau systems for propositional normal modal logics." Thesis, University of Cambridge, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.239668.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Namasivayam, Gayathri. "ON SIMPLE BUT HARD RANDOM INSTANCES OF PROPOSITIONAL THEORIES AND LOGIC PROGRAMS." UKnowledge, 2011. http://uknowledge.uky.edu/gradschool_diss/132.

Full text
Abstract:
In the last decade, Answer Set Programming (ASP) and Satisfiability (SAT) have been used to solve combinatorial search problems and practical applications in which they arise. In each of these formalisms, a tool called a solver is used to solve problems. A solver takes as input a specification of the problem – a logic program in the case of ASP, and a CNF theory for SAT – and produces as output a solution to the problem. Designing fast solvers is important for the success of this general-purpose approach to solving search problems. Classes of instances that pose challenges to solvers can help in this task. In this dissertation we create challenging yet simple benchmarks for existing solvers in ASP and SAT.We do so by providing models of simple logic programs as well as models of simple CNF theories. We then randomly generate logic programs as well as CNF theories from these models. Our experimental results show that computing answer sets of random logic programs as well as models of random CNF theories with carefully chosen parameters is hard for existing solvers. We generate random logic programs with 2-literals, and our experiments show that it is hard for ASP solvers to obtain answer sets of purely negative and constraint-free programs, indicating the importance of these programs in the development of ASP solvers. An easy-hard-easy pattern emerges as we compute the average number of choice points generated by ASP solvers on randomly generated 2-literal programs with an increasing number of rules. We provide an explanation for the emergence of this pattern in these programs. We also theoretically study the probability of existence of an answer set for sparse and dense 2-literal programs. We consider simple classes of mixed Horn formulas with purely positive 2- literal clauses and purely negated Horn clauses. First we consider a class of mixed Horn formulas wherein each formula has m 2-literal clauses and k-literal negated Horn clauses. We show that formulas that are generated from the phase transition region of this class are hard for complete SAT solvers. The second class of Mixed Horn Formulas we consider are obtained from completion of a certain class of random logic programs. We show the appearance of an easy-hard-easy pattern as we generate formulas from this class with increasing numbers of clauses, and that the formulas generated in the hard region can be used as benchmarks for testing incomplete SAT solvers.
APA, Harvard, Vancouver, ISO, and other styles
11

Behnke, Gregor [Verfasser]. "Hierarchical planning through propositional logic : highly efficient, versatile, and flexible / Gregor Behnke." Ulm : Universität Ulm, 2019. http://d-nb.info/1202076521/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Doyle, Edward J. "Two categories of refutation decision procedures for classical and intuitionistic propositional logic." Connect to this title online, 2008. http://etd.lib.clemson.edu/documents/1239896403/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Pettersson, Karl. "The Logical Structure of the Moral Concepts : An Essay in Propositional Deontic Logic." Doctoral thesis, Uppsala universitet, Avdelningen för praktisk filosofi, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-131581.

Full text
Abstract:
In this thesis, the main focus is on deontic logic as a tool for formal representation of moral reasoning in natural language. The simple standard system of deontic logic (SDL), i.e. the minimal Kripkean modal logic extended with the deontic axiom, stating that necessity (interpreted as obligation) implies possibility (interpreted as permission), has often been considered inadequate for this aim, due to different problems, e.g. the so-called deontic paradoxes. A general survey of deontic logic and the problems with SDL is made in chapter 1. In chapter 2, a system denoted Classical Deontic-Modal logic (CDM1) is defined. In this system, there is a primary obligation operator indexed to sets of possible worlds, and a secondary requirement operator, defined in terms of strictly necessary conditions for fulfilling an obligation. This secondary operator has most of the properties of the necessity operator in SDL. In chapters 3 and 4, it is argued that CDM1 is able to handle the SDL problems presented in chapter 1 in an adequate way, and the treatment of these problems in CDM1 is also compared with their treatment in some other well-known deontic systems. In chapter 5, it is argued that even though the problems related to quantification in modal contexts are relevant to deontic logic, these issues are not specific to deontic logic. In chapter 6, the relations between some controversial features of moral reasoning, such as moral dilemmas and “non-standard” deontic categories like supererogation, and deontic logic are discussed. It is shown how CDM1 can be modified in order to accommodate these features.
APA, Harvard, Vancouver, ISO, and other styles
14

Bianchi, M. "ON SOME AXIOMATIC EXTENSIONS OF THE MONOIDAL T-NORM BASED LOGIC MTL: AN ANALYSIS IN THE PROPOSITIONAL AND IN THE FIRST-ORDER CASE." Doctoral thesis, Università degli Studi di Milano, 2010. http://hdl.handle.net/2434/150078.

Full text
Abstract:
The scientific area this thesis belongs to are many-valued logics: in particular, the logic MTL and some of its extensions, in the propositional and in the first-order case (see [8],[9],[6],[7]). The thesis is divided in two parts: in the first one the necessary background about these logics, with some minor new results, are presented. The second part is devoted to more specific topics: there are five chapters, each one about a different problem. In chapter 6 a temporal semantics for Basic Logic BL is presented. In chapter 7 we move to first-order logics, by studying the supersoundness property: we have improved some previous works about this theme, by expanding the analysis to many extensions of the first-order version of MTL. Chapter 8 is dedicated to four different families of n-contractive axiomatic extensions of BL, analyzed in the propositional and in the first-order case: completeness, computational and arithmetical complexity, amalgamation and interpolation properties are studied. Finally, chapters 9 and 10 are about Nilpotent Minimum logic (NM, see [8]): in chapter 9 the sets of tautologies of some NM-chains (subalgebras of [0,1]_NM) are studied, compared and the problems of axiomatization and undecidability are tackled. Chapter 10, instead, concerns some logical and algebraic properties of (propositional) Nilpotent Minimum logic. The results (or an extended version of them) of these last chapters have been also presented in papers [1, 4, 5, 2, 3]. ---------------------------------References--------------------------------------------- [1] S. Aguzzoli, M. Bianchi, and V. Marra. A temporal semantics for Basic Logic. Studia Logica, 92(2), 147-162, 2009. doi:10.1007/s11225-009-9192-3. [2] M. Bianchi. First-order Nilpotent Minimum Logics: first steps. Submitted for publication,2010. [3] M. Bianchi. On some logical and algebraic properties of Nilpotent Minimum logic and its relation with Gödel logic. Submitted for publication, 2010. [4] M. Bianchi and F. Montagna. Supersound many-valued logics and Dedekind-MacNeille completions. Arch. Math. Log., 48(8), 719-736, 2009. doi:10.1007/s00153-009-0145-3. [5] M. Bianchi and F. Montagna. n-contractive BL-logics. Arch. Math. Log., 2010. doi:10.1007/s00153-010-0213-8. [6] P. Cintula, F. Esteva, J. Gispert, L. Godo, F. Montagna, and C. Noguera. Distinguished algebraic semantics for t-norm based fuzzy logics: methods and algebraic equivalencies. Ann. Pure Appl. Log., 160(1), 53-81, 2009. doi:10.1016/j.apal.2009.01.012. [7] P. Cintula and P. Hájek. Triangular norm predicate fuzzy logics. Fuzzy Sets Syst., 161(3), 311-346, 2010. doi:10.1016/j.fss.2009.09.006. [8] F. Esteva and L. Godo. Monoidal t-norm based logic: Towards a logic for left-continuous t-norms. Fuzzy sets Syst., 124(3), 271-288, 2001. doi:10.1016/S0165-0114(01)00098-7. [9] P. Hájek. Metamathematics of Fuzzy Logic, volume 4 of Trends in Logic. Kluwer Academic Publishers, paperback edition, 1998. ISBN:9781402003707.
APA, Harvard, Vancouver, ISO, and other styles
15

Al-Saedi, Mohammad Saleh Balasim. "Extensions of tractable classes for propositional satisfiability." Thesis, Artois, 2016. http://www.theses.fr/2016ARTO0405/document.

Full text
Abstract:
La représentation des connaissances et les problèmes d’inférence associés restent à l’heure actuelle une problématique riche et centrale en informatique et plus précisément en intelligence artificielle. Dans ce cadre, la logique propositionnelle permet d’allier puissance d’expression et efficacité. Il reste que, tant que P est différent de NP, la déduction en logique propositionnelle ne peut admettre de solutions à la fois générales et efficaces. Dans cette thèse, nous adressons le problème de satisfiabilité et proposons de nouvelles classes d’instances pouvant être résolues de manière polynomiale.La découverte de nouvelles classes polynomiales pour SAT est à la fois importante d’un point de vue théorique et pratique. En effet, on peut espérer les exploiter efficacement au sein de solveurs SAT. Dans cette thèse, nous proposons d’étendre deux fragments polynomiaux de SAT à l’aide de la propagation unitaire tout en s’assurant que ces fragments demeurent reconnus et résolus de manière polynomiale. Le premier résultat de cette thèse concerne la classe Quad. Nous avons établi certaines propriétés de cette classe d’instances et avons étendu cette dernière de manière à s’abstraire de l’ordre imposé sur les littéraux. Le fragment obtenu en remplaçant cet ordre par différents ordres sur les clauses, conserve lamême complexité dans le pire cas. Nous avons également étudié l’impact de la résolution bornée et de la redondance par propagation unitaire sur cette classe. La seconde contribution concerne la classe polynomiale proposée par Tovey. La propagation unitaire est une nouvelle fois utilisée pour étendre cette classe. Nous comparons le nouveau fragment polynomial obtenu à deux autres classes basées également sur la propagation unitaire : Quad et UP-Horn. Nousapportons également une réponse à une question ouverte au sujet des connexions de ces classes. Nous montrons que UP-Horn et d’autres classes basées sur la propagation unitaire sont strictement incluses dans S Quad qui représente l’union de toutes les classes Quad obtenues par l’exploitation de tous les ordres sur les clauses possibles
Knowledge representation and reasoning is a key issue in computer science and more particularly in artificial intelligence. In this respect, propositional logic is a representation formalism that is a good trade-off between the opposite computational efficiency and expressiveness criteria. However, unless P = NP, deduction in propositional logic is not polynomial in the worst case. So, in this thesis we propose new extensions of tractable classes of the propositional satisfiability problem. Tractable fragments of SAT play a role in the implementation of the most efficient current SAT solvers, many of thesetractable classes use the linear time unit propagation (UP) inference rule. We attempt to extend two of currently-known polynomial fragments of SAT thanks to UP in such a way that the fragments can still be recognized and solved in polynomial time. A first result focuses on Quad fragments: we establish some properties of Quad fragments and extend these fragments and exhibit promising variants. The extension is obtained by allowing Quad fixed total orderings of clauses to be accompanied with specific additional separate orderings of maximal sub-clauses. The resulting fragments extend Quad without degrading its worst-case complexity. Also, we investigate how bounded resolution and redundancy through unit propagation can play a role in this respect. The second contribution on tractable subclasses of SAT concerns extensions of one well-known Tovey’s polynomial fragment so that they also include instances that can be simplified using UP. Then, we compare two existing polynomial fragments based on UP: namely, Quad and UP-Horn. We also answer an open question about the connections between these two classes: we show that UP-Horn and some other UP-based variants are strict subclasses of S Quad, where S Quad is the union of all Quad classes obtained by investigating all possible orderings of clauses
APA, Harvard, Vancouver, ISO, and other styles
16

SOLARES, ROJAS ALEJANDRO JAVIER. "TRACTABLE DEPTH-BOUNDED APPROXIMATIONS TO SOME PROPOSITIONAL LOGICS. TOWARDS MORE REALISTIC MODELS OF LOGICAL AGENTS." Doctoral thesis, Università degli Studi di Milano, 2022. http://hdl.handle.net/2434/931483.

Full text
Abstract:
The depth-bounded approach seeks to provide realistic models of reasoners. Recognizing that most useful logics are idealizations in that they are either undecidable or likely to be intractable, the approach accounts for how they can be approximated in practice by resource-bounded agents. The approach has been applied to Classical Propositional Logic (CPL), yielding a hierarchy of tractable depth-bounded approximations to that logic, which in turn has been based on a KE/KI system. This Thesis shows that the approach can be naturally extended to useful non-classical logics such as First-Degree Entailment (FDE), the Logic of Paradox (LP), Strong Kleene Logic (K3) and Intuitionistic Propositional Logic (IPL). To do this, we introduce a KE/KI-style system for each of those logics such that: is formulated via signed formulae, consist of linear operational rules and branching structural rule(s), can be used as a direct-proof and a refutation method, and is interesting independently of the approach in that it has an exponential speed-up on its tableau system counterpart. The latter given that we introduce a new class of examples which we prove to be hard for all tableau systems sharing the ∨/∧ rules with the classical one, but easy for their analogous KE-style systems. Then we focus on showing that each of our KE/KI-style systems naturally yields a hierarchy of tractable depth-bounded approximations to the respective logic, in terms of the maximum number of allowed nested applications of the branching rule(s). The rule(s) express(es) a generalized rule of bivalence, is (are) essentially cut rule(s) and govern(s) the manipulation of virtual information, which is information that an agent does not hold but she temporarily assumes as if she held it. Intuitively, the more virtual information needs to be invoked via the branching rule(s), the harder the inference is for the agent. So, the nested application of the branching rule(s) provides a sensible measure of inferential depth. We also show that each hierarchy approximating FDE, LP, and K3, admits of a 5-valued non-deterministic semantics; whereas, paving the way for a semantical characterization of the hierarchy approximating IPL, we provide a 3-valued non-deterministic semantics for the full logic that fixes the meaning of the connectives without appealing to “structural” conditions. Moreover, we show a super-polynomial lower bound for the strongest possible version of clausal tableaux on the well-known class of “truly fat” expressions (which are easy for KE), settling a problem left open in the literature. Further, we investigate a hierarchy of tractable depth-bounded approximations to CPL based only on KE. Finally, we propose a refinement of the p-simulation relation which is adequate to establish positive results about the superiority of a system over another with respect to proof-search.
APA, Harvard, Vancouver, ISO, and other styles
17

Döcker, Janosch Otto [Verfasser]. "Placing problems from phylogenetics and (quantified) propositional logic in the polynomial hierarchy / Janosch Otto Döcker." Tübingen : Universitätsbibliothek Tübingen, 2021. http://nbn-resolving.de/urn:nbn:de:bsz:21-dspace-1187027.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Nicoladelli, José Martim. "ASA-CALCPRO: uma ferramenta de cálculo proposional e sua utilização no ensino." Centro Federal de Educação Tecnológica do Paraná, 2005. http://repositorio.utfpr.edu.br/jspui/handle/1/108.

Full text
Abstract:
O uso adequado de ambientes computacionais pode representar um aumento de qualidade e conforto no processo de ensino-aprendizagem de algumas disciplinas. Percebeu-se, através de pesquisa, a inexistência de ferramentas voltadas para o ensino-aprendizagem de cálculo proposicional que atendessem aos critérios estabelecidos para um ambiente de suporte ao aluno (ASA). Como consequência do resultado da pesquisa, este trabalho introduz o conceito e os requisitos básicos de um ASA, concebe e implementa uma ferramenta ASA voltada para o ensino-aprendizagem de cálculo proposicional, acompanhada de um plano de ensino opcional, e os coloca à disposição da comunidade acadêmica. Apresenta também estudos de casos sobre apresentações e usos da ferramenta em vários estágios de desenvolvimento, além da descrição de cada módulo e dos métodos e regras disponibilizados pela ferramenta. Pretende-se que a ferramenta ASA-CalcPro e o plano de ensino sugerido, sejam uma contribuição social positiva e um estímulo ao ensino-aprendizagem de cálculo proposicional.
The adequate use of computational environments can increase the quality and comfort of the teaching-learning process for some of the academic disciplines. A review of the literature reveals the lack of existing tools for the teaching-learning of the propositional calculus that conform to the criteria established for the student support environment (ambiente de suporte ao aluno (ASA)). As a consequence of this conclusion, the current thesis introduces the concept of, and the basic requirements for, an ASA, conceives and implements an ASA tool for the teaching-learning of the propositional calculus, together with an optional teaching plan, and puts both at the disposition of the academic community. The thesis also presents case studies of presentations and uses of the tool at various stages of its development, as well as a description of each module and of the methods and rules made available for use by the tool. It is hoped that the ASA-CalcPro tool and the suggested plan of study will make a positive social contribution and will be a stimulant for the teaching-learning of the propositional calculus.
APA, Harvard, Vancouver, ISO, and other styles
19

Slater, Andrew, and andrew slater@csl anu edu au. "Investigations into Satisfiability Search." The Australian National University. Research School of Information Sciences and Engineering, 2003. http://thesis.anu.edu.au./public/adt-ANU20040310.103258.

Full text
Abstract:
In this dissertation we investigate theoretical aspects of some practical approaches used in solving and understanding search problems. We concentrate on the Satisfiability problem, which is a strong representative from search problem domains. The work develops general theoretical foundations to investigate some practical aspects of satisfiability search. This results in a better understanding of the fundamental mechanics for search algorithm construction and behaviour. A theory of choice or branching heuristics is presented, accompanied by results showing a correspondence of both parameterisations and performance when the method is compared to previous empirically motivated branching techniques. The logical foundations of the backtracking mechanism are explored alongside formulations for reasoning in relevant logics which results in the development of a malleable backtracking mechanism that subsumes other intelligent backtracking proof construction techniques and allows the incorporation of proof rearrangement strategies. Moreover, empirical tests show that relevant backtracking outperforms all other forms of intelligent backtracking search tree construction methods. An investigation into modelling and generating world problem instances justifies a modularised problem model proposal which is used experimentally to highlight the practicability of search algorithms for the proposed model and related domains.
APA, Harvard, Vancouver, ISO, and other styles
20

Alves, Carla Maria Carneiro. "Lógica e álgebras booleanas." Master's thesis, Universidade Lusíada, 2002. http://hdl.handle.net/10198/4537.

Full text
Abstract:
O principal objectivo deste trabalho consistiu em investigar e aprofundar os conhecimentos na área da Lógica clássica “booleana” e suas álgebras. Nesse sentido desenvolveu-se um trabalho organizado em capítulos, designados por: - capítulo I - Lógica proposicional; - capítulo II - Álgebras booleanas; - capítulo III - Conclusões e considerações finais. No primeiro capítulo desenvolveu-se a Lógica proposicional, essencialmente do ponto de vista semântico. Fez-se ainda uma breve revisão dos conteúdos considerados mais pertinentes para a investigação. Os principais tópicos desenvolvidos, neste capítulo, foram: a sintaxe, a semântica, as formas normais, os conjuntos completos de conectivos, o lema de interpolação e o teorema da compacidade e algumas suas aplicações. No segundo capítulo desenvolveram-se as Álgebras booleanas. Deu-se particular ênfase aos conceitos de álgebra e de topologia, relacionados com as álgebras booleanas. Também neste capítulo foi feita uma revisão prévia acerca de alguns conteúdos considerados relevantes para o estudo. Para além de algumas definições e propriedades das álgebras booleanas, foram também abordados os seguintes subtemas: átomos, homomorfismos de álgebras booleanas, ideais e filtros e ainda o espaço de Stone. Os principais tópicos desenvolvidos foram: uma parte da álgebra e da topologia (alguns conteúdos necessários para o estudo), algumas definições de álgebras booleanas, átomos nas álgebras booleanas, homomorfismos, isomorfismos, subálgebras, ideais, filtros, teorema de Stone. O terceiro capítulo constitui uma síntese do trabalho e uma reflexão sobre o modo como decorreu a investigação bem como sugestões e possíveis implicações para investigações futuras. The main objective of this work was to investigate and deepen our knowledge about Logic classical “boolean” and its algebras. For that, we considered two chapters of Logic, Propositional Logical and Boolean Algebras. For that purpose, we developed a work that was organized in the following chapters: - chapter I - Propositional Logic; - chapter II – Boolean Algebras; - chapter III – conclusions and final considerations. The first chapter is about Propositional Logic, essentially from the semantical point of view. The main developed topics were, in this chapter, syntax, semantics, normal forms, connective full groups, interpolation lemma and the compactness theorem and some applications. The second chapter is about Boolean Algebras. We emphasized algebra and topology concepts, which are related to Boolean algebras. In this chapter we also made a previous survey about some concepts that were relevant for this study. Besides some definitions and properties of Boolean algebras, the following subthemes were also considered: atoms, homomorphisms of Boolean algebras, ideals and filters and also Stone’s space. The main topics developed were: a part of algebra and of topology (some contents that were necessary for the study), some Boolean algebra definitions, atoms in Boolean algebras, homomorphisms, isomorphisms, subalgebras, ideals, filters, and Stone’s theorem. The third chapter consists on a synthesis of the work and a reflexion about the way the investigation was taken and also on some suggestions and possible implications for future investigations.
APA, Harvard, Vancouver, ISO, and other styles
21

Abdulahhad, Karam. "Information retrieval modeling by logic and lattice : application to conceptual information retrieval." Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENM014/document.

Full text
Abstract:
Cette thèse se situe dans le contexte des modèles logique de Recherche d'Information (RI). Le travail présenté dans la thèse est principalement motivé par l'inexactitude de l'hypothèse sur l'indépendance de termes. En effet, cette hypothèse communément acceptée en RI stipule que les termes d'indexation sont indépendant les un des autres. Cette hypothèse est fausse en pratique mais permet tout de même aux systèmes de RI de donner de bon résultats. La proposition contenue dans cette thèse met également l'emphase sur la nature déductive du processus de jugement de pertinence. Les logiques formelles sont bien adaptées pour la représentation des connaissances. Elles permettent ainsi de représenter les relations entre les termes. Les logiques formelles sont également des systèmes d'inférence, ainsi la RI à base de logique constitue une piste de travail pour construire des systèmes efficaces de RI. Cependant, en étudiant les modèles actuels de RI basés sur la logique, nous montrons que ces modèles ont généralement des lacunes. Premièrement, les modèles de RI logiques proposent normalement des représentations complexes de document et des requête et difficile à obtenir automatiquement. Deuxièmement, la décision de pertinence d->q, qui représente la correspondance entre un document d et une requête q, pourrait être difficile à vérifier. Enfin, la mesure de l'incertitude U(d->q) est soit ad-hoc ou difficile à mettre en oeuvre. Dans cette thèse, nous proposons un nouveau modèle de RI logique afin de surmonter la plupart des limites mentionnées ci-dessus. Nous utilisons la logique propositionnelle (PL). Nous représentons les documents et les requêtes comme des phrases logiques écrites en Forme Normale Disjonctive. Nous argumentons également que la décision de pertinence d->q pourrait être remplacée par la validité de l'implication matérielle. Pour vérifier si d->q est valide ou non, nous exploitons la relation potentielle entre PL et la théorie des treillis. Nous proposons d'abord une représentation intermédiaire des phrases logiques, où elles deviennent des noeuds dans un treillis ayant une relation d'ordre partiel équivalent à la validité de l'implication matérielle. En conséquence, nous transformons la vérification de validité de d->q, ce qui est un calcul intensif, en une série de vérifications simples d'inclusion d'ensembles. Afin de mesurer l'incertitude de la décision de pertinence U(d->q), nous utilisons la fonction du degré d'inclusion Z, qui est capable de quantifier les relations d'ordre partielles définies sur des treillis. Enfin, notre modèle est capable de travailler efficacement sur toutes les phrases logiques sans aucune restriction, et est applicable aux données à grande échelle. Notre modèle apporte également quelques conclusions théoriques comme: la formalisation de l'hypothèse de van Rijsbergen sur l'estimation de l'incertitude logique U(d->q) en utilisant la probabilité conditionnelle P(q|d), la redéfinition des deux notions Exhaustivité et Spécificité, et finalement ce modèle a également la possibilité de reproduire les modèles les plus classiques de RI. De manière pratique, nous construisons trois instances opérationnelles de notre modèle. Une instance pour étudier l'importance de Exhaustivité et Spécificité, et deux autres pour montrer l'insuffisance de l'hypothèse sur l'indépendance des termes. Nos résultats expérimentaux montrent un gain de performance lors de l'intégration Exhaustivité et Spécificité. Cependant, les résultats de l'utilisation de relations sémantiques entre les termes ne sont pas suffisants pour tirer des conclusions claires. Le travail présenté dans cette thèse doit être poursuivit par plus d'expérimentations, en particulier sur l'utilisation de relations, et par des études théoriques en profondeur, en particulier sur les propriétés de la fonction Z
This thesis is situated in the context of logic-based Information Retrieval (IR) models. The work presented in this thesis is mainly motivated by the inadequate term-independence assumption, which is well-accepted in IR although terms are normally related, and also by the inferential nature of the relevance judgment process. Since formal logics are well-adapted for knowledge representation, and then for representing relations between terms, and since formal logics are also powerful systems for inference, logic-based IR thus forms a candidate piste of work for building effective IR systems. However, a study of current logic-based IR models shows that these models generally have some shortcomings. First, logic-based IR models normally propose complex, and hard to obtain, representations for documents and queries. Second, the retrieval decision d->q, which represents the matching between a document d and a query q, could be difficult to verify or check. Finally, the uncertainty measure U(d->q) is either ad-hoc or hard to implement. In this thesis, we propose a new logic-based IR model to overcome most of the previous limits. We use Propositional Logic (PL) as an underlying logical framework. We represent documents and queries as logical sentences written in Disjunctive Normal Form. We also argue that the retrieval decision d->q could be replaced by the validity of material implication. We then exploit the potential relation between PL and lattice theory to check if d->q is valid or not. We first propose an intermediate representation of logical sentences, where they become nodes in a lattice having a partial order relation that is equivalent to the validity of material implication. Accordingly, we transform the checking of the validity of d->q, which is a computationally intensive task, to a series of simple set-inclusion checking. In order to measure the uncertainty of the retrieval decision U(d->q), we use the degree of inclusion function Z that is capable of quantifying partial order relations defined on lattices. Finally, our model is capable of working efficiently on any logical sentence without any restrictions, and is applicable to large-scale data. Our model also has some theoretical conclusions, including, formalizing and showing the adequacy of van Rijsbergen assumption about estimating the logical uncertainty U(d->q) through the conditional probability P(q|d), redefining the two notions Exhaustivity and Specificity, and the possibility of reproducing most classical IR models as instances of our model. We build three operational instances of our model. An instance to study the importance of Exhaustivity and Specificity, and two others to show the inadequacy of the term-independence assumption. Our experimental results show worthy gain in performance when integrating Exhaustivity and Specificity into one concrete IR model. However, the results of using semantic relations between terms were not sufficient to draw clear conclusions. On the contrary, experiments on exploiting structural relations between terms were promising. The work presented in this thesis can be developed either by doing more experiments, especially about using relations, or by more in-depth theoretical study, especially about the properties of the Z function
APA, Harvard, Vancouver, ISO, and other styles
22

Nour, Abir. "Etude de systèmes logiques extensions de la logique intuitionniste." Université Joseph Fourier (Grenoble), 1997. http://www.theses.fr/1997GRE10128.

Full text
Abstract:
Pour modeliser le raisonnement d'un ensemble ordonne t d'agents intelligents, h. Rasiowa a introduit des systemes logiques appeles approximation logics. Dans ces systemes, un ensemble de constantes -bien que difficile a justifier dans les applications- joue un role fondamental. Dans notre travail, nous considerons des systemes logiques appeles l#t#f sans ce type de constantes mais en nous limitant au cas ou t est un ensemble ordonne fini. Nous demontrons un theoreme de deduction faible et nous presentons une liste de proprietes qui seront utilisees par la suite. Nous introduisons aussi une semantique algebrique en utilisant les algebres de heyting avec des operateurs. Pour demontrer la completude du systeme l#t#f par rapport a la semantique algebrique, nous utilisons la methode de h. Rasiowa et r. Sikorski pour la logique du premier ordre. Dans le cas propositionnel, un corollaire nous permet d'affirmer que la question de savoir si une formule du calcul propositionnel est valide ou non est effectivement decidable. Nous etudions aussi certaines relations entre les logiques l#t#f et les logiques intuitionniste et classique. De plus, la methode des tableaux est consideree car elle est connue dans la litterature sur les logiques non classiques. Enfin, nous introduisons une semantique de type kripke. L'ensemble dit des mondes possibles est ici enrichi d'une famille de fonctions indexee par l'ensemble fini t et verifiant certaines conditions. Ce type de semantique nous permet de deduire plusieurs resultats. Nous construisons un modele fini particulier de type kripke qui caracterise le calcul l#t#f propositionnel. Nous introduisons une semantique relationnelle en suivant la methode de e. Orlowska, qui a l'enorme avantage de permettre une interpretation de la logique propositionnelle l#t#f en n'utilisant qu'un type d'objet : les relations. Nous traitons aussi le probleme de la complexite du calcul propositionnel l#t#f.
APA, Harvard, Vancouver, ISO, and other styles
23

Curiel, Diaz Arturo Tlacaélel. "Using formal logic to represent sign language phonetics in semi-automatic annotation tasks." Thesis, Toulouse 3, 2015. http://www.theses.fr/2015TOU30308/document.

Full text
Abstract:
Cette thèse présente le développement d'un framework formel pour la représentation des Langues de Signes (LS), les langages des communautés Sourdes, dans le cadre de la construction d'un système de reconnaissance automatique. Les LS sont de langues naturelles, qui utilisent des gestes et l'espace autour du signeur pour transmettre de l'information. Cela veut dire que, à différence des langues vocales, les morphèmes en LS ne correspondent pas aux séquences de sons; ils correspondent aux séquences de postures corporelles très spécifiques, séparés par des changements tels que de mouvements. De plus, lors du discours les signeurs utilisent plusieurs parties de leurs corps (articulateurs) simultanément, ce qui est difficile à capturer avec un système de notation écrite. Cette situation difficulté leur représentation dans de taches de Traitement Automatique du Langage Naturel (TALN). Pour ces raisons, le travail présenté dans ce document a comme objectif la construction d'une représentation abstraite de la LS; plus précisément, le but est de pouvoir représenter des collections de vidéo LS (corpus) de manière formelle. En générale, il s'agit de construire une couche de représentation intermédiaire, permettant de faire de la reconnaissance automatique indépendamment des technologies de suivi et des corpus utilisés pour la recherche. Cette couche corresponde à un système de transition d'états (STE), spécialement crée pour représenter la nature parallèle des LS. En plus, elle peut-être annoté avec de formules logiques pour son analyse, à travers de la vérification de modèles. Pour représenter les propriétés à vérifier, une logique multi-modale a été choisi : la Logique Propositionnelle Dynamique (PDL). Cette logique a été originalement crée pour la spécification de programmes. De manière plus précise, PDL permit d'utilise des opérateurs modales comme [a] et , représentant <> et <>, respectivement. Une variante particulaire a été développée pour les LS : la PDL pour Langue de Signes (PDLSL), qui est interprété sur des STE représentant des corpus. Avec PDLSL, chaque articulateur du corps (comme les mains et la tête) est vu comme un agent indépendant; cela veut dire que chacun a ses propres actions et propositions possibles, et qu'il peux les exécuter pour influencer une posture gestuelle. L'utilisation du framework proposé peut aider à diminuer deux problèmes importantes qui existent dans l'étude linguistique des LS : hétérogénéité des corpus et la manque des systèmes automatiques d'aide à l'annotation. De ce fait, un chercheur peut rendre exploitables des corpus existants en les transformant vers des STE. Finalement, la création de cet outil à permit l'implémentation d'un système d'annotation semi-automatique, basé sur les principes théoriques du formalisme. Globalement, le système reçoit des vidéos LS et les transforme dans un STE valide. Ensuite, un module fait de la vérification formelle sur le STE, en utilisant une base de données de formules crée par un expert en LS. Les formules représentent des propriétés lexicales à chercher dans le STE. Le produit de ce processus, est une annotation qui peut être corrigé par des utilisateurs humains, et qui est utilisable dans des domaines d'études tels que la linguistique
This thesis presents a formal framework for the representation of Signed Languages (SLs), the languages of Deaf communities, in semi-automatic recognition tasks. SLs are complex visio-gestural communication systems; by using corporal gestures, signers achieve the same level of expressivity held by sound-based languages like English or French. However, unlike these, SL morphemes correspond to complex sequences of highly specific body postures, interleaved with postural changes: during signing, signers use several parts of their body simultaneously in order to combinatorially build phonemes. This situation, paired with an extensive use of the three-dimensional space, make them difficult to represent with tools already existent in Natural Language Processing (NLP) of vocal languages. For this reason, the current work presents the development of a formal representation framework, intended to transform SL video repositories (corpus) into an intermediate representation layer, where automatic recognition algorithms can work under better conditions. The main idea is that corpora can be described with a specialized Labeled Transition System (LTS), which can then be annotated with logic formulae for its study. A multi-modal logic was chosen as the basis of the formal language: the Propositional Dynamic Logic (PDL). This logic was originally created to specify and prove properties on computer programs. In particular, PDL uses the modal operators [a] and to denote necessity and possibility, respectively. For SLs, a particular variant based on the original formalism was developed: the PDL for Sign Language (PDLSL). With the PDLSL, body articulators (like the hands or head) are interpreted as independent agents; each articulator has its own set of valid actions and propositions, and executes them without influence from the others. The simultaneous execution of different actions by several articulators yield distinct situations, which can be searched over an LTS with formulae, by using the semantic rules of the logic. Together, the use of PDLSL and the proposed specialized data structures could help curb some of the current problems in SL study; notably the heterogeneity of corpora and the lack of automatic annotation aids. On the same vein, this may not only increase the size of the available datasets, but even extend previous results to new corpora; the framework inserts an intermediate representation layer which can serve to model any corpus, regardless of its technical limitations. With this, annotations is possible by defining with formulae the characteristics to annotate. Afterwards, a formal verification algorithm may be able to find those features in corpora, as long as they are represented as consistent LTSs. Finally, the development of the formal framework led to the creation of a semi-automatic annotator based on the presented theoretical principles. Broadly, the system receives an untreated corpus video, converts it automatically into a valid LTS (by way of some predefined rules), and then verifies human-created PDLSL formulae over the LTS. The final product, is an automatically generated sub-lexical annotation, which can be later corrected by human annotators for their use in other areas such as linguistics
APA, Harvard, Vancouver, ISO, and other styles
24

Falcão, Pedro Alonso Amaral. "Aspectos da teoria de funções modais." Universidade de São Paulo, 2012. http://www.teses.usp.br/teses/disponiveis/8/8133/tde-11042013-104549/.

Full text
Abstract:
Apresentamos alguns aspectos da teoria de funções modais, que é o correlato modal da teoria de funções de verdade. Enquanto as fórmulas da lógica proposicional clássica expressam funções de verdade, as fórmulas da lógica proposicional modal (S5) expressam funções modais. Generalizamos alguns dos teoremas da teoria de funções de verdade para o caso modal; em particular, exibimos provas da completude funcional de alguns conjuntos de funções modais e definimos uma (nova) noção de reduto vero-funcional de funções modais, bem como a composição de funções modais em termos destes redutos.
We present some aspects of the theory of modal functions, which is the modal correlate of the theory of truth-functions. While the formulas of classical propositional logic express truth-functions, the formulas of modal propositional logic (S5) express modal functions. We generalize some theorems of the theory of truth-functions to the modal case; in particular, we show the functional completeness of some sets of modal functions and define a (new) notion of truth-functional reduct of modal functions, as well as the composition of modal functions in terms of such reducts.
APA, Harvard, Vancouver, ISO, and other styles
25

Arora, Rajat. "Enhancing SAT-based Formal Verification Methods using Global Learning." Thesis, Virginia Tech, 2004. http://hdl.handle.net/10919/32987.

Full text
Abstract:
With the advances in VLSI and System-On-Chip (SOC) technology, the complexity of hardware systems has increased manifold. Today, 70% of the design cost is spent in verifying these intricate systems. The two most widely used formal methods for design verification are Equivalence Checking and Model Checking. Equivalence Checking requires that the implementation circuit should be exactly equivalent to the specification circuit (golden model). In other words, for each possible input pattern, the implementation circuit should yield the same outputs as the specification circuit. Model checking, on the other hand, checks to see if the design holds certain properties, which in turn are indispensable for the proper functionality of the design. Complexities in both Equivalence Checking and Model Checking are exponential to the circuit size. In this thesis, we firstly propose a novel technique to improve SAT-based Combinational Equivalence Checking (CEC) and Bounded Model Checking (BMC). The idea is to perform a low-cost preprocessing that will statically induce global signal relationships into the original CNF formula of the circuit under verification and hence reduce the complexity of the SAT instance. This efficient and effective preprocessing quickly builds up the implication graph for the circuit under verification, yielding a large set of logic implications composed of direct, indirect and extended backward implications. These two-node implications (spanning time-frame boundaries) are converted into two-literal clauses, and added to the original CNF database. The added clauses constrain the search space of the SAT-solver engine, and provide correlation among the different variables, which enhances the Boolean Constraint Propagation (BCP). Experimental results on large and difficult ISCAS'85, ISCAS'89 (full scan) and ITC'99 (full scan) CEC instances and ISCAS'89 BMC instances show that our approach is independent of the state-of-the-art SAT-solver used, and that the added clauses help to achieve more than an order of magnitude speedup over the conventional approach. Also, comparison with Hyper-Resolution [Bacchus 03] suggests that our technique is much more powerful, yielding non-trivial clauses that significantly simplify the SAT instance complexity. Secondly, we propose a novel global learning technique that helps to identify highly non-trivial relationships among signals in the circuit netlist, thereby boosting the power of the existing implication engine. We call this new class of implications as 'extended forward implications', and show its effectiveness through additional untestable faults they help to identify. Thirdly, we propose a suite of lemmas and theorems to formalize global learning. We show through implementation that these theorems help to significantly simplify a generic CNF formula (from Formal Verification, Artificial Intelligence etc.) by identifying the necessary assignments, equivalent signals, complementary signals and other non-trivial implication relationships among its variables. We further illustrate through experimental results that the CNF formula simplification obtained using our tool outshines the simplification obtained using other preprocessors.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
26

Cyriac, Aiswarya. "Verification of communicating recursive programs via split-width." Thesis, Cachan, Ecole normale supérieure, 2014. http://www.theses.fr/2014DENS0004/document.

Full text
Abstract:
Cette thèse développe des techniques à base d'automates pour la vérification formelle de systèmes physiquement distribués communiquant via des canaux fiables de tailles non bornées. Chaque machine peut exécuter localement plusieurs programmes récursifs (multi-threading). Un programme récursif peut également utiliser pour ses calculs locaux des structures de données non bornées, comme des files ou des piles. Ces systèmes, utilisés en pratique, sont si puissants que tous leurs problèmes de vérification deviennent indécidables. Nous introduisons et étudions un nouveau paramètre, appelé largeur de coupe (split-width), pour l'analyse de ces systèmes. Cette largeur de coupe est définie comme le nombre minimum de scissions nécessaires pour partitioner le graphe d'une exécution en parties sur lesquelles on pourra raisonner de manière indépendante. L'analyse est ainsi réalisée avec une approche diviser pour régner. Lorsqu'on se restreint à la classe des comportements ayant une largeur de coupe bornée par une constante, on obtient des procédures de décision optimales pour divers problèmes de vérification sur ces systèmes tels que l'accessibilité, l'inclusion, etc. ainsi que pour la satisfaisabilité et le model checking par rapport à divers formalismes comme la logique monadique du second ordre, la logique dynamique propositionnelle et des logiques temporelles. On montre aussi que les comportements d'un système ont une largeur de coupe bornée si et seulement si ils ont une largeur de clique bornée. Ainsi, grâce aux résultats de Courcelle sur les graphes de degré uniformément borné, la largeur de coupe est non seulement suffisante, mais aussi nécessaire pour obtenir la décidabilité du problème de satisfaisabilité d'une formule de la logique monadique du second ordre. Nous étudions ensuite l'existence de contrôleurs distribués génériques pour nos systèmes distribués. Nous proposons plusieurs contrôleurs, certains ayant un nombre fini d'états et d'autres étant déterministes, qui assurent que les comportements du système sont des graphes ayant une largeur de coupe bornée. Un système ainsi contrôlé de manière distribuée hérite des procédures de décision optimales pour les différents problèmes de vérification lorsque la largeur de coupe est bornée. Cette classe décidable de système généralise plusieurs sous-classes décidables étudiées précédemment
This thesis investigates automata-theoretic techniques for the verification of physically distributed machines communicating via unbounded reliable channels. Each of these machines may run several recursive programs (multi-threading). A recursive program may also use several unbounded stack and queue data-structures for its local-computation needs. Such real-world systems are so powerful that all verification problems become undecidable. We introduce and study a new parameter called split-width for the under-approximate analysis of such systems. Split-width is the minimum number of splits required in the behaviour graphs to obtain disjoint parts which can be reasoned about independently. Thus it provides a divide-and-conquer approach for their analysis. With the parameter split-width, we obtain optimal decision procedures for various verification problems on these systems like reachability, inclusion, etc. and also for satisfiability and model checking against various logical formalisms such as monadic second-order logic, propositional dynamic logic and temporal logics. It is shown that behaviours of a system have bounded split-width if and only if they have bounded clique-width. Thus, by Courcelle's results on uniformly bounded-degree graphs, split-width is not only sufficient but also necessary to get decidability for MSO satisfiability checking. We then study the feasibility of distributed controllers for our generic distributed systems. We propose several controllers, some finite state and some deterministic, which ensure that the behaviours of the system have bounded split-width. Such a distributedly controlled system yields decidability for the various verification problems by inheriting the optimal decision procedures for split-width. These also extend or complement many known decidable subclasses of systems studied previously
APA, Harvard, Vancouver, ISO, and other styles
27

Johannesson, Eric. "Analyticity, Necessity and Belief : Aspects of two-dimensional semantics." Doctoral thesis, Stockholms universitet, Filosofiska institutionen, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-141565.

Full text
Abstract:
A glass couldn't contain water unless it contained H2O-molecules. Likewise, a man couldn't be a bachelor unless he was unmarried. Now, the latter is what we would call a conceptual or analytical truth. It's also what we would call a priori. But it's hardly a conceptual or analytical truth that if a glass contains water, then it contains H2O-molecules. Neither is it a priori. The fact that water is composed of H2O-molecules was an empirical discovery made in the eighteenth century. The fact that all bachelors are unmarried was not. But neither is a logical truth, so how do we explain the difference? Two-dimensional semantics is a framework that promises to shed light on these issues. The main purpose of this thesis is to understand and evaluate this framework in relation to various alternatives, to see whether some version of it can be defended. I argue that it fares better than the alternatives. However, much criticism of two-dimensionalism has focused on its alleged inability to provide a proper semantics for certain epistemic operators, in particular the belief operator and the a priori operator. In response to this criticism, a two-dimensional semantics for belief ascriptions is developed using structured propositions. In connection with this, a number of other issues in the semantics of belief ascriptions are addressed, concerning indexicals, beliefs de se, beliefs de re, and the problem of logical omniscience.
APA, Harvard, Vancouver, ISO, and other styles
28

Ferreira, Rodrigo Costa. "Semântica proposicional categórica." Universidade Federal da Paraí­ba, 2010. http://tede.biblioteca.ufpb.br:8080/handle/tede/5678.

Full text
Abstract:
Made available in DSpace on 2015-05-14T12:11:59Z (GMT). No. of bitstreams: 1 arquivototal.pdf: 891353 bytes, checksum: 2d056c7f53fdfb7c20586b64874e848d (MD5) Previous issue date: 2010-12-01
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
The basic concepts of what later became called category theory were introduced in 1945 by Samuel Eilenberg and Saunders Mac Lane. In 1940s, the main applications were originally in the fields of algebraic topology and algebraic abstract. During the 1950s and 1960s, this theory became an important conceptual framework in other many areas of mathematical research, especially in algrebraic homology and algebraic geometry, as shows the works of Daniel M. Kan (1958) and Alexander Grothendieck (1957). Late, questions mathematiclogics about the category theory appears, in particularly, with the publication of the Functorial Semantics of Algebraic Theories (1963) of Francis Willian Lawvere. After, other works are done in the category logic, such as the the current Makkai (1977), Borceux (1994), Goldblatt (2006), and others. As introduction of application of the category theory in logic, this work presents a study on the logic category propositional. The first section of this work, shows to the reader the important concepts to a better understanding of subject: (a) basic components of category theory: categorical constructions, definitions, axiomatic, applications, authors, etc.; (b) certain structures of abstract algebra: monoids, groups, Boolean algebras, etc.; (c) some concepts of mathematical logic: pre-order, partial orderind, equivalence relation, Lindenbaum algebra, etc. The second section, it talk about the properties, structures and relations of category propositional logic. In that section, we interpret the logical connectives of the negation, conjunction, disjunction and implication, as well the Boolean connectives of complement, intersection and union, in the categorical language. Finally, we define a categorical boolean propositional semantics through a Boolean category algebra.
Os conceitos básicos do que mais tarde seria chamado de teoria das categorias são introduzidos no artigo General Theory of Natural Equivalences (1945) de Samuel Eilenberg e Saunders Mac Lane. Já em meados da década de 1940, esta teoria é aplicada com sucesso ao campo da topologia. Ao longo das décadas de 1950 e 1960, a teoria das categorias ostenta importantes mudanças ao enfoque tradicional de diversas áreas da matemática, entre as quais, em especial, a álgebra geométrica e a álgebra homológica, como atestam os pioneiros trabalhos de Daniel M. Kan (1958) e Alexander Grothendieck (1957). Mais tarde, questões lógico-matemáticas emergem em meio a essa teoria, em particular, com a publica ção da Functorial Semantics of Algebraic Theories (1963) de Francis Willian Lawvere. Desde então, diversos outros trabalhos vêm sendo realizados em lógica categórica, como os mais recentes Makkai (1977), Borceux (1994), Goldblatt (2006), entre outros. Como inicialização à aplicação da teoria das categorias à lógica, a presente dissertação aduz um estudo introdutório à lógica proposicional categórica. Em linhas gerais, a primeira parte deste trabalho procura familiarizar o leitor com os conceitos básicos à pesquisa do tema: (a) elementos constitutivos da teoria das categorias : axiomática, construções, aplicações, autores, etc.; (b) algumas estruturas da álgebra abstrata: monóides, grupos, álgebra de Boole, etc.; (c) determinados conceitos da lógica matemática: pré-ordem; ordem parcial; equivalência, álgebra de Lindenbaum, etc. A segunda parte, trata da aproximação da teoria das categorias à lógica proposicional, isto é, investiga as propriedades, estruturas e relações próprias à lógica proposicional categórica. Nesta passagem, há uma reinterpreta ção dos conectivos lógicos da negação, conjunção, disjunção e implicação, bem como dos conectivos booleanos de complemento, interseção e união, em termos categóricos. Na seqüência, estas novas concepções permitem enunciar uma álgebra booleana categórica, por meio da qual, ao final, é construída uma semântica proposicional booleana categórica.
APA, Harvard, Vancouver, ISO, and other styles
29

Zanuttini, Bruno. "Computational Aspects of Learning, Reasoning, and Deciding." Habilitation à diriger des recherches, Université de Caen, 2011. http://tel.archives-ouvertes.fr/tel-00995250.

Full text
Abstract:
We present results and research projects about the computational aspects of classical problems in Artificial Intelligence. We are interested in the setting of agents able to describe their environment through a possibly huge number of Boolean descriptors, and to act upon this environment. The typical applications of this kind of studies are to the design of autonomous robots (for exploring unknown zones, for instance) or of software assistants (for scheduling, for instance). The ultimate goal of research in this domain is the design of agents able to learn autonomously, by learning and interacting with their environment (including human users), also able to reason for producing new pieces of knowledge, for explaining observed phenomena, and finally, able to decide on which action to take at any moment, in a rational fashion. Ideally, such agents will be fast, efficient as soon as they start to interact with their environment, they will improve their behavior as time goes by, and they will be able to communicate naturally with humans. Among the numerous research questions raised by these objectives, we are especially interested in concept and preference learning, in reinforcement learning, in planning, and in some underlying problems in complexity theory. A particular attention is paid to interaction with humans and to huge numbers of descriptors of the environment, as are necessary in real-world applications.
APA, Harvard, Vancouver, ISO, and other styles
30

Rieger, Schmidt Ana. "La primauté de l’étant et les premiers principes chez Gérard Odon." Thesis, Paris 4, 2014. http://www.theses.fr/2014PA040030.

Full text
Abstract:
Il s’agit d’une thèse sur le traité De duobus communissimis principiis scientiarum de Gérard Odon (vers 1320). Dans la première partie, nous faisons l’analyse du texte en nous centrant sur la notion d’« ens tertio adiacens ». Il s’agit de l’étant signifié par la totalité de la proposition et son vérifacteur ; il est univoquement comment à l’ens reale et à l’ens rationis et pour cette raison Odon l’identifie au sujet des principes de non-contradiction et du tiers exclu. L’ens tertio adiacens correspond aussi au premier objet adéquat de l’intellect et au sujet de la logique, entendue comme la science première. Dans la deuxième partie, nous plaçons Odon dans deux débats historiographiques : celui du réalisme propositionnel (à côté de Walter Burley, Grégoire de Rimini et Jean Wyclif) et celui des avancements de la doctrine des surtranscendantaux (à côté de Nicolas Bonet, François de la Marche et d’autres), lequel émerge de la distinction des deux sens de « res » chez Henri de Gand et ensuite chez Duns Scot
This thesis deals with Geraldus Odonis’ treatise De duobos communissimis principiis scientiarum (ca. 1320). In the first part, we analyze the text by focusing on the concept of "ens tertio adiacens". It is the being signified by the totality of the proposition and its truthmaker; it is univocally common to ens reale and ens rationis, for this reason Odonis identifies it to the subject of the principle of non-contradiction and the principle of excluded middle. The ens tertio adiacens also corresponds to the first adequate object of the intellect and to the subject of logic, which is understood as the first science. In the second part, we place Odonis in two historiographical debates: the propositional realism (alongside Walter Burley, Gregory of Rimini and John Wyclif) and the advancements of the doctrine of supertranscendentals (alongside Nicolas Bonetus, Francis of Marchia and others), which emerges from the distinction between the two senses of "res" in Henry of Ghent and in Duns Scotus
APA, Harvard, Vancouver, ISO, and other styles
31

Boudou, Joseph. "Procédures de décision pour des logiques modales d'actions, de ressources et de concurrence." Thesis, Toulouse 3, 2016. http://www.theses.fr/2016TOU30145/document.

Full text
Abstract:
Les concepts d'action et de ressource sont omniprésents en informatique. La caractéristique principale d'une action est de changer l'état actuel du système modélisé. Une action peut ainsi être l'exécution d'une instruction dans un programme, l'apprentissage d'un fait nouveau, l'acte concret d'un agent autonome, l'énoncé d'un mot ou encore une tâche planifiée. La caractéristique principale d'une ressource est de pouvoir être divisée, par exemple pour être partagée. Il peut s'agir des cases de la mémoire d'un ordinateur, d'un ensemble d'agents, des différent sens d'une expression, d'intervalles de temps ou de droits d'accès. Actions et ressources correspondent souvent aux dimensions temporelles et spatiales du système modélisé. C'est le cas par exemple de l'exécution d'une instruction sur une case de la mémoire ou d'un groupe d'agents qui coopèrent. Dans ces cas, il est possible de modéliser les actions parallèles comme étant des actions opérant sur des parties disjointes des ressources disponibles. Les logiques modales permettent de modéliser les concepts d'action et de ressource. La sémantique relationnelle d'une modalité unaire est une relation binaire permettant d'accéder à un nouvel état depuis l'état courant. Ainsi une modalité unaire correspond à une action. De même, la sémantique d'une modalité binaire est une relation ternaire permettant d'accéder à deux états. En considérant ces deux états comme des sous-états de l'état courant, une modalité binaire modélise la séparation de ressources. Dans cette thèse, nous étudions des logiques modales utilisées pour raisonner sur les actions, les ressources et la concurrence. Précisément, nous analysons la décidabilité et la complexité du problème de satisfaisabilité de ces logiques. Ces problèmes consistent à savoir si une formule donnée peut être vraie. Pour obtenir ces résultats de décidabilité et de complexité, nous proposons des procédures de décision. Ainsi, nous étudions les logiques modales avec des modalités binaires, utilisées notamment pour raisonner sur les ressources. Nous nous intéressons particulièrement à l'associativité. Alors qu'il est généralement souhaitable que la modalité binaire soit associative, puisque la séparation de ressources l'est, cette propriété rend la plupart des logiques indécidables. Nous proposons de contraindre la valuation des variables propositionnelles afin d'obtenir des logiques décidables ayant une modalité binaire associative. Mais la majeure partie de cette thèse est consacrée à des variantes de la logique dynamique propositionnelle (PDL). Cette logiques possède une infinité de modalités unaires structurée par des opérateurs comme la composition séquentielle, l'itération et le choix non déterministe. Nous étudions tout d'abord des variantes de PDL comparables aux logiques temporelle avec branchement. Nous montrons que les problèmes de satisfaisabilité de ces variantes ont la même complexité que ceux des logiques temporelles correspondantes. Nous étudions ensuite en détails des variantes de PDL ayant un opérateur de composition parallèle de programmes inspiré des logiques de ressources. Cet opérateur permet d'exprimer la séparation de ressources et une notion intéressante d'actions parallèle est obtenue par la combinaison des notions d'actions et de séparation. En particulier, il est possible de décrire dans ces logiques des situations de coopération dans lesquelles une action ne peut être exécutée que simultanément avec une autre. Enfin, la contribution principale de cette thèse est de montrer que, dans certains cas intéressants en pratique, le problème de satisfaisabilité de ces logiques a la même complexité que PDL
The concepts of action and resource are ubiquitous in computer science. The main characteristic of an action is to change the current state of the modeled system. An action may be the execution of an instruction in a program, the learning of a new fact, a concrete act of an autonomous agent, a spoken word or a planned task. The main characteristic of resources is to be divisible, for instance in order to be shared. Resources may be memory cells in a computer, performing agents, different meanings of a phrase, time intervals or access rights. Together, actions and resources often constitute the temporal and spatial dimensions of a modeled system. Consider for instance the instructions of a computer executed at memory cells or a set of cooperating agents. We observe that in these cases, an interesting modeling of concurrency arises from the combination of actions and resources: concurrent actions are actions performed simultaneously on disjoint parts of the available resources. Modal logics have been successful in modeling both concepts of actions and resources. The relational semantics of a unary modality is a binary relation which allows to access another state from the current state. Hence, unary modalities are convenient to model actions. Similarly, the relational semantics of a binary modality is a ternary relation which allows to access two states from the current state. By interpreting these two states as substates of the current state, binary modalities allow to divide states. Hence, binary modalities are convenient to model resources. In this thesis, we study modal logics used to reason about actions, resources and concurrency. Specifically, we analyze the decidability and complexity of the satisfiability problem of these logics. These problems consist in deciding whether a given formula can be true in any model. We provide decision procedures to prove the decidability and state the complexity of these problems. Namely, we study modal logics with a binary modality used to reason about resources. We are particularly interested in the associativity property of the binary modality. This property is desirable since the separation of resources is usually associative too. But the associativity of a binary modality generally makes the logic undecidable. We propose in this thesis to constrain the valuation of propositional variables to make modal logics with an associative binary modality decidable. The main part of the thesis is devoted to the study of variants of the Propositional Dynamic Logic (PDL). These logics features an infinite set of unary modalities representing actions, structured by some operators like sequential composition, iteration and non-deterministic choice. We first study branching time variants of PDL and prove that the satisfiability problems of these logics have the same complexity as the corresponding branching-time temporal logics. Then we thoroughly study extensions of PDL with an operator for parallel composition of actions called separating parallel composition and based on the semantics of binary modalities. This operator allows to reason about resources, in addition to actions. Moreover, the combination of actions and resources provides a convenient expression of concurrency. In particular, these logics can express situations of cooperation where some actions can be executed only in parallel with some other actions. Finally, our main contribution is to prove that the complexity of the satisfiability problem of a practically useful variant of PDL with separating parallel composition is the same as the satisfiability problem of plain PDL
APA, Harvard, Vancouver, ISO, and other styles
32

Caridroit, Thomas. "Changement de croyances et logiques modales." Thesis, Artois, 2016. http://www.theses.fr/2016ARTO0402/document.

Full text
Abstract:
Le changement de croyances vise à trouver des moyens adéquats pour faire évoluer les croyances d'un agent lorsqu'il est confronté à de nouvelles informations. Dans la plupart des travaux sur la révision de croyances, l'ensemble de croyances d'un agent est composé de croyances au sujet de l'environnement (le monde) et est représenté par un ensemble de formules de la logique classique. Dans de nombreuses applications, un agent n'est pas seul dans l'environnement, mais le partage avec d'autres agents, qui ont aussi des croyances. Ainsi les croyances sur les croyances des autres agents constituent un élément d'information important pour l'agent, afin d'être en mesure de prendre les meilleures décisions et d'effectuer les meilleures actions. L'utilisation de croyances sur les croyances des autres agents est par exemple cruciale dans la théorie des jeux. Dans cette thèse, nous étudions dans un premier temps les opérateurs de contraction propositionnelle correspondant aux opérateurs de révision de Katsuno et Mendelzon. Nous étudions ensuite une connexion entre les logiques épistémiques et la théorie du changement de croyances, proche de l'approche AGM. Nous nous sommes intéressés à l'utilisation des opérateurs qui modifient les croyances des agents dans les modèles KD45n standard. Cette tâche est plus compliquée que dans le cadre AGM standard, car, dans un contexte multi-agents, les nouvelles informations peuvent prendre différentes formes. Par exemple, chaque nouvelle information peut être observée/transmise/disponible à tous les agents ou seulement à certains d’entre eux
Belief change is about finding appropriate ways to evolve an agent's beliefs when confronted with new pieces of information. In most works on belief revision, the set of beliefs of an agent is composed of beliefs about the environment (the world) and is represented by a set of formulas of classical logic. In many applications, an agent is not alone in the environment, but sharing with other agents, which also have beliefs. Thus beliefs about the beliefs of other agents are an important piece of information for the agent in order to be able to make the best decisions and perform the best actions. The use of beliefs about the beliefs of other agents is, for exampel, crucial in game theory. In this thesis, we first study the operators of propositional contraction corresponding to the revision operators proposed by Katsuno and Mendelzon. Then, we study a connection between epistemic logics and belief change theory, close to the AGM approach. We are interested in the use of operators that modify agent beliefs in standard KD45n models. This task is more complicated than in the standard AGM framework because, in a multi-agent context, new information can take different forms. For example, each new information can be observed/transmitted/available to all agents or only some of them
APA, Harvard, Vancouver, ISO, and other styles
33

Fortin, Marie. "Expressivité de la logique du premier ordre, de la logique dynamique propositionnelle sans étoile et des automates communicants." Electronic Thesis or Diss., université Paris-Saclay, 2020. http://www.theses.fr/2020UPASG046.

Full text
Abstract:
Cette thèse porte sur l’expressivité de la logique du premier ordre et d’autres formalismes sur différentes classes de structures ordonnées, parmi lesquelles les MSC (Message Sequence Charts), un modèle standard pour les exécutions de systèmes concurrents avec échange de messages. Cette étude est motivée par deux questions classiques : celle de l’équivalence, pour certaines classes de structures, entre la logique du premier ordre et son fragment avec k variables, et celle de la comparaison entre automates et logique, dans l’esprit du théorème de Büchi-Elgot-Trakhtenbrot. Notre approche repose sur la logique dynamique propositionnelle sans étoile (PDL sans étoile), une variante de PDL équivalente à la logique du premier ordre avec 3 variables. On étudie d’abord l’expressivité de PDL sans étoile sur des structures linéairement ordonnées avec des prédicats unaires et binaires. On montre que sous certaines conditions de monotonie, PDL sans étoile devient aussi expressive que la logique du premier ordre. Cela implique que toute formule de la logique du premier ordre peut alors être réécrite en une formule équivalente qui utilise au plus 3 variables. Ce résultat s’applique, directement ou indirectement, à un certain nombre de classes naturelles, généralisant des résultats connus et répondant à des questions ouvertes.On se concentre ensuite sur les MSC, auxquels ce premier résultat s’applique également. PDL sans étoile nous permet d’aborder un autre problème important: celui de la synthèse d’automates communicants à partir de spécifications écrites en logique du premier ordre. Les automates communicants sont un modèle de systèmes concurrents dans lequel un nombre fixé d’automates finis échangent des messages via des canaux FIFO. Ils définissent des langages de MSC. Bien que des caractérisations de l’expressivité des automates communicants aient déjà été établies pour certaines restrictions (borne sur la taille des canaux de communications, ou omission de la relation “arrivé-avant” au niveau de la logique), la question suivante restait ouverte dans le cas général : toute formule du premier ordre sur les MSC peut-elle être traduite en un automate communicant équivalent ? On montre que c’est le cas, en utilisant PDL sans étoile comme langage intermédiaire
This thesis is concerned with the expressive power of first-order logic and other formalisms over different classes of ordered structures, among which MSCs (Message Sequence Charts), a standard model for executions of message-passing systems. This study is motivated by two classic problems: the k-variable property, that is, the equivalence of first-order logic and its k-variable fragment over certain classes of structures, and the study of logic-automata connections, in the spirit of Büchi-Elgot-Trakhtenbrot theorem. Our approach relies on star-free propositional dynamic logic (star-free PDL), a variant of PDL with the same expressive power as the 3-variable fragment of first-order logic. We start by studying the expressive power of star-free PDL over linearly ordered structures with unary and binary predicates. We show that under certain monotonicity conditions, star-free PDL becomes as expressive as first-order logic. This implies that any first-order formula can then be rewritten into an equivalent formula with at most 3 variables. This result applies to various natural classes of structures, generalizing several known results and answering some open questions.We then focus on MSCs, to which this first result also applies. We use star-free PDL to address another important problem: the synthesis of communicating finite-state machines (CFMs) from first-order specifications. CFMs are a model of concurrent systems in which a fixed number of finite-state automata communicate through unbounded FIFO channels. They accept languages of MSCs. While logical characterizations of the expressive power of CFMs have been established under different restrictions (bounding the size of the communication channels, or removing the “happened-before” relation from the logic), the following question had remained open in the general case: can every first-order formula over MSCs be translated into an equivalent CFM? We prove that this is the case, using star-free PDL as an intermediate language
APA, Harvard, Vancouver, ISO, and other styles
34

Cialdea, Marta. "Une methode de deduction automatique en logique modale." Toulouse 3, 1986. http://www.theses.fr/1986TOU30179.

Full text
Abstract:
Le but de ce travail consiste a etendre au calcul des predicats la methode de resolution pour la logique propositionnelle modale qui a ete definie par m. Luis farinas del cerro. Le systeme modal considere, appele q, est un sous-systeme de ceux plus connus, t, s4 et s5. Nous considerons toutefois que les resultats obtenus dans ce travail peuvent facilement etre etendus a ceux-ci. Nous definissons une propriete de herbrand modale pour le systeme q et en donnons la preuve. Cette propriete nous permet ensuite de definir un systeme de regles de resolution dans lequel l'unification des termes est soumise a certaines restrictions, et d'en prouver sa correction et sa completude
APA, Harvard, Vancouver, ISO, and other styles
35

Hidouri, Amel. "Contributions aux approches par contraintes pour la fouille de données." Electronic Thesis or Diss., Artois, 2022. http://www.theses.fr/2022ARTO0407.

Full text
Abstract:
Cette thèse s'inscrit dans le contexte de la fouille de données et plus précisément l'extraction de connaissances à partir des données qui consiste à énumérer des motifs intéressants. Ce domaine a été introduit dans les années 1990, et est devenu un élément clé de la fouille de données et de l'apprentissage automatique.La fouille des motifs high utility (HUIM) est un problème très connu dans la fouille des motifs qui étend le problème classique de la fouille des itemsets fréquents. Le mesure d'utilité peut être évaluée en termes de profit, du coût, ou toute autre mesure de préférence définie par l'utilisateur. L'objectif de HUIM est donc de trouver les itemsets ayant une utilité supérieure à un seuil.Récemment, des approches déclaratives ont été proposées pour différentes tâches de la fouille de données incluant la fouille des itemsets fréquents, des règles d'associations, des séquences ou encore des graphes. Ces approches déclaratives présentent l'intérêt d'intégrer facilement de nouvelles contraintes pour la recherche de motifs particuliers.Dans ce contexte, le premier objectif est de proposer un cadre déclaratif pour l'extraction des itemsets high utility et leurs formes condensées à partir de bases de données transactionnelles en utilisant la logique propositionnelle.Pour le passage à l'échelle, nous souhaitons exploiter la décomposition et les approches parallèles. Enfin, l'objectif final de cette thèse est de mettre en evidence les performances des approches proposées à travers une comparaison avec un ensemble de méthodes existantes sur des données réelles et synthétiques
This thesis deals with the field of data mining and, more precisely, the extraction of knowledge from data by enumerating interesting patterns. This research domain was introduced in the 1990s and became a core part of data mining and machine learning.High Utility Pattern Mining (HUIM, for short) is a well-known problem in pattern mining that extends the classical problem of mining frequent itemsets. In fact, the utility can be evaluated in terms of profit, cost, or any other user preference. The objective of HUIM is to find itemsets with a utility greater than a threshold.Declarative approaches have recently been proposed for various data mining tasks such as mining frequent itemsets, association rules, sequences, or graphs. These declarative approaches have the advantage of easily incorporating new constraints for the search for particular patterns.The thesis's first goal is to propose a declarative framework for mining high utility itemsets from transaction databases using symbolic artificial intelligence. Our method is based on the propositional satisfiability problem. Second, in order to improve scalability, we intend to investigate how decomposition and parallelism can solve the common problem of symbolic techniques dealing with large databases while producing interesting results. The third contribution is to propose a propositional satisfiability-based framework for dealing with various condensed representations of high utility patterns as a solution to reduce the mining algorithm's output. Finally, the final objective of this thesis is to highlight the performances through a comparison with a set of approaches in the literature on real and synthetic data
APA, Harvard, Vancouver, ISO, and other styles
36

Mehta, Moneesha. "Four propositional logics." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk2/tape15/PQDD_0014/MQ36512.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Cyriac, Aiswarya, and Aiswarya Cyriac. "Verification of communicating recursive programs via split-width." Phd thesis, École normale supérieure de Cachan - ENS Cachan, 2014. http://tel.archives-ouvertes.fr/tel-01015561.

Full text
Abstract:
This thesis investigates automata-theoretic techniques for the verification of physically distributed machines communicating via unbounded reliable channels. Each of these machines may run several recursive programs (multi-threading). A recursive program may also use several unbounded stack and queue data-structures for its local-computation needs. Such real-world systems are so powerful that all verification problems become undecidable. We introduce and study a new parameter called split-width for the under-approximate analysis of such systems. Split-width is the minimum number of splits required in the behaviour graphs to obtain disjoint parts which can be reasoned about independently. Thus it provides a divide-and-conquer approach for their analysis. With the parameter split-width, we obtain optimal decision procedures for various verification problems on these systems like reachability, inclusion, etc. and also for satisfiability and model checking against various logical formalisms such as monadic second-order logic, propositional dynamic logic and temporal logics. It is shown that behaviours of a system have bounded split-width if and only if they have bounded clique-width. Thus, by Courcelle's results on uniformly bounded-degree graphs, split-width is not only sufficient but also necessary to get decidability for MSO satisfiability checking. We then study the feasibility of distributed controllers for our generic distributed systems. We propose several controllers, some finite state and some deterministic, which ensure that the behaviours of the system have bounded split-width. Such a distributedly controlled system yields decidability for the various verification problems by inheriting the optimal decision procedures for split-width. These also extend or complement many known decidable subclasses of systems studied previously.
APA, Harvard, Vancouver, ISO, and other styles
38

Hughes, Cameron A. "Epistemic Structures of Interrogative Domains." Youngstown State University / OhioLINK, 2008. http://rave.ohiolink.edu/etdc/view?acc_num=ysu1227285777.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Metcalfe, George. "Proof theory for propositional fuzzy logics." Thesis, King's College London (University of London), 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.406569.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Horsfall, Benjamin Robert. "The logic of bunched implications : a memoir /." Connect to thesis, 2007. http://eprints.unimelb.edu.au/archive/00002633.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Guilfoy, Kevin Stephen Abelard Peter Abelard Peter. "Peter Abelard's theory of the proposition /." Thesis, Connect to this title online; UW restricted, 1999. http://hdl.handle.net/1773/5698.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Calardo, Erica. "Inference Rules in some temporal multi-epistemic propositional logics." Thesis, Manchester Metropolitan University, 2008. http://e-space.mmu.ac.uk/323602/.

Full text
Abstract:
Multi-modal logics are among the best tools developed so far to analyse human reasoning and agents’ interactions. Recently multi-modal logics have found several applications in Artificial Intelligence (AI) and Computer Science (CS) in the attempt to formalise reasoning about the behavior of programs. Modal logics deal with sentences that are qualified by modalities. A modality is any word that could be added to a statement p to modify its mode of truth. Temporal logics are obtained by joining tense operators to the classical propositional calculus, giving rise to a language very effective to describe the flow of time. Epistemic logics are suitable to formalize reasoning about agents possessing a certain knowledge. Combinations of temporal and epistemic logics are particularly effective in describing the interaction of agents through the flow of time. Although not yet fully investigated, this approach has found many fruitful applications. These are concerned with the development of systems modelling reasoning about knowledge and space, reasoning under uncertainty, multi-agent reasoning et c. Despite their power, multi modal languages cannot handle a changing environment. But this is exactly what is required in the case of human reasoning, computation and multi-agent environment. For this purpose, inference rules are a core instrument. So far, the research in this field has investigated many modal and superintuitionistic logics. However, for the case of multi-modal logics, not much is known concerning admissible inference rules. In our research we extend the investigation to some multi-modal propositional logics which combine tense and knowledge modalities. As far as we are concerned, these systems have never been investigated before. In particular we start by defining our systems semantically; further we prove such systems to enjoy the effective finite model property and to be decidable with respect to their admissible inference rules. We turn then our attention to the syntactical side and we provide sound and complete axiomatic systems. We conclude our dissertation by introducing the reader to the piece of research we are currently working on. Our original results can be found in [9, 4, 11] (see Appendix A). They have also been presented by the author at some international conferences and schools (see [8, 10, 5, 7, 6] and refer to Appendix B for more details). Our project concerns philosophy, mathematics, AI and CS. Modern applications of logic in CS and AI often require languages able to represent knowledge about dynamic systems. Multi-modal logics serve these applications in a very efficient way, and we would absorb and develop some of these techniques to represent logical consequences in artificial intelligence and computation.
APA, Harvard, Vancouver, ISO, and other styles
43

Binko, Petr. "Moderní plánovací algoritmy." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2010. http://www.nusl.cz/ntk/nusl-237146.

Full text
Abstract:
This work describes graphplan, satplan and real-time adaptive A* planning algorithms. Through implementation of these algorithms, functionality and assumed attributes (real-time calculation, parallelism) are tested. These tests take place in nontrivial domains. Graphplan and satplan algorithms were tested in block-world, tire-world and bulldozer domains. Results of these tests were compared and displayed in graphs. Real-time adaptive A* algorithm was tested in tire-world domain. Results of these tests were compared with classic A* algorithm. Advantages and disadvantages of these algorithms are also described in this work.
APA, Harvard, Vancouver, ISO, and other styles
44

St, John Gavin. "On formally undecidable propositions of Zermelo-Fraenkel set theory." Youngstown State University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=ysu1369657108.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Francez, Itamar. "Existential propositions /." May be available electronically:, 2007. http://proquest.umi.com/login?COPT=REJTPTU1MTUmSU5UPTAmVkVSPTI=&clientId=12498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Wespel, Johannes. "Zur semantischen Feinstruktur in propositionalen Einstellungskontexten." [S.l. : s.n.], 2004. http://www.bsz-bw.de/cgi-bin/xvms.cgi?SWB11244071.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Bab, Sebastian [Verfasser]. "Epsilon-mu-Logik - Eine Theorie propositionaler Logiken / Sebastian Bab." Aachen : Shaker, 2007. http://d-nb.info/1163609757/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Armant, Vincent. "Diagnostic distribué de systèmes respectant la confidentialité." Phd thesis, Université Paris Sud - Paris XI, 2012. http://tel.archives-ouvertes.fr/tel-00763370.

Full text
Abstract:
Dans cette thèse, nous nous intéressons à diagnostiquer des systèmes intrinsèquement distribués (comme les systèmes pairs-à-pairs) où chaque pair n'a accès qu'à une sous partie de la description d'un système global. De plus, en raison d'une politique d'accès trop restrictive, il sera pourra qu'aucun pair ne puisse expliquer le comportement du système global. Dans ce contexte, le challenge du diagnostic distribué est le suivant: expliquer le comportement global d'un système distribué par un ensemble de pairs ayant chacun une vision limitée, tout comme l'aurait fait un unique pair diagnostiqueur ayant, lui, une vision globale du système.D'un point de vue théorique, nous montrons que tout nouveau système, logiquement équivalent au système pair-à-pairs initialement observé, garantit que tout diagnostic local d'un pair pourra être prolongé par un diagnostic global (dans ce cas, le nouveau système est dit correct pour le diagnostic distribué).Nous montrons aussi que si ce nouveau système est structuré (c-à-d: il contient un arbre couvrant pour lequel tous les pairs contenant une même variable forme un graphe connecté) alors il garantit que tout diagnostic global pourra être retrouvé à travers un ensemble de diagnostics locaux des pairs (dans ce cas le nouveau système est dit complet pour le diagnostic distribué).Dans un souci de représentation succincte et afin de respecter la politique de confidentialité du vocabulaire de chacun des pairs, nous présentons un nouvel algorithme Token Elimination (TE), qui décompose le système de pairs initial vers un système structuré.Nous montrons expérimentalement que TE produit des décompositions de meilleurs qualité (c-à-d: de plus petites largeurs arborescentes) que les méthodes envisagées dans un contexte distribué. À partir du système structuré construit par TE, nous transformons chaque description locale en une Forme Normale Disjonctive (FND) globalement cohérente.Nous montrons que ce dernier système garantit effectivement un diagnostic distribué correct et complet. En plus, nous exhibons un algorithme capable de vérifier efficacement que tout diagnostic local fait partie d'un diagnostic minimal global, faisant du système structuré de FNDs un système compilé pour le diagnostic distribué.
APA, Harvard, Vancouver, ISO, and other styles
49

Manthey, Norbert. "Towards Next Generation Sequential and Parallel SAT Solvers." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2015. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-158672.

Full text
Abstract:
This thesis focuses on improving the SAT solving technology. The improvements focus on two major subjects: sequential SAT solving and parallel SAT solving. To better understand sequential SAT algorithms, the abstract reduction system Generic CDCL is introduced. With Generic CDCL, the soundness of solving techniques can be modeled. Next, the conflict driven clause learning algorithm is extended with the three techniques local look-ahead, local probing and all UIP learning that allow more global reasoning during search. These techniques improve the performance of the sequential SAT solver Riss. Then, the formula simplification techniques bounded variable addition, covered literal elimination and an advanced cardinality constraint extraction are introduced. By using these techniques, the reasoning of the overall SAT solving tool chain becomes stronger than plain resolution. When using these three techniques in the formula simplification tool Coprocessor before using Riss to solve a formula, the performance can be improved further. Due to the increasing number of cores in CPUs, the scalable parallel SAT solving approach iterative partitioning has been implemented in Pcasso for the multi-core architecture. Related work on parallel SAT solving has been studied to extract main ideas that can improve Pcasso. Besides parallel formula simplification with bounded variable elimination, the major extension is the extended clause sharing level based clause tagging, which builds the basis for conflict driven node killing. The latter allows to better identify unsatisfiable search space partitions. Another improvement is to combine scattering and look-ahead as a superior search space partitioning function. In combination with Coprocessor, the introduced extensions increase the performance of the parallel solver Pcasso. The implemented system turns out to be scalable for the multi-core architecture. Hence iterative partitioning is interesting for future parallel SAT solvers. The implemented solvers participated in international SAT competitions. In 2013 and 2014 Pcasso showed a good performance. Riss in combination with Copro- cessor won several first, second and third prices, including two Kurt-Gödel-Medals. Hence, the introduced algorithms improved modern SAT solving technology.
APA, Harvard, Vancouver, ISO, and other styles
50

Kim, Ji Eun. "The generation of implicit propositions in "alleged" Korean topics." Diss., Restricted to subscribing institutions, 2010. http://proquest.umi.com/pqdweb?did=2035975201&sid=1&Fmt=2&clientId=1564&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography