Rozprawy doktorskie na temat „Propositional satisfiability”

Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Propositional satisfiability.

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 25 najlepszych rozpraw doktorskich naukowych na temat „Propositional satisfiability”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj rozprawy doktorskie z różnych dziedzin i twórz odpowiednie bibliografie.

1

Al-Saedi, Mohammad Saleh Balasim. "Extensions of tractable classes for propositional satisfiability". Thesis, Artois, 2016. http://www.theses.fr/2016ARTO0405/document.

Pełny tekst źródła
Streszczenie:
La représentation des connaissances et les problèmes d’inférence associés restent à l’heure actuelle une problématique riche et centrale en informatique et plus précisément en intelligence artificielle. Dans ce cadre, la logique propositionnelle permet d’allier puissance d’expression et efficacité. Il reste que, tant que P est différent de NP, la déduction en logique propositionnelle ne peut admettre de solutions à la fois générales et efficaces. Dans cette thèse, nous adressons le problème de satisfiabilité et proposons de nouvelles classes d’instances pouvant être résolues de manière polynomiale.La découverte de nouvelles classes polynomiales pour SAT est à la fois importante d’un point de vue théorique et pratique. En effet, on peut espérer les exploiter efficacement au sein de solveurs SAT. Dans cette thèse, nous proposons d’étendre deux fragments polynomiaux de SAT à l’aide de la propagation unitaire tout en s’assurant que ces fragments demeurent reconnus et résolus de manière polynomiale. Le premier résultat de cette thèse concerne la classe Quad. Nous avons établi certaines propriétés de cette classe d’instances et avons étendu cette dernière de manière à s’abstraire de l’ordre imposé sur les littéraux. Le fragment obtenu en remplaçant cet ordre par différents ordres sur les clauses, conserve lamême complexité dans le pire cas. Nous avons également étudié l’impact de la résolution bornée et de la redondance par propagation unitaire sur cette classe. La seconde contribution concerne la classe polynomiale proposée par Tovey. La propagation unitaire est une nouvelle fois utilisée pour étendre cette classe. Nous comparons le nouveau fragment polynomial obtenu à deux autres classes basées également sur la propagation unitaire : Quad et UP-Horn. Nousapportons également une réponse à une question ouverte au sujet des connexions de ces classes. Nous montrons que UP-Horn et d’autres classes basées sur la propagation unitaire sont strictement incluses dans S Quad qui représente l’union de toutes les classes Quad obtenues par l’exploitation de tous les ordres sur les clauses possibles
Knowledge representation and reasoning is a key issue in computer science and more particularly in artificial intelligence. In this respect, propositional logic is a representation formalism that is a good trade-off between the opposite computational efficiency and expressiveness criteria. However, unless P = NP, deduction in propositional logic is not polynomial in the worst case. So, in this thesis we propose new extensions of tractable classes of the propositional satisfiability problem. Tractable fragments of SAT play a role in the implementation of the most efficient current SAT solvers, many of thesetractable classes use the linear time unit propagation (UP) inference rule. We attempt to extend two of currently-known polynomial fragments of SAT thanks to UP in such a way that the fragments can still be recognized and solved in polynomial time. A first result focuses on Quad fragments: we establish some properties of Quad fragments and extend these fragments and exhibit promising variants. The extension is obtained by allowing Quad fixed total orderings of clauses to be accompanied with specific additional separate orderings of maximal sub-clauses. The resulting fragments extend Quad without degrading its worst-case complexity. Also, we investigate how bounded resolution and redundancy through unit propagation can play a role in this respect. The second contribution on tractable subclasses of SAT concerns extensions of one well-known Tovey’s polynomial fragment so that they also include instances that can be simplified using UP. Then, we compare two existing polynomial fragments based on UP: namely, Quad and UP-Horn. We also answer an open question about the connections between these two classes: we show that UP-Horn and some other UP-based variants are strict subclasses of S Quad, where S Quad is the union of all Quad classes obtained by investigating all possible orderings of clauses
Style APA, Harvard, Vancouver, ISO itp.
2

Hansen, Stephen Lee. "Complete Randomized Cutting Plane Algorithms for Propositional Satisfiability". NSUWorks, 2000. http://nsuworks.nova.edu/gscis_etd/565.

Pełny tekst źródła
Streszczenie:
The propositional satisfiability problem (SAT) is a fundamental problem in computer science and combinatorial optimization. A considerable number of prior researchers have investigated SAT, and much is already known concerning limitations of known algorithms for SAT. In particular, some necessary conditions are known, such that any algorithm not meeting those conditions cannot be efficient. This paper reports a research to develop and test a new algorithm that meets the currently known necessary conditions. In chapter three, we give a new characterization of the convex integer hull of SAT, and two new algorithms for finding strong cutting planes. We also show the importance of choosing which vertex to cut, and present heuristics to find a vertex that allows a strong cutting plane. In chapter four, we describe an experiment to implement a SAT solving algorithm using the new algorithms and heuristics, and to examine their effectiveness on a set of problems. In chapter five, we describe the implementation of the algorithms, and present computational results. For an input SAT problem, the output of the implemented program provides either a witness to the satisfiability or a complete cutting plane proof of satisfiability. The description, implementation, and testing of these algorithms yields both empirical data to characterize the performance of the new algorithms, and additional insight to further advance the theory. We conclude from the computational study that cutting plane algorithms are efficient for the solution of a large class of SAT problems.
Style APA, Harvard, Vancouver, ISO itp.
3

Duong, Thach-Thao Nguyen. "Improving Diversification in Local Search for Propositional Satisfiability". Thesis, Griffith University, 2014. http://hdl.handle.net/10072/365717.

Pełny tekst źródła
Streszczenie:
In recent years, the Propositional Satisfiability (SAT) has become standard for encoding real world complex constrained problems. SAT has significant impacts on various research fields in Artificial Intelligence (AI) and Constraint Programming (CP). SAT algorithms have also been successfully used in solving many practical and industrial applications that include electronic design automation, default reasoning, diagnosis, planning, scheduling, image interpretation, circuit design, and hardware and software verification. The most common representation of a SAT formula is the Conjunctive Normal Form (CNF). A CNF formula is a conjunction of clauses where each clause is a disjunction of Boolean literals. A SAT formula is satisfiable if there is a truth assignment for each variable such that all clauses in the formula are satisfied. Solving a SAT problem is to determine a truth assignment that satisfies a CNF formula. SAT is the first problem proved to be NP-complete [20]. There are many algorithmic methodologies to solve SAT. The most obvious one is systematic search; however another popular and successful approach is stochastic local search (SLS). Systematic search is usually referred to as complete search or backtrack-style search. In contrast, SLS is a method to explore the search space by randomisation and perturbation operations. Although SLS is an incomplete search method, it is able to find the solutions effectively by using limited time and resources. Moreover, some SLS solvers can solve hard SAT problems in a few minutes while these problems could be beyond the capacity of systematic search solvers.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Information and Communication Technology
Science, Environment, Engineering and Technology
Full Text
Style APA, Harvard, Vancouver, ISO itp.
4

Ferreira, Junior Valnir, i N/A. "Improvements to Clause Weighting Local Search for Propositional Satisfiability". Griffith University. Institute for Integrated and Intelligent Systems, 2007. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20070823.123257.

Pełny tekst źródła
Streszczenie:
The propositional satisfiability (SAT) problem is of considerable theoretical and practical relevance to the artificial intelligence (AI) community and has been used to model many pervasive AI tasks such as default reasoning, diagnosis, planning, image interpretation, and constraint satisfaction. Computational methods for SAT have historically fallen into two broad categories: complete search and local search. Within the local search category, clause weighting methods are amongst the best alternatives for SAT, becoming particularly attractive on problems where a complete search is impractical or where there is a need to find good candidate solutions within a short time. The thesis is concerned with the study of improvements to clause weighting local search methods for SAT. The main contributions are: A component-based framework for the functional analysis of local search methods. A clause weighting local search heuristic that exploits longer-term memory arising from clause weight manipulations. The approach first learns which clauses are globally hardest to satisfy and then uses this information to treat these clauses differentially during weight manipulation [Ferreira Jr and Thornton, 2004]. A study of heuristic tie breaking in the domain of additive clause weighting local search methods, and the introduction of a competitive method that uses heuristic tie breaking instead of the random tie breaking approach used in most existing methods [Ferreira Jr and Thornton, 2005]. An evaluation of backbone guidance for clause weighting local search, and the introduction of backbone guidance to three state-of-the-art clause weighting local search methods [Ferreira Jr, 2006]. A new clause weighting local search method for SAT that successfully exploits synergies between the longer-term memory and tie breaking heuristics developed in the thesis to significantly improve on the performance of current state-of-the-art local search methods for SAT-encoded instances containing identifiable CSP structure. Portions of this thesis have appeared in the following refereed publications: Longer-term memory in clause weighting local search for SAT. In Proceedings of the 17th Australian Joint Conference on Artificial Intelligence, volume 3339 of Lecture Notes in Artificial Intelligence, pages 730-741, Cairns, Australia, 2004. Tie breaking in clause weighting local search for SAT. In Proceedings of the 18th Australian Joint Conference on Artificial Intelligence, volume 3809 of Lecture Notes in Artificial Intelligence, pages 70–81, Sydney, Australia, 2005. Backbone guided dynamic local search for propositional satisfiability. In Proceedings of the Ninth International Symposium on Artificial Intelligence and Mathematics, AI&M, Fort Lauderdale, Florida, 2006.
Style APA, Harvard, Vancouver, ISO itp.
5

Pham, Duc Nghia, i n/a. "Modelling and Exploiting Structures in Solving Propositional Satisfiability Problems". Griffith University. Institute for Integrated and Intelligent Systems, 2006. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20070216.143447.

Pełny tekst źródła
Streszczenie:
Recent research has shown that it is often preferable to encode real-world problems as propositional satisfiability (SAT) problems and then solve using a general purpose SAT solver. However, much of the valuable information and structure of these realistic problems is flattened out and hidden inside the corresponding Conjunctive Normal Form (CNF) encodings of the SAT domain. Recently, systematic SAT solvers have been progressively improved and are now able to solve many highly structured practical problems containing millions of clauses. In contrast, state-of-the-art Stochastic Local Search (SLS) solvers still have difficulty in solving structured problems, apparently because they are unable to exploit hidden structure as well as the systematic solvers. In this thesis, we study and evaluate different ways to effectively recognise, model and efficiently exploit useful structures hidden in realistic problems. A summary of the main contributions is as follows: 1. We first investigate an off-line processing phase that applies resolution-based pre-processors to input formulas before running SLS solvers on these problems. We report an extensive empirical examination of the impact of SAT pre-processing on the performance of contemporary SLS techniques. It emerges that while all the solvers examined do indeed benefit from pre-processing, the effects of different pre-processors are far from uniform across solvers and across problems. Our results suggest that SLS solvers need to be equipped with multiple pre-processors if they are ever to match the performance of systematic solvers on highly structured problems. [Part of this study was published at the AAAI-05 conference]. 2. We then look at potential approaches to bridging the gap between SAT and constraint satisfaction problem (CSP) formalisms. One approach has been to develop a many-valued SAT formalism (MV-SAT) as an intermediate paradigm between SAT and CSP, and then to translate existing highly efficient SAT solvers to the MV-SAT domain. In this study, we follow a different route, developing SAT solvers that can automatically recognise CSP structure hidden in SAT encodings. This allows us to look more closely at how constraint weighting can be implemented in the SAT and CSP domains. Our experimental results show that a SAT-based mechanism to handle weights, together with a CSP-based method to instantiate variables, is superior to other combinations of SAT and CSP-based approaches. In addition, SLS solvers based on this many-valued weighting approach outperform other existing approaches to handle many-valued CSP structures. [Part of this study was published at the AAAI-05 conference]. 3. Finally, we propose and evaluate six different schemes to encode temporal reasoning problems, in particular the Interval Algebra (IA) networks, into SAT CNF formulas. We then empirically examine the performance of local search as well as systematic solvers on the new temporal SAT representations, in comparison with solvers that operate on native IA representations. Our empirical results show that zChaff (a state-of-the-art complete SAT solver) together with the best IA-to-SAT encoding scheme, can solve temporal problems significantly faster than existing IA solvers working on the equivalent native IA networks. [Part of this study was published at the CP-05 workshop].
Style APA, Harvard, Vancouver, ISO itp.
6

Pham, Duc Nghia. "Modelling and Exploiting Structures in Solving Propositional Satisfiability Problems". Thesis, Griffith University, 2006. http://hdl.handle.net/10072/365503.

Pełny tekst źródła
Streszczenie:
Recent research has shown that it is often preferable to encode real-world problems as propositional satisfiability (SAT) problems and then solve using a general purpose SAT solver. However, much of the valuable information and structure of these realistic problems is flattened out and hidden inside the corresponding Conjunctive Normal Form (CNF) encodings of the SAT domain. Recently, systematic SAT solvers have been progressively improved and are now able to solve many highly structured practical problems containing millions of clauses. In contrast, state-of-the-art Stochastic Local Search (SLS) solvers still have difficulty in solving structured problems, apparently because they are unable to exploit hidden structure as well as the systematic solvers. In this thesis, we study and evaluate different ways to effectively recognise, model and efficiently exploit useful structures hidden in realistic problems. A summary of the main contributions is as follows: 1. We first investigate an off-line processing phase that applies resolution-based pre-processors to input formulas before running SLS solvers on these problems. We report an extensive empirical examination of the impact of SAT pre-processing on the performance of contemporary SLS techniques. It emerges that while all the solvers examined do indeed benefit from pre-processing, the effects of different pre-processors are far from uniform across solvers and across problems. Our results suggest that SLS solvers need to be equipped with multiple pre-processors if they are ever to match the performance of systematic solvers on highly structured problems. [Part of this study was published at the AAAI-05 conference]. 2. We then look at potential approaches to bridging the gap between SAT and constraint satisfaction problem (CSP) formalisms. One approach has been to develop a many-valued SAT formalism (MV-SAT) as an intermediate paradigm between SAT and CSP, and then to translate existing highly efficient SAT solvers to the MV-SAT domain. In this study, we follow a different route, developing SAT solvers that can automatically recognise CSP structure hidden in SAT encodings. This allows us to look more closely at how constraint weighting can be implemented in the SAT and CSP domains. Our experimental results show that a SAT-based mechanism to handle weights, together with a CSP-based method to instantiate variables, is superior to other combinations of SAT and CSP-based approaches. In addition, SLS solvers based on this many-valued weighting approach outperform other existing approaches to handle many-valued CSP structures. [Part of this study was published at the AAAI-05 conference]. 3. Finally, we propose and evaluate six different schemes to encode temporal reasoning problems, in particular the Interval Algebra (IA) networks, into SAT CNF formulas. We then empirically examine the performance of local search as well as systematic solvers on the new temporal SAT representations, in comparison with solvers that operate on native IA representations. Our empirical results show that zChaff (a state-of-the-art complete SAT solver) together with the best IA-to-SAT encoding scheme, can solve temporal problems significantly faster than existing IA solvers working on the equivalent native IA networks. [Part of this study was published at the CP-05 workshop].
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
Institute for Integrated and Intelligent Systems
Full Text
Style APA, Harvard, Vancouver, ISO itp.
7

Ferreira, Junior Valnir. "Improvements to Clause Weighting Local Search for Propositional Satisfiability". Thesis, Griffith University, 2007. http://hdl.handle.net/10072/365857.

Pełny tekst źródła
Streszczenie:
The propositional satisfiability (SAT) problem is of considerable theoretical and practical relevance to the artificial intelligence (AI) community and has been used to model many pervasive AI tasks such as default reasoning, diagnosis, planning, image interpretation, and constraint satisfaction. Computational methods for SAT have historically fallen into two broad categories: complete search and local search. Within the local search category, clause weighting methods are amongst the best alternatives for SAT, becoming particularly attractive on problems where a complete search is impractical or where there is a need to find good candidate solutions within a short time. The thesis is concerned with the study of improvements to clause weighting local search methods for SAT. The main contributions are: A component-based framework for the functional analysis of local search methods. A clause weighting local search heuristic that exploits longer-term memory arising from clause weight manipulations. The approach first learns which clauses are globally hardest to satisfy and then uses this information to treat these clauses differentially during weight manipulation [Ferreira Jr and Thornton, 2004]. A study of heuristic tie breaking in the domain of additive clause weighting local search methods, and the introduction of a competitive method that uses heuristic tie breaking instead of the random tie breaking approach used in most existing methods [Ferreira Jr and Thornton, 2005]. An evaluation of backbone guidance for clause weighting local search, and the introduction of backbone guidance to three state-of-the-art clause weighting local search methods [Ferreira Jr, 2006]. A new clause weighting local search method for SAT that successfully exploits synergies between the longer-term memory and tie breaking heuristics developed in the thesis to significantly improve on the performance of current state-of-the-art local search methods for SAT-encoded instances containing identifiable CSP structure. Portions of this thesis have appeared in the following refereed publications: Longer-term memory in clause weighting local search for SAT. In Proceedings of the 17th Australian Joint Conference on Artificial Intelligence, volume 3339 of Lecture Notes in Artificial Intelligence, pages 730-741, Cairns, Australia, 2004. Tie breaking in clause weighting local search for SAT. In Proceedings of the 18th Australian Joint Conference on Artificial Intelligence, volume 3809 of Lecture Notes in Artificial Intelligence, pages 70–81, Sydney, Australia, 2005. Backbone guided dynamic local search for propositional satisfiability. In Proceedings of the Ninth International Symposium on Artificial Intelligence and Mathematics, AI&M, Fort Lauderdale, Florida, 2006.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
Institute for Integrated and Intelligent Systems
Full Text
Style APA, Harvard, Vancouver, ISO itp.
8

Slater, Andrew, i andrew slater@csl anu edu au. "Investigations into Satisfiability Search". The Australian National University. Research School of Information Sciences and Engineering, 2003. http://thesis.anu.edu.au./public/adt-ANU20040310.103258.

Pełny tekst źródła
Streszczenie:
In this dissertation we investigate theoretical aspects of some practical approaches used in solving and understanding search problems. We concentrate on the Satisfiability problem, which is a strong representative from search problem domains. The work develops general theoretical foundations to investigate some practical aspects of satisfiability search. This results in a better understanding of the fundamental mechanics for search algorithm construction and behaviour. A theory of choice or branching heuristics is presented, accompanied by results showing a correspondence of both parameterisations and performance when the method is compared to previous empirically motivated branching techniques. The logical foundations of the backtracking mechanism are explored alongside formulations for reasoning in relevant logics which results in the development of a malleable backtracking mechanism that subsumes other intelligent backtracking proof construction techniques and allows the incorporation of proof rearrangement strategies. Moreover, empirical tests show that relevant backtracking outperforms all other forms of intelligent backtracking search tree construction methods. An investigation into modelling and generating world problem instances justifies a modularised problem model proposal which is used experimentally to highlight the practicability of search algorithms for the proposed model and related domains.
Style APA, Harvard, Vancouver, ISO itp.
9

Drake, Lyndon Paul. "Combining inference and backtracking search for the propositional satisfiability problem". Thesis, University of York, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.421496.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Ishtaiwi, Abdelraouf. "Towards Effective Parameter-Free Clause Weighting Local Search for SAT". Thesis, Griffith University, 2008. http://hdl.handle.net/10072/366980.

Pełny tekst źródła
Streszczenie:
Recent research has shown that it is often preferable to encode real-world problems as propositional satisfiability (SAT) problems, and then solve them using general purpose SAT solvers. However, most SAT solvers require the tuning of parameters in order to obtain optimum performance. Tuning these parameters usually takes a considerable amount of time, and even to achieve average performance can require many runs with many different parameter settings. In this thesis we investigate various ways to improve the overall performance of local search solvers via new techniques that do not employ parameters and therefore take considerably less time for experimentation...
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
Institute for Integrated and Intelligent Systems
Faculty of Engineering and Information Technology
Full Text
Style APA, Harvard, Vancouver, ISO itp.
11

Polash, Md Masbaul Alam. "Exploiting Structures in Combinatorial Search". Thesis, Griffith University, 2017. http://hdl.handle.net/10072/370979.

Pełny tekst źródła
Streszczenie:
Combinatorial problems are believed to be hard in general, most of them are at least NPcomplete. Constraint-based approaches employ some convenient and generic techniques to solve these problems. These approaches use basic de nitons to model and wellde ned constraints to represent a problem. Although these approaches produce good results for smaller instances, for large-sized problems these do not perform so well. In this case, problem speci c information can help to increase the scalability of these approaches. Thus in this research, our focus is to nd out theoretically proven and heuristically promising properties that goes beyond the basic de nition of a problem. These properties help to model the problem e ciently and to reduce the e ective search space of a problem. Also, during search these properties can be used as auxiliary or streamlined constraint to boost the e ciency of a search algorithm. These properties along with e cient data structures and modern constraint-based techniques can be used to handle challenging combinatorial problems. The e ectiveness of these techniques is shown throughout this thesis by solving several combinatorial problems, such as optimal Golomb rulers, all-interval series and propositional satis ability. Finding optimal Golomb rulers is an extremely challenging combinatorial problem. Different approaches have been used so far to handle this problem. In this thesis, we provide tight upper bounds for Golomb ruler marks and present symmetry-based domain reduction technique. Using these along with tabu and con guration checking meta-heuristics, we then develop a constraint-based multi-point local search algorithm to perform a satisfaction search for optimal Golomb rulers of speci ed length. We then present an algorithm to perform an optimisation search that minimises the length of a Golomb ruler using the satisfaction search repeatedly. Experimental results demonstrate that our algorithms perform signi cantly better than the existing state-of-the-art algorithms. All-interval series is a standard benchmark problem for constraint satisfaction search. Di erent approaches have been used to date to generate all the solutions of this problem but the search space that must be explored still remains huge. In this thesis, we present a constraint-directed backtracking-based tree search algorithm that performs e cient lazy checking rather than immediate constraint propagation. Moreover, we prove several key properties of all-interval series that help to reduce the search space signi cantly. The reduced search space essentially results into fewer backtracking. We also present scalable parallel versions of our algorithm that can exploit the advantages of having multi-core processors and even multiple computer systems. Experimental results show that our new algorithm exhibits better performance than the satis ability-based state-of-the-art approach for this problem. The propositional satis ability (SAT) problem is one of the most studied combinatorial problems in computer science. In recent years, local search approaches have become one of the most e ective techniques in solving these SAT problems. In this research, our focus is to exploit the hidden structures of SAT problems in local search. These structures are generated in the form of logic gates. Due to the detection of gates, both the number of independent variables and external gates decreases. Thus the search space becomes narrower than before. But in some cases, the number of external gates or the number of independent variables may become too few to guide the search e ciently. In these cases, detecting only few but not all types of gates will actually perform better. Thus in this research, we investigate the e ect of detecting only the basic gates and also all types of gates. Then a dependency lattice is created to propagate the value of the independent variables. However, detection of gates may led to the problem of cycling in the dependency lattice. A new mechanism is proposed to remove those cycles as well. Moreover, we propose a new stagnation recovery technique to handle the cycling problem of local search. The experimental study on structured benchmarks shows that our new approach signi cantly outperforms the corresponding CNF-based implementations.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
Inst Integrated&IntelligentSys
Science, Environment, Engineering and Technology
Full Text
Style APA, Harvard, Vancouver, ISO itp.
12

Namasivayam, Gayathri. "ON SIMPLE BUT HARD RANDOM INSTANCES OF PROPOSITIONAL THEORIES AND LOGIC PROGRAMS". UKnowledge, 2011. http://uknowledge.uky.edu/gradschool_diss/132.

Pełny tekst źródła
Streszczenie:
In the last decade, Answer Set Programming (ASP) and Satisfiability (SAT) have been used to solve combinatorial search problems and practical applications in which they arise. In each of these formalisms, a tool called a solver is used to solve problems. A solver takes as input a specification of the problem – a logic program in the case of ASP, and a CNF theory for SAT – and produces as output a solution to the problem. Designing fast solvers is important for the success of this general-purpose approach to solving search problems. Classes of instances that pose challenges to solvers can help in this task. In this dissertation we create challenging yet simple benchmarks for existing solvers in ASP and SAT.We do so by providing models of simple logic programs as well as models of simple CNF theories. We then randomly generate logic programs as well as CNF theories from these models. Our experimental results show that computing answer sets of random logic programs as well as models of random CNF theories with carefully chosen parameters is hard for existing solvers. We generate random logic programs with 2-literals, and our experiments show that it is hard for ASP solvers to obtain answer sets of purely negative and constraint-free programs, indicating the importance of these programs in the development of ASP solvers. An easy-hard-easy pattern emerges as we compute the average number of choice points generated by ASP solvers on randomly generated 2-literal programs with an increasing number of rules. We provide an explanation for the emergence of this pattern in these programs. We also theoretically study the probability of existence of an answer set for sparse and dense 2-literal programs. We consider simple classes of mixed Horn formulas with purely positive 2- literal clauses and purely negated Horn clauses. First we consider a class of mixed Horn formulas wherein each formula has m 2-literal clauses and k-literal negated Horn clauses. We show that formulas that are generated from the phase transition region of this class are hard for complete SAT solvers. The second class of Mixed Horn Formulas we consider are obtained from completion of a certain class of random logic programs. We show the appearance of an easy-hard-easy pattern as we generate formulas from this class with increasing numbers of clauses, and that the formulas generated in the hard region can be used as benchmarks for testing incomplete SAT solvers.
Style APA, Harvard, Vancouver, ISO itp.
13

Arora, Rajat. "Enhancing SAT-based Formal Verification Methods using Global Learning". Thesis, Virginia Tech, 2004. http://hdl.handle.net/10919/32987.

Pełny tekst źródła
Streszczenie:
With the advances in VLSI and System-On-Chip (SOC) technology, the complexity of hardware systems has increased manifold. Today, 70% of the design cost is spent in verifying these intricate systems. The two most widely used formal methods for design verification are Equivalence Checking and Model Checking. Equivalence Checking requires that the implementation circuit should be exactly equivalent to the specification circuit (golden model). In other words, for each possible input pattern, the implementation circuit should yield the same outputs as the specification circuit. Model checking, on the other hand, checks to see if the design holds certain properties, which in turn are indispensable for the proper functionality of the design. Complexities in both Equivalence Checking and Model Checking are exponential to the circuit size. In this thesis, we firstly propose a novel technique to improve SAT-based Combinational Equivalence Checking (CEC) and Bounded Model Checking (BMC). The idea is to perform a low-cost preprocessing that will statically induce global signal relationships into the original CNF formula of the circuit under verification and hence reduce the complexity of the SAT instance. This efficient and effective preprocessing quickly builds up the implication graph for the circuit under verification, yielding a large set of logic implications composed of direct, indirect and extended backward implications. These two-node implications (spanning time-frame boundaries) are converted into two-literal clauses, and added to the original CNF database. The added clauses constrain the search space of the SAT-solver engine, and provide correlation among the different variables, which enhances the Boolean Constraint Propagation (BCP). Experimental results on large and difficult ISCAS'85, ISCAS'89 (full scan) and ITC'99 (full scan) CEC instances and ISCAS'89 BMC instances show that our approach is independent of the state-of-the-art SAT-solver used, and that the added clauses help to achieve more than an order of magnitude speedup over the conventional approach. Also, comparison with Hyper-Resolution [Bacchus 03] suggests that our technique is much more powerful, yielding non-trivial clauses that significantly simplify the SAT instance complexity. Secondly, we propose a novel global learning technique that helps to identify highly non-trivial relationships among signals in the circuit netlist, thereby boosting the power of the existing implication engine. We call this new class of implications as 'extended forward implications', and show its effectiveness through additional untestable faults they help to identify. Thirdly, we propose a suite of lemmas and theorems to formalize global learning. We show through implementation that these theorems help to significantly simplify a generic CNF formula (from Formal Verification, Artificial Intelligence etc.) by identifying the necessary assignments, equivalent signals, complementary signals and other non-trivial implication relationships among its variables. We further illustrate through experimental results that the CNF formula simplification obtained using our tool outshines the simplification obtained using other preprocessors.
Master of Science
Style APA, Harvard, Vancouver, ISO itp.
14

Lonlac, Konlac Jerry Garvin. "Contributions à la résolution du problème de la Satisfiabilité Propositionnelle". Thesis, Artois, 2014. http://www.theses.fr/2014ARTO0404/document.

Pełny tekst źródła
Streszczenie:
Dans cette thèse, nous nous intéressons à la résolution du problème de la satisfiabilité propositionnelle (SAT). Ce problème fondamental en théorie de la complexité est aujourd'hui utilisé dans de nombreux domaines comme la planification, la bio-informatique, la vérification de matériels et de logiciels. En dépit d'énormes progrès observés ces dernières années dans la résolution pratique du problème SAT, il existe encore une forte demande d'algorithmes efficaces pouvant permettre de résoudre les problèmes difficiles. C'est dans ce contexte que se situent les différentes contributions apportées par cette thèse. Ces contributions s'attellent principalement autour de deux composants clés des solveurs SAT : l'apprentissage de clauses et les heuristiques de choix de variables de branchement. Premièrement, nous proposons une méthode de résolution permettant d'exploiter les fonctions booléennes cachées généralement introduites lors de la phase d'encodage CNF pour réduire la taille des clauses apprises au cours de la recherche. Ensuite, nous proposons une approche de résolution basée sur le principe d'intensification qui indique les variables sur lesquelles le solveur devrait brancher prioritairement à chaque redémarrage. Ce principe permet ainsi au solveur de diriger la recherche sur la sous-formule booléenne la plus contraignante et de tirer profit du travail de recherche déjà accompli en évitant d'explorer le même sous-espace de recherche plusieurs fois. Dans une troisième contribution, nous proposons un nouveau schéma d'apprentissage de clauses qui permet de dériver une classe particulière de clauses Bi-Assertives et nous montrons que leur exploitation améliore significativement les performances des solveurs SAT CDCL issus de l'état de l'art. Finalement, nous nous sommes intéressés aux principales stratégies de gestion de la base de clauses apprises utilisées dans la littérature. En effet, partant de deux stratégies de réduction simples : élimination des clauses de manière aléatoire et celle utilisant la taille des clauses comme critère pour juger la qualité d'une clause apprise, et motiver par les résultats obtenus à partir de ces stratégies, nous proposons plusieurs nouvelles stratégies efficaces qui combinent le maintien de clauses courtes (de taille bornée par k), tout en supprimant aléatoirement les clauses de longueurs supérieures à k. Ces nouvelles stratégies nous permettent d'identifier les clauses les plus pertinentes pour le processus de recherche
In this thesis, we focus on propositional satisfiability problem (SAT). This fundamental problem in complexity theory is now used in many application domains such as planning, bioinformatic, hardware and software verification. Despite enormous progress observed in recent years in practical SAT solving, there is still a strong demand of efficient algorithms that can help to solve hard problems. Our contributions fit in this context. We focus on improving two of the key components of SAT solvers: clause learning and variable ordering heuristics. First, we propose a resolution method that allows to exploit hidden Boolean functions generally introduced during the encoding phase CNF to reduce the size of clauses learned during the search. Then, we propose an resolution approach based on the intensification principle that circumscribe the variables on which the solver should branch in priority at each restart. This principle allows the solver to direct the search to the most constrained sub-formula and takes advantage of the previous search to avoid exploring the same part of the search space several times. In a third contribution, we propose a new clause learning scheme that allows to derive a particular Bi-Asserting clauses and we show that their exploitation significantly improves the performance of the state-of-the art CDCL SAT solvers. Finally, we were interested to the main learned clauses database reduction strategies used in the literature. Indeed, starting from two simple strategies : random and size-bounded reduction strategies, and motivated by the results obtained from these strategies, we proposed several new effective ones that combine maintaing short clauses (of size bounded by k), while deleting randomly clauses of size greater than k. Several other efficient variants are proposed. These new strategies allow us to identify the most important learned clauses for the search process
Style APA, Harvard, Vancouver, ISO itp.
15

Manthey, Norbert. "Towards Next Generation Sequential and Parallel SAT Solvers". Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2015. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-158672.

Pełny tekst źródła
Streszczenie:
This thesis focuses on improving the SAT solving technology. The improvements focus on two major subjects: sequential SAT solving and parallel SAT solving. To better understand sequential SAT algorithms, the abstract reduction system Generic CDCL is introduced. With Generic CDCL, the soundness of solving techniques can be modeled. Next, the conflict driven clause learning algorithm is extended with the three techniques local look-ahead, local probing and all UIP learning that allow more global reasoning during search. These techniques improve the performance of the sequential SAT solver Riss. Then, the formula simplification techniques bounded variable addition, covered literal elimination and an advanced cardinality constraint extraction are introduced. By using these techniques, the reasoning of the overall SAT solving tool chain becomes stronger than plain resolution. When using these three techniques in the formula simplification tool Coprocessor before using Riss to solve a formula, the performance can be improved further. Due to the increasing number of cores in CPUs, the scalable parallel SAT solving approach iterative partitioning has been implemented in Pcasso for the multi-core architecture. Related work on parallel SAT solving has been studied to extract main ideas that can improve Pcasso. Besides parallel formula simplification with bounded variable elimination, the major extension is the extended clause sharing level based clause tagging, which builds the basis for conflict driven node killing. The latter allows to better identify unsatisfiable search space partitions. Another improvement is to combine scattering and look-ahead as a superior search space partitioning function. In combination with Coprocessor, the introduced extensions increase the performance of the parallel solver Pcasso. The implemented system turns out to be scalable for the multi-core architecture. Hence iterative partitioning is interesting for future parallel SAT solvers. The implemented solvers participated in international SAT competitions. In 2013 and 2014 Pcasso showed a good performance. Riss in combination with Copro- cessor won several first, second and third prices, including two Kurt-Gödel-Medals. Hence, the introduced algorithms improved modern SAT solving technology.
Style APA, Harvard, Vancouver, ISO itp.
16

Belov, Anton. "Syntactic characterization of propositional satisfiability". 2005. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&res_dat=xri:pqdiss&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft_dat=xri:pqdiss:MR11752.

Pełny tekst źródła
Streszczenie:
Thesis (M. Sc.)--York University, 2005. Graduate Programme in Computer Science.
Typescript. Includes bibliographical references (leaves 90-94). Also available on the Internet. MODE OF ACCESS via web browser by entering the following URL: http://gateway.proquest.com/openurl?url_ver=Z39.88-2004 & res_dat=xri:pqdiss & rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation & rft_dat=xri:pqdiss:MR11752.
Style APA, Harvard, Vancouver, ISO itp.
17

Pan, Guoqiang. "Complexity and structural heuristics for propositional and quantified satisfiability". Thesis, 2007. http://hdl.handle.net/1911/20686.

Pełny tekst źródła
Streszczenie:
Decision procedures for various logics are used as general-purpose solvers in computer science. A particularly popular choice is propositional logic, which is simultaneously powerful enough to model problems in many application domains, including formal verification and planning, while at the same time simple enough to be efficiently solved for many practical cases. Similarly, there are also recent interests in using QBF, an extension of propositional logic, as a modeling language to be used in a similar fashion. The hope is that QBF, being a more powerful language, can compactly encode, and in turn, be used to solve, a larger range of applications. Still, propositional logic and QBF are respectively complete for the complexity classes NP and PSPACE, thus, both can be theoretically considered intractable. A popular hypothesis is that real-world problems contain underlying structure that can be exploited by the decision procedures. In this dissertation, we study the impact of structural constraints (in the form of bounded width) and heuristics on the performance of propositional and QBF decision procedures. The results presented in this dissertation can be seen as a contrast on how bounded-width impacts propositional and quantified problems differently. Starting with a size bound on BDDs under bounded width, we proceed to compare symbolic decision procedures against the standard DPLL search-based approach for propositional logic, as well as compare different width-based heuristics for the symbolic approaches. In general, symbolic approaches for propositional satisfiability are only competitive for a small range of problems, and the theoretical tractability for the bounded-width case rarely applies in practice. However, the picture is very different for quantified satisfiability. To that end, we start with a series of "intractability in tractability" results which shows that although the complexity of QBF with constant width and alternation is tractable, there is an inherent non-elementary blowup in the width and alternation depth such that a width-bound that is slightly above constant leads to intractability. To contrast the theoretical intractability, we apply structural heuristics to a symbolic decision procedure of QBF and show that symbolic approaches complement search-based approaches quite well for QBF.
Style APA, Harvard, Vancouver, ISO itp.
18

Slater, Andrew. "Investigations into Satisfiability Search". Phd thesis, 2004. http://hdl.handle.net/1885/48193.

Pełny tekst źródła
Streszczenie:
In this dissertation we investigate theoretical aspects of some practical approaches used in solving and understanding search problems. We concentrate on the Satisfiability problem, which is a strong representative from search problem domains. The work develops general theoretical foundations to investigate some practical aspects of satisfiability search. This results in a better understanding of the fundamental mechanics for search algorithm construction and behaviour. A theory of choice or branching heuristics is presented, accompanied by results showing a correspondence of both parameterisations and performance when the method is compared to previous empirically motivated branching techniques. The logical foundations of the backtracking mechanism are explored alongside formulations for reasoning in relevant logics which results in the development of a malleable backtracking mechanism that subsumes other intelligent backtracking proof construction techniques and allows the incorporation of proof rearrangement strategies. Moreover, empirical tests show that relevant backtracking outperforms all other forms of intelligent backtracking search tree construction methods. An investigation into modelling and generating world problem instances justifies a modularised problem model proposal which is used experimentally to highlight the practicability of search algorithms for the proposed model and related domains.
Style APA, Harvard, Vancouver, ISO itp.
19

"A solution scheme of satisfiability problem by active usage of totally unimodularity property". 2003. http://library.cuhk.edu.hk/record=b5896100.

Pełny tekst źródła
Streszczenie:
by Mei Long.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2003.
Includes bibliographical references (leaves 93-98).
Abstracts in English and Chinese.
Table of Contents --- p.v
Abstract --- p.viii
Acknowledgements --- p.x
Chapter 1 --- Introduction --- p.1
Chapter 1.1 --- Satisfiability Problem --- p.1
Chapter 1.2 --- Motivation of the Research --- p.1
Chapter 1.3 --- Overview of the Thesis --- p.2
Chapter 2 --- Satisfiability Problem --- p.4
Chapter 2.1 --- Satisfiability Problem --- p.5
Chapter 2.1.1 --- Basic Definition --- p.5
Chapter 2.1.2 --- Phase Transitions --- p.5
Chapter 2.2 --- History --- p.6
Chapter 2.3 --- The Basic Search Algorithm --- p.8
Chapter 2.4 --- Some Improvements to the Basic Algorithm --- p.9
Chapter 2.4.1 --- Satz by Chu-Min Li --- p.9
Chapter 2.4.2 --- Heuristics and Local Search --- p.12
Chapter 2.4.3 --- Relaxation --- p.13
Chapter 2.5 --- Benchmarks --- p.14
Chapter 2.5.1 --- Specific Problems --- p.14
Chapter 2.5.2 --- Randomly Generated Problems --- p.14
Chapter 2.6 --- Software and Internet Information for SAT solving --- p.16
Chapter 2.6.1 --- Stochastic Local Search Algorithms (incomplete) --- p.16
Chapter 2.6.2 --- Systematic Search Algorithms (complete) --- p.16
Chapter 2.6.3 --- Some useful Links to SAT Related Sites --- p.17
Chapter 3 --- Integer Programming Formulation for Logic Problem --- p.18
Chapter 3.1 --- SAT Problem --- p.19
Chapter 3.2 --- MAXSAT Problem --- p.19
Chapter 3.3 --- Logical Inference Problem --- p.19
Chapter 3.4 --- Weighted Exact Satisfiability Problem --- p.20
Chapter 4 --- Integer Programming Formulation for SAT Problem --- p.22
Chapter 4.1 --- From 3-CNF SAT Clauses to Zero-One IP Constraints --- p.22
Chapter 4.2 --- Integer Programming Model for 3-SAT --- p.23
Chapter 4.3 --- The Equivalence of the SAT and the IP --- p.23
Chapter 4.4 --- Example --- p.24
Chapter 5 --- Integer Solvability of Linear Programs --- p.27
Chapter 5.1 --- Unimodularity --- p.27
Chapter 5.2 --- Totally Unimodularity --- p.28
Chapter 5.3 --- Some Results on Recognition of Linear Solvability of IP --- p.32
Chapter 6 --- TU Based Matrix Research Results --- p.33
Chapter 6.1 --- 2x2 Matrix's TU Property --- p.33
Chapter 6.2 --- Extended Integer Programming Model for SAT --- p.34
Chapter 6.3 --- 3x3 Matrix's TU Property --- p.35
Chapter 7 --- Totally Unimodularity Based Branching-and-Bound Algorithm --- p.38
Chapter 7.1 --- Introduction --- p.38
Chapter 7.1.1 --- Enumeration Trees --- p.39
Chapter 7.1.2 --- The Concept of Branch and Bound --- p.42
Chapter 7.2 --- TU Based Branching Rule --- p.43
Chapter 7.2.1 --- How to sort variables based on 2x2 submatrices --- p.43
Chapter 7.2.2 --- How to sort the rest variables --- p.45
Chapter 7.3 --- TU Based Bounding Rule --- p.46
Chapter 7.4 --- TU Based Branch-and-Bound Algorithm --- p.47
Chapter 7.5 --- Example --- p.49
Chapter 8 --- Numerical Result --- p.57
Chapter 8.1 --- Experimental Result --- p.57
Chapter 8.2 --- Statistical Results of ILOG CPLEX --- p.59
Chapter 9 --- Conclusions --- p.61
Chapter 9.1 --- Contributions --- p.61
Chapter 9.2 --- Future Work --- p.62
Chapter A --- The Coefficient Matrix A for Example in Chapter 7 --- p.64
Chapter B --- The Detailed Numerical Information of Solution Process for Exam- ple in Chapter 7 --- p.66
Chapter C --- Experimental Result --- p.67
Chapter C.1 --- "# of variables: 20, # of clauses: 91" --- p.67
Chapter C.2 --- "# of variables: 50, # of clauses: 218" --- p.70
Chapter C.3 --- # of variables: 75,# of clauses: 325 --- p.73
Chapter C.4 --- "# of variables: 100, # of clauses: 430" --- p.76
Chapter D --- Experimental Result of ILOG CPLEX --- p.80
Chapter D.1 --- # of variables: 20´ة # of clauses: 91 --- p.80
Chapter D.2 --- # of variables: 50,#of clauses: 218 --- p.83
Chapter D.3 --- # of variables: 75,# of clauses: 325 --- p.86
Chapter D.4 --- "# of variables: 100, # of clauses: 430" --- p.89
Bibliography --- p.93
Style APA, Harvard, Vancouver, ISO itp.
20

Katsirelos, George. "Nogood Processing in CSPs". Thesis, 2008. http://hdl.handle.net/1807/16737.

Pełny tekst źródła
Streszczenie:
The constraint satisfaction problem is an NP-complete problem that provides a convenient framework for expressing many computationally hard problems. In addition, domain knowledge can be efficiently integrated into CSPs, providing a potentially exponential speedup in some cases. The CSP is closely related to the satisfiability problem and many of the techniques developed for one have been transferred to the other. However, the recent dramatic improvements in SAT solvers that result from learning clauses during search have not been transferred successfully to CSP solvers. In this thesis we propose that this failure is due to a fundamental restriction of \newtext{nogood learning, which is intended to be the analogous to clause learning in CSPs}. This restriction means that nogood learning can exhibit a superpolynomial slowdown compared to clause learning in some cases. We show that the restriction can be lifted, delivering promising results. Integration of nogood learning in a CSP solver, however, presents an additional challenge, as a large body of domain knowledge is typically encoded in the form of domain specific propagation algorithms called global constraints. Global constraints often completely eliminate the advantages of nogood learning. We demonstrate generic methods that partially alleviate the problem irrespective of the type of global constraint. We also show that more efficient methods can be integrated into specific global constraints and demonstrate the feasibility of this approach on several widely used global constraints.
Style APA, Harvard, Vancouver, ISO itp.
21

Davies, Jessica. "Solving MAXSAT by Decoupling Optimization and Satisfaction". Thesis, 2013. http://hdl.handle.net/1807/43539.

Pełny tekst źródła
Streszczenie:
Many problems that arise in the real world are difficult to solve partly because they present computational challenges. Many of these challenging problems are optimization problems. In the real world we are generally interested not just in solutions but in the cost or benefit of these solutions according to different metrics. Hence, finding optimal solutions is often highly desirable and sometimes even necessary. The most effective computational approach for solving such problems is to first model them in a mathematical or logical language, and then solve them by applying a suitable algorithm. This thesis is concerned with developing practical algorithms to solve optimization problems modeled in a particular logical language, MAXSAT. MAXSAT is a generalization of the famous Satisfiability (SAT) problem, that associates finite costs with falsifying various desired conditions where these conditions are expressed as propositional clauses. Optimization problems expressed in MAXSAT typically have two interacting components: the logical relationships between the variables expressed by the clauses, and the optimization component involving minimizing the falsified clauses. The interaction between these components greatly contributes to the difficulty of solving MAXSAT. The main contribution of the thesis is a new hybrid approach, MaxHS, for solving MAXSAT. Our hybrid approach attempts to decouple these two components so that each can be solved with a different technology. In particular, we develop a hybrid solver that exploits two sophisticated technologies with divergent strengths: SAT for solving the logical component, and Integer Programming (IP) solvers for solving the optimization component. MaxHS automatically and incrementally splits the MAXSAT problem into two parts that are given to the SAT and IP solvers, which work together in a complementary way to find a MAXSAT solution. The thesis investigates several improvements to the MaxHS approach and provides empirical analysis of its behaviour in practise. The result is a new solver, MaxHS, that is shown to be the most robust existing solver for MAXSAT.
Style APA, Harvard, Vancouver, ISO itp.
22

Silverthorn, Bryan Connor. "A probabilistic architecture for algorithm portfolios". 2012. http://hdl.handle.net/2152/19828.

Pełny tekst źródła
Streszczenie:
Heuristic algorithms for logical reasoning are increasingly successful on computationally difficult problems such as satisfiability, and these solvers enable applications from circuit verification to software synthesis. Whether a problem instance can be solved, however, often depends in practice on whether the correct solver was selected and its parameters appropriately set. Algorithm portfolios leverage past performance data to automatically select solvers likely to perform well on a given instance. Existing portfolio methods typically select only a single solver for each instance. This dissertation develops and evaluates a more general portfolio method, one that computes complete solver execution schedules, including repeated runs of nondeterministic algorithms, by explicitly incorporating probabilistic reasoning into its operation. This modular architecture for probabilistic portfolios (MAPP) includes novel solutions to three issues central to portfolio operation: first, it estimates solver performance distributions from limited data by constructing a generative model; second, it integrates domain-specific information by predicting instances on which solvers exhibit similar performance; and, third, it computes execution schedules using an efficient and effective dynamic programming approximation. In a series of empirical comparisons designed to replicate past solver competitions, MAPP outperforms the most prominent alternative portfolio methods. Its success validates a principled approach to portfolio operation, offers a tool for tackling difficult problems, and opens a path forward in algorithm portfolio design.
text
Style APA, Harvard, Vancouver, ISO itp.
23

Lierler, Yuliya. "SAT-based answer set programming". Thesis, 2010. http://hdl.handle.net/2152/ETD-UT-2010-05-888.

Pełny tekst źródła
Streszczenie:
Answer set programming (ASP) is a declarative programming paradigm oriented towards difficult combinatorial search problems. Syntactically, ASP programs look like Prolog programs, but solutions are represented in ASP by sets of atoms, and not by substitutions, as in Prolog. Answer set systems, such as Smodels, Smodelscc, and DLV, compute answer sets of a given program in the sense of the answer set (stable model) semantics. This is different from the functionality of Prolog systems, which determine when a given query is true relative to a given logic program. ASP has been applied to many areas of science and technology, from the design of a decision support system for the Space Shuttle to graph-theoretic problems arising in zoology and linguistics. The "native" answer set systems mentioned above are based on specialized search procedures. Usually these procedures are described fairly informally with the use of pseudocode. We propose an alternative approach to describing algorithms of answer set solvers. In this approach we specify what "states of computation" are, and which transitions between states are allowed. In this way, we define a directed graph such that every execution of a procedure corresponds to a path in this graph. This allows us to model algorithms of answer set solvers by a mathematically simple and elegant object, graph, rather than a collection of pseudocode statements. We use this abstract framework to describe and prove the correctness of the answer set solver Smodels, and also of Smodelscc, which enhances the former using learning and backjumping techniques. Answer sets of a tight program can be found by running a SAT solver on the program's completion, because for such a program answer sets are in a one-to-one correspondence with models of completion. SAT is one of the most widely studied problems in computational logic, and many efficient SAT procedures were developed over the last decade. Using SAT solvers for computing answer sets allows us to take advantage of the advances in the SAT area. For a nontight program it is still the case that each answer set corresponds to a model of program's completion but not vice versa. We show how to modify the search method typically used in SAT solvers to allow testing models of completion and employ learning to utilize testing information to guide the search. We develop a new SAT-based answer set solver, called Cmodels, based on this idea. We develop an abstract graph based framework for describing SAT-based answer set solvers and use it to represent the Cmodels algorithm and to demonstrate its correctness. Such representations allow us to better understand similarities and differences between native and SAT-based answer set solvers. We formally compare the Smodels algorithm with a variant of the Cmodels algorithm without learning. Abstract frameworks for describing native and SAT-based answer set solvers facilitate the development of new systems. We propose and implement the answer set solver called SUP that can be seen as a combination of computational ideas behind Cmodels and Smodels. Like Cmodels, solver SUP operates by computing a sequence of models of completion of the given program, but it does not form the completion. Instead, SUP runs the Atleast algorithm, one of the main building blocks of the Smodels procedure. Both systems Cmodels and SUP, developed in this dissertation, proved to be competitive answer set programming systems.
text
Style APA, Harvard, Vancouver, ISO itp.
24

Χαρατσάρης, Δημήτριος. "Υλοποίηση διαδικτυακού προσομοιωτή για αλγορίθμους επίλυσης προβλημάτων SAT". Thesis, 2012. http://hdl.handle.net/10889/5754.

Pełny tekst źródła
Streszczenie:
Η παρούσα διπλωµατική εργασία ασχολείται με το θέμα των Αλγορίθμων Επίλυσης Προβληµάτων SAT. Η εργασία αυτή εκπονήθηκε στα πλαίσια του Εργαστηρίου Ενσύρµατης Επικοινωνίας του Τµήματος Ηλεκτρολόγων Μηχανικών και Τεχνολογίας Υπολογιστών της Πολυτεχνικής Σχολής του Πανεπιστηµίου Πατρών. Σκοπός της είναι η δημιουργία ενός Προσομοιωτή των αλγορίθμων αυτών, ο οποίος να μπορεί να προσπελαστεί από οποιονδήποτε μέσω του διαδικτύου. Αρχικά έγινε µία εισαγωγή στο αντικείμενο της Τεχνητής Νοημοσύνης και πιο συγκεκριµένα στην Προτασιακή Λογική, ενώ δόθηκε και το απαραίτητο υπόβαθρο για να κατανοηθεί το πρόβληµμα και οι τεχνικές λύσης του. Τέλος, επιλέχθηκε να γίνει η υλοποίηση του Προσωμοιωτή σε Java.
This diploma dissertation deals with SAT solvers, algorithms for the Boolean satisfiability problem. It was produced in the Wire Communications Laboratory of the Electrical and Computer Engineering Department of the University of Patras. Its aim is to create a simulator for these algorithms, accessible to anyone via the Internet. An introduction to the field of Artificial Intelligence and more specifically to Propositional Calculus was given as well as the necessary groundwork to understand the problem and its solution approaches. The simulation implementation was developed in Java
Style APA, Harvard, Vancouver, ISO itp.
25

Manthey, Norbert. "Towards Next Generation Sequential and Parallel SAT Solvers". Doctoral thesis, 2014. https://tud.qucosa.de/id/qucosa%3A28471.

Pełny tekst źródła
Streszczenie:
This thesis focuses on improving the SAT solving technology. The improvements focus on two major subjects: sequential SAT solving and parallel SAT solving. To better understand sequential SAT algorithms, the abstract reduction system Generic CDCL is introduced. With Generic CDCL, the soundness of solving techniques can be modeled. Next, the conflict driven clause learning algorithm is extended with the three techniques local look-ahead, local probing and all UIP learning that allow more global reasoning during search. These techniques improve the performance of the sequential SAT solver Riss. Then, the formula simplification techniques bounded variable addition, covered literal elimination and an advanced cardinality constraint extraction are introduced. By using these techniques, the reasoning of the overall SAT solving tool chain becomes stronger than plain resolution. When using these three techniques in the formula simplification tool Coprocessor before using Riss to solve a formula, the performance can be improved further. Due to the increasing number of cores in CPUs, the scalable parallel SAT solving approach iterative partitioning has been implemented in Pcasso for the multi-core architecture. Related work on parallel SAT solving has been studied to extract main ideas that can improve Pcasso. Besides parallel formula simplification with bounded variable elimination, the major extension is the extended clause sharing level based clause tagging, which builds the basis for conflict driven node killing. The latter allows to better identify unsatisfiable search space partitions. Another improvement is to combine scattering and look-ahead as a superior search space partitioning function. In combination with Coprocessor, the introduced extensions increase the performance of the parallel solver Pcasso. The implemented system turns out to be scalable for the multi-core architecture. Hence iterative partitioning is interesting for future parallel SAT solvers. The implemented solvers participated in international SAT competitions. In 2013 and 2014 Pcasso showed a good performance. Riss in combination with Copro- cessor won several first, second and third prices, including two Kurt-Gödel-Medals. Hence, the introduced algorithms improved modern SAT solving technology.
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii