Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Constrained horn clauses.

Artykuły w czasopismach na temat „Constrained horn clauses”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 39 najlepszych artykułów w czasopismach naukowych na temat „Constrained horn clauses”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.

1

DE ANGELIS, EMANUELE, FABIO FIORAVANTI, ALBERTO PETTOROSSI i MAURIZIO PROIETTI. "Proving correctness of imperative programs by linearizing constrained Horn clauses". Theory and Practice of Logic Programming 15, nr 4-5 (lipiec 2015): 635–50. http://dx.doi.org/10.1017/s1471068415000289.

Pełny tekst źródła
Streszczenie:
AbstractWe present a method for verifying the correctness of imperative programs which is based on the automated transformation of their specifications. Given a program prog, we consider a partial correctness specification of the form {ϕ}, prog {ψ}, where the assertions ϕ and ψ are predicates defined by a set Spec of possibly recursive Horn clauses with linear arithmetic (LA) constraints in their premise (also called constrained Horn clauses). The verification method consists in constructing a set PC of constrained Horn clauses whose satisfiability implies that {ϕ}, prog, {ψ} is valid. We highlight some limitations of state-of-the-art constrained Horn clause solving methods, here called LA-solving methods, which prove the satisfiability of the clauses by looking for linear arithmetic interpretations of the predicates. In particular, we prove that there exist some specifications that cannot be proved valid by any of those LA-solving methods. These specifications require the proof of satisfiability of a set PC of constrained Horn clauses that contain nonlinear clauses (that is, clauses with more than one atom in their premise). Then, we present a transformation, called linearization, that converts PC into a set of linear clauses (that is, clauses with at most one atom in their premise). We show that several specifications that could not be proved valid by LA-solving methods, can be proved valid after linearization. We also present a strategy for performing linearization in an automatic way and we report on some experimental results obtained by using a preliminary implementation of our method.
Style APA, Harvard, Vancouver, ISO itp.
2

KAFLE, BISHOKSAN, JOHN P. GALLAGHER i PIERRE GANTY. "Tree dimension in verification of constrained Horn clauses". Theory and Practice of Logic Programming 18, nr 2 (marzec 2018): 224–51. http://dx.doi.org/10.1017/s1471068418000030.

Pełny tekst źródła
Streszczenie:
AbstractIn this paper, we show how the notion of tree dimension can be used in the verification of constrained Horn clauses (CHCs). The dimension of a tree is a numerical measure of its branching complexity and the concept here applies to Horn clause derivation trees. Derivation trees of dimension zero correspond to derivations using linear CHCs, while trees of higher dimension arise from derivations using non-linear CHCs. We show how to instrument CHCs predicates with an extra argument for the dimension, allowing a CHC verifier to reason about bounds on the dimension of derivations. Given a set of CHCsP, we define a transformation ofPyielding adimension-boundedset of CHCsP≤k. The set of derivations forP≤kconsists of the derivations forPthat have dimension at mostk. We also show how to construct a set of clauses denotedP>kwhose derivations have dimension exceedingk. We then present algorithms using these constructions to decompose a CHC verification problem. One variation of this decomposition considers derivations of successively increasing dimension. The paper includes descriptions of implementations and experimental results.
Style APA, Harvard, Vancouver, ISO itp.
3

DE ANGELIS, EMANUELE, FABIO FIORAVANTI, ALBERTO PETTOROSSI i MAURIZIO PROIETTI. "Solving Horn Clauses on Inductive Data Types Without Induction". Theory and Practice of Logic Programming 18, nr 3-4 (lipiec 2018): 452–69. http://dx.doi.org/10.1017/s1471068418000157.

Pełny tekst źródła
Streszczenie:
AbstractWe address the problem of verifying the satisfiability of Constrained Horn Clauses (CHCs) based on theories of inductively defined data structures, such as lists and trees. We propose a transformation technique whose objective is the removal of these data structures from CHCs, hence reducing their satisfiability to a satisfiability problem for CHCs on integers and booleans. We propose a transformation algorithm and identify a class of clauses where it always succeeds. We also consider an extension of that algorithm, which combines clause transformation with reasoning on integer constraints. Via an experimental evaluation we show that our technique greatly improves the effectiveness of applying the Z3 solver to CHCs. We also show that our verification technique based on CHC transformation followed by CHC solving, is competitive with respect to CHC solvers extended with induction.
Style APA, Harvard, Vancouver, ISO itp.
4

Cathcart Burn, Toby, C. H. Luke Ong i Steven J. Ramsay. "Higher-order constrained horn clauses for verification". Proceedings of the ACM on Programming Languages 2, POPL (styczeń 2018): 1–28. http://dx.doi.org/10.1145/3158099.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

KAFLE, BISHOKSAN, JOHN P. GALLAGHER, GRAEME GANGE, PETER SCHACHTE, HARALD SØNDERGAARD i PETER J. STUCKEY. "An iterative approach to precondition inference using constrained Horn clauses". Theory and Practice of Logic Programming 18, nr 3-4 (lipiec 2018): 553–70. http://dx.doi.org/10.1017/s1471068418000091.

Pełny tekst źródła
Streszczenie:
AbstractWe present a method for automatic inference of conditions on the initial states of a program that guarantee that the safety assertions in the program are not violated. Constrained Horn clauses (CHCs) are used to model the program and assertions in a uniform way, and we use standard abstract interpretations to derive an over-approximation of the set ofunsafeinitial states. The precondition then is the constraint corresponding to the complement of that set, under-approximating the set ofsafeinitial states. This idea of complementation is not new, but previous attempts to exploit it have suffered from the loss of precision. Here we develop an iterative specialisation algorithm to give more precise, and in some cases optimal safety conditions. The algorithm combines existing transformations, namely constraint specialisation, partial evaluation and a trace elimination transformation. The last two of these transformations perform polyvariant specialisation, leading to disjunctive constraints which improve precision. The algorithm is implemented and tested on a benchmark suite of programs from the literature in precondition inference and software verification competitions.
Style APA, Harvard, Vancouver, ISO itp.
6

Zhou, Qi, David Heath i William Harris. "Solving Constrained Horn Clauses Using Dependence-Disjoint Expansions". Electronic Proceedings in Theoretical Computer Science 278 (12.09.2018): 3–18. http://dx.doi.org/10.4204/eptcs.278.3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Satake, Yuki, Hiroshi Unno i Hinata Yanagi. "Probabilistic Inference for Predicate Constraint Satisfaction". Proceedings of the AAAI Conference on Artificial Intelligence 34, nr 02 (3.04.2020): 1644–51. http://dx.doi.org/10.1609/aaai.v34i02.5526.

Pełny tekst źródła
Streszczenie:
In this paper, we present a novel constraint solving method for a class of predicate Constraint Satisfaction Problems (pCSP) where each constraint is represented by an arbitrary clause of first-order predicate logic over predicate variables. The class of pCSP properly subsumes the well-studied class of Constrained Horn Clauses (CHCs) where each constraint is restricted to a Horn clause. The class of CHCs has been widely applied to verification of linear-time safety properties of programs in different paradigms. In this paper, we show that pCSP further widens the applicability to verification of branching-time safety properties of programs that exhibit finitely-branching non-determinism. Solving pCSP (and CHCs) however is challenging because the search space of solutions is often very large (or unbounded), high-dimensional, and non-smooth. To address these challenges, our method naturally combines techniques studied separately in different literatures: counterexample guided inductive synthesis (CEGIS) and probabilistic inference in graphical models. We have implemented the presented method and obtained promising results on existing benchmarks as well as new ones that are beyond the scope of existing CHC solvers.
Style APA, Harvard, Vancouver, ISO itp.
8

DE ANGELIS, EMANUELE, FABIO FIORAVANTI, ALBERTO PETTOROSSI i MAURIZIO PROIETTI. "Predicate Pairing for program verification". Theory and Practice of Logic Programming 18, nr 2 (4.12.2017): 126–66. http://dx.doi.org/10.1017/s1471068417000497.

Pełny tekst źródła
Streszczenie:
AbstractIt is well-known that the verification of partial correctness properties of imperative programs can be reduced to the satisfiability problem for constrained Horn clauses (CHCs). However, state-of-the-art solvers for constrained Horn clauses (or CHC solvers) based onpredicate abstractionare sometimes unable to verify satisfiability because they look for models that are definable in a given class 𝓐 of constraints, called 𝓐-definable models. We introduce a transformation technique, calledPredicate Pairing, which is able, in many interesting cases, to transform a set of clauses into an equisatisfiable set whose satisfiability can be proved by finding an 𝓐-definable model, and hence can be effectively verified by a state-of-the-art CHC solver. In particular, we prove that, under very general conditions on 𝓐, the unfold/fold transformation rules preserve the existence of an 𝓐-definable model, that is, if the original clauses have an 𝓐-definable model, then the transformed clauses have an 𝓐-definable model. The converse does not hold in general, and we provide suitable conditions under which the transformed clauses have an 𝓐-definable modelif and only ifthe original ones have an 𝓐-definable model. Then, we present a strategy, called Predicate Pairing, which guides the application of the transformation rules with the objective of deriving a set of clauses whose satisfiability problem can be solved by looking for 𝓐-definable models. The Predicate Pairing (PP) strategy introduces a new predicate defined by the conjunction of two predicates occurring in the original set of clauses, together with a conjunction of constraints. We will show through some examples that an 𝓐-definable model may exist for the new predicate even if it does not exist for its defining atomic conjuncts. We will also present some case studies showing that Predicate Pairing plays a crucial role in the verification ofrelational properties of programs, that is, properties relating two programs (such as program equivalence) or two executions of the same program (such as non-interference). Finally, we perform an experimental evaluation of the proposed techniques to assess the effectiveness of Predicate Pairing in increasing the power of CHC solving.
Style APA, Harvard, Vancouver, ISO itp.
9

K, Hari Govind V., Sharon Shoham i Arie Gurfinkel. "Solving constrained Horn clauses modulo algebraic data types and recursive functions". Proceedings of the ACM on Programming Languages 6, POPL (16.01.2022): 1–29. http://dx.doi.org/10.1145/3498722.

Pełny tekst źródła
Streszczenie:
This work addresses the problem of verifying imperative programs that manipulate data structures, e.g., Rust programs. Data structures are usually modeled by Algebraic Data Types (ADTs) in verification conditions. Inductive invariants of such programs often require recursively defined functions (RDFs) to represent abstractions of data structures. From the logic perspective, this reduces to solving Constrained Horn Clauses (CHCs) modulo both ADT and RDF. The underlying logic with RDFs is undecidable. Thus, even verifying a candidate inductive invariant is undecidable. Similarly, IC3-based algorithms for solving CHCs lose their progress guarantee: they may not find counterexamples when the program is unsafe. We propose a novel IC3-inspired algorithm Racer for solving CHCs modulo ADT and RDF (i.e., automatically synthesizing inductive invariants, as opposed to only verifying them as is done in deductive verification). Racer ensures progress despite the undecidability of the underlying theory, and is guaranteed to terminate with a counterexample for unsafe programs. It works with a general class of RDFs over ADTs called catamorphisms. The key idea is to represent catamorphisms as both CHCs, via relationification , and RDFs, using novel abstractions . Encoding catamorphisms as CHCs allows learning inductive properties of catamorphisms, as well as preserving unsatisfiabilty of the original CHCs despite the use of RDF abstractions, whereas encoding catamorphisms as RDFs allows unfolding the recursive definition, and relying on it in solutions. Abstractions ensure that the underlying theory remains decidable. We implement our approach in Z3 and show that it works well in practice.
Style APA, Harvard, Vancouver, ISO itp.
10

De Angelis, Emanuele, Fabio Fioravanti, Alberto Pettorossi i Maurizio Proietti. "Satisfiability of constrained Horn clauses on algebraic data types: A transformation-based approach". Journal of Logic and Computation 32, nr 2 (4.02.2022): 402–42. http://dx.doi.org/10.1093/logcom/exab090.

Pełny tekst źródła
Streszczenie:
Abstract We address the problem of checking the satisfiability of constrained Horn clauses (CHCs) defined on algebraic data types (ADTs), such as lists and trees. We propose a new technique for transforming CHCs defined on ADTs into CHCs where the arguments of the predicates have only basic types, such as integers and booleans. Thus, our technique avoids, during satisfiability checking, the explicit use of proof rules based on induction over the ADTs. The main extension over previous techniques for ADT removal is a new transformation rule, called differential replacement, which allows us to introduce auxiliary predicates, whose definitions correspond to lemmas that are used when making inductive proofs. We present an algorithm that performs the automatic removal of ADTs by applying the new rule, together with the traditional folding/unfolding rules. We prove that, under suitable hypotheses, the set of the transformed clauses is satisfiable if and only if so is the set of the original clauses. By an experimental evaluation, we show that the use of the new rule significantly improves the effectiveness of ADT removal. We also show that our approach is competitive with respect to tools that extend CHC solvers with the use of inductive rules.
Style APA, Harvard, Vancouver, ISO itp.
11

Jochems, Jerome. "Reducing Higher-order Recursion Scheme Equivalence to Coinductive Higher-order Constrained Horn Clauses". Electronic Proceedings in Theoretical Computer Science 344 (13.09.2021): 36–64. http://dx.doi.org/10.4204/eptcs.344.4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
12

FARKA, FRANTIŠEK, EKATERINA KOMENDANTSKYA i KEVIN HAMMOND. "Proof-relevant Horn Clauses for Dependent Type Inference and Term Synthesis". Theory and Practice of Logic Programming 18, nr 3-4 (lipiec 2018): 484–501. http://dx.doi.org/10.1017/s1471068418000212.

Pełny tekst źródła
Streszczenie:
AbstractFirst-order resolution has been used for type inference for many years, including in Hindley-Milner type inference, type-classes, and constrained data types. Dependent types are a new trend in functional languages. In this paper, we show that proof-relevant first-order resolution can play an important role in automating type inference and term synthesis for dependently typed languages. We propose a calculus that translates type inference and term synthesis problems in a dependently typed language to a logic program and a goal in the proof-relevant first-order Horn clause logic. The computed answer substitution and proof term then provide a solution to the given type inference and term synthesis problem. We prove the decidability and soundness of our method.
Style APA, Harvard, Vancouver, ISO itp.
13

Matsushita, Yusuke, Takeshi Tsukada i Naoki Kobayashi. "RustHorn: CHC-based Verification for Rust Programs". ACM Transactions on Programming Languages and Systems 43, nr 4 (31.12.2021): 1–54. http://dx.doi.org/10.1145/3462205.

Pełny tekst źródła
Streszczenie:
Reduction to satisfiability of constrained Horn clauses (CHCs) is a widely studied approach to automated program verification. Current CHC-based methods, however, do not work very well for pointer-manipulating programs, especially those with dynamic memory allocation. This article presents a novel reduction of pointer-manipulating Rust programs into CHCs, which clears away pointers and memory states by leveraging Rust’s guarantees on permission. We formalize our reduction for a simplified core of Rust and prove its soundness and completeness. We have implemented a prototype verifier for a subset of Rust and confirmed the effectiveness of our method.
Style APA, Harvard, Vancouver, ISO itp.
14

Mordvinov, Dmitry A. "Property-Directed Inference of Relational Invariants". Modeling and Analysis of Information Systems 26, nr 4 (27.12.2019): 550–71. http://dx.doi.org/10.18255/1818-1015-2019-4-550-571.

Pełny tekst źródła
Streszczenie:
Property Directed Reachability (PDR) is an efficient and scalable approach to solving systems of symbolic constraints also known as Constrained Horn Clauses (CHC). In the case of non-linear CHCs, which may arise, e.g., from relational verification tasks, PDR aims to infer an inductive invariant for each uninterpreted predicate. However, in many practical cases this reasoning is not successful, as invariants should be derived for groups of predicates instead of individual predicates. The article describes a novel algorithm that identifies these groups automatically and complements the existing PDR technique. The key feature of the algorithm is that it does not require a possibly expensive synchronization transformation over the system of CHCs. We have implemented the algorithm on top of a up-to-date CHC solver Spacer. Our experimental evaluation shows that for some CHC systems, on which existing solvers diverge, our tool is able to discover relational invariants.
Style APA, Harvard, Vancouver, ISO itp.
15

Faella, Marco, i Gennaro Parlato. "Reachability Games Modulo Theories with a Bounded Safety Player". Proceedings of the AAAI Conference on Artificial Intelligence 37, nr 5 (26.06.2023): 6330–37. http://dx.doi.org/10.1609/aaai.v37i5.25779.

Pełny tekst źródła
Streszczenie:
Solving reachability games is a fundamental problem for the analysis, verification, and synthesis of reactive systems. We consider logical reachability games modulo theories (in short, GMTs), i.e., infinite-state games whose rules are defined by logical formulas over a multi-sorted first-order theory. Our games have an asymmetric constraint: the safety player has at most k possible moves from each game configuration, whereas the reachability player has no such limitation. Even though determining the winner of such a GMT is undecidable, it can be reduced to the well-studied problem of checking the satisfiability of a system of constrained Horn clauses (CHCs), for which many off-the-shelf solvers have been developed. Winning strategies for GMTs can also be computed by resorting to suitable CHC queries. We demonstrate that GMTs can model various relevant real-world games, and that our approach can effectively solve several problems from different domains, using Z3 as the backend CHC solver.
Style APA, Harvard, Vancouver, ISO itp.
16

Katsura, Hiroyuki, Naoki Kobayashi i Ryosuke Sato. "Higher-Order Property-Directed Reachability". Proceedings of the ACM on Programming Languages 7, ICFP (30.08.2023): 48–77. http://dx.doi.org/10.1145/3607831.

Pełny tekst źródła
Streszczenie:
The property-directed reachability (PDR) has been used as a successful method for automated verification of first-order transition systems. We propose a higher-order extension of PDR, called HoPDR, where higher-order recursive functions may be used to describe transition systems. We formalize HoPDR for the validity checking problem for conjunctive nu-HFL(Z), a higher-order fixpoint logic with integers and greatest fixpoint operators. The validity checking problem can also be viewed as a higher-order extension of the satisfiability problem for Constrained Horn Clauses (CHC), and safety property verification of higher-order programs can naturally be reduced to the validity checking problem. We have implemented a prototype verification tool based on HoPDR and confirmed its effectiveness. We also compare our HoPDR procedure with the PDR procedure for first-order systems and previous methods for fully automated higher-order program verification.
Style APA, Harvard, Vancouver, ISO itp.
17

Gu, Yu, Takeshi Tsukada i Hiroshi Unno. "Optimal CHC Solving via Termination Proofs". Proceedings of the ACM on Programming Languages 7, POPL (9.01.2023): 604–31. http://dx.doi.org/10.1145/3571214.

Pełny tekst źródła
Streszczenie:
Motivated by applications to open program reasoning such as maximal specification inference, this paper studies optimal CHC solving , a problem to compute maximal and/or minimal solutions of constrained Horn clauses (CHCs). This problem and its subproblems have been studied in the literature, and a major approach is to iteratively improve a solution of CHCs until it becomes optimal. So a key ingredient of optimization methods is the optimality checking of a given solution. We propose a novel optimality checking method, as well as an optimization method using the proposed optimality checker, based on a computational theoretical analysis of the optimality checking problem. The key observation is that the optimality checking problem is closely related to the termination analysis of programs, and this observation is useful both theoretically and practically. From a theoretical perspective, it clarifies a limitation of an existing method and incorrectness of another method in the literature. From a practical perspective, it allows us to apply techniques of termination analysis to the optimality checking of a solution of CHCs. We present an optimality checking method based on constraint-based synthesis of termination arguments, implemented our method, evaluated it on CHCs that encode maximal specification synthesis problems, and obtained promising results.
Style APA, Harvard, Vancouver, ISO itp.
18

De Angelis, Emanuele, Fabio Fioravanti, Alberto Pettorossi i Maurizio Proietti. "Contract Strengthening through Constrained Horn Clause Verification". Electronic Proceedings in Theoretical Computer Science 373 (22.11.2022): 23–34. http://dx.doi.org/10.4204/eptcs.373.3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
19

Jochems, Jerome, Eddie Jones i Steven Ramsay. "Higher-Order MSL Horn Constraints". Proceedings of the ACM on Programming Languages 7, POPL (9.01.2023): 2017–47. http://dx.doi.org/10.1145/3571262.

Pełny tekst źródła
Streszczenie:
The monadic shallow linear (MSL) class is a decidable fragment of first-order Horn clauses that was discovered and rediscovered around the turn of the century, with applications in static analysis and verification. We propose a new class of higher-order Horn constraints which extend MSL to higher-order logic and develop a resolution-based decision procedure. Higher-order MSL Horn constraints can quite naturally capture the complex patterns of call and return that are possible in higher-order programs, which make them well suited to higher-order program verification. In fact, we show that the higher-order MSL satisfiability problem and the HORS model checking problem are interreducible, so that higher-order MSL can be seen as a constraint-based approach to higher-order model checking. Finally, we describe an implementation of our decision procedure and its application to verified socket programming.
Style APA, Harvard, Vancouver, ISO itp.
20

Kafle, Bishoksan, i John P. Gallagher. "Constraint specialisation in Horn clause verification". Science of Computer Programming 137 (kwiecień 2017): 125–40. http://dx.doi.org/10.1016/j.scico.2017.01.002.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
21

LEACH, JAVIER, SUSANA NIEVA i MARIO RODRÍGUEZ-ARTALEJO. "Constraint Logic Programming with Hereditary Harrop formulas". Theory and Practice of Logic Programming 1, nr 4 (25.06.2001): 409–45. http://dx.doi.org/10.1017/s1471068401001041.

Pełny tekst źródła
Streszczenie:
Constraint Logic Programming (CLP) and Hereditary Harrop formulas (HH) are two well known ways to enhance the expressivity of Horn clauses. In this paper, we present a novel combination of these two approaches. We show how to enrich the syntax and proof theory of HH with the help of a given constraint system, in such a way that the key property of HH as a logic programming language (namely, the existence of uniform proofs) is preserved. We also present a procedure for goal solving, showing its soundness and completeness for computing answer constraints. As a consequence of this result, we obtain a new strong completeness theorem for CLP that avoids the need to build disjunctions of computed answers, as well as a more abstract formulation of a known completeness theorem for HH.
Style APA, Harvard, Vancouver, ISO itp.
22

Diaconescu, Răzvan. "Completeness of category-based equational deduction". Mathematical Structures in Computer Science 5, nr 1 (marzec 1995): 9–40. http://dx.doi.org/10.1017/s0960129500000621.

Pełny tekst źródła
Streszczenie:
Equational deduction is generalised within a category-based abstract model theory framework, and proved complete under a hypothesis of quantifier projectivity, using a semantic treatment that regards quantifiers as models rather than variables, and valuations as model morphisms rather than functions. Applications include many- and order-sorted (conditional) equational logics, Horn clause logic, equational deduction modulo a theory, constraint logics, and more, as well as any possible combination among them. In the cases of equational deduction modulo a theory and of constraint logic the completeness result is new. One important consequence is an abstract version of Herbrand's Theorem, which provides an abstract model theoretic foundation for equational and constraint logic programming.
Style APA, Harvard, Vancouver, ISO itp.
23

MARTINS, JOÃO, i R. VILELA MENDES. "NEURAL NETWORKS AND LOGICAL REASONING SYSTEMS: A TRANSLATION TABLE". International Journal of Neural Systems 11, nr 02 (kwiecień 2001): 179–86. http://dx.doi.org/10.1142/s0129065701000540.

Pełny tekst źródła
Streszczenie:
A correspondence is established between the basic elements of logic reasoning systems (knowledge bases, rules, inference and queries) and the structure and dynamical evolution laws of neural networks. The correspondence is pictured as a translation dictionary which might allow to go back and forth between symbolic and network formulations, a desirable step in learning-oriented systems and multicomputer networks. In the framework of Horn clause logics, it is found that atomic propositions with n arguments correspond to nodes with nth order synapses, rules to synaptic intensity constraints, forward chaining to synaptic dynamics and queries either to simple node activation or to a query tensor dynamics.
Style APA, Harvard, Vancouver, ISO itp.
24

LOPEZ-GARCIA, P., L. DARMAWAN, M. KLEMEN, U. LIQAT, F. BUENO i M. V. HERMENEGILDO. "Interval-based resource usage verification by translation into Horn clauses and an application to energy consumption". Theory and Practice of Logic Programming 18, nr 2 (marzec 2018): 167–223. http://dx.doi.org/10.1017/s1471068418000042.

Pełny tekst źródła
Streszczenie:
AbstractMany applications require conformance with specifications that constrain the use of resources, such as execution time, energy, bandwidth, etc. We present a configurable framework for static resource usage verification where specifications can include data size-dependent resource usage functions, expressing both lower and upper bounds. Ensuring conformance with respect to such specifications is an undecidable problem. Therefore, to statically check such specifications, our framework infers the same type of resource usage functions, which safely approximate the actual resource usage of the program, and compares them against the specification. We review how this framework supports several languages and compilation output formats by translating them to an intermediate representation based on Horn clauses and using the configurability of the framework to describe the resource semantics of the input language. We provide a detailed formalization and extend the framework so that both resource usage specification and analysis/verification output can include preconditions expressing intervals for the input data sizes for which assertions are intended to hold, proved, or disproved. Most importantly, we also extend the classes of functions that can be checked. We also report on and provide results from an implementation within the Ciao/CiaoPP framework, as well as on a practical tool built by instantiating this framework for the verification of energy consumption specifications for imperative/embedded programs. Finally, we show as an example how embedded software developers can use this tool, in particular, for determining values for program parameters that ensure meeting a given energy budget while minimizing the loss in quality of service.
Style APA, Harvard, Vancouver, ISO itp.
25

Palmer, T. N. "Bell's conspiracy, Schrödinger's black cat and global invariant sets". Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 373, nr 2047 (6.08.2015): 20140246. http://dx.doi.org/10.1098/rsta.2014.0246.

Pełny tekst źródła
Streszczenie:
A locally causal hidden-variable theory of quantum physics need not be constrained by the Bell inequalities if this theory also partially violates the measurement independence condition. However, such violation can appear unphysical, implying implausible conspiratorial correlations between the hidden variables of particles being measured and earlier determinants of instrumental settings. A novel physically plausible explanation for such correlations is proposed, based on the hypothesis that states of physical reality lie precisely on a non-computational measure-zero dynamically invariant set in the state space of the universe: the Cosmological Invariant Set Postulate. To illustrate the relevance of the concept of a global invariant set, a simple analogy is considered where a massive object is propelled into a black hole depending on the decay of a radioactive atom. It is claimed that a locally causal hidden-variable theory constrained by the Cosmological Invariant Set Postulate can violate the Clauser–Horne–Shimony–Holt inequality without being conspiratorial, superdeterministic, fine-tuned or retrocausal, and the theory readily accommodates the classical compatibilist notion of (experimenter) free will.
Style APA, Harvard, Vancouver, ISO itp.
26

DOMÉNECH, JESÚS J., JOHN P. GALLAGHER i SAMIR GENAIM. "Control-Flow Refinement by Partial Evaluation, and its Application to Termination and Cost Analysis". Theory and Practice of Logic Programming 19, nr 5-6 (wrzesień 2019): 990–1005. http://dx.doi.org/10.1017/s1471068419000310.

Pełny tekst źródła
Streszczenie:
AbstractControl-flow refinement refers to program transformations whose purpose is to make implicit control-flow explicit, and is used in the context of program analysis to increase precision. Several techniques have been suggested for different programming models, typically tailored to improving precision for a particular analysis. In this paper we explore the use of partial evaluation of Horn clauses as a general-purpose technique for control-flow refinement for integer transitions systems. These are control-flow graphs where edges are annotated with linear constraints describing transitions between corresponding nodes, and they are used in many program analysis tools. Using partial evaluation for control-flow refinement has the clear advantage over other approaches in that soundness follows from the general properties of partial evaluation; in particular, properties such as termination and complexity are preserved. We use a partial evaluation algorithm incorporating property-based abstraction, and show how the right choice of properties allows us to prove termination and to infer complexity of challenging programs that cannot be handled by state-of-the-art tools. We report on the integration of the technique in a termination analyzer, and its use as a preprocessing step for several cost analyzers.
Style APA, Harvard, Vancouver, ISO itp.
27

Molnár, Bálint, i András Benczúr. "The Application of Directed Hyper-Graphs for Analysis of Models of Information Systems". Mathematics 10, nr 5 (27.02.2022): 759. http://dx.doi.org/10.3390/math10050759.

Pełny tekst źródła
Streszczenie:
Hyper-graphs offer the opportunity to formulate logical statements about their components, for example, using Horn clauses. Several models of Information Systems can be represented using hyper-graphs as the workflows, i.e., the business processes. During the modeling of Information Systems, many constraints should be maintained during the development process. The models of Information Systems are complex objects, for this reason, the analysis of algorithms and graph structures that can support the consistency and integrity of models is an essential issue. A set of interdependencies between models and components of architecture can be formulated by functional dependencies and can be investigated via algorithmic methods. Information Systems can be perceived as overarching documents that includes data collections; documents to be processed; and representations of business processes, activities, and services. Whe selecting and working out an appropriate method encoding of artifacts in Information Systems, the complex structure can be represented using hyper-graphs. This representation enables the application of various model-checking, verification, and validation tools that are based on formal approaches. This paper describes the proposed representations in different situations using hyper-graphs, moreover, the formal, algorithmic-based model-checking methods that are coupled with the representations. The model-checking methods are realized by algorithms that are grounded in graph-theoretical approaches and tailored to the specificity of hyper-graphs. Finally, the possible applications in a real-life enterprise environment are outlined.
Style APA, Harvard, Vancouver, ISO itp.
28

Niu, Guanglin, Yongfei Zhang, Bo Li, Peng Cui, Si Liu, Jingyang Li i Xiaowei Zhang. "Rule-Guided Compositional Representation Learning on Knowledge Graphs". Proceedings of the AAAI Conference on Artificial Intelligence 34, nr 03 (3.04.2020): 2950–58. http://dx.doi.org/10.1609/aaai.v34i03.5687.

Pełny tekst źródła
Streszczenie:
Representation learning on a knowledge graph (KG) is to embed entities and relations of a KG into low-dimensional continuous vector spaces. Early KG embedding methods only pay attention to structured information encoded in triples, which would cause limited performance due to the structure sparseness of KGs. Some recent attempts consider paths information to expand the structure of KGs but lack explainability in the process of obtaining the path representations. In this paper, we propose a novel Rule and Path-based Joint Embedding (RPJE) scheme, which takes full advantage of the explainability and accuracy of logic rules, the generalization of KG embedding as well as the supplementary semantic structure of paths. Specifically, logic rules of different lengths (the number of relations in rule body) in the form of Horn clauses are first mined from the KG and elaborately encoded for representation learning. Then, the rules of length 2 are applied to compose paths accurately while the rules of length 1 are explicitly employed to create semantic associations among relations and constrain relation embeddings. Moreover, the confidence level of each rule is also considered in optimization to guarantee the availability of applying the rule to representation learning. Extensive experimental results illustrate that RPJE outperforms other state-of-the-art baselines on KG completion task, which also demonstrate the superiority of utilizing logic rules as well as paths for improving the accuracy and explainability of representation learning.
Style APA, Harvard, Vancouver, ISO itp.
29

Stuckey, William, Michael Silberstein, Timothy McDevitt i Ian Kohler. "Why the Tsirelson Bound? Bub’s Question and Fuchs’ Desideratum". Entropy 21, nr 7 (15.07.2019): 692. http://dx.doi.org/10.3390/e21070692.

Pełny tekst źródła
Streszczenie:
To answer Wheeler’s question “Why the quantum?” via quantum information theory according to Bub, one must explain both why the world is quantum rather than classical and why the world is quantum rather than superquantum, i.e., “Why the Tsirelson bound?” We show that the quantum correlations and quantum states corresponding to the Bell basis states, which uniquely produce the Tsirelson bound for the Clauser–Horne–Shimony–Holt (CHSH) quantity, can be derived from conservation per no preferred reference frame (NPRF). A reference frame in this context is defined by a measurement configuration, just as with the light postulate of special relativity. We therefore argue that the Tsirelson bound is ultimately based on NPRF just as the postulates of special relativity. This constraint-based/principle answer to Bub’s question addresses Fuchs’ desideratum that we “take the structure of quantum theory and change it from this very overt mathematical speak ... into something like [special relativity].” Thus, the answer to Bub’s question per Fuchs’ desideratum is, “the Tsirelson bound obtains due to conservation per NPRF”.
Style APA, Harvard, Vancouver, ISO itp.
30

Hess, Karl. "A Critical Review of Works Pertinent to the Einstein-Bohr Debate and Bell’s Theorem". Symmetry 14, nr 1 (14.01.2022): 163. http://dx.doi.org/10.3390/sym14010163.

Pełny tekst źródła
Streszczenie:
This review is related to the Einstein-Bohr debate and to Einstein–Podolsky–Rosen’s (EPR) and Bohm’s (EPRB) Gedanken-experiments as well as their realization in actual experiments. I examine a significant number of papers, from my minority point of view and conclude that the well-known theorems of Bell and Clauser, Horne, Shimony and Holt (CHSH) deal with mathematical abstractions that have only a tenuous relation to quantum theory and the actual EPRB experiments. It is also shown that, therefore, Bell-CHSH cannot be used to assess the nature of quantum entanglement, nor can physical features of entanglement be used to prove Bell-CHSH. Their proofs are, among other factors, based on a statistical sampling argument that is invalid for general physical entities and processes and only applicable for finite “populations”; not for elements of physical reality that are linked, for example, to a time-like continuum. Bell-CHSH have, furthermore, neglected the subtleties of the theorem of Vorob’ev that includes their theorems as special cases. Vorob’ev found that certain combinatorial-topological cyclicities of classical random variables form a necessary and sufficient condition for the constraints that are now known as Bell-CHSH inequalities. These constraints, however, must not be linked to the observables of quantum theory nor to the actual EPRB experiments for a variety of reasons, including the existence of continuum-related variables and appropriate considerations of symmetry.
Style APA, Harvard, Vancouver, ISO itp.
31

SANCHEZ-ORDAZ, MIGUEL A., ISABEL GARCIA-CONTRERAS, VICTOR PEREZ, JOSÉ F. MORALES, PEDRO LOPEZ-GARCIA i MANUEL V. HERMENEGILDO. "VeriFly: On-the-fly Assertion Checking via Incrementality". Theory and Practice of Logic Programming 21, nr 6 (listopad 2021): 768–84. http://dx.doi.org/10.1017/s1471068421000430.

Pełny tekst źródła
Streszczenie:
AbstractAssertion checking is an invaluable programmer’s tool for finding many classes of errors or verifying their absence in dynamic languages such as Prolog. For Prolog programmers, this means being able to have relevant properties, such as modes, types, determinacy, nonfailure, sharing, constraints, and cost, checked and errors flagged without having to actually run the program. Such global static analysis tools are arguably most useful the earlier they are used in the software development cycle, and fast response times are essential for interactive use. Triggering a full and precise semantic analysis of a software project every time a change is made can be prohibitively expensive. This is specially the case when complex properties need to be inferred for large, realistic code bases. In our static analysis and verification framework, this challenge is addressed through a combination of modular and incremental (context- and path-sensitive) analysis that is responsive to program edits, at different levels of granularity. In this tool paper, we present how the combination of this framework within an integrated development environment (IDE) takes advantage of such incrementality to achieve a high level of reactivity when reflecting analysis and verification results back as colorings and tooltips directly on the program text – the tool’s VeriFly mode. The concrete implementation that we describe is Emacs-based and reuses in part off-the-shelf “on-the-fly” syntax checking facilities (flycheck). We believe that similar extensions are also reproducible with low effort in other mature development environments. Our initial experience with the tool shows quite promising results, with low latency times that provide early, continuous, and precise assertion checking and other semantic feedback to programmers during the development process. The tool supports Prolog natively, as well as other languages by semantic transformation into Horn clauses.
Style APA, Harvard, Vancouver, ISO itp.
32

VAN ROY, PETER, PER BRAND, DENYS DUCHIER, SEIF HARIDI, CHRISTIAN SCHULTE i MARTIN HENZ. "Logic programming in the context of multiparadigm programming: the Oz experience". Theory and Practice of Logic Programming 3, nr 6 (listopad 2003): 717–63. http://dx.doi.org/10.1017/s1471068403001741.

Pełny tekst źródła
Streszczenie:
Oz is a multiparadigm language that supports logic programming as one of its major paradigms. A multiparadigm language is designed to support different programming paradigms (logic, functional, constraint, object-oriented, sequential, concurrent, etc.) with equal ease. This paper has two goals: to give a tutorial of logic programming in Oz; and to show how logic programming fits naturally into the wider context of multiparadigm programming. Our experience shows that there are two classes of problems, which we call algorithmic and search problems, for which logic programming can help formulate practical solutions. Algorithmic problems have known efficient algorithms. Search problems do not have known efficient algorithms but can be solved with search. The Oz support for logic programming targets these two problem classes specifically, using the concepts needed for each. This is in contrast to the Prolog approach, which targets both classes with one set of concepts, which results in less than optimal support for each class. We give examples that can be run interactively on the Mozart system, which implements Oz. To explain the essential difference between algorithmic and search programs, we define the Oz execution model. This model subsumes both concurrent logic programming (committed-choice-style) and search-based logic programming (Prolog-style). Furthermore, as consequences of its multiparadigm nature, the model supports new abilities such as first-class top levels, deep guards, active objects, and sophisticated control of the search process. Instead of Horn clause syntax, Oz has a simple, fully compositional, higher-order syntax that accommodates the abilities of the language. We give a brief history of Oz that traces the development of its main ideas and we summarize the lessons learned from this work. Finally, we give many entry points into the Oz literature.
Style APA, Harvard, Vancouver, ISO itp.
33

Blasiak, Pawel, Emmanuel M. Pothos, James M. Yearsley, Christoph Gallus i Ewa Borsuk. "Violations of locality and free choice are equivalent resources in Bell experiments". Proceedings of the National Academy of Sciences 118, nr 17 (22.04.2021): e2020569118. http://dx.doi.org/10.1073/pnas.2020569118.

Pełny tekst źródła
Streszczenie:
Bell inequalities rest on three fundamental assumptions: realism, locality, and free choice, which lead to nontrivial constraints on correlations in very simple experiments. If we retain realism, then violation of the inequalities implies that at least one of the remaining two assumptions must fail, which can have profound consequences for the causal explanation of the experiment. We investigate the extent to which a given assumption needs to be relaxed for the other to hold at all costs, based on the observation that a violation need not occur on every experimental trial, even when describing correlations violating Bell inequalities. How often this needs to be the case determines the degree of, respectively, locality or free choice in the observed experimental behavior. Despite their disparate character, we show that both assumptions are equally costly. Namely, the resources required to explain the experimental statistics (measured by the frequency of causal interventions of either sort) are exactly the same. Furthermore, we compute such defined measures of locality and free choice for any nonsignaling statistics in a Bell experiment with binary settings, showing that it is directly related to the amount of violation of the so-called Clauser–Horne–Shimony–Holt inequalities. This result is theory independent as it refers directly to the experimental statistics. Additionally, we show how the local fraction results for quantum-mechanical frameworks with infinite number of settings translate into analogous statements for the measure of free choice we introduce. Thus, concerning statistics, causal explanations resorting to either locality or free choice violations are fully interchangeable.
Style APA, Harvard, Vancouver, ISO itp.
34

HUET, GÉRARD. "Special issue on ‘Logical frameworks and metalanguages’". Journal of Functional Programming 13, nr 2 (marzec 2003): 257–60. http://dx.doi.org/10.1017/s0956796802004549.

Pełny tekst źródła
Streszczenie:
There is both a great unity and a great diversity in presentations of logic. The diversity is staggering indeed – propositional logic, first-order logic, higher-order logic belong to one classification; linear logic, intuitionistic logic, classical logic, modal and temporal logics belong to another one. Logical deduction may be presented as a Hilbert style of combinators, as a natural deduction system, as sequent calculus, as proof nets of one variety or other, etc. Logic, originally a field of philosophy, turned into algebra with Boole, and more generally into meta-mathematics with Frege and Heyting. Professional logicians such as Gödel and later Tarski studied mathematical models, consistency and completeness, computability and complexity issues, set theory and foundations, etc. Logic became a very technical area of mathematical research in the last half century, with fine-grained analysis of expressiveness of subtheories of arithmetic or set theory, detailed analysis of well-foundedness through ordinal notations, logical complexity, etc. Meanwhile, computer modelling developed a need for concrete uses of logic, first for the design of computer circuits, then more widely for increasing the reliability of sofware through the use of formal specifications and proofs of correctness of computer programs. This gave rise to more exotic logics, such as dynamic logic, Hoare-style logic of axiomatic semantics, logics of partial values (such as Scott's denotational semantics and Plotkin's domain theory) or of partial terms (such as Feferman's free logic), etc. The first actual attempts at mechanisation of logical reasoning through the resolution principle (automated theorem proving) had been disappointing, but their shortcomings gave rise to a considerable body of research, developing detailed knowledge about equational reasoning through canonical simplification (rewriting theory) and proofs by induction (following Boyer and Moore successful integration of primitive recursive arithmetic within the LISP programming language). The special case of Horn clauses gave rise to a new paradigm of non-deterministic programming, called Logic Programming, developing later into Constraint Programming, blurring further the scope of logic. In order to study knowledge acquisition, researchers in artificial intelligence and computational linguistics studied exotic versions of modal logics such as Montague intentional logic, epistemic logic, dynamic logic or hybrid logic. Some others tried to capture common sense, and modeled the revision of beliefs with so-called non-monotonic logics. For the careful crafstmen of mathematical logic, this was the final outrage, and Girard gave his anathema to such “montres à moutardes”.
Style APA, Harvard, Vancouver, ISO itp.
35

Proietti, Maurizio. "Transforming Constrained Horn Clauses for Program Verification". Electronic Proceedings in Theoretical Computer Science 219 (14.07.2016). http://dx.doi.org/10.4204/eptcs.219.0.1.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
36

DE ANGELIS, EMANUELE, MAURIZIO PROIETTI, FABIO FIORAVANTI i ALBERTO PETTOROSSI. "Verifying Catamorphism-Based Contracts using Constrained Horn Clauses". Theory and Practice of Logic Programming, 7.07.2022, 1–18. http://dx.doi.org/10.1017/s1471068422000175.

Pełny tekst źródła
Streszczenie:
Abstract We address the problem of verifying that the functions of a program meet their contracts, specified by pre/postconditions. We follow an approach based on constrained Horn clauses (CHCs) by which the verification problem is reduced to the problem of checking satisfiability of a set of clauses derived from the given program and contracts. We consider programs that manipulate algebraic data types (ADTs) and a class of contracts specified by catamorphisms, that is, functions defined by simple recursion schemata on the given ADTs. We show by several examples that state-of-the-art CHC satisfiability tools are not effective at solving the satisfiability problems obtained by direct translation of the contracts into CHCs. To overcome this difficulty, we propose a transformation technique that removes the ADT terms from CHCs and derives new sets of clauses that work on basic sorts only, such as integers and booleans. Thus, when using the derived CHCs there is no need for induction rules on ADTs. We prove that the transformation is sound, that is, if the derived set of CHCs is satisfiable, then so is the original set. We also prove that the transformation always terminates for the class of contracts specified by catamorphisms. Finally, we present the experimental results obtained by an implementation of our technique when verifying many non-trivial contracts for ADT manipulating programs.
Style APA, Harvard, Vancouver, ISO itp.
37

DE ANGELIS, EMANUELE, FABIO FIORAVANTI, JOHN P. GALLAGHER, MANUEL V. HERMENEGILDO, ALBERTO PETTOROSSI i MAURIZIO PROIETTI. "Analysis and Transformation of Constrained Horn Clauses for Program Verification". Theory and Practice of Logic Programming, 15.11.2021, 1–69. http://dx.doi.org/10.1017/s1471068421000211.

Pełny tekst źródła
Streszczenie:
Abstract This paper surveys recent work on applying analysis and transformation techniques that originate in the field of constraint logic programming (CLP) to the problem of verifying software systems. We present specialization-based techniques for translating verification problems for different programming languages, and in general software systems, into satisfiability problems for constrained Horn clauses (CHCs), a term that has become popular in the verification field to refer to CLP programs. Then, we describe static analysis techniques for CHCs that may be used for inferring relevant program properties, such as loop invariants. We also give an overview of some transformation techniques based on specialization and fold/unfold rules, which are useful for improving the effectiveness of CHC satisfiability tools. Finally, we discuss future developments in applying these techniques.
Style APA, Harvard, Vancouver, ISO itp.
38

Otoni, Rodrigo, Matteo Marescotti, Leonardo Alt, Patrick Eugster, Antti E. J. Hyvärinen i Natasha Sharygina. "A S olicitous Approach to Smart Contract Verification". ACM Transactions on Privacy and Security, 28.09.2022. http://dx.doi.org/10.1145/3564699.

Pełny tekst źródła
Streszczenie:
Smart contracts are tempting targets of attacks, since they often hold and manipulate significant financial assets, are immutable after deployment, and have publicly available source code, with assets estimated in the order of millions of US Dollars being lost in the past due to vulnerabilities. Formal verification is thus a necessity, but smart contracts challenge the existing highly efficient techniques routinely applied in the symbolic verification of software, due to specificities not present in general programming languages. A common feature of existing works in this area is the attempt to reuse off-the-shelf verification tools designed for general programming languages. This reuse can lead to inefficiency and potentially unsound results, since domain translation is required. In this paper we describe a carefully crafted approach that directly models the central aspects of smart contracts natively, going from the contract to its logical representation without intermediary steps. We use the expressive and highly automatable logic of constrained Horn clauses for modeling and we instantiate our approach to the Solidity language. A tool implementing our approach, called Solicitous, was developed and integrated into the SMTChecker module of the Solidity compiler solc. We evaluated our approach on an extensive benchmark set containing 22446 real-world smart contracts deployed on the Ethereum blockchain over a 27 months period. The results show that our approach is able to establish safety of significantly more contracts than comparable, publicly available verification tools, with an order of magnitude increase in the percentage of formally verified contracts.
Style APA, Harvard, Vancouver, ISO itp.
39

Lipka, Michał, Mateusz Mazelanik, Adam Leszczyński, Wojciech Wasilewski i Michał Parniak. "Massively-multiplexed generation of Bell-type entanglement using a quantum memory". Communications Physics 4, nr 1 (8.03.2021). http://dx.doi.org/10.1038/s42005-021-00551-1.

Pełny tekst źródła
Streszczenie:
AbstractHigh-rate generation of hybrid photon-matter entanglement remains a fundamental building block of quantum network architectures enabling protocols such as quantum secure communication or quantum distributed computing. While a tremendous effort has been made to overcome technological constraints limiting the efficiency and coherence times of current systems, an important complementary approach is to employ parallel and multiplexed architectures. Here we follow this approach experimentally demonstrating the generation of bipartite polarization-entangled photonic states across more than 500 modes, with a programmable delay for the second photon enabled by qubit storage in a wavevector-multiplexed cold-atomic quantum memory. We demonstrate Clauser, Horne, Shimony, Holt inequality violation by over 3 standard deviations, lasting for at least 45 μs storage time for half of the modes. The ability to shape hybrid entanglement between the polarization and wavevector degrees of freedom provides not only multiplexing capabilities but also brings prospects for novel protocols.
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii