Academic literature on the topic 'Logical encodings'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Logical encodings.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Logical encodings"

1

Kovács, Tibor, Gábor Simon, and Gergely Mezei. "Benchmarking Graph Database Backends—What Works Well with Wikidata?" Acta Cybernetica 24, no. 1 (May 21, 2019): 43–60. http://dx.doi.org/10.14232/actacyb.24.1.2019.5.

Full text
Abstract:
Knowledge bases often utilize graphs as logical model. RDF-based knowledge bases (KB) are prime examples, as RDF (Resource Description Framework) does use graph as logical model. Graph databases are an emerging breed of NoSQL-type databases, offering graph as the logical model. Although there are specialized databases, the so-called triple stores, for storing RDF data, graph databases can also be promising candidates for storing knowledge. In this paper, we benchmark different graph database implementations loaded with Wikidata, a real-life, large-scale knowledge base. Graph databases come in all shapes and sizes, offer different APIs and graph models. Hence we used a measurement system, that can abstract away the API differences. For the modeling aspect, we made measurements with different graph encodings previously suggested in the literature, in order to observe the impact of the encoding aspect on the overall performance.
APA, Harvard, Vancouver, ISO, and other styles
2

Yoder, Theodore J., and Isaac H. Kim. "The surface code with a twist." Quantum 1 (April 25, 2017): 2. http://dx.doi.org/10.22331/q-2017-04-25-2.

Full text
Abstract:
The surface code is one of the most successful approaches to topological quantum error-correction. It boasts the smallest known syndrome extraction circuits and correspondingly largest thresholds. Defect-based logical encodings of a new variety called twists have made it possible to implement the full Clifford group without state distillation. Here we investigate a patch-based encoding involving a modified twist. In our modified formulation, the resulting codes, called triangle codes for the shape of their planar layout, have only weight-four checks and relatively simple syndrome extraction circuits that maintain a high, near surface-code-level threshold. They also use 25% fewer physical qubits per logical qubit than the surface code. Moreover, benefiting from the twist, we can implement all Clifford gates by lattice surgery without the need for state distillation. By a surgical transformation to the surface code, we also develop a scheme of doing all Clifford gates on surface code patches in an atypical planar layout, though with less qubit efficiency than the triangle code. Finally, we remark that logical qubits encoded in triangle codes are naturally amenable to logical tomography, and the smallest triangle code can demonstrate high-pseudothreshold fault-tolerance to depolarizing noise using just 13 physical qubits.
APA, Harvard, Vancouver, ISO, and other styles
3

Steiner, Erich. "Ideational grammatical metaphor." Languages in Contrast 4, no. 1 (April 14, 2004): 137–64. http://dx.doi.org/10.1075/lic.4.1.07ste.

Full text
Abstract:
In this paper I want to explore the systemic-functional notion of ‘grammatical metaphor’ from a cross-linguistic perspective. After a brief introduction to the concept of ‘grammatical metaphor’, I shall discuss the distinction between ‘congruent’ and ‘metaphorical’ encodings of meaning, as well as the distinction between rankshift, transcategorization, and grammatical metaphor as semogenic resources (Section 1). In a second section, I shall then focus on ideational grammatical metaphors in English and German and revisit the notion of direct vs. indirect mapping of experiential and logical semantics onto lexicogrammar (Section 2). It will be argued that ‘directness of encoding’ within one language can be defined with the help of the concept of ‘transparency’ or ‘motivation’ of encoding between levels. Across and between languages, however, the notion of ‘directness’ either has to be seen from the perspective of one of the languages involved, or from the perspective of a generalized semantics and grammar. In Section 3, I shall then explore the question of the experiential vs. logical encoding of semantic categories across languages, and of how this relates to metaphoricity. I shall exemplify and discuss the fact that in cross-linguistic analyses, one cannot consider any one of a given set of experiential or logical encodings of some unit of meaning as ‘congruent’ or ‘direct’, as long as one does not have a cross-linguistic semantics to establish ‘motivation’ and ‘transparentness’ on. It will also be argued that some of the differences in texts across languages as to what counts as ‘congruent’ can be predicted from comparisons between the language-specific grammatical systems involved. Other differences, however, seem to rely heavily on registerial influences and cultural factors. In Section 4, then, I shall inquire into the question of whether and precisely in what sense we can speak of two different types of grammatical metaphor, dependent on whether they involve a relocation in rank or a mere re-arrangement of mappings of semantic and lexicogrammatical functions. These types of metaphor, it will be argued, have different implications for the metaphoricity of the clause as a whole, as well as for the ‘density’ of the packaging of meaning.
APA, Harvard, Vancouver, ISO, and other styles
4

Pal, Amit Kumar, Philipp Schindler, Alexander Erhard, Ángel Rivas, Miguel-Angel Martin-Delgado, Rainer Blatt, Thomas Monz, and Markus Müller. "Relaxation times do not capture logical qubit dynamics." Quantum 6 (January 24, 2022): 632. http://dx.doi.org/10.22331/q-2022-01-24-632.

Full text
Abstract:
Quantum error correction procedures have the potential to enable faithful operation of large-scale quantum computers. They protect information from environmental decoherence by storing it in logical qubits, built from ensembles of entangled physical qubits according to suitably tailored quantum error correcting encodings. To date, no generally accepted framework to characterise the behaviour of logical qubits as quantum memories has been developed. In this work, we show that generalisations of well-established figures of merit of physical qubits, such as relaxation times, to logical qubits fail and do not capture dynamics of logical qubits. We experimentally illustrate that, in particular, spatial noise correlations can give rise to rich and counter-intuitive dynamical behavior of logical qubits. We show that a suitable set of observables, formed by code space population and logical operators within the code space, allows one to track and characterize the dynamical behaviour of logical qubits. Awareness of these effects and the efficient characterisation tools used in this work will help to guide and benchmark experimental implementations of logical qubits.
APA, Harvard, Vancouver, ISO, and other styles
5

Scala, Enrico, Miquel Ramírez, Patrik Haslum, and Sylvie Thiebaux. "Numeric Planning with Disjunctive Global Constraints via SMT." Proceedings of the International Conference on Automated Planning and Scheduling 26 (March 30, 2016): 276–84. http://dx.doi.org/10.1609/icaps.v26i1.13766.

Full text
Abstract:
This paper describes a novel encoding for sequential numeric planning into the problem of determining the satisfiability of a logical theory T. We introduce a novel technique, orthogonal to existing work aiming at producing more succinct encodings that enables the theory solver to roll up an unbounded yet finite number of instances of an action into a single plan step, greatly reducing the horizon at which T models valid plans. The technique is then extended to deal with problems featuring disjunctive global constraints, in which the state space becomes a non-convex n dimensional polytope. In order to empirically evaluate the encoding, we build a planner, SPRINGROLL, around a state–of–the–art off– the–shelf SMT solver. Experiments on a diverse set of domains are finally reported, and results show the generality and efficiency of the approach.
APA, Harvard, Vancouver, ISO, and other styles
6

CAVE, ANDREW, and BRIGITTE PIENTKA. "Mechanizing proofs with logical relations – Kripke-style." Mathematical Structures in Computer Science 28, no. 9 (August 2, 2018): 1606–38. http://dx.doi.org/10.1017/s0960129518000154.

Full text
Abstract:
Proofs with logical relations play a key role to establish rich properties such as normalization or contextual equivalence. They are also challenging to mechanize. In this paper, we describe two case studies using the proof environmentBeluga: First, we explain the mechanization of the weak normalization proof for the simply typed lambda-calculus; second, we outline how to mechanize the completeness proof of algorithmic equality for simply typed lambda-terms where we reason about logically equivalent terms. The development of these proofs inBelugarelies on three key ingredients: (1) we encode lambda-terms together with their typing rules, operational semantics, algorithmic and declarative equality using higher order abstract syntax (HOAS) thereby avoiding the need to manipulate and deal with binders, renaming and substitutions, (2) we take advantage ofBeluga's support for representing derivations that depend on assumptions and first-class contexts to directly state inductive properties such as logical relations and inductive proofs, (3) we exploitBeluga's rich equational theory for simultaneous substitutions; as a consequence, users do not need to establish and subsequently use substitution properties, and proofs are not cluttered with references to them. We believe these examples demonstrate thatBelugaprovides the right level of abstractions and primitives to mechanize challenging proofs using HOAS encodings. It also may serve as a valuable benchmark for other proof environments.
APA, Harvard, Vancouver, ISO, and other styles
7

Dennis, Louise A., Martin Mose Bentzen, Felix Lindner, and Michael Fisher. "Verifiable Machine Ethics in Changing Contexts." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 13 (May 18, 2021): 11470–78. http://dx.doi.org/10.1609/aaai.v35i13.17366.

Full text
Abstract:
Many systems proposed for the implementation of ethical reasoning involve an encoding of user values as a set of rules or a model. We consider the question of how changes of context affect these encodings. We propose the use of a reasoning cycle, in which information about the ethical reasoner's context is imported in a logical form, and we propose that context-specific aspects of an ethical encoding be prefaced by a guard formula. This guard formula should evaluate to true when the reasoner is in the appropriate context and the relevant parts of the reasoner's rule set or model should be updated accordingly. This architecture allows techniques for the model-checking of agent-based autonomous systems to be used to verify that all contexts respect key stakeholder values. We implement this framework using the hybrid ethical reasoning agents system (HERA) and the model-checking agent programming languages (MCAPL) framework.
APA, Harvard, Vancouver, ISO, and other styles
8

RABE, FLORIAN. "A logical framework combining model and proof theory." Mathematical Structures in Computer Science 23, no. 5 (March 1, 2013): 945–1001. http://dx.doi.org/10.1017/s0960129512000424.

Full text
Abstract:
Mathematical logic and computer science have driven the design of a growing number of logics and related formalisms such as set theories and type theories. In response to this population explosion, logical frameworks have been developed as formal meta-languages in which to represent, structure, relate and reason about logics.Research on logical frameworks has diverged into separate communities, often with conflicting backgrounds and philosophies. In particular, two of the most important logical frameworks are the framework of institutions, from the area of model theory based on category theory, and the Edinburgh Logical Framework LF, from the area of proof theory based on dependent type theory. Even though their ultimate motivations overlap – for example in applications to software verification – they have fundamentally different perspectives on logic.In the current paper, we design a logical framework that integrates the frameworks of institutions and LF in a way that combines their complementary advantages while retaining the elegance of each of them. In particular, our framework takes a balanced approach between model theory and proof theory, and permits the representation of logics in a way that comprises all major ingredients of a logic: syntax, models, satisfaction, judgments and proofs. This provides a theoretical basis for the systematic study of logics in a comprehensive logical framework. Our framework has been applied to obtain a large library of structured and machine-verified encodings of logics and logic translations.
APA, Harvard, Vancouver, ISO, and other styles
9

Locher, David F., Lorenzo Cardarelli, and Markus Müller. "Quantum Error Correction with Quantum Autoencoders." Quantum 7 (March 9, 2023): 942. http://dx.doi.org/10.22331/q-2023-03-09-942.

Full text
Abstract:
Active quantum error correction is a central ingredient to achieve robust quantum processors. In this paper we investigate the potential of quantum machine learning for quantum error correction in a quantum memory. Specifically, we demonstrate how quantum neural networks, in the form of quantum autoencoders, can be trained to learn optimal strategies for active detection and correction of errors, including spatially correlated computational errors as well as qubit losses. We highlight that the denoising capabilities of quantum autoencoders are not limited to the protection of specific states but extend to the entire logical codespace. We also show that quantum neural networks can be used to discover new logical encodings that are optimally adapted to the underlying noise. Moreover, we find that, even in the presence of moderate noise in the quantum autoencoders themselves, they may still be successfully used to perform beneficial quantum error correction and thereby extend the lifetime of a logical qubit.
APA, Harvard, Vancouver, ISO, and other styles
10

Hardie, Andrew. "From legacy encodings to Unicode: the graphical and logical principles in the scripts of South Asia." Language Resources and Evaluation 41, no. 1 (April 4, 2007): 1–25. http://dx.doi.org/10.1007/s10579-006-9003-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Logical encodings"

1

Dubois, De Prisque Louise. "Prétraitement compositionnel en Coq." Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG040.

Full text
Abstract:
Cette thèse présente une méthodologie de prétraitement visant à transformer certains énoncés de la logique de l'assistant de preuve Coq en énoncés de logique du premier ordre, de façon à les envoyer à des prouveurs automatiques, et en particulier des prouveurs SMT. Cette méthodologie consiste à composer de petites transformations indépendantes et certifiantes, prenant la forme de tactiques Coq. Une implémentation de cette méthodologie est proposée dans un plugin appelé Sniper, qui offre une tactique "pousse-bouton" d'automatisation. Par ailleurs, un ordonnanceur de transformations logiques permet d'ajouter ses propres transformations logiques à l'ensemble et de déterminer quelles transformations s'appliquent en fonction de la preuve à effectuer
This thesis presents a preprocessing methodology aimed at transforming certain statements from the Coq proof assistant's logic into first-order logic statements, in order to send them to automatic provers, in particular SMT solvers. This methodology involves composing small, independent, and certifying transformations, taking the form of Coq tactics. An implementation of this methodology is provided in a plugin called Sniper, which offers a "push-button" automation tactic. Furthermore, a logical transformation scheduler (called Orchestrator) allows adding one's own logical transformations and determines which transformations apply depending on the proof to be carried out
APA, Harvard, Vancouver, ISO, and other styles
2

Sheridan, Daniel. "Temporal logic encodings for SAT-based bounded model checking." Thesis, University of Edinburgh, 2006. http://hdl.handle.net/1842/1467.

Full text
Abstract:
Since its introduction in 1999, bounded model checking (BMC) has quickly become a serious and indispensable tool for the formal verification of hardware designs and, more recently, software. By leveraging propositional satisfiability (SAT) solvers, BMC overcomes some of the shortcomings of more conventional model checking methods. In model checking we automatically verify whether a state transition system (STS) describing a design has some property, commonly expressed in linear temporal logic (LTL). BMC is the restriction to only checking the looping and non-looping runs of the system that have bounded descriptions. The conventional BMC approach is to translate the STS runs and LTL formulae into propositional logic and then conjunctive normal form (CNF). This CNF expression is then checked by a SAT solver. In this thesis we study the effect on the performance of BMC of changing the translation to propositional logic. One novelty is to use a normal form for LTL which originates in resolution theorem provers. We introduce the normal form conversion early on in the encoding process and examine the simplifications that it brings to the generation of propositional logic. We further enhance the encoding by specialising the normal form to take advantage of the types of runs peculiar to BMC. We also improve the conversion from propositional logic to CNF. We investigate the behaviour of the new encodings by a series of detailed experimental comparisons using both hand-crafted and industrial benchmarks from a variety of sources. These reveal that the new normal form based encodings can reduce the solving time by a half in most cases, and up to an order of magnitude in some cases, the size of the improvement corresponding to the complexity of the LTL expression. We also compare our method to the popular automata-based methods for model checking and BMC.
APA, Harvard, Vancouver, ISO, and other styles
3

Malik, Usama Computer Science &amp Engineering Faculty of Engineering UNSW. "Configuration encoding techniques for fast FPGA reconfiguration." Awarded by:University of New South Wales. School of Computer Science and Engineering, 2006. http://handle.unsw.edu.au/1959.4/26212.

Full text
Abstract:
This thesis examines the problem of reducing reconfiguration time of an island-style FPGA at its configuration memory level. The approach followed is to examine configuration encoding techniques in order to reduce the size of the bitstream that must be loaded onto the device to perform a reconfiguration. A detailed analysis of a set of benchmark circuits on various island-style FPGAs shows that a typical circuit randomly changes a small number of bits in the {\it null} or default configuration state of the device. This feature is exploited by developing efficient encoding schemes for configuration data. For a wide set of benchmark circuits on various FPGAs, it is shown that the proposed methods outperform all previous configuration compression methods and, depending upon the relative size of the circuit to the device, compress within 5\% of the fundamental information theoretic limit. Moreover, it is shown that the corresponding decoders are simple to implement in hardware and scale well with device size and available configuration bandwidth. It is not unreasonable to expect that with little modification to existing FPGA configuration memory systems and acceptable increase in configuration power a 10-fold improvement in configuration delay could be achieved. The main contribution of this thesis is that it defines the limit of configuration compression for the FPGAs under consideration and develops practical methods of overcoming this reconfiguration bottleneck. The functional density of reconfigurable devices could thereby be enhanced and the range of potential applications reasonably expanded.
APA, Harvard, Vancouver, ISO, and other styles
4

Hamdaoui, Yann. "Concurrency, references and linear logic." Thesis, Sorbonne Paris Cité, 2018. http://www.theses.fr/2018USPCC190/document.

Full text
Abstract:
Le sujet de cette thèse est l’étude de l’encodage des références et de la concurrence dans la Logique Linéaire. Notre motivation est de montrer que la Logique Linéaire est capable d’encoder des effets de bords, et pourrait ainsi servir comme une cible de compilation pour des langages fonctionnels qui soit à la fois viable, formalisée et largement étudiée. La notion clé développée dans cette thèse est celle de zone de routage. C’est une famille de réseaux de preuve qui correspond à un fragment de la logique linéaire différentielle, et permet d’implémenter différentes primitives de communication. Nous les définissons et étudions leur théorie. Nous illustrons ensuite leur expressivité en traduisant un λ-calcul avec concurrence, références et réplication dans un fragment des réseaux différentiels. Pour ce faire, nous introduisons un langage semblable au λ-calcul concurrent d’Amadio, mais avec des substitutions explicites à la fois pour les variables et pour les références. Nous munissons ce langage d’un système de types et d’effets, et prouvons la normalisation forte des termes bien typés avec une technique qui combine la réductibilité et une nouvelle technique interactive. Ce langage nous permet de prouver un théorème de simulation, et un théorème d’adéquation pour la traduction proposée
The topic of this thesis is the study of the encoding of references andconcurrency in Linear Logic. Our perspective is to demonstrate the capabilityof Linear Logic to encode side-effects to make it a viable, formalized and wellstudied compilation target for functional languages in the future. The keynotion we develop is that of routing areas: a family of proof nets whichcorrespond to a fragment of differential linear logic and which implementscommunication primitives. We develop routing areas as a parametrizable deviceand study their theory. We then illustrate their expressivity by translating aconcurrent λ-calculus featuring concurrency, references and replication to afragment of differential nets. To this purpose, we introduce a language akin toAmadio’s concurrent λ-calculus, but with explicit substitutions for bothvariables and references. We endow this language with a type and effect systemand we prove termination of well-typed terms by a mix of reducibility and anew interactive technique. This intermediate language allows us to prove asimulation and an adequacy theorem for the translation
APA, Harvard, Vancouver, ISO, and other styles
5

Karmarkar, Kedar Madhav. "SCALABLE BUS ENCODING FOR ERROR-RESILIENT HIGH-SPEED ON-CHIP COMMUNICATION." OpenSIUC, 2013. https://opensiuc.lib.siu.edu/dissertations/720.

Full text
Abstract:
Shrinking minimum feature size in deep sub-micron has made fabrication of progressively faster devices possible. The performance of interconnects has been a bottleneck in determining the overall performance of a chip. A reliable high-speed communication technique is necessary to improve the performance of on-chip communication. Recent publications have demonstrated that use of multiple threshold voltages improves the performance of a bus significantly. The multi-threshold capture mechanism takes advantage of predictable temporal behavior of a tightly coupled bus to predict the next state of the bus early. However, Use of multiple threshold voltages also reduces the voltage slack and consequently increases the susceptibility to noise. Reduction in supply voltage exacerbates the situation. This work proposes a novel error detection and correction encoding technique that takes advantage of the high performance of the multi-threshold capture mechanism as well as its inbuilt redundancy to achieve reliable high-speed communication while introducing considerably less amount of redundancy as compared to the conventional methods. The proposed technique utilizes graph-based algorithms to produce a set of valid code words. The algorithm takes advantage of implicit set operations using binary decision diagram to improve the scalability of the code word selection process. The code words of many crosstalk avoidance codes including the proposed error detection and correction technique exhibit a highly structured behavior. The sets of larger valid code words can be recursively formed using the sets of smaller valid code words. This work also presents a generalized framework for scalable on-chip code word generation. The proposed CODEC implementation strategy uses a structured graph to model the recursive nature of an encoding technique that facilitates scalable CODEC implementation. The non-enumerative nature of the implementation strategy makes it highly scalable. The modular nature of the CODEC also simplifies use of pipelined architecture thereby improving the throughput of the bus.
APA, Harvard, Vancouver, ISO, and other styles
6

Johnson, Justin Scott Escobar Martha Cecilia. "Initially held hypothesis does not affect encoding of event frequencies in contingency based causal judgment." Auburn, Ala., 2009. http://hdl.handle.net/10415/1948.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yuan, Zeying. "Sequential Equivalence Checking of Circuits with Different State Encodings by Pruning Simulation-based Multi-Node Invariants." Thesis, Virginia Tech, 2015. http://hdl.handle.net/10919/56693.

Full text
Abstract:
Verification is an important step for Integrated Circuit (IC) design. In fact, literature has reported that up to 70% of the design effort is spent on checking if the design is functionally correct. One of the core verification tasks is Equivalence Checking (EC), which attempts to check if two structurally different designs are functionally equivalent for all reachable states. Powerful equivalence checking can also provide opportunities for more aggressive logic optimizations, meeting different goals such as smaller area, better performance, etc. The success of Combinational Equivalence Checking (CEC) has laid a foundation to industry-level combinational logic synthesis and optimization. However, Sequential Equivalence Checking (SEC) still faces much challenge, especially for those complex circuits that have different state encodings and few internal signal equivalences. In this thesis, we propose a novel simulation-based multi-node inductive invariant generation and pruning technique to check the equivalence of sequential circuits that have different state encodings and very few equivalent signals between them. By first grouping flip-flops into smaller subsets to make it scalable for large designs, we then propose a constrained logic synthesis technique to prune potential multi-node invariants without inadvertently losing important constraints. Our pruning technique guarantees the same conclusion for different instances (proving SEC or not) compared to previous approaches in which merging of such potential invariants might lose important relations if the merged relation does not turn out to be a true invariant. Experimental results show that the smaller invariant set can be very effective for sequential equivalence checking of such hard SEC instances. Our approach is up to 20x-- faster compared to previous mining-based methods for larger circuits.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
8

Mailly, Jean-Guy. "Dynamics of argumentation frameworks." Thesis, Artois, 2015. http://www.theses.fr/2015ARTO0402/document.

Full text
Abstract:
Cette thèse traite du problème de l'intégration d'une nouvelle information dans un système d'argumentation abstrait. Un tel système est un graphe orienté dont les nœuds représentent les arguments, et les arcs représentent les attaques entre arguments. Il existe divers moyen de décider quels arguments sont acceptés par l'agent qui utilise un tel système pour représenter ses croyances.Il peut arriver dans la vie d'un agent qu'il soit confronté à une information du type "tel argument devrait être accepté", alors que c'est en contradiction avec ses croyances actuelles, représentées par son système d'argumentation.Nous avons étudié dans cette thèse diverses approches pour intégrer une information à un système d'argumentation.Notre première contribution est une adaptation du cadre AGM pour la révision de croyances, habituellement utilisé lorsque les croyances de l'agent sont représentées dans un formalisme logique. Nous avons notamment adapté les postulats de rationalité proposés dans le cadre AGM pour pouvoir caractériser des opérateurs de révision de systèmes d'argumentation, et nous avons proposé différents moyens de générer les systèmes d'argumentation résultant de la révision.Nous avons ensuite proposé d'utiliser la révision AGM comme un outil pour réviser les systèmes d'argumentation. Il s'agit cette fois-ci d'une approche par encodage en logique du système d'argumentation, qui permet d'utiliser les opérateurs de révision usuels pour obtenir le résultat souhaité.Enfin, nous avons étudié le problème du forçage d'un ensemble d'arguments (comment modifier le système pour qu'un ensemble donné soit une extension). Nous avons proposé une nouvelle famille d'opérateurs qui garantissent le succès de l'opération, contrairement aux opérateurs de forçage existants, et nous avons montré qu'une traduction de nos approches en problèmes de satisfaction ou d'optimisation booléenne permet de développer des outils efficaces pour calculer le résultat du forçage
This thesis tackles the problem of integrating a new piece of information in an abstract argumentation framework. Such a framework is a directed graph such that its nodes represent the arguments, and the directed edges represent the attacks between arguments. There are different ways to decide which arguments are accepted by the agent who uses such a framework to represent her beliefs.An agent may be confronted with a piece of information such that "this argument should be accepted", which is in contradiction with her current beliefs, represented by her argumentation framework.In this thesis, we have studied several approaches to incorporate a piece of information in an argumentation framework.Our first contribution is an adaptation of the AGM framework for belief revision, which has been developed for characterizing the incorporation of a new piece of information when the agent's beliefs are represented in a logical setting. We have adapted the rationality postulates from the AGM framework to characterize the revision operators suited to argumentation frameworks, and we have identified several ways to generate the argumentation frameworks resulting from the revision.We have also shown how to use AGM revision as a tool for revising argumentation frameworks. Our approach uses a logical encoding of the argumentation framework to take advantage of the classical revision operators, for deriving the expected result.At last, we have studied the problem of enforcing a set of arguments (how to change an argumentation framework so that a given set of arguments becomes an extension). We have developed a new family of operators which guarantee the success of the enforcement process, contrary to the existing approaches, and we have shown that a translation of our approaches into satisfaction and optimization problems makes possible to develop efficient tools for computing the result of the enforcement
APA, Harvard, Vancouver, ISO, and other styles
9

Abrahamsson, Olle. "A Gröbner basis algorithm for fast encoding of Reed-Müller codes." Thesis, Linköpings universitet, Matematik och tillämpad matematik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-132429.

Full text
Abstract:
In this thesis the relationship between Gröbner bases and algebraic coding theory is investigated, and especially applications towards linear codes, with Reed-Müller codes as an illustrative example. We prove that each linear code can be described as a binomial ideal of a polynomial ring, and that a systematic encoding algorithm for such codes is given by the remainder of the information word computed with respect to the reduced Gröbner basis. Finally we show how to apply the representation of a code by its corresponding polynomial ring ideal to construct a class of codes containing the so called primitive Reed-Müller codes, with a few examples of this result.
APA, Harvard, Vancouver, ISO, and other styles
10

Sastrawan, Dewa Ayu Dwi Damaiyanti. "The Instagram News Logic : The Encoding and Decoding of News Credibility on Instagram in the COVID-19 Infodemic in Indonesia." Thesis, Uppsala universitet, Medier och kommunikation, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-446746.

Full text
Abstract:
In the occurrence of the COVID-19 pandemic, a trend of Instagram as a news source emerged in the Reuters Institute Digital News Report 2020 (Reuters, 2020). Instagram’s visual factor has made accessing news more feasible and convenient through a curated feed. Consequently, news producers are migrating to social media platforms, including Instagram to serve news consumption needs. However, journalism on social media has been criticized for its lack of journalistic legitimacy where media trust is challenged by the sensationalism of news to drive engagement metrics. Moreover, the COVID-19 infodemic (WHO, 2020) has escalated the concern of news credibility due to a circulation of misinformation regarding the pandemic and overwhelming citizens. Hence, the objective of this study is to analyze how a news outlet on Instagram maintains journalistic legitimacy and how Instagram users navigate news through information abundance in finding credible and trustworthy news.  The analysis of this study takes into account an Indonesian news outlet on Instagram called Narasi Newsroom, by interviewing a representative from the news producer, a content analysis of their news content, and interviewing Indonesian Instagram users. The empirical findings illustrate how Narasi Newsroom can revive journalistic legitimacy through an innovative approach without diminishing journalism quality on Instagram. With its principle of educating the audience to understand the news beyond factual statements, Narasi Newsroom’s strategy of riding the wave to serve audience needs upholds journalism values through critical thinking and credible sources. By conducting the study through updating the encoding/decoding model (Hall, 1973; 2009), it was found that the social media logic (Dijck & Poell, 2013) can balance journalism practice on Instagram as well as practices critical thinking for news consumers in finding trustworthy news. In light of the post-truth era, media trust and news credibility may be challenged, however, they are not lost when journalism on social media takes accountability to serve news consumers’ needs. Consequently, it takes both the news producers and consumers to take a critical stance to preserve trust in the news.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Logical encodings"

1

Smoryński, Craig. "Arithmetic Encoding." In Logical Number Theory I, 1–139. Berlin, Heidelberg: Springer Berlin Heidelberg, 1991. http://dx.doi.org/10.1007/978-3-642-75462-3_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Smoryński, Craig. "Diophantine Encoding." In Logical Number Theory I, 140–265. Berlin, Heidelberg: Springer Berlin Heidelberg, 1991. http://dx.doi.org/10.1007/978-3-642-75462-3_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Eisenhofer, Clemens, Ruba Alassaf, Michael Rawson, and Laura Kovács. "Non-Classical Logics in Satisfiability Modulo Theories." In Lecture Notes in Computer Science, 24–36. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-43513-3_2.

Full text
Abstract:
AbstractWe show that tableau methods for satisfiability in non-classical logics can be supported naturally in SMT solving via the framework of user-propagators. By way of demonstration, we implement the description logic $$\mathcal {ALC}$$ in the Z3 SMT solver and show that working with user-propagators allows us to significantly outperform encodings to first-order logic with relatively little effort. We promote user-propagators for creating solvers for non-classical logics based on tableau calculi.
APA, Harvard, Vancouver, ISO, and other styles
4

Seiffertt, John. "Encoding Code." In Digital Logic for Computing, 135–48. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-56839-3_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sasao, Tsutomu. "Encoding Method." In Memory-Based Logic Synthesis, 41–54. New York, NY: Springer New York, 2011. http://dx.doi.org/10.1007/978-1-4419-8104-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Heuer, Jan, and Christoph Wernhard. "Synthesizing Strongly Equivalent Logic Programs: Beth Definability for Answer Set Programs via Craig Interpolation in First-Order Logic." In Automated Reasoning, 172–93. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-63498-7_11.

Full text
Abstract:
AbstractWe show a projective Beth definability theorem for logic programs under the stable model semantics: For given programs P and Q and vocabulary V (set of predicates) the existence of a program R in V such that $$P \cup R$$ P ∪ R and $$P \cup Q$$ P ∪ Q are strongly equivalent can be expressed as a first-order entailment. Moreover, our result is effective: A program R can be constructed from a Craig interpolant for this entailment, using a known first-order encoding for testing strong equivalence, which we apply in reverse to extract programs from formulas. As a further perspective, this allows transforming logic programs via transforming their first-order encodings. In a prototypical implementation, the Craig interpolation is performed by first-order provers based on clausal tableaux or resolution calculi. Our work shows how definability and interpolation, which underlie modern logic-based approaches to advanced tasks in knowledge representation, transfer to answer set programming.
APA, Harvard, Vancouver, ISO, and other styles
7

Pompe, Uroš. "Efficient proof encoding." In Inductive Logic Programming, 299–314. Berlin, Heidelberg: Springer Berlin Heidelberg, 1997. http://dx.doi.org/10.1007/3-540-63494-0_62.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Schürmann, Carsten. "Recursion for Higher-Order Encodings." In Computer Science Logic, 585–99. Berlin, Heidelberg: Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-44802-0_41.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Cortadella, J., M. Kishinevsky, A. Kondratyev, L. Lavagno, and A. Yakovlev. "State Encoding." In Logic Synthesis for Asynchronous Controllers and Interfaces, 87–123. Berlin, Heidelberg: Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/978-3-642-55989-1_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Saeedloei, Neda. "A Logical Encoding of Timed $$\pi $$ -Calculus." In Logic-Based Program Synthesis and Transformation, 164–82. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-14125-1_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Logical encodings"

1

Bienvenu, Meghyn, and Camille Bourgaux. "Querying Inconsistent Prioritized Data with ORBITS: Algorithms, Implementation, and Experiments." In 19th International Conference on Principles of Knowledge Representation and Reasoning {KR-2022}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/kr.2022/54.

Full text
Abstract:
We investigate practical algorithms for inconsistency-tolerant query answering over prioritized knowledge bases, which consist of a logical theory, a set of facts, and a priority relation between conflicting facts. We consider three well-known semantics (AR, IAR and brave) based upon two notions of optimal repairs (Pareto and completion). Deciding whether a query answer holds under these semantics is (co)NP-complete in data complexity for a large class of logical theories, and SAT-based procedures have been devised for repair-based semantics when there is no priority relation, or the relation has a special structure. The present paper introduces the first SAT encodings for Pareto- and completion-optimal repairs w.r.t. general priority relations and proposes several ways of employing existing and new encodings to compute answers under (optimal) repair-based semantics, by exploiting different reasoning modes of SAT solvers. The comprehensive experimental evaluation of our implementation compares both (i) the impact of adopting semantics based on different kinds of repairs, and (ii) the relative performances of alternative procedures for the same semantics.
APA, Harvard, Vancouver, ISO, and other styles
2

Roux, Johannes D., and F. Wilhelm Leuschner. "Polarization-based optical computing using liquid crystals." In Optical Computing. Washington, D.C.: Optica Publishing Group, 1989. http://dx.doi.org/10.1364/optcomp.1989.tui13.

Full text
Abstract:
Many architectures that perform digital optical logic have been proposed and built. Most of them use intensity-encoded logic where, for example, the presence of light would indicate logical true and the absence of light logical false. This way of encoding logic information has several disadvantages, e.g. light being irretrievably lost when switching from light ON to light OFF.
APA, Harvard, Vancouver, ISO, and other styles
3

Feyzbakhsh Rankooh, Masood, and Tomi Janhunen. "Capturing (Optimal) Relaxed Plans with Stable and Supported Models of Logic Programs (Extended Abstract)." In Thirty-Third International Joint Conference on Artificial Intelligence {IJCAI-24}. California: International Joint Conferences on Artificial Intelligence Organization, 2024. http://dx.doi.org/10.24963/ijcai.2024/930.

Full text
Abstract:
We establish a novel relation between delete-free planning, an important task for the AI Planning community also known as relaxed planning, and logic programming. We show that given a planning problem, all subsets of actions that could be ordered to produce relaxed plans for the problem can be bijectively captured with stable models of a logic program describing the corresponding relaxed planning problem. We also consider the supported model semantics of logic programs, and introduce one causal and one diagnostic encoding of the relaxed planning problem as logic programs, both capturing relaxed plans with their supported models. Our experimental results show that these new encodings can provide major performance gain when computing optimal relaxed plans.
APA, Harvard, Vancouver, ISO, and other styles
4

de Haan, Ronald, and Marija Slavkovik. "Answer Set Programming for Judgment Aggregation." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/231.

Full text
Abstract:
Judgment aggregation (JA) studies how to aggregate truth valuations on logically related issues. Computing the outcome of aggregation procedures is notoriously computationally hard, which is the likely reason that no implementation of them exists as of yet. However, even hard problems sometimes need to be solved. The worst-case computational complexity of answer set programming (ASP) matches that of most problems in judgment aggregation. We take advantage of this and propose a natural and modular encoding of various judgment aggregation procedures and related problems in JA into ASP. With these encodings, we achieve two results: (1) paving the way towards constructing a wide range of new benchmark instances (from JA) for answer set solving algorithms; and (2) providing an automated tool for researchers in the area of judgment aggregation.
APA, Harvard, Vancouver, ISO, and other styles
5

Wagner, Kelvin, Robert T. Weverka, and Demetri Psaltis. "Global Communication, Accuracy and Optical Threshold Device Tolerances in Digital Optical Matrix Multipliers." In Optical Bistability. Washington, D.C.: Optica Publishing Group, 1985. http://dx.doi.org/10.1364/obi.1985.ma5.

Full text
Abstract:
Analog optical array processors are limited in dynamic range and accuracy to about 1000:1. To circumvent this limitation residue, binary, or other radix encoding schemes can be employed. Invariably, nonlinear logical operations must be performed on the input data streams in order to perform operations such as additions and multiplications on the encoded data samples. The standard approach used in electronic systems is to perform pairwise locally connected bit operations sequentially in a tree or array configuration of adder or multiplier primitives, each formed out of boolean switching devices. This approach is dictated because of the local wiring constraints inherent to planar technologies, and the availibility of only low dimensionality switching primitives (i.e. logic circuits which have very few inputs).
APA, Harvard, Vancouver, ISO, and other styles
6

Ji, Zhang, Liu Weiwei, Zhong Licheng, and Gou Yili. "Optical Space-variant Logic-gate Using a New Hybrid BSO Spatial Light Modulator." In Optical Computing. Washington, D.C.: Optica Publishing Group, 1989. http://dx.doi.org/10.1364/optcomp.1989.mg3.

Full text
Abstract:
A novel method on the basis of spatial encoding technique has been advanced by T.Yatagai. In this method, the multiple instruction multiple date-fluent (MIMD) is simply realizable in parallel by varing the decoding mask, but the method for encoding input pattern poses a problem in practical application. One of solution is to use the hybrid system, and encoding can be done with electronic computer. The other is using a new hybrid BSO SLM which can be used to encode input binary pattern with optical method. The hybrid BSO SLM can be used not only in encoding but also in neural logical process. In the further research, we will use it to enhance edge of pattern and to quantize pattern.
APA, Harvard, Vancouver, ISO, and other styles
7

Wan, Lingxiao, Hui Zhang, Huihui Zhu, Leong Chuan Kwek, and Ai-Qun Liu. "Quantum Computing Chip with Error-Correction Encoding." In CLEO: QELS_Fundamental Science. Washington, D.C.: Optica Publishing Group, 2022. http://dx.doi.org/10.1364/cleo_qels.2022.ff2i.5.

Full text
Abstract:
We design and fabricate a quantum photonic circuit to implement a quantum error correction code. A single logical qubit is encoded with 4 physical qubits to demonstrate its capability of detecting and correcting a single-bit error with an average state fidelity of 86%. We further extend the scheme to demonstrate a fault-tolerant teleportation process.
APA, Harvard, Vancouver, ISO, and other styles
8

Thomae, D. A., and D. E. Van den Bout. "Encoding logical constraints into neural network cost functions." In 1990 IJCNN International Joint Conference on Neural Networks. IEEE, 1990. http://dx.doi.org/10.1109/ijcnn.1990.137943.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Claudette, Cayrol, and Lagasquie-Schiex Marie-Christine. "Logical Encoding of Argumentation Frameworks with Higher-Order Attacks." In 2018 IEEE 30th International Conference on Tools with Artificial Intelligence (ICTAI). IEEE, 2018. http://dx.doi.org/10.1109/ictai.2018.00106.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bednarczyk, Bartosz, and Sebastian Rudolph. "Worst-Case Optimal Querying of Very Expressive Description Logics with Path Expressions and Succinct Counting." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/212.

Full text
Abstract:
Among the most expressive knowledge representation formalisms are the description logics of the Z family. For well-behaved fragments of ZOIQ, entailment of positive two-way regular path queries is well known to be 2EXPTIME-complete under the proviso of unary encoding of numbers in cardinality constraints. We show that this assumption can be dropped without an increase in complexity and EXPTIME-completeness can be achieved when bounding the number of query atoms, using a novel reduction from query entailment to knowledge base satisfiability. These findings allow to strengthen other results regarding query entailment and query containment problems in very expressive description logics. Our results also carry over to GC2, the two-variable guarded fragment of first-order logic with counting quantifiers, for which hitherto only conjunctive query entailment has been investigated.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Logical encodings"

1

Baader, Franz, and Barbara Morawska. SAT Encoding of Unification in EL. Technische Universität Dresden, 2010. http://dx.doi.org/10.25368/2022.177.

Full text
Abstract:
The Description Logic EL is an inexpressive knowledge representation language, which nevertheless has recently drawn considerable attention in the knowledge representation and the ontology community since, on the one hand, important inference problems such as the subsumption problem are polynomial. On the other hand, EL is used to define large biomedical ontologies. Unification in Description Logics has been proposed as a novel inference service that can, for example, be used to detect redundancies in ontologies. In a recent paper, we have shown that unification in EL is NP-complete, and thus of a complexity that is considerably lower than in other Description Logics of comparably restricted expressive power. In this paper, we introduce a new NP-algorithm for solving unification problem in EL, which is based on a reduction to satisfiability in propositional logic (SAT). The advantage of this new algorithm is, on the one hand, that it allows us to employ highly optimized state of the art SAT solverswhen implementing an EL-unification algorithm. On the other hand, this reduction provides us with a proof of the fact that EL-unification is in NP that is much simpler than the one given in our previous paper on EL-unification.
APA, Harvard, Vancouver, ISO, and other styles
2

Baader, Franz, Stefan Borgwardt, and Barbara Morawska. SAT Encoding of Unification in ELHR+ w.r.t. Cycle-Restricted Ontologies. Technische Universität Dresden, 2012. http://dx.doi.org/10.25368/2022.186.

Full text
Abstract:
Unification in Description Logics has been proposed as an inference service that can, for example, be used to detect redundancies in ontologies. For the Description Logic EL, which is used to define several large biomedical ontologies, unification is NP-complete. An NP unification algorithm for EL based on a translation into propositional satisfiability (SAT) has recently been presented. In this report, we extend this SAT encoding in two directions: on the one hand, we add general concept inclusion axioms, and on the other hand, we add role hierarchies (H) and transitive roles (R+). For the translation to be complete, however, the ontology needs to satisfy a certain cycle restriction. The SAT translation depends on a new rewriting-based characterization of subsumption w.r.t. ELHR+-ontologies.
APA, Harvard, Vancouver, ISO, and other styles
3

Ruff, Grigory, and Tatyana Sidorina. THE DEVELOPMENT MODEL OF ENGINEERING CREATIVITY IN STUDENTS OF MILITARY INSTITUTIONS. Science and Innovation Center Publishing House, December 2020. http://dx.doi.org/10.12731/model_of_engineering_creativity.

Full text
Abstract:
The troops of the national guard of the Russian Federation are equipped with modern models of weapons, special equipment, Informatization tools, engineering weapons that have artificial intelligence in their composition are being developed, " etc., which causes an increase in the requirements for the quality of professional training of future officers. The increasing complexity of military professional activities, the avalanche-like increase in information, the need to develop the ability to quickly and accurately make and implement well-known and own engineering solutions in an unpredictable military environment demonstrates that the most important tasks of modern higher education are not only providing graduates with a system of fundamental and special knowledge and skills, but also developing their professional independence, and this led to the concept of engineering and creative potential in the list of professionally important qualities of an officer. To expedite a special mechanism system compact intense clarity through cognitive visualization of the educational material, thickening of educational knowledge through encoding, consolidation and structuring Principle of cognitive visualization stems from the psychological laws in accordance with which the efficiency of absorption is increased if visibility in training does not only illustrative, but also cognitive function, which leads to active inclusion, along with the left and right hemispheres of the student in the process of assimilation of information, based on the use of logical and semantic modeling, which contributes to the development of engineering and creative potential.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography