Dissertations / Theses on the topic 'Rewriting techniques'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 20 dissertations / theses for your research on the topic 'Rewriting techniques.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Sapiña, Sanchis Julia. "Rewriting Logic Techniques for Program Analysis and Optimization." Doctoral thesis, Universitat Politècnica de València, 2018. http://hdl.handle.net/10251/94044.
Full textThis thesis proposes a dynamic analysis methodology for improving the diagnosis of erroneous Maude programs. The key idea is to combine runtime assertion checking and dynamic trace slicing for automatically catching errors at runtime while reducing the size and complexity of the erroneous traces to be analyzed (i.e., those leading to states that fail to satisfy the assertions). In the event of an assertion violation, the slicing criterion is automatically inferred, which facilitates the user to rapidly pinpoint the source of the error. First, a technique is formalized that aims at automatically detecting anomalous deviations of the intended program behavior (error symptoms) by using assertions that are checked at runtime. This technique supports two types of user-defined assertions: functional assertions (which constrain deterministic function calls) and system assertions (which specify system state invariants). The proposed dynamic checking is provably sound in the sense that all errors flagged definitely signal a violation of the specifications. Then, upon eventual assertion violations, accurate trace slices (i.e., simplified yet precise execution traces) are generated automatically, which help identify the cause of the error. Moreover, the technique also suggests a possible repair for the rules involved in the generation of the erroneous states. The proposed methodology is based on (i) a logical notation for specifying assertions that are imposed on execution runs; (ii) a runtime checking technique that dynamically tests the assertions; and (iii) a mechanism based on (equational) least general generalization that automatically derives accurate criteria for slicing from falsified assertions. Finally, an implementation of the proposed technique is presented in the assertion-based, dynamic analyzer ABETS, which shows how the forward and backward tracking of asserted program properties leads to a thorough trace analysis algorithm that can be used for program diagnosis and debugging.
Esta tesi proposa una metodologia d'anàlisi dinàmica que millora el diagnòstic de programes erronis escrits en el llenguatge Maude. La idea clau és combinar tècniques de verificació d'assercions en temps d'execució amb la fragmentació dinàmica de traces d'execució per a detectar automàticament errors en temps d'execució, alhora que es reduïx la grandària i la complexitat de les traces a analitzar. En el cas de violar-se una asserció, s'inferix automàticament el criteri de fragmentació, la qual cosa facilita a l'usuari identificar ràpidament la font de l'error. En primer lloc, la tesi formalitza una tècnica destinada a detectar automàticament eventuals desviacions del comportament desitjat del programa (símptomes d'error). Esta tècnica suporta dos tipus d'assercions definides per l'usuari: assercions funcionals (que restringixen crides a funcions deterministes) i assercions de sistema (que especifiquen els invariants d'estat del sistema). La tècnica de verificació dinàmica proposta és demostrablement correcta en el sentit que tots els errors assenyalats definitivament delaten la violació de les assercions. Davant eventuals violacions d'assercions, es generen automàticament traces fragmentades (és a dir, traces simplificades però igualment precises) que ajuden a identificar la causa de l'error. A més, la tècnica també suggerix una possible reparació de les regles implicades en la generació dels estats erronis. La metodologia proposada es basa en (i) una notació lògica per a especificar les assercions que s'imposen a l'execució; (ii) una tècnica de verificació aplicable en temps d'execució que comprova dinàmicament les assercions; i (iii) un mecanisme basat en la generalització (ecuacional) menys general que automàticament obté criteris precisos per a fragmentar traces d'execució a partir d'assercions falsificades. Finalment, es presenta una implementació de la tècnica proposta en la ferramenta d'anàlisi dinàmica basat en assercions ABETS, que mostra com és possible combinar el traçat cap avant i cap arrere de les propietats assertades del programa per a obtindre un algoritme precís d'anàlisi de traces que resulta útil per al diagnòstic i la depuració de programes.
Sapiña Sanchis, J. (2017). Rewriting Logic Techniques for Program Analysis and Optimization [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/94044
TESIS
Papadopoulos, George Angelos. "Parallel implementation of concurrent logic languages using graph rewriting techniques." Thesis, University of East Anglia, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.329340.
Full textButh, Karl-Heinz [Verfasser]. "Techniques for Modelling Structured Operational and Denotational Semantics Definitions with Term Rewriting Systems / Karl Heinz Buth." Kiel : Universitätsbibliothek Kiel, 1994. http://d-nb.info/1080332669/34.
Full textFeliú, Gabaldón Marco Antonio. "Logic-based techniques for program analysis and specification synthesis." Doctoral thesis, Universitat Politècnica de València, 2013. http://hdl.handle.net/10251/33747.
Full textFeliú Gabaldón, MA. (2013). Logic-based techniques for program analysis and specification synthesis [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/33747
TESIS
Rusinowitch, Michaël. "Démonstration automatique par des techniques de réécritures." Nancy 1, 1987. http://www.theses.fr/1987NAN10358.
Full textKamat, Niranjan Ganesh. "Sampling-based Techniques for Interactive Exploration of Large Datasets." The Ohio State University, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=osu1523552932728325.
Full textKaranasos, Konstantinos. "View-Based techniques for the efficient management of web data." Phd thesis, Université Paris Sud - Paris XI, 2012. http://tel.archives-ouvertes.fr/tel-00755328.
Full textBeveraggi, Marc. "Problemes combinatoires en codage algebrique." Paris 6, 1987. http://www.theses.fr/1987PA066265.
Full textFerey, Gaspard. "Higher-Order Confluence and Universe Embedding in the Logical Framework." Electronic Thesis or Diss., université Paris-Saclay, 2021. http://www.theses.fr/2021UPASG032.
Full textIn the context of the multiplicity of formal systems, it has become a growing need to express formal proofs into a common logical framework.This thesis focuses on the use of higher-order term rewriting to embed complex formal systems in the simple and well-studied lambda-Pi calculus modulo.This system, commonly used as a logical framework, features dependent types and is extended with higher-order term rewriting.We study, in a first part, criterias for the confluence properties of higher-order rewrite systems considered together with the usual beta reduction.In the case of left-linear systems, confluence can be reduced to the study of critical pairs which must be provided a decreasing diagram with relation to some rule labeling.We show that in the presence of non-linear rules, it is still possible to achieve confluence if the set of considered terms is layered.We then focus, in a second part, on the encoding of higher-order logics based on complex universe structures. The embeding of cumulativity, a limited form of subtyping, is handled with new rewriting techniques relying on private symbols and allowing some form of proof irrelevance.We then describe how algebraic universe expressions containing level variables can be represented, even in presence of universe constraints.Eventually we introduce an embeding of universe polymorphism as defined in the core logic of the Coq system and prove the correctness of the defined translation mechanism.These results, along with other more practical techniques, allowed the implementation of a translator to Dedukti which was used to translate several sizeable Coq developments
Zighem, Ismail. "Etude d'invariants de graphes planaires." Université Joseph Fourier (Grenoble), 1998. http://www.theses.fr/1998GRE10211.
Full textSAKAI, Masahiko, and Keiichirou KUSAKARI. "Static Dependency Pair Method for Simply-Typed Term Rewriting and Related Technique." Institute of Electronics, Information and Communication Engineers, 2009. http://hdl.handle.net/2237/14975.
Full textDunn, Jennifer Erin. "Ambiguous and ambivalent signatures : rewriting, revision, and resistance in Emma Tennant's fiction." Thesis, University of Oxford, 2007. http://ora.ox.ac.uk/objects/uuid:6a4e8319-422a-48b9-8e43-cd05d742450f.
Full textPérution-Kihli, Guillaume. "Data Management in the Existential Rule Framework : Translation of Queries and Constraints." Electronic Thesis or Diss., Université de Montpellier (2022-....), 2023. http://www.theses.fr/2023UMONS030.
Full textThe general context of this work is the issue of designing high-quality systems that integrate multiple data sources via a semantic layer encoded in a knowledge representation and reasoning language. We consider knowledge-based data management (KBDM) systems, which are structured in three layers: the data layer, which comprises the data sources, the knowledge (or ontological) layer, and the mappings between the two. Mappings and knowledge are expressed within the existential rule framework. One of the intrinsic difficulties in designing a KBDM is the need to understand the content of data sources. Data sources are often provided with typical queries and constraints, from which valuable information about their semantics can be drawn, as long as this information is made intelligible to KBDM designers. This motivates our core question: is it possible to translate data queries and constraints at the knowledge level while preserving their semantics?The main contributions of this thesis are the following. We extend previous work on data-to-ontology query translation with new techniques for the computation of perfect, minimally complete, or maximally sound query translations. Concerning data-to-ontology constraint translation, we define a general framework and apply it to several classes of constraints. Finally, we provide a sound and complete query rewriting operator for disjunctive existential rules and disjunctive mappings, as well as undecidability results, which are of independent interest
Yang, Bin. "Contribution to a kernel of symbolic asymptotic modeling software." Thesis, Besançon, 2014. http://www.theses.fr/2014BESA2055/document.
Full textThis thesis is dedicated to develop a kernel of a symbolic asymptotic modeling software packageMEMSALab which will be used for automatic generation of asymptotic models for arrays of micro andnanosystems. Unlike traditional software packages aimed at numerical simulations by using pre-builtmodels, the purpose of MEMSALab is to derive asymptotic models for input equations by taking intoaccount their own features. An approach called ”by extension-combination” for the asymptotic modelingwhich allows an incremental model construction is firstly proposed for the homogenization modelderivation. It relies on a combination of the asymptotic method used in the field of partial differentialequations with term rewriting techniques coming from computer science. This approach focuses onthe model derivation for family of PDEs instead of each of them. An homogenization model of theelectrothermoelastic equation defined in a multi-layered thin domain has been derived by applyingthe mathematical method used in this approach. At last, an optimization tool has been developed bycombining a house-made optimization software package SIMBAD and COMSOL-MATLAB simulationand it has been applied for optimization of a SThM probe
Fournial, Céline. "Imitation et création dans le" théâtre moderne" (1550-1650) : la question des cycles d’inspiration." Thesis, Sorbonne université, 2019. http://www.theses.fr/2019SORUL012.
Full textIn the second half of the sixteenth century, modern French drama is developed from humanist reflection upon both past literature and the theory of imitation, considered a universal writing method by period scholars. In the first half of the 17th century, especially from the 1620s on, the evolving terms and stakes of such reflection produce numerous debates about drama, a genre then full of dramatic experimentation and renovation. Under these circumstances, the choice of inspiration is not insignificant. The dramatists not only look for subjects, but also for novel literary forms from foreign writers that could feed practice as much as theory. Studying one century of dramatic creations enables us to outline several cycles of inspiration within the history of tragicomedy, tragedy and comedy, and to record how these cycles match the main stages of evolution of those three genres. Throughout, Inventio and dramaturgy maintain close relations with each other. At a time when the central and most debated question is one of models, analyzing these cycles highlights the meaning and consequences of the use of adaptation and rewriting, and the choice of ancient, Italian, Spanish or French inspirations. The concept of cycles enables the comprehension of the sources of inspiration as periodical phenomenon and to show how drama uniquely evolves through imitation. In conclusion, studying the cyclic relation between the French playwrights and their ancient and modern inspirations leads to the examination of Modern French drama’s European nature and the circulation and transference of subjects and literary forms
Santiago, Pinazo Sonia. "Advanced Features in Protocol Verification: Theory, Properties, and Efficiency in Maude-NPA." Doctoral thesis, Universitat Politècnica de València, 2015. http://hdl.handle.net/10251/48527.
Full textSantiago Pinazo, S. (2015). Advanced Features in Protocol Verification: Theory, Properties, and Efficiency in Maude-NPA [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/48527
TESIS
Mohammed, Shoeb Ahmed. "Coding Techniques for Error Correction and Rewriting in Flash Memories." 2010. http://hdl.handle.net/1969.1/ETD-TAMU-2010-08-8476.
Full textShoaran, Maryam. "Automata methods and techniques for graph-structured data." Thesis, 2011. http://hdl.handle.net/1828/3249.
Full textGraduate
Wang, Li-Wei, and 王立為. "Application of DAG-Aware MIG Rewriting Technique in Logic Synthesis and Verification." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/13593673123619241948.
Full text國立臺灣大學
電子工程學研究所
104
A Majority-Inverter Graph (MIG) is a recently introduced logic representation form which manipulates logic by using only 3-input majority function (MAJ) and inversion function (INV). Its algebraic and Boolean properties enables efficient logic optimizations. In particular, MIG algorithms obtained significantly superior synthesis results as compared to the state-of-the-art approaches based on AND-inverter graphs and commercial tools. In this thesis, we integrate the DAG-aware rewriting technique, a fast greedy algorithm for circuit compression, into MIG and apply it not only in the logic synthesis but also verification. Experimental results on logic optimization show that heavily-optimized MIGs can be further reduced by 20.4% of network size while depth preserved. Experimental results on datapath verification also show the effectiveness of our algorithm. With our MIG rewriting applied, datapath analysis quality can be improved with the ratio 3.16. Runtime for equivalence checking can also be effectively reduced.
CHEN, HONG-CHIH, and 陳宏志. "Iterative Learning Control Technique Using G-code Rewriting Algorithm for Contour Control." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/k2ud6n.
Full text國立中正大學
資訊工程研究所
106
The traditional iterative learning control (ILC) technology provides better position commands for the machining. However, most commercial controllers cannot accept position commands to control the machining path directly. Therefore, it is hard to leverage self-developed ILC on these commercial controllers. In this thesis, for XY plane, we develop a G-code rewriting algorithm to control machining path to solve this issue. The proposed algorithm can transfer position commands to the corresponding G-code commands. In order to preserve the same machining time, we need to properly handle feed rate and the number of segmented G-code commands. We implement the proposed algorithm and integrate it into a customized, ILC-enabled LinuxCNC for the evaluation. For the tested G-code files, the experimental result shows that ILC with the proposed algorithm for contour control can reach a convergence state.