To see the other types of publications on this topic, follow the link: Linear logic; Functional programming; Nets.

Journal articles on the topic 'Linear logic; Functional programming; Nets'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 16 journal articles for your research on the topic 'Linear logic; Functional programming; Nets.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

PERRIER, G. "Concurrent programming as proof net construction." Mathematical Structures in Computer Science 8, no. 6 (December 1998): 681–710. http://dx.doi.org/10.1017/s0960129598002655.

Full text
Abstract:
We propose a concurrent process calculus, called Calcul Parallèle Logique (CPL), based on the paradigm of computation as proof net construction in linear logic. CPL uses a fragment of first-order intuitionistic linear logic where formulas represent processes and proof nets represent successful computations. In these computations, communication is expressed in an asynchronous way by means of axiom links. We define testing equivalences for processes, which are based on a concept of interface, and use the power of proof theory in linear logic.
APA, Harvard, Vancouver, ISO, and other styles
2

Danos, Vincent, Jean-Baptiste Joinet, and Harold Schellinx. "A new deconstructive logic: linear logic." Journal of Symbolic Logic 62, no. 3 (September 1997): 755–807. http://dx.doi.org/10.2307/2275572.

Full text
Abstract:
AbstractThe main concern of this paper is the design of a noetherian and confluent normalization for LK2 (that is, classical second order predicate logic presented as a sequent calculus).The method we present is powerful: since it allows us to recover as fragments formalisms as seemingly different as Girard's LC and Parigot's λμ, FD ([10, 12, 32, 36]), delineates other viable systems as well, and gives means to extend the Krivine/Leivant paradigm of ‘programming-with-proofs’ ([26, 27]) to classical logic; it is painless: since we reduce strong normalization and confluence to the same properties for linear logic (for non-additive proof nets, to be precise) using appropriate embeddings (so-called decorations); it is unifying: it organizes known solutions in a simple pattern that makes apparent the how and why of their making.A comparison of our method to that of embedding LK into LJ (intuitionistic sequent calculus) brings to the fore the latter's defects for these ‘deconstructive purposes’.
APA, Harvard, Vancouver, ISO, and other styles
3

Mackie, Ian. "Lilac: a functional programming language based on linear logic." Journal of Functional Programming 4, no. 4 (October 1994): 395–433. http://dx.doi.org/10.1017/s0956796800001131.

Full text
Abstract:
AbstractWe take Abramsky's term assignment for Intuitionistic Linear Logic (the linear term calculus) as the basis of a functional programming language. This is a language where the programmer must embed explicitly the resource and control information of an algorithm. We give a type reconstruction algorithm for our language in the style of Milner's W algorithm, together with a description of the implementation and examples of use.
APA, Harvard, Vancouver, ISO, and other styles
4

Qian, Zesen, G. A. Kavvos, and Lars Birkedal. "Client-server sessions in linear logic." Proceedings of the ACM on Programming Languages 5, ICFP (August 22, 2021): 1–31. http://dx.doi.org/10.1145/3473567.

Full text
Abstract:
We introduce coexponentials, a new set of modalities for Classical Linear Logic. As duals to exponentials, the coexponentials codify a distributed form of the structural rules of weakening and contraction. This makes them a suitable logical device for encapsulating the pattern of a server receiving requests from an arbitrary number of clients on a single channel. Guided by this intuition we formulate a system of session types based on Classical Linear Logic with coexponentials, which is suited to modelling client-server interactions. We also present a session-typed functional programming language for client-server programming, which we translate to our system of coexponentials.
APA, Harvard, Vancouver, ISO, and other styles
5

BOZZANO, MARCO, GIORGIO DELZANNO, and MAURIZIO MARTELLI. "An effective fixpoint semantics for linear logic programs." Theory and Practice of Logic Programming 2, no. 1 (December 18, 2001): 85–122. http://dx.doi.org/10.1017/s1471068402001254.

Full text
Abstract:
In this paper we investigate the theoretical foundation of a new bottom-up semantics for linear logic programs, and more precisely for the fragment of LinLog (Andreoli, 1992) that consists of the language LO (Andreoli & Pareschi, 1991) enriched with the constant 1. We use constraints to symbolically and finitely represent possibly infinite collections of provable goals. We define a fixpoint semantics based on a new operator in the style of TP working over constraints. An application of the fixpoint operator can be computed algorithmically. As sufficient conditions for termination, we show that the fixpoint computation is guaranteed to converge for propositional LO. To our knowledge, this is the first attempt to define an effective fixpoint semantics for linear logic programs. As an application of our framework, we also present a formal investigation of the relations between LO and Disjunctive Logic Programming (Minker et al., 1991). Using an approach based on abstract interpretation, we show that DLP fixpoint semantics can be viewed as an abstraction of our semantics for LO. We prove that the resulting abstraction is correct and complete (Cousot & Cousot, 1977; Giacobazzi & Ranzato, 1997) for an interesting class of LO programs encoding Petri Nets.
APA, Harvard, Vancouver, ISO, and other styles
6

Jeltsch, Wolfgang. "Towards a Common Categorical Semantics for Linear-Time Temporal Logic and Functional Reactive Programming." Electronic Notes in Theoretical Computer Science 286 (September 2012): 229–42. http://dx.doi.org/10.1016/j.entcs.2012.08.015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

HUET, GÉRARD. "Special issue on ‘Logical frameworks and metalanguages’." Journal of Functional Programming 13, no. 2 (March 2003): 257–60. http://dx.doi.org/10.1017/s0956796802004549.

Full text
Abstract:
There is both a great unity and a great diversity in presentations of logic. The diversity is staggering indeed – propositional logic, first-order logic, higher-order logic belong to one classification; linear logic, intuitionistic logic, classical logic, modal and temporal logics belong to another one. Logical deduction may be presented as a Hilbert style of combinators, as a natural deduction system, as sequent calculus, as proof nets of one variety or other, etc. Logic, originally a field of philosophy, turned into algebra with Boole, and more generally into meta-mathematics with Frege and Heyting. Professional logicians such as Gödel and later Tarski studied mathematical models, consistency and completeness, computability and complexity issues, set theory and foundations, etc. Logic became a very technical area of mathematical research in the last half century, with fine-grained analysis of expressiveness of subtheories of arithmetic or set theory, detailed analysis of well-foundedness through ordinal notations, logical complexity, etc. Meanwhile, computer modelling developed a need for concrete uses of logic, first for the design of computer circuits, then more widely for increasing the reliability of sofware through the use of formal specifications and proofs of correctness of computer programs. This gave rise to more exotic logics, such as dynamic logic, Hoare-style logic of axiomatic semantics, logics of partial values (such as Scott's denotational semantics and Plotkin's domain theory) or of partial terms (such as Feferman's free logic), etc. The first actual attempts at mechanisation of logical reasoning through the resolution principle (automated theorem proving) had been disappointing, but their shortcomings gave rise to a considerable body of research, developing detailed knowledge about equational reasoning through canonical simplification (rewriting theory) and proofs by induction (following Boyer and Moore successful integration of primitive recursive arithmetic within the LISP programming language). The special case of Horn clauses gave rise to a new paradigm of non-deterministic programming, called Logic Programming, developing later into Constraint Programming, blurring further the scope of logic. In order to study knowledge acquisition, researchers in artificial intelligence and computational linguistics studied exotic versions of modal logics such as Montague intentional logic, epistemic logic, dynamic logic or hybrid logic. Some others tried to capture common sense, and modeled the revision of beliefs with so-called non-monotonic logics. For the careful crafstmen of mathematical logic, this was the final outrage, and Girard gave his anathema to such “montres à moutardes”.
APA, Harvard, Vancouver, ISO, and other styles
8

Rocha, Pedro, and Luís Caires. "Propositions-as-types and shared state." Proceedings of the ACM on Programming Languages 5, ICFP (August 22, 2021): 1–30. http://dx.doi.org/10.1145/3473584.

Full text
Abstract:
We develop a principled integration of shared mutable state into a proposition-as-types linear logic interpretation of a session-based concurrent programming language. While the foundation of type systems for the functional core of programming languages often builds on the proposition-as-types correspondence, automatically ensuring strong safety and liveness properties, imperative features have mostly been handled by extra-logical constructions. Our system crucially builds on the integration of nondeterminism and sharing, inspired by logical rules of differential linear logic, and ensures session fidelity, progress, confluence and normalisation, while being able to handle first-class shareable reference cells storing any persistent object. We also show how preservation and, perhaps surprisingly, progress, resiliently survive in a natural extension of our language with first-class locks. We illustrate the expressiveness of our language with examples highlighting detailed features, up to simple shareable concurrent ADTs.
APA, Harvard, Vancouver, ISO, and other styles
9

BOUDOU, JOSEPH, MARTÍN DIÉGUEZ, DAVID FERNÁNDEZ-DUQUE, and PHILIP KREMER. "Exploring the Jungle of Intuitionistic Temporal Logics." Theory and Practice of Logic Programming 21, no. 4 (April 22, 2021): 459–92. http://dx.doi.org/10.1017/s1471068421000089.

Full text
Abstract:
AbstractThe importance of intuitionistic temporal logics in Computer Science and Artificial Intelligence has become increasingly clear in the last few years. From the proof-theory point of view, intuitionistic temporal logics have made it possible to extend functional programming languages with new features via type theory, while from the semantics perspective, several logics for reasoning about dynamical systems and several semantics for logic programming have their roots in this framework. We consider several axiomatic systems for intuitionistic linear temporal logic and show that each of these systems is sound for a class of structures based either on Kripke frames or on dynamic topological systems. We provide two distinct interpretations of “henceforth”, both of which are natural intuitionistic variants of the classical one. We completely establish the order relation between the semantically defined logics based on both interpretations of “henceforth” and, using our soundness results, show that the axiomatically defined logics enjoy the same order relations.
APA, Harvard, Vancouver, ISO, and other styles
10

SELINGER, PETER, and BENOIT VALIRON. "A lambda calculus for quantum computation with classical control." Mathematical Structures in Computer Science 16, no. 3 (June 2006): 527–52. http://dx.doi.org/10.1017/s0960129506005238.

Full text
Abstract:
In this paper we develop a functional programming language for quantum computers by extending the simply-typed lambda calculus with quantum types and operations. The design of this language adheres to the ‘quantum data, classical control’ paradigm, following the first author's work on quantum flow-charts. We define a call-by-value operational semantics, and give a type system using affine intuitionistic linear logic. The main results of this paper are the safety properties of the language and the development of a type inference algorithm.
APA, Harvard, Vancouver, ISO, and other styles
11

Leivant, Daniel, and Bob Constable. "Editorial." Journal of Functional Programming 11, no. 1 (January 2001): 1. http://dx.doi.org/10.1017/s0956796801009030.

Full text
Abstract:
This issue of the Journal of Functional Programming is dedicated to work presented at the Workshop on Implicit Computational Complexity in Programming Languages, affiliated with the 1998 meeting of the International Conference on Functional Programming in Baltimore.Several machine-independent approaches to computational complexity have been developed in recent years; they establish a correspondence linking computational complexity to conceptual and structural measures of complexity of declarative programs and of formulas, proofs and models of formal theories. Examples include descriptive complexity of finite models, restrictions on induction in arithmetic and related first order theories, complexity of set-existence principles in higher order logic, and specifications in linear logic. We refer to these approaches collectively as Implicit Computational Complexity. This line of research provides a framework for a streamlined incorporation of computational complexity into areas such as formal methods in software development, programming language theory, and database theory.A fruitful thread in implicit computational complexity is based on exploring the computational complexity consequences of introducing various syntactic control mechanisms in functional programming, including restrictions (akin to static typing) on scoping, data re-use (via linear modalities), and iteration (via ramification of data). These forms of control, separately and in combination, can certify bounds on the time and space resources used by programs. In fact, all results in this area establish that each restriction considered yields precisely a major computational complexity class. The complexity classes thus obtained range from very restricted ones, such as NC and Alternating logarithmic time, through the central classes Poly-Time and Poly-Space, to broad classes such as the Elementary and the Primitive Recursive functions.Considerable effort has been invested in recent years to relax as much as possible the structural restrictions considered, allowing for more exible programming and proof styles, while still guaranteeing the same resource bounds. Notably, more exible control forms have been developed for certifying that functional programs execute in Poly-Time.The 1998 workshop covered both the theoretical foundations of the field and steps toward using its results in various implemented systems, for example in controlling the computational complexity of programs extracted from constructive proofs. The five papers included in this issue nicely represent this dual concern of theory and practice. As they are going to print, we should note that the field of Implicit Computational Complexity continues to thrive: successful workshops dedicated to it were affiliated with both the LICS'99 and LICS'00 conferences. Special issues, of Information and Computation dedicated to the former, and of Theoretical Computer Science to the latter, are in preparation.
APA, Harvard, Vancouver, ISO, and other styles
12

Rahman, Muhammad Muhitur, Md Shafiullah, Syed Masiur Rahman, Abu Nasser Khondaker, Abduljamiu Amao, and Md Hasan Zahir. "Soft Computing Applications in Air Quality Modeling: Past, Present, and Future." Sustainability 12, no. 10 (May 14, 2020): 4045. http://dx.doi.org/10.3390/su12104045.

Full text
Abstract:
Air quality models simulate the atmospheric environment systems and provide increased domain knowledge and reliable forecasting. They provide early warnings to the population and reduce the number of measuring stations. Due to the complexity and non-linear behavior associated with air quality data, soft computing models became popular in air quality modeling (AQM). This study critically investigates, analyses, and summarizes the existing soft computing modeling approaches. Among the many soft computing techniques in AQM, this article reviews and discusses artificial neural network (ANN), support vector machine (SVM), evolutionary ANN and SVM, the fuzzy logic model, neuro-fuzzy systems, the deep learning model, ensemble, and other hybrid models. Besides, it sheds light on employed input variables, data processing approaches, and targeted objective functions during modeling. It was observed that many advanced, reliable, and self-organized soft computing models like functional network, genetic programming, type-2 fuzzy logic, genetic fuzzy, genetic neuro-fuzzy, and case-based reasoning are rarely explored in AQM. Therefore, the partially explored and unexplored soft computing techniques can be appropriate choices for research in the field of air quality modeling. The discussion in this paper will help to determine the suitability and appropriateness of a particular model for a specific modeling context.
APA, Harvard, Vancouver, ISO, and other styles
13

Rubanov, V. G., D. V. Velichko, and D. A. Bushuev. "Application of Adaptive Three-Position Control in the System of Automated Control of a Thermal Object." Proceedings of the Southwest State University 24, no. 4 (February 4, 2021): 230–43. http://dx.doi.org/10.21869/2223-1560-2020-24-4-230-243.

Full text
Abstract:
Purpose of research. The control object was considered to be a thermal unit in the form of a modified two-tier tunnel furnace designed for the production of foam glass blocks. The main goal of this work was to improve the quality of products, reduce defects, and ultimately increase productivity by developing an automated system for controlling the thermal field of a technological unit for the production of foam glass blocks using an adaptive three-position control law with adaptation to the load of the average position of the regulator. Methods. At the initial stage, a functional automation scheme for a modified two-tier tunnel furnace was developed. To model dynamic discrete systems, a mathematical apparatus was used in the form of labeled Petri nets, which resulted in algorithmization of the technological process for the production of foam glass blocks. This solution to the problem should be used as a method of algorithmization and programming of the logic controller that is part of the automation system structure. The developed functional automation scheme can be converted into a mnemonic circuit, thereby implementing a SCADA system designed for control and visualization, diagnostics and monitoring of the process at a centralized control point, which is part of the automated workplace of the operator-technologist. The described approach to the development of an automated process control system has a generalized representation. The solution is methodological in nature, demonstrating the usability of the model in the form of a labeled Petri net. Results. In the course of research, a graph of operations of the production process with discrete adaptive threeposition control of the average position under load was developed. To check the correctness of the graph of operations, a tree of achievable markings was built, and its analysis was performed for compliance with security conditions and network liveliness. A block diagram of the main algorithm and the algorithm for adapting the controller's control program is developed. Conclusion. The described approach to the development of an automated process control system for the production of foam glass blocks has a generalized character, although it is illustrated by applying it to a specific object , since it allows changing both the number of variables xi , zi, and their functional purpose, that is, instead of sensors, pushers, valves, parameter values, for example, temperature, other automation elements and other physical variables and their parameters can be used. Thus, the presented solution is methodological in nature, demonstrating the convenience of using the model in the form of a Petri net and a tree of achievable markings for algorithmization and programming of a logic controller that is part of the automation system structure.
APA, Harvard, Vancouver, ISO, and other styles
14

He, Zhou, Yuying Dong, Gongchang Ren, Chan Gu, and Zhiwu Li. "Path planning for automated guided vehicle systems with time constraints using timed Petri nets." Measurement and Control, October 20, 2020, 002029402096484. http://dx.doi.org/10.1177/0020294020964840.

Full text
Abstract:
Automated guided vehicles (AGVs) are extensively used in many applications such as intelligent transportation, logistics, and industrial factories. In this paper, we address the path planning problem for an AGV system (i.e. a team of identical AGVs) with logic and time constraints using Petri nets. We propose a method to model an AGV system and its static environment by timed Petri nets. Combining the structural characteristics of Petri nets and integer linear programming technique, a path planning method is developed to ensure that all task regions are visited by AGVs in time and forbidden regions are always avoided. Finally, simulation studies are presented to show the effectiveness of the proposed path planning methodology.
APA, Harvard, Vancouver, ISO, and other styles
15

Braüner, Torben. "A General Adequacy Result for a Linear Functional Language." BRICS Report Series 1, no. 22 (August 3, 1994). http://dx.doi.org/10.7146/brics.v1i22.21645.

Full text
Abstract:
A main concern of the paper will be a Curry-Howard interpretation of Intuitionistic Linear Logic. It will be extended with recursion, and the resulting functional programming language will be given operational as well as categorical semantics. The two semantics will be related by soundness and adequacy results. The main features of the categorical semantics are that convergence/divergence behaviour is modelled by a strong monad, and that recursion is modelled by ``linear fixpoints'' induced by CPO structure on the hom-sets. The ``linear fixpoints'' correspond to ordinary fixpoints in the category of free coalgebras w.r.t. the comonad used to interpret the ``of course'' modality. Concrete categories from (stable) domain theory satisfying the axioms of the categorical model are given, and thus adequacy follows in these instances from the general result.
APA, Harvard, Vancouver, ISO, and other styles
16

Ojha, Gaurav Kumar, Gyanendra Kumar Yadav, and Pankaj Kumar Yadav. "Optimization Technique For Surface Roughness Prediction in Turning Operation." SAMRIDDHI : A Journal of Physical Sciences, Engineering and Technology 6, no. 2 (June 8, 2015). http://dx.doi.org/10.18090/samriddhi.v6i2.1562.

Full text
Abstract:
Surface roughness has a great influence on the functional properties of the product. Finding the rules that how process factors and environment factors affect the values of surface roughness will help to set the process parameters of the future and then improve production quality and efficiency. Since surface roughness is impacted by different machining parameters and the inherent uncertainties in the machining process, how to predict the surface roughness becomes a challengeable problem for the researchers and engineers. In this paper an attempt is made to review the literature on optimizing machining parameters in turning processes. Various conventional techniques employed for machining optimization include geometric programming, geometric plus linear programming, goal programming, sequential unconstrained minimization technique, dynamic programming etc. The latest techniques for optimization include fuzzy logic, scatter search technique, genetic algorithm, Taguchi technique and response surface methodology.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography