Journal articles on the topic 'Algebaic Decision Diagram'

To see the other types of publications on this topic, follow the link: Algebaic Decision Diagram.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 26 journal articles for your research on the topic 'Algebaic Decision Diagram.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

DJIDJEV, HRISTO N., and ANDRZEJ LINGAS. "ON COMPUTING VORONOI DIAGRAMS FOR SORTED POINT SETS." International Journal of Computational Geometry & Applications 05, no. 03 (September 1995): 327–37. http://dx.doi.org/10.1142/s0218195995000192.

Full text
Abstract:
We show that the Voronoi diagram of a finite sequence of points in the plane which gives sorted order of the points with respect to two perpendicular directions can be computed in linear time. In contrast, we observe that the problem of computing the Voronoi diagram of a finite sequence of points in the plane which gives the sorted order of the points with respect to a single direction requires Ω(n log n) operations in the algebraic decision tree model. As a corollary from the first result, we show that the bounded Voronoi diagrams of simple n-vertex polygons which can be efficiently cut into the so called monotone histograms can be computed in o(n log n) time.
APA, Harvard, Vancouver, ISO, and other styles
2

Rauzy, Antoine, and Yang. "Decision Diagram Algorithms to Extract Minimal Cutsets of Finite Degradation Models." Information 10, no. 12 (November 25, 2019): 368. http://dx.doi.org/10.3390/info10120368.

Full text
Abstract:
In this article, we propose decision diagram algorithms to extract minimal cutsets of finite degradation models. Finite degradation models generalize and unify combinatorial models used to support probabilistic risk, reliability and safety analyses (fault trees, attack trees, reliability block diagrams…). They formalize a key idea underlying all risk assessment methods: states of the models represent levels of degradation of the system under study. Although these states cannot be totally ordered, they have a rich algebraic structure that can be exploited to extract minimal cutsets of models, which represent the most relevant scenarios of failure. The notion of minimal cutsets we introduce here generalizes the one defined for fault trees. We show how algorithms used to calculate minimal cutsets can be lifted up to finite degradation models, thanks to a generic decomposition theorem and an extension of the binary decision diagrams technology. We discuss the implementation and performance issues. Finally, we illustrate the interest of the proposed technology by means of the use case stemmed from the oil and gas industry.
APA, Harvard, Vancouver, ISO, and other styles
3

Dudek, Jeffrey, Vu Phan, and Moshe Vardi. "ADDMC: Weighted Model Counting with Algebraic Decision Diagrams." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 02 (April 3, 2020): 1468–76. http://dx.doi.org/10.1609/aaai.v34i02.5505.

Full text
Abstract:
We present an algorithm to compute exact literal-weighted model counts of Boolean formulas in Conjunctive Normal Form. Our algorithm employs dynamic programming and uses Algebraic Decision Diagrams as the main data structure. We implement this technique in ADDMC, a new model counter. We empirically evaluate various heuristics that can be used with ADDMC. We then compare ADDMC to four state-of-the-art weighted model counters (Cachet, c2d, d4, and miniC2D) on 1914 standard model counting benchmarks and show that ADDMC significantly improves the virtual best solver.
APA, Harvard, Vancouver, ISO, and other styles
4

Bibilo, P. N., and V. I. Romanov. "Experimental Study of Algorithms for Minimization of Binary Decision Diagrams using Algebraic Representations of Cofactors." Programmnaya Ingeneria 13, no. 2 (February 17, 2022): 51–67. http://dx.doi.org/10.17587/prin.13.51-67.

Full text
Abstract:
BDD (Binary Decision Diagram) is used for technology-independent optimization, performed as the first stage in the synthesis of logic circuits in the design of ASIC (application-specific integrated circuit). BDD is an acyclic graph defining a Boolean function or a system of Boolean functions. Each vertex of this graph is associated with the complete or reduced Shannon expansion formula. Binary decision diagrams with mutually inverse subfunctions (cofac-tors) are considered. We have developed algorithms for finding algebraic representations of cofactors of the same BDD level in the form of a disjunction or conjunction of other inverse or non-inverse cofactors of the same BDD level. The algorithms make it possible to reduce the number of literals by replacing the Shannon expansion formulas with simpler logical formulas and to reduce the number of literals in the description of a system of Boolean functions. We propose to use the developed algorithms for an additional logical optimization of the constructed BDD representations of systems of Boolean functions. Experimental results of the application of the corresponding programs in the synthesis of logic circuits in the design library of custom VLSI CMOS circuits are presented.
APA, Harvard, Vancouver, ISO, and other styles
5

Pralet, C., G. Verfaillie, and T. Schiex. "An Algebraic Graphical Model for Decision with Uncertainties, Feasibilities, and Utilities." Journal of Artificial Intelligence Research 29 (August 23, 2007): 421–89. http://dx.doi.org/10.1613/jair.2151.

Full text
Abstract:
Numerous formalisms and dedicated algorithms have been designed in the last decades to model and solve decision making problems. Some formalisms, such as constraint networks, can express "simple" decision problems, while others are designed to take into account uncertainties, unfeasible decisions, and utilities. Even in a single formalism, several variants are often proposed to model different types of uncertainty (probability, possibility...) or utility (additive or not). In this article, we introduce an algebraic graphical model that encompasses a large number of such formalisms: (1) we first adapt previous structures from Friedman, Chu and Halpern for representing uncertainty, utility, and expected utility in order to deal with generic forms of sequential decision making; (2) on these structures, we then introduce composite graphical models that express information via variables linked by "local" functions, thanks to conditional independence; (3) on these graphical models, we finally define a simple class of queries which can represent various scenarios in terms of observabilities and controllabilities. A natural decision-tree semantics for such queries is completed by an equivalent operational semantics, which induces generic algorithms. The proposed framework, called the Plausibility-Feasibility-Utility (PFU) framework, not only provides a better understanding of the links between existing formalisms, but it also covers yet unpublished frameworks (such as possibilistic influence diagrams) and unifies formalisms such as quantified boolean formulas and influence diagrams. Our backtrack and variable elimination generic algorithms are a first step towards unified algorithms.
APA, Harvard, Vancouver, ISO, and other styles
6

Joshi, S., and R. Khardon. "Probabilistic Relational Planning with First Order Decision Diagrams." Journal of Artificial Intelligence Research 41 (June 21, 2011): 231–66. http://dx.doi.org/10.1613/jair.3205.

Full text
Abstract:
Dynamic programming algorithms have been successfully applied to propositional stochastic planning problems by using compact representations, in particular algebraic decision diagrams, to capture domain dynamics and value functions. Work on symbolic dynamic programming lifted these ideas to first order logic using several representation schemes. Recent work introduced a first order variant of decision diagrams (FODD) and developed a value iteration algorithm for this representation. This paper develops several improvements to the FODD algorithm that make the approach practical. These include, new reduction operators that decrease the size of the representation, several speedup techniques, and techniques for value approximation. Incorporating these, the paper presents a planning system, FODD-Planner, for solving relational stochastic planning problems. The system is evaluated on several domains, including problems from the recent international planning competition, and shows competitive performance with top ranking systems. This is the first demonstration of feasibility of this approach and it shows that abstraction through compact representation is a promising approach to stochastic planning.
APA, Harvard, Vancouver, ISO, and other styles
7

Falkowski, B. J., and Chip-Hong Chang. "Efficient calculation of Gray code-ordered Walsh spectra through algebraic decision diagrams." Electronics Letters 34, no. 9 (1998): 848. http://dx.doi.org/10.1049/el:19980606.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Speck, David, Florian Geißer, and Robert Mattmüller. "Symbolic Planning with Edge-Valued Multi-Valued Decision Diagrams." Proceedings of the International Conference on Automated Planning and Scheduling 28 (June 15, 2018): 250–58. http://dx.doi.org/10.1609/icaps.v28i1.13890.

Full text
Abstract:
Symbolic representations have attracted significant attention in optimal planning. Binary Decision Diagrams (BDDs) form the basis for symbolic search algorithms. Closely related are Algebraic Decision Diagrams (ADDs), used to represent heuristic functions. Also, progress was made in dealing with models that take state-dependent action costs into account. Here, costs are represented as Edge-valued Multi-valued Decision Diagrams (EVMDDs), which can be exponentially more compact than the corresponding ADD representation. However, they were not yet considered for symbolic planning. In this work, we study EVMDD-based symbolic search for optimal planning. We define EVMDD-based representations of state sets and transition relations, and show how to compute the necessary operations required for EVMDD-A*. This EVMDD-based version of symbolic A* generalizes its BDD variant, and allows to solve planning tasks with state-dependent action costs. We prove theoretically that our approach is sound, complete and optimal. Additionally, we present an empirical analysis comparing EVMDD-A* to BDD-A* and explicit A* search. Our results underscore the usefulness of symbolic approaches and the feasibility of dealing with models that go beyond unit costs.
APA, Harvard, Vancouver, ISO, and other styles
9

Van den Broeck, Guy, Ingo Thon, Martijn Van Otterlo, and Luc De Raedt. "DTProbLog: A Decision-Theoretic Probabilistic Prolog." Proceedings of the AAAI Conference on Artificial Intelligence 24, no. 1 (July 4, 2010): 1217–22. http://dx.doi.org/10.1609/aaai.v24i1.7755.

Full text
Abstract:
We introduce DTProbLog, a decision-theoretic extension of Prolog and its probabilistic variant ProbLog. DTProbLog is a simple but expressive probabilistic programming language that allows the modeling of a wide variety of domains, such as viral marketing. In DTProbLog, the utility of a strategy (a particular choice of actions) is defined as the expected reward for its execution in the presence of probabilistic effects. The key contribution of this paper is the introduction of exact, as well as approximate, solvers to compute the optimal strategy for a DTProbLog program and the decision problem it represents, by making use of binary and algebraic decision diagrams. We also report on experimental results that show the effectiveness and the practical usefulness of the approach.
APA, Harvard, Vancouver, ISO, and other styles
10

Friedl, Katalin, and László Kabódi. "Storing the Quantum Fourier Operator in the QuIDD Data Structure." Acta Cybernetica 23, no. 2 (2017): 503–12. http://dx.doi.org/10.14232/actacyb.23.2.2017.5.

Full text
Abstract:
Quantum algorithms can be simulated using classical computers, but the typical time complexity of the simulation is exponential. There are some data structures which can speed up this simulation to make it possible to test these algorithms on classical computers using more than a few qubits. One of them is QuIDD by Viamontes et al., which is an extension of the Algebraic Decision Diagram. In this paper, we examine the matrix of Fourier operator and its QuIDD representation. To utilize the structure of the operator we propose two orderings (reversed column variables and even-odd order), both resulting in smaller data structure than the standard one. After that, we propose a new method of storing the Fourier operator, using a weighted decision diagram that further reduces its size. It should be the topic of subsequent research whether the basic operations can be performed efficiently on this weighted structure.
APA, Harvard, Vancouver, ISO, and other styles
11

Ibănescu, Radu, and Cătălin Ungureanu. "Lagrange's Equations versus Bond Graph Modeling Methodology by an Example of a Mechanical System." Applied Mechanics and Materials 809-810 (November 2015): 914–19. http://dx.doi.org/10.4028/www.scientific.net/amm.809-810.914.

Full text
Abstract:
The bond graph modeling method was discovered by Henry Painter in 1959 and has quickly become a wide spread method all over the modeling engineering world. The method is based on the analysis of power circulation in systems and has some indisputable advantages over other modeling methods, based in principle on mathematical aspects. The paper proposes a comparison between the bond graph method and Lagrange's equations method, by applying both methods to model a mechanical system. The bond graph model is a graphical model. There are three possibilities to exploit the bond-graph diagram. The first one consists in deducing a system of differential or algebraic-differential equations from the diagram. The second one consists in obtaining the block diagram model from the bond graph diagram, without additionally writing any equations, followed by the block diagram implementation in the appropriate software, which permits to perform simulations at once. The third one consists in implementing the bond graph diagram directly in the appropriate software, where simulations can immediately run. The advantages and the disadvantages of the methods are emphasized, but the decision about the most appropriate method is up to the modeler.
APA, Harvard, Vancouver, ISO, and other styles
12

Cramer, Claes. "Problem-solving a lesson from relativity in physics education." Physics Education 57, no. 4 (April 12, 2022): 045010. http://dx.doi.org/10.1088/1361-6552/ac5641.

Full text
Abstract:
Abstract The principle of relativity is of fundamental importance in practice when we are solving problems in physics since the axiom states that the result of any physical experiment is the same when performed with identical initial conditions relative to any inertial coordinate system. Hence, conceptual knowledge of coordinate systems is central in any physics problem-solving framework. This opens up the question whether upper secondary school students should be taught to set up coordinate systems and apply basic coordinate transformations systematically when solving problems or if it should be left for physics courses at a college or university level? Given that working with coordinate systems does not require any advanced algebra or analysis knowledge, a case can be made for introducing the concepts of coordinate systems and transformations together with a uniform problem-solving framework. The purpose here, given that coordinate systems are central in problem-solving, is to revisit the method of drawing free body diagrams in engineering mechanics but adapted for upper secondary school physics. This method relies on the existence of a coordinate system and can be described by an activity diagram—a connected sequence of actions and decisions. The activity diagram is a map for problem-solving. The central actions in the activity are to introduce a coordinate system, draw a free body diagram and decide if the system should be transformed besides actually working out any algebra and numerical calculations. Here the notion of ‘transformation’ should be read as a graphical transformation rather than an algebraic group operation that is not suitable, neither to teach, nor to use at upper secondary level. A basic example is provided to illustrate why the methodology simplifies the problem-solving process and the student’s understanding of the subject.
APA, Harvard, Vancouver, ISO, and other styles
13

Bibilo, P. N., and V. I. Romanov. "Minimization of Binary Decision Diagrams for Systems of Completely Defined Boolean Functions using Algebraic Representations of Cofactors." INFORMACIONNYE TEHNOLOGII 27, no. 8 (August 11, 2021): 395–408. http://dx.doi.org/10.17587/it.27.395-408.

Full text
Abstract:
In design systems for digital VLSI (very large integrated circuits), the BDD is used for VLSI verification, as well as for technologically independent optimization, performed as the first stage in the synthesis of logic circuits in various technological bases. BDD is an acyclic graph defining a Boolean function or a system of Boolean functions. Each vertex of this graph corresponds to the complete or reduced Shannon expansion formula. Having constructed BDD representation for systems of Boolean functions, it is possible to perform additional logical optimization based on the proposed method of searching for algebraic representations of cofactors (subfunctions) of the same BDD level in the form of a disjunction or conjunction of other cofactors of this BDD level. The method allows to reduce the number of literals by replacing the Shannon expansion formulas with simpler formulas that are disjunctions or conjunctions of cofactors, and to reduce the number of literals in specifying a system of Boolean functions. The number of literals in algebraic multilevel representations of systems of fully defined Boolean functions is the main optimization criterion in the synthesis of combinational circuits from library logic gates.
APA, Harvard, Vancouver, ISO, and other styles
14

Bibilo, P. N. "Minimizing Binary Decision Diagrams for Systems of Incompletely Defined Boolean Functions Using Algebraic Cofactor Expansions." Journal of Computer and Systems Sciences International 61, no. 4 (August 2022): 539–66. http://dx.doi.org/10.1134/s1064230722030029.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Bibilo, P. N., and V. I. Romanov. "Minimization of binary decision diagrams for systems of completely defined Boolean functions using Shannon expansions and algebraic representations of cofactors." Informatics 18, no. 2 (July 1, 2021): 7–32. http://dx.doi.org/10.37661/1816-0301-2021-18-2-7-32.

Full text
Abstract:
In the systems of digital VLSI design (Very Large Integrated Circuits), the BDD (Binary Decision Diagram) is used for VLSI verification, as well as for technologically independent optimization as the first stage in the synthesis of logic circuits in various technological bases. The BDD is an acyclic graph defining a Boolean function or a system of Boolean functions. Each vertex of this graph corresponds to the complete or reduced Shannon expansion formula. When BDD representation for systems of Boolean functions is constructed, it is possible to perform additional logical optimization based on the proposed method of searching for algebraic representations of cofactors (subfunctions) of the same BDD level in the form of a disjunction, conjunction either exclusive-or of cofactors of the same level or lower levels of BDD. A directed BDD graph for a system of functions is constructed on the basis of Shannon expansion of all component functions of the system by the same permutation of variables. The method allows to reduce the number of literals by replacing the Shannon expansion formulas with simpler formulas that are disjunctions or conjunctions of cofactors, and to reduce the number of literals in specifying a system of Boolean functions. The number of literals in algebraic multilevel representations of systems of fully defined Boolean functions is the main optimization criterion in the synthesis of combinational circuits from librarian logic elements.
APA, Harvard, Vancouver, ISO, and other styles
16

Kumar, Arvind, and Hiroaki Wagatsuma. "A Kamm’s Circle-Based Potential Risk Estimation Scheme in the Local Dynamic Map Computation Enhanced by Binary Decision Diagrams." Sensors 22, no. 19 (September 24, 2022): 7253. http://dx.doi.org/10.3390/s22197253.

Full text
Abstract:
Autonomous vehicles (AV) are a hot topic for safe mobility, which inevitably requires sensors to achieve autonomy, but relying too heavily on sensors will be a risk factor. A high-definition map (HD map) reduces the risk by giving geographical information if it covers dynamic information from moving entities on the road. Cooperative intelligent transport systems (C-ITS) are a prominent approach to solving the issue and local dynamic maps (LDMs) are expected to realize the ideal C-ITS. An actual LDM implementation requires a fine database design to be able to update the information to represent potential risks based on future interactions of vehicles. In the present study, we proposed an advanced method for embedding the geographical future occupancy of vehicles into the database by using a binary decision diagram (BDD). In our method, the geographical future occupancy of vehicles was formulated with Kamm’s circle. In computer experiments, sharing BDD-based occupancy data was successfully demonstrated in the ROS-based simulator with the linked list-based BDD. Algebraic operations in exchanged BDDs effectively managed future interactions such as data insertion and timing of collision avoidance in the LDM. This result opened a new door for the realization of the ideal LDM for safety in AVs.
APA, Harvard, Vancouver, ISO, and other styles
17

Bibilo, P. N., Yu Yu Lankevich, and V. I. Romanov. "Logical Minimization of Multilevel Representations of Boolean Function Systems." Informacionnye Tehnologii 29, no. 2 (February 20, 2023): 59–71. http://dx.doi.org/10.17587/it.29.59-71.

Full text
Abstract:
A decisive influence on complexity and speed of a combinational logic circuit of library CMOS elements is exerted by the preliminary stage of technologically independent optimization of the implemented system of Boolean functions. At present, the main methods of such optimization in the logical synthesis of custom CMOS VLSI blocks are methods for minimizing binary decision diagrams — Binary Decision Diagrams (BDD) or their modifications. Graphical representations of BDD are built on the basis of the Shannon expansions of Boolean functions. A BDD graph corresponds to a set of interrelated Shannon expansion formulas that form a multilevel representation of the minimized system of Boolean functions. The efficiency of applying various optimization procedures of minimization for several types of BDD representations of systems of Boolean functions is investigated in the paper. 7hese procedures are used as a technologically independent optimization in the synthesis of multi-output logic circuits of library CMOS elements. In addition to single logical optimization procedures, sequences of such procedures are studied that form various methods of logical optimization of multilevel representations of systems of Boolean functions. The results of experiments on 49 examples of systems of Boolean functions are presented. 25 optimization routes have been studied, efficient routes have been determined for various types of specifications of function systems. The obtained experimental results are compared with the known ones. It has been established that to estimate the complexity of optimized algebraic representations of systems of functions, it is advisable to use such a criterion as the total number of literals (variables or their inversions) of Boolean variables.
APA, Harvard, Vancouver, ISO, and other styles
18

Sanner, Scott, and Ehsan Abbasnejad. "Symbolic Variable Elimination for Discrete and Continuous Graphical Models." Proceedings of the AAAI Conference on Artificial Intelligence 26, no. 1 (September 20, 2021): 1954–60. http://dx.doi.org/10.1609/aaai.v26i1.8406.

Full text
Abstract:
Probabilistic reasoning in the real-world often requires inference incontinuous variable graphical models, yet there are few methods for exact, closed-form inference when joint distributions are non-Gaussian. To address this inferential deficit, we introduce SVE -- a symbolic extension of the well-known variable elimination algorithm to perform exact inference in an expressive class of mixed discrete and continuous variable graphical models whose conditional probability functions can be well-approximated as piecewise combinations of polynomials with bounded support. Using this representation, we show that we can compute all of the SVE operations exactly and in closed-form, which crucially includes definite integration w.r.t. multivariate piecewise polynomial functions. To aid in the efficient computation and compact representation of this solution, we use an extended algebraic decision diagram (XADD) data structure that supports all SVE operations. We provide illustrative results for SVE on probabilistic inference queries inspired by robotics localization and tracking applications that mix various continuous distributions; this represents the first time a general closed-form exact solution has been proposed for this expressive class of discrete/continuous graphical models.
APA, Harvard, Vancouver, ISO, and other styles
19

Dudek-Dyduch, Ewa, Zbigniew Gomolka, Boguslaw Twarog, and Ewa Zeslawska. "Intelligent ALMM System - implementation assumptions for its Knowledge Base." ITM Web of Conferences 21 (2018): 00002. http://dx.doi.org/10.1051/itmconf/20182100002.

Full text
Abstract:
The paper introduces the concept of implementation assumptions about the Knowledge Base (KB) system cooperating with intelligent information system for the discrete optimization of problem solving, named Intelligent ALMM System. This system utilizes a modeling paradigm named Algebraic Logical Meta Model of Multistage Decision Processes (ALMM) and its theory, both developed by Dudek-Dyduch E. The system solves combinatorial and discrete optimization problems including NP-hard problems with possible user assistance. The models of problems are stored in a Problem Model Library. In this paper the idea of KB for the storage of the properties of problems is presented. The concept of the KB on problems presented in previous works has been extended by introducing an additional module pertaining to the properties of a problems library. A discussion was presented in the context of the selection of tools that enable the construction of such a library as well as its architecture. In the adopted strategy of storing the properties of problems, the interface for exchanging information is compatible with the library of problems using polymorphic and component properties of object-oriented programming. Considerations are explained by means of a sample UML diagram and interface prototypes.
APA, Harvard, Vancouver, ISO, and other styles
20

Gossen, Frederik, and Bernhard Steffen. "Algebraic aggregation of random forests: towards explainability and rapid evaluation." International Journal on Software Tools for Technology Transfer, September 29, 2021. http://dx.doi.org/10.1007/s10009-021-00635-x.

Full text
Abstract:
AbstractRandom Forests are one of the most popular classifiers in machine learning. The larger they are, the more precise the outcome of their predictions. However, this comes at a cost: it is increasingly difficult to understand why a Random Forest made a specific choice, and its running time for classification grows linearly with the size (number of trees). In this paper, we propose a method to aggregate large Random Forests into a single, semantically equivalent decision diagram which has the following two effects: (1) minimal, sufficient explanations for Random Forest-based classifications can be obtained by means of a simple three step reduction, and (2) the running time is radically improved. In fact, our experiments on various popular datasets show speed-ups of several orders of magnitude, while, at the same time, also significantly reducing the size of the required data structure.
APA, Harvard, Vancouver, ISO, and other styles
21

Mladenov, Martin, Vaishak Belle, and Kristian Kersting. "The Symbolic Interior Point Method." Proceedings of the AAAI Conference on Artificial Intelligence 31, no. 1 (February 12, 2017). http://dx.doi.org/10.1609/aaai.v31i1.10699.

Full text
Abstract:
Numerical optimization is arguably the most prominent computational framework in machine learning and AI. It can be seen as an assembly language for hard combinatorial problems ranging from classification and regression in learning, to computing optimal policies and equilibria in decision theory, to entropy minimization in information sciences. Unfortunately, specifying such problems in complex domains involving relations, objects and other logical dependencies is cumbersome at best, requiring considerable expert knowledge, and solvers require models to be painstakingly reduced to standard forms. To overcome this, we introduce a rich modeling framework for optimization problems that allows convenient codification of symbolic structure. Rather than reducing this symbolic structure to a sparse or dense matrix, we represent and exploit it directly using algebraic decision diagrams (ADDs). Combining efficient ADD-based matrix-vector algebra with a matrix-free interior-point method, we develop an engine that can fully leverage the structure of symbolic representations to solve convex linear and quadratic optimization problems. We demonstrate the flexibility of the resulting symbolic-numeric optimizer on decision making and compressed sensing tasks with millions of non-zero entries.
APA, Harvard, Vancouver, ISO, and other styles
22

Shtrakov, Slavcho. "Minor complexity of discrete functions." Applied Computing and Informatics ahead-of-print, ahead-of-print (August 18, 2020). http://dx.doi.org/10.1016/j.aci.2018.07.003.

Full text
Abstract:
In this paper we study a class of complexity measures, induced by a new data structure for representing k-valued functions (operations), called minor decision diagram. When assigning values to some variables in a function the resulting functions are called subfunctions, and when identifying some variables the resulting functions are called minors. The sets of essential variables in subfunctions of f are called separable in f.We examine the maximal separable subsets of variables and their conjugates, introduced in the paper, proving that each such set has at least one conjugate. The essential arity gap gap(f) of the function f is the minimal number of essential variables in f which become fictive when identifying distinct essential variables in f. We also investigate separable sets of variables in functions with non-trivial arity gap. This allows us to solve several important algebraic, computational and combinatorial problems about the finite-valued functions.
APA, Harvard, Vancouver, ISO, and other styles
23

Vianna, Luis, Leliane De Barros, and Scott Sanner. "Real-Time Symbolic Dynamic Programming." Proceedings of the AAAI Conference on Artificial Intelligence 29, no. 1 (March 4, 2015). http://dx.doi.org/10.1609/aaai.v29i1.9651.

Full text
Abstract:
Recent advances in Symbolic Dynamic Programming (SDP) combined withthe extended algebraic decision diagram (XADD) have provided exactsolutions for expressive subclasses of finite-horizon Hybrid MarkovDecision Processes (HMDPs) with mixed continuous and discrete stateand action parameters. Unfortunately, SDP suffers from two majordrawbacks: (1) it solves for all states and can be intractable formany problems that inherently have large optimal XADD value functionrepresentations; and (2) it cannot maintain compact (pruned) XADDrepresentations for domains with nonlinear dynamics and reward due tothe need for nonlinear constraint checking. In this work, wesimultaneously address both of these problems by introducing real-timeSDP (RTSDP). RTSDP addresses (1) by focusing the solution and valuerepresentation only on regions reachable from a set of initial statesand RTSDP addresses (2) by using visited states as witnesses ofreachable regions to assist in pruning irrelevant or unreachable(nonlinear) regions of the value function. To this end, RTSDP enjoysprovable convergence over the set of initial states and substantialspace and time savings over SDP as we demonstrate in a variety of hybrid domains ranging from inventory to reservoir to traffic control.
APA, Harvard, Vancouver, ISO, and other styles
24

Klarlund, Nils. "An n log n Algorithm for Online BDD Refinement." BRICS Report Series 2, no. 29 (January 29, 1995). http://dx.doi.org/10.7146/brics.v2i29.19931.

Full text
Abstract:
Binary Decision Diagrams are in widespread use in verification systems<br />for the canonical representation of Boolean functions. A BDD representing<br />a function phi : B^nu -> N can easily be reduced to its canonical form in<br />linear time.<br />In this paper, we consider a natural online BDD refinement problem<br />and show that it can be solved in O(n log n) if n bounds the size of the<br />BDD and the total size of update operations.<br />We argue that BDDs in an algebraic framework should be understood<br />as minimal fixed points superimposed on maximal fixed points. We propose<br />a technique of controlled growth of equivalence classes to make the<br />minimal fixed point calculations be carried out efficiently. Our algorithm<br />is based on a new understanding of the interplay between the splitting<br />and growing of classes of nodes.<br />We apply our algorithm to show that automata with exponentially<br />large, but implicitly represented alphabets, can be minimized in time<br />O(n log n), where n is the total number of BDD nodes representing the<br />automaton.
APA, Harvard, Vancouver, ISO, and other styles
25

Mahmoud, Ahmed, and Jaka Sunarso. "A mixed integer nonlinear programming approach for integrated bio-refinery and petroleum refinery topology optimization." Chemical Product and Process Modeling, July 20, 2020. http://dx.doi.org/10.1515/cppm-2019-0124.

Full text
Abstract:
AbstractThe conversion of biomass into gasoline and diesel in bio-refinery process is an attractive process given its carbon neutral and sustainable nature. The economics of bio-refinery can be improved via integration with petroleum refinery, whereby bio-refinery intermediates can be processed into gasoline and diesel in the well-established petroleum refinery processing units, i. e., hydrocracking (HC) and fluidized catalytic cracking (FCC) units. However, the integration of the new bio-refinery into the existing petroleum refinery may not give the optimum solution given the capacities constraints of the existing petroleum refinery upgrading units such as FCC and HC units. Thus, this work proposed a superstructure comprising new bio-refinery and new petroleum refinery block diagrams. The superstructure was formulated into mixed integer nonlinear programming (MINLP) model. The model was coded into general algebraic modeling system (GAMS) platform and solved using global optimum solver, LINDOGLOBAL. The model application was demonstrated using representative case study. The model results showed that the optimum integrated bio-refinery and petroleum refinery topology favors the upgrading of bio-refinery intermediates using petroleum refinery HC unit under one-through operation mode with a marginal increase in the profit of about 0.39% compared to the second optimum case of upgrading bio-refinery intermediate using petroleum refinery FCC unit under gasoline operation mode. Thus, the decision in selecting the most suitable topology can be made based on the market demand for gasoline and diesel as the topology that uses FCC maximizes gasoline production and the topology that uses HC maximizes diesel production.
APA, Harvard, Vancouver, ISO, and other styles
26

Kinsner, Witold, Dario Schor, Samantha Olson, and Mount First Ng. "LEARNING HOW TO DESIGN A GRAPHICS PROCESSOR." Proceedings of the Canadian Engineering Education Association (CEEA), June 20, 2012. http://dx.doi.org/10.24908/pceea.v0i0.4716.

Full text
Abstract:
This paper describes the design and implementation of a rudimentary graphics processor, called GRAFIX, intended for use in a simple handheld gaming console. The processor is a part of a laboratory in an undergraduate course on Digital Systems Design 2 (DSD2) [1-2]. The GRAFIX processor can perform two different operations: (a) drawing an individual pixel, and (b) drawing a line using the Bresenham line drawing algorithm.The DSD2 course provides foundational material on discrete mathematics and the theory of modern very-large switching circuits. It presents computer engineering students with a firm foundation in the modern theory of optimal logic design. It illustrates some applications through formal characterization of combiniational functions and sequential machines, using contemporary techniques for the automatic synthesis and diagnosis of digital systems. It discusses (i) the design of VLSI systems with problems and approaches; (ii) gound-up development of algebraic structures, lattices, Boolean algebras for a generalized switching theory; (iii) exact optimization of two-level switching functions; (iv) heuristic techniques for the optimization of two-level logic circuits; (v) analysis, synthesis and optimization of complemented binary decision diagrams (BDDs) [3]; and (vi) provides design examples throughout the course.This fairly high-level lab design of a graphics processor is possible because the students have acquired the necessary prerequisite knowledge from previous courses. None of the courses, however, has attempted to develop a complete processor of such scale. The machine specifications include: (a) interfacing of the host to the GRAFIX unit may be synchronous or asynchronous (one must be selected and the consequences of the choice must be discussed); (b) the frame buffer can be either a single port memory or a dual port memory (again, one must be selected, and the consequences of the choice must be discussed); (c) interfacing of the GRAFIX unit to the frame buffer may be synchronous or asynchronous. The GRAFIX lab is split into four sessions: (i) familiarization with tools and design of the arithmetic logic unit (ALU) for GRAFIX, (ii) design of its Data Path Unit (DPU), (iii) design of its Computing Control Unit (CCU), and (iv) integration and testing.The objectives of those sessions are to learn how to (a) formulate an architecture of a simple graphics processor, (b) formulate an appropriate ALU, (c) formulate a DPU and CCU, (d) formulate a Command Interpreter, (e) formulate an Controller/Sequencer, (f) formulate the CCU interfacing with the GRAFIX I/O and with the DPU, (g) construct test procedures for each unit and incremental testing, (h) construct a supervisory module to provide the completed GRAFIX processor with test input, (i) integrate the system, (j) test operation of the GRAFIX processor, and (k) describe the system. VERILOG is used as the hardware-description language of choice.The paper provides a detailed description of the architecture of the graphics processor, its design and implementation, as well as experience from running the laboratory many times.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography