Siga este enlace para ver otros tipos de publicaciones sobre el tema: Formal and symbolic calculation.

Artículos de revistas sobre el tema "Formal and symbolic calculation"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores artículos de revistas para su investigación sobre el tema "Formal and symbolic calculation".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Deng, Hui y Jinzhao Wu. "Approximate Bisimulation and Optimization of Software Programs Based on Symbolic-Numeric Computation". Mathematical Problems in Engineering 2013 (2013): 1–19. http://dx.doi.org/10.1155/2013/421926.

Texto completo
Resumen
To achieve behavior and structure optimization for a type of software program whose data exchange processes are represented by nonlinear polynomial systems, this paper establishes a novel formal description called a nonlinear polynomial transition system to represent the behavior and structure of the software program. Then, the notion of bisimulation for software programs is proposed based on the equivalence relation of corresponding nonlinear polynomial systems in their nonlinear polynomial transition systems. However, the exact equivalence is too strict in application. To enhance the flexibility of the relation among the different software systems, the notion of approximate bisimulation within a controllable error range and the calculation algorithm of approximate bisimulation based on symbolic-numeric computation are given. In this calculation, an approximate relation is represented as a MAX function that is resolved with the full filled method. At the same time, the actual error is calculable. An example on a multithreading program indicates that the approximate bisimulation relation is feasible and effective in behavior and structure optimization.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Plaskura, Paweł. "DERIVWWW - WEB-BASED SYMBOLIC DIFFERENTIATION SYSTEM". Information Technologies and Learning Tools 60, n.º 4 (30 de septiembre de 2017): 254. http://dx.doi.org/10.33407/itlt.v60i4.1578.

Texto completo
Resumen
The article presents an online system for symbolic differentiation. It shows how derivatives are calculated. The trees are used for internal representation of formulas. Derivatives are generated by tree transformations. Presented algorithms are part of the microsystems simulator Dero. They are required by the calculation algorithms such as Newton-Raphson. They can be used to generate derivatives in model description languages and for automatic derivative calculation. DerivWWW can be used in didactics. The system can serve as a tool for students to count function derivatives at the point. It can also be used in teaching on the technical fields of study. Symbolic derivatives are saved in the Tex format, allowing easy integration with other software. The developed and implemented algorithms are discussed. Examples of use are given.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Constantine, Gregory M. y Marius G. Buliga. "Determinantal generating functions of colored spanning forests". International Journal of Mathematics and Mathematical Sciences 2004, n.º 6 (2004): 273–83. http://dx.doi.org/10.1155/s0161171204302206.

Texto completo
Resumen
The color type of a spanning forest of a graph with colored edges is defined and, subsequently, it is proved that the generating function of such spanning forests is obtained as the formal expansion of a certain determinant. An analogous determinantal expansion yields the generating function of all spanning forests of a given color type that contain a specific subforest. Algorithms are described for obtaining a list of all colored spanning trees and spanning forests of any graph with colored edges based on symbolic calculation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Yan, Zongshuai, Chenhua Nie, Rongsheng Dong, Xi Gao y Jianming Liu. "A Novel OBDD-Based Reliability Evaluation Algorithm for Wireless Sensor Networks on the Multicast Model". Mathematical Problems in Engineering 2015 (2015): 1–14. http://dx.doi.org/10.1155/2015/269781.

Texto completo
Resumen
The two-terminal reliability calculation for wireless sensor networks (WSNs) is a #P-hard problem. The reliability calculation of WSNs on the multicast model provides an even worse combinatorial explosion of node states with respect to the calculation of WSNs on the unicast model; many real WSNs require the multicast model to deliver information. This research first provides a formal definition for the WSN on the multicast model. Next, a symbolic OBDD_Multicast algorithm is proposed to evaluate the reliability of WSNs on the multicast model. Furthermore, our research on OBDD_Multicast construction avoids the problem of invalid expansion, which reduces the number of subnetworks by identifying the redundant paths of two adjacent nodes ands-tunconnected paths. Experiments show that the OBDD_Multicast both reduces the complexity of the WSN reliability analysis and has a lower running time than Xing’s OBDD- (ordered binary decision diagram-) based algorithm.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

CERVESATO, ILIANO. "NEXCEL, a deductive spreadsheet". Knowledge Engineering Review 22, n.º 3 (septiembre de 2007): 221–36. http://dx.doi.org/10.1017/s0269888907001142.

Texto completo
Resumen
AbstractUsability and usefulness have made the spreadsheet one of the most successful computing applications of all times: millions rely on it every day for anything from typing grocery lists to developing multimillion-dollar budgets. One thing spreadsheets are not very good at is manipulating the symbolic data and helping users make decisions based on them. By tapping into recent research in Logic Programming, Databases and Cognitive Psychology, we propose a deductive extension to the spreadsheet paradigm that precisely addresses this issue. The accompanying tool, which we call NEXCEL, is intended as an automated assistant for the daily reasoning and decision-making needs of computer users, in the same way as a spreadsheet application such as Microsoft Excel assists them every day with simple and complex calculations. Users without formal training in Logic or even Computer Science can interactively define logical rules in the same simple way as they define formulas in Excel. NEXCEL immediately evaluates these rules, thereby returning lists of values that satisfy them, again just like with numerical formulas. The deductive component is seamlessly integrated into the traditional spreadsheet so that a user not only still has access to the usual functionalities but is also able to use them as part of the logical inference and, dually, to embed deductive steps in a numerical calculation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Selot, Florian, Bruno Robisson, Claire Vaglio-Gaudard y Javier Gil-Quijano. "Formal modelling of the electricity markets: the example of the load reduction of electricity mechanism “NEBEF”". IOP Conference Series: Earth and Environmental Science 897, n.º 1 (1 de noviembre de 2021): 012017. http://dx.doi.org/10.1088/1755-1315/897/1/012017.

Texto completo
Resumen
Abstract The liberalisation of the electricity market initiated at the beginning of the 21st century has opened it to new parties. To ensure the growth of participants’ number will support the system’s balance, the EU regulation 2019/943 confirms that “all market participants should be financially responsible of the imbalances they cause”. In their respective area, the transmission system operators develops the regulation in compliance with this condition. However, as the regulation takes into account the new realities of the market such as renewables, the interactions between the participants become more complex. One of the risks is that the imbalance of an actor may not be due to its own actions, not complying with the EU regulation then. To analyse this kind of implicit condition, we propose a formal approach to model the exchanges of energy. Using the French regulation as a base, we model the participants and their interactions in the form of symbolic equations using the energy-related terms as variables. In this paper, to illustrate the model we will use to analyse the entire electricity market, we apply it to the NEBEF mechanism only. This mechanism is dedicated to the selling of demand response in France and introduces a third party between the final producer and the final consumer: the demand response operator. We model the mechanism and analyse how the mechanism complies with the balancing responsibility. Our results demonstrate that the mechanism complies with the regulation but there are some limits due to the calculation method of the reference consumption.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Røyrvik, Ola. "Teaching Electrical Engineering Using Maple". International Journal of Electrical Engineering & Education 39, n.º 4 (octubre de 2002): 297–309. http://dx.doi.org/10.7227/ijeee.39.4.1.

Texto completo
Resumen
Many electrical engineering (EE) students have difficulty in learning technical subjects because they lack sufficient competence in mathematical modeling and in algebra. Maple is a powerful program for doing symbolic algebra, numerical calculation, and plotting of graphs, so using this program allows students to spend more time on modeling and interpreting results. Maple also has a text editor, which makes it feasible to require students to explain their results in writing. The design of Maple documents suitable for EE teaching is discussed; a standard format, including bibliographical information, is recommended for easier use.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Noël, Marie-Pascale y Xavier Seron. "Notational Constraints and Number Processing: A Reappraisal of the Gonzalez and Kolers (1982) Study". Quarterly Journal of Experimental Psychology Section A 45, n.º 3 (septiembre de 1992): 451–78. http://dx.doi.org/10.1080/02724989208250623.

Texto completo
Resumen
In a verification task of simple additions composed of Arabic or Roman numerals, Gonzalez and Kolers (1982) reported data that were interpreted as supporting the idea that cognitive operations are not independent of the symbols that instigate them. We propose an alternative interpretation of these results and argue that the effects reported may have been produced by a peculiarity of the Roman code for which the encoding time would not be constant for all numerals. We hypothesize that three different “structures” can be distinguished in the Roman code, and that the time necessary to encode a numeral would vary according to its structure, with the analogical (numerals I, II, and HI) and the symbolic (V, X) structures being processed faster than the complex structures (IV, VI, VII, VIII, IX, XI,…). This structure effect is tested in two experiments: a verification of transcoded forms and a parity judgement. Data repeatedly showed support for this hypothesis. Moreover, a verification task for additions showed that the presentation format of the addends played a role in the encoding stage but did not interact with variables relative to the size of the addition problems. These data could thus sustain the hypothesis of a “translation model”, according to which numerals would be translated into a specific code to which the calculation process would be applied.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Topilnytskyy, Volodymyr, Yaroslav Kusyi y Dariya Rebot. "RESEARCH OF VIBRATION MACHINES DYNAMICS FOR PRODUCT SURFACES PROCESSING BY MATHEMATICAL MODELING". Vibrations in engineering and technology, n.º 1(96) (27 de agosto de 2020): 35–43. http://dx.doi.org/10.37128/2306-8744-2020-1-4.

Texto completo
Resumen
The article describes the methodology for the study of the dynamics of vibrating machines for surface processing of products by mathematical modeling, which is presented in four main stages. The first stage: analysis of classes of vibrating machines for surface treatment of products, choice of basic for solving the technological problem, project of a unified calculation scheme of the machine. The second stage: development of a nonlinear mathematical model for describing the dynamics of the vibration machine working body and its filling, development of elements of automated calculations of the machine. The third stage: the study of the influence of the parameters of the vibrating machine, product sets and tools (with their various combinations) on the factors of the intensity of products surface processing. The fourth stage: recommendations for choosing vibrating machine parameters and machining bodies that will maximize the processing performance of products with the selected intensity criterion. A mathematical model for describing the motion of a vibrating machine for surface treatment of articles by a set of unrelated bodies of small size is created. It has two unbalance units that generate oscillations of its working body and a spring suspension-mounting of the working chamber (container). The model is parametric and nonlinear, incorporating key dynamic, kinematic and geometric parameters of the vibrating machine in symbolic format. It is constructed by: descriptions of the plane-parallel movement of the mechanical system, the rotational motion of the material point and the body; second-order Lagrange equation; asymptotic (approximate) methods of nonlinear mechanics. With the help of the model it is possible: to describe the oscillatory movement of the working chamber (container) of the vibrating machine; to study the influence of the machine parameters on the efficiency of performance of the set technological task, the conditions of occurrence of non-stationary modes of operation of the vibrating machine and the ways of their regulation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Zhang, Yujian y Daifu Liu. "Toward Vulnerability Detection for Ethereum Smart Contracts Using Graph-Matching Network". Future Internet 14, n.º 11 (11 de noviembre de 2022): 326. http://dx.doi.org/10.3390/fi14110326.

Texto completo
Resumen
With the blooming of blockchain-based smart contracts in decentralized applications, the security problem of smart contracts has become a critical issue, as vulnerable contracts have resulted in severe financial losses. Existing research works have explored vulnerability detection methods based on fuzzing, symbolic execution, formal verification, and static analysis. In this paper, we propose two static analysis approaches called ASGVulDetector and BASGVulDetector for detecting vulnerabilities in Ethereum smart contacts from source-code and bytecode perspectives, respectively. First, we design a novel intermediate representation called abstract semantic graph (ASG) to capture both syntactic and semantic features from the program. ASG is based on syntax information but enriched by code structures, such as control flow and data flow. Then, we apply two different training models, i.e., graph neural network (GNN) and graph matching network (GMN), to learn the embedding of ASG and measure the similarity of the contract pairs. In this way, vulnerable smart contracts can be identified by calculating the similarity to labeled ones. We conduct extensive experiments to evaluate the superiority of our approaches to state-of-the-art competitors. Specifically, ASGVulDetector improves the best of three source-code-only static analysis tools (i.e., SmartCheck, Slither, and DR-GCN) regarding the F1 score by 12.6% on average, while BASGVulDetector improves that of the three detection tools supporting bytecode (i.e., ContractFuzzer, Oyente, and Securify) regarding the F1 score by 25.6% on average. We also investigate the effectiveness and advantages of the GMN model for detecting vulnerabilities in smart contracts.
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Pang, Alex y Chueng-Ryong Ji. "Symbolic Feynman-Diagram Calculation". Computers in Physics 9, n.º 6 (1995): 589. http://dx.doi.org/10.1063/1.168553.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Waring, Rylan J. y Marcie Penner-Wilger. "Estimation of importance: Relative contributions of symbolic and non-symbolic number systems to exact and approximate calculation". Journal of Numerical Cognition 2, n.º 3 (10 de febrero de 2017): 202–19. http://dx.doi.org/10.5964/jnc.v2i3.9.

Texto completo
Resumen
The topic of how symbolic and non-symbolic number systems relate to exact calculation skill has received great discussion for a number of years now. However, little research has been done to examine how these systems relate to approximate calculation skill. To address this question, performance on symbolic and non-symbolic numeric ordering tasks was examined as predictors of Woodcock Johnson calculation (exact) and computation estimation (approximate) scores among university adults (N = 85, 61 female, Mean age = 21.3, range = 18-49 years). For Woodcock Johnson calculation scores, only the symbolic task uniquely predicted performance outcomes in a multiple regression. For the computational estimation task, only the non-symbolic task uniquely predicted performance outcomes. Symbolic system performance mediated the relation between non-symbolic system performance and exact calculation skill. Non-symbolic system performance mediated the relation between symbolic system performance and approximate calculation skill. These findings suggest that symbolic and non-symbolic number system acuity uniquely relate to exact and approximate calculation ability respectively.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Deng, Xianghua, Jooyong Lee y Robby. "Efficient and formal generalized symbolic execution". Automated Software Engineering 19, n.º 3 (9 de junio de 2011): 233–301. http://dx.doi.org/10.1007/s10515-011-0089-9.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Young, Thomas. "Introductory Symbolic Logic Without Formal Proofs". Teaching Philosophy 15, n.º 3 (1992): 296–98. http://dx.doi.org/10.5840/teachphil199215352.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Costa, Umberto, Sérgio Campos, Newton Vieira y David Déharbe. "Explicit-Symbolic Modelling for Formal Verification". Electronic Notes in Theoretical Computer Science 130 (mayo de 2005): 301–21. http://dx.doi.org/10.1016/j.entcs.2005.03.016.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

DIVAKOV, D. V. y A. A. TYUTYUNNIK. "SYMBOLIC-NUMERICAL IMPLEMENTATION OF THE GALERKIN METHOD FOR APPROXIMATE SOLUTION OF THE WAVEGUIDE DIFFRACTION PROBLEM". Программирование, n.º 2 (1 de marzo de 2023): 46–53. http://dx.doi.org/10.31857/s0132347423020097.

Texto completo
Resumen
In this paper, we construct a symbolic-numerical implementation of the Galerkin method for approximate solution of the waveguide diffraction problem at the junction of two open planar three-layer waveguides. The Gelerkin method is implemented in the Maple computer algebra system by symbolic manipulations; its software implementation is based on the scprod symbolic-numerical procedure, which enables the numerical calculation of scalar products for the Galerkin method based on symbolic expressions. The use of symbolic manipulations makes it possible to speed up the calculation of integrals in the Galerkin method owing to single-run symbolic calculation of integrals typical for the problem, rather than multiple numerical integration.
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Nebel, G., U. Kleine y H. J. Pfleiderer. "Symbolic pole/zero calculation using SANTAFE". IEEE Journal of Solid-State Circuits 30, n.º 7 (julio de 1995): 752–61. http://dx.doi.org/10.1109/4.391114.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Barnett, Michael P. y Istvàn Pelczer. "Pulse sequence editing by symbolic calculation". Journal of Magnetic Resonance 204, n.º 2 (junio de 2010): 189–95. http://dx.doi.org/10.1016/j.jmr.2010.01.009.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Gyimothy, S. y I. Sebestyen. "Symbolic description of field calculation problems". IEEE Transactions on Magnetics 34, n.º 5 (1998): 3427–30. http://dx.doi.org/10.1109/20.717807.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Pearson, Jane M. y Noel G. Lloyd. "Space Saving Calculation of Symbolic Resultants". Mathematics in Computer Science 1, n.º 2 (15 de octubre de 2007): 267–90. http://dx.doi.org/10.1007/s11786-007-0016-4.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Barnett, Michael P., Joseph F. Capitani, Joachim von zur Gathen y Jürgen Gerhard. "Symbolic calculation in chemistry: Selected examples". International Journal of Quantum Chemistry 100, n.º 2 (2004): 80–104. http://dx.doi.org/10.1002/qua.20097.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Barnett, Michael P. y Joseph F. Capitani. "Modular chemical geometry and symbolic calculation". International Journal of Quantum Chemistry 106, n.º 1 (2005): 215–27. http://dx.doi.org/10.1002/qua.20807.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

de Boer, Frank S. y Marcello Bonsangue. "Symbolic execution formally explained". Formal Aspects of Computing 33, n.º 4-5 (2 de febrero de 2021): 617–36. http://dx.doi.org/10.1007/s00165-020-00527-y.

Texto completo
Resumen
AbstractIn this paper, we provide a formal explanation of symbolic execution in terms of a symbolic transition system and prove its correctness and completeness with respect to an operational semantics which models the execution on concrete values.We first introduce a formalmodel for a basic programming languagewith a statically fixed number of programming variables. This model is extended to a programming language with recursive procedures which are called by a call-by-value parameter mechanism. Finally, we present a more general formal framework for proving the soundness and completeness of the symbolic execution of a basic object-oriented language which features dynamically allocated variables.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Porncharoenwase, Sorawee, Luke Nelson, Xi Wang y Emina Torlak. "A formal foundation for symbolic evaluation with merging". Proceedings of the ACM on Programming Languages 6, POPL (16 de enero de 2022): 1–28. http://dx.doi.org/10.1145/3498709.

Texto completo
Resumen
Reusable symbolic evaluators are a key building block of solver-aided verification and synthesis tools. A reusable evaluator reduces the semantics of all paths in a program to logical constraints, and a client tool uses these constraints to formulate a satisfiability query that is discharged with SAT or SMT solvers. The correctness of the evaluator is critical to the soundness of the tool and the domain properties it aims to guarantee. Yet so far, the trust in these evaluators has been based on an ad-hoc foundation of testing and manual reasoning. This paper presents the first formal framework for reasoning about the behavior of reusable symbolic evaluators. We develop a new symbolic semantics for these evaluators that incorporates state merging. Symbolic evaluators use state merging to avoid path explosion and generate compact encodings. To accommodate a wide range of implementations, our semantics is parameterized by a symbolic factory, which abstracts away the details of merging and creation of symbolic values. The semantics targets a rich language that extends Core Scheme with assumptions and assertions, and thus supports branching, loops, and (first-class) procedures. The semantics is designed to support reusability, by guaranteeing two key properties: legality of the generated symbolic states, and the reducibility of symbolic evaluation to concrete evaluation. Legality makes it simpler for client tools to formulate queries, and reducibility enables testing of client tools on concrete inputs. We use the Lean theorem prover to mechanize our symbolic semantics, prove that it is sound and complete with respect to the concrete semantics, and prove that it guarantees legality and reducibility. To demonstrate the generality of our semantics, we develop Leanette, a reference evaluator written in Lean, and Rosette 4, an optimized evaluator written in Racket. We prove Leanette correct with respect to the semantics, and validate Rosette 4 against Leanette via solver-aided differential testing. To demonstrate the practicality of our approach, we port 16 published verification and synthesis tools from Rosette 3 to Rosette 4. Rosette 3 is an existing reusable evaluator that implements the classic merging semantics, adopted from bounded model checking. Rosette 4 replaces the semantic core of Rosette 3 but keeps its optimized symbolic factory. Our results show that Rosette 4 matches the performance of Rosette 3 across a wide range of benchmarks, while providing a cleaner interface that simplifies the implementation of client tools.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Kawahara, Shigeto, Hironori Katsuda y Gakuji Kumagai. "Accounting for the stochastic nature of sound symbolism using Maximum Entropy model". Open Linguistics 5, n.º 1 (22 de mayo de 2019): 109–20. http://dx.doi.org/10.1515/opli-2019-0007.

Texto completo
Resumen
AbstractSound symbolism refers to stochastic and systematic associations between sounds and meanings. Sound symbolism has not received much serious attention in the generative phonology literature, perhaps because most if not all sound symbolic patterns are probabilistic. Building on the recent proposal to analyze sound symbolic patterns within a formal phonological framework (Alderete and Kochetov 2017), this paper shows that MaxEnt grammars allow us to model stochastic sound symbolic patterns in a very natural way. The analyses presented in the paper show that sound symbolic relationships can be modeled in the same way that we model phonological patterns. We suggest that there is nothing fundamental that prohibits formal phonologists from analyzing sound symbolic patterns, and that studying sound symbolism using a formal framework may open up a new, interesting research domain. The current study also reports two hitherto unnoticed cases of sound symbolism, thereby expanding the empirical scope of sound symbolic patterns in natural languages.
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Bracken, Paul y Rodney J. Bartlett. "Calculation of Gaussian integrals using symbolic manipulation". International Journal of Quantum Chemistry 62, n.º 6 (1997): 557–70. http://dx.doi.org/10.1002/(sici)1097-461x(1997)62:6<557::aid-qua1>3.0.co;2-v.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Yu, Feng y Xinyuan Chen. "Dynamic Modeling and Development of Symbolic Calculation Software for N-DOF Flexible-Link Manipulators Incorporating Lumped Mass". Shock and Vibration 2019 (1 de julio de 2019): 1–16. http://dx.doi.org/10.1155/2019/5627271.

Texto completo
Resumen
Considering the complex dynamic modeling of multi-DOF planar flexible manipulators, a general-purpose method for the rigid-flexible coupling dynamic modeling of N-DOF flexible manipulators is proposed in this paper, and symbolic calculation software is developed. The modeling method is based on the Lagrange equation and assumed mode method (AMM). First, the N-DOF flexible manipulator is divided into two parts, which are assumed to be rigid and flexible. On this basis, the rigid part and the flexible part are coupled, and the calculation process of the model is further simplified. Then, the simplest general symbolic expression of the dynamic model of the N-DOF flexible manipulator is obtained with the induction method. According to the modeling method, “symbolic expression computation software for dynamic equations of N-DOF flexible manipulators” is developed using the symbolic calculation software Mathematica. Finally, the dynamic modeling method and the symbolic calculation software are verified by a trajectory tracking experiment with a PD control applied to a 2-DOF flexible manipulator. Compared with the traditional modeling method, the calculation time can be reduced by more than 90% using the modeling method proposed in this paper, which significantly reduces the complexity of the modeling process.
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Stewart, Sepideh y Michael O. J. Thomas. "Embodied, symbolic and formal thinking in linear algebra". International Journal of Mathematical Education in Science and Technology 38, n.º 7 (15 de octubre de 2007): 927–37. http://dx.doi.org/10.1080/00207390701573335.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Thomas, Michael O. J. y Sepideh Stewart. "Eigenvalues and eigenvectors: embodied, symbolic and formal thinking". Mathematics Education Research Journal 23, n.º 3 (9 de agosto de 2011): 275–96. http://dx.doi.org/10.1007/s13394-011-0016-1.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Smolensky, Paul. "Symbolic functions from neural computation". Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 370, n.º 1971 (28 de julio de 2012): 3543–69. http://dx.doi.org/10.1098/rsta.2011.0334.

Texto completo
Resumen
Is thought computation over ideas? Turing, and many cognitive scientists since, have assumed so, and formulated computational systems in which meaningful concepts are encoded by symbols which are the objects of computation. Cognition has been carved into parts, each a function defined over such symbols. This paper reports on a research program aimed at computing these symbolic functions without computing over the symbols. Symbols are encoded as patterns of numerical activation over multiple abstract neurons, each neuron simultaneously contributing to the encoding of multiple symbols. Computation is carried out over the numerical activation values of such neurons, which individually have no conceptual meaning. This is massively parallel numerical computation operating within a continuous computational medium. The paper presents an axiomatic framework for such a computational account of cognition, including a number of formal results. Within the framework, a class of recursive symbolic functions can be computed. Formal languages defined by symbolic rewrite rules can also be specified, the subsymbolic computations producing symbolic outputs that simultaneously display central properties of both facets of human language: universal symbolic grammatical competence and statistical, imperfect performance.
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Pang, Alex C. Y. y Chueng-Ryong Ji. "A Spinor Technique in Symbolic Feynman Diagram Calculation". Journal of Computational Physics 115, n.º 2 (diciembre de 1994): 267–75. http://dx.doi.org/10.1006/jcph.1994.1194.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Cory, Beth y Ken W. Smith. "Delving into Limits of Sequences". Mathematics Teacher 105, n.º 1 (agosto de 2011): 48–55. http://dx.doi.org/10.5951/mathteacher.105.1.0048.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Chipere, Ngoni. "Sentence comprehension: The integration of habits and rules. David J. Townsend and Thomas G. Bever. Cambridge, MA: MIT Press, 2001. Pp. 455." Applied Psycholinguistics 23, n.º 3 (septiembre de 2002): 471–77. http://dx.doi.org/10.1017/s0142716402213089.

Texto completo
Resumen
This book attempts to integrate symbolic processing, in the form of minimalism, with connectionism. Minimalism represents sentences as symbolic structures resulting from a formal process of syntactic derivation. Connectionism, on the other hand, represents sentences as patterns of association between linguistic features. These patterns are said to obey statistical regularities of linguistic usage instead of formal linguistic rules. The authors of the book argue that human sentence processing displays both structural and statistical characteristics and therefore requires the integration of the two views.
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Peng, Ji. "Application and Popularization of Formal Calculation". OALib 09, n.º 11 (2022): 1–21. http://dx.doi.org/10.4236/oalib.1109483.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Graham-Squire, Adam. "Calculation of local formal Mellin transforms". Pacific Journal of Mathematics 283, n.º 1 (14 de junio de 2016): 115–37. http://dx.doi.org/10.2140/pjm.2016.283.115.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Graham-Squire, Adam. "Calculation of local formal Fourier transforms". Arkiv för Matematik 51, n.º 1 (abril de 2013): 71–84. http://dx.doi.org/10.1007/s11512-011-0156-2.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Brkić, Dejan, Pavel Praks, Renáta Praksová y Tomáš Kozubek. "Symbolic Regression Approaches for the Direct Calculation of Pipe Diameter". Axioms 12, n.º 9 (31 de agosto de 2023): 850. http://dx.doi.org/10.3390/axioms12090850.

Texto completo
Resumen
This study provides novel and accurate symbolic regression-based solutions for the calculation of pipe diameter when flow rate and pressure drop (head loss) are known, together with the length of the pipe, absolute inner roughness of the pipe, and kinematic viscosity of the fluid. PySR and Eureqa, free and open-source symbolic regression tools, are used for discovering simple and accurate approximate formulas. Three approaches are used: (1) brute force of computing power, which provides results based on raw input data; (2) an improved method where input parameters are transformed through the Lambert W-function; (3) a method where the results are based on inputs and the Colebrook equation transformed through new suitable dimensionless groups. The discovered models were simplified by the WolframAlpha simplify tool and/or the equivalent Matlab Symbolic toolbox. Novel models make iterative calculus redundant; they are simple for computer coding while the relative error remains lower compared with the solution through nomograms. The symbolic-regression solutions discovered by brute force computing power discard the kinematic viscosity of the fluid as an input parameter, implying that it has the least influence.
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Seger, Carl-Johan H. y Randal E. Bryant. "Formal verification by symbolic evaluation of partially-ordered trajectories". Formal Methods in System Design 6, n.º 2 (marzo de 1995): 147–89. http://dx.doi.org/10.1007/bf01383966.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Hojati, Ramin y Robert K. Brayton. "An environment for formal verification based on symbolic computations". Formal Methods in System Design 6, n.º 2 (marzo de 1995): 191–216. http://dx.doi.org/10.1007/bf01383967.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Richards, Joan L. "Generations of Reason: A Family's Search for Meaning in Post-Newtonian England". Perspectives on Science and Christian Faith 75, n.º 1 (marzo de 2023): 63–65. http://dx.doi.org/10.56315/pscf3-23richards.

Texto completo
Resumen
GENERATIONS OF REASON: A Family's Search for Meaning in Post-Newtonian England by Joan L. Richards. New Haven, CT: Yale University Press, 2021. 456 pages, with 21 b/w illustrations, 1,218 endnotes, and a 35-page index. Hardcover; $45.00. ISBN: 9780300255492. *The title gives no clue who this book is about. Nor does the publisher's description on its website, the abbreviated blurb inside the book jacket, the four endorsements posted on the jacket's back ("beautifully written," "epic masterpiece," "magnificent study," "compelling and wide-ranging"), or even the chapter titles. The reader first learns whom the book is about and how it came into focus in the author's Acknowledgments. In studying the divergent interests of Augustus De Morgan and his wife, Sophia, the importance of De Morgan's father-in-law William Frend's thinking became apparent. This is turn led Richards to delve into the lives and beliefs of two ancestors from the previous generation, Francis Blackburne and Theophilus Lindsey, who felt compelled by their commitment to "reasoned conclusions about matters of faith" (p. x) to move away from orthodox Anglicanism and establish the first Unitarian church in England. Thus the book eventually evolved into chronicling the lives of three generations over a century and a half during (roughly) the Enlightenment era. *A central motif running through the experiences, beliefs, and work of these families was their steadfast commitment to a form of enlightened rationality that provided coherence and foundational meaning for their lives. Reason informed their ecclesiastical commit-ment to Unitarianism, their views of science and mathematics, and their public activity favoring social and educational reforms. But also, paradoxically, their search for reason led to the beliefs and practices (of some family members) that today would be considered pseu-do-scientific--mesmerism, phrenology, and spiritism, among others. *As Richards notes in the book's opening sentence, for her, Generations of Reason is "the culmination of a life devoted to understanding the place of mathematics in modern European cultural and intellectual history." The mathematics and logic of early- to mid-nineteenth-century Britain has been an ongoing research interest for Richards during her forty-year tenure as a historian of mathematics at Brown Universi-ty. It is this that largely drew me to the book and which I will focus on here: it climaxes in a substantive treatment of the progressive mathematics of De Morgan, whose work contributed to transforming British algebra and logic. This is in stark contrast with the radical ideas of Frend, who refused to admit negative numbers into mathematics. *A central figure behind the developments under investigation is John Locke, whose Essay Concerning Human Understanding (1689) and The Reasonableness of Christianity, as Delivered in the Scriptures (1695) exercised a tremendous influence over and challenge for eighteenth- and nineteenth-century British thinkers. Locke's ideas defined and emphasized rationality in relation to knowledge generally and to scientific and religious knowledge in particular, providing dissenters with a rationale for combatting traditional theology and conformist science and philosophy. For Locke, however, a literal reading of scripture was still authoritative for religious beliefs. This was true for Frend and De Morgan also, even though they held tolerant attitudes toward a wide latitude of thinkers. *Locke's view of Reason also affected period reflections on mathematics. Like others in the early modern and Enlightenment eras, Locke had held up mathematics as a model of absolutely certain knowledge because of the clarity of its ideas and the supposed self-evidence of its axiomatic truths. Of course, this characterization applied more to Euclidean geometry than to the burgeoning domains of analytic mathematics, such as calculus, which, as Berkeley charged, still lacked a sound theoretical basis. As for logic, Locke had an acute antipathy toward traditional argument forms and proposed that one should reason with ideas rather than words, assessing their agreement or disagreement in less convoluted ways than in a syllogism. In expressing such relations with language, though, one should use meaningful and unambiguous terms. This was somewhat problematic in algebra and calculus, where symbolic expressions were manipulated to produce useful and important results, even when their meaning was less than clear. *Around the turn of the nineteenth century, Frend campaigned to bring algebra in line with Lockean reasoning: algebra was conceptualized at that time as universal arithmetic, containing such laws as the transposition rule if a + b = c then a = c - b. Thus, no expression should be employed if its meaning was unintelligible. In the above equations, one must assume the condition b < c to rule out negative values, since numbers, which represent quantities of discrete things, cannot be less than 0. Excising negative quantities from mathematics was extreme but necessary in order to adhere to a literalistic view of rationality. *British mathematicians largely resisted following Frend down this path of purity, though they were unsure how to rationally justify their use of negative and imaginary quantities without going outside mathematics and appealing to things like debts. Robert Woodhouse, in an 1803 work, was one of the first Cambridge mathemati-cians to propose a more formalistic algebraic approach in calculus. This agenda was furthered a decade later by members of Cambridge's Analytical Society, one of whom was George Peacock. His and others' attempts to convert Cambridge analysis from Newtonian to Leibnizian calculus were waged through translating a French textbook and making notational changes in Cambridge's mathematical examinations. *In 1830 Peacock's Treatise on Algebra introduced a more formalistic approach in algebra. Richards argues, drawing upon some fairly recent research, that Peacock's position was grounded in a progressivist view of history: arithmetic developed naturally out of fluency with counting, and algebra out of familiarity with arithmetic. Arithmetic suggests equivalent forms (equations, or symbolic assertions like the above rule) that can also be accepted as equiva-lent/valid in algebra without being constrained by restrictions appropriate to arithmetic. Such transitions, he thought, constitute genuine historical progress. Algebra thus splits into two parts for Peacock, arithmetical algebra and symbolical algebra, the latter based upon his principle of the permanence of equivalent forms, as found in his 1830 A Treatise on Algebra. *Peacock's approach to algebra set the stage for later British mathematicians such as De Morgan (Peacock's student), Boole, and others. Initially inclined to follow his future father-in-law's restrictive approach in algebra, De Morgan was soon won over to Peacock's point of view, even going beyond it in his own work. In a series of articles around 1840, De Morgan identified the basic rules governing ordinary calculations, but he also began entertaining the notion of a symbolical algebra less tightly tied to arithmetical algebra. By more completely separating the interpretation of algebra's operations and symbols from its axioms, symbolical algebra gained further independence from arithmetic. This gave algebra more flexibility, making room for subsequent developments such as the quaternion algebra of William Rowan Hamilton (1843) and Boole's algebra of logic (1847). *After exploring the foundations of algebra, De Morgan turned his attention to analyzing forms of reasoning, a topic made popular by the resurgence of syllogistic logic instigated at Oxford around 1825 by Richard Whately. Traditional Aristotelian logic parsed valid arguments into syllogisms containing categorical statements such as every X is Y. De Morgan treated such sentences exten-sionally, using parentheses to indicate total or partial inclusion between classes X and Y. Thus, every X is Y was symbolized by X)Y since the parenthesis opens toward X; to be more precise, one should indicate whether X and Y are coextensive or X is only a part of Y. By thus quantifying the predicate, as it was called, De Morgan allowed for these two possibilities to be symbolized respectively by X)(Y and X))Y, in compact symbolic form as ')(' and '))'. Combining the two premises of a syllogistic argument using this notation, one could then apply an erasure rule to draw its conclusion. De Morgan enthusiastically elaborated his symbolic logic by adopting an abstract version of algebra that paved the way for operating with formal symbols in logic. De Morgan's symbolism is not as inaccessible as Frege's later two-dimensional concept-writing (though the full version of De Morgan's notation is more complex than indicated here), but it is still rather forbid-ding and failed to find adherents. *In addition to expanding Aristotelian forms by quantifying the predicate, yielding eight basic categorical forms instead of the standard four, by 1860 De Morgan was generalizing the copula "is" in such sentences to other relations, such as "is a brother of" or "is greater than." He began to systematically investigate the formal properties of such relations and the ways in which relations might be compounded. Though intended as a way to generalize categorical statements and expand syllogistic logic, his treatment of relations was later recognized as an important contribution that could be incorporated into predicate logic. Richards's treatment gives the reader a fair sense of what De Morgan's logic was like, and while a detailed comparison is not developed, the reader can begin to see how De Morgan's system compares to Aristotelian logic, Boole's algebra of logic, and contemporary mathematical logic. *However, as indicated at the outset, exploring De Morgan's algebraic and logical work is only a subplot of Richards's story. Her book is principally a brief for how Reason grounded the work and lives of several significant thinkers in an extended family over three generations. As she ties various threads together, the reader occasionally senses that the presentation may be too tidy, drawing parallels between vastly different developments to make them seem of a piece, all motivated by the same driving force of Reason. Nevertheless, Richards's account forces the reader to continually keep the bigger picture in mind and to connect various facets of the actors' lives and work to their deeper commitment to Reason. Her book thus offers a commendable case study for how technical trends in mathematics might be tied to broader cultural and philosophical concerns. *Reviewed by Calvin Jongsma, Professor Emeritus of Mathematics, Dordt University, Sioux Center, IA 51250.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Gordon, Michael J. C. "Programming Combinations of Deduction and BDD-based Symbolic Calculation". LMS Journal of Computation and Mathematics 5 (2002): 56–76. http://dx.doi.org/10.1112/s1461157000000693.

Texto completo
Resumen
AbstractA generalisation of Milner's ‘LCF approach’ is described. This allows algorithms based on binary decision diagrams (BDDs) to be programmed as derived proof rules in a calculus of representation judgements. The derivation of representation judgements becomes an LCF-style proof by defining an abstract type for judgements analogous to the LCF type of theorems. The primitive inference rules for representation judgements correspond to the operations provided by an efficient BDD package coded in C (BuDDy). Proof can combine traditional inference with steps inferring representation judgements. The resulting system provides a platform to support a tight and principled integration of theorem proving and model checking. The methods are illustrated by using them to solve all instances of a generalised Missionaries and Cannibals problem.
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Ratkiewicz, Artur y Thanh N. Truong. "Automated mechanism generation: From symbolic calculation to complex chemistry". International Journal of Quantum Chemistry 106, n.º 1 (2005): 244–55. http://dx.doi.org/10.1002/qua.20748.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Torresi, Sandra. "Interaction between domain-specific and domain-general abilities in math´s competence". Journal of Applied Cognitive Neuroscience 1, n.º 1 (7 de diciembre de 2020): 43–51. http://dx.doi.org/10.17981/jacn.1.1.2020.08.

Texto completo
Resumen
This article is an approach to some viewpoints about interactions between domain-specific and general cognitive tools involved in the development of mathematical competence. Many studies report positive correlations between the acuity of the numerical approximation system and formal mathematical performance, while another important group of investigations have found no evidence of a direct connection between non-symbolic and symbolic numerical representations. The challenge for future research will be to focus on correlations and possible causalities between non-symbolic and symbolic arithmetic skills and general domain cognitive skills in order to identify stable precursors of mathematical competence.
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

NAKAMURA, Hiroyuki, Masatake HIGASHI y Mamoru HOSAKA. "Robust Interference Calculation of Polyhedral Solids by Symbolic Calculation Using Face Names (1st Report)". Journal of the Japan Society for Precision Engineering 63, n.º 4 (1997): 515–19. http://dx.doi.org/10.2493/jjspe.63.515.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

NAKAMURA, Hiroyuki, Masatake HIGASHI y Mamoru HOSAKA. "Robust Interference Calculation of Polyhedral Solids by Symbolic Calculation Using Face Names (2nd Report)". Journal of the Japan Society for Precision Engineering 64, n.º 1 (1998): 106–10. http://dx.doi.org/10.2493/jjspe.64.106.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Zhang, Tao, Hui Li, Wenxue Hong, Xiamei Yuan y Xinyu Wei. "Deep First Formal Concept Search". Scientific World Journal 2014 (2014): 1–13. http://dx.doi.org/10.1155/2014/275679.

Texto completo
Resumen
The calculation of formal concepts is a very important part in the theory of formal concept analysis (FCA); however, within the framework of FCA, computing all formal concepts is the main challenge because of its exponential complexity and difficulty in visualizing the calculating process. With the basic idea of Depth First Search, this paper presents a visualization algorithm by the attribute topology of formal context. Limited by the constraints and calculation rules, all concepts are achieved by the visualization global formal concepts searching, based on the topology degenerated with the fixed start and end points, without repetition and omission. This method makes the calculation of formal concepts precise and easy to operate and reflects the integrity of the algorithm, which enables it to be suitable for visualization analysis.
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Greenberg, Gabriel. "The Iconic-Symbolic Spectrum". Philosophical Review 132, n.º 4 (1 de octubre de 2023): 579–627. http://dx.doi.org/10.1215/00318108-10697558.

Texto completo
Resumen
It is common to distinguish two great families of representation. Symbolic representations include logical and mathematical symbols, words, and complex linguistic expressions. Iconic representations include dials, diagrams, maps, pictures, 3-dimensional models, and depictive gestures. This essay describes and motivates a new way of distinguishing iconic from symbolic representation. It locates the difference not in the signs themselves, nor in the contents they express, but in the semantic rules by which signs are associated with contents. The two kinds of rule have divergent forms, occupying opposite poles on a spectrum of naturalness. Symbolic rules are composed entirely of primitive juxtapositions of sign types with contents, while iconic rules determine contents entirely by uniform natural relations with sign types. This distinction is marked explicitly in the formal semantics of familiar sign systems, both for atomic first-order representations, like words, pixel colors, and dials, and for complex second-order representations, like sentences, diagrams, and pictures.
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Zhi, Hui-lai. "On the Calculation of Formal Concept Stability". Journal of Applied Mathematics 2014 (2014): 1–6. http://dx.doi.org/10.1155/2014/917639.

Texto completo
Resumen
The idea of stability has been used in many applications. However, computing stability is still a challenge and the best algorithms known so far have algorithmic complexity quadratic to the size of the lattice. To improve the effectiveness, a critical term is introduced in this paper, that is, minimal generator, which serves as the minimal set that makes a concept stable when deleting some objects from the extent. Moreover, by irreducible elements, minimal generator is derived. Finally, based on inclusion-exclusion principle and minimal generator, formulas for the calculation of concept stability are proposed.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

DeTar, DeLos F. "Calculation of formal steric enthalpy with MM2". Journal of Organic Chemistry 57, n.º 3 (enero de 1992): 902–10. http://dx.doi.org/10.1021/jo00029a022.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Gonzalez, Cleotilde y Christian Lebiere. "Cognitive architectures combine formal and heuristic approaches". Behavioral and Brain Sciences 36, n.º 3 (14 de mayo de 2013): 285–86. http://dx.doi.org/10.1017/s0140525x12002956.

Texto completo
Resumen
AbstractQuantum probability (QP) theory provides an alternative account of empirical phenomena in decision making that classical probability (CP) theory cannot explain. Cognitive architectures combine probabilistic mechanisms with symbolic knowledge-based representations (e.g., heuristics) to address effects that motivate QP. They provide simple and natural explanations of these phenomena based on general cognitive processes such as memory retrieval, similarity-based partial matching, and associative learning.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía