To see the other types of publications on this topic, follow the link: Dynamical memory.

Journal articles on the topic 'Dynamical memory'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Dynamical memory.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Ganguli, S., D. Huh, and H. Sompolinsky. "Memory traces in dynamical systems." Proceedings of the National Academy of Sciences 105, no. 48 (November 19, 2008): 18970–75. http://dx.doi.org/10.1073/pnas.0804451105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Rehn, Martin, and Anders Lansner. "Sequence memory with dynamical synapses." Neurocomputing 58-60 (June 2004): 271–78. http://dx.doi.org/10.1016/j.neucom.2004.01.055.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Mitchell, Melanie. "Human Memory: A Dynamical Process." Contemporary Psychology 48, no. 3 (June 2003): 326–27. http://dx.doi.org/10.1037/000805.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Boffetta, G., R. Monasson, and R. Zecchina. "MEMORY RETRIEVAL IN OPTIMAL SUBSPACES." International Journal of Neural Systems 03, supp01 (January 1992): 71–77. http://dx.doi.org/10.1142/s0129065792000401.

Full text
Abstract:
A simple dynamical scheme for Attractor Neural Networks with non-monotonic three state effective neurons is discussed. For the unsupervised Hebb learning rule, we give some basic numerical results which are interpreted in terms of a combinatorial task realized by the dynamical process (dynamical selection of optimal subspaces). An analytical estimate of optimal performance is given by resorting to two different simplified versions of the model. We show that replica symmetry breaking is required since the replica symmetric solutions are unstable.
APA, Harvard, Vancouver, ISO, and other styles
5

AICARDI, FRANCESCA, and SERGIO INVERNIZZI. "MEMORY EFFECTS IN DISCRETE DYNAMICAL SYSTEMS." International Journal of Bifurcation and Chaos 02, no. 04 (December 1992): 815–30. http://dx.doi.org/10.1142/s0218127492000458.

Full text
Abstract:
Let fµ(s)=µs(1−s) be the family of logistic maps with parameter µ, 1≤µ≤4. We present a study of the second-order difference equation xn+1=fµ([1−∈]xn+∈xn−1), 0≤∈≤1, which reduces to the well-known logistic equation as ∈=0.
APA, Harvard, Vancouver, ISO, and other styles
6

Klinshov, Vladimir V., and Vladimir I. Nekorkin. "Dynamical model of working memory system." Neuroscience Research 58 (January 2007): S44. http://dx.doi.org/10.1016/j.neures.2007.06.259.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Brianzoni, Serena, Cristiana Mammana, Elisabetta Michetti, and Francesco Zirilli. "A Stochastic Cobweb Dynamical Model." Discrete Dynamics in Nature and Society 2008 (2008): 1–18. http://dx.doi.org/10.1155/2008/219653.

Full text
Abstract:
We consider the dynamics of a stochastic cobweb model with linear demand and a backward-bending supply curve. In our model, forward-looking expectations and backward-looking ones are assumed, in fact we assume that the representative agent chooses the backward predictor with probability , and the forward predictor with probability , so that the expected price at time is a random variable and consequently the dynamics describing the price evolution in time is governed by a stochastic dynamical system. The dynamical system becomes a Markov process when the memory rate vanishes. In particular, we study the Markov chain in the cases of discrete and continuous time. Using a mixture of analytical tools and numerical methods, we show that, when prices take discrete values, the corresponding Markov chain is asymptotically stable. In the case with continuous prices and nonnecessarily zero memory rate, numerical evidence of bounded price oscillations is shown. The role of the memory rate is studied through numerical experiments, this study confirms the stabilizing effects of the presence of resistant memory.
APA, Harvard, Vancouver, ISO, and other styles
8

Oliveira, H. S., A. S. de Paula, and M. A. Savi. "Dynamical Jumps in a Shape Memory Alloy Oscillator." Shock and Vibration 2014 (2014): 1–10. http://dx.doi.org/10.1155/2014/656212.

Full text
Abstract:
The dynamical response of systems with shape memory alloy (SMA) elements presents a rich behavior due to their intrinsic nonlinear characteristic. SMA’s nonlinear response is associated with both adaptive dissipation related to hysteretic behavior and huge changes in properties caused by phase transformations. These characteristics are attracting much technological interest in several scientific and engineering fields, varying from medical to aerospace applications. An important characteristic associated with dynamical response of SMA system is the jump phenomenon. Dynamical jumps result in abrupt changes in system behavior and its analysis is essential for a proper design of SMA systems. This paper discusses the nonlinear dynamics of a one degree of freedom SMA oscillator presenting pseudoelastic behavior and dynamical jumps. Numerical simulations show different aspects of this kind of behavior, illustrating its importance for a proper understanding of nonlinear dynamics of SMA systems.
APA, Harvard, Vancouver, ISO, and other styles
9

Mohapatra, Anushaya, and William Ott. "Memory loss for nonequilibrium open dynamical systems." Discrete & Continuous Dynamical Systems - A 34, no. 9 (2014): 3747–59. http://dx.doi.org/10.3934/dcds.2014.34.3747.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ott, William, Mikko Stenlund, and Lai-Sang Young. "Memory loss for time-dependent dynamical systems." Mathematical Research Letters 16, no. 3 (2009): 463–75. http://dx.doi.org/10.4310/mrl.2009.v16.n3.a7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Giovannetti, Vittorio. "A dynamical model for quantum memory channels." Journal of Physics A: Mathematical and General 38, no. 50 (November 30, 2005): 10989–1005. http://dx.doi.org/10.1088/0305-4470/38/50/008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Rudenko, E. A. "Optimal recurrent logical-dynamical finite memory filter." Journal of Computer and Systems Sciences International 56, no. 4 (July 2017): 607–15. http://dx.doi.org/10.1134/s106423071704013x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Gallas, Jason A. C. "Simulating memory effects with discrete dynamical systems." Physica A: Statistical Mechanics and its Applications 195, no. 3-4 (May 1993): 417–30. http://dx.doi.org/10.1016/0378-4371(93)90167-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Wang, Xiaofeng, and Xiaohe Chen. "Derivative-Free Kurchatov-Type Accelerating Iterative Method for Solving Nonlinear Systems: Dynamics and Applications." Fractal and Fractional 6, no. 2 (January 24, 2022): 59. http://dx.doi.org/10.3390/fractalfract6020059.

Full text
Abstract:
Two novel Kurchatov-type first-order divided difference operators were designed, which were used for constructing the variable parameter of three derivative-free iterative methods. The convergence orders of the new derivative-free methods are 3, (5+17)/2≈4.56 and 5. The new derivative-free iterative methods with memory were applied to solve nonlinear ordinary differential equations (ODEs) and partial differential equations (PDEs) in numerical experiments. The dynamical behavior of our new methods with memory was studied by using dynamical plane. The dynamical planes showed that our methods had good stability.
APA, Harvard, Vancouver, ISO, and other styles
15

Campos, Beatriz, Alicia Cordero, Juan R. Torregrosa, and Pura Vindel. "Dynamical analysis of an iterative method with memory on a family of third-degree polynomials." AIMS Mathematics 7, no. 4 (2022): 6445–66. http://dx.doi.org/10.3934/math.2022359.

Full text
Abstract:
<abstract><p>Qualitative analysis of iterative methods with memory has been carried out a few years ago. Most of the papers published in this context analyze the behaviour of schemes on quadratic polynomials. In this paper, we accomplish a complete dynamical study of an iterative method with memory, the Kurchatov scheme, applied on a family of cubic polynomials. To reach this goal we transform the iterative scheme with memory into a discrete dynamical system defined on $ \mathbf{R}^2 $. We obtain a complete description of the dynamical planes for every value of parameter of the family considered. We also analyze the bifurcations that occur related with the number of fixed points. Finally, the dynamical results are summarized in a parameter line. As a conclusion, we obtain that this scheme is completely stable for cubic polynomials since the only attractors that appear for any value of the parameter, are the roots of the polynomial.</p></abstract>
APA, Harvard, Vancouver, ISO, and other styles
16

Choubey, Neha, A. Cordero, J. P. Jaiswal, and J. R. Torregrosa. "Dynamical Techniques for Analyzing Iterative Schemes with Memory." Complexity 2018 (June 28, 2018): 1–13. http://dx.doi.org/10.1155/2018/1232341.

Full text
Abstract:
We construct a new biparametric three-point method with memory to highly improve the computational efficiency of its original partner, without adding functional evaluations. In this way, through different estimations of self-accelerating parameters, we have modified an existing seventh-order method. The parameters have been defined by Hermite interpolating polynomial that allows the accelerating effect. In particular, the R-order of the proposed iterative method with memory is increased from seven to ten. A real multidimensional analysis of the stability of this method with memory is made, in order to study its dependence on the initial estimations. Taking into account that usually iterative methods with memory are more stable than their derivative-free partners and the obtained results in this study, the behavior of this scheme shows to be excellent, but for a small domain. Numerical examples and comparison are also provided, confirming the theoretical results.
APA, Harvard, Vancouver, ISO, and other styles
17

Zeng, Shaolong, Xuejin Wan, Yangfan Hu, Shijing Tan, and Biao Wang. "Discovery of Intrinsic Ferromagnetism Induced by Memory Effects in Low-Dimensional System." Fractal and Fractional 8, no. 11 (November 16, 2024): 668. http://dx.doi.org/10.3390/fractalfract8110668.

Full text
Abstract:
The impact of dynamic processes on equilibrium properties is a fundamental issue in condensed matter physics. This study investigates the intrinsic ferromagnetism generated by memory effects in the low-dimensional continuous symmetry Landau–Ginzburg model, demonstrating how memory effects can suppress fluctuations and stabilize long-range magnetic order. Our results provide compelling evidence that tuning dynamical processes can significantly alter the behavior of systems in equilibrium. We quantitatively evaluate how the emergence of the ferromagnetic phase depends on memory effects and confirm the presence of ferromagnetism through simulations of hysteresis loops, spontaneous magnetization, and magnetic domain structures in the 1D continuous symmetry Landau–Ginzburg model. This research offers both theoretical and numerical insights for identifying new phases of matter by dynamically modifying equilibrium properties.
APA, Harvard, Vancouver, ISO, and other styles
18

Beri, A. C., and T. F. George. "Memory effects in dynamical many-body systems: The isomnesic (constant-memory) approximation." Zeitschrift f�r Physik B Condensed Matter 60, no. 1 (March 1985): 73–78. http://dx.doi.org/10.1007/bf01312645.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Aguiar, Ricardo AA, Marcelo A. Savi, and Pedro MCL Pacheco. "Experimental investigation of vibration reduction using shape memory alloys." Journal of Intelligent Material Systems and Structures 24, no. 2 (October 17, 2012): 247–61. http://dx.doi.org/10.1177/1045389x12461696.

Full text
Abstract:
Smart materials have a growing technological importance due to their unique thermomechanical characteristics. Shape memory alloys belong to this class of materials being easy to manufacture, relatively lightweight, and able to produce high forces or displacements with low power consumption. These aspects could be exploited in different applications including vibration control. Nevertheless, literature presents only a few references concerning the experimental analysis of shape memory alloy dynamical systems. This contribution deals with the experimental analysis of shape memory alloy dynamical systems by considering an experimental apparatus consisted of low-friction cars free to move in a rail. A shaker that provides harmonic forcing excites the system. The vibration analysis reveals that shape memory alloy elements introduce complex behaviors to the system and that different thermomechanical loadings are of concern showing the main aspects of the shape memory alloy dynamical response. Special attention is dedicated to the analysis of vibration reduction that can be achieved by considering different approaches exploiting either temperature variations promoted by electric current changes or vibration absorber techniques. The results establish that adaptability due to temperature variations is defined by a competition between stiffness and hysteretic behavior changes.
APA, Harvard, Vancouver, ISO, and other styles
20

AMRITKAR, R. E., and P. M. GADE. "CHARACTERIZING LOSS OF MEMORY IN CHAOTIC DYNAMICAL SYSTEMS." International Journal of Modern Physics B 05, no. 14 (August 20, 1991): 2323–45. http://dx.doi.org/10.1142/s0217979291000900.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Gade, P. M., and R. E. Amritkar. "Characterizing loss of memory in a dynamical system." Physical Review Letters 65, no. 4 (July 23, 1990): 389–92. http://dx.doi.org/10.1103/physrevlett.65.389.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Biercuk, Michael J., Hermann Uys, Aaron P. VanDevender, Nobuyasu Shiga, Wayne M. Itano, and John J. Bollinger. "Optimized dynamical decoupling in a model quantum memory." Nature 458, no. 7241 (April 2009): 996–1000. http://dx.doi.org/10.1038/nature07951.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Tanaka, Shu, and Seiji Miyashita. "Dynamical Properties of Temperature Chaos and Memory Effect." Progress of Theoretical Physics Supplement 157 (2005): 34–37. http://dx.doi.org/10.1143/ptps.157.34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Albeverio, Sergio, Andrei Khrennikov, and Peter E. Kloeden. "Memory retrieval as a p-adic dynamical system." Biosystems 49, no. 2 (February 1999): 105–15. http://dx.doi.org/10.1016/s0303-2647(98)00035-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Federer, Callie, and Joel Zylberberg. "A self-organizing short-term dynamical memory network." Neural Networks 106 (October 2018): 30–41. http://dx.doi.org/10.1016/j.neunet.2018.06.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Khetarpal, Monika, Deepika Bhandari, and N. S. Saxena. "Memory effects and dynamical correlations in liquid selenium." Physica B: Condensed Matter 254, no. 1-2 (November 1998): 52–56. http://dx.doi.org/10.1016/s0921-4526(98)00413-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Gade, P. M., and R. E. Amritkar. "Loss of memory in a chaotic dynamical system." Physical Review A 45, no. 2 (January 1, 1992): 725–33. http://dx.doi.org/10.1103/physreva.45.725.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Otsuka, Kenju, and Jyh-Long Chern. "Dynamical spatial-pattern memory in globally coupled lasers." Physical Review A 45, no. 11 (June 1, 1992): 8288–91. http://dx.doi.org/10.1103/physreva.45.8288.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Rouson, Damian W. I., Karla Morris, and Xiaofeng Xu. "Dynamic Memory De-allocation in Fortran 95/2003 Derived Type Calculus." Scientific Programming 13, no. 3 (2005): 189–203. http://dx.doi.org/10.1155/2005/702048.

Full text
Abstract:
Abstract data types developed for computational science and engineering are frequently modeled after physical objects whose state variables must satisfy governing differential equations. Generalizing the associated algebraic and differential operators to operate on the abstract data types facilitates high-level program constructs that mimic standard mathematical notation. For non-trivial expressions, multiple object instantiations must occur to hold intermediate results during the expression's evaluation. When the dimension of each object's state space is not specified at compile-time, the programmer becomes responsible for dynamically allocating and de-allocating memory for each instantiation. With the advent of allocatable components in Fortran 2003 derived types, the potential exists for these intermediate results to occupy a substantial fraction of a program's footprint in memory. This issue becomes particularly acute at the highest levels of abstraction where coarse-grained data structures predominate. This paper proposes a set of rules for de-allocating memory that has been dynamically allocated for intermediate results in derived type calculus, while distinguishing that memory from more persistent objects. The new rules are applied to the design of a polymorphic time integrator for integrating evolution equations governing dynamical systems. Associated issues of efficiency and design robustness are discussed.
APA, Harvard, Vancouver, ISO, and other styles
30

Horn, D., and M. Usher. "EXCITATORY–INHIBITORY NETWORKS WITH DYNAMICAL THRESHOLDS." International Journal of Neural Systems 01, no. 03 (January 1990): 249–57. http://dx.doi.org/10.1142/s0129065790000151.

Full text
Abstract:
We investigate feedback networks containing excitatory and inhibitory neurons. The couplings between the neurons follow a Hebbian rule in which the memory patterns are encoded as cell assemblies of the excitatory neurons. Using disjoint patterns, we study the attractors of this model and point out the importance of mixed states. The latter become dominant at temperatures above 0.25. We use both numerical simulations and an analytic approach for our investigation. The latter is based on differential equations for the activity of the different memory patterns in the network configuration. Allowing the excitatory thresholds to develop dynamic features which correspond to fatigue of individual neurons, we obtain motion in pattern space, the space of all memories. The attractors turn into transients leading to chaotic motion for appropriate values of the dynamical parameters. The motion can be guided by overlaps between patterns, resembling a process of free associative thinking in the absence of any input.
APA, Harvard, Vancouver, ISO, and other styles
31

Eiser, J. Richard. "The dynamical hypothesis in social cognition." Behavioral and Brain Sciences 21, no. 5 (October 1998): 638. http://dx.doi.org/10.1017/s0140525x98331738.

Full text
Abstract:
Research in attitudes and social cognition exemplifies van Gelder's distinction between the computational and dynamical approaches. The former emphasizes linear measurement and rational decision-making. The latter considers processes of associative memory and self-organization in attitude formation and social influence. The study of dynamical processes in social cognition has been facilitated by connectionist approaches to computation.
APA, Harvard, Vancouver, ISO, and other styles
32

Dannenberg, Holger, Andrew S. Alexander, Jennifer C. Robinson, and Michael E. Hasselmo. "The Role of Hierarchical Dynamical Functions in Coding for Episodic Memory and Cognition." Journal of Cognitive Neuroscience 31, no. 9 (September 2019): 1271–89. http://dx.doi.org/10.1162/jocn_a_01439.

Full text
Abstract:
Behavioral research in human verbal memory function led to the initial definition of episodic memory and semantic memory. A complete model of the neural mechanisms of episodic memory must include the capacity to encode and mentally reconstruct everything that humans can recall from their experience. This article proposes new model features necessary to address the complexity of episodic memory encoding and recall in the context of broader cognition and the functional properties of neurons that could contribute to this broader scope of memory. Many episodic memory models represent individual snapshots of the world with a sequence of vectors, but a full model must represent complex functions encoding and retrieving the relations between multiple stimulus features across space and time on multiple hierarchical scales. Episodic memory involves not only the space and time of an agent experiencing events within an episode but also features shown in neurophysiological data such as coding of speed, direction, boundaries, and objects. Episodic memory includes not only a spatio-temporal trajectory of a single agent but also segments of spatio-temporal trajectories for other agents and objects encountered in the environment consistent with data on encoding the position and angle of sensory features of objects and boundaries. We will discuss potential interactions of episodic memory circuits in the hippocampus and entorhinal cortex with distributed neocortical circuits that must represent all features of human cognition.
APA, Harvard, Vancouver, ISO, and other styles
33

TAN, Z., and M. K. ALI. "PATTERN RECOGNITION WITH STOCHASTIC RESONANCE IN A GENERIC NEURAL NETWORK." International Journal of Modern Physics C 11, no. 08 (December 2000): 1585–93. http://dx.doi.org/10.1142/s0129183100001413.

Full text
Abstract:
We discuss stochastic resonance in associative memory with a canonical neural network model that describes the generic behavior of a large family of dynamical systems near bifurcation. Our result shows that stochastic resonance helps memory association. The relationship between stochastic resonance, associative memory, storage load, history of memory and initial states are studied. In intelligent systems like neural networks, it is likely that stochastic resonance combined with synaptic information enhances memory recalls.
APA, Harvard, Vancouver, ISO, and other styles
34

GOLES, ERIC, and MARTÍN MATAMALA. "DYNAMICAL AND COMPLEXITY RESULTS FOR HIGH ORDER NEURAL NETWORKS." International Journal of Neural Systems 05, no. 03 (September 1994): 241–52. http://dx.doi.org/10.1142/s0129065794000256.

Full text
Abstract:
We present dynamical results concerning neural networks with high order arguments. More precisely, we study the family of block-sequential iteration of neural networks with polynomial arguments. In this context, we prove that, under a symmetric hypothesis, the sequential iteration is the only one of this family to converge to fixed points. The other iteration modes present a highly complex dynamical behavior: non-bounded cycles and simulation of arbitrary non-symmetric linear neural network.5 We also study a high order memory iteration scheme which accepts an energy functional and bounded cycles in the size of the memory steps.
APA, Harvard, Vancouver, ISO, and other styles
35

Otsuka, Kenju, and Jyh-Long Chern. "Erratum: Dynamical spatial-pattern memory in globally coupled lasers." Physical Review A 47, no. 4 (April 1, 1993): 3456. http://dx.doi.org/10.1103/physreva.47.3456.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Deco, Gustavo, and Edmund T. Rolls. "Sequential Memory: A Putative Neural and Synaptic Dynamical Mechanism." Journal of Cognitive Neuroscience 17, no. 2 (February 2005): 294–307. http://dx.doi.org/10.1162/0898929053124875.

Full text
Abstract:
A key issue in the neurophysiology of cognition is the problem of sequential learning. Sequential learning refers to the ability to encode and represent the temporal order of discrete elements occurring in a sequence. We show that the short-term memory for a sequence of items can be implemented in an autoassociation neural network. Each item is one of the attractor states of the network. The autoassociation network is implemented at the level of integrate-and-fire neurons so that the contributions of different biophysical mechanisms to sequence learning can be investigated. It is shown that if it is a property of the synapses or neurons that support each attractor state that they adapt, then everytime the network is made quiescent (e.g., by inhibition), then the attractor state that emerges next is the next item in the sequence. We show with numerical simulations implementations of the mechanisms using (1) a sodium inactivation-based spike-frequency-adaptation mechanism, (2) a Ca2+-activated K+ current, and (3) short-term synaptic depression, with sequences of up to three items. The network does not need repeated training on a particular sequence and will repeat the items in the order that they were last presented. The time between the items in a sequence is not fixed, allowing the items to be read out as required over a period of up to many seconds. The network thus uses adaptation rather than associative synaptic modification to recall the order of the items in a recently presented sequence.
APA, Harvard, Vancouver, ISO, and other styles
37

Liu, Xiao-Shu, and Yang Liu. "Dynamical Suppression of Decoherence in Two-Qubit Quantum Memory." Communications in Theoretical Physics 44, no. 3 (September 2005): 459–62. http://dx.doi.org/10.1088/6102/44/3/459.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Rivier, L., R. Loft, and L. M. Polvani. "An Efficient Spectral Dynamical Core for Distributed Memory Computers." Monthly Weather Review 130, no. 5 (May 2002): 1384–96. http://dx.doi.org/10.1175/1520-0493(2002)130<1384:aesdcf>2.0.co;2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Leman, Marc. "Dynamical‐hierarchical networks as perceptual memory representations of music." Interface 14, no. 3-4 (January 1985): 125–64. http://dx.doi.org/10.1080/09298218508570465.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Tiňo, Peter, and Ali Rodan. "Short term memory in input-driven linear dynamical systems." Neurocomputing 112 (July 2013): 58–63. http://dx.doi.org/10.1016/j.neucom.2012.12.041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Campos, Beatriz, Alicia Cordero, Juan R. Torregrosa, and Pura Vindel. "A multidimensional dynamical approach to iterative methods with memory." Applied Mathematics and Computation 271 (November 2015): 701–15. http://dx.doi.org/10.1016/j.amc.2015.09.056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Traversa, Fabio L., Massimiliano Di Ventra, Federica Cappelluti, and Fabrizio Bonani. "Application of Floquet theory to dynamical systems with memory." Chaos: An Interdisciplinary Journal of Nonlinear Science 30, no. 12 (December 2020): 123102. http://dx.doi.org/10.1063/5.0016058.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Enemark, Soren, Marcelo A. Savi, and Ilmar F. Santos. "Experimental analyses of dynamical systems involving shape memory alloys." Smart Structures and Systems 15, no. 6 (June 25, 2015): 1521–42. http://dx.doi.org/10.12989/sss.2015.15.6.1521.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Moustafa, Mahmoud, Suad Mawloud Zali, and Sharidan Shafie. "Dynamical Analysis of Eco-Epidemiological Model with Fading Memory." Journal of Applied Nonlinear Dynamics 13, no. 2 (June 2024): 191–202. http://dx.doi.org/10.5890/jand.2024.06.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Jacquet, Maud, and Robert M. French. "The BIA++: Extending the BIA+ to a dynamical distributed connectionist framework." Bilingualism: Language and Cognition 5, no. 3 (December 2002): 202–5. http://dx.doi.org/10.1017/s1366728902223019.

Full text
Abstract:
Dijkstra and van Heuven have made an admirable attempt to develop a new model of bilingual memory, the BIA+. Their article presents a clear and well-reasoned theoretical justification of their model, followed by a description of their model. The BIA+ is, as the name implies, an extension of the Bilingual Interactive Activation (BIA) model (Dijkstra and van Heuven, 1998; Van Heuven, Dijkstra and Grainger, 1998; etc), which was itself an adaptation to bilingual memory of McClelland and Rumelhart's (1981) Interactive Activation model of monolingual memory.
APA, Harvard, Vancouver, ISO, and other styles
46

Ahmed, Elsayed, and Ahmed Sadek Hegazi. "On discrete dynamical systems associated to games." Advances in Complex Systems 02, no. 04 (December 1999): 423–30. http://dx.doi.org/10.1142/s0219525999000217.

Full text
Abstract:
Dynamical systems corresponding to continuous strategy games are introduced. The steady of the systems corresponding to continuous prisoner's dilemma, hawk-dove and rock-scissors-paper games are found and their stability are discussed. A discrete imitation equation is derived. Smale's payoff memory game formulation is generalized.
APA, Harvard, Vancouver, ISO, and other styles
47

Park, Sungju, Kyungmin Kim, Keunjin Kim, and Choonsung Nam. "Dynamical Pseudo-Random Number Generator Using Reinforcement Learning." Applied Sciences 12, no. 7 (March 26, 2022): 3377. http://dx.doi.org/10.3390/app12073377.

Full text
Abstract:
Pseudo-random number generators (PRNGs) are based on the algorithm that generates a sequence of numbers arranged randomly. Recently, random numbers have been generated through a reinforcement learning mechanism. This method generates random numbers based on reinforcement learning characteristics that select the optimal behavior considering every possible status up to the point of episode closing to secure the randomness of such random numbers. The LSTM method is used for the long-term memory of previous patterns and selection of new patterns in consideration of such previous patterns. In addition, feature vectors extracted from the LSTM are accumulated, and their images are generated to overcome the limitation of LSTM long-term memory. From these generated images, features are extracted using CNN. This dynamical pseudo-random number generator secures the randomness of random numbers.
APA, Harvard, Vancouver, ISO, and other styles
48

Lattanzi, G., G. Nardulli, and S. Stramaglia. "A Neural Network with Permanent and Volatile Memory." Modern Physics Letters B 11, no. 24 (October 20, 1997): 1037–45. http://dx.doi.org/10.1142/s0217984997001250.

Full text
Abstract:
We discuss an attractor neural network where both neurons and synapses obey dynamical equations; it models the compresence of permanent and short-term memories. By signal-noise analysis and a discussion in terms of an energy-like function, we outline the behavior of the model.
APA, Harvard, Vancouver, ISO, and other styles
49

Taranto, Philip, Thomas J. Elliott, and Simon Milz. "Hidden Quantum Memory: Is Memory There When Somebody Looks?" Quantum 7 (April 27, 2023): 991. http://dx.doi.org/10.22331/q-2023-04-27-991.

Full text
Abstract:
In classical physics, memoryless dynamics and Markovian statistics are one and the same. This is not true for quantum dynamics, first and foremost because quantum measurements are invasive. Going beyond measurement invasiveness, here we derive a novel distinction between classical and quantum processes, namely the possibility of hidden quantum memory. While Markovian statistics of classical processes can always be reproduced by a memoryless dynamical model, our main result shows that this is not true in quantum mechanics: We first provide an example of quantum non-Markovianity whose manifestation depends on whether or not a previous measurement is performed – an impossible phenomenon for memoryless dynamics; we then strengthen this result by demonstrating statistics that are Markovian independent of how they are probed, but are nonetheless still incompatible with memoryless quantum dynamics. Thus, we establish the existence of Markovian statistics gathered by probing a quantum process that nevertheless fundamentally require memory for their creation.
APA, Harvard, Vancouver, ISO, and other styles
50

Zenyuk, Dmitry Alexeevich. "Discrete dynamical systems with random delays." Keldysh Institute Preprints, no. 70 (2024): 1–35. http://dx.doi.org/10.20948/prepr-2024-70.

Full text
Abstract:
The paper contains a detailed survey of the results on difference equations with random delays. Processes defined by them are history-dependent and can be treated as an effective modeling tool in studies of systems with strong memory, e. g. in epidemiology or population dynamics. All discussed models have simple linear form, yet many of their characteristics remain unknown. Random walks with similar history-dependent increment probabilities, of which a much more detailed description is readily available, are also briefly reviewed.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography