Добірка наукової літератури з теми "Dynamical memory"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Dynamical memory".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Статті в журналах з теми "Dynamical memory"

1

Ganguli, S., D. Huh, and H. Sompolinsky. "Memory traces in dynamical systems." Proceedings of the National Academy of Sciences 105, no. 48 (November 19, 2008): 18970–75. http://dx.doi.org/10.1073/pnas.0804451105.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Rehn, Martin, and Anders Lansner. "Sequence memory with dynamical synapses." Neurocomputing 58-60 (June 2004): 271–78. http://dx.doi.org/10.1016/j.neucom.2004.01.055.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Mitchell, Melanie. "Human Memory: A Dynamical Process." Contemporary Psychology 48, no. 3 (June 2003): 326–27. http://dx.doi.org/10.1037/000805.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Boffetta, G., R. Monasson, and R. Zecchina. "MEMORY RETRIEVAL IN OPTIMAL SUBSPACES." International Journal of Neural Systems 03, supp01 (January 1992): 71–77. http://dx.doi.org/10.1142/s0129065792000401.

Повний текст джерела
Анотація:
A simple dynamical scheme for Attractor Neural Networks with non-monotonic three state effective neurons is discussed. For the unsupervised Hebb learning rule, we give some basic numerical results which are interpreted in terms of a combinatorial task realized by the dynamical process (dynamical selection of optimal subspaces). An analytical estimate of optimal performance is given by resorting to two different simplified versions of the model. We show that replica symmetry breaking is required since the replica symmetric solutions are unstable.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

AICARDI, FRANCESCA, and SERGIO INVERNIZZI. "MEMORY EFFECTS IN DISCRETE DYNAMICAL SYSTEMS." International Journal of Bifurcation and Chaos 02, no. 04 (December 1992): 815–30. http://dx.doi.org/10.1142/s0218127492000458.

Повний текст джерела
Анотація:
Let fµ(s)=µs(1−s) be the family of logistic maps with parameter µ, 1≤µ≤4. We present a study of the second-order difference equation xn+1=fµ([1−∈]xn+∈xn−1), 0≤∈≤1, which reduces to the well-known logistic equation as ∈=0.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Klinshov, Vladimir V., and Vladimir I. Nekorkin. "Dynamical model of working memory system." Neuroscience Research 58 (January 2007): S44. http://dx.doi.org/10.1016/j.neures.2007.06.259.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Brianzoni, Serena, Cristiana Mammana, Elisabetta Michetti, and Francesco Zirilli. "A Stochastic Cobweb Dynamical Model." Discrete Dynamics in Nature and Society 2008 (2008): 1–18. http://dx.doi.org/10.1155/2008/219653.

Повний текст джерела
Анотація:
We consider the dynamics of a stochastic cobweb model with linear demand and a backward-bending supply curve. In our model, forward-looking expectations and backward-looking ones are assumed, in fact we assume that the representative agent chooses the backward predictor with probability , and the forward predictor with probability , so that the expected price at time is a random variable and consequently the dynamics describing the price evolution in time is governed by a stochastic dynamical system. The dynamical system becomes a Markov process when the memory rate vanishes. In particular, we study the Markov chain in the cases of discrete and continuous time. Using a mixture of analytical tools and numerical methods, we show that, when prices take discrete values, the corresponding Markov chain is asymptotically stable. In the case with continuous prices and nonnecessarily zero memory rate, numerical evidence of bounded price oscillations is shown. The role of the memory rate is studied through numerical experiments, this study confirms the stabilizing effects of the presence of resistant memory.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Oliveira, H. S., A. S. de Paula, and M. A. Savi. "Dynamical Jumps in a Shape Memory Alloy Oscillator." Shock and Vibration 2014 (2014): 1–10. http://dx.doi.org/10.1155/2014/656212.

Повний текст джерела
Анотація:
The dynamical response of systems with shape memory alloy (SMA) elements presents a rich behavior due to their intrinsic nonlinear characteristic. SMA’s nonlinear response is associated with both adaptive dissipation related to hysteretic behavior and huge changes in properties caused by phase transformations. These characteristics are attracting much technological interest in several scientific and engineering fields, varying from medical to aerospace applications. An important characteristic associated with dynamical response of SMA system is the jump phenomenon. Dynamical jumps result in abrupt changes in system behavior and its analysis is essential for a proper design of SMA systems. This paper discusses the nonlinear dynamics of a one degree of freedom SMA oscillator presenting pseudoelastic behavior and dynamical jumps. Numerical simulations show different aspects of this kind of behavior, illustrating its importance for a proper understanding of nonlinear dynamics of SMA systems.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Mohapatra, Anushaya, and William Ott. "Memory loss for nonequilibrium open dynamical systems." Discrete & Continuous Dynamical Systems - A 34, no. 9 (2014): 3747–59. http://dx.doi.org/10.3934/dcds.2014.34.3747.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Ott, William, Mikko Stenlund, and Lai-Sang Young. "Memory loss for time-dependent dynamical systems." Mathematical Research Letters 16, no. 3 (2009): 463–75. http://dx.doi.org/10.4310/mrl.2009.v16.n3.a7.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Дисертації з теми "Dynamical memory"

1

Liu, Yuxi. "Dynamical Activity Patterns of High-frequency Oscillations and Their Functional Roles in Neural Circuits." Thesis, University of Sydney, 2020. https://hdl.handle.net/2123/23236.

Повний текст джерела
Анотація:
Oscillations are widely observed in brain activities, but their spatiotemporal organisation properties and functional roles remain elusive. In this thesis, we first investigate gamma oscillations (30 - 80 Hz) by combined empirical and modelling studies. Array recordings of local field potentials in visual motion-processing cortical area MT of marmoset monkeys reveal that gamma bursts form localised patterns with superdiffusive propagation dynamics. To understand how these gamma burst patterns emerge, we investigate a spatially extended, biophysically realistic circuit model that quantitatively captures the superdiffusive propagation of gamma burst patterns as found in the recordings. We further demonstrate that such gamma burst dynamics arise from the intrinsic non-equilibrium nature of transitions between asynchronous and propagating wave states. These results thus reveal novel a spatiotemporal organisation property of gamma bursts and their underlying mechanisms. We further illustrate that such non-equilibrium transitions in the spatially extended circuit model mechanistically account for a range of dynamical properties of sharp-wave ripples (100-250 Hz) and that sharp-wave ripples with superdiffusive dynamics underlie efficient memory retrievals. Furthermore, by incorporating short-term synaptic plasticity into the spatially extended circuit model, we find that in the dynamical regime near the transition of circuit states, large endogenous fluctuations emerging from the circuit can quantitively reproduce and explain the variability of working memory across items and trials as found experimental studies. In addition, our circuit model reproduces and explains several other key neural and behavioural features of working memory, such as the capacity limit of working memory and gamma bursts that are coupled to theta oscillations. These results establish a dynamical circuit mechanism of working memory and provide novel experimentally testable predictions.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Kropff, Emilio. "Statistical and dynamical properties of large cortical network models: insights into semantic memory and language." Doctoral thesis, SISSA, 2007. http://hdl.handle.net/20.500.11767/4639.

Повний текст джерела
Анотація:
This thesis introduces several variants to the classical autoassociative memory model in order to capture different characteristics of large cortical networks, using semantic memory as a paradigmatic example in which to apply the results. Chapter 2 is devoted to the development of the sparse Potts model network as a simplification of a multi modular memory performing computations both at the local and the global level. If a network storing p global patterns has N local modules, each one active in S possible ways with a global sparseness a, and if each module is connected to cM other modules, the storage capacity scales like αc ≡ pmax /cM ∝ S 2 /a with logarithmic corrections. Chapter 3 further introduces adaptation and correlations among patterns, as a result of which a latching dynamics appears, consistent in the spontaneous hopping between global attractor states after an initial cue-guided retrieval, somehow similar to a free association process. The complexity of the latching series depends on the equilibrium between self-excitation of the local networks and global inhibition represented by the parameter U. Finally, Chapter 4 develops a consistent way to store and retrieve correlated patterns, which works as long as any statistical dependence between units can be neglected. The popularity of units must be introduced into the learning rule, as a result of which a new property of associative memories appears: the robustness of a memory is inverse to the information it conveys. As in some accounts of semantic memory deficits, random damage results in selective impairments, associated to the entropy measure Sf of each memory, since the minimum connectivity required to sustain its retrieval is, in optimal conditions, cM ∝ pSf , and still proportional to pSf but possibly with a larger coefficient in the general case. Present in the entire thesis, but specially in this last Chapter, the conjecture stating that autoassociative memories are limited in the amount of information stored per synapse results consistent with the results.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Rehn, Martin. "Aspects of memory and representation in cortical computation." Doctoral thesis, KTH, Numerisk Analys och Datalogi, NADA, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4161.

Повний текст джерела
Анотація:
Denna avhandling i datalogi föreslår modeller för hur vissa beräkningsmässiga uppgifter kan utföras av hjärnbarken. Utgångspunkten är dels kända fakta om hur en area i hjärnbarken är uppbyggd och fungerar, dels etablerade modellklasser inom beräkningsneurobiologi, såsom attraktorminnen och system för gles kodning. Ett neuralt nätverk som producerar en effektiv gles kod i binär mening för sensoriska, särskilt visuella, intryck presenteras. Jag visar att detta nätverk, när det har tränats med naturliga bilder, reproducerar vissa egenskaper (receptiva fält) hos nervceller i lager IV i den primära synbarken och att de koder som det producerar är lämpliga för lagring i associativa minnesmodeller. Vidare visar jag hur ett enkelt autoassociativt minne kan modifieras till att fungera som ett generellt sekvenslärande system genom att utrustas med synapsdynamik. Jag undersöker hur ett abstrakt attraktorminnessystem kan implementeras i en detaljerad modell baserad på data om hjärnbarken. Denna modell kan sedan analyseras med verktyg som simulerar experiment som kan utföras på en riktig hjärnbark. Hypotesen att hjärnbarken till avsevärd del fungerar som ett attraktorminne undersöks och visar sig leda till prediktioner för dess kopplingsstruktur. Jag diskuterar också metodologiska aspekter på beräkningsneurobiologin idag.
In this thesis I take a modular approach to cortical function. I investigate how the cerebral cortex may realise a number of basic computational tasks, within the framework of its generic architecture. I present novel mechanisms for certain assumed computational capabilities of the cerebral cortex, building on the established notions of attractor memory and sparse coding. A sparse binary coding network for generating efficient representations of sensory input is presented. It is demonstrated that this network model well reproduces the simple cell receptive field shapes seen in the primary visual cortex and that its representations are efficient with respect to storage in associative memory. I show how an autoassociative memory, augmented with dynamical synapses, can function as a general sequence learning network. I demonstrate how an abstract attractor memory system may be realised on the microcircuit level -- and how it may be analysed using tools similar to those used experimentally. I outline some predictions from the hypothesis that the macroscopic connectivity of the cortex is optimised for attractor memory function. I also discuss methodological aspects of modelling in computational neuroscience.
QC 20100916
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Bhalala, Smita Ashesh 1966. "Modified Newton's method for supervised training of dynamical neural networks for applications in associative memory and nonlinear identification problems." Thesis, The University of Arizona, 1991. http://hdl.handle.net/10150/277969.

Повний текст джерела
Анотація:
There have been several innovative approaches towards realizing an intelligent architecture that utilizes artificial neural networks for applications in information processing. The development of supervised training rules for updating the adjustable parameters of neural networks has received extensive attention in the recent past. In this study, specific learning algorithms utilizing modified Newton's method for the optimization of the adjustable parameters of a dynamical neural network are developed. Computer simulation results show that the convergence performance of the proposed learning schemes match very closely that of the LMS learning algorithm for applications in the design of associative memories and nonlinear mapping problems. However, the implementation of the modified Newton's method is complex due to the computation of the slope of the nonlinear sigmoidal function, whereas, the LMS algorithm approximates the slope to be zero.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Bauer, Michael. "Dynamical characterization of Markov processes with varying order." Master's thesis, [S.l. : s.n.], 2009. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-200900153.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Abbs, Brandon Robert. "The temporal dynamics of auditory memory for static and dynamic sounds." Diss., University of Iowa, 2008. http://ir.uiowa.edu/etd/4.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Williams, Peter. "Dynamic memory for design." Thesis, The University of Sydney, 1995. https://hdl.handle.net/2123/27472.

Повний текст джерела
Анотація:
This thesis addresses the problem of providing an efficient case memory suitable for use with case-based reasoning in design. First, the requirements of a case memory for design are identified. Then a formal specification of a dynamic memory which satisfies these requirements is produced based on the idea of the memory organization packet presented in Schank’s model of dynamic memory and the assumption that design cases are decomposed into excerpts. Memory organization packets are used to represent both the excerpts themselves and the generalizations that may be formed from them. The term trace is used to refer to a memory organization packet that represents an excerpt and the term epitome is used for one that represents a generalization. A memory organization packet may be both a trace and an epitome simultaneously. The memory organization packets are the nodes of a structure and are linked together by links from more general nodes to their specializations indexed by their differences. Minimum storage requirements to meet this specification are identified and it is shown that, after a short period of exponential growth, this requirement increases linearly with the number of case excerpts processed and that the rate of growth is exponentially related to the maximum length of an excerpt. Algorithms, based on this specification, which perform efficient retrieval of complete and partial matches to queries are presented. Complete match retrieval takes time proportional to the size of the query and retrieval of partial matches is optimal in the sense that the minimum number of nodes necessary to find all partial matches are visited. A method for retrieving approximate matches by using partial match procedures is also presented. A mechanism for choosing between multiple responses to retrieval requests that takes into account both how frequently excerpts have been seen (or generalizations have been supported) and how recently they have been seen (or supported) is presented. The dynamic memory is then implemented in a series of increasingly sophisticated implementations culminating in one that is optimal in terms of the number of nodes used. The series of implementations are used so that problems associated with formation of generalizations, exploitation of inheritance, multiple indexing and removal of node redundancy can be addressed separately and as simply as possible. In particular problems with exploitation of inheritance and multiple indexing which could lead to selective forgetting and spurious erroneous recollections (respectively) are identified and corrected. An empirical evaluation of the trade-offs between processing time increases due to the more complex algorithms of the final implementation and the savings in storage space that they allow is made. The results of this evaluation showed that the reduction in the number of nodes to be processed more than compensated for the increased complexity of the insertion algorithms. Excerpt insertion times for the final implementation are slightly less than for any of the other three. The average time to process retrievals is slightly longer for the final implementation than the others due to the complications with inheritance when node redundancy is eliminated. Additionally, an empirical evaluation of the effects of increasing the maximum length of excerpts was made. This evaluation confirmed the hypothesis that the number of nodes required increases dramatically as the maximum size of excerpts grows and that this is a more important factor in determining feasibility than the number of excerpts. The application of this kind of memory structure to classification problems is briefly discussed. In this area it has the advantage of being able to identify further tests that could be performed to resolve a classification problem when the existing data is insufficient to produce a unique solution. Additionally, the undifferentiated strength can be used to guide the choice between two competing classifications where it is not possible to gather more data.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Sperens, Martin. "Dynamic Memory Managment in C++." Thesis, Luleå tekniska universitet, Datavetenskap, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-76611.

Повний текст джерела
Анотація:
Memory allocation is an important part of program optimization as well as of computer architecture. This thesis examines some of the concepts of memory allocation and tries to implement overrides for the standard new and delete functions in the c++ library using memory pools combined with other techniques. The overrides are tested against the standard new and delete as well as a custom memory pool with perfect size for the allocations. The study finds that the overrides are slightly faster on a single thread but not on multiple. The study also finds that the biggest gain on performance is to create custom memory pools specific to the programs needs. Lastly, the study also lists a number of ways that the library could be improved
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Bisht, Pawas. "Disaster and the dynamics of memory." Thesis, Loughborough University, 2013. https://dspace.lboro.ac.uk/2134/14184.

Повний текст джерела
Анотація:
Calls for examining the interrelations between individual and collective processes of remembering have been repeatedly made within the field of memory studies. With the tendency being to focus on either the individual or the collective level, there have been few studies that have undertaken this task in an empirically informed manner. This thesis seeks to engage in such an examination by undertaking a multi-level study of the remembrance of the Bhopal gas disaster of 1984. The gas leak in Bhopal (India) was one of the world s worst industrial disasters and has seen a long-running political contestation involving state institutions, social movement organisations (SMOs) and individual survivors. Employing an ethnographic methodology, incorporating interviews, participant observation and archival research, the study seeks to examine similarities and divergences in how these institutional, group-level and individual actors have remembered the disaster. It identifies the factors that modulated these remembrances and focuses on examining the nature of their interrelationship. The study conceptualises remembering as memory-work : an active process of meaning-making in relation to the past. The memory-work of state institutions was examined within the judicial and commemorative domains. The analysis demonstrates how state institutions engaged in a limiting of the meaning of the disaster removing from view the transnational causality of the event and the issue of corporate liability. It tracks how the survivors suffering was dehistoricised and contained within the framework of a localized claims bureaucracy. The examination of SMO memory-work focused on the activities of the two most prominent groups working in Bhopal. The analysis reveals how both organisations emphasise the continuing suffering of the survivors to challenge the state s settlement of the event. However, clear differences are outlined between the two groups in the wider frameworks of meaning employed by them to explain the suffering, assign responsibility and define justice. Memory-work at the individual level was accessed in the memory narratives of individual survivors generated through ethnographic interviews. The study examined how individual survivors have made sense of the lived experience of suffering caused by the disaster and its aftermath. The analysis revealed how the frameworks of meaning imposed by the state are deeply incommensurate with the survivors needs to express the multi-dimensionality of their suffering; it tracks how the state imposed identities are resisted but cannot be entirely overcome in individual remembrance. Engagement with the activities of the SMOs is demonstrated as enabling the development of an alternative activist remembrance for a limited group of survivors. Overall, the thesis seeks to provide a complex and empirically grounded account of the relations between the inner, individual level processes of memory linked to lived experience and the wider, historically inflected, collective and institutional registers of remembrance. The examination of the encounters between these diverse individual and collective remembrances in the context of an on-going political contestation allows the study to contribute to ongoing discussions within the field about memory politics in a global age and memory and justice.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Wu, Jiaming. "A modular dynamic Neuro-Synaptic platform for Spiking Neural Networks." Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASP145.

Повний текст джерела
Анотація:
Que le réseau de neurones soit biologique ou artificiel, il possède une unité de calcul fondamentale : le neurone. Ces neurones, interconnectés par des synapses, forment ainsi des réseaux complexes qui permettent d’obtenir une pluralité de fonctions. De même, le réseau de neurones neuromorphique, ou plus généralement les ordinateurs neuromorphiques, nécessitent également ces deux éléments fondamentaux que sont les neurones et les synapses. Dans ce travail, nous introduisons une unité matérielle neuro-synaptique à impulsions, inspirée de la biologie et entièrement réalisée avec des composants électroniques conventionnels. Le modèle de cette unité neuro-synaptique repose sur les modèles théoriques classiques du neurone à impulsions et des courants synaptiques et membranaires. Le neurone à impulsions est entièrement analogique et un dispositif memristif, dont les composants électroniques sont facilement disponibles sur le marché, permet d’assurer l’excitabilité du neurone. En ce qui concerne les courants synaptiques et membranaires, leur intensité est ajustable, et ils possèdent une dynamique biomimétique, incluant à la fois des courants excitateurs et inhibiteurs. Tous les paramètres du modèle sont ajustables et permettant ainsi d'adapter le système neuro-synaptique. Cette flexibilité et cette adaptabilité sont des caractéristiques essentielles dans la réalisation d’applications telles que les interfaces cerveau-machine. En nous appuyant sur ces deux unités modulaires, le neurone et la synapse, nous pouvons concevoir des motifs fondamentaux des réseaux de neurones. Ces motifs servent ainsi de base pour implémenter des réseaux aux fonctionnalités plus complexes, telles que des mémoires dynamiques ou des réseaux locomoteurs spinaux (Central Pattern Generator). De plus, il sera possible d’améliorer le modèle existant, que ce soit en y intégrant des memristors à base d’oxydes (actuellement étudiés en science des matériaux), ou en le déployant à grande échelle (VLSI) afin de réaliser des réseaux d’ordres de grandeurs supérieures. L’unité neuro-synaptique peut être considérée comme un bloc fondamental pour implémenter des réseaux neuronaux à impulsions de géométrie arbitraire. Son design compact et modulaire, associé à la large disponibilité des composants électroniques, font de notre plateforme une option attrayante de développement pour construire des interfaces neuronales, que ce soit dans les domaines médical, robotique, ou des systèmes d'intelligence artificielle (par exemple le calcul par réservoir), etc
Biological and artificial neural networks share a fundamental computational unit: the neuron. These neurons are coupled by synapses, forming complex networks that enable various functions. Similarly, neuromorphic hardware, or more generally neuro-computers, also require two hardware elements: neurons and synapses. In this work, we introduce a bio-inspired spiking Neuro-Synaptic hardware unit, fully implemented with conventional electronic components. Our hardware is based on a textbook theoretical model of the spiking neuron, and its synaptic and membrane currents. The spiking neuron is fully analog and the various models that we introduced are defined by their hardware implementation. The neuron excitability is achieved through a memristive device made from off-the-shelf electronic components. Both synaptic and membrane currents feature tunable intensities and bio-mimetic dynamics, including excitatory and inhibitory currents. All model parameters are adjustable, allowing the system to be tuned to bio-compatible timescales, which is crucial in applications such as brain-machine interfaces. Building on these two modular units, we demonstrate various basic neural network motifs (or neuro-computing primitives) and show how to combine these fundamental motifs to implement more complex network functionalities, such as dynamical memories and central pattern generators. Our hardware design also carries potential extensions for integrating oxide-based memristors (which are widely studied in material science),or porting the design to very large-scale integration (VLSI) to implement large-scale networks. The Neuro-Synaptic unit can be considered as a building block for implementing spiking neural networks of arbitrary geometry. Its compact and modular design, as well as the wide availability of ordinary electronic components, makes our approach an attractive platform for building neural interfaces in medical devices, robotics, and artificial intelligence systems such as reservoir computing
Стилі APA, Harvard, Vancouver, ISO та ін.

Книги з теми "Dynamical memory"

1

Irene, Dorfman, Fokas A. S. 1952-, and Gelʹfand I. M, eds. Algebraic aspects of integrable systems: In memory of Irene Dorfman. Boston: Birkäuser, 1997.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Blokh, Alexander, Leonid Bunimovich, Paul Jung, Lex Oversteegen, and Yakov Sinai, eds. Dynamical Systems, Ergodic Theory, and Probability: in Memory of Kolya Chernov. Providence, Rhode Island: American Mathematical Society, 2017. http://dx.doi.org/10.1090/conm/698.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

V, Anosov D., Stepin A. M, and Bolibruch Andrej Andreevič, eds. Dynamical systems and related problems of geometry: Collected papers dedicated to the memory of academician Andrei Andreevich Bolibrukh. Moscow: Maik Nauka/Interperiodica, 2004.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Motorola. Dynamic RAMs & memory modules. 2nd ed. Phoenix, AZ: Motorola, 1996.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Korostelina, Karina V. Memory Sites and Conflict Dynamics. London: Routledge, 2024. http://dx.doi.org/10.4324/9781003497332.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Motorola. Dynamic RAMs and memory modules. Phoenix, AZ: Motorola, 1993.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Atienza Alonso, David, Stylianos Mamagkakis, Christophe Poucet, Miguel Peón-Quirós, Alexandros Bartzas, Francky Catthoor, and Dimitrios Soudris. Dynamic Memory Management for Embedded Systems. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-10572-7.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Incorporated, Advanced Micro Devices. Dynamic memory design data book/handbook. [Sunnyvale, CA]: Advanced Micro Devices, Inc., 1990.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Daconta, Michael C. C++ pointers and dynamic memory management. New York: Wiley, 1995.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Farkas, Keith I. Memory-system design considerations for dynamically-scheduled microprocessors. Ottawa: National Library of Canada = Bibliothèque nationale du Canada, 1997.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Частини книг з теми "Dynamical memory"

1

Pandolfi, Luciano. "Dynamical Algorithms for Identification Problems." In Systems with Persistent Memory, 283–329. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-80281-3_6.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Liu, Jun, and Andrew R. Teel. "Hybrid Dynamical Systems with Finite Memory." In Recent Results on Nonlinear Delay Control Systems, 261–73. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-18072-4_13.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Fung, C. C. Alan, K. Y. Michael Wong, and Si Wu. "Dynamical Synapses Enhance Mobility, Memory and Decoding." In Advances in Cognitive Neurodynamics (III), 131–37. Dordrecht: Springer Netherlands, 2013. http://dx.doi.org/10.1007/978-94-007-4792-0_18.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Cosnard, Michel, and Eric Goles Chacc. "Dynamical Properties of An Automaton with Memory." In Disordered Systems and Biological Organization, 63–66. Berlin, Heidelberg: Springer Berlin Heidelberg, 1986. http://dx.doi.org/10.1007/978-3-642-82657-3_7.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Bragov, A. M., L. A. Igumnov, A. Yu Konstantinov, A. K. Lomunov, and A. I. Razov. "Dynamic Research of Shape Memory Alloys." In Dynamical Processes in Generalized Continua and Structures, 133–46. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-11665-1_7.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Grasselli, Maurizio, and Vittorino Pata. "Uniform Attractors of Nonautonomous Dynamical Systems with Memory." In Evolution Equations, Semigroups and Functional Analysis, 155–78. Basel: Birkhäuser Basel, 2002. http://dx.doi.org/10.1007/978-3-0348-8221-7_9.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Butaud, Pauline, Morvan Ouisse, Kévin Jaboviste, Vincent Placet, and Emmanuel Foltête. "Dynamical Mechanical Thermal Analysis of Shape-Memory Polymers." In Advanced Structured Materials, 129–51. Singapore: Springer Singapore, 2019. http://dx.doi.org/10.1007/978-981-13-8574-2_6.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Soares, O. D. D., A. L. V. S. Lage, A. O. S. Gomes, and J. C. D. M. Santos. "Dynamical Digital Memory for Holography, Moiré and E.S.P.I." In Optical Metrology, 182–98. Dordrecht: Springer Netherlands, 1987. http://dx.doi.org/10.1007/978-94-009-3609-6_16.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Koopmans, Matthijs. "Investigating the Long Memory Process in Daily High School Attendance Data." In Complex Dynamical Systems in Education, 299–321. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-27577-2_14.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Hayashi, Hatsuo, and Motoharu Yoshida. "A Memory Model Based on Dynamical Behavior of the Hippocampus." In Lecture Notes in Computer Science, 967–73. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-30132-5_130.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Тези доповідей конференцій з теми "Dynamical memory"

1

Shen, Minghao, and Gábor Orosz. "Memory Sketching for Data-driven Prediction of Dynamical Systems." In 2024 American Control Conference (ACC), 5388–93. IEEE, 2024. http://dx.doi.org/10.23919/acc60939.2024.10645035.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Loveridge, Tegan, Kai Shinbrough, and Virginia O. Lorenz. "Optimal Continuous Dynamical Decoupling in N-type Atomic Ensemble Quantum Memories." In CLEO: Fundamental Science, FM3R.4. Washington, D.C.: Optica Publishing Group, 2024. http://dx.doi.org/10.1364/cleo_fs.2024.fm3r.4.

Повний текст джерела
Анотація:
We simulate N-type quantum memory in atomic ensembles with a high-lying sensor state in the regime of continuous dynamical decoupling. We find order-of-magnitude memory lifetime enhancements for realistic experimental parameters.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Otsuka, Kenju, and Jyh-Long Chern. "Factorial Dynamic Pattern Memory in Globally Coupled Lasers." In Nonlinear Dynamics in Optical Systems. Washington, D.C.: Optica Publishing Group, 1992. http://dx.doi.org/10.1364/nldos.1992.thb1.

Повний текст джерела
Анотація:
Recently, applicability of complex dynamics to information storage (memory) has been discussed in nonlinear optical systems. Spatial chaos memory was proposed in a bistable pixelsl1 and in a coupled bistable chain.2 Dynamic memory in a delayed feedback bistable system was demonstrated experimentally.3 On the other hand, Otsuka demonstrated that m a = (N − 1)! (N: number of oscillating modes) coexisting dynamical spatial patterns, i. e., antiphase periodic motions, can be selectively excited by applying seed signals to the modulated multimode lasers whose modes are globally coupled through spatial hole burning.4 In this paper, we discuss the detailed bifurcation scenario, featuring clustered states and chaotic itinerancy5 among destabilized clustered states. Also, factorial dynamical pattern memory associated with antiphase and clustered states as well as the effect of spontaneous emission on memory operation are demonstrated by numerical simulations.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Gordon, Goren, and Gershon Kurizki. "Dynamical control of noisy quantum memory channels." In Microtechnologies for the New Millennium, edited by Ali Serpengüzel, Gonçal Badenes, and Giancarlo C. Righini. SPIE, 2007. http://dx.doi.org/10.1117/12.723952.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Duda, Alexander M., and Stephen E. Levinson. "Nonlinear Dynamical Multi-Scale Model of Associative Memory." In 2010 International Conference on Machine Learning and Applications (ICMLA). IEEE, 2010. http://dx.doi.org/10.1109/icmla.2010.135.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Chung-Ming Ou and C. R. Ou. "Immune memory with associativity: Perspectives on dynamical systems." In 2012 IEEE Congress on Evolutionary Computation (CEC). IEEE, 2012. http://dx.doi.org/10.1109/cec.2012.6256646.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Andrianov, Serge N., and Nikolai S. Edamenko. "Geometric integration of nonlinear dynamical systems." In 2015 International Conference "Stability and Control Processes" in Memory of V.I. Zubov (SCP). IEEE, 2015. http://dx.doi.org/10.1109/scp.2015.7342048.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Vakhnenko, Vyacheslav O. "Dynamical realization of end-point memory in consolidated materials." In INNOVATIONS IN NONLINEAR ACOUSTICS: ISNA17 - 17th International Symposium on Nonlinear Acoustics including the International Sonic Boom Forum. AIP, 2006. http://dx.doi.org/10.1063/1.2210332.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Alonso-Sanz, Ramon. "Cellular automata and other discrete dynamical systems with memory." In 2012 International Conference on High Performance Computing & Simulation (HPCS). IEEE, 2012. http://dx.doi.org/10.1109/hpcsim.2012.6266914.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Davydenko, Alexander A., Natalya V. Raspopova, and Sergei S. Ustimenko. "On mass simulations of dynamical models of galaxy." In 2015 International Conference "Stability and Control Processes" in Memory of V.I. Zubov (SCP). IEEE, 2015. http://dx.doi.org/10.1109/scp.2015.7342053.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Звіти організацій з теми "Dynamical memory"

1

Beri, A. C., and T. F. George. Memory Effects in Dynamical Many-Body Systems: The Isomnesic (Constant-Memory) Approximation. Fort Belvoir, VA: Defense Technical Information Center, April 1985. http://dx.doi.org/10.21236/ada154160.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Perdigão, Rui A. P., and Julia Hall. Spatiotemporal Causality and Predictability Beyond Recurrence Collapse in Complex Coevolutionary Systems. Meteoceanics, November 2020. http://dx.doi.org/10.46337/201111.

Повний текст джерела
Анотація:
Causality and Predictability of Complex Systems pose fundamental challenges even under well-defined structural stochastic-dynamic conditions where the laws of motion and system symmetries are known. However, the edifice of complexity can be profoundly transformed by structural-functional coevolution and non-recurrent elusive mechanisms changing the very same invariants of motion that had been taken for granted. This leads to recurrence collapse and memory loss, precluding the ability of traditional stochastic-dynamic and information-theoretic metrics to provide reliable information about the non-recurrent emergence of fundamental new properties absent from the a priori kinematic geometric and statistical features. Unveiling causal mechanisms and eliciting system dynamic predictability under such challenging conditions is not only a fundamental problem in mathematical and statistical physics, but also one of critical importance to dynamic modelling, risk assessment and decision support e.g. regarding non-recurrent critical transitions and extreme events. In order to address these challenges, generalized metrics in non-ergodic information physics are hereby introduced for unveiling elusive dynamics, causality and predictability of complex dynamical systems undergoing far-from-equilibrium structural-functional coevolution. With these methodological developments at hand, hidden dynamic information is hereby brought out and explicitly quantified even beyond post-critical regime collapse, long after statistical information is lost. The added causal insights and operational predictive value are further highlighted by evaluating the new information metrics among statistically independent variables, where traditional techniques therefore find no information links. Notwithstanding the factorability of the distributions associated to the aforementioned independent variables, synergistic and redundant information are found to emerge from microphysical, event-scale codependencies in far-from-equilibrium nonlinear statistical mechanics. The findings are illustrated to shed light onto fundamental causal mechanisms and unveil elusive dynamic predictability of non-recurrent critical transitions and extreme events across multiscale hydro-climatic problems.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Asea, Patrick K., and Michael J. Dueker. Non-Monotonic Long Memory Dynamics in Black-Market Exchange Rates. Federal Reserve Bank of St. Louis, 1995. http://dx.doi.org/10.20955/wp.1995.003.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Kim, Joohee, and Marios C. Papaefthymiou. Block-Based Multi-Period Refresh for Energy Efficient Dynamic Memory. Fort Belvoir, VA: Defense Technical Information Center, April 2002. http://dx.doi.org/10.21236/ada414244.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Lagoudas, Dimitris C. Dynamic Behavior and Shock Absorption Properties of Porous Shape Memory Alloys. Fort Belvoir, VA: Defense Technical Information Center, July 2002. http://dx.doi.org/10.21236/ada403775.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Saxena, A., A. R. Bishop, S. R. Shenoy, Y. Wu, and T. Lookman. A model of shape memory materials with hierarchical twinning: Statics and dynamics. Office of Scientific and Technical Information (OSTI), July 1995. http://dx.doi.org/10.2172/102295.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Mayas, Magda. Creating with timbre. Norges Musikkhøgskole, August 2018. http://dx.doi.org/10.22501/nmh-ar.686088.

Повний текст джерела
Анотація:
Unfolding processes of timbre and memory in improvisational piano performance This exposition is an introduction to my research and practice as a pianist, in which I unfold processes of timbre and memory in improvised music from a performer’s perspective. Timbre is often understood as a purely sonic perceptual phenomenon. However, this is not in accordance with a site-specific improvisational practice with changing spatial circumstances impacting the listening experience, nor does it take into account the agency of the instrument and objects used or the performer’s movements and gestures. In my practice, I have found a concept as part of the creating process in improvised music which has compelling potential: Timbre orchestration. My research takes the many and complex aspects of a performance environment into account and offers an extended understanding of timbre, which embraces spatial, material and bodily aspects of sound in improvised music performance. The investigative projects described in this exposition offer a methodology to explore timbral improvisational processes integrated into my practice, which is further extended through collaborations with sound engineers, an instrument builder and a choreographer: -experiments in amplification and recording, resulting in Memory piece, a series of works for amplified piano and multichannel playback - Piano mapping, a performance approach, with a custom-built device for live spatialization as means to expand and deepen spatio-timbral relationships; - Accretion, a project with choreographer Toby Kassell for three grand pianos and a pianist, where gestural approaches are used to activate and compose timbre in space. Together, the projects explore memory as a structural, reflective and performative tool and the creation of performing and listening modes as integrated parts of timbre orchestration. Orchestration and choreography of timbre turn into an open and hybrid compositional approach, which can be applied to various contexts, engaging with dynamic relationships and re-configuring them.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

D`Azevedo, E. F., and C. H. Romine. A new shared-memory programming paradigm for molecular dynamics simulations on the Intel Paragon. Office of Scientific and Technical Information (OSTI), December 1994. http://dx.doi.org/10.2172/28414.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

D'Azevedo, E. F. A New Shared-Memory Programming Paradigm for Molecular Dynamics Simulations on the Intel Paragon. Office of Scientific and Technical Information (OSTI), January 1995. http://dx.doi.org/10.2172/814063.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Vineyard, Craig Michael, and Stephen Joseph Verzi. A Case Study on Neural Inspired Dynamic Memory Management Strategies for High Performance Computing. Office of Scientific and Technical Information (OSTI), September 2017. http://dx.doi.org/10.2172/1396076.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії