To see the other types of publications on this topic, follow the link: Potts Attractor Neural Network.

Dissertations / Theses on the topic 'Potts Attractor Neural Network'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 17 dissertations / theses for your research on the topic 'Potts Attractor Neural Network.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Seybold, John. "An attractor neural network model of spoken word recognition." Thesis, University of Oxford, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.335839.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Pereira, Patrícia. "Attractor Neural Network modelling of the Lifespan Retrieval Curve." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-280732.

Full text
Abstract:
Human capability to recall episodic memories depends on how much time has passed since the memory was encoded. This dependency is described by a memory retrieval curve that reflects an interesting phenomenon referred to as a reminiscence bump - a tendency for older people to recall more memories formed during their young adulthood than in other periods of life. This phenomenon can be modelled with an attractor neural network, for example, the firing-rate Bayesian Confidence Propagation Neural Network (BCPNN) with incremental learning. In this work, the mechanisms underlying the reminiscence bu
APA, Harvard, Vancouver, ISO, and other styles
3

Ericson, Julia. "Modelling Immediate Serial Recall using a Bayesian Attractor Neural Network." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-291553.

Full text
Abstract:
In the last decades, computational models have become useful tools for studying biological neural networks. These models are typically constrained by either behavioural data from neuropsychological studies or by biological data from neuroscience. One model of the latter kind is the Bayesian Confidence Propagating Neural Network (BCPNN) - an attractor network with a Bayesian learning rule which has been proposed as a model for various types of memory. In this thesis, I have further studied the potential of the BCPNN in short-term sequential memory. More specifically, I have investigated if the
APA, Harvard, Vancouver, ISO, and other styles
4

Batbayar, Batsukh, and S3099885@student rmit edu au. "Improving Time Efficiency of Feedforward Neural Network Learning." RMIT University. Electrical and Computer Engineering, 2009. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20090303.114706.

Full text
Abstract:
Feedforward neural networks have been widely studied and used in many applications in science and engineering. The training of this type of networks is mainly undertaken using the well-known backpropagation based learning algorithms. One major problem with this type of algorithms is the slow training convergence speed, which hinders their applications. In order to improve the training convergence speed of this type of algorithms, many researchers have developed different improvements and enhancements. However, the slow convergence problem has not been fully addressed. This thesis makes seve
APA, Harvard, Vancouver, ISO, and other styles
5

Villani, Gianluca. "Analysis of an Attractor Neural Network Model for Working Memory: A Control Theory Approach." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-260079.

Full text
Abstract:
Working Memory (WM) is a general-purpose cognitive system responsible for temporaryholding information in service of higher order cognition systems, e.g. decision making. Inthis thesis we focus on a non-spiking model belonging to a special family of biologicallyinspired recurrent Artificial Neural Network aiming to account for human experimentaldata on free recall. Considering its modular structure, this thesis gives a networked systemrepresentation of WM in order to analyze its stability and synchronization properties.Furthermore, with the tools provided by bifurcation analysis we investigate
APA, Harvard, Vancouver, ISO, and other styles
6

Ferland, Guy J. M. G. "A new paradigm for the classification of patterns: The 'race to the attractor' neural network model." Thesis, University of Ottawa (Canada), 2001. http://hdl.handle.net/10393/9298.

Full text
Abstract:
The human brain is arguably the best known classifier around. It can learn complex classification tasks with little apparent effort. It can learn to classify new patterns without forgetting old ones. It can learn a seemingly unlimited number of pattern classes. And it displays amazing resilience through its ability to persevere with reliable classifications despite damage to itself (e.g., dying neurons). These advantages have motivated researchers from many fields in the quest to understand the brain in order to duplicate its ability in an artificial system. And yet, little is known about the
APA, Harvard, Vancouver, ISO, and other styles
7

Rosay, Sophie. "A statistical mechanics approach to the modelling and analysis of place-cell activity." Thesis, Paris, Ecole normale supérieure, 2014. http://www.theses.fr/2014ENSU0010/document.

Full text
Abstract:
Les cellules de lieu de l’hippocampe sont des neurones aux propriétés intrigantes, commele fait que leur activité soit corrélée à la position spatiale de l’animal. Il est généralementconsidéré que ces propriétés peuvent être expliquées en grande partie par les comporte-ments collectifs de modèles schématiques de neurones en interaction. La physique statis-tique fournit des outils permettant l’étude analytique et numérique de ces comportementscollectifs.Nous abordons ici le problème de l’utilisation de ces outils dans le cadre du paradigmedu “réseau attracteur”, une hypothèse théorique sur la n
APA, Harvard, Vancouver, ISO, and other styles
8

Strandqvist, Jonas. "Attractors of autoencoders : Memorization in neural networks." Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-97746.

Full text
Abstract:
It is an important question in machine learning to understand how neural networks learn. This thesis sheds further light onto this by studying autoencoder neural networks which can memorize data by storing it as attractors.What this means is that an autoencoder can learn a training set and later produce parts or all of this training set even when using other inputs not belonging to this set. We seek out to illuminate the effect on how ReLU networks handle memorization when trained with different setups: with and without bias, for different widths and depths, and using two different types of tr
APA, Harvard, Vancouver, ISO, and other styles
9

Martí, Ortega Daniel. "Neural stochastic dynamics of perceptual decision making." Doctoral thesis, Universitat Pompeu Fabra, 2008. http://hdl.handle.net/10803/7552.

Full text
Abstract:
Models computacionals basats en xarxes a gran escala d'inspiració neurobiològica permeten descriure els correlats neurals de la decisió observats en certes àrees corticals com una transició entre atractors de la xarxa cortical. L'estimulació provoca un canvi en el paisatge d'atractors que afavoreix la transició entre l'atractor neutre inicial a un dels atractors associats a les eleccions categòriques. El soroll present en el sistema introdueix indeterminació en les transicions. En aquest treball mostrem l'existència de dos mecanismes de decisió qualitativament diferents, cadascun amb signature
APA, Harvard, Vancouver, ISO, and other styles
10

Posani, Lorenzo. "Inference and modeling of biological networks : a statistical-physics approach to neural attractors and protein fitness landscapes." Thesis, Paris Sciences et Lettres (ComUE), 2018. http://www.theses.fr/2018PSLEE043/document.

Full text
Abstract:
L'avènement récent des procédures expérimentales à haut débit a ouvert une nouvelle ère pour l'étude quantitative des systèmes biologiques. De nos jours, les enregistrements d'électrophysiologie et l'imagerie du calcium permettent l'enregistrement simultané in vivo de centaines à des milliers de neurones. Parallèlement, grâce à des procédures de séquençage automatisées, les bibliothèques de protéines fonctionnelles connues ont été étendues de milliers à des millions en quelques années seulement. L'abondance actuelle de données biologiques ouvre une nouvelle série de défis aux théoriciens. Des
APA, Harvard, Vancouver, ISO, and other styles
11

Battista, Aldo. "Low-dimensional continuous attractors in recurrent neural networks : from statistical physics to computational neuroscience." Thesis, Université Paris sciences et lettres, 2020. http://www.theses.fr/2020UPSLE012.

Full text
Abstract:
La manière dont l'information sensorielle est codée et traitée par les circuits neuronaux est une question centrale en neurosciences computationnelles. Dans de nombreuses régions du cerveau, on constate que l'activité des neurones dépend fortement de certains corrélats sensoriels continus ; on peut citer comme exemples les cellules simples de la zone V1 du cortex visuel codant pour l'orientation d'une barre présentée à la rétine, et les cellules de direction de la tête dans le subiculum ou les cellules de lieu dans l'hippocampe, dont les activités dépendent, respectivement, de l'orientation de
APA, Harvard, Vancouver, ISO, and other styles
12

Doria, Felipe França. "Padrões estruturados e campo aleatório em redes complexas." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2016. http://hdl.handle.net/10183/144076.

Full text
Abstract:
Este trabalho foca no estudo de duas redes complexas. A primeira é um modelo de Ising com campo aleatório. Este modelo segue uma distribuição de campo gaussiana e bimodal. Uma técnica de conectividade finita foi utilizada para resolvê-lo. Assim como um método de Monte Carlo foi aplicado para verificar os resultados. Há uma indicação em nossos resultados que para a distribuição gaussiana a transição de fase é sempre de segunda ordem. Para as distribuições bimodais há um ponto tricrítico, dependente do valor da conectividade . Abaixo de um certo mínimo de , só existe transição de segunda ordem.
APA, Harvard, Vancouver, ISO, and other styles
13

Tully, Philip. "Spike-Based Bayesian-Hebbian Learning in Cortical and Subcortical Microcircuits." Doctoral thesis, KTH, Beräkningsvetenskap och beräkningsteknik (CST), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-205568.

Full text
Abstract:
Cortical and subcortical microcircuits are continuously modified throughout life. Despite ongoing changes these networks stubbornly maintain their functions, which persist although destabilizing synaptic and nonsynaptic mechanisms should ostensibly propel them towards runaway excitation or quiescence. What dynamical phenomena exist to act together to balance such learning with information processing? What types of activity patterns do they underpin, and how do these patterns relate to our perceptual experiences? What enables learning and memory operations to occur despite such massive and cons
APA, Harvard, Vancouver, ISO, and other styles
14

Martínez-García, Marina. "Statistical analysis of neural correlates in decision-making." Doctoral thesis, Universitat Pompeu Fabra, 2014. http://hdl.handle.net/10803/283111.

Full text
Abstract:
We investigated the neuronal processes which occur during a decision- making task based on a perceptual classi cation judgment. For this purpose we have analysed three di erent experimental paradigms (somatosensory, visual, and auditory) in two di erent species (monkey and rat), with the common goal of shedding light into the information carried by neurons. In particular, we focused on how the information content is preserved in the underlying neuronal activity over time. Furthermore we considered how the decision, the stimuli, and the con dence are encoded in memory and, when the exp
APA, Harvard, Vancouver, ISO, and other styles
15

Insabato, Andrea. "Neurodynamical theory of decision confidence." Doctoral thesis, Universitat Pompeu Fabra, 2014. http://hdl.handle.net/10803/129463.

Full text
Abstract:
Decision confidence offers a window on introspection and onto the evaluation mechanisms associated with decision-making. Nonetheless we do not have yet a thorough understanding of its neurophysiological and computational substrate. There are mainly two experimental paradigms to measure decision confidence in animals: post-decision wagering and uncertain option. In this thesis we explore and try to shed light on the computational mechanisms underlying confidence based decision-making in both experimental paradigms. We propose that a double-layer attractor neural network can account for neur
APA, Harvard, Vancouver, ISO, and other styles
16

Larsson, Johan P. "Modelling neuronal mechanisms of the processing of tones and phonemes in the higher auditory system." Doctoral thesis, Universitat Pompeu Fabra, 2012. http://hdl.handle.net/10803/97293.

Full text
Abstract:
S'ha investigat molt tant els mecanismes neuronals bàsics de l'audició com l'organització psicològica de la percepció de la parla. Tanmateix, en ambdós temes n'hi ha una relativa escassetat en quant a modelització. Aquí describim dos treballs de modelització. Un d'ells proposa un nou mecanisme de millora de selectivitat de freqüències que explica resultats de experiments neurofisiològics investigant manifestacions de forward masking y sobretot auditory streaming en l'escorça auditiva principal (A1). El mecanisme funciona en una xarxa feed-forward amb depressió sináptica entre el tàlem
APA, Harvard, Vancouver, ISO, and other styles
17

Engelken, Rainer. "Chaotic Neural Circuit Dynamics." Doctoral thesis, 2017. http://hdl.handle.net/11858/00-1735-0000-002E-E349-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!