Auswahl der wissenschaftlichen Literatur zum Thema „Computationnal Neuroscience“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Computationnal Neuroscience" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Zeitschriftenartikel zum Thema "Computationnal Neuroscience"

1

Herrmann-Pillath, Carsten. „From dual systems to dual function: rethinking methodological foundations of behavioural economics“. Economics and Philosophy 35, Nr. 3 (24.01.2019): 403–22. http://dx.doi.org/10.1017/s0266267118000378.

Der volle Inhalt der Quelle
Annotation:
AbstractBuilding on an overview of dual systems theories in behavioural economics, the paper presents a methodological assessment in terms of the mechanistic explanations framework that has gained prominence in philosophy of the neurosciences. I conclude that they fail to meet the standards of causal explanations and I suggest an alternative ‘dual functions’ view based on Marr’s methodology of computational neuroscience. Recent psychological and neuroscience research undermines the case for a categorization of brain processes in terms of properties such as relative speed. I defend an interpretation of dualities as functional, without assigning them to specific neurophysiological structures.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Cao, Jinde, Qingshan Liu, Sabri Arik, Jianlong Qiu, Haijun Jiang und Ahmed Elaiw. „Computational Neuroscience“. Computational and Mathematical Methods in Medicine 2014 (2014): 1–2. http://dx.doi.org/10.1155/2014/120280.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Sejnowski, T., C. Koch und P. Churchland. „Computational neuroscience“. Science 241, Nr. 4871 (09.09.1988): 1299–306. http://dx.doi.org/10.1126/science.3045969.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Sejnowski, Terrence J. „Computational neuroscience“. Behavioral and Brain Sciences 9, Nr. 1 (März 1986): 104–5. http://dx.doi.org/10.1017/s0140525x00021713.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Moore, John W. „Computational Neuroscience“. Contemporary Psychology: A Journal of Reviews 38, Nr. 2 (Februar 1993): 137–39. http://dx.doi.org/10.1037/033019.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Ringo, J. L. „Computational Neuroscience“. Archives of Neurology 48, Nr. 2 (01.02.1991): 130. http://dx.doi.org/10.1001/archneur.1991.00530140018008.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Kriegeskorte, Nikolaus, und Pamela K. Douglas. „Cognitive computational neuroscience“. Nature Neuroscience 21, Nr. 9 (20.08.2018): 1148–60. http://dx.doi.org/10.1038/s41593-018-0210-5.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Cecchi, Guillermo A., und James Kozloski. „Preface: Computational neuroscience“. IBM Journal of Research and Development 61, Nr. 2/3 (01.03.2017): 0:1–0:4. http://dx.doi.org/10.1147/jrd.2017.2690118.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Popovych, Oleksandr, Peter Tass und Christian Hauptmann. „Desynchronization (computational neuroscience)“. Scholarpedia 6, Nr. 10 (2011): 1352. http://dx.doi.org/10.4249/scholarpedia.1352.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Érdi, Péter. „Teaching computational neuroscience“. Cognitive Neurodynamics 9, Nr. 5 (21.03.2015): 479–85. http://dx.doi.org/10.1007/s11571-015-9340-6.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Dissertationen zum Thema "Computationnal Neuroscience"

1

Higgins, Irina. „Computational neuroscience of speech recognition“. Thesis, University of Oxford, 2015. https://ora.ox.ac.uk/objects/uuid:daa8d096-6534-4174-b63e-cc4161291c90.

Der volle Inhalt der Quelle
Annotation:
Physical variability of speech combined with its perceptual constancy make speech recognition a challenging task. The human auditory brain, however, is able to perform speech recognition effortlessly. This thesis aims to understand the precise computational mechanisms that allow the auditory brain to do so. In particular, we look for the minimal subset of sub-cortical auditory brain areas that allow the primary auditory cortex to learn 'good representations' of speech-like auditory objects through spike-timing dependent plasticity (STDP) learning mechanisms as described by Bi & Poo (1998). A 'good representation' is defined as that which is informative of the stimulus class regardless of the variability in the raw input, while being less redundant and more compressed than the representations within the auditory nerve, which provides the firing inputs to the rest of the auditory brain hierarchy (Barlow 1961). Neurophysiological studies have provided insights into the architecture and response properties of different areas within the auditory brain hierarchy. We use these insights to guide the development of an unsupervised spiking neural network grounded in the neurophysiology of the auditory brain and equipped with spike-time dependent plasticity (STDP) learning (Bi & Poo 1998). The model was exposed to simple controlled speech- like stimuli (artificially synthesised phonemes and naturally spoken words) to investigate how stable representations that are invariant to the within- and between-speaker differences can emerge in the output area of the model. The output of the model is roughly equivalent to the primary auditory cortex. The aim of the first part of the thesis was to investigate what was the minimal taxonomy necessary for such representations to emerge through the interactions of spiking dynamics of the network neurons, their ability to learn through STDP learning and the statistics of the auditory input stimuli. It was found that sub-cortical pre-processing within the ventral cochlear nucleus and inferior colliculus was necessary to remove jitter inherent to the auditory nerve spike rasters, which would disrupt STDP learning in the primary auditory cortex otherwise. The second half of the thesis investigated the nature of neural encoding used within the primary auditory cortex stage of the model to represent the learnt auditory object categories. It was found that single cell binary encoding (DeWeese & Zador 2003) was sufficient to represent two synthesised vowel classes, however more complex population encoding using precisely timed spikes within polychronous chains (Izhikevich 2006) represented more complex naturally spoken words in speaker-invariant manner.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Walters, Daniel Matthew. „The computational neuroscience of head direction cells“. Thesis, University of Oxford, 2011. http://ora.ox.ac.uk/objects/uuid:d4afe06a-d44f-4a24-99a3-d0e0a2911459.

Der volle Inhalt der Quelle
Annotation:
Head direction cells signal the orientation of the head in the horizontal plane. This thesis shows how some of the known head direction cell response properties might develop through learning. The research methodology employed is the computer simulation of neural network models of head direction cells that self-organize through learning. The preferred firing directions of head direction cells will change in response to the manipulation of distal visual cues, but not in response to the manipulation of proximal visual cues. Simulation results are presented of neural network models that learn to form separate representations of distal and proximal visual cues that are presented simultaneously as visual input to the network. These results demonstrate the computation required for a subpopulation of head direction cells to learn to preferentially respond to distal visual cues. Within a population of head direction cells, the angular distance between the preferred firing directions of any two cells is maintained across different environments. It is shown how a neural network model can learn to maintain the angular distance between the learned preferred firing directions of head direction cells across two different visual training environments. A population of head direction cells can update the population representation of the current head direction, in the absence of visual input, using internal idiothetic (self-generated) motion signals alone. This is called the path integration of head direction. It is important that the head direction cell system updates its internal representation of head direction at the same speed as the animal is rotating its head. Neural network models are simulated that learn to perform the path integration of head direction, using solely idiothetic signals, at the same speed as the head is rotating.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Chateau-Laurent, Hugo. „Modélisation Computationnelle des Interactions Entre Mémoire Épisodique et Contrôle Cognitif“. Electronic Thesis or Diss., Bordeaux, 2024. http://www.theses.fr/2024BORD0019.

Der volle Inhalt der Quelle
Annotation:
La mémoire épisodique est souvent illustrée par la madeleine de Proust comme la capacité à revivre une situation du passé suite à la perception d'un stimulus. Ce scénario simpliste ne doit pas mener à penser que la mémoire opère en isolation des autres fonctions cognitives. Au contraire, la mémoire traite des informations hautement transformées et est elle-même modulée par les fonctions exécutives pour informer la prise de décision. Ces interactions complexes donnent lieu à des fonctions cognitives supérieures comme la capacité à imaginer de futures séquences d'événements potentielles en combinant des souvenirs pertinents dans le contexte. Comment le cerveau implémente ce système de construction reste un mystère. L'objectif de cette thèse est donc d'employer des méthodes de modélisation cognitive afin de mieux comprendre les interactions entre mémoire épisodique reposant principalement sur l'hippocampe et contrôle cognitif impliquant majoritairement le cortex préfrontal. Elle propose d'abord des éléments de réponse quant au rôle de la mémoire épisodique dans la sélection de l'action. Il est montré que le Contrôle Episodique Neuronal, une méthode puissante et rapide d’apprentissage par renforcement, est en fait mathématiquement proche du traditionnel réseau de Hopfield, un modèle de mémoire associative ayant grandement influencé la compréhension de l'hippocampe. Le Contrôle Episodique Neuronal peut en effet s'inscrire dans le cadre du réseau de Hopfield universel, il est donc montré qu’il peut être utilisé pour stocker et rappeler de l'information et que d'autres types de réseaux de Hopfield peuvent être utilisés pour l'apprentissage par renforcement. La question de comment les fonctions exécutives contrôlent la mémoire épisodique est aussi posée. Un réseau inspiré de l'hippocampe est créé avec le moins d'hypothèses possible et modulé avec de l'information contextuelle. L'évaluation des performances selon le niveau auquel le contexte est envoyé propose des principes de conception de mémoire épisodique contrôlée. Enfin, un nouveau modèle bio-inspiré de l'apprentissage en un coup de séquences dans l'hippocampe est proposé. Le modèle fonctionne bien avec plusieurs jeux de données tout en reproduisant des observations biologiques. Il attribue un nouveau rôle aux connexions récurrentes de la région CA3 et à l'expansion asymétrique des champs de lieu qui est de distinguer les séquences se chevauchant en faisant émerger des cellules de séparation rétrospective. Les implications pour les théories de l'hippocampe sont discutées et de nouvelles prédictions expérimentales sont dérivées
Episodic memory is often illustrated with the madeleine de Proust excerpt as the ability to re-experience a situation from the past following the perception of a stimulus. This simplistic scenario should not lead into thinking that memory works in isolation from other cognitive functions. On the contrary, memory operations treat highly processed information and are themselves modulated by executive functions in order to inform decision making. This complex interplay can give rise to higher-level functions such as the ability to imagine potential future sequences of events by combining contextually relevant memories. How the brain implements this construction system is still largely a mystery. The objective of this thesis is to employ cognitive computational modeling methods to better understand the interactions between episodic memory, which is supported by the hippocampus, and cognitive control, which mainly involves the prefrontal cortex. It provides elements as to how episodic memory can help an agent to act. It is shown that Neural Episodic Control, a fast and powerful method for reinforcement learning, is in fact mathematically close to the traditional Hopfield Network, a model of associative memory that has greatly influenced the understanding of the hippocampus. Neural Episodic Control indeed fits within the Universal Hopfield Network framework, and it is demonstrated that it can be used to store and recall information, and that other kinds of Hopfield networks can be used for reinforcement learning. The question of how executive functions can control episodic memory operations is also tackled. A hippocampus-inspired network is constructed with as little assumption as possible and modulated with contextual information. The evaluation of performance according to the level at which contextual information is sent provides design principles for controlled episodic memory. Finally, a new biologically inspired model of one-shot sequence learning in the hippocampus is proposed. The model performs very well on multiple datasets while reproducing biological observations. It ascribes a new role to the recurrent collaterals of area CA3 and the asymmetric expansion of place fields, that is to disambiguate overlapping sequences by making retrospective splitter cells emerge. Implications for theories of the hippocampus are discussed and novel experimental predictions are derived
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Cronin, Beau D. „Quantifying uncertainty in computational neuroscience with Bayesian statistical inference“. Thesis, Massachusetts Institute of Technology, 2008. http://hdl.handle.net/1721.1/45336.

Der volle Inhalt der Quelle
Annotation:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Brain and Cognitive Sciences, 2008.
Includes bibliographical references (p. 101-106).
Two key fields of computational neuroscience involve, respectively, the analysis of experimental recordings to understand the functional properties of neurons, and modeling how neurons and networks process sensory information in order to represent the environment. In both of these endeavors, it is crucial to understand and quantify uncertainty - when describing how the brain itself draws conclusions about the physical world, and when the experimenter interprets neuronal data. Bayesian modeling and inference methods provide many advantages for doing so. Three projects are presented that illustrate the advantages of the Bayesian approach. In the first, Markov chain Monte Carlo (MCMC) sampling methods were used to answer a range of scientific questions that arise in the analysis of physiological data from tuning curve experiments; in addition, a software toolbox is described that makes these methods widely accessible. In the second project, the model developed in the first project was extended to describe the detailed dynamics of orientation tuning in neurons in cat primary visual cortex. Using more sophisticated sampling-based inference methods, this model was applied to answer specific scientific questions about the tuning properties of a recorded population. The final project uses a Bayesian model to provide a normative explanation of sensory adaptation phenomena. The model was able to explain a range of detailed physiological adaptation phenomena.
by Beau D. Cronin.
Ph.D.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Lee, Ray A. „Analysis of Spreading Depolarization as a Traveling Wave in a Neuron-Astrocyte Network“. The Ohio State University, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=osu1503308416771087.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Allen, John Michael. „Effects of Abstraction and Assumptions on Modeling Motoneuron Pool Output“. Wright State University / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=wright1495538117787703.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Shepardson, Dylan. „Algorithms for inverting Hodgkin-Huxley type neuron models“. Diss., Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/31686.

Der volle Inhalt der Quelle
Annotation:
Thesis (Ph.D)--Algorithms, Combinatorics, and Optimization, Georgia Institute of Technology, 2010.
Committee Chair: Tovey, Craig; Committee Member: Butera, Rob; Committee Member: Nemirovski, Arkadi; Committee Member: Prinz, Astrid; Committee Member: Sokol, Joel. Part of the SMARTech Electronic Thesis and Dissertation Collection.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Stevens, Martin. „Animal camouflage, receiver psychology and the computational neuroscience of avian vision“. Thesis, University of Bristol, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.432958.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Tromans, James Matthew. „Computational neuroscience of natural scene processing in the ventral visual pathway“. Thesis, University of Oxford, 2012. http://ora.ox.ac.uk/objects/uuid:b82e1332-df7b-41db-9612-879c7a7dda39.

Der volle Inhalt der Quelle
Annotation:
Neural responses in the primate ventral visual system become more complex in the later stages of the pathway. For example, not only do neurons in IT cortex respond to complete objects, they also learn to respond invariantly with respect to the viewing angle of an object and also with respect to the location of an object. These types of neural responses have helped guide past research with VisNet, a computational model of the primate ventral visual pathway that self-organises during learning. In particular, previous research has focussed on presenting to the model one object at a time during training, and has placed emphasis on the transform invariant response properties of the output neurons of the model that consequently develop. This doctoral thesis extends previous VisNet research and investigates the performance of the model with a range of more challenging and ecologically valid training paradigms. For example, when multiple objects are presented to the network during training, or when objects partially occlude one another during training. The different mechanisms that help output neurons to develop object selective, transform invariant responses during learning are proposed and explored. Such mechanisms include the statistical decoupling of objects through multiple object pairings, and the separation of object representations by independent motion. Consideration is also given to the heterogeneous response properties of neurons that develop during learning. For example, although IT neurons demonstrate a number of differing invariances, they also convey spatial information and view specific information about the objects presented on the retina. A updated, scaled-up version of the VisNet model, with a significantly larger retina, is introduced in order to explore these heterogeneous neural response properties.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Vellmer, Sebastian. „Applications of the Fokker-Planck Equation in Computational and Cognitive Neuroscience“. Doctoral thesis, Humboldt-Universität zu Berlin, 2020. http://dx.doi.org/10.18452/21597.

Der volle Inhalt der Quelle
Annotation:
In dieser Arbeit werden mithilfe der Fokker-Planck-Gleichung die Statistiken, vor allem die Leistungsspektren, von Punktprozessen berechnet, die von mehrdimensionalen Integratorneuronen [Engl. integrate-and-fire (IF) neuron], Netzwerken von IF Neuronen und Entscheidungsfindungsmodellen erzeugt werden. Im Gehirn werden Informationen durch Pulszüge von Aktionspotentialen kodiert. IF Neurone mit radikal vereinfachter Erzeugung von Aktionspotentialen haben sich in Studien die auf Pulszeiten fokussiert sind als Standardmodelle etabliert. Eindimensionale IF Modelle können jedoch beobachtetes Pulsverhalten oft nicht beschreiben und müssen dazu erweitert werden. Im erste Teil dieser Arbeit wird eine Theorie zur Berechnung der Pulszugleistungsspektren von stochastischen, multidimensionalen IF Neuronen entwickelt. Ausgehend von der zugehörigen Fokker-Planck-Gleichung werden partiellen Differentialgleichung abgeleitet, deren Lösung sowohl die stationäre Wahrscheinlichkeitsverteilung und Feuerrate, als auch das Pulszugleistungsspektrum beschreibt. Im zweiten Teil wird eine Theorie für große, spärlich verbundene und homogene Netzwerke aus IF Neuronen entwickelt, in der berücksichtigt wird, dass die zeitlichen Korrelationen von Pulszügen selbstkonsistent sind. Neuronale Eingangströme werden durch farbiges Gaußsches Rauschen modelliert, das von einem mehrdimensionalen Ornstein-Uhlenbeck Prozess (OUP) erzeugt wird. Die Koeffizienten des OUP sind vorerst unbekannt und sind als Lösung der Theorie definiert. Um heterogene Netzwerke zu untersuchen, wird eine iterative Methode erweitert. Im dritten Teil wird die Fokker-Planck-Gleichung auf Binärentscheidungen von Diffusionsentscheidungsmodellen [Engl. diffusion-decision models (DDM)] angewendet. Explizite Gleichungen für die Entscheidungszugstatistiken werden für den einfachsten und analytisch lösbaren Fall von der Fokker-Planck-Gleichung hergeleitet. Für nichtliniear Modelle wird die Schwellwertintegrationsmethode erweitert.
This thesis is concerned with the calculation of statistics, in particular the power spectra, of point processes generated by stochastic multidimensional integrate-and-fire (IF) neurons, networks of IF neurons and decision-making models from the corresponding Fokker-Planck equations. In the brain, information is encoded by sequences of action potentials. In studies that focus on spike timing, IF neurons that drastically simplify the spike generation have become the standard model. One-dimensional IF neurons do not suffice to accurately model neural dynamics, however, the extension towards multiple dimensions yields realistic behavior at the price of growing complexity. The first part of this work develops a theory of spike-train power spectra for stochastic, multidimensional IF neurons. From the Fokker-Planck equation, a set of partial differential equations is derived that describes the stationary probability density, the firing rate and the spike-train power spectrum. In the second part of this work, a mean-field theory of large and sparsely connected homogeneous networks of spiking neurons is developed that takes into account the self-consistent temporal correlations of spike trains. Neural input is approximated by colored Gaussian noise generated by a multidimensional Ornstein-Uhlenbeck process of which the coefficients are initially unknown but determined by the self-consistency condition and define the solution of the theory. To explore heterogeneous networks, an iterative scheme is extended to determine the distribution of spectra. In the third part, the Fokker-Planck equation is applied to calculate the statistics of sequences of binary decisions from diffusion-decision models (DDM). For the analytically tractable DDM, the statistics are calculated from the corresponding Fokker-Planck equation. To determine the statistics for nonlinear models, the threshold-integration method is generalized.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Bücher zum Thema "Computationnal Neuroscience"

1

Terman, David H. (David Hillel), Hrsg. Mathematical foundations of neuroscience. New York: Springer, 2010.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

A, Ascoli Georgio, Hrsg. Computational neuroanatomy: Principles and methods. Totowa, N.J: Humana Press, 2002.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Ribeiro, Paulo Rogério de Almeida, Vinícius Rosa Cota, Dante Augusto Couto Barone und Alexandre César Muniz de Oliveira, Hrsg. Computational Neuroscience. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-08443-0.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Cota, Vinícius Rosa, Dante Augusto Couto Barone, Diego Roberto Colombo Dias und Laila Cristina Moreira Damázio, Hrsg. Computational Neuroscience. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-36636-0.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Bower, James M., Hrsg. Computational Neuroscience. Boston, MA: Springer US, 1997. http://dx.doi.org/10.1007/978-1-4757-9800-5.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Chaovalitwongse, Wanpracha, Panos M. Pardalos und Petros Xanthopoulos, Hrsg. Computational Neuroscience. New York, NY: Springer New York, 2010. http://dx.doi.org/10.1007/978-0-387-88630-5.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Barone, Dante Augusto Couto, Eduardo Oliveira Teles und Christian Puhlmann Brackmann, Hrsg. Computational Neuroscience. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-71011-2.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Mallot, Hanspeter A. Computational Neuroscience. Heidelberg: Springer International Publishing, 2013. http://dx.doi.org/10.1007/978-3-319-00861-5.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Bower, James M., Hrsg. Computational Neuroscience. Boston, MA: Springer US, 1998. http://dx.doi.org/10.1007/978-1-4615-4831-7.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Schwartz, Eric L. Computational neuroscience. Cambridge, Mass: MIT Press, 1990.

Den vollen Inhalt der Quelle finden
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Buchteile zum Thema "Computationnal Neuroscience"

1

Hasselmo, Michael E., und James R. Hinman. „Computational Neuroscience: Hippocampus“. In Neuroscience in the 21st Century, 3081–95. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4939-3474-4_175.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Hasselmo, Michael E., und James R. Hinman. „Computational Neuroscience: Hippocampus“. In Neuroscience in the 21st Century, 1–15. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4614-6434-1_175-1.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Hasselmo, Michael E., und James R. Hinman. „Computational Neuroscience: Hippocampus“. In Neuroscience in the 21st Century, 3489–503. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-88832-9_175.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Zednik, Carlos. „Computational cognitive neuroscience“. In The Routledge Handbook of the Computational Mind, 357–69. Milton Park, Abingdon, Oxon ; New York : Routledge, 2019. |: Routledge, 2018. http://dx.doi.org/10.4324/9781315643670-27.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Venugopal, Sharmila, Sharon Crook, Malathi Srivatsan und Ranu Jung. „Principles of Computational Neuroscience“. In Biohybrid Systems, 11–30. Weinheim, Germany: Wiley-VCH Verlag GmbH & Co. KGaA, 2011. http://dx.doi.org/10.1002/9783527639366.ch2.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Irvine, Liz. „Simulation in computational neuroscience“. In The Routledge Handbook of the Computational Mind, 370–80. Milton Park, Abingdon, Oxon ; New York : Routledge, 2019. |: Routledge, 2018. http://dx.doi.org/10.4324/9781315643670-28.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Easttom, Chuck. „Introduction to Computational Neuroscience“. In Machine Learning for Neuroscience, 147–71. Boca Raton: CRC Press, 2023. http://dx.doi.org/10.1201/9781003230588-10.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Mazzola, Guerino, Maria Mannone, Yan Pang, Margaret O’Brien und Nathan Torunsky. „Neuroscience and Gestures“. In Computational Music Science, 155–61. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-47334-5_18.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Druckmann, Shaul, Albert Gidon und Idan Segev. „Computational Neuroscience: Capturing the Essence“. In Neurosciences - From Molecule to Behavior: a university textbook, 671–94. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-10769-6_30.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Cheong, Jin Hyun, Eshin Jolly, Sunhae Sul und Luke J. Chang. „Computational Models in Social Neuroscience“. In Computational Models of Brain and Behavior, 229–44. Chichester, UK: John Wiley & Sons, Ltd, 2017. http://dx.doi.org/10.1002/9781119159193.ch17.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Konferenzberichte zum Thema "Computationnal Neuroscience"

1

Kawato, Mitsuo. „Computational Neuroscience and Multiple-Valued Logic“. In 2009 39th International Symposium on Multiple-Valued Logic. IEEE, 2009. http://dx.doi.org/10.1109/ismvl.2009.70.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Maley, Corey. „Analog Computation in Computational Cognitive Neuroscience“. In 2018 Conference on Cognitive Computational Neuroscience. Brentwood, Tennessee, USA: Cognitive Computational Neuroscience, 2018. http://dx.doi.org/10.32470/ccn.2018.1178-0.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

José Macário Costa, Raimundo, Luís Alfredo Vidal de Carvalho, Emilio Sánchez Miguel, Renata Mousinho, Renato Cerceau, Lizete Pontes Macário Costa, Jorge Zavaleta, Laci Mary Barbosa Manhães und Sérgio Manuel Serra da Cruz. „Computational Neuroscience - Challenges and Implications for Brazilian Education“. In 7th International Conference on Computer Supported Education. SCITEPRESS - Science and and Technology Publications, 2015. http://dx.doi.org/10.5220/0005481004360441.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Gao, Richard, Dylan Christiano, Tom Donoghue und Bradley Voytek. „The Structure of Cognition Across Computational Cognitive Neuroscience“. In 2019 Conference on Cognitive Computational Neuroscience. Brentwood, Tennessee, USA: Cognitive Computational Neuroscience, 2019. http://dx.doi.org/10.32470/ccn.2019.1426-0.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Chateau-Laurent, Hugo, und Frederic Alexandre. „Towards a Computational Cognitive Neuroscience Model of Creativity“. In 2021 IEEE 20th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC). IEEE, 2021. http://dx.doi.org/10.1109/iccicc53683.2021.9811309.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Zhang, Wen-Ran. „Six Conjectures in Quantum Physics and Computational Neuroscience“. In 2009 Third International Conference on Quantum, Nano and Micro Technologies (ICQNM). IEEE, 2009. http://dx.doi.org/10.1109/icqnm.2009.32.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Tirupattur, Naveen, Christopher C. Lapish, Snehasis Mukhopadhyay, Tuan D. Pham, Xiaobo Zhou, Hiroshi Tanaka, Mayumi Oyama-Higa et al. „Text Mining for Neuroscience“. In 2011 INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL MODELS FOR LIFE SCIENCES (CMLS-11). AIP, 2011. http://dx.doi.org/10.1063/1.3596634.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Karimimehr, Saeed. „A novel face recognition system inspired by computational neuroscience“. In IEEE EUROCON 2013. IEEE, 2013. http://dx.doi.org/10.1109/eurocon.2013.6731009.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Muzellec, Sabine, Mathieu Chalvidal, Thomas Serre und Rufin VanRullen. „Accurate implementation of computational neuroscience models through neural ODEs“. In 2022 Conference on Cognitive Computational Neuroscience. San Francisco, California, USA: Cognitive Computational Neuroscience, 2022. http://dx.doi.org/10.32470/ccn.2022.1165-0.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Karimimehr, Saeed, und Mohammad Reza Yazdchi. „How computational neuroscience could help improving face recognition systems?“ In 2014 4th International eConference on Computer and Knowledge Engineering (ICCKE). IEEE, 2014. http://dx.doi.org/10.1109/iccke.2014.6993453.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Berichte der Organisationen zum Thema "Computationnal Neuroscience"

1

Bower, James M., und Christof Koch. Methods in Computational Neuroscience. Fort Belvoir, VA: Defense Technical Information Center, September 1990. http://dx.doi.org/10.21236/ada231397.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Bower, James M., und Christof Koch. Training in Methods in Computational Neuroscience. Fort Belvoir, VA: Defense Technical Information Center, August 1992. http://dx.doi.org/10.21236/ada261806.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Halvorson, Harlyn O. Training in Methods in Computational Neuroscience. Fort Belvoir, VA: Defense Technical Information Center, November 1989. http://dx.doi.org/10.21236/ada217018.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Bower, James M., und Christof Koch. Methods in Computational Neuroscience: Marine Biology Laboratory Student Projects. Fort Belvoir, VA: Defense Technical Information Center, November 1988. http://dx.doi.org/10.21236/ada201434.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Schunn, C. D. A Review of Human Spatial Representations Computational, Neuroscience, Mathematical, Developmental, and Cognitive Psychology Considerations. Fort Belvoir, VA: Defense Technical Information Center, Dezember 2000. http://dx.doi.org/10.21236/ada440864.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Sejonowski, T. Workshop in Computational Neuroscience (8th) held in Woods Hole, Massachusetts on 22-28 August 1992. Fort Belvoir, VA: Defense Technical Information Center, Dezember 1992. http://dx.doi.org/10.21236/ada279786.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Semerikov, Serhiy O., Illia O. Teplytskyi, Yuliia V. Yechkalo und Arnold E. Kiv. Computer Simulation of Neural Networks Using Spreadsheets: The Dawn of the Age of Camelot. [б. в.], November 2018. http://dx.doi.org/10.31812/123456789/2648.

Der volle Inhalt der Quelle
Annotation:
The article substantiates the necessity to develop training methods of computer simulation of neural networks in the spreadsheet environment. The systematic review of their application to simulating artificial neural networks is performed. The authors distinguish basic approaches to solving the problem of network computer simulation training in the spreadsheet environment, joint application of spreadsheets and tools of neural network simulation, application of third-party add-ins to spreadsheets, development of macros using the embedded languages of spreadsheets; use of standard spreadsheet add-ins for non-linear optimization, creation of neural networks in the spreadsheet environment without add-ins and macros. After analyzing a collection of writings of 1890-1950, the research determines the role of the scientific journal “Bulletin of Mathematical Biophysics”, its founder Nicolas Rashevsky and the scientific community around the journal in creating and developing models and methods of computational neuroscience. There are identified psychophysical basics of creating neural networks, mathematical foundations of neural computing and methods of neuroengineering (image recognition, in particular). The role of Walter Pitts in combining the descriptive and quantitative theories of training is discussed. It is shown that to acquire neural simulation competences in the spreadsheet environment, one should master the models based on the historical and genetic approach. It is indicated that there are three groups of models, which are promising in terms of developing corresponding methods – the continuous two-factor model of Rashevsky, the discrete model of McCulloch and Pitts, and the discrete-continuous models of Householder and Landahl.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie