Gotowa bibliografia na temat „Computational neuroscience”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Computational neuroscience”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Computational neuroscience"

1

Cao, Jinde, Qingshan Liu, Sabri Arik, Jianlong Qiu, Haijun Jiang i Ahmed Elaiw. "Computational Neuroscience". Computational and Mathematical Methods in Medicine 2014 (2014): 1–2. http://dx.doi.org/10.1155/2014/120280.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Sejnowski, T., C. Koch i P. Churchland. "Computational neuroscience". Science 241, nr 4871 (9.09.1988): 1299–306. http://dx.doi.org/10.1126/science.3045969.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Sejnowski, Terrence J. "Computational neuroscience". Behavioral and Brain Sciences 9, nr 1 (marzec 1986): 104–5. http://dx.doi.org/10.1017/s0140525x00021713.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Moore, John W. "Computational Neuroscience". Contemporary Psychology: A Journal of Reviews 38, nr 2 (luty 1993): 137–39. http://dx.doi.org/10.1037/033019.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Ringo, J. L. "Computational Neuroscience". Archives of Neurology 48, nr 2 (1.02.1991): 130. http://dx.doi.org/10.1001/archneur.1991.00530140018008.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Kriegeskorte, Nikolaus, i Pamela K. Douglas. "Cognitive computational neuroscience". Nature Neuroscience 21, nr 9 (20.08.2018): 1148–60. http://dx.doi.org/10.1038/s41593-018-0210-5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Cecchi, Guillermo A., i James Kozloski. "Preface: Computational neuroscience". IBM Journal of Research and Development 61, nr 2/3 (1.03.2017): 0:1–0:4. http://dx.doi.org/10.1147/jrd.2017.2690118.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Popovych, Oleksandr, Peter Tass i Christian Hauptmann. "Desynchronization (computational neuroscience)". Scholarpedia 6, nr 10 (2011): 1352. http://dx.doi.org/10.4249/scholarpedia.1352.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Érdi, Péter. "Teaching computational neuroscience". Cognitive Neurodynamics 9, nr 5 (21.03.2015): 479–85. http://dx.doi.org/10.1007/s11571-015-9340-6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Becker, Suzanna, i Nathaniel D. Daw. "Computational cognitive neuroscience". Brain Research 1299 (listopad 2009): 1–2. http://dx.doi.org/10.1016/j.brainres.2009.09.114.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Rozprawy doktorskie na temat "Computational neuroscience"

1

Higgins, Irina. "Computational neuroscience of speech recognition". Thesis, University of Oxford, 2015. https://ora.ox.ac.uk/objects/uuid:daa8d096-6534-4174-b63e-cc4161291c90.

Pełny tekst źródła
Streszczenie:
Physical variability of speech combined with its perceptual constancy make speech recognition a challenging task. The human auditory brain, however, is able to perform speech recognition effortlessly. This thesis aims to understand the precise computational mechanisms that allow the auditory brain to do so. In particular, we look for the minimal subset of sub-cortical auditory brain areas that allow the primary auditory cortex to learn 'good representations' of speech-like auditory objects through spike-timing dependent plasticity (STDP) learning mechanisms as described by Bi & Poo (1998). A 'good representation' is defined as that which is informative of the stimulus class regardless of the variability in the raw input, while being less redundant and more compressed than the representations within the auditory nerve, which provides the firing inputs to the rest of the auditory brain hierarchy (Barlow 1961). Neurophysiological studies have provided insights into the architecture and response properties of different areas within the auditory brain hierarchy. We use these insights to guide the development of an unsupervised spiking neural network grounded in the neurophysiology of the auditory brain and equipped with spike-time dependent plasticity (STDP) learning (Bi & Poo 1998). The model was exposed to simple controlled speech- like stimuli (artificially synthesised phonemes and naturally spoken words) to investigate how stable representations that are invariant to the within- and between-speaker differences can emerge in the output area of the model. The output of the model is roughly equivalent to the primary auditory cortex. The aim of the first part of the thesis was to investigate what was the minimal taxonomy necessary for such representations to emerge through the interactions of spiking dynamics of the network neurons, their ability to learn through STDP learning and the statistics of the auditory input stimuli. It was found that sub-cortical pre-processing within the ventral cochlear nucleus and inferior colliculus was necessary to remove jitter inherent to the auditory nerve spike rasters, which would disrupt STDP learning in the primary auditory cortex otherwise. The second half of the thesis investigated the nature of neural encoding used within the primary auditory cortex stage of the model to represent the learnt auditory object categories. It was found that single cell binary encoding (DeWeese & Zador 2003) was sufficient to represent two synthesised vowel classes, however more complex population encoding using precisely timed spikes within polychronous chains (Izhikevich 2006) represented more complex naturally spoken words in speaker-invariant manner.
Style APA, Harvard, Vancouver, ISO itp.
2

Walters, Daniel Matthew. "The computational neuroscience of head direction cells". Thesis, University of Oxford, 2011. http://ora.ox.ac.uk/objects/uuid:d4afe06a-d44f-4a24-99a3-d0e0a2911459.

Pełny tekst źródła
Streszczenie:
Head direction cells signal the orientation of the head in the horizontal plane. This thesis shows how some of the known head direction cell response properties might develop through learning. The research methodology employed is the computer simulation of neural network models of head direction cells that self-organize through learning. The preferred firing directions of head direction cells will change in response to the manipulation of distal visual cues, but not in response to the manipulation of proximal visual cues. Simulation results are presented of neural network models that learn to form separate representations of distal and proximal visual cues that are presented simultaneously as visual input to the network. These results demonstrate the computation required for a subpopulation of head direction cells to learn to preferentially respond to distal visual cues. Within a population of head direction cells, the angular distance between the preferred firing directions of any two cells is maintained across different environments. It is shown how a neural network model can learn to maintain the angular distance between the learned preferred firing directions of head direction cells across two different visual training environments. A population of head direction cells can update the population representation of the current head direction, in the absence of visual input, using internal idiothetic (self-generated) motion signals alone. This is called the path integration of head direction. It is important that the head direction cell system updates its internal representation of head direction at the same speed as the animal is rotating its head. Neural network models are simulated that learn to perform the path integration of head direction, using solely idiothetic signals, at the same speed as the head is rotating.
Style APA, Harvard, Vancouver, ISO itp.
3

Cronin, Beau D. "Quantifying uncertainty in computational neuroscience with Bayesian statistical inference". Thesis, Massachusetts Institute of Technology, 2008. http://hdl.handle.net/1721.1/45336.

Pełny tekst źródła
Streszczenie:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Brain and Cognitive Sciences, 2008.
Includes bibliographical references (p. 101-106).
Two key fields of computational neuroscience involve, respectively, the analysis of experimental recordings to understand the functional properties of neurons, and modeling how neurons and networks process sensory information in order to represent the environment. In both of these endeavors, it is crucial to understand and quantify uncertainty - when describing how the brain itself draws conclusions about the physical world, and when the experimenter interprets neuronal data. Bayesian modeling and inference methods provide many advantages for doing so. Three projects are presented that illustrate the advantages of the Bayesian approach. In the first, Markov chain Monte Carlo (MCMC) sampling methods were used to answer a range of scientific questions that arise in the analysis of physiological data from tuning curve experiments; in addition, a software toolbox is described that makes these methods widely accessible. In the second project, the model developed in the first project was extended to describe the detailed dynamics of orientation tuning in neurons in cat primary visual cortex. Using more sophisticated sampling-based inference methods, this model was applied to answer specific scientific questions about the tuning properties of a recorded population. The final project uses a Bayesian model to provide a normative explanation of sensory adaptation phenomena. The model was able to explain a range of detailed physiological adaptation phenomena.
by Beau D. Cronin.
Ph.D.
Style APA, Harvard, Vancouver, ISO itp.
4

Stevens, Martin. "Animal camouflage, receiver psychology and the computational neuroscience of avian vision". Thesis, University of Bristol, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.432958.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Tromans, James Matthew. "Computational neuroscience of natural scene processing in the ventral visual pathway". Thesis, University of Oxford, 2012. http://ora.ox.ac.uk/objects/uuid:b82e1332-df7b-41db-9612-879c7a7dda39.

Pełny tekst źródła
Streszczenie:
Neural responses in the primate ventral visual system become more complex in the later stages of the pathway. For example, not only do neurons in IT cortex respond to complete objects, they also learn to respond invariantly with respect to the viewing angle of an object and also with respect to the location of an object. These types of neural responses have helped guide past research with VisNet, a computational model of the primate ventral visual pathway that self-organises during learning. In particular, previous research has focussed on presenting to the model one object at a time during training, and has placed emphasis on the transform invariant response properties of the output neurons of the model that consequently develop. This doctoral thesis extends previous VisNet research and investigates the performance of the model with a range of more challenging and ecologically valid training paradigms. For example, when multiple objects are presented to the network during training, or when objects partially occlude one another during training. The different mechanisms that help output neurons to develop object selective, transform invariant responses during learning are proposed and explored. Such mechanisms include the statistical decoupling of objects through multiple object pairings, and the separation of object representations by independent motion. Consideration is also given to the heterogeneous response properties of neurons that develop during learning. For example, although IT neurons demonstrate a number of differing invariances, they also convey spatial information and view specific information about the objects presented on the retina. A updated, scaled-up version of the VisNet model, with a significantly larger retina, is introduced in order to explore these heterogeneous neural response properties.
Style APA, Harvard, Vancouver, ISO itp.
6

Vellmer, Sebastian. "Applications of the Fokker-Planck Equation in Computational and Cognitive Neuroscience". Doctoral thesis, Humboldt-Universität zu Berlin, 2020. http://dx.doi.org/10.18452/21597.

Pełny tekst źródła
Streszczenie:
In dieser Arbeit werden mithilfe der Fokker-Planck-Gleichung die Statistiken, vor allem die Leistungsspektren, von Punktprozessen berechnet, die von mehrdimensionalen Integratorneuronen [Engl. integrate-and-fire (IF) neuron], Netzwerken von IF Neuronen und Entscheidungsfindungsmodellen erzeugt werden. Im Gehirn werden Informationen durch Pulszüge von Aktionspotentialen kodiert. IF Neurone mit radikal vereinfachter Erzeugung von Aktionspotentialen haben sich in Studien die auf Pulszeiten fokussiert sind als Standardmodelle etabliert. Eindimensionale IF Modelle können jedoch beobachtetes Pulsverhalten oft nicht beschreiben und müssen dazu erweitert werden. Im erste Teil dieser Arbeit wird eine Theorie zur Berechnung der Pulszugleistungsspektren von stochastischen, multidimensionalen IF Neuronen entwickelt. Ausgehend von der zugehörigen Fokker-Planck-Gleichung werden partiellen Differentialgleichung abgeleitet, deren Lösung sowohl die stationäre Wahrscheinlichkeitsverteilung und Feuerrate, als auch das Pulszugleistungsspektrum beschreibt. Im zweiten Teil wird eine Theorie für große, spärlich verbundene und homogene Netzwerke aus IF Neuronen entwickelt, in der berücksichtigt wird, dass die zeitlichen Korrelationen von Pulszügen selbstkonsistent sind. Neuronale Eingangströme werden durch farbiges Gaußsches Rauschen modelliert, das von einem mehrdimensionalen Ornstein-Uhlenbeck Prozess (OUP) erzeugt wird. Die Koeffizienten des OUP sind vorerst unbekannt und sind als Lösung der Theorie definiert. Um heterogene Netzwerke zu untersuchen, wird eine iterative Methode erweitert. Im dritten Teil wird die Fokker-Planck-Gleichung auf Binärentscheidungen von Diffusionsentscheidungsmodellen [Engl. diffusion-decision models (DDM)] angewendet. Explizite Gleichungen für die Entscheidungszugstatistiken werden für den einfachsten und analytisch lösbaren Fall von der Fokker-Planck-Gleichung hergeleitet. Für nichtliniear Modelle wird die Schwellwertintegrationsmethode erweitert.
This thesis is concerned with the calculation of statistics, in particular the power spectra, of point processes generated by stochastic multidimensional integrate-and-fire (IF) neurons, networks of IF neurons and decision-making models from the corresponding Fokker-Planck equations. In the brain, information is encoded by sequences of action potentials. In studies that focus on spike timing, IF neurons that drastically simplify the spike generation have become the standard model. One-dimensional IF neurons do not suffice to accurately model neural dynamics, however, the extension towards multiple dimensions yields realistic behavior at the price of growing complexity. The first part of this work develops a theory of spike-train power spectra for stochastic, multidimensional IF neurons. From the Fokker-Planck equation, a set of partial differential equations is derived that describes the stationary probability density, the firing rate and the spike-train power spectrum. In the second part of this work, a mean-field theory of large and sparsely connected homogeneous networks of spiking neurons is developed that takes into account the self-consistent temporal correlations of spike trains. Neural input is approximated by colored Gaussian noise generated by a multidimensional Ornstein-Uhlenbeck process of which the coefficients are initially unknown but determined by the self-consistency condition and define the solution of the theory. To explore heterogeneous networks, an iterative scheme is extended to determine the distribution of spectra. In the third part, the Fokker-Planck equation is applied to calculate the statistics of sequences of binary decisions from diffusion-decision models (DDM). For the analytically tractable DDM, the statistics are calculated from the corresponding Fokker-Planck equation. To determine the statistics for nonlinear models, the threshold-integration method is generalized.
Style APA, Harvard, Vancouver, ISO itp.
7

Zhu, Mengchen. "Sparse coding models of neural response in the primary visual cortex". Diss., Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/53868.

Pełny tekst źródła
Streszczenie:
Sparse coding is an influential unsupervised learning approach proposed as a theoretical model of the encoding process in the primary visual cortex (V1). While sparse coding has been successful in explaining classical receptive field properties of simple cells, it was unclear whether it can account for more complex response properties in a variety of cell types. In this dissertation, we demonstrate that sparse coding and its variants are consistent with key aspects of neural response in V1, including many contextual and nonlinear effects, a number of inhibitory interneuron properties, as well as the variance and correlation distributions in the population response. The results suggest that important response properties in V1 can be interpreted as emergent effects of a neural population efficiently representing the statistical structures of natural scenes under resource constraints. Based on the models, we make predictions of the circuit structure and response properties in V1 that can be verified by future experiments.
Style APA, Harvard, Vancouver, ISO itp.
8

Woldman, Wessel. "Emergent phenomena from dynamic network models : mathematical analysis of EEG from people with IGE". Thesis, University of Exeter, 2016. http://hdl.handle.net/10871/23297.

Pełny tekst źródła
Streszczenie:
In this thesis mathematical techniques and models are applied to electroencephalographic (EEG) recordings to study mechanisms of idiopathic generalised epilepsy (IGE). First, we compare network structures derived from resting-state EEG from people with IGE, their unaffected relatives, and healthy controls. Next, these static networks are combined with a dynamical model describing the ac- tivity of a cortical region as a population of phase-oscillators. We then examine the potential of the differences found in the static networks and the emergent properties of the dynamic network as individual biomarkers of IGE. The emphasis of this approach is on discerning the potential of these markers at the level of an indi- vidual subject rather than their ability to identify differences at a group level. Finally, we extend a dynamic model of seizure onset to investigate how epileptiform discharges vary over the course of the day in ambulatory EEG recordings from people with IGE. By per- turbing the dynamics describing the excitability of the system, we demonstrate the model can reproduce discharge distributions on an individual level which are shown to express a circadian tone. The emphasis of the model approach is on understanding how changes in excitability within brain regions, modulated by sleep, metabolism, endocrine axes, or anti-epileptic drugs (AEDs), can drive the emer- gence of epileptiform activity in large-scale brain networks. Our results demonstrate that studying EEG recordings from peo- ple with IGE can lead to new mechanistic insight on the idiopathic nature of IGE, and may eventually lead to clinical applications. We show that biomarkers derived from dynamic network models perform significantly better as classifiers than biomarkers based on static network properties. Hence, our results provide additional ev- idence that the interplay between the dynamics of specific brain re- gions, and the network topology governing the interactions between these regions, is crucial in the generation of emergent epileptiform activity. Pathological activity may emerge due to abnormalities in either of those factors, or a combination of both, and hence it is essential to develop new techniques to characterise this interplay theoretically and to validate predictions experimentally.
Style APA, Harvard, Vancouver, ISO itp.
9

Nguyen, Harrison Tri Tue. "Computational Neuroscience with Deep Learning for Brain Imaging Analysis and Behaviour Classification". Thesis, The University of Sydney, 2021. https://hdl.handle.net/2123/27313.

Pełny tekst źródła
Streszczenie:
Recent advances of artificial neural networks and deep learning model have produced significant results in problems related to neuroscience. For example, deep learning models have demonstrated superior performance in non-linear, multivariate pattern classification problems such as Alzheimer’s disease classification, brain lesion segmentation, skull stripping and brain age prediction. Deep learning provides unique advantages for high-dimensional data such as MRI data, since it does not require extensive feature engineering. The thesis investigates three problems related to neuroscience and discuss solutions to those scenarios. MRI has been used to analyse the structure of the brain and its pathology. However, for ex- ample, due to the heterogeneity of these scanners, MRI protocol, variation in site thermal and power stability can introduce scanning differences and artefacts for the same individual under- going different scans. Therefore combining images from different sites or even different days can introduce biases that obscure the signal of interest or can produce results that could be driven by these differences. An algorithm, the CycleGAN, will be presented and analysed which uses generative adversarial networks to transform a set of images from a given MRI site into images with characteristics of a different MRI site. Secondly, the MRI scans of the brain can come in the form of different modalities such as T1- weighted and FLAIR which have been used to investigate a wide range of neurological disorders. The acquisition of all of these modalities are expensive, time-consuming, inconvenient and the required modalities are often not available. As a result, these datasets contain large amounts of unpaired data, where examples in the dataset do not contain all modalities. On the other hand, there is a smaller fraction of examples that contain all modalities (paired data). This thesis presents a method to address the issue of translating between two neuroimaging modalities with a dataset of unpaired and paired, in semi-supervised learning framework. Lastly, behavioural modelling will be considered, where it is associated with an impressive range of decision-making tasks that are designed to index sub-components of psychological and neural computations that are distinct across groups of people, including people with an underlying disease. The thesis proposes a method that learns prototypical behaviours of each population in the form of readily interpretable, subsequences of choices, and classifies subjects by finding signatures of these prototypes in their behaviour.
Style APA, Harvard, Vancouver, ISO itp.
10

Lundh, Dan. "A computational neuroscientific model for short-term memory". Thesis, University of Exeter, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.324742.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Książki na temat "Computational neuroscience"

1

Ribeiro, Paulo Rogério de Almeida, Vinícius Rosa Cota, Dante Augusto Couto Barone i Alexandre César Muniz de Oliveira, red. Computational Neuroscience. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-08443-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Cota, Vinícius Rosa, Dante Augusto Couto Barone, Diego Roberto Colombo Dias i Laila Cristina Moreira Damázio, red. Computational Neuroscience. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-36636-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Bower, James M., red. Computational Neuroscience. Boston, MA: Springer US, 1997. http://dx.doi.org/10.1007/978-1-4757-9800-5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Chaovalitwongse, Wanpracha, Panos M. Pardalos i Petros Xanthopoulos, red. Computational Neuroscience. New York, NY: Springer New York, 2010. http://dx.doi.org/10.1007/978-0-387-88630-5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Barone, Dante Augusto Couto, Eduardo Oliveira Teles i Christian Puhlmann Brackmann, red. Computational Neuroscience. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-71011-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Mallot, Hanspeter A. Computational Neuroscience. Heidelberg: Springer International Publishing, 2013. http://dx.doi.org/10.1007/978-3-319-00861-5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Bower, James M., red. Computational Neuroscience. Boston, MA: Springer US, 1998. http://dx.doi.org/10.1007/978-1-4615-4831-7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Stoyanov, Drozdstoy, Bogdan Draganski, Paolo Brambilla i Claus Lamm, red. Computational Neuroscience. New York, NY: Springer US, 2023. http://dx.doi.org/10.1007/978-1-0716-3230-7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Riascos Salas, Jaime A., Vinícius Rosa Cota, Hernán Villota i Daniel Betancur Vasquez, red. Computational Neuroscience. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-63848-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Pardalos, P. M. Computational neuroscience. New York: Springer, 2010.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Części książek na temat "Computational neuroscience"

1

Hasselmo, Michael E., i James R. Hinman. "Computational Neuroscience: Hippocampus". W Neuroscience in the 21st Century, 3081–95. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4939-3474-4_175.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Hasselmo, Michael E., i James R. Hinman. "Computational Neuroscience: Hippocampus". W Neuroscience in the 21st Century, 1–15. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4614-6434-1_175-1.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Hasselmo, Michael E., i James R. Hinman. "Computational Neuroscience: Hippocampus". W Neuroscience in the 21st Century, 3489–503. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-88832-9_175.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Zednik, Carlos. "Computational cognitive neuroscience". W The Routledge Handbook of the Computational Mind, 357–69. Milton Park, Abingdon, Oxon ; New York : Routledge, 2019. |: Routledge, 2018. http://dx.doi.org/10.4324/9781315643670-27.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Mazzola, Guerino, Maria Mannone, Yan Pang, Margaret O’Brien i Nathan Torunsky. "Neuroscience and Gestures". W Computational Music Science, 155–61. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-47334-5_18.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Venugopal, Sharmila, Sharon Crook, Malathi Srivatsan i Ranu Jung. "Principles of Computational Neuroscience". W Biohybrid Systems, 11–30. Weinheim, Germany: Wiley-VCH Verlag GmbH & Co. KGaA, 2011. http://dx.doi.org/10.1002/9783527639366.ch2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Irvine, Liz. "Simulation in computational neuroscience". W The Routledge Handbook of the Computational Mind, 370–80. Milton Park, Abingdon, Oxon ; New York : Routledge, 2019. |: Routledge, 2018. http://dx.doi.org/10.4324/9781315643670-28.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Easttom, Chuck. "Introduction to Computational Neuroscience". W Machine Learning for Neuroscience, 147–71. Boca Raton: CRC Press, 2023. http://dx.doi.org/10.1201/9781003230588-10.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

DeWan, Andrew, Lana C. Rutherford i Gina G. Turrigiano. "Activity-Dependent Regulation of Inhibition in Visual Cortical Cultures". W Computational Neuroscience, 3–6. Boston, MA: Springer US, 1997. http://dx.doi.org/10.1007/978-1-4757-9800-5_1.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Chitwood, Raymond A., Brenda J. Claiborne i David B. Jaffe. "Modeling the Passive Properties of Nonpyramidal Neurons in Hippocampal Area CA3". W Computational Neuroscience, 59–64. Boston, MA: Springer US, 1997. http://dx.doi.org/10.1007/978-1-4757-9800-5_10.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Computational neuroscience"

1

Maley, Corey. "Analog Computation in Computational Cognitive Neuroscience". W 2018 Conference on Cognitive Computational Neuroscience. Brentwood, Tennessee, USA: Cognitive Computational Neuroscience, 2018. http://dx.doi.org/10.32470/ccn.2018.1178-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Piccinini, Gualtiero. "Non-Computational Functionalism: Computation and the Function of Consciousness". W 2018 Conference on Cognitive Computational Neuroscience. Brentwood, Tennessee, USA: Cognitive Computational Neuroscience, 2018. http://dx.doi.org/10.32470/ccn.2018.1022-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Kawato, Mitsuo. "Computational Neuroscience and Multiple-Valued Logic". W 2009 39th International Symposium on Multiple-Valued Logic. IEEE, 2009. http://dx.doi.org/10.1109/ismvl.2009.70.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Tirupattur, Naveen, Christopher C. Lapish, Snehasis Mukhopadhyay, Tuan D. Pham, Xiaobo Zhou, Hiroshi Tanaka, Mayumi Oyama-Higa i in. "Text Mining for Neuroscience". W 2011 INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL MODELS FOR LIFE SCIENCES (CMLS-11). AIP, 2011. http://dx.doi.org/10.1063/1.3596634.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Gao, Richard, Dylan Christiano, Tom Donoghue i Bradley Voytek. "The Structure of Cognition Across Computational Cognitive Neuroscience". W 2019 Conference on Cognitive Computational Neuroscience. Brentwood, Tennessee, USA: Cognitive Computational Neuroscience, 2019. http://dx.doi.org/10.32470/ccn.2019.1426-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

José Macário Costa, Raimundo, Luís Alfredo Vidal de Carvalho, Emilio Sánchez Miguel, Renata Mousinho, Renato Cerceau, Lizete Pontes Macário Costa, Jorge Zavaleta, Laci Mary Barbosa Manhães i Sérgio Manuel Serra da Cruz. "Computational Neuroscience - Challenges and Implications for Brazilian Education". W 7th International Conference on Computer Supported Education. SCITEPRESS - Science and and Technology Publications, 2015. http://dx.doi.org/10.5220/0005481004360441.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Chateau-Laurent, Hugo, i Frederic Alexandre. "Towards a Computational Cognitive Neuroscience Model of Creativity". W 2021 IEEE 20th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC). IEEE, 2021. http://dx.doi.org/10.1109/iccicc53683.2021.9811309.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Zhang, Wen-Ran. "Six Conjectures in Quantum Physics and Computational Neuroscience". W 2009 Third International Conference on Quantum, Nano and Micro Technologies (ICQNM). IEEE, 2009. http://dx.doi.org/10.1109/icqnm.2009.32.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Anllo, Hernan, Gil Salamander, Stefano Palminteri, Nichola Raihani i Uri Hertz. "Computational drivers of advice-giving". W 2023 Conference on Cognitive Computational Neuroscience. Oxford, United Kingdom: Cognitive Computational Neuroscience, 2023. http://dx.doi.org/10.32470/ccn.2023.1367-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Muzellec, Sabine, Mathieu Chalvidal, Thomas Serre i Rufin VanRullen. "Accurate implementation of computational neuroscience models through neural ODEs". W 2022 Conference on Cognitive Computational Neuroscience. San Francisco, California, USA: Cognitive Computational Neuroscience, 2022. http://dx.doi.org/10.32470/ccn.2022.1165-0.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Raporty organizacyjne na temat "Computational neuroscience"

1

Bower, James M., i Christof Koch. Methods in Computational Neuroscience. Fort Belvoir, VA: Defense Technical Information Center, wrzesień 1990. http://dx.doi.org/10.21236/ada231397.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Bower, James M., i Christof Koch. Training in Methods in Computational Neuroscience. Fort Belvoir, VA: Defense Technical Information Center, sierpień 1992. http://dx.doi.org/10.21236/ada261806.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Halvorson, Harlyn O. Training in Methods in Computational Neuroscience. Fort Belvoir, VA: Defense Technical Information Center, listopad 1989. http://dx.doi.org/10.21236/ada217018.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Bower, James M., i Christof Koch. Methods in Computational Neuroscience: Marine Biology Laboratory Student Projects. Fort Belvoir, VA: Defense Technical Information Center, listopad 1988. http://dx.doi.org/10.21236/ada201434.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Schunn, C. D. A Review of Human Spatial Representations Computational, Neuroscience, Mathematical, Developmental, and Cognitive Psychology Considerations. Fort Belvoir, VA: Defense Technical Information Center, grudzień 2000. http://dx.doi.org/10.21236/ada440864.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Sejonowski, T. Workshop in Computational Neuroscience (8th) held in Woods Hole, Massachusetts on 22-28 August 1992. Fort Belvoir, VA: Defense Technical Information Center, grudzień 1992. http://dx.doi.org/10.21236/ada279786.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Semerikov, Serhiy O., Illia O. Teplytskyi, Yuliia V. Yechkalo i Arnold E. Kiv. Computer Simulation of Neural Networks Using Spreadsheets: The Dawn of the Age of Camelot. [б. в.], listopad 2018. http://dx.doi.org/10.31812/123456789/2648.

Pełny tekst źródła
Streszczenie:
The article substantiates the necessity to develop training methods of computer simulation of neural networks in the spreadsheet environment. The systematic review of their application to simulating artificial neural networks is performed. The authors distinguish basic approaches to solving the problem of network computer simulation training in the spreadsheet environment, joint application of spreadsheets and tools of neural network simulation, application of third-party add-ins to spreadsheets, development of macros using the embedded languages of spreadsheets; use of standard spreadsheet add-ins for non-linear optimization, creation of neural networks in the spreadsheet environment without add-ins and macros. After analyzing a collection of writings of 1890-1950, the research determines the role of the scientific journal “Bulletin of Mathematical Biophysics”, its founder Nicolas Rashevsky and the scientific community around the journal in creating and developing models and methods of computational neuroscience. There are identified psychophysical basics of creating neural networks, mathematical foundations of neural computing and methods of neuroengineering (image recognition, in particular). The role of Walter Pitts in combining the descriptive and quantitative theories of training is discussed. It is shown that to acquire neural simulation competences in the spreadsheet environment, one should master the models based on the historical and genetic approach. It is indicated that there are three groups of models, which are promising in terms of developing corresponding methods – the continuous two-factor model of Rashevsky, the discrete model of McCulloch and Pitts, and the discrete-continuous models of Householder and Landahl.
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii