Academic literature on the topic 'Computational neuroscience'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Computational neuroscience.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Computational neuroscience"

1

Cao, Jinde, Qingshan Liu, Sabri Arik, Jianlong Qiu, Haijun Jiang, and Ahmed Elaiw. "Computational Neuroscience." Computational and Mathematical Methods in Medicine 2014 (2014): 1–2. http://dx.doi.org/10.1155/2014/120280.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Sejnowski, T., C. Koch, and P. Churchland. "Computational neuroscience." Science 241, no. 4871 (September 9, 1988): 1299–306. http://dx.doi.org/10.1126/science.3045969.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Sejnowski, Terrence J. "Computational neuroscience." Behavioral and Brain Sciences 9, no. 1 (March 1986): 104–5. http://dx.doi.org/10.1017/s0140525x00021713.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Moore, John W. "Computational Neuroscience." Contemporary Psychology: A Journal of Reviews 38, no. 2 (February 1993): 137–39. http://dx.doi.org/10.1037/033019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Ringo, J. L. "Computational Neuroscience." Archives of Neurology 48, no. 2 (February 1, 1991): 130. http://dx.doi.org/10.1001/archneur.1991.00530140018008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kriegeskorte, Nikolaus, and Pamela K. Douglas. "Cognitive computational neuroscience." Nature Neuroscience 21, no. 9 (August 20, 2018): 1148–60. http://dx.doi.org/10.1038/s41593-018-0210-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Cecchi, Guillermo A., and James Kozloski. "Preface: Computational neuroscience." IBM Journal of Research and Development 61, no. 2/3 (March 1, 2017): 0:1–0:4. http://dx.doi.org/10.1147/jrd.2017.2690118.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Popovych, Oleksandr, Peter Tass, and Christian Hauptmann. "Desynchronization (computational neuroscience)." Scholarpedia 6, no. 10 (2011): 1352. http://dx.doi.org/10.4249/scholarpedia.1352.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Érdi, Péter. "Teaching computational neuroscience." Cognitive Neurodynamics 9, no. 5 (March 21, 2015): 479–85. http://dx.doi.org/10.1007/s11571-015-9340-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Becker, Suzanna, and Nathaniel D. Daw. "Computational cognitive neuroscience." Brain Research 1299 (November 2009): 1–2. http://dx.doi.org/10.1016/j.brainres.2009.09.114.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Computational neuroscience"

1

Higgins, Irina. "Computational neuroscience of speech recognition." Thesis, University of Oxford, 2015. https://ora.ox.ac.uk/objects/uuid:daa8d096-6534-4174-b63e-cc4161291c90.

Full text
Abstract:
Physical variability of speech combined with its perceptual constancy make speech recognition a challenging task. The human auditory brain, however, is able to perform speech recognition effortlessly. This thesis aims to understand the precise computational mechanisms that allow the auditory brain to do so. In particular, we look for the minimal subset of sub-cortical auditory brain areas that allow the primary auditory cortex to learn 'good representations' of speech-like auditory objects through spike-timing dependent plasticity (STDP) learning mechanisms as described by Bi & Poo (1998). A 'good representation' is defined as that which is informative of the stimulus class regardless of the variability in the raw input, while being less redundant and more compressed than the representations within the auditory nerve, which provides the firing inputs to the rest of the auditory brain hierarchy (Barlow 1961). Neurophysiological studies have provided insights into the architecture and response properties of different areas within the auditory brain hierarchy. We use these insights to guide the development of an unsupervised spiking neural network grounded in the neurophysiology of the auditory brain and equipped with spike-time dependent plasticity (STDP) learning (Bi & Poo 1998). The model was exposed to simple controlled speech- like stimuli (artificially synthesised phonemes and naturally spoken words) to investigate how stable representations that are invariant to the within- and between-speaker differences can emerge in the output area of the model. The output of the model is roughly equivalent to the primary auditory cortex. The aim of the first part of the thesis was to investigate what was the minimal taxonomy necessary for such representations to emerge through the interactions of spiking dynamics of the network neurons, their ability to learn through STDP learning and the statistics of the auditory input stimuli. It was found that sub-cortical pre-processing within the ventral cochlear nucleus and inferior colliculus was necessary to remove jitter inherent to the auditory nerve spike rasters, which would disrupt STDP learning in the primary auditory cortex otherwise. The second half of the thesis investigated the nature of neural encoding used within the primary auditory cortex stage of the model to represent the learnt auditory object categories. It was found that single cell binary encoding (DeWeese & Zador 2003) was sufficient to represent two synthesised vowel classes, however more complex population encoding using precisely timed spikes within polychronous chains (Izhikevich 2006) represented more complex naturally spoken words in speaker-invariant manner.
APA, Harvard, Vancouver, ISO, and other styles
2

Walters, Daniel Matthew. "The computational neuroscience of head direction cells." Thesis, University of Oxford, 2011. http://ora.ox.ac.uk/objects/uuid:d4afe06a-d44f-4a24-99a3-d0e0a2911459.

Full text
Abstract:
Head direction cells signal the orientation of the head in the horizontal plane. This thesis shows how some of the known head direction cell response properties might develop through learning. The research methodology employed is the computer simulation of neural network models of head direction cells that self-organize through learning. The preferred firing directions of head direction cells will change in response to the manipulation of distal visual cues, but not in response to the manipulation of proximal visual cues. Simulation results are presented of neural network models that learn to form separate representations of distal and proximal visual cues that are presented simultaneously as visual input to the network. These results demonstrate the computation required for a subpopulation of head direction cells to learn to preferentially respond to distal visual cues. Within a population of head direction cells, the angular distance between the preferred firing directions of any two cells is maintained across different environments. It is shown how a neural network model can learn to maintain the angular distance between the learned preferred firing directions of head direction cells across two different visual training environments. A population of head direction cells can update the population representation of the current head direction, in the absence of visual input, using internal idiothetic (self-generated) motion signals alone. This is called the path integration of head direction. It is important that the head direction cell system updates its internal representation of head direction at the same speed as the animal is rotating its head. Neural network models are simulated that learn to perform the path integration of head direction, using solely idiothetic signals, at the same speed as the head is rotating.
APA, Harvard, Vancouver, ISO, and other styles
3

Cronin, Beau D. "Quantifying uncertainty in computational neuroscience with Bayesian statistical inference." Thesis, Massachusetts Institute of Technology, 2008. http://hdl.handle.net/1721.1/45336.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Brain and Cognitive Sciences, 2008.
Includes bibliographical references (p. 101-106).
Two key fields of computational neuroscience involve, respectively, the analysis of experimental recordings to understand the functional properties of neurons, and modeling how neurons and networks process sensory information in order to represent the environment. In both of these endeavors, it is crucial to understand and quantify uncertainty - when describing how the brain itself draws conclusions about the physical world, and when the experimenter interprets neuronal data. Bayesian modeling and inference methods provide many advantages for doing so. Three projects are presented that illustrate the advantages of the Bayesian approach. In the first, Markov chain Monte Carlo (MCMC) sampling methods were used to answer a range of scientific questions that arise in the analysis of physiological data from tuning curve experiments; in addition, a software toolbox is described that makes these methods widely accessible. In the second project, the model developed in the first project was extended to describe the detailed dynamics of orientation tuning in neurons in cat primary visual cortex. Using more sophisticated sampling-based inference methods, this model was applied to answer specific scientific questions about the tuning properties of a recorded population. The final project uses a Bayesian model to provide a normative explanation of sensory adaptation phenomena. The model was able to explain a range of detailed physiological adaptation phenomena.
by Beau D. Cronin.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
4

Stevens, Martin. "Animal camouflage, receiver psychology and the computational neuroscience of avian vision." Thesis, University of Bristol, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.432958.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Tromans, James Matthew. "Computational neuroscience of natural scene processing in the ventral visual pathway." Thesis, University of Oxford, 2012. http://ora.ox.ac.uk/objects/uuid:b82e1332-df7b-41db-9612-879c7a7dda39.

Full text
Abstract:
Neural responses in the primate ventral visual system become more complex in the later stages of the pathway. For example, not only do neurons in IT cortex respond to complete objects, they also learn to respond invariantly with respect to the viewing angle of an object and also with respect to the location of an object. These types of neural responses have helped guide past research with VisNet, a computational model of the primate ventral visual pathway that self-organises during learning. In particular, previous research has focussed on presenting to the model one object at a time during training, and has placed emphasis on the transform invariant response properties of the output neurons of the model that consequently develop. This doctoral thesis extends previous VisNet research and investigates the performance of the model with a range of more challenging and ecologically valid training paradigms. For example, when multiple objects are presented to the network during training, or when objects partially occlude one another during training. The different mechanisms that help output neurons to develop object selective, transform invariant responses during learning are proposed and explored. Such mechanisms include the statistical decoupling of objects through multiple object pairings, and the separation of object representations by independent motion. Consideration is also given to the heterogeneous response properties of neurons that develop during learning. For example, although IT neurons demonstrate a number of differing invariances, they also convey spatial information and view specific information about the objects presented on the retina. A updated, scaled-up version of the VisNet model, with a significantly larger retina, is introduced in order to explore these heterogeneous neural response properties.
APA, Harvard, Vancouver, ISO, and other styles
6

Vellmer, Sebastian. "Applications of the Fokker-Planck Equation in Computational and Cognitive Neuroscience." Doctoral thesis, Humboldt-Universität zu Berlin, 2020. http://dx.doi.org/10.18452/21597.

Full text
Abstract:
In dieser Arbeit werden mithilfe der Fokker-Planck-Gleichung die Statistiken, vor allem die Leistungsspektren, von Punktprozessen berechnet, die von mehrdimensionalen Integratorneuronen [Engl. integrate-and-fire (IF) neuron], Netzwerken von IF Neuronen und Entscheidungsfindungsmodellen erzeugt werden. Im Gehirn werden Informationen durch Pulszüge von Aktionspotentialen kodiert. IF Neurone mit radikal vereinfachter Erzeugung von Aktionspotentialen haben sich in Studien die auf Pulszeiten fokussiert sind als Standardmodelle etabliert. Eindimensionale IF Modelle können jedoch beobachtetes Pulsverhalten oft nicht beschreiben und müssen dazu erweitert werden. Im erste Teil dieser Arbeit wird eine Theorie zur Berechnung der Pulszugleistungsspektren von stochastischen, multidimensionalen IF Neuronen entwickelt. Ausgehend von der zugehörigen Fokker-Planck-Gleichung werden partiellen Differentialgleichung abgeleitet, deren Lösung sowohl die stationäre Wahrscheinlichkeitsverteilung und Feuerrate, als auch das Pulszugleistungsspektrum beschreibt. Im zweiten Teil wird eine Theorie für große, spärlich verbundene und homogene Netzwerke aus IF Neuronen entwickelt, in der berücksichtigt wird, dass die zeitlichen Korrelationen von Pulszügen selbstkonsistent sind. Neuronale Eingangströme werden durch farbiges Gaußsches Rauschen modelliert, das von einem mehrdimensionalen Ornstein-Uhlenbeck Prozess (OUP) erzeugt wird. Die Koeffizienten des OUP sind vorerst unbekannt und sind als Lösung der Theorie definiert. Um heterogene Netzwerke zu untersuchen, wird eine iterative Methode erweitert. Im dritten Teil wird die Fokker-Planck-Gleichung auf Binärentscheidungen von Diffusionsentscheidungsmodellen [Engl. diffusion-decision models (DDM)] angewendet. Explizite Gleichungen für die Entscheidungszugstatistiken werden für den einfachsten und analytisch lösbaren Fall von der Fokker-Planck-Gleichung hergeleitet. Für nichtliniear Modelle wird die Schwellwertintegrationsmethode erweitert.
This thesis is concerned with the calculation of statistics, in particular the power spectra, of point processes generated by stochastic multidimensional integrate-and-fire (IF) neurons, networks of IF neurons and decision-making models from the corresponding Fokker-Planck equations. In the brain, information is encoded by sequences of action potentials. In studies that focus on spike timing, IF neurons that drastically simplify the spike generation have become the standard model. One-dimensional IF neurons do not suffice to accurately model neural dynamics, however, the extension towards multiple dimensions yields realistic behavior at the price of growing complexity. The first part of this work develops a theory of spike-train power spectra for stochastic, multidimensional IF neurons. From the Fokker-Planck equation, a set of partial differential equations is derived that describes the stationary probability density, the firing rate and the spike-train power spectrum. In the second part of this work, a mean-field theory of large and sparsely connected homogeneous networks of spiking neurons is developed that takes into account the self-consistent temporal correlations of spike trains. Neural input is approximated by colored Gaussian noise generated by a multidimensional Ornstein-Uhlenbeck process of which the coefficients are initially unknown but determined by the self-consistency condition and define the solution of the theory. To explore heterogeneous networks, an iterative scheme is extended to determine the distribution of spectra. In the third part, the Fokker-Planck equation is applied to calculate the statistics of sequences of binary decisions from diffusion-decision models (DDM). For the analytically tractable DDM, the statistics are calculated from the corresponding Fokker-Planck equation. To determine the statistics for nonlinear models, the threshold-integration method is generalized.
APA, Harvard, Vancouver, ISO, and other styles
7

Zhu, Mengchen. "Sparse coding models of neural response in the primary visual cortex." Diss., Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/53868.

Full text
Abstract:
Sparse coding is an influential unsupervised learning approach proposed as a theoretical model of the encoding process in the primary visual cortex (V1). While sparse coding has been successful in explaining classical receptive field properties of simple cells, it was unclear whether it can account for more complex response properties in a variety of cell types. In this dissertation, we demonstrate that sparse coding and its variants are consistent with key aspects of neural response in V1, including many contextual and nonlinear effects, a number of inhibitory interneuron properties, as well as the variance and correlation distributions in the population response. The results suggest that important response properties in V1 can be interpreted as emergent effects of a neural population efficiently representing the statistical structures of natural scenes under resource constraints. Based on the models, we make predictions of the circuit structure and response properties in V1 that can be verified by future experiments.
APA, Harvard, Vancouver, ISO, and other styles
8

Woldman, Wessel. "Emergent phenomena from dynamic network models : mathematical analysis of EEG from people with IGE." Thesis, University of Exeter, 2016. http://hdl.handle.net/10871/23297.

Full text
Abstract:
In this thesis mathematical techniques and models are applied to electroencephalographic (EEG) recordings to study mechanisms of idiopathic generalised epilepsy (IGE). First, we compare network structures derived from resting-state EEG from people with IGE, their unaffected relatives, and healthy controls. Next, these static networks are combined with a dynamical model describing the ac- tivity of a cortical region as a population of phase-oscillators. We then examine the potential of the differences found in the static networks and the emergent properties of the dynamic network as individual biomarkers of IGE. The emphasis of this approach is on discerning the potential of these markers at the level of an indi- vidual subject rather than their ability to identify differences at a group level. Finally, we extend a dynamic model of seizure onset to investigate how epileptiform discharges vary over the course of the day in ambulatory EEG recordings from people with IGE. By per- turbing the dynamics describing the excitability of the system, we demonstrate the model can reproduce discharge distributions on an individual level which are shown to express a circadian tone. The emphasis of the model approach is on understanding how changes in excitability within brain regions, modulated by sleep, metabolism, endocrine axes, or anti-epileptic drugs (AEDs), can drive the emer- gence of epileptiform activity in large-scale brain networks. Our results demonstrate that studying EEG recordings from peo- ple with IGE can lead to new mechanistic insight on the idiopathic nature of IGE, and may eventually lead to clinical applications. We show that biomarkers derived from dynamic network models perform significantly better as classifiers than biomarkers based on static network properties. Hence, our results provide additional ev- idence that the interplay between the dynamics of specific brain re- gions, and the network topology governing the interactions between these regions, is crucial in the generation of emergent epileptiform activity. Pathological activity may emerge due to abnormalities in either of those factors, or a combination of both, and hence it is essential to develop new techniques to characterise this interplay theoretically and to validate predictions experimentally.
APA, Harvard, Vancouver, ISO, and other styles
9

Nguyen, Harrison Tri Tue. "Computational Neuroscience with Deep Learning for Brain Imaging Analysis and Behaviour Classification." Thesis, The University of Sydney, 2021. https://hdl.handle.net/2123/27313.

Full text
Abstract:
Recent advances of artificial neural networks and deep learning model have produced significant results in problems related to neuroscience. For example, deep learning models have demonstrated superior performance in non-linear, multivariate pattern classification problems such as Alzheimer’s disease classification, brain lesion segmentation, skull stripping and brain age prediction. Deep learning provides unique advantages for high-dimensional data such as MRI data, since it does not require extensive feature engineering. The thesis investigates three problems related to neuroscience and discuss solutions to those scenarios. MRI has been used to analyse the structure of the brain and its pathology. However, for ex- ample, due to the heterogeneity of these scanners, MRI protocol, variation in site thermal and power stability can introduce scanning differences and artefacts for the same individual under- going different scans. Therefore combining images from different sites or even different days can introduce biases that obscure the signal of interest or can produce results that could be driven by these differences. An algorithm, the CycleGAN, will be presented and analysed which uses generative adversarial networks to transform a set of images from a given MRI site into images with characteristics of a different MRI site. Secondly, the MRI scans of the brain can come in the form of different modalities such as T1- weighted and FLAIR which have been used to investigate a wide range of neurological disorders. The acquisition of all of these modalities are expensive, time-consuming, inconvenient and the required modalities are often not available. As a result, these datasets contain large amounts of unpaired data, where examples in the dataset do not contain all modalities. On the other hand, there is a smaller fraction of examples that contain all modalities (paired data). This thesis presents a method to address the issue of translating between two neuroimaging modalities with a dataset of unpaired and paired, in semi-supervised learning framework. Lastly, behavioural modelling will be considered, where it is associated with an impressive range of decision-making tasks that are designed to index sub-components of psychological and neural computations that are distinct across groups of people, including people with an underlying disease. The thesis proposes a method that learns prototypical behaviours of each population in the form of readily interpretable, subsequences of choices, and classifies subjects by finding signatures of these prototypes in their behaviour.
APA, Harvard, Vancouver, ISO, and other styles
10

Lundh, Dan. "A computational neuroscientific model for short-term memory." Thesis, University of Exeter, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.324742.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Computational neuroscience"

1

Ribeiro, Paulo Rogério de Almeida, Vinícius Rosa Cota, Dante Augusto Couto Barone, and Alexandre César Muniz de Oliveira, eds. Computational Neuroscience. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-08443-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Cota, Vinícius Rosa, Dante Augusto Couto Barone, Diego Roberto Colombo Dias, and Laila Cristina Moreira Damázio, eds. Computational Neuroscience. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-36636-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bower, James M., ed. Computational Neuroscience. Boston, MA: Springer US, 1997. http://dx.doi.org/10.1007/978-1-4757-9800-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Chaovalitwongse, Wanpracha, Panos M. Pardalos, and Petros Xanthopoulos, eds. Computational Neuroscience. New York, NY: Springer New York, 2010. http://dx.doi.org/10.1007/978-0-387-88630-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Barone, Dante Augusto Couto, Eduardo Oliveira Teles, and Christian Puhlmann Brackmann, eds. Computational Neuroscience. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-71011-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Mallot, Hanspeter A. Computational Neuroscience. Heidelberg: Springer International Publishing, 2013. http://dx.doi.org/10.1007/978-3-319-00861-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Bower, James M., ed. Computational Neuroscience. Boston, MA: Springer US, 1998. http://dx.doi.org/10.1007/978-1-4615-4831-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Stoyanov, Drozdstoy, Bogdan Draganski, Paolo Brambilla, and Claus Lamm, eds. Computational Neuroscience. New York, NY: Springer US, 2023. http://dx.doi.org/10.1007/978-1-0716-3230-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Riascos Salas, Jaime A., Vinícius Rosa Cota, Hernán Villota, and Daniel Betancur Vasquez, eds. Computational Neuroscience. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-63848-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Pardalos, P. M. Computational neuroscience. New York: Springer, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Computational neuroscience"

1

Hasselmo, Michael E., and James R. Hinman. "Computational Neuroscience: Hippocampus." In Neuroscience in the 21st Century, 3081–95. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4939-3474-4_175.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hasselmo, Michael E., and James R. Hinman. "Computational Neuroscience: Hippocampus." In Neuroscience in the 21st Century, 1–15. New York, NY: Springer New York, 2016. http://dx.doi.org/10.1007/978-1-4614-6434-1_175-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hasselmo, Michael E., and James R. Hinman. "Computational Neuroscience: Hippocampus." In Neuroscience in the 21st Century, 3489–503. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-88832-9_175.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zednik, Carlos. "Computational cognitive neuroscience." In The Routledge Handbook of the Computational Mind, 357–69. Milton Park, Abingdon, Oxon ; New York : Routledge, 2019. |: Routledge, 2018. http://dx.doi.org/10.4324/9781315643670-27.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Mazzola, Guerino, Maria Mannone, Yan Pang, Margaret O’Brien, and Nathan Torunsky. "Neuroscience and Gestures." In Computational Music Science, 155–61. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-47334-5_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Venugopal, Sharmila, Sharon Crook, Malathi Srivatsan, and Ranu Jung. "Principles of Computational Neuroscience." In Biohybrid Systems, 11–30. Weinheim, Germany: Wiley-VCH Verlag GmbH & Co. KGaA, 2011. http://dx.doi.org/10.1002/9783527639366.ch2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Irvine, Liz. "Simulation in computational neuroscience." In The Routledge Handbook of the Computational Mind, 370–80. Milton Park, Abingdon, Oxon ; New York : Routledge, 2019. |: Routledge, 2018. http://dx.doi.org/10.4324/9781315643670-28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Easttom, Chuck. "Introduction to Computational Neuroscience." In Machine Learning for Neuroscience, 147–71. Boca Raton: CRC Press, 2023. http://dx.doi.org/10.1201/9781003230588-10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

DeWan, Andrew, Lana C. Rutherford, and Gina G. Turrigiano. "Activity-Dependent Regulation of Inhibition in Visual Cortical Cultures." In Computational Neuroscience, 3–6. Boston, MA: Springer US, 1997. http://dx.doi.org/10.1007/978-1-4757-9800-5_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chitwood, Raymond A., Brenda J. Claiborne, and David B. Jaffe. "Modeling the Passive Properties of Nonpyramidal Neurons in Hippocampal Area CA3." In Computational Neuroscience, 59–64. Boston, MA: Springer US, 1997. http://dx.doi.org/10.1007/978-1-4757-9800-5_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Computational neuroscience"

1

Maley, Corey. "Analog Computation in Computational Cognitive Neuroscience." In 2018 Conference on Cognitive Computational Neuroscience. Brentwood, Tennessee, USA: Cognitive Computational Neuroscience, 2018. http://dx.doi.org/10.32470/ccn.2018.1178-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Piccinini, Gualtiero. "Non-Computational Functionalism: Computation and the Function of Consciousness." In 2018 Conference on Cognitive Computational Neuroscience. Brentwood, Tennessee, USA: Cognitive Computational Neuroscience, 2018. http://dx.doi.org/10.32470/ccn.2018.1022-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Kawato, Mitsuo. "Computational Neuroscience and Multiple-Valued Logic." In 2009 39th International Symposium on Multiple-Valued Logic. IEEE, 2009. http://dx.doi.org/10.1109/ismvl.2009.70.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Tirupattur, Naveen, Christopher C. Lapish, Snehasis Mukhopadhyay, Tuan D. Pham, Xiaobo Zhou, Hiroshi Tanaka, Mayumi Oyama-Higa, et al. "Text Mining for Neuroscience." In 2011 INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL MODELS FOR LIFE SCIENCES (CMLS-11). AIP, 2011. http://dx.doi.org/10.1063/1.3596634.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Gao, Richard, Dylan Christiano, Tom Donoghue, and Bradley Voytek. "The Structure of Cognition Across Computational Cognitive Neuroscience." In 2019 Conference on Cognitive Computational Neuroscience. Brentwood, Tennessee, USA: Cognitive Computational Neuroscience, 2019. http://dx.doi.org/10.32470/ccn.2019.1426-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

José Macário Costa, Raimundo, Luís Alfredo Vidal de Carvalho, Emilio Sánchez Miguel, Renata Mousinho, Renato Cerceau, Lizete Pontes Macário Costa, Jorge Zavaleta, Laci Mary Barbosa Manhães, and Sérgio Manuel Serra da Cruz. "Computational Neuroscience - Challenges and Implications for Brazilian Education." In 7th International Conference on Computer Supported Education. SCITEPRESS - Science and and Technology Publications, 2015. http://dx.doi.org/10.5220/0005481004360441.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chateau-Laurent, Hugo, and Frederic Alexandre. "Towards a Computational Cognitive Neuroscience Model of Creativity." In 2021 IEEE 20th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC). IEEE, 2021. http://dx.doi.org/10.1109/iccicc53683.2021.9811309.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zhang, Wen-Ran. "Six Conjectures in Quantum Physics and Computational Neuroscience." In 2009 Third International Conference on Quantum, Nano and Micro Technologies (ICQNM). IEEE, 2009. http://dx.doi.org/10.1109/icqnm.2009.32.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Anllo, Hernan, Gil Salamander, Stefano Palminteri, Nichola Raihani, and Uri Hertz. "Computational drivers of advice-giving." In 2023 Conference on Cognitive Computational Neuroscience. Oxford, United Kingdom: Cognitive Computational Neuroscience, 2023. http://dx.doi.org/10.32470/ccn.2023.1367-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Muzellec, Sabine, Mathieu Chalvidal, Thomas Serre, and Rufin VanRullen. "Accurate implementation of computational neuroscience models through neural ODEs." In 2022 Conference on Cognitive Computational Neuroscience. San Francisco, California, USA: Cognitive Computational Neuroscience, 2022. http://dx.doi.org/10.32470/ccn.2022.1165-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Computational neuroscience"

1

Bower, James M., and Christof Koch. Methods in Computational Neuroscience. Fort Belvoir, VA: Defense Technical Information Center, September 1990. http://dx.doi.org/10.21236/ada231397.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bower, James M., and Christof Koch. Training in Methods in Computational Neuroscience. Fort Belvoir, VA: Defense Technical Information Center, August 1992. http://dx.doi.org/10.21236/ada261806.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Halvorson, Harlyn O. Training in Methods in Computational Neuroscience. Fort Belvoir, VA: Defense Technical Information Center, November 1989. http://dx.doi.org/10.21236/ada217018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bower, James M., and Christof Koch. Methods in Computational Neuroscience: Marine Biology Laboratory Student Projects. Fort Belvoir, VA: Defense Technical Information Center, November 1988. http://dx.doi.org/10.21236/ada201434.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Schunn, C. D. A Review of Human Spatial Representations Computational, Neuroscience, Mathematical, Developmental, and Cognitive Psychology Considerations. Fort Belvoir, VA: Defense Technical Information Center, December 2000. http://dx.doi.org/10.21236/ada440864.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Sejonowski, T. Workshop in Computational Neuroscience (8th) held in Woods Hole, Massachusetts on 22-28 August 1992. Fort Belvoir, VA: Defense Technical Information Center, December 1992. http://dx.doi.org/10.21236/ada279786.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Semerikov, Serhiy O., Illia O. Teplytskyi, Yuliia V. Yechkalo, and Arnold E. Kiv. Computer Simulation of Neural Networks Using Spreadsheets: The Dawn of the Age of Camelot. [б. в.], November 2018. http://dx.doi.org/10.31812/123456789/2648.

Full text
Abstract:
The article substantiates the necessity to develop training methods of computer simulation of neural networks in the spreadsheet environment. The systematic review of their application to simulating artificial neural networks is performed. The authors distinguish basic approaches to solving the problem of network computer simulation training in the spreadsheet environment, joint application of spreadsheets and tools of neural network simulation, application of third-party add-ins to spreadsheets, development of macros using the embedded languages of spreadsheets; use of standard spreadsheet add-ins for non-linear optimization, creation of neural networks in the spreadsheet environment without add-ins and macros. After analyzing a collection of writings of 1890-1950, the research determines the role of the scientific journal “Bulletin of Mathematical Biophysics”, its founder Nicolas Rashevsky and the scientific community around the journal in creating and developing models and methods of computational neuroscience. There are identified psychophysical basics of creating neural networks, mathematical foundations of neural computing and methods of neuroengineering (image recognition, in particular). The role of Walter Pitts in combining the descriptive and quantitative theories of training is discussed. It is shown that to acquire neural simulation competences in the spreadsheet environment, one should master the models based on the historical and genetic approach. It is indicated that there are three groups of models, which are promising in terms of developing corresponding methods – the continuous two-factor model of Rashevsky, the discrete model of McCulloch and Pitts, and the discrete-continuous models of Householder and Landahl.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography