Books on the topic 'Neural network model of identification'

To see the other types of publications on this topic, follow the link: Neural network model of identification.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 45 books for your research on the topic 'Neural network model of identification.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse books on a wide variety of disciplines and organise your bibliography correctly.

1

Liu, G. P. Multiobjective criteria for nonlinear model selection and identification with neural networks. Sheffield: University of Sheffield, Dept. of Automatic Control and Systems Engineering, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Janczak, Andrzej. Identification of Nonlinear Systems Using Neural Networks and Polynomial Models. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/b98334.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wasserman, Theodore, and Lori Drucker Wasserman. Therapy and the Neural Network Model. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-26921-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Fortescue, Michael D. A neural network model of lexical organization. London: Continuum, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wasserman, Theodore, and Lori Wasserman. Motivation, Effort, and the Neural Network Model. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58724-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

A neural network model of lexical organization. London: Continuum Intl Pub Group, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Nelles, Oliver. Nonlinear System Identification: From Classical Approaches to Neural Networks and Fuzzy Models. Berlin, Heidelberg: Springer Berlin Heidelberg, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Liu, G. P. Nonlinear Identification and Control: A Neural Network Approach. London: Springer London, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Schenkel, Markus E. Handwriting recognition using neural networks and hidden Markov models. Konstanz: Hartung-Gorre, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zhu, Q. M. Fast orthogonal identification of nonlinear stochastic models and radial basis function neural networks. Sheffield: University of Sheffield, Dept. of Automatic Control and Systems Engineering, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
11

Zapranis, Achilleas, and Apostolos-Paul N. Refenes. Principles of Neural Model Identification, Selection and Adequacy. London: Springer London, 1999. http://dx.doi.org/10.1007/978-1-4471-0559-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Identification of nonlinear systems using neural networks and polynomial models: A block-oriented approach. Berlin: Springer, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
13

Ouyang, Xiaohong. Neural network identification and control of electrical power steering systems. Wolverhampton: University of Wolverhampton, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
14

Jorgensen, Charles C. Development of a sensor coordinated kinematic model for neural network controller training. [Moffett Field, Calif.?]: Research Institute for Advanced Computer Science, NASA Ames Research Center, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
15

Tsang, K. M. A self growing binary tree neural network for the identification of multiclass systems. Sheffield: University of Sheffield, Dept. of Automatic Control and Systems Engineering, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
16

Zapranis, Achilleas. Principles of Neural Model Identification, Selection and Adequacy: With Applications to Financial Econometrics. London: Springer London, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
17

Duyar, Ahmet. A failure diagnosis system based on a neural network classifier for the space shuttle main engine. [Washington, DC]: National Aeronautics and Space Administration, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
18

D'Autrechy, C. Lynne. Autoplan: A self-processing network model for an extended blocks world planning environment. College Park, Md: University of Maryland, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
19

D'Autrechy, C. Lynne. Autoplan: A self-processing network model for an extended blocks world planning environment. College Park, Md: University of Maryland, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
20

Zapranis, Achilleas, and Apostolos-Paul N. Refenes. Principles of Neural Model Identification, Selection and Adequacy: With Applications to Financial Econometrics (Perspectives in Neural Computing). Springer, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
21

Hans, Hellendoorn, and Driankov Dimiter, eds. Fuzzy model identification: Selected approaches. Berlin: Springer, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
22

(Editor), Hans Hellendoorn, and Dimiter Driankov (Editor), eds. Fuzzy Model Identification: Selected Approaches. Springer, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
23

Wasserman, Theodore, and Lori Drucker Wasserman. Therapy and the Neural Network Model. Springer, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
24

Jer-Nan, Juang, Hyland David C, and Langley Research Center, eds. On neural networks in identification and control of dynamic systems. Hampton, Va: National Aeronautics and Space Administration, Langley Research Center, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
25

Halperin, Janet Ruth Patricia *. A connectionist neural network model of aggression. 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
26

Nelles, Oliver. Nonlinear System Identification: From Classical Approaches to Neural Networks and Fuzzy Models. Springer, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
27

Tiumentsev, Yury, and Mikhail Egorchev. Neural Network Modeling and Identification of Dynamical Systems. Elsevier Science & Technology, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
28

Neural Network Modeling and Identification of Dynamical Systems. Elsevier, 2019. http://dx.doi.org/10.1016/c2017-0-02854-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Lee, Marcus T. H. A Bayesian neural network model of consumer choice. 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
30

Ławryńczuk, Maciej. Computationally Efficient Model Predictive Control Algorithms: A Neural Network Approach. Springer, 2016.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
31

Nonlinear Identification and Control: A Neural Network Approach (Advances in Industrial Control). Springer, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
32

Kashyap, Dr Nikita, Dr Dharmendra Kumar Singh, Dr Girish Kumar Singh, and Dr Arun Kumar Kashyap, eds. Identification of Diabetic Retinopathy Stages Using Modified DWT and Artificial Neural Network. AkiNik Publications, 2021. http://dx.doi.org/10.22271/ed.book.1314.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Joseph, Downs, ed. Application of the fuzzy ARTMAP neural network model to medical pattern classification tasks. Sheffield: University of Sheffield, Dept. of Automatic Control & Systems Engineering, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
34

Ye, Julia X. An application of the feedforward neural network model in currency exchange rate forecasting. 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
35

Iris Biometric Model for Secured Network Access. Taylor & Francis Inc, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
36

Predicting Launch Pad Winds at the Kennedy Space Center With a Neural Network Model. Storming Media, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
37

T, Bialasiewicz Jan, and Langley Research Center, eds. Neural network modeling of nonlinear systems based on Volterra series extension of a linear model. Hampton, Va: National Aeronautics and Space Administration, Langley Research Center, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
38

Al-Haddad, Luan Marie. Neural network techniques for the identification and classification of marine phytoplankton from flow cytometric data. 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
39

A, Reggia James, McFadden Francis M, University of Maryland at College Park., United States. National Aeronautics and Space Administration., and National Science Foundation (U.S.), eds. Autoplan: A self-processing network model for an extended blocks world planning environment. College Park, Md: University of Maryland, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
40

Mundy, Peter. A Neural Networks, Information-Processing Model of Joint Attention and Social-Cognitive Development. Edited by Philip David Zelazo. Oxford University Press, 2013. http://dx.doi.org/10.1093/oxfordhb/9780199958474.013.0010.

Full text
Abstract:
A neural networks approach to the development of joint attention can inform the study of the nature of human social cognition, learning, and symbolic thought process. Joint attention development involves increments in the capacity to engage in simultaneous or parallel processing of information about one’s own attention and the attention of other people. Infant practice with joint attention is both a consequence and an organizer of a distributed and integrated brain network involving frontal and parietal cortical systems. In this chapter I discuss two hypotheses that stem from this model. One is that activation of this distributed network during coordinated attention enhances the depth of information processing and encoding beginning in the first year of life. I also propose that with development joint attention becomes internalized as the capacity to socially coordinate mental attention to internal representations. As this occurs the executive joint attention network makes vital contributions to the development of human social cognition and symbolic thinking.
APA, Harvard, Vancouver, ISO, and other styles
41

Ashby, F. Gregory, and Fabian A. Soto. Multidimensional Signal Detection Theory. Edited by Jerome R. Busemeyer, Zheng Wang, James T. Townsend, and Ami Eidels. Oxford University Press, 2015. http://dx.doi.org/10.1093/oxfordhb/9780199957996.013.2.

Full text
Abstract:
Multidimensional signal detection theory is a multivariate extension of signal detection theory that makes two fundamental assumptions, namely that every mental state is noisy and that every action requires a decision. The most widely studied version is known as general recognition theory (GRT). General recognition theory assumes that the percept on each trial can be modeled as a random sample from a multivariate probability distribution defined over the perceptual space. Decision bounds divide this space into regions that are each associated with a response alternative. General recognition theory rigorously defines and tests a number of important perceptual and cognitive conditions, including perceptual and decisional separability and perceptual independence. General recognition theory has been used to analyze data from identification experiments in two ways: (1) fitting and comparing models that make different assumptions about perceptual and decisional processing, and (2) testing assumptions by computing summary statistics and checking whether these satisfy certain conditions. Much has been learned recently about the neural networks that mediate the perceptual and decisional processing modeled by GRT, and this knowledge can be used to improve the design of experiments where a GRT analysis is anticipated.
APA, Harvard, Vancouver, ISO, and other styles
42

Anderson, James A. Brain Theory. Oxford University Press, 2018. http://dx.doi.org/10.1093/acprof:oso/9780199357789.003.0012.

Full text
Abstract:
What form would a brain theory take? Would it be short and punchy, like Maxwell’s Equations? Or with a clear goal but achieved by a community of mechanisms—local theories—to attain that goal, like the US Tax Code. The best developed recent brain-like model is the “neural network.” In the late 1950s Rosenblatt’s Perceptron and many variants proposed a brain-inspired associative network. Problems with the first generation of neural networks—limited capacity, opaque learning, and inaccuracy—have been largely overcome. In 2016, a program from Google, AlphaGo, based on a neural net using deep learning, defeated the world’s best Go player. The climax of this chapter is a fictional example starring Sherlock Holmes demonstrating that complex associative computation in practice has less in common with accurate pattern recognition and more with abstract high-level conceptual inference.
APA, Harvard, Vancouver, ISO, and other styles
43

Gabora, Liane. The Creative Process of Cultural Evolution. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190455675.003.0002.

Full text
Abstract:
This chapter explores how we can better understand culture by understanding the creative processes that fuel it, and better understand creativity by examining it from its cultural context. First, it summarizes attempts to develop a scientific framework for how culture evolves, and it explores what these frameworks imply for the role of creativity in cultural evolution. Next it examines how questions about the relationship between creativity and cultural evolution have been addressed using an agent-based model in which neural network-based agents collectively generate increasingly fit ideas by building on previous ideas and imitating neighbors’ ideas. Finally, it outlines studies of how creative outputs are influenced, in perhaps unexpected ways, by other ideas and individuals, and how individual creative styles “peek through” cultural outputs in different domains.
APA, Harvard, Vancouver, ISO, and other styles
44

Barwich, Ann-Sophie. Measuring the World. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198779636.003.0017.

Full text
Abstract:
How much does stimulus input shape perception? The common-sense view is that our perceptions are representations of objects and their features and that the stimulus structures the perceptual object. The problem for this view concerns perceptual biases as responsible for distortions and the subjectivity of perceptual experience. These biases are increasingly studied as constitutive factors of brain processes in recent neuroscience. In neural network models the brain is said to cope with the plethora of sensory information by predicting stimulus regularities on the basis of previous experiences. Drawing on this development, this chapter analyses perceptions as processes. Looking at olfaction as a model system, it argues for the need to abandon a stimulus-centred perspective, where smells are thought of as stable percepts, computationally linked to external objects such as odorous molecules. Perception here is presented as a measure of changing signal ratios in an environment informed by expectancy effects from top-down processes.
APA, Harvard, Vancouver, ISO, and other styles
45

Koch, Christof. Biophysics of Computation. Oxford University Press, 1998. http://dx.doi.org/10.1093/oso/9780195104912.001.0001.

Full text
Abstract:
Neural network research often builds on the fiction that neurons are simple linear threshold units, completely neglecting the highly dynamic and complex nature of synapses, dendrites, and voltage-dependent ionic currents. Biophysics of Computation: Information Processing in Single Neurons challenges this notion, using richly detailed experimental and theoretical findings from cellular biophysics to explain the repertoire of computational functions available to single neurons. The author shows how individual nerve cells can multiply, integrate, or delay synaptic inputs and how information can be encoded in the voltage across the membrane, in the intracellular calcium concentration, or in the timing of individual spikes. Key topics covered include the linear cable equation; cable theory as applied to passive dendritic trees and dendritic spines; chemical and electrical synapses and how to treat them from a computational point of view; nonlinear interactions of synaptic input in passive and active dendritic trees; the Hodgkin-Huxley model of action potential generation and propagation; phase space analysis; linking stochastic ionic channels to membrane-dependent currents; calcium and potassium currents and their role in information processing; the role of diffusion, buffering and binding of calcium, and other messenger systems in information processing and storage; short- and long-term models of synaptic plasticity; simplified models of single cells; stochastic aspects of neuronal firing; the nature of the neuronal code; and unconventional models of sub-cellular computation. Biophysics of Computation: Information Processing in Single Neurons serves as an ideal text for advanced undergraduate and graduate courses in cellular biophysics, computational neuroscience, and neural networks, and will appeal to students and professionals in neuroscience, electrical and computer engineering, and physics.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography