Книги з теми "Neural network model of identification"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Neural network model of identification.

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-45 книг для дослідження на тему "Neural network model of identification".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте книги для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Liu, G. P. Multiobjective criteria for nonlinear model selection and identification with neural networks. Sheffield: University of Sheffield, Dept. of Automatic Control and Systems Engineering, 1994.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Janczak, Andrzej. Identification of Nonlinear Systems Using Neural Networks and Polynomial Models. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/b98334.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Wasserman, Theodore, and Lori Drucker Wasserman. Therapy and the Neural Network Model. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-26921-0.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Fortescue, Michael D. A neural network model of lexical organization. London: Continuum, 2011.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Wasserman, Theodore, and Lori Wasserman. Motivation, Effort, and the Neural Network Model. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58724-6.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

A neural network model of lexical organization. London: Continuum Intl Pub Group, 2009.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Nelles, Oliver. Nonlinear System Identification: From Classical Approaches to Neural Networks and Fuzzy Models. Berlin, Heidelberg: Springer Berlin Heidelberg, 2001.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Liu, G. P. Nonlinear Identification and Control: A Neural Network Approach. London: Springer London, 2001.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Schenkel, Markus E. Handwriting recognition using neural networks and hidden Markov models. Konstanz: Hartung-Gorre, 1995.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Zhu, Q. M. Fast orthogonal identification of nonlinear stochastic models and radial basis function neural networks. Sheffield: University of Sheffield, Dept. of Automatic Control and Systems Engineering, 1994.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Zapranis, Achilleas, and Apostolos-Paul N. Refenes. Principles of Neural Model Identification, Selection and Adequacy. London: Springer London, 1999. http://dx.doi.org/10.1007/978-1-4471-0559-6.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Identification of nonlinear systems using neural networks and polynomial models: A block-oriented approach. Berlin: Springer, 2004.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Ouyang, Xiaohong. Neural network identification and control of electrical power steering systems. Wolverhampton: University of Wolverhampton, 2000.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Jorgensen, Charles C. Development of a sensor coordinated kinematic model for neural network controller training. [Moffett Field, Calif.?]: Research Institute for Advanced Computer Science, NASA Ames Research Center, 1990.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Tsang, K. M. A self growing binary tree neural network for the identification of multiclass systems. Sheffield: University of Sheffield, Dept. of Automatic Control and Systems Engineering, 1992.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Zapranis, Achilleas. Principles of Neural Model Identification, Selection and Adequacy: With Applications to Financial Econometrics. London: Springer London, 1999.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Duyar, Ahmet. A failure diagnosis system based on a neural network classifier for the space shuttle main engine. [Washington, DC]: National Aeronautics and Space Administration, 1990.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
18

D'Autrechy, C. Lynne. Autoplan: A self-processing network model for an extended blocks world planning environment. College Park, Md: University of Maryland, 1990.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
19

D'Autrechy, C. Lynne. Autoplan: A self-processing network model for an extended blocks world planning environment. College Park, Md: University of Maryland, 1990.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Zapranis, Achilleas, and Apostolos-Paul N. Refenes. Principles of Neural Model Identification, Selection and Adequacy: With Applications to Financial Econometrics (Perspectives in Neural Computing). Springer, 1999.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Hans, Hellendoorn, and Driankov Dimiter, eds. Fuzzy model identification: Selected approaches. Berlin: Springer, 1997.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
22

(Editor), Hans Hellendoorn, and Dimiter Driankov (Editor), eds. Fuzzy Model Identification: Selected Approaches. Springer, 1998.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Wasserman, Theodore, and Lori Drucker Wasserman. Therapy and the Neural Network Model. Springer, 2019.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Jer-Nan, Juang, Hyland David C, and Langley Research Center, eds. On neural networks in identification and control of dynamic systems. Hampton, Va: National Aeronautics and Space Administration, Langley Research Center, 1993.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Halperin, Janet Ruth Patricia *. A connectionist neural network model of aggression. 1990.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Nelles, Oliver. Nonlinear System Identification: From Classical Approaches to Neural Networks and Fuzzy Models. Springer, 2000.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
27

Tiumentsev, Yury, and Mikhail Egorchev. Neural Network Modeling and Identification of Dynamical Systems. Elsevier Science & Technology, 2019.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Neural Network Modeling and Identification of Dynamical Systems. Elsevier, 2019. http://dx.doi.org/10.1016/c2017-0-02854-9.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
29

Lee, Marcus T. H. A Bayesian neural network model of consumer choice. 2003.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
30

Ławryńczuk, Maciej. Computationally Efficient Model Predictive Control Algorithms: A Neural Network Approach. Springer, 2016.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
31

Nonlinear Identification and Control: A Neural Network Approach (Advances in Industrial Control). Springer, 2001.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Kashyap, Dr Nikita, Dr Dharmendra Kumar Singh, Dr Girish Kumar Singh, and Dr Arun Kumar Kashyap, eds. Identification of Diabetic Retinopathy Stages Using Modified DWT and Artificial Neural Network. AkiNik Publications, 2021. http://dx.doi.org/10.22271/ed.book.1314.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Joseph, Downs, ed. Application of the fuzzy ARTMAP neural network model to medical pattern classification tasks. Sheffield: University of Sheffield, Dept. of Automatic Control & Systems Engineering, 1995.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Ye, Julia X. An application of the feedforward neural network model in currency exchange rate forecasting. 1994.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Iris Biometric Model for Secured Network Access. Taylor & Francis Inc, 2013.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Predicting Launch Pad Winds at the Kennedy Space Center With a Neural Network Model. Storming Media, 1999.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
37

T, Bialasiewicz Jan, and Langley Research Center, eds. Neural network modeling of nonlinear systems based on Volterra series extension of a linear model. Hampton, Va: National Aeronautics and Space Administration, Langley Research Center, 1992.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Al-Haddad, Luan Marie. Neural network techniques for the identification and classification of marine phytoplankton from flow cytometric data. 2001.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
39

A, Reggia James, McFadden Francis M, University of Maryland at College Park., United States. National Aeronautics and Space Administration., and National Science Foundation (U.S.), eds. Autoplan: A self-processing network model for an extended blocks world planning environment. College Park, Md: University of Maryland, 1990.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
40

Mundy, Peter. A Neural Networks, Information-Processing Model of Joint Attention and Social-Cognitive Development. Edited by Philip David Zelazo. Oxford University Press, 2013. http://dx.doi.org/10.1093/oxfordhb/9780199958474.013.0010.

Повний текст джерела
Анотація:
A neural networks approach to the development of joint attention can inform the study of the nature of human social cognition, learning, and symbolic thought process. Joint attention development involves increments in the capacity to engage in simultaneous or parallel processing of information about one’s own attention and the attention of other people. Infant practice with joint attention is both a consequence and an organizer of a distributed and integrated brain network involving frontal and parietal cortical systems. In this chapter I discuss two hypotheses that stem from this model. One is that activation of this distributed network during coordinated attention enhances the depth of information processing and encoding beginning in the first year of life. I also propose that with development joint attention becomes internalized as the capacity to socially coordinate mental attention to internal representations. As this occurs the executive joint attention network makes vital contributions to the development of human social cognition and symbolic thinking.
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Ashby, F. Gregory, and Fabian A. Soto. Multidimensional Signal Detection Theory. Edited by Jerome R. Busemeyer, Zheng Wang, James T. Townsend, and Ami Eidels. Oxford University Press, 2015. http://dx.doi.org/10.1093/oxfordhb/9780199957996.013.2.

Повний текст джерела
Анотація:
Multidimensional signal detection theory is a multivariate extension of signal detection theory that makes two fundamental assumptions, namely that every mental state is noisy and that every action requires a decision. The most widely studied version is known as general recognition theory (GRT). General recognition theory assumes that the percept on each trial can be modeled as a random sample from a multivariate probability distribution defined over the perceptual space. Decision bounds divide this space into regions that are each associated with a response alternative. General recognition theory rigorously defines and tests a number of important perceptual and cognitive conditions, including perceptual and decisional separability and perceptual independence. General recognition theory has been used to analyze data from identification experiments in two ways: (1) fitting and comparing models that make different assumptions about perceptual and decisional processing, and (2) testing assumptions by computing summary statistics and checking whether these satisfy certain conditions. Much has been learned recently about the neural networks that mediate the perceptual and decisional processing modeled by GRT, and this knowledge can be used to improve the design of experiments where a GRT analysis is anticipated.
Стилі APA, Harvard, Vancouver, ISO та ін.
42

Anderson, James A. Brain Theory. Oxford University Press, 2018. http://dx.doi.org/10.1093/acprof:oso/9780199357789.003.0012.

Повний текст джерела
Анотація:
What form would a brain theory take? Would it be short and punchy, like Maxwell’s Equations? Or with a clear goal but achieved by a community of mechanisms—local theories—to attain that goal, like the US Tax Code. The best developed recent brain-like model is the “neural network.” In the late 1950s Rosenblatt’s Perceptron and many variants proposed a brain-inspired associative network. Problems with the first generation of neural networks—limited capacity, opaque learning, and inaccuracy—have been largely overcome. In 2016, a program from Google, AlphaGo, based on a neural net using deep learning, defeated the world’s best Go player. The climax of this chapter is a fictional example starring Sherlock Holmes demonstrating that complex associative computation in practice has less in common with accurate pattern recognition and more with abstract high-level conceptual inference.
Стилі APA, Harvard, Vancouver, ISO та ін.
43

Gabora, Liane. The Creative Process of Cultural Evolution. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190455675.003.0002.

Повний текст джерела
Анотація:
This chapter explores how we can better understand culture by understanding the creative processes that fuel it, and better understand creativity by examining it from its cultural context. First, it summarizes attempts to develop a scientific framework for how culture evolves, and it explores what these frameworks imply for the role of creativity in cultural evolution. Next it examines how questions about the relationship between creativity and cultural evolution have been addressed using an agent-based model in which neural network-based agents collectively generate increasingly fit ideas by building on previous ideas and imitating neighbors’ ideas. Finally, it outlines studies of how creative outputs are influenced, in perhaps unexpected ways, by other ideas and individuals, and how individual creative styles “peek through” cultural outputs in different domains.
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Barwich, Ann-Sophie. Measuring the World. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198779636.003.0017.

Повний текст джерела
Анотація:
How much does stimulus input shape perception? The common-sense view is that our perceptions are representations of objects and their features and that the stimulus structures the perceptual object. The problem for this view concerns perceptual biases as responsible for distortions and the subjectivity of perceptual experience. These biases are increasingly studied as constitutive factors of brain processes in recent neuroscience. In neural network models the brain is said to cope with the plethora of sensory information by predicting stimulus regularities on the basis of previous experiences. Drawing on this development, this chapter analyses perceptions as processes. Looking at olfaction as a model system, it argues for the need to abandon a stimulus-centred perspective, where smells are thought of as stable percepts, computationally linked to external objects such as odorous molecules. Perception here is presented as a measure of changing signal ratios in an environment informed by expectancy effects from top-down processes.
Стилі APA, Harvard, Vancouver, ISO та ін.
45

Koch, Christof. Biophysics of Computation. Oxford University Press, 1998. http://dx.doi.org/10.1093/oso/9780195104912.001.0001.

Повний текст джерела
Анотація:
Neural network research often builds on the fiction that neurons are simple linear threshold units, completely neglecting the highly dynamic and complex nature of synapses, dendrites, and voltage-dependent ionic currents. Biophysics of Computation: Information Processing in Single Neurons challenges this notion, using richly detailed experimental and theoretical findings from cellular biophysics to explain the repertoire of computational functions available to single neurons. The author shows how individual nerve cells can multiply, integrate, or delay synaptic inputs and how information can be encoded in the voltage across the membrane, in the intracellular calcium concentration, or in the timing of individual spikes. Key topics covered include the linear cable equation; cable theory as applied to passive dendritic trees and dendritic spines; chemical and electrical synapses and how to treat them from a computational point of view; nonlinear interactions of synaptic input in passive and active dendritic trees; the Hodgkin-Huxley model of action potential generation and propagation; phase space analysis; linking stochastic ionic channels to membrane-dependent currents; calcium and potassium currents and their role in information processing; the role of diffusion, buffering and binding of calcium, and other messenger systems in information processing and storage; short- and long-term models of synaptic plasticity; simplified models of single cells; stochastic aspects of neuronal firing; the nature of the neuronal code; and unconventional models of sub-cellular computation. Biophysics of Computation: Information Processing in Single Neurons serves as an ideal text for advanced undergraduate and graduate courses in cellular biophysics, computational neuroscience, and neural networks, and will appeal to students and professionals in neuroscience, electrical and computer engineering, and physics.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії