Gotowa bibliografia na temat „Stochastic neural networks”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Stochastic neural networks”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Stochastic neural networks"

1

Reddy, BhanuTeja, i Usha J.C. "Prediction of Stock Market using Stochastic Neural Networks". International Journal of Innovative Research in Computer Science & Technology 7, nr 5 (wrzesień 2019): 128–38. http://dx.doi.org/10.21276/ijircst.2019.7.5.1.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Wong, Eugene. "Stochastic neural networks". Algorithmica 6, nr 1-6 (czerwiec 1991): 466–78. http://dx.doi.org/10.1007/bf01759054.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Zhou, Wuneng, Xueqing Yang, Jun Yang i Jun Zhou. "Stochastic Synchronization of Neutral-Type Neural Networks with Multidelays Based onM-Matrix". Discrete Dynamics in Nature and Society 2015 (2015): 1–8. http://dx.doi.org/10.1155/2015/826810.

Pełny tekst źródła
Streszczenie:
The problem of stochastic synchronization of neutral-type neural networks with multidelays based onM-matrix is researched. Firstly, we designed a control law of stochastic synchronization of the neural-type and multiple time-delays neural network. Secondly, by making use of Lyapunov functional andM-matrix method, we obtained a criterion under which the drive and response neutral-type multiple time-delays neural networks with stochastic disturbance and Markovian switching are stochastic synchronization. The synchronization condition is expressed as linear matrix inequality which can be easily solved by MATLAB. Finally, we introduced a numerical example to illustrate the effectiveness of the method and result obtained in this paper.
Style APA, Harvard, Vancouver, ISO itp.
4

Hu, Shigeng, Xiaoxin Liao i Xuerong Mao. "Stochastic Hopfield neural networks". Journal of Physics A: Mathematical and General 36, nr 9 (19.02.2003): 2235–49. http://dx.doi.org/10.1088/0305-4470/36/9/303.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Gao, Zhan, Elvin Isufi i Alejandro Ribeiro. "Stochastic Graph Neural Networks". IEEE Transactions on Signal Processing 69 (2021): 4428–43. http://dx.doi.org/10.1109/tsp.2021.3092336.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

SAKTHIVEL, RATHINASAMY, R. SAMIDURAI i S. MARSHAL ANTHONI. "EXPONENTIAL STABILITY FOR STOCHASTIC NEURAL NETWORKS OF NEUTRAL TYPE WITH IMPULSIVE EFFECTS". Modern Physics Letters B 24, nr 11 (10.05.2010): 1099–110. http://dx.doi.org/10.1142/s0217984910023141.

Pełny tekst źródła
Streszczenie:
This paper is concerned with the exponential stability of stochastic neural networks of neutral type with impulsive effects. By employing the Lyapunov functional and stochastic analysis, a new stability criterion for the stochastic neural network is derived in terms of linear matrix inequality. A numerical example is provided to show the effectiveness and applicability of the obtained result.
Style APA, Harvard, Vancouver, ISO itp.
7

Wu, Chunmei, Junhao Hu i Yan Li. "Robustness Analysis of Hybrid Stochastic Neural Networks with Neutral Terms and Time-Varying Delays". Discrete Dynamics in Nature and Society 2015 (2015): 1–12. http://dx.doi.org/10.1155/2015/278571.

Pełny tekst źródła
Streszczenie:
We analyze the robustness of global exponential stability of hybrid stochastic neural networks subject to neutral terms and time-varying delays simultaneously. Given globally exponentially stable hybrid stochastic neural networks, we characterize the upper bounds of contraction coefficients of neutral terms and time-varying delays by using the transcendental equation. Moreover, we prove theoretically that, for any globally exponentially stable hybrid stochastic neural networks, if additive neutral terms and time-varying delays are smaller than the upper bounds arrived, then the perturbed neural networks are guaranteed to also be globally exponentially stable. Finally, a numerical simulation example is given to illustrate the presented criteria.
Style APA, Harvard, Vancouver, ISO itp.
8

Kanarachos, Andreas E., i Kleanthis T. Geramanis. "Semi-Stochastic Complex Neural Networks". IFAC Proceedings Volumes 31, nr 12 (czerwiec 1998): 47–52. http://dx.doi.org/10.1016/s1474-6670(17)36040-8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Peretto, Pierre, i Jean-jacques Niez. "Stochastic Dynamics of Neural Networks". IEEE Transactions on Systems, Man, and Cybernetics 16, nr 1 (styczeń 1986): 73–83. http://dx.doi.org/10.1109/tsmc.1986.289283.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Hurtado, Jorge E. "Neural networks in stochastic mechanics". Archives of Computational Methods in Engineering 8, nr 3 (wrzesień 2001): 303–42. http://dx.doi.org/10.1007/bf02736646.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Rozprawy doktorskie na temat "Stochastic neural networks"

1

Pensuwon, Wanida. "Stochastic dynamic hierarchical neural networks". Thesis, University of Hertfordshire, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.366030.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

CAMPOS, LUCIANA CONCEICAO DIAS. "PERIODIC STOCHASTIC MODEL BASED ON NEURAL NETWORKS". PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2010. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=17076@1.

Pełny tekst źródła
Streszczenie:
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
Processo Estocástico é um ramo da teoria da probabilidade onde se define um conjunto de modelos que permitem o estudo de problemas com componentes aleatórias. Muitos problemas reais apresentam características complexas, tais como não-linearidade e comportamento caótico, que necessitam de modelos capazes de capturar as reais características do problema para obter um tratamento apropriado. Porém, os modelos existentes ou são lineares, cuja aplicabilidade a esses problemas pode ser inadequada, ou necessitam de uma formulação complexa, onde a aplicabilidade é limitada e específica ao problema, ou dependem de suposições a priori sobre o comportamento do problema para poderem ser aplicados. Isso motivou a elaboração de um novo modelo de processo estocástico genérico, intrinsecamente não-linear, que possa ser aplicado em uma gama de problemas de fenômenos não-lineares, de comportamento altamente estocástico, e até mesmo com características periódicas. Como as redes neurais artificiais são modelos paramétricos não-lineares, simples de entendimento e implementação, capazes de capturar comportamentos de variados tipos de problemas, decidiu-se então utilizá-las como base do novo modelo proposto nessa tese, que é denominado Processo Estocástico Neural. A não-linearidade, obtida através das redes neurais desse processo estocástico, permite que se capture adequadamente o comportamento da série histórica de problemas de fenômenos não-lineares, com características altamente estocásticas e até mesmo periódicas. O objetivo é usar esse modelo para gerar séries temporais sintéticas, igualmente prováveis à série histórica, na solução desses tipos de problemas, como por exemplo os problemas que envolvem fenômenos climatológicos, econômicos, entre outros. Escolheu-se, como estudo de caso dessa tese, aplicar o modelo proposto no tratamento de afluências mensais sob o contexto do planejamento da operação do sistema hidrotérmico brasileiro. Os resultados mostraram que o Processo Estocástico Neural consegue gerar séries sintéticas com características similares às séries históricas de afluências mensais.
Stochastic Process is a branch of probability theory which defines a set of templates that allow the study of problems with random components. Many real problems exhibit complex characteristics such as nonlinearity and chaotic behavior, which require models capable of capture the real characteristics of the problem for a appropriate treatment. However, existing models have limited application to certain problems or because they are linear models (whose application gets results inconsistent or inadequate) or because they require a complex formulation or depend on a priori assumptions about the behavior of the problem, which requires a knowledge the problem at a level of detail that there is not always available. This motivated the development of a model stochastic process based on neural networks, so that is generic to be applied in a range of problems involving highly stochastic phenomena of behavior and also can be applied to phenomena that have periodic characteristics. As artificial neural networks are non-linear models, simple to understand and implementation, able to capture behaviors of varied types problems, then decided to use them as the basis of new model proposed in this thesis, which is an intrinsically non-linear model, called the Neural Stochastic Process. Through neural networks that stochastic process, can adequately capture the behavior problems of the series of phenomena with features highly stochastic and / or periodical. The goal is to use this model to generate synthetic time series, equally likely to historical series, in solution of various problems, eg problems phenomena involving climatology, economic, among others. It was chosen as a case study of this thesis, applying the model proposed in the treatment of monthly inflows in the context of operation planning of the Brazilian hydrothermal system. The Results showed that the process can Stochastic Neural generate synthetic series of similar characteristics to the historical monthly inflow series.
Style APA, Harvard, Vancouver, ISO itp.
3

Sutherland, Connie. "Spatio-temporal feedback in stochastic neural networks". Thesis, University of Ottawa (Canada), 2007. http://hdl.handle.net/10393/27559.

Pełny tekst źródła
Streszczenie:
The mechanisms by which groups of neurons interact is an important facet to understanding how the brain functions. Here we study stochastic neural networks with delayed feedback. The first part of our study looks at how feedback and noise affect the mean firing rate of the network. Secondly we look at how the spatial profile of the feedback affects the behavior of the network. Our numerical and theoretical results show that negative (inhibitory) feedback linearizes the frequency vs input current (f-I) curve via the divisive gain effect it has on the network. The interaction of the inhibitory feedback and the input bias is what produces the divisive decrease in the slope (known as the gain) of the f-I curve. Our work predicts that an increase in noise is required along with increase in inhibitory feedback to attain a divisive and subtractive shift of the gain as seen in experiments [1]. Our results also show that, although the spatial profile of the feedback does not effect the mean activity of the network, it does influence the overall dynamics of the network. Local feedback generates a network oscillation, which is more robust against disruption by noise or uncorrelated input or network heterogeneity, than that for the global feedback (all-to-all coupling) case. For example uncorrelated input completely disrupts the network oscillation generated by global feedback, but only diminishes the network oscillation due to local feedback. This is characterized by 1st and 2nd order spike train statistics. Further, our theory agrees well with numerical simulations of network dynamics.
Style APA, Harvard, Vancouver, ISO itp.
4

Ling, Hong. "Implementation of Stochastic Neural Networks for Approximating Random Processes". Master's thesis, Lincoln University. Environment, Society and Design Division, 2007. http://theses.lincoln.ac.nz/public/adt-NZLIU20080108.124352/.

Pełny tekst źródła
Streszczenie:
Artificial Neural Networks (ANNs) can be viewed as a mathematical model to simulate natural and biological systems on the basis of mimicking the information processing methods in the human brain. The capability of current ANNs only focuses on approximating arbitrary deterministic input-output mappings. However, these ANNs do not adequately represent the variability which is observed in the systems’ natural settings as well as capture the complexity of the whole system behaviour. This thesis addresses the development of a new class of neural networks called Stochastic Neural Networks (SNNs) in order to simulate internal stochastic properties of systems. Developing a suitable mathematical model for SNNs is based on canonical representation of stochastic processes or systems by means of Karhunen-Loève Theorem. Some successful real examples, such as analysis of full displacement field of wood in compression, confirm the validity of the proposed neural networks. Furthermore, analysis of internal workings of SNNs provides an in-depth view on the operation of SNNs that help to gain a better understanding of the simulation of stochastic processes by SNNs.
Style APA, Harvard, Vancouver, ISO itp.
5

Zhao, Jieyu. "Stochastic bit stream neural networks : theory, simulations and applications". Thesis, Royal Holloway, University of London, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.338916.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Hyland, P. "On the implementation of neural networks using stochastic arithmetic". Thesis, Bangor University, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.306224.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Todeschi, Tiziano. "Calibration of local-stochastic volatility models with neural networks". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/23052/.

Pełny tekst źródła
Streszczenie:
During the last twenty years several models have been proposed to improve the classic Black-Scholes framework for equity derivatives pricing. Recently a new model has been proposed: Local-Stochastic Volatility Model (LSV). This model considers volatility as the product between a deterministic and a stochastic term. So far, the model choice was not only driven by the capacity of capturing empirically observed market features well, but also by the computational tractability of the calibration process. This is now undergoing a big change since machine learning technologies offer new perspectives on model calibration. In this thesis we consider the calibration problem to be the search for a model which generates given market prices and where additionally technology from generative adversarial networks can be used. This means parametrizing the model pool in a way which is accessible for machine learning techniques and interpreting the inverse problems a training task of a generative network, whose quality is assessed by an adversary. The calibration algorithm proposed for LSV models use as generative models so-called neural stochastic differential equations (SDE), which just means to parameterize the drift and volatility of an Ito-SDE by neural networks.
Style APA, Harvard, Vancouver, ISO itp.
8

陳穎志 i Wing-chi Chan. "Modelling of nonlinear stochastic systems using neural and neurofuzzy networks". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B31241475.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Chan, Wing-chi. "Modelling of nonlinear stochastic systems using neural and neurofuzzy networks /". Hong Kong : University of Hong Kong, 2001. http://sunzi.lib.hku.hk/hkuto/record.jsp?B22925843.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Rising, Barry John Paul. "Hardware architectures for stochastic bit-stream neural networks : design and implementation". Thesis, Royal Holloway, University of London, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.326219.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Książki na temat "Stochastic neural networks"

1

Zhou, Wuneng, Jun Yang, Liuwei Zhou i Dongbing Tong. Stability and Synchronization Control of Stochastic Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2016. http://dx.doi.org/10.1007/978-3-662-47833-2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

International Conference on Applied Stochastic Models and Data Analysis (12th : 2007 : Chania, Greece), red. Advances in data analysis: Theory and applications to reliability and inference, data mining, bioinformatics, lifetime data, and neural networks. Boston: Birkhäuser, 2010.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Zhu, Q. M. Fast orthogonal identification of nonlinear stochastic models and radial basis function neural networks. Sheffield: University of Sheffield, Dept. of Automatic Control and Systems Engineering, 1994.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Thathachar, Mandayam A. L. Networks of learning automata: Techniques for online stochastic optimization. Boston, MA: Kluwer Academic, 2003.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

S, Sastry P., red. Networks of learning automata: Techniques for online stochastic optimization. Boston: Kluwer Academic, 2004.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Focus, Symposium on Learning and Adaptation in Stochastic and Statistical Systems (2001 Baden-Baden Germany). Proceedings of the Focus Symposium on Learning and Adaptation in Stochastic and Statistical Systems. Windsor, Ont: International Institute for Advanced Studies in Systems Research and Cybernetics, 2002.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Curram, Stephen. representing intelligent decision making in discrete event simulation: a stochastic neural network approach. [s.l.]: typescript, 1997.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Falmagne, Jean-Claude, David Eppstein, Christopher Doble, Dietrich Albert i Xiangen Hu. Knowledge spaces: Applications in education. Heidelberg: Springer, 2013.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Falmagne, Jean-Claude. Knowledge Spaces: Applications in Education. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Hangartner, Ricky Dale. Probabilistic computation in stochastic pulse neuromime networks. 1994.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Części książek na temat "Stochastic neural networks"

1

Rojas, Raúl. "Stochastic Networks". W Neural Networks, 371–87. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-642-61068-4_14.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Müller, Berndt, i Joachim Reinhardt. "Stochastic Neurons". W Neural Networks, 37–44. Berlin, Heidelberg: Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/978-3-642-97239-3_4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Müller, Berndt, Joachim Reinhardt i Michael T. Strickland. "Stochastic Neurons". W Neural Networks, 38–45. Berlin, Heidelberg: Springer Berlin Heidelberg, 1995. http://dx.doi.org/10.1007/978-3-642-57760-4_4.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Hänggi, Martin, i George S. Moschytz. "Stochastic Optimization". W Cellular Neural Networks, 101–25. Boston, MA: Springer US, 2000. http://dx.doi.org/10.1007/978-1-4757-3220-7_6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Rennolls, Keith, Alan Soper, Phil Robbins i Ray Guthrie. "Stochastic Neural Networks". W ICANN ’93, 481. London: Springer London, 1993. http://dx.doi.org/10.1007/978-1-4471-2063-6_122.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Zhang, Yumin, Lei Guo, Lingyao Wu i Chunbo Feng. "On Stochastic Neutral Neural Networks". W Advances in Neural Networks — ISNN 2005, 69–74. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11427391_10.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Siegelmann, Hava T. "Stochastic Dynamics". W Neural Networks and Analog Computation, 121–39. Boston, MA: Birkhäuser Boston, 1999. http://dx.doi.org/10.1007/978-1-4612-0707-8_9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Golea, Mostefa, Masahiro Matsuoka i Yasubumi Sakakibara. "Stochastic simple recurrent neural networks". W Grammatical Interference: Learning Syntax from Sentences, 262–73. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/bfb0033360.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Hidalgo, Jorge, Luís F. Seoane, Jesús M. Cortés i Miguel A. Muñoz. "Stochastic Amplification in Neural Networks". W Trends in Mathematics, 45–49. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-08138-0_9.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Herve, Thierry, Olivier Francois i Jacques Demongeot. "Markovian spatial properties of a random field describing a stochastic neural network: Sequential or parallel implementation?" W Neural Networks, 81–89. Berlin, Heidelberg: Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/3-540-52255-7_29.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Stochastic neural networks"

1

Gao, Zhan, Elvin Isufi i Alejandro Ribeiro. "Stochastic Graph Neural Networks". W ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2020. http://dx.doi.org/10.1109/icassp40776.2020.9054424.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Zhao, J. "Stochastic connection neural networks". W 4th International Conference on Artificial Neural Networks. IEE, 1995. http://dx.doi.org/10.1049/cp:19950525.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Chien, Jen-Tzung, i Yu-Min Huang. "Stochastic Convolutional Recurrent Networks". W 2020 International Joint Conference on Neural Networks (IJCNN). IEEE, 2020. http://dx.doi.org/10.1109/ijcnn48605.2020.9206970.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Galan-Prado, Fabio, Alejandro Moran, Joan Font, Miquel Roca i Josep L. Rossello. "Stochastic Radial Basis Neural Networks". W 2019 29th International Symposium on Power and Timing Modeling, Optimization and Simulation (PATMOS). IEEE, 2019. http://dx.doi.org/10.1109/patmos.2019.8862129.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Ramakrishnan, Swathika, i Dhireesha Kudithipudi. "On accelerating stochastic neural networks". W NANOCOM '17: ACM The Fourth Annual International Conference on Nanoscale Computing and Communication. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3109453.3123959.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Weller, Dennis D., Nathaniel Bleier, Michael Hefenbrock, Jasmin Aghassi-Hagmann, Michael Beigl, Rakesh Kumar i Mehdi B. Tahoori. "Printed Stochastic Computing Neural Networks". W 2021 Design, Automation & Test in Europe Conference & Exhibition (DATE). IEEE, 2021. http://dx.doi.org/10.23919/date51398.2021.9474254.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Gulshad, Sadaf, Dick Sigmund i Jong-Hwan Kim. "Learning to reproduce stochastic time series using stochastic LSTM". W 2017 International Joint Conference on Neural Networks (IJCNN). IEEE, 2017. http://dx.doi.org/10.1109/ijcnn.2017.7965942.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Nikolic, Konstantin P., i Ivan B. Scepanovic. "Stochastic search-based neural networks learning algorithms". W 2008 9th Symposium on Neural Network Applications in Electrical Engineering. IEEE, 2008. http://dx.doi.org/10.1109/neurel.2008.4685579.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Juuti, Mika, Francesco Corona i Juha Karhunen. "Stochastic Discriminant Analysis". W 2015 International Joint Conference on Neural Networks (IJCNN). IEEE, 2015. http://dx.doi.org/10.1109/ijcnn.2015.7280609.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Kosko, B. "Stochastic competitive learning". W 1990 IJCNN International Joint Conference on Neural Networks. IEEE, 1990. http://dx.doi.org/10.1109/ijcnn.1990.137718.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Raporty organizacyjne na temat "Stochastic neural networks"

1

Burton, Robert M., i Jr. Topics in Stochastics, Symbolic Dynamics and Neural Networks. Fort Belvoir, VA: Defense Technical Information Center, grudzień 1996. http://dx.doi.org/10.21236/ada336426.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii