Literatura científica selecionada sobre o tema "Stochastic neural networks"

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Consulte a lista de atuais artigos, livros, teses, anais de congressos e outras fontes científicas relevantes para o tema "Stochastic neural networks".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Artigos de revistas sobre o assunto "Stochastic neural networks"

1

Reddy, BhanuTeja, e Usha J.C. "Prediction of Stock Market using Stochastic Neural Networks". International Journal of Innovative Research in Computer Science & Technology 7, n.º 5 (setembro de 2019): 128–38. http://dx.doi.org/10.21276/ijircst.2019.7.5.1.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Wong, Eugene. "Stochastic neural networks". Algorithmica 6, n.º 1-6 (junho de 1991): 466–78. http://dx.doi.org/10.1007/bf01759054.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Zhou, Wuneng, Xueqing Yang, Jun Yang e Jun Zhou. "Stochastic Synchronization of Neutral-Type Neural Networks with Multidelays Based onM-Matrix". Discrete Dynamics in Nature and Society 2015 (2015): 1–8. http://dx.doi.org/10.1155/2015/826810.

Texto completo da fonte
Resumo:
The problem of stochastic synchronization of neutral-type neural networks with multidelays based onM-matrix is researched. Firstly, we designed a control law of stochastic synchronization of the neural-type and multiple time-delays neural network. Secondly, by making use of Lyapunov functional andM-matrix method, we obtained a criterion under which the drive and response neutral-type multiple time-delays neural networks with stochastic disturbance and Markovian switching are stochastic synchronization. The synchronization condition is expressed as linear matrix inequality which can be easily solved by MATLAB. Finally, we introduced a numerical example to illustrate the effectiveness of the method and result obtained in this paper.
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Hu, Shigeng, Xiaoxin Liao e Xuerong Mao. "Stochastic Hopfield neural networks". Journal of Physics A: Mathematical and General 36, n.º 9 (19 de fevereiro de 2003): 2235–49. http://dx.doi.org/10.1088/0305-4470/36/9/303.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Gao, Zhan, Elvin Isufi e Alejandro Ribeiro. "Stochastic Graph Neural Networks". IEEE Transactions on Signal Processing 69 (2021): 4428–43. http://dx.doi.org/10.1109/tsp.2021.3092336.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

SAKTHIVEL, RATHINASAMY, R. SAMIDURAI e S. MARSHAL ANTHONI. "EXPONENTIAL STABILITY FOR STOCHASTIC NEURAL NETWORKS OF NEUTRAL TYPE WITH IMPULSIVE EFFECTS". Modern Physics Letters B 24, n.º 11 (10 de maio de 2010): 1099–110. http://dx.doi.org/10.1142/s0217984910023141.

Texto completo da fonte
Resumo:
This paper is concerned with the exponential stability of stochastic neural networks of neutral type with impulsive effects. By employing the Lyapunov functional and stochastic analysis, a new stability criterion for the stochastic neural network is derived in terms of linear matrix inequality. A numerical example is provided to show the effectiveness and applicability of the obtained result.
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Wu, Chunmei, Junhao Hu e Yan Li. "Robustness Analysis of Hybrid Stochastic Neural Networks with Neutral Terms and Time-Varying Delays". Discrete Dynamics in Nature and Society 2015 (2015): 1–12. http://dx.doi.org/10.1155/2015/278571.

Texto completo da fonte
Resumo:
We analyze the robustness of global exponential stability of hybrid stochastic neural networks subject to neutral terms and time-varying delays simultaneously. Given globally exponentially stable hybrid stochastic neural networks, we characterize the upper bounds of contraction coefficients of neutral terms and time-varying delays by using the transcendental equation. Moreover, we prove theoretically that, for any globally exponentially stable hybrid stochastic neural networks, if additive neutral terms and time-varying delays are smaller than the upper bounds arrived, then the perturbed neural networks are guaranteed to also be globally exponentially stable. Finally, a numerical simulation example is given to illustrate the presented criteria.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Kanarachos, Andreas E., e Kleanthis T. Geramanis. "Semi-Stochastic Complex Neural Networks". IFAC Proceedings Volumes 31, n.º 12 (junho de 1998): 47–52. http://dx.doi.org/10.1016/s1474-6670(17)36040-8.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Peretto, Pierre, e Jean-jacques Niez. "Stochastic Dynamics of Neural Networks". IEEE Transactions on Systems, Man, and Cybernetics 16, n.º 1 (janeiro de 1986): 73–83. http://dx.doi.org/10.1109/tsmc.1986.289283.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Hurtado, Jorge E. "Neural networks in stochastic mechanics". Archives of Computational Methods in Engineering 8, n.º 3 (setembro de 2001): 303–42. http://dx.doi.org/10.1007/bf02736646.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Teses / dissertações sobre o assunto "Stochastic neural networks"

1

Pensuwon, Wanida. "Stochastic dynamic hierarchical neural networks". Thesis, University of Hertfordshire, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.366030.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

CAMPOS, LUCIANA CONCEICAO DIAS. "PERIODIC STOCHASTIC MODEL BASED ON NEURAL NETWORKS". PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2010. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=17076@1.

Texto completo da fonte
Resumo:
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
Processo Estocástico é um ramo da teoria da probabilidade onde se define um conjunto de modelos que permitem o estudo de problemas com componentes aleatórias. Muitos problemas reais apresentam características complexas, tais como não-linearidade e comportamento caótico, que necessitam de modelos capazes de capturar as reais características do problema para obter um tratamento apropriado. Porém, os modelos existentes ou são lineares, cuja aplicabilidade a esses problemas pode ser inadequada, ou necessitam de uma formulação complexa, onde a aplicabilidade é limitada e específica ao problema, ou dependem de suposições a priori sobre o comportamento do problema para poderem ser aplicados. Isso motivou a elaboração de um novo modelo de processo estocástico genérico, intrinsecamente não-linear, que possa ser aplicado em uma gama de problemas de fenômenos não-lineares, de comportamento altamente estocástico, e até mesmo com características periódicas. Como as redes neurais artificiais são modelos paramétricos não-lineares, simples de entendimento e implementação, capazes de capturar comportamentos de variados tipos de problemas, decidiu-se então utilizá-las como base do novo modelo proposto nessa tese, que é denominado Processo Estocástico Neural. A não-linearidade, obtida através das redes neurais desse processo estocástico, permite que se capture adequadamente o comportamento da série histórica de problemas de fenômenos não-lineares, com características altamente estocásticas e até mesmo periódicas. O objetivo é usar esse modelo para gerar séries temporais sintéticas, igualmente prováveis à série histórica, na solução desses tipos de problemas, como por exemplo os problemas que envolvem fenômenos climatológicos, econômicos, entre outros. Escolheu-se, como estudo de caso dessa tese, aplicar o modelo proposto no tratamento de afluências mensais sob o contexto do planejamento da operação do sistema hidrotérmico brasileiro. Os resultados mostraram que o Processo Estocástico Neural consegue gerar séries sintéticas com características similares às séries históricas de afluências mensais.
Stochastic Process is a branch of probability theory which defines a set of templates that allow the study of problems with random components. Many real problems exhibit complex characteristics such as nonlinearity and chaotic behavior, which require models capable of capture the real characteristics of the problem for a appropriate treatment. However, existing models have limited application to certain problems or because they are linear models (whose application gets results inconsistent or inadequate) or because they require a complex formulation or depend on a priori assumptions about the behavior of the problem, which requires a knowledge the problem at a level of detail that there is not always available. This motivated the development of a model stochastic process based on neural networks, so that is generic to be applied in a range of problems involving highly stochastic phenomena of behavior and also can be applied to phenomena that have periodic characteristics. As artificial neural networks are non-linear models, simple to understand and implementation, able to capture behaviors of varied types problems, then decided to use them as the basis of new model proposed in this thesis, which is an intrinsically non-linear model, called the Neural Stochastic Process. Through neural networks that stochastic process, can adequately capture the behavior problems of the series of phenomena with features highly stochastic and / or periodical. The goal is to use this model to generate synthetic time series, equally likely to historical series, in solution of various problems, eg problems phenomena involving climatology, economic, among others. It was chosen as a case study of this thesis, applying the model proposed in the treatment of monthly inflows in the context of operation planning of the Brazilian hydrothermal system. The Results showed that the process can Stochastic Neural generate synthetic series of similar characteristics to the historical monthly inflow series.
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Sutherland, Connie. "Spatio-temporal feedback in stochastic neural networks". Thesis, University of Ottawa (Canada), 2007. http://hdl.handle.net/10393/27559.

Texto completo da fonte
Resumo:
The mechanisms by which groups of neurons interact is an important facet to understanding how the brain functions. Here we study stochastic neural networks with delayed feedback. The first part of our study looks at how feedback and noise affect the mean firing rate of the network. Secondly we look at how the spatial profile of the feedback affects the behavior of the network. Our numerical and theoretical results show that negative (inhibitory) feedback linearizes the frequency vs input current (f-I) curve via the divisive gain effect it has on the network. The interaction of the inhibitory feedback and the input bias is what produces the divisive decrease in the slope (known as the gain) of the f-I curve. Our work predicts that an increase in noise is required along with increase in inhibitory feedback to attain a divisive and subtractive shift of the gain as seen in experiments [1]. Our results also show that, although the spatial profile of the feedback does not effect the mean activity of the network, it does influence the overall dynamics of the network. Local feedback generates a network oscillation, which is more robust against disruption by noise or uncorrelated input or network heterogeneity, than that for the global feedback (all-to-all coupling) case. For example uncorrelated input completely disrupts the network oscillation generated by global feedback, but only diminishes the network oscillation due to local feedback. This is characterized by 1st and 2nd order spike train statistics. Further, our theory agrees well with numerical simulations of network dynamics.
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Ling, Hong. "Implementation of Stochastic Neural Networks for Approximating Random Processes". Master's thesis, Lincoln University. Environment, Society and Design Division, 2007. http://theses.lincoln.ac.nz/public/adt-NZLIU20080108.124352/.

Texto completo da fonte
Resumo:
Artificial Neural Networks (ANNs) can be viewed as a mathematical model to simulate natural and biological systems on the basis of mimicking the information processing methods in the human brain. The capability of current ANNs only focuses on approximating arbitrary deterministic input-output mappings. However, these ANNs do not adequately represent the variability which is observed in the systems’ natural settings as well as capture the complexity of the whole system behaviour. This thesis addresses the development of a new class of neural networks called Stochastic Neural Networks (SNNs) in order to simulate internal stochastic properties of systems. Developing a suitable mathematical model for SNNs is based on canonical representation of stochastic processes or systems by means of Karhunen-Loève Theorem. Some successful real examples, such as analysis of full displacement field of wood in compression, confirm the validity of the proposed neural networks. Furthermore, analysis of internal workings of SNNs provides an in-depth view on the operation of SNNs that help to gain a better understanding of the simulation of stochastic processes by SNNs.
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Zhao, Jieyu. "Stochastic bit stream neural networks : theory, simulations and applications". Thesis, Royal Holloway, University of London, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.338916.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Hyland, P. "On the implementation of neural networks using stochastic arithmetic". Thesis, Bangor University, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.306224.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Todeschi, Tiziano. "Calibration of local-stochastic volatility models with neural networks". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/23052/.

Texto completo da fonte
Resumo:
During the last twenty years several models have been proposed to improve the classic Black-Scholes framework for equity derivatives pricing. Recently a new model has been proposed: Local-Stochastic Volatility Model (LSV). This model considers volatility as the product between a deterministic and a stochastic term. So far, the model choice was not only driven by the capacity of capturing empirically observed market features well, but also by the computational tractability of the calibration process. This is now undergoing a big change since machine learning technologies offer new perspectives on model calibration. In this thesis we consider the calibration problem to be the search for a model which generates given market prices and where additionally technology from generative adversarial networks can be used. This means parametrizing the model pool in a way which is accessible for machine learning techniques and interpreting the inverse problems a training task of a generative network, whose quality is assessed by an adversary. The calibration algorithm proposed for LSV models use as generative models so-called neural stochastic differential equations (SDE), which just means to parameterize the drift and volatility of an Ito-SDE by neural networks.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

陳穎志 e Wing-chi Chan. "Modelling of nonlinear stochastic systems using neural and neurofuzzy networks". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2001. http://hub.hku.hk/bib/B31241475.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Chan, Wing-chi. "Modelling of nonlinear stochastic systems using neural and neurofuzzy networks /". Hong Kong : University of Hong Kong, 2001. http://sunzi.lib.hku.hk/hkuto/record.jsp?B22925843.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Rising, Barry John Paul. "Hardware architectures for stochastic bit-stream neural networks : design and implementation". Thesis, Royal Holloway, University of London, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.326219.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Livros sobre o assunto "Stochastic neural networks"

1

Zhou, Wuneng, Jun Yang, Liuwei Zhou e Dongbing Tong. Stability and Synchronization Control of Stochastic Neural Networks. Berlin, Heidelberg: Springer Berlin Heidelberg, 2016. http://dx.doi.org/10.1007/978-3-662-47833-2.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

International Conference on Applied Stochastic Models and Data Analysis (12th : 2007 : Chania, Greece), ed. Advances in data analysis: Theory and applications to reliability and inference, data mining, bioinformatics, lifetime data, and neural networks. Boston: Birkhäuser, 2010.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Zhu, Q. M. Fast orthogonal identification of nonlinear stochastic models and radial basis function neural networks. Sheffield: University of Sheffield, Dept. of Automatic Control and Systems Engineering, 1994.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Thathachar, Mandayam A. L. Networks of learning automata: Techniques for online stochastic optimization. Boston, MA: Kluwer Academic, 2003.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

S, Sastry P., ed. Networks of learning automata: Techniques for online stochastic optimization. Boston: Kluwer Academic, 2004.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Focus, Symposium on Learning and Adaptation in Stochastic and Statistical Systems (2001 Baden-Baden Germany). Proceedings of the Focus Symposium on Learning and Adaptation in Stochastic and Statistical Systems. Windsor, Ont: International Institute for Advanced Studies in Systems Research and Cybernetics, 2002.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Curram, Stephen. representing intelligent decision making in discrete event simulation: a stochastic neural network approach. [s.l.]: typescript, 1997.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Falmagne, Jean-Claude, David Eppstein, Christopher Doble, Dietrich Albert e Xiangen Hu. Knowledge spaces: Applications in education. Heidelberg: Springer, 2013.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Falmagne, Jean-Claude. Knowledge Spaces: Applications in Education. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Hangartner, Ricky Dale. Probabilistic computation in stochastic pulse neuromime networks. 1994.

Encontre o texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Capítulos de livros sobre o assunto "Stochastic neural networks"

1

Rojas, Raúl. "Stochastic Networks". In Neural Networks, 371–87. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-642-61068-4_14.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Müller, Berndt, e Joachim Reinhardt. "Stochastic Neurons". In Neural Networks, 37–44. Berlin, Heidelberg: Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/978-3-642-97239-3_4.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Müller, Berndt, Joachim Reinhardt e Michael T. Strickland. "Stochastic Neurons". In Neural Networks, 38–45. Berlin, Heidelberg: Springer Berlin Heidelberg, 1995. http://dx.doi.org/10.1007/978-3-642-57760-4_4.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Hänggi, Martin, e George S. Moschytz. "Stochastic Optimization". In Cellular Neural Networks, 101–25. Boston, MA: Springer US, 2000. http://dx.doi.org/10.1007/978-1-4757-3220-7_6.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Rennolls, Keith, Alan Soper, Phil Robbins e Ray Guthrie. "Stochastic Neural Networks". In ICANN ’93, 481. London: Springer London, 1993. http://dx.doi.org/10.1007/978-1-4471-2063-6_122.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Zhang, Yumin, Lei Guo, Lingyao Wu e Chunbo Feng. "On Stochastic Neutral Neural Networks". In Advances in Neural Networks — ISNN 2005, 69–74. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11427391_10.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Siegelmann, Hava T. "Stochastic Dynamics". In Neural Networks and Analog Computation, 121–39. Boston, MA: Birkhäuser Boston, 1999. http://dx.doi.org/10.1007/978-1-4612-0707-8_9.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Golea, Mostefa, Masahiro Matsuoka e Yasubumi Sakakibara. "Stochastic simple recurrent neural networks". In Grammatical Interference: Learning Syntax from Sentences, 262–73. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/bfb0033360.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Hidalgo, Jorge, Luís F. Seoane, Jesús M. Cortés e Miguel A. Muñoz. "Stochastic Amplification in Neural Networks". In Trends in Mathematics, 45–49. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-08138-0_9.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Herve, Thierry, Olivier Francois e Jacques Demongeot. "Markovian spatial properties of a random field describing a stochastic neural network: Sequential or parallel implementation?" In Neural Networks, 81–89. Berlin, Heidelberg: Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/3-540-52255-7_29.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Trabalhos de conferências sobre o assunto "Stochastic neural networks"

1

Gao, Zhan, Elvin Isufi e Alejandro Ribeiro. "Stochastic Graph Neural Networks". In ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2020. http://dx.doi.org/10.1109/icassp40776.2020.9054424.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Zhao, J. "Stochastic connection neural networks". In 4th International Conference on Artificial Neural Networks. IEE, 1995. http://dx.doi.org/10.1049/cp:19950525.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Chien, Jen-Tzung, e Yu-Min Huang. "Stochastic Convolutional Recurrent Networks". In 2020 International Joint Conference on Neural Networks (IJCNN). IEEE, 2020. http://dx.doi.org/10.1109/ijcnn48605.2020.9206970.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Galan-Prado, Fabio, Alejandro Moran, Joan Font, Miquel Roca e Josep L. Rossello. "Stochastic Radial Basis Neural Networks". In 2019 29th International Symposium on Power and Timing Modeling, Optimization and Simulation (PATMOS). IEEE, 2019. http://dx.doi.org/10.1109/patmos.2019.8862129.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Ramakrishnan, Swathika, e Dhireesha Kudithipudi. "On accelerating stochastic neural networks". In NANOCOM '17: ACM The Fourth Annual International Conference on Nanoscale Computing and Communication. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3109453.3123959.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Weller, Dennis D., Nathaniel Bleier, Michael Hefenbrock, Jasmin Aghassi-Hagmann, Michael Beigl, Rakesh Kumar e Mehdi B. Tahoori. "Printed Stochastic Computing Neural Networks". In 2021 Design, Automation & Test in Europe Conference & Exhibition (DATE). IEEE, 2021. http://dx.doi.org/10.23919/date51398.2021.9474254.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Gulshad, Sadaf, Dick Sigmund e Jong-Hwan Kim. "Learning to reproduce stochastic time series using stochastic LSTM". In 2017 International Joint Conference on Neural Networks (IJCNN). IEEE, 2017. http://dx.doi.org/10.1109/ijcnn.2017.7965942.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Nikolic, Konstantin P., e Ivan B. Scepanovic. "Stochastic search-based neural networks learning algorithms". In 2008 9th Symposium on Neural Network Applications in Electrical Engineering. IEEE, 2008. http://dx.doi.org/10.1109/neurel.2008.4685579.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Juuti, Mika, Francesco Corona e Juha Karhunen. "Stochastic Discriminant Analysis". In 2015 International Joint Conference on Neural Networks (IJCNN). IEEE, 2015. http://dx.doi.org/10.1109/ijcnn.2015.7280609.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Kosko, B. "Stochastic competitive learning". In 1990 IJCNN International Joint Conference on Neural Networks. IEEE, 1990. http://dx.doi.org/10.1109/ijcnn.1990.137718.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.

Relatórios de organizações sobre o assunto "Stochastic neural networks"

1

Burton, Robert M., e Jr. Topics in Stochastics, Symbolic Dynamics and Neural Networks. Fort Belvoir, VA: Defense Technical Information Center, dezembro de 1996. http://dx.doi.org/10.21236/ada336426.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
Oferecemos descontos em todos os planos premium para autores cujas obras estão incluídas em seleções literárias temáticas. Contate-nos para obter um código promocional único!

Vá para a bibliografia