Siga este link para ver outros tipos de publicações sobre o tema: Stochastic neural networks.

Artigos de revistas sobre o tema "Stochastic neural networks"

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Veja os 50 melhores artigos de revistas para estudos sobre o assunto "Stochastic neural networks".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Veja os artigos de revistas das mais diversas áreas científicas e compile uma bibliografia correta.

1

Reddy, BhanuTeja, e Usha J.C. "Prediction of Stock Market using Stochastic Neural Networks". International Journal of Innovative Research in Computer Science & Technology 7, n.º 5 (setembro de 2019): 128–38. http://dx.doi.org/10.21276/ijircst.2019.7.5.1.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Wong, Eugene. "Stochastic neural networks". Algorithmica 6, n.º 1-6 (junho de 1991): 466–78. http://dx.doi.org/10.1007/bf01759054.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Zhou, Wuneng, Xueqing Yang, Jun Yang e Jun Zhou. "Stochastic Synchronization of Neutral-Type Neural Networks with Multidelays Based onM-Matrix". Discrete Dynamics in Nature and Society 2015 (2015): 1–8. http://dx.doi.org/10.1155/2015/826810.

Texto completo da fonte
Resumo:
The problem of stochastic synchronization of neutral-type neural networks with multidelays based onM-matrix is researched. Firstly, we designed a control law of stochastic synchronization of the neural-type and multiple time-delays neural network. Secondly, by making use of Lyapunov functional andM-matrix method, we obtained a criterion under which the drive and response neutral-type multiple time-delays neural networks with stochastic disturbance and Markovian switching are stochastic synchronization. The synchronization condition is expressed as linear matrix inequality which can be easily solved by MATLAB. Finally, we introduced a numerical example to illustrate the effectiveness of the method and result obtained in this paper.
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Hu, Shigeng, Xiaoxin Liao e Xuerong Mao. "Stochastic Hopfield neural networks". Journal of Physics A: Mathematical and General 36, n.º 9 (19 de fevereiro de 2003): 2235–49. http://dx.doi.org/10.1088/0305-4470/36/9/303.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Gao, Zhan, Elvin Isufi e Alejandro Ribeiro. "Stochastic Graph Neural Networks". IEEE Transactions on Signal Processing 69 (2021): 4428–43. http://dx.doi.org/10.1109/tsp.2021.3092336.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

SAKTHIVEL, RATHINASAMY, R. SAMIDURAI e S. MARSHAL ANTHONI. "EXPONENTIAL STABILITY FOR STOCHASTIC NEURAL NETWORKS OF NEUTRAL TYPE WITH IMPULSIVE EFFECTS". Modern Physics Letters B 24, n.º 11 (10 de maio de 2010): 1099–110. http://dx.doi.org/10.1142/s0217984910023141.

Texto completo da fonte
Resumo:
This paper is concerned with the exponential stability of stochastic neural networks of neutral type with impulsive effects. By employing the Lyapunov functional and stochastic analysis, a new stability criterion for the stochastic neural network is derived in terms of linear matrix inequality. A numerical example is provided to show the effectiveness and applicability of the obtained result.
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Wu, Chunmei, Junhao Hu e Yan Li. "Robustness Analysis of Hybrid Stochastic Neural Networks with Neutral Terms and Time-Varying Delays". Discrete Dynamics in Nature and Society 2015 (2015): 1–12. http://dx.doi.org/10.1155/2015/278571.

Texto completo da fonte
Resumo:
We analyze the robustness of global exponential stability of hybrid stochastic neural networks subject to neutral terms and time-varying delays simultaneously. Given globally exponentially stable hybrid stochastic neural networks, we characterize the upper bounds of contraction coefficients of neutral terms and time-varying delays by using the transcendental equation. Moreover, we prove theoretically that, for any globally exponentially stable hybrid stochastic neural networks, if additive neutral terms and time-varying delays are smaller than the upper bounds arrived, then the perturbed neural networks are guaranteed to also be globally exponentially stable. Finally, a numerical simulation example is given to illustrate the presented criteria.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Kanarachos, Andreas E., e Kleanthis T. Geramanis. "Semi-Stochastic Complex Neural Networks". IFAC Proceedings Volumes 31, n.º 12 (junho de 1998): 47–52. http://dx.doi.org/10.1016/s1474-6670(17)36040-8.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Peretto, Pierre, e Jean-jacques Niez. "Stochastic Dynamics of Neural Networks". IEEE Transactions on Systems, Man, and Cybernetics 16, n.º 1 (janeiro de 1986): 73–83. http://dx.doi.org/10.1109/tsmc.1986.289283.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Hurtado, Jorge E. "Neural networks in stochastic mechanics". Archives of Computational Methods in Engineering 8, n.º 3 (setembro de 2001): 303–42. http://dx.doi.org/10.1007/bf02736646.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
11

PARK, JU H., e O. M. KWON. "SYNCHRONIZATION OF NEURAL NETWORKS OF NEUTRAL TYPE WITH STOCHASTIC PERTURBATION". Modern Physics Letters B 23, n.º 14 (10 de junho de 2009): 1743–51. http://dx.doi.org/10.1142/s0217984909019909.

Texto completo da fonte
Resumo:
In this letter, the problem of feedback controller design to achieve synchronization for neural network of neutral type with stochastic perturbation is considered. Based on Lyapunov method and LMI (linear matrix inequality) framework, the goal of this letter is to derive an existence criterion of the controller for the synchronization between master and response networks.
Estilos ABNT, Harvard, Vancouver, APA, etc.
12

PARK, JU H., e O. M. KWON. "ANALYSIS ON GLOBAL STABILITY OF STOCHASTIC NEURAL NETWORKS OF NEUTRAL TYPE". Modern Physics Letters B 22, n.º 32 (30 de dezembro de 2008): 3159–70. http://dx.doi.org/10.1142/s0217984908017680.

Texto completo da fonte
Resumo:
In this paper, the problem of global asymptotic stability of stochastic neural networks of neutral type is considered. Based on the Lyapunov stability theory, a new delay-dependent stability criterion for the network is derived in terms of LMI (linear matrix inequality). A numerical example is given to show the effectiveness of the proposed method.
Estilos ABNT, Harvard, Vancouver, APA, etc.
13

Hua, Min Gang, Jun Tao Fei e Wei Li Dai. "Exponential Stability Analysis for Neutral Stochastic Delayed Neural Networks". Advanced Materials Research 354-355 (outubro de 2011): 877–80. http://dx.doi.org/10.4028/www.scientific.net/amr.354-355.877.

Texto completo da fonte
Resumo:
In this paper, the generalized Finsler lemma and augmented Lyapunov functional are introduced to establish some improved delay-dependent stability criteria of neutral stochastic delayed neural networks. The stability criteria in the new results improve and generalize existing ones. Two examples are included to show the effectiveness of the results.
Estilos ABNT, Harvard, Vancouver, APA, etc.
14

Verikas, A., e A. Gelzinis. "Training neural networks by stochastic optimisation". Neurocomputing 30, n.º 1-4 (janeiro de 2000): 153–72. http://dx.doi.org/10.1016/s0925-2312(99)00123-x.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
15

Blythe, Steve, Xuerong Mao e Xiaoxin Liao. "Stability of stochastic delay neural networks". Journal of the Franklin Institute 338, n.º 4 (julho de 2001): 481–95. http://dx.doi.org/10.1016/s0016-0032(01)00016-3.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
16

Yang, Yunfeng, e Fengxian Tang. "Network Intrusion Detection Based on Stochastic Neural Networks Method". International Journal of Security and Its Applications 10, n.º 8 (31 de agosto de 2016): 435–46. http://dx.doi.org/10.14257/ijsia.2016.10.8.38.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
17

Borovkov, K., G. Decrouez e M. Gilson. "On Stationary Distributions of Stochastic Neural Networks". Journal of Applied Probability 51, n.º 03 (setembro de 2014): 837–57. http://dx.doi.org/10.1017/s0021900200011700.

Texto completo da fonte
Resumo:
The paper deals with nonlinear Poisson neuron network models with bounded memory dynamics, which can include both Hebbian learning mechanisms and refractory periods. The state of the network is described by the times elapsed since its neurons fired within the post-synaptic transfer kernel memory span, and the current strengths of synaptic connections, the state spaces of our models being hierarchies of finite-dimensional components. We prove the ergodicity of the stochastic processes describing the behaviour of the networks, establish the existence of continuously differentiable stationary distribution densities (with respect to the Lebesgue measures of corresponding dimensionality) on the components of the state space, and find upper bounds for them. For the density components, we derive a system of differential equations that can be solved in a few simplest cases only. Approaches to approximate computation of the stationary density are discussed. One approach is to reduce the dimensionality of the problem by modifying the network so that each neuron cannot fire if the number of spikes it emitted within the post-synaptic transfer kernel memory span reaches a given threshold. We show that the stationary distribution of this ‘truncated’ network converges to that of the unrestricted network as the threshold increases, and that the convergence is at a superexponential rate. A complementary approach uses discrete Markov chain approximations to the network process.
Estilos ABNT, Harvard, Vancouver, APA, etc.
18

Borovkov, K., G. Decrouez e M. Gilson. "On Stationary Distributions of Stochastic Neural Networks". Journal of Applied Probability 51, n.º 3 (setembro de 2014): 837–57. http://dx.doi.org/10.1239/jap/1409932677.

Texto completo da fonte
Resumo:
The paper deals with nonlinear Poisson neuron network models with bounded memory dynamics, which can include both Hebbian learning mechanisms and refractory periods. The state of the network is described by the times elapsed since its neurons fired within the post-synaptic transfer kernel memory span, and the current strengths of synaptic connections, the state spaces of our models being hierarchies of finite-dimensional components. We prove the ergodicity of the stochastic processes describing the behaviour of the networks, establish the existence of continuously differentiable stationary distribution densities (with respect to the Lebesgue measures of corresponding dimensionality) on the components of the state space, and find upper bounds for them. For the density components, we derive a system of differential equations that can be solved in a few simplest cases only. Approaches to approximate computation of the stationary density are discussed. One approach is to reduce the dimensionality of the problem by modifying the network so that each neuron cannot fire if the number of spikes it emitted within the post-synaptic transfer kernel memory span reaches a given threshold. We show that the stationary distribution of this ‘truncated’ network converges to that of the unrestricted network as the threshold increases, and that the convergence is at a superexponential rate. A complementary approach uses discrete Markov chain approximations to the network process.
Estilos ABNT, Harvard, Vancouver, APA, etc.
19

Shi, Xiang Dong. "Mean Square Asymptotic Stability of Neutral Stochastic Neutral Networks with Multiple Time-Varying Delays". Advanced Materials Research 684 (abril de 2013): 579–82. http://dx.doi.org/10.4028/www.scientific.net/amr.684.579.

Texto completo da fonte
Resumo:
The paper considers the problems of almost surely asymptotic stability for neutral stochastic neural networks with multiple time-varying delays. By applying Lyapunov functional method and differential inequality techniques, new sufficient conditions ensuring the existence and almost surely asymptotic stability of neutral stochastic neural networks with multiple time-varying delays are established. The results are shown to be generalizations of some previously published results and are less conservative than existing results.
Estilos ABNT, Harvard, Vancouver, APA, etc.
20

Akbar, Mutaqin. "Traffic sign recognition using convolutional neural networks". Jurnal Teknologi dan Sistem Komputer 9, n.º 2 (5 de março de 2021): 120–25. http://dx.doi.org/10.14710/jtsiskom.2021.13959.

Texto completo da fonte
Resumo:
Traffic sign recognition (TSR) can be used to recognize traffic signs by utilizing image processing. This paper presents traffic sign recognition in Indonesia using convolutional neural networks (CNN). The overall image dataset used is 2050 images of traffic signs, consisting of 10 kinds of signs. The CNN layer used in this study consists of one convolution layer, one pooling layer using maxpool operation, and one fully connected layer. The training algorithm used is stochastic gradient descent (SGD). At the training stage, using 1750 training images, 48 filters, and a learning rate of 0.005, the recognition results in 0.005 of loss and 100 % of accuracy. At the testing stage using 300 test images, the system recognizes the signs with 0.107 of loss and 97.33 % of accuracy.
Estilos ABNT, Harvard, Vancouver, APA, etc.
21

Liu, Qingliang, e Jinmei Lai. "Stochastic Loss Function". Proceedings of the AAAI Conference on Artificial Intelligence 34, n.º 04 (3 de abril de 2020): 4884–91. http://dx.doi.org/10.1609/aaai.v34i04.5925.

Texto completo da fonte
Resumo:
Training deep neural networks is inherently subject to the predefined and fixed loss functions during optimizing. To improve learning efficiency, we develop Stochastic Loss Function (SLF) to dynamically and automatically generating appropriate gradients to train deep networks in the same round of back-propagation, while maintaining the completeness and differentiability of the training pipeline. In SLF, a generic loss function is formulated as a joint optimization problem of network weights and loss parameters. In order to guarantee the requisite efficiency, gradients with the respect to the generic differentiable loss are leveraged for selecting loss function and optimizing network weights. Extensive experiments on a variety of popular datasets strongly demonstrate that SLF is capable of obtaining appropriate gradients at different stages during training, and can significantly improve the performance of various deep models on real world tasks including classification, clustering, regression, neural machine translation, and objection detection.
Estilos ABNT, Harvard, Vancouver, APA, etc.
22

EMMERT-STREIB, FRANK. "A HETEROSYNAPTIC LEARNING RULE FOR NEURAL NETWORKS". International Journal of Modern Physics C 17, n.º 10 (outubro de 2006): 1501–20. http://dx.doi.org/10.1142/s0129183106009916.

Texto completo da fonte
Resumo:
In this article we introduce a novel stochastic Hebb-like learning rule for neural networks that is neurobiologically motivated. This learning rule combines features of unsupervised (Hebbian) and supervised (reinforcement) learning and is stochastic with respect to the selection of the time points when a synapse is modified. Moreover, the learning rule does not only affect the synapse between pre- and postsynaptic neuron, which is called homosynaptic plasticity, but effects also further remote synapses of the pre- and postsynaptic neuron. This more complex form of synaptic plasticity has recently come under investigations in neurobiology and is called heterosynaptic plasticity. We demonstrate that this learning rule is useful in training neural networks by learning parity functions including the exclusive-or (XOR) mapping in a multilayer feed-forward network. We find, that our stochastic learning rule works well, even in the presence of noise. Importantly, the mean learning time increases with the number of patterns to be learned polynomially, indicating efficient learning.
Estilos ABNT, Harvard, Vancouver, APA, etc.
23

ROSSELLÓ, JOSEP L., VINCENT CANALS, ANTONI MORRO e ANTONI OLIVER. "HARDWARE IMPLEMENTATION OF STOCHASTIC SPIKING NEURAL NETWORKS". International Journal of Neural Systems 22, n.º 04 (25 de julho de 2012): 1250014. http://dx.doi.org/10.1142/s0129065712500141.

Texto completo da fonte
Resumo:
Spiking Neural Networks, the last generation of Artificial Neural Networks, are characterized by its bio-inspired nature and by a higher computational capacity with respect to other neural models. In real biological neurons, stochastic processes represent an important mechanism of neural behavior and are responsible of its special arithmetic capabilities. In this work we present a simple hardware implementation of spiking neurons that considers this probabilistic nature. The advantage of the proposed implementation is that it is fully digital and therefore can be massively implemented in Field Programmable Gate Arrays. The high computational capabilities of the proposed model are demonstrated by the study of both feed-forward and recurrent networks that are able to implement high-speed signal filtering and to solve complex systems of linear equations.
Estilos ABNT, Harvard, Vancouver, APA, etc.
24

Wang, Yanzhi, Zheng Zhan, Liang Zhao, Jian Tang, Siyue Wang, Jiayu Li, Bo Yuan, Wujie Wen e Xue Lin. "Universal Approximation Property and Equivalence of Stochastic Computing-Based Neural Networks and Binary Neural Networks". Proceedings of the AAAI Conference on Artificial Intelligence 33 (17 de julho de 2019): 5369–76. http://dx.doi.org/10.1609/aaai.v33i01.33015369.

Texto completo da fonte
Resumo:
Large-scale deep neural networks are both memory and computation-intensive, thereby posing stringent requirements on the computing platforms. Hardware accelerations of deep neural networks have been extensively investigated. Specific forms of binary neural networks (BNNs) and stochastic computing-based neural networks (SCNNs) are particularly appealing to hardware implementations since they can be implemented almost entirely with binary operations. Despite the obvious advantages in hardware implementation, these approximate computing techniques are questioned by researchers in terms of accuracy and universal applicability. Also it is important to understand the relative pros and cons of SCNNs and BNNs in theory and in actual hardware implementations. In order to address these concerns, in this paper we prove that the “ideal” SCNNs and BNNs satisfy the universal approximation property with probability 1 (due to the stochastic behavior), which is a new angle from the original approximation property. The proof is conducted by first proving the property for SCNNs from the strong law of large numbers, and then using SCNNs as a “bridge” to prove for BNNs. Besides the universal approximation property, we also derive an appropriate bound for bit length M in order to provide insights for the actual neural network implementations. Based on the universal approximation property, we further prove that SCNNs and BNNs exhibit the same energy complexity. In other words, they have the same asymptotic energy consumption with the growth of network size. We also provide a detailed analysis of the pros and cons of SCNNs and BNNs for hardware implementations and conclude that SCNNs are more suitable.
Estilos ABNT, Harvard, Vancouver, APA, etc.
25

Zhou, Wuneng, Xueqing Yang, Jun Yang, Anding Dai e Huashan Liu. "Almost Sure Asymptotical Adaptive Synchronization for Neutral-Type Neural Networks with Stochastic Perturbation and Markovian Switching". Mathematical Problems in Engineering 2014 (2014): 1–13. http://dx.doi.org/10.1155/2014/479084.

Texto completo da fonte
Resumo:
The problem of almost sure (a.s.) asymptotic adaptive synchronization for neutral-type neural networks with stochastic perturbation and Markovian switching is researched. Firstly, we proposed a new criterion of a.s. asymptotic stability for a general neutral-type stochastic differential equation which extends the existing results. Secondly, based upon this stability criterion, by making use of Lyapunov functional method and designing an adaptive controller, we obtained a condition of a.s. asymptotic adaptive synchronization for neutral-type neural networks with stochastic perturbation and Markovian switching. The synchronization condition is expressed as linear matrix inequality which can be easily solved by Matlab. Finally, we introduced a numerical example to illustrate the effectiveness of the method and result obtained in this paper.
Estilos ABNT, Harvard, Vancouver, APA, etc.
26

DEBIEC, PIOTOR, LUKASZ KORNATOWSKI, KRZYSZTOF SLOT e HYONGSUK KIM. "TEXTURE GENERATION USING CELLULAR NEURAL NETWORKS". International Journal of Bifurcation and Chaos 16, n.º 12 (dezembro de 2006): 3655–68. http://dx.doi.org/10.1142/s021812740601704x.

Texto completo da fonte
Resumo:
The following paper introduces an application of Cellular Neural Networks for the generation of predetermined stochastic textures. The key element for the task realization is an appropriate selection of template elements, which should provide a transformation of initial, random CNN state into a stable equilibrium, featuring desired perceptual properties. A template derivation procedure comprises two steps: linear CNN design, followed by a template-refinement procedure that involves nonlinear optimization. In addition, a procedure that extends CNN texture rendition capabilities into a realm of non-pure stochastic textures is proposed.
Estilos ABNT, Harvard, Vancouver, APA, etc.
27

Tosh, Colin R., e Graeme D. Ruxton. "The need for stochastic replication of ecological neural networks". Philosophical Transactions of the Royal Society B: Biological Sciences 362, n.º 1479 (18 de janeiro de 2007): 455–60. http://dx.doi.org/10.1098/rstb.2006.1973.

Texto completo da fonte
Resumo:
Artificial neural networks are becoming increasingly popular as predictive statistical tools in ecosystem ecology and as models of signal processing in behavioural and evolutionary ecology. We demonstrate here that a commonly used network in ecology, the three-layer feed-forward network, trained with the backpropagation algorithm, can be extremely sensitive to the stochastic variation in training data that results from random sampling of the same underlying statistical distribution, with networks converging to several distinct predictive states. Using a random walk procedure to sample error–weight space, and Sammon dimensional reduction of weight arrays, we demonstrate that these different predictive states are not artefactual, due to local minima, but lie at the base of major error troughs in the error–weight surface. We further demonstrate that various gross weight compositions can produce the same predictive state, suggesting the analogy of weight space as a ‘patchwork’ of multiple predictive states. Our results argue for increased inclusion of stochastic training replication and analysis into ecological and behavioural applications of artificial neural networks.
Estilos ABNT, Harvard, Vancouver, APA, etc.
28

Li WAN, e Qinghua ZHOU. "Ultimate Boundedness of Stochastic Hopfield Neural Networks". Journal of Convergence Information Technology 6, n.º 8 (31 de agosto de 2011): 244–48. http://dx.doi.org/10.4156/jcit.vol6.issue8.28.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
29

Verriest, E. I. "Stability of Stochastic Neural Networks with Delays". IFAC Proceedings Volumes 31, n.º 20 (julho de 1998): 441–46. http://dx.doi.org/10.1016/s1474-6670(17)41834-9.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
30

Huang, Chuangxia, Yigang He e Ping Chen. "Dynamic Analysis of Stochastic Recurrent Neural Networks". Neural Processing Letters 27, n.º 3 (11 de abril de 2008): 267–76. http://dx.doi.org/10.1007/s11063-008-9075-z.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
31

Rosaci, D. "Stochastic neural networks for transportation systems modeling". Applied Artificial Intelligence 12, n.º 6 (setembro de 1998): 491–500. http://dx.doi.org/10.1080/088395198117631.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
32

Yu, Wenwu, e Jinde Cao. "Synchronization control of stochastic delayed neural networks". Physica A: Statistical Mechanics and its Applications 373 (janeiro de 2007): 252–60. http://dx.doi.org/10.1016/j.physa.2006.04.105.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
33

Jay, Patel, Vasu Kalariya, Pushpendra Parmar, Sudeep Tanwar, Neeraj Kumar e Mamoun Alazab. "Stochastic Neural Networks for Cryptocurrency Price Prediction". IEEE Access 8 (2020): 82804–18. http://dx.doi.org/10.1109/access.2020.2990659.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
34

Borkar, V. S., e P. Gupta. "Randomized neural networks for learning stochastic dependences". IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics) 29, n.º 4 (1999): 469–80. http://dx.doi.org/10.1109/3477.775263.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
35

Ginzburg, Iris, e Haim Sompolinsky. "Theory of correlations in stochastic neural networks". Physical Review E 50, n.º 4 (1 de outubro de 1994): 3171–91. http://dx.doi.org/10.1103/physreve.50.3171.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
36

Zhu, Y., P. Garda, E. Belhaire e Y. Ni. "Autocompensated capacitive circuit for stochastic neural networks". Electronics Letters 30, n.º 4 (17 de fevereiro de 1994): 330–31. http://dx.doi.org/10.1049/el:19940250.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
37

Zhao, Jieyu, John Shawe-Taylor e Max van Daalen. "Learning in Stochastic Bit Stream Neural Networks". Neural Networks 9, n.º 6 (agosto de 1996): 991–98. http://dx.doi.org/10.1016/0893-6080(96)00025-1.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
38

Fernandes, Fernando, Rodrigo de Losso da Silveira Bueno, Pedro Delano Cavalcanti e Alemayehu Solomon Admasu. "Generating Stochastic Processes Through Convolutional Neural Networks". Journal of Control, Automation and Electrical Systems 31, n.º 2 (31 de janeiro de 2020): 294–303. http://dx.doi.org/10.1007/s40313-020-00567-y.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
39

Ochoa-Rivera, Juan Camilo. "Prospecting droughts with stochastic artificial neural networks". Journal of Hydrology 352, n.º 1-2 (abril de 2008): 174–80. http://dx.doi.org/10.1016/j.jhydrol.2008.01.006.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
40

Wang, Jie, Jun Wang, Wen Fang e Hongli Niu. "Financial Time Series Prediction Using Elman Recurrent Random Neural Networks". Computational Intelligence and Neuroscience 2016 (2016): 1–14. http://dx.doi.org/10.1155/2016/4742515.

Texto completo da fonte
Resumo:
In recent years, financial market dynamics forecasting has been a focus of economic research. To predict the price indices of stock markets, we developed an architecture which combined Elman recurrent neural networks with stochastic time effective function. By analyzing the proposed model with the linear regression, complexity invariant distance (CID), and multiscale CID (MCID) analysis methods and taking the model compared with different models such as the backpropagation neural network (BPNN), the stochastic time effective neural network (STNN), and the Elman recurrent neural network (ERNN), the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices.
Estilos ABNT, Harvard, Vancouver, APA, etc.
41

Fan, Tongke. "Research on Optimal CDMA Multiuser Detection Based on Stochastic Hopfield Neural Network". Recent Patents on Computer Science 12, n.º 3 (8 de maio de 2019): 233–40. http://dx.doi.org/10.2174/2213275912666181210103742.

Texto completo da fonte
Resumo:
Background: Most of the common multi-user detection techniques have the shortcomings of large computation and slow operation. For Hopfield neural networks, there are some problems such as high-speed searching ability and parallel processing, but there are local convergence problems. Objective: The stochastic Hopfield neural network avoids local convergence by introducing noise into the state variables and then achieves the optimal detection. Methods: Based on the study of CDMA communication model, this paper presents and models the problem of multi-user detection. Then a new stochastic Hopfield neural network is obtained by introducing a stochastic disturbance into the traditional Hopfield neural network. Finally, the problem of CDMA multi-user detection is simulated. Conclusion: The results show that the introduction of stochastic disturbance into Hopfield neural network can help the neural network to jump out of the local minimum, thus achieving the minimum and improving the performance of the neural network.
Estilos ABNT, Harvard, Vancouver, APA, etc.
42

Zhuo, Li’an, Baochang Zhang, Chen Chen, Qixiang Ye, Jianzhuang Liu e David Doermann. "Calibrated Stochastic Gradient Descent for Convolutional Neural Networks". Proceedings of the AAAI Conference on Artificial Intelligence 33 (17 de julho de 2019): 9348–55. http://dx.doi.org/10.1609/aaai.v33i01.33019348.

Texto completo da fonte
Resumo:
In stochastic gradient descent (SGD) and its variants, the optimized gradient estimators may be as expensive to compute as the true gradient in many scenarios. This paper introduces a calibrated stochastic gradient descent (CSGD) algorithm for deep neural network optimization. A theorem is developed to prove that an unbiased estimator for the network variables can be obtained in a probabilistic way based on the Lipschitz hypothesis. Our work is significantly distinct from existing gradient optimization methods, by providing a theoretical framework for unbiased variable estimation in the deep learning paradigm to optimize the model parameter calculation. In particular, we develop a generic gradient calibration layer which can be easily used to build convolutional neural networks (CNNs). Experimental results demonstrate that CNNs with our CSGD optimization scheme can improve the stateof-the-art performance for natural image classification, digit recognition, ImageNet object classification, and object detection tasks. This work opens new research directions for developing more efficient SGD updates and analyzing the backpropagation algorithm.
Estilos ABNT, Harvard, Vancouver, APA, etc.
43

TAN, Z., e M. K. ALI. "PATTERN RECOGNITION WITH STOCHASTIC RESONANCE IN A GENERIC NEURAL NETWORK". International Journal of Modern Physics C 11, n.º 08 (dezembro de 2000): 1585–93. http://dx.doi.org/10.1142/s0129183100001413.

Texto completo da fonte
Resumo:
We discuss stochastic resonance in associative memory with a canonical neural network model that describes the generic behavior of a large family of dynamical systems near bifurcation. Our result shows that stochastic resonance helps memory association. The relationship between stochastic resonance, associative memory, storage load, history of memory and initial states are studied. In intelligent systems like neural networks, it is likely that stochastic resonance combined with synaptic information enhances memory recalls.
Estilos ABNT, Harvard, Vancouver, APA, etc.
44

Soula, Hédi, e Carson C. Chow. "Stochastic Dynamics of a Finite-Size Spiking Neural Network". Neural Computation 19, n.º 12 (dezembro de 2007): 3262–92. http://dx.doi.org/10.1162/neco.2007.19.12.3262.

Texto completo da fonte
Resumo:
We present a simple Markov model of spiking neural dynamics that can be analytically solved to characterize the stochastic dynamics of a finite-size spiking neural network. We give closed-form estimates for the equilibrium distribution, mean rate, variance, and autocorrelation function of the network activity. The model is applicable to any network where the probability of firing of a neuron in the network depends on only the number of neurons that fired in a previous temporal epoch. Networks with statistically homogeneous connectivity and membrane and synaptic time constants that are not excessively long could satisfy these conditions. Our model completely accounts for the size of the network and correlations in the firing activity. It also allows us to examine how the network dynamics can deviate from mean field theory. We show that the model and solutions are applicable to spiking neural networks in biophysically plausible parameter regimes.
Estilos ABNT, Harvard, Vancouver, APA, etc.
45

Xu, Yuhua, Jinmeng Wang, Wuneng Zhou e Xin Wang. "Synchronization in pth Moment for Stochastic Chaotic Neural Networks with Finite-Time Control". Complexity 2019 (3 de janeiro de 2019): 1–8. http://dx.doi.org/10.1155/2019/2916364.

Texto completo da fonte
Resumo:
Finite-time synchronization in pth moment is considered for time varying stochastic chaotic neural networks. Compared with some existing results about finite-time mean square stability of stochastic neural network, we obtain some useful criteria of finite-time synchronization in pth moment for chaotic neural networks based on finite-time nonlinear feedback control and finite-time adaptive feedback control, which are efficient and easy to implement in practical applications. Finally, a numerical example is given to illustrate the validity of the derived synchronization conditions.
Estilos ABNT, Harvard, Vancouver, APA, etc.
46

SHAFEE, FARIEL. "STOCHASTIC DYNAMICS OF NETWORKS WITH SEMICLASSICAL EXCITATIONS". Stochastics and Dynamics 07, n.º 03 (setembro de 2007): 403–16. http://dx.doi.org/10.1142/s0219493707002086.

Texto completo da fonte
Resumo:
As artificial intelligence seems destined to go for devices at the nano-scale level involving quantized interactions, we investigate the changes brought forth by a transition of a dynamical classical neural system to a quantum one. First, we consider cases of interaction between neurons and with an external source, which are analytically tractable, and find relations for their average times of transition between different quantum states. We get simple relations between the average transition time and the effective potential in limits of resonance and small damping. Such potentials, with harmonic and/or damped behavior, may be quite important in nanoelectronic devices. We then present a network set up with interactions between nodes similar to classical biological-like action potentials with finite durations, and compare it with the classical integrate-and-fire neural network. As in the classical systems, we obtain from such networks dynamical behavior differing according to the type of static input. We have examined the role of the novel stochasticity introduced by the quantum potential which fuzzifies the logical links, in contrast to the system with the deterministic classical neural network. The dynamic quantum system also shows oscillations similar to biological systems. The average periodic behavior of the system, as found from simulations, agrees remarkably well with a simple formula with a single parameter to account for the intrinsic randomness of the system. Short-term retentivity of input memory is observed to be subtle but perceptible, and may be useful in designing devices with quantum networks that need a gradual auto-erasing facility, which too, being quasi-biological, may be a desirable feature in systems trying to mimic living neural systems.
Estilos ABNT, Harvard, Vancouver, APA, etc.
47

Wan, Li, e Qing Hua Zhou. "Ultimate Boundedness of Stochastic Neural Networks with Delays". Advanced Materials Research 204-210 (fevereiro de 2011): 1549–52. http://dx.doi.org/10.4028/www.scientific.net/amr.204-210.1549.

Texto completo da fonte
Resumo:
Although ultimate boundedness of several classes of neural networks with constant delays was studied by some researchers, the inherent randomness associated with signal transmission was not taken account into these networks. At present, few authors study ultimate boundedness of stochastic neural networks and no related papers are reported. In this paper, by using Lyapunov functional and linear matrix inequality, some sufficient conditions ensuring the ultimate boundedness of stochastic neural networks with time-varying delays are established. Our criteria are easily tested by Matlab LMI Toolbox. One example is given to demonstrate our criteria.
Estilos ABNT, Harvard, Vancouver, APA, etc.
48

Nikolic, Kostantin. "Training Neural Network Elements Created From Long Shot Term Memory". Oriental journal of computer science and technology 10, n.º 1 (1 de março de 2017): 01–10. http://dx.doi.org/10.13005/ojcst/10.01.01.

Texto completo da fonte
Resumo:
This paper presents the application of stochastic search algorithms to train artificial neural networks. Methodology approaches in the work created primarily to provide training complex recurrent neural networks. It is known that training recurrent networks is more complex than the type of training feedforward neural networks. Through simulation of recurrent networks is realized propagation signal from input to output and training process achieves a stochastic search in the space of parameters. The performance of this type of algorithm is superior to most of the training algorithms, which are based on the concept of gradient. The efficiency of these algorithms is demonstrated in the training network created from units that are characterized by long term and long shot term memory of networks. The presented methology is effective and relative simple.
Estilos ABNT, Harvard, Vancouver, APA, etc.
49

Zhou, Liqun, e Guangda Hu. "Almost sure exponential stability of neutral stochastic delayed cellular neural networks". Journal of Control Theory and Applications 6, n.º 2 (maio de 2008): 195–200. http://dx.doi.org/10.1007/s11768-008-7036-8.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
50

Amit, Daniel J., e Stefano Fusi. "Learning in Neural Networks with Material Synapses". Neural Computation 6, n.º 5 (setembro de 1994): 957–82. http://dx.doi.org/10.1162/neco.1994.6.5.957.

Texto completo da fonte
Resumo:
We discuss the long term maintenance of acquired memory in synaptic connections of a perpetually learning electronic device. This is affected by ascribing each synapse a finite number of stable states in which it can maintain for indefinitely long periods. Learning uncorrelated stimuli is expressed as a stochastic process produced by the neural activities on the synapses. In several interesting cases the stochastic process can be analyzed in detail, leading to a clarification of the performance of the network, as an associative memory, during the process of uninterrupted learning. The stochastic nature of the process and the existence of an asymptotic distribution for the synaptic values in the network imply generically that the memory is a palimpsest but capacity is as low as log N for a network of N neurons. The only way we find for avoiding this tight constraint is to allow the parameters governing the learning process (the coding level of the stimuli; the transition probabilities for potentiation and depression and the number of stable synaptic levels) to depend on the number of neurons. It is shown that a network with synapses that have two stable states can dynamically learn with optimal storage efficiency, be a palimpsest, and maintain its (associative) memory for an indefinitely long time provided the coding level is low and depression is equilibrated against potentiation. We suggest that an option so easily implementable in material devices would not have been overlooked by biology. Finally we discuss the stochastic learning on synapses with variable number of stable synaptic states.
Estilos ABNT, Harvard, Vancouver, APA, etc.
Oferecemos descontos em todos os planos premium para autores cujas obras estão incluídas em seleções literárias temáticas. Contate-nos para obter um código promocional único!

Vá para a bibliografia