To see the other types of publications on this topic, follow the link: Stochastic neural networks.

Journal articles on the topic 'Stochastic neural networks'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Stochastic neural networks.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Wong, Eugene. "Stochastic neural networks." Algorithmica 6, no. 1-6 (1991): 466–78. http://dx.doi.org/10.1007/bf01759054.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Reddy, BhanuTeja, and Usha J.C. "Prediction of Stock Market using Stochastic Neural Networks." International Journal of Innovative Research in Computer Science & Technology 7, no. 5 (2019): 128–38. http://dx.doi.org/10.21276/ijircst.2019.7.5.1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Wu, Chunmei, Junhao Hu, and Yan Li. "Robustness Analysis of Hybrid Stochastic Neural Networks with Neutral Terms and Time-Varying Delays." Discrete Dynamics in Nature and Society 2015 (2015): 1–12. http://dx.doi.org/10.1155/2015/278571.

Full text
Abstract:
We analyze the robustness of global exponential stability of hybrid stochastic neural networks subject to neutral terms and time-varying delays simultaneously. Given globally exponentially stable hybrid stochastic neural networks, we characterize the upper bounds of contraction coefficients of neutral terms and time-varying delays by using the transcendental equation. Moreover, we prove theoretically that, for any globally exponentially stable hybrid stochastic neural networks, if additive neutral terms and time-varying delays are smaller than the upper bounds arrived, then the perturbed neura
APA, Harvard, Vancouver, ISO, and other styles
4

Gao, Zhan, Elvin Isufi, and Alejandro Ribeiro. "Stochastic Graph Neural Networks." IEEE Transactions on Signal Processing 69 (2021): 4428–43. http://dx.doi.org/10.1109/tsp.2021.3092336.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Hu, Shigeng, Xiaoxin Liao, and Xuerong Mao. "Stochastic Hopfield neural networks." Journal of Physics A: Mathematical and General 36, no. 9 (2003): 2235–49. http://dx.doi.org/10.1088/0305-4470/36/9/303.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhou, Wuneng, Xueqing Yang, Jun Yang, and Jun Zhou. "Stochastic Synchronization of Neutral-Type Neural Networks with Multidelays Based onM-Matrix." Discrete Dynamics in Nature and Society 2015 (2015): 1–8. http://dx.doi.org/10.1155/2015/826810.

Full text
Abstract:
The problem of stochastic synchronization of neutral-type neural networks with multidelays based onM-matrix is researched. Firstly, we designed a control law of stochastic synchronization of the neural-type and multiple time-delays neural network. Secondly, by making use of Lyapunov functional andM-matrix method, we obtained a criterion under which the drive and response neutral-type multiple time-delays neural networks with stochastic disturbance and Markovian switching are stochastic synchronization. The synchronization condition is expressed as linear matrix inequality which can be easily s
APA, Harvard, Vancouver, ISO, and other styles
7

Shi, Xiang Dong. "Mean Square Asymptotic Stability of Neutral Stochastic Neutral Networks with Multiple Time-Varying Delays." Advanced Materials Research 684 (April 2013): 579–82. http://dx.doi.org/10.4028/www.scientific.net/amr.684.579.

Full text
Abstract:
The paper considers the problems of almost surely asymptotic stability for neutral stochastic neural networks with multiple time-varying delays. By applying Lyapunov functional method and differential inequality techniques, new sufficient conditions ensuring the existence and almost surely asymptotic stability of neutral stochastic neural networks with multiple time-varying delays are established. The results are shown to be generalizations of some previously published results and are less conservative than existing results.
APA, Harvard, Vancouver, ISO, and other styles
8

SAKTHIVEL, RATHINASAMY, R. SAMIDURAI, and S. MARSHAL ANTHONI. "EXPONENTIAL STABILITY FOR STOCHASTIC NEURAL NETWORKS OF NEUTRAL TYPE WITH IMPULSIVE EFFECTS." Modern Physics Letters B 24, no. 11 (2010): 1099–110. http://dx.doi.org/10.1142/s0217984910023141.

Full text
Abstract:
This paper is concerned with the exponential stability of stochastic neural networks of neutral type with impulsive effects. By employing the Lyapunov functional and stochastic analysis, a new stability criterion for the stochastic neural network is derived in terms of linear matrix inequality. A numerical example is provided to show the effectiveness and applicability of the obtained result.
APA, Harvard, Vancouver, ISO, and other styles
9

Peretto, Pierre, and Jean-jacques Niez. "Stochastic Dynamics of Neural Networks." IEEE Transactions on Systems, Man, and Cybernetics 16, no. 1 (1986): 73–83. http://dx.doi.org/10.1109/tsmc.1986.289283.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kanarachos, Andreas E., and Kleanthis T. Geramanis. "Semi-Stochastic Complex Neural Networks." IFAC Proceedings Volumes 31, no. 12 (1998): 47–52. http://dx.doi.org/10.1016/s1474-6670(17)36040-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Hurtado, Jorge E. "Neural networks in stochastic mechanics." Archives of Computational Methods in Engineering 8, no. 3 (2001): 303–42. http://dx.doi.org/10.1007/bf02736646.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Zhou, Wuneng, Xueqing Yang, Jun Yang, Anding Dai, and Huashan Liu. "Almost Sure Asymptotical Adaptive Synchronization for Neutral-Type Neural Networks with Stochastic Perturbation and Markovian Switching." Mathematical Problems in Engineering 2014 (2014): 1–13. http://dx.doi.org/10.1155/2014/479084.

Full text
Abstract:
The problem of almost sure (a.s.) asymptotic adaptive synchronization for neutral-type neural networks with stochastic perturbation and Markovian switching is researched. Firstly, we proposed a new criterion of a.s. asymptotic stability for a general neutral-type stochastic differential equation which extends the existing results. Secondly, based upon this stability criterion, by making use of Lyapunov functional method and designing an adaptive controller, we obtained a condition of a.s. asymptotic adaptive synchronization for neutral-type neural networks with stochastic perturbation and Mark
APA, Harvard, Vancouver, ISO, and other styles
13

Hua, Min Gang, Jun Tao Fei, and Wei Li Dai. "Exponential Stability Analysis for Neutral Stochastic Delayed Neural Networks." Advanced Materials Research 354-355 (October 2011): 877–80. http://dx.doi.org/10.4028/www.scientific.net/amr.354-355.877.

Full text
Abstract:
In this paper, the generalized Finsler lemma and augmented Lyapunov functional are introduced to establish some improved delay-dependent stability criteria of neutral stochastic delayed neural networks. The stability criteria in the new results improve and generalize existing ones. Two examples are included to show the effectiveness of the results.
APA, Harvard, Vancouver, ISO, and other styles
14

Han, Sufang, Tianwei Zhang, and Guoxin Liu. "Stochastic Dynamics of Discrete-Time Fuzzy Random BAM Neural Networks with Time Delays." Mathematical Problems in Engineering 2019 (August 25, 2019): 1–20. http://dx.doi.org/10.1155/2019/9416234.

Full text
Abstract:
By using the semidiscrete method of differential equations, a new version of discrete analogue of stochastic fuzzy BAM neural networks was formulated, which gives a more accurate characterization for continuous-time stochastic neural networks than that by the Euler scheme. Firstly, the existence of the 2p-th mean almost periodic sequence solution of the discrete-time stochastic fuzzy BAM neural networks is investigated with the help of Minkowski inequality, Hölder inequality, and Krasnoselskii’s fixed point theorem. Secondly, the 2p-th moment global exponential stability of the discrete-time s
APA, Harvard, Vancouver, ISO, and other styles
15

Yu, Tianyuan, Yongxin Yang, Da Li, Timothy Hospedales, and Tao Xiang. "Simple and Effective Stochastic Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 4 (2021): 3252–60. http://dx.doi.org/10.1609/aaai.v35i4.16436.

Full text
Abstract:
Stochastic neural networks (SNNs) are currently topical, with several paradigms being actively investigated including dropout, Bayesian neural networks, variational information bottleneck (VIB) and noise regularized learning. These neural network variants impact several major considerations, including generalization, network compression, robustness against adversarial attack and label noise, and model calibration. However, many existing networks are complicated and expensive to train, and/or only address one or two of these practical considerations. In this paper we propose a simple and effect
APA, Harvard, Vancouver, ISO, and other styles
16

Blythe, Steve, Xuerong Mao, and Xiaoxin Liao. "Stability of stochastic delay neural networks." Journal of the Franklin Institute 338, no. 4 (2001): 481–95. http://dx.doi.org/10.1016/s0016-0032(01)00016-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Verikas, A., and A. Gelzinis. "Training neural networks by stochastic optimisation." Neurocomputing 30, no. 1-4 (2000): 153–72. http://dx.doi.org/10.1016/s0925-2312(99)00123-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Yang, Yunfeng, and Fengxian Tang. "Network Intrusion Detection Based on Stochastic Neural Networks Method." International Journal of Security and Its Applications 10, no. 8 (2016): 435–46. http://dx.doi.org/10.14257/ijsia.2016.10.8.38.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Han, Jin Fang. "Mean-Square Exponential Stability of Stochastic Interval Cellular Neural Networks with Time-Varying Delay." Advanced Materials Research 760-762 (September 2013): 1742–47. http://dx.doi.org/10.4028/www.scientific.net/amr.760-762.1742.

Full text
Abstract:
This paper is concerned with the mean-square exponential stability analysis problem for a class of stochastic interval cellular neural networks with time-varying delay. By using the stochastic analysis approach, employing Lyapunov function and norm inequalities, several mean-square exponential stability criteria are established in terms of the formula and Razumikhin theorem to guarantee the stochastic interval delayed cellular neural networks to be mean-square exponential stable. Some recent results reported in the literatures are generalized. A kind of equivalent description for this stochast
APA, Harvard, Vancouver, ISO, and other styles
20

Li, Yajun, Xisheng Dai, Wenping Xiao, and Like Jiao. "Stability Analysis for Stochastic Markovian Jumping Neural Networks with Leakage Delay." Open Electrical & Electronic Engineering Journal 11, no. 1 (2017): 1–13. http://dx.doi.org/10.2174/1874129001711010001.

Full text
Abstract:
The stability problem for a class of stochastic neural networks with Markovian jump parameters and leakage delay is addressed in this study. The sufficient condition to ensure an exponentially stable stochastic neural networks system is presented and proven with Lyapunov functional theory, stochastic stability technique and linear matrix inequality method. The effect of leakage delay on the stability of the neural networks system is discussed and numerical examples are provided to show the correctness and effectiveness of the research results .
APA, Harvard, Vancouver, ISO, and other styles
21

Li, Yan, and Yi Shen. "Preserving Global Exponential Stability of Hybrid BAM Neural Networks with Reaction Diffusion Terms in the Presence of Stochastic Noise and Connection Weight Matrices Uncertainty." Mathematical Problems in Engineering 2014 (2014): 1–17. http://dx.doi.org/10.1155/2014/486052.

Full text
Abstract:
We study the impact of stochastic noise and connection weight matrices uncertainty on global exponential stability of hybrid BAM neural networks with reaction diffusion terms. Given globally exponentially stable hybrid BAM neural networks with reaction diffusion terms, the question to be addressed here is how much stochastic noise and connection weights matrices uncertainty the neural networks can tolerate while maintaining global exponential stability. The upper threshold of stochastic noise and connection weights matrices uncertainty is defined by using the transcendental equations. We find
APA, Harvard, Vancouver, ISO, and other styles
22

Hou, Yuanyuan, and Lihua Dai. "Square-Mean Pseudo Almost Periodic Solutions for Quaternion-Valued Stochastic Neural Networks with Time-Varying Delays." Mathematical Problems in Engineering 2021 (January 21, 2021): 1–17. http://dx.doi.org/10.1155/2021/6679326.

Full text
Abstract:
In this paper, we are concerned with a class of quaternion-valued stochastic neural networks with time-varying delays. Firstly, we cannot explicitly decompose the quaternion-valued stochastic systems into equivalent real-valued stochastic systems; by using the Banach fixed point theorem and stochastic analysis techniques, we obtain some sufficient conditions for the existence of square-mean pseudo almost periodic solutions for this class of neural networks. Then, by constructing an appropriate Lyapunov functional and stochastic analysis techniques, we can also obtain sufficient conditions for
APA, Harvard, Vancouver, ISO, and other styles
23

Jia, Zhifu, and Cunlin Li. "Almost Sure Exponential Stability of Uncertain Stochastic Hopfield Neural Networks Based on Subadditive Measures." Mathematics 11, no. 14 (2023): 3110. http://dx.doi.org/10.3390/math11143110.

Full text
Abstract:
For this paper, we consider the almost sure exponential stability of uncertain stochastic Hopfield neural networks based on subadditive measures. Firstly, we deduce two corollaries, using the Itô–Liu formula. Then, we introduce the concept of almost sure exponential stability for uncertain stochastic Hopfield neural networks. Next, we investigate the almost sure exponential stability of uncertain stochastic Hopfield neural networks, using the Lyapunov method, Liu inequality, the Liu lemma, and exponential martingale inequality. In addition, we prove two sufficient conditions for almost sure ex
APA, Harvard, Vancouver, ISO, and other styles
24

DEBIEC, PIOTOR, LUKASZ KORNATOWSKI, KRZYSZTOF SLOT, and HYONGSUK KIM. "TEXTURE GENERATION USING CELLULAR NEURAL NETWORKS." International Journal of Bifurcation and Chaos 16, no. 12 (2006): 3655–68. http://dx.doi.org/10.1142/s021812740601704x.

Full text
Abstract:
The following paper introduces an application of Cellular Neural Networks for the generation of predetermined stochastic textures. The key element for the task realization is an appropriate selection of template elements, which should provide a transformation of initial, random CNN state into a stable equilibrium, featuring desired perceptual properties. A template derivation procedure comprises two steps: linear CNN design, followed by a template-refinement procedure that involves nonlinear optimization. In addition, a procedure that extends CNN texture rendition capabilities into a realm of
APA, Harvard, Vancouver, ISO, and other styles
25

Meng, Duo. "Adaptive Neural Networks Synchronization of a Four-Dimensional Energy Resource Stochastic System." Abstract and Applied Analysis 2014 (2014): 1–8. http://dx.doi.org/10.1155/2014/863902.

Full text
Abstract:
An adaptive neural networks chaos synchronization control method is proposed for a four-dimensional energy resource demand-supply system with input constraints. Assuming the response system contains unknown uncertain nonlinearities and unknown stochastic disturbances, the neural networks and robust terms are used to identify the nonlinearities and overcome the stochastic disturbances, respectively. Based on stochastic Lyapunov stability and robust adaptive theories, an adaptive neural networks synchronization control method is developed. In the design process, an auxiliary design system is emp
APA, Harvard, Vancouver, ISO, and other styles
26

PARK, JU H., and O. M. KWON. "SYNCHRONIZATION OF NEURAL NETWORKS OF NEUTRAL TYPE WITH STOCHASTIC PERTURBATION." Modern Physics Letters B 23, no. 14 (2009): 1743–51. http://dx.doi.org/10.1142/s0217984909019909.

Full text
Abstract:
In this letter, the problem of feedback controller design to achieve synchronization for neural network of neutral type with stochastic perturbation is considered. Based on Lyapunov method and LMI (linear matrix inequality) framework, the goal of this letter is to derive an existence criterion of the controller for the synchronization between master and response networks.
APA, Harvard, Vancouver, ISO, and other styles
27

Wan, Li, and Qing Hua Zhou. "Ultimate Boundedness of Stochastic Neural Networks with Delays." Advanced Materials Research 204-210 (February 2011): 1549–52. http://dx.doi.org/10.4028/www.scientific.net/amr.204-210.1549.

Full text
Abstract:
Although ultimate boundedness of several classes of neural networks with constant delays was studied by some researchers, the inherent randomness associated with signal transmission was not taken account into these networks. At present, few authors study ultimate boundedness of stochastic neural networks and no related papers are reported. In this paper, by using Lyapunov functional and linear matrix inequality, some sufficient conditions ensuring the ultimate boundedness of stochastic neural networks with time-varying delays are established. Our criteria are easily tested by Matlab LMI Toolbo
APA, Harvard, Vancouver, ISO, and other styles
28

Akbar, Mutaqin. "Traffic sign recognition using convolutional neural networks." Jurnal Teknologi dan Sistem Komputer 9, no. 2 (2021): 120–25. http://dx.doi.org/10.14710/jtsiskom.2021.13959.

Full text
Abstract:
Traffic sign recognition (TSR) can be used to recognize traffic signs by utilizing image processing. This paper presents traffic sign recognition in Indonesia using convolutional neural networks (CNN). The overall image dataset used is 2050 images of traffic signs, consisting of 10 kinds of signs. The CNN layer used in this study consists of one convolution layer, one pooling layer using maxpool operation, and one fully connected layer. The training algorithm used is stochastic gradient descent (SGD). At the training stage, using 1750 training images, 48 filters, and a learning rate of 0.005,
APA, Harvard, Vancouver, ISO, and other styles
29

ROSSELLÓ, JOSEP L., VINCENT CANALS, ANTONI MORRO, and ANTONI OLIVER. "HARDWARE IMPLEMENTATION OF STOCHASTIC SPIKING NEURAL NETWORKS." International Journal of Neural Systems 22, no. 04 (2012): 1250014. http://dx.doi.org/10.1142/s0129065712500141.

Full text
Abstract:
Spiking Neural Networks, the last generation of Artificial Neural Networks, are characterized by its bio-inspired nature and by a higher computational capacity with respect to other neural models. In real biological neurons, stochastic processes represent an important mechanism of neural behavior and are responsible of its special arithmetic capabilities. In this work we present a simple hardware implementation of spiking neurons that considers this probabilistic nature. The advantage of the proposed implementation is that it is fully digital and therefore can be massively implemented in Field
APA, Harvard, Vancouver, ISO, and other styles
30

TANG, YANG, JIAN-AN FANG, SUOJUN LU, and QINGYING MIAO. "ADAPTIVE SYNCHRONIZATION FOR UNKNOWN STOCHASTIC CHAOTIC NEURAL NETWORKS WITH MIXED TIME-DELAYS BY OUTPUT COUPLING." Modern Physics Letters B 22, no. 24 (2008): 2391–409. http://dx.doi.org/10.1142/s0217984908016832.

Full text
Abstract:
This paper is concerned with the synchronization problem for a class of stochastic neural networks with unknown parameters and mixed time-delays via output coupling. The mixed time-delays comprise the time-varying delay and distributed delay, and the neural networks are subjected to stochastic disturbances described in terms of a Brownian motion. Firstly, we use Lyapunov functions to establish general theoretical conditions for designing the output coupling matrix. Secondly, by using the adaptive feedback technique, a simple, analytical and rigorous approach is proposed to synchronize the stoc
APA, Harvard, Vancouver, ISO, and other styles
31

Shi, Yanchao, and Peiyong Zhu. "Synchronization of Stochastic Competitive Neural Networks with Different Timescales and Reaction-Diffusion Terms." Neural Computation 26, no. 9 (2014): 2005–24. http://dx.doi.org/10.1162/neco_a_00629.

Full text
Abstract:
We propose a feedback controller for the synchronization of stochastic competitive neural networks with different timescales and reaction-diffusion terms. By constructing a proper Lyapunov-Krasovskii functional, as well as employing stochastic analysis theory, the LaShall-type invariance principle for stochastic differential delay equations, and a linear matrix inequality (LMI) technique, a feedback controller is designed to achieve the asymptotical synchronization of coupled stochastic competitive neural networks. A simulation example is given to show the effectiveness of the theoretical resu
APA, Harvard, Vancouver, ISO, and other styles
32

Li, Ya Jun, and Jing Zhao Li. "Exponential Stability for Stochastic Neural Networks with Time-Varying Delay and Leakage Delay." Applied Mechanics and Materials 742 (March 2015): 399–403. http://dx.doi.org/10.4028/www.scientific.net/amm.742.399.

Full text
Abstract:
This paper investigates the exponential stability problem for a class of stochastic neural networks with leakage delay. By employing a suitable Lyapunov functional and stochastic stability theory technic, the sufficient conditions which make the stochastic neural networks system exponential mean square stable are proposed and proved. All results are expressed in terms of linear matrix inequalities (LMIs). Example and simulation are presented to show the effectiveness of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
33

Wang, Xiaobo, Xuefei Wu, Zhe Nie, and Zengxian Yan. "The pth Moment Exponential Synchronization of Drive-Response Memristor Neural Networks Subject to Stochastic Perturbations." Complexity 2023 (July 18, 2023): 1–10. http://dx.doi.org/10.1155/2023/1335184.

Full text
Abstract:
In this paper, the p t h moment exponential synchronization problems of drive-response stochastic memristor neural networks are studied via a state feedback controller. The dynamics of the memristor neural network are nonidentical, consisting of both asymmetrically nondelayed and delayed coupled, state-dependent, and subject to exogenous stochastic perturbations. The pth moment exponential synchronization of these drive-response stochastic memristor neural networks is guaranteed under some testable and computable sufficient conditions utilizing differential inclusion theory and Filippov regula
APA, Harvard, Vancouver, ISO, and other styles
34

Sriraman, R., R. Samidurai, V. C. Amritha, G. Rachakit, and Prasanalakshmi Balaji. "System decomposition-based stability criteria for Takagi-Sugeno fuzzy uncertain stochastic delayed neural networks in quaternion field." AIMS Mathematics 8, no. 5 (2023): 11589–616. http://dx.doi.org/10.3934/math.2023587.

Full text
Abstract:
<abstract><p>Stochastic disturbances often occur in real-world systems which can lead to undesirable system dynamics. Therefore, it is necessary to investigate stochastic disturbances in neural network modeling. As such, this paper examines the stability problem for Takagi-Sugeno fuzzy uncertain quaternion-valued stochastic neural networks. By applying Takagi-Sugeno fuzzy models and stochastic analysis, we first consider a general form of Takagi-Sugeno fuzzy uncertain quaternion-valued stochastic neural networks with time-varying delays. Then, by constructing suitable Lyapunov-Kras
APA, Harvard, Vancouver, ISO, and other styles
35

EMMERT-STREIB, FRANK. "A HETEROSYNAPTIC LEARNING RULE FOR NEURAL NETWORKS." International Journal of Modern Physics C 17, no. 10 (2006): 1501–20. http://dx.doi.org/10.1142/s0129183106009916.

Full text
Abstract:
In this article we introduce a novel stochastic Hebb-like learning rule for neural networks that is neurobiologically motivated. This learning rule combines features of unsupervised (Hebbian) and supervised (reinforcement) learning and is stochastic with respect to the selection of the time points when a synapse is modified. Moreover, the learning rule does not only affect the synapse between pre- and postsynaptic neuron, which is called homosynaptic plasticity, but effects also further remote synapses of the pre- and postsynaptic neuron. This more complex form of synaptic plasticity has recen
APA, Harvard, Vancouver, ISO, and other styles
36

Li WAN, and Qinghua ZHOU. "Ultimate Boundedness of Stochastic Hopfield Neural Networks." Journal of Convergence Information Technology 6, no. 8 (2011): 244–48. http://dx.doi.org/10.4156/jcit.vol6.issue8.28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Borovkov, K., G. Decrouez, and M. Gilson. "On Stationary Distributions of Stochastic Neural Networks." Journal of Applied Probability 51, no. 3 (2014): 837–57. http://dx.doi.org/10.1239/jap/1409932677.

Full text
Abstract:
The paper deals with nonlinear Poisson neuron network models with bounded memory dynamics, which can include both Hebbian learning mechanisms and refractory periods. The state of the network is described by the times elapsed since its neurons fired within the post-synaptic transfer kernel memory span, and the current strengths of synaptic connections, the state spaces of our models being hierarchies of finite-dimensional components. We prove the ergodicity of the stochastic processes describing the behaviour of the networks, establish the existence of continuously differentiable stationary dis
APA, Harvard, Vancouver, ISO, and other styles
38

Rosaci, D. "Stochastic neural networks for transportation systems modeling." Applied Artificial Intelligence 12, no. 6 (1998): 491–500. http://dx.doi.org/10.1080/088395198117631.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Verriest, E. I. "Stability of Stochastic Neural Networks with Delays." IFAC Proceedings Volumes 31, no. 20 (1998): 441–46. http://dx.doi.org/10.1016/s1474-6670(17)41834-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Borovkov, K., G. Decrouez, and M. Gilson. "On Stationary Distributions of Stochastic Neural Networks." Journal of Applied Probability 51, no. 03 (2014): 837–57. http://dx.doi.org/10.1017/s0021900200011700.

Full text
Abstract:
The paper deals with nonlinear Poisson neuron network models with bounded memory dynamics, which can include both Hebbian learning mechanisms and refractory periods. The state of the network is described by the times elapsed since its neurons fired within the post-synaptic transfer kernel memory span, and the current strengths of synaptic connections, the state spaces of our models being hierarchies of finite-dimensional components. We prove the ergodicity of the stochastic processes describing the behaviour of the networks, establish the existence of continuously differentiable stationary dis
APA, Harvard, Vancouver, ISO, and other styles
41

Yu, Wenwu, and Jinde Cao. "Synchronization control of stochastic delayed neural networks." Physica A: Statistical Mechanics and its Applications 373 (January 2007): 252–60. http://dx.doi.org/10.1016/j.physa.2006.04.105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Zhao, Jieyu, John Shawe-Taylor, and Max van Daalen. "Learning in Stochastic Bit Stream Neural Networks." Neural Networks 9, no. 6 (1996): 991–98. http://dx.doi.org/10.1016/0893-6080(96)00025-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Ochoa-Rivera, Juan Camilo. "Prospecting droughts with stochastic artificial neural networks." Journal of Hydrology 352, no. 1-2 (2008): 174–80. http://dx.doi.org/10.1016/j.jhydrol.2008.01.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Jay, Patel, Vasu Kalariya, Pushpendra Parmar, Sudeep Tanwar, Neeraj Kumar, and Mamoun Alazab. "Stochastic Neural Networks for Cryptocurrency Price Prediction." IEEE Access 8 (2020): 82804–18. http://dx.doi.org/10.1109/access.2020.2990659.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Borkar, V. S., and P. Gupta. "Randomized neural networks for learning stochastic dependences." IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics) 29, no. 4 (1999): 469–80. http://dx.doi.org/10.1109/3477.775263.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Zhu, Y., P. Garda, E. Belhaire, and Y. Ni. "Autocompensated capacitive circuit for stochastic neural networks." Electronics Letters 30, no. 4 (1994): 330–31. http://dx.doi.org/10.1049/el:19940250.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Ginzburg, Iris, and Haim Sompolinsky. "Theory of correlations in stochastic neural networks." Physical Review E 50, no. 4 (1994): 3171–91. http://dx.doi.org/10.1103/physreve.50.3171.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Huang, Chuangxia, Yigang He, and Ping Chen. "Dynamic Analysis of Stochastic Recurrent Neural Networks." Neural Processing Letters 27, no. 3 (2008): 267–76. http://dx.doi.org/10.1007/s11063-008-9075-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Fernandes, Fernando, Rodrigo de Losso da Silveira Bueno, Pedro Delano Cavalcanti, and Alemayehu Solomon Admasu. "Generating Stochastic Processes Through Convolutional Neural Networks." Journal of Control, Automation and Electrical Systems 31, no. 2 (2020): 294–303. http://dx.doi.org/10.1007/s40313-020-00567-y.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Huang, Chuangxia, Lehua Huang, and Yigang He. "Mean Square Exponential Stability of Stochastic Cohen-Grossberg Neural Networks with Unbounded Distributed Delays." Discrete Dynamics in Nature and Society 2010 (2010): 1–15. http://dx.doi.org/10.1155/2010/513218.

Full text
Abstract:
This paper addresses the issue of mean square exponential stability of stochastic Cohen-Grossberg neural networks (SCGNN), whose state variables are described by stochastic nonlinear integrodifferential equations. With the help of Lyapunov function, stochastic analysis technique, and inequality techniques, some novel sufficient conditions on mean square exponential stability for SCGNN are given. Furthermore, we also establish some sufficient conditions for checking exponential stability for Cohen-Grossberg neural networks with unbounded distributed delays.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!