Статті в журналах з теми "Stochastic neural networks"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Stochastic neural networks.

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Stochastic neural networks".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Reddy, BhanuTeja, and Usha J.C. "Prediction of Stock Market using Stochastic Neural Networks." International Journal of Innovative Research in Computer Science & Technology 7, no. 5 (September 2019): 128–38. http://dx.doi.org/10.21276/ijircst.2019.7.5.1.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Wong, Eugene. "Stochastic neural networks." Algorithmica 6, no. 1-6 (June 1991): 466–78. http://dx.doi.org/10.1007/bf01759054.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Zhou, Wuneng, Xueqing Yang, Jun Yang, and Jun Zhou. "Stochastic Synchronization of Neutral-Type Neural Networks with Multidelays Based onM-Matrix." Discrete Dynamics in Nature and Society 2015 (2015): 1–8. http://dx.doi.org/10.1155/2015/826810.

Повний текст джерела
Анотація:
The problem of stochastic synchronization of neutral-type neural networks with multidelays based onM-matrix is researched. Firstly, we designed a control law of stochastic synchronization of the neural-type and multiple time-delays neural network. Secondly, by making use of Lyapunov functional andM-matrix method, we obtained a criterion under which the drive and response neutral-type multiple time-delays neural networks with stochastic disturbance and Markovian switching are stochastic synchronization. The synchronization condition is expressed as linear matrix inequality which can be easily solved by MATLAB. Finally, we introduced a numerical example to illustrate the effectiveness of the method and result obtained in this paper.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Hu, Shigeng, Xiaoxin Liao, and Xuerong Mao. "Stochastic Hopfield neural networks." Journal of Physics A: Mathematical and General 36, no. 9 (February 19, 2003): 2235–49. http://dx.doi.org/10.1088/0305-4470/36/9/303.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Gao, Zhan, Elvin Isufi, and Alejandro Ribeiro. "Stochastic Graph Neural Networks." IEEE Transactions on Signal Processing 69 (2021): 4428–43. http://dx.doi.org/10.1109/tsp.2021.3092336.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

SAKTHIVEL, RATHINASAMY, R. SAMIDURAI, and S. MARSHAL ANTHONI. "EXPONENTIAL STABILITY FOR STOCHASTIC NEURAL NETWORKS OF NEUTRAL TYPE WITH IMPULSIVE EFFECTS." Modern Physics Letters B 24, no. 11 (May 10, 2010): 1099–110. http://dx.doi.org/10.1142/s0217984910023141.

Повний текст джерела
Анотація:
This paper is concerned with the exponential stability of stochastic neural networks of neutral type with impulsive effects. By employing the Lyapunov functional and stochastic analysis, a new stability criterion for the stochastic neural network is derived in terms of linear matrix inequality. A numerical example is provided to show the effectiveness and applicability of the obtained result.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Wu, Chunmei, Junhao Hu, and Yan Li. "Robustness Analysis of Hybrid Stochastic Neural Networks with Neutral Terms and Time-Varying Delays." Discrete Dynamics in Nature and Society 2015 (2015): 1–12. http://dx.doi.org/10.1155/2015/278571.

Повний текст джерела
Анотація:
We analyze the robustness of global exponential stability of hybrid stochastic neural networks subject to neutral terms and time-varying delays simultaneously. Given globally exponentially stable hybrid stochastic neural networks, we characterize the upper bounds of contraction coefficients of neutral terms and time-varying delays by using the transcendental equation. Moreover, we prove theoretically that, for any globally exponentially stable hybrid stochastic neural networks, if additive neutral terms and time-varying delays are smaller than the upper bounds arrived, then the perturbed neural networks are guaranteed to also be globally exponentially stable. Finally, a numerical simulation example is given to illustrate the presented criteria.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Kanarachos, Andreas E., and Kleanthis T. Geramanis. "Semi-Stochastic Complex Neural Networks." IFAC Proceedings Volumes 31, no. 12 (June 1998): 47–52. http://dx.doi.org/10.1016/s1474-6670(17)36040-8.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Peretto, Pierre, and Jean-jacques Niez. "Stochastic Dynamics of Neural Networks." IEEE Transactions on Systems, Man, and Cybernetics 16, no. 1 (January 1986): 73–83. http://dx.doi.org/10.1109/tsmc.1986.289283.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Hurtado, Jorge E. "Neural networks in stochastic mechanics." Archives of Computational Methods in Engineering 8, no. 3 (September 2001): 303–42. http://dx.doi.org/10.1007/bf02736646.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
11

PARK, JU H., and O. M. KWON. "SYNCHRONIZATION OF NEURAL NETWORKS OF NEUTRAL TYPE WITH STOCHASTIC PERTURBATION." Modern Physics Letters B 23, no. 14 (June 10, 2009): 1743–51. http://dx.doi.org/10.1142/s0217984909019909.

Повний текст джерела
Анотація:
In this letter, the problem of feedback controller design to achieve synchronization for neural network of neutral type with stochastic perturbation is considered. Based on Lyapunov method and LMI (linear matrix inequality) framework, the goal of this letter is to derive an existence criterion of the controller for the synchronization between master and response networks.
Стилі APA, Harvard, Vancouver, ISO та ін.
12

PARK, JU H., and O. M. KWON. "ANALYSIS ON GLOBAL STABILITY OF STOCHASTIC NEURAL NETWORKS OF NEUTRAL TYPE." Modern Physics Letters B 22, no. 32 (December 30, 2008): 3159–70. http://dx.doi.org/10.1142/s0217984908017680.

Повний текст джерела
Анотація:
In this paper, the problem of global asymptotic stability of stochastic neural networks of neutral type is considered. Based on the Lyapunov stability theory, a new delay-dependent stability criterion for the network is derived in terms of LMI (linear matrix inequality). A numerical example is given to show the effectiveness of the proposed method.
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Hua, Min Gang, Jun Tao Fei, and Wei Li Dai. "Exponential Stability Analysis for Neutral Stochastic Delayed Neural Networks." Advanced Materials Research 354-355 (October 2011): 877–80. http://dx.doi.org/10.4028/www.scientific.net/amr.354-355.877.

Повний текст джерела
Анотація:
In this paper, the generalized Finsler lemma and augmented Lyapunov functional are introduced to establish some improved delay-dependent stability criteria of neutral stochastic delayed neural networks. The stability criteria in the new results improve and generalize existing ones. Two examples are included to show the effectiveness of the results.
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Verikas, A., and A. Gelzinis. "Training neural networks by stochastic optimisation." Neurocomputing 30, no. 1-4 (January 2000): 153–72. http://dx.doi.org/10.1016/s0925-2312(99)00123-x.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Blythe, Steve, Xuerong Mao, and Xiaoxin Liao. "Stability of stochastic delay neural networks." Journal of the Franklin Institute 338, no. 4 (July 2001): 481–95. http://dx.doi.org/10.1016/s0016-0032(01)00016-3.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Yang, Yunfeng, and Fengxian Tang. "Network Intrusion Detection Based on Stochastic Neural Networks Method." International Journal of Security and Its Applications 10, no. 8 (August 31, 2016): 435–46. http://dx.doi.org/10.14257/ijsia.2016.10.8.38.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Borovkov, K., G. Decrouez, and M. Gilson. "On Stationary Distributions of Stochastic Neural Networks." Journal of Applied Probability 51, no. 03 (September 2014): 837–57. http://dx.doi.org/10.1017/s0021900200011700.

Повний текст джерела
Анотація:
The paper deals with nonlinear Poisson neuron network models with bounded memory dynamics, which can include both Hebbian learning mechanisms and refractory periods. The state of the network is described by the times elapsed since its neurons fired within the post-synaptic transfer kernel memory span, and the current strengths of synaptic connections, the state spaces of our models being hierarchies of finite-dimensional components. We prove the ergodicity of the stochastic processes describing the behaviour of the networks, establish the existence of continuously differentiable stationary distribution densities (with respect to the Lebesgue measures of corresponding dimensionality) on the components of the state space, and find upper bounds for them. For the density components, we derive a system of differential equations that can be solved in a few simplest cases only. Approaches to approximate computation of the stationary density are discussed. One approach is to reduce the dimensionality of the problem by modifying the network so that each neuron cannot fire if the number of spikes it emitted within the post-synaptic transfer kernel memory span reaches a given threshold. We show that the stationary distribution of this ‘truncated’ network converges to that of the unrestricted network as the threshold increases, and that the convergence is at a superexponential rate. A complementary approach uses discrete Markov chain approximations to the network process.
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Borovkov, K., G. Decrouez, and M. Gilson. "On Stationary Distributions of Stochastic Neural Networks." Journal of Applied Probability 51, no. 3 (September 2014): 837–57. http://dx.doi.org/10.1239/jap/1409932677.

Повний текст джерела
Анотація:
The paper deals with nonlinear Poisson neuron network models with bounded memory dynamics, which can include both Hebbian learning mechanisms and refractory periods. The state of the network is described by the times elapsed since its neurons fired within the post-synaptic transfer kernel memory span, and the current strengths of synaptic connections, the state spaces of our models being hierarchies of finite-dimensional components. We prove the ergodicity of the stochastic processes describing the behaviour of the networks, establish the existence of continuously differentiable stationary distribution densities (with respect to the Lebesgue measures of corresponding dimensionality) on the components of the state space, and find upper bounds for them. For the density components, we derive a system of differential equations that can be solved in a few simplest cases only. Approaches to approximate computation of the stationary density are discussed. One approach is to reduce the dimensionality of the problem by modifying the network so that each neuron cannot fire if the number of spikes it emitted within the post-synaptic transfer kernel memory span reaches a given threshold. We show that the stationary distribution of this ‘truncated’ network converges to that of the unrestricted network as the threshold increases, and that the convergence is at a superexponential rate. A complementary approach uses discrete Markov chain approximations to the network process.
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Shi, Xiang Dong. "Mean Square Asymptotic Stability of Neutral Stochastic Neutral Networks with Multiple Time-Varying Delays." Advanced Materials Research 684 (April 2013): 579–82. http://dx.doi.org/10.4028/www.scientific.net/amr.684.579.

Повний текст джерела
Анотація:
The paper considers the problems of almost surely asymptotic stability for neutral stochastic neural networks with multiple time-varying delays. By applying Lyapunov functional method and differential inequality techniques, new sufficient conditions ensuring the existence and almost surely asymptotic stability of neutral stochastic neural networks with multiple time-varying delays are established. The results are shown to be generalizations of some previously published results and are less conservative than existing results.
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Akbar, Mutaqin. "Traffic sign recognition using convolutional neural networks." Jurnal Teknologi dan Sistem Komputer 9, no. 2 (March 5, 2021): 120–25. http://dx.doi.org/10.14710/jtsiskom.2021.13959.

Повний текст джерела
Анотація:
Traffic sign recognition (TSR) can be used to recognize traffic signs by utilizing image processing. This paper presents traffic sign recognition in Indonesia using convolutional neural networks (CNN). The overall image dataset used is 2050 images of traffic signs, consisting of 10 kinds of signs. The CNN layer used in this study consists of one convolution layer, one pooling layer using maxpool operation, and one fully connected layer. The training algorithm used is stochastic gradient descent (SGD). At the training stage, using 1750 training images, 48 filters, and a learning rate of 0.005, the recognition results in 0.005 of loss and 100 % of accuracy. At the testing stage using 300 test images, the system recognizes the signs with 0.107 of loss and 97.33 % of accuracy.
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Liu, Qingliang, and Jinmei Lai. "Stochastic Loss Function." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 4884–91. http://dx.doi.org/10.1609/aaai.v34i04.5925.

Повний текст джерела
Анотація:
Training deep neural networks is inherently subject to the predefined and fixed loss functions during optimizing. To improve learning efficiency, we develop Stochastic Loss Function (SLF) to dynamically and automatically generating appropriate gradients to train deep networks in the same round of back-propagation, while maintaining the completeness and differentiability of the training pipeline. In SLF, a generic loss function is formulated as a joint optimization problem of network weights and loss parameters. In order to guarantee the requisite efficiency, gradients with the respect to the generic differentiable loss are leveraged for selecting loss function and optimizing network weights. Extensive experiments on a variety of popular datasets strongly demonstrate that SLF is capable of obtaining appropriate gradients at different stages during training, and can significantly improve the performance of various deep models on real world tasks including classification, clustering, regression, neural machine translation, and objection detection.
Стилі APA, Harvard, Vancouver, ISO та ін.
22

EMMERT-STREIB, FRANK. "A HETEROSYNAPTIC LEARNING RULE FOR NEURAL NETWORKS." International Journal of Modern Physics C 17, no. 10 (October 2006): 1501–20. http://dx.doi.org/10.1142/s0129183106009916.

Повний текст джерела
Анотація:
In this article we introduce a novel stochastic Hebb-like learning rule for neural networks that is neurobiologically motivated. This learning rule combines features of unsupervised (Hebbian) and supervised (reinforcement) learning and is stochastic with respect to the selection of the time points when a synapse is modified. Moreover, the learning rule does not only affect the synapse between pre- and postsynaptic neuron, which is called homosynaptic plasticity, but effects also further remote synapses of the pre- and postsynaptic neuron. This more complex form of synaptic plasticity has recently come under investigations in neurobiology and is called heterosynaptic plasticity. We demonstrate that this learning rule is useful in training neural networks by learning parity functions including the exclusive-or (XOR) mapping in a multilayer feed-forward network. We find, that our stochastic learning rule works well, even in the presence of noise. Importantly, the mean learning time increases with the number of patterns to be learned polynomially, indicating efficient learning.
Стилі APA, Harvard, Vancouver, ISO та ін.
23

ROSSELLÓ, JOSEP L., VINCENT CANALS, ANTONI MORRO, and ANTONI OLIVER. "HARDWARE IMPLEMENTATION OF STOCHASTIC SPIKING NEURAL NETWORKS." International Journal of Neural Systems 22, no. 04 (July 25, 2012): 1250014. http://dx.doi.org/10.1142/s0129065712500141.

Повний текст джерела
Анотація:
Spiking Neural Networks, the last generation of Artificial Neural Networks, are characterized by its bio-inspired nature and by a higher computational capacity with respect to other neural models. In real biological neurons, stochastic processes represent an important mechanism of neural behavior and are responsible of its special arithmetic capabilities. In this work we present a simple hardware implementation of spiking neurons that considers this probabilistic nature. The advantage of the proposed implementation is that it is fully digital and therefore can be massively implemented in Field Programmable Gate Arrays. The high computational capabilities of the proposed model are demonstrated by the study of both feed-forward and recurrent networks that are able to implement high-speed signal filtering and to solve complex systems of linear equations.
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Wang, Yanzhi, Zheng Zhan, Liang Zhao, Jian Tang, Siyue Wang, Jiayu Li, Bo Yuan, Wujie Wen, and Xue Lin. "Universal Approximation Property and Equivalence of Stochastic Computing-Based Neural Networks and Binary Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 5369–76. http://dx.doi.org/10.1609/aaai.v33i01.33015369.

Повний текст джерела
Анотація:
Large-scale deep neural networks are both memory and computation-intensive, thereby posing stringent requirements on the computing platforms. Hardware accelerations of deep neural networks have been extensively investigated. Specific forms of binary neural networks (BNNs) and stochastic computing-based neural networks (SCNNs) are particularly appealing to hardware implementations since they can be implemented almost entirely with binary operations. Despite the obvious advantages in hardware implementation, these approximate computing techniques are questioned by researchers in terms of accuracy and universal applicability. Also it is important to understand the relative pros and cons of SCNNs and BNNs in theory and in actual hardware implementations. In order to address these concerns, in this paper we prove that the “ideal” SCNNs and BNNs satisfy the universal approximation property with probability 1 (due to the stochastic behavior), which is a new angle from the original approximation property. The proof is conducted by first proving the property for SCNNs from the strong law of large numbers, and then using SCNNs as a “bridge” to prove for BNNs. Besides the universal approximation property, we also derive an appropriate bound for bit length M in order to provide insights for the actual neural network implementations. Based on the universal approximation property, we further prove that SCNNs and BNNs exhibit the same energy complexity. In other words, they have the same asymptotic energy consumption with the growth of network size. We also provide a detailed analysis of the pros and cons of SCNNs and BNNs for hardware implementations and conclude that SCNNs are more suitable.
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Zhou, Wuneng, Xueqing Yang, Jun Yang, Anding Dai, and Huashan Liu. "Almost Sure Asymptotical Adaptive Synchronization for Neutral-Type Neural Networks with Stochastic Perturbation and Markovian Switching." Mathematical Problems in Engineering 2014 (2014): 1–13. http://dx.doi.org/10.1155/2014/479084.

Повний текст джерела
Анотація:
The problem of almost sure (a.s.) asymptotic adaptive synchronization for neutral-type neural networks with stochastic perturbation and Markovian switching is researched. Firstly, we proposed a new criterion of a.s. asymptotic stability for a general neutral-type stochastic differential equation which extends the existing results. Secondly, based upon this stability criterion, by making use of Lyapunov functional method and designing an adaptive controller, we obtained a condition of a.s. asymptotic adaptive synchronization for neutral-type neural networks with stochastic perturbation and Markovian switching. The synchronization condition is expressed as linear matrix inequality which can be easily solved by Matlab. Finally, we introduced a numerical example to illustrate the effectiveness of the method and result obtained in this paper.
Стилі APA, Harvard, Vancouver, ISO та ін.
26

DEBIEC, PIOTOR, LUKASZ KORNATOWSKI, KRZYSZTOF SLOT, and HYONGSUK KIM. "TEXTURE GENERATION USING CELLULAR NEURAL NETWORKS." International Journal of Bifurcation and Chaos 16, no. 12 (December 2006): 3655–68. http://dx.doi.org/10.1142/s021812740601704x.

Повний текст джерела
Анотація:
The following paper introduces an application of Cellular Neural Networks for the generation of predetermined stochastic textures. The key element for the task realization is an appropriate selection of template elements, which should provide a transformation of initial, random CNN state into a stable equilibrium, featuring desired perceptual properties. A template derivation procedure comprises two steps: linear CNN design, followed by a template-refinement procedure that involves nonlinear optimization. In addition, a procedure that extends CNN texture rendition capabilities into a realm of non-pure stochastic textures is proposed.
Стилі APA, Harvard, Vancouver, ISO та ін.
27

Tosh, Colin R., and Graeme D. Ruxton. "The need for stochastic replication of ecological neural networks." Philosophical Transactions of the Royal Society B: Biological Sciences 362, no. 1479 (January 18, 2007): 455–60. http://dx.doi.org/10.1098/rstb.2006.1973.

Повний текст джерела
Анотація:
Artificial neural networks are becoming increasingly popular as predictive statistical tools in ecosystem ecology and as models of signal processing in behavioural and evolutionary ecology. We demonstrate here that a commonly used network in ecology, the three-layer feed-forward network, trained with the backpropagation algorithm, can be extremely sensitive to the stochastic variation in training data that results from random sampling of the same underlying statistical distribution, with networks converging to several distinct predictive states. Using a random walk procedure to sample error–weight space, and Sammon dimensional reduction of weight arrays, we demonstrate that these different predictive states are not artefactual, due to local minima, but lie at the base of major error troughs in the error–weight surface. We further demonstrate that various gross weight compositions can produce the same predictive state, suggesting the analogy of weight space as a ‘patchwork’ of multiple predictive states. Our results argue for increased inclusion of stochastic training replication and analysis into ecological and behavioural applications of artificial neural networks.
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Li WAN, and Qinghua ZHOU. "Ultimate Boundedness of Stochastic Hopfield Neural Networks." Journal of Convergence Information Technology 6, no. 8 (August 31, 2011): 244–48. http://dx.doi.org/10.4156/jcit.vol6.issue8.28.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
29

Verriest, E. I. "Stability of Stochastic Neural Networks with Delays." IFAC Proceedings Volumes 31, no. 20 (July 1998): 441–46. http://dx.doi.org/10.1016/s1474-6670(17)41834-9.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
30

Huang, Chuangxia, Yigang He, and Ping Chen. "Dynamic Analysis of Stochastic Recurrent Neural Networks." Neural Processing Letters 27, no. 3 (April 11, 2008): 267–76. http://dx.doi.org/10.1007/s11063-008-9075-z.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
31

Rosaci, D. "Stochastic neural networks for transportation systems modeling." Applied Artificial Intelligence 12, no. 6 (September 1998): 491–500. http://dx.doi.org/10.1080/088395198117631.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Yu, Wenwu, and Jinde Cao. "Synchronization control of stochastic delayed neural networks." Physica A: Statistical Mechanics and its Applications 373 (January 2007): 252–60. http://dx.doi.org/10.1016/j.physa.2006.04.105.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Jay, Patel, Vasu Kalariya, Pushpendra Parmar, Sudeep Tanwar, Neeraj Kumar, and Mamoun Alazab. "Stochastic Neural Networks for Cryptocurrency Price Prediction." IEEE Access 8 (2020): 82804–18. http://dx.doi.org/10.1109/access.2020.2990659.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Borkar, V. S., and P. Gupta. "Randomized neural networks for learning stochastic dependences." IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics) 29, no. 4 (1999): 469–80. http://dx.doi.org/10.1109/3477.775263.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Ginzburg, Iris, and Haim Sompolinsky. "Theory of correlations in stochastic neural networks." Physical Review E 50, no. 4 (October 1, 1994): 3171–91. http://dx.doi.org/10.1103/physreve.50.3171.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Zhu, Y., P. Garda, E. Belhaire, and Y. Ni. "Autocompensated capacitive circuit for stochastic neural networks." Electronics Letters 30, no. 4 (February 17, 1994): 330–31. http://dx.doi.org/10.1049/el:19940250.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Zhao, Jieyu, John Shawe-Taylor, and Max van Daalen. "Learning in Stochastic Bit Stream Neural Networks." Neural Networks 9, no. 6 (August 1996): 991–98. http://dx.doi.org/10.1016/0893-6080(96)00025-1.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Fernandes, Fernando, Rodrigo de Losso da Silveira Bueno, Pedro Delano Cavalcanti, and Alemayehu Solomon Admasu. "Generating Stochastic Processes Through Convolutional Neural Networks." Journal of Control, Automation and Electrical Systems 31, no. 2 (January 31, 2020): 294–303. http://dx.doi.org/10.1007/s40313-020-00567-y.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Ochoa-Rivera, Juan Camilo. "Prospecting droughts with stochastic artificial neural networks." Journal of Hydrology 352, no. 1-2 (April 2008): 174–80. http://dx.doi.org/10.1016/j.jhydrol.2008.01.006.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
40

Wang, Jie, Jun Wang, Wen Fang, and Hongli Niu. "Financial Time Series Prediction Using Elman Recurrent Random Neural Networks." Computational Intelligence and Neuroscience 2016 (2016): 1–14. http://dx.doi.org/10.1155/2016/4742515.

Повний текст джерела
Анотація:
In recent years, financial market dynamics forecasting has been a focus of economic research. To predict the price indices of stock markets, we developed an architecture which combined Elman recurrent neural networks with stochastic time effective function. By analyzing the proposed model with the linear regression, complexity invariant distance (CID), and multiscale CID (MCID) analysis methods and taking the model compared with different models such as the backpropagation neural network (BPNN), the stochastic time effective neural network (STNN), and the Elman recurrent neural network (ERNN), the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices.
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Fan, Tongke. "Research on Optimal CDMA Multiuser Detection Based on Stochastic Hopfield Neural Network." Recent Patents on Computer Science 12, no. 3 (May 8, 2019): 233–40. http://dx.doi.org/10.2174/2213275912666181210103742.

Повний текст джерела
Анотація:
Background: Most of the common multi-user detection techniques have the shortcomings of large computation and slow operation. For Hopfield neural networks, there are some problems such as high-speed searching ability and parallel processing, but there are local convergence problems. Objective: The stochastic Hopfield neural network avoids local convergence by introducing noise into the state variables and then achieves the optimal detection. Methods: Based on the study of CDMA communication model, this paper presents and models the problem of multi-user detection. Then a new stochastic Hopfield neural network is obtained by introducing a stochastic disturbance into the traditional Hopfield neural network. Finally, the problem of CDMA multi-user detection is simulated. Conclusion: The results show that the introduction of stochastic disturbance into Hopfield neural network can help the neural network to jump out of the local minimum, thus achieving the minimum and improving the performance of the neural network.
Стилі APA, Harvard, Vancouver, ISO та ін.
42

Zhuo, Li’an, Baochang Zhang, Chen Chen, Qixiang Ye, Jianzhuang Liu, and David Doermann. "Calibrated Stochastic Gradient Descent for Convolutional Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 9348–55. http://dx.doi.org/10.1609/aaai.v33i01.33019348.

Повний текст джерела
Анотація:
In stochastic gradient descent (SGD) and its variants, the optimized gradient estimators may be as expensive to compute as the true gradient in many scenarios. This paper introduces a calibrated stochastic gradient descent (CSGD) algorithm for deep neural network optimization. A theorem is developed to prove that an unbiased estimator for the network variables can be obtained in a probabilistic way based on the Lipschitz hypothesis. Our work is significantly distinct from existing gradient optimization methods, by providing a theoretical framework for unbiased variable estimation in the deep learning paradigm to optimize the model parameter calculation. In particular, we develop a generic gradient calibration layer which can be easily used to build convolutional neural networks (CNNs). Experimental results demonstrate that CNNs with our CSGD optimization scheme can improve the stateof-the-art performance for natural image classification, digit recognition, ImageNet object classification, and object detection tasks. This work opens new research directions for developing more efficient SGD updates and analyzing the backpropagation algorithm.
Стилі APA, Harvard, Vancouver, ISO та ін.
43

TAN, Z., and M. K. ALI. "PATTERN RECOGNITION WITH STOCHASTIC RESONANCE IN A GENERIC NEURAL NETWORK." International Journal of Modern Physics C 11, no. 08 (December 2000): 1585–93. http://dx.doi.org/10.1142/s0129183100001413.

Повний текст джерела
Анотація:
We discuss stochastic resonance in associative memory with a canonical neural network model that describes the generic behavior of a large family of dynamical systems near bifurcation. Our result shows that stochastic resonance helps memory association. The relationship between stochastic resonance, associative memory, storage load, history of memory and initial states are studied. In intelligent systems like neural networks, it is likely that stochastic resonance combined with synaptic information enhances memory recalls.
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Soula, Hédi, and Carson C. Chow. "Stochastic Dynamics of a Finite-Size Spiking Neural Network." Neural Computation 19, no. 12 (December 2007): 3262–92. http://dx.doi.org/10.1162/neco.2007.19.12.3262.

Повний текст джерела
Анотація:
We present a simple Markov model of spiking neural dynamics that can be analytically solved to characterize the stochastic dynamics of a finite-size spiking neural network. We give closed-form estimates for the equilibrium distribution, mean rate, variance, and autocorrelation function of the network activity. The model is applicable to any network where the probability of firing of a neuron in the network depends on only the number of neurons that fired in a previous temporal epoch. Networks with statistically homogeneous connectivity and membrane and synaptic time constants that are not excessively long could satisfy these conditions. Our model completely accounts for the size of the network and correlations in the firing activity. It also allows us to examine how the network dynamics can deviate from mean field theory. We show that the model and solutions are applicable to spiking neural networks in biophysically plausible parameter regimes.
Стилі APA, Harvard, Vancouver, ISO та ін.
45

Xu, Yuhua, Jinmeng Wang, Wuneng Zhou, and Xin Wang. "Synchronization in pth Moment for Stochastic Chaotic Neural Networks with Finite-Time Control." Complexity 2019 (January 3, 2019): 1–8. http://dx.doi.org/10.1155/2019/2916364.

Повний текст джерела
Анотація:
Finite-time synchronization in pth moment is considered for time varying stochastic chaotic neural networks. Compared with some existing results about finite-time mean square stability of stochastic neural network, we obtain some useful criteria of finite-time synchronization in pth moment for chaotic neural networks based on finite-time nonlinear feedback control and finite-time adaptive feedback control, which are efficient and easy to implement in practical applications. Finally, a numerical example is given to illustrate the validity of the derived synchronization conditions.
Стилі APA, Harvard, Vancouver, ISO та ін.
46

SHAFEE, FARIEL. "STOCHASTIC DYNAMICS OF NETWORKS WITH SEMICLASSICAL EXCITATIONS." Stochastics and Dynamics 07, no. 03 (September 2007): 403–16. http://dx.doi.org/10.1142/s0219493707002086.

Повний текст джерела
Анотація:
As artificial intelligence seems destined to go for devices at the nano-scale level involving quantized interactions, we investigate the changes brought forth by a transition of a dynamical classical neural system to a quantum one. First, we consider cases of interaction between neurons and with an external source, which are analytically tractable, and find relations for their average times of transition between different quantum states. We get simple relations between the average transition time and the effective potential in limits of resonance and small damping. Such potentials, with harmonic and/or damped behavior, may be quite important in nanoelectronic devices. We then present a network set up with interactions between nodes similar to classical biological-like action potentials with finite durations, and compare it with the classical integrate-and-fire neural network. As in the classical systems, we obtain from such networks dynamical behavior differing according to the type of static input. We have examined the role of the novel stochasticity introduced by the quantum potential which fuzzifies the logical links, in contrast to the system with the deterministic classical neural network. The dynamic quantum system also shows oscillations similar to biological systems. The average periodic behavior of the system, as found from simulations, agrees remarkably well with a simple formula with a single parameter to account for the intrinsic randomness of the system. Short-term retentivity of input memory is observed to be subtle but perceptible, and may be useful in designing devices with quantum networks that need a gradual auto-erasing facility, which too, being quasi-biological, may be a desirable feature in systems trying to mimic living neural systems.
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Wan, Li, and Qing Hua Zhou. "Ultimate Boundedness of Stochastic Neural Networks with Delays." Advanced Materials Research 204-210 (February 2011): 1549–52. http://dx.doi.org/10.4028/www.scientific.net/amr.204-210.1549.

Повний текст джерела
Анотація:
Although ultimate boundedness of several classes of neural networks with constant delays was studied by some researchers, the inherent randomness associated with signal transmission was not taken account into these networks. At present, few authors study ultimate boundedness of stochastic neural networks and no related papers are reported. In this paper, by using Lyapunov functional and linear matrix inequality, some sufficient conditions ensuring the ultimate boundedness of stochastic neural networks with time-varying delays are established. Our criteria are easily tested by Matlab LMI Toolbox. One example is given to demonstrate our criteria.
Стилі APA, Harvard, Vancouver, ISO та ін.
48

Nikolic, Kostantin. "Training Neural Network Elements Created From Long Shot Term Memory." Oriental journal of computer science and technology 10, no. 1 (March 1, 2017): 01–10. http://dx.doi.org/10.13005/ojcst/10.01.01.

Повний текст джерела
Анотація:
This paper presents the application of stochastic search algorithms to train artificial neural networks. Methodology approaches in the work created primarily to provide training complex recurrent neural networks. It is known that training recurrent networks is more complex than the type of training feedforward neural networks. Through simulation of recurrent networks is realized propagation signal from input to output and training process achieves a stochastic search in the space of parameters. The performance of this type of algorithm is superior to most of the training algorithms, which are based on the concept of gradient. The efficiency of these algorithms is demonstrated in the training network created from units that are characterized by long term and long shot term memory of networks. The presented methology is effective and relative simple.
Стилі APA, Harvard, Vancouver, ISO та ін.
49

Zhou, Liqun, and Guangda Hu. "Almost sure exponential stability of neutral stochastic delayed cellular neural networks." Journal of Control Theory and Applications 6, no. 2 (May 2008): 195–200. http://dx.doi.org/10.1007/s11768-008-7036-8.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
50

Amit, Daniel J., and Stefano Fusi. "Learning in Neural Networks with Material Synapses." Neural Computation 6, no. 5 (September 1994): 957–82. http://dx.doi.org/10.1162/neco.1994.6.5.957.

Повний текст джерела
Анотація:
We discuss the long term maintenance of acquired memory in synaptic connections of a perpetually learning electronic device. This is affected by ascribing each synapse a finite number of stable states in which it can maintain for indefinitely long periods. Learning uncorrelated stimuli is expressed as a stochastic process produced by the neural activities on the synapses. In several interesting cases the stochastic process can be analyzed in detail, leading to a clarification of the performance of the network, as an associative memory, during the process of uninterrupted learning. The stochastic nature of the process and the existence of an asymptotic distribution for the synaptic values in the network imply generically that the memory is a palimpsest but capacity is as low as log N for a network of N neurons. The only way we find for avoiding this tight constraint is to allow the parameters governing the learning process (the coding level of the stimuli; the transition probabilities for potentiation and depression and the number of stable synaptic levels) to depend on the number of neurons. It is shown that a network with synapses that have two stable states can dynamically learn with optimal storage efficiency, be a palimpsest, and maintain its (associative) memory for an indefinitely long time provided the coding level is low and depression is equilibrated against potentiation. We suggest that an option so easily implementable in material devices would not have been overlooked by biology. Finally we discuss the stochastic learning on synapses with variable number of stable synaptic states.
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії