Artículos de revistas sobre el tema "Backpropagation"

Siga este enlace para ver otros tipos de publicaciones sobre el tema: Backpropagation.

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores artículos de revistas para su investigación sobre el tema "Backpropagation".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Andrew, Alex M. "Backpropagation". Kybernetes 30, n.º 9/10 (diciembre de 2001): 1110–17. http://dx.doi.org/10.1108/03684920110405601.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Faisal, Faisal. "Penggunaan Metode Backpropagation Pada Sistem Prediksi Kelulusan Mahasiswa STMIK Kaputama Binjai". Data Sciences Indonesia (DSI) 2, n.º 1 (10 de agosto de 2022): 13–19. http://dx.doi.org/10.47709/dsi.v2i1.1664.

Texto completo
Resumen
Kelulusan yang tepat pada waktunya menjadi salah satu tolak ukur integritas sekolah tinggi, termasuk STMIK Kaputama Binjai. Dari tahun ke tahun, banyak mahasiswa Universitas STMIK Kaputama Binjai yang lulus tepat pada waktunya, namun tidak sedikit pula mahasiswa yang tidak lulus tepat pada waktunya. Untuk itu perlu adanya sistem prediksi kelulusan agar dosen dapat mengarahkan mahasiswa yang diprediksi akan lulus terlambat. Metode yang digunakan adalah Jaringan Syaraf Tiruan Backpropagation. Metode Backpropagation memiliki 3 arsitektur yaitu input layer, hidden layer, dan output layer. Proses Backpropagation meliputi forward dan backward. Data yang digunakan adalah data IPS1 hingga IPS4 kelulusan tahun 2015-2021 dari program studi Teknik Informatika, sebagai data latih untuk jaringan syaraf tiruan Backpropagation menggunakan data dari mahasiswa yang sudah lulus, lalu sebagai data uji untuk prediksi kelulusan bisa mnggunakan data mahasiswa yang masih menempuh pendidikan dengan ketentuan harus sudah melewati semester 4. Dari berbagai percobaan dengan fitur max iterasi, max kecepatan latih, dan minimal error yang berbeda lalu data latih yang berbeda pula dapat menghasilkan tingkat akurasi hasil prediksi yang berbeda, akurasi pengujian tertinggi dapat dilihat dari hasil error yang paling minimum. Sistem ini dibangun menggunakan Bahasa Pemrograman Visual Basic dengan software Visual Studio 2010. Hasil penelitian menunjukkan bahwa metode Backpropagations dinilai cukup bagus dalam melakukan Pengklasifikasian untuk melakukan prediksi kelulusan mahasiswa.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Sexton, Randall S., Robert E. Dorsey y John D. Johnson. "Beyond Backpropagation". Journal of Organizational and End User Computing 11, n.º 3 (julio de 1999): 3–10. http://dx.doi.org/10.4018/joeuc.1999070101.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Adigun, Olaoluwa y Bart Kosko. "Bidirectional Backpropagation". IEEE Transactions on Systems, Man, and Cybernetics: Systems 50, n.º 5 (mayo de 2020): 1982–94. http://dx.doi.org/10.1109/tsmc.2019.2916096.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Irawan, Eka, M. Zarlis y Erna Budhiarti Nababan. "ANALISIS PENAMBAHAN NILAI MOMENTUM PADA PREDIKSI PRODUKTIVITAS KELAPA SAWIT MENGGUNAKAN BACKPROPAGATION". InfoTekJar (Jurnal Nasional Informatika dan Teknologi Jaringan) 1, n.º 2 (3 de marzo de 2017): 84–89. http://dx.doi.org/10.30743/infotekjar.v1i2.67.

Texto completo
Resumen
Algoritma backpropagation merupakan multi layer perceptron yang banyak digunakan untuk menyelesaikan persoalan yang luas, namun algoritma backpropagation juga mempunyai keterbatasan yaitu laju konvergensi yang cukup lambat. Pada penelitian ini penulis menambahkan parameter learning rate secara adaptif pada setiap iterasi dan koefisien momentum untuk menghitung proses perubahan bobot. Dari hasil simulasi komputer maka diperoleh perbandingan antara algoritma backpropagation standar dengan backpropagation dengan penambahan momentum. Untuk algoritma backpropagation standar kecepatan konvergensi 727 epoch dengan nilai MSE 0,01, sedangkan algoritma backpropagation standar mencapai 4000 epoch dengan nilai MSE 0,001. . Hal ini menunjukkan bahwa algoritma backpropagation adaptive learning lebih cepat mencapai konvergensi daripada algoritma backpropagation standar.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Nafisah, Zumrotun, Febrian Rachmadi y Elly Matul Imah. "Face Recognition Using Complex Valued Backpropagation". Jurnal Ilmu Komputer dan Informasi 11, n.º 2 (29 de junio de 2018): 103. http://dx.doi.org/10.21609/jiki.v11i2.617.

Texto completo
Resumen
Face recognition is one of biometrical research area that is still interesting. This study discusses the Complex-Valued Backpropagation algorithm for face recognition. Complex-Valued Backpropagation is an algorithm modified from Real-Valued Backpropagation algorithm where the weights and activation functions used are complex. The dataset used in this study consist of 250 images that is classified in 5 classes. The performance of face recognition using Complex-Valued Backpropagation is also compared with Real-Valued Backpropagation algorithm. Experimental results have shown that Complex-Valued Backpropagation performance is better than Real-Valued Backpropagation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Ojha, Varun y Giuseppe Nicosia. "Backpropagation Neural Tree". Neural Networks 149 (mayo de 2022): 66–83. http://dx.doi.org/10.1016/j.neunet.2022.02.003.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Vicini, Delio, Sébastien Speierer y Wenzel Jakob. "Path replay backpropagation". ACM Transactions on Graphics 40, n.º 4 (agosto de 2021): 1–14. http://dx.doi.org/10.1145/3476576.3476672.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Vicini, Delio, Sébastien Speierer y Wenzel Jakob. "Path replay backpropagation". ACM Transactions on Graphics 40, n.º 4 (agosto de 2021): 1–14. http://dx.doi.org/10.1145/3450626.3459804.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Georgiou, G. M. y C. Koutsougeras. "Complex domain backpropagation". IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing 39, n.º 5 (mayo de 1992): 330–34. http://dx.doi.org/10.1109/82.142037.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Yang, Liping y Wanzhen Yu. "Backpropagation with Homotopy". Neural Computation 5, n.º 3 (mayo de 1993): 363–66. http://dx.doi.org/10.1162/neco.1993.5.3.363.

Texto completo
Resumen
When training a feedforward neural network with backpropagation (Rumelhart et al. 1986), local minima are always a problem because of the nonlinearity of the system. There have been several ways to attack this problem: for example, to restart the training by selecting a new initial point, to perform the preprocessing of the input data or the neural network. Here, we propose a method which is efficient in computation to avoid some local minima.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Irukulapati, Naga V., Henk Wymeersch, Pontus Johannisson y Erik Agrell. "Stochastic Digital Backpropagation". IEEE Transactions on Communications 62, n.º 11 (noviembre de 2014): 3956–68. http://dx.doi.org/10.1109/tcomm.2014.2362534.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Zhang, Zhiyuan, Pengcheng Yang, Xuancheng Ren, Qi Su y Xu Sun. "Memorized sparse backpropagation". Neurocomputing 415 (noviembre de 2020): 397–407. http://dx.doi.org/10.1016/j.neucom.2020.08.055.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Yam, Y. F. y T. W. S. Chow. "Extended backpropagation algorithm". Electronics Letters 29, n.º 19 (1993): 1701. http://dx.doi.org/10.1049/el:19931131.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Wythoff, Barry J. "Backpropagation neural networks". Chemometrics and Intelligent Laboratory Systems 18, n.º 2 (febrero de 1993): 115–55. http://dx.doi.org/10.1016/0169-7439(93)80052-j.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Li, Zhiyuan, Wenshuai Zhao, Lijun Wu y Joni Pajarinen. "Backpropagation Through Agents". Proceedings of the AAAI Conference on Artificial Intelligence 38, n.º 12 (24 de marzo de 2024): 13718–26. http://dx.doi.org/10.1609/aaai.v38i12.29277.

Texto completo
Resumen
A fundamental challenge in multi-agent reinforcement learning (MARL) is to learn the joint policy in an extremely large search space, which grows exponentially with the number of agents. Moreover, fully decentralized policy factorization significantly restricts the search space, which may lead to sub-optimal policies. In contrast, the auto-regressive joint policy can represent a much richer class of joint policies by factorizing the joint policy into the product of a series of conditional individual policies. While such factorization introduces the action dependency among agents explicitly in sequential execution, it does not take full advantage of the dependency during learning. In particular, the subsequent agents do not give the preceding agents feedback about their decisions. In this paper, we propose a new framework Back-Propagation Through Agents (BPTA) that directly accounts for both agents' own policy updates and the learning of their dependent counterparts. This is achieved by propagating the feedback through action chains. With the proposed framework, our Bidirectional Proximal Policy Optimisation (BPPO) outperforms the state-of-the-art methods. Extensive experiments on matrix games, StarCraftII v2, Multi-agent MuJoCo, and Google Research Football demonstrate the effectiveness of the proposed method.
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Hertz, J., A. Krogh, B. Lautrup y T. Lehmann. "Nonlinear backpropagation: doing backpropagation without derivatives of the activation function". IEEE Transactions on Neural Networks 8, n.º 6 (noviembre de 1997): 1321–27. http://dx.doi.org/10.1109/72.641455.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Golding, Nace L., William L. Kath y Nelson Spruston. "Dichotomy of Action-Potential Backpropagation in CA1 Pyramidal Neuron Dendrites". Journal of Neurophysiology 86, n.º 6 (1 de diciembre de 2001): 2998–3010. http://dx.doi.org/10.1152/jn.2001.86.6.2998.

Texto completo
Resumen
In hippocampal CA1 pyramidal neurons, action potentials are typically initiated in the axon and backpropagate into the dendrites, shaping the integration of synaptic activity and influencing the induction of synaptic plasticity. Despite previous reports describing action-potential propagation in the proximal apical dendrites, the extent to which action potentials invade the distal dendrites of CA1 pyramidal neurons remains controversial. Using paired somatic and dendritic whole cell recordings, we find that in the dendrites proximal to 280 μm from the soma, single backpropagating action potentials exhibit <50% attenuation from their amplitude in the soma. However, in dendritic recordings distal to 300 μm from the soma, action potentials in most cells backpropagated either strongly (26–42% attenuation; n = 9/20) or weakly (71–87% attenuation; n = 10/20) with only one cell exhibiting an intermediate value (45% attenuation). In experiments combining dual somatic and dendritic whole cell recordings with calcium imaging, the amount of calcium influx triggered by backpropagating action potentials was correlated with the extent of action-potential invasion of the distal dendrites. Quantitative morphometric analyses revealed that the dichotomy in action-potential backpropagation occurred in the presence of only subtle differences in either the diameter of the primary apical dendrite or branching pattern. In addition, action-potential backpropagation was not dependent on a number of electrophysiological parameters (input resistance, resting potential, voltage sensitivity of dendritic spike amplitude). There was, however, a striking correlation of the shape of the action potential at the soma with its amplitude in the dendrite; larger, faster-rising, and narrower somatic action potentials exhibited more attenuation in the distal dendrites (300–410 μm from the soma). Simple compartmental models of CA1 pyramidal neurons revealed that a dichotomy in action-potential backpropagation could be generated in response to subtle manipulations of the distribution of either sodium or potassium channels in the dendrites. Backpropagation efficacy could also be influenced by local alterations in dendritic side branches, but these effects were highly sensitive to model parameters. Based on these findings, we hypothesize that the observed dichotomy in dendritic action-potential amplitude is conferred primarily by differences in the distribution, density, or modulatory state of voltage-gated channels along the somatodendritic axis.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Khalid Awang, Mohd, Mohammad Ridwan Ismail, Mokhairi Makhtar, M. Nordin A Rahman y Abd Rasid Mamat. "Performance Comparison of Neural Network Training Algorithms for Modeling Customer Churn Prediction". International Journal of Engineering & Technology 7, n.º 2.15 (6 de abril de 2018): 35. http://dx.doi.org/10.14419/ijet.v7i2.15.11196.

Texto completo
Resumen
Predicting customer churn has become the priority of every telecommunication service provider as the market is becoming more saturated and competitive. This paper presents a comparison of neural network learning algorithms for customer churn prediction. The data set used to train and test the neural network algorithms was provided by one of the leading telecommunication company in Malaysia. The Multilayer Perceptron (MLP) networks are trained using nine (9) types of learning algorithms, which are Levenberg Marquardt backpropagation (trainlm), BFGS Quasi-Newton backpropagation (trainbfg), Conjugate Gradient backpropagation with Fletcher-Reeves Updates (traincgf), Conjugate Gradient backpropagation with Polak-Ribiere Updates (traincgp), Conjugate Gradient backpropagation with Powell-Beale Restarts (traincgb), Scaled Conjugate Gradient backpropagation (trainscg), One Step Secant backpropagation (trainoss), Bayesian Regularization backpropagation (trainbr), and Resilient backpropagation (trainrp). The performance of the Neural Network is measured based on the prediction accuracy of the learning and testing phases. LM learning algorithm is found to be the optimum model of a neural network model consisting of fourteen input units, one hidden node and one output node. The best result of the experiment indicated that this model is able to produce the performance accuracy of 94.82%.
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Mahmood, Suzan A. y Loay E. George. "Speaker Identification Using Backpropagation Neural Network". Journal of Zankoy Sulaimani - Part A 11, n.º 1 (23 de septiembre de 2007): 61–66. http://dx.doi.org/10.17656/jzs.10181.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Lemon, N. y R. W. Turner. "Conditional Spike Backpropagation Generates Burst Discharge in a Sensory Neuron". Journal of Neurophysiology 84, n.º 3 (1 de septiembre de 2000): 1519–30. http://dx.doi.org/10.1152/jn.2000.84.3.1519.

Texto completo
Resumen
Backpropagating dendritic Na+spikes generate a depolarizing afterpotential (DAP) at the soma of pyramidal cells in the electrosensory lateral line lobe (ELL) of weakly electric fish. Repetitive spike discharge is associated with a progressive depolarizing shift in somatic spike afterpotentials that eventually triggers a high-frequency spike doublet and subsequent burst afterhyperpolarization (bAHP). The rhythmic generation of a spike doublet and bAHP groups spike discharge into an oscillatory burst pattern. This study examined the soma-dendritic mechanisms controlling the depolarizing shift in somatic spike afterpotentials, and the mechanism by which spike doublets terminate spike discharge. Intracellular recordings were obtained from ELL pyramidal somata and apical dendrites in an in vitro slice preparation. The pattern of spike discharge was equivalent in somatic and dendritic regions, reflecting the backpropagation of spikes from soma to dendrites. There was a clear frequency-dependent threshold in the transition from tonic to burst discharge, with bursts initiated when interspike intervals fell between ∼3–7 ms. Removal of all backpropagating spikes by dendritic TTX ejection revealed that the isolated somatic AHPs were entirely stable at the interspike intervals that generated burst discharge. As such, the depolarizing membrane potential shift during repetitive discharge could be attributed to a potentiation of DAP amplitude. Potentiation of the DAP was due to a frequency-dependent broadening and temporal summation of backpropagating dendritic Na+ spikes. Spike doublets were generated with an interspike interval close to, but not within, the somatic spike refractory period. In contrast, the interspike interval of spike doublets always fell within the longer dendritic refractory period, preventing backpropagation of the second spike of the doublet. The dendritic depolarization was thus abruptly removed from one spike to the next, allowing the burst to terminate when the bAHP hyperpolarized the membrane. The transition from tonic to burst discharge was dependent on the number and frequency of spikes invoking dendritic spike summation, indicating that burst threshold depends on the immediate history of cell discharge. Spike frequency thus represents an important condition that determines the success of dendritic spike invasion, establishing an intrinsic mechanism by which backpropagating spikes can be used to generate a rhythmic burst output.
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Falah, Miftahul, Dian Palupi Rini y Iwan Pahendra. "Kombinasi Algoritma Backpropagation Neural Network dengan Gravitational Search Algorithm Dalam Meningkatkan Akurasi". JURNAL MEDIA INFORMATIKA BUDIDARMA 5, n.º 1 (22 de enero de 2021): 90. http://dx.doi.org/10.30865/mib.v5i1.2597.

Texto completo
Resumen
Predicting disease is usually done based on the experience and knowledge of the doctor. Diagnosis of such a disease is traditionally less effective. The development of medical diagnosis based on machine learning in terms of disease prediction provides a more accurate diagnosis than the traditional way. In terms of predicting disease can use artificial neural networks. The artificial neural network consists of various algorithms, one of which is the Backpropagation Algorithm. In this paper it is proposed that disease prediction systems use the Backpropagation algorithm. Backpropagation algorithms are often used in disease prediction, but the Backpropagation algorithm has a slight drawback that tends to take a long time in obtaining optimum accuracy values. Therefore, a combination of algorithms can overcome the shortcomings of the Backpropagation algorithm by using the success of the Gravitational Search Algorithm (GSA) algorithm, which can overcome the slow convergence and local minimum problems contained in the Backpropagation algorithm. So the authors propose to combine the Backpropagation algorithm using the Gravitational Search Algorithm (GSA) in hopes of improving accuracy results better than using only the Backpropagation algorithm. The results resulted in a higher level of accuracy with the same number of iterations than using Backpropagation only. Can be seen in the first trial of breast cancer data with parameters namely hidden layer 5, learning rate of 2 and iteration as much as 5000 resulting in accuracy of 99.3 % with error 0.7% on Backpropagation Algorithm, while in combination BP & GSA got accuracy of 99.68 % with error of 0.32%.
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Moonlight, Lady Silk, Fiqqih Faizah, Yuyun Suprapto y Nyaris Pambudiyatno. "Comparison of Backpropagation and Kohonen Self Organising Map (KSOM) Methods in Face Image Recognition". Journal of Information Systems Engineering and Business Intelligence 7, n.º 2 (28 de octubre de 2021): 149. http://dx.doi.org/10.20473/jisebi.7.2.149-161.

Texto completo
Resumen
Background: Human face is a biometric feature. Artificial Intelligence (AI) called Artificial Neural Network (ANN) can be used in recognising such a biometric feature. In ANN, the learning process is divided into two: supervised and unsupervised learning. In supervised learning, a common method used is Backpropagation, while in the unsupervised learning, a common one is Kohonen Self Organizing Map (KSOM). However, the application of Backpropagation and KSOM need to be adjusted to improve the performance.Objective: In this study, Backpropagation and KSOM algorithms are rewritten to suit face image recognition, applied and compared to determine the effectiveness of each algorithm in solving face image recognition.Methods: In this study, the methods used and compared in the case of face image recognition are Backpropagation dan Kohonen Self Organizing Map (KSOM) Artificial Neural Network (ANN).Results: The smallest False Acceptance Rate (FAR) value of Backpropagation is 28%, and KSOM is 36%, out of 50 unregistered face images tested. While the smallest False Rejection Rate (FRR) value of Backpropagation is 22%, and KSOM is 30%, out of 50 registered face images. The fastest time for the training process using the backpropagation method is 7.14 seconds, and the fastest time for recognition is 0.71 seconds. While the fastest time for the training process using the KSOM method is 5.35 seconds, and the fastest time for recognition is 0.50 seconds.Conclusion: Backpropagation method is better in recognising face images than KSOM method, but the training process and the recognition process by KSOM method are faster than Backpropagation method due to the hidden layers. Keywords: Artificial Neural Network (ANN), Backpropagation, Kohonen Self Organizing Map (KSOM), Supervised learning, Unsupervised learning
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

AL-Assady, Nidhal, Jamal Majeed y Shahbaa Khaleel. "Integration Method with Backpropagation". AL-Rafidain Journal of Computer Sciences and Mathematics 2, n.º 1 (30 de junio de 2005): 49–68. http://dx.doi.org/10.33899/csmj.2005.164073.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Teo, Tat‐Jin y John M. Reid. "Range estimation using backpropagation". Journal of the Acoustical Society of America 92, n.º 3 (septiembre de 1992): 1440–42. http://dx.doi.org/10.1121/1.405265.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Park, Cheolsoo, Woojae Seong, Peter Gerstoft y William S. Hodgkiss. "Geoacoustic Inversion Using Backpropagation". IEEE Journal of Oceanic Engineering 35, n.º 4 (octubre de 2010): 722–31. http://dx.doi.org/10.1109/joe.2010.2040659.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Card, Howard. "Digital VLSI backpropagation networks". Canadian Journal of Electrical and Computer Engineering 20, n.º 1 (enero de 1995): 15–23. http://dx.doi.org/10.1109/cjece.1995.7102060.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Leung, H. y S. Haykin. "The complex backpropagation algorithm". IEEE Transactions on Signal Processing 39, n.º 9 (1991): 2101–4. http://dx.doi.org/10.1109/78.134446.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Lillicrap, Timothy P., Adam Santoro, Luke Marris, Colin J. Akerman y Geoffrey Hinton. "Backpropagation and the brain". Nature Reviews Neuroscience 21, n.º 6 (17 de abril de 2020): 335–46. http://dx.doi.org/10.1038/s41583-020-0277-3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Teo, Tat-Jin y John M. Reid. "Multifrequency Holography Using Backpropagation". Ultrasonic Imaging 8, n.º 3 (julio de 1986): 213–24. http://dx.doi.org/10.1177/016173468600800305.

Texto completo
Resumen
The technique of wavefield backpropagation has been used quite extensively in the literature. We report on an analytical study of the resolution properties of this technique. Backpropagation as a form of holographic reconstruction suffers from poor axial resolution. We derive expressions for both the axial and the lateral resolutions. We also show that the axial resolution can be substantially improved by the use of multiple frequencies. We derive an expression relating the resolution and bandwidth.
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Tesauro, Gerald, Yu He y Subutai Ahmad. "Asymptotic Convergence of Backpropagation". Neural Computation 1, n.º 3 (septiembre de 1989): 382–91. http://dx.doi.org/10.1162/neco.1989.1.3.382.

Texto completo
Resumen
We calculate analytically the rate of convergence at long times in the backpropagation learning algorithm for networks with and without hidden units. For networks without hidden units using the standard quadratic error function and a sigmoidal transfer function, we find that the error decreases as 1/t for large t, and the output states approach their target values as 1/√t. It is possible to obtain a different convergence rate for certain error and transfer functions, but the convergence can never be faster than 1/t. These results are unaffected by a momentum term in the learning algorithm, but convergence can be substantially improved by an adaptive learning rate scheme. For networks with hidden units, we generally expect the same rate of convergence to be obtained as in the single-layer case; however, under certain circumstances one can obtain a polynomial speed-up for non sigmoidal units, or a logarithmic speed-up for sigmoidal units. Our analytic results are confirmed by empirical measurements of the convergence rate in numerical simulations.
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Taek Mu Kwon y Hui Cheng. "Contrast enhancement for backpropagation". IEEE Transactions on Neural Networks 7, n.º 2 (marzo de 1996): 515–24. http://dx.doi.org/10.1109/72.485685.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

LiMin Fu, Hui-Huang Hsu y J. C. Principe. "Incremental backpropagation learning networks". IEEE Transactions on Neural Networks 7, n.º 3 (mayo de 1996): 757–61. http://dx.doi.org/10.1109/72.501732.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Matson, Charles L. y Hanli Liu. "Backpropagation in turbid media". Journal of the Optical Society of America A 16, n.º 6 (1 de junio de 1999): 1254. http://dx.doi.org/10.1364/josaa.16.001254.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Stoeva, Stefka y Alexander Nikov. "A fuzzy backpropagation algorithm". Fuzzy Sets and Systems 112, n.º 1 (mayo de 2000): 27–39. http://dx.doi.org/10.1016/s0165-0114(98)00079-7.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Nikov, A. y S. Stoeva. "Quick fuzzy backpropagation algorithm". Neural Networks 14, n.º 2 (marzo de 2001): 231–44. http://dx.doi.org/10.1016/s0893-6080(00)00085-x.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Lehtokangas, M. "Modelling with constructive backpropagation". Neural Networks 12, n.º 4-5 (junio de 1999): 707–16. http://dx.doi.org/10.1016/s0893-6080(99)00018-0.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Witaszek, Jacek. "Backpropagation: Theory, architectures, applications". Neurocomputing 9, n.º 3 (diciembre de 1995): 358–59. http://dx.doi.org/10.1016/0925-2312(95)90002-0.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Joshi, Anupam y Chia-Hoang Lee. "Backpropagation learns Marr's operator". Biological Cybernetics 70, n.º 1 (noviembre de 1993): 65–73. http://dx.doi.org/10.1007/bf00202567.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Teo, T. "Multifrequency holography using backpropagation". Ultrasonic Imaging 8, n.º 3 (julio de 1986): 213–24. http://dx.doi.org/10.1016/0161-7346(86)90010-6.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Ohno, Michihiro, Masato Okada y Kunihiko Fukushima. "Neocognitron learning by backpropagation". Systems and Computers in Japan 26, n.º 5 (1995): 19–28. http://dx.doi.org/10.1002/scj.4690260502.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Cho, Sung Bae y Jin H. Kim. "Rapid backpropagation learning algorithms". Circuits, Systems, and Signal Processing 12, n.º 2 (junio de 1993): 155–75. http://dx.doi.org/10.1007/bf01189872.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Budiman, I., A. Mubarak, S. Kapita, S. Do Abdullah y M. Salmin. "Implementation of Backpropagation Artificial Network Methods for Early Children’s Intelligence Prediction". E3S Web of Conferences 328 (2021): 04033. http://dx.doi.org/10.1051/e3sconf/202132804033.

Texto completo
Resumen
Intelligence is the ability to process certain types of information derived from human biological and psychological factors. This study aims to implement a Backpropagation artificial neural network for prediction of early childhood intelligence and how to calculate system accuracy on children's intelligence using the backpropagation artificial neural network method. The Backpropagation Neural Network method is one of the best methods in dealing with the problem of recognizing complex patterns. Backpropagation Neural Networks have advantages because the learning is done repeatedly so that it can create a system that is resistant to damage and consistently works well. The application of the Backpropagation Neural Network method is able to predict the intelligence of early childhood. The results of the calculation of the Backpropagation Artificial Neural Network method from 42 children's intelligence data being tested, with 27 training data and 15 test data, the results obtained 100% accuracy percentage results.
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Wahyudi, Mochamad, Firmansyah, Lise Pujiastuti y Solikhun. "Application of Neural Network Variations for Determining the Best Architecture for Data Prediction". Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) 6, n.º 5 (8 de octubre de 2022): 742–48. http://dx.doi.org/10.29207/resti.v6i5.4356.

Texto completo
Resumen
Abstract This study focuses on the application and comparison of the epoch, time, performance/MSE training, and performance/MSE testing of variations of the Backpropagation algorithm. The main problem in this study is that the Backpropagation algorithm tends to be slow to reach convergence in obtaining optimum accuracy, requires extensive training data, and the optimization used is less efficient and has performance/MSE which can still be improved to produce better performance/MSE in this research—data prediction process. Determination of the best model for data prediction is seen from the performance/MSE test. This data prediction uses five variations of the Backpropagation algorithm: standard Backpropagation, Resistant Backpropagation, Conjugate Gradient, Fletcher Reeves, and Powell Beale. The research stage begins with processing the avocado production dataset in Indonesia by province from 2016 to 2021. The dataset is first normalized to a value between 0 to 1. The test in this study was carried out using Matlab 2011a. The dataset is divided into two, namely training data and test data. This research's benefit is producing the best model of the Backpropagation algorithm in predicting data with five methods in the Backpropagation algorithm. The test results show that the Resilient Backpropagation method is the best model with a test performance of 0.00543829, training epochs of 1000, training time of 12 seconds, and training performance of 0.00012667.
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Johansson, E. M., F. U. Dowla y D. M. Goodman. "BACKPROPAGATION LEARNING FOR MULTILAYER FEED-FORWARD NEURAL NETWORKS USING THE CONJUGATE GRADIENT METHOD". International Journal of Neural Systems 02, n.º 04 (enero de 1991): 291–301. http://dx.doi.org/10.1142/s0129065791000261.

Texto completo
Resumen
In many applications, the number of interconnects or weights in a neural network is so large that the learning time for the conventional backpropagation algorithm can become excessively long. Numerical optimization theory offers a rich and robust set of techniques which can be applied to neural networks to improve learning rates. In particular, the conjugate gradient method is easily adapted to the backpropagation learning problem. This paper describes the conjugate gradient method, its application to the backpropagation learning problem and presents results of numerical tests which compare conventional backpropagation, steepest descent and the conjugate gradient methods. For the parity problem, we find that the conjugate gradient method is an order of magnitude faster than conventional backpropagation with momentum.
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Fitriah, Zuraidah, Mohamad Handri Tuloli, Syaiful Anam, Noor Hidayat, Indah Yanti y Dwi Mifta Mahanani. "Backpropagation with BFGS Optimizer for Covid-19 Prediction Cases in Surabaya". Telematika 18, n.º 2 (4 de octubre de 2021): 157. http://dx.doi.org/10.31315/telematika.v18i2.5454.

Texto completo
Resumen
Covid-19 is a new type of corona virus called SARS-CoV-2. One of the cities that has contributed the most to infected Covid-19 cases in Indonesia is Surabaya, East Java. Predicting the Covid-19 is the important thing to do. One of the prediction methods is Artificial Neural Network (ANN). The backpropagation algorithm is one of the ANN methods that has been successfully used in various fields. However, the performance of backpropagation is depended on the architecture and optimization method. The standard backpropagation algorithm is optimized by gradient descent method. The Broyden - Fletcher - Goldfarb - Shanno (BFGS) algorithm works faster then gradient descent. This paper was predicting the Covid-19 cases in Surabaya using backpropagation with BFGS. Several scenarios of backpropagation parameters were also tested to produce optimal performance. The proposed method gives better results with a faster convergence then the standard backpropagation algorithm for predicting the Covid-19 cases in Surabaya.
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Tambunan, Heru Satria. "PENGENALAN POLA HIV DAN AIDS MENGGUNAKAN ALGORITMA KOHONEN PADA JARINGAN SYARAF TIRUAN BACKPROPAGATION". InfoTekJar (Jurnal Nasional Informatika dan Teknologi Jaringan) 1, n.º 1 (9 de septiembre de 2016): 65–69. http://dx.doi.org/10.30743/infotekjar.v1i1.44.

Texto completo
Resumen
Perkembangan tekhnologi saat ini sangat berkembang pesat, sehingga sangat memudahkan untuk mengatasi berbagai masalah. Di dalam penelitian ini penulis menggunakan algoritma Kohonen pada Jaringan Syaraf Tiruan Backpropagation dalam pengenalan pola penyakit HIV dan AIDS dalam mengenali pola penyakit HIV dan AIDS. Algoritma Backpropagation merupakan salah satu algoritma pembelajaran yang membutuhkan pengawasan dalam proses pembelajarannya. Pada algoritma backpropagation terdapat pasangan data input dan output serta hidden layer untuk melakukan pemrosesan data Jaringan Syaraf Tiruan hingga diperoleh bobot penimbang (weight) yang diinginkan. Dalam penelitian ini, dalam pengenalan pola penyakit HIV dan AIDS. Penulis menggunakan 15 variabel datauntuk dilatih menggunakan algoritma backpropagation dimana pembobotannya secara random dan data yang kedua dilatih menggunakan algoritma backpropagation. Didalam penelitian ini mengunakan aplikasi matlab untuk melakukan pemrosesan.
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Beaufays, Françoise y Eric A. Wan. "Relating Real-Time Backpropagation and Backpropagation-Through-Time: An Application of Flow Graph Interreciprocity". Neural Computation 6, n.º 2 (marzo de 1994): 296–306. http://dx.doi.org/10.1162/neco.1994.6.2.296.

Texto completo
Resumen
We show that signal flow graph theory provides a simple way to relate two popular algorithms used for adapting dynamic neural networks, real-time backpropagation and backpropagation-through-time. Starting with the flow graph for real-time backpropagation, we use a simple transposition to produce a second graph. The new graph is shown to be interreciprocal with the original and to correspond to the backpropagation-through-time algorithm. Interreciprocity provides a theoretical argument to verify that both flow graphs implement the same overall weight update.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Cao, WanLing. "Evaluating the Vocal Music Teaching Using Backpropagation Neural Network". Mobile Information Systems 2022 (24 de junio de 2022): 1–7. http://dx.doi.org/10.1155/2022/3843726.

Texto completo
Resumen
The vocal music teaching for evaluating performers is affected by multiple factors. Evaluators are greatly influenced by subjective factors in scoring outputs. The backpropagation (BP) neural network provides a novel technology that can theoretically simulate any nonlinear continuous function within a certain accuracy range. The backpropagation neural network is composed of adaptive feedforward learning network that is widely used in artificial intelligence (AI). In addition, the backpropagation neural network can simulate the nonlinear mapping composed of various factors. The novelty of the neural network is that it can model the nonlinear process without knowing the cause of the data, which can overcome the human subjective arbitrariness and make the evaluation outcomes. Furthermore, accurate and effective scoring systems can be designed using neural networks. In this paper, we establish a vocal music evaluation research system in order to objectivize each vocal music teaching evaluation index. To do so, we use the score vector as the input and obtain a reasonable and objective output score through the backpropagation neural network. Moreover, according to the characteristics of the backpropagation neural network, the factors of vocal music teaching evaluation are analyzed, and a backpropagation neural network model for vocal music teaching evaluation and evaluation is constructed. The experimental outcomes demonstrate that the trained backpropagation network can simulate a stable vocal music teaching evaluation research system. Furthermore, we observed that the backpropagation neural network can be well utilized for vocal music teaching evaluation research.
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Kim, Jee-Heon, Nam-Chul Seong y Won-Chang Choi. "Comparative Evaluation of Predicting Energy Consumption of Absorption Heat Pump with Multilayer Shallow Neural Network Training Algorithms". Buildings 12, n.º 1 (26 de diciembre de 2021): 13. http://dx.doi.org/10.3390/buildings12010013.

Texto completo
Resumen
The performance of various multilayer neural network algorithms to predict the energy consumption of an absorption chiller in an air conditioning system under the same conditions was compared and evaluated in this study. Each prediction model was created using 12 representative multilayer shallow neural network algorithms. As training data, about a month of actual operation data during the heating period was used, and the predictive performance of 12 algorithms according to the training size was evaluated. The prediction results indicate that the error rates using the measured values are 0.09% minimum, 5.76% maximum, and 1.94 standard deviation (SD) for the Levenberg–Marquardt backpropagation model and 0.41% minimum, 5.05% maximum, and 1.68 SD for the Bayesian regularization backpropagation model. The conjugate gradient with Polak–Ribiére updates backpropagation model yielded lower values than the other two models, with 0.31% minimum, 5.73% maximum, and 1.76 SD. Based on the results for the predictive performance evaluation index, CvRMSE, all other models (conjugate gradient with Fletcher–Reeves updates backpropagation, one-step secant backpropagation, gradient descent with momentum and adaptive learning rate backpropagation, gradient descent with momentum backpropagation) except for the gradient descent backpropagation model yielded results that satisfy ASHRAE (American Society of Heating, Refrigerating and Air-Conditioning Engineers) Guideline 14. The results of this study confirm that the prediction performance may differ for each multilayer neural network training algorithm. Therefore, selecting the appropriate model to fit the characteristics of a specific project is essential.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía