Journal articles on the topic 'Backpropagation'

To see the other types of publications on this topic, follow the link: Backpropagation.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Backpropagation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Andrew, Alex M. "Backpropagation." Kybernetes 30, no. 9/10 (December 2001): 1110–17. http://dx.doi.org/10.1108/03684920110405601.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Faisal, Faisal. "Penggunaan Metode Backpropagation Pada Sistem Prediksi Kelulusan Mahasiswa STMIK Kaputama Binjai." Data Sciences Indonesia (DSI) 2, no. 1 (August 10, 2022): 13–19. http://dx.doi.org/10.47709/dsi.v2i1.1664.

Full text
Abstract:
Kelulusan yang tepat pada waktunya menjadi salah satu tolak ukur integritas sekolah tinggi, termasuk STMIK Kaputama Binjai. Dari tahun ke tahun, banyak mahasiswa Universitas STMIK Kaputama Binjai yang lulus tepat pada waktunya, namun tidak sedikit pula mahasiswa yang tidak lulus tepat pada waktunya. Untuk itu perlu adanya sistem prediksi kelulusan agar dosen dapat mengarahkan mahasiswa yang diprediksi akan lulus terlambat. Metode yang digunakan adalah Jaringan Syaraf Tiruan Backpropagation. Metode Backpropagation memiliki 3 arsitektur yaitu input layer, hidden layer, dan output layer. Proses Backpropagation meliputi forward dan backward. Data yang digunakan adalah data IPS1 hingga IPS4 kelulusan tahun 2015-2021 dari program studi Teknik Informatika, sebagai data latih untuk jaringan syaraf tiruan Backpropagation menggunakan data dari mahasiswa yang sudah lulus, lalu sebagai data uji untuk prediksi kelulusan bisa mnggunakan data mahasiswa yang masih menempuh pendidikan dengan ketentuan harus sudah melewati semester 4. Dari berbagai percobaan dengan fitur max iterasi, max kecepatan latih, dan minimal error yang berbeda lalu data latih yang berbeda pula dapat menghasilkan tingkat akurasi hasil prediksi yang berbeda, akurasi pengujian tertinggi dapat dilihat dari hasil error yang paling minimum. Sistem ini dibangun menggunakan Bahasa Pemrograman Visual Basic dengan software Visual Studio 2010. Hasil penelitian menunjukkan bahwa metode Backpropagations dinilai cukup bagus dalam melakukan Pengklasifikasian untuk melakukan prediksi kelulusan mahasiswa.
APA, Harvard, Vancouver, ISO, and other styles
3

Sexton, Randall S., Robert E. Dorsey, and John D. Johnson. "Beyond Backpropagation." Journal of Organizational and End User Computing 11, no. 3 (July 1999): 3–10. http://dx.doi.org/10.4018/joeuc.1999070101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Adigun, Olaoluwa, and Bart Kosko. "Bidirectional Backpropagation." IEEE Transactions on Systems, Man, and Cybernetics: Systems 50, no. 5 (May 2020): 1982–94. http://dx.doi.org/10.1109/tsmc.2019.2916096.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Irawan, Eka, M. Zarlis, and Erna Budhiarti Nababan. "ANALISIS PENAMBAHAN NILAI MOMENTUM PADA PREDIKSI PRODUKTIVITAS KELAPA SAWIT MENGGUNAKAN BACKPROPAGATION." InfoTekJar (Jurnal Nasional Informatika dan Teknologi Jaringan) 1, no. 2 (March 3, 2017): 84–89. http://dx.doi.org/10.30743/infotekjar.v1i2.67.

Full text
Abstract:
Algoritma backpropagation merupakan multi layer perceptron yang banyak digunakan untuk menyelesaikan persoalan yang luas, namun algoritma backpropagation juga mempunyai keterbatasan yaitu laju konvergensi yang cukup lambat. Pada penelitian ini penulis menambahkan parameter learning rate secara adaptif pada setiap iterasi dan koefisien momentum untuk menghitung proses perubahan bobot. Dari hasil simulasi komputer maka diperoleh perbandingan antara algoritma backpropagation standar dengan backpropagation dengan penambahan momentum. Untuk algoritma backpropagation standar kecepatan konvergensi 727 epoch dengan nilai MSE 0,01, sedangkan algoritma backpropagation standar mencapai 4000 epoch dengan nilai MSE 0,001. . Hal ini menunjukkan bahwa algoritma backpropagation adaptive learning lebih cepat mencapai konvergensi daripada algoritma backpropagation standar.
APA, Harvard, Vancouver, ISO, and other styles
6

Nafisah, Zumrotun, Febrian Rachmadi, and Elly Matul Imah. "Face Recognition Using Complex Valued Backpropagation." Jurnal Ilmu Komputer dan Informasi 11, no. 2 (June 29, 2018): 103. http://dx.doi.org/10.21609/jiki.v11i2.617.

Full text
Abstract:
Face recognition is one of biometrical research area that is still interesting. This study discusses the Complex-Valued Backpropagation algorithm for face recognition. Complex-Valued Backpropagation is an algorithm modified from Real-Valued Backpropagation algorithm where the weights and activation functions used are complex. The dataset used in this study consist of 250 images that is classified in 5 classes. The performance of face recognition using Complex-Valued Backpropagation is also compared with Real-Valued Backpropagation algorithm. Experimental results have shown that Complex-Valued Backpropagation performance is better than Real-Valued Backpropagation.
APA, Harvard, Vancouver, ISO, and other styles
7

Ojha, Varun, and Giuseppe Nicosia. "Backpropagation Neural Tree." Neural Networks 149 (May 2022): 66–83. http://dx.doi.org/10.1016/j.neunet.2022.02.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Vicini, Delio, Sébastien Speierer, and Wenzel Jakob. "Path replay backpropagation." ACM Transactions on Graphics 40, no. 4 (August 2021): 1–14. http://dx.doi.org/10.1145/3476576.3476672.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Vicini, Delio, Sébastien Speierer, and Wenzel Jakob. "Path replay backpropagation." ACM Transactions on Graphics 40, no. 4 (August 2021): 1–14. http://dx.doi.org/10.1145/3450626.3459804.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Georgiou, G. M., and C. Koutsougeras. "Complex domain backpropagation." IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing 39, no. 5 (May 1992): 330–34. http://dx.doi.org/10.1109/82.142037.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Yang, Liping, and Wanzhen Yu. "Backpropagation with Homotopy." Neural Computation 5, no. 3 (May 1993): 363–66. http://dx.doi.org/10.1162/neco.1993.5.3.363.

Full text
Abstract:
When training a feedforward neural network with backpropagation (Rumelhart et al. 1986), local minima are always a problem because of the nonlinearity of the system. There have been several ways to attack this problem: for example, to restart the training by selecting a new initial point, to perform the preprocessing of the input data or the neural network. Here, we propose a method which is efficient in computation to avoid some local minima.
APA, Harvard, Vancouver, ISO, and other styles
12

Irukulapati, Naga V., Henk Wymeersch, Pontus Johannisson, and Erik Agrell. "Stochastic Digital Backpropagation." IEEE Transactions on Communications 62, no. 11 (November 2014): 3956–68. http://dx.doi.org/10.1109/tcomm.2014.2362534.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Zhang, Zhiyuan, Pengcheng Yang, Xuancheng Ren, Qi Su, and Xu Sun. "Memorized sparse backpropagation." Neurocomputing 415 (November 2020): 397–407. http://dx.doi.org/10.1016/j.neucom.2020.08.055.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Yam, Y. F., and T. W. S. Chow. "Extended backpropagation algorithm." Electronics Letters 29, no. 19 (1993): 1701. http://dx.doi.org/10.1049/el:19931131.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Wythoff, Barry J. "Backpropagation neural networks." Chemometrics and Intelligent Laboratory Systems 18, no. 2 (February 1993): 115–55. http://dx.doi.org/10.1016/0169-7439(93)80052-j.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Li, Zhiyuan, Wenshuai Zhao, Lijun Wu, and Joni Pajarinen. "Backpropagation Through Agents." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 12 (March 24, 2024): 13718–26. http://dx.doi.org/10.1609/aaai.v38i12.29277.

Full text
Abstract:
A fundamental challenge in multi-agent reinforcement learning (MARL) is to learn the joint policy in an extremely large search space, which grows exponentially with the number of agents. Moreover, fully decentralized policy factorization significantly restricts the search space, which may lead to sub-optimal policies. In contrast, the auto-regressive joint policy can represent a much richer class of joint policies by factorizing the joint policy into the product of a series of conditional individual policies. While such factorization introduces the action dependency among agents explicitly in sequential execution, it does not take full advantage of the dependency during learning. In particular, the subsequent agents do not give the preceding agents feedback about their decisions. In this paper, we propose a new framework Back-Propagation Through Agents (BPTA) that directly accounts for both agents' own policy updates and the learning of their dependent counterparts. This is achieved by propagating the feedback through action chains. With the proposed framework, our Bidirectional Proximal Policy Optimisation (BPPO) outperforms the state-of-the-art methods. Extensive experiments on matrix games, StarCraftII v2, Multi-agent MuJoCo, and Google Research Football demonstrate the effectiveness of the proposed method.
APA, Harvard, Vancouver, ISO, and other styles
17

Hertz, J., A. Krogh, B. Lautrup, and T. Lehmann. "Nonlinear backpropagation: doing backpropagation without derivatives of the activation function." IEEE Transactions on Neural Networks 8, no. 6 (November 1997): 1321–27. http://dx.doi.org/10.1109/72.641455.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Golding, Nace L., William L. Kath, and Nelson Spruston. "Dichotomy of Action-Potential Backpropagation in CA1 Pyramidal Neuron Dendrites." Journal of Neurophysiology 86, no. 6 (December 1, 2001): 2998–3010. http://dx.doi.org/10.1152/jn.2001.86.6.2998.

Full text
Abstract:
In hippocampal CA1 pyramidal neurons, action potentials are typically initiated in the axon and backpropagate into the dendrites, shaping the integration of synaptic activity and influencing the induction of synaptic plasticity. Despite previous reports describing action-potential propagation in the proximal apical dendrites, the extent to which action potentials invade the distal dendrites of CA1 pyramidal neurons remains controversial. Using paired somatic and dendritic whole cell recordings, we find that in the dendrites proximal to 280 μm from the soma, single backpropagating action potentials exhibit <50% attenuation from their amplitude in the soma. However, in dendritic recordings distal to 300 μm from the soma, action potentials in most cells backpropagated either strongly (26–42% attenuation; n = 9/20) or weakly (71–87% attenuation; n = 10/20) with only one cell exhibiting an intermediate value (45% attenuation). In experiments combining dual somatic and dendritic whole cell recordings with calcium imaging, the amount of calcium influx triggered by backpropagating action potentials was correlated with the extent of action-potential invasion of the distal dendrites. Quantitative morphometric analyses revealed that the dichotomy in action-potential backpropagation occurred in the presence of only subtle differences in either the diameter of the primary apical dendrite or branching pattern. In addition, action-potential backpropagation was not dependent on a number of electrophysiological parameters (input resistance, resting potential, voltage sensitivity of dendritic spike amplitude). There was, however, a striking correlation of the shape of the action potential at the soma with its amplitude in the dendrite; larger, faster-rising, and narrower somatic action potentials exhibited more attenuation in the distal dendrites (300–410 μm from the soma). Simple compartmental models of CA1 pyramidal neurons revealed that a dichotomy in action-potential backpropagation could be generated in response to subtle manipulations of the distribution of either sodium or potassium channels in the dendrites. Backpropagation efficacy could also be influenced by local alterations in dendritic side branches, but these effects were highly sensitive to model parameters. Based on these findings, we hypothesize that the observed dichotomy in dendritic action-potential amplitude is conferred primarily by differences in the distribution, density, or modulatory state of voltage-gated channels along the somatodendritic axis.
APA, Harvard, Vancouver, ISO, and other styles
19

Khalid Awang, Mohd, Mohammad Ridwan Ismail, Mokhairi Makhtar, M. Nordin A Rahman, and Abd Rasid Mamat. "Performance Comparison of Neural Network Training Algorithms for Modeling Customer Churn Prediction." International Journal of Engineering & Technology 7, no. 2.15 (April 6, 2018): 35. http://dx.doi.org/10.14419/ijet.v7i2.15.11196.

Full text
Abstract:
Predicting customer churn has become the priority of every telecommunication service provider as the market is becoming more saturated and competitive. This paper presents a comparison of neural network learning algorithms for customer churn prediction. The data set used to train and test the neural network algorithms was provided by one of the leading telecommunication company in Malaysia. The Multilayer Perceptron (MLP) networks are trained using nine (9) types of learning algorithms, which are Levenberg Marquardt backpropagation (trainlm), BFGS Quasi-Newton backpropagation (trainbfg), Conjugate Gradient backpropagation with Fletcher-Reeves Updates (traincgf), Conjugate Gradient backpropagation with Polak-Ribiere Updates (traincgp), Conjugate Gradient backpropagation with Powell-Beale Restarts (traincgb), Scaled Conjugate Gradient backpropagation (trainscg), One Step Secant backpropagation (trainoss), Bayesian Regularization backpropagation (trainbr), and Resilient backpropagation (trainrp). The performance of the Neural Network is measured based on the prediction accuracy of the learning and testing phases. LM learning algorithm is found to be the optimum model of a neural network model consisting of fourteen input units, one hidden node and one output node. The best result of the experiment indicated that this model is able to produce the performance accuracy of 94.82%.
APA, Harvard, Vancouver, ISO, and other styles
20

Mahmood, Suzan A., and Loay E. George. "Speaker Identification Using Backpropagation Neural Network." Journal of Zankoy Sulaimani - Part A 11, no. 1 (September 23, 2007): 61–66. http://dx.doi.org/10.17656/jzs.10181.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Lemon, N., and R. W. Turner. "Conditional Spike Backpropagation Generates Burst Discharge in a Sensory Neuron." Journal of Neurophysiology 84, no. 3 (September 1, 2000): 1519–30. http://dx.doi.org/10.1152/jn.2000.84.3.1519.

Full text
Abstract:
Backpropagating dendritic Na+spikes generate a depolarizing afterpotential (DAP) at the soma of pyramidal cells in the electrosensory lateral line lobe (ELL) of weakly electric fish. Repetitive spike discharge is associated with a progressive depolarizing shift in somatic spike afterpotentials that eventually triggers a high-frequency spike doublet and subsequent burst afterhyperpolarization (bAHP). The rhythmic generation of a spike doublet and bAHP groups spike discharge into an oscillatory burst pattern. This study examined the soma-dendritic mechanisms controlling the depolarizing shift in somatic spike afterpotentials, and the mechanism by which spike doublets terminate spike discharge. Intracellular recordings were obtained from ELL pyramidal somata and apical dendrites in an in vitro slice preparation. The pattern of spike discharge was equivalent in somatic and dendritic regions, reflecting the backpropagation of spikes from soma to dendrites. There was a clear frequency-dependent threshold in the transition from tonic to burst discharge, with bursts initiated when interspike intervals fell between ∼3–7 ms. Removal of all backpropagating spikes by dendritic TTX ejection revealed that the isolated somatic AHPs were entirely stable at the interspike intervals that generated burst discharge. As such, the depolarizing membrane potential shift during repetitive discharge could be attributed to a potentiation of DAP amplitude. Potentiation of the DAP was due to a frequency-dependent broadening and temporal summation of backpropagating dendritic Na+ spikes. Spike doublets were generated with an interspike interval close to, but not within, the somatic spike refractory period. In contrast, the interspike interval of spike doublets always fell within the longer dendritic refractory period, preventing backpropagation of the second spike of the doublet. The dendritic depolarization was thus abruptly removed from one spike to the next, allowing the burst to terminate when the bAHP hyperpolarized the membrane. The transition from tonic to burst discharge was dependent on the number and frequency of spikes invoking dendritic spike summation, indicating that burst threshold depends on the immediate history of cell discharge. Spike frequency thus represents an important condition that determines the success of dendritic spike invasion, establishing an intrinsic mechanism by which backpropagating spikes can be used to generate a rhythmic burst output.
APA, Harvard, Vancouver, ISO, and other styles
22

Falah, Miftahul, Dian Palupi Rini, and Iwan Pahendra. "Kombinasi Algoritma Backpropagation Neural Network dengan Gravitational Search Algorithm Dalam Meningkatkan Akurasi." JURNAL MEDIA INFORMATIKA BUDIDARMA 5, no. 1 (January 22, 2021): 90. http://dx.doi.org/10.30865/mib.v5i1.2597.

Full text
Abstract:
Predicting disease is usually done based on the experience and knowledge of the doctor. Diagnosis of such a disease is traditionally less effective. The development of medical diagnosis based on machine learning in terms of disease prediction provides a more accurate diagnosis than the traditional way. In terms of predicting disease can use artificial neural networks. The artificial neural network consists of various algorithms, one of which is the Backpropagation Algorithm. In this paper it is proposed that disease prediction systems use the Backpropagation algorithm. Backpropagation algorithms are often used in disease prediction, but the Backpropagation algorithm has a slight drawback that tends to take a long time in obtaining optimum accuracy values. Therefore, a combination of algorithms can overcome the shortcomings of the Backpropagation algorithm by using the success of the Gravitational Search Algorithm (GSA) algorithm, which can overcome the slow convergence and local minimum problems contained in the Backpropagation algorithm. So the authors propose to combine the Backpropagation algorithm using the Gravitational Search Algorithm (GSA) in hopes of improving accuracy results better than using only the Backpropagation algorithm. The results resulted in a higher level of accuracy with the same number of iterations than using Backpropagation only. Can be seen in the first trial of breast cancer data with parameters namely hidden layer 5, learning rate of 2 and iteration as much as 5000 resulting in accuracy of 99.3 % with error 0.7% on Backpropagation Algorithm, while in combination BP & GSA got accuracy of 99.68 % with error of 0.32%.
APA, Harvard, Vancouver, ISO, and other styles
23

Moonlight, Lady Silk, Fiqqih Faizah, Yuyun Suprapto, and Nyaris Pambudiyatno. "Comparison of Backpropagation and Kohonen Self Organising Map (KSOM) Methods in Face Image Recognition." Journal of Information Systems Engineering and Business Intelligence 7, no. 2 (October 28, 2021): 149. http://dx.doi.org/10.20473/jisebi.7.2.149-161.

Full text
Abstract:
Background: Human face is a biometric feature. Artificial Intelligence (AI) called Artificial Neural Network (ANN) can be used in recognising such a biometric feature. In ANN, the learning process is divided into two: supervised and unsupervised learning. In supervised learning, a common method used is Backpropagation, while in the unsupervised learning, a common one is Kohonen Self Organizing Map (KSOM). However, the application of Backpropagation and KSOM need to be adjusted to improve the performance.Objective: In this study, Backpropagation and KSOM algorithms are rewritten to suit face image recognition, applied and compared to determine the effectiveness of each algorithm in solving face image recognition.Methods: In this study, the methods used and compared in the case of face image recognition are Backpropagation dan Kohonen Self Organizing Map (KSOM) Artificial Neural Network (ANN).Results: The smallest False Acceptance Rate (FAR) value of Backpropagation is 28%, and KSOM is 36%, out of 50 unregistered face images tested. While the smallest False Rejection Rate (FRR) value of Backpropagation is 22%, and KSOM is 30%, out of 50 registered face images. The fastest time for the training process using the backpropagation method is 7.14 seconds, and the fastest time for recognition is 0.71 seconds. While the fastest time for the training process using the KSOM method is 5.35 seconds, and the fastest time for recognition is 0.50 seconds.Conclusion: Backpropagation method is better in recognising face images than KSOM method, but the training process and the recognition process by KSOM method are faster than Backpropagation method due to the hidden layers. Keywords: Artificial Neural Network (ANN), Backpropagation, Kohonen Self Organizing Map (KSOM), Supervised learning, Unsupervised learning
APA, Harvard, Vancouver, ISO, and other styles
24

AL-Assady, Nidhal, Jamal Majeed, and Shahbaa Khaleel. "Integration Method with Backpropagation." AL-Rafidain Journal of Computer Sciences and Mathematics 2, no. 1 (June 30, 2005): 49–68. http://dx.doi.org/10.33899/csmj.2005.164073.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Teo, Tat‐Jin, and John M. Reid. "Range estimation using backpropagation." Journal of the Acoustical Society of America 92, no. 3 (September 1992): 1440–42. http://dx.doi.org/10.1121/1.405265.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Park, Cheolsoo, Woojae Seong, Peter Gerstoft, and William S. Hodgkiss. "Geoacoustic Inversion Using Backpropagation." IEEE Journal of Oceanic Engineering 35, no. 4 (October 2010): 722–31. http://dx.doi.org/10.1109/joe.2010.2040659.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Card, Howard. "Digital VLSI backpropagation networks." Canadian Journal of Electrical and Computer Engineering 20, no. 1 (January 1995): 15–23. http://dx.doi.org/10.1109/cjece.1995.7102060.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Leung, H., and S. Haykin. "The complex backpropagation algorithm." IEEE Transactions on Signal Processing 39, no. 9 (1991): 2101–4. http://dx.doi.org/10.1109/78.134446.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Lillicrap, Timothy P., Adam Santoro, Luke Marris, Colin J. Akerman, and Geoffrey Hinton. "Backpropagation and the brain." Nature Reviews Neuroscience 21, no. 6 (April 17, 2020): 335–46. http://dx.doi.org/10.1038/s41583-020-0277-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Teo, Tat-Jin, and John M. Reid. "Multifrequency Holography Using Backpropagation." Ultrasonic Imaging 8, no. 3 (July 1986): 213–24. http://dx.doi.org/10.1177/016173468600800305.

Full text
Abstract:
The technique of wavefield backpropagation has been used quite extensively in the literature. We report on an analytical study of the resolution properties of this technique. Backpropagation as a form of holographic reconstruction suffers from poor axial resolution. We derive expressions for both the axial and the lateral resolutions. We also show that the axial resolution can be substantially improved by the use of multiple frequencies. We derive an expression relating the resolution and bandwidth.
APA, Harvard, Vancouver, ISO, and other styles
31

Tesauro, Gerald, Yu He, and Subutai Ahmad. "Asymptotic Convergence of Backpropagation." Neural Computation 1, no. 3 (September 1989): 382–91. http://dx.doi.org/10.1162/neco.1989.1.3.382.

Full text
Abstract:
We calculate analytically the rate of convergence at long times in the backpropagation learning algorithm for networks with and without hidden units. For networks without hidden units using the standard quadratic error function and a sigmoidal transfer function, we find that the error decreases as 1/t for large t, and the output states approach their target values as 1/√t. It is possible to obtain a different convergence rate for certain error and transfer functions, but the convergence can never be faster than 1/t. These results are unaffected by a momentum term in the learning algorithm, but convergence can be substantially improved by an adaptive learning rate scheme. For networks with hidden units, we generally expect the same rate of convergence to be obtained as in the single-layer case; however, under certain circumstances one can obtain a polynomial speed-up for non sigmoidal units, or a logarithmic speed-up for sigmoidal units. Our analytic results are confirmed by empirical measurements of the convergence rate in numerical simulations.
APA, Harvard, Vancouver, ISO, and other styles
32

Taek Mu Kwon and Hui Cheng. "Contrast enhancement for backpropagation." IEEE Transactions on Neural Networks 7, no. 2 (March 1996): 515–24. http://dx.doi.org/10.1109/72.485685.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

LiMin Fu, Hui-Huang Hsu, and J. C. Principe. "Incremental backpropagation learning networks." IEEE Transactions on Neural Networks 7, no. 3 (May 1996): 757–61. http://dx.doi.org/10.1109/72.501732.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Matson, Charles L., and Hanli Liu. "Backpropagation in turbid media." Journal of the Optical Society of America A 16, no. 6 (June 1, 1999): 1254. http://dx.doi.org/10.1364/josaa.16.001254.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Stoeva, Stefka, and Alexander Nikov. "A fuzzy backpropagation algorithm." Fuzzy Sets and Systems 112, no. 1 (May 2000): 27–39. http://dx.doi.org/10.1016/s0165-0114(98)00079-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Nikov, A., and S. Stoeva. "Quick fuzzy backpropagation algorithm." Neural Networks 14, no. 2 (March 2001): 231–44. http://dx.doi.org/10.1016/s0893-6080(00)00085-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Lehtokangas, M. "Modelling with constructive backpropagation." Neural Networks 12, no. 4-5 (June 1999): 707–16. http://dx.doi.org/10.1016/s0893-6080(99)00018-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Witaszek, Jacek. "Backpropagation: Theory, architectures, applications." Neurocomputing 9, no. 3 (December 1995): 358–59. http://dx.doi.org/10.1016/0925-2312(95)90002-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Joshi, Anupam, and Chia-Hoang Lee. "Backpropagation learns Marr's operator." Biological Cybernetics 70, no. 1 (November 1993): 65–73. http://dx.doi.org/10.1007/bf00202567.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Teo, T. "Multifrequency holography using backpropagation." Ultrasonic Imaging 8, no. 3 (July 1986): 213–24. http://dx.doi.org/10.1016/0161-7346(86)90010-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Ohno, Michihiro, Masato Okada, and Kunihiko Fukushima. "Neocognitron learning by backpropagation." Systems and Computers in Japan 26, no. 5 (1995): 19–28. http://dx.doi.org/10.1002/scj.4690260502.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Cho, Sung Bae, and Jin H. Kim. "Rapid backpropagation learning algorithms." Circuits, Systems, and Signal Processing 12, no. 2 (June 1993): 155–75. http://dx.doi.org/10.1007/bf01189872.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Budiman, I., A. Mubarak, S. Kapita, S. Do Abdullah, and M. Salmin. "Implementation of Backpropagation Artificial Network Methods for Early Children’s Intelligence Prediction." E3S Web of Conferences 328 (2021): 04033. http://dx.doi.org/10.1051/e3sconf/202132804033.

Full text
Abstract:
Intelligence is the ability to process certain types of information derived from human biological and psychological factors. This study aims to implement a Backpropagation artificial neural network for prediction of early childhood intelligence and how to calculate system accuracy on children's intelligence using the backpropagation artificial neural network method. The Backpropagation Neural Network method is one of the best methods in dealing with the problem of recognizing complex patterns. Backpropagation Neural Networks have advantages because the learning is done repeatedly so that it can create a system that is resistant to damage and consistently works well. The application of the Backpropagation Neural Network method is able to predict the intelligence of early childhood. The results of the calculation of the Backpropagation Artificial Neural Network method from 42 children's intelligence data being tested, with 27 training data and 15 test data, the results obtained 100% accuracy percentage results.
APA, Harvard, Vancouver, ISO, and other styles
44

Wahyudi, Mochamad, Firmansyah, Lise Pujiastuti, and Solikhun. "Application of Neural Network Variations for Determining the Best Architecture for Data Prediction." Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) 6, no. 5 (October 8, 2022): 742–48. http://dx.doi.org/10.29207/resti.v6i5.4356.

Full text
Abstract:
Abstract This study focuses on the application and comparison of the epoch, time, performance/MSE training, and performance/MSE testing of variations of the Backpropagation algorithm. The main problem in this study is that the Backpropagation algorithm tends to be slow to reach convergence in obtaining optimum accuracy, requires extensive training data, and the optimization used is less efficient and has performance/MSE which can still be improved to produce better performance/MSE in this research—data prediction process. Determination of the best model for data prediction is seen from the performance/MSE test. This data prediction uses five variations of the Backpropagation algorithm: standard Backpropagation, Resistant Backpropagation, Conjugate Gradient, Fletcher Reeves, and Powell Beale. The research stage begins with processing the avocado production dataset in Indonesia by province from 2016 to 2021. The dataset is first normalized to a value between 0 to 1. The test in this study was carried out using Matlab 2011a. The dataset is divided into two, namely training data and test data. This research's benefit is producing the best model of the Backpropagation algorithm in predicting data with five methods in the Backpropagation algorithm. The test results show that the Resilient Backpropagation method is the best model with a test performance of 0.00543829, training epochs of 1000, training time of 12 seconds, and training performance of 0.00012667.
APA, Harvard, Vancouver, ISO, and other styles
45

Johansson, E. M., F. U. Dowla, and D. M. Goodman. "BACKPROPAGATION LEARNING FOR MULTILAYER FEED-FORWARD NEURAL NETWORKS USING THE CONJUGATE GRADIENT METHOD." International Journal of Neural Systems 02, no. 04 (January 1991): 291–301. http://dx.doi.org/10.1142/s0129065791000261.

Full text
Abstract:
In many applications, the number of interconnects or weights in a neural network is so large that the learning time for the conventional backpropagation algorithm can become excessively long. Numerical optimization theory offers a rich and robust set of techniques which can be applied to neural networks to improve learning rates. In particular, the conjugate gradient method is easily adapted to the backpropagation learning problem. This paper describes the conjugate gradient method, its application to the backpropagation learning problem and presents results of numerical tests which compare conventional backpropagation, steepest descent and the conjugate gradient methods. For the parity problem, we find that the conjugate gradient method is an order of magnitude faster than conventional backpropagation with momentum.
APA, Harvard, Vancouver, ISO, and other styles
46

Fitriah, Zuraidah, Mohamad Handri Tuloli, Syaiful Anam, Noor Hidayat, Indah Yanti, and Dwi Mifta Mahanani. "Backpropagation with BFGS Optimizer for Covid-19 Prediction Cases in Surabaya." Telematika 18, no. 2 (October 4, 2021): 157. http://dx.doi.org/10.31315/telematika.v18i2.5454.

Full text
Abstract:
Covid-19 is a new type of corona virus called SARS-CoV-2. One of the cities that has contributed the most to infected Covid-19 cases in Indonesia is Surabaya, East Java. Predicting the Covid-19 is the important thing to do. One of the prediction methods is Artificial Neural Network (ANN). The backpropagation algorithm is one of the ANN methods that has been successfully used in various fields. However, the performance of backpropagation is depended on the architecture and optimization method. The standard backpropagation algorithm is optimized by gradient descent method. The Broyden - Fletcher - Goldfarb - Shanno (BFGS) algorithm works faster then gradient descent. This paper was predicting the Covid-19 cases in Surabaya using backpropagation with BFGS. Several scenarios of backpropagation parameters were also tested to produce optimal performance. The proposed method gives better results with a faster convergence then the standard backpropagation algorithm for predicting the Covid-19 cases in Surabaya.
APA, Harvard, Vancouver, ISO, and other styles
47

Tambunan, Heru Satria. "PENGENALAN POLA HIV DAN AIDS MENGGUNAKAN ALGORITMA KOHONEN PADA JARINGAN SYARAF TIRUAN BACKPROPAGATION." InfoTekJar (Jurnal Nasional Informatika dan Teknologi Jaringan) 1, no. 1 (September 9, 2016): 65–69. http://dx.doi.org/10.30743/infotekjar.v1i1.44.

Full text
Abstract:
Perkembangan tekhnologi saat ini sangat berkembang pesat, sehingga sangat memudahkan untuk mengatasi berbagai masalah. Di dalam penelitian ini penulis menggunakan algoritma Kohonen pada Jaringan Syaraf Tiruan Backpropagation dalam pengenalan pola penyakit HIV dan AIDS dalam mengenali pola penyakit HIV dan AIDS. Algoritma Backpropagation merupakan salah satu algoritma pembelajaran yang membutuhkan pengawasan dalam proses pembelajarannya. Pada algoritma backpropagation terdapat pasangan data input dan output serta hidden layer untuk melakukan pemrosesan data Jaringan Syaraf Tiruan hingga diperoleh bobot penimbang (weight) yang diinginkan. Dalam penelitian ini, dalam pengenalan pola penyakit HIV dan AIDS. Penulis menggunakan 15 variabel datauntuk dilatih menggunakan algoritma backpropagation dimana pembobotannya secara random dan data yang kedua dilatih menggunakan algoritma backpropagation. Didalam penelitian ini mengunakan aplikasi matlab untuk melakukan pemrosesan.
APA, Harvard, Vancouver, ISO, and other styles
48

Beaufays, Françoise, and Eric A. Wan. "Relating Real-Time Backpropagation and Backpropagation-Through-Time: An Application of Flow Graph Interreciprocity." Neural Computation 6, no. 2 (March 1994): 296–306. http://dx.doi.org/10.1162/neco.1994.6.2.296.

Full text
Abstract:
We show that signal flow graph theory provides a simple way to relate two popular algorithms used for adapting dynamic neural networks, real-time backpropagation and backpropagation-through-time. Starting with the flow graph for real-time backpropagation, we use a simple transposition to produce a second graph. The new graph is shown to be interreciprocal with the original and to correspond to the backpropagation-through-time algorithm. Interreciprocity provides a theoretical argument to verify that both flow graphs implement the same overall weight update.
APA, Harvard, Vancouver, ISO, and other styles
49

Cao, WanLing. "Evaluating the Vocal Music Teaching Using Backpropagation Neural Network." Mobile Information Systems 2022 (June 24, 2022): 1–7. http://dx.doi.org/10.1155/2022/3843726.

Full text
Abstract:
The vocal music teaching for evaluating performers is affected by multiple factors. Evaluators are greatly influenced by subjective factors in scoring outputs. The backpropagation (BP) neural network provides a novel technology that can theoretically simulate any nonlinear continuous function within a certain accuracy range. The backpropagation neural network is composed of adaptive feedforward learning network that is widely used in artificial intelligence (AI). In addition, the backpropagation neural network can simulate the nonlinear mapping composed of various factors. The novelty of the neural network is that it can model the nonlinear process without knowing the cause of the data, which can overcome the human subjective arbitrariness and make the evaluation outcomes. Furthermore, accurate and effective scoring systems can be designed using neural networks. In this paper, we establish a vocal music evaluation research system in order to objectivize each vocal music teaching evaluation index. To do so, we use the score vector as the input and obtain a reasonable and objective output score through the backpropagation neural network. Moreover, according to the characteristics of the backpropagation neural network, the factors of vocal music teaching evaluation are analyzed, and a backpropagation neural network model for vocal music teaching evaluation and evaluation is constructed. The experimental outcomes demonstrate that the trained backpropagation network can simulate a stable vocal music teaching evaluation research system. Furthermore, we observed that the backpropagation neural network can be well utilized for vocal music teaching evaluation research.
APA, Harvard, Vancouver, ISO, and other styles
50

Kim, Jee-Heon, Nam-Chul Seong, and Won-Chang Choi. "Comparative Evaluation of Predicting Energy Consumption of Absorption Heat Pump with Multilayer Shallow Neural Network Training Algorithms." Buildings 12, no. 1 (December 26, 2021): 13. http://dx.doi.org/10.3390/buildings12010013.

Full text
Abstract:
The performance of various multilayer neural network algorithms to predict the energy consumption of an absorption chiller in an air conditioning system under the same conditions was compared and evaluated in this study. Each prediction model was created using 12 representative multilayer shallow neural network algorithms. As training data, about a month of actual operation data during the heating period was used, and the predictive performance of 12 algorithms according to the training size was evaluated. The prediction results indicate that the error rates using the measured values are 0.09% minimum, 5.76% maximum, and 1.94 standard deviation (SD) for the Levenberg–Marquardt backpropagation model and 0.41% minimum, 5.05% maximum, and 1.68 SD for the Bayesian regularization backpropagation model. The conjugate gradient with Polak–Ribiére updates backpropagation model yielded lower values than the other two models, with 0.31% minimum, 5.73% maximum, and 1.76 SD. Based on the results for the predictive performance evaluation index, CvRMSE, all other models (conjugate gradient with Fletcher–Reeves updates backpropagation, one-step secant backpropagation, gradient descent with momentum and adaptive learning rate backpropagation, gradient descent with momentum backpropagation) except for the gradient descent backpropagation model yielded results that satisfy ASHRAE (American Society of Heating, Refrigerating and Air-Conditioning Engineers) Guideline 14. The results of this study confirm that the prediction performance may differ for each multilayer neural network training algorithm. Therefore, selecting the appropriate model to fit the characteristics of a specific project is essential.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography