Academic literature on the topic 'Backpropagation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Backpropagation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Backpropagation"

1

Andrew, Alex M. "Backpropagation." Kybernetes 30, no. 9/10 (December 2001): 1110–17. http://dx.doi.org/10.1108/03684920110405601.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Faisal, Faisal. "Penggunaan Metode Backpropagation Pada Sistem Prediksi Kelulusan Mahasiswa STMIK Kaputama Binjai." Data Sciences Indonesia (DSI) 2, no. 1 (August 10, 2022): 13–19. http://dx.doi.org/10.47709/dsi.v2i1.1664.

Full text
Abstract:
Kelulusan yang tepat pada waktunya menjadi salah satu tolak ukur integritas sekolah tinggi, termasuk STMIK Kaputama Binjai. Dari tahun ke tahun, banyak mahasiswa Universitas STMIK Kaputama Binjai yang lulus tepat pada waktunya, namun tidak sedikit pula mahasiswa yang tidak lulus tepat pada waktunya. Untuk itu perlu adanya sistem prediksi kelulusan agar dosen dapat mengarahkan mahasiswa yang diprediksi akan lulus terlambat. Metode yang digunakan adalah Jaringan Syaraf Tiruan Backpropagation. Metode Backpropagation memiliki 3 arsitektur yaitu input layer, hidden layer, dan output layer. Proses Backpropagation meliputi forward dan backward. Data yang digunakan adalah data IPS1 hingga IPS4 kelulusan tahun 2015-2021 dari program studi Teknik Informatika, sebagai data latih untuk jaringan syaraf tiruan Backpropagation menggunakan data dari mahasiswa yang sudah lulus, lalu sebagai data uji untuk prediksi kelulusan bisa mnggunakan data mahasiswa yang masih menempuh pendidikan dengan ketentuan harus sudah melewati semester 4. Dari berbagai percobaan dengan fitur max iterasi, max kecepatan latih, dan minimal error yang berbeda lalu data latih yang berbeda pula dapat menghasilkan tingkat akurasi hasil prediksi yang berbeda, akurasi pengujian tertinggi dapat dilihat dari hasil error yang paling minimum. Sistem ini dibangun menggunakan Bahasa Pemrograman Visual Basic dengan software Visual Studio 2010. Hasil penelitian menunjukkan bahwa metode Backpropagations dinilai cukup bagus dalam melakukan Pengklasifikasian untuk melakukan prediksi kelulusan mahasiswa.
APA, Harvard, Vancouver, ISO, and other styles
3

Sexton, Randall S., Robert E. Dorsey, and John D. Johnson. "Beyond Backpropagation." Journal of Organizational and End User Computing 11, no. 3 (July 1999): 3–10. http://dx.doi.org/10.4018/joeuc.1999070101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Adigun, Olaoluwa, and Bart Kosko. "Bidirectional Backpropagation." IEEE Transactions on Systems, Man, and Cybernetics: Systems 50, no. 5 (May 2020): 1982–94. http://dx.doi.org/10.1109/tsmc.2019.2916096.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Irawan, Eka, M. Zarlis, and Erna Budhiarti Nababan. "ANALISIS PENAMBAHAN NILAI MOMENTUM PADA PREDIKSI PRODUKTIVITAS KELAPA SAWIT MENGGUNAKAN BACKPROPAGATION." InfoTekJar (Jurnal Nasional Informatika dan Teknologi Jaringan) 1, no. 2 (March 3, 2017): 84–89. http://dx.doi.org/10.30743/infotekjar.v1i2.67.

Full text
Abstract:
Algoritma backpropagation merupakan multi layer perceptron yang banyak digunakan untuk menyelesaikan persoalan yang luas, namun algoritma backpropagation juga mempunyai keterbatasan yaitu laju konvergensi yang cukup lambat. Pada penelitian ini penulis menambahkan parameter learning rate secara adaptif pada setiap iterasi dan koefisien momentum untuk menghitung proses perubahan bobot. Dari hasil simulasi komputer maka diperoleh perbandingan antara algoritma backpropagation standar dengan backpropagation dengan penambahan momentum. Untuk algoritma backpropagation standar kecepatan konvergensi 727 epoch dengan nilai MSE 0,01, sedangkan algoritma backpropagation standar mencapai 4000 epoch dengan nilai MSE 0,001. . Hal ini menunjukkan bahwa algoritma backpropagation adaptive learning lebih cepat mencapai konvergensi daripada algoritma backpropagation standar.
APA, Harvard, Vancouver, ISO, and other styles
6

Nafisah, Zumrotun, Febrian Rachmadi, and Elly Matul Imah. "Face Recognition Using Complex Valued Backpropagation." Jurnal Ilmu Komputer dan Informasi 11, no. 2 (June 29, 2018): 103. http://dx.doi.org/10.21609/jiki.v11i2.617.

Full text
Abstract:
Face recognition is one of biometrical research area that is still interesting. This study discusses the Complex-Valued Backpropagation algorithm for face recognition. Complex-Valued Backpropagation is an algorithm modified from Real-Valued Backpropagation algorithm where the weights and activation functions used are complex. The dataset used in this study consist of 250 images that is classified in 5 classes. The performance of face recognition using Complex-Valued Backpropagation is also compared with Real-Valued Backpropagation algorithm. Experimental results have shown that Complex-Valued Backpropagation performance is better than Real-Valued Backpropagation.
APA, Harvard, Vancouver, ISO, and other styles
7

Ojha, Varun, and Giuseppe Nicosia. "Backpropagation Neural Tree." Neural Networks 149 (May 2022): 66–83. http://dx.doi.org/10.1016/j.neunet.2022.02.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Vicini, Delio, Sébastien Speierer, and Wenzel Jakob. "Path replay backpropagation." ACM Transactions on Graphics 40, no. 4 (August 2021): 1–14. http://dx.doi.org/10.1145/3476576.3476672.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Vicini, Delio, Sébastien Speierer, and Wenzel Jakob. "Path replay backpropagation." ACM Transactions on Graphics 40, no. 4 (August 2021): 1–14. http://dx.doi.org/10.1145/3450626.3459804.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Georgiou, G. M., and C. Koutsougeras. "Complex domain backpropagation." IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing 39, no. 5 (May 1992): 330–34. http://dx.doi.org/10.1109/82.142037.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Backpropagation"

1

Civelek, Ferda N. (Ferda Nur). "Temporal Connectionist Expert Systems Using a Temporal Backpropagation Algorithm." Thesis, University of North Texas, 1993. https://digital.library.unt.edu/ark:/67531/metadc278824/.

Full text
Abstract:
Representing time has been considered a general problem for artificial intelligence research for many years. More recently, the question of representing time has become increasingly important in representing human decision making process through connectionist expert systems. Because most human behaviors unfold over time, any attempt to represent expert performance, without considering its temporal nature, can often lead to incorrect results. A temporal feedforward neural network model that can be applied to a number of neural network application areas, including connectionist expert systems, has been introduced. The neural network model has a multi-layer structure, i.e. the number of layers is not limited. Also, the model has the flexibility of defining output nodes in any layer. This is especially important for connectionist expert system applications. A temporal backpropagation algorithm which supports the model has been developed. The model along with the temporal backpropagation algorithm makes it extremely practical to define any artificial neural network application. Also, an approach that can be followed to decrease the memory space used by weight matrix has been introduced. The algorithm was tested using a medical connectionist expert system to show how best we describe not only the disease but also the entire course of the disease. The system, first, was trained using a pattern that was encoded from the expert system knowledge base rules. Following then, series of experiments were carried out using the temporal model and the temporal backpropagation algorithm. The first series of experiments was done to determine if the training process worked as predicted. In the second series of experiments, the weight matrix in the trained system was defined as a function of time intervals before presenting the system with the learned patterns. The result of the two experiments indicate that both approaches produce correct results. The only difference between the two results was that compressing the weight matrix required more training epochs to produce correct results. To get a measure of the correctness of the results, an error measure which is the value of the error squared was summed over all patterns to get a total sum of squares.
APA, Harvard, Vancouver, ISO, and other styles
2

Yee, Clifford Wing Wei Physics Faculty of Science UNSW. "Point source compensation ??? a backpropagation method for underwater acoustic imaging." Awarded by:University of New South Wales. School of Physics, 2003. http://handle.unsw.edu.au/1959.4/20590.

Full text
Abstract:
The backpropagation method of image reconstruction has been known for some time with the advantage of fast processing due to the use of Fast Fourier Transform. But its applicability to underwater imaging has been limited. At present the shift-and-add method is the more widely used method in underwater imaging. This is due to the fact that backpropagation has been derived for plane wave insonification, with the scattered waves detected in transmission-mode, or synthetic aperture set-up. One of the methods being used for underwater imaging is to use a point source for the insonification of the target and the scattered waves detected in reflection-mode by a receiver array. An advantage of this scanning method is only one transmission of the source is required to capture an image, instead of multiple transmissions. Therefore motion artifacts are kept to minimum. To be able to exploit the processing speed of the backpropagation method, it must be adapted for point source insonification. The coverage of this configuration in the literature has been scant, methods for spherical sources have been proposed for transmission mode and arbitrary surfaces in geophysical applications. These methods are complex and difficult to use. A novel point source compensation method is proposed in this thesis so that the backpropagation image formation method can be used for the point source insonification set-up. The method of investigation undertaken to derive this new backpropagation method was through theoretical analysis, numerical simulation and experimental verification. The effect of various compensation factors on the image quality was studied in simulation. In the experimental verification, practical issues relating to the application of the new method was addressed. The final proof of concept of our method was undertaken with our experimental verification. The quality of images formed with the point source compensation methods has also been compared with that with the shiftand- add method. Experimental and simulation results show that the point source compensated backpropagation algorithm can produce images of comparable quality with those formed with shift-and-add method for the set-up of wideband point-source insonification with detection in reflection-mode, with the advantage of faster image formation.
APA, Harvard, Vancouver, ISO, and other styles
3

Bendelac, Shiri. "Enhanced Neural Network Training Using Selective Backpropagation and Forward Propagation." Thesis, Virginia Tech, 2018. http://hdl.handle.net/10919/83714.

Full text
Abstract:
Neural networks are making headlines every day as the tool of the future, powering artificial intelligence programs and supporting technologies never seen before. However, the training of neural networks can take days or even weeks for bigger networks, and requires the use of super computers and GPUs in academia and industry in order to achieve state of the art results. This thesis discusses employing selective measures to determine when to backpropagate and forward propagate in order to reduce training time while maintaining classification performance. This thesis tests these new algorithms on the MNIST and CASIA datasets, and achieves successful results with both algorithms on the two datasets. The selective backpropagation algorithm shows a reduction of up to 93.3% of backpropagations completed, and the selective forward propagation algorithm shows a reduction of up to 72.90% in forward propagations and backpropagations completed compared to baseline runs of always forward propagating and backpropagating. This work also discusses employing the selective backpropagation algorithm on a modified dataset with disproportional under-representation of some classes compared to others.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
4

Bonnell, Jeffrey A. "Implementation of a New Sigmoid Function in Backpropagation Neural Networks." Digital Commons @ East Tennessee State University, 2011. https://dc.etsu.edu/etd/1342.

Full text
Abstract:
This thesis presents the use of a new sigmoid activation function in backpropagation artificial neural networks (ANNs). ANNs using conventional activation functions may generalize poorly when trained on a set which includes quirky, mislabeled, unbalanced, or otherwise complicated data. This new activation function is an attempt to improve generalization and reduce overtraining on mislabeled or irrelevant data by restricting training when inputs to the hidden neurons are sufficiently small. This activation function includes a flattened, low-training region which grows or shrinks during back-propagation to ensure a desired proportion of inputs inside the low-training region. With a desired low-training proportion of 0, this activation function reduces to a standard sigmoidal curve. A network with the new activation function implemented in the hidden layer is trained on benchmark data sets and compared with the standard activation function in an attempt to improve area under the curve for the receiver operating characteristic in biological and other classification tasks.
APA, Harvard, Vancouver, ISO, and other styles
5

Hövel, Christoph A. "Finanzmarktprognose mit neuronalen Netzen : Training mit Backpropagation und genetisch-evolutionären Verfahren /." Lohmar ; Köln : Eul, 2003. http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&doc_number=010635637&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Seifert, Christin, and Jan Parthey. "Simulation Rekursiver Auto-Assoziativer Speicher (RAAM) durch Erweiterung eines klassischen Backpropagation-Simulators." Thesis, Universitätsbibliothek Chemnitz, 2003. http://nbn-resolving.de/urn:nbn:de:swb:ch1-200300536.

Full text
Abstract:
Rekursive Auto-Assoziative Speicher (RAAM) sind spezielle Neuronale Netze (NN), die in der Lage sind, hierarchiche Strukturen zu verarbeiten. Bei der Simulation dieser Netze gibt es einige Besonderheiten, wie z.B. die dynamische Trainingsmenge, zu beachten. In der Arbeit werden diese und die daraus resultierenden angepassten Lernalgorithmen erörtert. Außerdem wird ein normaler Backpropagation-Simulator (Xerion) um die Fähigkeiten für die Simulation von RAAMs erweitert.
APA, Harvard, Vancouver, ISO, and other styles
7

Sam, Iat Tong. "Theory of backpropagation type learning of artificial neural networks and its applications." Thesis, University of Macau, 2001. http://umaclib3.umac.mo/record=b1446702.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

potter, matthew james. "Improving ANN Generalization via Self-Organized Flocking in conjunction with Multitasked Backpropagation." NCSU, 2003. http://www.lib.ncsu.edu/theses/available/etd-03242003-075528/.

Full text
Abstract:
The purpose of this research has been to develop methods of improving the generalization capabilities of artificial neural networks. Tools for examining the influence of individual training set patterns on the learning abilities of individual neurons are put forth and utilized in the implementation of new network learning algorithms. Algorithms are based largely on the supervised training algorithm: backpropagation, and all experiments use the standard backpropagation algorithm for comparison of results. The focus of the new learning algorithms revolve around the addition of two main components. The first addition is that of an unsupervised learning algorithm called flocking. Flocking attempts to provide network hyperplane divisions that are evenly influenced by examples on either side of the hyperplane. The second addition is that of a multi-tasking approach called convergence training. Convergence training uses the information provided by a clustering algorithm in order to create subtasks that represent the divisions between clusters. These subtasks are then trained in unison in order to promote hyperplane sharing within the problem space. Generalization was improved in most cases and the solutions produced by the new learning algorithms are demonstrated to be very robust against different random weight initializations. This research is not only a search for better generalizing ANN learning algorithms, but also a search for better understanding when dealing with the complexities involved in ANN generalization.
APA, Harvard, Vancouver, ISO, and other styles
9

Wellington, Charles H. "Backpropagation neural network for noise cancellation applied to the NUWES test ranges." Thesis, Monterey, California. Naval Postgraduate School, 1991. http://hdl.handle.net/10945/26899.

Full text
Abstract:
Approved for public release; distribution is unlimited
This thesis investigates the application of backpropagation neural networks as an alternative to adaptive filtering at the NUWES test ranges. To facilitate the investigation, a model of the test range is developed. This model accounts for acoustic transmission losses, the effects of doppler shift, multipath, and finite propagation time delay. After describing the model, the backpropagation neural network algorithm and feature selection for the network are explained. Then, two schemes based on the network's output, signal waveform recovery and binary code recovery, are applied to the model. Simulation results of the signal waveform recovery and direct code recovery schemes are presented for several scenarios.
APA, Harvard, Vancouver, ISO, and other styles
10

Seifert, Christin Parthey Jan. "Simulation Rekursiver Auto-Assoziativer Speicher (RAAM) durch Erweiterung eines klassischen Backpropagation-Simulators." [S.l. : s.n.], 2003. http://www.bsz-bw.de/cgi-bin/xvms.cgi?SWB10607558.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Backpropagation"

1

Karazanos, Elias. Temporal learning using time-dependent backpropagation and teacher forcing. Manchester: UMIST, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Nicolaides, Lena. Thermal-wave slice diffraction tomography with backpropagation and transmission reconstructions. Ottawa: National Library of Canada, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

P, Dhawan Atam, and United States. National Aeronautics and Space Administration., eds. LVQ and backpropagation neural networks applied to NASA SSME data. [Washington, DC: National Aeronautics and Space Administration, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gaxiola, Fernando, Patricia Melin, and Fevrier Valdez. New Backpropagation Algorithm with Type-2 Fuzzy Weights for Neural Networks. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-34087-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wellington, Charles H. Backpropagation neural network for noise cancellation applied to the NUWES test ranges. Monterey, Calif: Naval Postgraduate School, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Werbos, Paul J. The roots of backpropagation: From ordered derivatives to neural networksand political forecasting. New York: Wiley, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

N, Sundararajan, and Foo Shou King, eds. Parallel implementations of backpropagation neural networks on transputers: A study of training set parallelism. Singapore: World Scientific, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Billings, S. A. A comparison of the backpropagation and recursive prediction error algorithms for training neural networks. Sheffield: University of Sheffield, Dept. of Control Engineering, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Menke, Kurt William. Nonlinear adaptive control using backpropagating neural networks. Monterey, Calif: Naval Postgraduate School, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chauvin, Yves, and David E. Rumelhart, eds. Backpropagation. Psychology Press, 2013. http://dx.doi.org/10.4324/9780203763247.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Backpropagation"

1

Munro, Paul, Hannu Toivonen, Geoffrey I. Webb, Wray Buntine, Peter Orbanz, Yee Whye Teh, Pascal Poupart, et al. "Backpropagation." In Encyclopedia of Machine Learning, 73. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_51.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

De Wilde, Philippe. "Backpropagation." In Neural Network Models, 33–52. London: Springer London, 1997. http://dx.doi.org/10.1007/978-1-84628-614-8_2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Andrew, Alex M. "Backpropagation." In IFSR International Series on Systems Science and Engineering, 85–104. New York, NY: Springer New York, 2009. http://dx.doi.org/10.1007/978-0-387-75164-1_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Munro, Paul. "Backpropagation." In Encyclopedia of Machine Learning and Data Mining, 93–97. Boston, MA: Springer US, 2017. http://dx.doi.org/10.1007/978-1-4899-7687-1_51.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bishop, Christopher M., and Hugh Bishop. "Backpropagation." In Deep Learning, 233–52. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-45468-4_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Braun, Heinrich, Johannes Feulner, and Rainer Malaka. "Backpropagation I." In Springer-Lehrbuch, 81–101. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-642-61000-4_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Braun, Heinrich, Johannes Feulner, and Rainer Malaka. "Backpropagation II." In Springer-Lehrbuch, 103–26. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-642-61000-4_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gasparini, Sonia, and Michele Migliore. "Action Potential Backpropagation." In Encyclopedia of Computational Neuroscience, 133–37. New York, NY: Springer New York, 2015. http://dx.doi.org/10.1007/978-1-4614-6675-8_123.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Antonik, Piotr. "Backpropagation with Photonics." In Springer Theses, 63–89. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-91053-6_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Tuomi, Ilkka. "Vygotsky Meets Backpropagation." In Lecture Notes in Computer Science, 570–83. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-93843-1_42.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Backpropagation"

1

Brodsky, Stephen A., and Clark C. Guest. "Optical Matrix-Vector Implementation of Binary Valued Backpropagation." In Optical Computing. Washington, D.C.: Optica Publishing Group, 1991. http://dx.doi.org/10.1364/optcomp.1991.me8.

Full text
Abstract:
Optical implementations of neural networks can combine advantages of neural network adaptive parallel processing and optical free-space connectivity. Binary valued Backpropagation1, a supervised learning algorithm related to standard Backpropagation2, significantly reduces interconnection storage and computation requirements. This implementation of binary valued Backpropagation used optical matrix-vector multiplication3 to represent the forward information flow between network layers. Previous analog optical network memory systems have been described4.
APA, Harvard, Vancouver, ISO, and other styles
2

Dong, Yuhan, Chenguang Liu, Yui Lo, Yaqian Xu, Ke Wang, and Kai Zhang. "Attention Backpropagation." In CCEAI 2021: 5th International Conference on Control Engineering and Artificial Intelligence. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3448218.3448227.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Fausett, D. W. "Strictly local backpropagation." In 1990 IJCNN International Joint Conference on Neural Networks. IEEE, 1990. http://dx.doi.org/10.1109/ijcnn.1990.137834.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cheng, L. M., H. L. Mak, and L. L. Cheng. "Structured backpropagation network." In 1991 IEEE International Joint Conference on Neural Networks. IEEE, 1991. http://dx.doi.org/10.1109/ijcnn.1991.170640.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Fernandes de Moraes, Joyrles, and Jörg Dietrich Wilhelm Schleicher. "Backpropagation-based redatuming." In International Congress of the Brazilian Geophysical Society&Expogef. Brazilian Geophysical Society, 2021. http://dx.doi.org/10.22564/17cisbgf2021.064.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Wymeersch, Henk. "Stochastic Digital Backpropagation: Unifying Digital Backpropagation and the MAP Criterion." In Signal Processing in Photonic Communications. Washington, D.C.: OSA, 2014. http://dx.doi.org/10.1364/sppcom.2014.st2d.3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yaremchuk, Vanessa, and Marcelo M. Wanderley. "Brahms, Bodies and Backpropagation." In the 2014 International Workshop. New York, New York, USA: ACM Press, 2014. http://dx.doi.org/10.1145/2617995.2618011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Goli, Negar, and Tor M. Aamodt. "ReSprop: Reuse Sparsified Backpropagation." In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2020. http://dx.doi.org/10.1109/cvpr42600.2020.00162.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Leung, Karen, Nikos Arechiga, and Marco Pavone. "Backpropagation for Parametric STL." In 2019 IEEE Intelligent Vehicles Symposium (IV). IEEE, 2019. http://dx.doi.org/10.1109/ivs.2019.8814167.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Diegert, C. "Out-of-core backpropagation." In 1990 IJCNN International Joint Conference on Neural Networks. IEEE, 1990. http://dx.doi.org/10.1109/ijcnn.1990.137701.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Backpropagation"

1

Morton, Paul E., and Glenn F. Wilson. Backpropagation and EEG Data. Fort Belvoir, VA: Defense Technical Information Center, October 1988. http://dx.doi.org/10.21236/ada279073.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Levy, Bernard C., and Cengiz Esmersoy. Variable Background Born Inversion by Wavefield Backpropagation. Fort Belvoir, VA: Defense Technical Information Center, November 1986. http://dx.doi.org/10.21236/ada459595.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Vitela, J. E., and J. Reifman. Premature saturation in backpropagation networks: Mechanism and necessary conditions. Office of Scientific and Technical Information (OSTI), August 1997. http://dx.doi.org/10.2172/510390.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Vitela, J. E., and J. Reifman. Premature saturation in backpropagation networks: Mechanism and necessary conditions. Office of Scientific and Technical Information (OSTI), December 1995. http://dx.doi.org/10.2172/211552.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Mu, Ruihui, and Xiaoqin Zeng. Improved Webpage Classification Technology Based on Feedforward Backpropagation Neural Network. "Prof. Marin Drinov" Publishing House of Bulgarian Academy of Sciences, September 2018. http://dx.doi.org/10.7546/crabs.2018.09.11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gage, Harmon J. Using Upper Layer Weights to Efficiently Construct and Train Feedforward Neural Networks Executing Backpropagation. Fort Belvoir, VA: Defense Technical Information Center, March 2011. http://dx.doi.org/10.21236/ada545618.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kerr, John Patrick. The parallel implementation of a backpropagation neural network and its applicability to SPECT image reconstruction. Office of Scientific and Technical Information (OSTI), January 1992. http://dx.doi.org/10.2172/10138858.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Wawrzynek, John, Krste Asanovic, Brian Kingsbury, James Beck, and David Johnson. SPERT-II: A Vector Microprocessor System and Its Application to Large Problems in Backpropagation Training,. Fort Belvoir, VA: Defense Technical Information Center, January 1993. http://dx.doi.org/10.21236/ada327554.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kerr, J. P. The parallel implementation of a backpropagation neural network and its applicability to SPECT image reconstruction. Office of Scientific and Technical Information (OSTI), January 1992. http://dx.doi.org/10.2172/6879460.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Felix, Nick Alonso, and Corinne Teeter. Combining Spike Time Dependent Plasticity (STDP) and Backpropagation (BP) for Robust and Data Efficient Spiking Neural Networks (SNN). Office of Scientific and Technical Information (OSTI), December 2022. http://dx.doi.org/10.2172/1902866.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography