Literatura académica sobre el tema "Backpropagation"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Backpropagation".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Artículos de revistas sobre el tema "Backpropagation"

1

Andrew, Alex M. "Backpropagation". Kybernetes 30, n.º 9/10 (diciembre de 2001): 1110–17. http://dx.doi.org/10.1108/03684920110405601.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Faisal, Faisal. "Penggunaan Metode Backpropagation Pada Sistem Prediksi Kelulusan Mahasiswa STMIK Kaputama Binjai". Data Sciences Indonesia (DSI) 2, n.º 1 (10 de agosto de 2022): 13–19. http://dx.doi.org/10.47709/dsi.v2i1.1664.

Texto completo
Resumen
Kelulusan yang tepat pada waktunya menjadi salah satu tolak ukur integritas sekolah tinggi, termasuk STMIK Kaputama Binjai. Dari tahun ke tahun, banyak mahasiswa Universitas STMIK Kaputama Binjai yang lulus tepat pada waktunya, namun tidak sedikit pula mahasiswa yang tidak lulus tepat pada waktunya. Untuk itu perlu adanya sistem prediksi kelulusan agar dosen dapat mengarahkan mahasiswa yang diprediksi akan lulus terlambat. Metode yang digunakan adalah Jaringan Syaraf Tiruan Backpropagation. Metode Backpropagation memiliki 3 arsitektur yaitu input layer, hidden layer, dan output layer. Proses Backpropagation meliputi forward dan backward. Data yang digunakan adalah data IPS1 hingga IPS4 kelulusan tahun 2015-2021 dari program studi Teknik Informatika, sebagai data latih untuk jaringan syaraf tiruan Backpropagation menggunakan data dari mahasiswa yang sudah lulus, lalu sebagai data uji untuk prediksi kelulusan bisa mnggunakan data mahasiswa yang masih menempuh pendidikan dengan ketentuan harus sudah melewati semester 4. Dari berbagai percobaan dengan fitur max iterasi, max kecepatan latih, dan minimal error yang berbeda lalu data latih yang berbeda pula dapat menghasilkan tingkat akurasi hasil prediksi yang berbeda, akurasi pengujian tertinggi dapat dilihat dari hasil error yang paling minimum. Sistem ini dibangun menggunakan Bahasa Pemrograman Visual Basic dengan software Visual Studio 2010. Hasil penelitian menunjukkan bahwa metode Backpropagations dinilai cukup bagus dalam melakukan Pengklasifikasian untuk melakukan prediksi kelulusan mahasiswa.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Sexton, Randall S., Robert E. Dorsey y John D. Johnson. "Beyond Backpropagation". Journal of Organizational and End User Computing 11, n.º 3 (julio de 1999): 3–10. http://dx.doi.org/10.4018/joeuc.1999070101.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Adigun, Olaoluwa y Bart Kosko. "Bidirectional Backpropagation". IEEE Transactions on Systems, Man, and Cybernetics: Systems 50, n.º 5 (mayo de 2020): 1982–94. http://dx.doi.org/10.1109/tsmc.2019.2916096.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Irawan, Eka, M. Zarlis y Erna Budhiarti Nababan. "ANALISIS PENAMBAHAN NILAI MOMENTUM PADA PREDIKSI PRODUKTIVITAS KELAPA SAWIT MENGGUNAKAN BACKPROPAGATION". InfoTekJar (Jurnal Nasional Informatika dan Teknologi Jaringan) 1, n.º 2 (3 de marzo de 2017): 84–89. http://dx.doi.org/10.30743/infotekjar.v1i2.67.

Texto completo
Resumen
Algoritma backpropagation merupakan multi layer perceptron yang banyak digunakan untuk menyelesaikan persoalan yang luas, namun algoritma backpropagation juga mempunyai keterbatasan yaitu laju konvergensi yang cukup lambat. Pada penelitian ini penulis menambahkan parameter learning rate secara adaptif pada setiap iterasi dan koefisien momentum untuk menghitung proses perubahan bobot. Dari hasil simulasi komputer maka diperoleh perbandingan antara algoritma backpropagation standar dengan backpropagation dengan penambahan momentum. Untuk algoritma backpropagation standar kecepatan konvergensi 727 epoch dengan nilai MSE 0,01, sedangkan algoritma backpropagation standar mencapai 4000 epoch dengan nilai MSE 0,001. . Hal ini menunjukkan bahwa algoritma backpropagation adaptive learning lebih cepat mencapai konvergensi daripada algoritma backpropagation standar.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Nafisah, Zumrotun, Febrian Rachmadi y Elly Matul Imah. "Face Recognition Using Complex Valued Backpropagation". Jurnal Ilmu Komputer dan Informasi 11, n.º 2 (29 de junio de 2018): 103. http://dx.doi.org/10.21609/jiki.v11i2.617.

Texto completo
Resumen
Face recognition is one of biometrical research area that is still interesting. This study discusses the Complex-Valued Backpropagation algorithm for face recognition. Complex-Valued Backpropagation is an algorithm modified from Real-Valued Backpropagation algorithm where the weights and activation functions used are complex. The dataset used in this study consist of 250 images that is classified in 5 classes. The performance of face recognition using Complex-Valued Backpropagation is also compared with Real-Valued Backpropagation algorithm. Experimental results have shown that Complex-Valued Backpropagation performance is better than Real-Valued Backpropagation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Ojha, Varun y Giuseppe Nicosia. "Backpropagation Neural Tree". Neural Networks 149 (mayo de 2022): 66–83. http://dx.doi.org/10.1016/j.neunet.2022.02.003.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Vicini, Delio, Sébastien Speierer y Wenzel Jakob. "Path replay backpropagation". ACM Transactions on Graphics 40, n.º 4 (agosto de 2021): 1–14. http://dx.doi.org/10.1145/3476576.3476672.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Vicini, Delio, Sébastien Speierer y Wenzel Jakob. "Path replay backpropagation". ACM Transactions on Graphics 40, n.º 4 (agosto de 2021): 1–14. http://dx.doi.org/10.1145/3450626.3459804.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Georgiou, G. M. y C. Koutsougeras. "Complex domain backpropagation". IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing 39, n.º 5 (mayo de 1992): 330–34. http://dx.doi.org/10.1109/82.142037.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Tesis sobre el tema "Backpropagation"

1

Civelek, Ferda N. (Ferda Nur). "Temporal Connectionist Expert Systems Using a Temporal Backpropagation Algorithm". Thesis, University of North Texas, 1993. https://digital.library.unt.edu/ark:/67531/metadc278824/.

Texto completo
Resumen
Representing time has been considered a general problem for artificial intelligence research for many years. More recently, the question of representing time has become increasingly important in representing human decision making process through connectionist expert systems. Because most human behaviors unfold over time, any attempt to represent expert performance, without considering its temporal nature, can often lead to incorrect results. A temporal feedforward neural network model that can be applied to a number of neural network application areas, including connectionist expert systems, has been introduced. The neural network model has a multi-layer structure, i.e. the number of layers is not limited. Also, the model has the flexibility of defining output nodes in any layer. This is especially important for connectionist expert system applications. A temporal backpropagation algorithm which supports the model has been developed. The model along with the temporal backpropagation algorithm makes it extremely practical to define any artificial neural network application. Also, an approach that can be followed to decrease the memory space used by weight matrix has been introduced. The algorithm was tested using a medical connectionist expert system to show how best we describe not only the disease but also the entire course of the disease. The system, first, was trained using a pattern that was encoded from the expert system knowledge base rules. Following then, series of experiments were carried out using the temporal model and the temporal backpropagation algorithm. The first series of experiments was done to determine if the training process worked as predicted. In the second series of experiments, the weight matrix in the trained system was defined as a function of time intervals before presenting the system with the learned patterns. The result of the two experiments indicate that both approaches produce correct results. The only difference between the two results was that compressing the weight matrix required more training epochs to produce correct results. To get a measure of the correctness of the results, an error measure which is the value of the error squared was summed over all patterns to get a total sum of squares.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Yee, Clifford Wing Wei Physics Faculty of Science UNSW. "Point source compensation ??? a backpropagation method for underwater acoustic imaging". Awarded by:University of New South Wales. School of Physics, 2003. http://handle.unsw.edu.au/1959.4/20590.

Texto completo
Resumen
The backpropagation method of image reconstruction has been known for some time with the advantage of fast processing due to the use of Fast Fourier Transform. But its applicability to underwater imaging has been limited. At present the shift-and-add method is the more widely used method in underwater imaging. This is due to the fact that backpropagation has been derived for plane wave insonification, with the scattered waves detected in transmission-mode, or synthetic aperture set-up. One of the methods being used for underwater imaging is to use a point source for the insonification of the target and the scattered waves detected in reflection-mode by a receiver array. An advantage of this scanning method is only one transmission of the source is required to capture an image, instead of multiple transmissions. Therefore motion artifacts are kept to minimum. To be able to exploit the processing speed of the backpropagation method, it must be adapted for point source insonification. The coverage of this configuration in the literature has been scant, methods for spherical sources have been proposed for transmission mode and arbitrary surfaces in geophysical applications. These methods are complex and difficult to use. A novel point source compensation method is proposed in this thesis so that the backpropagation image formation method can be used for the point source insonification set-up. The method of investigation undertaken to derive this new backpropagation method was through theoretical analysis, numerical simulation and experimental verification. The effect of various compensation factors on the image quality was studied in simulation. In the experimental verification, practical issues relating to the application of the new method was addressed. The final proof of concept of our method was undertaken with our experimental verification. The quality of images formed with the point source compensation methods has also been compared with that with the shiftand- add method. Experimental and simulation results show that the point source compensated backpropagation algorithm can produce images of comparable quality with those formed with shift-and-add method for the set-up of wideband point-source insonification with detection in reflection-mode, with the advantage of faster image formation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Bendelac, Shiri. "Enhanced Neural Network Training Using Selective Backpropagation and Forward Propagation". Thesis, Virginia Tech, 2018. http://hdl.handle.net/10919/83714.

Texto completo
Resumen
Neural networks are making headlines every day as the tool of the future, powering artificial intelligence programs and supporting technologies never seen before. However, the training of neural networks can take days or even weeks for bigger networks, and requires the use of super computers and GPUs in academia and industry in order to achieve state of the art results. This thesis discusses employing selective measures to determine when to backpropagate and forward propagate in order to reduce training time while maintaining classification performance. This thesis tests these new algorithms on the MNIST and CASIA datasets, and achieves successful results with both algorithms on the two datasets. The selective backpropagation algorithm shows a reduction of up to 93.3% of backpropagations completed, and the selective forward propagation algorithm shows a reduction of up to 72.90% in forward propagations and backpropagations completed compared to baseline runs of always forward propagating and backpropagating. This work also discusses employing the selective backpropagation algorithm on a modified dataset with disproportional under-representation of some classes compared to others.
Master of Science
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Bonnell, Jeffrey A. "Implementation of a New Sigmoid Function in Backpropagation Neural Networks". Digital Commons @ East Tennessee State University, 2011. https://dc.etsu.edu/etd/1342.

Texto completo
Resumen
This thesis presents the use of a new sigmoid activation function in backpropagation artificial neural networks (ANNs). ANNs using conventional activation functions may generalize poorly when trained on a set which includes quirky, mislabeled, unbalanced, or otherwise complicated data. This new activation function is an attempt to improve generalization and reduce overtraining on mislabeled or irrelevant data by restricting training when inputs to the hidden neurons are sufficiently small. This activation function includes a flattened, low-training region which grows or shrinks during back-propagation to ensure a desired proportion of inputs inside the low-training region. With a desired low-training proportion of 0, this activation function reduces to a standard sigmoidal curve. A network with the new activation function implemented in the hidden layer is trained on benchmark data sets and compared with the standard activation function in an attempt to improve area under the curve for the receiver operating characteristic in biological and other classification tasks.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Hövel, Christoph A. "Finanzmarktprognose mit neuronalen Netzen : Training mit Backpropagation und genetisch-evolutionären Verfahren /". Lohmar ; Köln : Eul, 2003. http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&doc_number=010635637&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Seifert, Christin y Jan Parthey. "Simulation Rekursiver Auto-Assoziativer Speicher (RAAM) durch Erweiterung eines klassischen Backpropagation-Simulators". Thesis, Universitätsbibliothek Chemnitz, 2003. http://nbn-resolving.de/urn:nbn:de:swb:ch1-200300536.

Texto completo
Resumen
Rekursive Auto-Assoziative Speicher (RAAM) sind spezielle Neuronale Netze (NN), die in der Lage sind, hierarchiche Strukturen zu verarbeiten. Bei der Simulation dieser Netze gibt es einige Besonderheiten, wie z.B. die dynamische Trainingsmenge, zu beachten. In der Arbeit werden diese und die daraus resultierenden angepassten Lernalgorithmen erörtert. Außerdem wird ein normaler Backpropagation-Simulator (Xerion) um die Fähigkeiten für die Simulation von RAAMs erweitert.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Sam, Iat Tong. "Theory of backpropagation type learning of artificial neural networks and its applications". Thesis, University of Macau, 2001. http://umaclib3.umac.mo/record=b1446702.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

potter, matthew james. "Improving ANN Generalization via Self-Organized Flocking in conjunction with Multitasked Backpropagation". NCSU, 2003. http://www.lib.ncsu.edu/theses/available/etd-03242003-075528/.

Texto completo
Resumen
The purpose of this research has been to develop methods of improving the generalization capabilities of artificial neural networks. Tools for examining the influence of individual training set patterns on the learning abilities of individual neurons are put forth and utilized in the implementation of new network learning algorithms. Algorithms are based largely on the supervised training algorithm: backpropagation, and all experiments use the standard backpropagation algorithm for comparison of results. The focus of the new learning algorithms revolve around the addition of two main components. The first addition is that of an unsupervised learning algorithm called flocking. Flocking attempts to provide network hyperplane divisions that are evenly influenced by examples on either side of the hyperplane. The second addition is that of a multi-tasking approach called convergence training. Convergence training uses the information provided by a clustering algorithm in order to create subtasks that represent the divisions between clusters. These subtasks are then trained in unison in order to promote hyperplane sharing within the problem space. Generalization was improved in most cases and the solutions produced by the new learning algorithms are demonstrated to be very robust against different random weight initializations. This research is not only a search for better generalizing ANN learning algorithms, but also a search for better understanding when dealing with the complexities involved in ANN generalization.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Wellington, Charles H. "Backpropagation neural network for noise cancellation applied to the NUWES test ranges". Thesis, Monterey, California. Naval Postgraduate School, 1991. http://hdl.handle.net/10945/26899.

Texto completo
Resumen
Approved for public release; distribution is unlimited
This thesis investigates the application of backpropagation neural networks as an alternative to adaptive filtering at the NUWES test ranges. To facilitate the investigation, a model of the test range is developed. This model accounts for acoustic transmission losses, the effects of doppler shift, multipath, and finite propagation time delay. After describing the model, the backpropagation neural network algorithm and feature selection for the network are explained. Then, two schemes based on the network's output, signal waveform recovery and binary code recovery, are applied to the model. Simulation results of the signal waveform recovery and direct code recovery schemes are presented for several scenarios.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Seifert, Christin Parthey Jan. "Simulation Rekursiver Auto-Assoziativer Speicher (RAAM) durch Erweiterung eines klassischen Backpropagation-Simulators". [S.l. : s.n.], 2003. http://www.bsz-bw.de/cgi-bin/xvms.cgi?SWB10607558.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Libros sobre el tema "Backpropagation"

1

Karazanos, Elias. Temporal learning using time-dependent backpropagation and teacher forcing. Manchester: UMIST, 1997.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Nicolaides, Lena. Thermal-wave slice diffraction tomography with backpropagation and transmission reconstructions. Ottawa: National Library of Canada, 1996.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

P, Dhawan Atam y United States. National Aeronautics and Space Administration., eds. LVQ and backpropagation neural networks applied to NASA SSME data. [Washington, DC: National Aeronautics and Space Administration, 1993.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Gaxiola, Fernando, Patricia Melin y Fevrier Valdez. New Backpropagation Algorithm with Type-2 Fuzzy Weights for Neural Networks. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-34087-6.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Wellington, Charles H. Backpropagation neural network for noise cancellation applied to the NUWES test ranges. Monterey, Calif: Naval Postgraduate School, 1991.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Werbos, Paul J. The roots of backpropagation: From ordered derivatives to neural networksand political forecasting. New York: Wiley, 1994.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

N, Sundararajan y Foo Shou King, eds. Parallel implementations of backpropagation neural networks on transputers: A study of training set parallelism. Singapore: World Scientific, 1996.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Billings, S. A. A comparison of the backpropagation and recursive prediction error algorithms for training neural networks. Sheffield: University of Sheffield, Dept. of Control Engineering, 1990.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Menke, Kurt William. Nonlinear adaptive control using backpropagating neural networks. Monterey, Calif: Naval Postgraduate School, 1992.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Chauvin, Yves y David E. Rumelhart, eds. Backpropagation. Psychology Press, 2013. http://dx.doi.org/10.4324/9780203763247.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Capítulos de libros sobre el tema "Backpropagation"

1

Munro, Paul, Hannu Toivonen, Geoffrey I. Webb, Wray Buntine, Peter Orbanz, Yee Whye Teh, Pascal Poupart et al. "Backpropagation". En Encyclopedia of Machine Learning, 73. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_51.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

De Wilde, Philippe. "Backpropagation". En Neural Network Models, 33–52. London: Springer London, 1997. http://dx.doi.org/10.1007/978-1-84628-614-8_2.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Andrew, Alex M. "Backpropagation". En IFSR International Series on Systems Science and Engineering, 85–104. New York, NY: Springer New York, 2009. http://dx.doi.org/10.1007/978-0-387-75164-1_5.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Munro, Paul. "Backpropagation". En Encyclopedia of Machine Learning and Data Mining, 93–97. Boston, MA: Springer US, 2017. http://dx.doi.org/10.1007/978-1-4899-7687-1_51.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Bishop, Christopher M. y Hugh Bishop. "Backpropagation". En Deep Learning, 233–52. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-45468-4_8.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Braun, Heinrich, Johannes Feulner y Rainer Malaka. "Backpropagation I". En Springer-Lehrbuch, 81–101. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-642-61000-4_5.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Braun, Heinrich, Johannes Feulner y Rainer Malaka. "Backpropagation II". En Springer-Lehrbuch, 103–26. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-642-61000-4_6.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Gasparini, Sonia y Michele Migliore. "Action Potential Backpropagation". En Encyclopedia of Computational Neuroscience, 133–37. New York, NY: Springer New York, 2015. http://dx.doi.org/10.1007/978-1-4614-6675-8_123.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Antonik, Piotr. "Backpropagation with Photonics". En Springer Theses, 63–89. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-91053-6_3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Tuomi, Ilkka. "Vygotsky Meets Backpropagation". En Lecture Notes in Computer Science, 570–83. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-93843-1_42.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Actas de conferencias sobre el tema "Backpropagation"

1

Brodsky, Stephen A. y Clark C. Guest. "Optical Matrix-Vector Implementation of Binary Valued Backpropagation". En Optical Computing. Washington, D.C.: Optica Publishing Group, 1991. http://dx.doi.org/10.1364/optcomp.1991.me8.

Texto completo
Resumen
Optical implementations of neural networks can combine advantages of neural network adaptive parallel processing and optical free-space connectivity. Binary valued Backpropagation1, a supervised learning algorithm related to standard Backpropagation2, significantly reduces interconnection storage and computation requirements. This implementation of binary valued Backpropagation used optical matrix-vector multiplication3 to represent the forward information flow between network layers. Previous analog optical network memory systems have been described4.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Dong, Yuhan, Chenguang Liu, Yui Lo, Yaqian Xu, Ke Wang y Kai Zhang. "Attention Backpropagation". En CCEAI 2021: 5th International Conference on Control Engineering and Artificial Intelligence. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3448218.3448227.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Fausett, D. W. "Strictly local backpropagation". En 1990 IJCNN International Joint Conference on Neural Networks. IEEE, 1990. http://dx.doi.org/10.1109/ijcnn.1990.137834.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Cheng, L. M., H. L. Mak y L. L. Cheng. "Structured backpropagation network". En 1991 IEEE International Joint Conference on Neural Networks. IEEE, 1991. http://dx.doi.org/10.1109/ijcnn.1991.170640.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Fernandes de Moraes, Joyrles y Jörg Dietrich Wilhelm Schleicher. "Backpropagation-based redatuming". En International Congress of the Brazilian Geophysical Society&Expogef. Brazilian Geophysical Society, 2021. http://dx.doi.org/10.22564/17cisbgf2021.064.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Wymeersch, Henk. "Stochastic Digital Backpropagation: Unifying Digital Backpropagation and the MAP Criterion". En Signal Processing in Photonic Communications. Washington, D.C.: OSA, 2014. http://dx.doi.org/10.1364/sppcom.2014.st2d.3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Yaremchuk, Vanessa y Marcelo M. Wanderley. "Brahms, Bodies and Backpropagation". En the 2014 International Workshop. New York, New York, USA: ACM Press, 2014. http://dx.doi.org/10.1145/2617995.2618011.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Goli, Negar y Tor M. Aamodt. "ReSprop: Reuse Sparsified Backpropagation". En 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2020. http://dx.doi.org/10.1109/cvpr42600.2020.00162.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Leung, Karen, Nikos Arechiga y Marco Pavone. "Backpropagation for Parametric STL". En 2019 IEEE Intelligent Vehicles Symposium (IV). IEEE, 2019. http://dx.doi.org/10.1109/ivs.2019.8814167.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Diegert, C. "Out-of-core backpropagation". En 1990 IJCNN International Joint Conference on Neural Networks. IEEE, 1990. http://dx.doi.org/10.1109/ijcnn.1990.137701.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Informes sobre el tema "Backpropagation"

1

Morton, Paul E. y Glenn F. Wilson. Backpropagation and EEG Data. Fort Belvoir, VA: Defense Technical Information Center, octubre de 1988. http://dx.doi.org/10.21236/ada279073.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Levy, Bernard C. y Cengiz Esmersoy. Variable Background Born Inversion by Wavefield Backpropagation. Fort Belvoir, VA: Defense Technical Information Center, noviembre de 1986. http://dx.doi.org/10.21236/ada459595.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Vitela, J. E. y J. Reifman. Premature saturation in backpropagation networks: Mechanism and necessary conditions. Office of Scientific and Technical Information (OSTI), agosto de 1997. http://dx.doi.org/10.2172/510390.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Vitela, J. E. y J. Reifman. Premature saturation in backpropagation networks: Mechanism and necessary conditions. Office of Scientific and Technical Information (OSTI), diciembre de 1995. http://dx.doi.org/10.2172/211552.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Mu, Ruihui y Xiaoqin Zeng. Improved Webpage Classification Technology Based on Feedforward Backpropagation Neural Network. "Prof. Marin Drinov" Publishing House of Bulgarian Academy of Sciences, septiembre de 2018. http://dx.doi.org/10.7546/crabs.2018.09.11.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Gage, Harmon J. Using Upper Layer Weights to Efficiently Construct and Train Feedforward Neural Networks Executing Backpropagation. Fort Belvoir, VA: Defense Technical Information Center, marzo de 2011. http://dx.doi.org/10.21236/ada545618.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Kerr, John Patrick. The parallel implementation of a backpropagation neural network and its applicability to SPECT image reconstruction. Office of Scientific and Technical Information (OSTI), enero de 1992. http://dx.doi.org/10.2172/10138858.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Wawrzynek, John, Krste Asanovic, Brian Kingsbury, James Beck y David Johnson. SPERT-II: A Vector Microprocessor System and Its Application to Large Problems in Backpropagation Training,. Fort Belvoir, VA: Defense Technical Information Center, enero de 1993. http://dx.doi.org/10.21236/ada327554.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Kerr, J. P. The parallel implementation of a backpropagation neural network and its applicability to SPECT image reconstruction. Office of Scientific and Technical Information (OSTI), enero de 1992. http://dx.doi.org/10.2172/6879460.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Wang, Felix, Nick Alonso y Corinne Teeter. Combining Spike Time Dependent Plasticity (STDP) and Backpropagation (BP) for Robust and Data Efficient Spiking Neural Networks (SNN). Office of Scientific and Technical Information (OSTI), diciembre de 2022. http://dx.doi.org/10.2172/1902866.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía