Gotowa bibliografia na temat „Backpropagation”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Backpropagation”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Backpropagation"

1

Andrew, Alex M. "Backpropagation". Kybernetes 30, nr 9/10 (grudzień 2001): 1110–17. http://dx.doi.org/10.1108/03684920110405601.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Faisal, Faisal. "Penggunaan Metode Backpropagation Pada Sistem Prediksi Kelulusan Mahasiswa STMIK Kaputama Binjai". Data Sciences Indonesia (DSI) 2, nr 1 (10.08.2022): 13–19. http://dx.doi.org/10.47709/dsi.v2i1.1664.

Pełny tekst źródła
Streszczenie:
Kelulusan yang tepat pada waktunya menjadi salah satu tolak ukur integritas sekolah tinggi, termasuk STMIK Kaputama Binjai. Dari tahun ke tahun, banyak mahasiswa Universitas STMIK Kaputama Binjai yang lulus tepat pada waktunya, namun tidak sedikit pula mahasiswa yang tidak lulus tepat pada waktunya. Untuk itu perlu adanya sistem prediksi kelulusan agar dosen dapat mengarahkan mahasiswa yang diprediksi akan lulus terlambat. Metode yang digunakan adalah Jaringan Syaraf Tiruan Backpropagation. Metode Backpropagation memiliki 3 arsitektur yaitu input layer, hidden layer, dan output layer. Proses Backpropagation meliputi forward dan backward. Data yang digunakan adalah data IPS1 hingga IPS4 kelulusan tahun 2015-2021 dari program studi Teknik Informatika, sebagai data latih untuk jaringan syaraf tiruan Backpropagation menggunakan data dari mahasiswa yang sudah lulus, lalu sebagai data uji untuk prediksi kelulusan bisa mnggunakan data mahasiswa yang masih menempuh pendidikan dengan ketentuan harus sudah melewati semester 4. Dari berbagai percobaan dengan fitur max iterasi, max kecepatan latih, dan minimal error yang berbeda lalu data latih yang berbeda pula dapat menghasilkan tingkat akurasi hasil prediksi yang berbeda, akurasi pengujian tertinggi dapat dilihat dari hasil error yang paling minimum. Sistem ini dibangun menggunakan Bahasa Pemrograman Visual Basic dengan software Visual Studio 2010. Hasil penelitian menunjukkan bahwa metode Backpropagations dinilai cukup bagus dalam melakukan Pengklasifikasian untuk melakukan prediksi kelulusan mahasiswa.
Style APA, Harvard, Vancouver, ISO itp.
3

Sexton, Randall S., Robert E. Dorsey i John D. Johnson. "Beyond Backpropagation". Journal of Organizational and End User Computing 11, nr 3 (lipiec 1999): 3–10. http://dx.doi.org/10.4018/joeuc.1999070101.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Adigun, Olaoluwa, i Bart Kosko. "Bidirectional Backpropagation". IEEE Transactions on Systems, Man, and Cybernetics: Systems 50, nr 5 (maj 2020): 1982–94. http://dx.doi.org/10.1109/tsmc.2019.2916096.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Irawan, Eka, M. Zarlis i Erna Budhiarti Nababan. "ANALISIS PENAMBAHAN NILAI MOMENTUM PADA PREDIKSI PRODUKTIVITAS KELAPA SAWIT MENGGUNAKAN BACKPROPAGATION". InfoTekJar (Jurnal Nasional Informatika dan Teknologi Jaringan) 1, nr 2 (3.03.2017): 84–89. http://dx.doi.org/10.30743/infotekjar.v1i2.67.

Pełny tekst źródła
Streszczenie:
Algoritma backpropagation merupakan multi layer perceptron yang banyak digunakan untuk menyelesaikan persoalan yang luas, namun algoritma backpropagation juga mempunyai keterbatasan yaitu laju konvergensi yang cukup lambat. Pada penelitian ini penulis menambahkan parameter learning rate secara adaptif pada setiap iterasi dan koefisien momentum untuk menghitung proses perubahan bobot. Dari hasil simulasi komputer maka diperoleh perbandingan antara algoritma backpropagation standar dengan backpropagation dengan penambahan momentum. Untuk algoritma backpropagation standar kecepatan konvergensi 727 epoch dengan nilai MSE 0,01, sedangkan algoritma backpropagation standar mencapai 4000 epoch dengan nilai MSE 0,001. . Hal ini menunjukkan bahwa algoritma backpropagation adaptive learning lebih cepat mencapai konvergensi daripada algoritma backpropagation standar.
Style APA, Harvard, Vancouver, ISO itp.
6

Nafisah, Zumrotun, Febrian Rachmadi i Elly Matul Imah. "Face Recognition Using Complex Valued Backpropagation". Jurnal Ilmu Komputer dan Informasi 11, nr 2 (29.06.2018): 103. http://dx.doi.org/10.21609/jiki.v11i2.617.

Pełny tekst źródła
Streszczenie:
Face recognition is one of biometrical research area that is still interesting. This study discusses the Complex-Valued Backpropagation algorithm for face recognition. Complex-Valued Backpropagation is an algorithm modified from Real-Valued Backpropagation algorithm where the weights and activation functions used are complex. The dataset used in this study consist of 250 images that is classified in 5 classes. The performance of face recognition using Complex-Valued Backpropagation is also compared with Real-Valued Backpropagation algorithm. Experimental results have shown that Complex-Valued Backpropagation performance is better than Real-Valued Backpropagation.
Style APA, Harvard, Vancouver, ISO itp.
7

Ojha, Varun, i Giuseppe Nicosia. "Backpropagation Neural Tree". Neural Networks 149 (maj 2022): 66–83. http://dx.doi.org/10.1016/j.neunet.2022.02.003.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Vicini, Delio, Sébastien Speierer i Wenzel Jakob. "Path replay backpropagation". ACM Transactions on Graphics 40, nr 4 (sierpień 2021): 1–14. http://dx.doi.org/10.1145/3476576.3476672.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Vicini, Delio, Sébastien Speierer i Wenzel Jakob. "Path replay backpropagation". ACM Transactions on Graphics 40, nr 4 (sierpień 2021): 1–14. http://dx.doi.org/10.1145/3450626.3459804.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Georgiou, G. M., i C. Koutsougeras. "Complex domain backpropagation". IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing 39, nr 5 (maj 1992): 330–34. http://dx.doi.org/10.1109/82.142037.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Rozprawy doktorskie na temat "Backpropagation"

1

Civelek, Ferda N. (Ferda Nur). "Temporal Connectionist Expert Systems Using a Temporal Backpropagation Algorithm". Thesis, University of North Texas, 1993. https://digital.library.unt.edu/ark:/67531/metadc278824/.

Pełny tekst źródła
Streszczenie:
Representing time has been considered a general problem for artificial intelligence research for many years. More recently, the question of representing time has become increasingly important in representing human decision making process through connectionist expert systems. Because most human behaviors unfold over time, any attempt to represent expert performance, without considering its temporal nature, can often lead to incorrect results. A temporal feedforward neural network model that can be applied to a number of neural network application areas, including connectionist expert systems, has been introduced. The neural network model has a multi-layer structure, i.e. the number of layers is not limited. Also, the model has the flexibility of defining output nodes in any layer. This is especially important for connectionist expert system applications. A temporal backpropagation algorithm which supports the model has been developed. The model along with the temporal backpropagation algorithm makes it extremely practical to define any artificial neural network application. Also, an approach that can be followed to decrease the memory space used by weight matrix has been introduced. The algorithm was tested using a medical connectionist expert system to show how best we describe not only the disease but also the entire course of the disease. The system, first, was trained using a pattern that was encoded from the expert system knowledge base rules. Following then, series of experiments were carried out using the temporal model and the temporal backpropagation algorithm. The first series of experiments was done to determine if the training process worked as predicted. In the second series of experiments, the weight matrix in the trained system was defined as a function of time intervals before presenting the system with the learned patterns. The result of the two experiments indicate that both approaches produce correct results. The only difference between the two results was that compressing the weight matrix required more training epochs to produce correct results. To get a measure of the correctness of the results, an error measure which is the value of the error squared was summed over all patterns to get a total sum of squares.
Style APA, Harvard, Vancouver, ISO itp.
2

Yee, Clifford Wing Wei Physics Faculty of Science UNSW. "Point source compensation ??? a backpropagation method for underwater acoustic imaging". Awarded by:University of New South Wales. School of Physics, 2003. http://handle.unsw.edu.au/1959.4/20590.

Pełny tekst źródła
Streszczenie:
The backpropagation method of image reconstruction has been known for some time with the advantage of fast processing due to the use of Fast Fourier Transform. But its applicability to underwater imaging has been limited. At present the shift-and-add method is the more widely used method in underwater imaging. This is due to the fact that backpropagation has been derived for plane wave insonification, with the scattered waves detected in transmission-mode, or synthetic aperture set-up. One of the methods being used for underwater imaging is to use a point source for the insonification of the target and the scattered waves detected in reflection-mode by a receiver array. An advantage of this scanning method is only one transmission of the source is required to capture an image, instead of multiple transmissions. Therefore motion artifacts are kept to minimum. To be able to exploit the processing speed of the backpropagation method, it must be adapted for point source insonification. The coverage of this configuration in the literature has been scant, methods for spherical sources have been proposed for transmission mode and arbitrary surfaces in geophysical applications. These methods are complex and difficult to use. A novel point source compensation method is proposed in this thesis so that the backpropagation image formation method can be used for the point source insonification set-up. The method of investigation undertaken to derive this new backpropagation method was through theoretical analysis, numerical simulation and experimental verification. The effect of various compensation factors on the image quality was studied in simulation. In the experimental verification, practical issues relating to the application of the new method was addressed. The final proof of concept of our method was undertaken with our experimental verification. The quality of images formed with the point source compensation methods has also been compared with that with the shiftand- add method. Experimental and simulation results show that the point source compensated backpropagation algorithm can produce images of comparable quality with those formed with shift-and-add method for the set-up of wideband point-source insonification with detection in reflection-mode, with the advantage of faster image formation.
Style APA, Harvard, Vancouver, ISO itp.
3

Bendelac, Shiri. "Enhanced Neural Network Training Using Selective Backpropagation and Forward Propagation". Thesis, Virginia Tech, 2018. http://hdl.handle.net/10919/83714.

Pełny tekst źródła
Streszczenie:
Neural networks are making headlines every day as the tool of the future, powering artificial intelligence programs and supporting technologies never seen before. However, the training of neural networks can take days or even weeks for bigger networks, and requires the use of super computers and GPUs in academia and industry in order to achieve state of the art results. This thesis discusses employing selective measures to determine when to backpropagate and forward propagate in order to reduce training time while maintaining classification performance. This thesis tests these new algorithms on the MNIST and CASIA datasets, and achieves successful results with both algorithms on the two datasets. The selective backpropagation algorithm shows a reduction of up to 93.3% of backpropagations completed, and the selective forward propagation algorithm shows a reduction of up to 72.90% in forward propagations and backpropagations completed compared to baseline runs of always forward propagating and backpropagating. This work also discusses employing the selective backpropagation algorithm on a modified dataset with disproportional under-representation of some classes compared to others.
Master of Science
Style APA, Harvard, Vancouver, ISO itp.
4

Bonnell, Jeffrey A. "Implementation of a New Sigmoid Function in Backpropagation Neural Networks". Digital Commons @ East Tennessee State University, 2011. https://dc.etsu.edu/etd/1342.

Pełny tekst źródła
Streszczenie:
This thesis presents the use of a new sigmoid activation function in backpropagation artificial neural networks (ANNs). ANNs using conventional activation functions may generalize poorly when trained on a set which includes quirky, mislabeled, unbalanced, or otherwise complicated data. This new activation function is an attempt to improve generalization and reduce overtraining on mislabeled or irrelevant data by restricting training when inputs to the hidden neurons are sufficiently small. This activation function includes a flattened, low-training region which grows or shrinks during back-propagation to ensure a desired proportion of inputs inside the low-training region. With a desired low-training proportion of 0, this activation function reduces to a standard sigmoidal curve. A network with the new activation function implemented in the hidden layer is trained on benchmark data sets and compared with the standard activation function in an attempt to improve area under the curve for the receiver operating characteristic in biological and other classification tasks.
Style APA, Harvard, Vancouver, ISO itp.
5

Hövel, Christoph A. "Finanzmarktprognose mit neuronalen Netzen : Training mit Backpropagation und genetisch-evolutionären Verfahren /". Lohmar ; Köln : Eul, 2003. http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&doc_number=010635637&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Seifert, Christin, i Jan Parthey. "Simulation Rekursiver Auto-Assoziativer Speicher (RAAM) durch Erweiterung eines klassischen Backpropagation-Simulators". Thesis, Universitätsbibliothek Chemnitz, 2003. http://nbn-resolving.de/urn:nbn:de:swb:ch1-200300536.

Pełny tekst źródła
Streszczenie:
Rekursive Auto-Assoziative Speicher (RAAM) sind spezielle Neuronale Netze (NN), die in der Lage sind, hierarchiche Strukturen zu verarbeiten. Bei der Simulation dieser Netze gibt es einige Besonderheiten, wie z.B. die dynamische Trainingsmenge, zu beachten. In der Arbeit werden diese und die daraus resultierenden angepassten Lernalgorithmen erörtert. Außerdem wird ein normaler Backpropagation-Simulator (Xerion) um die Fähigkeiten für die Simulation von RAAMs erweitert.
Style APA, Harvard, Vancouver, ISO itp.
7

Sam, Iat Tong. "Theory of backpropagation type learning of artificial neural networks and its applications". Thesis, University of Macau, 2001. http://umaclib3.umac.mo/record=b1446702.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

potter, matthew james. "Improving ANN Generalization via Self-Organized Flocking in conjunction with Multitasked Backpropagation". NCSU, 2003. http://www.lib.ncsu.edu/theses/available/etd-03242003-075528/.

Pełny tekst źródła
Streszczenie:
The purpose of this research has been to develop methods of improving the generalization capabilities of artificial neural networks. Tools for examining the influence of individual training set patterns on the learning abilities of individual neurons are put forth and utilized in the implementation of new network learning algorithms. Algorithms are based largely on the supervised training algorithm: backpropagation, and all experiments use the standard backpropagation algorithm for comparison of results. The focus of the new learning algorithms revolve around the addition of two main components. The first addition is that of an unsupervised learning algorithm called flocking. Flocking attempts to provide network hyperplane divisions that are evenly influenced by examples on either side of the hyperplane. The second addition is that of a multi-tasking approach called convergence training. Convergence training uses the information provided by a clustering algorithm in order to create subtasks that represent the divisions between clusters. These subtasks are then trained in unison in order to promote hyperplane sharing within the problem space. Generalization was improved in most cases and the solutions produced by the new learning algorithms are demonstrated to be very robust against different random weight initializations. This research is not only a search for better generalizing ANN learning algorithms, but also a search for better understanding when dealing with the complexities involved in ANN generalization.
Style APA, Harvard, Vancouver, ISO itp.
9

Wellington, Charles H. "Backpropagation neural network for noise cancellation applied to the NUWES test ranges". Thesis, Monterey, California. Naval Postgraduate School, 1991. http://hdl.handle.net/10945/26899.

Pełny tekst źródła
Streszczenie:
Approved for public release; distribution is unlimited
This thesis investigates the application of backpropagation neural networks as an alternative to adaptive filtering at the NUWES test ranges. To facilitate the investigation, a model of the test range is developed. This model accounts for acoustic transmission losses, the effects of doppler shift, multipath, and finite propagation time delay. After describing the model, the backpropagation neural network algorithm and feature selection for the network are explained. Then, two schemes based on the network's output, signal waveform recovery and binary code recovery, are applied to the model. Simulation results of the signal waveform recovery and direct code recovery schemes are presented for several scenarios.
Style APA, Harvard, Vancouver, ISO itp.
10

Seifert, Christin Parthey Jan. "Simulation Rekursiver Auto-Assoziativer Speicher (RAAM) durch Erweiterung eines klassischen Backpropagation-Simulators". [S.l. : s.n.], 2003. http://www.bsz-bw.de/cgi-bin/xvms.cgi?SWB10607558.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Książki na temat "Backpropagation"

1

Karazanos, Elias. Temporal learning using time-dependent backpropagation and teacher forcing. Manchester: UMIST, 1997.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Nicolaides, Lena. Thermal-wave slice diffraction tomography with backpropagation and transmission reconstructions. Ottawa: National Library of Canada, 1996.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

P, Dhawan Atam, i United States. National Aeronautics and Space Administration., red. LVQ and backpropagation neural networks applied to NASA SSME data. [Washington, DC: National Aeronautics and Space Administration, 1993.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Gaxiola, Fernando, Patricia Melin i Fevrier Valdez. New Backpropagation Algorithm with Type-2 Fuzzy Weights for Neural Networks. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-34087-6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Wellington, Charles H. Backpropagation neural network for noise cancellation applied to the NUWES test ranges. Monterey, Calif: Naval Postgraduate School, 1991.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Werbos, Paul J. The roots of backpropagation: From ordered derivatives to neural networksand political forecasting. New York: Wiley, 1994.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

N, Sundararajan, i Foo Shou King, red. Parallel implementations of backpropagation neural networks on transputers: A study of training set parallelism. Singapore: World Scientific, 1996.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Billings, S. A. A comparison of the backpropagation and recursive prediction error algorithms for training neural networks. Sheffield: University of Sheffield, Dept. of Control Engineering, 1990.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Menke, Kurt William. Nonlinear adaptive control using backpropagating neural networks. Monterey, Calif: Naval Postgraduate School, 1992.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Chauvin, Yves, i David E. Rumelhart, red. Backpropagation. Psychology Press, 2013. http://dx.doi.org/10.4324/9780203763247.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Części książek na temat "Backpropagation"

1

Munro, Paul, Hannu Toivonen, Geoffrey I. Webb, Wray Buntine, Peter Orbanz, Yee Whye Teh, Pascal Poupart i in. "Backpropagation". W Encyclopedia of Machine Learning, 73. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_51.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

De Wilde, Philippe. "Backpropagation". W Neural Network Models, 33–52. London: Springer London, 1997. http://dx.doi.org/10.1007/978-1-84628-614-8_2.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Andrew, Alex M. "Backpropagation". W IFSR International Series on Systems Science and Engineering, 85–104. New York, NY: Springer New York, 2009. http://dx.doi.org/10.1007/978-0-387-75164-1_5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Munro, Paul. "Backpropagation". W Encyclopedia of Machine Learning and Data Mining, 93–97. Boston, MA: Springer US, 2017. http://dx.doi.org/10.1007/978-1-4899-7687-1_51.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Bishop, Christopher M., i Hugh Bishop. "Backpropagation". W Deep Learning, 233–52. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-45468-4_8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Braun, Heinrich, Johannes Feulner i Rainer Malaka. "Backpropagation I". W Springer-Lehrbuch, 81–101. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-642-61000-4_5.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Braun, Heinrich, Johannes Feulner i Rainer Malaka. "Backpropagation II". W Springer-Lehrbuch, 103–26. Berlin, Heidelberg: Springer Berlin Heidelberg, 1996. http://dx.doi.org/10.1007/978-3-642-61000-4_6.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Gasparini, Sonia, i Michele Migliore. "Action Potential Backpropagation". W Encyclopedia of Computational Neuroscience, 133–37. New York, NY: Springer New York, 2015. http://dx.doi.org/10.1007/978-1-4614-6675-8_123.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Antonik, Piotr. "Backpropagation with Photonics". W Springer Theses, 63–89. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-91053-6_3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Tuomi, Ilkka. "Vygotsky Meets Backpropagation". W Lecture Notes in Computer Science, 570–83. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-93843-1_42.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Backpropagation"

1

Brodsky, Stephen A., i Clark C. Guest. "Optical Matrix-Vector Implementation of Binary Valued Backpropagation". W Optical Computing. Washington, D.C.: Optica Publishing Group, 1991. http://dx.doi.org/10.1364/optcomp.1991.me8.

Pełny tekst źródła
Streszczenie:
Optical implementations of neural networks can combine advantages of neural network adaptive parallel processing and optical free-space connectivity. Binary valued Backpropagation1, a supervised learning algorithm related to standard Backpropagation2, significantly reduces interconnection storage and computation requirements. This implementation of binary valued Backpropagation used optical matrix-vector multiplication3 to represent the forward information flow between network layers. Previous analog optical network memory systems have been described4.
Style APA, Harvard, Vancouver, ISO itp.
2

Dong, Yuhan, Chenguang Liu, Yui Lo, Yaqian Xu, Ke Wang i Kai Zhang. "Attention Backpropagation". W CCEAI 2021: 5th International Conference on Control Engineering and Artificial Intelligence. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3448218.3448227.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Fausett, D. W. "Strictly local backpropagation". W 1990 IJCNN International Joint Conference on Neural Networks. IEEE, 1990. http://dx.doi.org/10.1109/ijcnn.1990.137834.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Cheng, L. M., H. L. Mak i L. L. Cheng. "Structured backpropagation network". W 1991 IEEE International Joint Conference on Neural Networks. IEEE, 1991. http://dx.doi.org/10.1109/ijcnn.1991.170640.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Fernandes de Moraes, Joyrles, i Jörg Dietrich Wilhelm Schleicher. "Backpropagation-based redatuming". W International Congress of the Brazilian Geophysical Society&Expogef. Brazilian Geophysical Society, 2021. http://dx.doi.org/10.22564/17cisbgf2021.064.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Wymeersch, Henk. "Stochastic Digital Backpropagation: Unifying Digital Backpropagation and the MAP Criterion". W Signal Processing in Photonic Communications. Washington, D.C.: OSA, 2014. http://dx.doi.org/10.1364/sppcom.2014.st2d.3.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Yaremchuk, Vanessa, i Marcelo M. Wanderley. "Brahms, Bodies and Backpropagation". W the 2014 International Workshop. New York, New York, USA: ACM Press, 2014. http://dx.doi.org/10.1145/2617995.2618011.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Goli, Negar, i Tor M. Aamodt. "ReSprop: Reuse Sparsified Backpropagation". W 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2020. http://dx.doi.org/10.1109/cvpr42600.2020.00162.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Leung, Karen, Nikos Arechiga i Marco Pavone. "Backpropagation for Parametric STL". W 2019 IEEE Intelligent Vehicles Symposium (IV). IEEE, 2019. http://dx.doi.org/10.1109/ivs.2019.8814167.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Diegert, C. "Out-of-core backpropagation". W 1990 IJCNN International Joint Conference on Neural Networks. IEEE, 1990. http://dx.doi.org/10.1109/ijcnn.1990.137701.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Raporty organizacyjne na temat "Backpropagation"

1

Morton, Paul E., i Glenn F. Wilson. Backpropagation and EEG Data. Fort Belvoir, VA: Defense Technical Information Center, październik 1988. http://dx.doi.org/10.21236/ada279073.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Levy, Bernard C., i Cengiz Esmersoy. Variable Background Born Inversion by Wavefield Backpropagation. Fort Belvoir, VA: Defense Technical Information Center, listopad 1986. http://dx.doi.org/10.21236/ada459595.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Vitela, J. E., i J. Reifman. Premature saturation in backpropagation networks: Mechanism and necessary conditions. Office of Scientific and Technical Information (OSTI), sierpień 1997. http://dx.doi.org/10.2172/510390.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Vitela, J. E., i J. Reifman. Premature saturation in backpropagation networks: Mechanism and necessary conditions. Office of Scientific and Technical Information (OSTI), grudzień 1995. http://dx.doi.org/10.2172/211552.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Mu, Ruihui, i Xiaoqin Zeng. Improved Webpage Classification Technology Based on Feedforward Backpropagation Neural Network. "Prof. Marin Drinov" Publishing House of Bulgarian Academy of Sciences, wrzesień 2018. http://dx.doi.org/10.7546/crabs.2018.09.11.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Gage, Harmon J. Using Upper Layer Weights to Efficiently Construct and Train Feedforward Neural Networks Executing Backpropagation. Fort Belvoir, VA: Defense Technical Information Center, marzec 2011. http://dx.doi.org/10.21236/ada545618.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Kerr, John Patrick. The parallel implementation of a backpropagation neural network and its applicability to SPECT image reconstruction. Office of Scientific and Technical Information (OSTI), styczeń 1992. http://dx.doi.org/10.2172/10138858.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Wawrzynek, John, Krste Asanovic, Brian Kingsbury, James Beck i David Johnson. SPERT-II: A Vector Microprocessor System and Its Application to Large Problems in Backpropagation Training,. Fort Belvoir, VA: Defense Technical Information Center, styczeń 1993. http://dx.doi.org/10.21236/ada327554.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Kerr, J. P. The parallel implementation of a backpropagation neural network and its applicability to SPECT image reconstruction. Office of Scientific and Technical Information (OSTI), styczeń 1992. http://dx.doi.org/10.2172/6879460.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Wang, Felix, Nick Alonso i Corinne Teeter. Combining Spike Time Dependent Plasticity (STDP) and Backpropagation (BP) for Robust and Data Efficient Spiking Neural Networks (SNN). Office of Scientific and Technical Information (OSTI), grudzień 2022. http://dx.doi.org/10.2172/1902866.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii