Literatura académica sobre el tema "Multi-layer perceptrons"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Multi-layer perceptrons".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Artículos de revistas sobre el tema "Multi-layer perceptrons"

1

Racca, Robert. "Can periodic perceptrons replace multi-layer perceptrons?" Pattern Recognition Letters 21, n.º 12 (noviembre de 2000): 1019–25. http://dx.doi.org/10.1016/s0167-8655(00)00057-x.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Garcı́a-Pedrajas, N., D. Ortiz-Boyer y C. Hervás-Martı́nez. "Cooperative coevolution of generalized multi-layer perceptrons". Neurocomputing 56 (enero de 2004): 257–83. http://dx.doi.org/10.1016/j.neucom.2003.09.004.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

TSENG, YUEN-HSIEN y JA-LING WU. "Decoding Reed-Muller codes by multi-layer perceptrons". International Journal of Electronics 75, n.º 4 (octubre de 1993): 589–94. http://dx.doi.org/10.1080/00207219308907134.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Buchholz, Sven y Gerald Sommer. "On Clifford neurons and Clifford multi-layer perceptrons". Neural Networks 21, n.º 7 (septiembre de 2008): 925–35. http://dx.doi.org/10.1016/j.neunet.2008.03.004.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Mirzai, A. R., A. Higgins y D. Tsaptsinos. "Techniques for the minimisation of multi-layer perceptrons". Engineering Applications of Artificial Intelligence 6, n.º 3 (junio de 1993): 265–77. http://dx.doi.org/10.1016/0952-1976(93)90069-a.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Egmont-Petersen, Michael, Jan L. Talmon, Arie Hasman y Anton W. Ambergen. "Assessing the importance of features for multi-layer perceptrons". Neural Networks 11, n.º 4 (junio de 1998): 623–35. http://dx.doi.org/10.1016/s0893-6080(98)00031-8.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Roque, Antonio Muñoz San, Carlos Maté, Javier Arroyo y Ángel Sarabia. "iMLP: Applying Multi-Layer Perceptrons to Interval-Valued Data". Neural Processing Letters 25, n.º 2 (15 de febrero de 2007): 157–69. http://dx.doi.org/10.1007/s11063-007-9035-z.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Vlachos, D. S. "A Local Supervised Learning Algorithm For Multi-Layer Perceptrons". Applied Numerical Analysis & Computational Mathematics 1, n.º 2 (diciembre de 2004): 535–39. http://dx.doi.org/10.1002/anac.200410016.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Moustafa, Essam B. y Ammar Elsheikh. "Predicting Characteristics of Dissimilar Laser Welded Polymeric Joints Using a Multi-Layer Perceptrons Model Coupled with Archimedes Optimizer". Polymers 15, n.º 1 (2 de enero de 2023): 233. http://dx.doi.org/10.3390/polym15010233.

Texto completo
Resumen
This study investigates the application of a coupled multi-layer perceptrons (MLP) model with Archimedes optimizer (AO) to predict characteristics of dissimilar lap joints made of polymethyl methacrylate (PMMA) and polycarbonate (PC). The joints were welded using the laser transmission welding (LTW) technique equipped with a beam wobbling feature. The inputs of the models were laser power, welding speed, pulse frequency, wobble frequency, and wobble width; whereas, the outputs were seam width and shear strength of the joint. The Archimedes optimizer was employed to obtain the optimal internal parameters of the multi-layer perceptrons. In addition to the Archimedes optimizer, the conventional gradient descent technique, as well as the particle swarm optimizer (PSO), was employed as internal optimizers of the multi-layer perceptrons model. The prediction accuracy of the three models was compared using different error measures. The AO-MLP outperformed the other two models. The computed root mean square errors of the MLP, PSO-MLP, and AO-MLP models are (39.798, 19.909, and 2.283) and (0.153, 0.084, and 0.0321) for shear strength and seam width, respectively.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Xi, Yan Hui y Hui Peng. "Training Multi-Layer Perceptrons with the Unscented Kalman Particle Filter". Advanced Materials Research 542-543 (junio de 2012): 745–48. http://dx.doi.org/10.4028/www.scientific.net/amr.542-543.745.

Texto completo
Resumen
Many Bayesian learning approaches to multi-layer perceptrons (MLPs) parameters optimization have been proposed such as the extended Kalman filter (EKF). In this paper, a sequential approach is applied to train the MLPs. Based on the particle filter, the approach named unscented Kalman particle filter (UPF) uses the unscented Kalman filter as proposal distribution to generate the importance sampling density. The UPF are devised to deal with the high dimensional parameter space that is inherent to neural network models. Simulation results show that the new algorithm performs better than traditional optimization methods such as the extended Kalman filter.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Tesis sobre el tema "Multi-layer perceptrons"

1

Zhao, Lenny. "Uncertainty prediction with multi-layer perceptrons". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0018/MQ55733.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Cairns, Graham Andrew. "Learning with analogue VLSI multi-layer perceptrons". Thesis, University of Oxford, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.296901.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Papadopoulos, Georgios. "Theoretical issues and practical considerations concerning confidence measures for multi-layer perceptrons". Thesis, University of Edinburgh, 2000. http://hdl.handle.net/1842/12753.

Texto completo
Resumen
The primary aim of this thesis is to study existing CM methods and assess their practicability and performance in harsh real-world environments. The motivation for this work was a real industrial application - the development of a paper curl prediction system. Curl is an important paper quality parameter that can only be measured after production. The available data were sparse and were known to be corrupted by gross errors. Moreover, it was suspected that data noise was not constant over input space. Three approaches were identified as suitable for use in real-world applications: maximum likelihood (ML), the approximate Bayesian approach and the bootstrap technique. These methods were initially compared using a standard CM performance evaluation method, based on estimating the prediction interval coverage probability (PI CP). It was found that the PI CP metric can only gauge CM performance as an average over the input space. However, local CM performance is crucial because a CM must associate low confidence with high data noise/low data density regions and high confidence with low noise/high data density regions. Moreover, evaluating local performance could be used to gauge the input-dependency of the noise in the data. For this reason, a new CM evaluation technique was developed to study local CM performance. The new approach, called classification of local uncertainty estimates (CLUES), was then used for a new comparison study, this time in the light of local performance. Three main conclusions were reached: the noise in the curl data was found to have input-dependent variance, the approximate Bayesian approach outperformed the other two in most cases, and the bootstrap technique was found to be inferior to both ML and Bayesian methods for data sets of input-dependent data noise variance.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Shepherd, Adrian John. "Novel second-order techniques and global optimisation methods for supervised training of multi-layer perceptrons". Thesis, University College London (University of London), 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.321662.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Collobert, Ronan. "Algorithmes d'Apprentissage pour grandes bases de données". Paris 6, 2004. http://www.theses.fr/2004PA066063.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Shao, Hang. "A Fast MLP-based Learning Method and its Application to Mine Countermeasure Missions". Thèse, Université d'Ottawa / University of Ottawa, 2012. http://hdl.handle.net/10393/23512.

Texto completo
Resumen
In this research, a novel machine learning method is designed and applied to Mine Countermeasure Missions. Similarly to some kernel methods, the proposed approach seeks to compute a linear model from another higher dimensional feature space. However, no kernel is used and the feature mapping is explicit. Computation can be done directly in the accessible feature space. In the proposed approach, the feature projection is implemented by constructing a large hidden layer, which differs from traditional belief that Multi-Layer Perceptron is usually funnel-shaped and the hidden layer is used as feature extractor. The proposed approach is a general method that can be applied to various problems. It is able to improve the performance of the neural network based methods and the learning speed of support vector machine. The classification speed of the proposed approach is also faster than that of kernel machines on the mine countermeasure mission task.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Coughlin, Michael J. y n/a. "Calibration of Two Dimensional Saccadic Electro-Oculograms Using Artificial Neural Networks". Griffith University. School of Applied Psychology, 2003. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20030409.110949.

Texto completo
Resumen
The electro-oculogram (EOG) is the most widely used technique for recording eye movements in clinical settings. It is inexpensive, practical, and non-invasive. Use of EOG is usually restricted to horizontal recordings as vertical EOG contains eyelid artefact (Oster & Stern, 1980) and blinks. The ability to analyse two dimensional (2D) eye movements may provide additional diagnostic information on pathologies, and further insights into the nature of brain functioning. Simultaneous recording of both horizontal and vertical EOG also introduces other difficulties into calibration of the eye movements, such as different gains in the two signals, and misalignment of electrodes producing crosstalk. These transformations of the signals create problems in relating the two dimensional EOG to actual rotations of the eyes. The application of an artificial neural network (ANN) that could map 2D recordings into 2D eye positions would overcome this problem and improve the utility of EOG. To determine whether ANNs are capable of correctly calibrating the saccadic eye movement data from 2D EOG (i.e. performing the necessary inverse transformation), the ANNs were first tested on data generated from mathematical models of saccadic eye movements. Multi-layer perceptrons (MLPs) with non-linear activation functions and trained with back propagation proved to be capable of calibrating simulated EOG data to a mean accuracy of 0.33° of visual angle (SE = 0.01). Linear perceptrons (LPs) were only nearly half as accurate. For five subjects performing a saccadic eye movement task in the upper right quadrant of the visual field, the mean accuracy provided by the MLPs was 1.07° of visual angle (SE = 0.01) for EOG data, and 0.95° of visual angle (SE = 0.03) for infrared limbus reflection (IRIS®) data. MLPs enabled calibration of 2D saccadic EOG to an accuracy not significantly different to that obtained with the infrared limbus tracking data.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Coughlin, Michael J. "Calibration of Two Dimensional Saccadic Electro-Oculograms Using Artificial Neural Networks". Thesis, Griffith University, 2003. http://hdl.handle.net/10072/365854.

Texto completo
Resumen
The electro-oculogram (EOG) is the most widely used technique for recording eye movements in clinical settings. It is inexpensive, practical, and non-invasive. Use of EOG is usually restricted to horizontal recordings as vertical EOG contains eyelid artefact (Oster & Stern, 1980) and blinks. The ability to analyse two dimensional (2D) eye movements may provide additional diagnostic information on pathologies, and further insights into the nature of brain functioning. Simultaneous recording of both horizontal and vertical EOG also introduces other difficulties into calibration of the eye movements, such as different gains in the two signals, and misalignment of electrodes producing crosstalk. These transformations of the signals create problems in relating the two dimensional EOG to actual rotations of the eyes. The application of an artificial neural network (ANN) that could map 2D recordings into 2D eye positions would overcome this problem and improve the utility of EOG. To determine whether ANNs are capable of correctly calibrating the saccadic eye movement data from 2D EOG (i.e. performing the necessary inverse transformation), the ANNs were first tested on data generated from mathematical models of saccadic eye movements. Multi-layer perceptrons (MLPs) with non-linear activation functions and trained with back propagation proved to be capable of calibrating simulated EOG data to a mean accuracy of 0.33° of visual angle (SE = 0.01). Linear perceptrons (LPs) were only nearly half as accurate. For five subjects performing a saccadic eye movement task in the upper right quadrant of the visual field, the mean accuracy provided by the MLPs was 1.07° of visual angle (SE = 0.01) for EOG data, and 0.95° of visual angle (SE = 0.03) for infrared limbus reflection (IRIS®) data. MLPs enabled calibration of 2D saccadic EOG to an accuracy not significantly different to that obtained with the infrared limbus tracking data.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Applied Psychology
Griffith Health
Full Text
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Dunne, R. A. "Multi-layer perceptron models for classification". Thesis, Dunne, R.A. (2003) Multi-layer perceptron models for classification. PhD thesis, Murdoch University, 2003. https://researchrepository.murdoch.edu.au/id/eprint/50257/.

Texto completo
Resumen
This thesis concerns the Multi-layer Perceptron (MLP) model, one of a variety of neural network models that have come into wide prominence since the mid 1980s for the classification of individuals into pre-defined classes based on a vector of individual measurements. Each discipline in which the MLP model has had influence, including computing, electrical engineering and psychology, has recast the model into its own language and imbued it with its own concerns. This divergence of terminologies has made the literature somewhat impenetrable but has also led to an appreciation of other disciplines' priorities and interests. The major aim of the thesis has been to bring the MLP model within the frame­work of statistics. We have two aims here: one is to make the MLP model more intelligible to statisticians; and the other is to bring the insights of statistics to the MLP model. A statistical modeling approach can make valuable contributions, ranging from small but important clarifications, such as clearing up the confusion in the MLP literature between the model and the methodology for fitting the model, to much larger insights such as determining the robustness of the model in the event of outlying or atypical data. We provide a treatment of the relationship of the MLP classifier to more familiar statistical models and of the various fitting and model selection methodologies currently used for MLP models. A description of the influence curves of the MLP is provided, leading to both an understanding of how the MLP model relates to logistic regression (and to robust versions of logistic regression) and to a proposal for a robust MLP model. Practical problems associated with the fitting of MLP models, from the effects of scaling of the input data to the effects of various penalty terms, are also considered. The MLP model has a variable architecture with the major source of variation being the number of hidden layer processing units. A direct method is given for determining this in multi-class problems where the pairwise decision boundary is linear in the feature space. Finally, in applications such as remote sensing each vector of measurements or pixel contains contextual information about the neighboring pixels. The MLP model is modified to incorporate this contextual information into the classification procedure.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Power, Phillip David. "Non-linear multi-layer perceptron channel equalisation". Thesis, Queen's University Belfast, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.343086.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Libros sobre el tema "Multi-layer perceptrons"

1

Ma, Zhe. Explanation by general rules extracted from trained multi-layer perceptrons. Sheffield: University of Sheffield, Dept. of Automatic Control & Systems Engineering, 1996.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Peeling, S. M. Experiments in isolated digit recognition using the multi-layer perceptron. [London: HMSO, 1987.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Lont, Jerzy B. Analog CMOS implementatrion of a multi-layer perceptron with nonlinear synapses. Kontanz: Hartung-Gorre, 1994.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Shepherd, Adrian J. Second-order methods for neural networks: Fast and reliable training methods for multi-layer perceptrons. London: Springer, 1997.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Zheng, Gonghui. Design and evaluation of a multi-output-layer perceptron. [s.l: The Author], 1996.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Harrison, R. F. The multi-layer perceptron as an aid to the early diagnosis of myocardial infarction. Sheffield: University of Sheffield, Dept. of Control Engineering, 1990.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Ma, Zhe. Dynamic query algorithms for human-computer interaction based on information gain and the multi-layer perceptron. Sheffield: University of Sheffield, Dept. of Automatic Control & Systems Engineering, 1996.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Shepherd, Adrian J. Second-Order Methods for Neural Networks: Fast and Reliable Training Methods for Multi-Layer Perceptrons. Springer, 2014.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Shepherd, Adrian J. Second-Order Methods for Neural Networks: Fast and Reliable Training Methods for Multi-Layer Perceptrons. Springer London, Limited, 2012.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Dissertation: Autonomous Construction of Multi Layer Perceptron Neural Networks. Storming Media, 1997.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Capítulos de libros sobre el tema "Multi-layer perceptrons"

1

Kruse, Rudolf, Christian Borgelt, Frank Klawonn, Christian Moewes, Matthias Steinbrecher y Pascal Held. "Multi-Layer Perceptrons". En Texts in Computer Science, 47–81. London: Springer London, 2013. http://dx.doi.org/10.1007/978-1-4471-5013-8_5.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Kruse, Rudolf, Sanaz Mostaghim, Christian Borgelt, Christian Braune y Matthias Steinbrecher. "Multi-layer Perceptrons". En Texts in Computer Science, 53–124. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-42227-1_5.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Conan-Guez, Brieuc y Fabrice Rossi. "Phoneme Discrimination with Functional Multi-Layer Perceptrons". En Classification, Clustering, and Data Mining Applications, 157–65. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-642-17103-1_16.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Bo, Liefeng, Ling Wang y Licheng Jiao. "Training Multi-layer Perceptrons Using MiniMin Approach". En Computational Intelligence and Security, 909–14. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11596448_135.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Goldberg, Yoav. "From Linear Models to Multi-layer Perceptrons". En Neural Network Methods for Natural Language Processing, 37–39. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-031-02165-7_3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Arce, Fernando, Erik Zamora, Gerardo Hernández, Javier M. Antelis y Humberto Sossa. "Recognizing Motor Imagery Tasks Using Deep Multi-Layer Perceptrons". En Machine Learning and Data Mining in Pattern Recognition, 468–82. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-96133-0_35.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Ortigosa, E. M., P. M. Ortigosa, A. Cañas, E. Ros, R. Agís y J. Ortega. "FPGA Implementation of Multi-layer Perceptrons for Speech Recognition". En Field Programmable Logic and Application, 1048–52. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-45234-8_117.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Tanahashi, Yusuke, Kazumi Saito y Ryohei Nakano. "Model Selection and Weight Sharing of Multi-layer Perceptrons". En Lecture Notes in Computer Science, 716–22. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11554028_100.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Köksal, Fatih, Ethem Alpaydyn y Günhan Dündar. "Weight Quantization for Multi-layer Perceptrons Using Soft Weight Sharing". En Artificial Neural Networks — ICANN 2001, 211–16. Berlin, Heidelberg: Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-44668-0_30.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Lappalainen, Harri y Antti Honkela. "Bayesian Non-Linear Independent Component Analysis by Multi-Layer Perceptrons". En Advances in Independent Component Analysis, 93–121. London: Springer London, 2000. http://dx.doi.org/10.1007/978-1-4471-0443-8_6.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Actas de conferencias sobre el tema "Multi-layer perceptrons"

1

Marchesi, M., G. Orlandi, F. Piazza, L. Pollonara y A. Uncini. "Multi-layer perceptrons with discrete weights". En 1990 IJCNN International Joint Conference on Neural Networks. IEEE, 1990. http://dx.doi.org/10.1109/ijcnn.1990.137772.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Coe, Brian. "Multi-layer Perceptrons for Subvocal Recognition". En 2017 IEEE 29th International Conference on Tools with Artificial Intelligence (ICTAI). IEEE, 2017. http://dx.doi.org/10.1109/ictai.2017.00054.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Hauger, S. y T. Windeatt. "ECOC and boosting with multi-layer perceptrons". En Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004. IEEE, 2004. http://dx.doi.org/10.1109/icpr.2004.1334565.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Yamany, Waleed, Mohammed Fawzy, Alaa Tharwat y Aboul Ella Hassanien. "Moth-flame optimization for training Multi-Layer Perceptrons". En 2015 11th International Computer Engineering Conference (ICENCO). IEEE, 2015. http://dx.doi.org/10.1109/icenco.2015.7416360.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Alboaneen, Dabiah Ahmed, Huaglory Tianfield y Yan Zhang. "Glowworm Swarm Optimisation for Training Multi-Layer Perceptrons". En UCC '17: 10th International Conference on Utility and Cloud Computing. New York, NY, USA: ACM, 2017. http://dx.doi.org/10.1145/3148055.3148075.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Ritschel, W., T. Pfeifer y R. Grob. "Rating of pattern classifications in multi-layer perceptrons". En the 1994 ACM symposium. New York, New York, USA: ACM Press, 1994. http://dx.doi.org/10.1145/326619.326684.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Bernardo-Torres, Abraham y Pilar Gomez-Gil. "One-step forecasting of seismograms using multi-layer perceptrons". En 2009 6th International Conference on Electrical Engineering, Computing Science and Automatic Control (CCE 2009). IEEE, 2009. http://dx.doi.org/10.1109/iceee.2009.5393349.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Kaban, Ata. "Compressive Learning of Multi-layer Perceptrons: An Error Analysis". En 2019 International Joint Conference on Neural Networks (IJCNN). IEEE, 2019. http://dx.doi.org/10.1109/ijcnn.2019.8851743.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Buhrke, E. R. y J. L. LoCicero. "Fast learning for multi-layer perceptrons using statistical techniques". En [Proceedings] ICASSP-92: 1992 IEEE International Conference on Acoustics, Speech, and Signal Processing. IEEE, 1992. http://dx.doi.org/10.1109/icassp.1992.225887.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Zheng, Lilei, Ying Zhang y Vrizlynn L. L. Thing. "Understanding multi-layer perceptrons on spatial image steganalysis features". En 2017 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC). IEEE, 2017. http://dx.doi.org/10.1109/apsipa.2017.8282181.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Informes sobre el tema "Multi-layer perceptrons"

1

Chen, B., T. Hickling, M. Krnjajic, W. Hanley, G. Clark, J. Nitao, D. Knapp, L. Hiller y M. Mugge. Multi-Layer Perceptrons and Support Vector Machines for Detection Problems with Low False Alarm Requirements: an Eight-Month Progress Report. Office of Scientific and Technical Information (OSTI), enero de 2007. http://dx.doi.org/10.2172/922310.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía