Literatura académica sobre el tema "Linear perceptrons"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Linear perceptrons".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Artículos de revistas sobre el tema "Linear perceptrons"
Bylander, Tom. "Learning Linear Threshold Approximations Using Perceptrons". Neural Computation 7, n.º 2 (marzo de 1995): 370–79. http://dx.doi.org/10.1162/neco.1995.7.2.370.
Texto completoAlpaydin, E. y M. I. Jordan. "Local linear perceptrons for classification". IEEE Transactions on Neural Networks 7, n.º 3 (mayo de 1996): 788–94. http://dx.doi.org/10.1109/72.501737.
Texto completoBarber, D., D. Saad y P. Sollich. "Test Error Fluctuations in Finite Linear Perceptrons". Neural Computation 7, n.º 4 (julio de 1995): 809–21. http://dx.doi.org/10.1162/neco.1995.7.4.809.
Texto completoLegenstein, Robert y Wolfgang Maass. "On the Classification Capability of Sign-Constrained Perceptrons". Neural Computation 20, n.º 1 (enero de 2008): 288–309. http://dx.doi.org/10.1162/neco.2008.20.1.288.
Texto completoYu, Xin, Mian Xie, Li Xia Tang y Chen Yu Li. "Learning Algorithm for Fuzzy Perceptron with Max-Product Composition". Applied Mechanics and Materials 687-691 (noviembre de 2014): 1359–62. http://dx.doi.org/10.4028/www.scientific.net/amm.687-691.1359.
Texto completoShah, J. V. y Chi-Sang Poon. "Linear independence of internal representations in multilayer perceptrons". IEEE Transactions on Neural Networks 10, n.º 1 (1999): 10–18. http://dx.doi.org/10.1109/72.737489.
Texto completoZwietering, P. J., E. H. L. Aarts y J. Wessels. "EXACT CLASSIFICATION WITH TWO-LAYERED PERCEPTRONS". International Journal of Neural Systems 03, n.º 02 (enero de 1992): 143–56. http://dx.doi.org/10.1142/s0129065792000127.
Texto completoHara, Kazuyuki y Masato Okada. "Ensemble Learning of Linear Perceptrons: On-Line Learning Theory". Journal of the Physical Society of Japan 74, n.º 11 (15 de noviembre de 2005): 2966–72. http://dx.doi.org/10.1143/jpsj.74.2966.
Texto completoFrean, Marcus. "The Upstart Algorithm: A Method for Constructing and Training Feedforward Neural Networks". Neural Computation 2, n.º 2 (junio de 1990): 198–209. http://dx.doi.org/10.1162/neco.1990.2.2.198.
Texto completoHamid, Danish, Syed Sajid Ullah, Jawaid Iqbal, Saddam Hussain, Ch Anwar ul Hassan y Fazlullah Umar. "A Machine Learning in Binary and Multiclassification Results on Imbalanced Heart Disease Data Stream". Journal of Sensors 2022 (20 de septiembre de 2022): 1–13. http://dx.doi.org/10.1155/2022/8400622.
Texto completoTesis sobre el tema "Linear perceptrons"
Ferronato, Giuliano. "Intervalos de predição para redes neurais artificiais via regressão não linear". Florianópolis, SC, 2008. http://repositorio.ufsc.br/xmlui/handle/123456789/91675.
Texto completoMade available in DSpace on 2012-10-24T01:24:51Z (GMT). No. of bitstreams: 1 258459.pdf: 252997 bytes, checksum: a0457bb78b352c0aab2bb1f48ab79985 (MD5)
Este trabalho descreve a aplicação de uma técnica de regressão não linear (mínimos quadrados) para obter predições intervalares em redes neurais artificiais (RNA#s). Através de uma simulação de Monte Carlo é mostrada uma maneira de escolher um ajuste de parâmetros (pesos) para uma rede neural, de acordo com um critério de seleção que é baseado na magnitude dos intervalos de predição fornecidos pela rede. Com esta técnica foi possível obter as predições intervalares com amplitude desejada e com probabilidade de cobertura conhecida, de acordo com um grau de confiança escolhido. Os resultados e as discussões associadas indicam ser possível e factível a obtenção destes intervalos, fazendo com que a resposta das redes seja mais informativa e consequentemente aumentando sua aplicabilidade. A implementação computacional está disponível em www.inf.ufsc.br/~dandrade. This work describes the application of a nonlinear regression technique (least squares) to create prediction intervals on artificial neural networks (ANN´s). Through Monte Carlo#s simulations it is shown a way of choosing the set of parameters (weights) to a neural network, according to a selection criteria based on the magnitude of the prediction intervals provided by the net. With this technique it is possible to obtain the prediction intervals with the desired amplitude and with known coverage probability, according to the chosen confidence level. The associated results and discussions indicate to be possible and feasible to obtain these intervals, thus making the network response more informative and consequently increasing its applicability. The computational implementation is available in www.inf.ufsc.br/~dandrade.
Louche, Ugo. "From confusion noise to active learning : playing on label availability in linear classification problems". Thesis, Aix-Marseille, 2016. http://www.theses.fr/2016AIXM4025/document.
Texto completoThe works presented in this thesis fall within the general framework of linear classification, that is the problem of categorizing data into two or more classes based on on a training set of labelled data. In practice though acquiring labeled examples might prove challenging and/or costly as data are inherently easier to obtain than to label. Dealing with label scarceness have been a motivational goal in the machine learning literature and this work discuss two settings related to this problem: learning in the presence of noise and active learning
Coughlin, Michael J. y n/a. "Calibration of Two Dimensional Saccadic Electro-Oculograms Using Artificial Neural Networks". Griffith University. School of Applied Psychology, 2003. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20030409.110949.
Texto completoManesco, Luis Fernando. "Modelagem de um processo fermentativo por rede Perceptron multicamadas com atraso de tempo". Universidade de São Paulo, 1996. http://www.teses.usp.br/teses/disponiveis/18/18133/tde-22012018-103016/.
Texto completoldentification and Control of dynamic systems using Artificial Neural Networks has been widely investigated by many researchers in the last few years, with special attention to the application of these in nonlinear systems. ls this works, a study on the utilization of a particular type of Artificial Neural Networks, a Time Delay Multi Layer Perceptron, in the state estimation of the fermentative phase of the Reichstein process of the C vitamin production. The use of Artificial Neural Networks can be justified by the presence of problems, such as uncertain and unmeasurable state variables and process non-linearity, and by the fact that a conventional model that works on all phases of the fermentative processes is very difficult to obtain. The efficiency of the Levenberg Marquadt algorithm on the acceleration of the training process is also studied. Also, a comparison is performed between the studied Artificial Neural Networks and an extended Kalman filter based on a non-structured model for this fermentative process. The analysis of lhe Artificial Neural Networks is carried out using lhe mean square errors taking into consideration lhe activation function and the number of units presents in the hidden layer. A set of batch experimental runs, interpolated to the desired time interval, is used for training and validating the Artificial Neural Networks.
Power, Phillip David. "Non-linear multi-layer perceptron channel equalisation". Thesis, Queen's University Belfast, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.343086.
Texto completoBueno, Felipe Roberto 1985. "Perceptrons híbridos lineares/morfológicos fuzzy com aplicações em classificação". [s.n.], 2015. http://repositorio.unicamp.br/jspui/handle/REPOSIP/306338.
Texto completoDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matemática Estatística e Computação Científica
Made available in DSpace on 2018-08-26T15:06:30Z (GMT). No. of bitstreams: 1 Bueno_FelipeRoberto_M.pdf: 1499339 bytes, checksum: 85b58d8b856fafa47974349e80c1729e (MD5) Previous issue date: 2015
Resumo: Perceptrons morfológicos (MPs) pertencem à classe de redes neurais morfológicas (MNNs). Estas redes representam uma classe de redes neurais artificiais que executam operações de morfologia matemática (MM) em cada nó, possivelmente seguido pela aplicação de uma função de ativação. Vale ressaltar que a morfologia matemática foi concebida como uma teoria para processamento e análise de objetos (imagens ou sinais), por meio de outros objetos chamados elementos estruturantes. Embora inicialmente desenvolvida para o processamento de imagens binárias e posteriormente estendida para o processamento de imagens em tons de cinza, a morfologia matemática pode ser conduzida de modo mais geral em uma estrutura de reticulados completos. Originalmente, as redes neurais morfológicas empregavam somente determinadas operações da morfologia matemática em tons de cinza, denominadas de erosão e dilatação em tons de cinza, segundo a abordagem umbra. Estas operações podem ser expressas em termos de produtos máximo aditivo e mínimo aditivo, definidos por meio de operações entre vetores ou matrizes, da álgebra minimax. Recentemente, as operações da morfologia matemática fuzzy surgiram como funções de agregação das redes neurais morfológicas. Neste caso, falamos em redes neurais morfológicas fuzzy. Perceptrons híbridos lineares/morfológicos fuzzy foram inicialmente projetados como uma generalização dos perceptrons lineares/morfológicos existentes, ou seja, os perceptrons lineares/morfológicos fuzzy podem ser definidos por uma combinação convexa de uma parte morfológica fuzzy e uma parte linear. Nesta dissertação de mestrado, introduzimos uma rede neural artificial alimentada adiante, representando um perceptron híbrido linear/morfológico fuzzy chamado F-DELP (do inglês fuzzy dilation/erosion/linear perceptron), que ainda não foi considerado na literatura de redes neurais. Seguindo as ideias de Pessoa e Maragos, aplicamos uma suavização adequada para superar a não-diferenciabilidade dos operadores de dilatação e erosão fuzzy utilizados no modelo F-DELP. Em seguida, o treinamento é realizado por intermédio de um algoritmo de retropropagação de erro tradicional. Desta forma, aplicamos o modelo F-DELP em alguns problemas de classificação conhecidos e comparamos seus resultados com os produzidos por outros classificadores
Abstract: Morphological perceptrons (MPs) belong to the class of morphological neural networks (MNNs). These MNNs represent a class of artificial neural networks that perform operations of mathematical morphology (MM) at every node, possibly followed by the application of an activation function. Recall that mathematical morphology was conceived as a theory for processing and analyzing objects (images or signals), by means of other objects called structuring elements. Although initially developed for binary image processing and later extended to gray-scale image processing, mathematical morphology can be conducted very generally in a complete lattice setting. Originally, morphological neural networks only employed certain operations of gray-scale mathematical morphology, namely gray-scale erosion and dilation according to the umbra approach. These operations can be expressed in terms of (additive maximum and additive minimum) matrix-vector products in minimax algebra. It was not until recently that operations of fuzzy mathematical morphology emerged as aggregation functions of morphological neural networks. In this case, we speak of fuzzy morphological neural networks. Hybrid fuzzy morphological/linear perceptrons was initially designed by generalizing existing morphological/linear perceptrons, in other words, fuzzy morphological/linear perceptrons can be defined by a convex combination of a fuzzy morphological part and a linear part. In this master's thesis, we introduce a feedforward artificial neural network representing a hybrid fuzzy morphological/linear perceptron called fuzzy dilation/erosion/linear perceptron (F-DELP), which has not yet been considered in the literature. Following Pessoa's and Maragos' ideas, we apply an appropriate smoothing to overcome the non-differentiability of the fuzzy dilation and erosion operators employed in the proposed F-DELP models. Then, training is achieved using a traditional backpropagation algorithm. Finally, we apply the F-DELP model to some well-known classification problems and compare the results with the ones produced by other classifiers
Mestrado
Matematica Aplicada
Mestre em Matemática Aplicada
Siu, Sammy. "Non-linear adaptive equalization based on a multi-layer perceptron architecture". Thesis, University of Edinburgh, 1991. http://hdl.handle.net/1842/11916.
Texto completoEvans, John Thomas. "Investigation of a multi-layer perceptron network to model and control a non-linear system". Thesis, Liverpool John Moores University, 1994. http://researchonline.ljmu.ac.uk/4945/.
Texto completoSamuel, Nikhil J. "Identification of Uniform Class Regions using Perceptron Training". University of Cincinnati / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1439307102.
Texto completoRocha, Fabiano Lopes. "Identificação de sistemas não-lineares multivariáveis usando redes neurais perceptron multicamadas e função de base radial / Fabiano Lopes Rocha ; orientador, Leandro dos Santos Coelho". reponame:Biblioteca Digital de Teses e Dissertações da PUC_PR, 2006. http://www.biblioteca.pucpr.br/tede/tde_busca/arquivo.php?codArquivo=450.
Texto completoInclui bibliografia
A identificação de sistemas dinâmicos não-lineares multivariáveis é uma área importante em várias áreas da Engenharia. Esta dissertação apresenta o estudo de uma metodologia baseada em redes neurais artificiais para identificação de sistemas não-lineares
Libros sobre el tema "Linear perceptrons"
Lont, Jerzy B. Analog CMOS implementatrion of a multi-layer perceptron with nonlinear synapses. Kontanz: Hartung-Gorre, 1994.
Buscar texto completo1941-, Hart Peter E. y Stork David G, eds. Pattern classification. 2a ed. New York: Wiley, 2001.
Buscar texto completoDuda, Richard O., David G. Stork y Peter E. Hart. Pattern Classification. Wiley & Sons, Incorporated, John, 2022.
Buscar texto completoDuda, Richard O. Pattern Classification. Wiley & Sons, Limited, John, 2013.
Buscar texto completoDuda, Richard O. Pattern Classification. Wiley & Sons, Incorporated, John, 2022.
Buscar texto completoDuda, Richard O., David G. Stork y Peter E. Hart. Pattern Classification: Solutions Manual. Wiley & Sons, Incorporated, John, 2003.
Buscar texto completoDuda, Richard O., David G. Stork y Peter E. Hart. Pattern Classification. Wiley & Sons, Incorporated, John, 2009.
Buscar texto completoDuda, Richard O., David G. Stork y Peter E. Hart. Pattern Classification. Wiley & Sons, Incorporated, John, 2012.
Buscar texto completoCapítulos de libros sobre el tema "Linear perceptrons"
Bielecki, Andrzej. "Linear Perceptrons". En Models of Neurons and Perceptrons: Selected Problems and Challenges, 111–19. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-90140-4_9.
Texto completoMurty, M. N. y Rashmi Raghava. "Linear Discriminant Function". En Support Vector Machines and Perceptrons, 15–25. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-41063-0_2.
Texto completoMurty, M. N. y Rashmi Raghava. "Linear Support Vector Machines". En Support Vector Machines and Perceptrons, 41–56. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-41063-0_4.
Texto completoHartono, Pitoyo y Shuji Hashimoto. "Learning with Ensemble of Linear Perceptrons". En Lecture Notes in Computer Science, 115–20. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11550907_19.
Texto completoGoldberg, Yoav. "From Linear Models to Multi-layer Perceptrons". En Neural Network Methods for Natural Language Processing, 37–39. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-031-02165-7_3.
Texto completoLappalainen, Harri y Antti Honkela. "Bayesian Non-Linear Independent Component Analysis by Multi-Layer Perceptrons". En Advances in Independent Component Analysis, 93–121. London: Springer London, 2000. http://dx.doi.org/10.1007/978-1-4471-0443-8_6.
Texto completoHara, Kazuyuki, Yoichi Nakayama, Seiji Miyoshi y Masato Okada. "Mutual Learning with Many Linear Perceptrons: On-Line Learning Theory". En Artificial Neural Networks – ICANN 2009, 171–80. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04274-4_18.
Texto completoMousset, E. y A. Faraj. "A Formal Link between Multilayer Perceptrons and a Generalization of Linear Discriminant Analysis". En ICANN ’93, 508. London: Springer London, 1993. http://dx.doi.org/10.1007/978-1-4471-2063-6_134.
Texto completoKryzhanovskiy, Vladimir, Irina Zhelavskaya y Anatoliy Fonarev. "Vector Perceptron Learning Algorithm Using Linear Programming". En Artificial Neural Networks and Machine Learning – ICANN 2012, 197–204. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-33266-1_25.
Texto completoLafif Tej, Mohamed y Stefan Holban. "Determining Optimal Multi-layer Perceptron Structure Using Linear Regression". En Business Information Systems, 232–46. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-20485-3_18.
Texto completoActas de conferencias sobre el tema "Linear perceptrons"
Bueno, Felipe Roberto y Peter Sussner. "FUZZY MORPHOLOGICAL PERCEPTRONS AND HYBRID FUZZY MORPHOLOGICAL/LINEAR PERCEPTRONS". En The 11th International FLINS Conference (FLINS 2014). WORLD SCIENTIFIC, 2014. http://dx.doi.org/10.1142/9789814619998_0120.
Texto completoArcadia, Christopher E., Hokchhay Tann, Amanda Dombroski, Kady Ferguson, Shui Ling Chen, Eunsuk Kim, Christopher Rose, Brenda M. Rubenstein, Sherief Reda y Jacob K. Rosenstein. "Parallelized Linear Classification with Volumetric Chemical Perceptrons". En 2018 IEEE International Conference on Rebooting Computing (ICRC). IEEE, 2018. http://dx.doi.org/10.1109/icrc.2018.8638627.
Texto completoYunfeng Wu, Jinming Zhang, Cong Wang y Sin Chun Ng. "Linear decision fusions in multilayer perceptrons for breast cancer diagnosis". En 17th IEEE International Conference on Tools with Artificial Intelligence (ICTAI'05). IEEE, 2005. http://dx.doi.org/10.1109/ictai.2005.82.
Texto completoHartono, Pitoyo. "Ensemble of perceptrons with confidence measure for piecewise linear decomposition". En 2011 International Joint Conference on Neural Networks (IJCNN 2011 - San Jose). IEEE, 2011. http://dx.doi.org/10.1109/ijcnn.2011.6033282.
Texto completoHassan, T. A. F., A. El-Shafei, Y. Zeyada y N. Rieger. "Comparison of Neural Network Architectures for Machinery Fault Diagnosis". En ASME Turbo Expo 2003, collocated with the 2003 International Joint Power Generation Conference. ASMEDC, 2003. http://dx.doi.org/10.1115/gt2003-38450.
Texto completoSussner, Peter y Felipe Roberto Bueno. "SOME EXPERIMENTAL RESULTS IN CLASSIFICATION USING HYBRID FUZZY MORPHOLOGICAL/LINEAR PERCEPTRONS". En The 11th International FLINS Conference (FLINS 2014). WORLD SCIENTIFIC, 2014. http://dx.doi.org/10.1142/9789814619998_0115.
Texto completoZhang, D., M. Kamel y M. I. Elmasry. "A training approach based on linear separability analysis for layered perceptrons". En Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94). IEEE, 1994. http://dx.doi.org/10.1109/icnn.1994.374217.
Texto completoAraujo, Ricardo de A., Adriano L. I. Oliveira y Silvio Meira. "A learning process based on covariance matrix adaptation for morphological-linear perceptrons". En 2013 IEEE Congress on Evolutionary Computation (CEC). IEEE, 2013. http://dx.doi.org/10.1109/cec.2013.6557840.
Texto completoSussner, Peter, Israel Campiott y Manuel Alejandro Quispe Torres. "Hybrid Gray-Scale and Fuzzy Morphological/Linear Perceptrons Trained By Extreme Learning Machine". En 2020 International Joint Conference on Neural Networks (IJCNN). IEEE, 2020. http://dx.doi.org/10.1109/ijcnn48605.2020.9206886.
Texto completoGoyal, Somya y Pradeep K. Bhatia. "A Non-Linear Technique for Effective Software Effort Estimation using Multi-Layer Perceptrons". En 2019 International Conference on Machine Learning, Big Data, Cloud and Parallel Computing (COMITCon). IEEE, 2019. http://dx.doi.org/10.1109/comitcon.2019.8862256.
Texto completo