Littérature scientifique sur le sujet « Least-squares support vector machine »

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Least-squares support vector machine ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Articles de revues sur le sujet "Least-squares support vector machine"

1

KITAYAMA, Satoshi, Masao ARAKAWA et Koetsu YAMAZAKI. « 1403 Least-Squares Support Vector Machine ». Proceedings of Design & ; Systems Conference 2010.20 (2010) : _1403–1_—_1403–5_. http://dx.doi.org/10.1299/jsmedsd.2010.20._1403-1_.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Adankon, M. M., M. Cheriet et A. Biem. « Semisupervised Least Squares Support Vector Machine ». IEEE Transactions on Neural Networks 20, no 12 (décembre 2009) : 1858–70. http://dx.doi.org/10.1109/tnn.2009.2031143.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

ZHENG, SHENG, YUQIU SUN, JINWEN TIAN et JAIN LIU. « MAPPED LEAST SQUARES SUPPORT VECTOR MACHINE REGRESSION ». International Journal of Pattern Recognition and Artificial Intelligence 19, no 03 (mai 2005) : 459–75. http://dx.doi.org/10.1142/s0218001405004058.

Texte intégral
Résumé :
This paper describes a novel version of regression SVM (Support Vector Machines) that is based on the least-squares error. We show that the solution of this optimization problem can be obtained easily once the inverse of a certain matrix is computed. This matrix, however, depends only on the input vectors, but not on the labels. Thus, if many learning problems with the same set of input vectors but different sets of labels have to be solved, it makes sense to compute the inverse of the matrix just once and then use it for computing all subsequent models. The computational complexity to train an regression SVM can be reduced to O (N2), just a matrix multiplication operation, and thus probably faster than known SVM training algorithms that have O (N2) work with loops. We describe applications from image processing, where the input points are usually of the form {(x0 + dx, y0 + dy) : |dx| < m, |dy| < n} and all such set of points can be translated to the same set {(dx, dy) : |dx| < m, |dy| < n} by subtracting (x0, y0) from all the vectors. The experimental results demonstrate that the proposed approach is faster than those processing each learning problem separately.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Hwang, Changha, et Jooyong Shim. « Geographically weighted least squares-support vector machine ». Journal of the Korean Data and Information Science Society 28, no 1 (31 janvier 2017) : 227–35. http://dx.doi.org/10.7465/jkdi.2017.28.1.227.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Choi, Young-Sik. « Least squares one-class support vector machine ». Pattern Recognition Letters 30, no 13 (octobre 2009) : 1236–40. http://dx.doi.org/10.1016/j.patrec.2009.05.007.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Huang, Xiaolin, Lei Shi et Johan A. K. Suykens. « Asymmetric least squares support vector machine classifiers ». Computational Statistics & ; Data Analysis 70 (février 2014) : 395–405. http://dx.doi.org/10.1016/j.csda.2013.09.015.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Liu, Dalian, Yong Shi, Yingjie Tian et Xiankai Huang. « Ramp loss least squares support vector machine ». Journal of Computational Science 14 (mai 2016) : 61–68. http://dx.doi.org/10.1016/j.jocs.2016.02.001.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

van Gestel, Tony, Johan A. K. Suykens, Bart Baesens, Stijn Viaene, Jan Vanthienen, Guido Dedene, Bart de Moor et Joos Vandewalle. « Benchmarking Least Squares Support Vector Machine Classifiers ». Machine Learning 54, no 1 (janvier 2004) : 5–32. http://dx.doi.org/10.1023/b:mach.0000008082.80494.e0.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Zhang, Yong Li, Yan Wei Zhu, Shu Fei Lin, Xiu Juan Sun, Qiu Na Zhang et Xiao Hong Liu. « Algorithm of Sparse Least Squares Support Vector Machine ». Advanced Materials Research 143-144 (octobre 2010) : 1229–33. http://dx.doi.org/10.4028/www.scientific.net/amr.143-144.1229.

Texte intégral
Résumé :
Support Vector Machine has been widely studied in recent years. The algorithm of least squares support vector machine is studied, the shortcomings of the algorithm are given. The result of algorithm is lack of sparseness. In this paper greedy algorithm is introduced into the least squares support vector machine. Sparseness is obtained again. A new algorithm of sparse least squares support vector machine is given. The new algorithm was used to sewage treatment plant daily monitoring. Experimental results demonstrate the improved algorithm of support vector machine was successful.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Dong, Zengshou, Zhaojing Ren et You Dong. « MECHANICAL FAULT RECOGNITION RESEARCH BASED ON LMD-LSSVM ». Transactions of the Canadian Society for Mechanical Engineering 40, no 4 (novembre 2016) : 541–49. http://dx.doi.org/10.1139/tcsme-2016-0042.

Texte intégral
Résumé :
Mechanical fault vibration signals are non-stationary, which causes system instability. The traditional methods are difficult to accurately extract fault information and this paper proposes a local mean decomposition and least squares support vector machine fault identification method. The article introduces waveform matching to solve the original features of signals at the endpoints, using linear interpolation to get local mean and envelope function, then obtain production function PF vector through making use of the local mean decomposition. The energy entropy of PF vector take as identification input vectors. These vectors are respectively inputted BP neural networks, support vector machines, least squares support vector machines to identify faults. Experimental result show that the accuracy of least squares support vector machine with higher classification accuracy has been improved.
Styles APA, Harvard, Vancouver, ISO, etc.

Thèses sur le sujet "Least-squares support vector machine"

1

Zigic, Ljiljana. « Direct L2 Support Vector Machine ». VCU Scholars Compass, 2016. http://scholarscompass.vcu.edu/etd/4274.

Texte intégral
Résumé :
This dissertation introduces a novel model for solving the L2 support vector machine dubbed Direct L2 Support Vector Machine (DL2 SVM). DL2 SVM represents a new classification model that transforms the SVM's underlying quadratic programming problem into a system of linear equations with nonnegativity constraints. The devised system of linear equations has a symmetric positive definite matrix and a solution vector has to be nonnegative. Furthermore, this dissertation introduces a novel algorithm dubbed Non-Negative Iterative Single Data Algorithm (NN ISDA) which solves the underlying DL2 SVM's constrained system of equations. This solver shows significant speedup compared to several other state-of-the-art algorithms. The training time improvement is achieved at no cost, in other words, the accuracy is kept at the same level. All the experiments that support this claim were conducted on various datasets within the strict double cross-validation scheme. DL2 SVM solved with NN ISDA has faster training time on both medium and large datasets. In addition to a comprehensive DL2 SVM model we introduce and derive its three variants. Three different solvers for the DL2's system of linear equations with nonnegativity constraints were implemented, presented and compared in this dissertation.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Li, Ke. « Automotive engine tuning using least-squares support vector machines and evolutionary optimization ». Thesis, University of Macau, 2012. http://umaclib3.umac.mo/record=b2580667.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Khawaja, Taimoor Saleem. « A Bayesian least squares support vector machines based framework for fault diagnosis and failure prognosis ». Diss., Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/34758.

Texte intégral
Résumé :
A high-belief low-overhead Prognostics and Health Management (PHM) system is desired for online real-time monitoring of complex non-linear systems operating in a complex (possibly non-Gaussian) noise environment. This thesis presents a Bayesian Least Squares Support Vector Machine (LS-SVM) based framework for fault diagnosis and failure prognosis in nonlinear, non-Gaussian systems. The methodology assumes the availability of real-time process measurements, definition of a set of fault indicators, and the existence of empirical knowledge (or historical data) to characterize both nominal and abnormal operating conditions. An efficient yet powerful Least Squares Support Vector Machine (LS-SVM) algorithm, set within a Bayesian Inference framework, not only allows for the development of real-time algorithms for diagnosis and prognosis but also provides a solid theoretical framework to address key concepts related to classication for diagnosis and regression modeling for prognosis. SVM machines are founded on the principle of Structural Risk Minimization (SRM) which tends to nd a good trade-o between low empirical risk and small capacity. The key features in SVM are the use of non-linear kernels, the absence of local minima, the sparseness of the solution and the capacity control obtained by optimizing the margin. The Bayesian Inference framework linked with LS-SVMs allows a probabilistic interpretation of the results for diagnosis and prognosis. Additional levels of inference provide the much coveted features of adaptability and tunability of the modeling parameters. The two main modules considered in this research are fault diagnosis and failure prognosis. With the goal of designing an efficient and reliable fault diagnosis scheme, a novel Anomaly Detector is suggested based on the LS-SVM machines. The proposed scheme uses only baseline data to construct a 1-class LS-SVM machine which, when presented with online data, is able to distinguish between normal behavior and any abnormal or novel data during real-time operation. The results of the scheme are interpreted as a posterior probability of health (1 - probability of fault). As shown through two case studies in Chapter 3, the scheme is well suited for diagnosing imminent faults in dynamical non-linear systems. Finally, the failure prognosis scheme is based on an incremental weighted Bayesian LS-SVR machine. It is particularly suited for online deployment given the incremental nature of the algorithm and the quick optimization problem solved in the LS-SVR algorithm. By way of kernelization and a Gaussian Mixture Modeling (GMM) scheme, the algorithm can estimate (possibly) non-Gaussian posterior distributions for complex non-linear systems. An efficient regression scheme associated with the more rigorous core algorithm allows for long-term predictions, fault growth estimation with confidence bounds and remaining useful life (RUL) estimation after a fault is detected. The leading contributions of this thesis are (a) the development of a novel Bayesian Anomaly Detector for efficient and reliable Fault Detection and Identification (FDI) based on Least Squares Support Vector Machines , (b) the development of a data-driven real-time architecture for long-term Failure Prognosis using Least Squares Support Vector Machines,(c) Uncertainty representation and management using Bayesian Inference for posterior distribution estimation and hyper-parameter tuning, and finally (d) the statistical characterization of the performance of diagnosis and prognosis algorithms in order to relate the efficiency and reliability of the proposed schemes.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Erdas, Ozlem. « Modelling And Predicting Binding Affinity Of Pcp-like Compounds Using Machine Learning Methods ». Master's thesis, METU, 2007. http://etd.lib.metu.edu.tr/upload/3/12608792/index.pdf.

Texte intégral
Résumé :
Machine learning methods have been promising tools in science and engineering fields. The use of these methods in chemistry and drug design has advanced after 1990s. In this study, molecular electrostatic potential (MEP) surfaces of PCP-like compounds are modelled and visualized in order to extract features which will be used in predicting binding affinity. In modelling, Cartesian coordinates of MEP surface points are mapped onto a spherical self-organizing map. Resulting maps are visualized by using values of electrostatic potential. These values also provide features for prediction system. Support vector machines and partial least squares method are used for predicting binding affinity of compounds, and results are compared.
Styles APA, Harvard, Vancouver, ISO, etc.
5

Pai, Chih-Yun. « Automatic Pain Assessment from Infants’ Crying Sounds ». Scholar Commons, 2016. http://scholarcommons.usf.edu/etd/6560.

Texte intégral
Résumé :
Crying is infants utilize to express their emotional state. It provides the parents and the nurses a criterion to understand infants’ physiology state. Many researchers have analyzed infants’ crying sounds to diagnose specific diseases or define the reasons for crying. This thesis presents an automatic crying level assessment system to classify infants’ crying sounds that have been recorded under realistic conditions in the Neonatal Intensive Care Unit (NICU) as whimpering or vigorous crying. To analyze the crying signal, Welch’s method and Linear Predictive Coding (LPC) are used to extract spectral features; the average and the standard deviation of the frequency signal and the maximum power spectral density are the other spectral features which are used in classification. For classification, three state-of-the-art classifiers, namely K-nearest Neighbors, Random Forests, and Least Squares Support Vector Machine are tested in this work, and the experimental result achieves the highest accuracy in classifying whimper and vigorous crying using the clean dataset is 90%, which is sampled with 10 seconds before scoring and 5 seconds after scoring and uses K-nearest neighbors as the classifier.
Styles APA, Harvard, Vancouver, ISO, etc.
6

Yoldas, Mine. « Predicting The Effect Of Hydrophobicity Surface On Binding Affinity Of Pcp-like Compounds Using Machine Learning Methods ». Master's thesis, METU, 2011. http://etd.lib.metu.edu.tr/upload/12613215/index.pdf.

Texte intégral
Résumé :
This study aims to predict the binding affinity of the PCP-like compounds by means of molecular hydrophobicity. Molecular hydrophobicity is an important property which affects the binding affinity of molecules. The values of molecular hydrophobicity of molecules are obtained on three-dimensional coordinate system. Our aim is to reduce the number of points on the hydrophobicity surface of the molecules. This is modeled by using self organizing maps (SOM) and k-means clustering. The feature sets obtained from SOM and k-means clustering are used in order to predict binding affinity of molecules individually. Support vector regression and partial least squares regression are used for prediction.
Styles APA, Harvard, Vancouver, ISO, etc.
7

TREVISO, FELIPE. « Modeling for the Computer-Aided Design of Long Interconnects ». Doctoral thesis, Politecnico di Torino, 2022. https://hdl.handle.net/11583/2973429.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Melo, Davyd Bandeira de. « Algoritmos de aprendizagem para aproximaÃÃo da cinemÃtica inversa de robÃs manipuladores : um estudo comparativo ». Universidade Federal do CearÃ, 2015. http://www.teses.ufc.br/tde_busca/arquivo.php?codArquivo=16997.

Texte intégral
Résumé :
In this dissertation it is reported the results of a comprehensive comparative study involving seven machine learning algorithms applied to the task of approximating the inverse kinematic model of 3 robotic arms (planar, PUMA 560 and Motoman HP6). The evaluated algorithm are the following ones: Multilayer Perceptron (MLP), Extreme Learning Machine (ELM), Least Squares Support Vector Regression (LS-SVR), Minimal Learning Machine (MLM), Gaussian Processes (GP), Adaptive Network-Based Fuzzy Inference Systems (ANFIS) and Local Linear Mapping (LLM). Each algorithm is evaluated with respect to its accuracy in estimating the joint angles given the cartesian coordinates which comprise end-effector trajectories within the robot workspace. A comprehensive evaluation of the performances of the aforementioned algorithms is carried out based on correlation analysis of the residuals. Finally, hypothesis testing procedures are also executed in order to verifying if there are significant differences in performance among the best algorithms.
Nesta dissertaÃÃo sÃo reportados os resultados de um amplo estudo comparativo envolvendo sete algoritmos de aprendizado de mÃquinas aplicados à tarefa de aproximaÃÃo do modelo cinemÃtico inverso de 3 robÃs manipuladores (planar, PUMA 560 e Motoman HP6). Os algoritmos avaliados sÃo os seguintes: Perceptron Multicamadas (MLP), MÃquina de Aprendizado Extremo (ELM), RegressÃo de MÃnimos Quadrados via Vetores-Suporte (LS-SVR), MÃquina de Aprendizado MÃnimo (MLM), Processos Gaussianos (PG), Sistema de InferÃncia Fuzzy Baseado em Rede Adaptativa (ANFIS) e Mapeamento Linear Local (LLM). Estes algoritmos sÃo avaliados quanto à acurÃcia na estimaÃÃo dos Ãngulos das juntas dos robÃs manipuladores em experimentos envolvendo a geraÃÃo de vÃrios tipos de trajetÃrias no volume de trabalho dos referidos robÃs. Uma avaliaÃÃo abrangente do desempenho de cada algoritmo à feito com base na anÃlise dos resÃduos e testes de hipÃteses sÃo executados para verificar se hà diferenÃas significativas entre os desempenhos dos melhores algoritmos.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Padilha, Carlos Alberto de Araújo. « Uma abordagem multinível usando algoritmos genéticos em um comitê de LS-SVM ». reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2018. http://hdl.handle.net/10183/174541.

Texte intégral
Résumé :
Há muitos anos, os sistemas de comitê já tem se mostrado um método eficiente para aumentar a acurácia e estabilidade de algoritmos de aprendizado nas décadas recentes, embora sua construção tem uma questão para ser elucidada: diversidade. O desacordo entre os modelos que compõe o comitê pode ser gerado quando eles são contruídos sob diferentes circunstâncias, tais como conjunto de dados de treinamento, configuração dos parâmetros e a seleção dos algoritmos de aprendizado. O ensemble pode ser visto como uma estrutura com três níveis: espaço de entrada, a base de componentes e o bloco de combinação das respostas dos componentes. Neste trabalho é proposta uma abordagem multi-nível usando Algoritmos Genéticos para construir um ensemble de Máquinas de Vetor de Suporte por Mínimos Quadrados ou LS-SVM, realizando uma seleção de atributos no espaço de entrada, parametrização e a escolha de quais modelos irão compor o comitê no nível de componentes e a busca por um vetor de pesos que melhor represente a importância de cada classificador na resposta final do comitê. De forma a avaliar a performance da abordagem proposta, foram utilizados alguns benchmarks do repositório da UCI para comparar com outros algoritmos de classificação. Além disso, também foram comparados os resultados da abordagem proposta com métodos de aprendizagem profunda nas bases de dados MNIST e CIFAR e se mostraram bastante satisfatórios.
Many years ago, the ensemble systems have been shown to be an efficient method to increase the accuracy and stability of learning algorithms in recent decades, although its construction has a question to be elucidated: diversity. The disagreement among the models that compose the ensemble can be generated when they are built under different circumstances, such as training dataset, parameter setting and selection of learning algorithms. The ensemble may be viewed as a structure with three levels: input space, the base components and the combining block of the components responses. In this work is proposed a multi-level approach using genetic algorithms to build the ensemble of Least Squares Support Vector Machines (LS-SVM), performing a feature selection in the input space, the parameterization and the choice of which models will compose the ensemble at the component level and finding a weight vector which best represents the importance of each classifier in the final response of the ensemble. In order to evaluate the performance of the proposed approach, some benchmarks from UCI Repository have been used to compare with other classification algorithms. Also, the results obtained by our approach were compared with some deep learning methods on the datasets MNIST and CIFAR and proved very satisfactory.
Styles APA, Harvard, Vancouver, ISO, etc.
10

SEDAGHAT, MOSTAFA. « Modeling and Optimization of the Microwave PCB Interconnects Using Macromodel Techniques ». Doctoral thesis, Politecnico di Torino, 2022. https://hdl.handle.net/11583/2973989.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Livres sur le sujet "Least-squares support vector machine"

1

missing], [name. Least squares support vector machines. Singapore : World Scientific, 2002.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Least squares support vector machines. River Edge, NJ : World Scientific, 2002.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Vandewalle, Joos, Bart De Moor, Tony Van Gestel, Jos De Brabanter et Johan A. K. Suykens. Least Squares Support Vector Machines. World Scientific Publishing Company, 2003.

Trouver le texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

O. Görgülü et A. Akilli. Egg production curve fitting using least square support vector machines and nonlinear regression analysis. Verlag Eugen Ulmer, 2018. http://dx.doi.org/10.1399/eps.2018.235.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Chapitres de livres sur le sujet "Least-squares support vector machine"

1

Pelckmans, K., I. Goethals, J. D. Brabanter, J. A. K. Suykens et B. D. Moor. « Componentwise Least Squares Support Vector Machines ». Dans Support Vector Machines : Theory and Applications, 77–98. Berlin, Heidelberg : Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/10984697_3.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Zhang, Xiaoou, et Zexuan Zhu. « Sparse Multi-task Least-Squares Support Vector Machine ». Dans Neural Computing for Advanced Applications, 157–67. Singapore : Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-7670-6_14.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Li, Yang, et Wanmei Tang. « A least Squares Support Vector Machine Sparseness Algorithm ». Dans Lecture Notes in Electrical Engineering, 346–53. London : Springer London, 2012. http://dx.doi.org/10.1007/978-1-4471-2386-6_45.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Wu, Fangfang, et Yinliang Zhao. « Least Squares Littlewood-Paley Wavelet Support Vector Machine ». Dans Lecture Notes in Computer Science, 462–72. Berlin, Heidelberg : Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11579427_47.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Li, Lijuan, Youfeng Li, Hongye Su et Jian Chu. « Least Squares Support Vector Machines Based on Support Vector Degrees ». Dans Lecture Notes in Computer Science, 1275–81. Berlin, Heidelberg : Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11816157_160.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Gan, Liang-zhi, Hai-kuan Liu et You-xian Sun. « Sparse Least Squares Support Vector Machine for Function Estimation ». Dans Advances in Neural Networks - ISNN 2006, 1016–21. Berlin, Heidelberg : Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11759966_149.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

López, Jorge, Álvaro Barbero et José R. Dorronsoro. « Momentum Acceleration of Least–Squares Support Vector Machines ». Dans Lecture Notes in Computer Science, 135–42. Berlin, Heidelberg : Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-21738-8_18.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Gijsberts, Arjan, Giorgio Metta et Léon Rothkrantz. « Evolutionary Optimization of Least-Squares Support Vector Machines ». Dans Annals of Information Systems, 277–97. Boston, MA : Springer US, 2009. http://dx.doi.org/10.1007/978-1-4419-1280-0_12.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Li, You-Feng, Li-Juan Li, Hong-Ye Su et Jian Chu. « Least Squares Support Vector Machine Based Partially Linear Model Identification ». Dans Lecture Notes in Computer Science, 775–81. Berlin, Heidelberg : Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11816157_94.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Zhang, Yongli, Yanwei Zhu, Shufei Lin et Xiaohong Liu. « Application of Least Squares Support Vector Machine in Fault Diagnosis ». Dans Communications in Computer and Information Science, 192–200. Berlin, Heidelberg : Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-27452-7_26.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Actes de conférences sur le sujet "Least-squares support vector machine"

1

« Least squares support vector machine ensemble ». Dans 2004 IEEE International Joint Conference on Neural Networks. IEEE, 2004. http://dx.doi.org/10.1109/ijcnn.2004.1380924.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Yugang Fan, Ping Li et Zhihuan Song. « Dynamic Least Squares Support Vector Machine ». Dans 2006 6th World Congress on Intelligent Control and Automation. IEEE, 2006. http://dx.doi.org/10.1109/wcica.2006.1713313.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Kong, Rui, et Bing Zhang. « A Fast Least Squares Support Vector Machine classifier ». Dans 2008 Chinese Control and Decision Conference (CCDC). IEEE, 2008. http://dx.doi.org/10.1109/ccdc.2008.4597413.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Jafar, Nurkamila, Sri Astuti Thamrin et Armin Lawi. « Multiclass classification using Least Squares Support Vector Machine ». Dans 2016 International Conference on Computational Intelligence and Cybernetics (CYBERNETICSCOM). IEEE, 2016. http://dx.doi.org/10.1109/cyberneticscom.2016.7892558.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Yanhui, Zhang, Liu Binbin et Pan Zhongming. « Biofouling estimation with least squares support vector machine ». Dans 2016 IEEE International Conference on Information and Automation (ICIA). IEEE, 2016. http://dx.doi.org/10.1109/icinfa.2016.7831980.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Yongsheng Sang, Haixian Zhang et Lin Zuo. « Least Squares Support Vector Machine classifiers using PCNNs ». Dans 2008 IEEE Conference on Cybernetics and Intelligent Systems (CIS). IEEE, 2008. http://dx.doi.org/10.1109/iccis.2008.4670890.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Xia, Xiao-Lei. « A novel sparse least-squares support vector machine ». Dans 2012 5th International Conference on Biomedical Engineering and Informatics (BMEI). IEEE, 2012. http://dx.doi.org/10.1109/bmei.2012.6513100.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

Gaobo, Chen, et Chen Xiufang. « Combining partial least squares regression and least squares support vector machine for data mining ». Dans 2011 International Conference on E-Business and E-Government (ICEE). IEEE, 2011. http://dx.doi.org/10.1109/icebeg.2011.5881755.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Jing, Lv, et Zhang Yanqing. « Colleges Employment Forecasting by Least Squares Support Vector Machine ». Dans 2012 International Conference on Computer Science and Electronics Engineering (ICCSEE). IEEE, 2012. http://dx.doi.org/10.1109/iccsee.2012.455.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
10

Richhariya, B., et M. Tanveer. « Universum least squares twin parametric-margin support vector machine ». Dans 2020 International Joint Conference on Neural Networks (IJCNN). IEEE, 2020. http://dx.doi.org/10.1109/ijcnn48605.2020.9206865.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.

Rapports d'organisations sur le sujet "Least-squares support vector machine"

1

Puttanapong, Nattapong, Arturo M. Martinez Jr, Mildred Addawe, Joseph Bulan, Ron Lester Durante et Marymell Martillan. Predicting Poverty Using Geospatial Data in Thailand. Asian Development Bank, décembre 2020. http://dx.doi.org/10.22617/wps200434-2.

Texte intégral
Résumé :
This study examines an alternative approach in estimating poverty by investigating whether readily available geospatial data can accurately predict the spatial distribution of poverty in Thailand. It also compares the predictive performance of various econometric and machine learning methods such as generalized least squares, neural network, random forest, and support vector regression. Results suggest that intensity of night lights and other variables that approximate population density are highly associated with the proportion of population living in poverty. The random forest technique yielded the highest level of prediction accuracy among the methods considered, perhaps due to its capability to fit complex association structures even with small and medium-sized datasets.
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie