Articles de revues sur le sujet « Least-squares support vector machine »

Pour voir les autres types de publications sur ce sujet consultez le lien suivant : Least-squares support vector machine.

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 50 meilleurs articles de revues pour votre recherche sur le sujet « Least-squares support vector machine ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les articles de revues sur diverses disciplines et organisez correctement votre bibliographie.

1

KITAYAMA, Satoshi, Masao ARAKAWA et Koetsu YAMAZAKI. « 1403 Least-Squares Support Vector Machine ». Proceedings of Design & ; Systems Conference 2010.20 (2010) : _1403–1_—_1403–5_. http://dx.doi.org/10.1299/jsmedsd.2010.20._1403-1_.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
2

Adankon, M. M., M. Cheriet et A. Biem. « Semisupervised Least Squares Support Vector Machine ». IEEE Transactions on Neural Networks 20, no 12 (décembre 2009) : 1858–70. http://dx.doi.org/10.1109/tnn.2009.2031143.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

ZHENG, SHENG, YUQIU SUN, JINWEN TIAN et JAIN LIU. « MAPPED LEAST SQUARES SUPPORT VECTOR MACHINE REGRESSION ». International Journal of Pattern Recognition and Artificial Intelligence 19, no 03 (mai 2005) : 459–75. http://dx.doi.org/10.1142/s0218001405004058.

Texte intégral
Résumé :
This paper describes a novel version of regression SVM (Support Vector Machines) that is based on the least-squares error. We show that the solution of this optimization problem can be obtained easily once the inverse of a certain matrix is computed. This matrix, however, depends only on the input vectors, but not on the labels. Thus, if many learning problems with the same set of input vectors but different sets of labels have to be solved, it makes sense to compute the inverse of the matrix just once and then use it for computing all subsequent models. The computational complexity to train an regression SVM can be reduced to O (N2), just a matrix multiplication operation, and thus probably faster than known SVM training algorithms that have O (N2) work with loops. We describe applications from image processing, where the input points are usually of the form {(x0 + dx, y0 + dy) : |dx| < m, |dy| < n} and all such set of points can be translated to the same set {(dx, dy) : |dx| < m, |dy| < n} by subtracting (x0, y0) from all the vectors. The experimental results demonstrate that the proposed approach is faster than those processing each learning problem separately.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Hwang, Changha, et Jooyong Shim. « Geographically weighted least squares-support vector machine ». Journal of the Korean Data and Information Science Society 28, no 1 (31 janvier 2017) : 227–35. http://dx.doi.org/10.7465/jkdi.2017.28.1.227.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Choi, Young-Sik. « Least squares one-class support vector machine ». Pattern Recognition Letters 30, no 13 (octobre 2009) : 1236–40. http://dx.doi.org/10.1016/j.patrec.2009.05.007.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Huang, Xiaolin, Lei Shi et Johan A. K. Suykens. « Asymmetric least squares support vector machine classifiers ». Computational Statistics & ; Data Analysis 70 (février 2014) : 395–405. http://dx.doi.org/10.1016/j.csda.2013.09.015.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Liu, Dalian, Yong Shi, Yingjie Tian et Xiankai Huang. « Ramp loss least squares support vector machine ». Journal of Computational Science 14 (mai 2016) : 61–68. http://dx.doi.org/10.1016/j.jocs.2016.02.001.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
8

van Gestel, Tony, Johan A. K. Suykens, Bart Baesens, Stijn Viaene, Jan Vanthienen, Guido Dedene, Bart de Moor et Joos Vandewalle. « Benchmarking Least Squares Support Vector Machine Classifiers ». Machine Learning 54, no 1 (janvier 2004) : 5–32. http://dx.doi.org/10.1023/b:mach.0000008082.80494.e0.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
9

Zhang, Yong Li, Yan Wei Zhu, Shu Fei Lin, Xiu Juan Sun, Qiu Na Zhang et Xiao Hong Liu. « Algorithm of Sparse Least Squares Support Vector Machine ». Advanced Materials Research 143-144 (octobre 2010) : 1229–33. http://dx.doi.org/10.4028/www.scientific.net/amr.143-144.1229.

Texte intégral
Résumé :
Support Vector Machine has been widely studied in recent years. The algorithm of least squares support vector machine is studied, the shortcomings of the algorithm are given. The result of algorithm is lack of sparseness. In this paper greedy algorithm is introduced into the least squares support vector machine. Sparseness is obtained again. A new algorithm of sparse least squares support vector machine is given. The new algorithm was used to sewage treatment plant daily monitoring. Experimental results demonstrate the improved algorithm of support vector machine was successful.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Dong, Zengshou, Zhaojing Ren et You Dong. « MECHANICAL FAULT RECOGNITION RESEARCH BASED ON LMD-LSSVM ». Transactions of the Canadian Society for Mechanical Engineering 40, no 4 (novembre 2016) : 541–49. http://dx.doi.org/10.1139/tcsme-2016-0042.

Texte intégral
Résumé :
Mechanical fault vibration signals are non-stationary, which causes system instability. The traditional methods are difficult to accurately extract fault information and this paper proposes a local mean decomposition and least squares support vector machine fault identification method. The article introduces waveform matching to solve the original features of signals at the endpoints, using linear interpolation to get local mean and envelope function, then obtain production function PF vector through making use of the local mean decomposition. The energy entropy of PF vector take as identification input vectors. These vectors are respectively inputted BP neural networks, support vector machines, least squares support vector machines to identify faults. Experimental result show that the accuracy of least squares support vector machine with higher classification accuracy has been improved.
Styles APA, Harvard, Vancouver, ISO, etc.
11

Xia, Xiao-Lei, Weidong Jiao, Kang Li et George Irwin. « A Novel Sparse Least Squares Support Vector Machines ». Mathematical Problems in Engineering 2013 (2013) : 1–10. http://dx.doi.org/10.1155/2013/602341.

Texte intégral
Résumé :
The solution of a Least Squares Support Vector Machine (LS-SVM) suffers from the problem of nonsparseness. The Forward Least Squares Approximation (FLSA) is a greedy approximation algorithm with a least-squares loss function. This paper proposes a new Support Vector Machine for which the FLSA is the training algorithm—the Forward Least Squares Approximation SVM (FLSA-SVM). A major novelty of this new FLSA-SVM is that the number of support vectors is the regularization parameter for tuning the tradeoff between the generalization ability and the training cost. The FLSA-SVMs can also detect the linear dependencies in vectors of the input Gramian matrix. These attributes together contribute to its extreme sparseness. Experiments on benchmark datasets are presented which show that, compared to various SVM algorithms, the FLSA-SVM is extremely compact, while maintaining a competitive generalization ability.
Styles APA, Harvard, Vancouver, ISO, etc.
12

Seok, Kyungha, et Daehyun Cho. « A Study on Support Vectors of Least Squares Support Vector Machine ». Communications for Statistical Applications and Methods 10, no 3 (1 décembre 2003) : 873–78. http://dx.doi.org/10.5351/ckss.2003.10.3.873.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
13

Li, Xiao, Xin Liu, Clyde Zhengdao Li, Zhumin Hu, Geoffrey Qiping Shen et Zhenyu Huang. « Foundation pit displacement monitoring and prediction using least squares support vector machines based on multi-point measurement ». Structural Health Monitoring 18, no 3 (23 avril 2018) : 715–24. http://dx.doi.org/10.1177/1475921718767935.

Texte intégral
Résumé :
Foundation pit displacement is a critical safety risk for both building structure and people lives. The accurate displacement monitoring and prediction of a deep foundation pit are essential to prevent potential risks at early construction stage. To achieve accurate prediction, machine learning methods are extensively applied to fulfill this purpose. However, these approaches, such as support vector machines, have limitations in terms of data processing efficiency and prediction accuracy. As an emerging approach derived from support vector machines, least squares support vector machine improve the data processing efficiency through better use of equality constraints in the least squares loss functions. However, the accuracy of this approach highly relies on the large volume of influencing factors from the measurement of adjacent critical points, which is not normally available during the construction process. To address this issue, this study proposes an improved least squares support vector machine algorithm based on multi-point measuring techniques, namely, multi-point least squares support vector machine. To evaluate the effectiveness of the proposed multi-point least squares support vector machine approach, a real case study project was selected, and the results illustrated that the multi-point least squares support vector machine approach on average outperformed single-point least squares support vector machine in terms of prediction accuracy during the foundation pit monitoring and prediction process.
Styles APA, Harvard, Vancouver, ISO, etc.
14

Yang Li, Wanmei Tang et Mingyong Li. « A Least Squares Support Vector Machine Sparseness Algorithm ». Journal of Convergence Information Technology 7, no 13 (31 juillet 2012) : 240–48. http://dx.doi.org/10.4156/jcit.vol7.issue13.28.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
15

WU, Zong-liang, et Heng DOU. « New sparse least squares support vector machine algorithm ». Journal of Computer Applications 29, no 6 (4 décembre 2009) : 1559–62. http://dx.doi.org/10.3724/sp.j.1087.2009.01559.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
16

Farrokh, Mojtaba. « Hysteresis Simulation Using Least-Squares Support Vector Machine ». Journal of Engineering Mechanics 144, no 9 (septembre 2018) : 04018084. http://dx.doi.org/10.1061/(asce)em.1943-7889.0001509.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
17

Xing, Hong-Jie, et Li-Fei Li. « Robust least squares one-class support vector machine ». Pattern Recognition Letters 138 (octobre 2020) : 571–78. http://dx.doi.org/10.1016/j.patrec.2020.09.005.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
18

Pei, Huimin, Kuaini Wang et Ping Zhong. « Semi-supervised matrixized least squares support vector machine ». Applied Soft Computing 61 (décembre 2017) : 72–87. http://dx.doi.org/10.1016/j.asoc.2017.07.040.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
19

Li, Yuangui, Chen Lin et Weidong Zhang. « Improved sparse least-squares support vector machine classifiers ». Neurocomputing 69, no 13-15 (août 2006) : 1655–58. http://dx.doi.org/10.1016/j.neucom.2006.03.001.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
20

Zhang, Ruiting, et Zhijian Zhou. « A Fuzzy Least Squares Support Tensor Machines in Machine Learning ». International Journal of Emerging Technologies in Learning (iJET) 10, no 8 (14 décembre 2015) : 4. http://dx.doi.org/10.3991/ijet.v10i8.5203.

Texte intégral
Résumé :
In the machine learning field, high-dimensional data are often encountered in the real applications. Most of the traditional learning algorithms are based on the vector space model, such as SVM. Tensor representation is useful to the over fitting problem in vector-based learning, and tensor-based algorithm requires a smaller set of decision variables as compared to vector-based approaches. We also would require that the meaningful training points must be classified correctly and would not care about some training points like noises whether or not they are classified correctly. To utilize the structural information present in high dimensional features of an object, a tensor-based learning framework, termed as Fuzzy Least Squares support tensor machine (FLSSTM), where the classifier is obtained by solving a system of linear equations rather than a quadratic programming problem at each iteration of FLSSTM algorithm as compared to STM algorithm. This in turn provides a significant reduction in the computation time, as well as comparable classification accuracy. The efficacy of the proposed method has been demonstrated in ORL database and Yale database. The FLSSTM outperforms other tensor-based algorithms, for example, LSSTM, especially when training size is small.
Styles APA, Harvard, Vancouver, ISO, etc.
21

Wornyo, Dickson Keddy, et Xiang-Jun Shen. « Coupled Least Squares Support Vector Ensemble Machines ». Information 10, no 6 (3 juin 2019) : 195. http://dx.doi.org/10.3390/info10060195.

Texte intégral
Résumé :
The least squares support vector method is a popular data-driven modeling method which shows better performance and has been successfully applied in a wide range of applications. In this paper, we propose a novel coupled least squares support vector ensemble machine (C-LSSVEM). The proposed coupling ensemble helps improve robustness and produce good classification performance than the single model approach. The proposed C-LSSVEM can choose appropriate kernel types and their parameters in a good coupling strategy with a set of classifiers being trained simultaneously. The proposed method can further minimize the total loss of ensembles in kernel space. Thus, we form an ensemble regressor by co-optimizing and weighing base regressors. Experiments conducted on several datasets such as artificial datasets, UCI classification datasets, UCI regression datasets, handwritten digits datasets and NWPU-RESISC45 datasets, indicate that C-LSSVEM performs better in achieving the minimal regression loss and the best classification accuracy relative to selected state-of-the-art regression and classification techniques.
Styles APA, Harvard, Vancouver, ISO, etc.
22

Suykens, J. A. K., et J. Vandewalle. « Recurrent least squares support vector machines ». IEEE Transactions on Circuits and Systems I : Fundamental Theory and Applications 47, no 7 (juillet 2000) : 1109–14. http://dx.doi.org/10.1109/81.855471.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
23

LI, JIANPING, ZHENYU CHEN, LIWEI WEI, WEIXUAN XU et GANG KOU. « FEATURE SELECTION VIA LEAST SQUARES SUPPORT FEATURE MACHINE ». International Journal of Information Technology & ; Decision Making 06, no 04 (décembre 2007) : 671–86. http://dx.doi.org/10.1142/s0219622007002733.

Texte intégral
Résumé :
In many applications such as credit risk management, data are represented as high-dimensional feature vectors. It makes the feature selection necessary to reduce the computational complexity, improve the generalization ability and the interpretability. In this paper, we present a novel feature selection method — "Least Squares Support Feature Machine" (LS-SFM). The proposed method has two advantages comparing with conventional Support Vector Machine (SVM) and LS-SVM. First, the convex combinations of basic kernels are used as the kernel and each basic kernel makes use of a single feature. It transforms the feature selection problem that cannot be solved in the context of SVM to an ordinary multiple-parameter learning problem. Second, all parameters are learned by a two stage iterative algorithm. A 1-norm based regularized cost function is used to enforce sparseness of the feature parameters. The "support features" refer to the respective features with nonzero feature parameters. Experimental study on some of the UCI datasets and a commercial credit card dataset demonstrates the effectiveness and efficiency of the proposed approach.
Styles APA, Harvard, Vancouver, ISO, etc.
24

Liang, Si Yang, et Jian Hong Lv. « Least Squares Support Vector Machine for Fault Diagnosis Optimization ». Applied Mechanics and Materials 347-350 (août 2013) : 505–8. http://dx.doi.org/10.4028/www.scientific.net/amm.347-350.505.

Texte intégral
Résumé :
In order to improve the diagnostic accuracy of digital circuit, the fault diagnosis method based on support vector machines (SVM) is proposed. The input is fault characteristics of digital circuit; the output is the fault style. The connection of fault characteristics and style was established. Network learning algorithm using least squares, the training sample data is formed by the simulation, the test sample data is formed by the untrained simulation. The method achieved the classification of faulted digital circuits, and the results show that the method has the features of fast and high accuracy.
Styles APA, Harvard, Vancouver, ISO, etc.
25

Hwang, Changha, Sang-Il Choi et Jooyong Shim. « Deep multiple kernel least squares support vector regression machine ». Journal of the Korean Data And Information Science Sociaty 29, no 4 (31 juillet 2018) : 895–902. http://dx.doi.org/10.7465/jkdi.2018.29.4.895.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
26

Saranya, N. « SENTIMENTAL ANALYSIS USING LEAST SQUARES TWIN SUPPORT VECTOR MACHINE ». International Journal of Advanced Research in Computer Science 8, no 7 (20 août 2017) : 860–66. http://dx.doi.org/10.26483/ijarcs.v8i7.4527.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
27

Nasiri, Jalal A., Nasrollah Moghadam Charkari et Saeed Jalili. « Least squares twin multi-class classification support vector machine ». Pattern Recognition 48, no 3 (mars 2015) : 984–92. http://dx.doi.org/10.1016/j.patcog.2014.09.020.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
28

Liu, Kun, et Bing-Yu Sun. « Least Squares Support Vector Machine Regression with Equality Constraints ». Physics Procedia 24 (2012) : 2227–30. http://dx.doi.org/10.1016/j.phpro.2012.02.327.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
29

Licheng Jiao, Liefeng Bo et Ling Wang. « Fast Sparse Approximation for Least Squares Support Vector Machine ». IEEE Transactions on Neural Networks 18, no 3 (mai 2007) : 685–97. http://dx.doi.org/10.1109/tnn.2006.889500.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
30

Shim, Joo-Yong, Jong-Sig Bae et Chang-Ha Hwang. « Multiclass Classification via Least Squares Support Vector Machine Regression ». Communications for Statistical Applications and Methods 15, no 3 (30 mai 2008) : 441–50. http://dx.doi.org/10.5351/ckss.2008.15.3.441.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
31

Ziggah, Yao Yevenyo, Youjina Hu, Yakubu Issaka et Prosper Basommi Laari. « LEAST SQUARES SUPPORT VECTOR MACHINE MODEL FOR COORDINATE TRANSFORMATION ». Geodesy and cartography 45, no 5 (17 avril 2019) : 16–27. http://dx.doi.org/10.3846/gac.2019.6053.

Texte intégral
Résumé :
In coordinate transformation, the main purpose is to provide a mathematical relationship between coordinates related to different geodetic reference frames. This gives the geospatial professionals the opportunity to link different datums together. Review of previous studies indicates that empirical and soft computing models have been proposed in recent times for coordinate transformation. The main aim of this study is to present the applicability and performance of Least Squares Support Vector Machine (LS-SVM) which is an extension of the Support Vector Machine (SVM) for coordinate transformation. For comparison purpose, the SVM and the widely used Backpropagation Neural Network (BPNN), Radial Basis Function Neural Network (RBFNN), 2D conformal and affine methods were also employed. To assess how well the transformation results fit the observed data, the root mean square of the residual horizontal distances and standard deviation were used. From the results obtained, the LS-SVM and RBFNN had comparable results and were better than the other methods. The overall statistical findings produced by LS-SVM met the accuracy requirement for cadastral surveying applications in Ghana. To this end, the proposed LS-SVM is known to possess promising predictive capabilities and could efficiently be used as a supplementary technique for coordinate transformation.
Styles APA, Harvard, Vancouver, ISO, etc.
32

Khemchandani, Reshma, et Aman Pal. « Multi-category laplacian least squares twin support vector machine ». Applied Intelligence 45, no 2 (23 mars 2016) : 458–74. http://dx.doi.org/10.1007/s10489-016-0770-6.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
33

Xie, Xijiong, Feixiang Sun, Jiangbo Qian, Lijun Guo, Rong Zhang, Xulun Ye et Zhijin Wang. « Laplacian Lp norm least squares twin support vector machine ». Pattern Recognition 136 (avril 2023) : 109192. http://dx.doi.org/10.1016/j.patcog.2022.109192.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
34

Liu, Yisen, Songbin Zhou, Weixin Liu, Xinhui Yang et Jun Luo. « Least-squares support vector machine and successive projection algorithm for quantitative analysis of cotton-polyester textile by near infrared spectroscopy ». Journal of Near Infrared Spectroscopy 26, no 1 (février 2018) : 34–43. http://dx.doi.org/10.1177/0967033518757069.

Texte intégral
Résumé :
The application of near infrared spectroscopy for quantitative analysis of cotton-polyester textile was investigated in the present work. A total of 214 cotton-polyester fabric samples, covering the range from 0% to 100% cotton were measured and analyzed. Partial least squares and least-squares support vector machine models with all variables as input data were established. Furthermore, successive projection algorithm was used to select effective wavelengths and establish the successive projection algorithm-least-squares support vector machine models, with the comparison of two other effective wavelength selection methods: loading weights analysis and regression coefficient analysis. The calibration and validation results show that the successive projection algorithm-least-squares support vector machine model outperformed not only the partial least squares and least-squares support vector machine models with all variables as inputs, but also the least-squares support vector machine models with loading weights analysis and regression coefficient analysis effective wavelength selection. The root mean squared error of calibration and root mean squared error of prediction values of the successive projection algorithm-least-squares support vector machine regression model with the optimal performance were 0.77% and 1.17%, respectively. The overall results demonstrated that near infrared spectroscopy combined with least-squares support vector machine and successive projection algorithm could provide a simple, rapid, economical and non-destructive method for determining the composition of cotton-polyester textiles.
Styles APA, Harvard, Vancouver, ISO, etc.
35

Chen, Zhan-bo. « Research on Application of Regression Least Squares Support Vector Machine on Performance Prediction of Hydraulic Excavator ». Journal of Control Science and Engineering 2014 (2014) : 1–4. http://dx.doi.org/10.1155/2014/686130.

Texte intégral
Résumé :
In order to improve the performance prediction accuracy of hydraulic excavator, the regression least squares support vector machine is applied. First, the mathematical model of the regression least squares support vector machine is studied, and then the algorithm of the regression least squares support vector machine is designed. Finally, the performance prediction simulation of hydraulic excavator based on regression least squares support vector machine is carried out, and simulation results show that this method can predict the performance changing rules of hydraulic excavator correctly.
Styles APA, Harvard, Vancouver, ISO, etc.
36

Lu, Yan, et Zhiping Huang. « A new hybrid model of sparsity empirical wavelet transform and adaptive dynamic least squares support vector machine for fault diagnosis of gear pump ». Advances in Mechanical Engineering 12, no 5 (mai 2020) : 168781402092204. http://dx.doi.org/10.1177/1687814020922047.

Texte intégral
Résumé :
Gear pump is the key component in hydraulic drive system, and it is very significant to fault diagnosis for gear pump. The combination of sparsity empirical wavelet transform and adaptive dynamic least squares support vector machine is proposed for fault diagnosis of gear pump in this article. Sparsity empirical wavelet transform is used to obtain the features of the vibrational signal of gear pump, the sparsity function is potential to make empirical wavelet transform adaptive, and adaptive dynamic least squares support vector machine is used to recognize the state of gear pump. The experimental results show that the diagnosis accuracies of sparsity empirical wavelet transform and adaptive dynamic least squares support vector machine are better than those of the empirical wavelet transform and adaptive dynamic least squares support vector machine method or the empirical wavelet transform and least squares support vector machine method.
Styles APA, Harvard, Vancouver, ISO, etc.
37

Guan, Qiong, Han Qing Tao et Bin Huang. « The Computer Interlocking Software System Reliability Test Based on the Monte Carlo ». Applied Mechanics and Materials 614 (septembre 2014) : 397–400. http://dx.doi.org/10.4028/www.scientific.net/amm.614.397.

Texte intégral
Résumé :
The railway switch failure prediction for railway signal equipment maintenance plays an important role. The paper put forward railway switch failure prediction algorithm based on least squares support vector machine, and chose five characteristic indexes composed of railway switch failure prediction models characteristic input vectors. It reduces the dimension of input vectors, shorten the least squares support vector machine training time, and use a pruning algorithm to accelerate the computing speed maintaining a good regression performance at the same time. The experiment proved that railway switch failure prediction algorithm has strong self-learning ability and higher prediction accuracy based on least squares support vector machine. And it can accelerate the speed of switch failure prediction and improve the accuracy and reliability of railway switch failure prediction.
Styles APA, Harvard, Vancouver, ISO, etc.
38

Liu, Xuanyu, et Kaiju Zhang. « Earth pressure prediction in sealed chamber of shield machine based on parallel least squares support vector machine optimized by cooperative particle swarm optimization ». Measurement and Control 52, no 7-8 (10 mai 2019) : 758–64. http://dx.doi.org/10.1177/0020294019840720.

Texte intégral
Résumé :
Earth pressure in sealed chamber is affected by multisystem and multifield coupling during shield tunneling process, so it is difficult to establish a mechanism earth pressure control model. Therefore, a data-driven modeling method of earth pressure in sealed chamber is proposed, which is based on parallel least squares support vector machine optimized by parallel cooperative particle swarm (parallel cooperative particle swarm optimization-partial least squares support vector machine). The vectors are parallel studied according to different hierarchies firstly, then the initial classifiers are updated by using cross-feedback method to retrain the vectors, and finally the vectors are merged to obtain the support vectors. The parameters of least squares support vector machine are optimized by the parallel cooperative particle swarm optimization, so as to predict quickly for large-scale data. Finally, the simulation experiment is carried out based on in-site measured data, and the results show that the method has high computing efficiency and prediction accuracy. The method has directive significance for engineering application.
Styles APA, Harvard, Vancouver, ISO, etc.
39

Mahmoud, Tarek. « Adaptive control scheme based on the least squares support vector machine network ». International Journal of Applied Mathematics and Computer Science 21, no 4 (1 décembre 2011) : 685–96. http://dx.doi.org/10.2478/v10006-011-0054-6.

Texte intégral
Résumé :
Adaptive control scheme based on the least squares support vector machine networkRecently, a new type of neural networks called Least Squares Support Vector Machines (LS-SVMs) has been receiving increasing attention in nonlinear system identification and control due to its generalization performance. This paper develops a stable adaptive control scheme using the LS-SVM network. The developed control scheme includes two parts: the identification part that uses a modified structure of LS-SVM neural networks called the multi-resolution wavelet least squares support vector machine network (MRWLS-SVM) as a predictor model, and the controller part that is developed to track a reference trajectory. By means of the Lyapunov stability criterion, stability analysis for the tracking errors is performed. Finally, simulation studies are performed to demonstrate the capability of the developed approach in controlling a pH process.
Styles APA, Harvard, Vancouver, ISO, etc.
40

Ren, Yuan, et Guang Chen Bai. « Colonial Competitive Algorithm Assisted Least Squares Support Vector Machines ». Advanced Materials Research 255-260 (mai 2011) : 2082–86. http://dx.doi.org/10.4028/www.scientific.net/amr.255-260.2082.

Texte intégral
Résumé :
The use of least squares support vector machine (LSSVM), a novel machine learning method, for classification and function approximation has increased over the past few years especially due to its high generalization performance. However, LSSVM is plagued by the drawback that the hyper-parameters, which largely determine the quality of LSSVM models, have to be defined by the user, and this increases the difficulty of applying LSSVM and limits its use on academic and industrial platforms. In this paper we present a novel method of automatically tuning the hyper-parameters of LSSVM based on colonial competitive algorithm (CCA), a newly developed evolutionary algorithm inspired by imperialistic competition mechanism. To show the efficacy of the CCA assisted LSSVM methodology, we have tested it on several benchmark examples. The study suggests that proposed paradigm can be a competitive and powerful tool for classification and function approximation.
Styles APA, Harvard, Vancouver, ISO, etc.
41

Cawley, Gavin C., et Nicola L. C. Talbot. « Improved sparse least-squares support vector machines ». Neurocomputing 48, no 1-4 (octobre 2002) : 1025–31. http://dx.doi.org/10.1016/s0925-2312(02)00606-9.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
42

Sartakhti, Javad Salimi, Homayun Afrabandpey et Nasser Ghadiri. « Fuzzy least squares twin support vector machines ». Engineering Applications of Artificial Intelligence 85 (octobre 2019) : 402–9. http://dx.doi.org/10.1016/j.engappai.2019.06.018.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
43

Xu, Shuo, Xin An, Xiaodong Qiao et Lijun Zhu. « Multi-task least-squares support vector machines ». Multimedia Tools and Applications 71, no 2 (30 mai 2013) : 699–715. http://dx.doi.org/10.1007/s11042-013-1526-5.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
44

Wang, Liejun, Taiyi Zhang et Yatong Zhou. « Multi-Resolution Least Squares Support Vector Machines ». Journal of Electronics (China) 24, no 5 (septembre 2007) : 701–4. http://dx.doi.org/10.1007/s11767-006-0270-7.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
45

Van Belle, V., K. Pelckmans, J. A. K. Suykens et S. Van Huffel. « Additive survival least-squares support vector machines ». Statistics in Medicine 29, no 2 (18 décembre 2009) : 296–308. http://dx.doi.org/10.1002/sim.3743.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
46

Yang, Chaoyu, Jie Yang et Jun Ma. « Sparse Least Squares Support Vector Machine With Adaptive Kernel Parameters ». International Journal of Computational Intelligence Systems 13, no 1 (2020) : 212. http://dx.doi.org/10.2991/ijcis.d.200205.001.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
47

Yu Kaijun. « A least Squares Support Vector Machine Classifier for Information Retrieval ». Journal of Convergence Information Technology 8, no 2 (31 janvier 2013) : 177–83. http://dx.doi.org/10.4156/jcit.vol8.issue2.22.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
48

WU, Zong-liang, et Heng DOU. « Generalized least squares support-vector-machine algorithm and its application ». Journal of Computer Applications 29, no 3 (6 mai 2009) : 877–79. http://dx.doi.org/10.3724/sp.j.1087.2009.00877.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
49

Xing, Chong, Liming Wan, Jiaxin Wang et Yanchun Liang. « Prediction of ncRNA Based on Least Squares Support Vector Machine ». Journal of Bionanoscience 7, no 1 (1 février 2013) : 121–25. http://dx.doi.org/10.1166/jbns.2013.1093.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
50

Jiang, Jingqing, Chuyisong et Lanying Bao. « ForwardGene Selection Algorithm Based on Least Squares Support Vector Machine ». Journal of Bionanoscience 7, no 3 (1 juin 2013) : 307–12. http://dx.doi.org/10.1166/jbns.2013.1136.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie