Artículos de revistas sobre el tema "Least-squares support vector machine"

Siga este enlace para ver otros tipos de publicaciones sobre el tema: Least-squares support vector machine.

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores artículos de revistas para su investigación sobre el tema "Least-squares support vector machine".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

KITAYAMA, Satoshi, Masao ARAKAWA y Koetsu YAMAZAKI. "1403 Least-Squares Support Vector Machine". Proceedings of Design & Systems Conference 2010.20 (2010): _1403–1_—_1403–5_. http://dx.doi.org/10.1299/jsmedsd.2010.20._1403-1_.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Adankon, M. M., M. Cheriet y A. Biem. "Semisupervised Least Squares Support Vector Machine". IEEE Transactions on Neural Networks 20, n.º 12 (diciembre de 2009): 1858–70. http://dx.doi.org/10.1109/tnn.2009.2031143.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

ZHENG, SHENG, YUQIU SUN, JINWEN TIAN y JAIN LIU. "MAPPED LEAST SQUARES SUPPORT VECTOR MACHINE REGRESSION". International Journal of Pattern Recognition and Artificial Intelligence 19, n.º 03 (mayo de 2005): 459–75. http://dx.doi.org/10.1142/s0218001405004058.

Texto completo
Resumen
This paper describes a novel version of regression SVM (Support Vector Machines) that is based on the least-squares error. We show that the solution of this optimization problem can be obtained easily once the inverse of a certain matrix is computed. This matrix, however, depends only on the input vectors, but not on the labels. Thus, if many learning problems with the same set of input vectors but different sets of labels have to be solved, it makes sense to compute the inverse of the matrix just once and then use it for computing all subsequent models. The computational complexity to train an regression SVM can be reduced to O (N2), just a matrix multiplication operation, and thus probably faster than known SVM training algorithms that have O (N2) work with loops. We describe applications from image processing, where the input points are usually of the form {(x0 + dx, y0 + dy) : |dx| < m, |dy| < n} and all such set of points can be translated to the same set {(dx, dy) : |dx| < m, |dy| < n} by subtracting (x0, y0) from all the vectors. The experimental results demonstrate that the proposed approach is faster than those processing each learning problem separately.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Hwang, Changha y Jooyong Shim. "Geographically weighted least squares-support vector machine". Journal of the Korean Data and Information Science Society 28, n.º 1 (31 de enero de 2017): 227–35. http://dx.doi.org/10.7465/jkdi.2017.28.1.227.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Choi, Young-Sik. "Least squares one-class support vector machine". Pattern Recognition Letters 30, n.º 13 (octubre de 2009): 1236–40. http://dx.doi.org/10.1016/j.patrec.2009.05.007.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Huang, Xiaolin, Lei Shi y Johan A. K. Suykens. "Asymmetric least squares support vector machine classifiers". Computational Statistics & Data Analysis 70 (febrero de 2014): 395–405. http://dx.doi.org/10.1016/j.csda.2013.09.015.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Liu, Dalian, Yong Shi, Yingjie Tian y Xiankai Huang. "Ramp loss least squares support vector machine". Journal of Computational Science 14 (mayo de 2016): 61–68. http://dx.doi.org/10.1016/j.jocs.2016.02.001.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

van Gestel, Tony, Johan A. K. Suykens, Bart Baesens, Stijn Viaene, Jan Vanthienen, Guido Dedene, Bart de Moor y Joos Vandewalle. "Benchmarking Least Squares Support Vector Machine Classifiers". Machine Learning 54, n.º 1 (enero de 2004): 5–32. http://dx.doi.org/10.1023/b:mach.0000008082.80494.e0.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Zhang, Yong Li, Yan Wei Zhu, Shu Fei Lin, Xiu Juan Sun, Qiu Na Zhang y Xiao Hong Liu. "Algorithm of Sparse Least Squares Support Vector Machine". Advanced Materials Research 143-144 (octubre de 2010): 1229–33. http://dx.doi.org/10.4028/www.scientific.net/amr.143-144.1229.

Texto completo
Resumen
Support Vector Machine has been widely studied in recent years. The algorithm of least squares support vector machine is studied, the shortcomings of the algorithm are given. The result of algorithm is lack of sparseness. In this paper greedy algorithm is introduced into the least squares support vector machine. Sparseness is obtained again. A new algorithm of sparse least squares support vector machine is given. The new algorithm was used to sewage treatment plant daily monitoring. Experimental results demonstrate the improved algorithm of support vector machine was successful.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Dong, Zengshou, Zhaojing Ren y You Dong. "MECHANICAL FAULT RECOGNITION RESEARCH BASED ON LMD-LSSVM". Transactions of the Canadian Society for Mechanical Engineering 40, n.º 4 (noviembre de 2016): 541–49. http://dx.doi.org/10.1139/tcsme-2016-0042.

Texto completo
Resumen
Mechanical fault vibration signals are non-stationary, which causes system instability. The traditional methods are difficult to accurately extract fault information and this paper proposes a local mean decomposition and least squares support vector machine fault identification method. The article introduces waveform matching to solve the original features of signals at the endpoints, using linear interpolation to get local mean and envelope function, then obtain production function PF vector through making use of the local mean decomposition. The energy entropy of PF vector take as identification input vectors. These vectors are respectively inputted BP neural networks, support vector machines, least squares support vector machines to identify faults. Experimental result show that the accuracy of least squares support vector machine with higher classification accuracy has been improved.
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Xia, Xiao-Lei, Weidong Jiao, Kang Li y George Irwin. "A Novel Sparse Least Squares Support Vector Machines". Mathematical Problems in Engineering 2013 (2013): 1–10. http://dx.doi.org/10.1155/2013/602341.

Texto completo
Resumen
The solution of a Least Squares Support Vector Machine (LS-SVM) suffers from the problem of nonsparseness. The Forward Least Squares Approximation (FLSA) is a greedy approximation algorithm with a least-squares loss function. This paper proposes a new Support Vector Machine for which the FLSA is the training algorithm—the Forward Least Squares Approximation SVM (FLSA-SVM). A major novelty of this new FLSA-SVM is that the number of support vectors is the regularization parameter for tuning the tradeoff between the generalization ability and the training cost. The FLSA-SVMs can also detect the linear dependencies in vectors of the input Gramian matrix. These attributes together contribute to its extreme sparseness. Experiments on benchmark datasets are presented which show that, compared to various SVM algorithms, the FLSA-SVM is extremely compact, while maintaining a competitive generalization ability.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Seok, Kyungha y Daehyun Cho. "A Study on Support Vectors of Least Squares Support Vector Machine". Communications for Statistical Applications and Methods 10, n.º 3 (1 de diciembre de 2003): 873–78. http://dx.doi.org/10.5351/ckss.2003.10.3.873.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Li, Xiao, Xin Liu, Clyde Zhengdao Li, Zhumin Hu, Geoffrey Qiping Shen y Zhenyu Huang. "Foundation pit displacement monitoring and prediction using least squares support vector machines based on multi-point measurement". Structural Health Monitoring 18, n.º 3 (23 de abril de 2018): 715–24. http://dx.doi.org/10.1177/1475921718767935.

Texto completo
Resumen
Foundation pit displacement is a critical safety risk for both building structure and people lives. The accurate displacement monitoring and prediction of a deep foundation pit are essential to prevent potential risks at early construction stage. To achieve accurate prediction, machine learning methods are extensively applied to fulfill this purpose. However, these approaches, such as support vector machines, have limitations in terms of data processing efficiency and prediction accuracy. As an emerging approach derived from support vector machines, least squares support vector machine improve the data processing efficiency through better use of equality constraints in the least squares loss functions. However, the accuracy of this approach highly relies on the large volume of influencing factors from the measurement of adjacent critical points, which is not normally available during the construction process. To address this issue, this study proposes an improved least squares support vector machine algorithm based on multi-point measuring techniques, namely, multi-point least squares support vector machine. To evaluate the effectiveness of the proposed multi-point least squares support vector machine approach, a real case study project was selected, and the results illustrated that the multi-point least squares support vector machine approach on average outperformed single-point least squares support vector machine in terms of prediction accuracy during the foundation pit monitoring and prediction process.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Yang Li, Wanmei Tang y Mingyong Li. "A Least Squares Support Vector Machine Sparseness Algorithm". Journal of Convergence Information Technology 7, n.º 13 (31 de julio de 2012): 240–48. http://dx.doi.org/10.4156/jcit.vol7.issue13.28.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

WU, Zong-liang y Heng DOU. "New sparse least squares support vector machine algorithm". Journal of Computer Applications 29, n.º 6 (4 de diciembre de 2009): 1559–62. http://dx.doi.org/10.3724/sp.j.1087.2009.01559.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Farrokh, Mojtaba. "Hysteresis Simulation Using Least-Squares Support Vector Machine". Journal of Engineering Mechanics 144, n.º 9 (septiembre de 2018): 04018084. http://dx.doi.org/10.1061/(asce)em.1943-7889.0001509.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Xing, Hong-Jie y Li-Fei Li. "Robust least squares one-class support vector machine". Pattern Recognition Letters 138 (octubre de 2020): 571–78. http://dx.doi.org/10.1016/j.patrec.2020.09.005.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Pei, Huimin, Kuaini Wang y Ping Zhong. "Semi-supervised matrixized least squares support vector machine". Applied Soft Computing 61 (diciembre de 2017): 72–87. http://dx.doi.org/10.1016/j.asoc.2017.07.040.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Li, Yuangui, Chen Lin y Weidong Zhang. "Improved sparse least-squares support vector machine classifiers". Neurocomputing 69, n.º 13-15 (agosto de 2006): 1655–58. http://dx.doi.org/10.1016/j.neucom.2006.03.001.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Zhang, Ruiting y Zhijian Zhou. "A Fuzzy Least Squares Support Tensor Machines in Machine Learning". International Journal of Emerging Technologies in Learning (iJET) 10, n.º 8 (14 de diciembre de 2015): 4. http://dx.doi.org/10.3991/ijet.v10i8.5203.

Texto completo
Resumen
In the machine learning field, high-dimensional data are often encountered in the real applications. Most of the traditional learning algorithms are based on the vector space model, such as SVM. Tensor representation is useful to the over fitting problem in vector-based learning, and tensor-based algorithm requires a smaller set of decision variables as compared to vector-based approaches. We also would require that the meaningful training points must be classified correctly and would not care about some training points like noises whether or not they are classified correctly. To utilize the structural information present in high dimensional features of an object, a tensor-based learning framework, termed as Fuzzy Least Squares support tensor machine (FLSSTM), where the classifier is obtained by solving a system of linear equations rather than a quadratic programming problem at each iteration of FLSSTM algorithm as compared to STM algorithm. This in turn provides a significant reduction in the computation time, as well as comparable classification accuracy. The efficacy of the proposed method has been demonstrated in ORL database and Yale database. The FLSSTM outperforms other tensor-based algorithms, for example, LSSTM, especially when training size is small.
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Wornyo, Dickson Keddy y Xiang-Jun Shen. "Coupled Least Squares Support Vector Ensemble Machines". Information 10, n.º 6 (3 de junio de 2019): 195. http://dx.doi.org/10.3390/info10060195.

Texto completo
Resumen
The least squares support vector method is a popular data-driven modeling method which shows better performance and has been successfully applied in a wide range of applications. In this paper, we propose a novel coupled least squares support vector ensemble machine (C-LSSVEM). The proposed coupling ensemble helps improve robustness and produce good classification performance than the single model approach. The proposed C-LSSVEM can choose appropriate kernel types and their parameters in a good coupling strategy with a set of classifiers being trained simultaneously. The proposed method can further minimize the total loss of ensembles in kernel space. Thus, we form an ensemble regressor by co-optimizing and weighing base regressors. Experiments conducted on several datasets such as artificial datasets, UCI classification datasets, UCI regression datasets, handwritten digits datasets and NWPU-RESISC45 datasets, indicate that C-LSSVEM performs better in achieving the minimal regression loss and the best classification accuracy relative to selected state-of-the-art regression and classification techniques.
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Suykens, J. A. K. y J. Vandewalle. "Recurrent least squares support vector machines". IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications 47, n.º 7 (julio de 2000): 1109–14. http://dx.doi.org/10.1109/81.855471.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

LI, JIANPING, ZHENYU CHEN, LIWEI WEI, WEIXUAN XU y GANG KOU. "FEATURE SELECTION VIA LEAST SQUARES SUPPORT FEATURE MACHINE". International Journal of Information Technology & Decision Making 06, n.º 04 (diciembre de 2007): 671–86. http://dx.doi.org/10.1142/s0219622007002733.

Texto completo
Resumen
In many applications such as credit risk management, data are represented as high-dimensional feature vectors. It makes the feature selection necessary to reduce the computational complexity, improve the generalization ability and the interpretability. In this paper, we present a novel feature selection method — "Least Squares Support Feature Machine" (LS-SFM). The proposed method has two advantages comparing with conventional Support Vector Machine (SVM) and LS-SVM. First, the convex combinations of basic kernels are used as the kernel and each basic kernel makes use of a single feature. It transforms the feature selection problem that cannot be solved in the context of SVM to an ordinary multiple-parameter learning problem. Second, all parameters are learned by a two stage iterative algorithm. A 1-norm based regularized cost function is used to enforce sparseness of the feature parameters. The "support features" refer to the respective features with nonzero feature parameters. Experimental study on some of the UCI datasets and a commercial credit card dataset demonstrates the effectiveness and efficiency of the proposed approach.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Liang, Si Yang y Jian Hong Lv. "Least Squares Support Vector Machine for Fault Diagnosis Optimization". Applied Mechanics and Materials 347-350 (agosto de 2013): 505–8. http://dx.doi.org/10.4028/www.scientific.net/amm.347-350.505.

Texto completo
Resumen
In order to improve the diagnostic accuracy of digital circuit, the fault diagnosis method based on support vector machines (SVM) is proposed. The input is fault characteristics of digital circuit; the output is the fault style. The connection of fault characteristics and style was established. Network learning algorithm using least squares, the training sample data is formed by the simulation, the test sample data is formed by the untrained simulation. The method achieved the classification of faulted digital circuits, and the results show that the method has the features of fast and high accuracy.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Hwang, Changha, Sang-Il Choi y Jooyong Shim. "Deep multiple kernel least squares support vector regression machine". Journal of the Korean Data And Information Science Sociaty 29, n.º 4 (31 de julio de 2018): 895–902. http://dx.doi.org/10.7465/jkdi.2018.29.4.895.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Saranya, N. "SENTIMENTAL ANALYSIS USING LEAST SQUARES TWIN SUPPORT VECTOR MACHINE". International Journal of Advanced Research in Computer Science 8, n.º 7 (20 de agosto de 2017): 860–66. http://dx.doi.org/10.26483/ijarcs.v8i7.4527.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Nasiri, Jalal A., Nasrollah Moghadam Charkari y Saeed Jalili. "Least squares twin multi-class classification support vector machine". Pattern Recognition 48, n.º 3 (marzo de 2015): 984–92. http://dx.doi.org/10.1016/j.patcog.2014.09.020.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Liu, Kun y Bing-Yu Sun. "Least Squares Support Vector Machine Regression with Equality Constraints". Physics Procedia 24 (2012): 2227–30. http://dx.doi.org/10.1016/j.phpro.2012.02.327.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Licheng Jiao, Liefeng Bo y Ling Wang. "Fast Sparse Approximation for Least Squares Support Vector Machine". IEEE Transactions on Neural Networks 18, n.º 3 (mayo de 2007): 685–97. http://dx.doi.org/10.1109/tnn.2006.889500.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Shim, Joo-Yong, Jong-Sig Bae y Chang-Ha Hwang. "Multiclass Classification via Least Squares Support Vector Machine Regression". Communications for Statistical Applications and Methods 15, n.º 3 (30 de mayo de 2008): 441–50. http://dx.doi.org/10.5351/ckss.2008.15.3.441.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Ziggah, Yao Yevenyo, Youjina Hu, Yakubu Issaka y Prosper Basommi Laari. "LEAST SQUARES SUPPORT VECTOR MACHINE MODEL FOR COORDINATE TRANSFORMATION". Geodesy and cartography 45, n.º 5 (17 de abril de 2019): 16–27. http://dx.doi.org/10.3846/gac.2019.6053.

Texto completo
Resumen
In coordinate transformation, the main purpose is to provide a mathematical relationship between coordinates related to different geodetic reference frames. This gives the geospatial professionals the opportunity to link different datums together. Review of previous studies indicates that empirical and soft computing models have been proposed in recent times for coordinate transformation. The main aim of this study is to present the applicability and performance of Least Squares Support Vector Machine (LS-SVM) which is an extension of the Support Vector Machine (SVM) for coordinate transformation. For comparison purpose, the SVM and the widely used Backpropagation Neural Network (BPNN), Radial Basis Function Neural Network (RBFNN), 2D conformal and affine methods were also employed. To assess how well the transformation results fit the observed data, the root mean square of the residual horizontal distances and standard deviation were used. From the results obtained, the LS-SVM and RBFNN had comparable results and were better than the other methods. The overall statistical findings produced by LS-SVM met the accuracy requirement for cadastral surveying applications in Ghana. To this end, the proposed LS-SVM is known to possess promising predictive capabilities and could efficiently be used as a supplementary technique for coordinate transformation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Khemchandani, Reshma y Aman Pal. "Multi-category laplacian least squares twin support vector machine". Applied Intelligence 45, n.º 2 (23 de marzo de 2016): 458–74. http://dx.doi.org/10.1007/s10489-016-0770-6.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Xie, Xijiong, Feixiang Sun, Jiangbo Qian, Lijun Guo, Rong Zhang, Xulun Ye y Zhijin Wang. "Laplacian Lp norm least squares twin support vector machine". Pattern Recognition 136 (abril de 2023): 109192. http://dx.doi.org/10.1016/j.patcog.2022.109192.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Liu, Yisen, Songbin Zhou, Weixin Liu, Xinhui Yang y Jun Luo. "Least-squares support vector machine and successive projection algorithm for quantitative analysis of cotton-polyester textile by near infrared spectroscopy". Journal of Near Infrared Spectroscopy 26, n.º 1 (febrero de 2018): 34–43. http://dx.doi.org/10.1177/0967033518757069.

Texto completo
Resumen
The application of near infrared spectroscopy for quantitative analysis of cotton-polyester textile was investigated in the present work. A total of 214 cotton-polyester fabric samples, covering the range from 0% to 100% cotton were measured and analyzed. Partial least squares and least-squares support vector machine models with all variables as input data were established. Furthermore, successive projection algorithm was used to select effective wavelengths and establish the successive projection algorithm-least-squares support vector machine models, with the comparison of two other effective wavelength selection methods: loading weights analysis and regression coefficient analysis. The calibration and validation results show that the successive projection algorithm-least-squares support vector machine model outperformed not only the partial least squares and least-squares support vector machine models with all variables as inputs, but also the least-squares support vector machine models with loading weights analysis and regression coefficient analysis effective wavelength selection. The root mean squared error of calibration and root mean squared error of prediction values of the successive projection algorithm-least-squares support vector machine regression model with the optimal performance were 0.77% and 1.17%, respectively. The overall results demonstrated that near infrared spectroscopy combined with least-squares support vector machine and successive projection algorithm could provide a simple, rapid, economical and non-destructive method for determining the composition of cotton-polyester textiles.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Chen, Zhan-bo. "Research on Application of Regression Least Squares Support Vector Machine on Performance Prediction of Hydraulic Excavator". Journal of Control Science and Engineering 2014 (2014): 1–4. http://dx.doi.org/10.1155/2014/686130.

Texto completo
Resumen
In order to improve the performance prediction accuracy of hydraulic excavator, the regression least squares support vector machine is applied. First, the mathematical model of the regression least squares support vector machine is studied, and then the algorithm of the regression least squares support vector machine is designed. Finally, the performance prediction simulation of hydraulic excavator based on regression least squares support vector machine is carried out, and simulation results show that this method can predict the performance changing rules of hydraulic excavator correctly.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Lu, Yan y Zhiping Huang. "A new hybrid model of sparsity empirical wavelet transform and adaptive dynamic least squares support vector machine for fault diagnosis of gear pump". Advances in Mechanical Engineering 12, n.º 5 (mayo de 2020): 168781402092204. http://dx.doi.org/10.1177/1687814020922047.

Texto completo
Resumen
Gear pump is the key component in hydraulic drive system, and it is very significant to fault diagnosis for gear pump. The combination of sparsity empirical wavelet transform and adaptive dynamic least squares support vector machine is proposed for fault diagnosis of gear pump in this article. Sparsity empirical wavelet transform is used to obtain the features of the vibrational signal of gear pump, the sparsity function is potential to make empirical wavelet transform adaptive, and adaptive dynamic least squares support vector machine is used to recognize the state of gear pump. The experimental results show that the diagnosis accuracies of sparsity empirical wavelet transform and adaptive dynamic least squares support vector machine are better than those of the empirical wavelet transform and adaptive dynamic least squares support vector machine method or the empirical wavelet transform and least squares support vector machine method.
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Guan, Qiong, Han Qing Tao y Bin Huang. "The Computer Interlocking Software System Reliability Test Based on the Monte Carlo". Applied Mechanics and Materials 614 (septiembre de 2014): 397–400. http://dx.doi.org/10.4028/www.scientific.net/amm.614.397.

Texto completo
Resumen
The railway switch failure prediction for railway signal equipment maintenance plays an important role. The paper put forward railway switch failure prediction algorithm based on least squares support vector machine, and chose five characteristic indexes composed of railway switch failure prediction models characteristic input vectors. It reduces the dimension of input vectors, shorten the least squares support vector machine training time, and use a pruning algorithm to accelerate the computing speed maintaining a good regression performance at the same time. The experiment proved that railway switch failure prediction algorithm has strong self-learning ability and higher prediction accuracy based on least squares support vector machine. And it can accelerate the speed of switch failure prediction and improve the accuracy and reliability of railway switch failure prediction.
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Liu, Xuanyu y Kaiju Zhang. "Earth pressure prediction in sealed chamber of shield machine based on parallel least squares support vector machine optimized by cooperative particle swarm optimization". Measurement and Control 52, n.º 7-8 (10 de mayo de 2019): 758–64. http://dx.doi.org/10.1177/0020294019840720.

Texto completo
Resumen
Earth pressure in sealed chamber is affected by multisystem and multifield coupling during shield tunneling process, so it is difficult to establish a mechanism earth pressure control model. Therefore, a data-driven modeling method of earth pressure in sealed chamber is proposed, which is based on parallel least squares support vector machine optimized by parallel cooperative particle swarm (parallel cooperative particle swarm optimization-partial least squares support vector machine). The vectors are parallel studied according to different hierarchies firstly, then the initial classifiers are updated by using cross-feedback method to retrain the vectors, and finally the vectors are merged to obtain the support vectors. The parameters of least squares support vector machine are optimized by the parallel cooperative particle swarm optimization, so as to predict quickly for large-scale data. Finally, the simulation experiment is carried out based on in-site measured data, and the results show that the method has high computing efficiency and prediction accuracy. The method has directive significance for engineering application.
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Mahmoud, Tarek. "Adaptive control scheme based on the least squares support vector machine network". International Journal of Applied Mathematics and Computer Science 21, n.º 4 (1 de diciembre de 2011): 685–96. http://dx.doi.org/10.2478/v10006-011-0054-6.

Texto completo
Resumen
Adaptive control scheme based on the least squares support vector machine networkRecently, a new type of neural networks called Least Squares Support Vector Machines (LS-SVMs) has been receiving increasing attention in nonlinear system identification and control due to its generalization performance. This paper develops a stable adaptive control scheme using the LS-SVM network. The developed control scheme includes two parts: the identification part that uses a modified structure of LS-SVM neural networks called the multi-resolution wavelet least squares support vector machine network (MRWLS-SVM) as a predictor model, and the controller part that is developed to track a reference trajectory. By means of the Lyapunov stability criterion, stability analysis for the tracking errors is performed. Finally, simulation studies are performed to demonstrate the capability of the developed approach in controlling a pH process.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Ren, Yuan y Guang Chen Bai. "Colonial Competitive Algorithm Assisted Least Squares Support Vector Machines". Advanced Materials Research 255-260 (mayo de 2011): 2082–86. http://dx.doi.org/10.4028/www.scientific.net/amr.255-260.2082.

Texto completo
Resumen
The use of least squares support vector machine (LSSVM), a novel machine learning method, for classification and function approximation has increased over the past few years especially due to its high generalization performance. However, LSSVM is plagued by the drawback that the hyper-parameters, which largely determine the quality of LSSVM models, have to be defined by the user, and this increases the difficulty of applying LSSVM and limits its use on academic and industrial platforms. In this paper we present a novel method of automatically tuning the hyper-parameters of LSSVM based on colonial competitive algorithm (CCA), a newly developed evolutionary algorithm inspired by imperialistic competition mechanism. To show the efficacy of the CCA assisted LSSVM methodology, we have tested it on several benchmark examples. The study suggests that proposed paradigm can be a competitive and powerful tool for classification and function approximation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Cawley, Gavin C. y Nicola L. C. Talbot. "Improved sparse least-squares support vector machines". Neurocomputing 48, n.º 1-4 (octubre de 2002): 1025–31. http://dx.doi.org/10.1016/s0925-2312(02)00606-9.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Sartakhti, Javad Salimi, Homayun Afrabandpey y Nasser Ghadiri. "Fuzzy least squares twin support vector machines". Engineering Applications of Artificial Intelligence 85 (octubre de 2019): 402–9. http://dx.doi.org/10.1016/j.engappai.2019.06.018.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Xu, Shuo, Xin An, Xiaodong Qiao y Lijun Zhu. "Multi-task least-squares support vector machines". Multimedia Tools and Applications 71, n.º 2 (30 de mayo de 2013): 699–715. http://dx.doi.org/10.1007/s11042-013-1526-5.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Wang, Liejun, Taiyi Zhang y Yatong Zhou. "Multi-Resolution Least Squares Support Vector Machines". Journal of Electronics (China) 24, n.º 5 (septiembre de 2007): 701–4. http://dx.doi.org/10.1007/s11767-006-0270-7.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Van Belle, V., K. Pelckmans, J. A. K. Suykens y S. Van Huffel. "Additive survival least-squares support vector machines". Statistics in Medicine 29, n.º 2 (18 de diciembre de 2009): 296–308. http://dx.doi.org/10.1002/sim.3743.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Yang, Chaoyu, Jie Yang y Jun Ma. "Sparse Least Squares Support Vector Machine With Adaptive Kernel Parameters". International Journal of Computational Intelligence Systems 13, n.º 1 (2020): 212. http://dx.doi.org/10.2991/ijcis.d.200205.001.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Yu Kaijun. "A least Squares Support Vector Machine Classifier for Information Retrieval". Journal of Convergence Information Technology 8, n.º 2 (31 de enero de 2013): 177–83. http://dx.doi.org/10.4156/jcit.vol8.issue2.22.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

WU, Zong-liang y Heng DOU. "Generalized least squares support-vector-machine algorithm and its application". Journal of Computer Applications 29, n.º 3 (6 de mayo de 2009): 877–79. http://dx.doi.org/10.3724/sp.j.1087.2009.00877.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Xing, Chong, Liming Wan, Jiaxin Wang y Yanchun Liang. "Prediction of ncRNA Based on Least Squares Support Vector Machine". Journal of Bionanoscience 7, n.º 1 (1 de febrero de 2013): 121–25. http://dx.doi.org/10.1166/jbns.2013.1093.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Jiang, Jingqing, Chuyisong y Lanying Bao. "ForwardGene Selection Algorithm Based on Least Squares Support Vector Machine". Journal of Bionanoscience 7, n.º 3 (1 de junio de 2013): 307–12. http://dx.doi.org/10.1166/jbns.2013.1136.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía