Journal articles on the topic 'Least-squares support vector machine'

To see the other types of publications on this topic, follow the link: Least-squares support vector machine.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Least-squares support vector machine.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

KITAYAMA, Satoshi, Masao ARAKAWA, and Koetsu YAMAZAKI. "1403 Least-Squares Support Vector Machine." Proceedings of Design & Systems Conference 2010.20 (2010): _1403–1_—_1403–5_. http://dx.doi.org/10.1299/jsmedsd.2010.20._1403-1_.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Adankon, M. M., M. Cheriet, and A. Biem. "Semisupervised Least Squares Support Vector Machine." IEEE Transactions on Neural Networks 20, no. 12 (December 2009): 1858–70. http://dx.doi.org/10.1109/tnn.2009.2031143.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

ZHENG, SHENG, YUQIU SUN, JINWEN TIAN, and JAIN LIU. "MAPPED LEAST SQUARES SUPPORT VECTOR MACHINE REGRESSION." International Journal of Pattern Recognition and Artificial Intelligence 19, no. 03 (May 2005): 459–75. http://dx.doi.org/10.1142/s0218001405004058.

Full text
Abstract:
This paper describes a novel version of regression SVM (Support Vector Machines) that is based on the least-squares error. We show that the solution of this optimization problem can be obtained easily once the inverse of a certain matrix is computed. This matrix, however, depends only on the input vectors, but not on the labels. Thus, if many learning problems with the same set of input vectors but different sets of labels have to be solved, it makes sense to compute the inverse of the matrix just once and then use it for computing all subsequent models. The computational complexity to train an regression SVM can be reduced to O (N2), just a matrix multiplication operation, and thus probably faster than known SVM training algorithms that have O (N2) work with loops. We describe applications from image processing, where the input points are usually of the form {(x0 + dx, y0 + dy) : |dx| < m, |dy| < n} and all such set of points can be translated to the same set {(dx, dy) : |dx| < m, |dy| < n} by subtracting (x0, y0) from all the vectors. The experimental results demonstrate that the proposed approach is faster than those processing each learning problem separately.
APA, Harvard, Vancouver, ISO, and other styles
4

Hwang, Changha, and Jooyong Shim. "Geographically weighted least squares-support vector machine." Journal of the Korean Data and Information Science Society 28, no. 1 (January 31, 2017): 227–35. http://dx.doi.org/10.7465/jkdi.2017.28.1.227.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Choi, Young-Sik. "Least squares one-class support vector machine." Pattern Recognition Letters 30, no. 13 (October 2009): 1236–40. http://dx.doi.org/10.1016/j.patrec.2009.05.007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Huang, Xiaolin, Lei Shi, and Johan A. K. Suykens. "Asymmetric least squares support vector machine classifiers." Computational Statistics & Data Analysis 70 (February 2014): 395–405. http://dx.doi.org/10.1016/j.csda.2013.09.015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Liu, Dalian, Yong Shi, Yingjie Tian, and Xiankai Huang. "Ramp loss least squares support vector machine." Journal of Computational Science 14 (May 2016): 61–68. http://dx.doi.org/10.1016/j.jocs.2016.02.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

van Gestel, Tony, Johan A. K. Suykens, Bart Baesens, Stijn Viaene, Jan Vanthienen, Guido Dedene, Bart de Moor, and Joos Vandewalle. "Benchmarking Least Squares Support Vector Machine Classifiers." Machine Learning 54, no. 1 (January 2004): 5–32. http://dx.doi.org/10.1023/b:mach.0000008082.80494.e0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Yong Li, Yan Wei Zhu, Shu Fei Lin, Xiu Juan Sun, Qiu Na Zhang, and Xiao Hong Liu. "Algorithm of Sparse Least Squares Support Vector Machine." Advanced Materials Research 143-144 (October 2010): 1229–33. http://dx.doi.org/10.4028/www.scientific.net/amr.143-144.1229.

Full text
Abstract:
Support Vector Machine has been widely studied in recent years. The algorithm of least squares support vector machine is studied, the shortcomings of the algorithm are given. The result of algorithm is lack of sparseness. In this paper greedy algorithm is introduced into the least squares support vector machine. Sparseness is obtained again. A new algorithm of sparse least squares support vector machine is given. The new algorithm was used to sewage treatment plant daily monitoring. Experimental results demonstrate the improved algorithm of support vector machine was successful.
APA, Harvard, Vancouver, ISO, and other styles
10

Dong, Zengshou, Zhaojing Ren, and You Dong. "MECHANICAL FAULT RECOGNITION RESEARCH BASED ON LMD-LSSVM." Transactions of the Canadian Society for Mechanical Engineering 40, no. 4 (November 2016): 541–49. http://dx.doi.org/10.1139/tcsme-2016-0042.

Full text
Abstract:
Mechanical fault vibration signals are non-stationary, which causes system instability. The traditional methods are difficult to accurately extract fault information and this paper proposes a local mean decomposition and least squares support vector machine fault identification method. The article introduces waveform matching to solve the original features of signals at the endpoints, using linear interpolation to get local mean and envelope function, then obtain production function PF vector through making use of the local mean decomposition. The energy entropy of PF vector take as identification input vectors. These vectors are respectively inputted BP neural networks, support vector machines, least squares support vector machines to identify faults. Experimental result show that the accuracy of least squares support vector machine with higher classification accuracy has been improved.
APA, Harvard, Vancouver, ISO, and other styles
11

Xia, Xiao-Lei, Weidong Jiao, Kang Li, and George Irwin. "A Novel Sparse Least Squares Support Vector Machines." Mathematical Problems in Engineering 2013 (2013): 1–10. http://dx.doi.org/10.1155/2013/602341.

Full text
Abstract:
The solution of a Least Squares Support Vector Machine (LS-SVM) suffers from the problem of nonsparseness. The Forward Least Squares Approximation (FLSA) is a greedy approximation algorithm with a least-squares loss function. This paper proposes a new Support Vector Machine for which the FLSA is the training algorithm—the Forward Least Squares Approximation SVM (FLSA-SVM). A major novelty of this new FLSA-SVM is that the number of support vectors is the regularization parameter for tuning the tradeoff between the generalization ability and the training cost. The FLSA-SVMs can also detect the linear dependencies in vectors of the input Gramian matrix. These attributes together contribute to its extreme sparseness. Experiments on benchmark datasets are presented which show that, compared to various SVM algorithms, the FLSA-SVM is extremely compact, while maintaining a competitive generalization ability.
APA, Harvard, Vancouver, ISO, and other styles
12

Seok, Kyungha, and Daehyun Cho. "A Study on Support Vectors of Least Squares Support Vector Machine." Communications for Statistical Applications and Methods 10, no. 3 (December 1, 2003): 873–78. http://dx.doi.org/10.5351/ckss.2003.10.3.873.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Li, Xiao, Xin Liu, Clyde Zhengdao Li, Zhumin Hu, Geoffrey Qiping Shen, and Zhenyu Huang. "Foundation pit displacement monitoring and prediction using least squares support vector machines based on multi-point measurement." Structural Health Monitoring 18, no. 3 (April 23, 2018): 715–24. http://dx.doi.org/10.1177/1475921718767935.

Full text
Abstract:
Foundation pit displacement is a critical safety risk for both building structure and people lives. The accurate displacement monitoring and prediction of a deep foundation pit are essential to prevent potential risks at early construction stage. To achieve accurate prediction, machine learning methods are extensively applied to fulfill this purpose. However, these approaches, such as support vector machines, have limitations in terms of data processing efficiency and prediction accuracy. As an emerging approach derived from support vector machines, least squares support vector machine improve the data processing efficiency through better use of equality constraints in the least squares loss functions. However, the accuracy of this approach highly relies on the large volume of influencing factors from the measurement of adjacent critical points, which is not normally available during the construction process. To address this issue, this study proposes an improved least squares support vector machine algorithm based on multi-point measuring techniques, namely, multi-point least squares support vector machine. To evaluate the effectiveness of the proposed multi-point least squares support vector machine approach, a real case study project was selected, and the results illustrated that the multi-point least squares support vector machine approach on average outperformed single-point least squares support vector machine in terms of prediction accuracy during the foundation pit monitoring and prediction process.
APA, Harvard, Vancouver, ISO, and other styles
14

Yang Li, Wanmei Tang, and Mingyong Li. "A Least Squares Support Vector Machine Sparseness Algorithm." Journal of Convergence Information Technology 7, no. 13 (July 31, 2012): 240–48. http://dx.doi.org/10.4156/jcit.vol7.issue13.28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

WU, Zong-liang, and Heng DOU. "New sparse least squares support vector machine algorithm." Journal of Computer Applications 29, no. 6 (December 4, 2009): 1559–62. http://dx.doi.org/10.3724/sp.j.1087.2009.01559.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Farrokh, Mojtaba. "Hysteresis Simulation Using Least-Squares Support Vector Machine." Journal of Engineering Mechanics 144, no. 9 (September 2018): 04018084. http://dx.doi.org/10.1061/(asce)em.1943-7889.0001509.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Xing, Hong-Jie, and Li-Fei Li. "Robust least squares one-class support vector machine." Pattern Recognition Letters 138 (October 2020): 571–78. http://dx.doi.org/10.1016/j.patrec.2020.09.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Pei, Huimin, Kuaini Wang, and Ping Zhong. "Semi-supervised matrixized least squares support vector machine." Applied Soft Computing 61 (December 2017): 72–87. http://dx.doi.org/10.1016/j.asoc.2017.07.040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Li, Yuangui, Chen Lin, and Weidong Zhang. "Improved sparse least-squares support vector machine classifiers." Neurocomputing 69, no. 13-15 (August 2006): 1655–58. http://dx.doi.org/10.1016/j.neucom.2006.03.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Zhang, Ruiting, and Zhijian Zhou. "A Fuzzy Least Squares Support Tensor Machines in Machine Learning." International Journal of Emerging Technologies in Learning (iJET) 10, no. 8 (December 14, 2015): 4. http://dx.doi.org/10.3991/ijet.v10i8.5203.

Full text
Abstract:
In the machine learning field, high-dimensional data are often encountered in the real applications. Most of the traditional learning algorithms are based on the vector space model, such as SVM. Tensor representation is useful to the over fitting problem in vector-based learning, and tensor-based algorithm requires a smaller set of decision variables as compared to vector-based approaches. We also would require that the meaningful training points must be classified correctly and would not care about some training points like noises whether or not they are classified correctly. To utilize the structural information present in high dimensional features of an object, a tensor-based learning framework, termed as Fuzzy Least Squares support tensor machine (FLSSTM), where the classifier is obtained by solving a system of linear equations rather than a quadratic programming problem at each iteration of FLSSTM algorithm as compared to STM algorithm. This in turn provides a significant reduction in the computation time, as well as comparable classification accuracy. The efficacy of the proposed method has been demonstrated in ORL database and Yale database. The FLSSTM outperforms other tensor-based algorithms, for example, LSSTM, especially when training size is small.
APA, Harvard, Vancouver, ISO, and other styles
21

Wornyo, Dickson Keddy, and Xiang-Jun Shen. "Coupled Least Squares Support Vector Ensemble Machines." Information 10, no. 6 (June 3, 2019): 195. http://dx.doi.org/10.3390/info10060195.

Full text
Abstract:
The least squares support vector method is a popular data-driven modeling method which shows better performance and has been successfully applied in a wide range of applications. In this paper, we propose a novel coupled least squares support vector ensemble machine (C-LSSVEM). The proposed coupling ensemble helps improve robustness and produce good classification performance than the single model approach. The proposed C-LSSVEM can choose appropriate kernel types and their parameters in a good coupling strategy with a set of classifiers being trained simultaneously. The proposed method can further minimize the total loss of ensembles in kernel space. Thus, we form an ensemble regressor by co-optimizing and weighing base regressors. Experiments conducted on several datasets such as artificial datasets, UCI classification datasets, UCI regression datasets, handwritten digits datasets and NWPU-RESISC45 datasets, indicate that C-LSSVEM performs better in achieving the minimal regression loss and the best classification accuracy relative to selected state-of-the-art regression and classification techniques.
APA, Harvard, Vancouver, ISO, and other styles
22

Suykens, J. A. K., and J. Vandewalle. "Recurrent least squares support vector machines." IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications 47, no. 7 (July 2000): 1109–14. http://dx.doi.org/10.1109/81.855471.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

LI, JIANPING, ZHENYU CHEN, LIWEI WEI, WEIXUAN XU, and GANG KOU. "FEATURE SELECTION VIA LEAST SQUARES SUPPORT FEATURE MACHINE." International Journal of Information Technology & Decision Making 06, no. 04 (December 2007): 671–86. http://dx.doi.org/10.1142/s0219622007002733.

Full text
Abstract:
In many applications such as credit risk management, data are represented as high-dimensional feature vectors. It makes the feature selection necessary to reduce the computational complexity, improve the generalization ability and the interpretability. In this paper, we present a novel feature selection method — "Least Squares Support Feature Machine" (LS-SFM). The proposed method has two advantages comparing with conventional Support Vector Machine (SVM) and LS-SVM. First, the convex combinations of basic kernels are used as the kernel and each basic kernel makes use of a single feature. It transforms the feature selection problem that cannot be solved in the context of SVM to an ordinary multiple-parameter learning problem. Second, all parameters are learned by a two stage iterative algorithm. A 1-norm based regularized cost function is used to enforce sparseness of the feature parameters. The "support features" refer to the respective features with nonzero feature parameters. Experimental study on some of the UCI datasets and a commercial credit card dataset demonstrates the effectiveness and efficiency of the proposed approach.
APA, Harvard, Vancouver, ISO, and other styles
24

Liang, Si Yang, and Jian Hong Lv. "Least Squares Support Vector Machine for Fault Diagnosis Optimization." Applied Mechanics and Materials 347-350 (August 2013): 505–8. http://dx.doi.org/10.4028/www.scientific.net/amm.347-350.505.

Full text
Abstract:
In order to improve the diagnostic accuracy of digital circuit, the fault diagnosis method based on support vector machines (SVM) is proposed. The input is fault characteristics of digital circuit; the output is the fault style. The connection of fault characteristics and style was established. Network learning algorithm using least squares, the training sample data is formed by the simulation, the test sample data is formed by the untrained simulation. The method achieved the classification of faulted digital circuits, and the results show that the method has the features of fast and high accuracy.
APA, Harvard, Vancouver, ISO, and other styles
25

Hwang, Changha, Sang-Il Choi, and Jooyong Shim. "Deep multiple kernel least squares support vector regression machine." Journal of the Korean Data And Information Science Sociaty 29, no. 4 (July 31, 2018): 895–902. http://dx.doi.org/10.7465/jkdi.2018.29.4.895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Saranya, N. "SENTIMENTAL ANALYSIS USING LEAST SQUARES TWIN SUPPORT VECTOR MACHINE." International Journal of Advanced Research in Computer Science 8, no. 7 (August 20, 2017): 860–66. http://dx.doi.org/10.26483/ijarcs.v8i7.4527.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Nasiri, Jalal A., Nasrollah Moghadam Charkari, and Saeed Jalili. "Least squares twin multi-class classification support vector machine." Pattern Recognition 48, no. 3 (March 2015): 984–92. http://dx.doi.org/10.1016/j.patcog.2014.09.020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Liu, Kun, and Bing-Yu Sun. "Least Squares Support Vector Machine Regression with Equality Constraints." Physics Procedia 24 (2012): 2227–30. http://dx.doi.org/10.1016/j.phpro.2012.02.327.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Licheng Jiao, Liefeng Bo, and Ling Wang. "Fast Sparse Approximation for Least Squares Support Vector Machine." IEEE Transactions on Neural Networks 18, no. 3 (May 2007): 685–97. http://dx.doi.org/10.1109/tnn.2006.889500.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Shim, Joo-Yong, Jong-Sig Bae, and Chang-Ha Hwang. "Multiclass Classification via Least Squares Support Vector Machine Regression." Communications for Statistical Applications and Methods 15, no. 3 (May 30, 2008): 441–50. http://dx.doi.org/10.5351/ckss.2008.15.3.441.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Ziggah, Yao Yevenyo, Youjina Hu, Yakubu Issaka, and Prosper Basommi Laari. "LEAST SQUARES SUPPORT VECTOR MACHINE MODEL FOR COORDINATE TRANSFORMATION." Geodesy and cartography 45, no. 5 (April 17, 2019): 16–27. http://dx.doi.org/10.3846/gac.2019.6053.

Full text
Abstract:
In coordinate transformation, the main purpose is to provide a mathematical relationship between coordinates related to different geodetic reference frames. This gives the geospatial professionals the opportunity to link different datums together. Review of previous studies indicates that empirical and soft computing models have been proposed in recent times for coordinate transformation. The main aim of this study is to present the applicability and performance of Least Squares Support Vector Machine (LS-SVM) which is an extension of the Support Vector Machine (SVM) for coordinate transformation. For comparison purpose, the SVM and the widely used Backpropagation Neural Network (BPNN), Radial Basis Function Neural Network (RBFNN), 2D conformal and affine methods were also employed. To assess how well the transformation results fit the observed data, the root mean square of the residual horizontal distances and standard deviation were used. From the results obtained, the LS-SVM and RBFNN had comparable results and were better than the other methods. The overall statistical findings produced by LS-SVM met the accuracy requirement for cadastral surveying applications in Ghana. To this end, the proposed LS-SVM is known to possess promising predictive capabilities and could efficiently be used as a supplementary technique for coordinate transformation.
APA, Harvard, Vancouver, ISO, and other styles
32

Khemchandani, Reshma, and Aman Pal. "Multi-category laplacian least squares twin support vector machine." Applied Intelligence 45, no. 2 (March 23, 2016): 458–74. http://dx.doi.org/10.1007/s10489-016-0770-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Xie, Xijiong, Feixiang Sun, Jiangbo Qian, Lijun Guo, Rong Zhang, Xulun Ye, and Zhijin Wang. "Laplacian Lp norm least squares twin support vector machine." Pattern Recognition 136 (April 2023): 109192. http://dx.doi.org/10.1016/j.patcog.2022.109192.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Liu, Yisen, Songbin Zhou, Weixin Liu, Xinhui Yang, and Jun Luo. "Least-squares support vector machine and successive projection algorithm for quantitative analysis of cotton-polyester textile by near infrared spectroscopy." Journal of Near Infrared Spectroscopy 26, no. 1 (February 2018): 34–43. http://dx.doi.org/10.1177/0967033518757069.

Full text
Abstract:
The application of near infrared spectroscopy for quantitative analysis of cotton-polyester textile was investigated in the present work. A total of 214 cotton-polyester fabric samples, covering the range from 0% to 100% cotton were measured and analyzed. Partial least squares and least-squares support vector machine models with all variables as input data were established. Furthermore, successive projection algorithm was used to select effective wavelengths and establish the successive projection algorithm-least-squares support vector machine models, with the comparison of two other effective wavelength selection methods: loading weights analysis and regression coefficient analysis. The calibration and validation results show that the successive projection algorithm-least-squares support vector machine model outperformed not only the partial least squares and least-squares support vector machine models with all variables as inputs, but also the least-squares support vector machine models with loading weights analysis and regression coefficient analysis effective wavelength selection. The root mean squared error of calibration and root mean squared error of prediction values of the successive projection algorithm-least-squares support vector machine regression model with the optimal performance were 0.77% and 1.17%, respectively. The overall results demonstrated that near infrared spectroscopy combined with least-squares support vector machine and successive projection algorithm could provide a simple, rapid, economical and non-destructive method for determining the composition of cotton-polyester textiles.
APA, Harvard, Vancouver, ISO, and other styles
35

Chen, Zhan-bo. "Research on Application of Regression Least Squares Support Vector Machine on Performance Prediction of Hydraulic Excavator." Journal of Control Science and Engineering 2014 (2014): 1–4. http://dx.doi.org/10.1155/2014/686130.

Full text
Abstract:
In order to improve the performance prediction accuracy of hydraulic excavator, the regression least squares support vector machine is applied. First, the mathematical model of the regression least squares support vector machine is studied, and then the algorithm of the regression least squares support vector machine is designed. Finally, the performance prediction simulation of hydraulic excavator based on regression least squares support vector machine is carried out, and simulation results show that this method can predict the performance changing rules of hydraulic excavator correctly.
APA, Harvard, Vancouver, ISO, and other styles
36

Lu, Yan, and Zhiping Huang. "A new hybrid model of sparsity empirical wavelet transform and adaptive dynamic least squares support vector machine for fault diagnosis of gear pump." Advances in Mechanical Engineering 12, no. 5 (May 2020): 168781402092204. http://dx.doi.org/10.1177/1687814020922047.

Full text
Abstract:
Gear pump is the key component in hydraulic drive system, and it is very significant to fault diagnosis for gear pump. The combination of sparsity empirical wavelet transform and adaptive dynamic least squares support vector machine is proposed for fault diagnosis of gear pump in this article. Sparsity empirical wavelet transform is used to obtain the features of the vibrational signal of gear pump, the sparsity function is potential to make empirical wavelet transform adaptive, and adaptive dynamic least squares support vector machine is used to recognize the state of gear pump. The experimental results show that the diagnosis accuracies of sparsity empirical wavelet transform and adaptive dynamic least squares support vector machine are better than those of the empirical wavelet transform and adaptive dynamic least squares support vector machine method or the empirical wavelet transform and least squares support vector machine method.
APA, Harvard, Vancouver, ISO, and other styles
37

Guan, Qiong, Han Qing Tao, and Bin Huang. "The Computer Interlocking Software System Reliability Test Based on the Monte Carlo." Applied Mechanics and Materials 614 (September 2014): 397–400. http://dx.doi.org/10.4028/www.scientific.net/amm.614.397.

Full text
Abstract:
The railway switch failure prediction for railway signal equipment maintenance plays an important role. The paper put forward railway switch failure prediction algorithm based on least squares support vector machine, and chose five characteristic indexes composed of railway switch failure prediction models characteristic input vectors. It reduces the dimension of input vectors, shorten the least squares support vector machine training time, and use a pruning algorithm to accelerate the computing speed maintaining a good regression performance at the same time. The experiment proved that railway switch failure prediction algorithm has strong self-learning ability and higher prediction accuracy based on least squares support vector machine. And it can accelerate the speed of switch failure prediction and improve the accuracy and reliability of railway switch failure prediction.
APA, Harvard, Vancouver, ISO, and other styles
38

Liu, Xuanyu, and Kaiju Zhang. "Earth pressure prediction in sealed chamber of shield machine based on parallel least squares support vector machine optimized by cooperative particle swarm optimization." Measurement and Control 52, no. 7-8 (May 10, 2019): 758–64. http://dx.doi.org/10.1177/0020294019840720.

Full text
Abstract:
Earth pressure in sealed chamber is affected by multisystem and multifield coupling during shield tunneling process, so it is difficult to establish a mechanism earth pressure control model. Therefore, a data-driven modeling method of earth pressure in sealed chamber is proposed, which is based on parallel least squares support vector machine optimized by parallel cooperative particle swarm (parallel cooperative particle swarm optimization-partial least squares support vector machine). The vectors are parallel studied according to different hierarchies firstly, then the initial classifiers are updated by using cross-feedback method to retrain the vectors, and finally the vectors are merged to obtain the support vectors. The parameters of least squares support vector machine are optimized by the parallel cooperative particle swarm optimization, so as to predict quickly for large-scale data. Finally, the simulation experiment is carried out based on in-site measured data, and the results show that the method has high computing efficiency and prediction accuracy. The method has directive significance for engineering application.
APA, Harvard, Vancouver, ISO, and other styles
39

Mahmoud, Tarek. "Adaptive control scheme based on the least squares support vector machine network." International Journal of Applied Mathematics and Computer Science 21, no. 4 (December 1, 2011): 685–96. http://dx.doi.org/10.2478/v10006-011-0054-6.

Full text
Abstract:
Adaptive control scheme based on the least squares support vector machine networkRecently, a new type of neural networks called Least Squares Support Vector Machines (LS-SVMs) has been receiving increasing attention in nonlinear system identification and control due to its generalization performance. This paper develops a stable adaptive control scheme using the LS-SVM network. The developed control scheme includes two parts: the identification part that uses a modified structure of LS-SVM neural networks called the multi-resolution wavelet least squares support vector machine network (MRWLS-SVM) as a predictor model, and the controller part that is developed to track a reference trajectory. By means of the Lyapunov stability criterion, stability analysis for the tracking errors is performed. Finally, simulation studies are performed to demonstrate the capability of the developed approach in controlling a pH process.
APA, Harvard, Vancouver, ISO, and other styles
40

Ren, Yuan, and Guang Chen Bai. "Colonial Competitive Algorithm Assisted Least Squares Support Vector Machines." Advanced Materials Research 255-260 (May 2011): 2082–86. http://dx.doi.org/10.4028/www.scientific.net/amr.255-260.2082.

Full text
Abstract:
The use of least squares support vector machine (LSSVM), a novel machine learning method, for classification and function approximation has increased over the past few years especially due to its high generalization performance. However, LSSVM is plagued by the drawback that the hyper-parameters, which largely determine the quality of LSSVM models, have to be defined by the user, and this increases the difficulty of applying LSSVM and limits its use on academic and industrial platforms. In this paper we present a novel method of automatically tuning the hyper-parameters of LSSVM based on colonial competitive algorithm (CCA), a newly developed evolutionary algorithm inspired by imperialistic competition mechanism. To show the efficacy of the CCA assisted LSSVM methodology, we have tested it on several benchmark examples. The study suggests that proposed paradigm can be a competitive and powerful tool for classification and function approximation.
APA, Harvard, Vancouver, ISO, and other styles
41

Cawley, Gavin C., and Nicola L. C. Talbot. "Improved sparse least-squares support vector machines." Neurocomputing 48, no. 1-4 (October 2002): 1025–31. http://dx.doi.org/10.1016/s0925-2312(02)00606-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Sartakhti, Javad Salimi, Homayun Afrabandpey, and Nasser Ghadiri. "Fuzzy least squares twin support vector machines." Engineering Applications of Artificial Intelligence 85 (October 2019): 402–9. http://dx.doi.org/10.1016/j.engappai.2019.06.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Xu, Shuo, Xin An, Xiaodong Qiao, and Lijun Zhu. "Multi-task least-squares support vector machines." Multimedia Tools and Applications 71, no. 2 (May 30, 2013): 699–715. http://dx.doi.org/10.1007/s11042-013-1526-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Wang, Liejun, Taiyi Zhang, and Yatong Zhou. "Multi-Resolution Least Squares Support Vector Machines." Journal of Electronics (China) 24, no. 5 (September 2007): 701–4. http://dx.doi.org/10.1007/s11767-006-0270-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Van Belle, V., K. Pelckmans, J. A. K. Suykens, and S. Van Huffel. "Additive survival least-squares support vector machines." Statistics in Medicine 29, no. 2 (December 18, 2009): 296–308. http://dx.doi.org/10.1002/sim.3743.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Yang, Chaoyu, Jie Yang, and Jun Ma. "Sparse Least Squares Support Vector Machine With Adaptive Kernel Parameters." International Journal of Computational Intelligence Systems 13, no. 1 (2020): 212. http://dx.doi.org/10.2991/ijcis.d.200205.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Yu Kaijun. "A least Squares Support Vector Machine Classifier for Information Retrieval." Journal of Convergence Information Technology 8, no. 2 (January 31, 2013): 177–83. http://dx.doi.org/10.4156/jcit.vol8.issue2.22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

WU, Zong-liang, and Heng DOU. "Generalized least squares support-vector-machine algorithm and its application." Journal of Computer Applications 29, no. 3 (May 6, 2009): 877–79. http://dx.doi.org/10.3724/sp.j.1087.2009.00877.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Xing, Chong, Liming Wan, Jiaxin Wang, and Yanchun Liang. "Prediction of ncRNA Based on Least Squares Support Vector Machine." Journal of Bionanoscience 7, no. 1 (February 1, 2013): 121–25. http://dx.doi.org/10.1166/jbns.2013.1093.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Jiang, Jingqing, Chuyisong, and Lanying Bao. "ForwardGene Selection Algorithm Based on Least Squares Support Vector Machine." Journal of Bionanoscience 7, no. 3 (June 1, 2013): 307–12. http://dx.doi.org/10.1166/jbns.2013.1136.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography