Journal articles on the topic 'Support vector machines'

To see the other types of publications on this topic, follow the link: Support vector machines.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Support vector machines.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Guenther, Nick, and Matthias Schonlau. "Support Vector Machines." Stata Journal: Promoting communications on statistics and Stata 16, no. 4 (December 2016): 917–37. http://dx.doi.org/10.1177/1536867x1601600407.

Full text
Abstract:
Support vector machines are statistical- and machine-learning techniques with the primary goal of prediction. They can be applied to continuous, binary, and categorical outcomes analogous to Gaussian, logistic, and multinomial regression. We introduce a new command for this purpose, svmachines. This package is a thin wrapper for the widely deployed libsvm (Chang and Lin, 2011, ACM Transactions on Intelligent Systems and Technology 2(3): Article 27). We illustrate svmachines with two examples.
APA, Harvard, Vancouver, ISO, and other styles
2

Hearst, M. A., S. T. Dumais, E. Osuna, J. Platt, and B. Scholkopf. "Support vector machines." IEEE Intelligent Systems and their Applications 13, no. 4 (July 1998): 18–28. http://dx.doi.org/10.1109/5254.708428.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bennett, Kristin P., and Colin Campbell. "Support vector machines." ACM SIGKDD Explorations Newsletter 2, no. 2 (December 2000): 1–13. http://dx.doi.org/10.1145/380995.380999.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Mammone, Alessia, Marco Turchi, and Nello Cristianini. "Support vector machines." Wiley Interdisciplinary Reviews: Computational Statistics 1, no. 3 (November 2009): 283–89. http://dx.doi.org/10.1002/wics.49.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Valkenborg, Dirk, Axel-Jan Rousseau, Melvin Geubbelmans, and Tomasz Burzykowski. "Support vector machines." American Journal of Orthodontics and Dentofacial Orthopedics 164, no. 5 (November 2023): 754–57. http://dx.doi.org/10.1016/j.ajodo.2023.08.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Giustolisi, Orazio. "Using a multi-objective genetic algorithm for SVM construction." Journal of Hydroinformatics 8, no. 2 (March 1, 2006): 125–39. http://dx.doi.org/10.2166/hydro.2006.016b.

Full text
Abstract:
Support Vector Machines are kernel machines useful for classification and regression problems. In this paper, they are used for non-linear regression of environmental data. From a structural point of view, Support Vector Machines are particular Artificial Neural Networks and their training paradigm has some positive implications. In fact, the original training approach is useful to overcome the curse of dimensionality and too strict assumptions on statistics of the errors in data. Support Vector Machines and Radial Basis Function Regularised Networks are presented within a common structural framework for non-linear regression in order to emphasise the training strategy for support vector machines and to better explain the multi-objective approach in support vector machines' construction. A support vector machine's performance depends on the kernel parameter, input selection and ε-tube optimal dimension. These will be used as decision variables for the evolutionary strategy based on a Genetic Algorithm, which exhibits the number of support vectors, for the capacity of machine, and the fitness to a validation subset, for the model accuracy in mapping the underlying physical phenomena, as objective functions. The strategy is tested on a case study dealing with groundwater modelling, based on time series (past measured rainfalls and levels) for level predictions at variable time horizons.
APA, Harvard, Vancouver, ISO, and other styles
7

Heo, Gyeongyong. "Context Dependent Fusion with Support Vector Machines." Journal of the Korea Society of Computer and Information 18, no. 7 (July 31, 2013): 37–45. http://dx.doi.org/10.9708/jksci.2013.18.7.037.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Fearn, Tom. "Support Vector Machines I: The Support Vector Classifier." NIR news 15, no. 5 (October 2004): 14–15. http://dx.doi.org/10.1255/nirn.788.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Goyal, Vishal, and Aasheesh Shukla. "Content based Image Retrieval Using Support Vector Machines." Journal of Advanced Research in Dynamical and Control Systems 11, no. 10-SPECIAL ISSUE (October 31, 2019): 625–32. http://dx.doi.org/10.5373/jardcs/v11sp10/20192851.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

REN, Shuang-Qiao, De-Gui YANG, Xiang LI, and Zhao-Wen ZHUANG. "Piecewise Support Vector Machines." Chinese Journal of Computers 32, no. 1 (July 29, 2009): 77–85. http://dx.doi.org/10.3724/sp.j.1016.2009.00077.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

García Díaz, Elkin Eduardo, and Fernando Lozano Martínez. "Boosting Support Vector Machines." Revista de Ingeniería, no. 24 (November 2006): 62–70. http://dx.doi.org/10.16924/revinge.24.8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Schlag, Sebastian, Matthias Schmitt, and Christian Schulz. "Faster Support Vector Machines." ACM Journal of Experimental Algorithmics 26 (December 31, 2021): 1–21. http://dx.doi.org/10.1145/3484730.

Full text
Abstract:
The time complexity of support vector machines (SVMs) prohibits training on huge datasets with millions of data points. Recently, multilevel approaches to train SVMs have been developed to allow for time-efficient training on huge datasets. While regular SVMs perform the entire training in one—time-consuming—optimization step, multilevel SVMs first build a hierarchy of problems decreasing in size that resemble the original problem and then train an SVM model for each hierarchy level, benefiting from the solved models of previous levels. We present a faster multilevel support vector machine that uses a label propagation algorithm to construct the problem hierarchy. Extensive experiments indicate that our approach is up to orders of magnitude faster than the previous fastest algorithm while having comparable classification quality. For example, already one of our sequential solvers is on average a factor 15 faster than the parallel ThunderSVM algorithm, while having similar classification quality. 1
APA, Harvard, Vancouver, ISO, and other styles
13

Yao, Chih-Chia, and Pao-Ta Yu. "Oblique Support Vector Machines." Informatica 18, no. 1 (January 1, 2007): 137–57. http://dx.doi.org/10.15388/informatica.2007.169.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Huang, Kaizhu, Haochuan Jiang, and Xu-Yao Zhang. "Field Support Vector Machines." IEEE Transactions on Emerging Topics in Computational Intelligence 1, no. 6 (December 2017): 454–63. http://dx.doi.org/10.1109/tetci.2017.2751062.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Lee, Yoonkyung, Yi Lin, and Grace Wahba. "Multicategory Support Vector Machines." Journal of the American Statistical Association 99, no. 465 (March 2004): 67–81. http://dx.doi.org/10.1198/016214504000000098.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Chun-Fu Lin and Sheng-De Wang. "Fuzzy support vector machines." IEEE Transactions on Neural Networks 13, no. 2 (March 2002): 464–71. http://dx.doi.org/10.1109/72.991432.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Gyemin Lee and C. Scott. "Nested Support Vector Machines." IEEE Transactions on Signal Processing 58, no. 3 (March 2010): 1648–60. http://dx.doi.org/10.1109/tsp.2009.2036071.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Lee, KiYoung, Dae-Won Kim, Kwang H. Lee, and Doheon Lee. "Possibilistic support vector machines." Pattern Recognition 38, no. 8 (August 2005): 1325–27. http://dx.doi.org/10.1016/j.patcog.2004.11.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Carrizosa, Emilio, Belen Martin-Barragan, and Dolores Romero Morales. "Binarized Support Vector Machines." INFORMS Journal on Computing 22, no. 1 (February 2010): 154–67. http://dx.doi.org/10.1287/ijoc.1090.0317.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Navia-Vazquez, A., D. Gutierrez-Gonzalez, E. Parrado-Hernandez, and J. J. Navarro-Abellan. "Distributed Support Vector Machines." IEEE Transactions on Neural Networks 17, no. 4 (July 2006): 1091–97. http://dx.doi.org/10.1109/tnn.2006.875968.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Pernes, Diogo, Kelwin Fernandes, and Jaime Cardoso. "Directional Support Vector Machines." Applied Sciences 9, no. 4 (February 19, 2019): 725. http://dx.doi.org/10.3390/app9040725.

Full text
Abstract:
Several phenomena are represented by directional—angular or periodic—data; from time references on the calendar to geographical coordinates. These values are usually represented as real values restricted to a given range (e.g., [ 0 , 2 π ) ), hiding the real nature of this information. In order to handle these variables properly in supervised classification tasks, alternatives to the naive Bayes classifier and logistic regression were proposed in the past. In this work, we propose directional-aware support vector machines. We address several realizations of the proposed models, studying their kernelized counterparts and their expressiveness. Finally, we validate the performance of the proposed Support Vector Machines (SVMs) against the directional naive Bayes and directional logistic regression with real data, obtaining competitive results.
APA, Harvard, Vancouver, ISO, and other styles
22

Seref, Onur, O. Erhun Kundakcioglu, Oleg A. Prokopyev, and Panos M. Pardalos. "Selective support vector machines." Journal of Combinatorial Optimization 17, no. 1 (October 3, 2008): 3–20. http://dx.doi.org/10.1007/s10878-008-9189-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Dong, Zengshou, Zhaojing Ren, and You Dong. "MECHANICAL FAULT RECOGNITION RESEARCH BASED ON LMD-LSSVM." Transactions of the Canadian Society for Mechanical Engineering 40, no. 4 (November 2016): 541–49. http://dx.doi.org/10.1139/tcsme-2016-0042.

Full text
Abstract:
Mechanical fault vibration signals are non-stationary, which causes system instability. The traditional methods are difficult to accurately extract fault information and this paper proposes a local mean decomposition and least squares support vector machine fault identification method. The article introduces waveform matching to solve the original features of signals at the endpoints, using linear interpolation to get local mean and envelope function, then obtain production function PF vector through making use of the local mean decomposition. The energy entropy of PF vector take as identification input vectors. These vectors are respectively inputted BP neural networks, support vector machines, least squares support vector machines to identify faults. Experimental result show that the accuracy of least squares support vector machine with higher classification accuracy has been improved.
APA, Harvard, Vancouver, ISO, and other styles
24

Pontil, Massimiliano, and Alessandro Verri. "Properties of Support Vector Machines." Neural Computation 10, no. 4 (May 1, 1998): 955–74. http://dx.doi.org/10.1162/089976698300017575.

Full text
Abstract:
Support vector machines (SVMs) perform pattern recognition between two point classes by finding a decision surface determined by certain points of the training set, termed support vectors (SV). This surface, which in some feature space of possibly infinite dimension can be regarded as a hyperplane, is obtained from the solution of a problem of quadratic programming that depends on a regularization parameter. In this article, we study some mathematical properties of support vectors and show that the decision surface can be written as the sum of two orthogonal terms, the first depending on only the margin vectors (which are SVs lying on the margin), the second proportional to the regularization parameter. For almost all values of the parameter, this enables us to predict how the decision surface varies for small parameter changes. In the special but important case of feature space of finite dimension m, we also show that there are at most m + 1 margin vectors and observe that m + 1 SVs are usually sufficient to determine the decision surface fully. For relatively small m, this latter result leads to a consistent reduction of the SV number.
APA, Harvard, Vancouver, ISO, and other styles
25

Ts. YEO SIANG CHUAN, Ir. Dr. Lim Meng Hee, Dr. Hui Kar Hoou, and Eng Hoe Cheng. "Bayes' Theorem for Multi-Bearing Faults Diagnosis." International Journal of Automotive and Mechanical Engineering 20, no. 2 (June 30, 2023): 10371–85. http://dx.doi.org/10.15282/ijame.20.2.2023.04.0802.

Full text
Abstract:
During the process of fault diagnosis for automated machinery, support vector machines is one of the suitable choices to categorize multiple faults for machinery. Regardless of the volume of sampling data, support vector machines can handle a high number of input features. It was learned that support vector machines could only sense binary fault classification (such as faulty or healthy). However, the classification accuracy was found to be lower when using support vector machines to diagnose multi-bearing faults classifications. This is because the multiple classification problem will be reduced into several sub-problems of binary classification when support vector machines adapt to multi-bearing faults classifications. From there, many contradictory results will occur from every support vector machine model. In order to solve the situation, the combination of Support Vector Machines and Bayes’ Theorem is introduced to every single support vector machine model to overcome the conflicting results. This method will also increase classification accuracy. The proposed Support Vector Machines - Bayes’ Theorem method has resulted in an increase in the accuracy of the fault diagnosis model. The analysis result has shown an accuracy from 72% to 95%. It proved that Support Vector Machines - Bayes’ Theorem continuously eliminates and refines conflicting results from the original support vector machine model. Compared to the existing support vector machine, the proposed Support Vector Machines - Bayes’ Theorem has proven its effectiveness in diagnosing the multi-bearing faults problem classification.
APA, Harvard, Vancouver, ISO, and other styles
26

Lorena, Ana Carolina, and André C. P. L. F. De Carvalho. "Uma Introdução às Support Vector Machines." Revista de Informática Teórica e Aplicada 14, no. 2 (December 20, 2007): 43–67. http://dx.doi.org/10.22456/2175-2745.5690.

Full text
Abstract:
This paper presents an introduction to the Support Vector Machines (SVMs), a Machine Learning technique that has received increasing attention in the last years. The SVMs have been applied to several pattern recognition tasks, obtaining results superior to those of other learning techniques in various applications.
APA, Harvard, Vancouver, ISO, and other styles
27

Huimin, Yao. "Research on Parallel Support Vector Machine Based on Spark Big Data Platform." Scientific Programming 2021 (December 17, 2021): 1–9. http://dx.doi.org/10.1155/2021/7998417.

Full text
Abstract:
With the development of cloud computing and distributed cluster technology, the concept of big data has been expanded and extended in terms of capacity and value, and machine learning technology has also received unprecedented attention in recent years. Traditional machine learning algorithms cannot solve the problem of effective parallelization, so a parallelization support vector machine based on Spark big data platform is proposed. Firstly, the big data platform is designed with Lambda architecture, which is divided into three layers: Batch Layer, Serving Layer, and Speed Layer. Secondly, in order to improve the training efficiency of support vector machines on large-scale data, when merging two support vector machines, the “special points” other than support vectors are considered, that is, the points where the nonsupport vectors in one subset violate the training results of the other subset, and a cross-validation merging algorithm is proposed. Then, a parallelized support vector machine based on cross-validation is proposed, and the parallelization process of the support vector machine is realized on the Spark platform. Finally, experiments on different datasets verify the effectiveness and stability of the proposed method. Experimental results show that the proposed parallelized support vector machine has outstanding performance in speed-up ratio, training time, and prediction accuracy.
APA, Harvard, Vancouver, ISO, and other styles
28

Ye, Qiaolin, Chunxia Zhao, Ning Ye, and Yannan Chen. "Multi-weight vector projection support vector machines." Pattern Recognition Letters 31, no. 13 (October 2010): 2006–11. http://dx.doi.org/10.1016/j.patrec.2010.06.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Tran, H. T., K. Y. Kim, and H. J. Yang. "Weldability Prediction of AHSS Stackups Using Support Vector Machines." International Journal of Computer and Electrical Engineering 6, no. 3 (2014): 207–10. http://dx.doi.org/10.7763/ijcee.2014.v6.823.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Kramer, Oliver. "Cascade Support Vector Machines with Dimensionality Reduction." Applied Computational Intelligence and Soft Computing 2015 (2015): 1–8. http://dx.doi.org/10.1155/2015/216132.

Full text
Abstract:
Cascade support vector machines have been introduced as extension of classic support vector machines that allow a fast training on large data sets. In this work, we combine cascade support vector machines with dimensionality reduction based preprocessing. The cascade principle allows fast learning based on the division of the training set into subsets and the union of cascade learning results based on support vectors in each cascade level. The combination with dimensionality reduction as preprocessing results in a significant speedup, often without loss of classifier accuracies, while considering the high-dimensional pendants of the low-dimensional support vectors in each new cascade level. We analyze and compare various instantiations of dimensionality reduction preprocessing and cascade SVMs with principal component analysis, locally linear embedding, and isometric mapping. The experimental analysis on various artificial and real-world benchmark problems includes various cascade specific parameters like intermediate training set sizes and dimensionalities.
APA, Harvard, Vancouver, ISO, and other styles
31

En, Jongmin, Songwook Lee, and Jungyun Seo. "An analysis of Speech Acts for Korean Using Support Vector Machines." KIPS Transactions:PartB 12B, no. 3 (June 1, 2005): 365–68. http://dx.doi.org/10.3745/kipstb.2005.12b.3.365.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

KUDO, TAKU, and YUJI MATSUMOTO. "Chunking with Support Vector Machines." Journal of Natural Language Processing 9, no. 5 (2002): 3–21. http://dx.doi.org/10.5715/jnlp.9.5_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Moosaei, H., S. Ketabchi, M. Razzaghi, and M. Tanveer. "Generalized Twin Support Vector Machines." Neural Processing Letters 53, no. 2 (March 6, 2021): 1545–64. http://dx.doi.org/10.1007/s11063-021-10464-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Kong, Rui, Qiong Wang, Gu Yu Hu, and Zhi Song Pan. "Fuzzy Asymmetric Support Vector Machines." Advanced Materials Research 433-440 (January 2012): 7479–86. http://dx.doi.org/10.4028/www.scientific.net/amr.433-440.7479.

Full text
Abstract:
Support Vector Machines (SVM) has been extensively studied and has shown remarkable success in many applications. However the success of SVM is very limited when it is applied to the problem of learning from imbalanced datasets in which negative instances heavily outnumber the positive instances (e.g. in medical diagnosis and detecting credit card fraud). In this paper, we propose the fuzzy asymmetric algorithm to augment SVMs to deal with imbalanced training-data problems, called FASVM, which is based on fuzzy memberships, combined with different error costs (DEC) algorithm. We compare the performance of our algorithm against these two algorithms, along with different error costs and regular SVM and show that our algorithm outperforms all of them.
APA, Harvard, Vancouver, ISO, and other styles
35

Zhang, L., W. Zhou, and L. Jiao. "Hidden Space Support Vector Machines." IEEE Transactions on Neural Networks 15, no. 6 (November 2004): 1424–34. http://dx.doi.org/10.1109/tnn.2004.831161.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Ertekin, Şeyda, Léon Bottou, and C. Lee Giles. "Nonconvex Online Support Vector Machines." IEEE Transactions on Pattern Analysis and Machine Intelligence 33, no. 2 (February 2011): 368–81. http://dx.doi.org/10.1109/tpami.2010.109.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Liu, Yufeng, and Ming Yuan. "Reinforced Multicategory Support Vector Machines." Journal of Computational and Graphical Statistics 20, no. 4 (January 2011): 901–19. http://dx.doi.org/10.1198/jcgs.2010.09206.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Tang, Yongqiang, and Hao Helen Zhang. "Multiclass Proximal Support Vector Machines." Journal of Computational and Graphical Statistics 15, no. 2 (June 2006): 339–55. http://dx.doi.org/10.1198/106186006x113647.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Ramon, M. M., Nan Xu, and C. G. Christodoulou. "Beamforming using support vector machines." IEEE Antennas and Wireless Propagation Letters 4 (2005): 439–42. http://dx.doi.org/10.1109/lawp.2005.860196.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Zhou, Weida, Li Zhang, and Licheng Jiao. "Linear programming support vector machines." Pattern Recognition 35, no. 12 (December 2002): 2927–36. http://dx.doi.org/10.1016/s0031-3203(01)00210-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Lu, Xiaoling, Fengchi Dong, Xiexin Liu, and Xiangyu Chang. "Varying Coefficient Support Vector Machines." Statistics & Probability Letters 132 (January 2018): 107–15. http://dx.doi.org/10.1016/j.spl.2017.09.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Razzaghi, Talayeh, and Ilya Safro. "Scalable Multilevel Support Vector Machines." Procedia Computer Science 51 (2015): 2683–87. http://dx.doi.org/10.1016/j.procs.2015.05.381.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Hong, Dug Hun, and Changha Hwang. "Support vector fuzzy regression machines." Fuzzy Sets and Systems 138, no. 2 (September 2003): 271–81. http://dx.doi.org/10.1016/s0165-0114(02)00514-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Iranmehr, Arya, Hamed Masnadi-Shirazi, and Nuno Vasconcelos. "Cost-sensitive support vector machines." Neurocomputing 343 (May 2019): 50–64. http://dx.doi.org/10.1016/j.neucom.2018.11.099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Czarnecki, Wojciech Marian, and Jacek Tabor. "Two ellipsoid Support Vector Machines." Expert Systems with Applications 41, no. 18 (December 2014): 8211–24. http://dx.doi.org/10.1016/j.eswa.2014.07.015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Huang, Yu-Len, and Dar-Ren Chen. "Support vector machines in sonography." Clinical Imaging 29, no. 3 (May 2005): 179–84. http://dx.doi.org/10.1016/j.clinimag.2004.08.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Moguerza, Javier M., and Alberto Muñoz. "Support Vector Machines with Applications." Statistical Science 21, no. 3 (August 2006): 322–36. http://dx.doi.org/10.1214/088342306000000493.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Campbell, Colin, and Yiming Ying. "Learning with Support Vector Machines." Synthesis Lectures on Artificial Intelligence and Machine Learning 5, no. 1 (February 11, 2011): 1–95. http://dx.doi.org/10.2200/s00324ed1v01y201102aim010.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Skomorokhov, Alexander, and Alexander Nakhabov. "Support vector machines in A+." ACM SIGAPL APL Quote Quad 34, no. 4 (September 2004): 8–17. http://dx.doi.org/10.1145/1152754.1152756.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Huang, Kaizhu, Danian Zheng, Irwin King, and Michael R. Lyu. "Arbitrary Norm Support Vector Machines." Neural Computation 21, no. 2 (February 2009): 560–82. http://dx.doi.org/10.1162/neco.2008.12-07-667.

Full text
Abstract:
Support vector machines (SVM) are state-of-the-art classifiers. Typically L2-norm or L1-norm is adopted as a regularization term in SVMs, while other norm-based SVMs, for example, the L0-norm SVM or even the L∞-norm SVM, are rarely seen in the literature. The major reason is that L0-norm describes a discontinuous and nonconvex term, leading to a combinatorially NP-hard optimization problem. In this letter, motivated by Bayesian learning, we propose a novel framework that can implement arbitrary norm-based SVMs in polynomial time. One significant feature of this framework is that only a sequence of sequential minimal optimization problems needs to be solved, thus making it practical in many real applications. The proposed framework is important in the sense that Bayesian priors can be efficiently plugged into most learning methods without knowing the explicit form. Hence, this builds a connection between Bayesian learning and the kernel machines. We derive the theoretical framework, demonstrate how our approach works on the L0-norm SVM as a typical example, and perform a series of experiments to validate its advantages. Experimental results on nine benchmark data sets are very encouraging. The implemented L0-norm is competitive with or even better than the standard L2-norm SVM in terms of accuracy but with a reduced number of support vectors, − 9.46% of the number on average. When compared with another sparse model, the relevance vector machine, our proposed algorithm also demonstrates better sparse properties with a training speed over seven times faster.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography