Articles de revues sur le sujet « Kernel-based model »

Pour voir les autres types de publications sur ce sujet consultez le lien suivant : Kernel-based model.

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les 50 meilleurs articles de revues pour votre recherche sur le sujet « Kernel-based model ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Parcourez les articles de revues sur diverses disciplines et organisez correctement votre bibliographie.

1

Nishiyama, Yu, Motonobu Kanagawa, Arthur Gretton et Kenji Fukumizu. « Model-based kernel sum rule : kernel Bayesian inference with probabilistic models ». Machine Learning 109, no 5 (2 janvier 2020) : 939–72. http://dx.doi.org/10.1007/s10994-019-05852-9.

Texte intégral
Résumé :
AbstractKernel Bayesian inference is a principled approach to nonparametric inference in probabilistic graphical models, where probabilistic relationships between variables are learned from data in a nonparametric manner. Various algorithms of kernel Bayesian inference have been developed by combining kernelized basic probabilistic operations such as the kernel sum rule and kernel Bayes’ rule. However, the current framework is fully nonparametric, and it does not allow a user to flexibly combine nonparametric and model-based inferences. This is inefficient when there are good probabilistic models (or simulation models) available for some parts of a graphical model; this is in particular true in scientific fields where “models” are the central topic of study. Our contribution in this paper is to introduce a novel approach, termed the model-based kernel sum rule (Mb-KSR), to combine a probabilistic model and kernel Bayesian inference. By combining the Mb-KSR with the existing kernelized probabilistic rules, one can develop various algorithms for hybrid (i.e., nonparametric and model-based) inferences. As an illustrative example, we consider Bayesian filtering in a state space model, where typically there exists an accurate probabilistic model for the state transition process. We propose a novel filtering method that combines model-based inference for the state transition process and data-driven, nonparametric inference for the observation generating process. We empirically validate our approach with synthetic and real-data experiments, the latter being the problem of vision-based mobile robot localization in robotics, which illustrates the effectiveness of the proposed hybrid approach.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Zong, Xinlu, Chunzhi Wang et Hui Xu. « Density-based Adaptive Wavelet Kernel SVM Model for P2P Traffic Classification ». International Journal of Future Generation Communication and Networking 6, no 6 (31 décembre 2013) : 25–36. http://dx.doi.org/10.14257/ijfgcn.2013.6.6.04.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
3

Shim, Jooyong, et Changha Hwang. « Kernel-based orthogonal quantile regression model ». Model Assisted Statistics and Applications 12, no 3 (30 août 2017) : 217–26. http://dx.doi.org/10.3233/mas-170396.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
4

Su, Zhi-gang, Pei-hong Wang et Zhao-long Song. « Kernel based nonlinear fuzzy regression model ». Engineering Applications of Artificial Intelligence 26, no 2 (février 2013) : 724–38. http://dx.doi.org/10.1016/j.engappai.2012.05.009.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
5

Wang, Zhijie, Mohamed Ben Salah, Hong Zhang et Nilanjan Ray. « Shape based appearance model for kernel tracking ». Image and Vision Computing 30, no 4-5 (mai 2012) : 332–44. http://dx.doi.org/10.1016/j.imavis.2012.03.003.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
6

Ma, Xin, et Zhi-bin Liu. « The kernel-based nonlinear multivariate grey model ». Applied Mathematical Modelling 56 (avril 2018) : 217–38. http://dx.doi.org/10.1016/j.apm.2017.12.010.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
7

Lingyu, Liang, Wenqi Huang, Zhaojie Dong, Jiguang Zhao, Peng Li, Bingfang Lu et Xinde Zhu. « Short-term power load forecasting based on combined kernel Gaussian process hybrid model ». E3S Web of Conferences 256 (2021) : 01009. http://dx.doi.org/10.1051/e3sconf/202125601009.

Texte intégral
Résumé :
As one of the countries with the most energy consumption in the world, electricity accounts for a large proportion of the energy supply in our country. According to the national basic policy of energy conservation and emission reduction, it is urgent to realize the intelligent distribution and management of electricity by prediction. Due to the complex nature of electricity load sequences, the traditional model predicts poor results. As a kernel-based machine learning model, Gaussian Process Mixing (GPM) has high predictive accuracy, can multi-modal prediction and output confidence intervals. However, the traditional GPM often uses a single kernel function, and the prediction effect is not optimal. Therefore, this paper will combine a variety of existing kernel to build a new kernel, and use it for load sequence prediction. In the electricity load prediction experiments, the prediction characteristics of the load sequences are first analyzed, and then the prediction is made based on the optimal hybrid kernel function constructed by GPM and compared with the traditional prediction model. The results show that the GPM based on the hybrid kernel is not only superior to the single kernel GPM but also superior to some traditional prediction models such as ridge regression, kernel regression and GP.
Styles APA, Harvard, Vancouver, ISO, etc.
8

Fan, Yanqin, et Qi Li. « CONSISTENT MODEL SPECIFICATION TESTS ». Econometric Theory 16, no 6 (décembre 2000) : 1016–41. http://dx.doi.org/10.1017/s0266466600166083.

Texte intégral
Résumé :
We point out the close relationship between the integrated conditional moment tests in Bierens (1982, Journal of Econometrics 20, 105–134) and Bierens and Ploberger (1997, Econometrica 65, 1129–1152) with the complex-valued exponential weight function and the kernel-based tests in Härdle and Mammen (1993, Annals of Statistics 21, 1926–1947), Li and Wang (1998, Journal of Econometrics 87, 145–165), and Zheng (1996, Journal of Econometrics 75, 263–289). It is well established that the integrated conditional moment tests of Bierens (1982) and Bierens and Ploberger (1997) are more powerful than kernel-based nonparametric tests against Pitman local alternatives. In this paper we analyze the power properties of the kernel-based tests and the integrated conditional moment tests for a sequence of “singular” local alternatives, and show that the kernel-based tests can be more powerful than the integrated conditional moment tests for these “singular” local alternatives. These results suggest that integrated conditional moment tests and kernel-based tests should be viewed as complements to each other. Results from a simulation study are in agreement with the theoretical results.
Styles APA, Harvard, Vancouver, ISO, etc.
9

Zhai, Yuejing, Zhouzheng Li et Haizhong Liu. « Multi-Angle Fast Neural Tangent Kernel Classifier ». Applied Sciences 12, no 21 (26 octobre 2022) : 10876. http://dx.doi.org/10.3390/app122110876.

Texte intégral
Résumé :
Multi-kernel learning methods are essential kernel learning methods. Still, the base kernel functions in most multi-kernel learning methods only with select kernel functions with shallow structures, which are weak for large-scale uneven data. We propose two types of acceleration models from a multidimensional perspective of the data: the neural tangent kernel (NTK)-based multi-kernel learning method is proposed, where the NTK kernel regressor is shown to be equivalent to an infinitely wide neural network predictor, and the NTK with deep structure is used as the base kernel function to enhance the learning ability of multi-kernel models; and a parallel computing kernel model based on data partitioning techniques. An RBF, POLY-based multi-kernel model is also proposed. All models use historical memory-based PSO (HMPSO) for efficient search of parameters within the model. Since NTK has a multi-layer structure and thus has a significant computational complexity, the use of a Monotone Disjunctive Kernel (MDK) to store and train Boolean features in binary achieves a 15–60% training time compression of NTK models in different datasets while obtaining a 1–25% accuracy improvement.
Styles APA, Harvard, Vancouver, ISO, etc.
10

Segera, Davies, Mwangi Mbuthia et Abraham Nyete. « Particle Swarm Optimized Hybrid Kernel-Based Multiclass Support Vector Machine for Microarray Cancer Data Analysis ». BioMed Research International 2019 (16 décembre 2019) : 1–11. http://dx.doi.org/10.1155/2019/4085725.

Texte intégral
Résumé :
Determining an optimal decision model is an important but difficult combinatorial task in imbalanced microarray-based cancer classification. Though the multiclass support vector machine (MCSVM) has already made an important contribution in this field, its performance solely depends on three aspects: the penalty factor C, the type of kernel, and its parameters. To improve the performance of this classifier in microarray-based cancer analysis, this paper proposes PSO-PCA-LGP-MCSVM model that is based on particle swarm optimization (PSO), principal component analysis (PCA), and multiclass support vector machine (MCSVM). The MCSVM is based on a hybrid kernel, i.e., linear-Gaussian-polynomial (LGP) that combines the advantages of three standard kernels (linear, Gaussian, and polynomial) in a novel manner, where the linear kernel is linearly combined with the Gaussian kernel embedding the polynomial kernel. Further, this paper proves and makes sure that the LGP kernel confirms the features of a valid kernel. In order to reveal the effectiveness of our model, several experiments were conducted and the obtained results compared between our model and other three single kernel-based models, namely, PSO-PCA-L-MCSVM (utilizing a linear kernel), PSO-PCA-G-MCSVM (utilizing a Gaussian kernel), and PSO-PCA-P-MCSVM (utilizing a polynomial kernel). In comparison, two dual and two multiclass imbalanced standard microarray datasets were used. Experimental results in terms of three extended assessment metrics (F-score, G-mean, and Accuracy) reveal the superior global feature extraction, prediction, and learning abilities of this model against three single kernel-based models.
Styles APA, Harvard, Vancouver, ISO, etc.
11

Abakar, Khalid AA, et Chongwen Yu. « The Spinning Quality Control Management Based on Decision Making by Data Mining Techniques ». International Journal of Emerging Research in Management and Technology 7, no 1 (11 juin 2018) : 72. http://dx.doi.org/10.23956/ijermt.v7i1.25.

Texte intégral
Résumé :
This work demonstrated the possibility of using the data mining techniques such as artificial neural networks (ANN) and support vector machine (SVM) based model to predict the quality of the spinning yarn parameters. Three different kernel functions were used as SVM kernel functions which are Polynomial and Radial Basis Function (RBF) and Pearson VII Function-based Universal Kernel (PUK) and ANN model were used as data mining techniques to predict yarn properties. In this paper, it was found that the SVM model based on Person VII kernel function (PUK) have the same performance in prediction of spinning yarn quality in comparison with SVM based RBF kernel. The comparison with the ANN model showed that the two SVM models give a better prediction performance than an ANN model.
Styles APA, Harvard, Vancouver, ISO, etc.
12

Tian, Jinkai, Peifeng Yan et Da Huang. « Kernel Analysis Based on Dirichlet Processes Mixture Models ». Entropy 21, no 9 (2 septembre 2019) : 857. http://dx.doi.org/10.3390/e21090857.

Texte intégral
Résumé :
Kernels play a crucial role in Gaussian process regression. Analyzing kernels from their spectral domain has attracted extensive attention in recent years. Gaussian mixture models (GMM) are used to model the spectrum of kernels. However, the number of components in a GMM is fixed. Thus, this model suffers from overfitting or underfitting. In this paper, we try to combine the spectral domain of kernels with nonparametric Bayesian models. Dirichlet processes mixture models are used to resolve this problem by changing the number of components according to the data size. Multiple experiments have been conducted on this model and it shows competitive performance.
Styles APA, Harvard, Vancouver, ISO, etc.
13

Elaissi, Ilyes, Okba Taouali et Messaoud Hassani. « Online Prediction Model Based on New Kernel Method ». International Review of Automatic Control (IREACO) 7, no 1 (31 janvier 2014) : 107. http://dx.doi.org/10.15866/ireaco.v7i1.1299.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
14

Wu, Xiao-hong, et Jian-jiang Zhou. « Modified possibilistic clustering model based on kernel methods ». Journal of Shanghai University (English Edition) 12, no 2 (avril 2008) : 136–40. http://dx.doi.org/10.1007/s11741-008-0210-2.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
15

Zhang, Zhihua, James T. Kwok et Dit-Yan Yeung. « Model-based transductive learning of the kernel matrix ». Machine Learning 63, no 1 (9 mars 2006) : 69–101. http://dx.doi.org/10.1007/s10994-006-6130-8.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
16

Sun, Jian Ping, et Lin Tao Hu. « Application of Status Monitoring of Wind Turbines Based on Relevance Vector Machine Regression ». Advanced Materials Research 347-353 (octobre 2011) : 2337–41. http://dx.doi.org/10.4028/www.scientific.net/amr.347-353.2337.

Texte intégral
Résumé :
Based on the single kernel function relevance vector machine(RVM) models,a multiple load-forecasting model has been established and simulated with several compound kernel functions, including Gauss kernel, Laplace, linear compounded by Gauss and Laplace, Gauss and polynomial kernel. Each model gained comparatively reasonable results in simulation .Moreover, multi linear-compound kernel RVMs performed better than single kernel RVMs in terms of most evaluating indicators, which prove that RVM is an appropriate machine learning method in monitoring status of components of wind turbines.
Styles APA, Harvard, Vancouver, ISO, etc.
17

Li, Jianliang, Xiaohai Li, Robert Lugg et Lawrence S. Melvin. « Kernel Count Reduction in Model Based Optical Proximity Correction Process Models ». Japanese Journal of Applied Physics 48, no 6 (22 juin 2009) : 06FA05. http://dx.doi.org/10.1143/jjap.48.06fa05.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
18

Qi, Jinshan, Xun Liang et Rui Xu. « A Multiple Kernel Learning Model Based on p-Norm ». Computational Intelligence and Neuroscience 2018 (2018) : 1–7. http://dx.doi.org/10.1155/2018/1018789.

Texte intégral
Résumé :
By utilizing kernel functions, support vector machines (SVMs) successfully solve the linearly inseparable problems. Subsequently, its applicable areas have been greatly extended. Using multiple kernels (MKs) to improve the SVM classification accuracy has been a hot topic in the SVM research society for several years. However, most MK learning (MKL) methods employ L1-norm constraint on the kernel combination weights, which forms a sparse yet nonsmooth solution for the kernel weights. Alternatively, the Lp-norm constraint on the kernel weights keeps all information in the base kernels. Nonetheless, the solution of Lp-norm constraint MKL is nonsparse and sensitive to the noise. Recently, some scholars presented an efficient sparse generalized MKL (L1- and L2-norms based GMKL) method, in which L1 L2 established an elastic constraint on the kernel weights. In this paper, we further extend the GMKL to a more generalized MKL method based on the p-norm, by joining L1- and Lp-norms. Consequently, the L1- and L2-norms based GMKL is a special case in our method when p=2. Experiments demonstrated that our L1- and Lp-norms based MKL offers a higher accuracy than the L1- and L2-norms based GMKL in the classification, while keeping the properties of the L1- and L2-norms based on GMKL.
Styles APA, Harvard, Vancouver, ISO, etc.
19

Yang, Hong, Lipeng Gao et Guohui Li. « Underwater Acoustic Signal Prediction Based on MVMD and Optimized Kernel Extreme Learning Machine ». Complexity 2020 (24 avril 2020) : 1–17. http://dx.doi.org/10.1155/2020/6947059.

Texte intégral
Résumé :
Aiming at the chaotic characteristics of underwater acoustic signal, a prediction model of grey wolf-optimized kernel extreme learning machine (OKELM) based on MVMD is proposed in this paper for short-term prediction of underwater acoustic signals. To solve the problem of K value selection in variational mode decomposition, a new K value selection method MVMD is proposed from the perspective of mutual information, which avoids the blindness of variational mode decomposition (VMD) in the preset modal number. Based on the prediction model of kernel extreme learning machine (KELM), this paper uses grey wolf optimization (GWO) algorithm to optimize and select its regularization parameters and kernel parameters and proposes an optimized kernel extreme learning machine OKELM. To further improve the prediction performance of the model, combined with MVMD, an underwater acoustic signal prediction model based on MVMD-OKELM is established. MVMD-OKELM prediction model is applied to Mackey–Glass chaotic time series prediction and underwater acoustic signal prediction and is compared with ARIMA, EMD-OKELM, and other prediction models. The experimental results show that the proposed MVMD-OKELM prediction model has a higher prediction accuracy and can be effectively applied to the prediction of underwater acoustic signal series.
Styles APA, Harvard, Vancouver, ISO, etc.
20

Luo, Dang, et Zhang Huihui. « Grey clustering model based on kernel and information field ». Grey Systems : Theory and Application 10, no 1 (1 novembre 2019) : 56–67. http://dx.doi.org/10.1108/gs-08-2019-0029.

Texte intégral
Résumé :
Purpose The purpose of this paper is to propose a grey clustering model based on kernel and information field to deal with the situation in which both the observation values and the turning points of the whitenization weight function are interval grey numbers. Design/methodology/approach First, the “unreduced axiom of degree of greyness” was expanded to obtain the inference of “information field not-reducing”. Then, based on the theoretical basis of inference, the expression of whitenization weight function with interval grey number was provided. The grey clustering model and fuzzy clustering model were compared to analyse the relationship and difference between the two models. Finally, the paper model and the fuzzy clustering model were applied to the example analysis, and the interval grey number clustering model was established to analyse the influencing factors of regional drought disaster risk in Henan Province. Findings The example analysis results illustrate that although the two clustering methods have different theoretical basis, they are suitable for dealing with complex systems with uncertainty or grey characteristic, solving the problem of incomplete system information, which has certain feasibility and rationality. The clustering results of case study show that five influencing factors of regional drought disaster risk in Henan Province are divided into three classes, consistent with the actual situation, and they show the validity and practicability of the clustering model. Originality/value The paper proposes a new whitenization weight function with interval grey number that can transform interval grey number operations into real number operations. It not only simplifies the calculation steps, but it has a great significance for the “small data sets and poor information” grey system and has a universal applicability.
Styles APA, Harvard, Vancouver, ISO, etc.
21

SenGupta, Ishuita, Anil Kumar et Rakesh Kumar Dwivedi. « Assimilation of Standard Regularizer Contextual Model and Composite Kernel with Fuzzy-based Noise Classifier ». Journal of Modeling and Optimization 11, no 1 (15 juin 2019) : 16–24. http://dx.doi.org/10.32732/jmo.2019.11.1.16.

Texte intégral
Résumé :
The paper assay the effect of assimilating smoothness prior contextual model and composite kernel function with fuzzy based noise classifier using remote sensing data. The concept of the composite kernel has been taken by fusing two kernels together to improve the classification accuracy. Gaussian and Sigmoid kernel functions have opted for kernel composition. As a contextual model, Markov Random Field (MRF) Standard regularization model (smoothness prior) has been studied with the composite kernel-based Noise Classifier. Comparative analysis of new classifier with the conventional construes increase in overall accuracy.
Styles APA, Harvard, Vancouver, ISO, etc.
22

Nadim, Mohammad, Wonjun Lee et David Akopian. « Characteristic Features of the Kernel-level Rootkit for Learning-based Detection Model Training ». Electronic Imaging 2021, no 3 (18 juin 2021) : 34–1. http://dx.doi.org/10.2352/issn.2470-1173.2021.3.mobmu-034.

Texte intégral
Résumé :
The core part of the operating system is the kernel, and it plays an important role in managing critical data structure resources for correct operations. The kernel-level rootkits are the most elusive type of malware that can modify the running OS kernel in order to hide its presence and perform many malicious activities such as process hiding, module hiding, network communication hiding, and many more. In the past years, many approaches have been proposed to detect kernel-level rootkit. Still, it is challenging to detect new attacks and properly categorize the kernel-level rootkits. Memory forensic approaches showed efficient results with the limitation against transient attacks. Cross-view-based and integrity monitoring-based approaches have their own weaknesses. A learning-based detection approach is an excellent way to solve these problems. In this paper, we give an insight into the kernel-level rootkit characteristic features and how the features can be represented to train learning-based models in order to detect known and unknown attacks. Our feature set combined the memory forensic, cross-view, and integrity features to train learning-based detection models. We also suggest useful tools that can be used to collect the characteristics features of the kernel-level rootkit.
Styles APA, Harvard, Vancouver, ISO, etc.
23

Christmann, Andreas, et Ding-Xuan Zhou. « Learning rates for the risk of kernel-based quantile regression estimators in additive models ». Analysis and Applications 14, no 03 (13 avril 2016) : 449–77. http://dx.doi.org/10.1142/s0219530515500050.

Texte intégral
Résumé :
Additive models play an important role in semiparametric statistics. This paper gives learning rates for regularized kernel-based methods for additive models. These learning rates compare favorably in particular in high dimensions to recent results on optimal learning rates for purely nonparametric regularized kernel-based quantile regression using the Gaussian radial basis function kernel, provided the assumption of an additive model is valid. Additionally, a concrete example is presented to show that a Gaussian function depending only on one variable lies in a reproducing kernel Hilbert space generated by an additive Gaussian kernel, but does not belong to the reproducing kernel Hilbert space generated by the multivariate Gaussian kernel of the same variance.
Styles APA, Harvard, Vancouver, ISO, etc.
24

Shim, Jooyong, Sang Bum Lee, Daiwon Kim, Jung-Suk Yu et Chanha Hwang. « Kernel-based spatial error model for analyzing spatial panel data ». Model Assisted Statistics and Applications 15, no 3 (9 octobre 2020) : 239–48. http://dx.doi.org/10.3233/mas-200491.

Texte intégral
Résumé :
Spatial panel data model captures spatial interactions across spatial units and over time. Lots of effort have been devoted to develop effective estimation methods for parametric and nonparametric spatial panel data models. Varying coefficient model has received a great deal of attention as an important tool for modeling panel data. In this paper we propose a kernel-based spatial error model for the purpose of analyzing spatial panel data. This model is based on the idea of fixed effect time-varying coefficient model and the kernel technique of support vector machine along with the technique of regularization. A generalized cross validation method is also considered for choosing the hyperparameters which affect the performance of the proposed model. The proposed model is evaluated through numerical studies.
Styles APA, Harvard, Vancouver, ISO, etc.
25

Pillonetto, Gianluigi, Tianshi Chen et Lennart Ljung. « Kernel-based model order selection for linear system identification ». IFAC Proceedings Volumes 46, no 11 (2013) : 257–62. http://dx.doi.org/10.3182/20130703-3-fr-4038.00043.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
26

Choklati, A., et K. Sabri. « Cyclic Analysis of Extra Heart Sounds:Gauss Kernel based Model ». International Journal of Image, Graphics and Signal Processing 10, no 5 (8 mai 2018) : 1–14. http://dx.doi.org/10.5815/ijigsp.2018.05.01.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
27

Han, R., Z. Jing et Y. Li. « Kernel based visual tracking with variant spatial resolution model ». Electronics Letters 44, no 8 (2008) : 517. http://dx.doi.org/10.1049/el:20080051.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
28

Gao, F. « Detecting vegetation structure using a kernel-based BRDF model ». Remote Sensing of Environment 86, no 2 (30 juillet 2003) : 198–205. http://dx.doi.org/10.1016/s0034-4257(03)00100-7.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
29

Wang, Yong, Xinbin Luo, Lu Ding, Shan Fu et Shiqiang Hu. « Collaborative model based UAV tracking via local kernel feature ». Applied Soft Computing 72 (novembre 2018) : 90–107. http://dx.doi.org/10.1016/j.asoc.2018.07.049.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
30

Yanxiang, Fang, Shen Changxiang, Xu Jingdong et Wu Gongyi. « A separated domain-based kernel model for trusted computing ». Wuhan University Journal of Natural Sciences 11, no 6 (novembre 2006) : 1424–28. http://dx.doi.org/10.1007/bf02831789.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
31

Yu, Changyong, Chengtang Yao, Mingtao Pei et Yunde Jia. « Diffusion-based kernel matrix model for face liveness detection ». Image and Vision Computing 89 (septembre 2019) : 88–94. http://dx.doi.org/10.1016/j.imavis.2019.06.009.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
32

Ikeda, Sei ichi, et Yoshiharu Sato. « Kernel methods for regression model based on variable selection ». International Journal of Knowledge Engineering and Soft Data Paradigms 1, no 1 (2009) : 49. http://dx.doi.org/10.1504/ijkesdp.2009.021984.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
33

Suykens, Johan A. K., Carlos Alzate et Kristiaan Pelckmans. « Primal and dual model representations in kernel-based learning ». Statistics Surveys 4 (2010) : 148–83. http://dx.doi.org/10.1214/09-ss052.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
34

Wu, Xiaohong, et Jianjiang Zhou. « Fuzzy principal component analysis and its Kernel-based model ». Journal of Electronics (China) 24, no 6 (novembre 2007) : 772–75. http://dx.doi.org/10.1007/s11767-006-0039-z.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
35

KUDLAI, Vladyslav, Nataliia BONDARENKO et Viktor BONDARENKO. « CONSTRUCTION AND VERIFICATION OF A DIGITAL EQUALIZER MODEL ». Herald of Khmelnytskyi National University. Technical sciences 313, no 5 (27 octobre 2022) : 178–84. http://dx.doi.org/10.31891/2307-5732-2022-313-5-178-184.

Texte intégral
Résumé :
An approach to the development of an equalizer by building its mathematical model based on a microcontroller is proposed. All operations, including signal processing and equalizer kernel calculation, are performed by a single microcontroller. Thanks to the created mathematical model of the equalizer, the calculation of the kernel is reduced to multiple uses of relatively simple operations, which saves time and memory of the program. The equalizer provides satisfactory processing quality at a small filter order which is selected as a digital filter with final impulse response (FIR) because of its linear phase-frequency response, a guarantee of stability at the complex amplitude-frequency response, and also its associativity and linearity allowing it easily reproduce a complex amplitude-frequency response. Schematic implementation is based on parallel bandpass filters and a low-pass filter followed by adding filtered and amplified signals. It is the distributive property of the FIR filter that makes it possible to obtain a new kernel that includes all the amplified ranges by the sum of the corresponding kernel elements, instead of adding amplified ranges. The associativity and linearity of the FIR filter gives the opportunity to easily implement different types of filters on the basis of a low-pass filter, for the calculation of which the cardinal sine function is used together with the window function, which in combination gives qualitative frequency characteristics. The low-pass filter kernel and equalizer model are verified in the GNU Octave environment, which is an open-source analogue of Matlab. The model is checked by setting the frequency response of the test equalizer, and for individual filters the allowable width of the transition band and the maximum value of pulsation in the suppression band are set. The low-pass filter kernel is created with an arbitrary cutoff frequency, and the filter consists of 129 elements, which were multiplied by the Kaiser window with a value of parameter equal to 4.5. As a result of verification of the mathematical model in the GNU Octave environment, the width of the transition band and the maximum value of pulsation in the suppression band meet the specified conditions. The simulated equalizer kernel fully corresponds to the specified frequency response. Verification of the mathematical model confirmed its efficiency and compliance of the obtained characteristics of the equalizer with the specified requirements.
Styles APA, Harvard, Vancouver, ISO, etc.
36

Gu, Lch, Zhw Ni et Zhj Wu. « Study of Predictive Method Based on SVM Optimal Model Selection ». Applied Mechanics and Materials 65 (juin 2011) : 443–46. http://dx.doi.org/10.4028/www.scientific.net/amm.65.443.

Texte intégral
Résumé :
The computation time consuming and poor efficiency of prediction exist in the model selection of traditional SVM. By studing on kernel matrix, a SVM-based prediction method for selecting the optimal model framework SVR-D1.2 was proposed with the help of the kernel matrix’s symmetry and positive definition and kernel alignment. The method was applied to the prediction of wheat scab, and comparison experiments were done with the main existing methods. The result shows the method has more efficiency and precision of prediction in the occurrence tendency of wheat scab. Meanwhile, it is simple, practicable.
Styles APA, Harvard, Vancouver, ISO, etc.
37

Feng, Jun, Jian-Zhou Zhang et Bin Zhou. « Compact Support FDK Kernel Reconstruction Model Base on Approximate Inverse ». Mathematical Problems in Engineering 2012 (2012) : 1–12. http://dx.doi.org/10.1155/2012/109534.

Texte intégral
Résumé :
A novel CT reconstruction model is proposed, and the reconstruction is completed by this kernel-based method. The reconstruction kernel can be obtained by combining the approximate inverse method with the FDK algorithm. The computation of the kernel is moderate, and the reconstruction results can be improved by introducing the compact support version of the kernel. The efficiency and the accuracy are shown in the numerical experiments.
Styles APA, Harvard, Vancouver, ISO, etc.
38

Gao, Xiangbing, Bo Jia, Gen Li et Xiaojing Ma. « Calorific Value Forecasting of Coal Gangue with Hybrid Kernel Function–Support Vector Regression and Genetic Algorithm ». Energies 15, no 18 (14 septembre 2022) : 6718. http://dx.doi.org/10.3390/en15186718.

Texte intégral
Résumé :
The calorific value of coal gangue is a critical index for coal waste recycling and the energy industry. To establish an accurate and efficient calorific value forecasting model, a method based on hybrid kernel function–support vector regression and genetic algorithms is presented in this paper. Firstly, key features of coal gangue gathered from major coal mines are measured and used to build a sample set. Then, the forecasting performance of single kernel function-based models is established, and linear kernel and Gaussian kernel functions are chosen according to forecasting results. Next, a hybrid kernel combined with the two kernel functions mentioned above is used to establish a calorific value forecasting model. In addition, a genetic algorithm is introduced to optimize critical parameters of SVR and the adjustable weight. Finally, the forecasting model based on hybrid kernel function–support vector regression and genetic algorithms is built to predict the calorific value of new coal gangue samples. The experimental results indicate that the hybrid kernel function is more suitable for forecasting the calorific value of coal gangue than that of a single kernel function. Moreover, the forecasting performance of the proposed method is better than other conventional forecasting methods.
Styles APA, Harvard, Vancouver, ISO, etc.
39

Kari, Tusongjiang, Wensheng Gao, Ayiguzhali Tuluhong, Yilihamu Yaermaimaiti et Ziwei Zhang. « Mixed Kernel Function Support Vector Regression with Genetic Algorithm for Forecasting Dissolved Gas Content in Power Transformers ». Energies 11, no 9 (14 septembre 2018) : 2437. http://dx.doi.org/10.3390/en11092437.

Texte intégral
Résumé :
Forecasting dissolved gas content in power transformers plays a significant role in detecting incipient faults and maintaining the safety of the power system. Though various forecasting models have been developed, there is still room to further improve prediction performance. In this paper, a new forecasting model is proposed by combining mixed kernel function-based support vector regression (MKF-SVR) and genetic algorithm (GA). First, forecasting performance of SVR models constructed with a single kernel are compared, and then Gaussian kernel and polynomial kernel are retained due to better learning and prediction ability. Next, a mixed kernel, which integrates a Gaussian kernel with a polynomial kernel, is used to establish a SVR-based forecasting model. Genetic algorithm (GA) and leave-one-out cross validation are employed to determine the free parameters of MKF-SVR, while mean absolute percentage error (MAPE) and squared correlation coefficient (r2) are applied to assess the quality of the parameters. The proposed model is implemented on a practical dissolved gas dataset and promising results are obtained. Finally, the forecasting performance of the proposed model is compared with three other approaches, including RBFNN, GRNN and GM. The experimental and comparison results demonstrate that the proposed model outperforms other popular models in terms of forecasting accuracy and fitting capability.
Styles APA, Harvard, Vancouver, ISO, etc.
40

Jue, Wang. « Prediction model of pulmonary tuberculosis based on gray kernel AR-SVM model ». Cluster Computing 22, S2 (17 février 2018) : 4383–87. http://dx.doi.org/10.1007/s10586-018-1906-8.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
41

Ma, Xin. « Research on a Novel Kernel Based Grey Prediction Model and Its Applications ». Mathematical Problems in Engineering 2016 (2016) : 1–9. http://dx.doi.org/10.1155/2016/5471748.

Texte intégral
Résumé :
The discrete grey prediction models have attracted considerable interest of research due to its effectiveness to improve the modelling accuracy of the traditional grey prediction models. The autoregressive GM(1,1)model, abbreviated as ARGM(1,1), is a novel discrete grey model which is easy to use and accurate in prediction of approximate nonhomogeneous exponential time series. However, the ARGM(1,1)is essentially a linear model; thus, its applicability is still limited. In this paper a novel kernel based ARGM(1,1)model is proposed, abbreviated as KARGM(1,1). The KARGM(1,1)has a nonlinear function which can be expressed by a kernel function using the kernel method, and its modelling procedures are presented in details. Two case studies of predicting the monthly gas well production are carried out with the real world production data. The results of KARGM(1,1)model are compared to the existing discrete univariate grey prediction models, including ARGM(1,1), NDGM(1,1,k), DGM(1,1), and NGBMOP, and it is shown that the KARGM(1,1)outperforms the other four models.
Styles APA, Harvard, Vancouver, ISO, etc.
42

Guo, Changying, Biqin Song, Yingjie Wang, Hong Chen et Huijuan Xiong. « Robust Variable Selection and Estimation Based on Kernel Modal Regression ». Entropy 21, no 4 (16 avril 2019) : 403. http://dx.doi.org/10.3390/e21040403.

Texte intégral
Résumé :
Model-free variable selection has attracted increasing interest recently due to its flexibility in algorithmic design and outstanding performance in real-world applications. However, most of the existing statistical methods are formulated under the mean square error (MSE) criterion, and susceptible to non-Gaussian noise and outliers. As the MSE criterion requires the data to satisfy Gaussian noise condition, it potentially hampers the effectiveness of model-free methods in complex circumstances. To circumvent this issue, we present a new model-free variable selection algorithm by integrating kernel modal regression and gradient-based variable identification together. The derived modal regression estimator is related closely to information theoretic learning under the maximum correntropy criterion, and assures algorithmic robustness to complex noise by replacing learning of the conditional mean with the conditional mode. The gradient information of estimator offers a model-free metric to screen the key variables. In theory, we investigate the theoretical foundations of our new model on generalization-bound and variable selection consistency. In applications, the effectiveness of the proposed method is verified by data experiments.
Styles APA, Harvard, Vancouver, ISO, etc.
43

Nie, Junlan, Ruibo Gao et Ye Kang. « Urban Noise Inference Model Based on Multiple Views and Kernel Tensor Decomposition ». Fluctuation and Noise Letters 20, no 03 (25 janvier 2021) : 2150027. http://dx.doi.org/10.1142/s0219477521500279.

Texte intégral
Résumé :
Prediction of urban noise is becoming more significant for tackling noise pollution and protecting human mental health. However, the existing noise prediction algorithms neglected not only the correlation between noise regions, but also the nonlinearity and sparsity of the data, which resulted in low accuracy of filling in the missing entries of data. In this paper, we propose a model based on multiple views and kernel-matrix tensor decomposition to predict the noise situation at different times of day in each region. We first construct a kernel tensor decomposition model by using kernel mapping in order to speed decomposition rate and realize stable estimate the prediction system. Then, we analyze and compute the cause of the noise from multiple views including computing the similarity of regions and the correlation between noise categories by kernel distance, which improves the credibility to infer the noise situation and the categories of regions. Finally, we devise a prediction algorithm based on the kernel-matrix tensor factorization model. We evaluate our method with a real dataset, and the experiments to verify the advantages of our method compared with other existing baselines.
Styles APA, Harvard, Vancouver, ISO, etc.
44

Lee, Kuan-Hui, Jenq-Neng Hwang et Shih-I. Chen. « Model-Based Vehicle Localization Based on 3-D Constrained Multiple-Kernel Tracking ». IEEE Transactions on Circuits and Systems for Video Technology 25, no 1 (janvier 2015) : 38–50. http://dx.doi.org/10.1109/tcsvt.2014.2329355.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
45

Ikeda, Seiichi, et Yoshiharu Sato. « Kernel Canonical Discriminant Analysis Based on Variable Selection ». Journal of Advanced Computational Intelligence and Intelligent Informatics 13, no 4 (20 juillet 2009) : 416–20. http://dx.doi.org/10.20965/jaciii.2009.p0416.

Texte intégral
Résumé :
We have shown that models support vector regression and classification are essentially linear in reproducing kernel Hilbert space (RKHS). To overcome the over fitting problem, a regularization term is added to the optimization process, deciding the coefficient of regularization term involves difficulties. We introduce the variable selection concept to the linear model in RKHS, where the kernel functions is treated as variable transformation when its value is given by observation. We show that kernel canonical discriminant functions for multiclass problems can be discussed under variable selection, which enables us to reduce the number of kernel functions in the discriminant function, i.e., the discriminant function is obtained as linear combinations of sufficiently small numbers of kernel functions, so, we can expect to get reasonable prediction. We discuss variable selection performance in canonical discriminant functions compared to support vector machines.
Styles APA, Harvard, Vancouver, ISO, etc.
46

Tang, Yidong, Shucai Huang et Aijun Xue. « Sparse Representation Based Binary Hypothesis Model for Hyperspectral Image Classification ». Mathematical Problems in Engineering 2016 (2016) : 1–10. http://dx.doi.org/10.1155/2016/3460281.

Texte intégral
Résumé :
The sparse representation based classifier (SRC) and its kernel version (KSRC) have been employed for hyperspectral image (HSI) classification. However, the state-of-the-art SRC often aims at extended surface objects with linear mixture in smooth scene and assumes that the number of classes is given. Considering the small target with complex background, a sparse representation based binary hypothesis (SRBBH) model is established in this paper. In this model, a query pixel is represented in two ways, which are, respectively, by background dictionary and by union dictionary. The background dictionary is composed of samples selected from the local dual concentric window centered at the query pixel. Thus, for each pixel the classification issue becomes an adaptive multiclass classification problem, where only the number of desired classes is required. Furthermore, the kernel method is employed to improve the interclass separability. In kernel space, the coding vector is obtained by using kernel-based orthogonal matching pursuit (KOMP) algorithm. Then the query pixel can be labeled by the characteristics of the coding vectors. Instead of directly using the reconstruction residuals, the different impacts the background dictionary and union dictionary have on reconstruction are used for validation and classification. It enhances the discrimination and hence improves the performance.
Styles APA, Harvard, Vancouver, ISO, etc.
47

Yang, Xiao Han, Di Suo et Fuan Wen. « The General Embedded Kernel Simulation System Model ». Advanced Materials Research 459 (janvier 2012) : 58–62. http://dx.doi.org/10.4028/www.scientific.net/amr.459.58.

Texte intégral
Résumé :
According to in-depth study of embedded system architecture and working principles, this paper proposes an embedded kernel simulation system model – Device Operation Interface (DOI) model based in the computer simulation theory. Furthermore, an embedded 8051 simulated kernel is designed and developed based on this model. The 8051 simulation system achieves loading, compiling and linking the C language source file, simulates the process of 8051 command execution and displays the results through interfaces. The DOI model contributes to the reuse in different experimental simulation platforms
Styles APA, Harvard, Vancouver, ISO, etc.
48

Guo, Luo, Zhihai Ma et Lianjun Zhang. « Comparison of bandwidth selection in application of geographically weighted regression : a case study ». Canadian Journal of Forest Research 38, no 9 (septembre 2008) : 2526–34. http://dx.doi.org/10.1139/x08-091.

Texte intégral
Résumé :
A forest plot with a clustered spatial pattern of tree locations was used to investigate the impacts of different kernel functions (fixed vs. adaptive) and different sizes of bandwidth on model fitting, model performance, and spatial characteristics of the geographically weighted regression (GWR) coefficient estimates and model residuals. Our results indicated that (i) the GWR models with smaller bandwidths fit the data better, yielded smaller model residuals across tree sizes, significantly reduced spatial autocorrelation and heterogeneity for model residuals, and generated better spatial patterns for model residuals; however, smaller bandwidth sizes produced a high level of coefficient variability; (ii) the GWR models based on the fixed spatial kernel function produced smoother spatial distributions for the model coefficients than those based on the adaptive kernel function; and (iii) the GWR cross-validation or Akaike’s information criterion (AIC) optimization process may not produce an “optimal” bandwidth for model fitting and performance. It was evident that the selection of spatial kernel function and bandwidth has a strong impact on the descriptive and predictive power of GWR models.
Styles APA, Harvard, Vancouver, ISO, etc.
49

Pan, Lizheng, Dashuai Zhu, Shigang She, Aiguo Song, Xianchuan Shi et Suolin Duan. « Gear fault diagnosis method based on wavelet-packet independent component analysis and support vector machine with kernel function fusion ». Advances in Mechanical Engineering 10, no 11 (novembre 2018) : 168781401881103. http://dx.doi.org/10.1177/1687814018811036.

Texte intégral
Résumé :
Aiming at the problem of gear fault diagnosis, in order to effectively extract the features and improve the accuracy of gear fault diagnosis, the method based on wavelet-packet independent component analysis and support vector machine with kernel function fusion is proposed in this research. The proposed wavelet-packet independent component analysis feature extraction method can effectively combine the advantages of wavelet packet and independent component analysis methods and acquire more comprehensive feature information. Besides, the proposed kernel-function-fusion support vector machine can well integrate the advantage characteristics of each kernel function. The energy features of wavelet packet coefficients are acquired with four-layer wavelet packet decomposition and then the extracted energy features are further optimized by the independent component analysis method. The kernel-function-fusion support vector machine method is adopted to realize the gear fault diagnosis. Two kernel function models with the best self-classification accuracy are employed to serve the gear fault diagnosis corporately. The test samples are primarily classified by the main kernel function model, and then some samples are selected to be reclassified with the other kernel function model. Finally, the two kernel function models cooperate to determine the type of test samples. The comparison investigations demonstrate that the proposed method based on wavelet-packet independent component analysis and support vector machine with kernel function fusion achieves very high diagnosis accuracy.
Styles APA, Harvard, Vancouver, ISO, etc.
50

CHEN, BADONG, JOSE C. PRINCIPE, JINCHUN HU et YU ZHU. « STOCHASTIC INFORMATION GRADIENT ALGORITHM WITH GENERALIZED GAUSSIAN DISTRIBUTION MODEL ». Journal of Circuits, Systems and Computers 21, no 01 (février 2012) : 1250006. http://dx.doi.org/10.1142/s0218126612500065.

Texte intégral
Résumé :
This paper presents a parameterized version of the stochastic information gradient (SIG) algorithm, in which the error distribution is modeled by generalized Gaussian density (GGD), with location, shape, and dispersion parameters. Compared with the kernel-based SIG (SIG-Kernel) algorithm, the GGD-based SIG (SIG-GGD) algorithm does not involve kernel width selection. If the error is zero-mean, the SIG-GGD algorithm will become the least mean p-power (LMP) algorithm with adaptive order and variable step-size. Due to its well matched density estimation and automatic switching capability, the proposed algorithm is favorably in line with existing algorithms.
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie