Academic literature on the topic 'Kernel Hilbert Spaces'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Kernel Hilbert Spaces.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Kernel Hilbert Spaces"
Ferrer, Osmin, Diego Carrillo, and Arnaldo De La Barrera. "Reproducing Kernel in Krein Spaces." WSEAS TRANSACTIONS ON MATHEMATICS 21 (January 11, 2022): 23–30. http://dx.doi.org/10.37394/23206.2022.21.4.
Full textThirulogasanthar, K., and S. Twareque Ali. "General construction of reproducing kernels on a quaternionic Hilbert space." Reviews in Mathematical Physics 29, no. 05 (May 2, 2017): 1750017. http://dx.doi.org/10.1142/s0129055x17500179.
Full textFerreira, J. C., and V. A. Menegatto. "Reproducing kernel Hilbert spaces associated with kernels on topological spaces." Functional Analysis and Its Applications 46, no. 2 (April 2012): 152–54. http://dx.doi.org/10.1007/s10688-012-0021-5.
Full textScovel, Clint, Don Hush, Ingo Steinwart, and James Theiler. "Radial kernels and their reproducing kernel Hilbert spaces." Journal of Complexity 26, no. 6 (December 2010): 641–60. http://dx.doi.org/10.1016/j.jco.2010.03.002.
Full textKumari, Rani, Jaydeb Sarkar, Srijan Sarkar, and Dan Timotin. "Factorizations of Kernels and Reproducing Kernel Hilbert Spaces." Integral Equations and Operator Theory 87, no. 2 (February 2017): 225–44. http://dx.doi.org/10.1007/s00020-017-2348-z.
Full textCARMELI, C., E. DE VITO, A. TOIGO, and V. UMANITÀ. "VECTOR VALUED REPRODUCING KERNEL HILBERT SPACES AND UNIVERSALITY." Analysis and Applications 08, no. 01 (January 2010): 19–61. http://dx.doi.org/10.1142/s0219530510001503.
Full textBall, Joseph A., Gregory Marx, and Victor Vinnikov. "Noncommutative reproducing kernel Hilbert spaces." Journal of Functional Analysis 271, no. 7 (October 2016): 1844–920. http://dx.doi.org/10.1016/j.jfa.2016.06.010.
Full textAlpay, Daniel, Palle Jorgensen, and Dan Volok. "Relative reproducing kernel Hilbert spaces." Proceedings of the American Mathematical Society 142, no. 11 (July 17, 2014): 3889–95. http://dx.doi.org/10.1090/s0002-9939-2014-12121-6.
Full textZHANG, HAIZHANG, and LIANG ZHAO. "ON THE INCLUSION RELATION OF REPRODUCING KERNEL HILBERT SPACES." Analysis and Applications 11, no. 02 (March 2013): 1350014. http://dx.doi.org/10.1142/s0219530513500140.
Full textAgud, L., J. M. Calabuig, and E. A. Sánchez Pérez. "Weighted p-regular kernels for reproducing kernel Hilbert spaces and Mercer Theorem." Analysis and Applications 18, no. 03 (October 31, 2019): 359–83. http://dx.doi.org/10.1142/s0219530519500179.
Full textDissertations / Theses on the topic "Kernel Hilbert Spaces"
Tipton, James Edward. "Reproducing Kernel Hilbert spaces and complex dynamics." Diss., University of Iowa, 2016. https://ir.uiowa.edu/etd/2284.
Full textBhujwalla, Yusuf. "Nonlinear System Identification with Kernels : Applications of Derivatives in Reproducing Kernel Hilbert Spaces." Thesis, Université de Lorraine, 2017. http://www.theses.fr/2017LORR0315/document.
Full textThis thesis will focus exclusively on the application of kernel-based nonparametric methods to nonlinear identification problems. As for other nonlinear methods, two key questions in kernel-based identification are the questions of how to define a nonlinear model (kernel selection) and how to tune the complexity of the model (regularisation). The following chapter will discuss how these questions are usually dealt with in the literature. The principal contribution of this thesis is the presentation and investigation of two optimisation criteria (one existing in the literature and one novel proposition) for structural approximation and complexity tuning in kernel-based nonlinear system identification. Both methods are based on the idea of incorporating feature-based complexity constraints into the optimisation criterion, by penalising derivatives of functions. Essentially, such methods offer the user flexibility in the definition of a kernel function and the choice of regularisation term, which opens new possibilities with respect to how nonlinear models can be estimated in practice. Both methods bear strong links with other methods from the literature, which will be examined in detail in Chapters 2 and 3 and will form the basis of the subsequent developments of the thesis. Whilst analogy will be made with parallel frameworks, the discussion will be rooted in the framework of Reproducing Kernel Hilbert Spaces (RKHS). Using RKHS methods will allow analysis of the methods presented from both a theoretical and a practical point-of-view. Furthermore, the methods developed will be applied to several identification ‘case studies’, comprising of both simulation and real-data examples, notably: • Structural detection in static nonlinear systems. • Controlling smoothness in LPV models. • Complexity tuning using structural penalties in NARX systems. • Internet traffic modelling using kernel methods
Struble, Dale William. "Wavelets on manifolds and multiscale reproducing kernel Hilbert spaces." Related electronic resource:, 2007. http://proquest.umi.com/pqdweb?did=1407687581&sid=1&Fmt=2&clientId=3739&RQT=309&VName=PQD.
Full textQuiggin, Peter Philip. "Generalisations of Pick's theorem to reproducing Kernel Hilbert spaces." Thesis, Lancaster University, 1994. http://eprints.lancs.ac.uk/61962/.
Full textMarx, Gregory. "The Complete Pick Property and Reproducing Kernel Hilbert Spaces." Thesis, Virginia Tech, 2014. http://hdl.handle.net/10919/24783.
Full textMaster of Science
Giménez, Febrer Pere Joan. "Matrix completion with prior information in reproducing kernel Hilbert spaces." Doctoral thesis, Universitat Politècnica de Catalunya, 2021. http://hdl.handle.net/10803/671718.
Full textA matrix completion, l'objectiu és recuperar una matriu a partir d'un subconjunt d'entrades observables. Els mètodes més eficaços es basen en la idea que la matriu desconeguda és de baix rang. Al ser de baix rang, les seves entrades són funció d'uns pocs coeficients que poden ser estimats sempre que hi hagi suficients observacions. Així, a matrix completion la solució s'obté com la matriu de mínim rang que millor s'ajusta a les entrades visibles. A més de baix rang, la matriu desconeguda pot tenir altres propietats estructurals que poden ser aprofitades en el procés de recuperació. En una matriu suau, pot esperar-se que les entrades en posicions pròximes tinguin valor similar. Igualment, grups de columnes o files poden saber-se similars. Aquesta informació relacional es proporciona a través de diversos mitjans com ara matrius de covariància o grafs, amb l'inconvenient que aquests no poden ser derivats a partir de la matriu de dades ja que està incompleta. Aquesta tesi tracta sobre matrix completion amb informació prèvia, i presenta metodologies que poden aplicar-se a diverses situacions. En la primera part, les columnes de la matriu desconeguda s'identifiquen com a senyals en un graf conegut prèviament. Llavors, la matriu d'adjacència del graf s'usa per calcular un punt inicial per a un algorisme de gradient pròxim amb la finalitat de reduir les iteracions necessàries per arribar a la solució. Després, suposant que els senyals són suaus, la matriu laplaciana del graf s'incorpora en la formulació del problema amb tal forçar suavitat en la solució. Això resulta en una reducció de soroll en la matriu observada i menor error, la qual cosa es demostra a través d'anàlisi teòrica i simulacions numèriques. La segona part de la tesi introdueix eines per a aprofitar informació prèvia mitjançant reproducing kernel Hilbert spaces. Atès que un kernel mesura la similitud entre dos punts en un espai, permet codificar qualsevol tipus d'informació tal com vectors de característiques, diccionaris o grafs. En associar cada columna i fila de la matriu desconeguda amb un element en un set, i definir un parell de kernels que mesuren similitud entre columnes o files, les entrades desconegudes poden ser extrapolades mitjançant les funcions de kernel. Es presenta un mètode basat en regressió amb kernels, amb dues variants addicionals que redueixen el cost computacional. Els mètodes proposats es mostren competitius amb tècniques existents, especialment quan el nombre d'observacions és molt baix. A més, es detalla una anàlisi de l'error quadràtic mitjà i l'error de generalització. Per a l'error de generalització, s'adopta el context transductiu, el qual mesura la capacitat d'un algorisme de transferir informació d'un set de mostres etiquetades a un set no etiquetat. Després, es deriven cotes d'error per als algorismes proposats i existents fent ús de la complexitat de Rademacher, i es presenten proves numèriques que confirmen els resultats teòrics. Finalment, la tesi explora la qüestió de com triar les entrades observables de la matriu per a minimitzar l'error de recuperació de la matriu completa. Una estratègia de mostrejat passiva és proposada, la qual implica que no és necessari conèixer cap etiqueta per a dissenyar la distribució de mostreig. Només les funcions de kernel són necessàries. El mètode es basa en construir la millor aproximació de Nyström a la matriu de kernel mostrejant les columnes segons la seva leverage score, una mètrica que apareix de manera natural durant l'anàlisi teòric.
Dieuleveut, Aymeric. "Stochastic approximation in Hilbert spaces." Thesis, Paris Sciences et Lettres (ComUE), 2017. http://www.theses.fr/2017PSLEE059/document.
Full textThe goal of supervised machine learning is to infer relationships between a phenomenon one seeks to predict and “explanatory” variables. To that end, multiple occurrences of the phenomenon are observed, from which a prediction rule is constructed. The last two decades have witnessed the apparition of very large data-sets, both in terms of the number of observations (e.g., in image analysis) and in terms of the number of explanatory variables (e.g., in genetics). This has raised two challenges: first, avoiding the pitfall of over-fitting, especially when the number of explanatory variables is much higher than the number of observations; and second, dealing with the computational constraints, such as when the mere resolution of a linear system becomes a difficulty of its own. Algorithms that take their roots in stochastic approximation methods tackle both of these difficulties simultaneously: these stochastic methods dramatically reduce the computational cost, without degrading the quality of the proposed prediction rule, and they can naturally avoid over-fitting. As a consequence, the core of this thesis will be the study of stochastic gradient methods. The popular parametric methods give predictors which are linear functions of a set ofexplanatory variables. However, they often result in an imprecise approximation of the underlying statistical structure. In the non-parametric setting, which is paramount in this thesis, this restriction is lifted. The class of functions from which the predictor is proposed depends on the observations. In practice, these methods have multiple purposes, and are essential for learning with non-vectorial data, which can be mapped onto a vector in a functional space using a positive definite kernel. This allows to use algorithms designed for vectorial data, but requires the analysis to be made in the non-parametric associated space: the reproducing kernel Hilbert space. Moreover, the analysis of non-parametric regression also sheds some light on the parametric setting when the number of predictors is much larger than the number of observations. The first contribution of this thesis is to provide a detailed analysis of stochastic approximation in the non-parametric setting, precisely in reproducing kernel Hilbert spaces. This analysis proves optimal convergence rates for the averaged stochastic gradient descent algorithm. As we take special care in using minimal assumptions, it applies to numerous situations, and covers both the settings in which the number of observations is known a priori, and situations in which the learning algorithm works in an on-line fashion. The second contribution is an algorithm based on acceleration, which converges at optimal speed, both from the optimization point of view and from the statistical one. In the non-parametric setting, this can improve the convergence rate up to optimality, even inparticular regimes for which the first algorithm remains sub-optimal. Finally, the third contribution of the thesis consists in an extension of the framework beyond the least-square loss. The stochastic gradient descent algorithm is analyzed as a Markov chain. This point of view leads to an intuitive and insightful interpretation, that outlines the differences between the quadratic setting and the more general setting. A simple method resulting in provable improvements in the convergence is then proposed
Giulini, Ilaria. "Generalization bounds for random samples in Hilbert spaces." Thesis, Paris, Ecole normale supérieure, 2015. http://www.theses.fr/2015ENSU0026/document.
Full textThis thesis focuses on obtaining generalization bounds for random samples in reproducing kernel Hilbert spaces. The approach consists in first obtaining non-asymptotic dimension-free bounds in finite-dimensional spaces using some PAC-Bayesian inequalities related to Gaussian perturbations and then in generalizing the results in a separable Hilbert space. We first investigate the question of estimating the Gram operator by a robust estimator from an i. i. d. sample and we present uniform bounds that hold under weak moment assumptions. These results allow us to qualify principal component analysis independently of the dimension of the ambient space and to propose stable versions of it. In the last part of the thesis we present a new algorithm for spectral clustering. It consists in replacing the projection on the eigenvectors associated with the largest eigenvalues of the Laplacian matrix by a power of the normalized Laplacian. This iteration, justified by the analysis of clustering in terms of Markov chains, performs a smooth truncation. We prove nonasymptotic bounds for the convergence of our spectral clustering algorithm applied to a random sample of points in a Hilbert space that are deduced from the bounds for the Gram operator in a Hilbert space. Experiments are done in the context of image analysis
Paiva, António R. C. "Reproducing kernel Hilbert spaces for point processes, with applications to neural activity analysis." [Gainesville, Fla.] : University of Florida, 2008. http://purl.fcla.edu/fcla/etd/UFE0022471.
Full textSabree, Aqeeb A. "Positive definite kernels, harmonic analysis, and boundary spaces: Drury-Arveson theory, and related." Diss., University of Iowa, 2019. https://ir.uiowa.edu/etd/7023.
Full textBooks on the topic "Kernel Hilbert Spaces"
Berlinet, Alain, and Christine Thomas-Agnan. Reproducing Kernel Hilbert Spaces in Probability and Statistics. Boston, MA: Springer US, 2004. http://dx.doi.org/10.1007/978-1-4419-9096-9.
Full textChristine, Thomas-Agnan, ed. Reproducing kernel Hilbert spaces in probability and statistics. Boston: Kluwer Academic, 2004.
Find full textDym, H. J contractive matrix functions, reproducing kernel Hilbert spaces and interpolation. Providence, R.I: Published for the Conference Board of the Mathematical Sciences by the American Mathematical Society, 1989.
Find full textPereverzyev, Sergei. An Introduction to Artificial Intelligence Based on Reproducing Kernel Hilbert Spaces. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-98316-1.
Full textMinggen, Cui, and Lin Yingzhen, eds. Nonlinear numerical analysis in the reproducing Kernel space. Hauppauge, N.Y: Nova Science Publishers, 2008.
Find full textPríncipe, J. C. Kernel adaptive filtering: A comprehensive introduction. Hoboken, N.J: Wiley, 2010.
Find full textPríncipe, J. C. Kernel adaptive filtering: A comprehensive introduction. Hoboken, N.J: Wiley, 2010.
Find full textPríncipe, J. C. Kernel adaptive filtering: A comprehensive introduction. Hoboken, N.J: Wiley, 2010.
Find full textE, Fennell Robert, and Minton Roland B. 1956-, eds. Structured hereditary systems. New York: Marcel Dekker, 1987.
Find full textChristensen, Jens Gerlach. Trends in harmonic analysis and its applications: AMS special session on harmonic analysis and its applications : March 29-30, 2014, University of Maryland, Baltimore County, Baltimore, MD. Providence, Rhode Island: American Mathematical Society, 2015.
Find full textBook chapters on the topic "Kernel Hilbert Spaces"
Suzuki, Joe. "Hilbert Spaces." In Kernel Methods for Machine Learning with Math and R, 27–57. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-0398-4_2.
Full textSuzuki, Joe. "Hilbert Spaces." In Kernel Methods for Machine Learning with Math and Python, 29–59. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-0401-1_2.
Full textChristensen, Ronald. "Reproducing Kernel Hilbert Spaces." In Springer Texts in Statistics, 87–123. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-29164-8_3.
Full textMontesinos López, Osval Antonio, Abelardo Montesinos López, and Jose Crossa. "Reproducing Kernel Hilbert Spaces Regression and Classification Methods." In Multivariate Statistical Machine Learning Methods for Genomic Prediction, 251–336. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-89010-0_8.
Full textSawano, Yoshihiro. "Pasting Reproducing Kernel Hilbert Spaces." In Trends in Mathematics, 401–7. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-48812-7_51.
Full textPillonetto, Gianluigi, Tianshi Chen, Alessandro Chiuso, Giuseppe De Nicolao, and Lennart Ljung. "Regularization in Reproducing Kernel Hilbert Spaces." In Regularized System Identification, 181–246. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-95860-2_6.
Full textGualtierotti, Antonio F. "Reproducing Kernel Hilbert Spaces: The Rudiments." In Detection of Random Signals in Dependent Gaussian Noise, 3–123. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-22315-5_1.
Full textGualtierotti, Antonio F. "Relations Between Reproducing Kernel Hilbert Spaces." In Detection of Random Signals in Dependent Gaussian Noise, 217–305. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-22315-5_3.
Full textGualtierotti, Antonio F. "Reproducing Kernel Hilbert Spaces and Discrimination." In Detection of Random Signals in Dependent Gaussian Noise, 329–430. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-22315-5_5.
Full textBall, Joseph A., and Victor Vinnikov. "Formal Reproducing Kernel Hilbert Spaces: The Commutative and Noncommutative Settings." In Reproducing Kernel Spaces and Applications, 77–134. Basel: Birkhäuser Basel, 2003. http://dx.doi.org/10.1007/978-3-0348-8077-0_3.
Full textConference papers on the topic "Kernel Hilbert Spaces"
Tuia, Devis, Gustavo Camps-Valls, and Manel Martinez-Ramon. "Explicit recursivity into reproducing kernel Hilbert spaces." In ICASSP 2011 - 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2011. http://dx.doi.org/10.1109/icassp.2011.5947266.
Full textSUQUET, CHARLES. "REPRODUCING KERNEL HILBERT SPACES AND RANDOM MEASURES." In Proceedings of the 5th International ISAAC Congress. WORLD SCIENTIFIC, 2009. http://dx.doi.org/10.1142/9789812835635_0013.
Full textBobade, Parag, Suprotim Majumdar, Savio Pereira, Andrew J. Kurdila, and John B. Ferris. "Adaptive estimation in reproducing kernel Hilbert spaces." In 2017 American Control Conference (ACC). IEEE, 2017. http://dx.doi.org/10.23919/acc.2017.7963839.
Full textPaiva, Antonio R. C., Il Park, and Jose C. Principe. "Reproducing kernel Hilbert spaces for spike train analysis." In ICASSP 2008 - 2008 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2008. http://dx.doi.org/10.1109/icassp.2008.4518834.
Full textHasanbelliu, Erion, and Jose C. Principe. "Content addressable memories in reproducing Kernel Hilbert spaces." In 2008 IEEE Workshop on Machine Learning for Signal Processing (MLSP) (Formerly known as NNSP). IEEE, 2008. http://dx.doi.org/10.1109/mlsp.2008.4685447.
Full textTanaka, Akira, Hideyuki Imai, and Koji Takamiya. "Variance analyses for kernel regressors with nested reproducing kernel hilbert spaces." In ICASSP 2012 - 2012 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2012. http://dx.doi.org/10.1109/icassp.2012.6288300.
Full textZhang, Xiao, and Shizhong Liao. "Hypothesis Sketching for Online Kernel Selection in Continuous Kernel Space." In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/346.
Full textBouboulis, Pantelis, Sergios Theodoridis, and Konstantinos Slavakis. "Edge Preserving Image Denoising in Reproducing Kernel Hilbert Spaces." In 2010 20th International Conference on Pattern Recognition (ICPR). IEEE, 2010. http://dx.doi.org/10.1109/icpr.2010.652.
Full textZhaoda Deng, J. Gregory, and A. Kurdila. "Learning theory with consensus in reproducing kernel Hilbert spaces." In 2012 American Control Conference - ACC 2012. IEEE, 2012. http://dx.doi.org/10.1109/acc.2012.6315086.
Full textKurdila, Andrew, and Yu Lei. "Adaptive control via embedding in reproducing kernel Hilbert spaces." In 2013 American Control Conference (ACC). IEEE, 2013. http://dx.doi.org/10.1109/acc.2013.6580354.
Full textReports on the topic "Kernel Hilbert Spaces"
Fukumizu, Kenji, Francis R. Bach, and Michael I. Jordan. Dimensionality Reduction for Supervised Learning With Reproducing Kernel Hilbert Spaces. Fort Belvoir, VA: Defense Technical Information Center, May 2003. http://dx.doi.org/10.21236/ada446572.
Full text