Gotowa bibliografia na temat „Kernel Hilbert Spaces”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Kernel Hilbert Spaces”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Artykuły w czasopismach na temat "Kernel Hilbert Spaces"
Ferrer, Osmin, Diego Carrillo i Arnaldo De La Barrera. "Reproducing Kernel in Krein Spaces". WSEAS TRANSACTIONS ON MATHEMATICS 21 (11.01.2022): 23–30. http://dx.doi.org/10.37394/23206.2022.21.4.
Pełny tekst źródłaThirulogasanthar, K., i S. Twareque Ali. "General construction of reproducing kernels on a quaternionic Hilbert space". Reviews in Mathematical Physics 29, nr 05 (2.05.2017): 1750017. http://dx.doi.org/10.1142/s0129055x17500179.
Pełny tekst źródłaFerreira, J. C., i V. A. Menegatto. "Reproducing kernel Hilbert spaces associated with kernels on topological spaces". Functional Analysis and Its Applications 46, nr 2 (kwiecień 2012): 152–54. http://dx.doi.org/10.1007/s10688-012-0021-5.
Pełny tekst źródłaScovel, Clint, Don Hush, Ingo Steinwart i James Theiler. "Radial kernels and their reproducing kernel Hilbert spaces". Journal of Complexity 26, nr 6 (grudzień 2010): 641–60. http://dx.doi.org/10.1016/j.jco.2010.03.002.
Pełny tekst źródłaKumari, Rani, Jaydeb Sarkar, Srijan Sarkar i Dan Timotin. "Factorizations of Kernels and Reproducing Kernel Hilbert Spaces". Integral Equations and Operator Theory 87, nr 2 (luty 2017): 225–44. http://dx.doi.org/10.1007/s00020-017-2348-z.
Pełny tekst źródłaCARMELI, C., E. DE VITO, A. TOIGO i V. UMANITÀ. "VECTOR VALUED REPRODUCING KERNEL HILBERT SPACES AND UNIVERSALITY". Analysis and Applications 08, nr 01 (styczeń 2010): 19–61. http://dx.doi.org/10.1142/s0219530510001503.
Pełny tekst źródłaBall, Joseph A., Gregory Marx i Victor Vinnikov. "Noncommutative reproducing kernel Hilbert spaces". Journal of Functional Analysis 271, nr 7 (październik 2016): 1844–920. http://dx.doi.org/10.1016/j.jfa.2016.06.010.
Pełny tekst źródłaAlpay, Daniel, Palle Jorgensen i Dan Volok. "Relative reproducing kernel Hilbert spaces". Proceedings of the American Mathematical Society 142, nr 11 (17.07.2014): 3889–95. http://dx.doi.org/10.1090/s0002-9939-2014-12121-6.
Pełny tekst źródłaZHANG, HAIZHANG, i LIANG ZHAO. "ON THE INCLUSION RELATION OF REPRODUCING KERNEL HILBERT SPACES". Analysis and Applications 11, nr 02 (marzec 2013): 1350014. http://dx.doi.org/10.1142/s0219530513500140.
Pełny tekst źródłaAgud, L., J. M. Calabuig i E. A. Sánchez Pérez. "Weighted p-regular kernels for reproducing kernel Hilbert spaces and Mercer Theorem". Analysis and Applications 18, nr 03 (31.10.2019): 359–83. http://dx.doi.org/10.1142/s0219530519500179.
Pełny tekst źródłaRozprawy doktorskie na temat "Kernel Hilbert Spaces"
Tipton, James Edward. "Reproducing Kernel Hilbert spaces and complex dynamics". Diss., University of Iowa, 2016. https://ir.uiowa.edu/etd/2284.
Pełny tekst źródłaBhujwalla, Yusuf. "Nonlinear System Identification with Kernels : Applications of Derivatives in Reproducing Kernel Hilbert Spaces". Thesis, Université de Lorraine, 2017. http://www.theses.fr/2017LORR0315/document.
Pełny tekst źródłaThis thesis will focus exclusively on the application of kernel-based nonparametric methods to nonlinear identification problems. As for other nonlinear methods, two key questions in kernel-based identification are the questions of how to define a nonlinear model (kernel selection) and how to tune the complexity of the model (regularisation). The following chapter will discuss how these questions are usually dealt with in the literature. The principal contribution of this thesis is the presentation and investigation of two optimisation criteria (one existing in the literature and one novel proposition) for structural approximation and complexity tuning in kernel-based nonlinear system identification. Both methods are based on the idea of incorporating feature-based complexity constraints into the optimisation criterion, by penalising derivatives of functions. Essentially, such methods offer the user flexibility in the definition of a kernel function and the choice of regularisation term, which opens new possibilities with respect to how nonlinear models can be estimated in practice. Both methods bear strong links with other methods from the literature, which will be examined in detail in Chapters 2 and 3 and will form the basis of the subsequent developments of the thesis. Whilst analogy will be made with parallel frameworks, the discussion will be rooted in the framework of Reproducing Kernel Hilbert Spaces (RKHS). Using RKHS methods will allow analysis of the methods presented from both a theoretical and a practical point-of-view. Furthermore, the methods developed will be applied to several identification ‘case studies’, comprising of both simulation and real-data examples, notably: • Structural detection in static nonlinear systems. • Controlling smoothness in LPV models. • Complexity tuning using structural penalties in NARX systems. • Internet traffic modelling using kernel methods
Struble, Dale William. "Wavelets on manifolds and multiscale reproducing kernel Hilbert spaces". Related electronic resource:, 2007. http://proquest.umi.com/pqdweb?did=1407687581&sid=1&Fmt=2&clientId=3739&RQT=309&VName=PQD.
Pełny tekst źródłaQuiggin, Peter Philip. "Generalisations of Pick's theorem to reproducing Kernel Hilbert spaces". Thesis, Lancaster University, 1994. http://eprints.lancs.ac.uk/61962/.
Pełny tekst źródłaMarx, Gregory. "The Complete Pick Property and Reproducing Kernel Hilbert Spaces". Thesis, Virginia Tech, 2014. http://hdl.handle.net/10919/24783.
Pełny tekst źródłaMaster of Science
Giménez, Febrer Pere Joan. "Matrix completion with prior information in reproducing kernel Hilbert spaces". Doctoral thesis, Universitat Politècnica de Catalunya, 2021. http://hdl.handle.net/10803/671718.
Pełny tekst źródłaA matrix completion, l'objectiu és recuperar una matriu a partir d'un subconjunt d'entrades observables. Els mètodes més eficaços es basen en la idea que la matriu desconeguda és de baix rang. Al ser de baix rang, les seves entrades són funció d'uns pocs coeficients que poden ser estimats sempre que hi hagi suficients observacions. Així, a matrix completion la solució s'obté com la matriu de mínim rang que millor s'ajusta a les entrades visibles. A més de baix rang, la matriu desconeguda pot tenir altres propietats estructurals que poden ser aprofitades en el procés de recuperació. En una matriu suau, pot esperar-se que les entrades en posicions pròximes tinguin valor similar. Igualment, grups de columnes o files poden saber-se similars. Aquesta informació relacional es proporciona a través de diversos mitjans com ara matrius de covariància o grafs, amb l'inconvenient que aquests no poden ser derivats a partir de la matriu de dades ja que està incompleta. Aquesta tesi tracta sobre matrix completion amb informació prèvia, i presenta metodologies que poden aplicar-se a diverses situacions. En la primera part, les columnes de la matriu desconeguda s'identifiquen com a senyals en un graf conegut prèviament. Llavors, la matriu d'adjacència del graf s'usa per calcular un punt inicial per a un algorisme de gradient pròxim amb la finalitat de reduir les iteracions necessàries per arribar a la solució. Després, suposant que els senyals són suaus, la matriu laplaciana del graf s'incorpora en la formulació del problema amb tal forçar suavitat en la solució. Això resulta en una reducció de soroll en la matriu observada i menor error, la qual cosa es demostra a través d'anàlisi teòrica i simulacions numèriques. La segona part de la tesi introdueix eines per a aprofitar informació prèvia mitjançant reproducing kernel Hilbert spaces. Atès que un kernel mesura la similitud entre dos punts en un espai, permet codificar qualsevol tipus d'informació tal com vectors de característiques, diccionaris o grafs. En associar cada columna i fila de la matriu desconeguda amb un element en un set, i definir un parell de kernels que mesuren similitud entre columnes o files, les entrades desconegudes poden ser extrapolades mitjançant les funcions de kernel. Es presenta un mètode basat en regressió amb kernels, amb dues variants addicionals que redueixen el cost computacional. Els mètodes proposats es mostren competitius amb tècniques existents, especialment quan el nombre d'observacions és molt baix. A més, es detalla una anàlisi de l'error quadràtic mitjà i l'error de generalització. Per a l'error de generalització, s'adopta el context transductiu, el qual mesura la capacitat d'un algorisme de transferir informació d'un set de mostres etiquetades a un set no etiquetat. Després, es deriven cotes d'error per als algorismes proposats i existents fent ús de la complexitat de Rademacher, i es presenten proves numèriques que confirmen els resultats teòrics. Finalment, la tesi explora la qüestió de com triar les entrades observables de la matriu per a minimitzar l'error de recuperació de la matriu completa. Una estratègia de mostrejat passiva és proposada, la qual implica que no és necessari conèixer cap etiqueta per a dissenyar la distribució de mostreig. Només les funcions de kernel són necessàries. El mètode es basa en construir la millor aproximació de Nyström a la matriu de kernel mostrejant les columnes segons la seva leverage score, una mètrica que apareix de manera natural durant l'anàlisi teòric.
Dieuleveut, Aymeric. "Stochastic approximation in Hilbert spaces". Thesis, Paris Sciences et Lettres (ComUE), 2017. http://www.theses.fr/2017PSLEE059/document.
Pełny tekst źródłaThe goal of supervised machine learning is to infer relationships between a phenomenon one seeks to predict and “explanatory” variables. To that end, multiple occurrences of the phenomenon are observed, from which a prediction rule is constructed. The last two decades have witnessed the apparition of very large data-sets, both in terms of the number of observations (e.g., in image analysis) and in terms of the number of explanatory variables (e.g., in genetics). This has raised two challenges: first, avoiding the pitfall of over-fitting, especially when the number of explanatory variables is much higher than the number of observations; and second, dealing with the computational constraints, such as when the mere resolution of a linear system becomes a difficulty of its own. Algorithms that take their roots in stochastic approximation methods tackle both of these difficulties simultaneously: these stochastic methods dramatically reduce the computational cost, without degrading the quality of the proposed prediction rule, and they can naturally avoid over-fitting. As a consequence, the core of this thesis will be the study of stochastic gradient methods. The popular parametric methods give predictors which are linear functions of a set ofexplanatory variables. However, they often result in an imprecise approximation of the underlying statistical structure. In the non-parametric setting, which is paramount in this thesis, this restriction is lifted. The class of functions from which the predictor is proposed depends on the observations. In practice, these methods have multiple purposes, and are essential for learning with non-vectorial data, which can be mapped onto a vector in a functional space using a positive definite kernel. This allows to use algorithms designed for vectorial data, but requires the analysis to be made in the non-parametric associated space: the reproducing kernel Hilbert space. Moreover, the analysis of non-parametric regression also sheds some light on the parametric setting when the number of predictors is much larger than the number of observations. The first contribution of this thesis is to provide a detailed analysis of stochastic approximation in the non-parametric setting, precisely in reproducing kernel Hilbert spaces. This analysis proves optimal convergence rates for the averaged stochastic gradient descent algorithm. As we take special care in using minimal assumptions, it applies to numerous situations, and covers both the settings in which the number of observations is known a priori, and situations in which the learning algorithm works in an on-line fashion. The second contribution is an algorithm based on acceleration, which converges at optimal speed, both from the optimization point of view and from the statistical one. In the non-parametric setting, this can improve the convergence rate up to optimality, even inparticular regimes for which the first algorithm remains sub-optimal. Finally, the third contribution of the thesis consists in an extension of the framework beyond the least-square loss. The stochastic gradient descent algorithm is analyzed as a Markov chain. This point of view leads to an intuitive and insightful interpretation, that outlines the differences between the quadratic setting and the more general setting. A simple method resulting in provable improvements in the convergence is then proposed
Giulini, Ilaria. "Generalization bounds for random samples in Hilbert spaces". Thesis, Paris, Ecole normale supérieure, 2015. http://www.theses.fr/2015ENSU0026/document.
Pełny tekst źródłaThis thesis focuses on obtaining generalization bounds for random samples in reproducing kernel Hilbert spaces. The approach consists in first obtaining non-asymptotic dimension-free bounds in finite-dimensional spaces using some PAC-Bayesian inequalities related to Gaussian perturbations and then in generalizing the results in a separable Hilbert space. We first investigate the question of estimating the Gram operator by a robust estimator from an i. i. d. sample and we present uniform bounds that hold under weak moment assumptions. These results allow us to qualify principal component analysis independently of the dimension of the ambient space and to propose stable versions of it. In the last part of the thesis we present a new algorithm for spectral clustering. It consists in replacing the projection on the eigenvectors associated with the largest eigenvalues of the Laplacian matrix by a power of the normalized Laplacian. This iteration, justified by the analysis of clustering in terms of Markov chains, performs a smooth truncation. We prove nonasymptotic bounds for the convergence of our spectral clustering algorithm applied to a random sample of points in a Hilbert space that are deduced from the bounds for the Gram operator in a Hilbert space. Experiments are done in the context of image analysis
Paiva, António R. C. "Reproducing kernel Hilbert spaces for point processes, with applications to neural activity analysis". [Gainesville, Fla.] : University of Florida, 2008. http://purl.fcla.edu/fcla/etd/UFE0022471.
Pełny tekst źródłaSabree, Aqeeb A. "Positive definite kernels, harmonic analysis, and boundary spaces: Drury-Arveson theory, and related". Diss., University of Iowa, 2019. https://ir.uiowa.edu/etd/7023.
Pełny tekst źródłaKsiążki na temat "Kernel Hilbert Spaces"
Berlinet, Alain, i Christine Thomas-Agnan. Reproducing Kernel Hilbert Spaces in Probability and Statistics. Boston, MA: Springer US, 2004. http://dx.doi.org/10.1007/978-1-4419-9096-9.
Pełny tekst źródłaChristine, Thomas-Agnan, red. Reproducing kernel Hilbert spaces in probability and statistics. Boston: Kluwer Academic, 2004.
Znajdź pełny tekst źródłaDym, H. J contractive matrix functions, reproducing kernel Hilbert spaces and interpolation. Providence, R.I: Published for the Conference Board of the Mathematical Sciences by the American Mathematical Society, 1989.
Znajdź pełny tekst źródłaPereverzyev, Sergei. An Introduction to Artificial Intelligence Based on Reproducing Kernel Hilbert Spaces. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-98316-1.
Pełny tekst źródłaMinggen, Cui, i Lin Yingzhen, red. Nonlinear numerical analysis in the reproducing Kernel space. Hauppauge, N.Y: Nova Science Publishers, 2008.
Znajdź pełny tekst źródłaPríncipe, J. C. Kernel adaptive filtering: A comprehensive introduction. Hoboken, N.J: Wiley, 2010.
Znajdź pełny tekst źródłaPríncipe, J. C. Kernel adaptive filtering: A comprehensive introduction. Hoboken, N.J: Wiley, 2010.
Znajdź pełny tekst źródłaPríncipe, J. C. Kernel adaptive filtering: A comprehensive introduction. Hoboken, N.J: Wiley, 2010.
Znajdź pełny tekst źródłaE, Fennell Robert, i Minton Roland B. 1956-, red. Structured hereditary systems. New York: Marcel Dekker, 1987.
Znajdź pełny tekst źródłaChristensen, Jens Gerlach. Trends in harmonic analysis and its applications: AMS special session on harmonic analysis and its applications : March 29-30, 2014, University of Maryland, Baltimore County, Baltimore, MD. Providence, Rhode Island: American Mathematical Society, 2015.
Znajdź pełny tekst źródłaCzęści książek na temat "Kernel Hilbert Spaces"
Suzuki, Joe. "Hilbert Spaces". W Kernel Methods for Machine Learning with Math and R, 27–57. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-0398-4_2.
Pełny tekst źródłaSuzuki, Joe. "Hilbert Spaces". W Kernel Methods for Machine Learning with Math and Python, 29–59. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-0401-1_2.
Pełny tekst źródłaChristensen, Ronald. "Reproducing Kernel Hilbert Spaces". W Springer Texts in Statistics, 87–123. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-29164-8_3.
Pełny tekst źródłaMontesinos López, Osval Antonio, Abelardo Montesinos López i Jose Crossa. "Reproducing Kernel Hilbert Spaces Regression and Classification Methods". W Multivariate Statistical Machine Learning Methods for Genomic Prediction, 251–336. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-89010-0_8.
Pełny tekst źródłaSawano, Yoshihiro. "Pasting Reproducing Kernel Hilbert Spaces". W Trends in Mathematics, 401–7. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-48812-7_51.
Pełny tekst źródłaPillonetto, Gianluigi, Tianshi Chen, Alessandro Chiuso, Giuseppe De Nicolao i Lennart Ljung. "Regularization in Reproducing Kernel Hilbert Spaces". W Regularized System Identification, 181–246. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-95860-2_6.
Pełny tekst źródłaGualtierotti, Antonio F. "Reproducing Kernel Hilbert Spaces: The Rudiments". W Detection of Random Signals in Dependent Gaussian Noise, 3–123. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-22315-5_1.
Pełny tekst źródłaGualtierotti, Antonio F. "Relations Between Reproducing Kernel Hilbert Spaces". W Detection of Random Signals in Dependent Gaussian Noise, 217–305. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-22315-5_3.
Pełny tekst źródłaGualtierotti, Antonio F. "Reproducing Kernel Hilbert Spaces and Discrimination". W Detection of Random Signals in Dependent Gaussian Noise, 329–430. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-22315-5_5.
Pełny tekst źródłaBall, Joseph A., i Victor Vinnikov. "Formal Reproducing Kernel Hilbert Spaces: The Commutative and Noncommutative Settings". W Reproducing Kernel Spaces and Applications, 77–134. Basel: Birkhäuser Basel, 2003. http://dx.doi.org/10.1007/978-3-0348-8077-0_3.
Pełny tekst źródłaStreszczenia konferencji na temat "Kernel Hilbert Spaces"
Tuia, Devis, Gustavo Camps-Valls i Manel Martinez-Ramon. "Explicit recursivity into reproducing kernel Hilbert spaces". W ICASSP 2011 - 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2011. http://dx.doi.org/10.1109/icassp.2011.5947266.
Pełny tekst źródłaSUQUET, CHARLES. "REPRODUCING KERNEL HILBERT SPACES AND RANDOM MEASURES". W Proceedings of the 5th International ISAAC Congress. WORLD SCIENTIFIC, 2009. http://dx.doi.org/10.1142/9789812835635_0013.
Pełny tekst źródłaBobade, Parag, Suprotim Majumdar, Savio Pereira, Andrew J. Kurdila i John B. Ferris. "Adaptive estimation in reproducing kernel Hilbert spaces". W 2017 American Control Conference (ACC). IEEE, 2017. http://dx.doi.org/10.23919/acc.2017.7963839.
Pełny tekst źródłaPaiva, Antonio R. C., Il Park i Jose C. Principe. "Reproducing kernel Hilbert spaces for spike train analysis". W ICASSP 2008 - 2008 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2008. http://dx.doi.org/10.1109/icassp.2008.4518834.
Pełny tekst źródłaHasanbelliu, Erion, i Jose C. Principe. "Content addressable memories in reproducing Kernel Hilbert spaces". W 2008 IEEE Workshop on Machine Learning for Signal Processing (MLSP) (Formerly known as NNSP). IEEE, 2008. http://dx.doi.org/10.1109/mlsp.2008.4685447.
Pełny tekst źródłaTanaka, Akira, Hideyuki Imai i Koji Takamiya. "Variance analyses for kernel regressors with nested reproducing kernel hilbert spaces". W ICASSP 2012 - 2012 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, 2012. http://dx.doi.org/10.1109/icassp.2012.6288300.
Pełny tekst źródłaZhang, Xiao, i Shizhong Liao. "Hypothesis Sketching for Online Kernel Selection in Continuous Kernel Space". W Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/346.
Pełny tekst źródłaBouboulis, Pantelis, Sergios Theodoridis i Konstantinos Slavakis. "Edge Preserving Image Denoising in Reproducing Kernel Hilbert Spaces". W 2010 20th International Conference on Pattern Recognition (ICPR). IEEE, 2010. http://dx.doi.org/10.1109/icpr.2010.652.
Pełny tekst źródłaZhaoda Deng, J. Gregory i A. Kurdila. "Learning theory with consensus in reproducing kernel Hilbert spaces". W 2012 American Control Conference - ACC 2012. IEEE, 2012. http://dx.doi.org/10.1109/acc.2012.6315086.
Pełny tekst źródłaKurdila, Andrew, i Yu Lei. "Adaptive control via embedding in reproducing kernel Hilbert spaces". W 2013 American Control Conference (ACC). IEEE, 2013. http://dx.doi.org/10.1109/acc.2013.6580354.
Pełny tekst źródłaRaporty organizacyjne na temat "Kernel Hilbert Spaces"
Fukumizu, Kenji, Francis R. Bach i Michael I. Jordan. Dimensionality Reduction for Supervised Learning With Reproducing Kernel Hilbert Spaces. Fort Belvoir, VA: Defense Technical Information Center, maj 2003. http://dx.doi.org/10.21236/ada446572.
Pełny tekst źródła