Academic literature on the topic 'Generalization bounds'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Generalization bounds.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Generalization bounds"
Cohn, David, and Gerald Tesauro. "How Tight Are the Vapnik-Chervonenkis Bounds?" Neural Computation 4, no. 2 (March 1992): 249–69. http://dx.doi.org/10.1162/neco.1992.4.2.249.
Full textNedovic, M., and Lj Cvetkovic. "Norm bounds for the inverse and error bounds for linear complementarity problems for {P1,P2}-Nekrasov matrices." Filomat 35, no. 1 (2021): 239–50. http://dx.doi.org/10.2298/fil2101239n.
Full textNedovic, M. "Norm bounds for the inverse for generalized Nekrasov matrices in point-wise and block case." Filomat 35, no. 8 (2021): 2705–14. http://dx.doi.org/10.2298/fil2108705n.
Full textRubab, Faiza, Hira Nabi, and Asif R. Khan. "GENERALIZATION AND REFINEMENTS OF JENSEN INEQUALITY." Journal of Mathematical Analysis 12, no. 5 (October 31, 2021): 1–27. http://dx.doi.org/10.54379/jma-2021-5-1.
Full textLiu, Tongliang, Dacheng Tao, and Dong Xu. "Dimensionality-Dependent Generalization Bounds for k-Dimensional Coding Schemes." Neural Computation 28, no. 10 (October 2016): 2213–49. http://dx.doi.org/10.1162/neco_a_00872.
Full textPereira, Rajesh, and Mohammad Ali Vali. "Generalizations of the Cauchy and Fujiwara Bounds for Products of Zeros of a Polynomial." Electronic Journal of Linear Algebra 31 (February 5, 2016): 565–71. http://dx.doi.org/10.13001/1081-3810.3333.
Full textParrondo, J. M. R., and C. Van den Broeck. "Vapnik-Chervonenkis bounds for generalization." Journal of Physics A: Mathematical and General 26, no. 9 (May 7, 1993): 2211–23. http://dx.doi.org/10.1088/0305-4470/26/9/016.
Full textFreund, Yoav, Yishay Mansour, and Robert E. Schapire. "Generalization bounds for averaged classifiers." Annals of Statistics 32, no. 4 (August 2004): 1698–722. http://dx.doi.org/10.1214/009053604000000058.
Full textPapadatos, N., and V. Papathanasiou. "A generalization of variance bounds." Statistics & Probability Letters 28, no. 2 (June 1996): 191–94. http://dx.doi.org/10.1016/0167-7152(95)00117-4.
Full textKochedykov, D. A. "Combinatorial shell bounds for generalization ability." Pattern Recognition and Image Analysis 20, no. 4 (December 2010): 459–73. http://dx.doi.org/10.1134/s1054661810040061.
Full textDissertations / Theses on the topic "Generalization bounds"
McDonald, Daniel J. "Generalization Error Bounds for Time Series." Research Showcase @ CMU, 2012. http://repository.cmu.edu/dissertations/184.
Full textKroon, Rodney Stephen. "Support vector machines, generalization bounds, and transduction." Thesis, Stellenbosch : University of Stellenbosch, 2003. http://hdl.handle.net/10019.1/16375.
Full textKelby, Robin J. "Formalized Generalization Bounds for Perceptron-Like Algorithms." Ohio University / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1594805966855804.
Full textGiulini, Ilaria. "Generalization bounds for random samples in Hilbert spaces." Thesis, Paris, Ecole normale supérieure, 2015. http://www.theses.fr/2015ENSU0026/document.
Full textThis thesis focuses on obtaining generalization bounds for random samples in reproducing kernel Hilbert spaces. The approach consists in first obtaining non-asymptotic dimension-free bounds in finite-dimensional spaces using some PAC-Bayesian inequalities related to Gaussian perturbations and then in generalizing the results in a separable Hilbert space. We first investigate the question of estimating the Gram operator by a robust estimator from an i. i. d. sample and we present uniform bounds that hold under weak moment assumptions. These results allow us to qualify principal component analysis independently of the dimension of the ambient space and to propose stable versions of it. In the last part of the thesis we present a new algorithm for spectral clustering. It consists in replacing the projection on the eigenvectors associated with the largest eigenvalues of the Laplacian matrix by a power of the normalized Laplacian. This iteration, justified by the analysis of clustering in terms of Markov chains, performs a smooth truncation. We prove nonasymptotic bounds for the convergence of our spectral clustering algorithm applied to a random sample of points in a Hilbert space that are deduced from the bounds for the Gram operator in a Hilbert space. Experiments are done in the context of image analysis
Rakhlin, Alexander. "Applications of empirical processes in learning theory : algorithmic stability and generalization bounds." Thesis, Massachusetts Institute of Technology, 2006. http://hdl.handle.net/1721.1/34564.
Full textIncludes bibliographical references (p. 141-148).
This thesis studies two key properties of learning algorithms: their generalization ability and their stability with respect to perturbations. To analyze these properties, we focus on concentration inequalities and tools from empirical process theory. We obtain theoretical results and demonstrate their applications to machine learning. First, we show how various notions of stability upper- and lower-bound the bias and variance of several estimators of the expected performance for general learning algorithms. A weak stability condition is shown to be equivalent to consistency of empirical risk minimization. The second part of the thesis derives tight performance guarantees for greedy error minimization methods - a family of computationally tractable algorithms. In particular, we derive risk bounds for a greedy mixture density estimation procedure. We prove that, unlike what is suggested in the literature, the number of terms in the mixture is not a bias-variance trade-off for the performance. The third part of this thesis provides a solution to an open problem regarding the stability of Empirical Risk Minimization (ERM). This algorithm is of central importance in Learning Theory.
(cont.) By studying the suprema of the empirical process, we prove that ERM over Donsker classes of functions is stable in the L1 norm. Hence, as the number of samples grows, it becomes less and less likely that a perturbation of o(v/n) samples will result in a very different empirical minimizer. Asymptotic rates of this stability are proved under metric entropy assumptions on the function class. Through the use of a ratio limit inequality, we also prove stability of expected errors of empirical minimizers. Next, we investigate applications of the stability result. In particular, we focus on procedures that optimize an objective function, such as k-means and other clustering methods. We demonstrate that stability of clustering, just like stability of ERM, is closely related to the geometry of the class and the underlying measure. Furthermore, our result on stability of ERM delineates a phase transition between stability and instability of clustering methods. In the last chapter, we prove a generalization of the bounded-difference concentration inequality for almost-everywhere smooth functions. This result can be utilized to analyze algorithms which are almost always stable. Next, we prove a phase transition in the concentration of almost-everywhere smooth functions. Finally, a tight concentration of empirical errors of empirical minimizers is shown under an assumption on the underlying space.
by Alexander Rakhlin.
Ph.D.
Nordenfors, Oskar. "A Literature Study Concerning Generalization Error Bounds for Neural Networks via Rademacher Complexity." Thesis, Umeå universitet, Institutionen för matematik och matematisk statistik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-184487.
Full textI denna uppsats presenteras några grundläggande resultat från teorin kring maskininlärning och neurala nätverk, med målet att slutligen diskutera övre begräsningar på generaliseringsfelet hos neurala nätverk, via Rademachers komplexitet.
Bellet, Aurélien. "Supervised metric learning with generalization guarantees." Phd thesis, Université Jean Monnet - Saint-Etienne, 2012. http://tel.archives-ouvertes.fr/tel-00770627.
Full textMusayeva, Khadija. "Generalization Performance of Margin Multi-category Classifiers." Thesis, Université de Lorraine, 2019. http://www.theses.fr/2019LORR0096/document.
Full textThis thesis deals with the theory of margin multi-category classification, and is based on the statistical learning theory founded by Vapnik and Chervonenkis. We are interested in deriving generalization bounds with explicit dependencies on the number C of categories, the sample size m and the margin parameter gamma, when the loss function considered is a Lipschitz continuous margin loss function. Generalization bounds rely on the empirical performance of the classifier as well as its "capacity". In this work, the following scale-sensitive capacity measures are considered: the Rademacher complexity, the covering numbers and the fat-shattering dimension. Our main contributions are obtained under the assumption that the classes of component functions implemented by a classifier have polynomially growing fat-shattering dimensions and that the component functions are independent. In the context of the pathway of Mendelson, which relates the Rademacher complexity to the covering numbers and the latter to the fat-shattering dimension, we study the impact that decomposing at the level of one of these capacity measures has on the dependencies on C, m and gamma. In particular, we demonstrate that the dependency on C can be substantially improved over the state of the art if the decomposition is postponed to the level of the metric entropy or the fat-shattering dimension. On the other hand, this impacts negatively the rate of convergence (dependency on m), an indication of the fact that optimizing the dependencies on the three basic parameters amounts to looking for a trade-off
Philips, Petra Camilla, and petra philips@gmail com. "Data-Dependent Analysis of Learning Algorithms." The Australian National University. Research School of Information Sciences and Engineering, 2005. http://thesis.anu.edu.au./public/adt-ANU20050901.204523.
Full textKatsikarelis, Ioannis. "Structurally Parameterized Tight Bounds and Approximation for Generalizations of Independence and Domination." Thesis, Paris Sciences et Lettres (ComUE), 2019. http://www.theses.fr/2019PSLED048.
Full textIn this thesis we focus on the NP-hard problems (k, r)-CENTER and d-SCATTERED SET that generalize the well-studied concepts of domination and independence over larger distances. In the first part we maintain a parameterized viewpoint and examine the standard parameterization as well as the most widely-used graph parameters measuring the input’s structure. We offer hardness results that show there is no algorithm of running-time below certain bounds, subject to the (Strong) Exponential Time Hypothesis, produce essentially optimal algorithms of complexity that matches these lower bounds and further attempt to offer an alternative to exact computation in significantly reduced running-time by way of approximation algorithms. In the second part we consider the (super-)polynomial (in-)approximability of the d-SCATTERED SET problem, i.e. we determine the exact relationship between an achievable approximation ratio ρ, the distance parameter d, and the runningtime of any ρ-approximation algorithm expressed as a function of the above and the size of the input n. We then consider strictly polynomial running-times and improve our understanding on the approximability characteristics of the problem on graphs of bounded maximum degree as well as bipartite graphs
Books on the topic "Generalization bounds"
I, Arnolʹd V. Experimental mathematics. Berkeley, California: MSRI Mathematical Sciences Research Institute, 2015.
Find full textEdmunds, D. E., and W. D. Evans. Sesquilinear Forms in Hilbert Spaces. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198812050.003.0004.
Full textEspiritu, Yen Le. Race and U.S. Panethnic Formation. Edited by Ronald H. Bayor. Oxford University Press, 2014. http://dx.doi.org/10.1093/oxfordhb/9780199766031.013.013.
Full textHoring, Norman J. Morgenstern. Superfluidity and Superconductivity. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198791942.003.0013.
Full textBook chapters on the topic "Generalization bounds"
Zhang, Xinhua, Novi Quadrianto, Kristian Kersting, Zhao Xu, Yaakov Engel, Claude Sammut, Mark Reid, et al. "Generalization Bounds." In Encyclopedia of Machine Learning, 447–54. Boston, MA: Springer US, 2011. http://dx.doi.org/10.1007/978-0-387-30164-8_328.
Full textReid, Mark. "Generalization Bounds." In Encyclopedia of Machine Learning and Data Mining, 556. Boston, MA: Springer US, 2017. http://dx.doi.org/10.1007/978-1-4899-7687-1_328.
Full textRejchel, W. "Generalization Bounds for Ranking Algorithms." In Ensemble Classification Methods with Applicationsin R, 135–39. Chichester, UK: John Wiley & Sons, Ltd, 2018. http://dx.doi.org/10.1002/9781119421566.ch7.
Full textShawe-Taylor, John, and Nello Cristianini. "Margin Distribution Bounds on Generalization." In Lecture Notes in Computer Science, 263–73. Berlin, Heidelberg: Springer Berlin Heidelberg, 1999. http://dx.doi.org/10.1007/3-540-49097-3_21.
Full textCadoli, Marco, Luigi Palopoli, and Francesco Scarcello. "Propositional Lower Bounds: Generalization and Algorithms." In Logics in Artificial Intelligence, 355–67. Berlin, Heidelberg: Springer Berlin Heidelberg, 1998. http://dx.doi.org/10.1007/3-540-49545-2_24.
Full textKääriäinen, Matti. "Generalization Error Bounds Using Unlabeled Data." In Learning Theory, 127–42. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005. http://dx.doi.org/10.1007/11503415_9.
Full textMannor, Shie, and Ron Meir. "Geometric Bounds for Generalization in Boosting." In Lecture Notes in Computer Science, 461–72. Berlin, Heidelberg: Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-44581-1_30.
Full textBlanchard, Gilles. "Generalization Error Bounds for Aggregate Classifiers." In Nonlinear Estimation and Classification, 357–67. New York, NY: Springer New York, 2003. http://dx.doi.org/10.1007/978-0-387-21579-2_23.
Full textAgarwal, Shivani. "Generalization Bounds for Some Ordinal Regression Algorithms." In Lecture Notes in Computer Science, 7–21. Berlin, Heidelberg: Springer Berlin Heidelberg, 2008. http://dx.doi.org/10.1007/978-3-540-87987-9_6.
Full textVorontsov, Konstantin, and Andrey Ivahnenko. "Tight Combinatorial Generalization Bounds for Threshold Conjunction Rules." In Lecture Notes in Computer Science, 66–73. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-21786-9_13.
Full textConference papers on the topic "Generalization bounds"
Banerjee, Pradeep Kr, and Guido Montufar. "Information Complexity and Generalization Bounds." In 2021 IEEE International Symposium on Information Theory (ISIT). IEEE, 2021. http://dx.doi.org/10.1109/isit45174.2021.9517960.
Full textLopez, Adrian Tovar, and Varun Jog. "Generalization error bounds using Wasserstein distances." In 2018 IEEE Information Theory Workshop (ITW). IEEE, 2018. http://dx.doi.org/10.1109/itw.2018.8613445.
Full textLei, Yunwen, Shao-Bo Lin, and Ke Tang. "Generalization Bounds for Regularized Pairwise Learning." In Twenty-Seventh International Joint Conference on Artificial Intelligence {IJCAI-18}. California: International Joint Conferences on Artificial Intelligence Organization, 2018. http://dx.doi.org/10.24963/ijcai.2018/329.
Full textDayal, Abhinav. "Adaptive bounds for quadric based generalization." In 2009 IEEE International Geoscience and Remote Sensing Symposium. IEEE, 2009. http://dx.doi.org/10.1109/igarss.2009.5417731.
Full textKääriäinen, Matti, and John Langford. "A comparison of tight generalization error bounds." In the 22nd international conference. New York, New York, USA: ACM Press, 2005. http://dx.doi.org/10.1145/1102351.1102403.
Full textPensia, Ankit, Varun Jog, and Po-Ling Loh. "Generalization Error Bounds for Noisy, Iterative Algorithms." In 2018 IEEE International Symposium on Information Theory (ISIT). IEEE, 2018. http://dx.doi.org/10.1109/isit.2018.8437571.
Full textChan, Wendy. "Improving the Precision of Bounds for Generalization." In 2019 AERA Annual Meeting. Washington DC: AERA, 2019. http://dx.doi.org/10.3102/1426278.
Full textModak, Eeshan, Himanshu Asnani, and Vinod M. Prabhakaran. "Rényi Divergence Based Bounds on Generalization Error." In 2021 IEEE Information Theory Workshop (ITW). IEEE, 2021. http://dx.doi.org/10.1109/itw48936.2021.9611387.
Full textBu, Yuheng, Shaofeng Zou, and Venugopal V. Veeravalli. "Tightening Mutual Information Based Bounds on Generalization Error." In 2019 IEEE International Symposium on Information Theory (ISIT). IEEE, 2019. http://dx.doi.org/10.1109/isit.2019.8849590.
Full textIssa, Ibrahim, Amedeo Roberto Esposito, and Michael Gastpar. "Strengthened Information-theoretic Bounds on the Generalization Error." In 2019 IEEE International Symposium on Information Theory (ISIT). IEEE, 2019. http://dx.doi.org/10.1109/isit.2019.8849834.
Full textReports on the topic "Generalization bounds"
Dhankhar, Ritu, and Prasanna Kumar. A Remark on a Generalization of the Cauchy’s Bound. "Prof. Marin Drinov" Publishing House of Bulgarian Academy of Sciences, October 2020. http://dx.doi.org/10.7546/crabs.2020.10.01.
Full textZarrieß, Benjamin, and Anni-Yasmin Turhan. Most Specific Generalizations w.r.t. General EL-TBoxes. Technische Universität Dresden, 2013. http://dx.doi.org/10.25368/2022.196.
Full text