Literatura académica sobre el tema "Sparse bound"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Sparse bound".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Artículos de revistas sobre el tema "Sparse bound"
Lorist, Emiel y Zoe Nieraeth. "Sparse domination implies vector-valued sparse domination". Mathematische Zeitschrift 301, n.º 1 (12 de enero de 2022): 1–35. http://dx.doi.org/10.1007/s00209-021-02943-z.
Texto completoShparlinski, Igor E. y José Felipe Voloch. "Value Sets of Sparse Polynomials". Canadian Mathematical Bulletin 63, n.º 1 (24 de septiembre de 2019): 187–96. http://dx.doi.org/10.4153/s0008439519000316.
Texto completoZhang, Rui, Rui Xin, Margo Seltzer y Cynthia Rudin. "Optimal Sparse Regression Trees". Proceedings of the AAAI Conference on Artificial Intelligence 37, n.º 9 (26 de junio de 2023): 11270–79. http://dx.doi.org/10.1609/aaai.v37i9.26334.
Texto completoChen, Wengu y Huanmin Ge. "A Sharp Bound on RIC in Generalized Orthogonal Matching Pursuit". Canadian Mathematical Bulletin 61, n.º 1 (1 de marzo de 2018): 40–54. http://dx.doi.org/10.4153/cmb-2017-009-6.
Texto completoFerber, Asaf, Gweneth McKinley y Wojciech Samotij. "Supersaturated Sparse Graphs and Hypergraphs". International Mathematics Research Notices 2020, n.º 2 (8 de marzo de 2018): 378–402. http://dx.doi.org/10.1093/imrn/rny030.
Texto completoFELDHEIM, OHAD N. y MICHAEL KRIVELEVICH. "Winning Fast in Sparse Graph Construction Games". Combinatorics, Probability and Computing 17, n.º 6 (noviembre de 2008): 781–91. http://dx.doi.org/10.1017/s0963548308009401.
Texto completoChen, Qiushi, Xin Zhang, Qiang Yang, Lei Ye y Mengxiao Zhao. "Performance Bound for Joint Multiple Parameter Target Estimation in Sparse Stepped-Frequency Radar: A Comparison Analysis". Sensors 19, n.º 9 (29 de abril de 2019): 2002. http://dx.doi.org/10.3390/s19092002.
Texto completoSapir, Shachar y Asaf Shapira. "The Induced Removal Lemma in Sparse Graphs". Combinatorics, Probability and Computing 29, n.º 1 (30 de septiembre de 2019): 153–62. http://dx.doi.org/10.1017/s0963548319000233.
Texto completoDeng, Hao, Jianghong Chen, Biqin Song y Zhibin Pan. "Error Bound of Mode-Based Additive Models". Entropy 23, n.º 6 (22 de mayo de 2021): 651. http://dx.doi.org/10.3390/e23060651.
Texto completoMilani Fard, Mahdi, Yuri Grinberg, Joelle Pineau y Doina Precup. "Compressed Least-Squares Regression on Sparse Spaces". Proceedings of the AAAI Conference on Artificial Intelligence 26, n.º 1 (20 de septiembre de 2021): 1054–60. http://dx.doi.org/10.1609/aaai.v26i1.8303.
Texto completoTesis sobre el tema "Sparse bound"
Shabara, Yahia. "Establishing Large-Scale MIMO Communication: Coding for Channel Estimation". The Ohio State University, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=osu1618578732285999.
Texto completoPrice, Eric (Eric C. ). "Algorithms and lower bounds for sparse recovery". Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/62668.
Texto completoCataloged from PDF version of thesis.
Includes bibliographical references (p. 69-71).
We consider the following k-sparse recovery problem: design a distribution of m x n matrix A, such that for any signal x, given Ax with high probability we can efficiently recover x satisfying IIx - x l, -Cmink-sparse x' IIx - x'II. It is known that there exist such distributions with m = O(k log(n/k)) rows; in this thesis, we show that this bound is tight. We also introduce the set query algorithm, a primitive useful for solving special cases of sparse recovery using less than 8(k log(n/k)) rows. The set query algorithm estimates the values of a vector x [epsilon] Rn over a support S of size k from a randomized sparse binary linear sketch Ax of size O(k). Given Ax and S, we can recover x' with IIlx' - xSII2 - [theta]IIx - xsII2 with probability at least 1 - k-[omega](1). The recovery takes O(k) time. While interesting in its own right, this primitive also has a number of applications. For example, we can: * Improve the sparse recovery of Zipfian distributions O(k log n) measurements from a 1 + [epsilon] approximation to a 1 + o(1) approximation, giving the first such approximation when k - O(n1-[epsilon]). * Recover block-sparse vectors with O(k) space and a 1 + [epsilon] approximation. Previous algorithms required either w(k) space or w(1) approximation.
by Eric Price.
M.Eng.
Do, Ba Khanh. "Algorithms and lower bounds in the streaming and sparse recovery models". Thesis, Massachusetts Institute of Technology, 2012. http://hdl.handle.net/1721.1/75629.
Texto completoCataloged from PDF version of thesis.
Includes bibliographical references (p. 52-56).
In the data stream computation model, input data is given to us sequentially (the data stream), and our goal is to compute or approximate some function or statistic on that data using a sublinear (in both the length of the stream and the size of the universe of items that can appear in the stream) amount of space; in particular, we can store neither the entire stream nor a counter for each possible item we might see. In the sparse recovery model (also known as compressed sensing), input data is a large but sparse vector x [epsilon] Rn, and our goal is to design an m x n matrix [Phi]D, where m << n, such that for any sufficiently sparse x we can efficiently recover a good approximation of x from [Phi]x. Although at first glance these two models may seem quite different, they are in fact intimately related. In the streaming model, most statistics of interest are order-invariant, meaning they care only about the frequency of each item in the stream and not their position. For these problems, the data in the stream can be viewed as an n-dimensional vector x, where xi is the number of occurrences of item i. Using this representation, one of the high-level tools that have proven most popular has been the linear sketch, where for some m x n matrix {Phi]we maintain {Phi]x (the sketch) for the partial vector x as we progress along the stream. The linearity of the mapping D allows us to efficiently do incremental updates on our sketch, and as in its use in sparse recovery, the linear sketch turns out to be surprisingly powerful. In this thesis, we try to answer some questions of interest in each model, illustrating both the power and the limitations of the linear sketch. In Chapter 2, we provide an efficient sketch for estimating the (planar) Earth-Mover Distance (EMD) between two multisets of points. The EMD between point sets A, B R2 of the same size is defined as the minimum cost of a perfect matching between them, with each edge contributing a cost equal to its (Euclidean) length. As immediate consequences, we give an improved algorithm for estimating EMD between point sets given over a stream, and an improved algorithm for the approximate nearest neighbor problem under EMD. In Chapter 3, we prove tight lower bounds for sparse recovery in the number of rows in the matrix [Phi] (i.e., the number of measurements) in order to achieve any of the three most studied recovery guarantees. Specifically, consider a matrix [Phi] and an algorithm R such that for any signal x, R can recover an approximation & from [Phi] satisfying ... where (1) p= q= 1 and C= O(1), (2) p= q= 2 and C = O(1), or (3) p =2, q = 1 and C = O(k-1/ 2 ). We show that any such [Phi] I must have at least [Omega](k log(n/k)) rows. This is known to be optimal in cases (1) and (2), and near optimal for (3). In Chapter 4, we propose a variant of sparse recovery that incorporates some additional knowledge about the signal that allows the above lower bound to be broken. In particular, we consider the scenario where, after measurements are taken, we are given a set S of size s < n (s is known beforehand) that is supposed to contain most of the "large" coefficients of x. The goal is then to recover i satisfying ... We refer to this formulation as the sparse recovery with partial support knowledge problem (SRPSK). We focus on the guarantees where p = q = 1 or 2 and C = 1 + e, for which we provide lower bounds as well as a method of converting algorithms for "standard" sparse recovery into ones for SRPSK. We also make use of one of the reductions to give an optimal algorithm for SRPSK for the guarantee where p = q = 2.
by Khanh Do Ba.
Ph.D.
Brennan, Matthew (Matthew Stewart). "Reducibility and computational lower bounds for problems with planted sparse structure". Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/118062.
Texto completoCataloged from PDF version of thesis.
Includes bibliographical references (pages 145-155).
Recently, research in unsupervised learning has gravitated towards exploring statistical-computational gaps induced by sparsity. A line of work initiated by Berthet and Rigollet has aimed to explain these gaps through reductions to conjecturally hard problems from complexity theory. However, the delicate nature of average-case reductions has limited the development of techniques and often led to weaker hardness results that only apply to algorithms that are robust to different noise distributions or that do not need to know the parameters of the problem. We introduce several new techniques to give a web of average-case reductions showing strong computational lower bounds based on the planted clique conjecture for planted independent set, planted dense subgraph, biclustering, sparse rank-1 submatrix, sparse PCA and the subgraph stochastic block model. Our results demonstrate that, despite the delicate nature of average-case reductions, using natural problems as intermediates can often be beneficial, as is the case in worst-case complexity. Our main technical contribution is to introduce a set of techniques for average-case reductions that: (1) maintain the level of signal in an instance of a problem; (2) alter its planted structure; and (3) map two initial high-dimensional distributions simultaneously to two target distributions approximately under total variation. We also give algorithms matching our lower bounds and identify the information-theoretic limits of the models we consider.
by Matthew Brennan.
S.M. in Computer Science and Engineering
Gasiorowski, Pawel. "Individual and group dynamic behaviour patterns in bound spaces". Thesis, London Metropolitan University, 2017. http://repository.londonmet.ac.uk/1447/.
Texto completoSeeger, Matthias. "Bayesian Gaussian process models : PAC-Bayesian generalisation error bounds and sparse approximations". Thesis, University of Edinburgh, 2003. http://hdl.handle.net/1842/321.
Texto completoGiulini, Ilaria. "Generalization bounds for random samples in Hilbert spaces". Thesis, Paris, Ecole normale supérieure, 2015. http://www.theses.fr/2015ENSU0026/document.
Texto completoThis thesis focuses on obtaining generalization bounds for random samples in reproducing kernel Hilbert spaces. The approach consists in first obtaining non-asymptotic dimension-free bounds in finite-dimensional spaces using some PAC-Bayesian inequalities related to Gaussian perturbations and then in generalizing the results in a separable Hilbert space. We first investigate the question of estimating the Gram operator by a robust estimator from an i. i. d. sample and we present uniform bounds that hold under weak moment assumptions. These results allow us to qualify principal component analysis independently of the dimension of the ambient space and to propose stable versions of it. In the last part of the thesis we present a new algorithm for spectral clustering. It consists in replacing the projection on the eigenvectors associated with the largest eigenvalues of the Laplacian matrix by a power of the normalized Laplacian. This iteration, justified by the analysis of clustering in terms of Markov chains, performs a smooth truncation. We prove nonasymptotic bounds for the convergence of our spectral clustering algorithm applied to a random sample of points in a Hilbert space that are deduced from the bounds for the Gram operator in a Hilbert space. Experiments are done in the context of image analysis
Cherief-Abdellatif, Badr-Eddine. "Contributions to the theoretical study of variational inference and robustness". Electronic Thesis or Diss., Institut polytechnique de Paris, 2020. http://www.theses.fr/2020IPPAG001.
Texto completoThis PhD thesis deals with variational inference and robustness. More precisely, it focuses on the statistical properties of variational approximations and the design of efficient algorithms for computing them in an online fashion, and investigates Maximum Mean Discrepancy based estimators as learning rules that are robust to model misspecification.In recent years, variational inference has been extensively studied from the computational viewpoint, but only little attention has been put in the literature towards theoretical properties of variational approximations until very recently. In this thesis, we investigate the consistency of variational approximations in various statistical models and the conditions that ensure the consistency of variational approximations. In particular, we tackle the special case of mixture models and deep neural networks. We also justify in theory the use of the ELBO maximization strategy, a model selection criterion that is widely used in the Variational Bayes community and is known to work well in practice.Moreover, Bayesian inference provides an attractive online-learning framework to analyze sequential data, and offers generalization guarantees which hold even under model mismatch and with adversaries. Unfortunately, exact Bayesian inference is rarely feasible in practice and approximation methods are usually employed, but do such methods preserve the generalization properties of Bayesian inference? In this thesis, we show that this is indeed the case for some variational inference algorithms. We propose new online, tempered variational algorithms and derive their generalization bounds. Our theoretical result relies on the convexity of the variational objective, but we argue that our result should hold more generally and present empirical evidence in support of this. Our work presents theoretical justifications in favor of online algorithms that rely on approximate Bayesian methods. Another point that is addressed in this thesis is the design of a universal estimation procedure. This question is of major interest, in particular because it leads to robust estimators, a very hot topic in statistics and machine learning. We tackle the problem of universal estimation using a minimum distance estimator based on the Maximum Mean Discrepancy. We show that the estimator is robust to both dependence and to the presence of outliers in the dataset. We also highlight the connections that may exist with minimum distance estimators using L2-distance. Finally, we provide a theoretical study of the stochastic gradient descent algorithm used to compute the estimator, and we support our findings with numerical simulations. We also propose a Bayesian version of our estimator, that we study from both a theoretical and a computational points of view
Lashkaripour, Rahmatollah. "Lower bounds and norms of operators on Lorentz sequence spaces". Thesis, Lancaster University, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.364315.
Texto completoRammea, Lisema. "Computations and bounds for surfaces in weighted projective four-spaces". Thesis, University of Bath, 2009. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.507235.
Texto completoLibros sobre el tema "Sparse bound"
Turtledove, Harry. Homeward Bound. New York: Del Rey/Ballantine Books, 2004.
Buscar texto completoTurtledove, Harry. Homeward Bound. New York: Random House Publishing Group, 2004.
Buscar texto completoTurtledove, Harry. Homeward bound. New York: Ballantine Books, 2005.
Buscar texto completoNikova, Svetla Iordanova. Bounds for designs in infinite polynomial metric spaces. [Eindhoven]: University Press Facilities, Eindhoven University of Technology, 1998.
Buscar texto completoSchneck, Arne. Bounds for optimization of the reflection coefficient by constrained optimization in hardy spaces. Karlsruhe: Univ.-Verl. Karlsruhe, 2009.
Buscar texto completoPublic Art Development Trust (London, England), ed. The Thames archive project: Henry Bond Angela Bulloch. London: Public Art Development Trust, 2000.
Buscar texto completoUnited States. Superintendent of Documents. Explorations in space: A collection of space-related publications to interest earth-bound astronauts of all ages. Washington, DC: Supt. of Docs., 1985.
Buscar texto completoUnited States. National Aeronautics and Space Administration., ed. Development of a nondestructive vibration technique for bond assessment of space shuttle tiles: Final report. Orlando, Fla: Dept. of Mechanical and Aerospace Engineering, University of Central Florida, 1994.
Buscar texto completoUnited States. Environmental Protection Agency. Office of Transportation and Air Quality. Bond requirements for nonroad spark-ignition engines: Frequently asked questions. Washington, D.C.]: U.S. Environmental Protection Agency, Office of Transportation and Air Quality, 2009.
Buscar texto completoCopyright Paperback Collection (Library of Congress), ed. Honor Bound: Day of Honor Book 6: Star Trek: Deep Space Nine #11. New York: Pocket Books, 1997.
Buscar texto completoCapítulos de libros sobre el tema "Sparse bound"
Schmitt, Michael. "An Improved VC Dimension Bound for Sparse Polynomials". En Learning Theory, 393–407. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-27819-1_27.
Texto completoLiu, Pan y Fenglei Wang. "Sparse Acceleration Algorithm Based on Likelihood Upper Bound". En Advances in Intelligent Systems and Computing, 171–79. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-34387-3_21.
Texto completoChen, Xi, Tim Randolph, Rocco A. Servedio y Timothy Sun. "A Lower Bound on Cycle-Finding in Sparse Digraphs". En Proceedings of the Fourteenth Annual ACM-SIAM Symposium on Discrete Algorithms, 2936–52. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2020. http://dx.doi.org/10.1137/1.9781611975994.178.
Texto completoMaurer, Andreas, Massimiliano Pontil y Luca Baldassarre. "Lower Bounds for Sparse Coding". En Measures of Complexity, 359–70. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-21852-6_24.
Texto completode Woot, Philippe. "New Spaces: Globalization of the Market Economy". En Should Prometheus Be Bound?, 25–37. London: Palgrave Macmillan UK, 2005. http://dx.doi.org/10.1057/9780230502062_3.
Texto completovan de Geer, Sara. "Lower Bounds for Sparse Quadratic Forms". En Lecture Notes in Mathematics, 223–31. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-32774-7_15.
Texto completoSottile, Frank. "Lower bounds for sparse polynomial systems". En Real Solutions to Equations from Geometry, 77–90. Providence, Rhode Island: American Mathematical Society, 2011. http://dx.doi.org/10.1090/ulect/057/07.
Texto completoWaghid, Yusef. "Pedagogy Within Rhizomatic Spaces". En Pedagogy Out of Bounds, 65–70. Rotterdam: SensePublishers, 2014. http://dx.doi.org/10.1007/978-94-6209-616-5_7.
Texto completoMartín Pendás, Ángel y Julia Contreras-García. "Topological Spaces". En Topological Approaches to the Chemical Bond, 9–28. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-13666-5_2.
Texto completoKrause, Ben y Michael T. Lacey. "Sparse Bounds for Random Discrete Carleson Theorems". En 50 Years with Hardy Spaces, 317–32. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-59078-3_16.
Texto completoActas de conferencias sobre el tema "Sparse bound"
Guitton, A. "Sparse Radon Transforms with Bound-Constrained Optimization". En 67th EAGE Conference & Exhibition. European Association of Geoscientists & Engineers, 2005. http://dx.doi.org/10.3997/2214-4609-pdb.1.a025.
Texto completoPajovic, Milutin. "Misspecified Bayesian Cramér-Rao Bound for Sparse Bayesian". En 2018 IEEE Statistical Signal Processing Workshop (SSP). IEEE, 2018. http://dx.doi.org/10.1109/ssp.2018.8450780.
Texto completoFlorescu, Anisia, Emilie Chouzenoux, Jean-Christophe Pesquet y Silviu Ciochina. "Cramer-Rao bound for a sparse complex model". En 2014 10th International Conference on Communications (COMM). IEEE, 2014. http://dx.doi.org/10.1109/iccomm.2014.6866673.
Texto completoSaidi, Pouria, George Atia y Azadeh Vosoughi. "Improved Block-Sparse Recovery Bound Using Cumulative Block Coherence". En 2020 54th Asilomar Conference on Signals, Systems, and Computers. IEEE, 2020. http://dx.doi.org/10.1109/ieeeconf51394.2020.9443546.
Texto completoFriedlander, B. "On the Cramer-Rao Bound for Sparse Linear Arrays". En 2020 54th Asilomar Conference on Signals, Systems, and Computers. IEEE, 2020. http://dx.doi.org/10.1109/ieeeconf51394.2020.9443292.
Texto completoMontanari, Andrea y Elchanan Mossel. "Smooth compression, Gallager bound and nonlinear sparse-graph codes". En 2008 IEEE International Symposium on Information Theory - ISIT. IEEE, 2008. http://dx.doi.org/10.1109/isit.2008.4595436.
Texto completoWang, Di y Jinhui Xu. "Lower Bound of Locally Differentially Private Sparse Covariance Matrix Estimation". En Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/665.
Texto completoHashemi, Abolfazl y Haris Vikalo. "Recovery of sparse signals via Branch and Bound Least-Squares". En 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2017. http://dx.doi.org/10.1109/icassp.2017.7953060.
Texto completoKhan, Diba y Kristine L. Bell. "Explicit Ziv-Zakai bound for DOA estimation with sparse linear arrays". En 2009 3rd IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP). IEEE, 2009. http://dx.doi.org/10.1109/camsap.2009.5413287.
Texto completoKoochakzadeh, Ali y Piya Pal. "On saturation of the Cramér Rao Bound for Sparse Bayesian Learning". En 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2017. http://dx.doi.org/10.1109/icassp.2017.7952723.
Texto completoInformes sobre el tema "Sparse bound"
Clark, Donald L., Stefan M. Kirby y Charles G. Oviatt. Geologic Map of the Rush Valley 30' X 60' Quadrangle, Tooele, Utah, and Salt Lake Counties, Utah. Utah Geological Survey, agosto de 2023. http://dx.doi.org/10.34191/m-294dm.
Texto completoMezzacappa, Anthony, Eirik Endeve, Cory D. Hauck y Yulong Xing. Bound-Preserving Discontinuous Galerkin Methods for Conservative Phase Space Advection in Curvilinear Coordinates. Office of Scientific and Technical Information (OSTI), febrero de 2015. http://dx.doi.org/10.2172/1394128.
Texto completoJacobs, Timothy y Jacob Hedrick. PR-457-14201-R03 Variable NG Composition Effects in LB 2S Compressor Engines - Prediction Enhancement. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), agosto de 2017. http://dx.doi.org/10.55274/r0011406.
Texto completoOliynyk, Kateryna y Matteo Ciantia. Application of a finite deformation multiplicative plasticity model with non-local hardening to the simulation of CPTu tests in a structured soil. University of Dundee, diciembre de 2021. http://dx.doi.org/10.20933/100001230.
Texto completoBano, Masooda y Zeena Oberoi. Embedding Innovation in State Systems: Lessons from Pratham in India. Research on Improving Systems of Education (RISE), diciembre de 2020. http://dx.doi.org/10.35489/bsg-rise-wp_2020/058.
Texto completoLitaor, Iggy, James Ippolito, Iris Zohar y Michael Massey. Phosphorus capture recycling and utilization for sustainable agriculture using Al/organic composite water treatment residuals. United States Department of Agriculture, enero de 2015. http://dx.doi.org/10.32747/2015.7600037.bard.
Texto completo