Academic literature on the topic 'Sparse bound'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Sparse bound.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Sparse bound"
Lorist, Emiel, and Zoe Nieraeth. "Sparse domination implies vector-valued sparse domination." Mathematische Zeitschrift 301, no. 1 (January 12, 2022): 1–35. http://dx.doi.org/10.1007/s00209-021-02943-z.
Full textShparlinski, Igor E., and José Felipe Voloch. "Value Sets of Sparse Polynomials." Canadian Mathematical Bulletin 63, no. 1 (September 24, 2019): 187–96. http://dx.doi.org/10.4153/s0008439519000316.
Full textZhang, Rui, Rui Xin, Margo Seltzer, and Cynthia Rudin. "Optimal Sparse Regression Trees." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 9 (June 26, 2023): 11270–79. http://dx.doi.org/10.1609/aaai.v37i9.26334.
Full textChen, Wengu, and Huanmin Ge. "A Sharp Bound on RIC in Generalized Orthogonal Matching Pursuit." Canadian Mathematical Bulletin 61, no. 1 (March 1, 2018): 40–54. http://dx.doi.org/10.4153/cmb-2017-009-6.
Full textFerber, Asaf, Gweneth McKinley, and Wojciech Samotij. "Supersaturated Sparse Graphs and Hypergraphs." International Mathematics Research Notices 2020, no. 2 (March 8, 2018): 378–402. http://dx.doi.org/10.1093/imrn/rny030.
Full textFELDHEIM, OHAD N., and MICHAEL KRIVELEVICH. "Winning Fast in Sparse Graph Construction Games." Combinatorics, Probability and Computing 17, no. 6 (November 2008): 781–91. http://dx.doi.org/10.1017/s0963548308009401.
Full textChen, Qiushi, Xin Zhang, Qiang Yang, Lei Ye, and Mengxiao Zhao. "Performance Bound for Joint Multiple Parameter Target Estimation in Sparse Stepped-Frequency Radar: A Comparison Analysis." Sensors 19, no. 9 (April 29, 2019): 2002. http://dx.doi.org/10.3390/s19092002.
Full textSapir, Shachar, and Asaf Shapira. "The Induced Removal Lemma in Sparse Graphs." Combinatorics, Probability and Computing 29, no. 1 (September 30, 2019): 153–62. http://dx.doi.org/10.1017/s0963548319000233.
Full textDeng, Hao, Jianghong Chen, Biqin Song, and Zhibin Pan. "Error Bound of Mode-Based Additive Models." Entropy 23, no. 6 (May 22, 2021): 651. http://dx.doi.org/10.3390/e23060651.
Full textMilani Fard, Mahdi, Yuri Grinberg, Joelle Pineau, and Doina Precup. "Compressed Least-Squares Regression on Sparse Spaces." Proceedings of the AAAI Conference on Artificial Intelligence 26, no. 1 (September 20, 2021): 1054–60. http://dx.doi.org/10.1609/aaai.v26i1.8303.
Full textDissertations / Theses on the topic "Sparse bound"
Shabara, Yahia. "Establishing Large-Scale MIMO Communication: Coding for Channel Estimation." The Ohio State University, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=osu1618578732285999.
Full textPrice, Eric (Eric C. ). "Algorithms and lower bounds for sparse recovery." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/62668.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (p. 69-71).
We consider the following k-sparse recovery problem: design a distribution of m x n matrix A, such that for any signal x, given Ax with high probability we can efficiently recover x satisfying IIx - x l, -Cmink-sparse x' IIx - x'II. It is known that there exist such distributions with m = O(k log(n/k)) rows; in this thesis, we show that this bound is tight. We also introduce the set query algorithm, a primitive useful for solving special cases of sparse recovery using less than 8(k log(n/k)) rows. The set query algorithm estimates the values of a vector x [epsilon] Rn over a support S of size k from a randomized sparse binary linear sketch Ax of size O(k). Given Ax and S, we can recover x' with IIlx' - xSII2 - [theta]IIx - xsII2 with probability at least 1 - k-[omega](1). The recovery takes O(k) time. While interesting in its own right, this primitive also has a number of applications. For example, we can: * Improve the sparse recovery of Zipfian distributions O(k log n) measurements from a 1 + [epsilon] approximation to a 1 + o(1) approximation, giving the first such approximation when k - O(n1-[epsilon]). * Recover block-sparse vectors with O(k) space and a 1 + [epsilon] approximation. Previous algorithms required either w(k) space or w(1) approximation.
by Eric Price.
M.Eng.
Do, Ba Khanh. "Algorithms and lower bounds in the streaming and sparse recovery models." Thesis, Massachusetts Institute of Technology, 2012. http://hdl.handle.net/1721.1/75629.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (p. 52-56).
In the data stream computation model, input data is given to us sequentially (the data stream), and our goal is to compute or approximate some function or statistic on that data using a sublinear (in both the length of the stream and the size of the universe of items that can appear in the stream) amount of space; in particular, we can store neither the entire stream nor a counter for each possible item we might see. In the sparse recovery model (also known as compressed sensing), input data is a large but sparse vector x [epsilon] Rn, and our goal is to design an m x n matrix [Phi]D, where m << n, such that for any sufficiently sparse x we can efficiently recover a good approximation of x from [Phi]x. Although at first glance these two models may seem quite different, they are in fact intimately related. In the streaming model, most statistics of interest are order-invariant, meaning they care only about the frequency of each item in the stream and not their position. For these problems, the data in the stream can be viewed as an n-dimensional vector x, where xi is the number of occurrences of item i. Using this representation, one of the high-level tools that have proven most popular has been the linear sketch, where for some m x n matrix {Phi]we maintain {Phi]x (the sketch) for the partial vector x as we progress along the stream. The linearity of the mapping D allows us to efficiently do incremental updates on our sketch, and as in its use in sparse recovery, the linear sketch turns out to be surprisingly powerful. In this thesis, we try to answer some questions of interest in each model, illustrating both the power and the limitations of the linear sketch. In Chapter 2, we provide an efficient sketch for estimating the (planar) Earth-Mover Distance (EMD) between two multisets of points. The EMD between point sets A, B R2 of the same size is defined as the minimum cost of a perfect matching between them, with each edge contributing a cost equal to its (Euclidean) length. As immediate consequences, we give an improved algorithm for estimating EMD between point sets given over a stream, and an improved algorithm for the approximate nearest neighbor problem under EMD. In Chapter 3, we prove tight lower bounds for sparse recovery in the number of rows in the matrix [Phi] (i.e., the number of measurements) in order to achieve any of the three most studied recovery guarantees. Specifically, consider a matrix [Phi] and an algorithm R such that for any signal x, R can recover an approximation & from [Phi] satisfying ... where (1) p= q= 1 and C= O(1), (2) p= q= 2 and C = O(1), or (3) p =2, q = 1 and C = O(k-1/ 2 ). We show that any such [Phi] I must have at least [Omega](k log(n/k)) rows. This is known to be optimal in cases (1) and (2), and near optimal for (3). In Chapter 4, we propose a variant of sparse recovery that incorporates some additional knowledge about the signal that allows the above lower bound to be broken. In particular, we consider the scenario where, after measurements are taken, we are given a set S of size s < n (s is known beforehand) that is supposed to contain most of the "large" coefficients of x. The goal is then to recover i satisfying ... We refer to this formulation as the sparse recovery with partial support knowledge problem (SRPSK). We focus on the guarantees where p = q = 1 or 2 and C = 1 + e, for which we provide lower bounds as well as a method of converting algorithms for "standard" sparse recovery into ones for SRPSK. We also make use of one of the reductions to give an optimal algorithm for SRPSK for the guarantee where p = q = 2.
by Khanh Do Ba.
Ph.D.
Brennan, Matthew (Matthew Stewart). "Reducibility and computational lower bounds for problems with planted sparse structure." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/118062.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (pages 145-155).
Recently, research in unsupervised learning has gravitated towards exploring statistical-computational gaps induced by sparsity. A line of work initiated by Berthet and Rigollet has aimed to explain these gaps through reductions to conjecturally hard problems from complexity theory. However, the delicate nature of average-case reductions has limited the development of techniques and often led to weaker hardness results that only apply to algorithms that are robust to different noise distributions or that do not need to know the parameters of the problem. We introduce several new techniques to give a web of average-case reductions showing strong computational lower bounds based on the planted clique conjecture for planted independent set, planted dense subgraph, biclustering, sparse rank-1 submatrix, sparse PCA and the subgraph stochastic block model. Our results demonstrate that, despite the delicate nature of average-case reductions, using natural problems as intermediates can often be beneficial, as is the case in worst-case complexity. Our main technical contribution is to introduce a set of techniques for average-case reductions that: (1) maintain the level of signal in an instance of a problem; (2) alter its planted structure; and (3) map two initial high-dimensional distributions simultaneously to two target distributions approximately under total variation. We also give algorithms matching our lower bounds and identify the information-theoretic limits of the models we consider.
by Matthew Brennan.
S.M. in Computer Science and Engineering
Gasiorowski, Pawel. "Individual and group dynamic behaviour patterns in bound spaces." Thesis, London Metropolitan University, 2017. http://repository.londonmet.ac.uk/1447/.
Full textSeeger, Matthias. "Bayesian Gaussian process models : PAC-Bayesian generalisation error bounds and sparse approximations." Thesis, University of Edinburgh, 2003. http://hdl.handle.net/1842/321.
Full textGiulini, Ilaria. "Generalization bounds for random samples in Hilbert spaces." Thesis, Paris, Ecole normale supérieure, 2015. http://www.theses.fr/2015ENSU0026/document.
Full textThis thesis focuses on obtaining generalization bounds for random samples in reproducing kernel Hilbert spaces. The approach consists in first obtaining non-asymptotic dimension-free bounds in finite-dimensional spaces using some PAC-Bayesian inequalities related to Gaussian perturbations and then in generalizing the results in a separable Hilbert space. We first investigate the question of estimating the Gram operator by a robust estimator from an i. i. d. sample and we present uniform bounds that hold under weak moment assumptions. These results allow us to qualify principal component analysis independently of the dimension of the ambient space and to propose stable versions of it. In the last part of the thesis we present a new algorithm for spectral clustering. It consists in replacing the projection on the eigenvectors associated with the largest eigenvalues of the Laplacian matrix by a power of the normalized Laplacian. This iteration, justified by the analysis of clustering in terms of Markov chains, performs a smooth truncation. We prove nonasymptotic bounds for the convergence of our spectral clustering algorithm applied to a random sample of points in a Hilbert space that are deduced from the bounds for the Gram operator in a Hilbert space. Experiments are done in the context of image analysis
Cherief-Abdellatif, Badr-Eddine. "Contributions to the theoretical study of variational inference and robustness." Electronic Thesis or Diss., Institut polytechnique de Paris, 2020. http://www.theses.fr/2020IPPAG001.
Full textThis PhD thesis deals with variational inference and robustness. More precisely, it focuses on the statistical properties of variational approximations and the design of efficient algorithms for computing them in an online fashion, and investigates Maximum Mean Discrepancy based estimators as learning rules that are robust to model misspecification.In recent years, variational inference has been extensively studied from the computational viewpoint, but only little attention has been put in the literature towards theoretical properties of variational approximations until very recently. In this thesis, we investigate the consistency of variational approximations in various statistical models and the conditions that ensure the consistency of variational approximations. In particular, we tackle the special case of mixture models and deep neural networks. We also justify in theory the use of the ELBO maximization strategy, a model selection criterion that is widely used in the Variational Bayes community and is known to work well in practice.Moreover, Bayesian inference provides an attractive online-learning framework to analyze sequential data, and offers generalization guarantees which hold even under model mismatch and with adversaries. Unfortunately, exact Bayesian inference is rarely feasible in practice and approximation methods are usually employed, but do such methods preserve the generalization properties of Bayesian inference? In this thesis, we show that this is indeed the case for some variational inference algorithms. We propose new online, tempered variational algorithms and derive their generalization bounds. Our theoretical result relies on the convexity of the variational objective, but we argue that our result should hold more generally and present empirical evidence in support of this. Our work presents theoretical justifications in favor of online algorithms that rely on approximate Bayesian methods. Another point that is addressed in this thesis is the design of a universal estimation procedure. This question is of major interest, in particular because it leads to robust estimators, a very hot topic in statistics and machine learning. We tackle the problem of universal estimation using a minimum distance estimator based on the Maximum Mean Discrepancy. We show that the estimator is robust to both dependence and to the presence of outliers in the dataset. We also highlight the connections that may exist with minimum distance estimators using L2-distance. Finally, we provide a theoretical study of the stochastic gradient descent algorithm used to compute the estimator, and we support our findings with numerical simulations. We also propose a Bayesian version of our estimator, that we study from both a theoretical and a computational points of view
Lashkaripour, Rahmatollah. "Lower bounds and norms of operators on Lorentz sequence spaces." Thesis, Lancaster University, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.364315.
Full textRammea, Lisema. "Computations and bounds for surfaces in weighted projective four-spaces." Thesis, University of Bath, 2009. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.507235.
Full textBooks on the topic "Sparse bound"
Turtledove, Harry. Homeward Bound. New York: Del Rey/Ballantine Books, 2004.
Find full textTurtledove, Harry. Homeward Bound. New York: Random House Publishing Group, 2004.
Find full textTurtledove, Harry. Homeward bound. New York: Ballantine Books, 2005.
Find full textNikova, Svetla Iordanova. Bounds for designs in infinite polynomial metric spaces. [Eindhoven]: University Press Facilities, Eindhoven University of Technology, 1998.
Find full textSchneck, Arne. Bounds for optimization of the reflection coefficient by constrained optimization in hardy spaces. Karlsruhe: Univ.-Verl. Karlsruhe, 2009.
Find full textPublic Art Development Trust (London, England), ed. The Thames archive project: Henry Bond Angela Bulloch. London: Public Art Development Trust, 2000.
Find full textUnited States. Superintendent of Documents. Explorations in space: A collection of space-related publications to interest earth-bound astronauts of all ages. Washington, DC: Supt. of Docs., 1985.
Find full textUnited States. National Aeronautics and Space Administration., ed. Development of a nondestructive vibration technique for bond assessment of space shuttle tiles: Final report. Orlando, Fla: Dept. of Mechanical and Aerospace Engineering, University of Central Florida, 1994.
Find full textUnited States. Environmental Protection Agency. Office of Transportation and Air Quality. Bond requirements for nonroad spark-ignition engines: Frequently asked questions. Washington, D.C.]: U.S. Environmental Protection Agency, Office of Transportation and Air Quality, 2009.
Find full textCopyright Paperback Collection (Library of Congress), ed. Honor Bound: Day of Honor Book 6: Star Trek: Deep Space Nine #11. New York: Pocket Books, 1997.
Find full textBook chapters on the topic "Sparse bound"
Schmitt, Michael. "An Improved VC Dimension Bound for Sparse Polynomials." In Learning Theory, 393–407. Berlin, Heidelberg: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-27819-1_27.
Full textLiu, Pan, and Fenglei Wang. "Sparse Acceleration Algorithm Based on Likelihood Upper Bound." In Advances in Intelligent Systems and Computing, 171–79. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-34387-3_21.
Full textChen, Xi, Tim Randolph, Rocco A. Servedio, and Timothy Sun. "A Lower Bound on Cycle-Finding in Sparse Digraphs." In Proceedings of the Fourteenth Annual ACM-SIAM Symposium on Discrete Algorithms, 2936–52. Philadelphia, PA: Society for Industrial and Applied Mathematics, 2020. http://dx.doi.org/10.1137/1.9781611975994.178.
Full textMaurer, Andreas, Massimiliano Pontil, and Luca Baldassarre. "Lower Bounds for Sparse Coding." In Measures of Complexity, 359–70. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-21852-6_24.
Full textde Woot, Philippe. "New Spaces: Globalization of the Market Economy." In Should Prometheus Be Bound?, 25–37. London: Palgrave Macmillan UK, 2005. http://dx.doi.org/10.1057/9780230502062_3.
Full textvan de Geer, Sara. "Lower Bounds for Sparse Quadratic Forms." In Lecture Notes in Mathematics, 223–31. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-32774-7_15.
Full textSottile, Frank. "Lower bounds for sparse polynomial systems." In Real Solutions to Equations from Geometry, 77–90. Providence, Rhode Island: American Mathematical Society, 2011. http://dx.doi.org/10.1090/ulect/057/07.
Full textWaghid, Yusef. "Pedagogy Within Rhizomatic Spaces." In Pedagogy Out of Bounds, 65–70. Rotterdam: SensePublishers, 2014. http://dx.doi.org/10.1007/978-94-6209-616-5_7.
Full textMartín Pendás, Ángel, and Julia Contreras-García. "Topological Spaces." In Topological Approaches to the Chemical Bond, 9–28. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-13666-5_2.
Full textKrause, Ben, and Michael T. Lacey. "Sparse Bounds for Random Discrete Carleson Theorems." In 50 Years with Hardy Spaces, 317–32. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-59078-3_16.
Full textConference papers on the topic "Sparse bound"
Guitton, A. "Sparse Radon Transforms with Bound-Constrained Optimization." In 67th EAGE Conference & Exhibition. European Association of Geoscientists & Engineers, 2005. http://dx.doi.org/10.3997/2214-4609-pdb.1.a025.
Full textPajovic, Milutin. "Misspecified Bayesian Cramér-Rao Bound for Sparse Bayesian." In 2018 IEEE Statistical Signal Processing Workshop (SSP). IEEE, 2018. http://dx.doi.org/10.1109/ssp.2018.8450780.
Full textFlorescu, Anisia, Emilie Chouzenoux, Jean-Christophe Pesquet, and Silviu Ciochina. "Cramer-Rao bound for a sparse complex model." In 2014 10th International Conference on Communications (COMM). IEEE, 2014. http://dx.doi.org/10.1109/iccomm.2014.6866673.
Full textSaidi, Pouria, George Atia, and Azadeh Vosoughi. "Improved Block-Sparse Recovery Bound Using Cumulative Block Coherence." In 2020 54th Asilomar Conference on Signals, Systems, and Computers. IEEE, 2020. http://dx.doi.org/10.1109/ieeeconf51394.2020.9443546.
Full textFriedlander, B. "On the Cramer-Rao Bound for Sparse Linear Arrays." In 2020 54th Asilomar Conference on Signals, Systems, and Computers. IEEE, 2020. http://dx.doi.org/10.1109/ieeeconf51394.2020.9443292.
Full textMontanari, Andrea, and Elchanan Mossel. "Smooth compression, Gallager bound and nonlinear sparse-graph codes." In 2008 IEEE International Symposium on Information Theory - ISIT. IEEE, 2008. http://dx.doi.org/10.1109/isit.2008.4595436.
Full textWang, Di, and Jinhui Xu. "Lower Bound of Locally Differentially Private Sparse Covariance Matrix Estimation." In Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/665.
Full textHashemi, Abolfazl, and Haris Vikalo. "Recovery of sparse signals via Branch and Bound Least-Squares." In 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2017. http://dx.doi.org/10.1109/icassp.2017.7953060.
Full textKhan, Diba, and Kristine L. Bell. "Explicit Ziv-Zakai bound for DOA estimation with sparse linear arrays." In 2009 3rd IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP). IEEE, 2009. http://dx.doi.org/10.1109/camsap.2009.5413287.
Full textKoochakzadeh, Ali, and Piya Pal. "On saturation of the Cramér Rao Bound for Sparse Bayesian Learning." In 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2017. http://dx.doi.org/10.1109/icassp.2017.7952723.
Full textReports on the topic "Sparse bound"
Clark, Donald L., Stefan M. Kirby, and Charles G. Oviatt. Geologic Map of the Rush Valley 30' X 60' Quadrangle, Tooele, Utah, and Salt Lake Counties, Utah. Utah Geological Survey, August 2023. http://dx.doi.org/10.34191/m-294dm.
Full textMezzacappa, Anthony, Eirik Endeve, Cory D. Hauck, and Yulong Xing. Bound-Preserving Discontinuous Galerkin Methods for Conservative Phase Space Advection in Curvilinear Coordinates. Office of Scientific and Technical Information (OSTI), February 2015. http://dx.doi.org/10.2172/1394128.
Full textJacobs, Timothy, and Jacob Hedrick. PR-457-14201-R03 Variable NG Composition Effects in LB 2S Compressor Engines - Prediction Enhancement. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), August 2017. http://dx.doi.org/10.55274/r0011406.
Full textOliynyk, Kateryna, and Matteo Ciantia. Application of a finite deformation multiplicative plasticity model with non-local hardening to the simulation of CPTu tests in a structured soil. University of Dundee, December 2021. http://dx.doi.org/10.20933/100001230.
Full textBano, Masooda, and Zeena Oberoi. Embedding Innovation in State Systems: Lessons from Pratham in India. Research on Improving Systems of Education (RISE), December 2020. http://dx.doi.org/10.35489/bsg-rise-wp_2020/058.
Full textLitaor, Iggy, James Ippolito, Iris Zohar, and Michael Massey. Phosphorus capture recycling and utilization for sustainable agriculture using Al/organic composite water treatment residuals. United States Department of Agriculture, January 2015. http://dx.doi.org/10.32747/2015.7600037.bard.
Full text