Academic literature on the topic 'Low-Rank matrix approximation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Low-Rank matrix approximation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Low-Rank matrix approximation"

1

Ting Liu, Ting Liu, Mingjian Sun Mingjian Sun, Naizhang Feng Naizhang Feng, Minghua Wang Minghua Wang, Deying Chen Deying Chen, and and Yi Shen and Yi Shen. "Sparse photoacoustic microscopy based on low-rank matrix approximation." Chinese Optics Letters 14, no. 9 (2016): 091701–91705. http://dx.doi.org/10.3788/col201614.091701.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Parekh, Ankit, and Ivan W. Selesnick. "Enhanced Low-Rank Matrix Approximation." IEEE Signal Processing Letters 23, no. 4 (2016): 493–97. http://dx.doi.org/10.1109/lsp.2016.2535227.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Fomin, Fedor V., Petr A. Golovach, and Fahad Panolan. "Parameterized low-rank binary matrix approximation." Data Mining and Knowledge Discovery 34, no. 2 (2020): 478–532. http://dx.doi.org/10.1007/s10618-019-00669-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Fomin, Fedor V., Petr A. Golovach, Daniel Lokshtanov, Fahad Panolan, and Saket Saurabh. "Approximation Schemes for Low-rank Binary Matrix Approximation Problems." ACM Transactions on Algorithms 16, no. 1 (2020): 1–39. http://dx.doi.org/10.1145/3365653.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Jia, Yuheng, Hui Liu, Junhui Hou, and Qingfu Zhang. "Clustering Ensemble Meets Low-rank Tensor Approximation." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 9 (2021): 7970–78. http://dx.doi.org/10.1609/aaai.v35i9.16972.

Full text
Abstract:
This paper explores the problem of clustering ensemble, which aims to combine multiple base clusterings to produce better performance than that of the individual one. The existing clustering ensemble methods generally construct a co-association matrix, which indicates the pairwise similarity between samples, as the weighted linear combination of the connective matrices from different base clusterings, and the resulting co-association matrix is then adopted as the input of an off-the-shelf clustering algorithm, e.g., spectral clustering. However, the co-association matrix may be dominated by po
APA, Harvard, Vancouver, ISO, and other styles
6

Zhenyue Zhang and Keke Zhao. "Low-Rank Matrix Approximation with Manifold Regularization." IEEE Transactions on Pattern Analysis and Machine Intelligence 35, no. 7 (2013): 1717–29. http://dx.doi.org/10.1109/tpami.2012.274.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Xu, An-Bao, and Dongxiu Xie. "Low-rank approximation pursuit for matrix completion." Mechanical Systems and Signal Processing 95 (October 2017): 77–89. http://dx.doi.org/10.1016/j.ymssp.2017.03.024.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Barlow, Jesse L., and Hasan Erbay. "Modifiable low-rank approximation to a matrix." Numerical Linear Algebra with Applications 16, no. 10 (2009): 833–60. http://dx.doi.org/10.1002/nla.651.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Jiani, Jennifer Erway, Xiaofei Hu, Qiang Zhang, and Robert Plemmons. "Randomized SVD Methods in Hyperspectral Imaging." Journal of Electrical and Computer Engineering 2012 (2012): 1–15. http://dx.doi.org/10.1155/2012/409357.

Full text
Abstract:
We present a randomized singular value decomposition (rSVD) method for the purposes of lossless compression, reconstruction, classification, and target detection with hyperspectral (HSI) data. Recent work in low-rank matrix approximations obtained from random projections suggests that these approximations are well suited for randomized dimensionality reduction. Approximation errors for the rSVD are evaluated on HSI, and comparisons are made to deterministic techniques and as well as to other randomized low-rank matrix approximation methods involving compressive principal component analysis. Nu
APA, Harvard, Vancouver, ISO, and other styles
10

Soto-Quiros, Pablo. "Error analysis of the generalized low-rank matrix approximation." Electronic Journal of Linear Algebra 37 (July 23, 2021): 544–48. http://dx.doi.org/10.13001/ela.2021.5961.

Full text
Abstract:
In this paper, we propose an error analysis of the generalized low-rank approximation, which is a generalization of the classical approximation of a matrix $A\in\mathbb{R}^{m\times n}$ by a matrix of a rank at most $r$, where $r\leq\min\{m,n\}$.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Low-Rank matrix approximation"

1

Robeyns, Matthieu. "Mixed precision algorithms for low-rank matrix and tensor approximations." Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASG095.

Full text
Abstract:
La gestion des données est souvent réalisée par des objets mathématiques tels que les matrices et les tenseurs, qui sont la généralisation des matrices à plus de deux dimensions.Certains domaines d'application nécessitent de stocker trop d'éléments, créant des tenseurs trop grands ; ce problème est connu sous le nom de emph curse of dimensionality.Des méthodes mathématiques telles que les approximations de rang faible ont été développées pour réduire la dimensionnalité de ces objets malgré un coût très élevé en temps de calcul.De plus, de nouvelles architectures informatiques telles que les GP
APA, Harvard, Vancouver, ISO, and other styles
2

Blanchard, Pierre. "Fast hierarchical algorithms for the low-rank approximation of matrices, with applications to materials physics, geostatistics and data analysis." Thesis, Bordeaux, 2017. http://www.theses.fr/2017BORD0016/document.

Full text
Abstract:
Les techniques avancées pour l’approximation de rang faible des matrices sont des outils de réduction de dimension fondamentaux pour un grand nombre de domaines du calcul scientifique. Les approches hiérarchiques comme les matrices H2, en particulier la méthode multipôle rapide (FMM), bénéficient de la structure de rang faible par bloc de certaines matrices pour réduire le coût de calcul de problèmes d’interactions à n-corps en O(n) opérations au lieu de O(n2). Afin de mieux traiter des noyaux d’interaction complexes de plusieurs natures, des formulations FMM dites ”kernel-independent” ont réc
APA, Harvard, Vancouver, ISO, and other styles
3

Lee, Joonseok. "Local approaches for collaborative filtering." Diss., Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/53846.

Full text
Abstract:
Recommendation systems are emerging as an important business application as the demand for personalized services in E-commerce increases. Collaborative filtering techniques are widely used for predicting a user's preference or generating a list of items to be recommended. In this thesis, we develop several new approaches for collaborative filtering based on model combination and kernel smoothing. Specifically, we start with an experimental study that compares a wide variety of CF methods under different conditions. Based on this study, we formulate a combination model similar to boosting but w
APA, Harvard, Vancouver, ISO, and other styles
4

Kim, Jingu. "Nonnegative matrix and tensor factorizations, least squares problems, and applications." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/42909.

Full text
Abstract:
Nonnegative matrix factorization (NMF) is a useful dimension reduction method that has been investigated and applied in various areas. NMF is considered for high-dimensional data in which each element has a nonnegative value, and it provides a low-rank approximation formed by factors whose elements are also nonnegative. The nonnegativity constraints imposed on the low-rank factors not only enable natural interpretation but also reveal the hidden structure of data. Extending the benefits of NMF to multidimensional arrays, nonnegative tensor factorization (NTF) has been shown to be successful in
APA, Harvard, Vancouver, ISO, and other styles
5

Galvin, Timothy Matthew. "Faster streaming algorithms for low-rank matrix approximations." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/91810.

Full text
Abstract:
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2014.<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (pages 53-55).<br>Low-rank matrix approximations are used in a significant number of applications. We present new algorithms for generating such approximations in a streaming fashion that expand upon recently discovered matrix sketching techniques. We test our approaches on real and synthetic data to explore runtime and accuracy performance. We apply our algorithms to the technique of Latent Sema
APA, Harvard, Vancouver, ISO, and other styles
6

Abbas, Kinan. "Dématriçage et démélange conjoints d'images multispectrales." Electronic Thesis or Diss., Littoral, 2024. http://www.theses.fr/2024DUNK0710.

Full text
Abstract:
Dans cette thèse, nous considérons des images captées par une caméra multispectrale (MS) miniaturisée « snapshot ». Contrairement aux caméras RVB classiques, l’imagerie MS permet d’observer une scène sur des dizaines de longueurs d’onde différentes, permettant une analyse beaucoup plus précise du contenu observé. Alors que la plupart des caméras MS nécessitent un scan pour générer une image, les caméras MS snapshot peuvent fournir instantanément des images, voire des vidéos. Lorsque la caméra est miniaturisée, au lieu d’un cube de données 3D, elle fournit une image 2D, chaque pixel étant assoc
APA, Harvard, Vancouver, ISO, and other styles
7

Castorena, Juan. "Remote-Sensed LIDAR Using Random Impulsive Scans." International Foundation for Telemetering, 2012. http://hdl.handle.net/10150/581855.

Full text
Abstract:
Third generation full-waveform (FW) LIDAR systems image an entire scene by emitting laser pulses in particular directions and measuring the echoes. Each of these echoes provides range measurements about the objects intercepted by the laser pulse along a specified direction. By scanning through a specified region using a series of emitted pulses and observing their echoes, connected 1D profiles of 3D scenes can be readily obtained. This extra information has proven helpful in providing additional insight into the scene structure which can be used to construct effective characterizations and cla
APA, Harvard, Vancouver, ISO, and other styles
8

Vinyes, Marina. "Convex matrix sparsity for demixing with an application to graphical model structure estimation." Thesis, Paris Est, 2018. http://www.theses.fr/2018PESC1130/document.

Full text
Abstract:
En apprentissage automatique on a pour but d'apprendre un modèle, à partir de données, qui soit capable de faire des prédictions sur des nouvelles données (pas explorées auparavant). Pour obtenir un modèle qui puisse se généraliser sur les nouvelles données, et éviter le sur-apprentissage, nous devons restreindre le modèle. Ces restrictions sont généralement une connaissance a priori de la structure du modèle. Les premières approches considérées dans la littérature sont la régularisation de Tikhonov et plus tard le Lasso pour induire de la parcimonie dans la solution. La parcimonie fait partie
APA, Harvard, Vancouver, ISO, and other styles
9

Sadek, El Mostafa. "Méthodes itératives pour la résolution d'équations matricielles." Thesis, Littoral, 2015. http://www.theses.fr/2015DUNK0434/document.

Full text
Abstract:
Nous nous intéressons dans cette thèse, à l’étude des méthodes itératives pour la résolutiond’équations matricielles de grande taille : Lyapunov, Sylvester, Riccati et Riccatinon symétrique.L’objectif est de chercher des méthodes itératives plus efficaces et plus rapides pour résoudreles équations matricielles de grande taille. Nous proposons des méthodes itérativesde type projection sur des sous espaces de Krylov par blocs Km(A, V ) = Image{V,AV, . . . ,Am−1V }, ou des sous espaces de Krylov étendus par blocs Kem(A, V ) = Image{V,A−1V,AV,A−2V,A2V, · · · ,Am−1V,A−m+1V } . Ces méthodes sont gén
APA, Harvard, Vancouver, ISO, and other styles
10

Winkler, Anderson M. "Widening the applicability of permutation inference." Thesis, University of Oxford, 2016. https://ora.ox.ac.uk/objects/uuid:ce166876-0aa3-449e-8496-f28bf189960c.

Full text
Abstract:
This thesis is divided into three main parts. In the first, we discuss that, although permutation tests can provide exact control of false positives under the reasonable assumption of exchangeability, there are common examples in which global exchangeability does not hold, such as in experiments with repeated measurements or tests in which subjects are related to each other. To allow permutation inference in such cases, we propose an extension of the well known concept of exchangeability blocks, allowing these to be nested in a hierarchical, multi-level definition. This definition allows permu
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Low-Rank matrix approximation"

1

Kannan, Ramakrishnan, Mariya Ishteva, Barry Drake, and Haesun Park. "Bounded Matrix Low Rank Approximation." In Signals and Communication Technology. Springer Berlin Heidelberg, 2015. http://dx.doi.org/10.1007/978-3-662-48331-2_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Friedland, Shmuel, and Venu Tammali. "Low-Rank Approximation of Tensors." In Numerical Algebra, Matrix Theory, Differential-Algebraic Equations and Control Theory. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-15260-8_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Dewilde, Patrick, and Alle-Jan van der Veen. "Low-Rank Matrix Approximation and Subspace Tracking." In Time-Varying Systems and Computations. Springer US, 1998. http://dx.doi.org/10.1007/978-1-4757-2817-0_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zhang, Huaxiang, Zhichao Wang, and Linlin Cao. "Fast Nyström for Low Rank Matrix Approximation." In Advanced Data Mining and Applications. Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-35527-1_38.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Deshpande, Amit, and Santosh Vempala. "Adaptive Sampling and Fast Low-Rank Matrix Approximation." In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques. Springer Berlin Heidelberg, 2006. http://dx.doi.org/10.1007/11830924_28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Evensen, Geir, Femke C. Vossepoel, and Peter Jan van Leeuwen. "Localization and Inflation." In Springer Textbooks in Earth Sciences, Geography and Environment. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-96709-3_10.

Full text
Abstract:
AbstractLocalization and inflation have become essential means of mitigating the effects of the low-rank approximation in ensemble methods. Localization increases the effective rank of the ensemble covariance matrix and allows it to fit a large number of independent observations. Thus, we use localization to reduce sampling errors, in combination with inflation, to reduce the underestimation of the ensemble variance caused by the low-rank approximation. These methods are essential for high-dimensional applications, and this chapter will give a general introduction to various formulations of lo
APA, Harvard, Vancouver, ISO, and other styles
7

Li, Chong-Ya, Wenzheng Bao, Zhipeng Li, Youhua Zhang, Yong-Li Jiang, and Chang-An Yuan. "Local Sensitive Low Rank Matrix Approximation via Nonconvex Optimization." In Intelligent Computing Methodologies. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-63315-2_67.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Wacira, Joseph Muthui, Dinna Ranirina, and Bubacarr Bah. "Low Rank Matrix Approximation for Imputing Missing Categorical Data." In Artificial Intelligence Research. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-95070-5_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wu, Jiangang, and Shizhong Liao. "Accuracy-Preserving and Scalable Column-Based Low-Rank Matrix Approximation." In Knowledge Science, Engineering and Management. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-25159-2_22.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Mantzaflaris, Angelos, Bert Jüttler, B. N. Khoromskij, and Ulrich Langer. "Matrix Generation in Isogeometric Analysis by Low Rank Tensor Approximation." In Curves and Surfaces. Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-22804-4_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Low-Rank matrix approximation"

1

Kannan, Ramakrishnan, Mariya Ishteva, and Haesun Park. "Bounded Matrix Low Rank Approximation." In 2012 IEEE 12th International Conference on Data Mining (ICDM). IEEE, 2012. http://dx.doi.org/10.1109/icdm.2012.131.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Li, Chong-Ya, Lin Zhu, Wen-Zheng Bao, Yong-Li Jiang, Chang-An Yuan, and De-Shuang Huang. "Convex local sensitive low rank matrix approximation." In 2017 International Joint Conference on Neural Networks (IJCNN). IEEE, 2017. http://dx.doi.org/10.1109/ijcnn.2017.7965863.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

van der Veen, Alle-Jan. "Schur method for low-rank matrix approximation." In SPIE's 1994 International Symposium on Optics, Imaging, and Instrumentation, edited by Franklin T. Luk. SPIE, 1994. http://dx.doi.org/10.1117/12.190848.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Nadakuditi, Raj Rao. "Exploiting random matrix theory to improve noisy low-rank matrix approximation." In 2011 45th Asilomar Conference on Signals, Systems and Computers. IEEE, 2011. http://dx.doi.org/10.1109/acssc.2011.6190110.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Tatsukawa, Manami, and Mirai Tanaka. "Box Constrained Low-rank Matrix Approximation with Missing Values." In 7th International Conference on Operations Research and Enterprise Systems. SCITEPRESS - Science and Technology Publications, 2018. http://dx.doi.org/10.5220/0006612100780084.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Yinqiang Zheng, Guangcan Liu, S. Sugimoto, Shuicheng Yan, and M. Okutomi. "Practical low-rank matrix approximation under robust L1-norm." In 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2012. http://dx.doi.org/10.1109/cvpr.2012.6247828.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Alelyani, Salem, and Huan Liu. "Supervised Low Rank Matrix Approximation for Stable Feature Selection." In 2012 Eleventh International Conference on Machine Learning and Applications (ICMLA). IEEE, 2012. http://dx.doi.org/10.1109/icmla.2012.61.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Liu, Yang, Wenji Chen, and Yong Guan. "Monitoring Traffic Activity Graphs with low-rank matrix approximation." In 2012 IEEE 37th Conference on Local Computer Networks (LCN 2012). IEEE, 2012. http://dx.doi.org/10.1109/lcn.2012.6423680.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wang, Hengyou, Ruizhen Zhao, Yigang Cen, and Fengzhen Zhang. "Low-rank matrix recovery based on smooth function approximation." In 2016 IEEE 13th International Conference on Signal Processing (ICSP). IEEE, 2016. http://dx.doi.org/10.1109/icsp.2016.7877928.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kaloorazi, Maboud F., and Jie Chen. "Low-rank Matrix Approximation Based on Intermingled Randomized Decomposition." In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2019. http://dx.doi.org/10.1109/icassp.2019.8683284.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!