Journal articles on the topic 'Low-Rank Tensor'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 journal articles for your research on the topic 'Low-Rank Tensor.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.
Zhong, Guoqiang, and Mohamed Cheriet. "Large Margin Low Rank Tensor Analysis." Neural Computation 26, no. 4 (April 2014): 761–80. http://dx.doi.org/10.1162/neco_a_00570.
Full textLiu, Hongyi, Hanyang Li, Zebin Wu, and Zhihui Wei. "Hyperspectral Image Recovery Using Non-Convex Low-Rank Tensor Approximation." Remote Sensing 12, no. 14 (July 15, 2020): 2264. http://dx.doi.org/10.3390/rs12142264.
Full textZhou, Pan, Canyi Lu, Zhouchen Lin, and Chao Zhang. "Tensor Factorization for Low-Rank Tensor Completion." IEEE Transactions on Image Processing 27, no. 3 (March 2018): 1152–63. http://dx.doi.org/10.1109/tip.2017.2762595.
Full textHe, Yicong, and George K. Atia. "Multi-Mode Tensor Space Clustering Based on Low-Tensor-Rank Representation." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 6 (June 28, 2022): 6893–901. http://dx.doi.org/10.1609/aaai.v36i6.20646.
Full textLiu, Xiaohua, and Guijin Tang. "Color Image Restoration Using Sub-Image Based Low-Rank Tensor Completion." Sensors 23, no. 3 (February 3, 2023): 1706. http://dx.doi.org/10.3390/s23031706.
Full textJiang, Yuanxiang, Qixiang Zhang, Zhanjiang Yuan, and Chen Wang. "Convex Robust Recovery of Corrupted Tensors via Tensor Singular Value Decomposition and Local Low-Rank Approximation." Journal of Physics: Conference Series 2670, no. 1 (December 1, 2023): 012026. http://dx.doi.org/10.1088/1742-6596/2670/1/012026.
Full textYu, Shicheng, Jiaqing Miao, Guibing Li, Weidong Jin, Gaoping Li, and Xiaoguang Liu. "Tensor Completion via Smooth Rank Function Low-Rank Approximate Regularization." Remote Sensing 15, no. 15 (August 3, 2023): 3862. http://dx.doi.org/10.3390/rs15153862.
Full textNie, Jiawang. "Low Rank Symmetric Tensor Approximations." SIAM Journal on Matrix Analysis and Applications 38, no. 4 (January 2017): 1517–40. http://dx.doi.org/10.1137/16m1107528.
Full textMickelin, Oscar, and Sertac Karaman. "Multiresolution Low-rank Tensor Formats." SIAM Journal on Matrix Analysis and Applications 41, no. 3 (January 2020): 1086–114. http://dx.doi.org/10.1137/19m1284579.
Full textGong, Xiao, Wei Chen, Jie Chen, and Bo Ai. "Tensor Denoising Using Low-Rank Tensor Train Decomposition." IEEE Signal Processing Letters 27 (2020): 1685–89. http://dx.doi.org/10.1109/lsp.2020.3025038.
Full textChen, Xi’ai, Zhen Wang, Kaidong Wang, Huidi Jia, Zhi Han, and Yandong Tang. "Multi-Dimensional Low-Rank with Weighted Schatten p-Norm Minimization for Hyperspectral Anomaly Detection." Remote Sensing 16, no. 1 (December 24, 2023): 74. http://dx.doi.org/10.3390/rs16010074.
Full textSobolev, Konstantin, Dmitry Ermilov, Anh-Huy Phan, and Andrzej Cichocki. "PARS: Proxy-Based Automatic Rank Selection for Neural Network Compression via Low-Rank Weight Approximation." Mathematics 10, no. 20 (October 14, 2022): 3801. http://dx.doi.org/10.3390/math10203801.
Full textSun, Li, and Bing Song. "Data Recovery Technology Based on Subspace Clustering." Scientific Programming 2022 (July 20, 2022): 1–6. http://dx.doi.org/10.1155/2022/1920933.
Full textBachmayr, Markus, and Vladimir Kazeev. "Stability of Low-Rank Tensor Representations and Structured Multilevel Preconditioning for Elliptic PDEs." Foundations of Computational Mathematics 20, no. 5 (January 23, 2020): 1175–236. http://dx.doi.org/10.1007/s10208-020-09446-z.
Full textShcherbakova, Elena M., Sergey A. Matveev, Alexander P. Smirnov, and Eugene E. Tyrtyshnikov. "Study of performance of low-rank nonnegative tensor factorization methods." Russian Journal of Numerical Analysis and Mathematical Modelling 38, no. 4 (August 1, 2023): 231–39. http://dx.doi.org/10.1515/rnam-2023-0018.
Full textDu, Shiqiang, Yuqing Shi, Guangrong Shan, Weilan Wang, and Yide Ma. "Tensor low-rank sparse representation for tensor subspace learning." Neurocomputing 440 (June 2021): 351–64. http://dx.doi.org/10.1016/j.neucom.2021.02.002.
Full textCai, Bing, and Gui-Fu Lu. "Tensor subspace clustering using consensus tensor low-rank representation." Information Sciences 609 (September 2022): 46–59. http://dx.doi.org/10.1016/j.ins.2022.07.049.
Full text李, 鸿燕. "Double Factor Tensor Norm Regularized Low Rank Tensor Completion." Advances in Applied Mathematics 11, no. 10 (2022): 6908–14. http://dx.doi.org/10.12677/aam.2022.1110732.
Full textJiang, Bo, Shiqian Ma, and Shuzhong Zhang. "Low-M-Rank Tensor Completion and Robust Tensor PCA." IEEE Journal of Selected Topics in Signal Processing 12, no. 6 (December 2018): 1390–404. http://dx.doi.org/10.1109/jstsp.2018.2873144.
Full textZheng, Yu-Bang, Ting-Zhu Huang, Xi-Le Zhao, Tai-Xiang Jiang, Teng-Yu Ji, and Tian-Hui Ma. "Tensor N-tubal rank and its convex relaxation for low-rank tensor recovery." Information Sciences 532 (September 2020): 170–89. http://dx.doi.org/10.1016/j.ins.2020.05.005.
Full text马, 婷婷. "Enhanced Low Rank Tensor Approximation Algorithm." Advances in Applied Mathematics 08, no. 08 (2019): 1336–40. http://dx.doi.org/10.12677/aam.2019.88157.
Full textHuang, Huyan, Yipeng Liu, Zhen Long, and Ce Zhu. "Robust Low-Rank Tensor Ring Completion." IEEE Transactions on Computational Imaging 6 (2020): 1117–26. http://dx.doi.org/10.1109/tci.2020.3006718.
Full textZhang, Anru. "Cross: Efficient low-rank tensor completion." Annals of Statistics 47, no. 2 (April 2019): 936–64. http://dx.doi.org/10.1214/18-aos1694.
Full textSu, Yaru, Xiaohui Wu, and Genggeng Liu. "Nonconvex Low Tubal Rank Tensor Minimization." IEEE Access 7 (2019): 170831–43. http://dx.doi.org/10.1109/access.2019.2956115.
Full textWang, Andong, Zhihui Lai, and Zhong Jin. "Noisy low-tubal-rank tensor completion." Neurocomputing 330 (February 2019): 267–79. http://dx.doi.org/10.1016/j.neucom.2018.11.012.
Full textGuo, Kailing, Tong Zhang, Xiangmin Xu, and Xiaofen Xing. "Low-Rank Tensor Thresholding Ridge Regression." IEEE Access 7 (2019): 153761–72. http://dx.doi.org/10.1109/access.2019.2944426.
Full textTan, Huachun, Jianshuai Feng, Zhengdong Chen, Fan Yang, and Wuhong Wang. "Low Multilinear Rank Approximation of Tensors and Application in Missing Traffic Data." Advances in Mechanical Engineering 6 (January 1, 2014): 157597. http://dx.doi.org/10.1155/2014/157597.
Full textZhu, Yada, Jingrui He, and Rick Lawrence. "Hierarchical Modeling with Tensor Inputs." Proceedings of the AAAI Conference on Artificial Intelligence 26, no. 1 (September 20, 2021): 1233–39. http://dx.doi.org/10.1609/aaai.v26i1.8283.
Full textHe, Jingfei, Xunan Zheng, Peng Gao, and Yatong Zhou. "Low-rank tensor completion based on tensor train rank with partially overlapped sub-blocks." Signal Processing 190 (January 2022): 108339. http://dx.doi.org/10.1016/j.sigpro.2021.108339.
Full textLiu, Yipeng, Jiani Liu, and Ce Zhu. "Low-Rank Tensor Train Coefficient Array Estimation for Tensor-on-Tensor Regression." IEEE Transactions on Neural Networks and Learning Systems 31, no. 12 (December 2020): 5402–11. http://dx.doi.org/10.1109/tnnls.2020.2967022.
Full textJyothula, Sunil Kumar, and Jaya Chandra Prasad Talari. "An Efficient Transform based Low Rank Tensor Completion to Extreme Visual Recovery." Indian Journal of Science and Technology 15, no. 14 (April 11, 2022): 608–18. http://dx.doi.org/10.17485/ijst/v15i14.264.
Full textDong, Le, and Yuan Yuan. "Sparse Constrained Low Tensor Rank Representation Framework for Hyperspectral Unmixing." Remote Sensing 13, no. 8 (April 11, 2021): 1473. http://dx.doi.org/10.3390/rs13081473.
Full textGhadermarzy, Navid, Yaniv Plan, and Özgür Yilmaz. "Near-optimal sample complexity for convex tensor completion." Information and Inference: A Journal of the IMA 8, no. 3 (November 23, 2018): 577–619. http://dx.doi.org/10.1093/imaiai/iay019.
Full textChen, Chuan, Zhe-Bin Wu, Zi-Tai Chen, Zi-Bin Zheng, and Xiong-Jun Zhang. "Auto-weighted robust low-rank tensor completion via tensor-train." Information Sciences 567 (August 2021): 100–115. http://dx.doi.org/10.1016/j.ins.2021.03.025.
Full textLiu, Chunsheng, Hong Shan, and Chunlei Chen. "Tensor p-shrinkage nuclear norm for low-rank tensor completion." Neurocomputing 387 (April 2020): 255–67. http://dx.doi.org/10.1016/j.neucom.2020.01.009.
Full textYang, Jing-Hua, Xi-Le Zhao, Teng-Yu Ji, Tian-Hui Ma, and Ting-Zhu Huang. "Low-rank tensor train for tensor robust principal component analysis." Applied Mathematics and Computation 367 (February 2020): 124783. http://dx.doi.org/10.1016/j.amc.2019.124783.
Full textZhang, Zhao, Cheng Ding, Zhisheng Gao, and Chunzhi Xie. "ANLPT: Self-Adaptive and Non-Local Patch-Tensor Model for Infrared Small Target Detection." Remote Sensing 15, no. 4 (February 12, 2023): 1021. http://dx.doi.org/10.3390/rs15041021.
Full textShi, Qiquan, Jiaming Yin, Jiajun Cai, Andrzej Cichocki, Tatsuya Yokota, Lei Chen, Mingxuan Yuan, and Jia Zeng. "Block Hankel Tensor ARIMA for Multiple Short Time Series Forecasting." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 5758–66. http://dx.doi.org/10.1609/aaai.v34i04.6032.
Full textMohaoui, S., K. El Qate, A. Hakim, and S. Raghay. "Low-rank tensor completion using nonconvex total variation." Mathematical Modeling and Computing 9, no. 2 (2022): 365–74. http://dx.doi.org/10.23939/mmc2022.02.365.
Full textJia, Yuheng, Hui Liu, Junhui Hou, and Qingfu Zhang. "Clustering Ensemble Meets Low-rank Tensor Approximation." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 9 (May 18, 2021): 7970–78. http://dx.doi.org/10.1609/aaai.v35i9.16972.
Full textSuzuki, Taiji, and Heishiro Kanagawa. "Bayes method for low rank tensor estimation." Journal of Physics: Conference Series 699 (March 2016): 012020. http://dx.doi.org/10.1088/1742-6596/699/1/012020.
Full textKadmon, Jonathan, and Surya Ganguli. "Statistical mechanics of low-rank tensor decomposition." Journal of Statistical Mechanics: Theory and Experiment 2019, no. 12 (December 20, 2019): 124016. http://dx.doi.org/10.1088/1742-5468/ab3216.
Full textKressner, Daniel, Michael Steinlechner, and Bart Vandereycken. "Low-rank tensor completion by Riemannian optimization." BIT Numerical Mathematics 54, no. 2 (November 7, 2013): 447–68. http://dx.doi.org/10.1007/s10543-013-0455-z.
Full textXie, Ting, Shutao Li, Leyuan Fang, and Licheng Liu. "Tensor Completion via Nonlocal Low-Rank Regularization." IEEE Transactions on Cybernetics 49, no. 6 (June 2019): 2344–54. http://dx.doi.org/10.1109/tcyb.2018.2825598.
Full textSohrabi Bonab, Zahra, and Mohammad B. Shamsollahi. "Low-rank Tensor Restoration for ERP extraction." Biomedical Signal Processing and Control 87 (January 2024): 105379. http://dx.doi.org/10.1016/j.bspc.2023.105379.
Full text王, 香懿. "Improved Robust Low-Rank Regularization Tensor Completion." Advances in Applied Mathematics 11, no. 11 (2022): 7647–52. http://dx.doi.org/10.12677/aam.2022.1111809.
Full textWang, Xiangyi, and Wei Jiang. "Improved Robust Low-Rank Regularization Tensor Completion." OALib 09, no. 11 (2022): 1–25. http://dx.doi.org/10.4236/oalib.1109425.
Full textHosono, Kaito, Shunsuke Ono, and Takamichi Miyata. "On the Synergy between Nonconvex Extensions of the Tensor Nuclear Norm for Tensor Recovery." Signals 2, no. 1 (February 18, 2021): 108–21. http://dx.doi.org/10.3390/signals2010010.
Full textDUAN, YI-SHI, and SHAO-FENG WU. "MAGNETIC BRANES FROM GENERALIZED 't HOOFT TENSOR." Modern Physics Letters A 21, no. 34 (November 10, 2006): 2599–606. http://dx.doi.org/10.1142/s0217732306020500.
Full textZhou, Junxiu, Yangyang Tao, and Xian Liu. "Tensor Decomposition for Salient Object Detection in Images." Big Data and Cognitive Computing 3, no. 2 (June 19, 2019): 33. http://dx.doi.org/10.3390/bdcc3020033.
Full text