Journal articles on the topic 'Least squares algorithm'

To see the other types of publications on this topic, follow the link: Least squares algorithm.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Least squares algorithm.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Yoo, Chang Kyoo, Su Whan Sung, and In-Beum Lee. "Generalized damped least squares algorithm." Computers & Chemical Engineering 27, no. 3 (March 2003): 423–31. http://dx.doi.org/10.1016/s0098-1354(02)00219-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhang, Xu-Yao, Lingfeng Wang, Shiming Xiang, and Cheng-Lin Liu. "Retargeted Least Squares Regression Algorithm." IEEE Transactions on Neural Networks and Learning Systems 26, no. 9 (September 2015): 2206–13. http://dx.doi.org/10.1109/tnnls.2014.2371492.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Peng Guo, 彭国, 李伟明 Li Weiming, 黄扬 Huang Yang, 陈艺海 Cheng Yihai, and 高兴宇 Gao Xingyu. "Improved Least Squares Unwrapping Algorithm." Laser & Optoelectronics Progress 57, no. 18 (2020): 181101. http://dx.doi.org/10.3788/lop57.181101.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Da-Zheng Feng, Zheng Bao, and Li-Cheng Jiao. "Total least mean squares algorithm." IEEE Transactions on Signal Processing 46, no. 8 (1998): 2122–30. http://dx.doi.org/10.1109/78.705421.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Van Huffel, Sabine. "Partial total least squares algorithm." Journal of Computational and Applied Mathematics 33, no. 1 (December 1990): 113–21. http://dx.doi.org/10.1016/0377-0427(90)90261-w.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Javed, Shazia, and Noor Atinah Ahmad. "A Stochastic Total Least Squares Solution of Adaptive Filtering Problem." Scientific World Journal 2014 (2014): 1–6. http://dx.doi.org/10.1155/2014/625280.

Full text
Abstract:
An efficient and computationally linear algorithm is derived for total least squares solution of adaptive filtering problem, when both input and output signals are contaminated by noise. The proposed total least mean squares (TLMS) algorithm is designed by recursively computing an optimal solution of adaptive TLS problem by minimizing instantaneous value of weighted cost function. Convergence analysis of the algorithm is given to show the global convergence of the proposed algorithm, provided that the stepsize parameter is appropriately chosen. The TLMS algorithm is computationally simpler than the other TLS algorithms and demonstrates a better performance as compared with the least mean square (LMS) and normalized least mean square (NLMS) algorithms. It provides minimum mean square deviation by exhibiting better convergence in misalignment for unknown system identification under noisy inputs.
APA, Harvard, Vancouver, ISO, and other styles
7

Heller, René, Michael Hippke, and Kai Rodenbeck. "Transit least-squares survey." Astronomy & Astrophysics 627 (July 2019): A66. http://dx.doi.org/10.1051/0004-6361/201935600.

Full text
Abstract:
The extended Kepler mission (K2) has revealed more than 500 transiting planets in roughly 500 000 stellar light curves. All of these were found either with the box least-squares algorithm or by visual inspection. Here we use our new transit least-squares (TLS) algorithm to search for additional planets around all K2 stars that are currently known to host at least one planet. We discover and statistically validate 17 new planets with radii ranging from about 0.7 Earth radii (R⊕) to roughly 2.2 R⊕ and a median radius of 1.18 R⊕. EPIC 201497682.03, with a radius of 0.692+0.059−0.048, is the second smallest planet ever discovered with K2. The transit signatures of these 17 planets are typically 200 ppm deep (ranging from 100 ppm to 2000 ppm), and their orbital periods extend from about 0.7 d to 34 d with a median value of about 4 d. Fourteen of these 17 systems only had one known planet before, and they now join the growing number of multi-planet systems. Most stars in our sample have subsolar masses and radii. The small planetary radii in our sample are a direct result of the higher signal detection efficiency that TLS has compared to box-fitting algorithms in the shallow-transit regime. Our findings help in populating the period-radius diagram with small planets. Our discovery rate of about 3.7% within the group of previously known K2 systems suggests that TLS can find over 100 additional Earth-sized planets in the data of the Kepler primary mission.
APA, Harvard, Vancouver, ISO, and other styles
8

Haaland, David M., and David K. Melgaard. "New Classical Least-Squares/Partial Least-Squares Hybrid Algorithm for Spectral Analyses." Applied Spectroscopy 55, no. 1 (January 2001): 1–8. http://dx.doi.org/10.1366/0003702011951353.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Qian Xiaofan, 钱晓凡, 饶帆 Rao Fan, 李兴华 Li Xinghua, 林超 Lin Chao, and 李斌 Li Bin. "Accurate Least-Squares Phase Unwrapping Algorithm." Chinese Journal of Lasers 39, no. 2 (2012): 0209001. http://dx.doi.org/10.3788/cjl201239.0209001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Al-Saggaf, Ubaid M., Muhammad Moinuddin, Muhammad Arif, and Azzedine Zerguine. "The q-Least Mean Squares algorithm." Signal Processing 111 (June 2015): 50–60. http://dx.doi.org/10.1016/j.sigpro.2014.11.016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Geng, Pengbo, Jian Wang, and Wengu Chen. "Multipath least squares algorithm and analysis." Signal Processing 174 (September 2020): 107633. http://dx.doi.org/10.1016/j.sigpro.2020.107633.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Wang, Gang, Jingci Qiao, Rui Xue, and Bei Peng. "Quaternion kernel recursive least-squares algorithm." Signal Processing 178 (January 2021): 107810. http://dx.doi.org/10.1016/j.sigpro.2020.107810.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Rezende Barros, Péricles. "Recursive Incremental Least Squares Estimation Algorithm." IFAC Proceedings Volumes 28, no. 13 (June 1995): 315–20. http://dx.doi.org/10.1016/s1474-6670(17)45369-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Feng, Da-Zheng, Hai-Qin Zhang, Xian-Da Zhang, and Zheng Bao. "An extended recursive least-squares algorithm." Signal Processing 81, no. 5 (May 2001): 1075–81. http://dx.doi.org/10.1016/s0165-1684(00)00268-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Stoica, P., M. Agrawal, and P. Ahgren. "On the hierarchical least-squares algorithm." IEEE Communications Letters 6, no. 4 (April 2002): 153–55. http://dx.doi.org/10.1109/4234.996042.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Engel, Y., S. Mannor, and R. Meir. "The Kernel Recursive Least-Squares Algorithm." IEEE Transactions on Signal Processing 52, no. 8 (August 2004): 2275–85. http://dx.doi.org/10.1109/tsp.2004.830985.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Weifeng Liu, Il Park, Yiwen Wang, and J. C. Principe. "Extended Kernel Recursive Least Squares Algorithm." IEEE Transactions on Signal Processing 57, no. 10 (October 2009): 3801–14. http://dx.doi.org/10.1109/tsp.2009.2022007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Skretting, Karl, and Kjersti Engan. "Recursive Least Squares Dictionary Learning Algorithm." IEEE Transactions on Signal Processing 58, no. 4 (April 2010): 2121–30. http://dx.doi.org/10.1109/tsp.2010.2040671.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Chansarkar, M. M., and U. B. Desai. "A robust recursive least squares algorithm." IEEE Transactions on Signal Processing 45, no. 7 (July 1997): 1726–35. http://dx.doi.org/10.1109/78.599942.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Chen, Badong, Songlin Zhao, Pingping Zhu, and Jose C. Principe. "Quantized Kernel Recursive Least Squares Algorithm." IEEE Transactions on Neural Networks and Learning Systems 24, no. 9 (September 2013): 1484–91. http://dx.doi.org/10.1109/tnnls.2013.2258936.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Yongbin Wei, S. B. Gelfand, and J. V. Krogmeier. "Noise-constrained least mean squares algorithm." IEEE Transactions on Signal Processing 49, no. 9 (2001): 1961–70. http://dx.doi.org/10.1109/78.942625.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Van Huffel, Sabine, and Joos Vandewalle. "The partial total least squares algorithm." Journal of Computational and Applied Mathematics 21, no. 3 (March 1988): 333–41. http://dx.doi.org/10.1016/0377-0427(88)90317-2.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Lybanon, Matthew. "A simple generalized least-squares algorithm." Computers & Geosciences 11, no. 4 (January 1985): 501–8. http://dx.doi.org/10.1016/0098-3004(85)90032-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Mohamadipanah, Hossein, Mahdi Heydari, and Girish Chowdhary. "Deep kernel recursive least-squares algorithm." Nonlinear Dynamics 104, no. 3 (April 18, 2021): 2515–30. http://dx.doi.org/10.1007/s11071-021-06416-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Jin, Xi, Xing Zhang, Kaifeng Rao, Liang Tang, and Qiwei Xie. "Semi-supervised partial least squares." International Journal of Wavelets, Multiresolution and Information Processing 18, no. 03 (January 13, 2020): 2050014. http://dx.doi.org/10.1142/s0219691320500149.

Full text
Abstract:
Traditional supervised dimensionality reduction methods can establish a better model often under the premise of a large number of samples. However, in real-world applications where labeled data are scarce, traditional methods tend to perform poorly because of overfitting. In such cases, unlabeled samples could be useful in improving the performance. In this paper, we propose a semi-supervised dimensionality reduction method by using partial least squares (PLS) which we call semi-supervised partial least squares (S2PLS). To combine the labeled and unlabeled samples into a S2PLS model, we first apply the PLS algorithm to unsupervised dimensionality reduction. Then, the final S2PLS model is established by ensembling the supervised PLS model and the unsupervised PLS model which using the basic idea of principal model analysis (PMA) method. Compared with unsupervised or supervised dimensionality reduction algorithms, S2PLS not only can improve the prediction accuracy of the samples but also enhance the generalization ability of the model. Meanwhile, it can obtain better results even there are only a few or no labeled samples. Experimental results on five UCI data sets also confirmed the above properties of S2PLS algorithm.
APA, Harvard, Vancouver, ISO, and other styles
26

Shi, Zhenwei, and Zhicheng Ji. "Least Squares Based and Two-Stage Least Squares Based Iterative Estimation Algorithms for H-FIR-MA Systems." Mathematical Problems in Engineering 2015 (2015): 1–8. http://dx.doi.org/10.1155/2015/516374.

Full text
Abstract:
This paper studies the identification of Hammerstein finite impulse response moving average (H-FIR-MA for short) systems. A new two-stage least squares iterative algorithm is developed to identify the parameters of the H-FIR-MA systems. The simulation cases indicate the efficiency of the proposed algorithms.
APA, Harvard, Vancouver, ISO, and other styles
27

Heller, René, Kai Rodenbeck, and Michael Hippke. "Transit least-squares survey." Astronomy & Astrophysics 625 (May 2019): A31. http://dx.doi.org/10.1051/0004-6361/201935276.

Full text
Abstract:
We apply for the first time the transit least-squares (TLS) algorithm to search for new transiting exoplanets. TLS has been developed as a successor to the box least-squares (BLS) algorithm, which has served as a standard tool for the detection of periodic transits. In this proof-of-concept paper, we demonstrate that TLS finds small planets that have previously been missed. We show the capabilities of TLS using the K2 EVEREST-detrended light curve of the star K2-32 (EPIC 205071984), which has been known to have three transiting planets. TLS detects these known Neptune-sized planets K2-32 b, d, and c in an iterative search and finds an additional transit signal with a high signal detection efficiency (SDETLS) of 26.1 at a period of 4.34882−0.00075+0.00069 d. We show that this additional signal remains detectable (SDETLS = 13.2) with TLS in the K2SFF light curve of K2-32, which includes a less optimal detrending of the systematic trends. The signal is below common detection thresholds if searched with BLS in the K2SFF light curve (SDEBLS = 8.9), however, as in previous searches. Markov chain Monte Carlo sampling with the emcee software shows that the radius of this candidate is 1.01−0.09+0.10 R⊕. We analyzed its phase-folded transit light curve using the vespa software and calculated a false-positive probability FPP = 3.1 × 10−3. Taking into account the multiplicity boost of the system, we estimate an FPP < 3.1 × 10−4, which formally validates K2-32 e as a planet. K2-32 now hosts at least four planets that are very close to a 1:2:5:7 mean motion resonance chain. The offset of the orbital periods of K2-32 e and b from a 1:2 mean motion resonance agrees very well with the sample of transiting multiplanet systems from Kepler, lending further credence to the planetary nature of K2-32 e. We expect that TLS can find many more transits of Earth-sized and even smaller planets in the Kepler and K2 data that have so far remained undetected with algorithms that search for box-like signals.
APA, Harvard, Vancouver, ISO, and other styles
28

EL-KHAMY, S. E., M. M. HADHOUD, M. I. DESSOUKY, B. M. SALAM, and F. E. ABD EL-SAMIE. "ADAPTIVE LEAST SQUARES ACQUISITION OF HIGH RESOLUTION IMAGES." International Journal of Information Acquisition 02, no. 01 (March 2005): 45–53. http://dx.doi.org/10.1142/s0219878905000416.

Full text
Abstract:
This paper presents a least squares block by block adaptive approach for the acquisition of high resolution (HR) images from available (LR) images. The suggested algorithm is based on the segmentation of the image to overlapping blocks and the interpolation of each block separately. The purpose of the overlapping of blocks is to avoid edge effects. An adaptive 2D least squares approach, which considers the image acquisition model, is used in the minimization of the estimation error of each block. In this suggested algorithm, a weight matrix of moderate dimensions is estimated in a small number of iterations to interpolate each block. This algorithm avoids the large computational complexity due to the matrices of large dimensions required to interpolate the image as a whole. The performance of the proposed algorithm is studied for different LR images with different SNRs. The performance of the proposed algorithm is also compared to the standard as well as the warped distance cubic O-MOMS image interpolation algorithms from the PSNR point of view.
APA, Harvard, Vancouver, ISO, and other styles
29

Xu, X., H. He, and D. Hu. "Efficient Reinforcement Learning Using Recursive Least-Squares Methods." Journal of Artificial Intelligence Research 16 (April 1, 2002): 259–92. http://dx.doi.org/10.1613/jair.946.

Full text
Abstract:
The recursive least-squares (RLS) algorithm is one of the most well-known algorithms used in adaptive filtering, system identification and adaptive control. Its popularity is mainly due to its fast convergence speed, which is considered to be optimal in practice. In this paper, RLS methods are used to solve reinforcement learning problems, where two new reinforcement learning algorithms using linear value function approximators are proposed and analyzed. The two algorithms are called RLS-TD(lambda) and Fast-AHC (Fast Adaptive Heuristic Critic), respectively. RLS-TD(lambda) can be viewed as the extension of RLS-TD(0) from lambda=0 to general lambda within interval [0,1], so it is a multi-step temporal-difference (TD) learning algorithm using RLS methods. The convergence with probability one and the limit of convergence of RLS-TD(lambda) are proved for ergodic Markov chains. Compared to the existing LS-TD(lambda) algorithm, RLS-TD(lambda) has advantages in computation and is more suitable for online learning. The effectiveness of RLS-TD(lambda) is analyzed and verified by learning prediction experiments of Markov chains with a wide range of parameter settings. The Fast-AHC algorithm is derived by applying the proposed RLS-TD(lambda) algorithm in the critic network of the adaptive heuristic critic method. Unlike conventional AHC algorithm, Fast-AHC makes use of RLS methods to improve the learning-prediction efficiency in the critic. Learning control experiments of the cart-pole balancing and the acrobot swing-up problems are conducted to compare the data efficiency of Fast-AHC with conventional AHC. From the experimental results, it is shown that the data efficiency of learning control can also be improved by using RLS methods in the learning-prediction process of the critic. The performance of Fast-AHC is also compared with that of the AHC method using LS-TD(lambda). Furthermore, it is demonstrated in the experiments that different initial values of the variance matrix in RLS-TD(lambda) are required to get better performance not only in learning prediction but also in learning control. The experimental results are analyzed based on the existing theoretical work on the transient phase of forgetting factor RLS methods.
APA, Harvard, Vancouver, ISO, and other styles
30

Tan, Wen Xue, Mei Sen Pan, and Xiao Rong Xu. "A Novel Elman Algorithm Based on Sectional Least Squares." Advanced Materials Research 301-303 (July 2011): 695–700. http://dx.doi.org/10.4028/www.scientific.net/amr.301-303.695.

Full text
Abstract:
In this paper, a novel algorithm named SLS-Elman is put forward, which aims at more effectively training and learning small scale samples with many characteristic variables, and takes Sectional Least Squares principle and some structural property of Elman neural network into consideration. When doing characteristic variable reduction on high-dimension of small scale samples, the novelty algorithm takes relativity among dependent variables into account. Obtained data by this algorithm carry on training and simulating on a neural network; outputted network is more simplified in view of structure, and presents a more precise network model. The statistics of case analysis demonstrates novelty algorithm improves convergence rate and forecast precision, and efficiency. In the mean time, on purpose to test efficiency of novelty algorithm, it is compared with some algorithms such as Elman neural network algorithm based on Principal Component Analysis and so on, as a result, it presents more advantages.
APA, Harvard, Vancouver, ISO, and other styles
31

Subrahmanyam, A. V. B., D. C. Saha, and G. P. Rao. "A H ∞ -Norm Bounded Least-Squares Algorithm." IFAC Proceedings Volumes 28, no. 13 (June 1995): 309–14. http://dx.doi.org/10.1016/s1474-6670(17)45368-7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Bai, Er-Wei. "A random least-trimmed-squares identification algorithm." Automatica 39, no. 9 (September 2003): 1651–59. http://dx.doi.org/10.1016/s0005-1098(03)00193-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Keerthi, S. S., and S. K. Shevade. "SMO Algorithm for Least-Squares SVM Formulations." Neural Computation 15, no. 2 (February 1, 2003): 487–507. http://dx.doi.org/10.1162/089976603762553013.

Full text
Abstract:
This article extends the well-known SMO algorithm of support vector machines (SVMs) to least-squares SVM formulations that include LS-SVM classification, kernel ridge regression, and a particular form of regularized kernel Fisher discriminant. The algorithm is shown to be asymptotically convergent. It is also extremely easy to implement. Computational experiments show that the algorithm is fast and scales efficiently (quadratically) as a function of the number of examples.
APA, Harvard, Vancouver, ISO, and other styles
34

Vega, L. R., H. Rey, J. Benesty, and S. Tressens. "A Fast Robust Recursive Least-Squares Algorithm." IEEE Transactions on Signal Processing 57, no. 3 (March 2009): 1209–16. http://dx.doi.org/10.1109/tsp.2008.2010643.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Wang, Xuan, Jingqiu Zhou, Shaoshuai Mou, and Martin J. Corless. "A Distributed Algorithm for Least Squares Solutions." IEEE Transactions on Automatic Control 64, no. 10 (October 2019): 4217–22. http://dx.doi.org/10.1109/tac.2019.2894588.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Vogel, C. R., and Q. Yang. "Multigrid algorithm for least-squares wavefront reconstruction." Applied Optics 45, no. 4 (February 1, 2006): 705. http://dx.doi.org/10.1364/ao.45.000705.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Wang, Jin, Gwanggil Jeon, and Jechang Jeong. "De-Interlacing Algorithm Using Weighted Least Squares." IEEE Transactions on Circuits and Systems for Video Technology 24, no. 1 (January 2014): 39–48. http://dx.doi.org/10.1109/tcsvt.2013.2280068.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Arablouei, R., and K. Dogancay. "Linearly-Constrained Recursive Total Least-Squares Algorithm." IEEE Signal Processing Letters 19, no. 12 (December 2012): 821–24. http://dx.doi.org/10.1109/lsp.2012.2221705.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Goutis, Constantinos. "Partial least squares algorithm yields shrinkage estimators." Annals of Statistics 24, no. 2 (April 1996): 816–24. http://dx.doi.org/10.1214/aos/1032894467.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Li, Z. F., M. R. Osborne, and T. Prvan. "Adaptive Algorithm for Constrained Least-Squares Problems." Journal of Optimization Theory and Applications 114, no. 2 (August 2002): 423–41. http://dx.doi.org/10.1023/a:1016043919978.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Rhode, Stephan, Konstantin Usevich, Ivan Markovsky, and Frank Gauterin. "A Recursive Restricted Total Least-Squares Algorithm." IEEE Transactions on Signal Processing 62, no. 21 (November 2014): 5652–62. http://dx.doi.org/10.1109/tsp.2014.2350959.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Aguirre, L. A. "Algorithm for extended least-squares model reduction." Electronics Letters 31, no. 22 (October 26, 1995): 1957–59. http://dx.doi.org/10.1049/el:19951311.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Helland, Kristian, Hans E. Berntsen, Odd S. Borgen, and Harald Martens. "Recursive algorithm for partial least squares regression." Chemometrics and Intelligent Laboratory Systems 14, no. 1-3 (April 1992): 129–37. http://dx.doi.org/10.1016/0169-7439(92)80098-o.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Peng, Jing-Jing, and An-Ping Liao. "Algorithm for inequality-constrained least squares problems." Computational and Applied Mathematics 36, no. 1 (April 9, 2015): 249–58. http://dx.doi.org/10.1007/s40314-015-0226-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Jungblut, Jens, Daniel Fritz Plöger, Philip Zech, and Stephan Rinderknecht. "Order Tracking based Least Mean Squares Algorithm." IFAC-PapersOnLine 52, no. 15 (2019): 465–70. http://dx.doi.org/10.1016/j.ifacol.2019.11.719.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Van Huffel, Sabine. "The extended classical total least squares algorithm." Journal of Computational and Applied Mathematics 25, no. 1 (January 1989): 111–19. http://dx.doi.org/10.1016/0377-0427(89)90080-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Mathai, A. M., and R. S. Katiyar. "A new algorithm for nonlinear least squares." Journal of Mathematical Sciences 81, no. 1 (August 1996): 2454–63. http://dx.doi.org/10.1007/bf02362352.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Shakarji, C. M. "Least-squares fitting algorithms of the NIST algorithm testing system." Journal of Research of the National Institute of Standards and Technology 103, no. 6 (November 1998): 633. http://dx.doi.org/10.6028/jres.103.043.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Geng, Pengbo, Wengu Chen, and Huanmin Ge. "Perturbation Analysis of Orthogonal Least Squares." Canadian Mathematical Bulletin 62, no. 4 (March 22, 2019): 780–97. http://dx.doi.org/10.4153/s0008439519000134.

Full text
Abstract:
AbstractThe Orthogonal Least Squares (OLS) algorithm is an efficient sparse recovery algorithm that has received much attention in recent years. On one hand, this paper considers that the OLS algorithm recovers the supports of sparse signals in the noisy case. We show that the OLS algorithm exactly recovers the support of $K$-sparse signal $\boldsymbol{x}$ from $\boldsymbol{y}=\boldsymbol{\unicode[STIX]{x1D6F7}}\boldsymbol{x}+\boldsymbol{e}$ in $K$ iterations, provided that the sensing matrix $\boldsymbol{\unicode[STIX]{x1D6F7}}$ satisfies the restricted isometry property (RIP) with restricted isometry constant (RIC) $\unicode[STIX]{x1D6FF}_{K+1}<1/\sqrt{K+1}$, and the minimum magnitude of the nonzero elements of $\boldsymbol{x}$ satisfies some constraint. On the other hand, this paper demonstrates that the OLS algorithm exactly recovers the support of the best $K$-term approximation of an almost sparse signal $\boldsymbol{x}$ in the general perturbations case, which means both $\boldsymbol{y}$ and $\boldsymbol{\unicode[STIX]{x1D6F7}}$ are perturbed. We show that the support of the best $K$-term approximation of $\boldsymbol{x}$ can be recovered under reasonable conditions based on the restricted isometry property (RIP).
APA, Harvard, Vancouver, ISO, and other styles
50

Panigrahi, T., P. M. Pradhan, G. Panda, and B. Mulgrew. "Block Least Mean Squares Algorithm over Distributed Wireless Sensor Network." Journal of Computer Networks and Communications 2012 (2012): 1–13. http://dx.doi.org/10.1155/2012/601287.

Full text
Abstract:
In a distributed parameter estimation problem, during each sampling instant, a typical sensor node communicates its estimate either by the diffusion algorithm or by the incremental algorithm. Both these conventional distributed algorithms involve significant communication overheads and, consequently, defeat the basic purpose of wireless sensor networks. In the present paper, we therefore propose two new distributed algorithms, namely, block diffusion least mean square (BDLMS) and block incremental least mean square (BILMS) by extending the concept of block adaptive filtering techniques to the distributed adaptation scenario. The performance analysis of the proposed BDLMS and BILMS algorithms has been carried out and found to have similar performances to those offered by conventional diffusion LMS and incremental LMS algorithms, respectively. The convergence analyses of the proposed algorithms obtained from the simulation study are also found to be in agreement with the theoretical analysis. The remarkable and interesting aspect of the proposed block-based algorithms is that their communication overheads per node and latencies are less than those of the conventional algorithms by a factor as high as the block size used in the algorithms.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography