Journal articles on the topic 'Recursive Least Squares'

To see the other types of publications on this topic, follow the link: Recursive Least Squares.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Recursive Least Squares.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

GRANT, IAN H. W. M. "Recursive Least Squares." Teaching Statistics 9, no. 1 (January 1987): 15–18. http://dx.doi.org/10.1111/j.1467-9639.1987.tb00614.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Mehrabi, Hamid, and Behzad Voosoghi. "Recursive moving least squares." Engineering Analysis with Boundary Elements 58 (September 2015): 119–28. http://dx.doi.org/10.1016/j.enganabound.2015.04.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gozzo, F. "Recursive least-squares sequence estimation." IBM Journal of Research and Development 38, no. 2 (March 1994): 131–56. http://dx.doi.org/10.1147/rd.382.0131.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Baykal, B., and A. G. Constantinides. "Order-recursive underdetermined recursive least-squares adaptive algorithms." Signal Processing 63, no. 3 (December 1997): 241–47. http://dx.doi.org/10.1016/s0165-1684(97)00160-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Mohamadipanah, Hossein, Mahdi Heydari, and Girish Chowdhary. "Deep kernel recursive least-squares algorithm." Nonlinear Dynamics 104, no. 3 (April 18, 2021): 2515–30. http://dx.doi.org/10.1007/s11071-021-06416-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Paleologu, Constantin, Jacob Benesty, and Silviu Ciochina. "Data-Reuse Recursive Least-Squares Algorithms." IEEE Signal Processing Letters 29 (2022): 752–56. http://dx.doi.org/10.1109/lsp.2022.3153207.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zhang, Chunyuan, Qi Song, and Zeng Meng. "Minibatch Recursive Least Squares Q-Learning." Computational Intelligence and Neuroscience 2021 (October 8, 2021): 1–9. http://dx.doi.org/10.1155/2021/5370281.

Full text
Abstract:
The deep Q-network (DQN) is one of the most successful reinforcement learning algorithms, but it has some drawbacks such as slow convergence and instability. In contrast, the traditional reinforcement learning algorithms with linear function approximation usually have faster convergence and better stability, although they easily suffer from the curse of dimensionality. In recent years, many improvements to DQN have been made, but they seldom make use of the advantage of traditional algorithms to improve DQN. In this paper, we propose a novel Q-learning algorithm with linear function approximation, called the minibatch recursive least squares Q-learning (MRLS-Q). Different from the traditional Q-learning algorithm with linear function approximation, the learning mechanism and model structure of MRLS-Q are more similar to those of DQNs with only one input layer and one linear output layer. It uses the experience replay and the minibatch training mode and uses the agent’s states rather than the agent’s state-action pairs as the inputs. As a result, it can be used alone for low-dimensional problems and can be seamlessly integrated into DQN as the last layer for high-dimensional problems as well. In addition, MRLS-Q uses our proposed average RLS optimization technique, so that it can achieve better convergence performance whether it is used alone or integrated with DQN. At the end of this paper, we demonstrate the effectiveness of MRLS-Q on the CartPole problem and four Atari games and investigate the influences of its hyperparameters experimentally.
APA, Harvard, Vancouver, ISO, and other styles
8

Huarng, K. C., and C. C. Yeh. "Continuous-time recursive least-squares algorithms." IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing 39, no. 10 (1992): 741–45. http://dx.doi.org/10.1109/82.199900.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jenq-Tay Yuan and J. A. Stuller. "Least squares order-recursive lattice smoothers." IEEE Transactions on Signal Processing 43, no. 5 (May 1995): 1058–67. http://dx.doi.org/10.1109/78.382393.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chansarkar, M. M., and U. B. Desai. "A robust recursive least squares algorithm." IEEE Transactions on Signal Processing 45, no. 7 (July 1997): 1726–35. http://dx.doi.org/10.1109/78.599942.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Connolly, M. P., and P. Fitzpatrick. "Fault-tolerant QRD recursive least squares." IEE Proceedings - Computers and Digital Techniques 143, no. 2 (1996): 137. http://dx.doi.org/10.1049/ip-cdt:19960198.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Weifeng Liu, Il Park, Yiwen Wang, and J. C. Principe. "Extended Kernel Recursive Least Squares Algorithm." IEEE Transactions on Signal Processing 57, no. 10 (October 2009): 3801–14. http://dx.doi.org/10.1109/tsp.2009.2022007.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Skretting, Karl, and Kjersti Engan. "Recursive Least Squares Dictionary Learning Algorithm." IEEE Transactions on Signal Processing 58, no. 4 (April 2010): 2121–30. http://dx.doi.org/10.1109/tsp.2010.2040671.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Feng, Da-Zheng, Hai-Qin Zhang, Xian-Da Zhang, and Zheng Bao. "An extended recursive least-squares algorithm." Signal Processing 81, no. 5 (May 2001): 1075–81. http://dx.doi.org/10.1016/s0165-1684(00)00268-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Wang, Gang, Jingci Qiao, Rui Xue, and Bei Peng. "Quaternion kernel recursive least-squares algorithm." Signal Processing 178 (January 2021): 107810. http://dx.doi.org/10.1016/j.sigpro.2020.107810.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Engel, Y., S. Mannor, and R. Meir. "The Kernel Recursive Least-Squares Algorithm." IEEE Transactions on Signal Processing 52, no. 8 (August 2004): 2275–85. http://dx.doi.org/10.1109/tsp.2004.830985.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Liu, Zhaoting, and Chunguang Li. "Recursive Least Squares for Censored Regression." IEEE Transactions on Signal Processing 65, no. 6 (March 15, 2017): 1565–79. http://dx.doi.org/10.1109/tsp.2016.2646660.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Chen, Badong, Songlin Zhao, Pingping Zhu, and Jose C. Principe. "Quantized Kernel Recursive Least Squares Algorithm." IEEE Transactions on Neural Networks and Learning Systems 24, no. 9 (September 2013): 1484–91. http://dx.doi.org/10.1109/tnnls.2013.2258936.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Li, X. Rong, and Yunmin Zhu. "Recursive Least Squares with Linear Constraints." Communications in Information and Systems 7, no. 3 (2007): 287–312. http://dx.doi.org/10.4310/cis.2007.v7.n3.a5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Kazemi, Nasser, and Mauricio D. Sacchi. "Block row recursive least-squares migration." GEOPHYSICS 80, no. 5 (September 2015): A95—A101. http://dx.doi.org/10.1190/geo2015-0070.1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Rezende Barros, Péricles. "Recursive Incremental Least Squares Estimation Algorithm." IFAC Proceedings Volumes 28, no. 13 (June 1995): 315–20. http://dx.doi.org/10.1016/s1474-6670(17)45369-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Tanc, A. Korhan. "Sparsity regularized recursive total least-squares." Digital Signal Processing 40 (May 2015): 176–80. http://dx.doi.org/10.1016/j.dsp.2015.02.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Irshad, Azeem, Muhammad Salman, Sajid Bashir, and Muhammad Bilal Malik. "Extended state space recursive least squares." Digital Signal Processing 49 (February 2016): 95–103. http://dx.doi.org/10.1016/j.dsp.2015.10.017.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Okano, T., L. D. Albright, J. Q. Pan, and L. S. Marsh. "GREENHOUSE PARAMETER ESTIMATION BY RECURSIVE LEAST SQUARES." Acta Horticulturae, no. 174 (December 1985): 433–42. http://dx.doi.org/10.17660/actahortic.1985.174.58.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Zhao, K., Fuyun Ling, H. Lev-Ari, and J. G. Proakis. "Sliding window order-recursive least-squares algorithms." IEEE Transactions on Signal Processing 42, no. 8 (1994): 1961–72. http://dx.doi.org/10.1109/78.301835.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Strobach, P. "Pure order recursive least-squares ladder algorithms." IEEE Transactions on Acoustics, Speech, and Signal Processing 34, no. 4 (August 1986): 880–97. http://dx.doi.org/10.1109/tassp.1986.1164881.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Arablouei, R., and K. Dogancay. "Linearly-Constrained Recursive Total Least-Squares Algorithm." IEEE Signal Processing Letters 19, no. 12 (December 2012): 821–24. http://dx.doi.org/10.1109/lsp.2012.2221705.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Eksioglu, E. M. "Sparsity regularised recursive least squares adaptive filtering." IET Signal Processing 5, no. 5 (2011): 480. http://dx.doi.org/10.1049/iet-spr.2010.0083.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Vega, L. R., H. Rey, J. Benesty, and S. Tressens. "A Fast Robust Recursive Least-Squares Algorithm." IEEE Transactions on Signal Processing 57, no. 3 (March 2009): 1209–16. http://dx.doi.org/10.1109/tsp.2008.2010643.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Molander, M., P. E. Modén, and K. Holmström. "Model Reduction in Recursive Least Squares Identification." IFAC Proceedings Volumes 25, no. 14 (July 1992): 5–10. http://dx.doi.org/10.1016/s1474-6670(17)50704-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Malik, Mohammad Bilal. "State-space recursive least squares: Part II." Signal Processing 84, no. 9 (September 2004): 1719–28. http://dx.doi.org/10.1016/j.sigpro.2004.05.021.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Malik, Mohammad Bilal. "State-space recursive least-squares: Part I." Signal Processing 84, no. 9 (September 2004): 1709–18. http://dx.doi.org/10.1016/j.sigpro.2004.05.022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Zhao, Yongping, and Jianguo Sun. "Recursive reduced least squares support vector regression." Pattern Recognition 42, no. 5 (May 2009): 837–42. http://dx.doi.org/10.1016/j.patcog.2008.09.028.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Moonen, Marc, and Joos Vandewalle. "Recursive least squares with stabilized inverse factorization." Signal Processing 21, no. 1 (September 1990): 1–15. http://dx.doi.org/10.1016/0165-1684(90)90022-q.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Engel, Konrad, and Sebastian Engel. "Recursive least squares with linear inequality constraints." Optimization and Engineering 16, no. 1 (December 31, 2014): 1–26. http://dx.doi.org/10.1007/s11081-014-9274-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Liu, Zhaoting, Ying Liu, and Chunguang Li. "Distributed Sparse Recursive Least-Squares Over Networks." IEEE Transactions on Signal Processing 62, no. 6 (March 2014): 1386–95. http://dx.doi.org/10.1109/tsp.2014.2302731.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Rhode, Stephan, Konstantin Usevich, Ivan Markovsky, and Frank Gauterin. "A Recursive Restricted Total Least-Squares Algorithm." IEEE Transactions on Signal Processing 62, no. 21 (November 2014): 5652–62. http://dx.doi.org/10.1109/tsp.2014.2350959.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Helland, Kristian, Hans E. Berntsen, Odd S. Borgen, and Harald Martens. "Recursive algorithm for partial least squares regression." Chemometrics and Intelligent Laboratory Systems 14, no. 1-3 (April 1992): 129–37. http://dx.doi.org/10.1016/0169-7439(92)80098-o.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

MOONEN, MARC. "SYSTOLIC ALGORITHMS FOR RECURSIVE TOTAL LEAST SQUARES PARAMETER ESTIMATION AND MIXED RLS/RTLS PROBLEMS." International Journal of High Speed Electronics and Systems 04, no. 01 (March 1993): 55–68. http://dx.doi.org/10.1142/s0129156493000042.

Full text
Abstract:
Total least squares parameter estimation is an alternative to least squares estimation though much less used in practice, partly due to the absence of efficient recursive algorithms or parallel architectures. Here it is shown how previously developed systolic algorithms/architectures for recursive least squares estimation can be used for recursive total least squares problems. Unconstrained as well as linearly constrained and "mixed RLS/RTLS" problems are considered.
APA, Harvard, Vancouver, ISO, and other styles
40

Hong, Wang Jian, and Daobo Wang. "Recursive least squares identification for piecewise affine Hammerstein models." International Journal of Intelligent Computing and Cybernetics 11, no. 2 (June 11, 2018): 234–53. http://dx.doi.org/10.1108/ijicc-01-2017-0004.

Full text
Abstract:
Purpose The purpose of this paper is to probe the recursive identification of piecewise affine Hammerstein models directly by using input-output data. To explain the identification process of a parametric piecewise affine nonlinear function, the authors prove that the inverse function corresponding to the given piecewise affine nonlinear function is also an equivalent piecewise affine form. Based on this equivalent property, during the detailed identification process with respect to piecewise affine function and linear dynamical system, three recursive least squares methods are proposed to identify those unknown parameters under the probabilistic description or bounded property of noise. Design/methodology/approach First, the basic recursive least squares method is used to identify those unknown parameters under the probabilistic description of noise. Second, multi-innovation recursive least squares method is proposed to improve the efficiency lacked in basic recursive least squares method. Third, to relax the strict probabilistic description on noise, the authors provide a projection algorithm with a dead zone in the presence of bounded noise and analyze its two properties. Findings Based on complex mathematical derivation, the inverse function of a given piecewise affine nonlinear function is also an equivalent piecewise affine form. As the least squares method is suited under one condition that the considered noise may be a zero mean random signal, a projection algorithm with a dead zone in the presence of bounded noise can enhance the robustness in the parameter update equation. Originality/value To the best knowledge of the authors, this is the first attempt at identifying piecewise affine Hammerstein models, which combine a piecewise affine function and a linear dynamical system. In the presence of bounded noise, the modified recursive least squares methods are efficient in identifying two kinds of unknown parameters, so that the common set membership method can be replaced by the proposed methods.
APA, Harvard, Vancouver, ISO, and other styles
41

Moonen, Marc, and Joos Vandewalle. "A square root covariance algorithm for constrained recursive least squares estimation." Journal of VLSI signal processing systems for signal, image and video technology 3, no. 3 (September 1991): 163–72. http://dx.doi.org/10.1007/bf00925827.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Ardalan, S. "Floating-point error analysis of recursive least-squares and least-mean-squares adaptive filters." IEEE Transactions on Circuits and Systems 33, no. 12 (December 1986): 1192–208. http://dx.doi.org/10.1109/tcs.1986.1085877.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Xu, X., H. He, and D. Hu. "Efficient Reinforcement Learning Using Recursive Least-Squares Methods." Journal of Artificial Intelligence Research 16 (April 1, 2002): 259–92. http://dx.doi.org/10.1613/jair.946.

Full text
Abstract:
The recursive least-squares (RLS) algorithm is one of the most well-known algorithms used in adaptive filtering, system identification and adaptive control. Its popularity is mainly due to its fast convergence speed, which is considered to be optimal in practice. In this paper, RLS methods are used to solve reinforcement learning problems, where two new reinforcement learning algorithms using linear value function approximators are proposed and analyzed. The two algorithms are called RLS-TD(lambda) and Fast-AHC (Fast Adaptive Heuristic Critic), respectively. RLS-TD(lambda) can be viewed as the extension of RLS-TD(0) from lambda=0 to general lambda within interval [0,1], so it is a multi-step temporal-difference (TD) learning algorithm using RLS methods. The convergence with probability one and the limit of convergence of RLS-TD(lambda) are proved for ergodic Markov chains. Compared to the existing LS-TD(lambda) algorithm, RLS-TD(lambda) has advantages in computation and is more suitable for online learning. The effectiveness of RLS-TD(lambda) is analyzed and verified by learning prediction experiments of Markov chains with a wide range of parameter settings. The Fast-AHC algorithm is derived by applying the proposed RLS-TD(lambda) algorithm in the critic network of the adaptive heuristic critic method. Unlike conventional AHC algorithm, Fast-AHC makes use of RLS methods to improve the learning-prediction efficiency in the critic. Learning control experiments of the cart-pole balancing and the acrobot swing-up problems are conducted to compare the data efficiency of Fast-AHC with conventional AHC. From the experimental results, it is shown that the data efficiency of learning control can also be improved by using RLS methods in the learning-prediction process of the critic. The performance of Fast-AHC is also compared with that of the AHC method using LS-TD(lambda). Furthermore, it is demonstrated in the experiments that different initial values of the variance matrix in RLS-TD(lambda) are required to get better performance not only in learning prediction but also in learning control. The experimental results are analyzed based on the existing theoretical work on the transient phase of forgetting factor RLS methods.
APA, Harvard, Vancouver, ISO, and other styles
44

Bin, Michelangelo. "Generalized recursive least squares: Stability, robustness, and excitation." Systems & Control Letters 161 (March 2022): 105144. http://dx.doi.org/10.1016/j.sysconle.2022.105144.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Qin, Hui, Xialoli Ding, Minghai Jia, and Jason Chao. "VARIANCE COMPONENT ESTIMATION FOR RECURSIVE LEAST SQUARES ADJUSTMENTS." Survey Review 35, no. 272 (April 1999): 117–25. http://dx.doi.org/10.1179/sre.1999.35.272.117.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Huang, Dawei. "LEVINSON-TYPE RECURSIVE ALGORITHMS FOR LEAST-SQUARES AUTOREGRESSION." Journal of Time Series Analysis 11, no. 4 (July 1990): 295–315. http://dx.doi.org/10.1111/j.1467-9892.1990.tb00059.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Bhotto, Md Zulfiquar Ali, and Andreas Antoniou. "New Improved Recursive Least-Squares Adaptive-Filtering Algorithms." IEEE Transactions on Circuits and Systems I: Regular Papers 60, no. 6 (June 2013): 1548–58. http://dx.doi.org/10.1109/tcsi.2012.2220452.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

An-Yeu Wu and K. J. Ray Liu. "Split Recursive Least-Squares: algorithms, architectures, and applications." IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing 43, no. 9 (September 1996): 645–58. http://dx.doi.org/10.1109/82.536761.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Moonen, M., and J. Vandewalle. "A systolic array for recursive least squares computations." IEEE Transactions on Signal Processing 41, no. 2 (1993): 906–12. http://dx.doi.org/10.1109/78.193226.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Lorenzelli, F., and Kung Yao. "A linear systolic array for recursive least squares." IEEE Transactions on Signal Processing 43, no. 12 (1995): 3014–21. http://dx.doi.org/10.1109/78.476445.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography