Academic literature on the topic 'Recursive Least Squares'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Recursive Least Squares.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Recursive Least Squares"

1

GRANT, IAN H. W. M. "Recursive Least Squares." Teaching Statistics 9, no. 1 (January 1987): 15–18. http://dx.doi.org/10.1111/j.1467-9639.1987.tb00614.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Mehrabi, Hamid, and Behzad Voosoghi. "Recursive moving least squares." Engineering Analysis with Boundary Elements 58 (September 2015): 119–28. http://dx.doi.org/10.1016/j.enganabound.2015.04.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gozzo, F. "Recursive least-squares sequence estimation." IBM Journal of Research and Development 38, no. 2 (March 1994): 131–56. http://dx.doi.org/10.1147/rd.382.0131.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Baykal, B., and A. G. Constantinides. "Order-recursive underdetermined recursive least-squares adaptive algorithms." Signal Processing 63, no. 3 (December 1997): 241–47. http://dx.doi.org/10.1016/s0165-1684(97)00160-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Mohamadipanah, Hossein, Mahdi Heydari, and Girish Chowdhary. "Deep kernel recursive least-squares algorithm." Nonlinear Dynamics 104, no. 3 (April 18, 2021): 2515–30. http://dx.doi.org/10.1007/s11071-021-06416-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Paleologu, Constantin, Jacob Benesty, and Silviu Ciochina. "Data-Reuse Recursive Least-Squares Algorithms." IEEE Signal Processing Letters 29 (2022): 752–56. http://dx.doi.org/10.1109/lsp.2022.3153207.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Zhang, Chunyuan, Qi Song, and Zeng Meng. "Minibatch Recursive Least Squares Q-Learning." Computational Intelligence and Neuroscience 2021 (October 8, 2021): 1–9. http://dx.doi.org/10.1155/2021/5370281.

Full text
Abstract:
The deep Q-network (DQN) is one of the most successful reinforcement learning algorithms, but it has some drawbacks such as slow convergence and instability. In contrast, the traditional reinforcement learning algorithms with linear function approximation usually have faster convergence and better stability, although they easily suffer from the curse of dimensionality. In recent years, many improvements to DQN have been made, but they seldom make use of the advantage of traditional algorithms to improve DQN. In this paper, we propose a novel Q-learning algorithm with linear function approximation, called the minibatch recursive least squares Q-learning (MRLS-Q). Different from the traditional Q-learning algorithm with linear function approximation, the learning mechanism and model structure of MRLS-Q are more similar to those of DQNs with only one input layer and one linear output layer. It uses the experience replay and the minibatch training mode and uses the agent’s states rather than the agent’s state-action pairs as the inputs. As a result, it can be used alone for low-dimensional problems and can be seamlessly integrated into DQN as the last layer for high-dimensional problems as well. In addition, MRLS-Q uses our proposed average RLS optimization technique, so that it can achieve better convergence performance whether it is used alone or integrated with DQN. At the end of this paper, we demonstrate the effectiveness of MRLS-Q on the CartPole problem and four Atari games and investigate the influences of its hyperparameters experimentally.
APA, Harvard, Vancouver, ISO, and other styles
8

Huarng, K. C., and C. C. Yeh. "Continuous-time recursive least-squares algorithms." IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing 39, no. 10 (1992): 741–45. http://dx.doi.org/10.1109/82.199900.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Jenq-Tay Yuan and J. A. Stuller. "Least squares order-recursive lattice smoothers." IEEE Transactions on Signal Processing 43, no. 5 (May 1995): 1058–67. http://dx.doi.org/10.1109/78.382393.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chansarkar, M. M., and U. B. Desai. "A robust recursive least squares algorithm." IEEE Transactions on Signal Processing 45, no. 7 (July 1997): 1726–35. http://dx.doi.org/10.1109/78.599942.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Recursive Least Squares"

1

Baykal, Buyurman. "Underdetermined recursive least-squares adaptive filtering." Thesis, Imperial College London, 1995. http://hdl.handle.net/10044/1/7790.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bian, Xiaomeng. "Completely Recursive Least Squares and Its Applications." ScholarWorks@UNO, 2012. http://scholarworks.uno.edu/td/1518.

Full text
Abstract:
The matrix-inversion-lemma based recursive least squares (RLS) approach is of a recursive form and free of matrix inversion, and has excellent performance regarding computation and memory in solving the classic least-squares (LS) problem. It is important to generalize RLS for generalized LS (GLS) problem. It is also of value to develop an efficient initialization for any RLS algorithm. In Chapter 2, we develop a unified RLS procedure to solve the unconstrained/linear-equality (LE) constrained GLS. We also show that the LE constraint is in essence a set of special error-free observations and further consider the GLS with implicit LE constraint in observations (ILE-constrained GLS). Chapter 3 treats the RLS initialization-related issues, including rank check, a convenient method to compute the involved matrix inverse/pseudoinverse, and resolution of underdetermined systems. Based on auxiliary-observations, the RLS recursion can start from the first real observation and possible LE constraints are also imposed recursively. The rank of the system is checked implicitly. If the rank is deficient, a set of refined non-redundant observations is determined alternatively. In Chapter 4, base on [Li07], we show that the linear minimum mean square error (LMMSE) estimator, as well as the optimal Kalman filter (KF) considering various correlations, can be calculated from solving an equivalent GLS using the unified RLS. In Chapters 5 & 6, an approach of joint state-and-parameter estimation (JSPE) in power system monitored by synchrophasors is adopted, where the original nonlinear parameter problem is reformulated as two loosely-coupled linear subproblems: state tracking and parameter tracking. Chapter 5 deals with the state tracking which determines the voltages in JSPE, where dynamic behavior of voltages under possible abrupt changes is studied. Chapter 6 focuses on the subproblem of parameter tracking in JSPE, where a new prediction model for parameters with moving means is introduced. Adaptive filters are developed for the above two subproblems, respectively, and both filters are based on the optimal KF accounting for various correlations. Simulations indicate that the proposed approach yields accurate parameter estimates and improves the accuracy of the state estimation, compared with existing methods.
APA, Harvard, Vancouver, ISO, and other styles
3

Hutchinson, Derek Charles Glenn. "Manipulator inverse kinematics based on recursive least squares estimation." Thesis, University of British Columbia, 1988. http://hdl.handle.net/2429/27890.

Full text
Abstract:
The inverse kinematics problem for six degree of freedom robots having a separable structure with the wrist equivalent to a spherical joint is considered and an iterative solution based on estimating the inverse Jacobian by recursive least squares estimation is proposed. This solution is found to have properties similar to Wampler's Damped Least Squares method and provides a stable result when the manipulator is in singular regions. Furthermore, the solution is more computationally efficient than Wampler's method; however, its best performance is obtained when the distances between the current end effector pose and the target pose are small. No knowledge of the manipulator's geometry is required provided that the end effector and joint position data are obtained from sensor information. This permits the algorithm to be readily transferable among manipulators and circumvents detailed analysis of the manipulator's structure.
Applied Science, Faculty of
Electrical and Computer Engineering, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
4

Walke, Richard Lewis. "High sample-rate Givens rotations for recursive least squares." Thesis, University of Warwick, 1997. http://wrap.warwick.ac.uk/36283/.

Full text
Abstract:
The design of an application-specific integrated circuit of a parallel array processor is considered for recursive least squares by QR decomposition using Givens rotations, applicable in adaptive filtering and beamforming applications. Emphasis is on high sample-rate operation, which, for this recursive algorithm, means that the time to perform arithmetic operations is critical. The algorithm, architecture and arithmetic are considered in a single integrated design procedure to achieve optimum results. A realisation approach using standard arithmetic operators, add, multiply and divide is adopted. The design of high-throughput operators with low delay is addressed for fixed- and floating-point number formats, and the application of redundant arithmetic considered. New redundant multiplier architectures are presented enabling reductions in area of up to 25%, whilst maintaining low delay. A technique is presented enabling the use of a conventional tree multiplier in recursive applications, allowing savings in area and delay. Two new divider architectures are presented showing benefits compared with the radix-2 modified SRT algorithm. Givens rotation algorithms are examined to determine their suitability for VLSI implementation. A novel algorithm, based on the Squared Givens Rotation (SGR) algorithm, is developed enabling the sample-rate to be increased by a factor of approximately 6 and offering area reductions up to a factor of 2 over previous approaches. An estimated sample-rate of 136 MHz could be achieved using a standard cell approach and O.35pm CMOS technology. The enhanced SGR algorithm has been compared with a CORDIC approach and shown to benefit by a factor of 3 in area and over 11 in sample-rate. When compared with a recent implementation on a parallel array of general purpose (GP) DSP chips, it is estimated that a single application specific chip could offer up to 1,500 times the computation obtained from a single OP DSP chip.
APA, Harvard, Vancouver, ISO, and other styles
5

Tsakiris, Manolis. "On the regularization of the recursive least squares algorithm." Universidade de São Paulo, 2010. http://www.teses.usp.br/teses/disponiveis/3/3142/tde-21102010-101424/.

Full text
Abstract:
This thesis is concerned with the issue of the regularization of the Recursive Least-Squares (RLS) algorithm. In the first part of the thesis, a novel regularized exponentially weighted array RLS algorithm is developed, which circumvents the problem of fading regularization that is inherent to the standard regularized exponentially weighted RLS formulation, while allowing the employment of generic time-varying regularization matrices. The standard equations are directly perturbed via a chosen regularization matrix; then the resulting recursions are extended to the array form. The price paid is an increase in computational complexity, which becomes cubic. The superiority of the algorithm with respect to alternative algorithms is demonstrated via simulations in the context of adaptive beamforming, in which low filter orders are employed, so that complexity is not an issue. In the second part of the thesis, an alternative criterion is motivated and proposed for the dynamical regulation of regularization in the context of the standard RLS algorithm. The regularization is implicitely achieved via dithering of the input signal. The proposed criterion is of general applicability and aims at achieving a balance between the accuracy of the numerical solution of a perturbed linear system of equations and its distance from the analytical solution of the original system, for a given computational precision. Simulations show that the proposed criterion can be effectively used for the compensation of large condition numbers, small finite precisions and unecessary large values of the regularization.
Esta tese trata da regularização do algoritmo dos mínimos-quadrados recursivo (Recursive Least-Squares - RLS). Na primeira parte do trabalho, um novo algoritmo array com matriz de regularização genérica e com ponderação dos dados exponencialmente decrescente no tempo é apresentado. O algoritmo é regularizado via perturbação direta da inversa da matriz de auto-correlação (Pi) por uma matriz genérica. Posteriormente, as equações recursivas são colocadas na forma array através de transformações unitárias. O preço a ser pago é o aumento na complexidade computacional, que passa a ser de ordem cúbica. A robustez do algoritmo resultante ´e demonstrada via simula¸coes quando comparado com algoritmos alternativos existentes na literatura no contexto de beamforming adaptativo, no qual geralmente filtros com ordem pequena sao empregados, e complexidade computacional deixa de ser fator relevante. Na segunda parte do trabalho, um critério alternativo ´e motivado e proposto para ajuste dinâmico da regularização do algoritmo RLS convencional. A regularização é implementada pela adição de ruído branco no sinal de entrada (dithering), cuja variância é controlada por um algoritmo simples que explora o critério proposto. O novo critério pode ser aplicado a diversas situações; procura-se alcançar um balanço entre a precisão numérica da solução de um sistema linear de equações perturbado e sua distância da solução do sistema original não-perturbado, para uma dada precisão. As simulações mostram que tal critério pode ser efetivamente empregado para compensação de números de condicionamento (CN) elevados, baixa precisão numérica, bem como valores de regularização excessivamente elevados.
APA, Harvard, Vancouver, ISO, and other styles
6

Lightbody, Gaye. "High performance VLSI architectures for recursive least squares adaptive filtering." Thesis, Queen's University Belfast, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.313974.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Thompson, Kenneth. "Position estimation in a switched reluctance motor using recursive least squares." Thesis, University of Newcastle Upon Tyne, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.366575.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Lauzon, Anne-Marie. "The time course of bronchoconstriction and its assessment by recursive least-squares." Thesis, McGill University, 1993. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=41672.

Full text
Abstract:
A recursive least-squares algorithm was developed to estimate respiratory mechanical parameters with high temporal resolution. This algorithm was used to investigate the time course of bronchoconstriction induced by intravenous histamine injection in the dog. The onset of the response of lung tissue resistance and elastance demonstrated a different time course than airway resistance. This was interpreted in terms of the sequential delivery of the drug first through the pulmonary and then the bronchial circulations. The time course of respiratory mechanical parameters among various alveolar capsules revealed two patterns of inhomogeneity development. The first one was random whereas the second one was progressive with dose. A mathematical derivation elucidated the negative tissue resistance frequently obtained at high levels of constriction. The time courses of respiratory resistance and elastance during bronchoconstriction were transient and scaled with dose. They were reproducible for repeated doses of histamine after indomethacin pre-treatment and were intrinsically modulated by the adrenergic sympathetic system and through the $ rm H sb2$ histamine receptors.
APA, Harvard, Vancouver, ISO, and other styles
9

Huo, Jia Q. "Numerical properties of adaptive recursive least-squares (RLS) algorithms with linear constraints." Thesis, Curtin University, 1999. http://hdl.handle.net/20.500.11937/270.

Full text
Abstract:
Adaptive filters have found applications in many signal processing problems. In some situations, linear constraints are imposed on the filter weights such that the filter is forced to exhibit a certain desired response. Several algorithms for linearly constrained least-squares adaptive filtering have been developed in the literature. When implemented with finite precision arithmetic, these algorithms are inevitably subjected to rounding errors. It is essential to understand how these algorithms react to rounding errors.In this thesis, the numerical properties of three linearly constrained least-squares adaptive filtering algorithms, namely, the linearly constrained fast least algorithm, the linear systolic array for MVDR beamforming and the linearly constrained QRD-RLS algorithm, are studied. It is shown that all these algorithms can be separated into a constrained part and an unconstrained part. The numerical properties of unconstrained least-squares algorithms (i.e., the unconstrained part of the linearly constrained algorithms under study) are reviewed from the perspectives of error propagation, error accumulation and numerical persistency. It is shown that persistent excitation and sufficient numerical resolution are needed to ensure the stability of the CRLS algorithm, while the QRD-RLS algorithm is unconditionally stable. The numerical properties of the constrained algorithms are then examined. Based on the technique of how the constraints are applied, these algorithms can be grouped into two categories. The first two algorithms admit a similar structure in that the unconstrained parts preceed the constrained parts. Error propagation analysis shows that this structure gives rise to unstable error propagation in the constrained part. In contrast, the constrained part of the third algorithm preceeds the unconstrained part. It is shown that this algorithm gives an exact solution to a linearly constrained least-squares adaptive filtering problem with perturbed constraints and perturbed input data. A minor modification to the constrained part of the linearly constrained QRD-RLS algorithm is proposed to avoid a potential numerical difficulty due to the Gaussian elimination operation employed in the algorithm.
APA, Harvard, Vancouver, ISO, and other styles
10

Huo, Jia Q. "Numerical properties of adaptive recursive least-squares (RLS) algorithms with linear constraints." Curtin University of Technology, Australian Telecommunications Research Institute, 1999. http://espace.library.curtin.edu.au:80/R/?func=dbin-jump-full&object_id=10094.

Full text
Abstract:
Adaptive filters have found applications in many signal processing problems. In some situations, linear constraints are imposed on the filter weights such that the filter is forced to exhibit a certain desired response. Several algorithms for linearly constrained least-squares adaptive filtering have been developed in the literature. When implemented with finite precision arithmetic, these algorithms are inevitably subjected to rounding errors. It is essential to understand how these algorithms react to rounding errors.In this thesis, the numerical properties of three linearly constrained least-squares adaptive filtering algorithms, namely, the linearly constrained fast least algorithm, the linear systolic array for MVDR beamforming and the linearly constrained QRD-RLS algorithm, are studied. It is shown that all these algorithms can be separated into a constrained part and an unconstrained part. The numerical properties of unconstrained least-squares algorithms (i.e., the unconstrained part of the linearly constrained algorithms under study) are reviewed from the perspectives of error propagation, error accumulation and numerical persistency. It is shown that persistent excitation and sufficient numerical resolution are needed to ensure the stability of the CRLS algorithm, while the QRD-RLS algorithm is unconditionally stable. The numerical properties of the constrained algorithms are then examined. Based on the technique of how the constraints are applied, these algorithms can be grouped into two categories. The first two algorithms admit a similar structure in that the unconstrained parts preceed the constrained parts. Error propagation analysis shows that this structure gives rise to unstable error propagation in the constrained part. In contrast, the constrained part of the third algorithm preceeds the unconstrained part. It is shown that this algorithm gives an ++
exact solution to a linearly constrained least-squares adaptive filtering problem with perturbed constraints and perturbed input data. A minor modification to the constrained part of the linearly constrained QRD-RLS algorithm is proposed to avoid a potential numerical difficulty due to the Gaussian elimination operation employed in the algorithm.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Recursive Least Squares"

1

Dynamic data processing: Recursive least-squares. Delft: Delft University Press, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Olszanskyj, Serge. Rank-k modification for recursive least squares problems. Ithaca, N.Y: Cornell Theory Center, Cornell University, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Walke, Richard Lewis. High sample-rate Givens rotations for recursive least squares. [s.l.]: typescript, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Price, Lydia J. Recursive least-squares approach to data transferability: Exposition and numerical results. Fontainebleau: INSEAD, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Tobia, John. A time-varying analysis of the exponentially data weighted recursive least squares (EDW-RLS) algorithm. Ottawa: National Library of Canada, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

United States. National Aeronautics and Space Administration., ed. On recursive least-squares filtering algorithms and implementations. Los Angeles: University of California, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

On recursive least-squares filtering algorithms and implementations. Los Angeles: University of California, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

United States. National Aeronautics and Space Administration., ed. On recursive least-squares filtering algorithms and implementations. Los Angeles: University of California, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Toplis, Blake Stephen. Tracking, adaptability and stability modifications for fast recursive least squares algorithms. 1987.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Center, Ames Research, ed. Round-off error propogation in four generally applicable, recursive, least-squares-estimation schemes. Moffett Field, Calif: National Aeronautics and Space Administration, Ames Research Center, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Recursive Least Squares"

1

Benesty, Jacob, Constantin Paleologu, Tomas Gänsler, and Silviu Ciochină. "Recursive Least-Squares Algorithms." In A Perspective on Stereophonic Acoustic Echo Cancellation, 63–69. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-22574-1_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Young, Peter C. "Recursive Least Squares Estimation." In Recursive Estimation and Time-Series Analysis, 29–46. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-21981-8_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Strobach, Peter. "Recursive Least-Squares Transversal Algorithms." In Springer Series in Information Sciences, 102–57. Berlin, Heidelberg: Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/978-3-642-75206-3_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Strobach, Peter. "Fast Recursive Least-Squares Ladder Algorithms." In Springer Series in Information Sciences, 281–311. Berlin, Heidelberg: Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/978-3-642-75206-3_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zhao, Ji, and Hongbin Zhang. "Projected Kernel Recursive Least Squares Algorithm." In Neural Information Processing, 356–65. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-70087-8_38.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Luk, Franklin T. "Fault Tolerant Recursive Least Squares Minimization." In Numerical Linear Algebra, Digital Signal Processing and Parallel Algorithms, 237–50. Berlin, Heidelberg: Springer Berlin Heidelberg, 1991. http://dx.doi.org/10.1007/978-3-642-75536-1_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Alexander, S. Thomas. "Chapter 8 Recursive Least Squares Signal Processing." In Adaptive Signal Processing, 111–22. New York, NY: Springer New York, 1986. http://dx.doi.org/10.1007/978-1-4612-4978-8_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Strobach, Peter. "Recursive Least-Squares Using the QR Decomposition." In Springer Series in Information Sciences, 63–101. Berlin, Heidelberg: Springer Berlin Heidelberg, 1990. http://dx.doi.org/10.1007/978-3-642-75206-3_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Helwani, Karim. "Spatio-Temporal Regularized Recursive Least Squares Algorithm." In T-Labs Series in Telecommunication Services, 23–33. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-08954-6_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Scherrer, Bruno, and Matthieu Geist. "Recursive Least-Squares Learning with Eligibility Traces." In Lecture Notes in Computer Science, 115–27. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-29946-9_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Recursive Least Squares"

1

Yibin Zheng. "Recursive least squares image reconstruction." In Conference Record. Thirty-Fifth Asilomar Conference on Signals, Systems and Computers. IEEE, 2001. http://dx.doi.org/10.1109/acssc.2001.987775.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Malik, Mohammad, Mohammad Hakeem, Imran Ghazi, and Ata-ul-basit Hassan. "Recursive Least Squares Spectrum Estimation." In 2006 IEEE International Symposium on Industrial Electronics. IEEE, 2006. http://dx.doi.org/10.1109/isie.2006.295527.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Geist, Matthieu, and Olivier Pietquin. "Statistically linearized recursive least squares." In 2010 IEEE International Workshop on Machine Learning for Signal Processing (MLSP). IEEE, 2010. http://dx.doi.org/10.1109/mlsp.2010.5589236.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Chansarkar, M. M., and U. B. Desai. "A robust recursive least squares algorithm." In Proceedings of ICASSP '93. IEEE, 1993. http://dx.doi.org/10.1109/icassp.1993.319527.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Van Vaerenbergh, Steven, Ignacio Santamaria, Weifeng Liu, and Jose C. Principe. "Fixed-budget kernel recursive least-squares." In 2010 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2010. IEEE, 2010. http://dx.doi.org/10.1109/icassp.2010.5495350.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Dowling, Eric M., and Ronald D. DeGroat. "Recursive total-least-squares adaptive filtering." In San Diego, '91, San Diego, CA, edited by Simon Haykin. SPIE, 1991. http://dx.doi.org/10.1117/12.49762.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Yang, Bin. "Recursive least-squares-based subspace tracking." In SPIE's 1994 International Symposium on Optics, Imaging, and Instrumentation, edited by Franklin T. Luk. SPIE, 1994. http://dx.doi.org/10.1117/12.190847.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Tjell, Katrine, Ignacio Cascudo, and Rafael Wisniewski. "Privacy Preserving Recursive Least Squares Solutions." In 2019 18th European Control Conference (ECC). IEEE, 2019. http://dx.doi.org/10.23919/ecc.2019.8796169.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bruce, Adam L., Ankit Goel, and Dennis S. Bernstein. "Recursive Least Squares with Matrix Forgetting." In 2020 American Control Conference (ACC). IEEE, 2020. http://dx.doi.org/10.23919/acc45564.2020.9148005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Choi, Jae Won, Jeffrey Ludwig, and Andrew Singer. "Online Segmented Recursive Least Squares (OSRLS)." In 2021 55th Asilomar Conference on Signals, Systems, and Computers. IEEE, 2021. http://dx.doi.org/10.1109/ieeeconf53345.2021.9723217.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Recursive Least Squares"

1

Cioffi, J. M., and T. Kailath. An Efficient, RLS (Recursive-Least-Squares) Data-Driven Echo Canceller for Fast Initialization of Full-Duplex Data Transmission,. Fort Belvoir, VA: Defense Technical Information Center, June 1985. http://dx.doi.org/10.21236/ada160177.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography