Skip to main navigation Skip to search Skip to main content

FFT-based exponentially weighted recursive least squares computations

Research output: Contribution to journalJournal articlepeer-review

Abstract

We consider exponentially weighted recursive least squares (RLS) computations with forgetting factor γ (0 < γ < 1). The least squares estimator can be found by solving a matrix system A(t)x(t) = b(t) at each adaptive time step t. Unlike the sliding window RLS computation, the matrix A(t) is not a “near-Toeplitz” matrix (a sum of products of Toeplitz matrices). However, we show that its scaled matrix is a “near-Toeplitz” matrix, and hence the matrix-vector multiplication can be performed efficiently by using fast Fourier transforms (FFTs). We apply the FFT-based preconditioned conjugate gradient method to solve such systems. When the input stochastic process is stationary, we prove that both [∥A(t) − E(A(t))∥2] and Var[∥A(t) − E(A(t))∥2] tend to zero, provided that the number of data samples taken is sufficient large. Here (·) and Var(·) are the expectation and variance operators respectively. Hence the expected values of the eigenvalues of the preconditioned matrices are near to 1 except for a finite number of outlying eigenvalues. The result is stronger than those proved by Ng, Chan, and Plemmons that the spectra of the preconditioned matrices are clustered around 1 with probability 1.

Original languageEnglish
Pages (from-to)167-191
Number of pages25
JournalLinear Algebra and Its Applications
Volume263
Issue number1-3
DOIs
Publication statusPublished - 15 Sept 1997

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 9 - Industry, Innovation, and Infrastructure
    SDG 9 Industry, Innovation, and Infrastructure

Fingerprint

Dive into the research topics of 'FFT-based exponentially weighted recursive least squares computations'. Together they form a unique fingerprint.

Cite this