RLS Recursive Least Squares

Recursive Least Squares (RLS) is an adaptive filtering algorithm used to estimate the parameters of a linear model based on input-output data. It is particularly useful in applications where the data is non-stationary and changes over time. RLS belongs to the family of recursive estimation algorithms, which means it updates its estimates incrementally as new data becomes available.

The RLS algorithm is based on the method of least squares, which aims to minimize the sum of squared errors between the predicted output and the actual output. It can be used to estimate parameters in a linear model of the form:

y(t) = θ^T(t) * x(t)

where y(t) is the observed output at time t, x(t) is the input vector at time t, and θ(t) is the parameter vector to be estimated at time t.

The RLS algorithm maintains an estimate of the parameter vector, denoted as θ^(t), which is continuously updated as new data arrives. The key idea behind RLS is to update the estimate recursively, rather than re-estimating the parameters from scratch each time new data is available.

The RLS algorithm consists of the following steps:

Initialization:

  • Initialize the parameter estimate: θ^(0) = 0 (or any other appropriate initial value).
  • Initialize the covariance matrix: P(0) = δ^(-1)I, where δ is a small positive constant (typically a small value close to zero) and I is the identity matrix.

For each time step t:

  • Compute the Kalman gain matrix, K(t), given by: K(t) = P(t-1) * x(t) / (λ + x(t)^T * P(t-1) * x(t)) where λ is a small positive constant called the forgetting factor.
  • Compute the prediction error, e(t), given by: e(t) = y(t) - x(t)^T * θ^(t-1)
  • Update the parameter estimate, θ^(t), as follows: θ^(t) = θ^(t-1) + K(t) * e(t)
  • Update the covariance matrix, P(t), as follows: P(t) = (1/λ) * [P(t-1) - K(t) * x(t)^T * P(t-1)]

The forgetting factor, λ, controls the influence of past data on the current estimate. A smaller value of λ gives more weight to recent data, making the estimate more adaptive to changes in the system. Conversely, a larger value of λ places more emphasis on past data, making the estimate more stable but less responsive to changes.

The RLS algorithm has several advantages. It provides a computationally efficient way to estimate parameters in a linear model, and it can track time-varying systems effectively. Additionally, RLS allows for online learning, meaning it can adapt and update estimates in real-time as new data becomes available.

However, RLS also has some limitations. It can be sensitive to noise in the data, and if the forgetting factor is not properly chosen, it may suffer from numerical instability or divergence. The choice of the forgetting factor depends on the specific application and the characteristics of the system being modeled.

Overall, Recursive Least Squares is a powerful algorithm for adaptive filtering and parameter estimation, making it widely used in various fields, including signal processing, control systems, and machine learning.