Sliding window recursive quadratic optimization with variable regularization

Jesse B. Hoagg, Asad A. Ali, Magnus Mossberg, Dennis S. Bernstein

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Scopus citations

Abstract

In this paper, we present a sliding-window variable-regularization recursive least squares algorithm. In contrast to standard recursive least squares, the algorithm presented in this paper operates on a finite window of data, where old data are discarded as new data become available. This property can be beneficial for estimating time-varying parameters. Furthermore, standard recursive least squares uses time-invariant regularization. More specifically, the inverse of the initial covariance matrix in standard recursive least squares can be viewed as a regularization term, which weights the difference between the next estimate and the initial estimate. This regularization is fixed for all steps of the recursion. The algorithm derived in this paper allows for time-varying regularization. In particular, the present paper allows for time-varying regularization in the weighting as well as what is being weighted. Specifically, the regularization term can weight the difference between the next estimate and a time-varying vector of parameters rather than the initial estimate.

Original languageEnglish
Title of host publicationProceedings of the 2011 American Control Conference, ACC 2011
Pages3275-3280
Number of pages6
DOIs
StatePublished - 2011

Publication series

NameProceedings of the American Control Conference
ISSN (Print)0743-1619

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Sliding window recursive quadratic optimization with variable regularization'. Together they form a unique fingerprint.

Cite this