Local Kernel Ridge Regression for Scalable, Interpolating, Continuous Regression

Mingxuan Han, Chenglong Ye, Jeff M. Phillips

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

We study a localized version of kernel ridge regression that can continuously, smoothly interpolate the underlying function values which are highly non-linear with observed data points. This new method can deal with the data of which (a) local density is highly uneven and (b) the function values change dramatically in certain small but unknown regions. By introducing a new rank-based interpolation scheme, which can be interpreted as a variable bandwidth Nadaraya-Watson Kernel Regression, the interpolated values provided by our local method can be proven to continuously vary with query points. Our method is scalable by avoiding the full matrix inverse, compared with traditional kernel ridge regression.

Original languageEnglish
JournalTransactions on Machine Learning Research
Volume2022-September
StatePublished - Oct 1 2022

Bibliographical note

Publisher Copyright:
© 2022, Transactions on Machine Learning Research. All rights reserved.

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Local Kernel Ridge Regression for Scalable, Interpolating, Continuous Regression'. Together they form a unique fingerprint.

Cite this