Steady-state mean square performance of a sparsified kernel least mean square algorithm

Badong Chen*, Zhengda Qin, Lei Sun

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)

Abstract

In this paper, we investigate the convergence performance of a sparsified kernel least mean square (KLMS) algorithm in which the input is added into the dictionary only when the prediction error in amplitude is larger than a preset threshold. Under certain conditions, we derive an approximate value of the steady-state excess mean square error (EMSE). Simulation results confirm the theoretical predictions and provide some interesting findings, showing that the sparsification can not only be used to constrain the network size (hence reduce the computational burden) but also be used to improve the steady-state performance in some cases.

Original languageEnglish
Title of host publication2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2701-2705
Number of pages5
ISBN (Electronic)9781509041176
DOIs
Publication statusPublished - 16 Jun 2017
Event2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - New Orleans, United States
Duration: 5 Mar 20179 Mar 2017

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
ISSN (Print)1520-6149

Conference

Conference2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017
Country/TerritoryUnited States
CityNew Orleans
Period5/03/179/03/17

Keywords

  • KLMS
  • mean square performance
  • sparsification

Fingerprint

Dive into the research topics of 'Steady-state mean square performance of a sparsified kernel least mean square algorithm'. Together they form a unique fingerprint.

Cite this