Simpler Gradient Methods for Blind Super-Resolution With Lower Iteration Complexity

Jinsheng Li, Wei Cui, Xu Zhang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

We study the problem of blind super-resolution, which can be formulated as a low-rank matrix recovery problem via vectorized Hankel lift (VHL). The previous gradient descent method based on VHL named PGD-VHL relies on additional regularization such as the projection and balancing penalty, exhibiting a suboptimal iteration complexity. In this paper, we propose a simpler unconstrained optimization problem without the above two types of regularization and develop two new and provable gradient methods named VGD-VHL and ScalGD-VHL. A novel and sharp analysis is provided for the theoretical guarantees of our algorithms, which demonstrates that our methods offer lower iteration complexity than PGD-VHL. In addition, ScalGD-VHL has the lowest iteration complexity while being independent of the condition number. Furthermore, our novel analysis reveals that the blind super-resolution problem is less incoherence-demanding, thereby eliminating the necessity for incoherent projections to achieve linear convergence. Empirical results illustrate that our methods exhibit superior computational efficiency while achieving comparable recovery performance to prior arts.

Original languageEnglish
Pages (from-to)5123-5139
Number of pages17
JournalIEEE Transactions on Signal Processing
Volume72
DOIs
Publication statusPublished - 2024

Keywords

  • Blind super-resolution
  • low-rank matrix factorization
  • scaled gradient descent
  • vanilla gradient descent

Fingerprint

Dive into the research topics of 'Simpler Gradient Methods for Blind Super-Resolution With Lower Iteration Complexity'. Together they form a unique fingerprint.

Cite this