Blind image blur metric based on orientation-aware local patterns

Lixiong Liu, Jiachao Gong, Hua Huang*, Qingbing Sang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)

Abstract

We develop an effective blind image blur assessment model based on a novel orientation-aware local pattern operator. The resulting metric first proposes an orientation-aware local pattern operator that fully considers the impact of anisotropy of orientation selectivity mechanism and the gradient orientation effect on visual perception. Our results indicate that the proposed descriptor is sensitive to image distortion and can effectively represent orientation information. We thus use it to extract image structure information. In order to enhance features’ representation capability for blur image, we extract edge information by a Toggle operator and use it as weight of local patterns to optimize the computed structural statistical features. Finally, a support vector regression method is used to train a predictive model with optimized features and subjective scores. Experimental results obtained on six public databases show that our proposed model performs better than state-of-the-art image blur assessment models.

Original languageEnglish
Article number115654
JournalSignal Processing: Image Communication
Volume80
DOIs
Publication statusPublished - Feb 2020

Keywords

  • Blind blur metric
  • Orientation selectivity
  • Orientation-aware local pattern
  • Toggle operator

Fingerprint

Dive into the research topics of 'Blind image blur metric based on orientation-aware local patterns'. Together they form a unique fingerprint.

Cite this