Weighted Group Sparse Bayesian Learning for Human Activity Classification

Yingxia Fan, Juan Zhao*, Xia Bai

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Recently, many researchers have focused on the human behavior recognition based on micro-Doppler signal. In this paper, we propose a sparse representation classification approach based on weighted group sparse Bayesian learning (SRC-WGSBL) for human activity classification, which introduces the property of group sparsity to distinguish the sparse coefficients between different classes. In addition, the use of Bayesian model for sparse coding is helpful to have robust classification performance in practice. Extensive experiments on a public database have been carried out to compare the performance of the proposed approach with support vector machine (SVM) and sparse representation classification based on orthogonal matching pursuit (SRC-OMP). Experimental results demonstrate that the proposed approach is effective and has better performance.

Original languageEnglish
Title of host publication2021 CIE International Conference on Radar, Radar 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1550-1555
Number of pages6
ISBN (Electronic)9781665498142
DOIs
Publication statusPublished - 2021
Event2021 CIE International Conference on Radar, Radar 2021 - Haikou, Hainan, China
Duration: 15 Dec 202119 Dec 2021

Publication series

NameProceedings of the IEEE Radar Conference
Volume2021-December
ISSN (Print)1097-5764
ISSN (Electronic)2375-5318

Conference

Conference2021 CIE International Conference on Radar, Radar 2021
Country/TerritoryChina
CityHaikou, Hainan
Period15/12/2119/12/21

Keywords

  • group sparsity
  • human activity classification
  • sparse Bayesian learning
  • sparse representation

Fingerprint

Dive into the research topics of 'Weighted Group Sparse Bayesian Learning for Human Activity Classification'. Together they form a unique fingerprint.

Cite this