Automated Blade Type Discrimination Algorithm for Figure Skating Based on MediaPipe

  • Dan Chen
  • , Cong Han
  • , Xiangru Zeng
  • , Jinhui Pang*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Figure skating is a highly technical sport that demands athletes’ precise control of takeoff timing, blade edge, and force application—with blade edge playing a particularly critical role in both performance quality and competitive success. We present a novel automated Blade-type Discrimination Algorithm (BDA) that leverages MediaPipe technology to significantly improve the accuracy of blade edge identification. By defining four boundary angle thresholds for blade transitions, BDA algorithm precisely identifies blade types, such as inside edge, outside edge, and vague edge, while delivering real-time discrimination capabilities. Accurate identification of blade type at takeoff moment is crucial to ensuring scoring fairness. Accordingly, we provide a method to determine the moment of takeoff and extract the relevant frames from the video. Furthermore, the blade types can be determined by BDA algorithm. To validate the algorithm’s effectiveness, we collect 437 competition videos, then extract these videos into 858 high-quality jump video segments to construct a robust dataset of figure skating jumps. Evaluation results show that BDA algorithm attains an overall accuracy of 79.09% in blade-type discrimination, demonstrating its precision and reliability in real-time blade-type discrimination.

Original languageEnglish
Pages (from-to)591-617
Number of pages27
JournalData Intelligence
Volume7
Issue number3
DOIs
Publication statusPublished - 1 Sept 2025
Externally publishedYes

Keywords

  • 3D Human keypoints
  • Action recognition
  • Blade discrimination
  • Figure skating
  • MediaPipe

Fingerprint

Dive into the research topics of 'Automated Blade Type Discrimination Algorithm for Figure Skating Based on MediaPipe'. Together they form a unique fingerprint.

Cite this