Event-Assisted Object Tracking on High-Speed Drones in Harsh Illumination Environment

Yuqi Han, Xiaohang Yu, Heng Luan, Jinli Suo*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

Drones have been used in a variety of scenarios, such as atmospheric monitoring, fire rescue, agricultural irrigation, etc., in which accurate environmental perception is of crucial importance for both decision making and control. Among drone sensors, the RGB camera is indispensable for capturing rich visual information for vehicle navigation but encounters a grand challenge in high-dynamic-range scenes, which frequently occur in real applications. Specifically, the recorded frames suffer from underexposure and overexposure simultaneously and degenerate the successive vision tasks. To solve the problem, we take object tracking as an example and leverage the superior response of event cameras over a large intensity range to propose an event-assisted object tracking algorithm that can achieve reliable tracking under large intensity variations. Specifically, we propose to pursue feature matching from dense event signals and, based on this, to (i) design a U-Net-based image enhancement algorithm to balance RGB intensity with the help of neighboring frames in the time domain and then (ii) construct a dual-input tracking model to track the moving objects from intensity-balanced RGB video and event sequences. The proposed approach is comprehensively validated in both simulation and real experiments.

Original languageEnglish
Article number22
JournalDrones
Volume8
Issue number1
DOIs
Publication statusPublished - Jan 2024
Externally publishedYes

Keywords

  • drones
  • event-assisted object tracking
  • harsh illumination
  • image enhancement
  • multi-sensor fusion

Fingerprint

Dive into the research topics of 'Event-Assisted Object Tracking on High-Speed Drones in Harsh Illumination Environment'. Together they form a unique fingerprint.

Cite this