High-Precision Object Pose Estimation Using Visual-Tactile Information for Dynamic Interactions in Robotic Grasping

  • Zicai Peng
  • , Te Cui
  • , Guangyan Chen
  • , Haoyang Lu
  • , Yi Yang
  • , Yufeng Yue*
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In various robotic applications, understanding accurate object poses for robots is essential for high-precision tasks such as factory assembly or daily insertions. Tactile sensing, which compensates for visual information, offers rich texture-based or force-based data for object pose estimation. However, previous methods for pose estimation typically over-look dynamic situations, such as slippage of grasped objects or movement of contacted objects during interactions with the environment, thus increasing the complexity of pose estimation. To address these challenges, we propose an efficient method that utilizes visual and tactile sensing to estimate object poses through particle filtering. We leverage visual information to track the pose of the contacted object in real-time and estimate the pose changes of the grasped object using displacement data obtained from tactile sensors. Our experimental evaluation on 13 objects with diverse geometric shapes demonstrated the ability to estimate high-precision poses, which revealed the robot's powerful ability to cope with dynamic scenes for compelled motion of objects, proving our framework's adaptability in practical scenarios with uncertainty.

Original languageEnglish
Title of host publication2025 IEEE International Conference on Robotics and Automation, ICRA 2025
EditorsChristian Ott, Henny Admoni, Sven Behnke, Stjepan Bogdan, Aude Bolopion, Youngjin Choi, Fanny Ficuciello, Nicholas Gans, Clement Gosselin, Kensuke Harada, Erdal Kayacan, H. Jin Kim, Stefan Leutenegger, Zhe Liu, Perla Maiolino, Lino Marques, Takamitsu Matsubara, Anastasia Mavromatti, Mark Minor, Jason O'Kane, Hae Won Park, Hae-Won Park, Ioannis Rekleitis, Federico Renda, Elisa Ricci, Laurel D. Riek, Lorenzo Sabattini, Shaojie Shen, Yu Sun, Pierre-Brice Wieber, Katsu Yamane, Jingjin Yu
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages14799-14805
Number of pages7
ISBN (Electronic)9798331541392
DOIs
Publication statusPublished - 2025
Externally publishedYes
Event2025 IEEE International Conference on Robotics and Automation, ICRA 2025 - Atlanta, United States
Duration: 19 May 202523 May 2025

Publication series

NameProceedings - IEEE International Conference on Robotics and Automation
ISSN (Print)1050-4729

Conference

Conference2025 IEEE International Conference on Robotics and Automation, ICRA 2025
Country/TerritoryUnited States
CityAtlanta
Period19/05/2523/05/25

Fingerprint

Dive into the research topics of 'High-Precision Object Pose Estimation Using Visual-Tactile Information for Dynamic Interactions in Robotic Grasping'. Together they form a unique fingerprint.

Cite this