Object manipulation of humanoid robot based on combined optimization approach

Altaf Hussain Rajpar*, Qiang Huang, Weimin Zhang, Dongyong Jia, Kejie Li

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

In this paper an object manipulation approach is proposed, in which object location is detected through stereo vision, pre-reaching and alignment of the object with hand is based on image features, and finally inverse kinematics control algorithm is developed. A new method for computing numerical solution to the inverse kinematics problem of robot manipulator is developed. The proposed method is based on combination of two nonlinear programming techniques. Forward recursion formula with backward cycle computation method is used to reach in the vicinity of target object and then FBS method is used to grasp object in real-time. Proposed method is numerically stable, computationally effective and it is not sensitive to singular configuration of the manipulator. Effectiveness of the proposed method is achieved through simulated and experimental results.

Original languageEnglish
Title of host publicationProceedings of the 2007 IEEE International Conference on Mechatronics and Automation, ICMA 2007
Pages1148-1153
Number of pages6
DOIs
Publication statusPublished - 2007
Event2007 IEEE International Conference on Mechatronics and Automation, ICMA 2007 - Harbin, China
Duration: 5 Aug 20078 Aug 2007

Publication series

NameProceedings of the 2007 IEEE International Conference on Mechatronics and Automation, ICMA 2007

Conference

Conference2007 IEEE International Conference on Mechatronics and Automation, ICMA 2007
Country/TerritoryChina
CityHarbin
Period5/08/078/08/07

Keywords

  • Alignment
  • Grasp and inverse kinematics
  • Pre-reaching
  • Target detection

Fingerprint

Dive into the research topics of 'Object manipulation of humanoid robot based on combined optimization approach'. Together they form a unique fingerprint.

Cite this