TY - GEN
T1 - Deep Detector and Optical Flow-based Tracking Approach of Facial Markers for Animation Capture
AU - Tian, Zeyu
AU - Weng, Dongdong
AU - Fang, Hui
AU - Bao, Yihua
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Marker-based facial motion capture methods are still commonly used in the film and game industry. However, these methods can lose track of target markers under certain conditions such as occlusion and blur, requiring extensive manual revisions. Therefore, there is a need for a more robust marker tracking method that can accurately and stably track markers over a longer period of time, thereby simplifying manual operations. In this paper, we present a new facial marker tracking system that focuses on improving the accuracy and stability of performance capture. Our system integrates a robust optical flow tracking method with the proposed Marker-DETR detector to achieve synthetic analysis of the marker tracking process. To evaluate the performance of our system, we collected a real-world dataset that records the performance of voluntary actors, with ground truth labels provided by artists. Our experimental results demonstrate that our approach outperforms state-of-the-art trackers such as SiamDW and ECO in terms of RMSE (root mean square error) and AUC (area under the curve). These results confirm the improved accuracy and stability of our approach.
AB - Marker-based facial motion capture methods are still commonly used in the film and game industry. However, these methods can lose track of target markers under certain conditions such as occlusion and blur, requiring extensive manual revisions. Therefore, there is a need for a more robust marker tracking method that can accurately and stably track markers over a longer period of time, thereby simplifying manual operations. In this paper, we present a new facial marker tracking system that focuses on improving the accuracy and stability of performance capture. Our system integrates a robust optical flow tracking method with the proposed Marker-DETR detector to achieve synthetic analysis of the marker tracking process. To evaluate the performance of our system, we collected a real-world dataset that records the performance of voluntary actors, with ground truth labels provided by artists. Our experimental results demonstrate that our approach outperforms state-of-the-art trackers such as SiamDW and ECO in terms of RMSE (root mean square error) and AUC (area under the curve). These results confirm the improved accuracy and stability of our approach.
KW - Computing methodologies - Artificial intelligence - Computer vision - Interest point and salient region detections
KW - Computing methodologies - Computer graphics - Animation - Motion capture
UR - http://www.scopus.com/inward/record.url?scp=85180363720&partnerID=8YFLogxK
U2 - 10.1109/ISMAR-Adjunct60411.2023.00134
DO - 10.1109/ISMAR-Adjunct60411.2023.00134
M3 - Conference contribution
AN - SCOPUS:85180363720
T3 - Proceedings - 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2023
SP - 625
EP - 630
BT - Proceedings - 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2023
A2 - Bruder, Gerd
A2 - Olivier, Anne-Helene
A2 - Cunningham, Andrew
A2 - Peng, Evan Yifan
A2 - Grubert, Jens
A2 - Williams, Ian
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct, ISMAR-Adjunct 2023
Y2 - 16 October 2023 through 20 October 2023
ER -