Generating natural pedestrian crowds by learning real crowd trajectories through a transformer-based GAN

Dapeng Yan, Gangyi Ding, Kexiang Huang, Tianyu Huang*

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

摘要

Traditional methods for constructing crowd simulations often have shortcomings in terms of realism, and data-driven methods are an effective approach to enhancing the visual realism of crowd simulation. However, existing work mainly constructs crowd simulations through prediction-based approaches based on deep learning or by fitting the parameters of traditional methods, which limits the expressiveness of the model. In response to these limitations, this paper introduces a method capable of generating realistic pedestrian crowds. This approach uses a Generative Adversarial Network, complemented by a transformer module, to learn behavioral patterns from actual crowd trajectories. We use a transformer module to extract trajectory features of the crowd, then convert the spatial relationships between individuals into sequences using a special data processing mechanism, and use the transformer module to extract social features of the crowd, while guiding the movement of each individual with their target direction. During training, we simultaneously learn from real crowd data and simulation data resolving collisions by traditional methods, to enhance the collision avoidance behavior of virtual crowds while maintaining the movement patterns of real crowds, resulting in more general collision avoidance behavior. The crowds generated by the model are not limited to specific scenarios and show generalization capabilities. Compared to other models, our method shows better performance on publicly available large-scale pedestrian datasets after training. Our code is publicly available at https://github.com/ydp91/NPCGAN.

源语言英语
期刊Visual Computer
DOI
出版状态已接受/待刊 - 2024

指纹

探究 'Generating natural pedestrian crowds by learning real crowd trajectories through a transformer-based GAN' 的科研主题。它们共同构成独一无二的指纹。

引用此