Open Access
Journal Article
Improved Pose-Controlled Animation: A Quantitative and Qualitative Analysis
by
Qinghui Xu
, YanLin Wu
, Yajun Yuan
, Zongqi Ge
and
Khang Wen Goh
Abstract
Character animation, which aims to generate dynamic character videos from static images, has gained significant attention in recent years. Despite the advances in diffusion models, which have established themselves as the leading approach in visual generation tasks due to their strong generative capabilities, challenges remain in the domain of image-to-video synthesis, particul
[...] Read more
Character animation, which aims to generate dynamic character videos from static images, has gained significant attention in recent years. Despite the advances in diffusion models, which have established themselves as the leading approach in visual generation tasks due to their strong generative capabilities, challenges remain in the domain of image-to-video synthesis, particularly in character animation. The preservation of temporal consistency and the retention of fine-grained character details across frames continue to pose significant obstacles. In this work, we propose a novel framework specifically designed for character animation, leveraging the potential of diffusion models. To address the challenge of maintaining intricate appearance details from the reference image, we introduce ReferenceNet, a network that integrates detailed features using spatial attention mechanisms. To enhance controllability and ensure smooth motion transitions, we present an efficient pose guide that directs the character's movements and incorporate an effective temporal modeling strategy to facilitate seamless inter-frame consistency. Our framework is capable of animating arbitrary characters by expanding the training data, outperforming existing image-to-video methods in character animation tasks. Experimental evaluations on benchmark image animation datasets demonstrate that our approach achieves state-of-the-art performance, setting a new standard for this domain.