StableDiffusion's new gameplay is on fire! Given a few words, a dynamic graph can be generated
Xi Xiaoyao Technology Says OriginalityAuthor | Mini PlayLet's just talk about the development of large model AI and its assistance in unleashing people's imagination,The methods based on the StableDiffusion model bear the brunt. Through text description to image generation technology, large models have opened up a vast and dreamy world for our imagination
Xi Xiaoyao Technology Says Originality
Author | Mini Play
Let's just talk about the development of large model AI and its assistance in unleashing people's imagination,The methods based on the StableDiffusion model bear the brunt. Through text description to image generation technology, large models have opened up a vast and dreamy world for our imagination. Through a few drops of text, it is possible to reproduce the strange and mysterious things that are trapped in our minds and cannot be released.
And recently,StableDiffusion continues to evolve, with a snap of the finger and a frame using AnimateDiff, text and image generation can move from static to dynamic, dynamically transforming personalized text generated images in one go, achieving one click GIF animation generation! Firstly, let's demonstrate the animation generation effect of AnimateDiff. When we want to generate an image that shows a girl happily wearing her new armor in the living room, capture the keywords cybergirl, smiling, armor, liveingroom, etc., and modify it slightly by entering the following prompt:
Longighlighted hair, cybergirl, futuristic silver armorsuit, confidence distance, high resolution, livingroom, sliming, heated
You can obtain a natural and realistic dynamic image:
Similarly, using Prompt:
1Girl, anime, longlinkhair, necklace, earings, masterpiece, highlydetailed, highquality, 8k
Can generate a GIF with a more anime style:
Using different styles of models, these generated animated images can be cartoon characters full of anime style:
It can also be a more realistic character design:
It can be a background clip from an anime movie:
It can also be an artistic scroll of ink painting style
And more interestingly, AnimateDiff supports the combination of ControlNet, for example, we want the armor girl generated earlier to imitate the actions of the girl in the picture below:
Simply configure and enable ControlNet, and use the above image as the control diagram to generate the following image. The armor girl generated above perfectly imitates the actions of the girl in the above image, adding imagination to the custom animation generation!
Meanwhile, by using the motionLoRA method and using Prompt, we can also control the action of the "camera". For example, if we want the camera to shift to the left, that is, the background to move to the right, we can add< Lora: v2_ Lora_ PanLeft: 0.75> Indicates using a weight of 0.75 to pan the camera to the left, resulting in the following effect:
AnimateDiff AnimateDiff .From the perspective of training methods, AnimateDiff is not complex either. Based on the user personalized or customized Text Image Generation Model (T2I), AnimateDiff uses short video clip data to complete the training of a motion modeling module, similar to a plugin. AnimateDiff embeds this motion modeling module into the T2I model, enabling the image to successfully "rotate from rest".
Undoubtedly, there is an upper limit to the expressive power of a single static image, whileAnimateDiff empowers images with the ability to "move" like gold, greatly expanding our ability to express simple input text.Stable Diffusion . AnimateDiffIt is highly likely that AnimateDiff will redefine the entire animation industry!
Disclaimer: The content of this article is sourced from the internet. The copyright of the text, images, and other materials belongs to the original author. The platform reprints the materials for the purpose of conveying more information. The content of the article is for reference and learning only, and should not be used for commercial purposes. If it infringes on your legitimate rights and interests, please contact us promptly and we will handle it as soon as possible! We respect copyright and are committed to protecting it. Thank you for sharing.(Email:[email protected])