From d363657f60d9822f2f94a0e8ef735ef2e99c82fa Mon Sep 17 00:00:00 2001 From: liuwenran <448073814@qq.com> Date: Tue, 13 Jun 2023 14:53:58 +0800 Subject: [PATCH] add readme --- configs/controlnet_animation/README.md | 26 ++++++++++++++++++++++---- 1 file changed, 22 insertions(+), 4 deletions(-) diff --git a/configs/controlnet_animation/README.md b/configs/controlnet_animation/README.md index bfae689ef2..5878ff7818 100644 --- a/configs/controlnet_animation/README.md +++ b/configs/controlnet_animation/README.md @@ -13,11 +13,13 @@ It is difficult to keep consistency and avoid video frame flickering when using stable diffusion to generate video frame by frame. Here we reproduce two methods that effectively avoid video flickering: -1. Controlnet with multi-frame rendering.[ControlNet](https://github.com/lllyasviel/ControlNet) is a neural network structure to control diffusion models by adding extra conditions. - [Multi-frame rendering](https://xanthius.itch.io/multi-frame-rendering-for-stablediffusion) is a community method to reduce flickering. - We use controlnet with hed condition and stable diffusion img2img for multi-frame rendering. +**Controlnet with multi-frame rendering**. [ControlNet](https://github.com/lllyasviel/ControlNet) is a neural network structure to control diffusion models by adding extra conditions. +[Multi-frame rendering](https://xanthius.itch.io/multi-frame-rendering-for-stablediffusion) is a community method to reduce flickering. +We use controlnet with hed condition and stable diffusion img2img for multi-frame rendering. -2. Controlnet with attention injection. Attention injection is widely used to generate the current frame from a reference image. There is an implementation in [sd-webui-controlnet](https://github.com/Mikubill/sd-webui-controlnet#reference-only-control) and we use some of their code to create the animation in this repo. +**Controlnet with attention injection**. Attention injection is widely used to generate the current frame from a reference image. There is an implementation in [sd-webui-controlnet](https://github.com/Mikubill/sd-webui-controlnet#reference-only-control) and we use some of their code to create the animation in this repo. + +You may need 40G GPU memory to run controlnet with multi-frame rendering and 10G GPU memory for controlnet with attention injection. If the config file is not changed, it defaults to using controlnet with attention injection. ## Demos @@ -84,6 +86,22 @@ editor.infer(video=video, prompt=prompt, negative_prompt=negative_prompt, save_p python demo/gradio_controlnet_animation.py ``` +### 3. Change config to use multi-frame rendering or attention injection. + +change "inference_method" in [anythingv3 config](./anythingv3_config.py) + +To use multi-frame rendering. + +```python +inference_method = 'multi-frame rendering' +``` + +To use attention injection. + +```python +inference_method = 'attention_injection' +``` + ## Play animation with SAM We also provide a demo to play controlnet animation with sam, for details, please see [OpenMMLab PlayGround](https://github.com/open-mmlab/playground/blob/main/mmediting_sam/README.md).