Introduction
The latest version of this workflow uses
alimama-creative/FLUX.1-dev-Controlnet-Inpainting-Beta
alimama-creative/FLUX.1-Turbo-Alpha
to achieve 8 steps inpainting and outpainting within the same workflow.
Models
FLUX.1-Turbo-Alpha.safetensors (models/loras/flux) https://huggingface.co/alimama-creative/FLUX.1-Turbo-Alpha
FLUX.1-dev-Controlnet-Inpainting-Beta-fp8.safetensors (models/controlnet) https://huggingface.co/alimama-creative/FLUX.1-dev-Controlnet-Inpainting-Beta
Download diffusion_pytorch_model.safetensors, rename the file and use Kijai's script (https://huggingface.co/Kijai/flux-fp8/discussions/7#66ae0455a20def3de3c6d476) to convert to FP8 to be able to fit into 16GB VRAM.
flux1-dev-fp8-e4m3fb.safetensors (models/diffusion_models/flux): https://huggingface.co/Kijai/flux-fp8/tree/main
t5xxl_fp8_e4m3fn_scaled.safetensors (models/clip): https://huggingface.co/comfyanonymous/flux_text_encoders/tree/main
ViT-L-14-BEST-smooth-GmP-TE-only-HF-format.safetensors (models/clip): https://huggingface.co/zer0int/CLIP-GmP-ViT-L-14/tree/main
Custom Nodes
segment anything (if you have > 16GB VRAM and want to use automatic segmentation)
Various ComfyUI Nodes by Type
KJNodes for ComfyU
Changed the model to Juggernaut Lightning for more realistic generation.
Workflow now allows for both inpainting and outpainting within the same generation.
Go ahead and upload yours!
Your query returned no results – please try removing some filters or trying a different term.