FLUX Dev/Schnell (Base UNET) + Google FLAN FP16/NF4-FP32/FP8

FLUX_Schnell_FLAN_FP8
Felldude
11 months ago

FLUX.1 (Base UNET) + Google FLAN

Full Checkpoint with improved TE do not load additional CLIP/TE

This model took the 42GB FP32 Google Flan T5xxl and quantized it with improved CLIP-L for Flux. To my knowledge no one else has posted or attempted this.

  • Quantized from FP32 T5xxl (42GB 11B Parameter)

  • Base UNET no baked lora's or other changes

  • Full FP16 version is available.

  • NF4 Full checkpoint is ready to use in Comfy with NF4 loader or natively in Forge (Forge has Lora Support and Comfy is taking 10x longer then Forge per IT - I prefer comfy but the NF4 support is garbage)

  • FP8 version recommended for comfy just use standard checkpoint loader. (NF4 is recommended for Forge as it looses less in Quantitation)

Again Do not load a separate VAE, CLIP or TE - FP32 Quantized versions baked in.

Per the Apache 2.0 license FLAN is attributed to Google

Read more...

What is FLUX Dev/Schnell (Base UNET) + Google FLAN FP16/NF4-FP32/FP8?

FLUX Dev/Schnell (Base UNET) + Google FLAN FP16/NF4-FP32/FP8 is a highly specialized Image generation AI Model of type Safetensors / Checkpoint AI Model created by AI community user Felldude. Derived from the powerful Stable Diffusion (Flux.1 D) model, FLUX Dev/Schnell (Base UNET) + Google FLAN FP16/NF4-FP32/FP8 has undergone an extensive fine-tuning process, leveraging the power of a dataset consisting of images generated by other AI models or user-contributed data. This fine-tuning process ensures that FLUX Dev/Schnell (Base UNET) + Google FLAN FP16/NF4-FP32/FP8 is capable of generating images that are highly relevant to the specific use-cases it was designed for, such as base model, flux1.s, nf4.

With a rating of 0 and over 0 ratings, FLUX Dev/Schnell (Base UNET) + Google FLAN FP16/NF4-FP32/FP8 is a popular choice among users for generating high-quality images from text prompts.

Can I download FLUX Dev/Schnell (Base UNET) + Google FLAN FP16/NF4-FP32/FP8?

Yes! You can download the latest version of FLUX Dev/Schnell (Base UNET) + Google FLAN FP16/NF4-FP32/FP8 from here.

How to use FLUX Dev/Schnell (Base UNET) + Google FLAN FP16/NF4-FP32/FP8?

To use FLUX Dev/Schnell (Base UNET) + Google FLAN FP16/NF4-FP32/FP8, download the model checkpoint file and set up an UI for running Stable Diffusion models (for example, AUTOMATIC1111). Then, provide the model with a detailed text prompt to generate an image. Experiment with different prompts and settings to achieve the desired results. If this sounds a bit complicated, check out our initial guide to Stable Diffusion – it might be of help. And if you really want to dive deep into AI image generation and understand how set up AUTOMATIC1111 to use Safetensors / Checkpoint AI Models like FLUX Dev/Schnell (Base UNET) + Google FLAN FP16/NF4-FP32/FP8, check out our crash course in AI image generation.

Download (20.2 GB) Download available on desktop only
You'll need to use a program like A1111 to run this – learn how in our crash course

Popularity

410 ~10

Info

Base model: Flux.1 S

Latest version (FLUX_Schnell_FLAN_FP8): 1 File

To download these files, please visit this page from a desktop computer.

6 Versions

😥 There are no FLUX Dev/Schnell (Base UNET) + Google FLAN FP16/NF4-FP32/FP8 FLUX_Schnell_FLAN_FP8 prompts yet!

Go ahead and upload yours!

No results

Your query returned no results – please try removing some filters or trying a different term.