How to run FLUX AI model for free (NSFW)
By gerogero
Updated: November 16, 2024
In August 2024, a team of ex-Stability AI developers announced the formation of Black Forest Labs, and the release of their first AI model, FLUX.1, trained on 12 billion parameters and based upon a novel transformer architecture.
Reception to the model was overwhelmingly positive. Users were shocked by how good the image quality and prompt recognition was.
What makes Flux.1 model all the more special is that it is an open source model, so the community will be able to build on top of it by training custom models and LORAs with different capabilities.
Flux.1 consists of three model variations;
- Flux.1 [Pro] – Pro is only available for use via API. You can use Black Forest Labs’ own API, or a number of their commercial partners, to generate images with Flux.1 Pro. The weights (model file) can’t be downloaded, and even if they could, the system requirements would be too high for use with consumer hardware.
- Flux.1 [Dev] – Dev is an open-source model for non-commercial applications. Distilled from Flux.1 [Pro], it provides similar image quality and prompt-recognition capabilities while being more efficient; we can run it locally on consumer hardware. Flux.1 [Dev] is released under a Non Commercial License. Download from Hugging Face here.
- Flux.1 [Schnell] – Schnell is the super fast model. It lets you do fast low step-count generations at the expense of some image quality. Flux.1 [Schnell] is released under the Apache-2.0 License. Download from Hugging Face here.
How good is FLUX.1? What quality can I expect?
Described by many AI enthusiasts as “the model we’ve been waiting for” (especially after the disappointment of SD3), Flux has been received and embraced with open arms. Image fidelity, prompt adherence, and overall image quality are exceptional, setting a new standard in the text2img landscape.
Just take a look at these examples:
And let’s not forgot the NSFW capabilities (more NSFW prompts here):
How do I try Flux online?
You can use FLUX.1 for free (limited use) on Hugging Face. Here’s the generators for the Dev and the Schnell models.
You can also run FLUX.1 on Replicate.com: Dev and Schnell (also limited free use).
How do I use Flux locally?
Currently, there are a couple of options for local generation, depending on your hardware!
At time of writing there is no Automatic1111 support.
Let’s take a look at our options:
- SwarmUI (my personal recommendation)
- Forge
- ComfyUI
Using SwarmUI
Here’s the download link:
https://github.com/mcmonkeyprojects/SwarmUI
Follow the instructions, which are repeated here:
Note: if you’re on Windows 10, you may need to manually install git and DotNET 8 first. (Windows 11 this is automated).
- Download The Install-Windows.bat file, store it somewhere you want to install at (not Program Files), and run it. For me that’s on my D: drive but up to you.
- It should open a command prompt and install itself.
- If it closes without going further, try running it again, it sometimes needs to run twice.
- It will place an icon on your desktop that you can use to re-launch the server at any time.
- When the installer completes, it will automatically launch the
StableSwarmUI server, and open a browser window to the install page. - Follow the install instructions on the page.
- After you submit, be patient, some of the install processing take a few minutes (downloading models and etc).
That should finish installing, offering SD XL Base model.
To start it, double-click the “Launch-Windows.bat” file. It will have also put a shortcut on your desktop, unless you told it not to.
Try creating an image with the XL model. If that works, great!
Download the Flux model from here:
- The Dev model, if you have 12GB+ VRAM:
https://huggingface.co/black-forest-labs/FLUX.1-dev/tree/main
- The Schnell model, if you have less:
https://huggingface.co/black-forest-labs/FLUX.1-schnell/tree/main
Also download the corresponding “ae.safetensors
” file for whichever model you choose.
Put your chosen FLUX file in your unet folder:
SwarmUI\Models\unet
Then put the “ae.safetensors” file in your VAE folder:
SwarmUI\Models\VAE
Close the app, both the browser and the console.
Restart Swarm with the Windows-launch.bat file.
You should be able to select Flux as the model, try to create an image.
It will tell you it is in the queue.
You will have to wait, because Swarm is downloading large files. You can check progress in the console.
When download is complete, your first image should start to appear!
ComfyUI
Flux.1 launched with day-1 ComfyUI support, making it one of the quickest and easiest ways to dive into generating with the original Black Forest Labs models. To get started using Flux with ComfyUI, you’ll need the following components:
Flux.1 Models | HF Link | Civitai Link |
---|---|---|
ae.safetensors (vae, required) | Black Forest Labs HF Repository | |
flux1-dev.safetensors | Black Forest Labs HF Repository | Civitai Download Link |
flux1-schnell.safetensors | Black Forest Labs HF Repository | Civitai Download Link |
Clip_l.safetensors | Comfyanonymous HF Repository | |
t5xxl_fp16.safetensors | Comfyanonymous HF Repository | |
t5xxl_fp8_e4m3fn.safetensors | Comfyanonymous HF Repository |
- Note that the Flux-dev and -schnell .safetensors models must be placed into the ComfyUI\models\unet folder.
- Clip Models must be placed into the ComfyUI\models\clip folder. You may already have the required Clip models if you’ve previously used SD3.
- Some System Requirement considerations;
- flux1-dev requires more than 12GB VRAM
- flux1-schnell can run on 12GB VRAM
- If you have less than 32GB of System RAM, use the t5xxl_fp8_e4m3fn text encoder instead of the t5xxl_fp16 version.
⚠️ If you’re unable to run the ‘full fat’ official models, creator Kijai has released compressed fp8 versions of both flux1-dev and flux1-schnell. While there might be some reduction in image quality, these versions make it possible for users with less available VRAM to generate with Flux.
Flux.1 Models | HF Link |
---|---|
flux1-dev-fp8.safetensors | Kijai HF Repository |
flux1-schnell-fp8.safetensors | Kijai HF Repository |
You’ll also need a basic text-to-image workflow to get started. The download link below provides a straightforward setup with some solid pre-set options. Additionally, we’ve included a LoRA Loader (bypassed by default), as we’re already seeing the first Flux LoRAs appearing on Civitai!
civitai_flux_t2i_workflowDownload
Forge
One of our favorite interfaces, Forge, has received Flux support in a surprise Major Update! If you’re familiar with Automatic1111’s interface, you’ll be right at home with Forge; the Gradio front-ends are practically identical.
Forge can support the original Flux models, and text encoders, as listed above for Comfy.
To use the full models and text encoders, there are new fields in which the models and encoders can be loaded;
Forge creator, Illyasviel, has released a “compressed” NF4 model which is currently the recommended way to use Flux with Forge; “NF4 is significantly faster than FP8 on 6GB/8GB/12GB devices and slightly faster for >16GB vram devices. For GPUs with 6GB/8GB VRAM, the speed-up is about 1.3x to 2.5x (pytorch 2.4, cuda 12.4) or about 1.3x to 4x (pytorch 2.1, cuda 12.1)“
Model | HF Link |
---|---|
flux1-dev-bnb-nf4-v2.safetensors | Illyasviel HF Repository |
flux1-dev-fp8.safetensors | Illyasviel HF Repository |
Note! If your GPU supports CUDA newer than 11.7 you can use the NF4 model. Most RTX3XXX and 4XXX GPUs support NF4).
If your GPU is a GTX10XX/20XX, it may not support NF4. In this case, use the fp8 model.