This flux1-schnell model has weights in FP8, which makes running in ComfyUI much faster and use less memory.

Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Spaces using Comfy-Org/flux1-schnell 2