Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in loading flux fp8 model with local transformer_flux.py file #9667

Open
DhavalTaunk08 opened this issue Oct 14, 2024 · 1 comment
Open
Labels
bug Something isn't working

Comments

@DhavalTaunk08
Copy link

DhavalTaunk08 commented Oct 14, 2024

Describe the bug

Unable to use flux fp8 model from Kijai/flux-fp8 while having transformer_flux.py file in local. I have modified the scripts to remove any import error. I put some print statements in single_model_file.py to check why it is not loading the model.

Reproduction

The below code works fine.

single_model_file.py

def _get_single_file_loadable_mapping_class(cls):
    print(cls)
    diffusers_module = importlib.import_module(__name__.split(".")[0])

    for loadable_class_str in SINGLE_FILE_LOADABLE_CLASSES:
        loadable_class = getattr(diffusers_module, loadable_class_str)
        print(cls, loadable_class)
        print(issubclass(cls, loadable_class))
        if issubclass(cls, loadable_class):
            return loadable_class_str

    return None
from diffusers import FluxTransformer2DModel
transformer = FluxTransformer2DModel.from_single_file(
    "https://huggingface.co/Kijai/flux-fp8/blob/main/flux1-schnell-fp8-e4m3fn.safetensors", 
    torch_dtype=torch.bfloat16
)

I am getting the below output:

<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'>
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.unets.unet_stable_cascade.StableCascadeUNet'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.unets.unet_2d_condition.UNet2DConditionModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.autoencoders.autoencoder_kl.AutoencoderKL'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.controlnet.ControlNetModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.transformers.transformer_sd3.SD3Transformer2DModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.unets.unet_motion_model.MotionAdapter'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.controlnet_sparsectrl.SparseControlNetModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'>
True

But while using the class from my local code:

from transformer_flux import FluxTransformer2DModel
FluxTransformer2DModel.__module__ = 'diffusers.models.transformers.transformer_flux'
transformer = FluxTransformer2DModel.from_single_file(
    "https://huggingface.co/Kijai/flux-fp8/blob/main/flux1-schnell-fp8-e4m3fn.safetensors", 
    torch_dtype=torch.bfloat16
)

It is giving me following error:

<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'>
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.unets.unet_stable_cascade.StableCascadeUNet'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.unets.unet_2d_condition.UNet2DConditionModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.autoencoders.autoencoder_kl.AutoencoderKL'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.controlnet.ControlNetModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.transformers.transformer_sd3.SD3Transformer2DModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.unets.unet_motion_model.MotionAdapter'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.controlnet_sparsectrl.SparseControlNetModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'>
False

Traceback (most recent call last):
  File "/workspace/GarmentTransferV2/test.py", line 441, in <module>
    main(args)
  File "/workspace/GarmentTransferV2/test.py", line 368, in main
    transformer_garment = FluxTransformerGarment2DModel.from_single_file(
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/workspace/garment/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/workspace/garment/lib/python3.11/site-packages/diffusers/loaders/single_file_model.py", line 182, in from_single_file
    raise ValueError(
ValueError: FromOriginalModelMixin is currently only compatible with StableCascadeUNet, UNet2DConditionModel, AutoencoderKL, ControlNetModel, SD3Transformer2DModel, MotionAdapter, SparseControlNetModel, FluxTransformer2DModel

Any leads would be appreciated.

Logs

No response

System Info

  • 🤗 Diffusers version: 0.30.3
  • Platform: Linux-6.8.0-40-generic-x86_64-with-glibc2.35
  • Running on Google Colab?: No
  • Python version: 3.11.9
  • PyTorch version (GPU?): 2.4.1+cu121 (True)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Huggingface_hub version: 0.25.2
  • Transformers version: 4.45.2
  • Accelerate version: 1.0.1
  • PEFT version: not installed
  • Bitsandbytes version: not installed
  • Safetensors version: 0.4.5
  • xFormers version: not installed
  • Accelerator: NVIDIA H100 80GB HBM3, 81559 MiB
  • Using GPU in script?:

Who can help?

@DN6 @sayakpaul

@DhavalTaunk08 DhavalTaunk08 added the bug Something isn't working label Oct 14, 2024
@a-r-r-o-w
Copy link
Member

a-r-r-o-w commented Oct 14, 2024

You cannot pass URLs to from_single_file. Could you try the following instead?

from diffusers import FluxTransformer2DModel
from huggingface_hub import hf_hub_download

safetensors_file = hf_hub_download("Kijai/flux-fp8", filename="flux1-dev-fp8-e4m3fn.safetensors")
print(safetensors_file)

transformer = FluxTransformer2DModel.from_single_file(safetensors_file, subfolder="transformer")
print(transformer.config)

Edit: My bad, you can pass URLs to from_single_file, I got confused with something else. I think you just have to pass subfolder="transformer" when initializing to make your snippet work. This is because we try to fetch the init config from the original Flux-Dev repository (as Kijai/flux-fp8 is based on Flux-Dev).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants