Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error occurred when executing SamplerCustomAdvanced #159

Open
Jajanoce opened this issue Dec 22, 2024 · 10 comments
Open

Error occurred when executing SamplerCustomAdvanced #159

Jajanoce opened this issue Dec 22, 2024 · 10 comments

Comments

@Jajanoce
Copy link

I don't understand what the problem is, can anyone help me?

ErrorAttn

ComfyUI Error Report

Error Details

  • Node ID: 13
  • Node Type: SamplerCustomAdvanced
  • Exception Type: TypeError
  • Exception Message: DoubleStreamBlock.forward() got an unexpected keyword argument 'attn_mask'

Stack Trace

  File "C:\IA\ComfyUI\execution.py", line 328, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)

  File "C:\IA\ComfyUI\execution.py", line 203, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)

  File "C:\IA\ComfyUI\execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)

  File "C:\IA\ComfyUI\execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))

  File "C:\IA\ComfyUI\comfy_extras\nodes_custom_sampler.py", line 633, in sample
    samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=noise.seed)

  File "C:\IA\ComfyUI\comfy\samplers.py", line 897, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)

  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)

  File "C:\IA\ComfyUI\comfy\samplers.py", line 866, in outer_sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)

  File "C:\IA\ComfyUI\comfy\samplers.py", line 850, in inner_sample
    samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)

  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)

  File "C:\IA\ComfyUI\comfy\samplers.py", line 707, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)

  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)

  File "C:\IA\ComfyUI\comfy\k_diffusion\sampling.py", line 155, in sample_euler
    denoised = model(x, sigma_hat * s_in, **extra_args)

  File "C:\IA\ComfyUI\comfy\samplers.py", line 379, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)

  File "C:\IA\ComfyUI\comfy\samplers.py", line 832, in __call__
    return self.predict_noise(*args, **kwargs)

  File "C:\IA\ComfyUI\comfy\samplers.py", line 835, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)

  File "C:\IA\ComfyUI\comfy\samplers.py", line 359, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)

  File "C:\IA\ComfyUI\comfy\samplers.py", line 195, in calc_cond_batch
    return executor.execute(model, conds, x_in, timestep, model_options)

  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)

  File "C:\IA\ComfyUI\comfy\samplers.py", line 308, in _calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)

  File "C:\IA\ComfyUI\comfy\model_base.py", line 130, in apply_model
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(

  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)

  File "C:\IA\ComfyUI\comfy\model_base.py", line 159, in _apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()

  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)

  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)

  File "C:\IA\ComfyUI\comfy\ldm\flux\model.py", line 204, in forward
    out = self.forward_orig(img, img_ids, context, txt_ids, timestep, y, guidance, control, transformer_options, attn_mask=kwargs.get("attention_mask", None))

  File "C:\IA\ComfyUI\comfy\ldm\flux\model.py", line 143, in forward_orig
    img, txt = block(img=img,

  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)

  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)

System Information

  • ComfyUI Version: v0.3.9-9-g601ff9e
  • Arguments: main.py
  • OS: nt
  • Python Version: 3.10.6 (tags/v3.10.6:9c7b4bd, Aug 1 2022, 21:53:49) [MSC v.1932 64 bit (AMD64)]
  • Embedded Python: false
  • PyTorch Version: 2.5.1+cu124

Devices

  • Name: cuda:0 NVIDIA GeForce RTX 3070 Laptop GPU : cudaMallocAsync
    • Type: cuda
    • VRAM Total: 8589410304
    • VRAM Free: 56179520
    • Torch VRAM Total: 12817793024
    • Torch VRAM Free: 56179520

Logs

2024-12-22T07:01:38.120018 - [START] Security scan2024-12-22T07:01:38.120018 - 
2024-12-22T07:01:39.174756 - [DONE] Security scan2024-12-22T07:01:39.174756 - 
2024-12-22T07:01:39.328593 - ## ComfyUI-Manager: installing dependencies done.2024-12-22T07:01:39.328593 - 
2024-12-22T07:01:39.328593 - ** ComfyUI startup time:2024-12-22T07:01:39.328593 -  2024-12-22T07:01:39.328593 - 2024-12-22 07:01:39.3285932024-12-22T07:01:39.328593 - 
2024-12-22T07:01:39.343741 - ** Platform:2024-12-22T07:01:39.343741 -  2024-12-22T07:01:39.343741 - Windows2024-12-22T07:01:39.343741 - 
2024-12-22T07:01:39.343741 - ** Python version:2024-12-22T07:01:39.343741 -  2024-12-22T07:01:39.344749 - 3.10.6 (tags/v3.10.6:9c7b4bd, Aug  1 2022, 21:53:49) [MSC v.1932 64 bit (AMD64)]2024-12-22T07:01:39.344749 - 
2024-12-22T07:01:39.344749 - ** Python executable:2024-12-22T07:01:39.344749 -  2024-12-22T07:01:39.344749 - C:\Users\lospu\AppData\Local\Programs\Python\Python310\python.exe2024-12-22T07:01:39.344749 - 
2024-12-22T07:01:39.344749 - ** ComfyUI Path:2024-12-22T07:01:39.344749 -  2024-12-22T07:01:39.344749 - C:\IA\ComfyUI2024-12-22T07:01:39.344749 - 
2024-12-22T07:01:39.344749 - ** Log path:2024-12-22T07:01:39.344749 -  2024-12-22T07:01:39.344749 - C:\IA\ComfyUI\comfyui.log2024-12-22T07:01:39.344749 - 
2024-12-22T07:01:40.185721 - 
Prestartup times for custom nodes:
2024-12-22T07:01:40.185721 -    2.1 seconds: C:\IA\ComfyUI\custom_nodes\ComfyUI-Manager
2024-12-22T07:01:40.185721 - 
2024-12-22T07:01:43.880437 - Total VRAM 8192 MB, total RAM 32620 MB
2024-12-22T07:01:43.880437 - pytorch version: 2.5.1+cu124
2024-12-22T07:01:43.881840 - Set vram state to: NORMAL_VRAM
2024-12-22T07:01:43.881840 - Device: cuda:0 NVIDIA GeForce RTX 3070 Laptop GPU : cudaMallocAsync
2024-12-22T07:01:45.810004 - Using pytorch attention
2024-12-22T07:01:48.413870 - [Prompt Server] web root: C:\IA\ComfyUI\web
2024-12-22T07:01:49.635688 - Traceback (most recent call last):
  File "C:\IA\ComfyUI\nodes.py", line 2075, in load_custom_node
    module_spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 879, in exec_module
  File "<frozen importlib._bootstrap_external>", line 1016, in get_code
  File "<frozen importlib._bootstrap_external>", line 1073, in get_data
FileNotFoundError: [Errno 2] No such file or directory: 'C:\\IA\\ComfyUI\\custom_nodes\\ComfyUI\\__init__.py'

2024-12-22T07:01:49.635688 - Cannot import C:\IA\ComfyUI\custom_nodes\ComfyUI module for custom nodes: [Errno 2] No such file or directory: 'C:\\IA\\ComfyUI\\custom_nodes\\ComfyUI\\__init__.py'
2024-12-22T07:01:49.670820 - ### Loading: ComfyUI-Manager (V2.55.5)2024-12-22T07:01:49.671831 - 
2024-12-22T07:01:49.885903 - ### ComfyUI Version: v0.3.9-9-g601ff9e | Released on '2024-12-21'2024-12-22T07:01:49.885903 - 
2024-12-22T07:01:50.278037 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json2024-12-22T07:01:50.279376 - 
2024-12-22T07:01:50.306787 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json2024-12-22T07:01:50.306787 - 
2024-12-22T07:01:50.308287 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json2024-12-22T07:01:50.308287 - 
2024-12-22T07:01:50.358883 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json2024-12-22T07:01:50.359889 - 
2024-12-22T07:01:50.436251 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json2024-12-22T07:01:50.437246 - 
2024-12-22T07:01:50.588897 - 
Import times for custom nodes:
2024-12-22T07:01:50.588897 -    0.0 seconds: C:\IA\ComfyUI\custom_nodes\websocket_image_save.py
2024-12-22T07:01:50.588897 -    0.0 seconds (IMPORT FAILED): C:\IA\ComfyUI\custom_nodes\ComfyUI
2024-12-22T07:01:50.589908 -    0.0 seconds: C:\IA\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale
2024-12-22T07:01:50.589908 -    0.0 seconds: C:\IA\ComfyUI\custom_nodes\ComfyUI-GGUF
2024-12-22T07:01:50.589908 -    0.0 seconds: C:\IA\ComfyUI\custom_nodes\efficiency-nodes-comfyui
2024-12-22T07:01:50.589908 -    0.4 seconds: C:\IA\ComfyUI\custom_nodes\x-flux-comfyui
2024-12-22T07:01:50.589908 -    0.5 seconds: C:\IA\ComfyUI\custom_nodes\ComfyUI-Manager
2024-12-22T07:01:50.590907 - 
2024-12-22T07:01:50.599426 - Starting server

2024-12-22T07:01:50.600428 - To see the GUI go to: http://127.0.0.1:8188
2024-12-22T07:03:17.340202 - FETCH DATA from: C:\IA\ComfyUI\custom_nodes\ComfyUI-Manager\extension-node-map.json2024-12-22T07:03:17.340202 - 2024-12-22T07:03:17.349825 -  [DONE]2024-12-22T07:03:17.350826 - 
2024-12-22T07:03:17.418833 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:03:17.418833 - 
2024-12-22T07:03:17.418833 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:03:17.419840 - 
2024-12-22T07:07:03.747930 - got prompt
2024-12-22T07:07:03.749936 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:07:03.750933 - 
2024-12-22T07:07:03.750933 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:07:03.750933 - 
2024-12-22T07:07:03.751442 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:07:03.751442 - 
2024-12-22T07:07:03.751442 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:07:03.751442 - 
2024-12-22T07:07:03.752451 - Failed to validate prompt for output 9:
2024-12-22T07:07:03.752451 - * FluxLoraLoader 38:
2024-12-22T07:07:03.752451 -   - Value not in list: lora_name: 'realism_lora.safetensors' not in ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']
2024-12-22T07:07:03.752451 - Output will be ignored
2024-12-22T07:07:03.752451 - invalid prompt: {'type': 'prompt_outputs_failed_validation', 'message': 'Prompt outputs failed validation', 'details': '', 'extra_info': {}}
2024-12-22T07:07:08.689057 - got prompt
2024-12-22T07:07:08.691058 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:07:08.691058 - 
2024-12-22T07:07:08.691561 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:07:08.691561 - 
2024-12-22T07:07:08.691561 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:07:08.691561 - 
2024-12-22T07:07:08.691561 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:07:08.691561 - 
2024-12-22T07:07:08.692567 - Failed to validate prompt for output 9:
2024-12-22T07:07:08.692567 - * FluxLoraLoader 38:
2024-12-22T07:07:08.692567 -   - Value not in list: lora_name: 'realism_lora.safetensors' not in ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']
2024-12-22T07:07:08.692567 - Output will be ignored
2024-12-22T07:07:08.693571 - invalid prompt: {'type': 'prompt_outputs_failed_validation', 'message': 'Prompt outputs failed validation', 'details': '', 'extra_info': {}}
2024-12-22T07:08:23.496855 - FETCH DATA from: C:\IA\ComfyUI\custom_nodes\ComfyUI-Manager\extension-node-map.json2024-12-22T07:08:23.498215 - 2024-12-22T07:08:23.508759 -  [DONE]2024-12-22T07:08:23.508759 - 
2024-12-22T07:08:23.563095 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:08:23.563095 - 
2024-12-22T07:08:23.563095 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:08:23.563095 - 
2024-12-22T07:08:31.991159 - got prompt
2024-12-22T07:08:31.992161 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:08:31.992161 - 
2024-12-22T07:08:31.993160 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:08:31.993160 - 
2024-12-22T07:08:31.993160 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:08:31.993160 - 
2024-12-22T07:08:31.993664 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:08:31.993664 - 
2024-12-22T07:08:31.996674 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:08:31.996674 - 
2024-12-22T07:08:32.000672 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:08:32.000672 - 
2024-12-22T07:08:32.032216 - Using pytorch attention in VAE
2024-12-22T07:08:32.033720 - Using pytorch attention in VAE
2024-12-22T07:08:32.437518 - C:\IA\ComfyUI\custom_nodes\ComfyUI-GGUF\loader.py:64: UserWarning: The given NumPy array is not writable, and PyTorch does not support non-writable tensors. This means writing to this tensor will result in undefined behavior. You may want to copy the array to protect its data or make it writable before converting it to a tensor. This type of warning will be suppressed for the rest of this program. (Triggered internally at C:\actions-runner\_work\pytorch\pytorch\builder\windows\pytorch\torch\csrc\utils\tensor_numpy.cpp:212.)
  torch_tensor = torch.from_numpy(tensor.data) # mmap
2024-12-22T07:08:32.447029 - 
ggml_sd_loader:2024-12-22T07:08:32.448029 - 
2024-12-22T07:08:32.448029 -  GGMLQuantizationType.F16      4762024-12-22T07:08:32.448029 - 
2024-12-22T07:08:32.448029 -  GGMLQuantizationType.Q8_0     3042024-12-22T07:08:32.448029 - 
2024-12-22T07:08:32.484576 - model weight dtype torch.bfloat16, manual cast: None
2024-12-22T07:08:32.492579 - model_type FLUX
2024-12-22T07:08:32.559502 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:08:32.559502 - 
2024-12-22T07:08:32.559502 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:08:32.559502 - 
2024-12-22T07:08:32.559502 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:08:32.559502 - 
2024-12-22T07:08:32.559502 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:08:32.559502 - 
2024-12-22T07:08:32.562011 - Is model already patched? False2024-12-22T07:08:32.562011 - 
2024-12-22T07:08:47.537940 - We are patching diffusion model, be patient please2024-12-22T07:08:47.537940 - 
2024-12-22T07:09:37.226566 - 
ggml_sd_loader:2024-12-22T07:09:37.226566 - 
2024-12-22T07:09:37.226566 -  GGMLQuantizationType.Q8_0     1692024-12-22T07:09:37.226566 - 
2024-12-22T07:09:37.226566 -  GGMLQuantizationType.F32       502024-12-22T07:09:37.226566 - 
2024-12-22T07:09:37.648993 - Token indices sequence length is longer than the specified maximum sequence length for this model (85 > 77). Running this sequence through the model will result in indexing errors
2024-12-22T07:09:37.651942 - Requested to load FluxClipModel_
2024-12-22T07:09:37.653350 - 0 models unloaded.
2024-12-22T07:09:37.758853 - loaded partially 64.0 63.4521484375 0
2024-12-22T07:09:37.761455 - Attempting to release mmap (239)2024-12-22T07:09:37.761455 - 
2024-12-22T07:09:58.240587 - Requested to load Flux
2024-12-22T07:09:58.330050 - 0 models unloaded.
2024-12-22T07:09:58.366041 - loaded partially 64.0 63.83984375 0
2024-12-22T07:09:58.445618 - 
  0%|                                                                                           | 0/20 [00:00<?, ?it/s]2024-12-22T07:09:58.930244 - 
  0%|                                                                                           | 0/20 [00:00<?, ?it/s]2024-12-22T07:09:58.930244 - 
2024-12-22T07:09:58.940456 - !!! Exception during processing !!! DoubleStreamBlock.forward() got an unexpected keyword argument 'attn_mask'
2024-12-22T07:09:58.944230 - Traceback (most recent call last):
  File "C:\IA\ComfyUI\execution.py", line 328, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "C:\IA\ComfyUI\execution.py", line 203, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "C:\IA\ComfyUI\execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)
  File "C:\IA\ComfyUI\execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
  File "C:\IA\ComfyUI\comfy_extras\nodes_custom_sampler.py", line 633, in sample
    samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=noise.seed)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 897, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 866, in outer_sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 850, in inner_sample
    samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 707, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\k_diffusion\sampling.py", line 155, in sample_euler
    denoised = model(x, sigma_hat * s_in, **extra_args)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 379, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 832, in __call__
    return self.predict_noise(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 835, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 359, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 195, in calc_cond_batch
    return executor.execute(model, conds, x_in, timestep, model_options)
  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 308, in _calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
  File "C:\IA\ComfyUI\comfy\model_base.py", line 130, in apply_model
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(
  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\model_base.py", line 159, in _apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\ldm\flux\model.py", line 204, in forward
    out = self.forward_orig(img, img_ids, context, txt_ids, timestep, y, guidance, control, transformer_options, attn_mask=kwargs.get("attention_mask", None))
  File "C:\IA\ComfyUI\comfy\ldm\flux\model.py", line 143, in forward_orig
    img, txt = block(img=img,
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
TypeError: DoubleStreamBlock.forward() got an unexpected keyword argument 'attn_mask'

2024-12-22T07:09:58.946769 - Prompt executed in 86.95 seconds
2024-12-22T07:16:48.861986 - got prompt
2024-12-22T07:16:48.864481 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:16:48.864481 - 
2024-12-22T07:16:48.865487 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:16:48.865487 - 
2024-12-22T07:16:48.865487 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:16:48.865487 - 
2024-12-22T07:16:48.865487 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:16:48.865487 - 
2024-12-22T07:16:48.873998 - 0 models unloaded.
2024-12-22T07:16:48.898545 - loaded partially 64.0 63.83984375 0
2024-12-22T07:16:48.916635 - 
  0%|                                                                                           | 0/20 [00:00<?, ?it/s]2024-12-22T07:16:49.224196 - 
  0%|                                                                                           | 0/20 [00:00<?, ?it/s]2024-12-22T07:16:49.225197 - 
2024-12-22T07:16:49.226558 - !!! Exception during processing !!! DoubleStreamBlock.forward() got an unexpected keyword argument 'attn_mask'
2024-12-22T07:16:49.227650 - Traceback (most recent call last):
  File "C:\IA\ComfyUI\execution.py", line 328, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "C:\IA\ComfyUI\execution.py", line 203, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "C:\IA\ComfyUI\execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)
  File "C:\IA\ComfyUI\execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
  File "C:\IA\ComfyUI\comfy_extras\nodes_custom_sampler.py", line 633, in sample
    samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=noise.seed)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 897, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 866, in outer_sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 850, in inner_sample
    samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 707, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\k_diffusion\sampling.py", line 155, in sample_euler
    denoised = model(x, sigma_hat * s_in, **extra_args)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 379, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 832, in __call__
    return self.predict_noise(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 835, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 359, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 195, in calc_cond_batch
    return executor.execute(model, conds, x_in, timestep, model_options)
  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 308, in _calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
  File "C:\IA\ComfyUI\comfy\model_base.py", line 130, in apply_model
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(
  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\model_base.py", line 159, in _apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\ldm\flux\model.py", line 204, in forward
    out = self.forward_orig(img, img_ids, context, txt_ids, timestep, y, guidance, control, transformer_options, attn_mask=kwargs.get("attention_mask", None))
  File "C:\IA\ComfyUI\comfy\ldm\flux\model.py", line 143, in forward_orig
    img, txt = block(img=img,
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
TypeError: DoubleStreamBlock.forward() got an unexpected keyword argument 'attn_mask'

2024-12-22T07:16:49.229167 - Prompt executed in 0.36 seconds
2024-12-22T07:18:47.712354 - got prompt
2024-12-22T07:18:47.714355 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:18:47.714355 - 
2024-12-22T07:18:47.714355 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:18:47.714355 - 
2024-12-22T07:18:47.714355 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:18:47.715353 - 
2024-12-22T07:18:47.715353 - ['anime_lora_comfy_converted.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-12-22T07:18:47.715353 - 
2024-12-22T07:18:47.723880 - 0 models unloaded.
2024-12-22T07:18:47.747412 - loaded partially 64.0 63.83984375 0
2024-12-22T07:18:47.759000 - 
  0%|                                                                                           | 0/20 [00:00<?, ?it/s]2024-12-22T07:18:48.016634 - 
  0%|                                                                                           | 0/20 [00:00<?, ?it/s]2024-12-22T07:18:48.016634 - 
2024-12-22T07:18:48.018830 - !!! Exception during processing !!! DoubleStreamBlock.forward() got an unexpected keyword argument 'attn_mask'
2024-12-22T07:18:48.019829 - Traceback (most recent call last):
  File "C:\IA\ComfyUI\execution.py", line 328, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "C:\IA\ComfyUI\execution.py", line 203, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "C:\IA\ComfyUI\execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)
  File "C:\IA\ComfyUI\execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
  File "C:\IA\ComfyUI\comfy_extras\nodes_custom_sampler.py", line 633, in sample
    samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=noise.seed)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 897, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 866, in outer_sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 850, in inner_sample
    samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 707, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\k_diffusion\sampling.py", line 155, in sample_euler
    denoised = model(x, sigma_hat * s_in, **extra_args)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 379, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 832, in __call__
    return self.predict_noise(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 835, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 359, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 195, in calc_cond_batch
    return executor.execute(model, conds, x_in, timestep, model_options)
  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\samplers.py", line 308, in _calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
  File "C:\IA\ComfyUI\comfy\model_base.py", line 130, in apply_model
    return comfy.patcher_extension.WrapperExecutor.new_class_executor(
  File "C:\IA\ComfyUI\comfy\patcher_extension.py", line 110, in execute
    return self.original(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\model_base.py", line 159, in _apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
  File "C:\IA\ComfyUI\comfy\ldm\flux\model.py", line 204, in forward
    out = self.forward_orig(img, img_ids, context, txt_ids, timestep, y, guidance, control, transformer_options, attn_mask=kwargs.get("attention_mask", None))
  File "C:\IA\ComfyUI\comfy\ldm\flux\model.py", line 143, in forward_orig
    img, txt = block(img=img,
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "C:\Users\lospu\AppData\Local\Programs\Python\Python310\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
TypeError: DoubleStreamBlock.forward() got an unexpected keyword argument 'attn_mask'

2024-12-22T07:18:48.022341 - Prompt executed in 0.30 seconds

Attached Workflow

Please make sure that workflow does not contain any sensitive information such as API keys or passwords.

{"last_node_id":44,"last_link_id":130,"nodes":[{"id":8,"type":"VAEDecode","pos":[866,367],"size":[210,46],"flags":{},"order":18,"mode":0,"inputs":[{"name":"samples","type":"LATENT","link":24},{"name":"vae","type":"VAE","link":12}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[9,126],"slot_index":0}],"properties":{"Node name for S&R":"VAEDecode"},"widgets_values":[]},{"id":13,"type":"SamplerCustomAdvanced","pos":[864,192],"size":[272.3617858886719,124.53733825683594],"flags":{"collapsed":false},"order":17,"mode":0,"inputs":[{"name":"noise","type":"NOISE","link":37,"slot_index":0},{"name":"guider","type":"GUIDER","link":30,"slot_index":1},{"name":"sampler","type":"SAMPLER","link":19,"slot_index":2},{"name":"sigmas","type":"SIGMAS","link":20,"slot_index":3},{"name":"latent_image","type":"LATENT","link":116,"slot_index":4}],"outputs":[{"name":"output","type":"LATENT","links":[24],"slot_index":0,"shape":3},{"name":"denoised_output","type":"LATENT","links":null,"shape":3}],"properties":{"Node name for S&R":"SamplerCustomAdvanced"},"widgets_values":[]},{"id":16,"type":"KSamplerSelect","pos":[480,912],"size":[315,58],"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"SAMPLER","type":"SAMPLER","links":[19],"shape":3}],"properties":{"Node name for S&R":"KSamplerSelect"},"widgets_values":["euler"]},{"id":17,"type":"BasicScheduler","pos":[480,1008],"size":[315,106],"flags":{},"order":15,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":55,"slot_index":0}],"outputs":[{"name":"SIGMAS","type":"SIGMAS","links":[20],"shape":3}],"properties":{"Node name for S&R":"BasicScheduler"},"widgets_values":["simple",20,1]},{"id":22,"type":"BasicGuider","pos":[576,48],"size":[222.3482666015625,46],"flags":{},"order":16,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":54,"slot_index":0},{"name":"conditioning","type":"CONDITIONING","link":42,"slot_index":1}],"outputs":[{"name":"GUIDER","type":"GUIDER","links":[30],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"BasicGuider"},"widgets_values":[]},{"id":25,"type":"RandomNoise","pos":[480,768],"size":[315,82],"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[{"name":"NOISE","type":"NOISE","links":[37],"shape":3}],"properties":{"Node name for S&R":"RandomNoise"},"widgets_values":[246833593956789,"randomize"],"color":"#2a363b","bgcolor":"#3f5159"},{"id":26,"type":"FluxGuidance","pos":[480,144],"size":[317.4000244140625,58],"flags":{},"order":14,"mode":0,"inputs":[{"name":"conditioning","type":"CONDITIONING","link":41}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[42],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"FluxGuidance"},"widgets_values":[3.5],"color":"#233","bgcolor":"#355"},{"id":27,"type":"EmptySD3LatentImage","pos":[480,624],"size":[315,106],"flags":{},"order":12,"mode":0,"inputs":[{"name":"width","type":"INT","link":112,"widget":{"name":"width"}},{"name":"height","type":"INT","link":113,"widget":{"name":"height"}}],"outputs":[{"name":"LATENT","type":"LATENT","links":[116],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"EmptySD3LatentImage"},"widgets_values":[1024,1024,1]},{"id":30,"type":"ModelSamplingFlux","pos":[480,1152],"size":[315,130],"flags":{},"order":13,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":118,"slot_index":0},{"name":"width","type":"INT","link":115,"slot_index":1,"widget":{"name":"width"}},{"name":"height","type":"INT","link":114,"slot_index":2,"widget":{"name":"height"}}],"outputs":[{"name":"MODEL","type":"MODEL","links":[54,55],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"ModelSamplingFlux"},"widgets_values":[1.15,0.5,1024,1024]},{"id":35,"type":"PrimitiveNode","pos":[672,480],"size":[210,82],"flags":{},"order":2,"mode":0,"inputs":[],"outputs":[{"name":"INT","type":"INT","links":[113,114],"slot_index":0,"widget":{"name":"height"}}],"title":"height","properties":{"Run widget replace on values":false},"widgets_values":[1024,"fixed"],"color":"#323","bgcolor":"#535"},{"id":37,"type":"Note","pos":[480,1344],"size":[314.99755859375,117.98363494873047],"flags":{},"order":3,"mode":0,"inputs":[],"outputs":[],"properties":{"text":""},"widgets_values":["The reference sampling implementation auto adjusts the shift value based on the resolution, if you don't want this you can just bypass (CTRL-B) this ModelSamplingFlux node.\n"],"color":"#432","bgcolor":"#653"},{"id":39,"type":"UltimateSDUpscale","pos":[946.881591796875,923.532958984375],"size":[315,614],"flags":{},"order":20,"mode":4,"inputs":[{"name":"image","type":"IMAGE","link":126},{"name":"model","type":"MODEL","link":129},{"name":"positive","type":"CONDITIONING","link":121},{"name":"negative","type":"CONDITIONING","link":122},{"name":"vae","type":"VAE","link":123},{"name":"upscale_model","type":"UPSCALE_MODEL","link":119}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[125],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"UltimateSDUpscale"},"widgets_values":[2,721384036564318,"randomize",20,8,"euler","normal",0.2,"Linear",512,512,8,32,"None",1,64,8,16,true,false]},{"id":40,"type":"UpscaleModelLoader","pos":[950.881591796875,810.532958984375],"size":[315,58],"flags":{},"order":4,"mode":4,"inputs":[],"outputs":[{"name":"UPSCALE_MODEL","type":"UPSCALE_MODEL","links":[119],"shape":3}],"properties":{"Node name for S&R":"UpscaleModelLoader"},"widgets_values":["4xUltrasharp.pth"]},{"id":42,"type":"SaveImage","pos":[1290.881591796875,815.532958984375],"size":[663.0575561523438,706.5049438476562],"flags":{},"order":21,"mode":4,"inputs":[{"name":"images","type":"IMAGE","link":125}],"outputs":[],"properties":{},"widgets_values":["ComfyUI"]},{"id":9,"type":"SaveImage","pos":[1196,152],"size":[558.0821533203125,536.7359619140625],"flags":{},"order":19,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":9}],"outputs":[],"properties":{},"widgets_values":["ComfyUI"]},{"id":41,"type":"CLIPTextEncode","pos":[988.7509155273438,1576.3458251953125],"size":[210,112.88722229003906],"flags":{"collapsed":true},"order":11,"mode":4,"inputs":[{"name":"clip","type":"CLIP","link":130}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[121,122],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"CLIPTextEncode"},"widgets_values":[""]},{"id":6,"type":"CLIPTextEncode","pos":[384,240],"size":[422.84503173828125,164.31304931640625],"flags":{},"order":10,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":127}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[41],"slot_index":0}],"title":"CLIP Text Encode (Positive Prompt)","properties":{"Node name for S&R":"CLIPTextEncode"},"widgets_values":["A stunning cowgirl with sun-kissed skin, wearing a wide-brimmed cowboy hat, a plaid shirt tied at the waist, and denim jeans with boots. She stands confidently in the golden light of a sunset, with a rustic wooden fence and open prairie behind her. Her hair is flowing in the wind, and her expression is warm and inviting. Highly detailed, photorealistic, soft shadows, cinematic lighting.\n"],"color":"#232","bgcolor":"#353"},{"id":43,"type":"UnetLoaderGGUF","pos":[11,159],"size":[315,58],"flags":{},"order":5,"mode":0,"inputs":[],"outputs":[{"name":"MODEL","type":"MODEL","links":[128,129],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"UnetLoaderGGUF"},"widgets_values":["flux1-dev-Q8_0.gguf"]},{"id":44,"type":"DualCLIPLoaderGGUF","pos":[10,263],"size":[315,106],"flags":{},"order":6,"mode":0,"inputs":[],"outputs":[{"name":"CLIP","type":"CLIP","links":[127,130],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"DualCLIPLoaderGGUF"},"widgets_values":["t5-v1_1-xxl-encoder-Q8_0.gguf","clip_l.safetensors","flux"]},{"id":10,"type":"VAELoader","pos":[13,538],"size":[311.81634521484375,60.429901123046875],"flags":{},"order":7,"mode":0,"inputs":[],"outputs":[{"name":"VAE","type":"VAE","links":[12,123],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"VAELoader"},"widgets_values":["ae.safetensors"]},{"id":34,"type":"PrimitiveNode","pos":[432,480],"size":[210,82],"flags":{},"order":8,"mode":0,"inputs":[],"outputs":[{"name":"INT","type":"INT","links":[112,115],"slot_index":0,"widget":{"name":"width"}}],"title":"width","properties":{"Run widget replace on values":false},"widgets_values":[1024,"fixed"],"color":"#323","bgcolor":"#535"},{"id":38,"type":"FluxLoraLoader","pos":[11,413],"size":[315,82],"flags":{},"order":9,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":128}],"outputs":[{"name":"MODEL","type":"MODEL","links":[118],"slot_index":0,"shape":3}],"properties":{"Node name for S&R":"FluxLoraLoader"},"widgets_values":["realism_lora.safetensors",1]}],"links":[[9,8,0,9,0,"IMAGE"],[12,10,0,8,1,"VAE"],[19,16,0,13,2,"SAMPLER"],[20,17,0,13,3,"SIGMAS"],[24,13,0,8,0,"LATENT"],[30,22,0,13,1,"GUIDER"],[37,25,0,13,0,"NOISE"],[41,6,0,26,0,"CONDITIONING"],[42,26,0,22,1,"CONDITIONING"],[54,30,0,22,0,"MODEL"],[55,30,0,17,0,"MODEL"],[112,34,0,27,0,"INT"],[113,35,0,27,1,"INT"],[114,35,0,30,2,"INT"],[115,34,0,30,1,"INT"],[116,27,0,13,4,"LATENT"],[118,38,0,30,0,"MODEL"],[119,40,0,39,5,"UPSCALE_MODEL"],[121,41,0,39,2,"CONDITIONING"],[122,41,0,39,3,"CONDITIONING"],[123,10,0,39,4,"VAE"],[125,39,0,42,0,"IMAGE"],[126,8,0,39,0,"IMAGE"],[127,44,0,6,0,"CLIP"],[128,43,0,38,0,"MODEL"],[129,43,0,39,1,"MODEL"],[130,44,0,41,0,"CLIP"]],"groups":[{"id":1,"title":"Upscale","bounding":[937,737,1027,850],"color":"#3f789e","font_size":24,"flags":{}}],"config":{},"extra":{"ds":{"scale":0.620921323059155,"offset":[396.9256138581951,26.925489114463502]},"groupNodes":{}},"version":0.4}

Additional Context

(Please add any additional context or steps to reproduce the error here)

@arso7
Copy link

arso7 commented Dec 22, 2024

I encountered the same error. please help

@CAOTTAA
Copy link

CAOTTAA commented Dec 22, 2024

Same error here...

@Skallagrimr
Copy link

same error

1 similar comment
@mkygogo
Copy link

mkygogo commented Dec 23, 2024

same error

@kylan02
Copy link

kylan02 commented Dec 23, 2024

same with me, following this workflow: https://www.patreon.com/file?h=115147229&m=373158000

@pendave
Copy link

pendave commented Dec 23, 2024

Same here.

@pendave
Copy link

pendave commented Dec 23, 2024

`!!! Exception during processing !!! DoubleStreamBlock.forward() got an unexpected keyword argument 'attn_mask'
Traceback (most recent call last):
File "E:\ComfyUI_Portable\ComfyUI\execution.py", line 328, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb,
pre_execute_cb=pre_execute_cb)
File "E:\ComfyUI_Portable\ComfyUI\execution.py", line 203, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=exe
cution_block_cb, pre_execute_cb=pre_execute_cb)
File "E:\ComfyUI_Portable\ComfyUI\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "E:\ComfyUI_Portable\ComfyUI\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
File "E:\ComfyUI_Portable\ComfyUI\nodes.py", line 1503, in sample
return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=
denoise)
File "E:\ComfyUI_Portable\ComfyUI\nodes.py", line 1470, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
File "E:\ComfyUI_Portable\ComfyUI\comfy\sample.py", line 43, in sample
samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last
step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disa
ble_pbar=disable_pbar, seed=seed)
File "E:\ComfyUI_Portable\ComfyUI\comfy\samplers.py", line 1013, in sample
return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent

image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "E:\ComfyUI_Portable\ComfyUI\comfy\samplers.py", line 911, in sample
return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
File "E:\ComfyUI_Portable\ComfyUI\comfy\samplers.py", line 897, in sample
output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
File "E:\ComfyUI_Portable\ComfyUI\comfy\patcher_extension.py", line 110, in execute
return self.original(*args, **kwargs)
File "E:\ComfyUI_Portable\ComfyUI\comfy\samplers.py", line 866, in outer_sample
output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed
)
File "E:\ComfyUI_Portable\ComfyUI\comfy\samplers.py", line 850, in inner_sample
samples = executor.execute(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
File "E:\ComfyUI_Portable\ComfyUI\comfy\patcher_extension.py", line 110, in execute
return self.original(*args, **kwargs)
File "E:\ComfyUI_Portable\ComfyUI\comfy\samplers.py", line 707, in sample
samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable
_pbar, **self.extra_options)
File "E:\ComfyUI_Portable\python_embeded\Lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context

return func(*args, **kwargs)

File "E:\ComfyUI_Portable\ComfyUI\comfy\k_diffusion\sampling.py", line 155, in sample_euler
denoised = model(x, sigma_hat * s_in, **extra_args)
File "E:\ComfyUI_Portable\ComfyUI\comfy\samplers.py", line 379, in call
out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
File "E:\ComfyUI_Portable\ComfyUI\comfy\samplers.py", line 832, in call
return self.predict_noise(*args, **kwargs)
File "E:\ComfyUI_Portable\ComfyUI\comfy\samplers.py", line 835, in predict_noise
return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive"
, None), self.cfg, model_options=model_options, seed=seed)
File "E:\ComfyUI_Portable\ComfyUI\comfy\samplers.py", line 359, in sampling_function
out = calc_cond_batch(model, conds, x, timestep, model_options)
File "E:\ComfyUI_Portable\ComfyUI\comfy\samplers.py", line 195, in calc_cond_batch
return executor.execute(model, conds, x_in, timestep, model_options)
File "E:\ComfyUI_Portable\ComfyUI\comfy\patcher_extension.py", line 110, in execute
return self.original(*args, **kwargs)
File "E:\ComfyUI_Portable\ComfyUI\comfy\samplers.py", line 308, in calc_cond_batch
output = model.apply_model(input_x, timestep
, **c).chunk(batch_chunks)
File "E:\ComfyUI_Portable\ComfyUI\comfy\model_base.py", line 129, in apply_model
return comfy.patcher_extension.WrapperExecutor.new_class_executor(
File "E:\ComfyUI_Portable\ComfyUI\comfy\patcher_extension.py", line 110, in execute
return self.original(*args, **kwargs)
File "E:\ComfyUI_Portable\ComfyUI\comfy\model_base.py", line 158, in _apply_model
model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_option
s, **extra_conds).float()
File "E:\ComfyUI_Portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_i
mpl
return self._call_impl(*args, **kwargs)
File "E:\ComfyUI_Portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
File "E:\ComfyUI_Portable\ComfyUI\comfy\ldm\flux\model.py", line 204, in forward
out = self.forward_orig(img, img_ids, context, txt_ids, timestep, y, guidance, control, transformer_options, attn_m
ask=kwargs.get("attention_mask", None))
File "E:\ComfyUI_Portable\ComfyUI\comfy\ldm\flux\model.py", line 143, in forward_orig
img, txt = block(img=img,
File "E:\ComfyUI_Portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_i
mpl
return self._call_impl(*args, **kwargs)
File "E:\ComfyUI_Portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
TypeError: DoubleStreamBlock.forward() got an unexpected keyword argument 'attn_mask'
`

@jerzysobski
Copy link

Im getting same error.

@dxdpxl
Copy link

dxdpxl commented Jan 1, 2025

exactly the same

@dxdpxl
Copy link

dxdpxl commented Jan 1, 2025

A reset to ComfyUI v.3.0.7 with "git checkout v0.3.7" inside the lower comfyui folder and we can continue using x-flux. Thanks to the guys who posted the solution in a ComfyUI-PuLID-Flux-Enhanced issue thread. Hope they will fix it soon!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants