Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PulidFluxInsightFaceLoader Error #11

Open
lunatico67 opened this issue Jan 23, 2025 · 10 comments
Open

PulidFluxInsightFaceLoader Error #11

lunatico67 opened this issue Jan 23, 2025 · 10 comments

Comments

@lunatico67
Copy link

Hi
I get this error:

ComfyUI Error Report

Error Details

  • Node ID: 24
  • Node Type: PulidFluxInsightFaceLoader
  • Exception Type: AssertionError
  • Exception Message:

Stack Trace

  File "F:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 327, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "F:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 202, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "F:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)

  File "F:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "F:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_PuLID_Flux_ll\pulidflux.py", line 132, in load_insightface
    model = FaceAnalysis(name="antelopev2", root=INSIGHTFACE_DIR, providers=[provider + 'ExecutionProvider',]) # alternative to buffalo_l
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "F:\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\insightface\app\face_analysis.py", line 43, in __init__
    assert 'detection' in self.models
@lldacing
Copy link
Owner

@lunatico67
Copy link
Author

Thank you very much for your help IIdacing. I didn't have the EVA clip model, I downloaded it and placed it in the ComfyUI/models/clip folder but I still get the same error.

@lldacing
Copy link
Owner

lldacing commented Jan 23, 2025

Thank you very much for your help IIdacing. I didn't have the EVA clip model, I downloaded it and placed it in the ComfyUI/models/clip folder but I still get the same error.

@lunatico67 Did you restart ComfyUI. I have not verified if this model can be downloaded manually

@lunatico67
Copy link
Author

Yes; I have restarted ComfyUI and place the model EVA02_CLIP_L_336_psz14_s6B.pt in ComfyUI_windows_portable\ComfyUI\models\clip

@lunatico67
Copy link
Author

When i start ComfyUI i dont get thet typical error : import failed about ComfyUI Pulid Flux II

Image

@lldacing
Copy link
Owner

When i start ComfyUI i dont get thet typical error : import failed about ComfyUI Pulid Flux II

Image

It happens at runtime.
I think you should keep good network, let it auto-download.

@lunatico67
Copy link
Author

So you think the error is that the Load Eva clip (Pulid Flux) node does not detect the model EVA02_CLIP_L_336_psz14_s6B.pt?
It would be a good idea to have the option to select the model as in the Dualcliploader node.
I keep getting the same error:

Image

@lldacing
Copy link
Owner

So you think the error is that the Load Eva clip (Pulid Flux) node does not detect the model EVA02_CLIP_L_336_psz14_s6B.pt? It would be a good idea to have the option to select the model as in the Dualcliploader node. I keep getting the same error:

Image

Image

Yes, if the error is same. If you have time, you can try modify file pulidflux.py.

model, _, _ = create_model_and_transforms('EVA02-CLIP-L-14-336', 'eva_clip', force_custom_clip=True, cache_dir=os.path.join(folder_paths.models_dir, "clip"))

In future I will check the path of Eva-Clip model. Now I haven't PC for testing.

@lunatico67
Copy link
Author

Thank you for your help. Still the same error. Don't worry, I'll wait until it's fixed in the next update.

@lldacing
Copy link
Owner

Thank you for your help. Still the same error. Don't worry, I'll wait until it's fixed in the next update.

Sorry, I got the wrong node.
Because I only saw the error message and looked at your screenshot, I found that there was a mistake.
Did you download all the model antelopev2?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants