Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Need clip input/output for "FluxBlockLoraLoader" #170

Open
tarkansarim opened this issue Jan 10, 2025 · 1 comment
Open

Need clip input/output for "FluxBlockLoraLoader" #170

tarkansarim opened this issue Jan 10, 2025 · 1 comment

Comments

@tarkansarim
Copy link

Now that Kohya SS is supporting ClipL and T5xxl for Lora training, having clip input/output for Flux Lora loaders have become a thing again. Are there plans to update this node or a recommende workaround? I've tried just loading the clip through a seperate Lora loader but I'm not sure if each block has also it's own Lora weight.

@kijai
Copy link
Owner

kijai commented Jan 11, 2025

Your workaround sounds fine, the transformer blocks aren't tied to the text encoder blocks, the text encoder does have it's own set of blocks/layers though. I don't know if there would be any point to try adjusting those.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants