Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cannot load model successfully? #112

Open
geeklt-fighter opened this issue Oct 2, 2024 · 1 comment
Open

cannot load model successfully? #112

geeklt-fighter opened this issue Oct 2, 2024 · 1 comment

Comments

@geeklt-fighter
Copy link

image selected model is undefined, I have already start ollama
@if-ai
Copy link
Owner

if-ai commented Oct 2, 2024

Are you accessing from the same machine? try localhost 127.0.0.1 If you are using a VPN you also need to make sure you are allowing it on the split tunneling feature and/or your firewall rules. If you are using WSL but you are serving Ollama from the windows side and ComfyUI is on the Linux side or vs you need to use your IPv4 Address as base ip. You can find the IPv4 by pressing Win+R and launch ipconfig
ProtonVPN_H7jsXzBhvN

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants