Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Local run LLM support? #9

Open
kwanLeeFrmVi opened this issue Jan 20, 2025 · 2 comments
Open

Local run LLM support? #9

kwanLeeFrmVi opened this issue Jan 20, 2025 · 2 comments

Comments

@kwanLeeFrmVi
Copy link

As we see the rise of local runnable LLMs, I appreciate how Windsurf IDE operates, but I also prefer to keep my code local. An alternative would be to use the Continue extension in VS Code, but I enjoy using Windsurf more. If possible, could you guide me on how to forward the Windsurf chat API to my localhost? Thank you!

@kingparks
Copy link
Owner

I also like that, I added it to my agency list

@liuqiang1541
Copy link

我也喜欢它,我已将其添加到我的代理商列表中

我不是开发者,也不是程序员,但是看见你仓库和作品,心感国人自当强加油啊!为了一个小目标加油

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants