Skip to content

What all LLM's does onlook support ? #1110

Answered by Kitenite
Greatz08 asked this question in Questions
Discussion options

You must be logged in to vote

Hey @Greatz08, sorry I didn't get notified for this

We're using Claude Sonnet by default but it's extensible to any supported provider from the AI SDK including Ollama.

https://github.com/onlook-dev/onlook/blob/main/apps/studio/electron/main/chat/llmProvider.ts#L20-L31

Here's an example usage of our chat https://github.com/onlook-dev/onlook/blob/main/apps/studio/electron/main/chat/index.ts#L77-L89

You can check out prompts and tools here. Would love to hear your thoughts on how we could architect this better :)
https://github.com/onlook-dev/onlook/blob/main/packages/ai/src

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by Greatz08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants