You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am a novice LLM user. I want to modify the parameters sent by Roo Code to the large model so that we can use the knowledge base function of the Volcano engine for code inference. I would like to ask how to achieve this. Thanks!
Steps to reproduce
Relevant API REQUEST output
Additional context
No response
The text was updated successfully, but these errors were encountered:
Our model supports optimizing answers based on different knowledge bases.
But before that, we need to find out which knowledge base we need to use and add it to the prompt.
Searching for a knowledge base is a functional operation.
So we need to generate the prompt through a function. That is, the prompt is dynamically generated.
Which version of the app are you using?
3.7.5
Which API Provider are you using?
OpenAI Compatible
Which Model are you using?
LLM provided by volcengine, based on deepseek R1
What happened?
I am a novice LLM user. I want to modify the parameters sent by Roo Code to the large model so that we can use the knowledge base function of the Volcano engine for code inference. I would like to ask how to achieve this. Thanks!
Steps to reproduce
Relevant API REQUEST output
Additional context
No response
The text was updated successfully, but these errors were encountered: