-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
We are urlencoding the source file contents before we send to LLM #213
Comments
I'm experimenting with a fix for this as below: diff --git a/kai/data/templates/main.jinja b/kai/data/templates/main.jinja @@ -26,7 +26,7 @@ After you have shared your step by step thinking, provide a full output of the u
Issues |
Hi @jwmatthews, |
Hi @SaxenaAnushka102 if you'd like to pick this up we could use help to test this further both with running against a LLM you have access to as well creating unit tests. |
Hello @jwmatthews , |
I've noticed that sometimes incident 'message' and 'solution' is urlencoded in the prompt when we do not want this. Example:
I'm experimenting with below to address
|
I don't believe that we are seeing this in the agent prompt, I think we should consider closing |
We have changed out the templating engine and I don't believe we should see this problem anymore |
We are starting to escape some portions of the source code we send to the LLM request.
Example:
public List<ShoppingCartItem> getShoppingCartItemList() {
return "ShoppingCart [cartItemTotal=" + cartItemTotal
Below is an example of the source file contents in a prompt we constructed
The text was updated successfully, but these errors were encountered: