You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I have installed the latest version (1.8.1) on two brand new setup: ubuntu 24.0.1 and Raspian OS. On both I have installed Ollama as local service (not a docker container). Everything works fine when I chat (eg I get a proper reply from the model) but there is no context information passed to the prompt. The # Context part of the SystemMessage is missing. I have checked in the memory page and I am able to find declarative message.
To Reproduce
Steps to reproduce the behavior (as exampple`):
Install Ollama as local service
Upload a document or Upload via URL
Ask anything related to the uploaded content
Check the system message
The text was updated successfully, but these errors were encountered:
what embedder do you have configured?
if you have a proper embedder, check the similarity score you find in the memory tab, the default threshold is 0.7
Describe the bug
I have installed the latest version (1.8.1) on two brand new setup: ubuntu 24.0.1 and Raspian OS. On both I have installed Ollama as local service (not a docker container). Everything works fine when I chat (eg I get a proper reply from the model) but there is no context information passed to the prompt. The # Context part of the SystemMessage is missing. I have checked in the memory page and I am able to find declarative message.
To Reproduce
Steps to reproduce the behavior (as exampple`):
The text was updated successfully, but these errors were encountered: