Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Issue with Ollama and context. #1026

Open
mc9625 opened this issue Feb 2, 2025 · 2 comments
Open

[BUG] Issue with Ollama and context. #1026

mc9625 opened this issue Feb 2, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@mc9625
Copy link

mc9625 commented Feb 2, 2025

Describe the bug
I have installed the latest version (1.8.1) on two brand new setup: ubuntu 24.0.1 and Raspian OS. On both I have installed Ollama as local service (not a docker container). Everything works fine when I chat (eg I get a proper reply from the model) but there is no context information passed to the prompt. The # Context part of the SystemMessage is missing. I have checked in the memory page and I am able to find declarative message.

To Reproduce
Steps to reproduce the behavior (as exampple`):

  1. Install Ollama as local service
  2. Upload a document or Upload via URL
  3. Ask anything related to the uploaded content
  4. Check the system message
@mc9625 mc9625 added the bug Something isn't working label Feb 2, 2025
@AlessandroSpallina
Copy link
Member

what embedder do you have configured?
if you have a proper embedder, check the similarity score you find in the memory tab, the default threshold is 0.7

@AlessandroSpallina
Copy link
Member

any update @mc9625 ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants