Skip to content
This repository has been archived by the owner on Oct 14, 2024. It is now read-only.

Feedback for “Integrate Ollama with Jan” #209

Open
rnusser opened this issue May 16, 2024 · 2 comments
Open

Feedback for “Integrate Ollama with Jan” #209

rnusser opened this issue May 16, 2024 · 2 comments

Comments

@rnusser
Copy link

rnusser commented May 16, 2024

Firstly thanks for a great app. I use it all the time!

Whle reading the guide "Integrate Ollama with Jan" I presumed it meant you can access your ollama models being served by the ollama server.

The guide mentions downloading the model from the Hub. Surely if you download the model from the Hub you wouldn't need to integrate with your ollama models.

Is there a way to "use" the models that have already been downloaded in to ollama?

@CeKMTL
Copy link

CeKMTL commented Jun 19, 2024

I second that request; Create a new server entry in JAN (instead of the hackish instruction to use open) that is specifically tailored to Ollama and let us run our Ollama models in the beauty and confort of JAN.
Thank you.

PS: A TON of ppl want this functionality for some time now...

@igor-elbert
Copy link

Upvote from me!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants