Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama.create not working properly #371

Open
erlebach opened this issue Dec 11, 2024 · 10 comments
Open

ollama.create not working properly #371

erlebach opened this issue Dec 11, 2024 · 10 comments

Comments

@erlebach
Copy link

Could somebody please provide a working python example using ollama.create ? so far I have been unsuccessful. I also noticed that there are no examples of this type in the examples/ folder. Thanks.

@mustangs0786
Copy link

same i also facing issue here...

@iNLyze
Copy link

iNLyze commented Jan 17, 2025

Up until version 0.4.5 I wrapped it like so:

class Model():
    def __init__(
            self,
            name: str,
            model: str,
            system: str,
            options: Dict = None,
    ):
        rm_str = ['\\n', '\t'] # Remove problematic string formatting
        self.name = self._clean(str(name), rm_str, len(rm_str)*[' '])
        self.model = self._clean(str(model), rm_str, len(rm_str)*[' ']) # e.g. 'llama3.1:latest'
        self.system = f"""{self._clean(str(system), rm_str, len(rm_str)*[' '])}""" # e.g. 'You are a helpful assistant.'
        self.options = options
        self.modelfile = f"""FROM {model}\nSYSTEM {system}\n{self._options_to_modelfile(options)}"""
        self.modelfile = self._clean(self.modelfile, rm_str, len(rm_str)*[' '])
        print(self.modelfile)
        self._create()

    def _clean(self, s: str, old: str|List[str]='\\n', new:str|List[str]=' ') -> str:
        out = s
        if not isinstance(old, List):
            old = [old]
        if not isinstance(new, List):
            new = [new]
        assert len(old)==len(new)
        for o, n in zip(old, new):
            out = out.replace(o, n)
        return out
    
    def _create(self):
        self.create_status = ollama.create(model=self.name, modelfile=self.modelfile)

    def _options_to_modelfile(self, options: Dict = None):
        if options is None:
            res = ''
        else:
            res = 'PARAMETER '+ 'PARAMETER '.join([' '.join((k,str(v), '\n')) for k,v in options.items()])
        return res
    
    def _format_to_modelfile(self, format: BaseModel = None):
        if format is None:
            res = None
        else:
            res = format.model_json_schema()
        return res

However, with the next version the API was completely changed. Can I ask, btw, why this was done (e.g. @pdevine )? Now you have to pass all parameters separately, see changes.
I have not found documentation on this yet, @erlebach.

@pdevine
Copy link
Contributor

pdevine commented Jan 18, 2025

Hey guys,

Sorry about the confusion here. We had a breaking change in the API for the create endpoint. This was because the Modelfile isn't a great way to serialize data and we wanted to simplify how it works and make it more RESTful. We should have bumped the version properly since it's a breaking change.

There is new documentation on how the format works in the main repo and the example in the ollama-python repo has been updated (admittedly this is still pretty sparse and could use more examples).

@iNLyze for your example you can just send each of the args in _create(), rename options to parameters and you shouldn't need self.modelfile anymore.

@mustangs0786
Copy link

HI @pdevine i tried latest way of creating model

ollama.create(model='example', from_='llama3.2', system="You are Mario from Super Mario Bros.")

still i am getting below error

ResponseError: path or Modelfile are required

can you please look into this

@nodeMevK
Copy link

Yea i'm having the same issue as well

@ParthSareen
Copy link
Contributor

@mustangs0786 @nodeMevK could you please provide your Ollama versions as well? Should not be running into that anymore...

@nodeMevK
Copy link

@ParthSareen i'm using Ollama 0.4.6

I get this error when running the example/create.py

Image

Image

and also get the same error when running the example from the readme

Image

Image

@ParthSareen
Copy link
Contributor

@nodeMevK
The API changed recently - can you make sure the ollama version you're running is v0.5.7 you can check with ollama -v

@nodeMevK
Copy link

@ParthSareen ah thank you I updated ollama-python version but not the actual ollama. this fixed my issue

@ParthSareen
Copy link
Contributor

@nodeMevK Glad to hear that! Sorry about the change, should've bumped the major version to show breaking change.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants