-
Notifications
You must be signed in to change notification settings - Fork 494
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ollama.create
not working properly
#371
Comments
same i also facing issue here... |
Up until version 0.4.5 I wrapped it like so: class Model():
def __init__(
self,
name: str,
model: str,
system: str,
options: Dict = None,
):
rm_str = ['\\n', '\t'] # Remove problematic string formatting
self.name = self._clean(str(name), rm_str, len(rm_str)*[' '])
self.model = self._clean(str(model), rm_str, len(rm_str)*[' ']) # e.g. 'llama3.1:latest'
self.system = f"""{self._clean(str(system), rm_str, len(rm_str)*[' '])}""" # e.g. 'You are a helpful assistant.'
self.options = options
self.modelfile = f"""FROM {model}\nSYSTEM {system}\n{self._options_to_modelfile(options)}"""
self.modelfile = self._clean(self.modelfile, rm_str, len(rm_str)*[' '])
print(self.modelfile)
self._create()
def _clean(self, s: str, old: str|List[str]='\\n', new:str|List[str]=' ') -> str:
out = s
if not isinstance(old, List):
old = [old]
if not isinstance(new, List):
new = [new]
assert len(old)==len(new)
for o, n in zip(old, new):
out = out.replace(o, n)
return out
def _create(self):
self.create_status = ollama.create(model=self.name, modelfile=self.modelfile)
def _options_to_modelfile(self, options: Dict = None):
if options is None:
res = ''
else:
res = 'PARAMETER '+ 'PARAMETER '.join([' '.join((k,str(v), '\n')) for k,v in options.items()])
return res
def _format_to_modelfile(self, format: BaseModel = None):
if format is None:
res = None
else:
res = format.model_json_schema()
return res However, with the next version the API was completely changed. Can I ask, btw, why this was done (e.g. @pdevine )? Now you have to pass all parameters separately, see changes. |
Hey guys, Sorry about the confusion here. We had a breaking change in the API for the create endpoint. This was because the Modelfile isn't a great way to serialize data and we wanted to simplify how it works and make it more RESTful. We should have bumped the version properly since it's a breaking change. There is new documentation on how the format works in the main repo and the example in the @iNLyze for your example you can just send each of the args in |
HI @pdevine i tried latest way of creating model
still i am getting below error ResponseError: path or Modelfile are required can you please look into this |
Yea i'm having the same issue as well |
@mustangs0786 @nodeMevK could you please provide your Ollama versions as well? Should not be running into that anymore... |
@ParthSareen i'm using Ollama 0.4.6 I get this error when running the example/create.py and also get the same error when running the example from the readme |
@nodeMevK |
@ParthSareen ah thank you I updated ollama-python version but not the actual ollama. this fixed my issue |
@nodeMevK Glad to hear that! Sorry about the change, should've bumped the major version to show breaking change. |
Could somebody please provide a working python example using
ollama.create
? so far I have been unsuccessful. I also noticed that there are no examples of this type in the examples/ folder. Thanks.The text was updated successfully, but these errors were encountered: