-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
InternalError when running llava model #2966
Comments
Any input on this problem would be very helpful for me since I have not managed to solve this myself yet. To begin with, should I interpret this as an out of memory error or just a bug while allocating memory? I have lots of free ram, but maybe I have missed some internal config limit in mlc-llm? I have now also tried:
|
I ran into the same problem as you did! but not sure how to take it from here! I dont understand how one image can cause out-of-bounds memory issues or something! |
Okay, thanks a lot for your reply, atleast I'm not alone.
Yeah I also find it weird, and the python process does not seem to allocate insane amounts of memory of anything. There is plenty left. It would be really nice to get any type of answer from a maintainer, just so I know if I should move on to another backend or keep on trying. |
Hey plufz!
336 is from the model config: |
Thanks a million, that was it. Is this a bug? Because from what I can see there is a function in the llava model that scales images according to the config image size value. I didn't dig into when this scaling function is called though. |
yeah should keep this issue open, until we get a response from any of the Llava maintainers! seems like a bug that needs to be addressed/fixed |
❓ InternalError when running llava model
Im new to mlc-llm and I'm not sure if this is a bug or me doing something incorrectly. I have so far not managed to run any model successfully. I have tried to use a tiny llama model and a llava-1.5 model. I have converted the weights and compiled using the docs. And from what I could see from the output those steps succeed.
Environment
python 3.11 (conda)
macos 13.6.4
m1 max 64GB
Convert and compile
Running the model
Output
Also observe that the process does not stop after the output. I need to ctrl-c to stop it.
The text was updated successfully, but these errors were encountered: