-
Notifications
You must be signed in to change notification settings - Fork 409
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Model URL fails to import into MLCChat for Android #10
Comments
Adding a model via "https://huggingface.co/llSourcell/doctorGPT_mini/blob/main/" doesn't fail in the way mentioned above, so I'm pretty sure that it retrieves and read the JSON. Unfortunately it FCs immediately after that. Samsung SM-T970 running Tiramisu (13) with a Snapdragon 865 w/8G RAM |
Tried the above URL and it FC'd for me as well, can't tell whether it's an issue with the payload at the URL or strictly a problem with MLCChat. Updated the OP with my phone model. |
After digging I found it works with the following URL format: https://huggingface.co/llSourcell/doctorGPT_mini/resolve/main/ The final slash is important, it wouldn't work without it for me! |
So, i.e. |
Using that URL I was able to add it as an entry to MLCChat, proceed to download, and start a chat - however there was no response when I tried to ask a question no matter whether I asked a medical question or "hi" or what, just a little blank empty response bubble which never expands. Again, can't tell whether it's a problem with the URL, the payload at the URL, or some issue with MLCChat but my phone should have plenty of resources to run it. Was anyone else able to generate responses with that URL? |
I can't get the "stock" models to work :/ |
Same issue with Samsung Galaxy S21 Ultra. Any luck? |
@jwwpua no, although like other commenters I confirmed I was unable to get any models to work in MLCChat, which is odd given that my phone certainly has enough ram etc for it. I did a fair amount of digging for any other way to run hugging face models on android and seems like there's nothing. Not really this project's fault that there's little to no support for running models on android, but it does claim it works so I guess this issue is still useful for others who might be confused about why it's not working. If anyone has a foot in the door suggestion of what might be wrong or what might be worth trying I'm certainly happy to give it a go, but otherwise it seems like the next thing to try would be cracking open the source code of MLCChat and compiling my own version of it which would take more time than I have to spare for a stab in the dark. |
It just says "Add model failed: " and then the URL. Sorry I can't be more help here, but will be happy to confirm that the URL works if/when it gets fixed. Seems like all Android users will be unable to use this unless there's something presumed to be obvious missing from the instructions. URL specified by the instructions is: https://huggingface.co/llSourcell/doctorGPT_mini
Asus ROG Phone 6
Tramisu (13)
Qualcomm SM8475 Snapdragon 8+ Gen 1
Adreno 730
18GB RAM
The text was updated successfully, but these errors were encountered: