You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for releasing this model as open-source! It's a valuable resource for the community.
I tested the inference_examples_RGB.ipynb code you provided, and it runs smoothly. However, I noticed that in the demo provided by your team, the model is used in a conversational format for input and output, but I didn't find this functionality in the code you shared. So, I would like to ask how this is implemented. I look forward to your reply. Thank you!
The text was updated successfully, but these errors were encountered:
Thanks for your interest! The demo is a wrapper around the core inference code with GPT-4 assistance. The model itself isn't conversational. We are thinking of adding the demo code to the repo while requiring the users to spin up there own GPT-4 endpoints. Let me know if that's something of interest to you.
Dear Authors,
Thank you for releasing this model as open-source! It's a valuable resource for the community.
I tested the inference_examples_RGB.ipynb code you provided, and it runs smoothly. However, I noticed that in the demo provided by your team, the model is used in a conversational format for input and output, but I didn't find this functionality in the code you shared. So, I would like to ask how this is implemented. I look forward to your reply. Thank you!
The text was updated successfully, but these errors were encountered: