You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If i load it under 1 GPU, it works perfectly with 2k context + 8bit
But if I use autosplit under 2 GPUs as 8k context, it responses in nonsense
Is there some option that I should tick to make it work? Just updated to the latest version of text gen and L2 version of airoboros works fine with autosplit on
Instruct:
1 GPU log:
Input: hello
Output: Hello! I'm here to help you with any questions or tasks you may have. What can I assist you with today?Autosplit log:Input: helloOutput: The string to the text-to-text model 1001
System Info
4090 FE + 3090 MSI via eGPU
The text was updated successfully, but these errors were encountered:
Describe the bug
Im using this model:
https://huggingface.co/LoneStriker/airoboros-70b-3.3-2.4bpw-h6-exl2
If i load it under 1 GPU, it works perfectly with 2k context + 8bit
But if I use autosplit under 2 GPUs as 8k context, it responses in nonsense
Is there some option that I should tick to make it work? Just updated to the latest version of text gen and L2 version of airoboros works fine with autosplit on
Is there an existing issue for this?
Reproduction
latest version of text gen
load this model: https://huggingface.co/LoneStriker/airoboros-70b-3.3-2.4bpw-h6-exl2 w/ autosplit and 8k context
Screenshot
No response
Logs
System Info
The text was updated successfully, but these errors were encountered: