Lora Training & TrainingPRO transformers training cuttoff error. #5903
Replies: 1 comment
-
You can only train with base models such as llama-2 or llama-3 in transformers. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Training on the Undi95_MXLewd-L2-20B and other models with the transformer loader. On a new installation, I attempted transformers training with and without exllama and received the following error, causing the training to terminate immediately. Loaded into 4bit, auto devices. With and without use_double_quant. Lora training uses raw text files.
It should be noted that the identical data and settings were previously used to create a lora.
When trying to load without exllama I get the following error:
The error in question:
Beta Was this translation helpful? Give feedback.
All reactions