You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @zty07, sorry for the extremely late response. Could you please clarify which experiment you are interested in running? The memory would depend on the task (specifically the sequence length). The quantization code is somewhat broken unfortunately but will be fixed soon which should help with lowering the memory requirements.
When fine-tuning llama, approximately how much GPU memory is required for training?
The text was updated successfully, but these errors were encountered: