You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Use transformers after this commit https://github.com/huggingface/transformers/issues/34176.
You can refer to the discussion in this issue.
Information
The official example scripts
My own modified scripts
Tasks
One of the scripts in the examples/ folder of Accelerate or an officially supported no_trainer script in the examples folder of the transformers repo (such as run_no_trainer_glue.py)
My own task or dataset (give details below)
Reproduction
We can't correctly use clip_grad_norm_ in accelerator for xla fsdp. We now check the env arg "accelerator use fsdp" in clip_grad_norm_:
. However, accelerator donot integrate fsdp, so we should not set this env arg to true.I want to know how we can modify the code to make it correct.
Expected behavior
Modify the clip_grad_norm_ to make it correct for xla fsdp.Is there any plan to integrate XLA FSDP for the accelerator? I believe this would be a better approach to solving the problem, and it would work for any scenario that uses the accelerator.
The text was updated successfully, but these errors were encountered:
System Info
Use transformers after this commit https://github.com/huggingface/transformers/issues/34176. You can refer to the discussion in this issue.
Information
Tasks
no_trainer
script in theexamples
folder of thetransformers
repo (such asrun_no_trainer_glue.py
)Reproduction
We can't correctly use clip_grad_norm_ in accelerator for xla fsdp. We now check the env arg "accelerator use fsdp" in clip_grad_norm_:
accelerate/src/accelerate/accelerator.py
Line 2375 in a84327e
Expected behavior
Modify the clip_grad_norm_ to make it correct for xla fsdp.Is there any plan to integrate XLA FSDP for the accelerator? I believe this would be a better approach to solving the problem, and it would work for any scenario that uses the accelerator.
The text was updated successfully, but these errors were encountered: