You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One of the scripts in the examples/ folder of Accelerate or an officially supported no_trainer script in the examples folder of the transformers repo (such as run_no_trainer_glue.py)
My own task or dataset (give details below)
Reproduction
The issue happens when initializing a Trainer object:
Is there a reason on why the accelerator_config only gets forwarded to the Accelerator args when Accelerate is not available (or less than 0.28.0)?
I would like to provide custom arguments to the accelerator when I initialize the Trainer class but it seems those arguments won't be picked up by the accelerator construction due to the conditional here:
System Info
- `transformers` version: 4.46.3 - Platform: Linux - Accelerate version: 1.3.0
Information
Tasks
no_trainer
script in theexamples
folder of thetransformers
repo (such asrun_no_trainer_glue.py
)Reproduction
The issue happens when initializing a
Trainer
object:Expected behavior
Hey!
Is there a reason on why the
accelerator_config
only gets forwarded to the Acceleratorargs
when Accelerate is not available (or less than0.28.0
)?I would like to provide custom arguments to the accelerator when I initialize the
Trainer
class but it seems those arguments won't be picked up by the accelerator construction due to the conditional here:The text was updated successfully, but these errors were encountered: