Skip to content

Commit

Permalink
[Trainer] Wrap model when lora is ON and only do evaluation. (#9803)
Browse files Browse the repository at this point in the history
  • Loading branch information
wtmlon authored Jan 23, 2025
1 parent 30fa8b9 commit e247c85
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions paddlenlp/trainer/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -3123,6 +3123,9 @@ def evaluation_loop(
if self.model is self.model_wrapped and isinstance(self.model_wrapped, PipelineLayer):
# NOTE(gongenlei): when do_train=False, do_eval=True, we need to wrap model for pipeline
self.model_wrapped = fleet.distributed_model(self.model_wrapped)
if isinstance(self.model_wrapped, LoRAModel) and isinstance(self.model_wrapped.model, PipelineLayer):
# NOTE(liuting): when do_train=False, do_eval=True, lora=True, we need to wrap model for pipeline
self.model_wrapped = fleet.distributed_model(self.model_wrapped.model)
model = self.model_wrapped
else:
model = self.model
Expand Down

0 comments on commit e247c85

Please sign in to comment.