Support XPU for auto-paralllel LLaMa #9796
6.45% of diff hit (target 80.00%)
View this Pull Request on Codecov
6.45% of diff hit (target 80.00%)
Annotations
Check warning on line 29 in paddlenlp/trainer/auto_trainer.py
codecov / codecov/patch
paddlenlp/trainer/auto_trainer.py#L29
Added line #L29 was not covered by tests
Check warning on line 528 in paddlenlp/trainer/auto_trainer.py
codecov / codecov/patch
paddlenlp/trainer/auto_trainer.py#L527-L528
Added lines #L527 - L528 were not covered by tests
Check warning on line 218 in paddlenlp/transformers/llama/modeling_auto.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling_auto.py#L218
Added line #L218 was not covered by tests
Check warning on line 240 in paddlenlp/transformers/llama/modeling_auto.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling_auto.py#L240
Added line #L240 was not covered by tests
Check warning on line 245 in paddlenlp/transformers/llama/modeling_auto.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling_auto.py#L245
Added line #L245 was not covered by tests
Check warning on line 319 in paddlenlp/transformers/llama/modeling_auto.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling_auto.py#L319
Added line #L319 was not covered by tests
Check warning on line 324 in paddlenlp/transformers/llama/modeling_auto.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling_auto.py#L324
Added line #L324 was not covered by tests
Check warning on line 331 in paddlenlp/transformers/llama/modeling_auto.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling_auto.py#L331
Added line #L331 was not covered by tests
Check warning on line 970 in paddlenlp/transformers/llama/modeling_auto.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling_auto.py#L958-L970
Added lines #L958 - L970 were not covered by tests
Check warning on line 973 in paddlenlp/transformers/llama/modeling_auto.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling_auto.py#L972-L973
Added lines #L972 - L973 were not covered by tests
Check warning on line 1205 in paddlenlp/transformers/llama/modeling_auto.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling_auto.py#L1205
Added line #L1205 was not covered by tests
Check warning on line 1207 in paddlenlp/transformers/llama/modeling_auto.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling_auto.py#L1207
Added line #L1207 was not covered by tests
Check warning on line 1210 in paddlenlp/transformers/llama/modeling_auto.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling_auto.py#L1209-L1210
Added lines #L1209 - L1210 were not covered by tests
Check warning on line 1219 in paddlenlp/transformers/llama/modeling_auto.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling_auto.py#L1219
Added line #L1219 was not covered by tests