Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorrect import in version 0.0.4 #680

Closed
1 of 2 tasks
AAndersn opened this issue Sep 26, 2024 · 1 comment · May be fixed by #681
Closed
1 of 2 tasks

Incorrect import in version 0.0.4 #680

AAndersn opened this issue Sep 26, 2024 · 1 comment · May be fixed by #681

Comments

@AAndersn
Copy link

System Info

PyTorch 2.4.0, CUDA 12.1, 7x H100

Information

  • The official example scripts
  • My own modified scripts

🐛 Describe the bug

Import of MllamaForConditionalGeneration on https://github.com/meta-llama/llama-recipes/blob/main/src/llama_recipes/finetuning.py#L24 should be on line 28 instead. It is part of mllama module, not main transformers library

Error logs

Throws import error

Expected behavior

Have fixed this in https://github.com/AAndersn/llama-recipes/blob/main/src/llama_recipes/finetuning.py#L27. Will make PR to fix it along with int4 -> 4 bit fix in readme

@AAndersn
Copy link
Author

The import from transformers import MllamaForConditionalGeneration is fixed in transformers = 4.45.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant