Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update mutligpu readme and MllamaForConditionalGeneration import #681

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

AAndersn
Copy link

@AAndersn AAndersn commented Sep 26, 2024

Fixes an incorrect parameter in documentation for multigpu that says int4 instead of 4bit for quantization type.

Fixes incorrect import of MllamaForConditionalGeneration from transformers instead of transformers.models.mllama.modeling_mllama

Fixes #680

Before submitting

  • [ X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • [ X] Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue? Please add a link
    to it if that's the case.
  • [ X] Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

Thanks for contributing 🎉!

@wukaixingxp
Copy link
Contributor

Hi! Can you check your transformer version? I think the latest transformer can do from transformers import MllamaForConditionalGeneration

@AAndersn
Copy link
Author

@wukaixingxp You are right.

I think I might have had the 4.44.0 when I created that bug ticket and PR last night. Re-running with transformers 4.45.0 this morning and from transformers import MllamaForConditionalGeneration works fine

@wukaixingxp
Copy link
Contributor

Thanks for your help! We want to bump the transformer version to 4.45.0 but it has a bug as stated in my PR. Now people must pip install from source for transformers to avoid this bug. We are waiting for a new release of transformers pip package.

@init27
Copy link
Contributor

init27 commented Oct 4, 2024

@AAndersn thanks again for the PR-since this is fixed in the latest HF version can you take another parse please?

@AAndersn
Copy link
Author

Sorry for the delay. Yes, I will fix my branch to resolve the conflict with the other portion that has already been fixed

@init27
Copy link
Contributor

init27 commented Oct 10, 2024

Many Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Incorrect import in version 0.0.4
4 participants