-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wrong PositionalEncoding in the Transformer example #19138
Comments
You are not on the master branch. |
|
@Galaxy-Husky Do you want to make the adjustments in a PR? |
I think it would be good to go over the example one more time to fix the correctness issues. The initial version was just ported from the PyTorch examples repo but the goal then was not to make it train well but to serve as dummy example for quick testing. |
Sure, I'd love to. |
Thank you for your explanation. I got it ! |
Help needed for this ? I see this is still open. @awaelchli |
Bug description
Hi,
I think there are several mistakes in the implemention of the PositionalEncoding in the lightning/pytorch/demos/transformer.py.
Since the transformer is set to
pytorch-lightning/src/lightning/pytorch/demos/transformer.py
Line 47 in 275822d
pytorch-lightning/src/lightning/pytorch/demos/transformer.py
Line 97 in 275822d
self.pe
should beself.pe[:, :x.size(1)]
pytorch-lightning/src/lightning/pytorch/demos/transformer.py
Line 88 in 275822d
Would you please let me know if I'm wrong?
What version are you seeing the problem on?
master
How to reproduce the bug
No response
Error messages and logs
Environment
Current environment
More info
No response
cc @Borda
The text was updated successfully, but these errors were encountered: