Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test different layer norm #270

Draft
wants to merge 27 commits into
base: main
Choose a base branch
from
Draft

Conversation

thomasw21
Copy link
Member

Script to reproduce diverging layer_norm weights

adammoody pushed a commit to adammoody/Megatron-DeepSpeed that referenced this pull request Dec 18, 2023
* Enable universal ckpting

* Update run scripts

* Address PR feedback

* Remove line

* Fix white lines

* Remove redudant changes

* Apply to gpt_model only

* Code cleanup

* Code cleanup

* Update training.py

Co-authored-by: Michael Wyatt <[email protected]>

* Update training.py

Co-authored-by: Michael Wyatt <[email protected]>

* Log loss_scale only valid for fp16

* Add README and bf16 scripts

* Visualization docsts

* Support older DS

---------

Co-authored-by: Michael Wyatt <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants