Skip to content

Commit

Permalink
Merge pull request #3 from mlcommons/sarthakpati-patch-1
Browse files Browse the repository at this point in the history
Minor doc and codeowner update
  • Loading branch information
sarthakpati authored Oct 3, 2024
2 parents 9d5cedd + 46ee939 commit 5e65cf8
Show file tree
Hide file tree
Showing 2 changed files with 23 additions and 3 deletions.
2 changes: 1 addition & 1 deletion .github/CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@

/.github/workflows/cla.yml @mlcommons/staff

/LICENSE.md @mlcommons/staff
/LICENSE.md @mlcommons/staff
24 changes: 22 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,10 @@
General documentation is in preparation, it will be made available soon.
# GaNDLF-Synth

Presenting the **G**ener**a**lly **N**uanced **D**eep **L**earning **F**ramework for **Synth**esis (GaNDLF-Synth), a unified abstraction to train various synthesis algorithms in a zero/low code approach.

## Documentation

General documentation is in preparation, it will be **made available soon**.

Mixed precision training:
We currently support mixed precision training based on [lightning](https://pytorch-lightning.readthedocs.io/en/latest/advanced/mixed_precision.html). To use mixed precision, please set the "precision" field in the "compute" field. All available precision options can be found under the link above.
Expand Down Expand Up @@ -34,4 +40,18 @@ compute:
```
Details of this config file can be found in the deepspeed documentation here: https://www.deepspeed.ai/docs/config-json/
Please read further details in the Lightning guide: https://lightning.ai/docs/pytorch/stable/advanced/model_parallel/deepspeed.html#custom-deepspeed-config.
Note that you will probably need to override the optimizer choice with one of optimized ones available in deepspeed. This optimizer (scheduler can be specified here too) will take precedence over the one specified in the base yaml config file.
Note that you will probably need to override the optimizer choice with one of optimized ones available in deepspeed. This optimizer (scheduler can be specified here too) will take precedence over the one specified in the base yaml config file.
## Citation
```
@misc{pati2024gandlfsynthframeworkdemocratizegenerative,
title={GaNDLF-Synth: A Framework to Democratize Generative AI for (Bio)Medical Imaging},
author={Sarthak Pati and Szymon Mazurek and Spyridon Bakas},
year={2024},
eprint={2410.00173},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2410.00173},
}
```

0 comments on commit 5e65cf8

Please sign in to comment.