How to log hyperparameters in Tensorboard HPARAMS tab? #19876
Unanswered
00sapo
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 2 comments 1 reply
-
@00sapo I have opened a similar discussion here. Do you get "No hparams data was found."? |
Beta Was this translation helpful? Give feedback.
1 reply
-
Any solutions ? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
from the documentation, it's not completely clear to me how to log hyperparameters in Tensorboard.
The documentation is at https://lightning.ai/docs/pytorch/stable/extensions/logging.html#logging-hyperparameters
It's not clear if and how
torch.utils.tensorboard.writer.SummaryWriter.add_hparams()
should be called. Moreover, I'm usingself.logger.log_hyperparams(self.hparams, {"hp/metric_1": 0, "hp/metric_2": 0})
as in the example, but all I get is thisI would expect Start Value to be 0, at least. The HPARAMS tab instead shows the usual empty hyperparameters warning.
Can somebody show me a simple example for using the HPARAMS tab?
Beta Was this translation helpful? Give feedback.
All reactions