How do I pass metric to tensorboards hparam interface #5784
Replies: 6 comments 1 reply
-
I have also tried to log the hparams at the end of training with a callback but this seems to fail for unknown reasons
|
Beta Was this translation helpful? Give feedback.
-
Would you provide failed case screen snapshot? In my case, it runs ok. |
Beta Was this translation helpful? Give feedback.
-
I got it to work by logging with the name "hp_metric" instead is this how it is meant to be done? Code now seems to work so old lines have been removed from my code making a screenshot difficult but what happened was the log metric command just added the metric as a scalar and it was not available in the tensorboard hparam interface |
Beta Was this translation helpful? Give feedback.
-
Yes, if you look into tensorboard logger docs there is a default "hp_metric" which can be disabled. When it's disabled you can use: |
Beta Was this translation helpful? Give feedback.
-
@mm04926412 I had the same problem with you that I can simply do Have you solved the problem? |
Beta Was this translation helpful? Give feedback.
-
https://github.com/s-rog/StanfordRibonanza2023/blob/72fe1ebaf866c0c2f77dddc7a02d600c8c4885ba/exp/trainer.py#L85 Hmmm I'm not sure how to explain this better... see here for a working example. |
Beta Was this translation helpful? Give feedback.
-
What is your question?
I have been unable to log metrics on the hparam menu in pytorch lightning. I've been googling for a few hours and found it difficult to get a straight explanation of how to do this. The only available metric is a generic one called hp_metric.
Code
What have you tried?
I have used self.save_hyperparameters in init and passed my config and added a self.logger.log_metrics function to my validation step. I've tried looking through the docs for tensorboard, torch and pytorch lightning and found myself unable to figure out what is needed here.
What's your environment?
Beta Was this translation helpful? Give feedback.
All reactions