Rouben
Rouben

Reputation: 101

HParams in Tensorboard, Run IDs and naming

I'm using SummaryWriter.add_hparams(params, values) to log hyperparameters during training of my Seq2Seq model. My runs are named with a timestamp like 2020-09-10 14-50-27. In the HParams tab in Tensorboard, everything looks fine, but the HParam Trial IDs are different; they have another string of numbers attached like this: 2020-09-10 14-50-27/1599742915.9712806. These also appear in the Scalar tab as different runs, which is quite inconvenient. Is there a way to turn of this extra naming or to stop them of appearing in the Scalars tab? I use pytorch and its summarywriter like this:

params = {
    'max_epochs' : max_epochs,
    'learning_rate': learning_rate,
    'batch_size': batch_size,
    'optimizer_name': optimizer_name,
    'dropout_fc': dropout_fc
}
values = {
    'hparam/hp_total_time': t1_stop - t0_start,
    'hparam/score' : best_score
}

tb.add_hparams(params, values)

Upvotes: 3

Views: 2953

Answers (1)

Elio
Elio

Reputation: 21

As Aniket mentioned there is not enough in your issue description to be entirely sure what the issue is.

However, if you are using Pytorch, I suspect you may be referring to the behaviour also reported in this issue. The add_hparams method creates a new subfolder with current timestamp when called, which is 1599742915.9712806 in your case. TensorBoard uses the hierarchical folder structure to organise (group) runs, which is why 2020-09-10 14-50-27/1599742915.9712806 and 2020-09-10 14-50-27 appear as different runs.

As per the issue I mentioned above, there does not seem to be an "official" way to modify this behaviour but if you read the comments you will find a few custom classes that have been proposed to help.

Upvotes: 2

Related Questions