Read out tensorboard hparams programatically


after running a hyperparameter search with PyTorch and visualizing the results in Tensorboard, I want to read out the hyperparameters from Tensorboard programatically.

My folder structure is

  • all_runs
    • run1
      • Loss_trainingloss
      • 1634168941.9091413 (or some other timestamp) events.out.tfevents.1634168941
      • events.out.tfevents.1634135651
    • run2
    • run3

and I can easily read tensorboard time-series like e.g. the loss curve with code like this:

from tensorboard.backend.event_processing.event_accumulator import EventAccumulator

def read_eventfile(filepath, tag):
    event_accumulator = EventAccumulator(filepath)

        events = event_accumulator.Scalars(tag)

        y = [x.value for x in events]

        return y

train_loss_curve = read_eventfile(path_to_event_folder, "Loss_trainingloss")

However I struggle to find an access to the hparams of each run, which are correctly displayed in tensorboard. Anybody knows how this can be done?