Tensorboard not showing optimizer test

New to Tensorboard here, on Ubuntu Linux. I have tried to build a model to test several different parameter values, and it appears to work well, except it only shows me results for a single type of optimizer, not for both that I’ve selected. I’m happy to share all code but the the critical bit is this :slight_smile:

HP_NUM_UNITS_L1 = hp.HParam('num_units_l1', hp.Discrete([8, 16]))
HP_NUM_UNITS_L2 = hp.HParam('num_units_l2', hp.Discrete([8, 16]))
HP_NUM_UNITS_L3 = hp.HParam('num_units_l3', hp.Discrete([8, 16]))
HP_DROPOUT = hp.HParam('dropout', hp.RealInterval(0.1, 0.2))
HP_OPTIMIZER = hp.HParam('optimizer', hp.Discrete(['adam', 'sgd']))
HP_NUM_FILTERS = hp.HParam('filters', hp.Discrete([64, 128, 256]))

MSE = 'mean_squared_error'

with tf.summary.create_file_writer('logs/hparam_tuning').as_default():
  hp.hparams_config(
    hparams=[HP_NUM_UNITS_L1, HP_NUM_UNITS_L2, HP_NUM_UNITS_L3, HP_DROPOUT, 
             HP_OPTIMIZER, HP_NUM_FILTERS], 
      metrics=[hp.Metric(MSE, display_name='Mean Squared Error')],
  )

However in the outputs I only get estimates for the sgd optimizer, and no values for adam.

What might I be doing incorrectly? Thx. J.