Getting error on calculating the mean of a tensor. Is there any other way that I can accomplish this?

for col in range(n_cols): # Tiles each filter into a big horizontal grid

  for row in range(images_per_row):

      channel_image = layer_activation[0,:, :, col * images_per_row + row]
      channel_image -= channel_image.mean() 
     channel_image /= channel_image.std()

      channel_image *= 64

      channel_image += 128

      channel_image = np.clip(channel_image, 0, 255).astype('uint8')

      display_grid[col * size : (col + 1) * size, # Displays the grid

                     row * size : (row + 1) * size] = channel_image

      scale = 1. / size

      plt.figure(figsize=(scale * display_grid.shape[1],

                    scale * display_grid.shape[0]))

      plt.title(layer_name)

      plt.grid(False)

      plt.imshow(display_grid, aspect='auto', cmap='viridis')

Error Message : AttributeError: ‘tensorflow.python.framework.ops.EagerTensor’ object has no attribute ‘mean’

Try tf.math.reduce_mean. Implementation found here

Thank you … This worked

Glad it worked. Could you mark this as solution since it has been resolved. Thanks in advance