How to test a model created with

I am using a EfficientDet-D2 model in order to do some object detection. I have built the model like this:

self.model = = model_config, is_training = True)

And the training seems to work fine (the loss is at least decreasing):

  def train_step(self,image_tensors,groundtruth_boxes_list,groundtruth_classes_list):

    shapes = tf.constant(self.batch_size * [[self.height, self.width, self.channels]], dtype=tf.int32)


    with tf.GradientTape() as tape:
      preprocessed_images = self.model.preprocess(image_tensors)[0]
      prediction_dict = self.model.predict(preprocessed_images, shapes)

      losses_dict = self.model.loss(prediction_dict, shapes)
      total_loss = losses_dict['Loss/localization_loss'] + losses_dict['Loss/classification_loss']
      gradients = tape.gradient(total_loss, self.vars_to_fine_tune)
      self.optimizer.apply_gradients(zip(gradients, self.vars_to_fine_tune))
    return total_loss

But I am struggeling to see how I can test the model after each epoch or N steps to see the accuracy? Since the is_training parameter is set to True, I guess I have to set it to False in some way and do predictions outside the tf.GradientTape loop? Or is there any test fuction built into the model that can be used?

Thanks for any help!

Here you can find a complete example of a custom train + evaluation loop:

Thank you for your reply! :grinning:

Do you know how to measure this for bounding boxes and classes? If I only have labels it is enough to just pass that to the function, but now I have to consider both bounding boxes and classes :woozy_face:

You can use IOU metric but you will find a quite complete collection in:

1 Like