How to define input_signature for custom model

Hi, I have a custom model example:

import tensorflow as tf

class CustomModule(tf.keras.layers.Layer):

  def __init__(self):

    super(CustomModule, self).__init__()

    self.v = tf.Variable(1.)

  def call(self, x):

    print('Tracing with', x)

    return x * self.v

  def mutate(self, new_v):

    self.v.assign(new_v)

I want to save it for serving and that is why I need to provide a function for “serving_default”. I’ve tried to do it like this:

module = CustomModule()
module_with_signature_path = './tmp/1'
call = tf.function(module.mutate, input_signature=[tf.TensorSpec([], tf.float32)])
tf.saved_model.save(module, module_with_signature_path, signatures=call)

And I got an error:


    ValueError: Got a non-Tensor value <tf.Operation 'StatefulPartitionedCall' type=StatefulPartitionedCall> for key 'output_0' in the output of the function __inference_mutate_8 used to generate the SavedModel signature 'serving_default'. Outputs for functions used as signatures must be a single Tensor, a sequence of Tensors, or a dictionary from string to Tensor.

How can I properly define signature while saving model? Thank you!

You can’t serialize only a method, you can serialize a whole tf.Module (and a tf.keras.Model is a tf.Module as well as a tf.keras.layers.Layer).

Thus, you’ll be able to export the module by changing this line

call = tf.function(module.mutate, input_signature=[tf.TensorSpec([], tf.float32)])

to this line

call = tf.function(module, input_signature=[tf.TensorSpec([], tf.float32)])

with saved_model_cli you can inspect the ./tmp/1 created SavedModel and look at the signatures available inside. The call method is automatically decorated with tf.function and thus exported.