Examples of Using the Batch Class for Inference

Hi everyone,

I am looking for examples of how to use the Batch class

for inference.

My model takes in three TFloat32 inputs and returns an TFloat32 array of probabilities.

` public static TFloat32 runPredict(TFloat32 wordTensor, TFloat32 charTensor, TFloat32 featTensor, Session sess) {

    return (TFloat32) sess.runner()
            .feed("serving_default_words_input:0", wordTensor)
            .feed("serving_default_word_features_input:0", featTensor)
            .feed("serving_default_char_input:0", charTensor)
            .fetch("StatefulPartitionedCall:0")
            .run()
            .get(0);
}`

I want to replace the feed(…) with feed(…batch…Tensor). Is the Batch the best way to feed batches of tensors at once? How would I go about doing this?

If your model is constructed so that the leading dimension is unbound then you can use that as the batch dimension, feed it in batched tensors anyway and not use the batching operation.

Hi Yusufu, maybe knowing why exactly you’d like to pass all tensors as a batch would help guiding you in the right direction.

If your intent is simply to pass a single object to your runPredict method, then you can look at creating a SessionFunction that accepts a Map of tensors in input?

The Batch op you are referring to is mostly used when building a graph and not when running it