Hi everyone,
I am looking for examples of how to use the Batch class
for inference.
My model takes in three TFloat32 inputs and returns an TFloat32 array of probabilities.
` public static TFloat32 runPredict(TFloat32 wordTensor, TFloat32 charTensor, TFloat32 featTensor, Session sess) {
return (TFloat32) sess.runner()
.feed("serving_default_words_input:0", wordTensor)
.feed("serving_default_word_features_input:0", featTensor)
.feed("serving_default_char_input:0", charTensor)
.fetch("StatefulPartitionedCall:0")
.run()
.get(0);
}`
I want to replace the feed(…) with feed(…batch…Tensor). Is the Batch the best way to feed batches of tensors at once? How would I go about doing this?