Can I use `tf.nn.ctc_beam_search_decoder` on Android?

I’m working with tflite on Android, and I have a model architecture that’s a MobilenetV2 backbone, and a GRU RNN, trained with CTC loss. I want to use ctc beam search decoding at inference time.

I’ve successfully made a .tflite model and tested that I can load it up and get the expected behaviour on my local machine.

BUT, it doesn’t work on my Android project as I get this error:

Could not build model from the provided pre-loaded flatbuffer: Unsupported custom op: FlexCTCBeamSearchDecoder, version: 1

The error is raised when I try to instantiate this class:

package com.stampfree.validation.tflite;
import android.app.Activity;
import java.io.IOException;
/** This TensorFlowLite classifier works with the float MobileNet model. */
public class DigicodeMobileNet extends Classifier {
    /**
     * Initializes a {@code ClassifierFloatMobileNet}.
     *
     * @param device a {@link Device} object to configure the hardware accelerator
     * @param numThreads the number of threads during the inference
     * @throws IOException if the model is not loaded correctly
     */
    public DigicodeMobileNet(Activity activity, Device device, int numThreads)
            throws IOException {
        super(activity, device, numThreads);
    }
    @Override
    protected String getModelPath() {
        return "tf_mobilenetv2_128x768_ctc_post_epoch08.tflite";
    }
}

I have tried this using the default dependencies:

dependencies {
    implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly-SNAPSHOT'
    implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:0.0.0-nightly-SNAPSHOT'
}

and by building the AAR in a docker container.

Both approaches gave the same result.

Any tips? Happy to share more context on request.

1 Like

Hi @Alexander_Soare

I have used a FlexCTCGreedyDecoder operator before at this project. It seems that adding the
implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:0.0.0-nightly dependency worked fine and to drop 70MB of the final .apk I have used also the building the AAR in a docker container. It is strange that it does not work for you.

First try with the above dependency and not the SNAPSHOT one. If you need explicitly the SNAPSHOT dependency check again the documentation here.

Ping me again if neither worked.

Thanks

1 Like

Thanks @George_Soloupis will give it a try and let you know.

1 Like

Yes, it is possible to use tf.nn.ctc_beam_search_decoder on Android devices if you have a TensorFlow Lite (TFLite) model that was trained with the CTC (Connectionist Temporal Classification) loss function and exported as a TFLite model.

To use tf.nn.ctc_beam_search_decoder in a TFLite model on Android, you would first need to load the TFLite model into your Android application using the TFLite Java API. Once the model is loaded, you can run inference using the Interpreter class and pass input data to the model using a ByteBuffer.