Conversion of LSTM model to tflite

I am trying to convert a simple LSTM model to tflite. But conversion is asking for the flex delegate.

Following is the architecture :
model_input = tf.keras.Input(shape=(124, 129), name=‘input’)
LSTM_out = tf.keras.layers.LSTM(units=256)(model_input)
dense_1 = tf.keras.layers.Dense(128, activation=‘tanh’)(LSTM_out)
dense_2 = tf.keras.layers.Dense(64, activation=‘tanh’)(dense_1)
dense_3 = tf.keras.layers.Dense(32, activation=‘tanh’)(dense_2)
model_output = tf.keras.layers.Dense(num_labels)(dense_3)
model = tf.keras.Model([model_input], [model_output])

Converter config :
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS]
converter._experimental_lower_tensor_list_ops = False
converter.conversion_print_before_pass = “all”
converter.inference_input_type = tf.float32
converter.inference_output_type = tf.float32

I have another set of models where the conversion worked perfectly fine for an almost similar architecture without flex delegate.

As flex delegation is not an option in tflite-micro, this is a great issue for me.

Please let me know how I can get around it or kindly point me in the right direction.

Thanks in advance :slightly_smiling_face:

Please go through LSTM Fusion Code lab to convert LSTM model without select ops may help you. Thank you!

1 Like

Thanks a lot. That helped.