TFLITE to mediapipe

Hello Everyone, I am trying to create a llm in mobile using gemma-2b-it, but I have a problem when I try to run the llm with mediapipe.

E0000 00:00:1719161438.366071   20510 calculator_graph.cc:887] INTERNAL: CalculatorGraph::Run() failed: 
                                                                                                    Calculator::Open() for node "odml.infra.TfLitePrefillDecodeRunnerCalculator" failed: ; RET_CHECK failure (external/odml/odml/infra/genai/inference/calculators/tflite_prefill_decode_runner_calculator.cc:157) (prefill_runner_)!=(nullptr)

the step that i followed were:

  1. trained the llm with keras.
  2. Convert from keras to tflite.
  3. Convert tflite to mediaPipe. (using colab
  4. load .task in android with mediapipe

does Anybody know something about that error?

Modified by moderator
Hello Everyone, I am trying to create a llm in mobile using gemma-2b-it, but I have a problem when I try to run the llm with mediapipe.

E0000 00:00:1719161438.366071   20510 calculator_graph.cc:887] INTERNAL: CalculatorGraph::Run() failed: 
                                                                                                    Calculator::Open() for node "odml.infra.TfLitePrefillDecodeRunnerCalculator" failed: ; RET_CHECK failure (external/odml/odml/infra/genai/inference/calculators/tflite_prefill_decode_runner_calculator.cc:157) (prefill_runner_)!=(nullptr)

the step that i followed were:

  1. trained the llm with keras.
  2. Convert from keras to tflite.
  3. Convert tflite to mediaPipe. (using colab
  4. load .task in android with mediapipe

does Anybody know something about that error?
[/quote]

Hello, @Jorge_Martinez-Abarc

I understand the issue you’re facing with running your LLM (Lightweight Language Model) using gemma-2b-it and Mediapipe. The error message you’re encountering indicates a problem related to the TfLitePrefillDecodeRunnerCalculator. Let’s troubleshoot this:

Check Model Conversion:
Ensure that the conversion from Keras to TFLite was successful.
Verify that the TFLite model is correctly generated and compatible with Mediapipe.
Model Loading in Android:
When loading the .task in Android with Mediapipe, ensure that the file path is correct.
Double-check that the model file exists and is accessible.
Mediapipe Configuration:
Review your Mediapipe configuration and ensure that it matches the expected input and output nodes of your LLM model.
Check if any additional preprocessing or postprocessing steps are required.
Debugging:
Use logging or debugging tools to inspect the model’s behavior during inference.
Look for any specific error messages or warnings related to the TfLitePrefillDecodeRunnerCalculator.

Remember to verify each step carefully.

I hope this info is helpful to you.

Best Regard,
Lisa_Morris