Integrating new third party delegates into tflite

I’m working on open source ML accelerators at ( and we’re looking at whether it’s worth implementing a tflite delegate. I’ve found the guides at [1][2] helpful for technical info. However, I’m curious whether there is any appetite from the tf/tflite team to integrate a third party delegate back into the core library, assuming we go ahead with implementing one.

Can anyone help me understand whether that is a possibility and if so what the process would look like?



Just to clarify, I’m not asking for the TF team to implement and maintain a delegate for us - I just want to figure out how we can best maintain forward compatibility & submit high quality PRs that are likely to get merged.

Thanks for the question, Tom and cool project! Any recommendations @Karim_Nosseir ?

1 Like

Hi Tom,

Thanks for reaching out.
Regarding hosting the delegate. We don’t recommend hosting external delegates in our code base, this to provide more flexibility for our users so they don’t need to be blocked on checking in the code, and decrease the amount of code we maintain.
On the other hand, we highly recommend community contributions.
So, the suggested way is to host your delegate in your own repo - which gives you the extreme flexibility and freedom to make changes.
Then add it to Google Dev Library here See the submit button.
TFLite website references this page from our main page for community contributions.

Regarding quality, we highly recommend testing the delegate

  • on some popular models/tasks. You can leverage our open source tools for this, image_classification, object_detection, and our general inference_diff.
  • Add CPU vs delegate for each kernel tests. You can see set of test cases in test files for any of our TFLite kernels here. And even examples for comparison in our delegate tests, example



Thanks for the information Karim, we’ll give this method a shot.