I am planning to deploy the MoveNet pose estimation model on microcontrollers like the ESP32.
But I am not sure whether the MoveNet model architecture is supported by the TensorFlow Lite for Microcontrollers.
Can I get some advice on it? Thanks.
I am planning to deploy the MoveNet pose estimation model on microcontrollers like the ESP32.
But I am not sure whether the MoveNet model architecture is supported by the TensorFlow Lite for Microcontrollers.
Can I get some advice on it? Thanks.
Hi, is there anyone that can provide me some advice?
It looks like the standard int8 version of MoveNet on TFHub is about 2.7MB in size:
https://tfhub.dev/google/lite-model/movenet/singlepose/lightning/tflite/int8/4
This is probably too large for most embedded boards to store. I would also guess that the latency would be very high, since it’s designed for mobile phone CPUs. I would look into whether you can use a Raspberry Pi instead, if possible.
Thanks for the information.
I think Raspberry Pi would be able to support the MoveNet pose estimation as the tutorial and example were provided:
examples/lite/examples/pose_estimation/raspberry_pi at master · tensorflow/examples · GitHub
Yes just tried verifying an int8 version, on Ardunio 1.8.19 int ESP32 Cam got the following, BUT may be able to load a model some other way.
text section exceeds available space in board. Sketch uses 3507390 bytes (111%) of program storage space. Maximum is 3145728 bytes.
Global variables use 101168 bytes (30%) of dynamic memory, leaving 226512 bytes for local variables. Maximum is 327680 bytes.
Sketch too big; see https://support.arduino.cc/hc/en-us/articles/360013825179 for tips on reducing it.
Error compiling for board ESP32 Wrover Module.