Hi, I was recently trying to selectively reduce the tf delegates for a model link here using Reduce TensorFlow Lite binary size. I have setup docker using the guide here ( Build TensorFlow Lite for Android) and I have no problem building fat binaries only issue is the tf-delegate reduction. I have tried both tensorflow:devel and tensorflow:devel-latest containers both resulting in the same error behavior resembling (Server terminated abruptly (error code: 14, error message: ‘Socket closed’ · Issue #41480 · tensorflow/tensorflow · GitHub)
The steps after setting up docker are as follows
- copy my models to the container to a folder using
bash tensorflow/lite/tools/build_aar.sh -- input_models=/host_dir/smallbert_L6_H128,/host_dir/smallbert_L12_H128_mean.tflite --target_archs=arm64-v8a,armeabi-v7a
Error occurs during the same exact phase and almost the same files if I remember correctly, over couple of tried I did. Also the compilation stops and the timer continues for next 100-200 sec depending each time, while reading disk at about 2GB per sec the entire minute or so the timer continues.
one possiblity that found was low memory but I have plenty of ram left for the process … so .
Just one more thing I would like to mention, I have tried compiling custom binaries by editing BUILD file maybe adding the specific build is possible for tf delegates I need but I have no experience how to go about approching it either
tflite_jni_binary( name = "libtensorflowlite_jni_normal_with_gpu.so", linkscript = ":tflite_version_script.lds", deps = [ "//tensorflow/lite/c:c_api", "//tensorflow/lite/c:c_api_experimental", "//tensorflow/lite/delegates/nnapi/java/src/main/native", "//tensorflow/lite/delegates/xnnpack:xnnpack_delegate", "//tensorflow/lite/java/src/main/native", "//tensorflow/lite/delegates/gpu/java/src/main/native", ], )
Any help would be appreciated.