Build Tensorflow static library

I am new to TensorFlow, and not sure this is an already answered question. Here is my problem:

I need to convert TensorFlow HLO to mlir-hlo dialect, and then convert it my customized MLIR dialect. I need to link both TensorFlow library and LLVM/MLIR. I know during the TensorFlow build process, a version of LLVM is pulled over and built from source. I have a customized module, which wants to use the TensorFlow library and LLVM library built in the building process. Is it possible?

In another word, how to build intermediate libraries used by TensorFlow into static libraries so other component can use it?

Are you using bazel for your build? If so you could just add normal build does between these - I’ve at least always only done this by way of workspaces and external deps. I have not checked to see what the .a libraries produced looks like. I believe some have bee able to just build and then link as normal. What issues are you running into?

Integrating your build with TF can be painful. What we do in IREE is use some import binaries to get from TF models to an MLIR text representation. Then we pass of the MLIR text to our own ecosystem, which builds with both Bazel and CMake (getting TF to interop with CMake was a no-go). So you could do something similar and use tf-opt (or your own standalone tool integrated with the TF Bazel build system) to get to an MHLO representation and then hand it off from there. In IREE our input to the core compiler is now linalg, but we used to use MHLO. MHLO, specifically, does not need TF to build and has CMake support. There’s also a standalone repository if you want to use that instead

2 Likes

Hi, I am trying to do something very similar, did you every have much luck?

I am using TensorFlow + Bazel to build a dynamic library and want it to be static instead.

I am using --config=monolithic and everything is built into a static binary.

1 Like

Hi! Can anyone guide me the process of converting a tensorflow model in to MLIR. I’m using:
iree-import-tf saved_model/ -o output.mlir

Where I’ve generated the saved model direcotory using:

import tensorflow as tf

import tensorflow_hub as hub

# Download the pre-trained model from TensorFlow Hub

model_url = "https://tfhub.dev/google/imagenet/mobilenet_v2_100_224/classification/4"

model = hub.load(model_url)

# Save the model in the SavedModel format

tf.saved_model.save(model, "saved_model/")

But when I use mlip-opt view-ir on the generated output.mlir using iree-import-tf, it gives me no output:

./mlir-opt --print-ir output.mlir
// -----// IR Dump //----- //
module {
}
module {
}

Can anyone guide me how can I get the MLIR intermediate representation of a model.