Need help compiling TF Serving with custom TF kernels

Hey, I am looking for some help to debug a compilation issue for Serving. I have a TensorFlow fork with some custom kernel that compiles and runs without an issue. Unfortunately when I try to compile Serving with this same TensorFlow I am unable to get past this point. After a quick google search and multiple iterations I think those warnings are not the error that stopped the build. Does anyone any suggestions for how to get bazel to provide more information about the error that is preventing the build?

root@32863057b38d:/tensorflow/tm/serving# bazel --host_jvm_args="-Xms1024m" --host_jvm_args="-Xmx2048m" build --color=y
es --curses=yes --host_javabase="@local_jdk//:jdk" --verbose_failures -c opt tensorflow_serving/model_servers:tensorflo
w_model_server
DEBUG: Rule 'io_bazel_rules_docker' indicated that a canonical reproducible form can be obtained by modifying arguments
 shallow_since = "1556410077 -0400"
DEBUG: Repository io_bazel_rules_docker instantiated at:
  no stack (--record_rule_instantiation_callstack not enabled)
Repository rule git_repository defined at:
  /root/.cache/bazel/_bazel_root/4ec26d1a12bea518b8df81fbc61cec69/external/bazel_tools/tools/build_defs/repo/git.bzl:19
5:33: in <toplevel>
WARNING: /root/.cache/bazel/_bazel_root/4ec26d1a12bea518b8df81fbc61cec69/external/org_tensorflow/tensorflow/compiler/xl
a/client/lib/BUILD:57:11: in hdrs attribute of cc_library rule @org_tensorflow//tensorflow/compiler/xla/client/lib:comp
arators: file 'libliteral_util.a' from target '@org_tensorflow//tensorflow/compiler/xla:literal_util' is not allowed in
 hdrs. Since this rule was created by the macro 'cc_library', the error might have been caused by the macro implementat
ion
WARNING: /root/.cache/bazel/_bazel_root/4ec26d1a12bea518b8df81fbc61cec69/external/org_tensorflow/tensorflow/compiler/xl
a/client/lib/BUILD:57:11: in hdrs attribute of cc_library rule @org_tensorflow//tensorflow/compiler/xla/client/lib:comp
arators: file 'libliteral_util.pic.a' from target '@org_tensorflow//tensorflow/compiler/xla:literal_util' is not allowe
d in hdrs. Since this rule was created by the macro 'cc_library', the error might have been caused by the macro impleme
ntation
WARNING: /root/.cache/bazel/_bazel_root/4ec26d1a12bea518b8df81fbc61cec69/external/org_tensorflow/tensorflow/compiler/xl
a/client/lib/BUILD:57:11: in hdrs attribute of cc_library rule @org_tensorflow//tensorflow/compiler/xla/client/lib:comp
arators: file 'libliteral_util.so' from target '@org_tensorflow//tensorflow/compiler/xla:literal_util' is not allowed i
n hdrs. Since this rule was created by the macro 'cc_library', the error might have been caused by the macro implementa
tion
WARNING: errors encountered while analyzing target '//tensorflow_serving/model_servers:tensorflow_model_server': it wil
l not be built
INFO: Analyzed target //tensorflow_serving/model_servers:tensorflow_model_server (9 packages loaded, 5410 targets confi
gured).
INFO: Found 0 targets...
ERROR: command succeeded, but not all targets were analyzed
INFO: Elapsed time: 27.122s, Critical Path: 0.02s
INFO: 0 processes.
FAILED: Build did NOT complete successfully

Are you following this guide:

Thank you I will try to incorporate that guide and see how it goes

Ok, I have tried compiling statically but I still get no more information from the build when it stops with ERROR: command succeeded, but not all targets were analyzed for a little more context we are also adding a custom device and all our kernels are compiled directly with the TF build process. Maybe if there was a way for bazel to be more verbose about the analysis errors

I don’t know, as you have also a custom device, if this could be better handled with pluggable device API /cc @penporn

Thank you @Bhack for flagging me and sorry for the late reply! I think the WARNING messages here are indeed blocking the build. ERROR: command succeeded, but not all targets were analyzed means bazel couldn’t analyze all the target dependencies that your build target depends on, hence it failed to build. The WARNING messages explain each failures:

  • tensorflow/compiler/xla/client/lib/BUILD:57:11:
    In hdrs attribute of cc_library rule @org_tensorflow//tensorflow/compiler/xla/client/lib:comparators: file libliteral_util.a from target @org_tensorflow//tensorflow/compiler/xla:literal_util is not allowed in hdrs.

  • //tensorflow/compiler/xla/client/lib/BUILD:57:11:
    In hdrs attribute of cc_library rule @org_tensorflow//tensorflow/compiler/xla/client/lib:comparators: file libliteral_util.pic.a from target @org_tensorflow//tensorflow/compiler/xla:literal_util is not allowed in hdrs.

  • //tensorflow/compiler/xla/client/lib/BUILD:57:11:
    In hdrs attribute of cc_library rule @org_tensorflow//tensorflow/compiler/xla/client/lib:comparators: file libliteral_util.so from target @org_tensorflow//tensorflow/compiler/xla:literal_util is not allowed in hdrs.

These sound like libliteral_util.a, libliteral_util.pic.a, and libliteral_util.so were put in a hdrs section. The hdrs section should only contain header files or filegroup targets containing header files. Those lib files should be in other sections such as data or linkopts [1, 2, 3] instead.