Need help in including the TensorFlow dialect


In order to handle an AI use case, I try to include the tensorflow dialect into my MLIR project. I have difficulties in doing so, and I’m looking for help (e.g. best practices, tutorials, etc.).

One of my main problems here is that the hierarchy of Bazel packages is not really a hierarchy, having lateral dependences between packages. Consider, for instance, one of the simpler packages, namely github/tensorflow/tensorflow/compiler/mlir/hlo. Here, everything is well-encapsulated except for the include #include "tensorflow/compiler/xla/test.h" used in lib/utils/ It’s often the test infrastructure that poses these hierarchy problems.

Of course, I could try to slice away just the source files that do not use the test infrastructure. However, there must be a non-manual way of doing this, which would allow me to keep the link between github repositories as they evolve.

Any insight into how this should be done or into how you did it would help me a lot.

Thank you in advance,


You mention include the TF dialect but then also mention different test dependencies, which is confusing as you’d normally include the specific targets you care about rather than everything at a package level. Bazel build files are more fine grained in this regard. Is the question which bazel build targets (e.g., tensorflow/compiler/mlir/tensorflow ) you should use to be able to include the different dialects from TF repo?



@mehdi_amini has created most of the bazel rules so he’d be a good person to ask.

I guess you are exactly right. This is my first time using Bazel, the solution may be just including the right BUILD files that allow me to avoid including files I don’t want to depend on.



I don’t love how this is organized for out of tree use, and I don’t completely understand the question of what exactly you’re trying to get to. However, I’ll offer these examples as the most stand-alone ways that I found to interoperate with the various layers of the TF stack (for extracting programs):

The corresponding bazel rules should have what is needed for independent use.

Thanks a lot. What we’re trying to do is to have, in a tool of our conception, the following:

  1. the MLIR dialects of llvm-project and the ability to use lowering passes defined in llvm-project/mlir.
  2. the MLIR dialects and lowering passes of tensorflow.
  3. MLIR dialects and lowering passes of our own

We currently have (1) and (3) fully integrated. I’m trying for some tome now to integrate tensorflow (and hlo).

I’ve looked in the 3 files you link. These are C++ files, and I do know how to work the C++ -level part. The part I do not know how to do is to make sure that only the parts of tensorflow that I need are recompiled each time I update the tensorflow git.
This is, I guess, the BAZEL part.

Best regards,

Not to be melodramatic, but to a first approximation all of TensorFlow depends on every other part of TensorFlow. It is my belief that the BUILD file next to those files I link has the minimal set of dependencies needed to do things with the various parts that they cover. TensorFlow is a very hard project to integrate and keep working as an out of tree consumer of its low level faclities. We (IREE) do it because it is important to us, and it costs a lot (relatively easy to plumb something together but has a tendency to fly apart). Through a long history on this, we ultimately went so far as to have a solid division point between anything that touches TensorFlow dialects and everything else (even as a separate binary that is used by way of pipes and non-C++ APIs).

The core problem you are going to run in to is that the high level TensorFlow dialects (not MHLO, which has been preserved as a separate entity with strict deps) depend on the full TensorFlow executor and op/kernel library for correctness (in practice – it is technically correct to do no constant folding but is only useful in isolated situations). There simply is no minimal build of that: you’re in for the whole thing. And since you are wanting to co-mingle your own compiler assets with it at the C++ level, you have to track what is there (or build a defensible boundary like we did).

1 Like

This feedback will be very useful for us (albeit a bit frightening). Thanks. @qaco