MLIR Code Generation for XLA

Hi,

Is there a path through XLA that generates MLIR directly from an HLO graph? Preferably not in the tf dialect, as not many other projects understand it. Using the mhlo dialect would probably be viable (we can use mlir-hlo to convert it), but being able to lower it down to linalg (and in future tcp) would be best.

Looking at this document tells me that there is an interest, at least for GPUs, but I’m not sure how far is that effort.

If we wanted to build such a code-gen device, what would be the best place to start?

We’re interested initially in CPU targets, but ultimately, such an effort would make sense for all targets.

1 Like

I don’t know if it is related to the TF bridges or more to the new OpenXLA SIG:

/cc @thea @Mehdi_AMINI It was not explained in the official SIG OpenXLA announcements on this forum but I suppose that here, using the XLA and MLIR labels, we are going to restrict these labels to TF bridges related topics and probably we could find a better name for these labels on the forum and on the Github TF repository.

Yes we should use Discussions · openxla · GitHub for XLA discussions.

All the pieces you need exist @Renato_Golin : https://github.com/tensorflow/mlir-hlo should contain most/all of it, but it lack end-to-end tests. We’re using these pieces in XLA and in IREE. In the next few quarters XLA for CPU and GPU should have end-to-end pipelines that are roughly StableHLO->MHLO->Linalg->…
An alternative can also be to look right now into IREE which has end-to-end integration with MHLO as input, or https://github.com/alibaba/BladeDISC as well.

Is it still a valid place for TF bridges here? In the case we could rename tags.

Yes: TF/XLA bridge is still appropriate I think.

Also TF/MLIR right?