Convert TensorFlow/TF lite model to MHLO/Linalg standalone

Hi Folks,

As known, IREE project has the TensorFlow frontend to import the TF model and converts it to IREE/tensor. Is three any standalone infrastructure that imports the TF model and exports the MHLO, which could be ported to the custom MLIR project? In the TensorFlow framework and mlir-hlo, there seem no clues about the model conversion to MLIR case. Please correct me if it is. Thanks ahead.

1 Like

Maybe here is not the right place to ask such a question about converting TF mode to MLIR. Please correct me.

By Google, part of a solution is found here, but it is not all of what I want. Hope that there is a case guiding how to run a model from TF to MLIR

 tf-savedmodel-exported-names=predict /path/to/saved_model -o out.mlir.

Hey Jackwin,

Indeed the best place is to file an issue on TensorFlow GitHub and I can assign the bug there to the responsible parties. TFLite to MHLO/Linalg is not PoR at the moment, so I’d recommend focusing on TF to MHLO in the request there.

Hi jpienaar,

Many thanks. A feature issue is submitted in[52887](

Hi Jacques,

TFLite to MHLO/Linalg is not PoR at the moment

Is there any plan to add TFLite to StableHLO lowering in the future?
I see lowering from tfl.custom to stablehlo ops, however, I couldn’t find the lowering from other tfl ops to tfl.custom op or to stablehlo ops.

Thanks in advance.

Not as far as I know. @burmako may know more.

I know there is StableHLO to TFlite, a TFlite to TOSA and (preliminary) StableHLO to TOSA. The latter written using PDLL and could probably be inverted rather easily for the common cases supported so that one could connect these. A dedicated path may be better but this would enable some reuse and could work well enough for your use case.

Even though there is no complete coverage path though, it could work. E.g., IREE was doing TFLite to TOSA to Linalg for codegen here, so depending on what you need, you may be able to get to your backend by a combination of these.

1 Like