Using google Coral (TPU) with MLIR

What I want to do is generate MLIR from custom language (just like toy tutorial does) and compile it to TPU binary so it can run on google coral board.
I have seen some slides and diagrams that states MLIR can support TPUs, but I can’t find any examples or use-cases of running coral with MLIR except internal usages of TensorFlow.
Is this possible? If it is, how can it be achieved?

The Coral TPU stack is proprietary, so other than targeting TFLite I don’t think there is much to do here. @jingpu may have more info.

Coral Edge TPU only supports TFLite programming interface. I don’t think there is much to do here either.

I have seen some slides and diagrams that states MLIR can support TPUs

I don’t know any slides like that. Maybe they are referring to cloud TPUs?

There is also an important difference between what it can support and what interface is supported in product sense. Even for Cloud TPUs the supported interface is HLO/MHLO only (with some sprinkle of non-HLO ops that have less guarantees).

Targeting TFLite directly* is best. Of course I’d be interested if you have some cases which couldn’t be supported such. E.g., if an affine loop or ml_program load is useful to enable use case that’s interesting signal.

[*] Or what the TFlite converter supports, it allows a bit more than whats on “the can” for the adventurous if you look at the code, but you are off the beaten track then and can run into sharp edges - so don’t build a product around undocumented behavior, but it could allow for experiments, and further discussions to see if it intersects with planned support.