MLIR News, 49th edition (21st June 2023)

Welcome to the 49th issue of the MLIR Newsletter covering developments in MLIR, and related projects in the ecosystem. Please send any tips or feedback to javed.absar@gmail.com. We welcome your contributions. Click here to see previous edition.

Highlights

  • All apply functions now have a TransformRewriter & parameter. This rewriter should be used to modify the IR. It has a TrackingListener attached and updates the internal handle-payload mappings based on rewrites”. Diff here. See Combining Existing Transformations and also RFC - Introduce the concept of IR listeners in MLIR for wider context.

  • Ivan Butygin and Diptorup Deb from Intel, presented the progress on Numba-MLIR: python compiler based on Numba and MLIR. The goal of this project is to create compiler for numeric python code. Find slides here.

MLIR RFC Discussions

  • Alex elaborated memref address calculation and emphasized “… heavily discourage the use of reinterpret_cast because it is effectively a construct-an-arbitrary-memref-from-another-memref kind of operation that removes most of the benefits of using structured references to start with”. Find discussion here.

  • On numba-mlir, “…speaking of CFG-to-SCF conversion specifically, we have completely rewritten it recently, based on this paper”. More discussion here and link to diff here.

  • On “what passes to call before linalg-to-affine” etc, folks pointed to available passes and their contracts, although more info on this topic would be useful. Discussion here.

  • On “changing the type of an Op while cloning”…yeah you can change the type of a value (to be done with great care of course!), you can’t change the number of results for an Operation.". Pass infrastructure here. Discussions here.

  • Matthias highlighted " -tensor-bufferize , -finalizing-bufferize are deprecated and should not be used anymore. Just run -one-shot-bufferize . That’s the only pass that you for tensor->memref in most cases". Discussion here.

MLIR Commits

Useful Links

7 Likes