Description of the project: Enzyme performs automatic differentiation (in the calculus sense) of LLVM programs. This enables users to use Enzyme to perform various algorithms such as back-propagation in ML or scientific simulation on existing code for any language that lowers to LLVM. Enzyme already implements forward and reverse mode automatic differentiation. Enzyme also implements vector forward mode automatic differentiation, which allows Enzyme to batch the derivative computation of several objects in a single call. The goal of this project is too extend this capability in order to perform vector reverse mode. In doing so, multiple sweeps of reverse mode automatic differentiation can be performed at the same time, reducing memory, time, and otherwise generally enabling further optimization.
Expected results: Vectorized version of reverse mode automatic differentiation
Confirmed mentor: William Moses (@wsmoses), Tim Gymnich (@tgymnich)
Desirable skills: Good knowledge of C++ and some experience with LLVM API’s. Experience with Enzyme or automatic differentiation would be nice, but can also be learned in the project.