Gutjahr, KevinKevinGutjahrRuck, ClemensClemensRuckSchüle, Maximilian E.Maximilian E.Schüle0000-0003-1546-269X2025-11-262025-11-262025979-8-4007-1924-0https://fis.uni-bamberg.de/handle/uniba/111824Forward and reverse mode automatic differentiation evaluate the gradient of a model function efficiently by caching the results of partial derivatives. Just-in-time compilation improves the runtime of automatic differentiation by eliminating function calls and storing partial derivatives in virtual registers. This paper discusses the first open-source implementation of automatic differentiation with MLIR and LingoDB. The evaluation compares optimizations applied to forward and reverse modes. It showed that sub-expressions, that appear frequently within the calculation, will be reused after MLIR performs its optimization. Additionally, reverse mode outperforms forward mode due to less generated code.engAutomatic DiferentiationIn-Database Machine LearningQuery CompilationMLIRDuoLingo-AutoDiff : In-Database Automatic Differentiation with MLIRconferenceobject10.1145/3735654.3735943