As the author explains, this approach to automatic differentiation (AD) via transformation of source code "supports control flow, higher-order functions and nested derivatives. The differentiated code can be further fed into a traditional compiler such as LLVM, which results in an extremely efficient derivative program. Further, it opens up the opportunity for robust traditional compiler techniques to be extended to machine learning, enabling kernel fusion or compilation for accelerators with no artificial limitations on the kinds of models that researchers can express. This combination has not previously been possible in a high-level, general-purpose programming language."<p>The author's package, Zygote, makes <i>all</i> Julia code differentiable, so <i>any program</i> can be optimized as an ML/AI model to learn a set of parameters given some training objective.[a]<p>:-)<p>[a] That said, you will not be able magically to overcome the limits of mathematics, in case you're wondering. See darawk's and b_tterc_p's comments below.