Almost feels like a fallacy of Julia at this point, on the one hand Julia really needs a stable, high-performance AD-engine, but on the other hand it seems to be fairly easy to get a minimal AD-package off the ground.<p>And so the perennial cycle continues and another Julia AD-package emerges, and ignores all/most previous work in order to claim novelty.<p>Without a claim for a complete list: ReverseDiff.jl, ForwardDiff.jl, Zygote.jl, Enzyme.jl, Tangent.jl, Diffractor.jl, and many more whose name has disappeared in the short history of Julia...
Odd that the author excluded ForwardDiff.jl and Zygote.jl, both of which get a lot of mileage in the Julia AD world. Nonetheless, awesome tutorial and great to see more Julia content like this!
Julia is a splendid, high performance language. And the most overlooked. Such a huge pity and shame that the entire current AI ecosystem is build on Python/Pytorch. Python - not a real programming language, let alone is interpreted... such a huge loss of performance besides Julia.