I came across a reference to the Douglas-Peucker line simplification algorithm in the context of lossy compression for GPS tracks. It's a beautiful thing, but not widely known, perhaps? It made me think: what other gems are there that are not part of the standard CS 101 curriculum? Kalman filtering?<p>https://en.wikipedia.org/wiki/Ramer%E2%80%93Douglas%E2%80%93Peucker_algorithm
The Frank-Wolfe algorithm for optimization. Christian Bauckhage has a couple of papers, where he uses it to connect optimization problems to the fix states of a certain type of recurrent neutral networks (echo state networks).<p>The Marchenko-Pastur distribution derives from random matrix theory a nice theoretical border to estimate if a principal component is more noise then data.<p>Also I am a huge fan of all sorts of embedding/projection/matrix factorization algorithms and I use them quite regularly.
The k dimensional Weisfieler-Leman algorithm
<a href="https://www.iti.zcu.cz/wl2018/wlpaper.html" rel="nofollow">https://www.iti.zcu.cz/wl2018/wlpaper.html</a><p>Connect a huge number of graph isomorphism algorithms with a algebra.
The Burrows-Wheeler transform. Especially the bijective variant. I feel like there's some profound magical thing hidden inside it that I sadly don't have nearly enough algebraic chops to find.
Some of my favorites are:<p>Best First Upper Confidence Bound Tree Search.<p>Monte Carlo Counterfactual Regret Minimization.<p>Temporally Weighted Averaging - eg discount the first ten samples.<p>Markov Chain Monte Carlo Sampling<p>Unification