Whether one fully understands or not, rewriting always results in chance of
breakage. Ceteris parabus I always favor rewriting if only to allow the
rewriting programmer easier maintenance going forward. When the breakage
reveals why the old code was written in that peculiar way, the hope is that
the cost of breakage is less than the time cost of understanding the
peculiarity. Obviously, for big stakes, e.g., the Google search algorithm,
this "move fast and break things" ethos is untenable.
This actually hints at a trap. There's something akin to the Dunning-Kruger effect in programming, because code is in a Turing-complete language. You can't necessarily understand it, but you might believe you understand it. Some weird side-effect gets eliminated or introduced and then your system starts failing.