The author is asking for overly-precise numbers <i>in scientific papers</i>, because the science data publishing/sharing/reproducibility process is broken.<p>I won't argue against this specific case. But in almost any other context, I think comprehension would be better served with <i>zero</i> significant digits, or at most 1. This is one reason that I propose magnitude notation: <a href="https://saul.pw/mag" rel="nofollow">https://saul.pw/mag</a>.
Wouldn’t the proper solution to this be to fix how data is published? Instead of making the actual paper unreadable to help people replicate it, maybe require a standardized structure to provide the necessary data and code to replicate the calculations?
This seems to be written by someone who has no idea of what they are talking about. Propagation of significant digits over algebraic operations is a solved problem. Adding more digits than the minimum necessary significant digits does not get you anything extra. Literally the first class in Physics 101 courses teaches this.