I mean stacking of hardware and software ECC – like storage medium having own ECC layer, file system own ECC mechanisms and later file format stored with additional ECC data that later gets verified after loading data into ECC capable memory. Are there any patterns that would cause cascade effect of the level that would render scale of introduced errors so harmful to integrity and validity of data that it would have been better and less harmful to not layer any more levels of ECC?
I don't know about too many layers, but you can definitely do things wrong. The difference between ECC (which corrects bit errors) and erasure coding (which fills in data that is known to be missing) is important and I've seen people attempt to use one when they should have used the other.<p>I've also seen people nerd sniping themselves on erasure coding while neglecting basic features.
It's fairly easy to analyze the effect of layering ECCs mathematically.<p>The end result will always be more robust to errors than it would be otherwise, but the code rate will be smaller (i.e., the same data takes a constant multiple more space to store.)