[LightGBM,] does not converge regardless of feature order.<p>From <a href="https://news.ycombinator.com/item?id=41873650">https://news.ycombinator.com/item?id=41873650</a> :<p>> <i>Do algorithmic outputs diverge or converge given variance in sequence order of</i> all orthogonal <i>axes? Does it matter which order the dimensions are stated in; is the output sensitive to feature order, but does it converge regardless?</i><p>Also, current LLMs suggest that statistical independence is entirely distinct from orthogonality, which we typically assume with high-dimensional problems. And, many statistical models do not work with non-independent features.<p>Does this model work with non-independence or nonlinearity?<p>Does the order of the columns in the training data CSV change the alpha of the model; does model output converge regardless of variance in the order of training data?