This does not seem very informed, starting from the false premise in the question up to the random concepts listed in the answers.<p>Calculus is actually important for Computer Science, it's actually important for everything, it's where you learn how to handle the exponential function and the natural logarithm, how to do approximations and bounds, how to handle infinite series, etc., and those things then appear all over the place, unlike most things listed it's something that you can expect to encounter almost regardless of what domain you are interested in.<p>I mean, the guy asks what should replace Calculus and then the first answer includes "Asymptotics", "basic limits, sequences and series", so actually calculus. In general I cringe a little every time I hear Computer Science people should focus on "discrete math", because without tools from analysis you can only solve the most trivial discrete problems. And yes, calculus by itself is hardly ever applicable in CS, yet you still have to learn it, tough luck. In general what is not stressed enough I think is that applying math is hard and you need to learn a lot of it before you have enough tools to tackle problems anywhere close to real-world complexity.<p>The top answer also lists random concepts. I am learning probability currently, for applications in machine learning. "Discrete spaces, Bayes theorem and expected values" you can learn in a day, "Markovs, Chebyshev and Chernoff inequalities" are mostly only useful for further theoretical work, so is "the law of large numbers". What will really be useful will depend a lot on the applications, if you are a theoretical computer scientist, mastery of generating functions and transforms will be useful, and it's one of those instances where discrete problems are solved via tools from calculus/analysis. For machine learning you need to know everything about the normal distribution by heart, and this means you have to know everything about the exponential function by heart, so again back to calculus. Notions from information theory are useful, but of course none of the ones he listed. The comment "This is a must for modern programmers." sounds just comic.