There is nothing "mathematical" in being a developer, unless your job is related to math, finance, crypto algorithms, etc...<p>I've known guys back in my CS classes who got straight A's in the hard CS theory and had C's in the required calc courses, where the grading curves were shared by students from all majors.<p>Let's admit it -- being a developer is like learning a new language -- what's so different from constructing logical sentences with C# than constructing logical statements with a foreign language? Yes there is a DIFFERENCE, but really, the basic building blocks are the same...<p>When I was 11, I learned c++ with a buddy (also 11) and after school we'd work on fun little games together. I know I would never cut it in a pure math phd-type track and he definitely wasn't able to hack it when he got to college and majored in math -- he dropped out and switched majors halfway.
YES it makes sense. Maths, especially as it is taught in college, is mostly useless for programming. A bit of classical logic and algebra can be useful. Whilst it is possible to be a self-taught programmer (and many good ones are), for many people it is effective to take a software engineering course. A good course will teach you more than just programming. Some theory is useful for being able to quickly understand advances as they come into wider use.<p>As for getting a job, there are many avenues to take. The key is to show that you can and have solved real problems using your skills of analysis, design and implementation.<p>Of course, some HN readers will ask "On what basis have you reached the conclusion that you are a 'pretty good developer'?"
I'm real good at differential equations. The only time I've used that for coding was when I wrote my senior thesis for my physics degree.<p>Being good at verbal reasoning, organizational skills, keeping track of complex things is far more important than math. Math skills for computing are way overrated IMHO.<p>I expect that the notion that we need math to do computing has to do with early computers being used mostly to design nuclear weapons. That made sense then but not today.
Many non-programmers think that math is 90% of what we do. They don't realize that programming is instead mostly logic, not hard math.<p>99% of a business software developer's math usage is going to be elementary school algebra or less complicated. If you're doing game development, HFT, or certain other specialized fields, there will be a lot more math, but that doesn't apply for the vast majority of programming out there.
You don't need Maths for software engineering, only if you're doing real computer science that involves complex algorithms. I focus mostly on building distributed software, never needed maths for that.