First of all, I hate this "Agile" nonsense. I've seen it kill companies. It's truly awful, because it gives legitimacy to the anti-intellectualism that has infected this industry.
It's that anti-intellectualism that, if you let it, will cause a rot of your mathematical and technical skills. Before you know it, you've spent five years reacting to Scrum tickets and haven't written any serious code, and your math has gone to the birds as well. It's insidious but dangerous, this culture of business-driven engineering mediocrity.<p>I hope that it'll be the fakes and the brogrammers who get flushed out in the next crash. Who knows, though? Obviously I can't predict the future better than anyone else.<p>To me, Python doesn't feel like a "scientific" language. Python's a great exploratory tool, and it's got some great libraries for feeling out a concept or exploring a type of model (e.g. off-the-shelf machine learning tools). That said, science values reproducibility and precision, which brings us around to functional programming and static typing... and suddenly we're at Haskell. (Of course, for a wide variety of purposes, Python is just fine, and may be a better choice because of its library ecosystem.) I do think that, as we use more machine learning, we're going to have a high demand for people who can apply rigor to the sorts of engineering that are currently done very quickly (resulting in "magic" algorithms that seem to work but that no one understands). I also agree that "deep learning" and machine learning in general are carrying some substance, even if 90% of what is being called "data science" is watered-down bullshit.<p>I still don't feel like I know what a "scientific programmer" is, or should be. And I'd love to see the death of business-driven engineering and "Agile" and all the mediocrity of user stories and backlog grooming meetings, but there's nothing yet that has convinced me that it's imminent just yet. Sadly, I think it may be around for a while.