I am surprised about the notion given that "speed doesn't matter". I think it does, even if you have access to powerful workstations to develop on, and server to run it.<p>I think that this one of the biggest downsides to Python for medium scale projects: You necessarily need to think a lot about performance and infrastructure to get your stuff up and running. Do it wrong, and the speed is infeasible from the start - and it doesn't scale and gets you into trouble later on.<p>For data analysis, for example, you can't really just start coding arbitrary Python. You need to know how you will eventually speed things up - using C, or libraries based on C. And I maintain that parallelizing code in Python is not at all straightforward nor performant.
That is, performance optimization is coupled to development and deployment. I can't just "use" the base language to develop a prototype and worry about performance later. If I don't know what I will eventually do, if I just code pure Python, then the program usually turns out unworkably slow when faced with data.
And even if you have "huge servers", then you need your code to actually scale. And in my experience, efficient small scale python code and efficient large scale python code are not the same thing!<p>E.g. I had deploy something on Windows and without having fork and with GIL, what ran well during testing became inefficiently slow. Just a choice of where to use multi-processing, for example, ended up making it slower than just running pure python without parallel code! And that speed meant that - contrary to what the article said - even on a large server, tasks simply would not finish. Meanwhile, the ingestion pipelines would clog up when data sizes became significant. Furthermore, one package I used to represent datastructures (networkx) just simply could not scale at all and crashed machines with even hundreds over hundreds of GBs of RAM during certain operations. And that really just happened without warning at certain sizes - unforeseen. I had to rewrite huge parts of the program to make it work, including all the database back end.<p>Of course all that is down to me not being a Python expert, just a normal scientist. Of course next time I will be smarter, but only becauce I will and will have to plan and test performance during conceptual coding and know which tools will eventually scale.<p>And that is not the "promise" Python seemingly makes to us applied programmers.<p>I am itching to move to Julia as soon as I can. Not only is the Matlab style syntax arguably superior for numerical / data science stuff, you can also get things up and running at reasonable speed and then use the same tooling and structure to make it scale.