I see this so often with academics. They’re not developers and only have basic coding ability. So they make all kinds of basic mistakes like trying to load a bunch of images converting them to float (IR images for example) and wondering where their memeory went. I took some spaghetti code that took a couple of days to process thousands of IR images, in batches, and wrote one that ran in under a minute simply because they didn’t manage their memory at all.<p>Even something as opening CSV files quickly can have a massive effect.<p>Lots of Universities switched from FORTRAN to Matlab in the last decade but many of the researchers who learned FORTRAN try to write Matlab like FORTRAN with messy nested for loops and no knowledge of vectorisation.
It isn't just badly done math loops that can cripple performance. Years ago, some users were complaining that it was taking forever to load their data into their analysis program. It turned out they were reading thousands of structs, <i>one element at a time</i> with the Unix read(2) <i>system call</i>! I taught them about buffering and the read time went down by a factor of ten or more, I forget the exact numbers.