I feel that many times my source of fatigue can be traced back to my tools being laggy pieces of shit for no good reason.<p>I don't understand what prevents actors like Microsoft from doing a clean, lightweight, native rewrite of tools like Visual Studio for people who are looking forward into the .NET 5 horizon, and don't care about being able to debug VB.NET apps written in 2009. There is no reason there has to be any delay at all in the UI. Graceful degradation of intellisense is acceptable depending on project complexity, but there should never be any sort of perceptible hitching or delays when moving code windows around, scrolling, typing, switching tabs, minimizing/maximizing, etc. If my PC can display the complex 3d scenes of Overwatch at 2560x1440p@180FPS with <5ms input latency, I cannot comprehend any rational argument for my IDE being unable to achieve even 10% of that performance objective.<p>I understand that use of frameworks like ElectronJS make it virtually impossible to achieve my stated objectives, so perhaps we need to dust off some APIs and re-learn old tricks. Think about the aggregate developer hours that could be saved with 1 heroic implementation effort. Imagine if you could load a VS solution in less than a second and immediately start clicking through the UI in any direction without any fear that it is about to sandbag your ass with frustratingly-arbitrary UI delay soup. That is the kind of UX that inspires confidence and encourages a developer to charge forth, instead of compelling them to fuck off on HN for the 20th time of the day.
Sublime Text is one those modern tools that I would call snappy. I pray that it stays that way. I tried Atom once, but every key press had tiny lag, drove me insane.
Notepad++ is still one of if not the best and most reliable pieces of software I’ve ever used. Better than sublime IMO. VLC coming in at a close second. These two are probably the last “snappy” pieces of software I use, besides maybe terminals.<p>I’m sure there are others but these are the two the come to mind.
That's quite timely. I'm writing a soft real time piece of code for use in the browser. It's a nightmare. So many layers underneath that it is very hard to get any kind of idea where latency and throughput issues are coming from.
Fast software is not the same as small executable. I don't care how many instructions an executable contain, I care about how many are needed to be executed to do what i want. There are plenty of times where a larger executable means faster execution (Loop unrolling is an obvious example).<p>There are plenty of reason why a lot of modern software is slow, but size of executable isn't even in the top 10.
Depends, how much Electron instances are you running?<p><a href="https://mspoweruser.com/xbox-pc-app-gains-huge-performance-gains-after-abandoning-electron-framework/" rel="nofollow">https://mspoweruser.com/xbox-pc-app-gains-huge-performance-g...</a>
It's cheaper to buy faster computers nowadays. Code optimization for speed and size is a lost art as the programming world is filled with layers and layers of libraries and frameworks nobody knows what happens under the covers.
cf. Dan Luu's page on computer latency:<p><a href="https://danluu.com/input-lag/" rel="nofollow">https://danluu.com/input-lag/</a>