What is the holy grail of [classical] computing?<p>For anyone else who might be having a slow day at work - where do you think the evolution of computing practices is headed?<p>I’m curious as to what the prevailing ideas of “ideal computing architectures” are, in terms of computing performance, programmability and utility.<p>Perhaps a platform that frees us of having to tailor algorithms to match the hardware topology?<p>How would reconfigurable hardware change the nature of the game? Does it turn into an optimisation problem that we can throw ML/DL algorithms at?<p>Is it useful to work backwards from some ideal computing platform? Are we doing this already?<p>For instance, If Optical switches are the fastest possible computing device, would it make sense to focus most R&D effort there? Or Would this come at the cost of missing other low hanging fruit? And if so could there be an optimal risk/reward strategy to advance computing?<p>Are there other existing Non von-neuman architectures that that aren’t getting the attention they deserve?<p>Are these too many questions for one AskHN post? Do I need to stop and back to work?
If you're including software no question the "holy grail" is <a href="https://en.wikipedia.org/wiki/Artificial_general_intelligence" rel="nofollow">https://en.wikipedia.org/wiki/Artificial_general_intelligenc...</a>.<p>However it sounds more like you're asking what hardware innovations will enable fundamental performance improvements, or milestones that mark a large change in how things are engineered.