TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

The world could run on older hardware if software optimization was a priority

847 点作者 turrini9 天前

90 条评论

caseyy9 天前
There is an argument to be made that the market buys bug-filled, inefficient software about as well as it buys pristine software. And one of them is the cheapest software you could make.<p>It&#x27;s similar to the &quot;Market for Lemons&quot; story. In short, the market sells as if all goods were high-quality but underhandedly reduces the quality to reduce marginal costs. The buyer cannot differentiate between high and low-quality goods before buying, so the demand for high and low-quality goods is artificially even. The cause is asymmetric information.<p>This is already true and will become increasingly more true for AI. The user cannot differentiate between sophisticated machine learning applications and a washing machine spin cycle calling itself AI. The AI label itself commands a price premium. The user overpays significantly for a washing machine[0].<p>It&#x27;s fundamentally the same thing when a buyer overpays for crap software, thinking it&#x27;s designed and written by technologists and experts. But IC1-3s write 99% of software, and the 1 QA guy in 99% of tech companies is the sole measure to improve quality beyond &quot;meets acceptance criteria&quot;. Occasionally, a flock of interns will perform an &quot;LGTM&quot; incantation in hopes of improving the software, but even that is rarely done.<p>[0] <a href="https:&#x2F;&#x2F;www.lg.com&#x2F;uk&#x2F;lg-experience&#x2F;inspiration&#x2F;lg-ai-wash-efficient-spins&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.lg.com&#x2F;uk&#x2F;lg-experience&#x2F;inspiration&#x2F;lg-ai-wash-e...</a>
评论 #43973432 未加载
评论 #43973105 未加载
评论 #43973044 未加载
评论 #43972713 未加载
评论 #43973703 未加载
评论 #43972654 未加载
评论 #43973853 未加载
评论 #43973418 未加载
评论 #43973120 未加载
评论 #43974503 未加载
评论 #43975121 未加载
评论 #43976692 未加载
评论 #43998494 未加载
评论 #43973257 未加载
评论 #43984708 未加载
评论 #43974031 未加载
评论 #43979081 未加载
评论 #43986570 未加载
评论 #43975380 未加载
评论 #43980549 未加载
评论 #43974052 未加载
评论 #43995397 未加载
评论 #43972732 未加载
评论 #43973128 未加载
评论 #43976615 未加载
评论 #43973198 未加载
评论 #43982939 未加载
titzer9 天前
I like to point out that since ~1980, computing power has increased about 1000X.<p>If dynamic array bounds checking cost 5% (narrator: it is far less than that), and we turned it on everywhere, we could have computers that are just a mere 950X faster.<p>If you went back in time to 1980 and offered the following choice:<p>I&#x27;ll give you a computer that runs 950X faster and doesn&#x27;t have a huge class of memory safety vulnerabilities, and you can debug your programs orders of magnitude more easily, or you can have a computer that runs 1000X faster and software will be just as buggy, or worse, and debugging will be even more of a nightmare.<p>People would have their minds blown at 950X. You wouldn&#x27;t even have to offer 1000X. But guess what we chose...<p>Personally I think the 1000Xers kinda ruined things for the rest of us.
评论 #43972050 未加载
评论 #43973716 未加载
评论 #43972915 未加载
评论 #43973104 未加载
评论 #43978303 未加载
评论 #43972246 未加载
评论 #43978286 未加载
评论 #43972469 未加载
评论 #43971990 未加载
评论 #43971976 未加载
评论 #43972619 未加载
评论 #43972135 未加载
评论 #43972675 未加载
评论 #43974422 未加载
评论 #43972158 未加载
评论 #43972888 未加载
评论 #43972107 未加载
评论 #43977351 未加载
评论 #43976383 未加载
评论 #43973584 未加载
cletus9 天前
So I&#x27;ve worked for Google (and Facebook) and it really drives the point home of just how cheap hardware is and how not worth it optimizing code is most of the time.<p>More than a decade ago Google had to start managing their resource usage in data centers. Every project has a budget. CPU cores, hard disk space, flash storage, hard disk spindles, memory, etc. And these are generally convertible to each other so you can see the relative cost.<p>Fun fact: even though at the time flash storage was ~20x the cost of hard disk storage, it was often cheaper net because of the spindle bottleneck.<p>Anyway, all of these things can be turned into software engineer hours, often called &quot;mili-SWEs&quot; meaning a thousandth of the effort of 1 SWE for 1 year. So projects could save on hardware and hire more people or hire fewer people but get more hardware within their current budgets.<p>I don&#x27;t remember the exact number of CPU cores amounted to a single SWE but IIRC it was in the <i>thousands</i>. So if you spend 1 SWE year working on optimization acrosss your project and you&#x27;re not saving 5000 CPU cores, it&#x27;s a net loss.<p>Some projects were incredibly large and used much more than that so optimization made sense. But so often it didn&#x27;t, particularly when whatever code you wrote would probably get replaced at some point anyway.<p>The other side of this is that there is (IMHO) a general usability problem with the Web in that it simply shouldn&#x27;t take the resources it does. If you know people who had to or still do data entry for their jobs, you&#x27;ll know that the mouse is pretty inefficient. The old terminals from 30-40+ years ago that were text-based had some incredibly efficent interfaces at a tiny fraction of the resource usage.<p>I had expected that at some point the Web would be &quot;solved&quot; in the sense that there&#x27;d be a generally expected technology stack and we&#x27;d move on to other problems but it simply hasn&#x27;t happened. There&#x27;s still a &quot;framework of the week&quot; and we&#x27;re still doing dumb things like reimplementing scroll bars in user code that don&#x27;t work right with the mouse wheel.<p>I don&#x27;t know how to solve that problem or even if it will ever be &quot;solved&quot;.
评论 #43976742 未加载
评论 #43975206 未加载
评论 #43983076 未加载
评论 #43978070 未加载
评论 #43978689 未加载
SilverSlash9 天前
The title made me think Carmack was criticizing poorly optimized software and advocating for improving performance on old hardware.<p>When in fact, the tweet is absolutely not about either of the two. He&#x27;s talking about a thought experiment where hardware stopped advancing and concludes with &quot;Innovative new products would get much rarer without super cheap and scalable compute, of course&quot;.
评论 #43972062 未加载
评论 #43972299 未加载
评论 #43973089 未加载
评论 #43972256 未加载
评论 #43985499 未加载
评论 #43973903 未加载
agentultra9 天前
I heartily agree. It would be nice if we could extend the lifetime of hardware 5, 10 years past its, &quot;planned obsolescence.&quot; This would divert a lot of e-waste, leave a lot of rare earth minerals in the ground, and might even significantly lower GHG emissions.<p>The market forces for producing software however... are not paying for such externalities. It&#x27;s much cheaper to ship it sooner, test, and iterate than it is to plan and design for performance. Some organizations in the games industry have figured out a formula for having good performance and moving units. It&#x27;s not spread evenly though.<p>In enterprise and consumer software there&#x27;s not a lot of motivation to consider performance criteria in requirements: we tend to design for what users will <i>tolerate</i> and give ourselves as much wiggle room as possible... because these systems tend to be complex and we want to ship changes&#x2F;features continually. Every change is a liability that can affect performance and user satisfaction. So we make sure we have enough room in our budget for an error rate.<p>Much different compared to designing and developing software behind closed doors until it&#x27;s, &quot;ready.&quot;
评论 #43984274 未加载
bob10299 天前
We&#x27;ve been able to run order matching engines for entire exchanges on a single thread for over a decade by this point.<p>I think this specific class of computational power - strictly serialized transaction processing - has <i>not</i> grown at the same rate as other metrics would suggest. Adding 31 additional cores doesn&#x27;t make the order matching engine go any faster (it could only go slower).<p>If your product is handling fewer than several million transactions per second and you are finding yourself reaching for a <i>cluster</i> of machines, you need to back up like 15 steps and start over.
评论 #43972195 未加载
评论 #43973292 未加载
评论 #43972696 未加载
AndrewDucker9 天前
Well, yes. It&#x27;s an economic problem (which is to say, it&#x27;s a resource allocation problem). Do you have someone spend extra time optimising your software or do you have them produce more functionality. If the latter generates more cash then that&#x27;s what you&#x27;ll get them to do. If the former becomes important to your cashflow then you&#x27;ll get them to do that.
评论 #43975266 未加载
评论 #43971960 未加载
评论 #43974907 未加载
评论 #43974733 未加载
评论 #43976399 未加载
评论 #43975795 未加载
fdr9 天前
One of the things I think about sometimes, a specific example rather than a rebuttal to Carmack.<p>The Electron Application is somewhere between tolerated and reviled by consumers, often on grounds of performance, but it&#x27;s probably the single innovation that made using my Linux laptop in the workplace tractable. And it is genuinely useful to, for example, drop into a MS Teams meeting without installing.<p>So, everyone laments that nothing is as tightly coded as Winamp anymore, without remembering the first three characters.
评论 #43975842 未加载
inetknght9 天前
I was working as a janitor, moonlighting as an IT director, in 2010. Back then I told the business that laptops for the past five years (roughly since Nehalem) have plenty of horsepower to run spreadsheets (which is basically all they do) with two cores, 16 GB of RAM, and a 500GB SATA SSD. A couple of users in marketing did need something a little (not much) beefier. Saved a bunch of money by not buying the latest-and-greatest laptops.<p>I don&#x27;t work there any more. Today I am convinced that&#x27;s true today: those computers should <i>still</i> be great for spreadsheets. Their workflow hasn&#x27;t seriously changed. It&#x27;s the software that has. If they&#x27;ve continued with updates (can it even &quot;run&quot; MS Windows 10 or 11 today? No idea, I&#x27;ve since moved on to Linux) then there&#x27;s a solid chance that the amount of bloat and especially move to online-only spreadsheets would tank their productivity.<p>Further, the internet at that place was terrible. The only offerings were ~16Mbit asynchronous DSL (for $300&#x2F;mo just because it&#x27;s a &quot;business&quot;, when I could get the same speed for $80&#x2F;mo at home), or Comcast cable 120Mbit for $500&#x2F;mo. 120Mbit is barely enough to get by with an online-only spreadsheet, and 16Mbit definitely not. But worse: if internet goes down, then the business ceases to function.<p>This is the real theft that another commenter [0] mentioned that I wholeheartedly agree with. There&#x27;s <i>no reason whatsoever</i> that a laptop running spreadsheets in an office environment should require internet to edit and update spreadsheets, or crazy amounts of compute&#x2F;storage, or even huge amounts of bandwidth.<p>Computers today have <i>zero</i> excuse for terrible performance except only to offload costs onto customers - private persons and businesses alike.<p>[0]: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=43971960">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=43971960</a>
alkonaut9 天前
&quot;The world&quot; runs on _features_ not elegant, fast, or bug free software. To the end user, there is no difference between a lack of a feature, and a bug. Nor is there any meaningful difference between software taking 5 minutes to complete something because of poor performance, compared to the feature not being there and the user having to spend 5 minutes completing the same task manually. It&#x27;s &quot;slow&quot;.<p>If you keep maximizing value for the end user, then you invariably create slow and buggy software. But also, if you <i>ask</i> the user whether they would want faster and less buggy software in exchange for fewer features, they - surprise - say no. And even more importantly: if you ask the <i>buyer</i> of software, which in the business world is rarely the end user, then they want features even more, and performance and elegance even less. Given the same feature set, a user&#x2F;buyer would opt for the fastest&#x2F;least buggy&#x2F;most elegant software. But if it lacks any features - it loses. The reason to keep software fast and elegant is because it&#x27;s the most likely path to be able to _keep_ adding features to it as to not be the less feature rich offering. People will describe the fast and elegant solution with great reviews, praising how good it feels to use. Which might lead people to think that it&#x27;s an important aspect. But in the end - they wouldn&#x27;t buy it at all if it didn&#x27;t do what they wanted. They&#x27;d go for the slow frustrating buggy mess if it has the critical feature they need.
评论 #43973469 未加载
评论 #43973923 未加载
评论 #43973484 未加载
评论 #43973280 未加载
nottorp9 天前
Unfortunately, bloated software passes the costs to the customer and it&#x27;s hard to evaluate the loss.<p>Except your browser taking 180% of available ram maybe.<p>By the way, the world could also have some bug free software, if anyone could afford to pay for it.
评论 #43972118 未加载
ManlyBread9 天前
I have been thinking about this a lot ever since I played a game called &quot;Balatro&quot;. In this game nothing extraordinary happens in terms of computing - some computations get done, some images are shuffled around on the screen, the effects are sparse. The hardware requirements aren&#x27;t much by modern standards, but still, this game could be ported 1:1 to a machine with Pentium II with a 3dfx graphics card. And yet it demands so much more - not a lot by today standards, but still. I am tempted to try to run it on a 2010 netbook to see if it even boots up.
评论 #43973463 未加载
评论 #43982861 未加载
freddie_mercury9 天前
The world DOES run on older hardware.<p>How new do you think the CPU in your bank ATM or car&#x27;s ECU is?
评论 #43971971 未加载
评论 #43971986 未加载
评论 #43975120 未加载
评论 #43971989 未加载
评论 #43972058 未加载
评论 #43972160 未加载
评论 #43971970 未加载
armchairhacker9 天前
Is there or could we make an iPhone-like that runs 100x slower than conventional phones but uses much less energy, so it powers itself on solar? It would be good for the environment and useful in survival situations.<p>Or could we make a phone that runs 100x slower but is much cheaper? If it also runs on solar it would be useful in third-world countries.<p>Processors are more than fast enough for most tasks nowadays; more speed is still useful, but I think improving price and power consumption is more important. Also cheaper E-ink displays, which are much better for your eyes, more visible outside, and use less power than LEDs.
评论 #43972752 未加载
评论 #43972776 未加载
评论 #43974247 未加载
socalgal28 天前
Sorry, don&#x27;t want to go back to a time where I could only edit ASCII in a single font.<p>Do I like bloat? No. Do I like more software rather than less? Yes! Unity and Unreal are less efficient than custom engines but there are 100x more titles because that tradeoff of efficiency of the CPU vs efficiency of creation.<p>The same is true for website based app (both online and off). Software ships 10x faster as a web page than as a native app for (windows&#x2F;mac&#x2F;linux&#x2F;android&#x2F;ios). For most, that&#x27;s all I need. Even for native like apps, I use photopea.com over photoshop&#x2F;gimp&#x2F;krita&#x2F;affinity etc because it&#x27;s available everywhere no matter which machine I use or who&#x27;s machine it is. Is it less efficient running in JS in the browser? Probaby. Do I care? No<p>VSCode, now the most popular editor in the worlds (IIRC) is web-tech. This has so many benefits. For one, it&#x27;s been integrated into 100s of websites, so this editor I use is available in more places. It&#x27;s using tech more people know so more extension that do more things. Also, probably arguably because of JS&#x27;s speed issues, it encouraged the creation of the Language Server Protocol. Before this, every editor rolled their own language support. The LSP is arguably way more bloat than doing it directly in the editor. I don&#x27;t care. It&#x27;s a great idea, way more flexible. Any language can write one LSP and then all editors get support for that language.
xg159 天前
Meanwhile on every programmer&#x27;s 101 forum: &quot;Space is cheap! Premature optimization is the root of all evil! Dev time &gt; runtime!&quot;
评论 #43973809 未加载
joaohaas9 天前
HN: Yeah! We should be go back to writing optimized code that fully uses the hardware capabilities!<p>Also HN: Check this new AI tool that consumes 1000x more energy to do the exact same thing we could already do, but worse and with no reproducibility
评论 #43974380 未加载
WillAdams8 天前
Wirth&#x27;s Law:<p>&gt;software is getting slower more rapidly than hardware is becoming faster.<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Wirth%27s_law" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Wirth%27s_law</a>
评论 #43984303 未加载
vermilingua9 天前
Related: <a href="https:&#x2F;&#x2F;duskos.org&#x2F;" rel="nofollow">https:&#x2F;&#x2F;duskos.org&#x2F;</a>
评论 #43972726 未加载
shadowgovt9 天前
Really no notes on this. Carmack hit both sides of the coin:<p>- the way we do industry-scale computing right now tends to leave a lot of opportunity on the table because we decouple, interpret, and de-integrate where things would be faster and take less space if we coupled, compiled, and made monoliths<p>- we do things that way because it&#x27;s easier to innovate, tweak, test, and pivot on decoupled systems that isolate the impact of change and give us ample signal about their internal state to debug and understand them
1970-01-019 天前
The idea of a hand me down computer made of brass and mahogany still sounds ridiculous because it is, but we&#x27;re nearly there in terms of Moore&#x27;s law. We have true 2nm within reach and then the 1nm process is basically the end of the journey. I expect &#x27;audiophile grade&#x27; PCs in the 2030s and then PCs become works of art, furniture, investments, etc. because they have nowhere to go.<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;2_nm_process" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;2_nm_process</a><p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;International_Roadmap_for_Devices_and_Systems" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;International_Roadmap_for_Devi...</a>
评论 #43972547 未加载
shireboy8 天前
.NET has made great strides in this front in recent years. Newer versions optimize cpu and ram usage of lots of fundamentals, and introduced new constructs to reduce allocations and cpu for new code. One might argue they were able because they were so bad, but it’s worth looking into if you haven’t in a while.
QuadrupleA9 天前
This always saddens me. We could have things <i>instant</i>, simple, and compute &amp; storage would be 100x more abundant in practical terms than it is today.<p>It&#x27;s not even a trade off a lot of the time, simpler architectures perform better but are also vastly easier and cheaper to maintain.<p>We just lack expertise I think, and pass on cargo cult &quot;best practices&quot; much of the time.
eth0up9 天前
Yeah, having browsers the size and complexities of OSs is just one of many symptoms. I intimate at this concept in a grumbling, helpless manner somewhat chronically.<p>There&#x27;s a lot today that wasn&#x27;t possible yesterday, but it also sucks in ways that weren&#x27;t possible then.<p>I foresee hostility for saying the following, but it really seems most people are unwilling to admit that most software (and even hardware) isn&#x27;t necessarily made for the user or its express purpose anymore. To be perhaps a bit silly, I get the impression of many services as bait for telemetry and background fun.<p>While not an overly earnest example, looking at Android&#x27;s Settings&#x2F;System&#x2F;Developer Options is pretty quick evidence that the user is involved but clearly not the main component in any respect. Even an objective look at Linux finds manifold layers of hacks and compensation for a world of hostile hardware and soft conflict. It often works exceedingly well, though as impractical as it may be to fantasize, imagine how badass it would be if everything was clean, open and honest. There&#x27;s immense power, with lots of infirmities.<p>I&#x27;ve said that today is the golden age of the LLM in all its puerility. It&#x27;ll get way better, yeah, but it&#x27;ll get way worse too, in the ways that matter.[1]<p>Edit: 1. Assuming open source doesn&#x27;t persevere
fennecfoxy3 天前
Oh look, Carmack says something obvious again and people listen to it like gospel.<p>Don&#x27;t get me wrong, I like the guy. But what has he done since getting booted from id&#x2F;Doom team? Espousing VR but with no real breakthroughs in the space with his name on them?<p>I&#x27;m thankful for Doom 2016. Without booting him it would have never been possible. Although looking at the recent vikings-wannabe expansion with a ridiculous, dumb &quot;story&quot; seems they&#x27;ve lost the plot again-executives get their grubby fingers into franchises. People just wanna rip&amp;tear, don&#x27;t need a story for that - if anything focus on combat&#x2F;environment&#x2F;number of enemy improvements &amp; leave it at that.
protoster9 天前
My phone isn&#x27;t getting slower, but rather the OS running on it becomes less efficient with every update. Shameful.
yonisto9 天前
I&#x27;m not much into retro computing. But it amazes me what people are pulling out of a dated hardware.<p>Doom on the Amiga for example (many consider it the main factor for the Amiga demise). Optimization and 30 years and it finally arrived
quantadev8 天前
Developers over 50ish (like me) grew up at a time when CPU performance and memory constraints affected every application. So you had to always be smart about doing things efficiently with both CPU and memory.<p>Younger developers have machines that are so fast they can be lazy with all algorithms and do everything &#x27;brute force&#x27;. Like searching thru an array every time when a hashmap would&#x27;ve been 10x faster. Or using all kinds of &quot;list.find().filter().any().every()&quot; chaining nonsense, when it&#x27;s often smarter to do ONE loop, and inside that loop do a bunch of different things.<p>So younger devs only optimize once they NOTICE the code running slow. That means they&#x27;re ALWAYS right on the edge of overloading the CPU, just thru bad coding. In other words, their inefficiencies will always expand to fit available memory, and available clock cycles.
seydor9 天前
I wonder if anyone has calculated the additional planet heating generated by crappy e.g. JS apps or useless animations
yencabulator9 天前
Z+6 months: Start porting everything to Collapse OS<p><a href="https:&#x2F;&#x2F;collapseos.org&#x2F;" rel="nofollow">https:&#x2F;&#x2F;collapseos.org&#x2F;</a>
gmerc9 天前
Perfect parallel to the madness that is AI. With even modest sustainability incentives, the industry wouldn&#x27;t have pulverized a trillion dollar on training models nobody uses to dominate the weekly attention fight and fundraising game.<p>Evidence: DeepSeek
diego_sandoval8 天前
I generally believe that markets are somewhat efficient.<p>But somehow, we&#x27;ve ended up with the current state of Windows as the OS that most people use to do their job.<p>Something went terribly wrong. Maybe the market is just too dumb, maybe it&#x27;s all the market distortions that have to do with IP, maybe it&#x27;s the monopolístic practices of Microsoft. I don&#x27;t know, but in my head, no sane civilization would think that Windows 10&#x2F;11 is a good OS that everyone should use to optimize our economy.<p>I&#x27;m not talking only about performance, but about the general crappiness of the experience of using it.
branko_d8 天前
Often, this is presented as a tradeoff between the cost of development and the cost of hardware. However, there is a third leg of that stool: the cost of end-user experience.<p>When you have a system which is sluggish to use because your skimped on development, it is often the case that you cannot make it much faster no matter how expensive is the hardware you throw at it. Either there is a single-threaded critical path, so you hit the limit of what one CPU can do (and adding more does not help), or you hit the laws of physics, such as with network latency which is ultimately bound by the speed of light.<p>And even when the situation could be improved by throwing more hardware at it, this is often done only to the extent to make the user experience &quot;acceptable&quot;, but not &quot;great&quot;.<p>In either case, the user experience suffers and each individual user is less productive. And since there are (usually) orders of magnitude more users than developers, the total damage done can be much greater than the increased cost of performance-focused development. But the cost of development is &quot;concentrated&quot; while the cost of user experience is &quot;distributed&quot;, so it&#x27;s more difficult to measure or incentivize for.<p>The cost of poor user experience is a real cost, is larger than most people seem to think and is non-linear. This was observed in the experiments done by IBM, Google, Amazon and others decades ago. For example, take a look at:<p>The Economic Value of Rapid Response Time <a href="https:&#x2F;&#x2F;jlelliotton.blogspot.com&#x2F;p&#x2F;the-economic-value-of-rapid-response.html" rel="nofollow">https:&#x2F;&#x2F;jlelliotton.blogspot.com&#x2F;p&#x2F;the-economic-value-of-rap...</a><p><i>He and Richard P. Kelisky, Director of Computing Systems for IBM&#x27;s Research Division, wrote about their observations in 1979, &quot;...each second of system response degradation leads to a similar degradation added to the user&#x27;s time for the following [command]. This phenomenon seems to be related to an individual&#x27;s attention span. The traditional model of a person thinking after each system response appears to be inaccurate. Instead, people seem to have a sequence of actions in mind, contained in a short-term mental memory buffer. Increases in SRT [system response time] seem to disrupt the thought processes, and this may result in having to rethink the sequence of actions to be continued.&quot;</i>
threetonesun9 天前
Obviously, the world ran before computers. The more interesting part of this is what would we lose if we knew there were no new computers, and while I&#x27;d like to believe the world would put its resources towards critical infrastructure and global logistics, we&#x27;d probably see the financial sector trying to buy out whatever they could, followed by any data center &#x2F; cloud computing company trying to lock all of the best compute power in their own buildings.
nurettin3 天前
If memory doesn&#x27;t fail me, we kind of went through this phase back in the early 2010s when a lot of companies rewrote their php&#x2F;ruby web applications and delayed jobs in go, blogging about experiencing speedups and lowering server costs.
Devasta9 天前
Imagine software engineering was like real engineering, where the engineers had licensing and faced fines or even prison for negligence. How much of the modern worlds software would be tolerated?<p>Very, very little.<p>If engineers handled the Citicorp center the same way software engineers did, the fix would have been to update the documentation in Confluence to not expose the building to winds and then later on shrug when it collapsed.
评论 #43976210 未加载
margorczynski9 天前
The priority should be safety, not speed. I prefer an e.g. slower browser or OS that isn&#x27;t ridden with exploits and attack vectors.<p>Of course that doesn&#x27;t mean everything should be done in JS and Electron as there&#x27;s a lot of drawbacks to that. There exists a reasonable middle ground where you get e.g. memory safety but don&#x27;t operate on layers upon layers of heavy abstraction and overhead.
评论 #43972139 未加载
评论 #43982705 未加载
cadamsdotcom8 天前
Consider UX:<p>Click the link and contemplate while X loads. First, the black background. Next it spends a while and you&#x27;re ready to celebrate! Nope, it was loading the loading spinner. Then the pieces of the page start to appear. A few more seconds pass while the page is redrawn with the right fonts; only then can you actually scroll the page.<p>Having had some time to question your sanity for clicking, you&#x27;re grateful to finally see what you came to see. So you dwell 10x as long, staring at a loaded page and contemplating the tweet. You dwell longer to scroll and look at the replies.<p>How long were you willing to wait for data you REALLY care about? 10-30 seconds; if it&#x27;s important enough you&#x27;ll wait even longer.<p>Software is as fast as it needs to be to be useful to humans. Computer speed doesn&#x27;t matter.<p>If the computer goes too fast it may even be suspected of trickery.
jmward019 天前
The goal isn&#x27;t optimized code, it is utility&#x2F;value prop. The question then is how do we get the best utility&#x2F;value given the resources we have. This question often leads to people believing optimization is the right path since it would use fewer resources and therefore the value prop would be higher. I believe they are both right and wrong. For me, almost universally, good optimization ends up simplifying things as it speeds things up. This &#x27;secondary&#x27; benefit, to me, is actually the primary benefit. So when considering optimizations I&#x27;d argue that performance gains are a potential proxy for simplicity gains in many cases so putting a little more effort into that is almost always worth it. Just make sure you actually are simplifying though.
评论 #43973943 未加载
don_searchcraft9 天前
100% agree with Carmack. There was a craft in writing software that I feel has been lost with access to inexpensive memory and compute. Programmers can be inefficient because they have all that extra headroom to do so which just contributes to the cycle of needing better hardware.
评论 #43972576 未加载
pmontra9 天前
I work on a laptop from 2014. An i7 4xxx with 32 GB RAM and 3 TB SSD. It&#x27;s OK for Rails and for Django, Vue, Slack, Firefox and Chrome. Browsers and interpreters got faster. Luckily there was pressure to optimize especially in browsers.
Earw0rm8 天前
Optimise is never a neutral word.<p>You always optimise FOR something at the expense of something.<p>And that can, and frequently should, be lean resource consumption, but it can come at a price.<p>Which might be one or more of: Accessibility. Full internationalisation. Integration paradigms (thinking about how modern web apps bring UI and data elements in from third parties). Readability&#x2F;maintainability. Displays that can actually represent text correctly at any size without relying on font hinting hacks. All sorts of subtle points around UX. Economic&#x2F;business model stuff (megabytes of cookie BS on every web site, looking at you right now.) Etc.
VagabundoP9 天前
I&#x27;ve installed OSX Sequoia on 2015 iMacs with 8 gigs of ram and it runs great. More than great actually.<p>Linux on 10-15 year old laptops and it runs good. if you beef up RAM and SSD then actually really good.<p>So for everyday stuff we can and do run on older hardware.
westurner8 天前
Code bloat: <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Code_bloat" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Code_bloat</a><p>Software bloat &gt; Causes: <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Software_bloat#Causes" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Software_bloat#Causes</a><p>Program optimization &gt; Automated and manual optimization: <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Program_optimization#Automated_and_manual_optimization" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Program_optimization#Automated...</a>
vasco9 天前
He mentions the rate of innovation would slow down which I agree with. But I think that even 5% slower innovation rate would delay the optimizations we can do or even figure out what we need to optimize through centuries of computer usage and in the end we&#x27;d be less efficient because we&#x27;d be slower at finding efficiencies. Low adoption rate of new efficiencies is worse than high adoption rate of old efficiencies is I guess how to phrase it.<p>If Cadence for example releases every feature 5 years later because they spend more time optimizing them, it&#x27;s software after all, how much will that delay semiconductor innovations?
Paianni8 天前
So Metal and DirectX 12 capable GPUs are already hard requirements for their respective platforms, and I anticipate that Wayland compositors will start depending on Vulkan eventually.<p>That will make pre-Broadwell laptops generally obsolete and there aren&#x27;t many AMD laptops of that era that are worth saving.<p>Servers don&#x27;t need a graphical environment, but the inefficiencies of older hardware are much more acute in this context compared to personal computing, where the full capabilities of the hardware are rarely exploited.
giancarlostoro8 天前
If the tooling had kept up. We went from RADs that built you fully native GUIs to abandoning ship and letting Electron take over. Anyone else have 40 web browsers installed and they are each some Chromium hack?
MisterBastahrd8 天前
The world will seek out software optimization only after hardware reaches its physical limits.<p>We&#x27;re still in Startup Land, where it&#x27;s more important to be first than it is to be good. From that point onward, you have to make a HUGE leap and your first-to-market competitor needs to make some horrendous screwups in order to overtake them.<p>The other problem is that some people still believe that the masses will pay more for quality. Sometimes, good enough is good enough. Tidal didn&#x27;t replace iTunes or Spotify, and Pono didn&#x27;t exactly crack the market for iPods.
floathub8 天前
<p><pre><code> &gt; npx create-expo-app@latest --template blank HelloWorldExpoReact &gt; du -h HelloWorldExpoReact&#x2F; </code></pre> 258M! A quarter of a gigabyte for a HelloWorld example. Sheesh.
actuallyalys8 天前
Carmack is right to some extent, although I think it’s also worth mentioning that people replace their computers for reasons other than performance, especially smartphones. Improvements in other components, damage, marketing, and status are other reasons.<p>It’s not that uncommon for people to replace their phone after two years, and as someone who’s typically bought phones that are good but not top-of-the-line, I’m skeptical all of those people’s phones are getting bogged down by slow software.
ricardo819 天前
Minimalism is excellent. As others have mentioned, using languages that are more memory safe (by assumption the language is wrote in such a way) may be worth the additional complexity cost.<p>But surely with burgeoning AI use efficiency savings are being gobbled up by the brute force nature of it.<p>Maybe model training and the likes of hugging face can avoid different groups trying to reinvent the same AI wheel using more resources than a cursory search of a resource.
dehrmann8 天前
This is Carmack&#x27;s favorite observation over the last decade+. It stems from what made him successful at id. The world&#x27;s changed since then. Home computers are rarely compute-bound, the code we write is orders of magnitude more complex, and compilers have gotten better. Any wins would come at the cost of a <i>massive</i> investment in engineering time or degraded user experience.
therealmarv9 天前
Tell me about it. Web development has only become fun again at my place since upgrading from Intel Mac to M4 Mac.<p>Just throw in Slack chat, vscode editor in Electron, Next.js stack, 1-2 docker containers, one browser and you need top notch hardware to run it fluid (Apple Silicon is amazing though). I&#x27;m doing no fancy stuff.<p>Chat, editor in a browser and docker don&#x27;t seem the most efficient thing if put all together.
WJW9 天前
Well obviously. And there would be no wars if everybody made peace a priority.<p>It&#x27;s obvious for both cases where the real priorities of humanity lie.
casey28 天前
True for large corporations. But for individuals the ability to put what was previously an entire stacks in a script that doesn&#x27;t call out to the internet will be a big win.<p>How many people are going to write and maintain shell scripts with 10+ curls? If we are being honest this is the main reason people use python.
teleforce8 天前
In the last Def Con 32 the badge can run full Doom on puny Pico 2 microcontroller [1].<p>[1] Running Doom on the Raspberry Pi Pico 2: A Def Con 32 Badge Hack:<p><a href="https:&#x2F;&#x2F;shop.sb-components.co.uk&#x2F;blogs&#x2F;posts&#x2F;running-doom-" rel="nofollow">https:&#x2F;&#x2F;shop.sb-components.co.uk&#x2F;blogs&#x2F;posts&#x2F;running-doom-</a>
markus_zhang9 天前
I think optimizations only occur when the users need them. That is why there are so many tricks for game engine optimization and compiling speed optimization. And that is why MSFT could optimize the hell out of VSCode.<p>People simply do not care about the rest. So there will be as little money spent on optimization as possible.
vishalontheline8 天前
When it&#x27;s free, it doesn&#x27;t need to be performant unless the free competition is performant.
knowitnone9 天前
I already run on older hardware and most people can if they chose to - haven&#x27;t bought a new computer since 2005. Perhaps the OS can adopt a &quot;serverless&quot; model where high computational tasks are offloaded as long as there is sufficient bandwidth.
abetaha9 天前
Sadly software optimization doesn&#x27;t offer enough cost savings for most companies to address consumer frustration. However, for large AI workloads, even small CPU improvements yield significant financial benefits, making optimization highly worthwhile.
noobermin9 天前
I&#x27;m already moving in this direction in my personal life. It&#x27;s partly nostalgia but it&#x27;s partly practical. It&#x27;s just that work requires working with people who only use what hr and it hoists on them, then I need a separate machine for that.
jasonthorsness9 天前
We are squandering bandwidth similarly and that hasn’t increased as much as processing power.
mcflubbins8 天前
We have customers with thousands of machines that are still using spinning, mechanical 54000 RPM drives. The machines are unbelievably slow and only get slower with every single update, its nuts.
southernplaces78 天前
Reminded me of this interesting thought experiment<p><a href="https:&#x2F;&#x2F;x.com&#x2F;lauriewired&#x2F;status&#x2F;1922015999118680495" rel="nofollow">https:&#x2F;&#x2F;x.com&#x2F;lauriewired&#x2F;status&#x2F;1922015999118680495</a>
1vuio0pswjnm78 天前
Works without Javascript:<p><a href="https:&#x2F;&#x2F;nitter.poast.org&#x2F;ID_AA_Carmack&#x2F;status&#x2F;1922100771392520710" rel="nofollow">https:&#x2F;&#x2F;nitter.poast.org&#x2F;ID_AA_Carmack&#x2F;status&#x2F;19221007713925...</a>
emsign8 天前
8bit&#x2F;16bit demo scene can do it, but that&#x27;s true dedication.
AtNightWeCode8 天前
Well, it is a point. But also remember the horrors of the monoliths he made. Like in Quake (123?) where you have hacks like if level name contains XYZ then do this magic. I think the conclusion might be wrong.
jrowen8 天前
This is the story of life in a nutshell. It&#x27;s extremely far from optimized, and that is the natural way of all that it spawns. It almost seems inelegant to attempt to &quot;correct&quot; it.
benced8 天前
Feels like half of this thread didn&#x27;t read or ignored his last line: &quot;Innovative new products would get much rarer without super cheap and scalable compute, of course.&quot;
turtlebits9 天前
1. Consumers are attracted to pretty UIs and lots of features, which pretty much drives inefficiency.<p>2. The consumers that have the money to buy software&#x2F;pay for subscriptions have the newer hardware.
dardeaup9 天前
It could also run on much less current hardware if efficiency was a priority. Then comes the AI bandwagon and everyone is buying loads of new equipment to keep up with the Jones.
评论 #43972234 未加载
rsynnott8 天前
Well, yes, I mean, the world could run on less of all sorts of things, if efficient use of those things were a priority. It&#x27;s not, though.
more_corn7 天前
The world could run on old hardware if you wiped your win10 machines and installed Linux instead of buying new win11 machines.
cogman109 天前
I&#x27;m going to be pretty blunt. Carmack gets worshiped when he shouldn&#x27;t be. He has several bad takes in terms of software. Further, he&#x27;s frankly behind the times when it comes to the current state of the software ecosystem.<p>I get it, he&#x27;s legendary for the work he did at id software. But this is the guy who only like 5 years ago was convinced that static analysis was actually a good thing for code.<p>He seems to have a perpetual view on the state of software. Interpreted stuff is slow, networks are slow, databases are slow. Everyone is working with Pentium 1s and 2MB of ram.<p>None of these are what he thinks they are. CPUs are wicked fast. Interpreted languages are now within a single digit multiple of natively compiled languages. Ram is cheap and plentiful. Databases and networks are insanely fast.<p>Good on him for sharing his takes, but really, he shouldn&#x27;t be considered a &quot;thought leader&quot;. I&#x27;ve noticed his takes have been outdated for over a decade.<p>I&#x27;m sure he&#x27;s a nice guy, but I believe he&#x27;s fallen into a trap that many older devs do. He&#x27;s overestimating what the costs of things are because his mental model of computing is dated.
评论 #43982378 未加载
评论 #43975300 未加载
generalizations8 天前
The world runs on the maximization of ( - entropy &#x2F; $) and that&#x27;s definitely not the same thing as minimizing compute power or bug count.
Scene_Cast29 天前
Where lack of performance costs money, optimization is quite invested in. See PyTorch (Inductor CUDA graphs), Triton, FlashAttention, Jax, etc.
nashashmi9 天前
The world could run on older hardware if rapid development did not also make money.<p>Rapid development is creating a race towards faster hardware.
narag8 天前
How much of the extra power has gone to graphics?<p>Most of it?
Mindwipe9 天前
Probably, but we&#x27;d be in a pretty terrible security place without modern hardware based cryptographic operations.
oatmeal_croc8 天前
Yes, but it is not a priority. GTM is the priority. Make money machine go brrr.
randomcarbloke8 天前
isn&#x27;t this Bane&#x27;s rule?:<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=8902739">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=8902739</a>
评论 #43984067 未加载
datax29 天前
This is a double edge sword problem, but I think what people are glazing over with the compute power topic is power efficiency. One thing I struggle with home labing old gaming equipment is the consideration to the power efficiency of new hardware. Hardly a valid comparison, but I can choose to recycle my Ryzen 1700x with a 2080ti for a media server that will probably consume a few hundred watts, or I can get a M1 that sips power. The double edge sword part is that Ryzen system becomes considerably more power efficient running proxmox or ubuntu server vs a windows client. We as a society choose our niche we want to leverage and it swings with and like economics, strapped for cash, choose to build more efficient code; no limits, buy the horsepower to meet the needs.
hnlurker229 天前
My professor back in the day told me that &quot;software is eating hardware&quot;. No matter how hardware gets advanced, software will utilize that advancement.
redleader559 天前
Carmack is a very smart guy and I agree with the sentiment behind his post, but he&#x27;s a software guy. Unfortunately for all of us hardware has bugs, sometimes bugs so bad that you need to drop 30-40% of your performance to mitigate them - see Spectre, Meltdown and friends.<p>I don&#x27;t want the crap Intel has been producing for the last 20 years, I want the ARM, RiscV and AMD CPUs from 5 years in the future. I don&#x27;t want a GPU by Nvidia that comes with buggy drivers and opaque firmware updates, I want the open source GPU that someone is bound to make in the next decade. I&#x27;m happy 10gb switches are becoming a thing in the home, I don&#x27;t want the 100 mb hubs from the early 2000s.
cannabis_sam8 天前
As long as sufficient amounts of wealthy people are able to wield their money as a force to shape society, this is will always be the outcome.<p>Unfortunately,in our current society a rich group of people with a very restricted intellect, abnormal psychology, perverse views on human interaction and a paranoid delusion that kept normal human love and compassion beyond their grasp, were able to shape society to their dreadful imagination.<p>Hopefully humanity can make it through these times, despite these hateful aberrations doing their best to wield their economic power to destroy humans as a concept.
JohnMakin8 天前
don&#x27;t major cloud companies do this and then sell the gains as a commodity?
voidUpdate9 天前
I mean, if you put win 95 on a period appropriate machine, you can do office work easily. All that is really driving computing power is the web and gaming. If we weren&#x27;t doing either of those things as much, I bet we could all quite happily use machines from the 2000s era
评论 #43982839 未加载
x1unix9 天前
based
wiz21c9 天前
I&#x27;d much prefer Carmack to think about optimizing for energy consumption.
评论 #43972662 未加载
busterarm9 天前
Let&#x27;s keep the CPU efficiency golf to Zachtronics games, please.<p>I&#x2F;O is almost always the main bottleneck. I swear to god 99% of developers out there only know how to measure cpu cycles of their code so that&#x27;s the only thing they optimize for. Call me after you&#x27;ve seen your jobs on your k8s clusters get slow because all of your jobs are inefficiently using local disk and wasting cycles waiting in queue for reads&#x2F;writes. Or your DB replication slows down to the point that you have to choose between breaking the mirror and stop making money.<p>And older hardware consumes more power. That&#x27;s the main driving factor between server hardware upgrades because you can fit more compute into your datacenter.<p>I agree with Carmack&#x27;s assessment here, but most people reading are taking the wrong message away with them.
评论 #43972077 未加载
评论 #43972051 未加载
评论 #43972235 未加载