TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Algorithms for the 21st Century (2006)

14 点作者 pncnmnp8 个月前

4 条评论

sameoldtune8 个月前
This paper is very enlightening, but I don’t think that an algorithms class should address these sorts of problems. When I went to school (15 years ago) We had a separate machine architecture class series that went over these sorts of practical concerns. We had labs where we would time memory accesses and test aliased variables and manually decompile simple functions.<p>In my humble opinion as a volunteer educator, algorithms are already very complicated and students being introduced to them don’t need to concern themselves with this stuff at the same time.
jpollock8 个月前
Nifty paper. I kind-of wish I was in a problem space that had that sort of issue!<p>I just spent a week shaving 98% off of the latency in a single RPC call by going from O(N^2) to O(1). Data locality is not a problem I have. :)
kmoser8 个月前
&gt; We can also examine what happens when we write data repeatedly.<p>I wonder whether an optimizing compiler can detect and optimize the sequential writes of the same values, possibly by replacing them with something like a blitter. It would be interesting to see the difference in speed if the data to be written was random, rather than the same value (1.0) every time.
abirch8 个月前
This is a good paper. I would label it &quot;Caveats for Big O&quot; or a map is not the territory.