TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ubuntu is getting slower

13 点作者 urlwolf超过 16 年前

5 条评论

kxt超过 16 年前
Seeing benchmarks conducted by Phoronix always fills me with emptiness on the inside. While I really welcome the idea of benchmarking Linux, their methodology always seem lacking for me.<p>All we get are a bunch of numbers, without any actual investigation of what those numbers should represent, what can be the reason of the outcome, and sometimes the measurements make simply no sense.<p>For example, according to these benchmarks, Ubuntu 7.04 reads memory twice as fast as newer versions. There is no possible way it can be a valid result. At least assuming that the exact same compiled code was used on every installation. Which brings us to another problem: no information on the tests. All we get is a software name, a version number and the result numbers. Which would be almost fine if they were prepackaged binaries, but with FOSS different compile time options and compiler flags can make quite a difference in the results too.<p>About the nonsensical tests: RAM speed should be the same regardless of the OS. So dedicating a full page to RAM speed tests should be senseless. No. It's actually a nice control to the tests, and the numbers show that there's a problem somewhere. Either the tests, the measurements are off significantly, or there is something flawed in either the 7.04 configuration or the others that cause almost 50% difference in such test.<p>Also, measuring compile times. They managed to measure the time it takes to compile 3 software written in C using an unspecified compiler with unspecified options.<p>At the end, no conclusion were drawn, just the results summaried in English instead of plain numbers. The whole thing gives me the feeling that they don't really know that they are testing, they're just running a bunch of programs and reporting the numbers they output.<p>I'm sorry if it seems like I'm just ranting, but I've tried a couple times sending emails that point out the flaws in their methodology, to no use.
mdasen超过 16 年前
I've always found that benchmarks are not good at showing OS performance for desktop applications. Why? I don't really need raw speed. What I need is something that will allow me to switch tasks and use other applications while one is locking up/doing a ton.<p>BeOS was great at that. I'm guessing the new Comlpetely Fair Scheduler that is debuting in Intrepid is better in that way too. My use of Intrepid shows it to be noticably faster in my daily usage. However, I'm not compressing video. I'm running several apps at a time and need to switch between them a lot.
woodsier超过 16 年前
Eugh. I quit the site as soon as I saw it was divided into 10 pages. No thanks.
评论 #355645 未加载
delano超过 16 年前
There are a lot of superfluous details in this article. After doing a few initial benchmarks it should have been obvious that the largest performance decline occurs between the 7.04 and 7.10 releases. The article should have answered that question first before throwing 8.x into the mix.<p>Also, don't use a laptop for benchmarking operating systems!
评论 #355625 未加载
评论 #355741 未加载
erik超过 16 年前
When slashdot ran this story, the readers commented that the results seem suspicious. Misconfigured power saving settings are likely the cause of the speed difference.<p><a href="http://linux.slashdot.org/comments.pl?sid=1008879&#38;cid=25525597" rel="nofollow">http://linux.slashdot.org/comments.pl?sid=1008879&#38;cid=25...</a>