This comes as no surprise. This is a company who has also been caught claiming their phone took a photo - but left in DSLR exif data, claiming a phone had UFS but shipped EMMC instead, boasting to work with dev community - only to lock them out months later, among other shady practices. They make nice phones at a good price, but is not a company to be trusted.
> we have decided to delist the affected models and remove them from our performance rankings<p>I would be inclined to be more aggressive than that. Leave them listed, but add support in the site or other publication system to display "deliberately broke the rules of the test" as a negative, perhaps displaying in a negative looking colour and in any graphs show as a if they scored, say, 20% worse then the worst other score.<p>If presenting some arithmetically derived overall rating of benchmarks and other properties, balloon this further. Then if the rest of the phone is demonstrably brilliant they still have a hope of not looking terrible, but the cheating hurts the result otherwise.<p>Unfortunately in these litigious times such an inclination would probably cause me some financial trouble, so it is probably for the best that I don't run a device comparison site! This may be why simply delisting was the chosen way of handling the situation here.
I used to work at an OEM that did this. When the VW Emissions Scandal was publicized, we removed almost every piece of code that did things like these from all our phones.<p>Not sure if it was ever reverted eventually but performance test might not matter to HN users but it is something that typical consumers do look at when making purchasing decisions, based on our research.
I'm gonna get down voted but that's why I buy Apple phones. Comparing processors, ram, test scores means nothing compared to the actual experience
This is just what is expected: the vendors do this because they need to shine in these meaningless benchmarks because that's merely the name of the game. The tech magazines are in between: they publish results in reviews and phone comparisons but it seems that not many people really buy phones based on benchmark scores. They might give some overall idea of whether the phone is middle-end or high-end, but differences within one category tend to be small enough to not matter for the general buyer.<p>To fix this, there should be different runs for these benchmarks. A max-clocks run with the phone set in an actively cooled cabin to see what the hardware can theoretically do at its very best, and another run in non-actively cooled room temperature with the test running for about an hour to rule out any benefits from temporary boosts or ignoring thermal limits to get an idea of continuously supported performance.<p>Even those results wouldn't really tell much to the end-user. I always suggest my friends and relatives to buy a phone with excessively large memory and storage space because lack of memory and storage really is what turns phones slow after a few rounds of application and system updates.<p>The performance edge on mobile is thinning out pretty much what happened with PCs. Ditto for memory and storage. Early on, vendors competed on who has the highest MHz cpus but somewhere between 1-2GHz the performance got high enough in nearly any case. 95% of people could buy about just any PC or laptop and it would be "fast enough". There's gigabytes of memory in even the sloppiest laptop these days, and enough SSD to make things fly. The same will happen with phones which makes it impossible to buy a phone that is too slow. At that point components with lower performance and less capacity will become more expensive due to the lack of volumes, that no vendor will bother any longer.
My first reaction was reaction isn't this a dupe. But then the earlier one was about Huwaei's camera cheat:<p><a href="https://news.ycombinator.com/item?id=17805027" rel="nofollow">https://news.ycombinator.com/item?id=17805027</a>
>however, when an unlabeled version of the benchmark test was run, the phones were unable to recognize it and, as a result, displayed lower performances.<p>>In other words, the phones aren’t so smart after all.<p>If I may, also the fact that benchmarks are normally/usually "recognizable" through some "label" doesn't seem "smart" to me.
If something is cheap in terms of the price you pay for it, you're going to pay by something else. Be it poor quality components, information collection or unethical manufacturing processes. Yes, of course I am generalizing. There are good cheap(er) products and bad expensive ones.
That's a shame. I got a P20 Pro and I really like it (even the software).<p>But I don't think I'll get another Huawei next time, because the company itself is really weird.
Huawei caught cheating at a meaningless 'test' nobody should care about.<p>I spoke a fair bit about performances with a Googler in charger of it. Apparently he is appalled that OEMs have been optimizing their phones in order to score high on these benchmarks.<p>They don't reflect real use at all and are not what should be optimized against.<p>In the end, he documented the work he did on the various terminals he worked on so OEMs could do what they should have been doing for years..
> For the Huawei case, the rules are actually a little fuzzy. Phones are permitted to adjust performance based on workload, which results in peaks or dips in performance for different apps, but they are not permitted to hard-code peaks in performance specifically for the benchmark app. Huawei reportedly claimed that the peak in performance seen during the run of the benchmark app was an intuitive jump determined by AI; however, when an unlabeled version of the benchmark test was run, the phones were unable to recognize it and, as a result, displayed lower performances.<p>Source code for their AI engine:<p>if (app == benchmark) { increasePerformance(); }