Linked by Thom Holwerda on Thu 3rd Oct 2013 16:07 UTC
Benchmarks

With the exception of Apple and Motorola, literally every single OEM we've worked with ships (or has shipped) at least one device that runs this silly CPU optimization. It's possible that older Motorola devices might've done the same thing, but none of the newer devices we have on hand exhibited the behavior. It's a systemic problem that seems to have surfaced over the last two years, and one that extends far beyond Samsung.

Pathetic, but this has been going on in the wider industry for as long as I can remember - graphics chip makers come to mind, for instance. Still, this is clearly scumbag behaviour designed to mislead consumers.

On the other hand, if you buy a phone based on silly artificial benchmark scores, you deserve to be cheated.

Thread beginning with comment 573814
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE: not a cheat
by Alfman on Thu 3rd Oct 2013 19:06 UTC in reply to "not a cheat"
Alfman
Member since:
2011-01-28

viton,

"For accurate results benchmark needs stable frequency across the particular interval of measurement."

It's wrong to conduct the benchmark against a cpu/gpu super-configuration if the intention is to get an idea of the performance for a normal configuration (which it generally is).


Non deterministic behavior like this should be mitigated by conducting the benchmark over a longer period and/or repeating it a few times. At this point you get a better idea of min/max/avg/median/etc.

Changing the configuration for the benchmark makes the benchmark less accurate in terms of what it's trying to measure (even if the measurements are more consistent as you suggest).


O/T news: usa.gov (every gov website for that matter) "Due to a lapse in funding, the US government has shut down.".

Reply Parent Score: 5