There were others. Many others.
Just a few years after the NIC cheat, for example, Rash encountered a brand of Gigabit Ethernet switches that had results that were too good to be believed. In this case, when tested with real-world data, the switch not only had poor performance, it “failed spectacularly,” as Rash put it.
Another writer and programming buddy of mine, Stephen Satchell, reminded me of another case. This time it was a software trick very reminiscent of what VW did. As he recounted this incident in a recent email, a “C compiler (names removed to protect the guilty) would detect that the Sieve of Eratosthenes [a simple algorithm for finding prime numbers often used to test the speed of compilers] was being compiled, and would inject handcrafted intermediate code that had been highly optimized.” This made the compiler look faster, although the trick could sometimes cause programs to break.
In the ’90s, I also encountered a brand of graphic cards that always won our benchmarks. They were good, but that good? We didn’t think so.
So we looked closer and found that the cards were set to detect our benchmarks. When they did, they either speeded up the GPU or output the correct graphics from embedded memory, depending on the particular cheat.
Like VW, all these companies took shortcuts to make more money on the cheap. People willing to break rules to look better than the competition in “objective” benchmarks and tests will always be with us.
So whenever you see benchmark results that appear too good to be true, go with your gut reaction. It’s all too likely the vendor has been playing games with numbers just so it can swindle you out of some money.
Sign up for Computerworld eNewsletters.