We seem to see a lot of activity on benchmarks at the moment, and I'm currently concentrating on two.
The first is power benchmarks, which I think will have a lot to do with not only how useful a laptop is but how we can help buyers, both individuals and companies, save energy and perhaps save the planet. The other is browser benchmarks, probably more a long-term fascination with folks saying their stuff is faster without first establishing some kind of reliable, repeatable, independent benchmark.
What I'm saying is that while it is nice to have choices, it would be nicer if we had the tools we needed to make better ones.
If I buy almost any appliance, I get to see a nice sticker that states the energy usage of the appliance and where my appliance falls within the range. I get tons of vendors telling me how wonderfully efficient their various PC components are, but when I look at desktop systems, the OEM has cut costs on the power supply and other components; the end result is a PC that can spin my power meter like a top when it doesn't really need to.
Laptops have different issues. Here, it would be nice if you could tell just how long one would work if you were, say, just e-mailing and watching movies or recorded TV, or if you were doing something more intensive. I'm not sure I'd actually do that last while on battery power, but it would be nice to know how long the darned thing would last if I did. By the way, as a rule of thumb, use the battery life a laptop vendor promises, divide by two, and use the result to set your expectations. You'll be much happier.
Further, it would be nice to know how long the battery actually would last at something close to these benchmarks. It's great to know that I could do something on day one when I first got the laptop, but since laptop batteries seem to have a life of about 24 months and can lose about 50 percent of their capacity in 12 months, wouldn't it be wonderful if you could figure out which systems gave you the battery life you bought for more than a year?
For instance, Lenovo in its ThinkPads has implemented a unique charging feature that supposedly can extend the battery's in-service life by nearly 2X. HP just switched some of its systems to more expensive Boston Power batteries, which last substantially longer (up to 3x). But without a benchmark, how would you or I know which to choose? You can make your battery last longer manually but it is kind of a pain. (If you let the battery drop to 40 percent and store it until you need it and just charge and use it when you travel, it may last for the life of the laptop).
At the very least, it would be nice to have something that gave you the equivalent of city and highway mileage and told you how power-efficient the PC was so you could pick something that would last as long as you need and would help save the planet.
By the way, Patrick Moorhead over at AMD posted an interesting comparison of cell phone battery benchmarks. Those makers, particularly Apple, do a better job of giving you a range but they aren't particularly accurate.
Browser benchmarks, going back to Netscape Navigator, have been kind of a bad joke, particularly since we add things like anti-phishing software, tool bars and utilities that can significantly slow down the browser we use day to day. I'm reminded of the joke about the drunk who fell into an open grave. He starts screaming, "Help me, I'm cold, help me, I'm cold," and another drunk comes up and says, "Of course you're cold, you done kicked all your dirt off." Half the time, I want to say to a new browser claim, of course you're faster, you aren't running all of the security and advertising crap the primary browser is running yet.
The benchmarks folks do use seem to take one feature and test it, like Java, but what good is that if we don't actually live in Java? There is a poor guy in Europe who did a great benchmark test two years ago and folks are still so hungry for the results that he has started asking them not to reference it anymore. (I'm putting the link to his site here so if you don't need to click on it you won't, per his request to keep traffic down.)
Recently Microsoft did a video (you and I would never go to this much trouble) and published a white paper on how it thinks you should do a browser benchmark. Now, this is just to start the dialogue, not to dictate terms, but I see two advantages to at least starting the process of creating a cross-vendor benchmark. It could result in something we eventually could use to test benchmarks and it could help the vendors focus on the speed that users find important. Of course, for that last to work, we users have to have some input. And I just thought of a third reason: It might allow us to focus on the things that are really slowing our browsers down.
Life is about choices. I'm a big fan of making better ones. Let's hope we can get some better benchmarks so we can make better choices.