Warning: Table './wcdrupal/watchdog' is marked as crashed and last (automatic?) repair failed query: INSERT INTO watchdog (uid, type, message, severity, link, location, referer, hostname, timestamp) VALUES (0, 'php', '<em>preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead</em> in <em>/home/rcousine/wiredcola.com/includes/unicode.inc</em> on line <em>311</em>.', 2, '', 'http://wiredcola.com/content/camera-my-fantasies-reviews-damned', '', '', 1492648011) in /home/rcousine/wiredcola.com/includes/database.mysql.inc on line 172
The camera of my fantasies, the reviews of the damned | Wired Cola

The camera of my fantasies, the reviews of the damned

: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /home/rcousine/wiredcola.com/includes/unicode.inc on line 311.

No, the doughty Pentax K100 D Super is not going anywhere, and I'm not giving up on the Canon SD1000 super-compact until I get a phone with a decent camera, but Olympus went and built the interchangeable-lens "rangefinder" I expressed a desire for in 2005.

Behold the Olympus Pen E-P1. That's the short Wired review. the long DP Review is longer. It's based around the micro-4/3s standard, and seems to be the first camera to really exploit the 4/3 system's benefits. On the down side, it sacrificed any built-in flash (it has a hot shoe) and a built-in optical rangefinder (there's a hot-shoe mounted one that's matched to the available 17 mm pancake lens (like 35 mm in OG full-frame focal lengths)).

Technically the Panasonic G1 (and its now-with-HD updated version) trod this same ground last year, but the G1 and GH1, despite their technical similarity to the E-P1, went the SLR-look direction, though the G1's fake pentaprism redeems itself by holding the flash and an electronic viewfinder.

While I'm here, what's with DP Review Ratings? I love the obsessive depth of their coverage, but after identifying some real issues with the GH1 (mainly a high kit price driven by the expensive-but-so-so included super-zoom lens), they give it their highest recommendation. In reviewing the E-P1, they identify critical autofocus problems with using it in low-available-light ("social") situations (which is one of the best reasons to want a high-quality compact flashless camera), and then give it their highest rating "(though probably not for everyone)"!

The sordid tale is shown here, where they have all cameras sorted on their 5-rating scale (Highly Recommended/Recommended/Above Average/Average/Below Average). Never mind that the two lowest scales have only 16 cameras between them, the last time they used even "Average" to rate a camera was 2003.

In other words, every camera since 2003 has been above average. Welcome to the Lake Wobegon camera shop!

It just gets more nonsensical. In 2008, here's how they rated cameras:

  • Above Average: 3
  • Recommended: 7
  • Highly Recommended: 20

Now I repeat: DP Review does a solid job of show and tell, and they seem to really point up actual flaws in the cameras they review. But I think they should drop their rating logos out of a sense of shame, or at least admit that it's a three-step scale, and anything less than the top rating indicates a serious dog.

They're hardly the only sinners. Cyclingnews does a pretty good job of their bike-gear tech reviews, but they tend to rate products on a scale from 3.5 to 5. This is a bit unfair, as the latest review they have is of a set of pricey road shoes from a major name (Bontrager, Trek's gear brand), and they gave them a lousy 2.5. Of course, to earn that score, the included insoles bled color on their socks in the rain, and the malformed sole warped cleats enough to affect clip-in and clip-out. I'd consider that last item a 0.5/5 kind of thing, really.

I once made a joke that a product would have to kill one of their testers to get a 2. I'm not quite sure that's still a joke.

Velonews, the across-the-planet competition for bike tech reviews, goes for a form of honesty by not having a scale-rating on their articles, and in at least one case, they have admitted it when a product tried to kill one of their testers. Perhaps the more terrifying thing should be that at the start of that article, they confess that they "often do not report such issues."

Guys, what? OK, fine, if the structural issue was because you rode into a curb, or engaged in some inept mountain biking, fine. But if a product breaks "in a safe way" or due to a "manufacturing anomaly" (both reasons the article cites for not mentioning such failures!) I'd really like to know that.