How we rate and score cameras

In 2010 we undertook a major overhaul of the way we rate and score cameras for the final conclusion of our in-depth reviews. This page explains how we produce those scores, how to interpret them and how they compare with the old system (which had run essentially unchanged since the late 1990's). 

Camera ratings: at a glance

What the scores mean

The category scores (Rating bars) are designed to give you an idea of where the camera's strengths and weaknesses lie - few cameras are bad (or excel) at everything, so there's a tendency for the final scores to 'even out'. Although no replacement for actually reading the review, the results box is designed to give you an 'at a glance' view of the camera based on our findings, and how it compares to its competitors. Very short bars mean below the average for the category, very long lines mean above average. The scores behind these ratings are then combined to produce the single '%' score at the end

Although, taken on its own, this overall rating figure has little real 'meaning', it does give you an idea of how the camera ranks - when taken as a whole - compared with the other cameras in the same category.

The overall scores sit in the following very rough bands:

  • 0-40% Totally unacceptable. Run away
  • 41-50% Poor to Below average, avoid
  • 51- 60% At best average, treat with caution
  • 61-70% Average to Good
  • 71-80% Very Good to Excellent
  • Anything over 80%: Outstanding

This is all you really need to know:

  • All scores are relative to the other cameras in the same category
  • For compacts the scores are also broadly comparable across categories
  • The lengths of the bars represent the weighted average of a range of measurements and scores
  • The final score is calculated using a weighted average of the main category scores
  • A camera's score represents 'a moment in time' - the date the review is published
  • Since the scores, weights and ratings contain elements of opinion (these are, after all, reviews) you should take the time to read the entire review if you feel our opinions and priorities may not match yours.
  • The ratings are produced after extensive consultation between all the editors and reviewers and after careful tabulation of all available information.

A word on weighting

You can read more about it down the page if you really want, but if all you want is the basics then read on. The final score is, as mentioned above, produced by weighting the scores of the various categories then taking an average. The weighting is the 'dpreview' opinion of what matters most (based on a lot of experience, it should be said).

Broadly speaking our main priority is image quality, so the categories that make up that get the highest weight. In future we will allow you to apply your own weights if you have more specific needs.

Want to know roughly how the final score is calculated? The weights applied break down as shown below (you can see more detail later on this page if you want):

The new awards

We've created two new awards (Gold and Silver) to replace the retiring 'Recommended' and 'Highly Recommended' gongs. There are only two things to know about the awards:

  • They are not awarded to every camera, just those we feel deserve one
  • There is no direct link between the overall score and the awards: they are not given automatically to cameras reaching a certain threshold. Crucially a camera can get an award even if a camera with a higher overall score didn't.

Whilst we have retrospectively applied our new stricter, more comprehensive scoring system to all cameras reviewed last year, we will not be retrospectively giving out awards (which is why you'll see the original rating on the page too).

How do the new scores relate to those in older (pre-2009) reviews?

The short answer is that they don't: the new scores represent a 'reset' of both scoring and how we give out our awards. The new SILVER and GOLD awards replace the outgoing Recommended and Highly Recommended awards, but they're not directly comparable. The old scores are still valid, but they will almost certainly be numerically higher than those produced by the new system. Long term we intend to take this scoring system back to cover all currently available cameras, but this is a huge undertaking so will take a long time.

Where the scores come from

Once a camera has been reviewed (a process that takes around 4-6 weeks for an SLR and involves 1000's of shots inside and outside the lab) all the measured data (from our lab tests) is gathered together and tabulated to establish how well it performed compared to its competitors (the other cameras in its category).

By the end of the process we scores based on measured values, specification and less easily quantifiable aspects (such as handling) for nearly 60 aspects of camera performance and specification (see below for more details). These are then combined and weighted in various ways to produce the final 11 'top level' scoring categories you see. These are then weighted before being used to calculate the final percentage rating. This information - the top level scores and the final % score - is what you see in the table at the bottom of the conclusion page.

The situation for compact (fixed lens) compacts is slightly different: there is a single category for most 'compact' cameras, and scoring in this category is based on the performance relative to a wide range of models, even those you may not consider to be direct competitors. As you'll see below, however, we have pulled out four specific categories towards the top of the market, and it is these models that we most often review.

Understanding the scores and ratings

Category

Every camera is placed into a broad category based on its market position, feature set and price point. The scores and ratings we assign are based on the camera's performance relative to the other cameras in that category (though there is a natural tendency for average scores to rise slightly as you move up the category range - see below). If a camera is very near the boundary of the next category (in either direction) we also consider the nearest competitors in the next category if it makes sense to do so.

SLR / interchangeable lens camera categories:

  • Entry-level
  • Mid Range
  • High End Enthusiast / Semi-Pro
  • Professional

Compact (fixed lens) cameras are also categorized, though one of the categories (the first) covers a far, far wider range than any other. Scores for compacts are essentially comparable across categories simply because that's the nature of the compact camera market (where features and styling are the main differentiators), but when we're scoring we're only taking into account sensible competitors (including those in other categories where applicable).

Fixed lens camera categories:

  • Compact camera - this is most point-and-click models on the market today
  • Compact 'superzoom' - small cameras with big zooms (over 8x)
  • Superzoom / Bridge - SLR-styled cameras with large zooms and electronic viewfinders
  • Enthusiast compacts - High-end models with extensive photographic control and (usually) raw mode

Scoring categories (SLR / interchangeable lens camera)

There are 11 top level scoring categories on the conclusion page - each represents a (lightly) weighted combination of scores and measurements. Some of the aspects of performance that contribute to each score are shown below:

CategoryConstituent scores and measurements
Build quality Construction, materials, finish and sealing

Ergonomics & Handling

Physical handling, user interface
Features The features score is based on an extensive comparison of (in order of importance) core features (those we would consider essential), useful but non-standard features (such as in-body IS) and cool new features (such as art filters or in-camera HDR). Note that minor differences in shutter speed, ISO ranges etc will not affect this score.
Metering & Focus accuracy Auto focus and default metering accuracy
Image Quality (JPEG) Resolution, noise (up to ISO 800), noise reduction, auto white balance*, default parameters (optimization)*, pixel-level sharpness and demosaicing, dynamic range, tone curve, color and tonality, flash performance*.
*lower weighting.
Image Quality (RAW) Resolution, noise (up to ISO 800), dynamic Range, pixel-level sharpness
Low light / High ISO performance Noise (over ISO 800), JPEG noise reduction, low light focus and exposure
Viewfinder / Screen Screen size, resolution, viewfinder size / type / usability
Performance (speed) Focus speed, burst rate, overall performance scores, buffer, live view performance
Movie / video mode Movie format, resolution, quality and functionality
Value Price / performance compared to peers

Note that for compacts the list is slightly different (it includes separate scores for Flash performance and Optics - how good/capable the lens is).

Ratings bars

The scores awarded in each category are shown by a bar representing the range from poor (way below average) to excellent - in comparison to the other cameras in the category at the point the review is published. Note that in order to magnify the small differences between cameras the bars do not actually start at zero. In future we will make the display of our findings considerably more interactive and informative.

Overall rating

The final rating is calculated using the weighted average of the 11 (or 12) scoring categories. At the moment we're using fairly mild weighting (there's enough image quality categories to ensure IQ is the number one even without weighting) that represents our view of what's most important to us, and of most import to people looking to buy a camera. Note that the bars used to represent the scores of the individual categories do not represent the weighted values - these are the absolute scores.

The weighting distribution used for compacts and SLRs (including non-reflex interchangeable lens cameras) is shown below. If you don't like our weightings simply ignore the final score - by then you should've read enough about the camera to make your own mind up!

To ensure that - at this moment in time - the absence of a movie mode in an SLR doesn't disproportionately affect the score we use a slightly different calculation for these cameras (removing the movie section from the calculation entirely), though the features score will be reduced slightly if a movie mode is considered standard for the category.

How we score - in detail

The basic scoring process is as follows:

  • Review camera
  • Collate all measurements and sit down with other editors to score 'unmeasurable' aspects
  • Combine all these (with weighting where appropriate) into 27 basic rated attributes
  • Apply weightings and combine these into the 11 (or 12) rating categories published
  • Produce the final % score from the weighted average of the category scores

In future we intend to allow users to override the last two stages, applying their own weightings (to produce bespoke, personalized rankings/ratings).

All our cameras are put through a series of standardized tests - most (but not all) of which are reported in the reviews themselves. Virtually all these tests produce some kind of measured value, and these measured values are used to establish the average performance across an entire category. A very similar process is used to score attributes that are essentially part of the spec (viewfinder size, feature comparisons etc).

This allows us to establish a benchmark for producing our scores - cameras with measured results at, or near, the average get a certain score.

The exact score (out of 10) an 'average' result in a category for a particular generation of cameras gets depends on several factors, including the following:

  • The spread of results (how much better the best is than the worst in the entire category)
  • How 'good' the average is (based on our own long-developed criteria of acceptability)
  • How rapid the trend toward improvement is (based on our historical test results)
  • How accurately we consider our controlled tests reflect real world experiences

This means that the 'average' score (that given to cameras near the middle of the pack) for SLR auto white balance, for example, will be slightly higher than the average score for SLR movie mode - we see little change or variance in AWB performance (and it's normally pretty good overall), but we do see a lot of difference - and rapid change - in movie mode functionality, so the average performance is expected to rise, and differences to grow between models.

This system allows us to produce a series of scores that reflect not only how well a particular camera performs against its peers, but how good the average camera in any particular category does. This aspect of the scoring system means that, although there's no absolute relationship between the scores in different categories, there is a general increase in the average score as you move up the range (presuming, of course, that the cameras do, on average, get better as you move up the categories).

Future plans

We hope in the future to offer the ability to re-score SLR cameras based on the entire market (i.e. to get an absolute score), but we've taken the decision that it's more useful when purchasing to see how cameras compare to their peers.

The big plan for the future is that now we are adding the review measurements and scores to a database we'll be able to use our findings to offer new and exciting ways to explore the relative performance of a wide range of cameras. This is only the first phase - representing the 'dpreview' score (which, yes, has an element of opinion in it - get over it!). In future we'll offer custom ranking and scoring based on preset criteria or your own preferences, priorities and needs. We're a long way off being able to deploy anything but our plans include:

  • Suitability ranking / scoring based on pre-set usage profiles (sports, kids, etc)
  • Custom scores (personal weighting options)
  • Absolute (weighting-free) ratings/rankings
  • New user interfaces for exploring cameras based on review scores

© 2010, www.dpreview.com