DxOMark's Sensor Overall Score

Discussion in 'Canon EOS' started by peter_j|2, Mar 23, 2012.

  1. There is published controversy regarding DxOMark's Sensor Overall Score. I have owned some of the cameras on the list below and would agree with the scores. What is your opinion about the scores?

    Camera Sensor Ratings by DxOMark

    http://www.dxomark.com/index.php/Cameras/Camera-Sensor-Ratings

    SENSOR OVERALL SCORE

    Canon EOS-1Ds Mark III (80%)

    Canon EOS 5D Mark II (79%)

    Canon EOS-1Ds Mark II (74%)

    Canon EOS-1D Mark IV (74%)

    Canon EOS-1D Mark III (71%)

    Canon EOS 5D (71%)

    Canon EOS-1D Mark II N (66%)

    Canon EOS-1D Mark II (66%)

    Canon EOS 60D (66%)

    Canon EOS 7D (66%)

    Canon EOS 1Ds (63%)

    Canon PowerShot G1X (60%)

    Canon PowerShot G12 (47%)

    Canon PowerShot G11 (47%)

    Canon PowerShot G10 (37%)
     
  2. So much for medium format :) Or are the DxO results a bit flawed/biased?
     
  3. Not sure. But, so much for Hasselblad.
     
  4. You scoundrel, Peter. Isn't that a Nikon listed as "best ever?"
     
  5. What? The D800 better than 35mm and medium format? Oh no! I'll reserve one D800 and one D800E for now just in case the Canon EOS 5D Mark III and Canon EOS-1D X fall short of 95% :)
     
  6. The "problem" with the scores is that they apply their arbitrary scoring to a multiple of metrics to end up with one number, if you don't give the same weight to the metrics they do the scores can be moved substantially.
    Whilst they are very open about how and what they test, the relevance to any one individuals shooting needs is so rounded up, or down, to achieve the one number that it becomes almost meaningless. For instance, how many people downsize to 8 MP for everything? For electronic display that is too high, for many prints, way too small, but it puts cameras with higher than 8 MP at a disadvantage, or advantage depending on your opinion, unless you agree with the specific weighting they go through to even everything down to that 8 MP figure.
    Another controversial point, they claim to measure RAW files direct off the sensor, but can make no allowance for on sensor processing, similarly, if nobody can actually use unprocessed RAW files, scoring it seems pointless. For instance there is some in camera processing of most RAW files, if those processes improve actual image quality not testing them doesn't really work. It is like saying a 600hp semi will have a higher top speed than a 450hp Ferrari because it has more horsepower. I'd take them both for a test drive!
    The weighting of their metrics end up showing serious anomalies, they used to have a page directly addressing the inability to compare different sensor sizes, the scores don't transfer across size differences. That is why, Medium Format cameras end up scoring so badly, too high an importance is put on high iso performance.
    Personally, I rarely shoot above 200 iso and my idea of sports shooting is AF, AF and AF, I don't care if a different sensor can perform 5 points better (or 1/3 stop) I care which will keep my subject in focus. Bearing this in mind, for my uses, DxO marks are worse than useless.
     
  7. Scott: As an example, concrete compressive test results in the construction industry in my neighbourhood must be certified accurate by a licensed professional engineer as required by law. The same regulation should apply to sensor testing. LOL!
     
  8. For my three current canon DSLRs I think the weightings are fair (5DII, 1DIIN and 7D) but I think that they must put a
    heavy weighting on high ISO. My Leica M8 performs badly (like MF) and it really does have great IQ.
     
  9. Why 30d is 3 points worse then 20d?
    Same sensor, better body? Maybe they
    think cameras are like wine?
     
  10. how many people downsize to 8 MP for everything?
    That's not the point. The DXO sensor metrics attempt to address the question of signal quality from the sensor per area in the final print. The metrics do not attempt to give an indication of overall image quality, but only how good the signal is, in relation to the whole sensor area. This is something that is difficult for the photographer to test themselves, but it gives an indication of how good the tonal & colour information is in a small print (small relative to the resolving capabilities of the camera). Most images are used in small sizes (web, magazine print, small inkjet prints etc.) and this is why the DXO sensor tests are important (not the overall score, but the individual plots of the metrics).
    It isn't just about small prints, either. When we look at a large print on a wall, typically we look at the whole image from a distance. How smooth are the tones, how rich the colours, how muddy the shadows? These questions are what the DXO sensor metrics can shed light into. A lot of the time we are looking at the big print as a whole image, to take in the message of the print and look at its overall composition, instead of looking at it up close with 10cm distance to print, inspecting details. The latter is also sometimes done, I guess, but the question of detail is already answered by other sites.
    When it comes to testing the whole imaging system, including the lens, then I think it's simply something that we can test ourselves more easily by simply taking pictures with the setup and evaluating them visually, see what we like and don't like (that's the ultimate test really, using it in the real world). There are also plenty of websites which test lens+camera combinations for resolution etc. but all of them have some limitations in that it's difficult to find quantities which thoroughly characterize the quality of the imaging system. Or there would be so many numbers that readers would be lost as to what is important and what is not.
     
  11. What is your opinion about the scores?​
    They're utterly irrelevant to the Real World experience.
     
  12. The scores are interesting, and prone to cause me gear envy. I've used Canon 20D's, 40D's, 5D's, and 5DII's. Their scores seem to range all over the center of the DXO chart: 5DII #17, 20D #86. But in real life looking at large prints (16"x24"+) experienced photographers seem to have a very hard time picking which of my photos were made with which camera.
     
  13. Dxo rates the consumer Nikon D5100 better than some medium format backs....I've leave it up to everyone to decided if they would want the back, or the Nikon D5100 for making prints. DxO is a complete joke.
     
  14. I guess my ancient Nikon bodies (D80, D60, D50 and D40) are a bunch of junk; how much can I get for the group?
     
  15. It is easy to demonstrate that DxO's DR estimates have no connection with reality by simply testing a few bodies with a Stouffer transmission step wedge. I haven't performed tests to specifically debunk their other metrics, but I do find it laughable that they rate several small format DSLRs as superior to medium format digital backs.
    Sorry, but I consider DxO to be a joke, and give no consideration to their claims. The two sites I give the most consideration to for technical evaluations are DPReview and Imaging Resource. I can replicate their tests, and the test results are very much in line with real world experience.
     
  16. There is published controversy regarding DxOMark's Sensor Overall Score
    Published where? Forum ramblings or a reasoned analysis?
     
  17. Forum ramblings and reasoned analysis. The above respected commenters now included.
     
  18. The only article with any kind of authoritative analysis of the work DxO publish I have seen is here, it seems well written and even though I am a qualified engineer, it is outside my realm of expertise, having said that it is written in such a way that I trust the points Mr van den Hamer makes. I don't believe his article is controversial, but it does put into perspective the results DxO publish. Do we really need performance figures down to 1/15 of a stop? And how much should that influence a purchasing decision?
    Considering the sensor performance of practically every camera made in the last few years, the constant improvements, and the comparatively minor increases per generation being achieved, sensor performance is barely in the top ten list of purchasing priorities when entertaining a new body purchase for me. To give any credence, or importance, to such a finely focused analysis of one aspect of a cameras performance seems a strange thing to do, particularly when there are question marks about the testing methodology and the way results are arrived at.
     
  19. To give any credence, or importance, to such a finely focused analysis of one aspect of a cameras performance seems a strange thing to do, particularly when there are question marks about the testing methodology and the way results are arrived at.
    On the contrary, I think DXO gives extremely valuable information that no other test site does. For example, what if I need to know which camera will give the most headroom for doing single-image "HDR" for landscapes where water or trees move between shots preventing multiple images from being used? I look at the base ISO DR at DXO. What if I want to know which gives the most headroom for exposure and white balance corrections in night club lighting conditions? I look at the DR and SNR for the required ISO settings at DXO. This information is very important to me and I need to have it without having to buy each camera to test it myself.
    Dxo rates the consumer Nikon D5100 better than some medium format backs....I've leave it up to everyone to decided if they would want the back, or the Nikon D5100 for making prints. DxO is a complete joke.
    Unfortunately there is a problem with doing the DXO raw data analysis for medium format cameras (pattern noise). They admit it but for some reason publish the scores (with a caution message) anyway. :-( To me it's regrettable as the scores they give for these cameras are useless. But this doesn't prevent me from benefiting from their data on small-sensor cameras, which are excellent and I find them to be quite accurate in the case of cameras that I have owned. If I were to buy a MF system I would have to do my own testing.
    The two sites I give the most consideration to for technical evaluations are DPReview and Imaging Resource.
    Alas, both sites focus their studio scene images all over the place. Their quality control regarding focusing is atrocious. Photozone is much, much better in this respect.
     
  20. "For example, what if I need to know which camera will give the most headroom for doing single-image "HDR" for landscapes where water or trees move between shots preventing multiple images from being used? "​
    DxO results won't show you that, they will show you the information that comes directly off the sensor, but not the RAW file you can actually download from your camera and manipulate, real world image files can show differences that DxO do not test and are not reflected in their results.
     
  21. Keith Reeder, Mar 24, 2012; 08:55 a.m.
    What is your opinion about the scores?
    They're utterly irrelevant to the Real World experience.​
    I agree. If you need further proof, consider this: the 'overall results' are not in relation to other cameras that are available, nor are they in relation to what is possible at the time of testing.
    For instance, the G11 and G12 both received a 66% score - barely a passing score, to an American high school student. Fair enough - it's the same sensor. But at the G11's release, there was nothing on the market that had that quality sensor, aside from Sigma's DP1, which had a prime lens. The G11 was an A+ camera, with no real competition on release. If the G11 had the best sensor available for a small camera with a zoom - which it did - then why the D grade? The G12 is competing with several other cameras, using now-older technology, that can do the same thing with more zoom or using a slightly smaller form factor. In relation to other cameras, the G12 is much worse than the G11 was, which makes the D grade more appropriate.
    And if we're grading on a curve, and the numbers just keep going up for new models (as they appear to), what do we do when something hits 100%? Do we completely rework the grading system then? Or will we be faced with a list of cameras that are all 95-100%? The latter seems to be where DxO is going, but it gives a very false impression of quality and capability.
     
  22. what do we do when something hits 100%?
    If I understand the rating system correctly, that would be a sensor that records every photon and adds no noise to the signal. No real-world sensor can ever do that (they never record all photons and they always add some noise), so DXO doesn't have to rework anything.
    DxO results won't show you that, they will show you the information that comes directly off the sensor, but not the RAW file you can actually download from your camera and manipulate, real world image files can show differences that DxO do not test and are not reflected in their results.
    They do, to the degree that I need to know. Results of subjective analysis of real world images changes from day to day with a human observer. Without actually buying all the cameras you want to compare and shooting them side by side in various conditions it's impossible to get reliable estimates of the comparative performance. I don't care to carry around multiple cameras for testing purposes. When I'm in the field I shoot real images and do not waste time testing. I get the comparative data from competent sites and this saves me time and money. If you cannot interpret the data given by dxomark and predict real world results based on it then it is your limitation, but please do not tell others what their limitations in ability are.
     
  23. Zack,
    That just shows a complete lack of understanding of what they are doing and the figures they label a camera with. Besides, the G11 and G12 both received a 47 score, however, a score of 66 is not 66%, it is 10 (or 2/3 stop) better than a sensor that scored 56. There is no pass or fail and 100 is not a maximum achievable, as sensors get better you should expect all scores to go up.
    Compare them to the same generation Nikon P6000 with a score of 35 and you should expect the Canon G11 RAW files to be around 2/3 stop better.
    Having said all that, unless your usage fits in with their specific formula of scoring the three categories and you can ignore all other camera and system functionality for a possible 1/3 stop difference then the scores might have some merit.
    Ilkka,
    If I understand the rating system correctly, that would be a sensor that records every photon and adds no noise to the signal. No real-world sensor can ever do that (they never record all photons and they always add some noise), so DXO doesn't have to rework anything.​
    Your understanding is not correct. The scale is open ended as the linked quote below says.
    Sensor Overall Score is open and it is not a percentage. This score has been computed so that the current set of cameras, from low-end DSCs up to professional DSLRs and medium-format cameras, show results within a range from 0 to 100. However, new technologies may well lead to higher performance models.​
    As for this comment.
    If you cannot interpret the data given by dxomark and predict real world results based on it then it is your limitation, but please do not tell others what their limitations in ability are.​
    As my understanding of their ratings seems to be more accurate than yours I would not presume I am too limited. Anybody that doesn't understand the scores can't possibly be looking at anything other than a number. High number= good, lower number= not so good, well it doesn't work like that for a multitude of complicated reasons, I understand some of those reasons so find the DxO scores to be of extremely limited value.
     
  24. On the contrary, I think DXO gives extremely valuable information that no other test site does. For example, what if I need to know which camera will give the most headroom for doing single-image "HDR" for landscapes where water or trees move between shots preventing multiple images from being used?
    You're kidding yourself if you think they are showing you this. Their DR results have no relation to real shots and are literally worthless.
     
  25. Most funny thing about IR and dpreview is, they don't even use same exposure for comparing different cameras. If for the same t stop, same shutter speed and same scene lighting, one camera needs ISO 6400 and another camera needs ISO 12,800 , they just change the exposure so that both camera use the same ISO value. How can that be a fair comparison? At least dxo knows these things and takes care of.
    But dxo's single number "rating" is very misleading, you have to look at the graphs to really understand and compare cameras. And you also have to understand that they do not have any weightage on resolution (where medium format cameras specifically shine).
     
  26. Scott, you're correct - I misread the posts about the G11/G12. Frankly, it was just too long a list for me :) So the ratings are 47, and not 66. However, that changes nothing.
    DxO says that the rating is not a percentage. That's not true at all. They rate them 1-100, with 100 being a perfect, flawless score. 100 is the maximum allowable score. DxO can say whatever they like about their ratings system, but I think you'll find that that is the very definition of a percentage. Pretending that the definition, or the math, is different than what it truly is shouldn't fool anyone.
    Despite my misreading of the chart, I understand the concept that a higher rating equals better ISO/IQ etc. But let me ask you this: as sensor technology improves, ISO and IQ performance increases. This is a given. Based on DxO's grading system, all ratings will go up over time. Say three years from now we get a pro model that scores a 99%, since a 100% is impossible. What happens then? Does the Rebel with the same sensor and a worse processor get a grade of 99% as well? Does the grading system mean anything by then?
    The problem with limiting the scores to 100 (and yes, this means it is a percentage) is that once scores climb to a certain extent, they become meaningless. Since we don't know what the maximum quality that this technology will eventually be capable of is, any scale that we create without that knowledge is arbitrary. For all we know, sensors may one day be capable of 4 million ISO with no visible noise. If a million ISO gets a score of 99%, how do we rate 4 million ISO? We have to give it the same score of 99%, even though its two stops better. The only way to give them meaning is to rate them based on what is currently possible.
    The fact is that DxO grades cameras based on a system that, because of our lack of knowledge about what the future will hold, is destined to eventually become worthless. It's just a matter of when - not if. Until then, it is dodgy at best.
     
  27. I am surprised how emotional the reaction to DxO are. I find the scores to be mainly useless as the weight several test
    results to create this score. Their tests and weights do not necessarily reflect my own needs. However, that said they do
    produce some data in there reports that some may find useful. From my own experience their ISO sensitivity
    measurement is consistent with my Leica M8 performance and their Signal to noise ratio tests do show where the sensor
    tends to fall off in performance. Obviously these are not reasons for camera purchase decisions by themselves but they
    do provide information which we can use or ignore at our discretion. While I have never read their scoring criteria I have
    always assumed that MF backs and Leica do poorly due to a strong emphasis on high ISO performance. Indeed their
    rating system seems a little over simplistic as it suggests portrait shooter only need Colour depth, landscape shooters
    dynamic range and Sports shooters high ISO. All of these things are rather over generalized - I shoot B&W portraits in
    low light for example or landscapes at dawn where colour depth and higher ISO are useful
     
  28. Zack,
    Sorry you are wrong again.
    DxO says that the rating is not a percentage. That's not true at all. They rate them 1-100, with 100 being a perfect, flawless score. 100 is the maximum allowable score.​
    100 is not the maximum score, the number is open ended, cameras will score over 100. There is nothing to stop a camera getting a 140 score it just has to perform 3 stops better than the D800 with a 95. The scoring is linear not an ever steepening curve, there will be no 99.8-99.9 mess, it will just go over 100. No change of the rules of maths, no recalculations for everything, just a continuation of exactly what they have done up to now.
    Quite how Nikon have managed to make the D800 score better than the D4 just illustrates that the weighting in the final calculation is skewed. Sure they are both supremely capable cameras, but seriously, two brand new sensors from the same company with the same R&D, with the result that a $3000 camera beats the off sensor signal of a $6,000 camera.
     
  29. Scott, from your own writing it states that cameras can currently score up to 100. This means that based on the current grading system, 100 is perfect, as it is the maximum allowable grade within the current stated system. Unless you know something personally going on at DxO that runs counter to their stated grading system, there should be no argument there.
    Now, the entire purpose of a grading system is not to make a list - it is to let people know how well something performs in relation to how well it could perform. In this case, DxO has picked an arbitrary number (100), but it could just as well be 200, or 347. The number given to a camera sensor indicates how well it does on certain things vs. the maximum, which is 'captures every photon perfectly.' I realize I am oversimplifying.
    DxO's claimed purpose is to let users know how well a camera performs by grading. By changing the grading scale, they are no longer letting people know the same information. It is as much a grade as a college professor that decides that since he has too many students getting As, all tests will now be out of 140, rather than 100.
    I'm not saying that their system cannot change. I am saying that what a grade is: an idication of score in relation to maximum possible score - is no longer shown if the scale is changed, unless all models are re-evaluated according to the new scale. Which won't happen. As an arbitrary number, all we know is that newer models score higher; which, frankly, should be obvious to anyone that has bought multiple cameras.
    DxO is producing numbers that show light sensitivity. They are not grading cameras, nor are they doing anything else that the words that they use would imply. They have chosen incorrect terms for what they are doing, and it leads many people to believe that these numbers mean something different than what they mean.
     
  30. Zack,
    What is it about this comment I have already posted direct from DxO themselves (with a link) that you don't understand?
    Sensor Overall Score is open and it is not a percentage. This score has been computed so that the current set of cameras, from low-end DSCs up to professional DSLRs and medium-format cameras, show results within a range from 0 to 100. However, new technologies may well lead to higher performance models.​
    You will notice it does not say up to 100 anywhere, just that current technology falls between 0 and 100, but newer tech will take it higher.
    As for this assumption
    Now, the entire purpose of a grading system is not to make a list - it is to let people know how well something performs in relation to how well it could perform.​
    No it isn't. In this instance the DxO score is only meant to illustrate performance against others, there is no theoretical perfect score.
    The number given to a camera sensor indicates how well it does on certain things vs. the maximum, which is 'captures every photon perfectly.' I realize I am oversimplifying.​
    No, again, that is not what the DxO number is, there is no maximum. If you drop your idea of 100 being a perfect score, which is wrong, and understand the scoring is open, which it is, then you will realize nothing changes when sensors start scoring over 100. The system does not need to change and nothing needs to be re-evaluated, all scores stay where they currently are. A score of 100 does not mean it is perfect, it will mean a sensor is 1 stop "better" than a sensor that scored 85, a sensor that scores 115 will be one stop "better" than one that scored 100 and two stops "better" than one that scored 85, it is an open linear scale.
     
  31. I find it interesting that some of the most fervent advocates on this thread of the accuracy and relevance of DxOMark data and information flatly refuse to accept the premise they document here.
    Wonder why? Surely if DxOMark is right about the relevance of the data they pull off a sensor to the end result, they're going to be right about the relationship between pixel count and noise at the image level too?
     
  32. Kieth,
    Who is a "most fervent advocate" for "the accuracy and relevance of DxOMark data" on this thread other than Ilkka?
     
  33. Scott Ferris [​IMG], Mar 25, 2012; 12:26 p.m.
    Zack,
    What is it about this comment I have already posted direct from DxO themselves (with a link) that you don't understand?
    Sensor Overall Score is open and it is not a percentage. This score has been computed so that the current set of cameras, from low-end DSCs up to professional DSLRs and medium-format cameras, show results within a range from 0 to 100. However, new technologies may well lead to higher performance models.
    You will notice it does not say up to 100 anywhere, just that current technology falls between 0 and 100, but newer tech will take it higher.​
    I understand the concept. I don't understand how you seem to have a disconnect between the words that they use and what you think they mean. In the words of Inigo Montoya, "I do not think that word means what you think it means."
    Before you remind me AGAIN that the scores can change, let me point out that, unless I'm reading it wrong, the current top score is 95%. If DxO were attempting to create a 'living' scoreboard, then they would claim instead that 'all scores are 0-95.' Because they are. Since they state that "all cameras are 0-100" even though NO cameras are 100, this means that someone at DxO perceives 100 to be the top of the chart. The top of the chart that theoretically has no top.
    Clearly the number 100 is relevant to DxO, or else they wouldn't say that all models fall between 0 and 100 when they do not. DxO says that the scale is open-ended, but that current models fall between 0 and 100. Their charts max out at 100. This means that *right now* the score is 0 to 100. It cannot be 0 to 100 AND be open-ended. Those maths just plain don't work - I don't care how you try to explain it. Something has a top end, or it doesn't. This isn't a Schrodinger's Cat thought experiment where we can imagine that it exists in two states at once. The scale is NOT open-ended. It is, by the very definiton of the words, a closed scale that they are planning on changing.
    What keeps all this from being a rant about semantics is this: regardless of the claims that DxO makes, the words that they choose to use imply so strongly as to practically state with conviction that their results mean something that they do not. Their results, in reality, are describing a variety of ways in which sensitivity to light improves with sensor design. The 'true' results will show each progressive model improving at the exact rate at which the technology improves, meaning that the actual results are really just an engineer's timeline.
    The words that they have chosen to use, and the fact that they continue to use a 0-100 scale despite obvious confusion and years of no models being anywhere near 100%, lead the reader to believe - wrongly so - that they are producing a relevant 'grade.' In order to have a proper grade, you need a top limit. Regardless of what you claim, they currently have one. It is out of a maximum 100, just like a grade. All the indications on their page point to it being a grade, aside from a short little sentence in the 'about' section that says, 'This is not a grade.'
    The fact is that regardless of how you spin it, the language and the system that DxO uses is misleading and self-contradictory to the point of total worthlessness to the reader. I've pointed out what it needs to be a grade and what it needs to be open-ended - the system currently resides in some nebulous area in the middle, where it says one and does the other.
    I hope this post explains why the 'grading system' is extremely flawed. If not ... well if we couldn't see eye-to-eye by now, another post from me isn't going to help.
     
  34. "I understand the concept."​
    Clearly you don't, I am sorry I have failed to explain in a way you comprehend, DxO scores are of extremely limited value, but not because of your understanding/misunderstanding of them but clearly you are madder at me than you are interested in learning, so be it.
    We will just have to wait until the first cameras go over 100 and we find the sky doesn't fall on our heads. By DxO figures it will take a camera that performs 1/3 stop "better" than the D800 to achieve 100, anything over 1/3 stop "better" than a D800 and we go into numbers higher than 100. I wonder if the 5D MkIII or the 1Dx will get there?
     
  35. I have used
    Canon ( 450D, 20D, 40D, 5D, 7D )
    Konica Minolta 7D
    Nikon ( D50, D70s )
    Panasonic FX3
    Sony NEX C3
    From my point of view the Sensor Overall Score given by DxO is a very convincing.
    One thing which has puzzled me is the very close score for Canon EF 24-105mm f/4L IS USM and the score for Canon EF 28-105mm f/3.5-4.5 USM, ( in which the degree of zoom close to some ).
    The fist cost about $1250 while the second cost only $250.
    ( I have Only EF 28-105mm f/3.5-4.5 USM and it is really a good lens )
     
  36. Hi all, I have seen DXO scorning I can see it is market advertising and not real evaluation or they have gone for points
    that suite them. As if I am telling that T2i is better than 1D mk4 because it is lighter, cheaper, yes these points are
    right but do actually it is, no so the photographer is the judge to DXO marking and not the other way around. As our
    friends mention some entry level cams are marked better than MF in term of sensor preference which is totally awful
    measure.
     
  37. DXO Sensor Overall Score dose not compare prices or weights, it goes to a more important thing like dynamic range, signal to noise ratio and color depth. We must not wonder that the modern cameras even if it is from lower-level excel the performance of an old camera which is in a higher level. If the case remains as it is, it means there is no technical progress and principally there is no digital photography, and if I have money I will buy Nikon D800 today and not tomorrow.
     
  38. >>> What is your opinion about the scores?

    Eh... I just don't have an opinion on lists like that.

    With respect to my opinion about creating compelling photographs in general, and how "sensor scores" factor in... Putting energy into seeing, seeking nice light, imagining the
    possibilities, and thinking about composition with the goal of creating an image evoking
    emotional pull, hugely trumps differences in sensor scores.

    And it's free - meaning you're not constantly spending $$$ to own the absolute best and arguing about
    minutia...
     
  39. Lies, damn lies, and statistics!
    In short, I subscribe to the point-of-view of those who argue that DxO comparative scores between cameras are based upon arbitrary manipulation of data, introducing substantial artifacts into the comparisons, therefore rendering them of little validity.
    Objective, repeatable measurements form the basis of scientific conclusion. Arbitrarily weighting criteria in such a way that may vary from photographer to photographer forms the basis of a subjective opinion blog.
     
  40. I'm sure that the DxO test is regimental, repeatable, and accurate in what it measures. However, the score doesn't predict how well a camera will work for a particular application.
    I spent two years shooting a D700 (currently ranked 11th) when I purchased my 5D Mark II (currently ranked 16th). From that day forward, I rarely used the D700 and I eventually ended up selling it. The 5D Mark II, inferior on the DxO scale, was clearly and demonstrably SUPERIOR for the requirements of my shooting.
    The D700 is a nice camera, and it does some things better than the 5D2. But the 5D2 does some things better than the D700. The dimensions in which the 5D2 is superior might not be weighted heavily on the DxO scale, but they are more important to me than what DxO measures.
    It's important to gather as much information as possible AND consider how that information will be of benefit to YOU. The DxO Mark ranking is only one piece of information. Despite the fact that it's very meticulously calculated, it does not reflect what's of importance to me in my image making projects.
    I keep waiting for them to come up with a DxO Dan score. I don't know who this Mark fellow is, but he has different priorities than I do. ;-)
     

Share This Page