Digital cameras ain't what they used to be. For the most part they are an order of magnitude better overall than the first generation or two, which were made in the early part of this century. Modern cameras are much faster, cheaper, more sensitive, more responsive and more functional than their predecessors. It's a very different market now. It has gone way beyond the simple idea of digital taking over from film. What was necessary in the past, namely RAW files, may not be necessary (or every desirable) in the foreseeable future. Embedded microprocessors are progressing in terms of processing power and energy efficiency, and that implies that very high quality debayering can be performed on files internally. And maybe this debayering won't be merely good enough for JPEGs, but it could be good enough to make RAW files redundant. RAW used to be (and arguably still is) necessary because a really high quality debayer needs more than a fraction of a second to process. It wasn't too long ago that a RAW file, if large enough, needed a minute to render on an average computer. That says a lot about the mathematics behind the process. But if a camera can do a very high quality debayer in a fraction of a second, and it can output to a file format which preserves fine detail, and that file format can store 16 bits per channel, then RAW files would be completely unnecessary. Of course I'm thinking of JPEG 2000 here. It's wavelet based, has a higher compression ratio than JPEG, has a higher bit depth, and other nice features such as better error protection. RAW will always be available as an option for the foreseeable future. After all, it is very easy to implement. But if I could choose JPEG 2000 or a similar format, I would shoot that exclusively.