<blockquote>
<p>I noticed that JPEG version 9.1 was available and wondered if any one has tried using it?<br /> Are there any benefits for using version 9.1? (I never use JPEG when taking picture for myself).<br /> Just curious.</p>
</blockquote>
<p>Hi Gerald<br>
I have not tried JPEG 9.1. I think it will take software and hardware manufacturers incorporating the new standard for it to be more widely used (e.g. cameras upgraded to JPEG 9.1 and Photoshop/Lightroom incorporating that standard in their software). Hard to know when that would happen as JPEG2000 has been around for a while and that really did not see any wide spread adoption. If JPEG 9.1 does get adopted, it would probably be driven by the present JPEG only products (smart phones and tablets) and the associated software that with the improved screens and fancier photo editing apps are are finding their quality being limited by the present JPEG standard (bit depth and lossy compression). The incoproation of JPEG 9.1 will be driven by customer need/benefit not just because a new standard was created. "If" and when JPEG 9.1 becomes widely adopted is when it really becomes a standard.<br>
That certainly could have some advantages with lossless removing the present JPEG artifacts, the 12 bit depth giving more latitude in photo editing, plus the advantages of not needing an updated raw converter to get every new camera to work.<br>
If JPEG 9.1 gets adopted in such JPEG only products it will probably come along for the ride in DLSRs etc as a substitute for the present JPEG standard and not as a replacement for raw formats. <br>
An in camera move to JPEG 9.1 losses some information from demosaicing, potentially a smaller color gamut (unless the camera allows ProPhoto RGB and not just Adobe RGB and sRGB - JPEG 9.1 does allow for wider gamuts), as well as potential loss of shadows and highlights from the auto-processing in camera algorithms.<br>
And believe it or not, though I have not personally tried it out I don't see how a 12 bit lossless mode RGB JPEG 9.1 file will be smaller than than a 12 bit raw format lossless compressed file. The reason being is that the raw file data from the sensor would be 12 bits per pixel as each pixel is either R, G, or B and not all three in the vast majority of sensors. So only 12 bits to compress. If you want a lossless compression of a 12 bit RGB file then you have 36 bits to compress. I have not studied the standard so I could be way off base on these file size comments yet it will be interesting to see once some tests have been done yet am skeptical about any file size advantage until I see more testing done.<br>
<br />And as for the side discussion of RAW vs raw I will make sure I will use "raw" and that makes sense. What I found interesting is that even though Adobe's John Nacks 2005 blog saying that they will use "raw" at Adobe, I have seen Adobe use RAW multiple times in the last couple months including in their blogs announcing new versions of ACR as Adobe Camera RAW and similar misuse in Adobe TV. So not so sure how strict even Adobe is being about this (unless those just qualify as the same as typos).<br>
<br />Just my perspective of course.</p>