Discussion in 'Digital Darkroom' started by kelly_fajack, Mar 2, 2005.
my website is to slow but i don't want to lose any quality. Any help? www.kellyfajack.com
I use Photoshop. Go to image size, adjust ppi to 72 and then image pixel size to either 600 pixels tall or 700 pixels wide. Then use save for web and I save at 100 quality. Thats how I do it! I too am interested in other ways...
Kelly, I?m assuming you are using Photoshop. Don't bother with 72dpi, when you ?save for web? it should do that for you. Just resize the image to whatever you want on you site (Image>Image Size). After you resize don?t forget to use the ?unsharpen mask? to correct the blurring cause by the resize. Then ?save for web?. Select the ?2-Up? tab on the top on the screen. This should display the original on the left and the image that is going to be saved on the right. You can see the file size on the bottom left corner of the image. Make sure you are saving a jpeg rather than gif. Now adjust the Quality until you get an acceptable level. This can be done by using the Select box ?Low, Medium, High, Very High and Maximum) or the Quality slider. The quality you save totally depends on the image. You should see the file size change as you change the image quality. Personally I don?t like my images to be over 70K. Hope this helps, Patrick
For some reason the single quotes were replaced with a question mark... weird.
I use Irfan View. It's easy to use and a free download. Just do an internet search on it and you'll get a download site for it.
The following ImageMagick command outperforms Photoshop downsampling, as Bart van der Wolf shows on his website. Furthermore Photoshop creates bloated JPEG that will fill up your disk and bandwidth more quickly than better encodings. Irfanview is OK but my version doesn't allow you to select 1x1 chroma subsampling, which is mandatory for best quality. All one line, scale selectable:
convert -filter Lanczos -scale 25% -unsharp 1x3+1+.1 -quality 95 -sampling_factor 1x1 in.bmp out.jpg
Heres another idea you might want to check out, and it's freeware. http://www.nuetools.co.uk/stripfile.html I've tried the program and it is easy to use. It cuts a few more bits without effecting viewing quality that I can see. Good luck.
Thanks for the feedback but none of these work with mac. Are there any that do?
Cough. ImageMagick runs on Unix, and Mac X is Unix.
I copied this from a previous post: Bob Michaels , jan 13, 2005; 07:27 p.m. I never downsample by more than 50% per step and re-sharpen at every step. It seems to work very well. Example, I want a 600 pixel wide file JPG and and start with a 4388 pixel wide file PS file: I sharpen at 500% (always 500%), radius .3. threshold of 0 initially Then I downside to 2400 pixels wide (an even multiple of the desired final 600) I resharpen at around 300%, .3, 0, then fade the unsharp mask at 80% (edit, fade unsharp mask) downsize to 1200 pixels wide (half of the previous 2400) resharpen at 150%, .3,0 then fade the unsharp mask at 80% (edit, fade unsharp mask) downsize by half to the desired 600 pixels wide resharpen at 75%, .3,0; then fade the unsharp mask at 80% (edit, fade unsharp mask) Finally save as JPG It seems that the reduction in pixel count by exactly half makes a difference. The sharpening needs to start very high and then taper down. The fading the unsharp mask by 80% seems to help. The interim files looks oversharpened but the final file looks better than any other way. Downsampling and then resharpening in one step just doesn't seem to work as well for me. Credit for this technique goes to Wilfred Van Der Vegte of the Netherlands who passed it along to me.
I use photoshop or the apple image processing tool (sips) under Macos X. sips is a command line tool that supports the image processing applescript stuff. Both have the feature that they correctly hold on to color space information, which ImageMagick did not the last time I tried it.
Pete, what is the point of preserving colorspace information in a JPEG? The intended audience is (mostly unprofiled) RGB monitors and the goal is to make web images as small-for-the-quality as possible. Colorspaces for archived images are important, of course. I followed Bob Michaels' Photoshop formula and don't like the results. It introduces more artifacts than the method I posted and the image is 50% bigger.
And with ImageMagick convert -filter Lanczos.
There is way too much voodoo in image resampling. Someone needs to write a comprehensive paper to settle the matter. My token survey led me to this link: http://audio.rightmark.org/lukin/graphics/resampling.htm A summary of the page, confirmed by my experience, is that the common methods involve a trade-off between aliasing (jaggy lines), blurring, and ringing. Steven A. Ruzinsky is a resampling enthusiast who frequently adds new algorithms to his (cheap) program, SAR: http://general-cathexis.com/ I just noticed that the latest version also added a Gradient Lens Blur filter, for those of you who do not own Photoshop CS.
Separate names with a comma.