JPEG q-tables, 'quality', and some other image formats

Discuss digital image processing techniques and algorithms. We encourage its application to ImageMagick but you can discuss any software solutions here.
Post Reply
xooyoozoo
Posts: 5
Joined: 2013-02-17T17:38:20-07:00
Authentication code: 6789

JPEG q-tables, 'quality', and some other image formats

Post by xooyoozoo » 2013-02-17T18:41:31-07:00

Edit: Redid these tests, except I fixed WebP parameters and used lossless sources. In hindsight, this subject might not pertain to ImageMagick very much, so I posted it on the Doom9 forums.
Edit #2: I posted the gist below.

After observing Photoshop's "Save for Web" JPEG performance and comparing it to the JPEGs produced by ImageMagick, it's obvious (to me at least) that Photoshop has superior visual quality at lower filesizes. I attribute it entirely to Adobe's well-honed q-tables, as I can't imagine any other way one would psychovisually tweak a JPEG encoder. However, the question is how to present q-table performance in a quantifiable form.

To that end, I whipped up a small comparison using a state-of-the-art perceptual metric, IW-SSIM, on IM and Photoshop and threw in WebP (which should be better tuned by now) and Kakadu JP2K (which does visual weighting by default). IW-SSIM should outperform PSNR/SSIM/MS-SSIM in almost all situations.

Note that this is far from exhaustive, and I don't claim that it's even conclusive; it's just something I did for my own ends and might as well share.

Reference pictures: Run, Old, Fish
Various settings:

Code: Select all

convert 6.8.0-10 $f -define jpeg:dct-method=float -quality $i
kdu_compress 7.1 -full -precise -i $f -rate $i
webpe 0.2.1 -preset default -m 6 -segments 4 -af -q $i -pre 1 -pass 5 -jpeg_like
CS6 ExportOptionsSaveForWeb();  [ ... ]  optimized = true;   [ ... ]   quality = i;      <---- q = 0 to 50 here is about equal to 15 to 75 in IM

JPGs were also ran through ImageOptim ( JpegOptim + jpegrescan + jpegtran ) 
(EDIT: IGNORE THE WEBP RESULTS. The encoding parameters don't represent WebP well at all.)

Image
Image
Image

Each of the windows above should represent 30 to 50 different sample points for each different encoder.

The metric numbers are meant to be analyzed in relative terms in standardized test conditions rather than absolute terms. However, based on IW-SSIM's performance on various subjective databases, below 0.95 is almost certainly bad and the threshold into probably bad is around 0.97. Anything above 0.99 is not worth analyzing, as it's entirely "good enough".

About a dozen sanity checks were done in the lower bpp ranges by comparing the various pictures. I entirely agree with IW-SSIM's rankings.

Edit: Well, I guess I missed the obvious fact that Photoshop automatically uses different quality factors for luma and chroma, and structural similarity completely focuses on luma. It could be conceivable that Photoshop simply focuses more on luma, but that is obviously not the case: 502048 B Photoshop vs 49808 B ImageMagick. It just seem strange that q-tables, by themselves, can do so much.

I do already have PS, so I'm just going to ascribe this to Adobe witchcraft and move on. However, it'd be nice if my open-source image encoding tools can go toe to toe with my overpriced proprietary tools.
Last edited by xooyoozoo on 2013-02-27T17:32:24-07:00, edited 4 times in total.

User avatar
magick
Site Admin
Posts: 11062
Joined: 2003-05-31T11:32:55-07:00

Re: JPEG q-tables, 'quality', and some other image formats

Post by magick » 2013-02-18T06:27:20-07:00

Take a look at http://www.imagemagick.org/source/quant ... -table.xml. You can specify your custom quantization table to achieve similar results to Photoshop.

NicolasRobidoux
Posts: 1944
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Montreal, Canada

Re: JPEG q-tables, 'quality', and some other image formats

Post by NicolasRobidoux » 2013-02-22T08:37:06-07:00

q-tables do a lot.
-----
Thank you very much for this informative post.

Juce
Posts: 12
Joined: 2011-03-22T07:06:04-07:00
Authentication code: 8675308
Location: Finland

Re: JPEG q-tables, 'quality', and some other image formats

Post by Juce » 2013-02-22T14:13:01-07:00

xooyoozoo wrote:To that end, I whipped up a small comparison using a state-of-the-art perceptual metric, IW-SSIM
What program you used for this comparison? This interest me, because it could help in the development of better quantization tables.

I tried previously to develop a SSIM optimized quantization table and here is the result :wink: :

Code: Select all

  50  50  50  50  50  50  50  50
  50  50  50  50  50  50  50  50
  50  50  50  50  50  50  50  50
  50  50  50  50  50  50  50  50
  50  50  50  50  50  50  50  50
  50  50  50  50  50  50  50  50
  50  50  50  50  50  50  50  50
  50  50  50  50  50  50  50  50

NicolasRobidoux
Posts: 1944
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Montreal, Canada

Re: JPEG q-tables, 'quality', and some other image formats

Post by NicolasRobidoux » 2013-02-23T07:04:46-07:00

Quite messy, and I kept some of my later conclusions to myself (because I have a client for them), but you may want to have a look at http://www.wizards-toolkit.org/discours ... cd341a646a. Very messy, mostly wrong (most likely), but maybe interesting nonetheless.
Comment: Be careful not to trust blindly in existing image comparison metrics. TTBOMK, none is really good.
Last edited by NicolasRobidoux on 2013-02-23T07:23:23-07:00, edited 1 time in total.

NicolasRobidoux
Posts: 1944
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Montreal, Canada

Re: JPEG q-tables, 'quality', and some other image formats

Post by NicolasRobidoux » 2013-02-23T07:21:46-07:00

I also suggest you have a look at Dark Shikari's work http://x264dev.multimedia.cx/.

xooyoozoo
Posts: 5
Joined: 2013-02-17T17:38:20-07:00
Authentication code: 6789

Re: JPEG q-tables, 'quality', and some other image formats

Post by xooyoozoo » 2013-02-24T16:20:30-07:00

As there seems to be some interest here, I'll copy-paste a post I just made on the Doom9 forums:

----------

The problem with graphs in the first post is that the data they present is too discrete. Aggregation makes for much better visualization and generalization.

To that end, we're going to take the 24 768x512 images of the Kodak image set and the 8 1680x1200 images of the Jpeg-XR test set ask the question "how much space would one save if a batch of Photoshop jpegs had each of its images encoded by something else?". The effect is that gains on each image is weighted against how much space the gains would actually save. Using Photoshop batches as a reference point also assumes that constant quality encoding is more important here than arbitrary size limits.

Image

TL;DR: Jpegs are gonna stick around for a long while.
Last edited by xooyoozoo on 2013-02-24T16:49:59-07:00, edited 2 times in total.

xooyoozoo
Posts: 5
Joined: 2013-02-17T17:38:20-07:00
Authentication code: 6789

Re: JPEG q-tables, 'quality', and some other image formats

Post by xooyoozoo » 2013-02-24T16:40:43-07:00

Juce wrote:
xooyoozoo wrote:To that end, I whipped up a small comparison using a state-of-the-art perceptual metric, IW-SSIM
What program you used for this comparison? This interest me, because it could help in the development of better quantization tables.
Just plain old Matlab.

There are numerous C/C++ implementations of SSIM and MS-SSIM, but I don't believe anyone's done the same with IW-SSIM. A large majority of visual quality research and analysis is done in Matlab, though, so I don't necessarily blame anyone.
Comment: Be careful not to trust blindly in existing image comparison metrics. TTBOMK, none is really good.
Oh I completely agree. However, I'm trying to find objective results that match subjective results instead of the other way around. I think this is a fairly healthy approach, all things considered. :D
I also suggest you have a look at Dark Shikari's work http://x264dev.multimedia.cx/.
I'm fairly familiar with what he's done. He also frequent Doom9, which deals with video manipulation/encoding.

x264's psychovisual code works really well in medium-high bitrate situations. However, it ramps itself up with higher quantization—presumably to counteract the greater loss of detail—and, in my opinion, becomes excessive in stressful situations. Based on my own tests, at sizes less than libJPG q=30-40, x264's results become analogous to looking into a funhouse mirror.

This effect can also be seen in x264's video encodes, especially when comparing against the next-generation video codecs. I assume that x264's overblown psy-rd wasn't really noticed or commented on because no one bothered to encode at those low bit-rates, as x264 couldn't handle it and x264 is/was the best encoder around. However, times are a-changin'...

xooyoozoo
Posts: 5
Joined: 2013-02-17T17:38:20-07:00
Authentication code: 6789

Re: JPEG q-tables, 'quality', and some other image formats

Post by xooyoozoo » 2013-02-25T17:09:18-07:00

Are these simple settings for IM "enough" for optimizing Jasper JP2K image quality?

Code: Select all

convert $f -define jp2:rate=$i $g.jp2
Because I got some terrible results... here's Kodak 24 JPEG vs Jasper at ~35KB: http://imgur.com/a/3ayPz#0. Kodak 21 and 3 JP2K encoders.

And here's some quantitative analysis of the situation:

Image

Code: Select all

convert 6.8.0-10 $f -strip -define jpeg:dct-method=float -quality $i $g.jpg  [...] +JpegOptim+jpegrescan+jpegtran
kdu_compress 7.1 -full -precise -i $f -rate $i -o $g.jp2
opj_compress 2.0.0 -i $f -I -r $i $g.jp2
convert 6.8.0-10 $f -define jp2:rate=$i $g.jp2

NicolasRobidoux
Posts: 1944
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Montreal, Canada

Re: JPEG q-tables, 'quality', and some other image formats

Post by NicolasRobidoux » 2013-02-25T17:42:26-07:00

Libjpeg is definitely not optimized for very low bit rate. (It's pertty much "officially stated".)

It actually is not difficult to do quite a bit better. Apologies for not revealing how.

Juce
Posts: 12
Joined: 2011-03-22T07:06:04-07:00
Authentication code: 8675308
Location: Finland

Re: JPEG q-tables, 'quality', and some other image formats

Post by Juce » 2013-02-27T19:06:00-07:00

xooyoozoo wrote:Edit: Well, I guess I missed the obvious fact that Photoshop automatically uses different quality factors for luma and chroma, and structural similarity completely focuses on luma.
Have you tried to convert images to monochrome before compression?

Code: Select all

convert orginal.png -fx luminance grayscale.png

xooyoozoo
Posts: 5
Joined: 2013-02-17T17:38:20-07:00
Authentication code: 6789

Re: JPEG q-tables, 'quality', and some other image formats

Post by xooyoozoo » 2013-02-28T23:25:25-07:00

Juce wrote:
xooyoozoo wrote:Edit: Well, I guess I missed the obvious fact that Photoshop automatically uses different quality factors for luma and chroma, and structural similarity completely focuses on luma.
Have you tried to convert images to monochrome before compression?

Code: Select all

convert orginal.png -fx luminance grayscale.png
I did, but Photoshop's SaveForWeb won't the JPEG as anything but RGB even though there's zero color data. As libJPEG produce smaller files if they were saved specifically in Grayscale JPEG, I can only assume PS's datapoints would not be useable if I can't get it to do likewise.

It's interesting that there really isn't a color-based perceptual metric with close to or equal predictive abilities as the top-end luma ones. At most, I've found luma focused metrics that throw on color as an afterthought but don't really offer a convincing reason why they should be doing so. The most 'acceptable' solution I've found is to use S-CIELAB to filter the images for color sensitivity, calculate error using a perceptually accurate color forumla (CIEDE2000), and use weights to model the objective values after subjective opinions. Maybe the result could then be combined with a good luma metric at some meaningful ratio (2 luma, one chroma?). At the very least, any addition of chroma measurements would (rightfully) lower libJPEG's and WebP's relative results at lower bpps.

I might try the above one day, even though everything seems like complete hackery. :D

NicolasRobidoux
Posts: 1944
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Montreal, Canada

Re: JPEG q-tables, 'quality', and some other image formats

Post by NicolasRobidoux » 2013-03-05T11:47:11-07:00

Suggested untested hackery: Use a linear combination of your favorite "greyscale" metric with the result of blurring in linear light and applying a decent colour difference metric to the result (like CMC 2:1).
Of course, it's the ratio of the two weights that matter, not really their absolute sizes (you could set the weight of, say, IW-SSIM, to 1, without losing anything of significance). So, once you've picked the two metrics and the blur, this is a one parameter family of metrics.

Post Reply