Better JPEG quantization tables?

Discuss digital image processing techniques and algorithms. We encourage its application to ImageMagick but you can discuss any software solutions here.
NicolasRobidoux
Posts: 1943
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Copenhagen, Denmark

Better JPEG quantization tables?

Post by NicolasRobidoux » 2012-02-14T12:33:19-07:00

P.S. 2 Actually not: Best answers are here: viewtopic.php?f=22&t=20333&p=98008#p98008
P.S. You should know that the best answers I found so far are in another thread of the Digital Image Processing Forum. The best "all purpose" luma quantization table I have found so far is discussed in http://imagemagick.org/discourse-server ... 22&t=20427
I'm not quite done with chroma, and with the exploitation of progressive scan to improve the quality of images which are enlarged. However, the "stupid pet tricks" discussed in http://imagemagick.org/discourse-server ... 22&t=20402 are immensely promising (at least for large images) and even though the ones shown there are not quite "finalized versions", there is much to gain in using them, esp. if you expect users to zoom in the images you put out.

The current thread documents my early attempts and presents some heuristics that eventually led to progress.

-----

I've scoured the web/literature for better JPEG quantization tables and eventually settled for trying to improve them myself.

In the process, I discovered why there is not a plethora of better tables "out there". (Understatement.)

For one thing, the original quantization tables (determined through experiments involving human test subjects) are amazing.

For another, the Discrete Cosine Transform (DCT) is quite good at ironing out small differences, and consequently things don't jump at you if you make small changes.

With this disclaimer out of the way, here is my current best. It's likely to change, although probably not by a lot.

I'd love to get feedback (or, even better, a superior set of quantization tables for quality settings in the neighbourhood of 75 using the cjpeg convention).

The tables are tuned for 2x2 subsampling (more or less conceptually equivalent to box filtering the blue and red channels prior to encoding, and using 16x16 blocks instead of 8x8 for these two channels, with known unfortunate consequences (basically, chroma bleeds from one luma block to another), but considerable file size reduction). I have another set, not quite as mature, for 1x1.

We are certainly not talking about a huge win, but it appears to me that they represent a small bang/buck improvement.

When I started working on this I thought I could get mileage by exploiting the fact that the human psychovisual system is less sensitive to small blue features than red ones, and consequently experimented with the use of a more "compressing" blue channel table, but so far I have not managed to make this a big win, and consequently have (temporarily?) abandoned this direction. So, there are only two tables, like in the ITU standard.

Code: Select all

# Nicolas Robidoux's better (?) JPEG quantization tables v.2012.02.14
# Remix of ISO-IEC 10918-1 : 1993(E) Annex K
# Recommended for use with cjpeg -sample 2x2
# Luma
16 12 13 16 22  33  51  69
12 13 14 19 25  47  63  76
13 14 17 24 39  57  76  83
16 19 24 30 54  78  91  96
22 25 39 54 70  104 123 129
33 47 57 78 104 124 159 154
51 63 76 91 123 159 193 201
69 76 83 96 129 154 201 248
# Chroma
17  18   24   47   99   128  192  256
18  21   26   66   99   192  256  512
24  26   56   99   128  256  512  512 
47  66   99   128  256  512  1024 1024 
99  99   128  256  512  1024 2048 2048
128 192  256  512  1024 2048 4096 4096
192 256  512  1024 2048 4096 8192 8192
256 512  512  1024 2048 4096 8192 8192
With these tables, the highly oscillatory modes are squashed more than is usual.

For the sake of comparision, here are the standard quantization tables, copied from the jcparam.c file of the JPEG club's jpeg-8d (most recent) source:

Code: Select all

static const unsigned int std_luminance_quant_tbl[DCTSIZE2] = {
  16,  11,  10,  16,  24,  40,  51,  61,
  12,  12,  14,  19,  26,  58,  60,  55,
  14,  13,  16,  24,  40,  57,  69,  56,
  14,  17,  22,  29,  51,  87,  80,  62,
  18,  22,  37,  56,  68, 109, 103,  77,
  24,  35,  55,  64,  81, 104, 113,  92,
  49,  64,  78,  87, 103, 121, 120, 101,
  72,  92,  95,  98, 112, 100, 103,  99
};
static const unsigned int std_chrominance_quant_tbl[DCTSIZE2] = {
  17,  18,  24,  47,  99,  99,  99,  99,
  18,  21,  26,  66,  99,  99,  99,  99,
  24,  26,  56,  99,  99,  99,  99,  99,
  47,  66,  99,  99,  99,  99,  99,  99,
  99,  99,  99,  99,  99,  99,  99,  99,
  99,  99,  99,  99,  99,  99,  99,  99,
  99,  99,  99,  99,  99,  99,  99,  99,
  99,  99,  99,  99,  99,  99,  99,  99
};
I usually use the cjpeg command directly (of course I could call it through ImageMagick). If you put quantization tables in a file called, say, robidoux.txt, you can use them with a command line like

Code: Select all

cjpeg -optimize -baseline -quality 75 -qtables robidoux.txt -qslots 0,1,1 -sample 2x2 -dct float -outfile out.jpg in.ppm
to compress a ppm file (the JPEG club's software only handles ppm and tga input files).
Last edited by NicolasRobidoux on 2013-05-03T19:35:52-07:00, edited 13 times in total.

User avatar
fmw42
Posts: 22083
Joined: 2007-07-02T17:14:51-07:00
Location: Sunnyvale, California, USA

Re: Better JPEG quantization tables?

Post by fmw42 » 2012-02-14T16:24:37-07:00

Nicolas,

Do you have any file size comparisons between the two sets for all other factors the same? How much improvement are we talking about or is it just a visual difference

Free

NicolasRobidoux
Posts: 1943
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Copenhagen, Denmark

Re: Better JPEG quantization tables?

Post by NicolasRobidoux » 2012-02-14T18:00:22-07:00

@Fred: The amount of compression depends on the content, but on a high quality image with lots of detail it appears that about something like 4% smaller without (IMHO) a change in perceived image quality, compared with -quality 75. Put another way: Same size gain as going down 2 levels in quality with the usual -sample 2x2 setup (from 75 to 73), hopefully without the corresponding quality loss. (A bit too early to be sure, but I like to be optimistic.)

I've not tested on faces and the like: only indoor and outdoor shots of buildings and scenery, and I certainly am not claiming earth shattering improvements. But if you have a large internet bill, 3-5% from the quantization tables alone (and I have other tricks) may be something worth a little programming.

NicolasRobidoux
Posts: 1943
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Copenhagen, Denmark

Re: Better JPEG quantization tables?

Post by NicolasRobidoux » 2012-02-14T18:54:36-07:00

This post is as much about (hopefully) getting a conversation going with people who have ideas RE: improving the quantization tables as providing an answer.

NicolasRobidoux
Posts: 1943
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Copenhagen, Denmark

Re: Better JPEG quantization tables?

Post by NicolasRobidoux » 2012-02-14T19:00:41-07:00

You can get a little bit of additional compression by smoothing the image (cjpeg actually has an option, -smooth) prior to compressing. If the smoothing is tight enough and mild enough, this allows you to sneak by with much less storage allocated to the high modes.

I'm reluctant to do this because what I'm doing is recompressing jpegs. If the quality of the incoming jpegs is low enough, smoothing actually produces larger files, because smoothing makes blocks "talk to each other," and this cross talk creates a lot of new nonzero coefficients.

If the incoming jpeg is high quality, this does not matter, because "all" coefficients are nonzero to start with.

But ideally I want something which works no matter what.
Last edited by NicolasRobidoux on 2012-02-15T08:21:49-07:00, edited 1 time in total.

NicolasRobidoux
Posts: 1943
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Copenhagen, Denmark

Re: Better JPEG quantization tables?

Post by NicolasRobidoux » 2012-02-14T19:08:46-07:00

I'll have a new version tomorrow: I see some weaknesses that I think I can fix.

NicolasRobidoux
Posts: 1943
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Copenhagen, Denmark

Re: Better JPEG quantization tables?

Post by NicolasRobidoux » 2012-02-15T10:30:28-07:00

Today's candidates:

Code: Select all

# Nicolas Robidoux's better (?) JPEG quantization tables v.2012.02.15
# Remix of ISO-IEC 10918-1 : 1993(E) Annex K
# Recommended for use with cjpeg -sample 2x2
# Luma
17  12  13  16  22  30  40  52
12  12  14  19  26  35  46  74
13  14  17  23  31  41  69 106
16  19  23  28  37  65 102 146
22  26  31  37  62  99 143 194
30  35  41  65  99 141 192 250
40  46  69 102 143 192 249 314
52  74 106 146 194 250 314 386
# Chroma
17  18  24  47   99   128  192  256
18  21  26  66   99   128  192  256
24  26  56  99   128  192  256  512 
47  66  99  128  192  256  512  1024 
99  99  128 192  256  512  1024 2048
128 128 192 256  512  1024 3072 4096
192 192 256 512  1024 3072 6144 7168
256 256 512 1024 2048 4096 7168 8192

NicolasRobidoux
Posts: 1943
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Copenhagen, Denmark

Re: Better JPEG quantization tables?

Post by NicolasRobidoux » 2012-02-16T05:56:34-07:00

This morning's. I think I'm finally getting somewhere.

Code: Select all

# Nicolas Robidoux's better (?) JPEG quantization tables v.2012.02.16
# Remix of ISO-IEC 10918-1 : 1993(E) Annex K
# Recommended for use with cjpeg -sample 2x2
# Luma
17 12 13 16 23 34 57 80
12 13 14 19 26 50 71 86
13 14 17 25 42 63 85 89
16 19 25 31 59 85 99 97
23 26 42 59 77 109 125 118
34 50 63 85 109 125 146 126
57 71 85 99 125 146 156 139
80 86 89 97 118 126 139 141
# Chroma
17  18  24  47   99   128  192  256
18  21  26  66   99   128  192  256
24  26  56  99   128  192  256  512 
47  66  99  128  192  256  512  1024 
99  99  128 192  256  512  1024 2048
128 128 192 256  512  1024 3072 4096
192 192 256 512  1024 3072 6144 7168
256 256 512 1024 2048 4096 7168 8192

rnbc
Posts: 109
Joined: 2010-04-11T18:27:46-07:00
Authentication code: 8675308

Re: Better JPEG quantization tables?

Post by rnbc » 2012-02-16T10:08:45-07:00

How are you measuring quality? Perceptual? Numeric Error (standard deviation)? And if "perceptual", on which output device, under which conditions? Just questions...

NicolasRobidoux
Posts: 1943
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Copenhagen, Denmark

Re: Better JPEG quantization tables?

Post by NicolasRobidoux » 2012-02-16T10:58:22-07:00

So, maybe I hit on a good recipe: Now, I can control, more or less, the amount of high mode squashing, without hell breaking loose. This would allow me to account for viewing conditions (distance to the screen relative to pixel size, say). A representative member is this set, which squashes high modes fairly mildly:

Code: Select all

# Nicolas Robidoux's better (?) JPEG quantization tables v.2012.02.16.13
# Remix of ISO-IEC 10918-1 : 1993(E) Annex K
# Recommended for use with cjpeg -sample 2x2
# Luma
16 12 12 15 22 33 55 75
12 12 14 18 25 48 68 81
12 14 16 24 40 60 81 83
15 18 24 30 57 81 93 90
22 25 40 57 74 104 117 109
33 48 60 81 104 118 136 115
55 68 81 93 117 136 144 126
75 81 83 90 109 115 126 126
# Chroma
17  18  24  47   99   128  192  256
18  21  26  66   99   128  192  256
24  26  56  99   128  192  256  512 
47  66  99  128  192  256  512  1024 
99  99  128 192  256  512  1024 2048
128 128 192 256  512  1024 3072 4096
192 192 256 512  1024 3072 6144 7168
256 256 512 1024 2048 4096 7168 8192
Last edited by NicolasRobidoux on 2012-02-16T11:24:32-07:00, edited 2 times in total.

NicolasRobidoux
Posts: 1943
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Copenhagen, Denmark

Re: Better JPEG quantization tables?

Post by NicolasRobidoux » 2012-02-16T11:06:01-07:00

rnbc wrote:How are you measuring quality? Perceptual? Numeric Error (standard deviation)? And if "perceptual", on which output device, under which conditions? Just questions...
Perceptual, on an uncalibrated Samsung flat screen monitor, and an Thinpad T60p laptop screen, in a room with tame lighting, or in darkness. (Not ideal, I know.) The Samsung is actually a rather "in your face" monitor (like, I hear, the Galaxy tablet).

RMSE is not a very good measure of perceptual error. SSIM is better (although it does not "do" chroma). But I actually trust my eyes more. (Although they probably are not a good metric because I scrutinize the images and I have a "pet" set of artifacts.)

I am also testing (most of the time) on a very specific set of test images: outdoor shots of scenery and buildings, and indoor shots of public places, which are themselves JPEGs of widely varied quality taken by photographers from all over the world and which are sRGB recompressed to sRGB.

So, my claimed "mild improvements" could be bunk for multiple reasons.

Again: If someone truly has a better set (not an article that says that they have a better set, a set that actually works better) for my target use, I'll drop my own qtables in a heartbeat.

It just looks like I may have succeeded in improving on the default set, at least if the goal is "good looks" as opposed to "faithfulness". That is: I'd rather have less artifacts, and lose detail, provided it's not obvious I lost details. And ideally I want something which is reasonably robust w.r.t. JPEG recompression: If the incoming quality is not great, I would like things to get worse gracefully.

And I am under time pressure...
Last edited by NicolasRobidoux on 2012-02-16T11:26:30-07:00, edited 1 time in total.

rnbc
Posts: 109
Joined: 2010-04-11T18:27:46-07:00
Authentication code: 8675308

Re: Better JPEG quantization tables?

Post by rnbc » 2012-02-16T11:26:08-07:00

Without a large set of people for testing and a large set of viewing conditions I think trying to improve on the standard is impossible...

Only with statistics and intensive testing could you claim to have arrived at a better set of quantization tables.

Otherwise, as a pet project, I have nothing against it. In fact I love pet projects myself :)

Consider also that many persons seem to be half blind and/or unaware of image quality. My experiments on image resizing with LCD sub-pixel awareness (see other thread...) have shown me that most people don't notice the difference between a badly scaled down picture and good output, despite obvious differences...

NicolasRobidoux
Posts: 1943
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Copenhagen, Denmark

Re: Better JPEG quantization tables?

Post by NicolasRobidoux » 2012-02-16T11:27:44-07:00

rnbc wrote: Only with statistics and intensive testing could you claim to have arrived at a better set of quantization tables.
... hence the question mark in the thread subject line.

NicolasRobidoux
Posts: 1943
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Copenhagen, Denmark

Re: Better JPEG quantization tables?

Post by NicolasRobidoux » 2012-02-16T11:39:25-07:00

... the multiple sets of eyes are coming next.

This thread, in part, is an attempt at getting more eyes for a preview. (And hook someone who actually does have a better set.)

But of course, this raises the following question: If I show similar file size images with the classic tables (obtained with changing the quality level) and with the ones I'm playing with, will my clients see a difference?

If not, I clearly wasted my (and their) time. If they find the ones produced by the classic tables more attractive, this is also quite embarrassing.

My "bet", if you will, is that provided the visible artifacts don't get larger, I can get rid of a lot of fine detail without casual viewers noticing that something is missing.

And I hope that the new tables allow me to do this better than the standard ones. (What I'm doing, actually, is building a crude approximation of Gaussian blur right into the quantization tables. I would think someone else would have tried that but I found no explicit trace of it.)

You may say that I can get the same effect by blurring the image prior to compressing (with cjpeg's -smooth, say; I still have to figure out exactly what this does). But because I'm recompressing jpegs, I want to avoid operations that make blocks talk to each other like the plague: When the quality is not high, blurring across block boundaries actually increases file size.

NicolasRobidoux
Posts: 1943
Joined: 2010-08-28T11:16:00-07:00
Authentication code: 8675308
Location: Copenhagen, Denmark

Re: Better JPEG quantization tables?

Post by NicolasRobidoux » 2012-02-16T12:09:34-07:00

... but I'm definitely not sure I managed something which I'm unambiguously happy with.

Post Reply