Page 4 of 6
Re: Better JPEG quantization tables?
Posted: 2012-02-22T11:58:02-07:00
by NicolasRobidoux
Once you've decided on the baseline colour quality you want for luma, or chroma, you can keep it more or less constant across all quality levels above 50 with this formula:
Suppose that you want the equivalent of baseline value B which you set using -quality Q, and you want to have the same effective baseline using -quality q. Then, in the relevant quantization table, set the baseline value b to b = B ( 100 - Q ) / ( 100 - q ).
This assumes that, using the cjpeg convention, both q and Q are 50 or above. If q < 50 and Q >= 50, the formula becomes b = B ( 100 - Q ) / q.
If both q and Q are <= 50, then it's really simple: b = B Q / q.
For example, if you want to use -quality 63 instead of -quality 60, and your baseline value was 6, replace it by 7 (rounding 6.8571).
(This is from what I recall is the gist of cjpeg's translation of "qtable + quality -> actually used qtable".)
Re: Better JPEG quantization tables?
Posted: 2012-02-22T17:29:50-07:00
by NicolasRobidoux
Just discovered that if you use -sample 1x1, you can chop the chroma big time and still win over -sample 2x2.
Re: Better JPEG quantization tables?
Posted: 2012-02-22T18:05:49-07:00
by NicolasRobidoux
What this means is that my latest chroma table is almost certainly overkill.
Re: Better JPEG quantization tables?
Posted: 2012-02-23T06:00:47-07:00
by NicolasRobidoux
Flavor of the day, for -quality 65 (as mentioned earlier, I think that the very first entries of the quantization tables, at the very least, should be changed depending on the quality level)
and -sample 2x2 (2x2 just gives too much compression compared to 1x1, despite the colour loss and artifacts):
Code: Select all
# Nicolas Robidoux's better (?) JPEG quantization tables v2012.02.23
# Remix of ISO-IEC 10918-1 : 1993(E) Annex K
# Chroma table recommended for use with cjpeg -sample 1x1 (values too big for 2x2)
# Luma
11 11 12 15 20 27 36 47
11 12 15 20 27 36 47 93
12 15 20 27 36 47 93 185
15 20 27 36 47 93 185 369
20 27 36 47 93 185 369 737
27 36 47 93 185 369 737 1473
36 47 93 185 369 737 1473 2945
47 93 185 369 737 1473 2945 5889
# Chroma
12 15 18 26 39 69 139 279
15 18 26 39 69 139 279 559
18 26 39 69 139 279 559 1119
26 39 69 139 279 559 1119 2239
39 69 139 279 559 1119 2239 4479
69 139 279 559 1119 2239 4479 8959
139 279 559 1119 2239 4479 8959 17919
279 559 1119 2239 4479 8959 17919 35839
I've not bothered clamping down the high values to 12725 (was a cosmetic thing anyway, to indicate that anything above ends up being 255 no matter what).
For high, but not extremely high, quality, I always chop off everything past the 10th diagonal with progressive encoding, so I may as well have the luma (and chroma) tables reflect that. (Yes, I apply a crude low pass filter using both the quantization tables and progressive encoding. It's because I value "good looks" over "accuracy".)
Re: Better JPEG quantization tables?
Posted: 2012-02-23T11:45:36-07:00
by NicolasRobidoux
Well, well, check this out: -resize with -filter Mitchell interacts better with JPEG compression than -distort with -filter Robidoux (or maybe it is that EWA does not mesh as well with JPEG compression than orthogonal resize?). Of course the comparison is unfair, because EWA Robidoux is considerably sharper than orthogonal Mitchell filtering, but still: That little bit of smoothing sure seems to get rid of a lot of JPEG artifacts. P.S. Maybe it's because my tables have built in low pass filtering. P.S.2 No time to investigate this further, but it looks like it's a sharpness thing: -resize Lanczos2 has more JPEG artifacts than -distort Robidoux.
Re: Better JPEG quantization tables?
Posted: 2012-02-23T11:52:44-07:00
by NicolasRobidoux
Also, at equivalent file size, my latest tables sure seems to keep colours slightly more vivid than the standard tables.
Re: Better JPEG quantization tables?
Posted: 2012-02-25T15:13:55-07:00
by NicolasRobidoux
Here is an observation that will not surprise anybody: If you are not recompressing an existing JPEG, but you are actually compressing a non-jpeg image, you are better off lowering the high frequency content of the image by blurring before pushing through JPEG compression, as opposed to using the quantization tables to do that. The reason is that "direct" filtering is way more effective an accurate than it's crude approximation through quantization tables.
Less obvious: If you are going to chop off high coefficients (by abusing progressive encoding), it is preferable, although not completely necessary, to prefilter (provided you are not recompressing) so that not much gets cut off. This matters to me because I am finding that if you appropriately prefilter, you can chop off almost all chroma diagonals when using -sample 1x1 (-sampling-rate 1x1 in ImageMagick), which allows one to make 1x1 files as small as 2x2, and gets rid of the painfully obvious "16x16 chroma blocks in an 8x8 luma world" -sample 2x2 artifacts. This is less important, a mid-quality, with luma (which I chop less anyway).
Re: Better JPEG quantization tables?
Posted: 2012-02-27T07:52:07-07:00
by NicolasRobidoux
A completely different approach to "modifying quantization tables" is described in
http://imagemagick.org/discourse-server ... 22&t=20402
Re: Better JPEG quantization tables?
Posted: 2012-02-27T10:49:28-07:00
by NicolasRobidoux
@rnbc: Although, as usual, these are not finalized, there are definitely settings worth trying in the stupid pet trick thread.
I can't quite tell whether I'm just basically doing the usual with right shifts instead of integer division, but it sure looks like Huffman just eats this up. Pushing the result through jpegrescan strongly suggests that.
Re: Better JPEG quantization tables?
Posted: 2012-04-04T10:04:17-07:00
by rnbc
You might get better results, relative to the standard tables, by restricting your search to a certain set of image types. For example, I've heard about a guy that made an optimized set of tables for face image compression, and managed to get less than half the size for the same perceptible image quality. That was quite a few years ago, when I was still at college, digital cameras were a novelty, and getting a large corpus of sample images was not that easy... but I'll try to find his article.
Re: Better JPEG quantization tables?
Posted: 2012-04-04T10:38:57-07:00
by NicolasRobidoux
@mbc: Cool! And thank you.
(I can't touch this stuff for at least a month (last Masters student needs shoving out the door), but I still am very interested in making progress.)
Re: Better JPEG quantization tables?
Posted: 2012-04-04T21:12:01-07:00
by anthony
If you can find that table, that would make a good example for IM Examples, Common File Formats, JPEG Write.
http://www.imagemagick.org/Usage/formats/#jpg_write
I am just writing up the new option now. (give it a few hours)
Re: Better JPEG quantization tables?
Posted: 2013-02-27T19:16:18-07:00
by Juce
It seems to me that
ITU-T.81 Annex K.1 is often misunderstood. Many software (including libjpeg) and hardware using the tables, despite the fact that the standart says "These tables are provided as examples only and are not necessarily suitable for any particular application."
Re: Better JPEG quantization tables?
Posted: 2013-03-11T05:55:53-07:00
by NicolasRobidoux
http://citeseerx.ist.psu.edu/viewdoc/su ... 1.113.7853 contains sample tables, and agrees with my hunch that "asymmetrical" tables (that treat vertical and horizontal frequencies differently) make no sense for general purpose use.
Re: Better JPEG quantization tables?
Posted: 2013-04-20T08:15:41-07:00
by NicolasRobidoux
Here is my current favorite for very good quality JPEG compression. Let me know what you think.
Code: Select all
# April Fool's quantization table
# Dr. Nicolas Robidoux, Senior Research Scientist at Phase One
# (www.phaseone.com)
# Named in honour of the April 20th, 2013, snow storm that befell my
# soon to be former home town Sudbury Ontario, as well as the May 1st
# snow storm that welcomed my wife Cheryl more than a few years ago.
# Happy Anniversary Cheryl!
# ("A JPEG quantization table! Just what I wanted.")
######################################################################
# Copyright © 2013 Nicolas Robidoux <nicolas.robidoux@gmail.com>
# This program is free software. It comes without any warranty, to
# the extent permitted by applicable law. You can redistribute it
# and/or modify it under the terms of the Do What The Fu You Want
# To Public License, Version 2, as published by Sam Hocevar. See
# http://www.wtfpl.net/ for more details.
######################################################################
# RECOMMENDED USE
# Warning: Without Chroma subsampling, this table allocates too many
# bits to colour preservation.
# Tuned for use with Chroma subsampling ("-sample 2x2" with cjpeg) in
# the IJG quality range 60 to 80 in viewing conditions appropriate for
# 20/20 vision when viewed at native size or enlarged 2x. (Pixel
# peepers: Stay away!)
# One single table is used for all three channels. Using the same
# quantization table for both Luma and Chroma allows one to embed only
# one in the image file (using the "-qslots 0" cjpeg option), which
# yields a nontrivial size reduction with very small
# thumbnails.
# This quantization table is based on the one recommended by
# Relevance of human vision to JPEG-DCT compression by Stanley
# A. Klein, Amnon D. Silverstein and Thom Carney. In Human Vision,
# Visual Processing and Digital Display III, 1992.
# for 1 minute per pixel viewing.
# I decided to simply add 5 across the board, the numerological rationale being
# that 5 is the single digit number with the highest level of vibrational energy.
######################################################################
18 17 19 24 31 43 62 91
17 23 26 33 40 46 59 81
19 26 30 37 49 68 97 141
24 33 37 46 59 80 112 162
31 40 49 59 75 100 137 195
43 46 68 80 100 130 175 244
62 59 97 112 137 175 232 317
91 81 141 162 195 244 317 424