jpeg decoder: quality regression in chroma upsampling
Posted: 2013-04-03T16:10:10-07:00
Recent versions of imagemagick produce lower-quality decoding of subsampled jpegs than older versions did. I noticed the problem in 6.8.4-6 2013-04-03 Q16, and tracked the regression to svn-r7275. That commit message ("Change IsMagickTrue() to more sensible name IsStringTrue()") implies it was only supposed to be a refactoring, but it in fact changed the default upsampling algorithm.
Patch to revert to the previous behavior, i.e. keeping libjpeg defaults:
Patch to revert to the previous behavior, i.e. keeping libjpeg defaults:
Code: Select all
--- a/coders/jpeg.c
+++ b/coders/jpeg.c
@@ -1126,7 +1126,8 @@ static Image *ReadJPEGImage(const ImageInfo *image_info,
jpeg_info.desired_number_of_colors=(int) StringToUnsignedLong(option);
}
option=GetImageOption(image_info,"jpeg:block-smoothing");
- jpeg_info.do_block_smoothing=IsStringTrue(option);
+ if (option != (const char *) NULL)
+ jpeg_info.do_block_smoothing=IsStringTrue(option);
jpeg_info.dct_method=JDCT_FLOAT;
option=GetImageOption(image_info,"jpeg:dct-method");
if (option != (const char *) NULL)
@@ -1159,7 +1160,8 @@ static Image *ReadJPEGImage(const ImageInfo *image_info,
}
}
option=GetImageOption(image_info,"jpeg:fancy-upsampling");
- jpeg_info.do_fancy_upsampling=IsStringTrue(option);
+ if (option != (const char *) NULL)
+ jpeg_info.do_fancy_upsampling=IsStringTrue(option);
(void) jpeg_start_decompress(&jpeg_info);
image->columns=jpeg_info.output_width;
image->rows=jpeg_info.output_height;