Page 1 of 1

big tmp file or segfault on convert

Posted: 2011-02-21T07:37:28-07:00
by kriks
hi,

we've had a problem on our production environment (IM 6.6.4-2)

1119.png => http://dl.free.fr/dgU1Zu1oH (1x1px PNG file)

this convert command creates a +5To tmp file and never finishes

Code: Select all

convert 1119.png -resize '260x260!'  -format JPEG  -compress JPEG -colorspace RGB -interlace line -quality 75 -type optimize -strip -flatten 1119.jpeg

here the end of the debug for this command

Code: Select all

2011-02-21T14:52:42+01:00 0:00.120 0.020u 6.6.4 Cache convert[13718]: cache.c/OpenPixelCache/4233/Cache
  open 1/1/1/1/9/1119.png[0] (/tmp/magick-XXvD8uAw[3], disk, 982540x676000 5.31358TB)
2011-02-21T14:52:42+01:00 0:00.140 0.040u 6.6.4 Cache convert[13718]: cache.c/WritePixelCachePixels/5549/Cache
  1/1/1/1/9/1119.png[0][982540x1+0+0]
on a smaller environment (with a more recent version :6.6.5-6), it segfaults, certainly because it can't create a so huge tmp file.


is there any workaround ?

Re: big tmp file or segfault on convert

Posted: 2011-02-21T19:47:59-07:00
by glennrp
The underlying problem is that the image has negative offsets in the PNG oFFs
chunk, which is allowed, but ImageMagick stores them in a RectangleInfo
structure in which the offsets are of type size_t. The "-1, -1" offsets become
huge positive numbers and that apparently has a bad effect on your process.

Code: Select all

convert 1119.png -resize '260x260!'  -format JPEG  -compress JPEG \
        -colorspace RGB -interlace line -quality 75 -type optimize \
        -strip -page 0x0+0+0 -flatten 1119.jpeg
does not crash. However, if you are expecting to use the page
data, you'll of course lose that.

Re: big tmp file or segfault on convert

Posted: 2011-02-21T20:16:12-07:00
by magick
Something else is causing the problem, perhaps promoting a long to a ssize_t. RectangleInfo uses ssize_t (not size_t) as the x and y offset. ssize_t, unlike size_t can go negative. If you show that the PNG coder is properly setting a negative value, the problem may lie elsewhere within ImageMagick.

Re: big tmp file or segfault on convert

Posted: 2011-02-21T20:27:08-07:00
by glennrp
The crash seems to be happening in the "flatten" operation. Using -page 0x0+0+0 to get rid of the offsets stops the crash.

Re: big tmp file or segfault on convert

Posted: 2011-02-21T20:29:33-07:00
by magick
That makes sense if it thinks the offset is 4GB. So you're saying the page offset from PNG is -1,-1. If so, the bug would be in ImageMagick rather than the PNG coder. However,
  • -> identify 1119.png
    1119.png PNG 1x1 3779x2600+4294967295+4294967295 16-bit PseudoClass 65536c 2.4KB 0.000u 0:00.000
suggests the non-negative offset is coming from the PNG coder. Can you confirm.

Re: big tmp file or segfault on convert

Posted: 2011-02-22T11:29:07-07:00
by magick
The 1x1 image has a virtual canvas of 3779x2600:
  • -> identify 1119.png
    1119.png PNG 1x1 3779x2600-1-1 16-bit PseudoClass
When you resize the image to 260x260, it increases the virtual canvas by the same factor to 982540x676000.

The -flatten option respects virtual canvases so ImageMagick does what its designed to do, it creates an image of 982540x676000 pixels. In your case, your computer cannot accommodate an image that huge-- so it hangs.

The fix is to simply remore the virtual canvas, like this:
  • convert 1119.png -resize '260x260!' -format JPEG -compress JPEG -colorspace RGB -interlace line -quality 75 -type optimize +repage -flatten -strip 1119.jpeg

Re: big tmp file or segfault on convert

Posted: 2011-02-23T02:22:59-07:00
by kriks
thanks for the explanation

it works using +repage instead of +page before -flatten