Hi everyone,
I'm hoping for some collective wisdom here on a problem I'm having with ImageMagick (6.5.7-0 , Q16) and creation of large multi-page TIFF files.
I want to create a large multi-page TIFF from several single-page TIFFs from a scanner. I have a large number (~400) of images. For the sake of argument, lets assume these are all 8.5x11, at a pixel size of 2500x3300 (which is my "worst case", as each image should be no larger than a standard letter-sized page, although some are smaller). This gives me ~25 GB for storage size for my pixel cache.
However, when I try to run the convert command, the job dies, and I get an out of disk space error, and find imagemagick has used > 33 GB of space in the /tmp (there's a limit on /tmp of 40 GB and 7 GB were used at the time). The convert command that I'm using, is " convert -quiet -compress group4 x1.tif x2.tif x3.tif ... "
What's going on here? Am I missing something, or is Imagemagick eating up way more space than it should be ? Should I be trying to use the -stream method for this purpose instead? (Does stream even support doing this, esp. with compression?)
Thanks in advance!
Edit: updated the size of the pixel cache calculation and images to match problem scenario.
Large Multi-page TIFF struggles
Large Multi-page TIFF struggles
Last edited by alesser on 2013-09-16T16:22:27-07:00, edited 1 time in total.
-
- Posts: 12159
- Joined: 2010-01-23T23:01:33-07:00
- Authentication code: 1151
- Location: England, UK
Re: Large Multi-page TIFF struggles
Don't forget you need the same amount again: a load of memory for the input files, and the same again for the output file.
I know nothing about stream.
I know nothing about stream.
snibgo's IM pages: im.snibgo.com
- fmw42
- Posts: 25562
- Joined: 2007-07-02T17:14:51-07:00
- Authentication code: 1152
- Location: Sunnyvale, California, USA
Re: Large Multi-page TIFF struggles
I really know little about large file processing, but see http://www.imagemagick.org/Usage/files/#massive and http://www.imagemagick.org/script/resou ... nvironment. You might change your tmp directory to somewhere else that has more allowed space.
Re: Large Multi-page TIFF struggles
So, the output file should only be about ~40-50 MB compressed based on our experience with other slightly smaller batches, so I'd expect that to be a neglible amount of space.snibgo wrote:Don't forget you need the same amount again: a load of memory for the input files, and the same again for the output file.
Re: Large Multi-page TIFF struggles
I've tried the -limit flag already, but all that does is shift the pixel cache around in memory/memory map/disk, but it doesnt change the size. The environment variables are set so that image magick has ~4 GB of memory map, and then drops to disk... but we simply run out of disk.fmw42 wrote:I really know little about large file processing, but see http://www.imagemagick.org/Usage/files/#massive and http://www.imagemagick.org/script/resou ... nvironment. You might change your tmp directory to somewhere else that has more allowed space.
-
- Posts: 12159
- Joined: 2010-01-23T23:01:33-07:00
- Authentication code: 1151
- Location: England, UK
Re: Large Multi-page TIFF struggles
I thought your problem was running out of memory. Images being processed in memory aren't compressed. You need space for all of the input pixels, and for the output pixels. Compression only happens when they are written to file.alesser wrote:So, the output file should only be about ~40-50 MB compressed ...
EDIT: That's the general rule. Perhaps IM optimises memory usage more than this.
snibgo's IM pages: im.snibgo.com
Re: Large Multi-page TIFF struggles
I assumed that what happened was that IM read all the image data into the pixel cache as it concatenated each file into the multi-page TIFF, and then wrote it out compressed, so that it wouldn't need to double. According to the math on this page http://www.imagemagick.org/script/architecture.php, at least (see section on "Cache Storage and Resource Requirements"). That page seems to imply the memory requirements for such an operation would be calculated as i've done it above?snibgo wrote: I thought your problem was running out of memory. Images being processed in memory aren't compressed. You need space for all of the input pixels, and for the output pixels. Compression only happens when they are written to file.
EDIT: That's the general rule. Perhaps IM optimises memory usage more than this.
Note: Correction to the original post.. the images turn out to be 300 DPI, so it should be more like 25 GB, but it's still taking in excess of 33 GB... so i'm figuring something is strange.