Another experience: to create a PDF from 24 JPG files, convert eats 3 GiB RAM and 4 GiB swap, bringing the machine to swapdeath, and is eventually killed by Linux.
This makes sense given
viewtopic.php?f=1&t=28438&p=126258 says we need 8 bytes per pixel and these are 35 MPx images: 35*10**6*24*8/1024**3 = 6 GiB memory needed. And I'd need to convert ~300 such images, not just 24!
Changing the resource limits makes it die faster but not complete the job, it just can't take this much memory on my 8 GB RAM machine.
Code: Select all
$ identify -version
Version: ImageMagick 6.8.8-10 Q16 x86_64 2015-03-10 http://www.imagemagick.org
Copyright: Copyright (C) 1999-2014 ImageMagick Studio LLC
Features: DPC Modules OpenMP
Delegates: bzlib cairo djvu fftw fontconfig freetype gslib jng jpeg lcms ltdl lzma openexr pangocairo png ps rsvg tiff webp wmf x xml zlib
$ convert -list resource
File Area Memory Map Disk Thread Throttle Time
--------------------------------------------------------------------------------
38400 15.948GB 7.4266GiB 14.853GiB unlimited 2 0 unlimited
$ time convert 000*jpg output.fromjpeg75.pdf
Ucciso
real 7m23.713s
user 0m21.734s
sys 0m13.826s
$ time convert -limit memory 7GiB 000*jpg output.fromjpeg75.pdf
Ucciso
real 0m36.269s
user 0m22.991s
sys 0m10.424s
In contrast mogrify completes the job in a reasonable time, but then I have to merge the individual PDF files. As magick suggested in another thread about large collections of TIFF, I tried tiffcp + tiff2pdf instead and it's indeed way faster: 1 min instead of 4 for the same PDF.
Code: Select all
$ time mogrify -format jpg -quality 75 000*tif
real 1m46.553s
user 1m6.503s
sys 0m17.276s
$ time mogrify -format pdf 000*jpg
real 4m10.036s
user 3m19.764s
sys 0m23.655s
$ time tiffcp -c lzw 000*tif out.lzw.tiff
real 0m57.404s
user 0m35.480s
sys 0m4.750s
$ time tiff2pdf -o output.jpeg75.pdf -j -q 75 -u m -F out.lzw.tiff
real 0m55.016s
user 0m49.864s
sys 0m4.687s
$ time tiff2pdf -o output.lossless.pdf -z -u m -F out.lzw.tiff
real 4m15.793s
user 4m1.839s
sys 0m9.028s