Resizing multiple dimensions, multiple formats
Posted: 2016-09-25T16:35:15-07:00
Hi guys,
I have a bit of a plea to help me improve the performance of a command.
What I am trying to do is this:
Given a source image, resize it to various sizes, and for each size save it in a few different formats.
Initially I did this by the means of some nested loops in PHP, having a different convert command for each size/format. This was quite slow.
Currently, my command looks like this:
(I have tried to re-format the above in order to make it more readable)
The general idea is to open the image, clone it to resize, then clone the resized version for each desired format, and repeat.
I tried the -monitor option, the output of this looks rather like this:
(please note that I have been fiddling around with the command for a while, this may be output from an earlier attempt, but the general form is the same)
It does seem to spend a long time on lines beginning with "Load/Image//tmp" which I found surprising, to me this seems to be loading the intermediate images from a temp file, where I would expect it to be held in RAM.
I added "-limit memory 4000MiB" to the command thinking it might be doing the equivalent of paging out the data, but this had no effect and I struggle to believe that the uncompressed data takes anywhere near 4 gigs of RAM.
I have tried messing about with mpr:, but this made no difference either.
At the moment I am rather at a loss as to what to try next, and I would appreciate any pointers that folk might be able to give me.
Here are some details about my setup:
My test.jpg image is some random JPEG around 3000x3000 pixels, ~3MiB and the above command takes just over 30 seconds.
I have a bit of a plea to help me improve the performance of a command.
What I am trying to do is this:
Given a source image, resize it to various sizes, and for each size save it in a few different formats.
Initially I did this by the means of some nested loops in PHP, having a different convert command for each size/format. This was quite slow.
Currently, my command looks like this:
Code: Select all
convert 'test.jpg' \
\( +clone -resize 3000x \
\( +clone -quality 85 -write 'test 3000.webp' +delete \) \
\( +clone -quality 85 -write '/test 3000.jpeg' +delete \) \
\( +clone -quality 85 -write 'test 3000.png' +delete \) \
+delete \) \
\( +clone -resize 2000x \
\( +clone -quality 85 -write 'test 2000.webp' +delete \) \
\( +clone -quality 85 -write 'test 2000.jpeg' +delete \) \
\( +clone -quality 85 -write 'test 2000.png' +delete \) \
+delete \) \
\( +clone -resize 1000x \
\( +clone -quality 85 -write 'test 1000.webp' +delete \) \
\( +clone -quality 85 -write 'test 1000.jpeg' +delete \) \
\( +clone -quality 85 -write 'test 1000.png' +delete \) \
+delete \) \
\( +clone -resize 500x \
\( +clone -quality 85 -write 'test 500.webp' +delete \) \
\( +clone -quality 85 -write 'test 500.jpeg' +delete \) \
\( +clone -quality 85 -write 'test 500.png' +delete \) \
+delete \) \
\( +clone -resize 200x \
\( +clone -quality 85 -write 'test 200.webp' +delete \) \
\( +clone -quality 85 -write 'test 200.jpeg' +delete \) \
\( +clone -quality 85 -write 'test 200.png' +delete \) \
+delete \) \
\( +clone -resize 100x \
\( +clone -quality 85 -write 'test 100.webp' +delete \) \
\( +clone -quality 85 -write 'test 100.jpeg' +delete \) \
\( +clone -quality 85 -write 'test 100.png' +delete \) \
+delete \) \
null:
The general idea is to open the image, clone it to resize, then clone the resized version for each desired format, and repeat.
I tried the -monitor option, the output of this looks rather like this:
Code: Select all
Load/Image//test.jpg]: 2591 of 2592, 100% complete
Resize/Image//test.jpg]: 5007 of 5008, 100% complete
Load/Image//tmp[magick-17321HoHBm5dIzJAP]: 2007 of 2008, 100% complete
Resize/Image//test.jpg]: 3338 of 3339, 100% complete
Load/Image//tmp[magick-17321nLYjY8YlK6jY]: 1338 of 1339, 100% complete
Resize/Image//test.jpg]: 1668 of 1669, 100% complete
Load/Image//tmp[magick-173218bgbxFLEHEz7]: 668 of 669, 100% complete
Resize/Image//test.jpg]: 834 of 835, 100% complete
Load/Image//tmp[magick-17321I8FOKQF9yESr]: 334 of 335, 100% complete
Resize/Image//test.jpg]: 333 of 334, 100% complete
Load/Image//tmp[magick-17321uvkNSbuf2bk8]: 133 of 134, 100% complete
Resize/Image//test.jpg]: 166 of 167, 100% complete
Load/Image//tmp[magick-17321v6F4VLRZkvsY]: 66 of 67, 100% complete
It does seem to spend a long time on lines beginning with "Load/Image//tmp" which I found surprising, to me this seems to be loading the intermediate images from a temp file, where I would expect it to be held in RAM.
I added "-limit memory 4000MiB" to the command thinking it might be doing the equivalent of paging out the data, but this had no effect and I struggle to believe that the uncompressed data takes anywhere near 4 gigs of RAM.
I have tried messing about with mpr:, but this made no difference either.
At the moment I am rather at a loss as to what to try next, and I would appreciate any pointers that folk might be able to give me.
Here are some details about my setup:
Code: Select all
$ uname -a
Linux foo-VirtualBox 3.19.0-32-generic #37~14.04.1-Ubuntu SMP Thu Oct 22 09:41:40 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux
Code: Select all
cat /proc/meminfo
MemTotal: 3112044 kB
MemFree: 1257256 kB
MemAvailable: 2416080 kB
Code: Select all
$ convert -version
Version: ImageMagick 6.9.3-3 Q16 x86_64 2016-02-06 http://www.imagemagick.org
Copyright: Copyright (C) 1999-2016 ImageMagick Studio LLC
License: http://www.imagemagick.org/script/license.php
Features: Cipher DPC HDRI OpenMP
Delegates (built-in): freetype jbig jng jpeg lcms lzma png tiff zlib