Massive averaging

Questions and postings pertaining to the usage of ImageMagick regardless of the interface. This includes the command-line utilities, as well as the C and C++ APIs. Usage questions are like "How do I use ImageMagick to create drop shadows?".
Post Reply
magicko

Massive averaging

Post by magicko »

Moin,

I'm trying to average a huge amount of images (22K+) in a serial process where each resulting image n is the average of the previous ones n-1, e.g. img3(img1,img2) and 22000(img1,... img21999).

Code: Select all

mv ~/srcDir1/img00700.png ~/srcDir2
convert ~/srcDir2/'*.png' -average ~/tgtDir/00700.png
What I do here is moving one file at a time from srcDir1 to srcDir2 and then average all the images of srcDir2 saving the averaged image into tgtDir

The problems:
- Is there a way to automate the process? I'm currently doing it one-by-one
- So far, each average (now in the 700-something image) takes about 10 minutes. Can I improve the code?

The images are 720x544 RGB 8-bit and I'm running OS X 10.5.7

Any help is appreciated. Thanks!
User avatar
magick
Site Admin
Posts: 11064
Joined: 2003-05-31T11:32:55-07:00

Re: Massive averaging

Post by magick »

Assuming you have plenty of free disk space, this is the fastest method:
  • convert -limit area 1 ~/srcDir2/'*.png' -average ~/tgtDir/00700.png
Where srcDir2 contains all the images you want to average. If you have programming skills, an even faster method entails calling the PNG API and averaging one scanline at a time.
User avatar
anthony
Posts: 8883
Joined: 2004-05-31T19:27:03-07:00
Authentication code: 8675308
Location: Brisbane, Australia

Re: Massive averaging

Post by anthony »

Their is a technique for keeping track of a 'progressive average' so you only ever have two images in memory at any one time.

However I would suggest a different technique.

First recompile your IM so that it has HDRI enabled. This version of IM saves all values as floating point numbers, and can have these number exceed the normal 'black' and 'white' bounds.

Then in a API (not command line), you can loop though each image, reading them in one at a time
and add them together (use 'plus' composition). When finished divide all the image values by the number of images, which will bring the values back into normal black to white range, and save.

As HDRI (floating point values) is in use you should NOT have any problems averaging the images, and saving the results in this way.


For command line, I would do it in smaller pieces. Say averaging 10 to 20 images at a time together (whatever number can be comfortable retained in memory), then average 10 to 20 of the averages, and so on. You may have to use a 'blended average' to handle images that do not contain the same number of images, as the other averaged images, and that can be tricky, but you should be able to get a average of any number of images in this way.

WARNING: this last method may have Quantum rounding effects. Again I suggest you use the highest Quality level version of IM possible, and save to a intermediate image file format (such as MIFF) that does not introduce quantum rounding errors due to 'depth' restrictions. Again it may be better to use a HDRI version of IM.

See IM Examples, Basics,
Quality, Depth & HDRI
http://www.imagemagick.org/Usage/basics/#depth
Anthony Thyssen -- Webmaster for ImageMagick Example Pages
https://imagemagick.org/Usage/
User avatar
fmw42
Posts: 25562
Joined: 2007-07-02T17:14:51-07:00
Authentication code: 1152
Location: Sunnyvale, California, USA

Re: Massive averaging

Post by fmw42 »

Say you have 10 images image_1.png ... image_10.png all the same size

j=2
convert image_1.png tmp.png
while [ $j -le 10 ]; do
new=`convert xc: -format "%[fx:100/$j]" info:`
old=`convert xc: -format "%[fx:100-$new]" info:`
composite -blend ${old}%x${new}% tmp.png image_$i.png tmp.png
i=`expr $i + 1`
done
convert tmp.png runningaverage.png
User avatar
anthony
Posts: 8883
Joined: 2004-05-31T19:27:03-07:00
Authentication code: 8675308
Location: Brisbane, Australia

Re: Massive averaging

Post by anthony »

fmw42 wrote:Say you have 10 images image_1.png ... image_10.png all the same size

Code: Select all

j=2
convert image_1.png tmp.png
while [ $j -le 10 ]; do
new=`convert xc: -format "%[fx:100/$j]" info:`
old=`convert xc: -format "%[fx:100-$new]" info:`
composite -blend ${old}%x${new}% tmp.png image_$i.png tmp.png
i=`expr $i + 1`
done
convert tmp.png runningaverage.png
This is tha incremental average I talked about. though it does one image at a time. On of my suggestions was to do the same loop wbut with more than one image at a time.

However as I mentioned you may want to use a higher quality version of IM (Q64, or HDRI), and a higher depth quality for the intermediate save format, EG MIFF, with larger -depth setting floating point HDRI. Note that setting a higher -depth than the IM compile time quality is useless.
Anthony Thyssen -- Webmaster for ImageMagick Example Pages
https://imagemagick.org/Usage/
magicko

Re: Massive averaging

Post by magicko »

Thank you people for the replies!

Well, my programming skills are below zero, the last time I used a command line was to move LOGO's turtle back in the 80's :D

Anyway, I will try ASAP the suggested methods and come back with (hopefully) solutions or more questions
Post Reply