Page 1 of 1

32bit mean pixel calculation

Posted: 2016-04-12T05:21:52-07:00
by mikjsmith
Hi

I used the following to calculate the mean pixel value from a series of 16bit input images (ImageMagick 6.9.1-2 Q16) as per

convert t00.tif t01.tif t02.tif -evaluate-sequence mean t_mean.tif

for the first pixel (for example) this gives values in the original images of 2312, 1276 and 4305, with a computed value of 2631 (makes sense).

These images were converted to 32bit and pixel values show as 151521544, 83625212 and 282136785. I then re-computed the mean (ImageMagick 6.9.1-2 Q32 hdri enabled) and got 000.040.

The histograms all look the same so the image values are being scaled. However all precision is lost in the final mean calculation. Two queries:

1. why is 32bit mean calculation not an integer?
2. can I increase the number of significant figures for it?

thanks

mike

Re: 32bit mean pixel calculation

Posted: 2016-04-12T05:26:31-07:00
by snibgo
Use "-precision" to control the number of output significant digits. See http://www.imagemagick.org/script/comma ... #precision
1. why is 32bit mean calculation not an integer?
Please show us the command(s) you are using. Is your IM Q32? HDRI?

Re: 32bit mean pixel calculation

Posted: 2016-04-12T14:27:52-07:00
by mikjsmith
ImageMagick 6.9.3-7 Q32 x86_64 2016-04-04

Apologies, my fault. I did put this in the message, but it was hidden somewhat

I used

..\convert t00.tif t01.tif t02.tif -evaluate-sequence mean t_mean.tif

and was expecting an integer result given the pixel values were integers

Re: 32bit mean pixel calculation

Posted: 2016-04-12T14:33:59-07:00
by snibgo
No, that command didn't do this:
mikjsmith wrote:These images were converted to 32bit and pixel values show as 151521544, 83625212 and 282136785. I then re-computed the mean (ImageMagick 6.9.1-2 Q32 hdri enabled) and got 000.040.
What commands showed those results?