potential bugs in stats for image in IM 6.4.4.-1
Posted: 2008-10-06T00:47:03-07:00
IM 6.4.4-1 Q16 Mac OSX Tiger
I have been trying to understand some differences in ways to get the min,max,mean,std stats for an image and believe that in some cases there are some errors and in other cases, I may not understand what is being computed.
(I am working on an autogamma and autolevel script and need some reliable stats!)
Here is my test image:
file="zelda2.jpg"
I have tried two sets of tests --- one with color r,g,b stats and one with grayscale stats.
I have marked in red, where I think the differences are occuring in comparison to the stats reported by -verbose info:
Please clarify where there are indeed bugs and where I may be misunderstanding from what channel or colorspace the stats are being generated
COLOR:
Baseline:
convert "$file" -depth 16 -verbose info:
red:
min: 0 (0)
max: 65535 (1)
mean: 19601.7 (0.299102)
standard deviation: 10829.2 (0.165243)
green:
min: 0 (0)
max: 56797 (0.866667)
mean: 13639.7 (0.208128)
standard deviation: 10666.9 (0.162766)
blue:
min: 0 (0)
max: 54484 (0.831373)
mean: 12035.8 (0.183655)
standard deviation: 9734.71 (0.148542)
FX method 1 (channel qualifier):
stats="RED: min=%[fx:quantumrange*minima.r] max=%[fx:quantumrange*maxima.r] mean=%[fx:quantumrange*mean.r] std=%[fx:quantumrange*standard_deviation.r]"
convert "$file" -depth 16 -format "$stats" info:
stats="GREEN: min=%[fx:quantumrange*minima.g] max=%[fx:quantumrange*maxima.g] mean=%[fx:quantumrange*mean.g] std=%[fx:quantumrange*standard_deviation.g]"
convert "$file" -depth 16 -format "$stats" info:
stats="BLUE: min=%[fx:quantumrange*minima.b] max=%[fx:quantumrange*maxima.b] mean=%[fx:quantumrange*mean.b] std=%[fx:quantumrange*standard_deviation.b]"
convert "$file" -depth 16 -format "$stats" info:
returns:
RED: min=0 max=65535 mean=19601.7 std=19601.7
GREEN: min=0 max=56797 mean=13639.7 std=13639.7
BLUE: min=0 max=54484 mean=12035.8 std=12035.8
std is always the same as the mean and does not match verbose info: --- bug?
FX method 2 (channel separation)
stats="RED: min=%[fx:quantumrange*minima] max=%[fx:quantumrange*maxima] mean=%[fx:quantumrange*mean] std=%[fx:quantumrange*standard_deviation]"
convert "$file" -depth 16 -channel r -separate -format "$stats" info:
stats="GREEN: min=%[fx:quantumrange*minima] max=%[fx:quantumrange*maxima] mean=%[fx:quantumrange*mean] std=%[fx:quantumrange*standard_deviation]"
convert "$file" -depth 16 -channel g -separate -format "$stats" info:
stats="BLUE: min=%[fx:quantumrange*minima] max=%[fx:quantumrange*maxima] mean=%[fx:quantumrange*mean] std=%[fx:quantumrange*standard_deviation]"
convert "$file" -depth 16 -channel b -separate -format "$stats" info:
returns
RED: min=0 max=65535 mean=19601.7 std=19601.7
GREEN: min=0 max=56797 mean=13639.7 std=13639.7
BLUE: min=0 max=54484 mean=12035.8 std=12035.8
same as in FX method 1 --- std is always the same as the mean and does not match verbose info: --- bug?
Method 3 (channel separation string formats):
stats="RED: min=%[min] max=%[max] mean=%[mean] std=%[standard_deviation]"
convert "$file" -depth 16 -channel r -separate -format "$stats" info:
stats="GREEN: min=%[min] max=%[max] mean=%[mean] std=%[standard_deviation]"
convert "$file" -depth 16 -channel g -separate -format "$stats" info:
stats="BLUE: min=%[min] max=%[max] mean=%[mean] std=%[standard_deviation]"
convert "$file" -depth 16 -channel b -separate -format "$stats" info:
returns:
RED: min=0 max=65535 mean=19601.7 std=10829.2
GREEN: min=0 max=56797 mean=13639.7 std=10666.9
BLUE: min=0 max=54484 mean=12035.8 std=9734.71
Seems to match verbose info fine
GRAYSCALE:
Baseline:
convert "$file" -depth 16 -colorspace gray -verbose info:
min: 0 (0)
max: 56729 (0.865629)
mean: 15239.5 (0.23254)
standard deviation: 10063.1 (0.153553)
Method 1 (colorspace gray):
stats="min=%[fx:quantumrange*minima] max=%[fx:quantumrange*maxima] mean=%[fx:quantumrange*mean] std=%[fx:quantumrange*standard_deviation]"
convert "$file" -depth 16 -colorspace gray -format "$stats" info:
min=0 max=56729 mean=15239.5 std=15239.5
std is the same as the mean and does not match verbose info: --- bug?
Method 2 (fx intensity):
stats="min=%[fx:quantumrange*minima] max=%[fx:quantumrange*maxima] mean=%[fx:quantumrange*mean] std=%[fx:quantumrange*standard_deviation]"
convert "$file" -depth 16 -fx "intensity" -format "$stats" info:
min=0 max=56729 mean=15239.5 std=15239.5
std is the same as the mean and does not match verbose info: --- bug?
Method 3 (string formats -- no colorspace gray):
stats="min=%[min] max=%[max] mean=%[mean] std=%[standard_deviation]"
convert "$file" -depth 16 -format "$stats" info:
min=0 max=65535 mean=15092.4 std=10917.9
max,mean,std do not match verbose info --- bug? or why so different?
Is this using some global stats and how are they defined?
Method 4 (type grayscale):
stats="min=%[fx:quantumrange*minima] max=%[fx:quantumrange*maxima] mean=%[fx:quantumrange*mean] std=%[fx:quantumrange*standard_deviation]"
convert "$file" -depth 16 -type grayscale -format "$stats" info:
min=0 max=65535 mean=19601.7 std=19601.7
max,mean,std do not match verbose info nor that from -colorspace gray and std is same as mean --- bug? (looks more like it is trying to get the red stats only)
Method 5 (colorspace gray string format):
file="zelda2.jpg"
stats="min=%[min] max=%[max] mean=%[mean] std=%[standard_deviation]"
convert "$file" -depth 16 -colorspace gray -format "$stats" info:
min=0 max=56729 mean=15239.5 std=10063.1
Seems to match verbose info correctly
Method 6 (type grayscale string format)
file="zelda2.jpg"
stats="min=%[min] max=%[max] mean=%[mean] std=%[standard_deviation]"
convert "$file" -depth 16 -type grayscale -format "$stats" info:
min=0 max=65535 mean=15092.4 std=10917.9
Does not match verbose info for colorspace gray nor red channel --- bug?
Please clarify any misconceptions I may have about the use of -type grayscale and/or string formats with or without -colorspace gray or -type grayscale. How are the string formats computed --- from what channel or combination of channels?
I have been trying to understand some differences in ways to get the min,max,mean,std stats for an image and believe that in some cases there are some errors and in other cases, I may not understand what is being computed.
(I am working on an autogamma and autolevel script and need some reliable stats!)
Here is my test image:
file="zelda2.jpg"
I have tried two sets of tests --- one with color r,g,b stats and one with grayscale stats.
I have marked in red, where I think the differences are occuring in comparison to the stats reported by -verbose info:
Please clarify where there are indeed bugs and where I may be misunderstanding from what channel or colorspace the stats are being generated
COLOR:
Baseline:
convert "$file" -depth 16 -verbose info:
red:
min: 0 (0)
max: 65535 (1)
mean: 19601.7 (0.299102)
standard deviation: 10829.2 (0.165243)
green:
min: 0 (0)
max: 56797 (0.866667)
mean: 13639.7 (0.208128)
standard deviation: 10666.9 (0.162766)
blue:
min: 0 (0)
max: 54484 (0.831373)
mean: 12035.8 (0.183655)
standard deviation: 9734.71 (0.148542)
FX method 1 (channel qualifier):
stats="RED: min=%[fx:quantumrange*minima.r] max=%[fx:quantumrange*maxima.r] mean=%[fx:quantumrange*mean.r] std=%[fx:quantumrange*standard_deviation.r]"
convert "$file" -depth 16 -format "$stats" info:
stats="GREEN: min=%[fx:quantumrange*minima.g] max=%[fx:quantumrange*maxima.g] mean=%[fx:quantumrange*mean.g] std=%[fx:quantumrange*standard_deviation.g]"
convert "$file" -depth 16 -format "$stats" info:
stats="BLUE: min=%[fx:quantumrange*minima.b] max=%[fx:quantumrange*maxima.b] mean=%[fx:quantumrange*mean.b] std=%[fx:quantumrange*standard_deviation.b]"
convert "$file" -depth 16 -format "$stats" info:
returns:
RED: min=0 max=65535 mean=19601.7 std=19601.7
GREEN: min=0 max=56797 mean=13639.7 std=13639.7
BLUE: min=0 max=54484 mean=12035.8 std=12035.8
std is always the same as the mean and does not match verbose info: --- bug?
FX method 2 (channel separation)
stats="RED: min=%[fx:quantumrange*minima] max=%[fx:quantumrange*maxima] mean=%[fx:quantumrange*mean] std=%[fx:quantumrange*standard_deviation]"
convert "$file" -depth 16 -channel r -separate -format "$stats" info:
stats="GREEN: min=%[fx:quantumrange*minima] max=%[fx:quantumrange*maxima] mean=%[fx:quantumrange*mean] std=%[fx:quantumrange*standard_deviation]"
convert "$file" -depth 16 -channel g -separate -format "$stats" info:
stats="BLUE: min=%[fx:quantumrange*minima] max=%[fx:quantumrange*maxima] mean=%[fx:quantumrange*mean] std=%[fx:quantumrange*standard_deviation]"
convert "$file" -depth 16 -channel b -separate -format "$stats" info:
returns
RED: min=0 max=65535 mean=19601.7 std=19601.7
GREEN: min=0 max=56797 mean=13639.7 std=13639.7
BLUE: min=0 max=54484 mean=12035.8 std=12035.8
same as in FX method 1 --- std is always the same as the mean and does not match verbose info: --- bug?
Method 3 (channel separation string formats):
stats="RED: min=%[min] max=%[max] mean=%[mean] std=%[standard_deviation]"
convert "$file" -depth 16 -channel r -separate -format "$stats" info:
stats="GREEN: min=%[min] max=%[max] mean=%[mean] std=%[standard_deviation]"
convert "$file" -depth 16 -channel g -separate -format "$stats" info:
stats="BLUE: min=%[min] max=%[max] mean=%[mean] std=%[standard_deviation]"
convert "$file" -depth 16 -channel b -separate -format "$stats" info:
returns:
RED: min=0 max=65535 mean=19601.7 std=10829.2
GREEN: min=0 max=56797 mean=13639.7 std=10666.9
BLUE: min=0 max=54484 mean=12035.8 std=9734.71
Seems to match verbose info fine
GRAYSCALE:
Baseline:
convert "$file" -depth 16 -colorspace gray -verbose info:
min: 0 (0)
max: 56729 (0.865629)
mean: 15239.5 (0.23254)
standard deviation: 10063.1 (0.153553)
Method 1 (colorspace gray):
stats="min=%[fx:quantumrange*minima] max=%[fx:quantumrange*maxima] mean=%[fx:quantumrange*mean] std=%[fx:quantumrange*standard_deviation]"
convert "$file" -depth 16 -colorspace gray -format "$stats" info:
min=0 max=56729 mean=15239.5 std=15239.5
std is the same as the mean and does not match verbose info: --- bug?
Method 2 (fx intensity):
stats="min=%[fx:quantumrange*minima] max=%[fx:quantumrange*maxima] mean=%[fx:quantumrange*mean] std=%[fx:quantumrange*standard_deviation]"
convert "$file" -depth 16 -fx "intensity" -format "$stats" info:
min=0 max=56729 mean=15239.5 std=15239.5
std is the same as the mean and does not match verbose info: --- bug?
Method 3 (string formats -- no colorspace gray):
stats="min=%[min] max=%[max] mean=%[mean] std=%[standard_deviation]"
convert "$file" -depth 16 -format "$stats" info:
min=0 max=65535 mean=15092.4 std=10917.9
max,mean,std do not match verbose info --- bug? or why so different?
Is this using some global stats and how are they defined?
Method 4 (type grayscale):
stats="min=%[fx:quantumrange*minima] max=%[fx:quantumrange*maxima] mean=%[fx:quantumrange*mean] std=%[fx:quantumrange*standard_deviation]"
convert "$file" -depth 16 -type grayscale -format "$stats" info:
min=0 max=65535 mean=19601.7 std=19601.7
max,mean,std do not match verbose info nor that from -colorspace gray and std is same as mean --- bug? (looks more like it is trying to get the red stats only)
Method 5 (colorspace gray string format):
file="zelda2.jpg"
stats="min=%[min] max=%[max] mean=%[mean] std=%[standard_deviation]"
convert "$file" -depth 16 -colorspace gray -format "$stats" info:
min=0 max=56729 mean=15239.5 std=10063.1
Seems to match verbose info correctly
Method 6 (type grayscale string format)
file="zelda2.jpg"
stats="min=%[min] max=%[max] mean=%[mean] std=%[standard_deviation]"
convert "$file" -depth 16 -type grayscale -format "$stats" info:
min=0 max=65535 mean=15092.4 std=10917.9
Does not match verbose info for colorspace gray nor red channel --- bug?
Please clarify any misconceptions I may have about the use of -type grayscale and/or string formats with or without -colorspace gray or -type grayscale. How are the string formats computed --- from what channel or combination of channels?