I don't think that is the way -colors works. It maps the colors onto a tree and travers
Well, i don't really care how it works -- most importantly is that it should be intuitive.
For instance, if the number of colours is already met by the constraint, then it should not change anything. I suspect, that this difference has to do with some quantization/rounding error.
BTW, talking about intuition, what do you think will happen if i run:
Code: Select all
convert rose: -type GrayScale rose.tiff
convert rose.tiff -colors 256 rose2.tiff
convert rose2.tiff -colors 256 rose3.tiff
compare -metric mse rose2.tiff rose3.tiff null:
What would be your guess?
Well, they differ.
I think this is odd.
If the "complex" algorithm reduced(and i suppose optimized) the number of colours,
then why does the second application of the same transform change the image?
Isn't it supposed to be already optimized?
I wonder, how would the image look on the 10000 iteration?
The reason that i need -colors 256, in the first place, is that i need to transform a TrueColor Grayscale bmp image to a pellete based bmp. In the tutorial it is suggested to use "-colors 256", the last discussion shows, that "-colors 256" command also destroys the grayscale values of the original image.
But, i think this is a bug in the realization of the algorithm, some error in rounding, ...