Different behaviour of fx '&' operator in Windows and Li
Posted: 2006-07-13T16:54:04-07:00
Hi,
When trying out the imagemagick toolkit for some basic image processing I ran into some unexpected behaviour. The exact same command performed in Windows and Linux had different effects on the same image. I wanted to apply an AND bitwise mask for every pixel on every channel.
In Windows I used the syntax:
convert <original> -fx "u&<RGBA mask in decimal>" <destination>
which worked perfectly.
In gentoo linux, the exact same command gave an entirely different result. Upon experimenting in linux with different masks, I found that 0 for the mask returned the image totally in black, as would be expected. But upon using 1, the image came exactly the same as the original, which is unexpected as the 1 would filter out every bit except the first least significant bit, and it does so in windows.
Thanks in advance for any help,
Pedro Martins
When trying out the imagemagick toolkit for some basic image processing I ran into some unexpected behaviour. The exact same command performed in Windows and Linux had different effects on the same image. I wanted to apply an AND bitwise mask for every pixel on every channel.
In Windows I used the syntax:
convert <original> -fx "u&<RGBA mask in decimal>" <destination>
which worked perfectly.
In gentoo linux, the exact same command gave an entirely different result. Upon experimenting in linux with different masks, I found that 0 for the mask returned the image totally in black, as would be expected. But upon using 1, the image came exactly the same as the original, which is unexpected as the 1 would filter out every bit except the first least significant bit, and it does so in windows.
Thanks in advance for any help,
Pedro Martins