I'm trying to automatically detect hot pixels in a sensor. In order to do so I defined a criteria for "isolated pixel" which states that a pixel is more isolated if the difference to all it's neighbors is high. If the pixel has any close neighbor, even if only one, than it's not isolated.
I'm calculating difference in terms of luminance, where luminance = sum of all channels. That's debatable, but anyway... my problem is implementing this.
The definition would be something like, in terms of -fx:
Code: Select all
convert.exe \
Lenna.tif \
-fx '\
min(\
abs(r+g+b-p[-1,0].r-p[-1,0].g-p[-1,0].b),\
abs(r+g+b-p[1,0].r-p[1,0].g-p[1,0].b),\
abs(r+g+b-p[0,-1].r-p[0,-1].g-p[0,-1].b),\
abs(r+g+b-p[0,1].r-p[0,1].g-p[0,1].b),\
abs(r+g+b-p[-1,1].r-p[-1,1].g-p[-1,1].b),\
abs(r+g+b-p[1,1].r-p[1,1].g-p[1,1].b),\
abs(r+g+b-p[1,-1].r-p[1,-1].g-p[1,-1].b),\
abs(r+g+b-p[-1,-1].r-p[-1,-1].g-p[-1,-1].b)\
)' \
-evaluate Multiply 0.1 \
-depth 8 \
-interlace line \
-quality 100 \
-sampling-factor 1x1 \
Lenna-isolated-pixels.tif
Any ideas?