Awesome, thank you for the clarification!
Thanks,
Ryan
Search found 4 matches
- 2013-06-27T09:18:41-07:00
- Forum: Users
- Topic: Grayscale TIFF depth change from 32-bit to 16-bit
- Replies: 6
- Views: 10027
- 2013-06-26T15:03:28-07:00
- Forum: Users
- Topic: Grayscale TIFF depth change from 32-bit to 16-bit
- Replies: 6
- Views: 10027
Re: Grayscale TIFF depth change from 32-bit to 16-bit
Ah, that makes sense. In my investigation, I wrote some quick libtiff code that would map the actual range of the floats contained in the image to the range 0...1 and then multiply that value by USHRT_MAX. I believe the differences I was seeing in the output of my application and ImageMagick are ...
- 2013-06-24T11:25:40-07:00
- Forum: Users
- Topic: Grayscale TIFF depth change from 32-bit to 16-bit
- Replies: 6
- Views: 10027
Re: Grayscale TIFF depth change from 32-bit to 16-bit
Oops, I should've been more clear. Both image are grayscale, i.e., a 32-bit grayscale TIFF to a 16-bit grayscale TIFF.
I also discovered the sample format flag, which answers the first question: http://www.awaresystems.be/imaging/tiff ... ormat.html
Thanks,
Ryan
I also discovered the sample format flag, which answers the first question: http://www.awaresystems.be/imaging/tiff ... ormat.html
Thanks,
Ryan
- 2013-06-24T10:57:18-07:00
- Forum: Users
- Topic: Grayscale TIFF depth change from 32-bit to 16-bit
- Replies: 6
- Views: 10027
Grayscale TIFF depth change from 32-bit to 16-bit
Hello, I had a question about how exactly ImageMagick performs a depth conversion from a grayscale 32-bit floating point TIFF to a 16-bit grayscale TIFF: convert 32bitGrayscaleTiff.tif -depth 16 16bitGrayscaleTiff.tif The command works fine, and convert appears to produce the desired output, but I'm ...