Page 1 of 1

FIFO thumbnail

Posted: 2011-05-06T11:25:53-07:00
by skeltoac
I'm trying to produce thumbnails in real-time. It must read input as available and flush output as frequently as possible. For example, if I'm reducing by 50%, I want one line of output every time two lines of input are read. No waiting for the entire image to load. Reduced sampling quality is acceptable.

Is there any way to do this in IM?

Ideally, convert would have an option (+thumbnail geometry filename) so that one command could produce the progress image as well as a final thumbnail:

Code: Select all

convert - +thumbnail "200x200>" - -thumbnail "200x200>" $THUMBNAIL
That would read stdin, write an instant thumbnail on stdout while still reading stdin, and finally save a proper thumbnail.

Re: FIFO thumbnail

Posted: 2011-05-06T12:11:11-07:00
by fmw42
try

convert - -thumbnail "200x200>" -write $THUMBNAIL miff:-

I believe (not tested) that this will write your $THUMBNAIL to disk and the same thumbnail in miff format to stdout. You can use any format for stdout, such as TIFF:- or PNG:-, etc

Re: FIFO thumbnail

Posted: 2011-05-06T12:35:28-07:00
by skeltoac
fmw42 wrote:try

convert - -thumbnail "200x200>" -write $THUMBNAIL miff:-
Getting convert to write to stdout is no problem. The issue is that it starts output only after the entire image has been read on stdin. (The same is true for stream AFAICT.) I'm looking for a way to output a thumbnail one line at a time without waiting for the entire image. I'm sorry if that wasn't clear. AFAIK nobody ever asked for this before.

Re: FIFO thumbnail

Posted: 2011-05-08T23:12:09-07:00
by anthony
convert is NOT a pipelined image processor. It always reads in the whole image and writes out the whole image.
I have a wrapper program (process_ppm_pipeline) that can break a pipeline of multiple images into individual images and process each image in turn, but only for 'raw' pnm streams at this time. But then that is image-by-image processing.

The IM command "stream" will handling image processing on a line by line bases if the image format allows. But does not have resize capability, as it is designed with basic cropping in mind.

That does not mean it is not possible, as stream is using the appropriate magick core routines to do it. But it is probably not as you envisage, as resize is more complex than you may think. See IM Examples, Resize Filters, if you really want to get into this!
http://www.imagemagick.org/Usage/resize/#filter

To actually resize images height wise in a streamed line by line fashion is actually not straight forward. You would need to read in at least all the lines within a resize filter's support range, so all the data is available, the number of image rows that involves is variable. It may be two, it may be a hundred, depending on how much compression of the image (height-wise) is involved and the actual filter being used.

The only Image processing package that even has a chance of doing a row-by-row stream 'resize', is the PbmPlus/NetPbm package, which is stream orientated. However its manuals rarely define at want level it 'streams' the image. Hmmm the command is "pamscale" and in this case the manpage defines that it works purely on a image-by-image bases.

Basically, it is possible (by buffering all rows in the current resize filter support range), but I know of no program doing actual row-by-row stream resize of images.

If you manage to find out more, no matter how trivial. Please let this forum know! There are lots of people interested in stream processing of extremely large images, using minimal memory foot prints.