convert is NOT a pipelined image processor. It always reads in the whole image and writes out the whole image.
I have a wrapper program (
process_ppm_pipeline) that can break a pipeline of multiple images into individual images and process each image in turn, but only for 'raw' pnm streams at this time. But then that is image-by-image processing.
The IM command "stream" will handling image processing on a line by line bases if the image format allows. But does not have resize capability, as it is designed with basic cropping in mind.
That does not mean it is not possible, as stream is using the appropriate magick core routines to do it. But it is probably not as you envisage, as resize is more complex than you may think. See IM Examples, Resize Filters, if you really want to get into this!
http://www.imagemagick.org/Usage/resize/#filter
To actually resize images height wise in a streamed line by line fashion is actually not straight forward. You would need to read in at least all the lines within a resize filter's support range, so all the data is available, the number of image rows that involves is variable. It may be two, it may be a hundred, depending on how much compression of the image (height-wise) is involved and the actual filter being used.
The only Image processing package that even has a chance of doing a row-by-row stream 'resize', is the PbmPlus/NetPbm package, which is stream orientated. However its manuals rarely define at want level it 'streams' the image. Hmmm the command is "pamscale" and in this case the manpage defines that it works purely on a image-by-image bases.
Basically, it is possible (by buffering all rows in the current resize filter support range), but I know of no program doing actual row-by-row stream resize of images.
If you manage to find out more, no matter how trivial. Please let this forum know! There are lots of people interested in stream processing of extremely large images, using minimal memory foot prints.