On the basis of the few downsampling examples I've scrutinized, here are
MY (most likely biased)
CONCLUSIONS.
BEST OVERALL:
EWA (-distort Resize)
LanczosSharp (the new one, which is not very different from the current one, and also not very different from plain EWA Lanczos; Henry calls this scheme JincJinc3) is the best overall. It's a very slightly deblurred Jinc-windowed Jinc 3-lobe Elliptical Weighted Averaging.
FOR SHARPER RESULTS:
EWA (-distort Resize)
Jinc3Radius3: EWA Lanczos deblurred more than LanczosSharp (just enough so that the unscaled support has radius 3 instead of a radius equal to the first root of the Jinc function).
I am biased toward schemes that fight off moire, are reasonably respectful of local tone, and don't have halos that extend too far (no 4-lobe method makes my short list, even though some of them give good results). I am also partial to schemes that do well when upsampling, not only when downsampling.
CLOSE BUT NO CIGAR:
The various
EWA Keys cubics, starting with Robidoux, are good and they make it easy to adjust the sharpness <-> aliasing balance with one intuitive parameter that controls the blur (and replaces a final USM), but they don't do as good a downsampling job as the above two, unless tight halos are desired (3-lobe methods have double halos; EWA Keys cubics, single halos, which are mild unless the blur parameter is close to 0).
Tensor (-resize)
Lanczos (Sinc-windowed Sinc 3, the classic "high quality downsampler") does really well, and is sharp, but the fly results kill it. IMHO It does not low pass nearly as well as the best EWA schemes. (This is suggested by the theory: Jinc is the "ideal" 2D filter, not (tensor) Sinc, which fails to be isotropic in its filtering power.)
-----
This is not set in stone, but I'm starting to have a more solid opinion.
P.S. Did not have the heart to carefully study things like Mitchell-Lanczos blends, which really should be done in HDRI to avoid effects of clamping before blending (unless this is intentional). And I may have to have a double take comparing, as a group, tensor Keys with EWA Keys. I may have been guilty of selection bias
http://en.wikipedia.org/wiki/Selection_bias w.r.t. EWA VS tensor Keys Cubics. Although I do think that the EWA ones win.