Page 1 of 1

Dealing with High Load from Huge Images

Posted: 2009-03-25T15:48:17-07:00
by Akujin
I'm a developer for Artician.com. It is an art site. We allow image uploads of up to 20 Megabytes in jpg, jpeg, gif, or png.

We use Imagemagick for two distinct functions.

One is to calculate the average main colors in an image's palette.

Code: Select all

$average = new Imagick( $file );
$average->quantizeImage( $numColors, Imagick::COLORSPACE_RGB, 0, false, false ); //Reduce the amount of colors to 10
$average->uniqueImageColors(); //Only save one pixel of each color

//Clone the average and modulate to brighter & darker
$bright = $average->clone();  
$bright->modulateImage ( 125, 200, 100 );
$dark = $average->clone();
$dark->modulateImage ( 80, 100, 100 );  

// Helper function to create the mini-images
$colors['avg'] = self::getGeneralColors($average);
$colors['dark'] = self::getGeneralColors($bright);
$colors['light'] = self::getGeneralColors($dark);
The other way we use ImageMagick is for creating thumbnails which is handled through the phpthumb open source class.

So far we've had 38,000 pieces of art submitted with little to no problems but today we had our entire server grind to a halt when someone uploaded a really high resolution JPEG (9438x12012 pixels). Short term load averages on the server went above the 50.0 mark. The server has a single Quad Core.

The thumbnails get created through a function registered to run on shutdown so that the client isn't waiting on it.

The colors get calculated through an AJAX call when a client visits the view page of the submission if and only if the colors haven't already been calculated.

So my question is fairly open ended.

How can I mitigate the load of handling both the thumbnailing and the color paletization of such large images?

If possible I'd like to fork all of this to a separate process and then lower the process priority. Is this doable (and is it a solution)?

Re: Dealing with High Load from Huge Images

Posted: 2009-03-25T18:41:43-07:00
by magick
Have you read http://www.imagemagick.org/script/architecture.php?

At our image site, http://www.imagemagick.org/MagickStudio ... Studio.cgi, we perform a number of sanity checks to prevent a denial of service problem. The first thing we do is limit the number of incoming characters of any upload, next we use Ping() and check to see if the width / height of the image exceeds a maximum value (ping is lightweight and does not load the image pixels). Next we set the area, memory, map, and disk limits to ensure that large images are cached to disk so the processor is not taxed and very large images return an exception. The script is available at ftp://ftp.imagemagick.org/pub/ImageMagi ... 9.3.tar.gz. Its written in Perl but the principles can be translated to imagick / PHP. This script has been in use for over 10 years now without a single incidence of denial of service.

Re: Dealing with High Load from Huge Images

Posted: 2009-03-28T13:43:55-07:00
by Akujin
After running some benchmarks we found that the one line causing most of the load was

$average->quantizeImage( $numColors, Imagick::COLORSPACE_RGB, 0, false, false );

It was taking 47 seconds to run.

Instead of running the function on the original image we decided to do it on our "large view" thumbnail (max width 1200, keep aspect ratio). On large view it only took 3 seconds. Thumbnailing itself was more or less a none issue.

The colors are a bit different but we decided it wasn't worth the hassle.

Here are the colors that come out on the test http://dl.getdropbox.com/u/4163/diff.png

Left side is original size @ 47 seconds and right side is "large" size at 3 seconds.

Re: Dealing with High Load from Huge Images

Posted: 2009-04-10T07:26:52-07:00
by mkoppanen
I think it is quite safe to say that the two palettes are close enough :)