I'm writing a software generating web galleries (http://www.ezwebgallery.org), using Qt4 and Magick++. After having updated ImageMagick from 6.7.1-0 to 6.7.6-5, I couldn't read some photos using Magick::Image::read( const std::string &imageSpec_ ) anymore.
I found the cause, but I cannot determine if it is because of a former misuse of this function on my side or a regression. This occurred on Windows, I didn't try on Linux.
Code sample:
Code: Select all
string filenameLocal8( fileName.toLocal8Bit().constData() );
try{
pic.read( filenameLocal8 );
}
As you can see this QString contains a character 'é', which is not part of the 128 character standard ASCII. It is "ISO 8859-1" encoded (via toLocal8Bit() )before being assigned to the std::string filenameLocal8. As I understand, a std::string contains no info about its encoding.
The data contained by filenameLocal8 is subsequently passed to OpenBlob then to fopen_utf8, in which the 'é' character is incorrectly decoded ; leading to call _wfopen with
"D:\Users\Chris\temp\sꭥlection test\_PiqueNique&BBQ_011 Copie__.jpg" as the path parameter ultimately failing.
Of course I understand this happened because the std::string I passed to Image::read was not UTF-8 encoded. Thus assigning it some proper UTF-8 data solved my problem. Anyway, as stated, 6.7.1-0 Image::read could use the ISO 8859-1 string and open the corresponding file with no trouble.
So,
- 1. If ImageMagick tries to determine if the filepath provided is "8bit ASCII" encoded or UTF-8 encoded, in this particular case 6.7.6-5 failed and 6.7.1-0 didn't.
- 2. If there is a requirement to provide a string containing UTF-8 or plain 7bit "standard" ASCII, it should be mentioned somewhere in the Image::read documentation (I didn't find it). Of course I understand that UTF-8 is way better and fail proof than the various collection of "8bit ASCII" codecs.
Thanks!