Converting large number of tifs to pdf efficiently?

Questions and postings pertaining to the usage of ImageMagick regardless of the interface. This includes the command-line utilities, as well as the C and C++ APIs. Usage questions are like "How do I use ImageMagick to create drop shadows?".
Post Reply
chadcf
Posts: 4
Joined: 2013-04-26T09:09:29-07:00
Authentication code: 6789

Converting large number of tifs to pdf efficiently?

Post by chadcf »

We have a project that requires us to take an uploaded zip file and convert the 100+ tifs and pdfs inside to a single pdf document. I've written a simple shell script to do this and it works fine, except that it completely thrashes the machine and has locked up one of our production servers on multiple occasions... The files are all named something like xxx-01.tif and I have to sort them by the last part after the -, then convert them in that order to a pdf, so the actual convert command looks something like

convert file-01.tif file-02.tif file-03.tif ... file-101.tif outputfile.pdf

I'm guessing that perhaps image magick is sucking all of these source tifs into ram and that's why things are going haywire (or just that the cpu is being thrashed). But regardless, is there any way I can accomplish this without completely destroying the server? Like maybe convert them one at a time and append, or some other strategy?
chadcf
Posts: 4
Joined: 2013-04-26T09:09:29-07:00
Authentication code: 6789

Re: Converting large number of tifs to pdf efficiently?

Post by chadcf »

Yeah I gave that a shot with a limit of 128 and that ended up segfaulting and filling up the disk.
User avatar
magick
Site Admin
Posts: 11064
Joined: 2003-05-31T11:32:55-07:00

Re: Converting large number of tifs to pdf efficiently?

Post by magick »

Set the temporary path to a disk that has plenty of free space. If the free space fills up, and the pixel cache is memory-mapped, the OS will return a fault. You can set '-limit memory 0 -limit map 0' and all pixels are cached to disk, dramatically reducing the memory requirements. It doesn't hurt to use a modern version of ImageMagick, the current release is 6.8.5-2.
josal
Posts: 2
Joined: 2013-06-07T09:28:02-07:00
Authentication code: 6789

Re: Converting large number of tifs to pdf efficiently?

Post by josal »

I've used the '-limit memory 0 -limit map 0' suggestion but I'm still getting memory errors.

For you to know more details, I'm using imagemagick in a web app hosted in heroku, which only gives me 512MB in memory to use in my process. It always gives me a R14 error https://devcenter.heroku.com/articles/e ... a_exceeded. I can increase the available memory to 1024MB, but it gets it all the same way. It seems like the memory limit is not being set at all. The same happens with other limits like '-limit memory 32MiB -limit map 64MiB' or similar.

How can I efficiently set the memory limit? Of how can I know that the limit has been correctly set?

Thanks in advance - the used version in my hosting server is 6.5.7-8 2012-08-17 Q16
Post Reply