Working with large TIFF images C ++ / Magick / libtiff
I am writing C ++ - OpenCV based software for parsing a large number of large TIFF images - feature extraction and characterization. My current approach is to try to split each image into smaller cut sections, analyze each separately, and combine the results. I am trying to use Image Magick's convert function to perform cropping using flags:
convert input.tif -crop 4000x4000 +repage -scene 0 "output%d.tif"
This, however, runs for hours using 100% RAM and 100% I / O bandwidth, without producing any output.
Image details given by magic define input.tif:
input.tif TIFF64 63488x49408 63488x49408+0+0 8-bit sRGB 9.4112GB 0.000u 0:00.031
This same command ran on a smaller (about 20KB by 20KB) downsampled version of the image that returns the output within 5 seconds.
The computer has 8 GB of RAM, Windows 8 x64 and 2.00 GHz.
Can anyone please advise me on how to establish what is going wrong? Otherwise, can anyone advise me on an alternative way to solve this problem?
EDIT 1: More information
c:\test>identify -list resource
Resource limits:
Width: 214.7MP
Height: 214.7MP
Area: 14.888GP
Memory: 6.9326GiB
Map: 13.865GiB
Disk: unlimited
File: 1536
Thread: 4
Throttle: 0
Time: unlimited
source to share
I think my other answer here should help you significantly, but another thing you could check is that you are allowing ImageMagick to use 8GB of RAM. Try this command
identify -list resource
Resource limits:
Width: 214.7MP
Height: 214.7MP
Area: 4.295GP
Memory: 2GiB <---
Map: 4GiB
Disk: unlimited
File: 192
Thread: 1
Throttle: 0
Time: unlimited
and check the parameter Memory
.
If it is low, you can increase it on the command line with something like this
convert -limit memory 6GiB ...
or with an environment variable like this:
export MAGICK_MEMORY_LIMIT=8000000000
identify -list resource
Resource limits:
Width: 214.7MP
Height: 214.7MP
Area: 4.295GP
Memory: 7.4506GiB <---
Map: 4GiB
Disk: unlimited
File: 192
Thread: 1
Throttle: 0
Time: unlimited
source to share