I've been trying to work with some large image files. A current example being a JPG file which is 22447 x 54334 pixels (approx 4Gb, or 250Mb on disk). I'm working in x64 and have 16Gb of RAM. So there is adequate addressing and RAM available.
Initially I wanted to work with the files this size, but in practise it's going to better to reduce them to a a mosaic of tiles. I have found external ways of making these into tiles, but I really hoped to be able to work with these inside my application.
[and before anyone says it - the size of the file I start with is defined by the data acquisition and changing it is not negotiable]
I am able to load this file with TJPEGImage.LoadFile(), however if I try and do anything with it then I get an EOutOfResources exception and the message "The parameter is incorrect", so this is using TJPEGImage.Canvas.Pixels or if I try to TBitmap.Assign() - presumably both cases when it's attempting to access the DIB.
Is this a permanent problem with a bitmap this size ? (and if so why is there this limit) or does anyone know of a way of working round this ?
I'm not too fussed about performance as I basically just want to chop up the data as it's acquired and one of these bitmaps take several hours to acquire.
I've also approached this in Firemonkey and found some interesting effects.
(1) if you open this large JPG with TBitmap.LoadFromFile it resizes it so that the maximum dimension is 8192. I haven't found a way to either determine this value or change it. The problem is that until you load the file you don't know the size, so you don't even know it's been resized.
(2) You can avoid resizing with TBitmapCodecManager.LoadFromFile() and then I get the correct Width/Height values I am expecting. However if I save the file TBitmapCodecManager.SaveToFile() or access a Pixels value then it appears to be all zeros.
It's very disconcerting to have code that you call that doesn't deal with you data or modifies it, but gives no clue that it has done so.
Anyone any ideas ?