Margherita
thanks, now that you say this, I remeber a special GDAL option "GDAL_MAX_DATASET_POOL_SIZE" which by default is 100. The operating system limit itself is high enough on our server (65k). When running the processing again with less memory cache (and hence likely less concurrent opened files) it worked without errors. I will set this GDAL option and see if I ever run into this errors again.
armin
Gesendet: Mittwoch, 21. Dezember 2016 um 07:29 Uhr
Von: "Margherita Di Leo" <[email protected]>
An: [email protected]
Betreff: Re: [gdal-dev] Read error on large VRT files
Von: "Margherita Di Leo" <[email protected]>
An: [email protected]
Betreff: Re: [gdal-dev] Read error on large VRT files
Hey Armin,
Could it be a limitation in number of open files at same time? See https://grasswiki.osgeo.org/wiki/Large_raster_data_processing#Troubleshooting
Il giorno mar 20 dic 2016 alle 22:21 Armin Burger <[email protected]> ha scritto:
Hi all
I sometimes get read errors when trying to convert large input VRT files
into a single BigTIFF file, typically applying an outsize of e.g. 50% or
25%. Errors are like
ERROR 1: TIFFFillTile:Read error at row 11264, col 11264; got 46 by
tes, expected 84
ERROR 1: TIFFReadEncodedTile() failed.
ERROR 1: 2352/2592_BUCNFD.tif, band 1: IReadBlock failed at X offset 45,
Y offset 47
The tile that returns the error can be converted alone without problems.
All together the number of tiles of the VRT can be towards 6000, having
100 GB in total, tiles are LZW or Deflate compressed TIFF's.
Is there a possible source of errors when trying to read and convert
VRT's of compressed TIFF of that size?
Thanks for any hint
Armin
_______________________________________________
gdal-dev mailing list
[email protected]
http://lists.osgeo.org/mailman/listinfo/gdal-dev
_______________________________________________ gdal-dev mailing list [email protected] http://lists.osgeo.org/mailman/listinfo/gdal-dev
