The whole, new GTI format got by me until Michael Sumner mentioned it on slack, which I visit extremely rarely. (Thanks Michael). I've been doing it in an entirely different way. The feature request is based on that.

The feature request is this:

   gdaltindex -co GDALINFO=mymeta catalog.fgb *.tif
or
   ogrtindex -co OGRINFO=mymeta catalog.fgb *.shp

This says, create an attribute in the .fgb file called 'mymeta' and populate it with the equivalent of 'ogrinfo -json' or 'gdalinfo -json'. Or perhaps:

   gdaltindex -co GDALSTATS=mymeta catalog.fgb *.tif

Which is equivalent to 'gdalinfo -json -stats -hist'

I've been doing this as a 2 step process, but it would be really cool to have it all integrated. I have many raster/vector files of different formats in a single catalog.gpkg and catalog.fgb. I can easily filter any by bbox, type or stats.

Why?

If you're grabbing your data from an .fgb file on S3/web server without an intermediate server or service, using a bbox, it's painless to get exactly the data you need. Further, you have access to all the attributes, stats, min, max, stddev for filtering *before* you request the actual data.

If you don't like cloud native or .fgb and want to access your data via an API, create your catalog as a .gpkg. Create custom attributes for filtering and indexes on those attributes, json meta data, etc, and throw it behind an API.

With a catalog.gpkg you can have all your rasters/vectors in the same catalog.gpkg and ogr2ogr to an .fgb. Anytime you update/upsert your .gpkg, recreate the .fgb.

As awesome as STAC is, it's becoming increasingly complex and very slow with large data sets. Compare that to an -sql query on a single catalog.gpkg (particularly with spatialite).

If you've made it this far, thanks for listening!

Scott

--
www.postholer.com
_______________________________________________
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Reply via email to