Averaging rasters is a common task. Often I see developers write many lines of python to do simple tasks like this. The following GDAL command should give you the results you're looking for:

gdal_calc.py -A input1.tif -B input2.tif --outfile=result.tif --calc="(A+B)/2"

The above command was taken directly from the gdal_calc.py man page:
https://gdal.org/programs/gdal_calc.html


On 9/27/23 10:44, Bill Myers via gdal-dev wrote:
Hi,
I am new to working with GDAL. We are trying to spatially interpolate Numerical Weather Prediction (NWP) model data to an arbitrary location, i.e. lat/lon. We have been using gdal_translate but have been getting unexpected results. We have used the resampling method called "average". In this toy example, we are trying to resample to a grid with twice the density of the input data set. As can be seen in the image in the link below, the values at the points that match the input points agree with the input values. That's good. However, the interpolated values do not match our expectations. We would expect the interpolated value at the midpoint (directly between two input points) to be the average of those two input points. As can be seen, this is not the case. Can someone please help us understand what we might be doing incorrectly? Also, does GDAL have the functionality that we are looking for?
Thanks for your help.
-b
https://pasteboard.co/uumLD4JtNmp3.png <https://pasteboard.co/uumLD4JtNmp3.png>

_______________________________________________
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev
_______________________________________________
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Reply via email to