I have a large amount of data that is scattered in memory that I want to write 
as a single large data set in HDF5. The data is organized as follows:

64000 structs of type Lambert;

typedef struct
{
  double north[22][22];
  double south[22][22];
} Lambert;

(The 64000 is actually flexible but I hard set it for this example along with 
the 2D sizes in the Lambert struct).

I figure I need to create a "chunked" data array in the "FileSpace" so that all 
the space is allocated in the file and with each write to the file the "north" 
and "south" arrays are written to the correct location in the file.

Is there an example that does something like this? Or am I thinking about this 
correctly?

Thanks for any help or pointers.
___________________________________________________________
Mike Jackson                            BlueQuartz Software
Help:                                [email protected]
Web|Download                  http://dream3d.bluequartz.net


_______________________________________________
Hdf-forum is for HDF software users discussion.
[email protected]
http://mail.lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org

Reply via email to