Reputation: 1989
I have a 2D array which I need to write into an hdf5 file. I need the 2D array split up into a set of 1D arrays, i.e. every 1D array has the same size. For some reason I create a new dataspace in my C
code for every one of those 1D arrays. To be more precise, my current approach is
H5Fcreate
H5Gcreate2
H5Screate_simple
H5Dcreate
H5Dwrite
H5Dclose
H5Sclose
After having a look at the documentation for H5Dcreate2
, I realize that I might not need to close the dataspace after each 1D array and create a new one afterwards for the next 1D array, as those 1D arrays are all of the same size.
My question: is there any reason, why I should create a new/separate dataspace for every single 1D array, if all those 1D arrays have the same size (but different content)?
Upvotes: 0
Views: 802
Reputation: 912
Your conclusion is correct: there is no need to create a new/separate dataspace for every single 1D array you create (in other words, just re-use the same dataspace as many times needed as long as all these arrays are of the same size and data type).
This is what happens when using HDFql. Coding your algorithm above using HDFql looks as follows in C:
// declare variable
char script[100];
// create an HDF5 file named 'test.h5' and use it (i.e. open it)
hdfql_execute("create and use file test.h5");
// create an HDF5 group named 'my_group'
hdfql_execute("create group my_group");
// create 100 arrays
for(int i = 0; i < 100; i++)
{
// prepare script to create a one dimensional (size 10) dataset of data type int
sprintf(script, "create dataset my_group/my_dataset_%d as int(10)", i);
// execute script
hdfql_execute(script);
}
// close file
hdfql_execute("close file");
In this code above this is done transparently (from the user point-of-view) but this is what is happening under the hood.
Upvotes: 1