You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I find myself rather often trying to convert different sorts of hdf5 files to vtk with h5tovtk.
Despite having access to High-Performance-Computing resources, when converting large files (from about 100 GB up) it is rather frustrating to get the same hopeless response: arrayh5 error: out of memory.
A distributed (MPI?) version of the tool (for multin-node system) or at least a buffered one, capable of copying data in chunks, would be very welcome!
Any hope of seeing something like this in the near future?
Or is this already possible and I'm doing something wrong?
Thank you!
The text was updated successfully, but these errors were encountered:
It would certainly be possible, but I'm unlikely to work on this myself in the near future. If someone wants to work on a PR to read the data in chunks that would be welcome.
I find myself rather often trying to convert different sorts of
hdf5
files to vtk withh5tovtk
.Despite having access to High-Performance-Computing resources, when converting large files (from about 100 GB up) it is rather frustrating to get the same hopeless response:
arrayh5 error: out of memory
.A distributed (MPI?) version of the tool (for multin-node system) or at least a buffered one, capable of copying data in chunks, would be very welcome!
Any hope of seeing something like this in the near future?
Or is this already possible and I'm doing something wrong?
Thank you!
The text was updated successfully, but these errors were encountered: