You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I created a dataset with the following python code
compression_options=hdf5plugin.Blosc2(
cname='blosclz', # Blosc2 supports 'zstd', 'lz4', 'blosclz', etc.clevel=9, # Compression level (1-9)filters=hdf5plugin.Blosc2.SHUFFLE, # Better for floating point data
)
ifisinstance(da_arr.data, da.Array):
chunks=da_arr.data.chunksizeis_dask=Trueelse:
chunks=Noneis_dask=Falsedset=f.create_dataset(
dset_name,
shape=da_arr.shape, # Keep original shapedtype=da_arr.dtype, # Ensure proper data typechunks=chunks, # copy chunksize of underlying dask arraycompression=compression_options
)
The h5web (in vscode) viewer gives this error:
HDF5-DIAG: Error detected in HDF5 (1.14.2) thread 0:
#000: /__w/libhdf5-wasm/libhdf5-wasm/build/1.14.2/_deps/hdf5-src/src/H5D.c line 1061 in H5Dread(): can't synchronously read data
major: Dataset
minor: Read failed
#001: /__w/libhdf5-wasm/libhdf5-wasm/build/1.14.2/_deps/hdf5-src/src/H5D.c line 1008 in H5D__read_api_common(): can't read data
major: Dataset
minor: Read failed
#002: /__w/libhdf5-wasm/libhdf5-wasm/build/1.14.2/_deps/hdf5-src/src/H5VLcallback.c line 2092 in H5VL_dataset_read_direct(): dataset read failed
major: Virtual Object Layer
minor: Read failed
#003: /__w/libhdf5-wasm/libhdf5-wasm/build/1.14.2/_deps/hdf5-src/src/H5VLcallback.c line 2048 in H5VL__dataset_read(): dataset read failed
major: Virtual Object Layer
minor: Read failed
#004: /__w/libhdf5-wasm/libhdf5-wasm/build/1.14.2/_deps/hdf5-src/src/H5VLnative_dataset.c line 363 in H5VL__native_dataset_read(): can't read data
major: Dataset
minor: Read failed
#005: /__w/libhdf5-wasm/libhdf5-wasm/build/1.14.2/_deps/hdf5-src/src/H5Dio.c line 279 in H5D__read(): can't initialize I/O info
major: Dataset
minor: Unable to initialize object
#006: /__w/libhdf5-wasm/libhdf5-wasm/build/1.14.2/_deps/hdf5-src/src/H5Dchunk.c line 1088 in H5D__chunk_io_init(): unable to create file and memory chunk selections
major: Dataset
minor: Unable to initialize object
#007: /__w/libhdf5-wasm/libhdf5-wasm/build/1.14.2/_deps/hdf5-src/src/H5Dchunk.c line 1231 in H5D__chunk_io_init_selections(): unable to create file chunk selections
major: Dataset
minor: Unable to initialize object
#008: /__w/libhdf5-wasm/libhdf5-wasm/build/1.14.2/_deps/hdf5-src/src/H5Dchunk.c line 1692 in H5D__create_piece_file_map_all(): can't insert chunk into skip list
major: Dataspace
minor: Unable to insert object
#009: /__w/libhdf5-wasm/libhdf5-wasm/build/1.14.2/_deps/hdf5-src/src/H5SL.c line 1036 in H5SL_insert(): can't create new skip list node
major: Skip Lists
minor: Unable to insert object
#010: /__w/libhdf5-wasm/libhdf5-wasm/build/1.14.2/_deps/hdf5-src/src/H5SL.c line 709 in H5SL__insert_common(): can't insert duplicate key
major: Skip Lists
minor: Unable to insert object
This is the metadata displayed with h5web of the dataset:
With for instance gzip compression it works just fine. I am also read the dataset again with the h5py library. Should h5web be able to read blosc and blosc2 compressed datasets out of the box?
The text was updated successfully, but these errors were encountered:
So I don't think the problem comes from the compression per se. Perhaps your dataset is somehow malformed and the Blosc2 filter is less lenient to it than the gzip filter?
Hi, I created a dataset with the following python code
The h5web (in vscode) viewer gives this error:
This is the metadata displayed with h5web of the dataset:
With for instance gzip compression it works just fine. I am also read the dataset again with the h5py library. Should h5web be able to read blosc and blosc2 compressed datasets out of the box?
The text was updated successfully, but these errors were encountered: