Merged
Conversation
Bumps [pypa/cibuildwheel](https://github.com/pypa/cibuildwheel) from 3.3 to 3.4. - [Release notes](https://github.com/pypa/cibuildwheel/releases) - [Changelog](https://github.com/pypa/cibuildwheel/blob/main/docs/changelog.md) - [Commits](pypa/cibuildwheel@v3.3...v3.4) --- updated-dependencies: - dependency-name: pypa/cibuildwheel dependency-version: '3.4' dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com>
The 'NDArray.save' method does not path 'cparams' to the parent class 'copy'
consequently array is reprocessed with the default 'cparams'.
To observe the bug run the script below.
=================================================================
import os
import blosc2
import numpy as np
a = np.arange(10_000_000)
a = a * a
cparams = blosc2.CParams(
codec=blosc2.Codec.ZSTD,
clevel=9,
filters=[blosc2.Filter.BITSHUFFLE],
)
ba = blosc2.asarray(a, cparams=cparams)
print(f"Blosc2 memory: size {ba.cbytes}\t cratio: {ba.cratio}")
outdir = "cache/"
prefix = "save"
fname = outdir + prefix + ".b2nd"
ba.save(fname, mode="w")
fsize = os.path.getsize(fname)
print(f"Blosc2 array save:\t saved file size = {fsize}\t cratio: {a.nbytes/fsize}")
=================================================================
You should see
~~~~~
Blosc2 memory: size 4370284 cratio: 18.74477722729232
Blosc2 array save: saved file size = 12370369 cratio: 6.467066584675041
~~~~~
ba.cbytes ---> 4370284
ba.cratio ---> 18.3
I.e. the array in memory has 'cbytes=4370284',
however the saved file has 12370369 bytes, which gives 'cratio' of about 6.5.
Also if the array is loaded back from the file, it is easy to see
that 'cparams' are different from the original array and changed to the
default one.
After the patch, the memory 'cbytes' are closely matching the saved file size
as well as 'cparams'
~~~~~
Blosc2 memory: size 4370284 cratio: 18.74477722729232
Blosc2 array save: saved file size = 4370051 cratio: 18.30642251085856
~~~~~
fixing NDArray save method to use its own compression parameters
…uildwheel-3.4 Bump pypa/cibuildwheel from 3.3 to 3.4
First implementation of a VLArray store
Preserve user vlmeta when BatchStore recreates its empty backing SChunk during initial layout inference, avoid persisting empty batch_lengths metadata that breaks vlmeta.getall() on empty stores, and keep user meta/vlmeta when copy(storage=...) is used. Add BatchStore regression tests covering: - vlmeta preservation during inferred layout initialization - clear()/delete-last on empty stores - metadata preservation on copy(storage=...)
Batch store
# Conflicts: # ANNOUNCE.rst # RELEASE_NOTES.md # pyproject.toml # src/blosc2/version.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Multithreaded matrix multiplication.
Unfortunately running the benchmark gives:
so multithreaded is only faster for the specific chunk/block combos made in the bench.
Also: