BigArray: An ndarray-like on top of BigFile memory mappings
I.e. something like numpy.memmap for numpy.ndarray and OS files. The whole bigarray cannot be used as a drop-in replacement for numpy arrays, but BigArray _slices_ are real ndarrays and can be used everywhere ndarray can be used, including in C/Fortran code. Slice size is limited by mapping-size (= address-space size) limit, i.e. to ~ max 127TB on Linux/amd64. Changes to bigarray memory are changes to bigfile memory mapping and as such can be discarded or saved back to bigfile using mapping (= BigFileH) dirty discard/writeout interface. For the same reason the whole amount of changes to memory is limited by amount of physical RAM.
Showing
bigarray/__init__.py
0 → 100644
bigarray/tests/__init__.py
0 → 120000
bigarray/tests/test_basic.py
0 → 100644
Please register or sign in to comment