Skip to content

Commit 715489e

Browse files
shinaokaclaude
andauthored
feat: HDF5 1.10.5+ support and rename to hdf5-rt (#18)
* docs: add HDF5 1.10.5+ support implementation plan Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * feat: add HDF5 version storage and detection - Add HDF5_RUNTIME_VERSION global static to store detected version - Add hdf5_version() and hdf5_version_at_least() accessors - Change minimum version from 1.12.0 to 1.10.5 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * feat: add H5O_info1_t type for HDF5 < 1.12 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * feat: add pre-1.12 H5O functions (H5Oget_info1, H5Oopen_by_addr) - H5Oget_info1 and H5Oget_info_by_name1 loaded conditionally - H5Oopen_by_addr available in all versions Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * feat: change LocationToken to enum for pre-1.12 support - LocationToken now has Address and Token variants - H5O_get_info branches by HDF5 version - H5O_open_by_token uses appropriate API based on token type Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * feat: export version functions from sys module Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * ci: test multiple HDF5 versions (1.10.x, 1.12, 1.14) - Add matrix for HDF5 version testing - Ubuntu system HDF5 (1.10.x) tests compatibility - Conda HDF5 1.12 and 1.14 test newer features Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * feat: complete HDF5 1.10.x compatibility and CI matrix testing - Add version-dependent wrappers for H5Sencode, H5Literate - Fix H5Oget_info1/H5Oget_info_by_name1 signatures (no fields param) - Add complete H5O_info1_t struct with hdr and meta_size fields - Add convert_h5i_type for H5I_type_t enum differences between versions - Skip test_references on HDF5 < 1.12 (requires H5R_ref_t) - Update CI to explicitly test HDF5 1.10.x, 1.12.x, 1.14.x - Add test script for local multi-version testing Tested with HDF5 1.10.11, 1.12.3, 1.13.3, 1.14.5 Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * refactor: rename crates to hdf5-rt and hdf5-rt-types Rename for general use beyond tensor4all: - tensor4all-hdf5-ffi → hdf5-rt - tensor4all-hdf5-types → hdf5-rt-types Also recover tests from hdf5-metno: - test_plist.rs (39 tests, 2 ignored) - test_datatypes.rs (7 tests) - test_object_references.rs (8 tests) - tests.rs (1 test with manual H5Type impl) Test coverage improved from 70.45% to 82.27% Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * fix: support HDF5 2.x in version test Homebrew on macOS now provides HDF5 2.x. Update test to accept both HDF5 1.x and 2.x major versions. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * feat: add HDF5 2.x compatibility - Add H5T_COMPLEX type class (new in HDF5 2.0) - Update H5T_NCLASSES to 12 for HDF5 2.0 - Update version test to accept both HDF5 1.x and 2.x Note: Our runtime-loading approach requires handling both versions in the same binary, unlike upstream compile-time feature flags. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * fix: leak HDF5 library handle to prevent cleanup issues Root cause: When the OnceLock<Library> was dropped at process exit, dlclose() was called on the HDF5 library. This triggered HDF5's internal cleanup routines which caused 'infinite loop closing library' and SIGSEGV on Linux, especially during parallel test execution. Solution: Use Box::leak() to intentionally leak the library handle. This prevents dlclose() from being called, keeping the HDF5 library loaded until process termination. This is safe because: 1. We only load the library once per process 2. The OS will reclaim all memory on process exit 3. This is a common pattern for libraries with problematic cleanup Also reverts the CI workaround (--test-threads=1) since the root cause is now fixed. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * ci: use single-threaded tests on Linux to avoid SIGSEGV Parallel test execution on Linux causes SIGSEGV in test_plist tests. The root cause is still under investigation, but this workaround allows CI to pass while we debug the issue. macOS parallel tests work fine, so only Linux CI uses --test-threads=1. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * ci: simplify CI and disable test_plist temporarily - Remove test-features job, use --all-features in main test job - Temporarily disable test_plist.rs (SIGSEGV on Linux with conda HDF5) - Add *.disabled to .gitignore The test_plist tests work on macOS but crash on Linux. Root cause investigation needed. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * ci: upgrade Julia to 1.11 to fix curl_multi_assign abort Julia 1.10 + curl 8.10+ triggers a crash in Downloads.jl during Pkg.instantiate() due to a NULL handle dereference in curl_multi_assign. Julia 1.11 includes a fixed Downloads.jl that avoids this issue. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: re-enable test_plist and improve PropertyList error handling - PropertyList::copy() now returns Result<Self> instead of silently returning an invalid handle on failure - get_shared_mesg_indexes() uses h5get! instead of h5get_d! to propagate errors instead of silently defaulting to 0 - Re-enable test_plist.rs (41 tests) - SIGSEGV was caused by the library cleanup issue fixed in 057fb0f, not by plist operations - Replace conda with JLL/system HDF5 in Julia interop CI to avoid curl_multi_assign crash in Pkg.instantiate() Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: CI fixes for test_plist SIGSEGV and Julia interop - Use --test-threads=1 on Linux to avoid test_plist SIGSEGV in parallel - Set LD_LIBRARY_PATH in test_interop.jl so Rust binary can dlopen JLL libhdf5.so and its dependencies - Add pkg-config fallback path for Ubuntu systems Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * ci: add gdb backtrace for test_plist SIGSEGV debugging Add gdb step to capture exact backtrace of SIGSEGV in test_fapl_common on x86_64 Linux. This will reveal the exact crash location. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: correct hbool_t type from c_uint (4 bytes) to u8 (1 byte) HDF5's hbool_t is typedef'd as bool (_Bool), which is 1 byte on all modern systems with <stdbool.h>. Our definition was c_uint (4 bytes), causing struct layout mismatches in H5AC_cache_config_t and other structs containing hbool_t fields. This was the root cause of the SIGSEGV in test_plist on x86_64 Linux: H5Pget_mdc_config wrote into H5AC_cache_config_t using 1-byte bool offsets, but Rust read fields at 4-byte uint offsets, causing decr_mode to contain an invalid enum discriminant. Also removes the temporary GDB debugging CI step and continue-on-error. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: Julia interop CI - add JLL dependency paths and pin ubuntu - Add get_jll_lib_paths() to collect all JLL dependency library paths (Zlib_jll, libaec_jll, etc.) for LD_LIBRARY_PATH when running Rust binary - Pin Julia interop job to ubuntu-22.04 to avoid libssl.so loading issues on ubuntu-24.04 where system OpenSSL 3.x conflicts with JLL OpenSSL 1.1 Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * ci: simplify Julia interop to JLL-only, fix attribute read API - Remove system HDF5 variant from Julia interop CI (JLL is the standard path) - Add get_jll_lib_paths() to include all JLL dependency dirs in LD_LIBRARY_PATH - Fix read(attrs(file), key) -> read_attribute(file, key) for newer HDF5.jl Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
1 parent 42b9c68 commit 715489e

34 files changed

Lines changed: 2557 additions & 205 deletions

.github/workflows/ci.yml

Lines changed: 39 additions & 50 deletions
Original file line numberDiff line numberDiff line change
@@ -35,62 +35,58 @@ jobs:
3535
run: cargo clippy --workspace -- -D warnings -A clippy::multiple-crate-versions
3636

3737
test:
38-
name: test (${{ matrix.os }})
38+
name: test (${{ matrix.os }}, HDF5 ${{ matrix.hdf5 }})
3939
runs-on: ${{ matrix.os }}
4040
strategy:
4141
fail-fast: false
4242
matrix:
43-
os: [ubuntu-22.04, ubuntu-24.04]
43+
include:
44+
# Ubuntu with system HDF5 1.10.x (minimum supported version)
45+
- os: ubuntu-22.04
46+
hdf5: "1.10"
47+
hdf5_source: "apt"
48+
# Ubuntu with conda HDF5 1.12.x
49+
- os: ubuntu-22.04
50+
hdf5: "1.12"
51+
hdf5_source: "conda"
52+
# Ubuntu with conda HDF5 1.14.x
53+
- os: ubuntu-24.04
54+
hdf5: "1.14"
55+
hdf5_source: "conda"
4456
steps:
4557
- name: Checkout repository
4658
uses: actions/checkout@v6
4759
- name: Install Rust
4860
uses: dtolnay/rust-toolchain@stable
61+
- name: Install HDF5 from apt (1.10.x)
62+
if: matrix.hdf5_source == 'apt'
63+
run: sudo apt-get update && sudo apt-get install -y libhdf5-dev
4964
- name: Setup Conda
65+
if: matrix.hdf5_source == 'conda'
5066
uses: conda-incubator/setup-miniconda@v3
5167
with:
5268
auto-update-conda: true
5369
python-version: "3.11"
54-
- name: Install HDF5 1.12+ from conda-forge
70+
- name: Install HDF5 from conda-forge
71+
if: matrix.hdf5_source == 'conda'
5572
shell: bash -el {0}
56-
run: conda install -c conda-forge hdf5>=1.12
57-
- name: Build
73+
run: conda install -c conda-forge "hdf5>=${{ matrix.hdf5 }},<${{ matrix.hdf5 }}.99"
74+
- name: Set HDF5 library path (conda)
75+
if: matrix.hdf5_source == 'conda'
5876
shell: bash -el {0}
59-
run: cargo build --workspace --verbose
60-
- name: Run tests
77+
run: echo "LD_LIBRARY_PATH=$CONDA_PREFIX/lib:$LD_LIBRARY_PATH" >> $GITHUB_ENV
78+
- name: Show HDF5 version
6179
shell: bash -el {0}
62-
run: cargo test --workspace --verbose
63-
64-
test-features:
65-
name: test features (${{ matrix.features }})
66-
runs-on: ubuntu-latest
67-
strategy:
68-
fail-fast: false
69-
matrix:
70-
features:
71-
- ""
72-
- "complex"
73-
- "f16"
74-
- "complex,f16"
75-
steps:
76-
- name: Checkout repository
77-
uses: actions/checkout@v6
78-
- name: Install Rust
79-
uses: dtolnay/rust-toolchain@stable
80-
- name: Setup Conda
81-
uses: conda-incubator/setup-miniconda@v3
82-
with:
83-
auto-update-conda: true
84-
python-version: "3.11"
85-
- name: Install HDF5 1.12+ from conda-forge
86-
shell: bash -el {0}
87-
run: conda install -c conda-forge hdf5>=1.12
88-
- name: Build with features
89-
shell: bash -el {0}
90-
run: cargo build --workspace --features "${{ matrix.features }}" --verbose
91-
- name: Test with features
80+
run: |
81+
if command -v h5dump &> /dev/null; then
82+
h5dump --version
83+
else
84+
echo "h5dump not in PATH, checking library..."
85+
find /usr -name "libhdf5*.so*" 2>/dev/null | head -5 || true
86+
fi
87+
- name: Run tests
9288
shell: bash -el {0}
93-
run: cargo test --workspace --features "${{ matrix.features }}" --verbose
89+
run: cargo test --workspace --all-features --verbose
9490

9591
macos:
9692
name: macOS
@@ -111,7 +107,7 @@ jobs:
111107

112108
interop-julia:
113109
name: Julia interop
114-
runs-on: ubuntu-latest
110+
runs-on: ubuntu-22.04
115111
steps:
116112
- name: Checkout repository
117113
uses: actions/checkout@v6
@@ -120,22 +116,12 @@ jobs:
120116
- name: Install Julia
121117
uses: julia-actions/setup-julia@v2
122118
with:
123-
version: '1.10'
124-
- name: Setup Conda
125-
uses: conda-incubator/setup-miniconda@v3
126-
with:
127-
auto-update-conda: true
128-
python-version: "3.11"
129-
- name: Install HDF5 1.12+ from conda-forge
130-
shell: bash -el {0}
131-
run: conda install -c conda-forge hdf5>=1.12
119+
version: '1.11'
132120
- name: Setup Julia project
133-
shell: bash -el {0}
134121
run: |
135122
cd tests/julia
136123
julia --project=. -e 'using Pkg; Pkg.instantiate()'
137124
- name: Run Julia interop tests
138-
shell: bash -el {0}
139125
run: |
140126
cd tests/julia
141127
julia --project=. test_interop.jl
@@ -156,6 +142,9 @@ jobs:
156142
- name: Install HDF5 1.12+ and h5py from conda-forge
157143
shell: bash -el {0}
158144
run: conda install -c conda-forge hdf5>=1.12 h5py
145+
- name: Set HDF5 library path
146+
shell: bash -el {0}
147+
run: echo "LD_LIBRARY_PATH=$CONDA_PREFIX/lib:$LD_LIBRARY_PATH" >> $GITHUB_ENV
159148
- name: Install Python dependencies
160149
shell: bash -el {0}
161150
run: |

.gitignore

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,4 +15,5 @@ sweep.timestamp
1515
tests/julia/Manifest.toml
1616

1717
# Python virtual environment
18-
tests/python/.venv/
18+
tests/python/.venv/
19+
status.md

AGENTS.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Agent Guidelines for tensor4all-hdf5-ffi
1+
# Agent Guidelines for hdf5-rt
22

33
Read `README.md` before starting work.
44

@@ -77,7 +77,7 @@ gh pr create --base main --title "Title" --body "Desc"
7777
gh pr merge --auto --squash --delete-branch
7878

7979
# Large: worktree workflow
80-
git worktree add ../tensor4all-hdf5-ffi-feature -b feature
80+
git worktree add ../hdf5-rt-feature -b feature
8181

8282
# Check PR before update
8383
gh pr view <NUM> --json state # Never push to merged PR

Cargo.toml

Lines changed: 6 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,8 @@ authors = [
1111
]
1212
keywords = ["hdf5"]
1313
license = "MIT OR Apache-2.0"
14-
repository = "https://github.com/tensor4all/tensor4all-hdf5-ffi"
15-
homepage = "https://github.com/tensor4all/tensor4all-hdf5-ffi"
14+
repository = "https://github.com/tensor4all/hdf5-rt"
15+
homepage = "https://github.com/tensor4all/hdf5-rt"
1616
edition = "2021"
1717

1818
[workspace.dependencies]
@@ -24,13 +24,10 @@ libloading = "0.9"
2424
num-complex = { version = "0.4", default-features = false }
2525

2626
# internal
27-
tensor4all-hdf5-ffi = { path = "hdf5" }
28-
tensor4all-hdf5-types = { path = "hdf5-types" }
29-
# alias for internal use (to avoid changing source code)
30-
hdf5-types = { path = "hdf5-types", package = "tensor4all-hdf5-types" }
31-
32-
# Use hdf5-metno-sys from crates.io
33-
hdf5-sys = { package = "hdf5-metno-sys", version = "0.11" }
27+
hdf5-rt = { path = "hdf5" }
28+
hdf5-rt-types = { path = "hdf5-types" }
29+
# alias for internal use (to avoid changing all source code)
30+
hdf5-types = { path = "hdf5-types", package = "hdf5-rt-types" }
3431

3532
[profile.dev]
3633
# Fast compile, reasonable runtime

README.md

Lines changed: 38 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,35 +1,51 @@
1-
# tensor4all-hdf5-ffi
1+
# hdf5-rt
22

3-
Thread-safe Rust bindings for the HDF5 library, forked from [hdf5-metno](https://github.com/metno/hdf5-rust) for the tensor4all project.
3+
Thread-safe Rust bindings for the HDF5 library with **runtime loading** (dlopen).
4+
5+
Forked from [hdf5-metno](https://github.com/metno/hdf5-rust).
46

57
## Overview
68

7-
This is a simplified fork of hdf5-metno with:
8-
- Removed features: MPI, compression filters (blosc, lzf, zfp)
9-
- Removed derive macros (hdf5-derive)
10-
- Uses hdf5-metno-sys from crates.io for FFI bindings
11-
- Infrastructure for runtime library loading (dlopen) for Julia/Python bindings
9+
`hdf5-rt` loads the HDF5 library at runtime via dlopen, eliminating build-time dependencies on HDF5. This makes it ideal for:
10+
11+
- **Julia/Python bindings** - Reuse the HDF5 library already loaded by HDF5.jl or h5py
12+
- **Portable binaries** - Ship without bundling HDF5
13+
- **Version flexibility** - Work with any compatible HDF5 version installed on the system
1214

1315
## Features
1416

17+
- **Runtime loading** - No compile-time HDF5 dependency
18+
- **HDF5 1.10.5+ support** - Compatible with Ubuntu 22.04, HDF5.jl, h5py
19+
- **Thread-safe** - Safe concurrent access to HDF5
20+
21+
Optional features:
1522
- `complex`: Complex number type support (Complex32, Complex64)
1623
- `f16`: Float16 type support
17-
- `runtime-loading`: Runtime library loading via dlopen (infrastructure only)
1824

1925
## Usage
2026

2127
```toml
2228
[dependencies]
23-
hdf5 = { git = "https://github.com/shinaoka/tensor4all-hdf5-ffi" }
29+
hdf5-rt = { git = "https://github.com/tensor4all/hdf5-rt" }
2430
```
2531

26-
## Requirements
27-
28-
- **HDF5 1.12.0 or later** - The library uses HDF5 1.12+ features
32+
```rust
33+
use hdf5_rt::File;
34+
35+
fn main() -> hdf5_rt::Result<()> {
36+
let file = File::create("test.h5")?;
37+
let group = file.create_group("data")?;
38+
let dataset = group.new_dataset::<f64>()
39+
.shape([100, 100])
40+
.create("matrix")?;
41+
Ok(())
42+
}
43+
```
2944

30-
## Building
45+
## Requirements
3146

32-
Requires HDF5 library (version 1.12.0+) installed on your system:
47+
- **HDF5 1.10.5 or later** installed on your system
48+
- Rust 1.80.0+
3349

3450
```bash
3551
# Ubuntu/Debian
@@ -39,6 +55,13 @@ sudo apt-get install libhdf5-dev
3955
brew install hdf5
4056
```
4157

58+
## Crates
59+
60+
| Crate | Description |
61+
|-------|-------------|
62+
| `hdf5-rt` | Main HDF5 bindings with runtime loading |
63+
| `hdf5-rt-types` | Native Rust equivalents of HDF5 types |
64+
4265
## License
4366

4467
Licensed under either of:
@@ -49,5 +72,4 @@ at your option.
4972

5073
## Acknowledgments
5174

52-
Based on [hdf5-metno](https://github.com/metno/hdf5-rust) by Magnus Ulimoen and contributors.
53-
75+
Based on [hdf5-metno](https://github.com/metno/hdf5-rust) by Ivan Smirnov, Magnus Ulimoen, and contributors.

0 commit comments

Comments
 (0)