Skip to content

Commit 57fd168

Browse files
committed
Release v0.1.0: hyper-models model zoo
- ONNX-based inference (torch-free runtime) - 4 models: hycoclip-vit-s/b, meru-vit-s/b - Auto-download from HuggingFace Hub - PIL image preprocessing - CI/CD with GitHub Actions - Trusted Publisher for PyPI
1 parent 7489595 commit 57fd168

19 files changed

Lines changed: 578 additions & 539 deletions

File tree

.github/workflows/ci.yml

Lines changed: 64 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,64 @@
1+
name: CI
2+
3+
on:
4+
pull_request:
5+
push:
6+
branches:
7+
- main
8+
9+
jobs:
10+
lint:
11+
name: Lint
12+
runs-on: ubuntu-latest
13+
steps:
14+
- name: Checkout
15+
uses: actions/checkout@v6
16+
17+
- name: Set up Python
18+
uses: actions/setup-python@v6
19+
with:
20+
python-version: "3.12"
21+
22+
- name: Install uv
23+
uses: astral-sh/setup-uv@v7
24+
with:
25+
enable-cache: true
26+
27+
- name: Install dependencies
28+
run: uv sync --extra dev
29+
30+
- name: Ruff lint
31+
run: uv run ruff check .
32+
continue-on-error: true
33+
34+
- name: Ruff format
35+
run: uv run ruff format --check .
36+
continue-on-error: true
37+
38+
test:
39+
name: Tests (Python ${{ matrix.python-version }})
40+
runs-on: ubuntu-latest
41+
strategy:
42+
fail-fast: false
43+
matrix:
44+
python-version: ["3.10", "3.11", "3.12"]
45+
46+
steps:
47+
- name: Checkout
48+
uses: actions/checkout@v6
49+
50+
- name: Set up Python
51+
uses: actions/setup-python@v6
52+
with:
53+
python-version: ${{ matrix.python-version }}
54+
55+
- name: Install uv
56+
uses: astral-sh/setup-uv@v7
57+
with:
58+
enable-cache: true
59+
60+
- name: Install dependencies
61+
run: uv sync --extra dev
62+
63+
- name: Run tests
64+
run: uv run pytest -v

.github/workflows/release.yml

Lines changed: 71 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,71 @@
1+
name: Release
2+
3+
on:
4+
push:
5+
tags:
6+
# Publish on any tag starting with a `v`, e.g. v0.1.0
7+
- v*
8+
workflow_dispatch:
9+
10+
jobs:
11+
build:
12+
name: Build distributions
13+
runs-on: ubuntu-latest
14+
permissions:
15+
contents: read
16+
steps:
17+
- name: Checkout
18+
uses: actions/checkout@v6
19+
with:
20+
persist-credentials: false
21+
22+
- name: Set up Python
23+
uses: actions/setup-python@v6
24+
with:
25+
python-version: "3.12"
26+
27+
- name: Install uv
28+
uses: astral-sh/setup-uv@v7
29+
with:
30+
enable-cache: true
31+
32+
- name: Build
33+
run: uv build --no-sources
34+
35+
- name: Smoke test (wheel)
36+
run: uv run --isolated --no-project --with dist/*.whl python -c "import hyper_models; print(hyper_models.__version__)"
37+
38+
- name: Smoke test (source distribution)
39+
run: uv run --isolated --no-project --with dist/*.tar.gz python -c "import hyper_models; print(hyper_models.__version__)"
40+
41+
- name: Upload dist artifacts
42+
uses: actions/upload-artifact@v5
43+
with:
44+
name: python-package-distributions
45+
path: dist/
46+
if-no-files-found: error
47+
48+
pypi:
49+
name: Publish to PyPI
50+
needs: build
51+
if: startsWith(github.ref, 'refs/tags/v')
52+
runs-on: ubuntu-latest
53+
environment:
54+
# Create this environment in the GitHub repository under Settings -> Environments
55+
name: pypi
56+
url: https://pypi.org/p/hyper-models
57+
permissions:
58+
# IMPORTANT: this permission is mandatory for trusted publishing
59+
id-token: write
60+
contents: read
61+
steps:
62+
- name: Download dist artifacts
63+
uses: actions/download-artifact@v6
64+
with:
65+
name: python-package-distributions
66+
path: dist/
67+
68+
- name: Publish
69+
uses: pypa/gh-action-pypi-publish@release/v1
70+
with:
71+
packages-dir: dist/

.gitignore

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ hycoclip_repo/
5959
# Package Management
6060
# -----------------------------------------------------------------------------
6161
# uv lockfile - can be committed for reproducibility, but optional
62-
# uv.lock
62+
uv.lock
6363

6464
# pip
6565
pip-log.txt
@@ -126,3 +126,4 @@ site/
126126
.cache/
127127
*.hf/
128128
AGENTS.md
129+
.specstory/

LICENSE

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
MIT License
2+
3+
Copyright (c) 2025 Hyper3Labs
4+
5+
Permission is hereby granted, free of charge, to any person obtaining a copy
6+
of this software and associated documentation files (the "Software"), to deal
7+
in the Software without restriction, including without limitation the rights
8+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9+
copies of the Software, and to permit persons to whom the Software is
10+
furnished to do so, subject to the following conditions:
11+
12+
The above copyright notice and this permission notice shall be included in all
13+
copies or substantial portions of the Software.
14+
15+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21+
SOFTWARE.

README.md

Lines changed: 34 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -7,8 +7,8 @@
77
</p>
88

99
<p align="center">
10-
<a href="https://huggingface.co/collections/hyperview-org/hyper-models-67900e48542fa2ea29a26684">
11-
<img src="https://img.shields.io/badge/🤗_Models-Hugging_Face-orange" alt="Hugging Face">
10+
<a href="https://huggingface.co/mnm-matin/hyperbolic-clip">
11+
<img src="https://img.shields.io/badge/🤗_Models-hyperbolic--clip-orange" alt="Hugging Face">
1212
</a>
1313
<a href="LICENSE">
1414
<img src="https://img.shields.io/badge/License-MIT-blue" alt="License: MIT">
@@ -21,7 +21,7 @@
2121

2222
- **Standardized access** to non-Euclidean embedding models
2323
- **Torch-free runtime** via ONNX (models published to Hugging Face Hub)
24-
- **Simple API**`load()` and `encode()`
24+
- **Simple API**`load()` and `encode_images()`
2525

2626
## Installation
2727

@@ -33,48 +33,57 @@ pip install hyper-models
3333

3434
```python
3535
import hyper_models
36+
from PIL import Image
3637

3738
# List available models
3839
hyper_models.list_models()
39-
# ['hycoclip-vit-s', 'hycoclip-vit-b', 'meru-vit-s', ...]
40+
# ['hycoclip-vit-s', 'hycoclip-vit-b', 'meru-vit-s', 'meru-vit-b']
4041

4142
# Load model (auto-downloads from Hugging Face Hub)
4243
model = hyper_models.load("hycoclip-vit-s")
44+
model.geometry # 'hyperboloid'
45+
model.dim # 513
4346

44-
# Encode images
45-
embeddings = model.encode(images) # (B, D) ndarray
47+
# Encode PIL images
48+
images = [Image.open("image.jpg")]
49+
embeddings = model.encode_images(images) # (1, 513) ndarray
4650

47-
# Metadata
48-
model.geometry # 'hyperboloid'
49-
model.dim # 512
51+
# Get model info
52+
info = hyper_models.get_model_info("hycoclip-vit-s")
53+
info.hub_id # 'mnm-matin/hyperbolic-clip'
54+
info.license # 'CC-BY-NC'
55+
56+
# Low-level: preprocess images yourself
57+
batch = hyper_models.preprocess_images(images) # (B, 3, 224, 224)
58+
embeddings = model.encode(batch)
5059
```
5160

5261
## Models
5362

5463
### Hyperbolic
5564

56-
| Model | Available | Paper | License | Code |
57-
|-------|:---------:|-------|---------|------|
58-
| `hycoclip-vit-s` | | [ICLR 2025](https://arxiv.org/abs/2410.06912) | CC-BY-NC | [PalAvik/hycoclip](https://github.com/PalAvik/hycoclip) |
59-
| `hycoclip-vit-b` | | [ICLR 2025](https://arxiv.org/abs/2410.06912) | CC-BY-NC | [PalAvik/hycoclip](https://github.com/PalAvik/hycoclip) |
60-
| `meru-vit-s` | | [ICML 2023](https://arxiv.org/abs/2304.09172) | CC-BY-NC | [facebookresearch/meru](https://github.com/facebookresearch/meru) |
61-
| `meru-vit-b` | | [ICML 2023](https://arxiv.org/abs/2304.09172) | CC-BY-NC | [facebookresearch/meru](https://github.com/facebookresearch/meru) |
62-
| `hyp-vit` | | [CVPR 2022](https://arxiv.org/abs/2203.10833) | MIT | [htdt/hyp_metric](https://github.com/htdt/hyp_metric) |
63-
| `hie` | | [CVPR 2020](https://arxiv.org/abs/1904.02239) | MIT | [leymir/hyperbolic-image-embeddings](https://github.com/leymir/hyperbolic-image-embeddings) |
64-
| `hcnn` | | [ICLR 2024](https://openreview.net/forum?id=ekz1hN5QNh) | MIT | [kschwethelm/HyperbolicCV](https://github.com/kschwethelm/HyperbolicCV) |
65+
| Model | Available | Paper | Code |
66+
|-------|:---------:|-------|------|
67+
| `hycoclip-vit-s` | [![HF](https://img.shields.io/badge/🤗-HuggingFace-yellow)](https://huggingface.co/mnm-matin/hyperbolic-clip/tree/main/hycoclip-vit-s) | [ICLR 2025](https://arxiv.org/abs/2410.06912) | [PalAvik/hycoclip](https://github.com/PalAvik/hycoclip) |
68+
| `hycoclip-vit-b` | [![HF](https://img.shields.io/badge/🤗-HuggingFace-yellow)](https://huggingface.co/mnm-matin/hyperbolic-clip/tree/main/hycoclip-vit-b) | [ICLR 2025](https://arxiv.org/abs/2410.06912) | [PalAvik/hycoclip](https://github.com/PalAvik/hycoclip) |
69+
| `meru-vit-s` | [![HF](https://img.shields.io/badge/🤗-HuggingFace-yellow)](https://huggingface.co/mnm-matin/hyperbolic-clip/tree/main/meru-vit-s) | [ICML 2023](https://arxiv.org/abs/2304.09172) | [facebookresearch/meru](https://github.com/facebookresearch/meru) |
70+
| `meru-vit-b` | [![HF](https://img.shields.io/badge/🤗-HuggingFace-yellow)](https://huggingface.co/mnm-matin/hyperbolic-clip/tree/main/meru-vit-b) | [ICML 2023](https://arxiv.org/abs/2304.09172) | [facebookresearch/meru](https://github.com/facebookresearch/meru) |
71+
| `hyp-vit` | | [CVPR 2022](https://arxiv.org/abs/2203.10833) | [htdt/hyp_metric](https://github.com/htdt/hyp_metric) |
72+
| `hie` | | [CVPR 2020](https://arxiv.org/abs/1904.02239) | [leymir/hyperbolic-image-embeddings](https://github.com/leymir/hyperbolic-image-embeddings) |
73+
| `hcnn` | | [ICLR 2024](https://openreview.net/forum?id=ekz1hN5QNh) | [kschwethelm/HyperbolicCV](https://github.com/kschwethelm/HyperbolicCV) |
6574

6675
### Spherical
6776

68-
| Model | Available | Paper | License | Code |
69-
|-------|:---------:|-------|---------|------|
70-
| `sphereface` | | [CVPR 2017](https://arxiv.org/abs/1704.08063) | MIT | [wy1iu/sphereface](https://github.com/wy1iu/sphereface) |
71-
| `arcface` | | [CVPR 2019](https://arxiv.org/abs/1801.07698) | MIT | [deepinsight/insightface](https://github.com/deepinsight/insightface) |
77+
| Model | Available | Paper | Code |
78+
|-------|:---------:|-------|------|
79+
| `sphereface` | | [CVPR 2017](https://arxiv.org/abs/1704.08063) | [wy1iu/sphereface](https://github.com/wy1iu/sphereface) |
80+
| `arcface` | | [CVPR 2019](https://arxiv.org/abs/1801.07698) | [deepinsight/insightface](https://github.com/deepinsight/insightface) |
7281

7382
### Product Manifolds
7483

75-
| Model | Available | Paper | License | Code |
76-
|-------|:---------:|-------|---------|------|
77-
| `hyperbolics` | | [ICLR 2019](https://openreview.net/forum?id=HJxeWnCcF7) | MIT | [HazyResearch/hyperbolics](https://github.com/HazyResearch/hyperbolics) |
84+
| Model | Available | Paper | Code |
85+
|-------|:---------:|-------|------|
86+
| `hyperbolics` | | [ICLR 2019](https://openreview.net/forum?id=HJxeWnCcF7) | [HazyResearch/hyperbolics](https://github.com/HazyResearch/hyperbolics) |
7887

7988
## Export Tooling
8089

export/hycoclip/export_onnx.py

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -108,6 +108,25 @@ def forward(self, image):
108108
verbose=False,
109109
)
110110

111+
# Normalize external data filename to match the ONNX filename + ".data"
112+
# This ensures compatibility when files are renamed (e.g., on HuggingFace Hub)
113+
onnx = importlib.import_module("onnx")
114+
external_data_helper = importlib.import_module("onnx.external_data_helper")
115+
116+
model = onnx.load(str(onnx_path), load_external_data=True)
117+
external_data_helper.convert_model_to_external_data(
118+
model,
119+
all_tensors_to_one_file=True,
120+
location=onnx_path.name + ".data",
121+
size_threshold=1024,
122+
)
123+
onnx.save_model(model, str(onnx_path))
124+
125+
# Remove old .data file from torch.onnx.export (if different from normalized name)
126+
old_data = onnx_path.with_suffix(".onnx.data")
127+
if old_data.exists() and old_data.name != onnx_path.name + ".data":
128+
old_data.unlink()
129+
111130
print(f"Wrote ONNX: {onnx_path}")
112131

113132

export/hycoclip/hf/upload_to_hf.py

Lines changed: 39 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -238,7 +238,7 @@ def main() -> int:
238238
print("\n[DRY RUN] Would upload the above files.")
239239
return 0
240240

241-
# Import huggingface_hub
241+
# Import huggingface_hub and onnx
242242
try:
243243
from huggingface_hub import HfApi
244244
except ImportError as exc:
@@ -248,30 +248,56 @@ def main() -> int:
248248
"Then login with: hf auth login"
249249
) from exc
250250

251+
try:
252+
import onnx
253+
from onnx.external_data_helper import convert_model_to_external_data
254+
except ImportError as exc:
255+
raise SystemExit(
256+
"Missing dependency: onnx\n"
257+
"Install with: uv add onnx"
258+
) from exc
259+
251260
api = HfApi()
252261

253262
# Create repo if needed
254263
print("\nCreating repository (if needed)...")
255264
api.create_repo(repo_id=args.repo_id, private=args.private, exist_ok=True)
256265

257-
# Upload ONNX file(s) to model subdirectory
258-
print(f"Uploading {model_name}/model.onnx...")
259-
api.upload_file(
260-
repo_id=args.repo_id,
261-
path_or_fileobj=str(onnx_path),
262-
path_in_repo=f"{model_name}/model.onnx",
263-
commit_message=f"Add {model_name} ONNX model",
266+
# Rewrite external data location to match the uploaded filename (model.onnx.data)
267+
# This is necessary because the original export may have used a different filename.
268+
print("Rewriting ONNX external data location to 'model.onnx.data'...")
269+
import tempfile
270+
onnx_model = onnx.load(str(onnx_path), load_external_data=True)
271+
convert_model_to_external_data(
272+
onnx_model,
273+
all_tensors_to_one_file=True,
274+
location="model.onnx.data",
275+
size_threshold=1024,
264276
)
277+
with tempfile.TemporaryDirectory() as tmpdir:
278+
tmp_onnx = Path(tmpdir) / "model.onnx"
279+
onnx.save_model(onnx_model, str(tmp_onnx))
280+
tmp_data = Path(tmpdir) / "model.onnx.data"
265281

266-
if data_path is not None:
267-
print(f"Uploading {model_name}/model.onnx.data...")
282+
# Upload rewritten ONNX file
283+
print(f"Uploading {model_name}/model.onnx...")
268284
api.upload_file(
269285
repo_id=args.repo_id,
270-
path_or_fileobj=str(data_path),
271-
path_in_repo=f"{model_name}/model.onnx.data",
272-
commit_message=f"Add {model_name} ONNX weights",
286+
path_or_fileobj=str(tmp_onnx),
287+
path_in_repo=f"{model_name}/model.onnx",
288+
commit_message=f"Add {model_name} ONNX model",
273289
)
274290

291+
# Upload the rewritten external data file
292+
if tmp_data.exists():
293+
print(f"Uploading {model_name}/model.onnx.data...")
294+
api.upload_file(
295+
repo_id=args.repo_id,
296+
path_or_fileobj=str(tmp_data),
297+
path_in_repo=f"{model_name}/model.onnx.data",
298+
commit_message=f"Add {model_name} ONNX weights",
299+
)
300+
275301
# Update repo README with model table
276302
print("Updating README.md...")
277303
existing_models = _get_existing_models(api, args.repo_id)

0 commit comments

Comments
 (0)