Skip to content

Improve discoverability on HF #4

@NielsRogge

Description

@NielsRogge

Hi,

Niels here from the open-source team at Hugging Face. Congrats on your work! I found it based on the paper page: https://huggingface.co/papers/2405.18425, which already has a linked model repository.

However, I've got some suggestions regarding how to improve the integration with HF.

1. Make download stats work

Currently, download stats aren't tracked for your models. The easiest way to fix that is by leveraging the PyTorchModelHubMixin class, as it adds push_to_hub and from_pretrained capabilities to any custom nn.Module. It creates a config.json along with safetensors for each model, enforcing downloads to work.

Alternatively, a PR could be opened to the huggingface.js open-source library as explained here.

2. Make the model Transformers compatible

In case you want your models to be usable through the Transformers library with trust_remote_code=True, I highly recommend following this guide: https://huggingface.co/docs/transformers/custom_models. It basically allows people to use your backbones using the AutoModel and AutoModelForImageClassification APIs.

We recently did the same with the MambaVision author as can be seen here: https://huggingface.co/collections/nvidia/mambavision-66943871a6b36c9e78b327d3.

Let me know if you need any help regarding this!

Cheers,

Niels
ML Engineer @ HF 🤗

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions