-
Notifications
You must be signed in to change notification settings - Fork 19
Open
Description
Assuming we have Mobilenet-V2 classification model converted to OpenVINO format and quantized to INT4, INT8 and FP16 precisions, prepare text for Hugginface model cards for those models.
Model card based on existing OpenVINO cards, for example https://huggingface.co/OpenVINO/Phi-3.5-vision-instruct-int8-ov/blob/main/README.md?code=true with contents similar to below text:
Model Name
- model creator
- original model
Description
This is XXX model model converted to the OpenVINO™ IR (Intermediate Representation) format with weights compressed to YYYY (INT4/INT8/FP16) by ZZZZ.
Compatibility
The provided OpenVINO™ IR model is compatible with:
OpenVINO version 2025.3.0 and higher
Model API 0.4.0 and higher
Quantization Command line/code
(...)
Dataset used for training/quantization
(...)
Running Model Inference with Model API:
- Install Model API:
pip install git+https://github.com/open-edge-platform/model_api.git- Run model inference
# download the model using huggingface_hub
# run inference using Model API on sample image
# display top_labels from ClassificationResultMetadata
Metadata
Assignees
Labels
No labels