TensorFlow Serving is an open-source software library for serving machine learning models. It deals with the inference aspect of machine learning, taking models after training and managing their lifetimes, providing clients with versioned access via a high-performance, reference-counted lookup table.
git clone --recurse-submodules https://github.com/addappio/servingcd /into/this/repo
docker build -t tfserving .docker run -d -p 8080:80 -p 9000:9000 --name serving tfservingYou can add a model in the /models/models_config.txt. Just add a config key into the model_config_list.
Model versions are stored in the /tmp/models base folder. Name the model (sub) folder the same as the model name.
config: {
name: "MODEL_NAME",
base_path: "/tmp/models/MODEL_NAME",
model_platform: "tensorflow"
}pip install grpcio