Skip to content

addappio/serving

 
 

Repository files navigation

TensorFlow Serving

TensorFlow Serving is an open-source software library for serving machine learning models. It deals with the inference aspect of machine learning, taking models after training and managing their lifetimes, providing clients with versioned access via a high-performance, reference-counted lookup table.

Tutorials

For more information

Getting started

Clone this bad boy

git clone --recurse-submodules https://github.com/addappio/serving

Build the docker images

cd /into/this/repo
docker build -t tfserving .

Run the docker image locally

docker run -d -p 8080:80 -p 9000:9000 --name serving tfserving

Adding additional models

You can add a model in the /models/models_config.txt. Just add a config key into the model_config_list. Model versions are stored in the /tmp/models base folder. Name the model (sub) folder the same as the model name.

config: {
    name: "MODEL_NAME",
    base_path: "/tmp/models/MODEL_NAME",
    model_platform: "tensorflow"
}

gRPC

Install gRPC for python

pip install grpcio

About

A flexible, high-performance serving system for machine learning models

Resources

License

Contributing

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C++ 86.8%
  • Python 10.6%
  • Protocol Buffer 1.6%
  • CSS 0.8%
  • HTML 0.2%
  • Shell 0.0%