Serving a TensorFlow Model
From https://tensorflow.google.cn/serving/serving_basic
Get and Run TensorFlow Serving From Docker
$ docker pull songxitang/tensorflow-serving
$ docker run -it songxitang/tensorflow-serving
$ cd serving/
Train And Export TensorFlow Model
// Clear the export directory if it already exists
$ rm -rf /tmp/mnist_model
// Generage the Model
$ python tensorflow_serving/example/mnist_saved_model.py /tmp/mnist_model
Training model...
Extracting /tmp/train-images-idx3-ubyte.gz
Extracting /tmp/train-labels-idx1-ubyte.gz
Extracting /tmp/t10k-images-idx3-ubyte.gz
Extracting /tmp/t10k-labels-idx1-ubyte.gz
training accuracy 0.9092
Done training!
Exporting trained model to /tmp/mnist_mode/1
Done exporting!
$ ls /tmp/mnist_mode/
1
$ ls /tmp/mnist_mode/1
saved_model.pb variables
Load Exported Model With Standard TensorFlow ModelServer
$ tensorflow_model_server --port=9000 --model_name=mnist --model_base_path=/tmp/mnist_mode/
Test The Server
$ python tensorflow_serving/example/mnist_client.py --num_tests=1000 --server=localhost:9000
Extracting /tmp/train-images-idx3-ubyte.gz
Extracting /tmp/train-labels-idx1-ubyte.gz
Extracting /tmp/t10k-images-idx3-ubyte.gz
Extracting /tmp/t10k-labels-idx1-ubyte.gz
.........................................
Inference error rate: 10.4%