Toy project implementing the "Hello Word" of Machine Learning, namely the MNIST Machine Learning Challenge. The project aims to experiment with the latest APIs (e.g. TensorFlow v2.0).
You can see the demonstration of the UI. As you can see, the model makes mistakes, as it is not a perfectly trained one.

$ docker build -t ui_image .
$ docker run --name ui_container \
--rm \
-it \
-p 3000:3000 \
ui_imageOnce the JavaScript server is up within ui_container, the user reach the UI under http://localhost:3000.
$ docker build -t train_image train_context
$ docker run --name train_container \
--rm \
-it \
-p 8888:8888 \
--mount src=$(pwd)/model,target=/home/docker/model,type=bind \
train_imagedocker run starts a Jupyter server on :8888 (hence the port publishing argument of docker run).
The server can be reached at the command line provided URL (URL is provided as a result of docker run).
Above configuration includes
plotlyextension installation intojupyterlab. This installation step takes quite long to execute, besides even fails if not enough resources are assigned to the Docker Engine.plotlyis a nice-to-have visualisation tool to draw pretty plots – therefore if a quicker-to-build configuration is required,plotlyis recommended to be replaced withmatplotlib. Within this project,matplotlibwould be more than satisfactory, default configuration includesplotlyonly for the nicer plots.
After docker run gets executed, the home folder of train_container will look like this:
home
└── docker
├── model
│ ├── group1-shard1of1.bin
│ ├── model.h5
│ └── model.json
└── train_model.ipynb
modelis the folder of- the
.h5format of the model (format before conversion), - the
.bin+.jsonformat of the model (format after conversion).
- the
train_model.ipynbis the Jupyter Notebook with which the user can train a new model.
To train a new model, open the Jupyter server within your browser, open train_model.ipynb and execute its cells.
$ docker build -t convert_image convert_context
$ docker run --name convert_container \
--rm \
-it \
--mount src=$(pwd)/model,target=/home/docker/model,type=bind \
convert_image
# within convert_container
$ tensorflowjs_converter --input_format=keras model/model.h5 modelWe override the default model with the above conversion. If ui_container is still running, restarting its JavaScript
server will start using the newly converted model.