So this blog post is to get you operational with Docker, image and volume management with a pivot towards scientific computing and tensor flow. So I am working on building a Jupyter Notebook for the local mahcine learning meetup to learn the ins and outs of Tensor Flow and deploy this thing up to Azure. Part of getting this to work is not only managing the Docker Containers, but also the data on the volumes so when we deploy up to Azure and somebody opens up the notebook it comes pre-loaded with all the necessary tutorial data.
So I’m not going to walk through the whole 9 yards of everything. You start by installing Docker for Windows and grabbing the latest TensorFlow docker image. Here is the docker command for the latest CPU only TensorFlow image. Note, this will also kick off the docker image into a container.
docker run -it -p 8888:8888 gcr.io/tensorflow/tensorflow
So, this is good and all, gets our base image downloaded. Lets go ahead and “branch” this thing into a new image.
docker ps docker commit <ContainerName> tf_demos
docker ps will list all of our active containers. We just want to simply save our currently running container as a new image tf_demos. We can now stop the container with the following command.
docker stop <ContainerName>
Great, lets go ahead and create a new volume called “tf_demos_data” and boot up a container with the new volume.
docker volume create --name tf_demos_data docker run -it -v tf_demos_data:/data -p 8888:8888 tf_demos
Alright, so the first line creates a new volume “tf_demos_data”, the second line launches our image tf_demos with the new volume mounted to /data.
We are now ready to start loading some data into the volume via the container.
docker cp "C:\data\myfile.zip" <ContainerName>:/data/myfile.zip docker commit <ContainerName> tf_demos docker stop <ContainerName> docker rm (docker ps -aq)
So what are we doing here? We are first copying our file “myfile.zip” from my local machine into the container’s file system onto the volume. We then commit changes of the container to demos. We stop the container. We finally iterate through every container’s id and remove it.
We remove just to showcase you can be container clean and launch a new container via the image with the volume and your data be there.
docker run -it -v tf_demos_data:/data -p 8888:8888 tf_demos
Lastly, lets run it again. The data file should have persisted as well as any changes to the image.