Skip to main content

How I 'Dockerized' my deep learning project?

Another post starts with you beautiful people!
I hope you have enjoyed my last two posts about Yolo system and now you are well aware of using Yolo with kers api as well as Yolo with Darknet framework in your Windows machine. I know Linux is the most suitable environment for production ready configurations, but like me there are many Windows users who find sometime very difficult to setup an environment. In my previous post I have shared my Python setup requirements which you can use to setup Python computer vision environment in Windows. Continuing my knowledge sharing journey, today I am going to share you how can we setup and use Docker with our Python project.

What is Docker?
Generally in our Python project structure we use a requirements.txt file where we define Python library name as key and it's version as value. This file is used to install required Python libraries when we deploy our code to new environment ( for example from Windows to Linux environment). We just need to run 'pip install -r requirements.txt' command to install all dependencies in new environment and our application will be run in that new environment. But this process creates issue specially if you have installed Windows specific versions of different libraries and trying to use the same in Linux environment. In solving such dependency issues, we spent our valuable time. What if we can remove this issue? Is there any way of sharing and running our application in any environment without need of handling dependencies?

Yes, Docker is the solution of this dependency!  Docker is a platform which uses the combination of the images and the containers concepts. Here we can think 'image' as a self contained package and 'container' as a running image on Docker platform. Currently Docker is a must know skill for building a cloud native microservices based applications. And future of all monolithic architecture based application is microservices. With Docker we can deliver our application with all required dependencies as an image. So as a Data Scientist we must also know atleast high level of this platform. Let's start todays learning-

How to install Docker?
We will use community version of the Docker since it is free. We can download the Docker Desktop from the official site-


Download the setup file. It will ask you to create an account so make it and then download the setup-

Before installing please make sure following two things-

  1. Windows 10 64-bit: Pro, Enterprise, or Education (Build 15063 or later).
  2. Hyper-V and Containers Windows features must be enabled.
Following is the way to do second option-
Open command prompt with admin rights and run the following command-
DISM /Online /Enable-Feature /All /FeatureName:Microsoft-Hyper-V
In my machine it looks like below once we run the above command-

When you type Y, system will be restarted. Now we are ready to install the Docker Desktop. Go to the downloaded file, run it as admin and follow the instruction. The setup process will look like below screens-


Click on Ok button, next screen will unpack the installation and then will ask you to logout of machine-

Once you click on Close and log out button an re login. You can see Docker desktop icon in your desktop and a small whale icon in Taskbar. If the setup has not asked you for re-login then you must restart your machine. Now when I click on the Desktop icon of Docker, it was showing me following access error-

To solve this issue copy the path till 'C:\ProgramData\docker' in address bar and press enter. A pop up will appear and ask for permanent access, select this button and you will be given the access. Now go back to the error showing console and click on Quit button. Again double click on the Desktop icon of Docker. This time a pop up will come and ask you for enabling the containers-

Click on Ok button and your system will be restarted. Once the system is restarted, you will see that Docker is running notification in taskbar notification. It means you have successfully install the Docker Desktop in your machine. Now whenever your system is restarted, Docker will be run automatically. But this default behaviour takes lots of RAM utilization, so it is recommended to disable this auto start feature and start the Docker when it is needed. We can disable this feature by following this step-
Right click on Docker Whale icon in Taskbar, Go to Settings, Click on General tab and uncheck the first option 'Start Docker Desktop when you log in' and restart the Docker-

Testing the installation
Once we completed the above steps, we can check if our installation is correct or not. To do so open command prompt and run following command- docker --version

There are various repositories with inbuilt Docker images are available for us to use. A simple example is 'Docker Hub' repository. We can use this and test our Hello World image by running following command- docker run hello-world

You can see in above console,a hello message is printed. It means hello-world image is downloaded from Docker Hub repository. Now we can print details of this downloaded image by running following command- docker image ls

Exploring an application
Next, we will see real power of the Docker by running a Ubuntu OS and a webserver- nginx. Sounds interesting, right! For this we need to switch our container from Windows to Linux. To do so, right click on Whale Docker icon in the Taskbar and select 'Switch to Linux containers.." option.

Click on Switch button. Close the existing command prompt console and open a new one and run the following command- docker run --interactive --tty ubuntu bash

See, the above root prompt. We are in the container. Let's check the hostname of the container by running typing 'hostname' and pressing enter-

'3b9f82bf5dea' is the id of the hostname. Exit the prompt by typing 'exit' and pressing enter. Now we will check containers available in this image by running following command- docker container ls --all

Now we will start the nginx webserver but to do so we must restart our Docker and open a new window of command prompt because if you do not do so and run directly the following command: docker run --detach --publish 80:80 --name mywebserver nginx , you will get following error-
docker: Error response from daemon: driver failed programming external connectivity on endpoint mywebserver
After restarting Docker, open a new console and run following command: docker run --detach --publish 8080:80 --name mywebserver nginx. Here mywebserver is my server name so you can use any name and also can give any port number instead of 8080.

Our server is up and running, we can find it's details by running following command-docker container ls

Open a browser window and type http://localhost:8080, press enter. You will see following screen-
You can stop your server by running following command:docker container stop <your_server_name>

Running a Python image
Let's run a Python image (for example python:latest) from Docker Hub repository. Open a command prompt and run following command:docker run -it --rm python:latest. This command will download the python:latest image from Docker Hub repository.

As you can see in console, command has downloaded latest version of Python- 3.8.1 on our Linux container. If you want to use a specific version then instead of python:latest use python:<version_number>. We have now our Python 3.8.1 running in Linux container.

Preparing my Deep Learning project specific Docker image
Now I will tell you how I have made my project as a Docker image. To do so we use concept of a Dockerfile. This file is simple text file having instructions and arguments. Docker reads this file and builds image automatically. According to official site syntax of a sample docker file looks like below format-
FROM ubuntu:18.04
COPY . /myapp
RUN make /myapp
CMD python /myapp/app.py
FROM creates a layer from the ubuntu:18.04 Docker image.
COPY adds files from your Docker client’s current directory.
RUN builds your application with make.
CMD specifies what command to run within the container.

Following the above syntax I have created a Dockerfile.txt for my project:

My project structure looks like below-

Before moving further please make sure you have created a requirements.txt file in your project structure. requirements.txt file for my project looks like below-

Now we will build our image. Go to the the path of your project and open it in a command prompt and run following command:docker build -t dsbyprateekg .
Here dsbyprateekg is my image name. One you run the above command, Docker will read your Dockerfile.txt and follow it's instruction.

Once the process is completed you will see following look like message in the end of the server console-

Now you can check if your newly created image exist or not by running following command:docker images

Here under the 'TAG' column you see 'latest'. We can change this value to a more specific name. For example I have changed 'latest' to 'prateek_gupta' by running following command:docker tag <IMAGE_ID> <IMAGE_NAME>:<TAG>

Next, we can test our container locally by running following command:docker run <IMAGE_NAME>:<TAG>

If all is well then you will see your Python file running successfully as in my case I have deployed my deep learning project as Flask Rest API so my flask server is up and running-

It means we have tested our Docker container locally and it can be run anywhere Docker is installed. In last step we can now push our repository to our account in Docker Hub. For this first we need to tag our image in require format: docker tag <IMAGE_NAME> <YOUR_DOCKERHUB_NAME>/<IMAGE_NAME> . 
In my case my IMAGE_NAME is dsbyprateekg and YOUR_DOCKERHUB_NAME is my Docker account name. Then we need to login in Docker with our account by running following command:docker login --username <Your_Docker_Username>
It will ask you to enter your password, give the password and press enter. Now you can push your changes by running following command: docker push <Your_Docker_Username>/<IMAGE_NAME>

Once your Docker image is pushed, you can verify it also by going to your Docker account. Now you can easily share this to anyone. You just need to ask them to install Docker in their system, run following command:docker pull <IMAGE_NAME> to get the image and then just run the container. They don't need to do any code setup in their system.That's it for today. We have learnt about Docker, it's installation in our machine, switching from Windows to Linux container, successfully made a Docker image of a project and pushed our changes in Docker Hub. I have shared my project structure, Dockerfile screenshot so that you can get a reference from it and can easily apply it in your machine learning or deep learning based project. So try it yourself in your project and explore more. Till then Go chase your dreams, have an awesome day, make every second count and see you later in my next post.




Comments

  1. It was great experience after reading this. thanks for sharing such good stuff with us.
    Python Training in Delhi

    ReplyDelete
  2. I wanted to thank you for this great read. Your blog is one of the finest blog . Thanks for posting this informative article.
    best ui and ux design in nashik

    ReplyDelete
  3. The article is so appealing. You should read this article before choosing the Big data engineering services you want to learn.

    ReplyDelete
  4. Deep learning or machine learning are same in some context. Read detailed article on machine learning or deep learning here

    ReplyDelete

Post a Comment

Popular posts from this blog

How to deploy your ML model as Fast API?

Another post starts with you beautiful people! Thank you all for showing so much interests in my last posts about object detection and recognition using YOLOv4. I was very happy to see many aspiring data scientists have learnt from my past three posts about using YOLOv4. Today I am going to share you all a new skill to learn. Most of you have seen my post about  deploying and consuming ML models as Flask API   where we have learnt to deploy and consume a keras model with Flask API  . In this post you are going to learn a new framework-  FastAPI to deploy your model as Rest API. After completing this post you will have a new industry standard skill. What is FastAPI? FastAPI is a modern, fast (high-performance), web framework for building APIs with Python 3.6+ based on standard Python type hints. It is easy to learn, fast to code and ready for production . Yes, you heard it right! Flask is not meant to be used in production but with FastAPI you can use you...

Learn the fastest way to build data apps

Another post starts with you beautiful people! I hope you have enjoyed and learned something new from my previous three posts about machine learning model deployment. In one post we have learned  How to deploy a model as FastAPI?  I n the second post, we have learned  How to deploy a deep learning model as RestAPI ? and in the third post, we have also learned  How to scale your deep learning model API?   If you are following my blog posts, you have seen how easily you have transit yourselves from aspiring to a mature data scientist. In this new post, I am going to share a new framework-  Streamlit which will help you to easily create a beautiful app with Python only. I will show here how had I used the Streamlit framework to create an app for my YOLOv3 custom model. What is Streamlit? Streamlit’s open-source app framework is the easiest way for data scientists and machine learning engineers to create beautiful, performant apps in only a few hours!...

How can I make a simple ChatBot?

Another post starts with you beautiful people! It has been a long time of posting a new post. But my friends in this period I was not sitting  where I got a chance to work with chatbot and classification related machine learning problem. So in this post I am going to share all about chatbot- from where I have learned? What I have learned? And how can you build your first bot? Quite interesting right! Chatbot is a program that can conduct an intelligent conversation based on user's input. Since chatbot is a new thing to me also, I first searched- is there any Python library available to start with this? And like always Python has helped me this time also. There is a Python library available with name as  ChatterBot   which is nothing but a machine learning conversational dialog engine. And yes that is all I want to start my learning because I always prefer inbuilt Python library to start my learning journey and once I learn this then only I move ahead for another...