QATechnicals

This blog is all about testing and quality analysis, tools techniques and best practises

Docker

If you’re someone with a background in IT industry, particularly in softwares, and haven’t been living under a rock for last 5 years, then this word should not surprise you. Everything is docker now.  So what the heck in the world is docker and why should I use it and why is it so darn popular?

If you’re a programmer or someone in DevOps, there are high chances that you must have done a deployment of a web app to a certain environment – whether it be Dev/QA/Production. Until this magical unicorn themed tool came up, developers used a lot of hypervisors from a lot of software vendors like VMware, Oracle’s VirtualBox to create virtual machines , have guest operating systems on your laptops to deploy applications. These approaches were clunky, resource-intensive and cumbersome to say the least.

Then came Docker.  So what is it any ways?

Definition wise - Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Containers allow a developer to package up an application with all of the parts it needs, such as libraries and other dependencies, and ship it all out as one package.

Well, it is a lot of technical jargon. Let’s break it down in simple and plain English for us to understand.

You are a web developer or a DevOps person, who has to build/deploy a web application  – lets say an E-Commerce website or a Social Networking web app like Facebook. In order for the hosting to take place, so that the outer world can access your application, you need to serve it to the public. This is done via two ways – using a dedicated computer, which we call as a server or using a third party hosting platform like GoDaddy etc.

If you’ve opted the first one, you’ll need to build a server. This server is a dedicated computer whose task is to serve the desired web application or web-services or a range of other services. In the past there were a lot of virtualisation available for creating virtual machines or use of expensive hardware to host servers.

But with advancement of cloud computing, there came services like the Amazon Web Services, Google Cloud , which allowed people to host their applications, using their own built servers, called data-centers.

These companies/services, defined a new meaning for virtualisation. They enabled the hardware functionality and cost to be broken down even further using software functionality, and provide customers with the resources on a need-basis, with a range of optimisation and customisation. These services enabled developers and businesses to pay for only those and that amount of resources that they need to host their application instead of bearing up the cost of building an entire server.

 

But even if AWS or Google Cloud broke down the cost of virtualisation, there was one problem that needed to be taken care of – use of operating systems, and everyone knows, OS’es are heavy- running into Gigabytes of storage space requiring to run a simple OS like Ubuntu.

 

Now consider that your application is a light weight application, and is less than even 150 MB in size to start with.  Now do you see the problem. Why would you pay or install a OS, which is multiple times the size of your application itself?

 

This is the biggest challenge that comes with virtualisation and Docker , with the concept of containers helped fix it or at least reduce it. How? Let’s see.

Let’s compare these two images – one for Virtual Machine and one for Docker.

 

Virtual Machine

 

 

Docker Container

 

What a container in does in a Docker ?

Docker sits between your application and the host operating system (OS). Docker is able to share the host OS across multiple containers rather than requiring each one to have and run its own full operating system. This allows you to encapsulate your application into a reusable module that can be plugged in and run on any machine where resources are available. This allows for more fine grained resource allocation and can minimize the amount of wasted cpu or memory resources. This notion of tightly packaged Docker containers is similar to the concept of shipping containers; which is where the term Docker containers comes from.

So if you see, Docker is a very simple solution, where in there is a layer between OS and applications to optimise resource usages and reduce the need for redundant operating systems.

A couple of terminology that often comes associated with Docker –

The Docker daemon is a service that runs on your host operating system. It currently only runs on Linux because it depends on a number of Linux kernel features, but there are a few ways to run Docker on MacOS and Windows too. The Docker daemon itself exposes a REST API.

Docker container is an open source software development platform. Its main benefit is to package applications in “containers,” allowing them to be portable among any system running the Linux operating system (OS).

Docker Hub is a cloud-based registry service which allows you to link to code repositories, build your images and test them, stores manually pushed images, and links to Docker Cloud so you can deploy images to your hosts.

 

Hope you’ve now a beginner’s understanding of what Docker is, why do we need it and what problems does it solves. In next tutorials, we’ll learn how to install, run and create a new Docker image.

 

A couple of suggested reads –