Got Any Questions?

YOUR NAME
COMPANY
WHAT DESCRIBES YOU THE BEST?
MESSAGE
ENTER YOUR EMAIL ADDRESS
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Janet John

Everything You Need To Know About The Docker

April 12, 2022
DevOps

Nowadays, any mention of containers and most people would think about Docker. However, Containers had existed for more than a decade before Docker's meteoric rise in popularity, which began in 2013.

Although Containers replaced chroot, they brought a new set of development and security challenges - particularly for data storage as Containers aren't designed to hold long-term data. Because your container may be destroyed and regenerated at any time, any data saved inside it will be lost.

Fortunately, Docker came as a burst of sunshine. Docker is a virtualization platform that automates routine configuration procedures across the development lifecycle to create fast, simple and portable applications.

Due to the success of Docker technology, adoption by developers and companies took a bullish trend. Docker adoption doubled from 13% to 27% in just one year, according to the 2016 State of the Cloud Survey by RightScale.

What is Docker

You've just written some fantastic code that works wonderfully in your system, and you'd want to share it with a coworker or friend, but it doesn't work in theirs, and now you have to spend hours debugging to fix the issue.

Honestly, most developers experience this, and it could be because of several factors, such as dependencies, libraries and versions, frameworks, and Microservices, you name it.

Docker resolves the issues because it is an open-source platform for building, deploying, and managing containerized applications. It allows developers to package programs into containers, standardized executable components that combine application source code with the OS libraries and dependencies needed to run that code in any environment.

Developers can develop containers without Docker. However, the docker platform uses a toolkit that allows developers to use a single API to build, deploy, operate, update, and stop containers using simple commands and work-saving automation.

Docker utilizes a client-server architecture and a remote API to manage and create Docker containers and images. Docker containers are created from Docker images.

Benefits of Docker

  • Developers may execute any software as a lightweight, portable container that can be started practically anywhere.
  • After you've created the container file, you can share it with anyone, and they'll be able to execute your app on their machine from anywhere.
  • Docker works well as part of continuous integration pipelines thanks to tools like Travis and Jenkins. When your source code is updated, these tools can save the new version as a Docker image, tag it with a version number, upload it to Docker Hub, and deploy it to production.

What are Containers

Containers are executable units of software in which application code is packaged, along with its libraries and dependencies, in familiar ways to be run anywhere, whether on desktop, traditional IT, or the cloud.

A Docker container image is a lightweight, standalone, executable package of software that includes everything needed to run an application: code, runtime, system tools, system libraries, and settings.

Containers and images are similar to objects and classes in object-oriented programming, with the image describing the container and the container being a running instance of the image.

Benefits of Containers

In the past, virtual machines were employed to achieve many of the same objectives. On the other hand, Docker containers are much smaller and have significantly less overhead than virtual machines. Because different VM runtime environments are so varied, VMs are not portable. Docker containers are lightweight and easy to transport.

Finally, virtual machines were not designed with software developers in mind; there is no idea of versioning, and logging and monitoring are problematic. On the other hand, Docker images are made up of layers that may be controlled and versioned. Docker includes logging capabilities that are easy to utilize.

Docker utilizes a client-server architecture and a remote API to manage and create containers and images. Docker containers are created from Docker images.

Getting started with Docker:

Docker works with any programming language. This tutorial will show you how to write a simple “hello world” in node.js and dockerize the application.

Prerequisites

If you are on macOS or Windows, the best way to install Docker is by installing Docker Desktop.

Note: Docker for Windows requires Windows 10 Professional or Enterprise 64-bit.

How to dockerize your application

After installing this, the next step is to create your node.js application

Step 1: On your terminal, type mkdir my-node-app, this will create a folder on your PC.

Then type cd my-node-app to move into the directory you just created.

Step 2: initialize npm, npm generates a package.json file that contains the app's dependencies. Follow the prompt and be sure to replace the author information with your own name, and add a description.

npm init

This is what you should have:

To install express, type this:

npm install express --save

You can now open the folder you created with your favorite text editor to create the app.js file or create it on the CLI with nano.

From your editor, add the following. This will create the Express application and Router objects and define the base directory and port as constants:

The require function loads the express module, which is used to build the app. The porttells the app to listen on port 8081 and bind to it.

After following these steps, the app is now ready to be launched.

To run the app type this on your terminal:

node app.js

Then go to http://localhost:8081/ to see what you've done so far. You should see this. Quit the server by typing CTRL+C.

The next step is to create your docker file

Dockerfile specifies what will be included in your application and helps you to define your container environment. You can either go to your text editor e.g vs code and create a Dockerfile or use nano on the CLI.

nano Dockerfile

In your Dockerfile, add this:

FROM node:10-alpine
# Create app directory
WORKDIR /app
# Install app dependencies
COPY package*.json ./
USER node
# Copying rest of the application to app directory
COPY . /app
# Expose the port and start the application
EXPOSE 8081
CMD ["npm","start"]

What does this mean:

  • Each Dockerfile must begin with a FROM instruction. FROM node:10-alpine defines the base image we want to build from. the node:10-alpine image, since at the time of writing this is the recommended LTS version of Node.js available on the docker hub.
  • Next, set the working directory of the application WORKDIR /app
  • Install the app dependencies by copying the package.json COPY package*.json ./
  • run npm run npm install
  • Copy your application code to the container's application directory COPY . /app
  • Finally, expose port 8081 on the container and start the application EXPOSE 8081

Let's add a .dockerignore file before we build the application image.

.dockerignore, like a.gitignore file, will prevent unnecessary files from copying to the container.

Add this inside your dockerignore file:

node_modules
npm-debug.log
Dockerfile
.dockerignore

Building your Docker Image

docker build -t your_dockerhub_username/ image name

The -t flag lets you easily tag your image so that you can find it later.

Here's an example of the output:

Once the build is complete, you can check your image by running this command:

docker images

We can now create a container with this image using docker run.

Create a Container

Run the following command to build the container:

docker run --name my-node-app -p 80:8081 -d
your_dockerhub_username/my-node-app

Here's the output:

Once your container is up and running, you can see a list of your running containers by running docker ps.

Push to Docker Hub

Now that you have created an image for your application, you can push it to Docker Hub for future use. This automatically creates a repository on your docker hub for you.

Login to your docker hub account. In your terminal, run this command:

docker login -u your_dockerhub_username

You will be prompted to log in with your password. You can now push the application image to Docker Hub:

docker push your_dockerhub_username/my-node-app


Docker Alternatives

Docker is by far the most well-known and commonly used container platform on the planet. However, alternative container technologies are on the market, each with own techniques and use cases. So, if you're new to containers, here are other alternatives to try.

LXC

The LinuxContainers.org open-source project includes low-level container management tools called LXC. Canonical, the company behind Ubuntu, supported the technology, a precursor of Docker.

LXC aims to create an isolated application environment that closely resembles a full-fledged virtual machine (VM) but without the burden of running its kernel. LXC also uses the Unix process model, which eliminates a central daemon. Instead of being managed by a single central program, each container acts as if its program handled it.

Podman

Podman is a free and open-source container engine that works similarly to Docker. Podman stands out because of its isolation and user privilege characteristics, making it intrinsically more secure.

Podman's command-line interface (CLI) commands are nearly identical to those supported by the Docker CLI, except that Podman is used instead of the Docker base.

Hyper-V and Windows Containers

Microsoft released two new container technologies with Windows Server 2016, lightweight alternatives to full-fledged Windows virtual machines (VMs). Windows Containers uses a similar abstraction approach to Docker, and because each Hyper-V container can have its kernel, it is more aligned with the VM virtualization model. Since applications running within them do not need to be compatible with the host system, they offer greater portability than regular containers. As a result of the increased isolation from the host operating system and other container environments, they also provide better security.

Additional Resources

https://docker-curriculum.com/
https://docs.docker.com/get-started/overview/
https://docs.microsoft.com/en-us/visualstudio/docker/tutorials/docker-tutorial
https://www.dropbox.com/scl/fi/qysgqa81nkmkbz3321dj3/.paper?dl=0&rlkey=yj3r0dkdivo6yj3kovfbf8j24

Conclusion

In this tutorial, you have learned about Docker and its benefits in the development cycle. You have also known how to build a simple node application, build a docker image and a container, and push to Docker from the command line.

About Syntropy

Syntropy powers modular, interoperable data infrastructure across all major chains. At its core lies the Data Layer, a protocol serving as the customizable execution layer between all blockchains, allowing developers to build composable, use-case-specific, interoperable applications that can execute on any data from any chain.

To learn more about Syntropy, visit the Syntropy website, Twitter, Telegram, Discord or blog.

Media Inquiries

Emilis Klybas

Marketing Manager

emilis@syntropynet.com