Why should I care about Docker? Docker warehouse and Java application service dynamics

What is Docker?

Docker is a new method of virtualization. If you know virtualization, skip the next part. If not, you need to have a basic understanding of virtualization before I can help you understand Docker.

What is virtualization?

Why should I care about Docker?  Docker warehouse and Java application service dynamics

 

Let's start with an analogy: suppose you own a house. You have a friend who needs a place to say. If you want to help your friends, you have a few options.

Bring your friends to your bedroom. This may be a little nervous.

Build a new house for your friends on your property. This is an expensive solution.

Invite your friends to stay in the spare bedroom. Now we are somewhere...

The third option is very good. You can help your friends instead of building new houses for them, but at the same time keep your life mostly separate. You will share some common resources such as the kitchen and living room, but you can enter your bedroom and close the door for some privacy.

Virtualization is like arranging your friends in your spare bedroom. Imagine you want to run a web server on your computer. You want to separate it from your own operating system and applications. To do this, you can run a virtual machine that contains a web server . It runs like a standalone computer, but it uses the computer's processor and RAM. When a virtual machine is started, its entire operating system will be displayed in a window within the operating system.

How is Docker different?

Docker is a different kind of virtualization. If a typical virtual machine packages the operating system with the application you are running, Docker will share as much as possible between the virtualized systems. This makes them use fewer resources at runtime and makes them easier to transfer to other developers or production environments.

If you are learning web development on your own, it is difficult to know what to learn next. Sign up for Rad Radon’s free tutoring course and we will provide you with the next steps for your web development career!

Why should developers use Docker?

Docker provides some cool superpowers for web developers .

Easily share development environment

If you and I want to collaborate on the Node application, we need to make sure that we both have Node installed and they are the same version so that our environment is consistent. We can skip this and hope for the best, but it may cause problems that we may have difficulty narrowing down. In different versions of Node, the library and our own code sometimes behave differently.

The solution is to ensure that we all have the same version of Node. However, if each of us already has other projects on our system that require other versions of Node, we may want to install NVM so that we can switch Node versions easily. Then, we can add the .nvmrc file to the root directory of the project, specifying the general version we want.

We only need to do this once, so our work is now complete. All in all, this is what we must do:

Determine the node version.

Install NVM.

Install the Node version of our choice.

Add .nvmrc to the project directory and set the correct node version.

Start the application.

It works, but it is a lot. For other people we want to join this project, we have to do most of the work again. Even if we take all these steps, we still cannot guarantee that all developers have the same environment . Developers running different operating systems or even different versions of the same operating system may be interrupted.

Docker allows us to solve all these problems by providing the same development environment for all developers. Instead, with Docker, we will take the following measures:

Install Docker.

Write a Dockerfile.

Run docker build -t <image-name>. The image name can be any name you choose.

Run docker run -p 3000:3000 <image-name>. The "p" option maps the container port to the local port. This allows you to hit port 3000 on your computer, which will be mapped to port 3000 on the container. Use the same image name as in step 3.

This does not seem to be much simpler than Node/NVM setup (in fact it is not). It does bring advantages. Regardless of your technology stack, you only need to install Docker once . Of course, you only need to install Node once (unless you need multiple versions), but when you are ready to deal with applications on different stacks, you need to install all the software stacks you need. With Docker, you only need to write a different Dockerfile (or Docker Compose file, depending on the complexity of the application).

The Dockerfile is very simple: it is a text file named "Dockerfile" with no extension. Let's take a look at the Dockerfile you might use for a simple Node application.

<span style="color:rgba(0, 0, 0, 0.843137)">
</span>
<span style="color:rgba(0, 0, 0, 0.843137)">
</span>
<span style="color:rgba(0, 0, 0, 0.843137)">
</span>
<span style="color:rgba(0, 0, 0, 0.843137)">
</span>

This Dockerfile is written for a node application listening on port 3000 and started from the npm start command. Submit this to your project's repository, and getting started for new developers becomes very simple and 100% consistent: every developer gets the same environment every time.

Develop in the same environment as production

After the application is up and running in the Docker development environment, you can actually ship the entire container directly to the production environment. If you think it is painful to deal with the contradiction between the two developers, just wait until you write the code that runs on your machine only to make it not work in production . This is very frustrating.

You have many options to deploy Docker containers into a production environment. Here are some:

AWS ECS (official tutorial)

Digital Ocean (Tutorial)

Heroku (official tutorial)

sloppy.io (official tutorial)

I like Heroku's method because it is the only method that allows you to simply push your projects with Dockerfile to make them run. Others need to perform several steps, such as pushing the Docker image to the repository. The extra steps are not the end of the world, but they are not necessary.

What about more complex applications?

Due to Docker's philosophy (one process per container), most applications will require multiple containers . For example, a WordPress site should contain a container for a web server running PHP and a container for a MySQL database. This means that you need some way for the container to talk. This is called container orchestration .

If you can run all containers on a single host, Docker Compose may meet your orchestration needs. It is included when you install Docker and it is easy to learn. It allows you to start multiple containers at the same time and establish a network between them so that they can communicate with each other. This is the fastest and easiest way to orchestrate multiple containers.

If you need to coordinate containers scattered on multiple hosts, Kubernetes is the mainstream solution. Many hosts that support Docker deployment provide Kubernetes for orchestration.

Quick win from understanding Docker

It may not seem important now, but when you first encounter problems caused by environmental differences, please submit this knowledge. You don't want it to happen again. By learning Docker, you will be able to ensure a consistent environment for your application , no matter where or who it runs. This means consistent results that you, your clients and your employer can rely on .

Follow me for more dry goods!

Guess you like

Origin blog.csdn.net/qq_45401061/article/details/108740936