Skip to content
Article

The Effect of Containers on the Software Delivery Process - Xebia

Although container technology has been around for more than a decade, it was Docker that made it easy to use. The impact on the software industry has been tremendous. It improved the speed of software delivery. It did away with manual handovers between dev and ops. It made the deployment process 100% reproducible. And last but least: Once you put your application in a container, you can run it anywhere. On-premise or in the cloud. In this article we give you 8 reasons, why container technology is a fundamental building block for a multi-cloud strategy.

1. Instant software installation

In the old days (before 2013) when you wanted to try out some software, you had to do a lot of work. You had to read the installation instructions. you had to download and install the software. Configure your operating system. It took a lot of time and after toying with a lot of software, your machine became a complete mess.

Today, there is good chance that an application image is available of any software you want to try out. Just type ‘docker run’ followed by the name of your application, and within seconds you’re good to go. And if you don’t like it, no worries; type ‘docker rm’ and it’s removed from your computer.

2. Out-of-the-box security controls

Container technology isolates applications from each other within the operating system. Applications cannot see each other’s process, data or network. You can restrict the amount of CPU, memory and disk space they use.

The application registry scans all application images for security vulnerabilities. If there are critical vulnerabilities, you can prevent deploying the application to production.

3. Run your applications in any cloud

Once you have created a container image for your application, you can run it anywhere. If it runs on your laptop, it will run anywhere in the exact same way. Whether it is a test environment, on premise or in the cloud. It will be the exact same application.

Once you put your application in a container, you can run it On-premise or on AWS, Google or Azure. Container technology is a fundamental building block for a multi-cloud strategy.

4. Simplified system operations

Once you have put your application in a container, it is sealed. You cannot access nor change anything inside the container. You do not have to write installation instructions for the system operators. You do not have to change anything to run the application in a test, or production environment.

Computers can run any of your applications unmodified. Each computer that runs your applications is identical. There are no application specific differences. Broken computers can be safely destroyed and replaced with a new one.

The Effect of Containers on the Software Delivery Process

5. Reliable application deployments

Once you have put your application in a container, each deployment is identical. the same operating system, software packages, version and configuration. No matter how often you do a deployment. No matter who does the deployment, you get 100% reproducibility! As a result, environment specific errors are a thing of the past, saving you immense amounts of time.

6. Rock-solid version control

Once you have put your application in a container, you can version and sign it. When the application is running on a machine, Docker will tell you which image version it is. As a result, you don’t have to maintain a configuration management database. Just query the run-time and you will get an accurate inventory.

7. Delivering high availability is a breeze

Once you have put your application in a container, it is very easy to provide high availability. Just start multiple instances of that application and increase the availability level.

Applications can be scheduled on a cluster of machines. When your application crashes, it will be restarted. Often the restart is so fast that people barely notice it. When an entire machine crashes, the applications will be restarted on another machine.

8. Disaster recovery built-in

Container Images form the basic building block for immutable infrastructures. All the images are stored in a repository. Once stored you can pull these images to run them anywhere.

If you have your infrastructure defined as code, you can easily recover from a disaster. Just recreate a virtual datacentre somewhere, recover your data and start the applications!

Conclusion

Although Docker’s container technology is only four years old, it’s incredibly stable. Many customers are running in production, from both green-field and legacy backgrounds.

We have given you 8 reasons why container technology should be the corner-stone for your multi-cloud strategy. The question is not if you should start using container technology, but when.

This article is part of the Urgent Future report Cloud - It's a Golden Age.

Download Report

 

urgentfuture-logo-black.png

 

Explore more articles