Which Problems Can Docker Help Me Solve?

Docker can be used for deployment, local development and testing. It’s also a useful experimenting and tinkering tool. But what practical problems can it help solve?

Here’s a non-exhaustive list of problems which can be made less painful with the help of Docker.

If you’re looking to understand what jobs Docker has you can read more here. There are three main ones - packaging, distributing and running applications. This is a slightly different, more practical way to approach the same topic.

I wish I could try this in an isolated environment / I wish I noted down everything which is needed to run this.

Docker is great for quick and reproducible environments. Isolated from your development machine. Captured as code.

Creating a Dockerfile where your application runs (even in dev mode) is a huge relief if you want to make sure you didn’t miss any important points.

It’s surprisingly tricky to share my working app with others. They’ll need to compile and install these dependencies…

Pack it up in a Docker container, and share the image - either via a public registry, or a private one if only a select few people should be able to access it (or as a tar file if you like).

Folks will be able to start container from that image and run your application right away. Without jumping through hoops and setting up an environment for it manually.

Well, apart from installing Docker.

I know how to configure everything for the app on Ubuntu, but on MacOS…

If your app is containerized, anybody can run it on their machine - no matter which operating system they use. As long as Docker is installed.

Vagrant is another nice way to create automated, reproducible development environments, but Docker coupled with Docker Compose is a nice solution as well.

Something on staging changed, and now the app doesn’t work anymore.

A dockerized app is self-contained. The way to generate the image is saved as code, and is reproducible.

If you keep your Dockerfile versioned and pin dependencies (as you should), you can see what changed over time, making it easier to debug things. You should handle a way to version and keep track of your configuration variables in different environments as well. That’s a problem beyond Docker though.

A build artifact? What’s that?

A Docker image is a great build artifact. Everything is packaged up and ready to go, (in the best case) just waiting for correct configuration values to be passed to it.

They are also versioned if tags are used well, you can switch between versions of your application without much hassle.

I want to run multiple applications on the same machine, but they keep clashing

Docker containers got your back. If some of your applications need different OS-level dependencies, you can avoid nasty collisions.

Containers are isolated from each other, which means that it’s easier to run multiple projects next to each other without them getting in the way. That applies to different applications, as well as for multiple instances of the same app (if everything is configured correctly).

I don’t know why it doesn’t work for you. It works on my machine!

Dev/Prod parity is a useful thing to have. Heck, dev/dev parity isn’t even the case sometimes.

Imagine being in a dev team where a vital piece of your environments is different - a package, a backing service version mismatch or similar. If that isn’t a recipe for nasty bugs, I don’t know what is.

Docker helps by capturing dependencies and environments as code, and making it easy to start from a clean slate. When your environments are automated and reproducible by design, it’s harder for things to be forgotten or misconfigured.

No more guessing about which parts of your manually-configured development machine are amiss.

Using Docker Compose, you can use Docker just to bring up the same versions of backing services as in your production set, but when you develop your app locally. It’s simply a docker-compose up with each Project you’re working on, without having to keep around 3 different versions of your database installed and switch them on and off.

Distributing workloads across multiple machines dynamically.

Strictly speaking, we are leaving the realm of Docker and venturing into the domain of orchestrators here.

But many orchestrators are built around handling containers. Docker makes it possible to use those popular orchestrators (like Kubernetes or Nomad).

A container is much like a shipping container - your tooling only needs to care about what configurations to pass inside, which ones to connect and how to handle a generic container. Everything else is safely contained.

All you need to do, is make sure that you application is dockerized well, and that you got the configurations right. The orchestrator will take care of distributing and scaling it across multiple machines, react to one of them going missing or additional ones getting added. You won’t need to take care of fiddly details yourself, which is handy if the load is dynamic.

That one app hogs all available resources!

Docker makes it possible to limit the amount of RAM and CPU a container can use. You won’t have to rely on the good intentions of your dockerized apps - you can limit the amount of resources they get and react to an app misbehaving.

In case of CPU resources for example, an app might still try to use all available resources, just it’ll only get its assigned quota.

In Conclusion

Well that’s been quite a few challenges which can be made less painful with the help of Docker.

Not all of those problems have just “use Docker!” as the only answer. You can use different techniques and tools to address them. Docker just happens to be a useful tool, which can help navigate around many of those issues.

I hope this has been interesting for you! Sign up below to get notified about future articles and level up your Docker knowledge.

If you’re getting started with Docker, check out these 12 things I wish I knew about Docker when starting out myself.