I don't see how Docker is improving my development setup. Am I doing it wrong?

I have Dockerized some Tomcat applications with the idea to make it faster and easier to start the application and be ready to code within minutes. I have this working just fine, and I am happy about this.

But maybe I am not doing it right, because as I understand it, if you want to actually develop locally against a running container, you need to mount your entire application directory as a volume onto the running docker container. This means as a developer, you still have to clone all the repositories locally, still run any tasks necessary to get dependencies such as npm install for example, and in my case, I need to still locally clone all the other sub projects that my application requires to run, including the WEB-INF folder required by Tomcat in this case.

Here is a sample of my application structure:

my-app (need to clone this locally)
  |
  - WEB-INF
  - pages/home-page-app (need to also clone this in my-app locally)
  - pages/about-page-app (need to also clone this in my-app locally)
  - etc....

The benefit I see right now is the base image I am using in the my-app Dockerfile is a Tomcat image, which is configured to work out of the box with this application, which is really nice, because now the user does not need to download, install or configure Tomcat.

But can someone explain what benefit in development I gain from using Docker, if in the Dockerfile I have instructions for example to run npm install for a Node app, I STILL have to also run npm install locally as well. I see the benefit if I am just running the application, or deploying the application, but for local development, why use Docker? Am I doing it wrong?

Thank you.

You could skip installing any tools locally, and simply shell in to the docker container: docker exec -it <container_name> bash. Beyond that, running your development environment in a docker container really shines when you take that same image and run it in production.

The only differences between running a docker image in different environments should boil down to the environment variables. For a development setup, I would additionally expect to see code shared into the container with a volume at runtime. The same image in production should be built with the code baked in to it, ready to run.

Hopefully this helps

/Jeff

In your case I agree, there is not much in using docker. I see the benefit when running a complicated application which requires more than application server. For example: when you need to have a DB, Some Queuing layer (like Kafka), Elastic Search and Processing Engines, then using docker enables developer to create such environment on a single machine. Then in QA you may use same environment with several machines, and same in production.

1 Like

This overview here would be great to expand upon and turn into a blog post/documentation article. Docker staff?

Sometimes its helpful to see how others approach the same problem. With that in mind, here is the basic structure we use for our Dockerised Tomcat projects:

It uses Vagrant to spin up a local dockerhost and deploy the container. The project contains a git sub-module for the code base. You can run it to build the image locally, or pull the entire code base to build it remotely on something like Docker Hub.

This can be easily extended to more complicated set up’s say with multiple containers, like a local mysql database:

Hope that helps :whale:

Thanks for the response. I do think git submodule could help automate the structure of the project, and it’s something I am thinking about.

But is Vagrant necessary here? My goal is to have the least amount of abstraction and technologies at work, and Vagrant seems like just another complication or layer on top of Docker, where Docker should be all you need to build and run applications.

Granted I have never used Vagrant before, but I want to keep things simple and lean. Adding vagrant to the mix is just another thing you have to think about.