Docker Community Forums

Share and learn in the Docker community.

Deployment and volumes?

docker

(Samir Sabri) #1

Hello,

I am new to docker, mainly I am using it for development, but I didn’t use it for production yet…
I have few questions please:

  1. When pushing my image, should it contain the source code also? or just the environment? what about volumes? and how would this effect scalability?

  2. If I have only my container that has my app, would I manage the load balancing? shall I know about load balancing to use your service?

  3. When my app in the cloud managed by docker cloud, would I achieve auto-scaling?

  4. In-general, should I be a skilled devops developer? or shall I hire a devops professional?

Thank you!


(Gustavostor) #2

Hi,

  1. When deploying to production, you can ADD your source code into your image and make all the necessary setups there, inside the Dockerfile. Everything you need usually should be containerized. As to volumes, they will be persistent across multiple deploys unless you explicitly say that you want to use new and clean volumes.

  2. You can find an answer for that here.

  3. You can achieve auto-scaling programmatically using the Docker Cloud API. However, I think it’s currently not possible to achieve that automatically from the Docker Cloud UI. It’d be a nice feature for the future to be able to describe a set of rules to autoscale the app.

  4. I only had a few experiences with DevOps and I found it relatively simple to use Docker Cloud. I think you’ll find it easy too :-). Of course you can complicate a lot depending on your use case.

Hope it helps,
Gustavo.


(Samir Sabri) #3

Thanks Gustavo,

So, I can use docker cloud APIs to detect nodes memory usage? and then
spawn new nodes? what are the server info that I can read via APIs? memory,
cpu usage … ?


(Gustavostor) #4

There’re many ways you can monitor metrics of your node, but as far as I know Docker Cloud API is not one of them (yet?). You can have a container dedicated to measuring your host’s CPU, memory, I/O ops, etc. from time to time, and develop some kind of logic behind it to auto-scale your app via Docker Cloud API.
However, this would require a lot of extra work, so I’ll give you another idea.
Have you heard of DataDog? It’s a service that lets you monitor your infrastructure dynamically. You can setup a set of rules to monitor, which would trigger some events. So you can use DataDog to set up the rules which would trigger an autoscale in your app, just like you’d do with AWS Autoscaling service. As soon as you define the rules which would trigger an upscale or downscale, you can use webhooks to call Docker Cloud API. Simple as that.

You can take a look at Docker Cloud Trigger’s API to see how exactly they work, but I believe you’ll only need an URL, which DataDog will make a POST request to, and which would trigger what you want.