I was wondering what is the current best practice for packaging a REST-based API. I am still learning my way around Docker. I have a PHP-based application that provides a REST API and want to dockerize it. I am currently leaning towards bundling PHP and NGINX in the same image.
I know that there are some thoughts that each process should be in a container on its own. Have a separate NGINX container and a PHP container. But at that point the nginx configs for the PHP application still need to reside on the NGINX container so there is some tight dependency there (?)
But at the same time when I think from a deployment perspective, it feels that it would better to package up with NGINX and PHP together. That what the NGINX configs resides in the same container and deployment dependencies are not so tied up together.
I think it’s fine. Having PHP + whatever serves it in the same container is very common (official wordpress image does this, for instance), the PHP modules for nginx and Apache have basically forced this in my experience. They just don’t seem to be designed to be decoupled easily.
So, basically, what I would do is drop your PHP code in the container, and use PHP fpm or whatever via nginx to serve it.
The “one process per container” zealotry is largely intended to keep people away from doing things that are obviously wrong like running a Python daemon, redis, cron, and PostgreSQL all in one container with supervisord as PID1.
It’s depends on what you are trying to achieve if you are building a small app or basic simple PHP it would make sense to pack it all in the same container however if you are looking to have a more flexible approach thinking on scaling/tracing and logging definitely I would recommend keep PHP in a separate container and whatever you use to serve it in a second container (yes it could be not as easy to manage as the ‘all in one’ idea) but you gain in other aspect. So in the end it depends on what you are building.