Docker Container Best Practices

I’m trying to figure out how to use docker in the best possible way to keep my new server environment efficient and organized. However I struggle a little bit with what can be containerized separately. I want to have an instance of
* redmine
* gitlab
and possibly some more stuff. A separate database container is for sure useful, as well as data volumes/data volume containers.

But how about eg a shared webserver for both these services? Is this possible/recommended? What is the recommended granularity of such a setup?

  • docker: command not found ( mac mini ) only happens in jenkins shell step but work from command prompt
  • Error when running artisan migrate in dockerfile
  • Docker-compose not mount the locale volume
  • Router assign ip to docker container
  • Postgresql DB not getting connected in docker
  • Links not being created between containers
  • Change system date time in docker containers without impacting the host
  • How to run Node.js and MongoDB interactive shell simultaneously within a Docker container
  • Adding a directory from the host filesystem to the docker filesystem (not in a dockerfile)
  • Nodemon Docker ECONNREFUSED error
  • DockerHub set tag alias in automated build
  • App running in Docker container on port 4567 can't be accessed from the outside
  • 2 Solutions collect form web for “Docker Container Best Practices”

    The generally accepted practice is “one process per container”.

    You could run a single web server container for both services if you wanted to. If you started to feel like the responsibilities of each vhost were diverging, then you could split it into two.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.