Docker Container Best Practices

I’m trying to figure out how to use docker in the best possible way to keep my new server environment efficient and organized. However I struggle a little bit with what can be containerized separately. I want to have an instance of
* redmine
* gitlab
and possibly some more stuff. A separate database container is for sure useful, as well as data volumes/data volume containers.

But how about eg a shared webserver for both these services? Is this possible/recommended? What is the recommended granularity of such a setup?

  • Specify the env file docker compose uses
  • docker RUN - not executing script
  • How to pass command line arguments to a docker image
  • Load balance docker containers in Mesos cluster with HAproxy
  • Setup secured Jenkins master with docker
  • Nginx + dnsmasq = 'could not be resolved (5: Operation refused)'
  • Pentaho Installed on Docker Container Not able to connect host mysql if firewall is up
  • Dockerfile error with AWS Elastic Beanstalk; works otherwise, are there differences?
  • Cannot start mosquitto as service on docker container
  • getUserMedia in Firefox in Docker not working when using audio
  • Access a docker containers own localhost inside that container — php apache
  • Eliminate mess with docker swarm mode loadbalancer, consul and external load balancer
  • 2 Solutions collect form web for “Docker Container Best Practices”

    The generally accepted practice is “one process per container”.

    You could run a single web server container for both services if you wanted to. If you started to feel like the responsibilities of each vhost were diverging, then you could split it into two.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.