Docker Container Best Practices

I’m trying to figure out how to use docker in the best possible way to keep my new server environment efficient and organized. However I struggle a little bit with what can be containerized separately. I want to have an instance of
* redmine
* gitlab
and possibly some more stuff. A separate database container is for sure useful, as well as data volumes/data volume containers.

But how about eg a shared webserver for both these services? Is this possible/recommended? What is the recommended granularity of such a setup?

  • How to use docker for many connected projects
  • Does my proxied server need to use HTTPS protocol with docker linking?
  • Dockerfile - Defining an ENV variable with a dynamic value
  • Docker Compose Dependencies
  • Window 10 Docker error during docker run
  • Docker Environment Variable
  • Docker does not resolve company network: UnknownHostException
  • “kubectl get services” doesn't show an “ExternalIP” column
  • docker-compose to access port on host machine
  • How to run docker-compose on remote host?
  • How to upgrade docker container with previous network and volumes?
  • How to write variable in docker-compose, running multiple containers which uses the same image but their ports are different
  • 2 Solutions collect form web for “Docker Container Best Practices”

    The generally accepted practice is “one process per container”.

    You could run a single web server container for both services if you wanted to. If you started to feel like the responsibilities of each vhost were diverging, then you could split it into two.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.