Controlling access to multiple Docker containers on the same host

I’ve been tasked with setting up multiple isolated, client-specific instances of a web application on a single Amazon EC2 instance using Docker. The base application is fundamentally the same for each instance, but has been customized for each client.

The goal is as follows:

  • MySQL in Docker on Windows: World-writable files ignored
  • Adjust OOM killer for subprocess in Docker container
  • Rocket.Chat as Docker-Container using sub folder
  • Conductor container can't be found. Run `ansible-container build` first
  • How can I find out how much space is used by my container images from the Google Container Registry
  • Docker container with status “Dead” after consul healthcheck runs
  • 1) Each container would be secured and “sandboxed” such that no one container could affect the others or the host. It’s my understanding Docker does this anyway, but I just want to be sure.

    2) Each container would be “complete” and have its own users, database, access keys, etc.

    3) A user associated with one container should have no way to access any aspect of any other container or the host.

    I’ve searched the similar questions and some seem to touch on the idea but not completely answer my question.

    I know this likely goes against the Docker philosophy, but that aside, is this feasible, and what would be the best approach to achieve this? We have used SSH tunnels to access relevant ports in the past when there was only one client per host, but is there a safe way to do this with multiple clients on the same host? Would this setup be better served by a reverse proxy like Nginx or Apache? I should specify here that we are currently looking at only having one domain to access this host.

    I guess the question boils down to, how do I restrict remote access on a per-container level when running multiple client containers on a single host?

    Any help is much appreciated.

  • Docker - cannot access container from Mac Host
  • Accessing a MariaDB in one Docker from another Docker
  • Deploy to docker with nginx, django, daphne
  • Docker compose global level logging
  • Docker-compose and pdb
  • Create symbolic link fails in Docker for Windows, it's not supported yet?
  • One Solution collect form web for “Controlling access to multiple Docker containers on the same host”

    It is feasible but the solution is too big to contain in a typical Stack Overflow answer. This is part of the value that PaaS providers like dotCloud, Heroku, and others provide.

    You can try to roll your own multi-tenant solution, maybe by starting with something like Deis, but they warn against it.

    Security is hard, especially when you are trying to make things easy for your customers.

    You may find this series in the dotCloud blog helpful:

    • Episode 1: Kernel Namespaces (docker helps)
    • Episode 2: cgroups (docker helps)
    • Episode 3: AUFS (docker helps)
    • Episode 4: GRSEC (your kernel, up to you)
    • Episode 5: Distributing routing (your network, up to you, though Docker Swarm may help eventually)
    • Episode 6: Memory Optimization (up to your users)

    notable bias: I used to work for dotCloud and now Docker.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.