Launch Docker containers to handle HTTP requests

I’m building a web application where users manage files in projects. Each user can have multiple projects, and each project can have multiple files. I’ve implemented this using Docker, where each project is a Docker volume. When the user clicks a button in the webapp interface to modify files in their project, the web server configures and launches a worker (which is another Docker instance) to modify the files in the Docker volume. This all works pretty well so far.

However, now I want to serve out these project files over HTTP. The strategy I have in mind is:

  • Heroku docker deployment
  • How to install docker daemon when resizing data center cluster size in Mesosphere?
  • Docker error on Windows 7 “Client.Timeout exceeded while awaiting headers”
  • Using Docker for Drupal Dev (Local)
  • why docker not executing init as lxc
  • application(JMX) monitoring running inside kubernetes cluster using prometheus
    1. A web server (like nginx) accepts an incoming HTTP request from the user
    2. The web server inspects the incoming request to determine which project is being requested. For example, if the URL is sparkle-pony.myapp.com, then we know that the sparkle-pony project is being requested. If this project doesn’t exist, nginx responds with a 404 Not Found response.
    3. The web server also checks if the user is logged in, and if that logged in user has permission to view the project. If not, the web server responds with a 403 Forbidden HTTP response.
    4. The web server configures and launches a new Docker container, probably another nginx process. Part of this configuration includes mounting the correct Docker volume onto the new container. We’ll call this newly launched container the “inner” container, and the existing container the “outer” container.
    5. The outer container either hands off this HTTP request to the inner container, or acts as a proxy for the inner container’s response.
    6. The inner container, with access to the correct Docker volume for the project and secure in the knowledge that the requesting user has the right permissions, checks the URL path and serves up the correct project file from the Docker volume. After the request has been suitably handled, the inner container shuts down.

    So, with all that being said, I have three questions:

    1. Is this a reasonable strategy? It does involve launching a new Docker container for every incoming HTTP request, but I think that’s OK…
    2. What is the best way to hand off the HTTP request from one container to another? Or does the outer container have to proxy the response from the inner container?
    3. Can someone provide some pointers or examples of how to set up a project like this? There are probably some tools or techniques that I don’t yet know about.

    Thank you!

  • How do I setup a docker image to dynamically pull app code from a repository?
  • Docker volume not used with Redis (mount does show up with inspect)
  • Flask web app, Node.js (queue) worker. Can't make docker work
  • Can't make outbound connections from haproxy-exporter docker
  • How to set image stored path in docker registry?
  • Runtime constraints on CPU and memory with docker containers
  • Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.