Should docker image be bundled with code?

We are building a SaaS application. I don’t have (for now – for this app) high demands on availability. It’s mostly going to be used in a specific time zone and for business purposes only, so scheduled restarting at 3 in the morning shouldn’t be a problem at all.

It is an ASP.NET application running in mono with the fastcgi server. Each customer will have – due to security reasons – his own application deployed. This is going to be done using docker containers, with an Nginx server in the front, to distribute the requests based on URL. The possible ways how to deploy it are for me:

  • Where to put folders for code in docker containers
  • Can't Access Spring Boot API in docker container
  • Dockerizing Jenkins builds - slaves as containers or builds as containers?
  • Docker and git deployment workflow confusion
  • How to automate docker deployment based on GitHub webhook?
  • How can I update the CSS and links in marathon UI?
    1. Create a docker image with the fcgi server only and run the code from a mount point
    2. Create a docker image with the fcgi server and the code

    pros for 1. would seem

    • It’s easier to update the code, since the docker containers can keep running
    • Configuration can be bundled with the code
    • I could easily (if I ever wanted to) add minor changes for specific clients

    pros for 2. would seem

    • everything is in an image, no need to mess around with additional files, just pull it and run it

    cons for 1.

    • a lot of folders for a lot of customers additionally to the running containers

    cons for 2.

    • Configuration can’t be in the image (or can it? – should i create specific images per customer with their configuration?) => still additional files for each customer
    • Updating a container is harder since I need to restart it – but not a big deal, as stated in the beginning

    For now – the first year – the number of customers will be low and when the demand is low, any solution is good enough. I’m looking rather at – what is going to work with >100 customers.

    Also for future I want to set up CI for this project, so we wouldn’t need to update all customers instances manually. Docker images can have automated builds but not sure that will be enough.

    My concerns are basically – which solution is less messier, maybe easier to automate?

    I couldn’t find any best practices with docker which cover a similar scenario.

  • Laravel 5 is not reading mysql connection (docker) ubuntu
  • Can I run/convert a Docker-Compose project with/to CoreOS rkt?
  • Debug Django project with environment in docker container
  • Not able to deploy ASP .net vnext application on VM on Azure using Docker
  • Docker : Start mysql and apache from entrypoint or CMD
  • Can I run DCE (Docker Container Executor) on Yarn with Kerberos?
  • One Solution collect form web for “Should docker image be bundled with code?”

    It’s likely that your application’s dependencies are going to be dependent on the code, so you’ll still have to sometimes rebuild the images and restart the containers (whenever you add a new dependency).

    This means you would have two upgrade workflows:

    • One where you update just the code (when there are no dependency changes)
    • One where you update the images too, and restart the containers (when there are dependency changes)

    This is most likely undesirable, because it’s difficult to automate.

    So, I would recommend bundling the code on the image.

    You should definitely make sure that your application’s configuration can be stored somewhere else, though (e.g. on a volume, or accessed through through environment variables).

    Ultimately, Docker is a platform to package, deploy and run applications, so packaging the application (i.e. bundling the code on the image) seems to be the better way to use it.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.