Docker: Best practice for development and production environment

Suppeose I have a simple node.js app. I can build a container to run the app with a simple Dockerfile like this:

FROM ubuntu:16.04
RUN apt-get update && apt-get install -y nodejs nodejs-legacy npm
COPY . /app
WORKDIR /app
RUN npm install
CMD node index.js

This will copy the source code into the container and I can ship it off to a registry no problem.

  • Logging into docker
  • Emulating `docker run` using the golang docker API
  • Docker: how to to get access to interactive mode once server started in foreground?
  • wrapping DB in docker image
  • Installing Python 3 Docker Ubuntu error command 'x86_64-linux-gnu-gcc
  • Fluentd - Using source tag as index
  • But for development I don’t want to rebuild the container for every change in my code. So naturally, I use a volume in combination to nodemon. Here’s my questions:

    • How do I keep the different configurations? Two dockerfiles? Use compose with two different compose files?
    • The node_nodules folder on my host is different from the one I need in the container (i.e. some packages are installed globally on the host). Can I exclude it from the volume? If so, I need to run npm install after mounting the volume. How do I do this?

    So my question is really: How do I keep dev and deploy environments separate. Two Dockerfiles? Two compose-files? Are there any best practices?

  • How to send network packets to the docker container?
  • ElasticBeanstalk Docker host network mode
  • Docker copy files to and mount same folder
  • How collect metrics from uwsgi with help collectd
  • Jenkins: running build does not provision docker slave
  • How to share volumes across multiple hosts in docker engine swarm mode?
  • 3 Solutions collect form web for “Docker: Best practice for development and production environment”

    So the way I handle it is I have 2 Docker files (Dockerfile and Dockerfile.dev).

    In the Dockerfile.dev I have:

    FROM node:6
    
    # Update the repository
    RUN apt-get update
    
    # useful tools if need to ssh in or used by other tools
    RUN apt-get install -y curl net-tools jq
    
    # app location
    ENV ROOT /usr/src/app
    
    COPY package.json /usr/src/app/
    
    # copy over private npm repo access file
    ADD .npmrc /usr/src/app/.npmrc
    
    # set working directory
    WORKDIR ${ROOT}
    
    # install packages
    RUN npm install
    
    # copy all other files over
    COPY . ${ROOT}
    
    # start it up
    CMD [ "npm", "run", "start" ]
    
    # what port should I have
    EXPOSE 3000
    

    My NPM scripts look like this

    "scripts": {
        ....
        "start": "node_modules/.bin/supervisor -e js,json --watch './src/' --no-restart-on error ./index.js",
        "start-production": "node index.js",
        ....
    },
    

    You will notice it uses supervisor for start so any changes to any file under src will cause it to restart the server without requiring a restart to docker.

    Last is the docker compose.

    dev:
      build: .
      dockerfile: Dockerfile.dev
      volumes:
        - "./src:/usr/src/app/src"
        - "./node_modules:/usr/src/node_modules"
      ports:
        - "3000:3000"
    
    prod:
      build: .
      dockerfile: Dockerfile
      ports:
        - "3000:3000"
    

    So you see in a dev mode it loads it mounts the current directory’s src folder to the container at /usr/src/app/src and also the node_modules directory to the /usr/src/node_modules.

    This makes it so that I can make changes locally and save, the volume will update the container’s file, then supervisor will see that change and restart the server.

    ** Note as it doesn’t watch the node_modules folder you have to change another file in the src directory to do the restart **

    Use environment variables. See the documentation Docker env. This is the recommended way, also for use in the production.

    You can use single Dockerfile in which you’ll just declare VOLUME section.

    Remember that volume won’t get mounted unless you’ll specify that explicitly during docker run with -v <path>:<path> option. Having that given, you can declare multiple VOLUMEs even in your prod environment.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.