Docker: Best practice for development and production environment

Suppeose I have a simple node.js app. I can build a container to run the app with a simple Dockerfile like this:

FROM ubuntu:16.04
RUN apt-get update && apt-get install -y nodejs nodejs-legacy npm
COPY . /app
WORKDIR /app
RUN npm install
CMD node index.js

This will copy the source code into the container and I can ship it off to a registry no problem.

  • Recommended way to mimic a production server with Docker
  • Docker multi host connection issue
  • Is there a way to access a running docker container on a remote server from my local development enviroment(Sublime)
  • docker not exposing the port with network host
  • Adding files to standard images using docker-compose
  • Docker container rake file error
  • But for development I don’t want to rebuild the container for every change in my code. So naturally, I use a volume in combination to nodemon. Here’s my questions:

    • How do I keep the different configurations? Two dockerfiles? Use compose with two different compose files?
    • The node_nodules folder on my host is different from the one I need in the container (i.e. some packages are installed globally on the host). Can I exclude it from the volume? If so, I need to run npm install after mounting the volume. How do I do this?

    So my question is really: How do I keep dev and deploy environments separate. Two Dockerfiles? Two compose-files? Are there any best practices?

  • Can't install awscli to docker VM
  • NginX times out connecting to java app
  • Error when Dockerizing a sails app with docker-toolbox
  • force refresh of docker image when updated in registry / kubernetes
  • How to do a custom deploy using ssh with Travis CI?
  • docker-machine ssh into Vagrant VM failing
  • 3 Solutions collect form web for “Docker: Best practice for development and production environment”

    So the way I handle it is I have 2 Docker files (Dockerfile and Dockerfile.dev).

    In the Dockerfile.dev I have:

    FROM node:6
    
    # Update the repository
    RUN apt-get update
    
    # useful tools if need to ssh in or used by other tools
    RUN apt-get install -y curl net-tools jq
    
    # app location
    ENV ROOT /usr/src/app
    
    COPY package.json /usr/src/app/
    
    # copy over private npm repo access file
    ADD .npmrc /usr/src/app/.npmrc
    
    # set working directory
    WORKDIR ${ROOT}
    
    # install packages
    RUN npm install
    
    # copy all other files over
    COPY . ${ROOT}
    
    # start it up
    CMD [ "npm", "run", "start" ]
    
    # what port should I have
    EXPOSE 3000
    

    My NPM scripts look like this

    "scripts": {
        ....
        "start": "node_modules/.bin/supervisor -e js,json --watch './src/' --no-restart-on error ./index.js",
        "start-production": "node index.js",
        ....
    },
    

    You will notice it uses supervisor for start so any changes to any file under src will cause it to restart the server without requiring a restart to docker.

    Last is the docker compose.

    dev:
      build: .
      dockerfile: Dockerfile.dev
      volumes:
        - "./src:/usr/src/app/src"
        - "./node_modules:/usr/src/node_modules"
      ports:
        - "3000:3000"
    
    prod:
      build: .
      dockerfile: Dockerfile
      ports:
        - "3000:3000"
    

    So you see in a dev mode it loads it mounts the current directory’s src folder to the container at /usr/src/app/src and also the node_modules directory to the /usr/src/node_modules.

    This makes it so that I can make changes locally and save, the volume will update the container’s file, then supervisor will see that change and restart the server.

    ** Note as it doesn’t watch the node_modules folder you have to change another file in the src directory to do the restart **

    Use environment variables. See the documentation Docker env. This is the recommended way, also for use in the production.

    You can use single Dockerfile in which you’ll just declare VOLUME section.

    Remember that volume won’t get mounted unless you’ll specify that explicitly during docker run with -v <path>:<path> option. Having that given, you can declare multiple VOLUMEs even in your prod environment.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.