How to reduce time required by docker image to install dependencies?

I have some node docker-containers which basically just look like:

# core nodejs just installs node and git on archlinux
FROM core/nodejs

# clones directory into current working dir
RUN git clone https://github.com/bodokaiser/nearby .

# installs all dependencies
RUN npm install

# lets node execute the source code
CMD ["node", "index.js"]

When I now rebuild the image so it collects new updates it downloads all dependencies from npm. This always takes up about 5 minutes.

  • Conditional COPY/ADD in Dockerfile?
  • docker build Error checking context: 'can't stat '\\?\C:\Users\username\AppData\Local\Application Data''
  • Running and debugging an ASP.NET Core RC2 application in Docker
  • Installing and using Gradle in a docker image/container
  • Is there a way to tell gradle which profiles should be used for the tests?
  • INFO[0015] [8] System error: pipe2: too many open files on docker build
  • I am now wondering how I could avoid reinstalling all dependencies.

    One idea I had so far is to use VOLUME and then share the code repository with the host put this would make it hard to use the image on other hosts.

    Update:
    Another idea I have is to create a volume container which contains a git repo and which is shared with the runtime container. However the repo container must be able to rebuild the other container somehow?

  • Deploy Docker container using Kubernetes
  • How to rebuild go project efficiently while using Docker Compose?
  • Deploying with Docker into production: Zero downtime
  • Nginx proxy, linked to other docker containers
  • building in docker using buildpack-deps, but dependencies don't seem to be installed?
  • Docker rails migrations
  • One Solution collect form web for “How to reduce time required by docker image to install dependencies?”

    It sounds like what you’re after is having a base image that builds your dependencies and a local image that extends it so that you can build / run quickly.

    Something like:

    base/Dockerfile

    #core nodejs just installs node and git on archlinux
    FROM core/nodejs
    
    # installs all dependencies
    RUN npm install
    

    Then you could do a:

    cd base
    docker build -t your-image-name-base:your-tag .
    

    local/Dockerfile

    FROM your-image-name-base:your-tag
    
    # clones directory into current working dir
    RUN git clone https://github.com/bodokaiser/nearby .
    
    # lets node execute the source code
    CMD ["node", "index.js"]
    

    Then build your local image:

    cd local
    docker build -t your-image-name-local:your-tag .
    

    And run it like:

    docker run your-image-name-local:your-tag
    

    Now your local image will build really quickly because it extends your base image, which has already done all of the heavy, dependency-installing, lifting.

    As an alternative to doing a git clone inside your container, you could mount your code directory into the docker container so when you made changes to the code on your host, they would be immediately reflected inside the container:

    local/Dockerfile

    FROM your-image-name-base:your-tag
    
    # lets node execute the source code
    CMD ["node", "index.js"]
    

    Then you would run:

    docker run -v /path/to/your/code:/path/inside/container your-image-name-local:your-tag
    

    This will mount the directory inside your container and then execute your CMD.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.