How do you share volumes between Docker containers in an Elastic Beanstalk application?

I’m trying to share data between two Docker containers that are running in a multicontainer AWS EC2 instance.

Normally, I would specify the volume as a command flag when I ran the container, ie: docker run -p 80:80 -p 443:443 --link Widget:Widget --volumes-from Widget --name Nginx1 -d nginx1 to share a volume from Widget to Nginx1.

  • Deploying multiple Deis clusters
  • Version error in AWS Elastic beanstalk for Multicontainer Docker Configuration
  • Docker - denied: Your Authorization Token has expired
  • ECS Docker Container get env variable running cron
  • How do I configure “ulimits” for a Docker container running in AWS ECS?
  • Installation of chef-client(Bootstrapping) on docker container in a VM on Azure/AWS
  • However, since Elastic Beanstalk requires you to specify your Docker configuration in a dockerrun.aws.json file, and then handles running your docker containers internally, I haven’t been able to figure out how to share data volumes between containers.

    Note that I’m not trying to share data from the EC2 instance into a Docker container — this part seems to work fine; rather, I would like to share data directly from one Docker container to another. I know that docker container volumes are shared with the host at "/var/lib/docker/volumes/fac362...80535" etc., but since this location is not static I don’t know how I would reference it in the dockerrun.aws.json file.

    Has anyone found a solution or a workaround?

    More info on dockerrun.aws.json and the config EB is looking for here: http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/create_deploy_docker_v2config.html

    Thanks!

  • Swarm container not getting started with created overlay network
  • Docker: Compressing Directories in Image
  • launch basic bash script on docker build from windows system
  • Microsoft Word crashes when invoking its COM inside Docker Container
  • How ensure at least one service is running on every Docker Swarm node
  • ImportError: No module named ssl_match_hostname when importing the docker SDK for Python
  • 2 Solutions collect form web for “How do you share volumes between Docker containers in an Elastic Beanstalk application?”

    To accomplish what you want, you need to use the volumesFrom parameter correctly. You need to make sure to expose the volume with a VOLUME command for the container sharing its internal data.

    Here’s an example Dockerfile which I used to bundle some static files for serving via a webserver:

    FROM tianon/true
    COPY build/ /opt/static
    VOLUME ["/opt/static"]
    

    Now the relevant parts of the Dockerrun.aws.json:

    {
        "name": "staticfiles",
        "image": "mystaticcontainer",
        "essential": false,
        "memory": "16"
    },
    {
        "name": "webserver,
        ...
        "volumesFrom" : [
            {
                "sourceContainer": "staticfiles"
            }
        ]
    }
    

    Note that you don’t need any volumes entry in the root of the Dockerrun.aws.json file, since the volume is only shared between the two containers, and not persisted on the host. You also don’t need any specific mountPoints key in the container definition holding the volume to be shared, as the container with volumesFrom automatically picks up all the volumes from the referred container. In this example, all the files in /opt/static in the staticfiles container will also be available to the webserver container at the same location.

    From the AWS docs I found this:

    You can define one or more volumes on a container, and then use the
    volumesFrom parameter in a different container definition (within the
    same task) to mount all of the volumes from the sourceContainer at
    their originally defined mount points.

    The volumesFrom parameter applies to volumes defined in the task
    definition, and those that are built into the image with a Dockerfile.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.