Docker intercontainer communication

I would like to run Hadoop and Flume dockerized. I have a standard Hadoop image with all the default values. I cannot see how can these services communicate each other placed in separated containers.

Flume‘s Dockerfile looks like this:

  • Docker build in Jenkins
  • Starting Docker containers from PHP
  • What does HTTP 400 when pulling docker image mean?
  • How to RUN plenv/rbenv in Dockerfile?
  • jar file with arguments in docker
  • Configure docker daemon logs
  • FROM ubuntu:14.04.4
    RUN apt-get update && apt-get install -q -y --no-install-recommends wget
    RUN mkdir /opt/java
    RUN wget --no-check-certificate --header "Cookie: oraclelicense=accept-securebackup-cookie" -qO- \ \
      | tar zxvf - -C /opt/java --strip 1
    RUN mkdir /opt/flume
    RUN wget -qO- \
      | tar zxvf - -C /opt/flume --strip 1
    ADD flume.conf /var/tmp/flume.conf
    ADD /opt/flume/bin/start-flume
    ENV JAVA_HOME /opt/java
    ENV PATH /opt/flume/bin:/opt/java/bin:$PATH
    CMD [ "start-flume" ]
    EXPOSE 10000

  • Docker - Gitlab container. SSH git login error
  • iptables rules break communication between Docker containers
  • openssh-server doesn't start in Docker container
  • How to get two Docker containers talking to each other?
  • docker and image size limit
  • CentOs 7 : Docker -d “Error running deviceCreate (CreatePool)”
  • One Solution collect form web for “Docker intercontainer communication”

    You should link your containers. There are some variants how you can implement this.

    1) Publish ports:

    docker run -p 50070:50070 hadoop

    option p binds port 50070 of your docker container with port 50070 of host machine

    2) Link containers (using docker-compose)


    version: '2'
      image: hadoop:2.6
     image: flume:last
     - hadoop

    link option here binds your flume container with hadoop

    more info about this

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.