Problems accessing multiple docker containers remotely

I’m trying to set up some docker container demo blogs but I’m having problems when I try to access more than one:

docker run --volumes-from my-data -p 80:8080 --name site1 tutum/wordpress
docker run --volumes-from my-data -p 80:8081 --name site2 tutum/wordpress

I can access the first one from myhost:8080 but I can’t access the second one from myhost:8081
Is there anything obvious I’m missing?

  • Use sysctl to change kernel params in docker file,will the running container affect Host kernel
  • Flow of registration between kafka and zookeeper
  • How do you run `apt-get` in a dockerfile behind a proxy?
  • docker-compose 1.6 “args” attribute on “build”
  • How to build a docker image within jenkins
  • How to add Docker volume via YAML file in such a way that it will work?
  • How to get Container Id of Docker in Jenkins
  • Docker not adding existing file
  • How to free wasted space used by docker?
  • What is the better way to do service discovery on Docker environment?
  • How to get Image ID of docker in jenkins?
  • Docker mount dangling volume
  • One Solution collect form web for “Problems accessing multiple docker containers remotely”

    Yes. The -p argument tells docker how to map external addresses to internal (container) addresses. You are instructing it to map port 80 of all host interfaces to port 8080/8081 of the respective container. Assuming the container processes really listen on port 8080/8081 you might want to try -p 8080:8080 / -p8081:8081. If the containers run standard webservers on port 80, you might want to use -p 8080:80 / -p 8081:80 instead. The proper port mapping will make the container service accessible on port 8080/8081 of all host interfaces.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.