Problems accessing multiple docker containers remotely

I’m trying to set up some docker container demo blogs but I’m having problems when I try to access more than one:

docker run --volumes-from my-data -p 80:8080 --name site1 tutum/wordpress
docker run --volumes-from my-data -p 80:8081 --name site2 tutum/wordpress

I can access the first one from myhost:8080 but I can’t access the second one from myhost:8081
Is there anything obvious I’m missing?

  • Communication with Spark using Spark JobServer in docker
  • Docker Compose: Exposing Environment Variables to linked service
  • asp.net task wont stay running on Amazon EC2
  • Difference between official node image & other node image
  • use rabbitmqadmin in docker
  • Why pulled image is smaller than original image from the private registry?
  • How docker manage volumes when scaling up compose project?
  • Dockerfile compile in local machine but fails in docker hub for automated build
  • Docker in virtualbox
  • Docker node flow-bin libelf.so.1 not found (gitlab ci)
  • How to deploy docker image of artifactory-oss on marathon without permissions issues
  • How to expose ports only within the docker network?
  • One Solution collect form web for “Problems accessing multiple docker containers remotely”

    Yes. The -p argument tells docker how to map external addresses to internal (container) addresses. You are instructing it to map port 80 of all host interfaces to port 8080/8081 of the respective container. Assuming the container processes really listen on port 8080/8081 you might want to try -p 8080:8080 / -p8081:8081. If the containers run standard webservers on port 80, you might want to use -p 8080:80 / -p 8081:80 instead. The proper port mapping will make the container service accessible on port 8080/8081 of all host interfaces.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.