Problems accessing multiple docker containers remotely

I’m trying to set up some docker container demo blogs but I’m having problems when I try to access more than one:

docker run --volumes-from my-data -p 80:8080 --name site1 tutum/wordpress
docker run --volumes-from my-data -p 80:8081 --name site2 tutum/wordpress

I can access the first one from myhost:8080 but I can’t access the second one from myhost:8081
Is there anything obvious I’m missing?

  • Confusing status on virtualbox+fabric8/kubernetes+docker
  • running java applications in docker with jboss or tomcat server
  • Upgrade docker-compose to version 3
  • Docker for SQL Server 2012 Container support
  • Beanstalk app CPU spikes
  • How do I use EBS volume with ECS container
  • Internally load balance Docker Containers using Azure Container Service
  • How do you set an ssh key for docker image containing a private git repo?
  • Excluding all hidden files with .dockerignore
  • docker with shiny server pro issues
  • Confluence Error 500 while setup
  • Connect docker container to database in host's local network
  • One Solution collect form web for “Problems accessing multiple docker containers remotely”

    Yes. The -p argument tells docker how to map external addresses to internal (container) addresses. You are instructing it to map port 80 of all host interfaces to port 8080/8081 of the respective container. Assuming the container processes really listen on port 8080/8081 you might want to try -p 8080:8080 / -p8081:8081. If the containers run standard webservers on port 80, you might want to use -p 8080:80 / -p 8081:80 instead. The proper port mapping will make the container service accessible on port 8080/8081 of all host interfaces.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.