Docker with multiple exposed ports

I have a container with say, 3 ports, 1000 (nodejs-express), 1001 (python-flask) and 1002 (angular2-client) exposed. When I use

docker run --name test -d -p 1000:1000 -p 1001:1001 -p 1002:1002 docker_image

Only the Express server is working fine on the host computer. However, when I log into the container and do curl, all three servers are responding just fine.

  • How to ensure a Docker container starts and stays running on a production server?
  • docker how do I make host scripts available inside a container
  • After using Docker-machine to reconfigure docker client how do I set it back to Docker for Windows hyperv
  • How to create “RegistryAuth” for Private Registry Login Credentials
  • fastcgi-mono-server4 and nginx with docker
  • Building Dockerfile fails when touching a file after a mkdir
  • Any ideas what is going on with multiple port bindings with docker/host?

  • Dockerfile: $HOME is not working with ADD/COPY instructions
  • How to copy a specific docker image layer from one host to another?
  • docker rabbitmq crashing during startup
  • Running many docker instances on Google cloud with different command-line parameters
  • How to save all docker images and copy to another machine?
  • How to deal with stale data when doing service discovery with etcd on CoreOS?
  • One Solution collect form web for “Docker with multiple exposed ports”

    Once you do the following:

    1. EXPOSE ports on the DockerFile
    2. set -p flag for each port to expose externally

    You just need to make sure that your services allows external connections.

    i.e. for python flask: the default listen is localhost. Make sure it’s listening on

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.