Docker with multiple exposed ports

I have a container with say, 3 ports, 1000 (nodejs-express), 1001 (python-flask) and 1002 (angular2-client) exposed. When I use

docker run --name test -d -p 1000:1000 -p 1001:1001 -p 1002:1002 docker_image

Only the Express server is working fine on the host computer. However, when I log into the container and do curl, all three servers are responding just fine.

  • Problems with docker container to container communication on same host when firewalld/iptables is enabled
  • Running GUI in docker (no ssh, no VNC)
  • Kubernetes can't start due to too many open files in system
  • Docker workflow for scientific computing
  • Is it possible to start cassandra opscenter with a pre-configured node by passing it in as a startup parameter?
  • How to deliver dockerized app to client?
  • Any ideas what is going on with multiple port bindings with docker/host?

  • Docker building an image for ELK stack
  • How to restore a redis backup in a redis container?
  • Supervisord haproxy reload not working [closed]
  • Docker HTTP-requests between containers
  • Docker disk memory : can I remove intermediate images?
  • Docker: RUN pip install boto succeeds but is missing from the resulting image
  • One Solution collect form web for “Docker with multiple exposed ports”

    Once you do the following:

    1. EXPOSE ports on the DockerFile
    2. set -p flag for each port to expose externally

    You just need to make sure that your services allows external connections.

    i.e. for python flask: http://dixu.me/2015/10/26/How_to_Allow_Remote_Connections_to_Flask_Web_Service/ the default listen is localhost. Make sure it’s listening on 0.0.0.0

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.