Docker app and database on different containers

I have a node app running in one docker container, a mongo database on another, and a redis database on a third. In development I want to work with these three containers (not pollute my system with database installations), but in production, I want the databases installed locally and the app in docker.

The app assumes the databases are running on localhost. I know I can forward ports from containers to the host, but can I forward ports between containers so the app can access the databases? Port forwarding the same ports on different containers creates a collision.

  • docker-compose: connection refused between containers, but service accessible from host
  • Docker, EC2 and Rstudio
  • Assign LAN IP address to Docker container different from host's IP address
  • Docker on a Server 2016 Virtual Machine
  • docker -P not exposing ports of application started as argument
  • Set the hostname of the host from WITHIN a docker container
  • I also know the containers will be on the same bridged network, and using the “curl” command I found out they’re connected and I can access them using their relative IP addresses. However, I was hoping to make this project work without changing the “localhost” specification in the code.

    Is there a way to forward these ports? Perhaps in my app’s dockerfile using iptables? I want the container of my app to be able to access mongoDB using “localhost:27017”, for example, even though they’re in separate containers.

    I’m using Docker for Mac (V 1.13.1). In production we’ll use Docker on an Ubuntu server.

    I’m somewhat of a noob. Thank you for your help.

  • Consul: SD architecture. What is the right way to access microservices from a front-end side?
  • Writable folder permissions in docker
  • Docker not starting “ could not delete the default bridge network: network bridge has active endpoints”"
  • AngularJS and NodeJS app in Docker
  • Google Kubernetes storage in EC2
  • Could not find gem mysql2 even though it's in the Gemfile and bundle install is in the Dockerfile
  • One Solution collect form web for “Docker app and database on different containers”

    Docker only allows you to map container ports to host ports (not the reverse), but there are some ways to achieve that:

    • You can use --net=host, which will make the container use your host network instead of the default bridge. You should note that this can raise some security issues (because the container can potentially access any other service you run in your host)…

    • You can run something inside your container to map a local port to a remote port (ex rinetd or a ssh tunnel). This will basically create a mapping localhost:SOME_PORT –> HOST_IP_IN_DOCKER0:SOME_PORT

    • As stated in the comments, create some script to extract the ip address (ex: ifconfig docker0 | awk '/inet addr/{print substr($2,6)}'), and then expose this as an environment variable.
      Supposing that script is wrappen in a command named getip, you could run it like this:

      $ docker run -e DOCKER_HOST=$(getip) ...
      

      and then inside the container use the env var named DOCKER_HOST to connect your services.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.