How to connect to localhost postgres database from docker container?

I’m configured my project to docker. I have database that have been used in non-docker period and now I want to connect my docker-compose db service to it. But when I write docker-compose up – existing database not used – new one created instead (I suspect, docker container simply doesn’t see the database). If I do nonsense please let me know. Maybe I shoud migrate my server db into container.

Here is my docker-compose.yml:

  • Local development with docker - do I need 2 Dockerfiles?
  • Docker Swarm, Compose and PostgreSQL
  • Docker Golang onbuild custom action at docker run
  • Docker - Nodejs to Mongodb connection works but collection is null
  • How to open 3000 port on Amazon EC2 for Docker Container?
  • Should Node.js and MongoDB be in different pods?
  • services:
      db:
        restart: always
        image: postgres:latest
        environment:
          - POSTGRES_DB=mydb
          - POSTGRES_PASSWORD=p@ssw0rd
          - POSTGRES_USER=root
        ports:
          - "5432:5432"
        volumes:
          # We'll mount the 'postgres-data' volume into the location Postgres stores it's data:
          - postgres-data:/var/lib/postgresql/data
      web:
        build: .
        command: bash -c "python manage.py collectstatic --noinput && ./manage.py migrate && ./run_gunicorn.sh"
        volumes:
          - .:/code
          - /static:/static
        ports:
          - 443:443
        depends_on:
          - db
      nginx:
        restart: always
        image: nginx:latest
        ports:
          - 80:80
        volumes:
          - ./misc/nginx.conf:/etc/nginx/conf.d/default.conf
          - /static:/static
        depends_on:
      - web
    

  • Memory issues when running Keras inside Docker on a Mac
  • Setting up Jetty Runner with custom Cert in Docker Build File
  • How to clear Docker task history
  • Mongodb error during mup setup
  • Docker Nginx Reverse Proxy
  • “docker cp” all files from a folder to existing container folder
  • 2 Solutions collect form web for “How to connect to localhost postgres database from docker container?”

    I think, the canonic approach is to have your DB engine running in container while storing the data on the persistent storage (map the volume to your hard disk).
    So I would use the Postgres in docker as ServerDB, as you suggested.

    If you only want your application connect to the external database, declare it as an external host:

    version: '2'
    services:
    
      web:
        build: .
        command: bash -c "python manage.py collectstatic --noinput && ./manage.py migrate && ./run_gunicorn.sh"
        volumes:
          - .:/code
          - /static:/static
        ports:
          - 443:443
        extra_hosts:
          - "db:192.168.1.2"
    
      nginx:
        restart: always
        image: nginx:latest
        ports:
          - 80:80
        volumes:
          - ./misc/nginx.conf:/etc/nginx/conf.d/default.conf
          - /static:/static
        depends_on:
          - web
    

    Just be sure your application reference the database as db and replace the ip I put there with your host ip.

    Regards

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.