Create Docker image for NodeJS + PostgreSQL web application

I’ve been reading Docker’s documentation, but I can’t get around creating an image that will work.

I have a NodeJS application that uses PostgreSQL as database:

  • How to execute the Entrypoint of a Docker images at each “exec” command?
  • Unable to push a docker image to the docker hub
  • docker custom nginx container failed to start
  • Will docker-compose allow mapping a port to two ports or do I need an ambassador?
  • docker-compose returning non-zero code: 100
  • Namenode runs on InternalIP:8020 in cloudera docker and causes applications to fail
  • var connectionString = process.env.DATABASE_URL || 'postgres://localhost:5432/db';
    var pg = require('pg');
    var pgp = require('pg-promise')();
    var db = pgp(connectionString);

    I first created a Dockerfile according to Node’s documentation for it:

    FROM node:argon
    # Create app directory
    RUN mkdir -p /app
    WORKDIR /app
    # Install app dependencies
    COPY package.json /app
    RUN npm install
    # Bundle app source
    COPY . /app
    EXPOSE 5000
    CMD [ "npm", "start" ]

    I then followed this post regarding connecting the database to it with docker-compose. the docker-compose.yml file looks like:

     build: .
        - "5000:5000"
        - .:/app
        - db
        DATABASE_URL: postgres://myuser:mypass@db:5432/db
      image: postgres
        POSTGRES_USER: myuser
        POSTGRES_PASSWORD: mypass

    This is (some) of what is returned when I run docker-compose up with these files, after creating the image.

    npm info ok 
     ---> 87dbbae35721
    Removing intermediate container c73f826a0b3d
    Step 6 : COPY . /app
     ---> ec56bfc11d3c
    Removing intermediate container 745ddf82d742
    Step 7 : EXPOSE 5000
     ---> Running in b2be5aecd9d6
     ---> a7d126a7ea5e
    Removing intermediate container b2be5aecd9d6
    Step 8 : CMD npm start
     ---> Running in 0379d512c688
     ---> 266517f47311
    Removing intermediate container 0379d512c688
    Successfully built 266517f47311
    WARNING: Image for service web was built because it did not already exist. To rebuild this image you must use `docker-compose build` or `docker-compose up --build`.
    Starting imagename_db_1
    Creating imagename_web_1
    Attaching to imagename_db_1, imagename_web_1
    web_1  | npm info it worked if it ends with ok
    web_1  | npm info using npm@2.15.1
    web_1  | npm info using node@v4.4.3
    web_1  | npm info prestart SharedServer@5.8.0
    web_1  | npm info start SharedServer@5.8.0
    web_1  | 
    web_1  | > SharedServer@5.8.0 start /app
    web_1  | > node index.js
    web_1  | 
    web_1  | Wed, 27 Apr 2016 00:41:19 GMT body-parser deprecated bodyParser: use individual json/urlencoded middlewares at index.js:13:9
    web_1  | Wed, 27 Apr 2016 00:41:19 GMT body-parser deprecated undefined extended: provide extended option at node_modules/body-parser/index.js:105:29
    web_1  | Node app is running on port 5000
    db_1   | LOG:  database system was interrupted; last known up at 2016-04-25 00:17:59 UTC
    db_1   | LOG:  database system was not properly shut down; automatic recovery in progress
    db_1   | LOG:  invalid record length at 0/17076E8
    db_1   | LOG:  redo is not required
    db_1   | LOG:  MultiXact member wraparound protections are now enabled
    db_1   | LOG:  database system is ready to accept connections
    db_1   | LOG:  autovacuum launcher started

    When I access http://localhost:5000, I see the web application running, but whenever I fire up something that tries to access the database, I get a HTTP 500 error with the following body

    code: "28P01"
    file: "auth.c"
    length: 98
    line: "285"
    name: "error"
    routine: "auth_failed"
    severity: "FATAL"

    What am I doing wrong? I’m not sure I understand what I’m doing with Docker, and the only thing I have for documentation are simple recipes to build specific environments (or at least, that’s what I’ve understood)


  • GO - Local import not working
  • Linking Containers via Docker Remote API
  • docker redis -Can't open the log file: No such file or directory
  • java process started inside a docker container not getting provided -Xms heap
  • connection refused when trying to connect redis using docker compose file
  • Where are the packages installed with `apt-get install` in Docker containers?
  • One Solution collect form web for “Create Docker image for NodeJS + PostgreSQL web application”

    Check your pg_hba.conf in $PGDATA allows connections from node.js.

    By default the pg_hba.conf is like so:

    # TYPE  DATABASE        USER            ADDRESS                 METHOD
    # "local" is for Unix domain socket connections only
    local   all             all                                     trust
    # IPv4 local connections:
    host    all             all               trust
    # IPv6 local connections:
    host    all             all             ::1/128                 trust

    This is fine for your standard psql connectivity via the OS owner as it allows local connections trusted for localhost IP address However if you have an IP address set on the server, which I’m guessing you do then you need to allow an entry for that because in your node.js configuration you’re referring to a host of “DB”. So ping “DB” and add that IP address:

    # TYPE  DATABASE        USER            ADDRESS                 METHOD
    # "local" is for Unix domain socket connections only
    local   all             all                                     trust
    # IPv4 local connections:
    host    all             all               trust
    # IPv6 local connections:
    host    all             all             ::1/128                 trust
    host    db              myuser          <ip-for-db-host>        md5

    Once you’ve changed that file you will need to perform a pg_ctl reload.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.