Loading PostgreSQL Database Backup Into Docker/Initial Docker Data

I am migrating an application into Docker. One of the issues that I am bumping into is what is the correct way to load the initial data into PostgreSQL running in Docker? My typical method of restoring a database backup file are not working. I have tried the following ways:

gunzip -c mydbbackup.sql.gz | psql -h <docker_host> -p <docker_port> -U <dbuser> -d <db> -W

  • Using MongoDB Docker on .NET Core
  • installing `lightdm` in Dockerfile raises interactive keyboard layout menu
  • How to run a python-flask container on Mac OS?
  • Docker 1.12 on Mac issue with DB connection pool
  • Embed code in docker container or mount it as a volume?
  • eenvsubst not substitituing value for varable
  • That does not work, because PostgreSQL is prompting for a password, and I cannot enter a password because it is reading data from STDOUT. I cannot use the $PGPASSWORD environment variable, because the any environment variable I set in my host is not set in my container.

    I also tried a similar command above, except using the -f flag, and specify the path to a sql backup file. This does not work because my file is not on my container. I could copy the file to my container with the ADD statement in my Dockerfile, but this does not seem right.

    So, I ask the community. What is the preferred method on loading PostgreSQL database backups into Docker containers?

  • How to import data to mongodb container and creating an image
  • Trigger automatic build on dockerhub when some package is updated in official repository
  • Symfony app/console command is not running with cronjob inside Docker container
  • Using pytz as non-root in Docker with Alpine Linux causes “IOError: [Errno 13] Permission denied”
  • Docker and libseccomp
  • Running Mkbootstrap for DBI () => DBIXS.h:22:20: fatal error: EXTERN.h: No such file or directory
  • One Solution collect form web for “Loading PostgreSQL Database Backup Into Docker/Initial Docker Data”

    I cannot use the $PGPASSWORD environment variable, because the any
    environment variable I set in my host is not set in my container.

    I don’t use docker, but your container looks like a remote host in the command shown, with psql running locally. So PGPASSWORD never has to to be set on the remote host, only locally.

    If the problems boils down to adding a password to this command:

    gunzip -c mydbbackup.sql.gz |
      psql -h <docker_host> -p <docker_port> -U <dbuser> -d <db> -W

    you may submit it using several methods (in all cases, don’t use the -W option to psql)

    • hardcoded in the invocation:

       gunzip -c mydbbackup.sql.gz |
        PGPASSWORD=something psql -h <docker_host> -p <docker_port> -U <dbuser> -d <db>
    • typed on the keyboard

       echo -n "Enter password:"
       read -s PGPASSWORD
       export PGPASSWORD
       gunzip -c mydbbackup.sql.gz |
         psql -h <docker_host> -p <docker_port> -U <dbuser> -d <db>

    Note about the -W or --password option to psql.

    The point of this option is to ask for a password to be typed first thing, even if the context makes it unnecessary.

    It’s frequently misunderstood as the equivalent of the -poption of mysql. This is a mistake: while -p is required on password-protected connections, -W is never required and actually goes in the way when scripting.

           -W, --password
               Force psql to prompt for a password before connecting to a
               This option is never essential, since psql will automatically
               prompt for a password if the server demands password
               authentication. However, psql will waste a connection attempt
               finding out that the server wants a password. In some cases it is
               worth typing -W to avoid the extra connection attempt.
    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.