Loading PostgreSQL Database Backup Into Docker/Initial Docker Data

I am migrating an application into Docker. One of the issues that I am bumping into is what is the correct way to load the initial data into PostgreSQL running in Docker? My typical method of restoring a database backup file are not working. I have tried the following ways:

gunzip -c mydbbackup.sql.gz | psql -h <docker_host> -p <docker_port> -U <dbuser> -d <db> -W

  • Docker container blocking text/event-stream
  • Would Docker or Vagrant be help in creating test machine for our enterprise product
  • Multiple storage drivers panic in private registry
  • Deploying Docker containers with port-mapping on Mesos/Marathon
  • Installing novadocker for openstack
  • dockerized nginx “times out” after several minutes
  • That does not work, because PostgreSQL is prompting for a password, and I cannot enter a password because it is reading data from STDOUT. I cannot use the $PGPASSWORD environment variable, because the any environment variable I set in my host is not set in my container.

    I also tried a similar command above, except using the -f flag, and specify the path to a sql backup file. This does not work because my file is not on my container. I could copy the file to my container with the ADD statement in my Dockerfile, but this does not seem right.

    So, I ask the community. What is the preferred method on loading PostgreSQL database backups into Docker containers?

  • Cannot connect from one container to another
  • Meteor DDP call between containers on same host
  • how to access docker registry v2 with curl?
  • How to solve `Building CXX object src/CMakeFiles/qpidcommon.dir/qpid/sys/posix/Condition.cpp.o` while compiling qpid-cpp within Docker Alpine?
  • Which Kubernetes component creates a new pod?
  • Does Kubernetes evenly distribute across an ec2 cluster?
  • One Solution collect form web for “Loading PostgreSQL Database Backup Into Docker/Initial Docker Data”

    I cannot use the $PGPASSWORD environment variable, because the any
    environment variable I set in my host is not set in my container.

    I don’t use docker, but your container looks like a remote host in the command shown, with psql running locally. So PGPASSWORD never has to to be set on the remote host, only locally.

    If the problems boils down to adding a password to this command:

    gunzip -c mydbbackup.sql.gz |
      psql -h <docker_host> -p <docker_port> -U <dbuser> -d <db> -W
    

    you may submit it using several methods (in all cases, don’t use the -W option to psql)

    • hardcoded in the invocation:

       gunzip -c mydbbackup.sql.gz |
        PGPASSWORD=something psql -h <docker_host> -p <docker_port> -U <dbuser> -d <db>
      
    • typed on the keyboard

       echo -n "Enter password:"
       read -s PGPASSWORD
       export PGPASSWORD
       gunzip -c mydbbackup.sql.gz |
         psql -h <docker_host> -p <docker_port> -U <dbuser> -d <db>
      

    Note about the -W or --password option to psql.

    The point of this option is to ask for a password to be typed first thing, even if the context makes it unnecessary.

    It’s frequently misunderstood as the equivalent of the -poption of mysql. This is a mistake: while -p is required on password-protected connections, -W is never required and actually goes in the way when scripting.

           -W, --password
               Force psql to prompt for a password before connecting to a
               database.
    
               This option is never essential, since psql will automatically
               prompt for a password if the server demands password
               authentication. However, psql will waste a connection attempt
               finding out that the server wants a password. In some cases it is
               worth typing -W to avoid the extra connection attempt.
    
    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.