How to echo environment variables in Docker

My docker-compose.yml is:

version: '2'

  postgres_data: {}
  postgres_backup: {}

    build: ./compose/postgres
      - postgres_data:/var/lib/postgresql/data
      - postgres_backup:/backups
    env_file: .env

      context: .
      dockerfile: ./compose/django/Dockerfile
    user: django
      - postgres
      - redis
    command: /
    env_file: .env

    build: ./compose/nginx
      - django

      - ""

    image: redis:latest
    restart: always

And in my .env file, I have:

  • How can I create containers at time of instance creation from my project container images
  • On which MAC address Docker interface with the internet?
  • logstash on docker: set multiple pipeline
  • Docker: Scaling Python container, memory overhead
  • How do I install os specific packages inside docker?
  • Docker cannot run
  • # PostgreSQL

    How do I test if the environment variables are effectively set?

    I’ve tried tu run on the remote machine:

    docker run sorbetcitron_django echo $POSTGRES_USER

    where sorbetcitron_django is my django image, but it outputs nothing.

  • Docker “not found” for existing file
  • how can we make volume as optional in docker compose .yml , how can we mount volume as optional using flag
  • How to share data between the docker container and the host?
  • removing what was added in previous layer in docker
  • How is “lxd” different from lxc/docker?
  • Docker nodejs build works locally but hangs on server
  • One Solution collect form web for “How to echo environment variables in Docker”

    I’d use:

    docker-compose run postgres env

    If you pass a $POSTGRES_USER to your cli, it’s going to get interpreted by the shell on the host, and if you escape the $, you’ll need to eval the line to get the shell to parse the $ inside the container.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.