Logs not being flushed to Elasticsearch container through Fluentd

I have a local setup running 2 conainers –

One for Elasticsearch (setup for development as detailed here – https://www.elastic.co/guide/en/elasticsearch/reference/current/docker.html). This I run as directed in the article using – docker run -p 9200:9200 -e "http.host=" -e "transport.host=" docker.elastic.co/elasticsearch/elasticsearch:5.4.1

  • Docker with foreman
  • update git version in docker container
  • Addressing issues with Apache Spark application run in Client mode from Docker container
  • Accessing external hosts from docker container
  • When using a docker volume, are files up to date when the container is restarted?
  • Error while running docker run hello-world
  • Another as a Fluentd aggregator (using this base image – https://hub.docker.com/r/fluent/fluentd/). My fluent.conf for testing purposes is as follows :

        @type forward
        port 24224
    <match **>
        @type elasticsearch
        host    # Verified internal IP address of the ES container
        port 9200
        user elastic
        password changeme
        index_name fluentd
        buffer_type memory
        flush_interval 60
        retry_limit 17
        retry_wait 1.0
        include_tag_key true
        tag_key docker.test
        reconnect_on_error true

    This I start with the command – docker run -p 24224:24224 -v /data:/fluentd/log vg/fluentd:latest

    When I run my processes (that generate logs), and run these 2 containers, I see the following towards the end of stdout for the Fluentd container –

    2017-06-15 12:16:33 +0000 [info]: Connection opened to Elasticsearch cluster => {:host=>"", :port=>9200, :scheme=>"http", :user=>"elastic", :password=>"obfuscated"}

    However, beyond this, I see no logs. When I login to http://localhost:9200 I only see the Elasticsearch welcome message.

    I know the logs are reaching the Fluentd container, because when I change fluent.conf to redirect to a file, I see all the logs as expected. What am I doing wrong in my setup of Elasticsearch? How can I get to seeing all the indexes laid out correctly in my browser / through Kibana?

  • Connect Rails/Unicorn/Nginx container to MySQL container
  • chown docker volumes on host (possibly through docker-compose)
  • Revert changes to docker container
  • Is there a way to build docker image from two jars such as each jar locates in it's own layer
  • Can't access docker image from EC2 server
  • Docker compose (v2) UnkownHostException
  • One Solution collect form web for “Logs not being flushed to Elasticsearch container through Fluentd”

    It seems that you are in the right track. Just check the indexes that were created in elasticsearch as follows:

    curl 'localhost:9200/_cat/indices?v'


    There you can see each index name. So pick one and search within it:

    curl 'localhost:9200/INDEXNAME/_search'

    Docs: https://www.elastic.co/guide/en/elasticsearch/reference/current/search-search.html

    However I recommend you to use kibana in order to have a better human experience. Just start it and by default it searches for an elastic in localhost. In the interface’s config put the index name that you now know, and start to play with it.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.