How to mount Hadoop HDFS

I am trying to mount Hadoop HDFS that I have set up in Google engine in a docker container running Ubuntu 14.04. Can anyone show me the necessary steps? I have tried to following some poor guides on the internet but it seems like all the packages are broken.

I will add 500 bounty to an answer that is correct.

  • Using Kubernetes with an Artifactory/Nginx proxy
  • Within Docker VM, Gulp-Watch Seems to not work well on volumes hosted from the host OS
  • Slow network performance in Docker container
  • Azure App Service + Docker Autodeploy?
  • How to setup hostnames using docker-compose?
  • Running Remote Bamboo Agents on Demand Using Docker
  • Not able to see file after creating it using docker command and committing the changes
  • Missing PID for process inside docker container
  • docker swarm certificate expiry
  • composer install fails when unable to see mysql database
  • How to create queue in rabbitmq from spring-boot with rabbitmq docker
  • Docker Beta on Mac : Cannot use ip to access nginx container
  • One Solution collect form web for “How to mount Hadoop HDFS”

    I don’t have an easy way to test hdfs but I just mounted s3fs inside a docker ubuntu container. S3fs also uses fuse so I hope it would have similar requirements. I had to run the container as -privileged to get it to work otherwise fuze mount the filesystem and would get the following error. fuse: failed to open /dev/fuse: Operation not permitted

    I am not sure if google allows you to run containers in privileged mode though so I don’t know if this will work for you even if you could mount hdfs.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.