How to mount Hadoop HDFS

I am trying to mount Hadoop HDFS that I have set up in Google engine in a docker container running Ubuntu 14.04. Can anyone show me the necessary steps? I have tried to following some poor guides on the internet but it seems like all the packages are broken.

I will add 500 bounty to an answer that is correct.

  • Creating BigInsights (IBM Hadoop) Docker image, sysctl issue
  • Docker + SSH + Git Clone Isssue
  • How to start docker daemon automatically with custom arguments in Ubuntu?
  • binding of ports not working in docker
  • Can't pull image from docker repo
  • “Failed to resolve IP”; “Lookup error” in bash script when joining dockerized serf agents
  • Docker build from source fails
  • Permissions inside docker
  • What does Docker Quickstart Terminal do?
  • Docker compose run migrations on django web application + postgres db
  • why ansible always replaces double quotes with single quotes in templates?
  • Configure LXC to use wireless hosted network
  • One Solution collect form web for “How to mount Hadoop HDFS”

    I don’t have an easy way to test hdfs but I just mounted s3fs inside a docker ubuntu container. S3fs also uses fuse so I hope it would have similar requirements. I had to run the container as -privileged to get it to work otherwise fuze mount the filesystem and would get the following error. fuse: failed to open /dev/fuse: Operation not permitted

    I am not sure if google allows you to run containers in privileged mode though so I don’t know if this will work for you even if you could mount hdfs.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.