How to mount Hadoop HDFS

I am trying to mount Hadoop HDFS that I have set up in Google engine in a docker container running Ubuntu 14.04. Can anyone show me the necessary steps? I have tried to following some poor guides on the internet but it seems like all the packages are broken.

I will add 500 bounty to an answer that is correct.

  • node-babel works very slow in Docker
  • Can't bootstrap Docker version since 1.11.x for direct-lvm setup (Kubernetes)
  • Does Iptables has any limitation in combination with TCP short connection?
  • “chfn: PAM: System Error” Intermittently in Docker Hub Builds
  • Protractor running in docker returns element not existing but in localhost does
  • Docker Exposing ports for Flask API
  • WSL in Windows Container
  • docker-compose caches run results
  • Problems deploying Keycloak in HA using Postgres in Docker
  • Cross-compile using go build CGO_ENABLED - warning: libudev.so.1 not found
  • Which ca.crt does docker.sock use for docker pull?
  • How to create docker containers with the same internal IP address?
  • One Solution collect form web for “How to mount Hadoop HDFS”

    I don’t have an easy way to test hdfs but I just mounted s3fs inside a docker ubuntu container. S3fs also uses fuse so I hope it would have similar requirements. I had to run the container as -privileged to get it to work otherwise fuze mount the filesystem and would get the following error. fuse: failed to open /dev/fuse: Operation not permitted

    I am not sure if google allows you to run containers in privileged mode though so I don’t know if this will work for you even if you could mount hdfs.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.