Hadoop Config in Docker – Datanode won't connect

I am trying to build a dockerized hadoop system. I am currently having the issue that the datanode’s will not connect to the namenode. For some background: each docker image is running both its hadoop role and a free-ipa client and all are using free ipa for dns. All hdfs services are being run under the hdfs user uid: 6001 gid: 6001 group: hadoop.

This is the error I am seeing on the namenode:

  • How to move a local volume onto a remote docker machine
  • Pentaho Installed on Docker Container Not able to connect host mysql if firewall is up
  • How am I supposed to use a Postgresql docker image/container?
  • subdomain is created via domain regitrar or web server?
  • Setup secured Jenkins master with docker
  • Docker build from source fails
  • 2014-10-16 15:52:28,066 WARN  [IPC Server handler 4 on 8020] blockmanagement.DatanodeManager (DatanodeManager.java:registerDatanode(738)) - Unresolved datanode registration from 172.31.1.166
    2014-10-16 15:52:28,067 ERROR [IPC Server handler 4 on 8020] security.UserGroupInformation (UserGroupInformation.java:doAs(1494)) - PriviledgedActionException as:hdfs (auth:SIMPLE) cause:org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0, storageID=DS-300514933-172.31.1.166-50010-1413489147639, infoPort=50075, ipcPort=50020, storageInfo=lv=-47;cid=CID-41426277-e1f8-4154-8189-a0b556231333;nsid=900398376;c=0)
    2014-10-16 15:52:28,068 INFO  [IPC Server handler 4 on 8020] ipc.Server (Server.java:run(2075)) - IPC Server handler 4 on 8020, call org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode from 172.31.1.166:35452 Call#1 Retry#0: error: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0, storageID=DS-300514933-172.31.1.166-50010-1413489147639, infoPort=50075, ipcPort=50020, storageInfo=lv=-47;cid=CID-41426277-e1f8-4154-8189-a0b556231333;nsid=900398376;c=0)
    org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0, storageID=DS-300514933-172.31.1.166-50010-1413489147639, infoPort=50075, ipcPort=50020, storageInfo=lv=-47;cid=CID-41426277-e1f8-4154-8189-a0b556231333;nsid=900398376;c=0)
    at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3944)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
    at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
    at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2053)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2047)
    

    and on the datanode:

    2014-10-16 15:52:28,030 INFO  [DataNode: [file:/data/hdfs/dd]  heartbeating to namenode.example.internal/172.31.1.51:8020] datanode.DataNode (BPServiceActor.java:register(618)) - Block pool BP-763144819-172.31.1.51-1413403838191 (storage id DS-300514933-172.31.1.166-50010-1413489147639) service to namenode.example.internal/172.31.1.51:8020 beginning handshake with NN
    2014-10-16 15:52:28,083 FATAL [DataNode: [file:/data/hdfs/dd]  heartbeating to namenode.example.internal/172.31.1.51:8020] datanode.DataNode (BPServiceActor.java:run(668)) - Initialization failed for block pool Block pool BP-763144819-172.31.1.51-1413403838191 (storage id DS-300514933-172.31.1.166-50010-1413489147639) service to namenode.example.internal/172.31.1.51:8020
    org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException): Datanode denied communication with namenode: DatanodeRegistration(0.0.0.0, storageID=DS-300514933-172.31.1.166-50010-1413489147639, infoPort=50075, ipcPort=50020, storageInfo=lv=-47;cid=CID-41426277-e1f8-4154-8189-a0b556231333;nsid=900398376;c=0)
    at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:739)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3944)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:948)
    at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
    at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:24079)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2053)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2047)
    
    at org.apache.hadoop.ipc.Client.call(Client.java:1347)
    at org.apache.hadoop.ipc.Client.call(Client.java:1300)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
    at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
    at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
    at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
    at java.lang.Thread.run(Thread.java:744)
    

  • Why am I unable to run django migrations via the 'docker-compose run web' command?
  • Docker /vfs folder size
  • Wkhtmltopdf in a Dokku app?
  • Docker Remote API with Weave
  • Connect to SQL Server on Linux (docker) using SQL Management Studio
  • docker-compose volume not appearing in container
  • One Solution collect form web for “Hadoop Config in Docker – Datanode won't connect”

    I figured it out!

    It is important for hadoop to have both forward and reverse dns and I was failing to create the reverse dns records.

    Make Sure To Do That!!!!

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.