java - Hadoop Program cannot find installed binary -


i have hadoop setup , running on machine , trying run pegasus peta scale mining program , when make , there lot of warning , following part of it:

13/03/21 12:58:42 warn hdfs.dfsclient: datastreamer exception: org.apache.hadoop.ipc.remoteexception: java.io.ioexception: file /user/hadoop/hadi_edge/catepillar_star.edge replicated 0 nodes, instead of 1     @ org.apache.hadoop.hdfs.server.namenode.fsnamesystem.getadditionalblock(fsnamesystem.java:1558)     @ org.apache.hadoop.hdfs.server.namenode.namenode.addblock(namenode.java:696)     @ sun.reflect.generatedmethodaccessor4.invoke(unknown source)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)     @ java.lang.reflect.method.invoke(method.java:601)     @ org.apache.hadoop.ipc.rpc$server.call(rpc.java:563)     @ org.apache.hadoop.ipc.server$handler$1.run(server.java:1388)     @ org.apache.hadoop.ipc.server$handler$1.run(server.java:1384)     @ java.security.accesscontroller.doprivileged(native method)     @ javax.security.auth.subject.doas(subject.java:415)     @ org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation.java:1093)     @ org.apache.hadoop.ipc.server$handler.run(server.java:1382)      @ org.apache.hadoop.ipc.client.call(client.java:1066)     @ org.apache.hadoop.ipc.rpc$invoker.invoke(rpc.java:225)     @ $proxy1.addblock(unknown source)     @ sun.reflect.nativemethodaccessorimpl.invoke0(native method)     @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:57)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)     @ java.lang.reflect.method.invoke(method.java:601)     @ org.apache.hadoop.io.retry.retryinvocationhandler.invokemethod(retryinvocationhandler.java:82)     @ org.apache.hadoop.io.retry.retryinvocationhandler.invoke(retryinvocationhandler.java:59)     @ $proxy1.addblock(unknown source)     @ org.apache.hadoop.hdfs.dfsclient$dfsoutputstream.locatefollowingblock(dfsclient.java:3507)     @ org.apache.hadoop.hdfs.dfsclient$dfsoutputstream.nextblockoutputstream(dfsclient.java:3370)     @ org.apache.hadoop.hdfs.dfsclient$dfsoutputstream.access$2700(dfsclient.java:2586)     @ org.apache.hadoop.hdfs.dfsclient$dfsoutputstream$datastreamer.run(dfsclient.java:2826)  13/03/21 12:58:42 warn hdfs.dfsclient: error recovery block null bad datanode[0] nodes = = null 13/03/21 12:58:42 warn hdfs.dfsclient: not block locations. source file "/user/hadoop/hadi_edge/catepillar_star.edge" - aborting... put: java.io.ioexception: file /user/hadoop/hadi_edge/catepillar_star.edge replicated 0 nodes, instead of 1 13/03/21 12:58:42 error hdfs.dfsclient: exception closing file /user/hadoop/hadi_edge/catepillar_star.edge : org.apache.hadoop.ipc.remoteexception: java.io.ioexception: file /user/hadoop/hadi_edge/catepillar_star.edge replicated 0 nodes, instead of 1     @ org.apache.hadoop.hdfs.server.namenode.fsnamesystem.getadditionalblock(fsnamesystem.java:1558)     @ org.apache.hadoop.hdfs.server.namenode.namenode.addblock(namenode.java:696)     @ sun.reflect.generatedmethodaccessor4.invoke(unknown source)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)     @ java.lang.reflect.method.invoke(method.java:601)     @ org.apache.hadoop.ipc.rpc$server.call(rpc.java:563)     @ org.apache.hadoop.ipc.server$handler$1.run(server.java:1388)     @ org.apache.hadoop.ipc.server$handler$1.run(server.java:1384)     @ java.security.accesscontroller.doprivileged(native method)     @ javax.security.auth.subject.doas(subject.java:415)     @ org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation.java:1093)     @ org.apache.hadoop.ipc.server$handler.run(server.java:1382)  org.apache.hadoop.ipc.remoteexception: java.io.ioexception: file /user/hadoop/hadi_edge/catepillar_star.edge replicated 0 nodes, instead of 1     @ org.apache.hadoop.hdfs.server.namenode.fsnamesystem.getadditionalblock(fsnamesystem.java:1558)     @ org.apache.hadoop.hdfs.server.namenode.namenode.addblock(namenode.java:696)     @ sun.reflect.generatedmethodaccessor4.invoke(unknown source)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)     @ java.lang.reflect.method.invoke(method.java:601)     @ org.apache.hadoop.ipc.rpc$server.call(rpc.java:563)     @ org.apache.hadoop.ipc.server$handler$1.run(server.java:1388)     @ org.apache.hadoop.ipc.server$handler$1.run(server.java:1384)     @ java.security.accesscontroller.doprivileged(native method)     @ javax.security.auth.subject.doas(subject.java:415)     @ org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation.java:1093)     @ org.apache.hadoop.ipc.server$handler.run(server.java:1382)      @ org.apache.hadoop.ipc.client.call(client.java:1066)     @ org.apache.hadoop.ipc.rpc$invoker.invoke(rpc.java:225)     @ $proxy1.addblock(unknown source)     @ sun.reflect.nativemethodaccessorimpl.invoke0(native method)     @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:57)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)     @ java.lang.reflect.method.invoke(method.java:601)     @ org.apache.hadoop.io.retry.retryinvocationhandler.invokemethod(retryinvocationhandler.java:82)     @ org.apache.hadoop.io.retry.retryinvocationhandler.invoke(retryinvocationhandler.java:59)     @ $proxy1.addblock(unknown source)     @ org.apache.hadoop.hdfs.dfsclient$dfsoutputstream.locatefollowingblock(dfsclient.java:3507)     @ org.apache.hadoop.hdfs.dfsclient$dfsoutputstream.nextblockoutputstream(dfsclient.java:3370)     @ org.apache.hadoop.hdfs.dfsclient$dfsoutputstream.access$2700(dfsclient.java:2586)     @ org.apache.hadoop.hdfs.dfsclient$dfsoutputstream$datastreamer.run(dfsclient.java:2826) 

i thought 1 permission issue tried sudo make , folloing error:

hadoop not installed in system. please install hadoop , make sure hadoop binary accessible. make: *** [demo_hadi] error 127 

hadoop installed print jps :

hadoop@polaris:~/pegasus$ jps 7814 jobtracker 8061 tasktracker 3799 fsshell 7718 secondarynamenode 9155 fsshell 8881 runjar 7235 namenode 6339 runjar 9236 jps 

thanks time , want know going wrong !!!

as visible, datanode not on system (same probable reason "could replicated 0 nodes, instead of 1")

it happens sometimes.

solution : stop hadoop daemons , start them again. if 5 daemons running (namenode, secondary namenode,datanode, jobtracker, tasktracker).

try running sample code test framework running fine.

restart machine, if above solution isnt working.

if issue persists, reformat namenode (not advise though; should kept last solution).


Comments