Tuesday, 15 February 2011

mapreduce - ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Call to localhost/127.0.0.1:54310 failed on local exception -



mapreduce - ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Call to localhost/127.0.0.1:54310 failed on local exception -

i getting error in starting info node while initiating single node cluster set on machine

************************************************************/ 2013-02-18 20:21:32,300 info org.apache.hadoop.hdfs.server.datanode.datanode: startup_msg: /************************************************************ startup_msg: starting datanode startup_msg: host = somnath-laptop/127.0.1.1 startup_msg: args = [] startup_msg: version = 1.0.4 startup_msg: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1393290; compiled 'hortonfo' on wed oct 3 05:13:58 utc 2012 ************************************************************/ 2013-02-18 20:21:32,593 info org.apache.hadoop.metrics2.impl.metricsconfig: loaded properties hadoop-metrics2.properties 2013-02-18 20:21:32,618 info org.apache.hadoop.metrics2.impl.metricssourceadapter: mbean source metricssystem,sub=stats registered. 2013-02-18 20:21:32,620 info org.apache.hadoop.metrics2.impl.metricssystemimpl: scheduled snapshot period @ 10 second(s). 2013-02-18 20:21:32,620 info org.apache.hadoop.metrics2.impl.metricssystemimpl: datanode metrics scheme started 2013-02-18 20:21:33,052 info org.apache.hadoop.metrics2.impl.metricssourceadapter: mbean source ugi registered. 2013-02-18 20:21:33,056 warn org.apache.hadoop.metrics2.impl.metricssystemimpl: source name ugi exists! 2013-02-18 20:21:37,890 error org.apache.hadoop.hdfs.server.datanode.datanode: java.io.ioexception: phone call localhost/127.0.0.1:54310 failed on local exception: java.io.ioexception: connection reset peer @ org.apache.hadoop.ipc.client.wrapexception(client.java:1107) @ org.apache.hadoop.ipc.client.call(client.java:1075) @ org.apache.hadoop.ipc.rpc$invoker.invoke(rpc.java:225) @ sun.proxy.$proxy5.getprotocolversion(unknown source) @ org.apache.hadoop.ipc.rpc.getproxy(rpc.java:396) @ org.apache.hadoop.ipc.rpc.getproxy(rpc.java:370) @ org.apache.hadoop.ipc.rpc.getproxy(rpc.java:429) @ org.apache.hadoop.ipc.rpc.waitforproxy(rpc.java:331) @ org.apache.hadoop.ipc.rpc.waitforproxy(rpc.java:296) @ org.apache.hadoop.hdfs.server.datanode.datanode.startdatanode(datanode.java:356) @ org.apache.hadoop.hdfs.server.datanode.datanode.<init>(datanode.java:299) @ org.apache.hadoop.hdfs.server.datanode.datanode.makeinstance(datanode.java:1582) @ org.apache.hadoop.hdfs.server.datanode.datanode.instantiatedatanode(datanode.java:1521) @ org.apache.hadoop.hdfs.server.datanode.datanode.createdatanode(datanode.java:1539) @ org.apache.hadoop.hdfs.server.datanode.datanode.securemain(datanode.java:1665) @ org.apache.hadoop.hdfs.server.datanode.datanode.main(datanode.java:1682) caused by: java.io.ioexception: connection reset peer @ sun.nio.ch.filedispatcher.read0(native method) @ sun.nio.ch.socketdispatcher.read(socketdispatcher.java:39) @ sun.nio.ch.ioutil.readintonativebuffer(ioutil.java:251) @ sun.nio.ch.ioutil.read(ioutil.java:224)

any thought how resolve error?

ok got problem solved.

since using single-node cluster through network proxy, had added next property line $hadoop_home/conf/mapred-site.xml by-pass proxy server while communicating across hadoop daemons.

however, time trying out on direct net connection, had comment out property added in mapred-site.xml.

below property mapred-site.xml commented out:

<!-- <property> <name>hadoop.rpc.socket.factory.class.default</name> <value>org.apache.hadoop.net.standardsocketfactory</value> <final>true</final> <description> prevent proxy settings set clients in job configs affecting our connectivity. </description> </property> -->

hadoop mapreduce hadoop-streaming

No comments:

Post a Comment