Tuesday, 15 May 2012

java - Access to HDFS files from all computers of a cluster -



java - Access to HDFS files from all computers of a cluster -

my hadoop programme launched in local mode, , purpose became start in distributed mode. purpose necessary provide access files reading executed in reducer , mapper functions, computers of cluster , hence asked question on http://answers.mapr.com/questions/4444/syntax-of-option-files-in-hadoop-script (also not known on computer executed mapper function (mapper logic of programme there 1 , programme launched 1 mapper), necessary provide access on cluster file arriving on input of mapper function). in regard had question: whether possible utilize hdfs-files directly: re-create beforehand files file scheme of linux in file scheme of hdfs (thereby assume, these files become available on computers of cluster if not so, right please) , utilize hdfs java api reading these files, in reducer , mapper functions executing on computers of cluster?

if on question response positive, give please copying illustration file scheme of linux in file scheme of hdfs , reading these files in java programme means of hdfs java api , and record of contents @ java-string.

copy input files master node (this can done using scp). login master node (ssh) , execute next re-create files local filesystem hdfs:

hadoop fs -put $localfilelocation $destination

now in hadoop jobs, may utilize input hdfs:///$destination. no need utilize api read hdfs.

if want read files hdfs , utilize addiotional info other input files, refer this.

java linux hadoop mapreduce hdfs

No comments:

Post a Comment