测试Hadoop程序报错:java.lang.IllegalArgumentException: Wrong FS: hdfs://192.168.31.225:9000/user/root

测试PutMerge程序时,出现java.lang.IllegalArgumentException: Wrong FS: hdfs://192.168.31.225:9000/user/root,测试程序源码如下:

package com.hadoop.demo;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;

public class PutMerge {

    public static void main(String[] args) throws IOException {
        Configuration conf = new Configuration();
        //conf.set("fs.defaultFS", "hdfs://192.168.31.225:9000");
        FileSystem hdfs = FileSystem.get(conf);
        FileSystem local = FileSystem.getLocal(conf);

        Path inputDir = new Path("D:/hadooptest");
        Path hdfsFile = new Path("hdfs://192.168.31.225:9000/user/root/example.txt");

        try{
            FileStatus[] inputFiles = local.listStatus(inputDir);
            FSDataOutputStream out = hdfs.create(hdfsFile);

            for(int i= 0 ; i < inputFiles.length; i++){
                System.out.println(inputFiles[i].getPath().getName());
                FSDataInputStream in = local.open(inputFiles[i].getPath());
                byte buffer[] = new byte[256];
                int bytesRead = 0;
                while((bytesRead = in.read(buffer))>0){
                    out.write(buffer,0,bytesRead);
                }
                in.close();
            }
            out.close();

        }catch(Exception ex){
            ex.printStackTrace();
        }
        System.out.println("end---------->");
    }
}

解决方案:
1、hadoop需要把集群上的core-site.xml和hdfs-site.xml放到当前工程下。eclipse工作目录的bin文件夹下面。
2、在代码中加入conf.set(“fs.defaultFS”, “hdfs://192.168.31.225:9000”);

猜你喜欢

转载自blog.csdn.net/u012343297/article/details/79990446