IDEA远程连接Hadoop

IDEA远程连接Hadoop

Win

1.Hadoop配置

下载并配置到本地环境

HADOOP_HOME D:\Tools\hadoop-2.7.7\hadoop-2.7.7
HADOOP_PREFIX   D:\Tools\hadoop-2.7.7\hadoop-2.7.7
PATH += %HADOOP_HOME%\bin

2.Maven pom.xml

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.7.7</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-hdfs</artifactId>
    <version>2.7.7</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-core</artifactId>
    <version>2.7.7</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
    <version>2.7.7</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-common</artifactId>
    <version>2.7.7</version>
</dependency>

3.core-site.xml

hadoop集群里的core-site.xml文件拿下来,放到resource目录下(记在etc/hosts里的名字记得换成ip地址)

4.替换

windows下开发的可执行文件肥肠重要,就是\(HADOOP_HOME\)/bin下的windows可执行文件或者dll,下载下来的覆盖掉bin下的目录下即可

(hadoop.dll winutils.exe winutils.pdb......balabala)(版本版本!!!)

5.码代码

String filePath = "hdfs://172.18.145.167//user//zwj//wordCountTest.txt";
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(filePath),conf);
InputStream in = null;
try
{
    in = fs.open(new Path(filePath));
    IOUtils.copyBytes(in,System.out,4096,false);
}finally {
    IOUtils.closeStream(in);
}

完全ojbk

问题

1.org.apache.hadoop.hdfs.BlckMissingException

子节点datanode崩了,去log里找问题

2.Exception in thread "main" java.lang.UnsatisfiedLinkError:...nativeIO$Wi.....

win下可执行文件版本不对

3.hadoop Permission denied : USER=ZWJ,access=WRITE......

windows系统的用户名和ubuntu上的用户名不一致,炸

hadoop fs -chmod 777 filePath(filePath下文件所有者、群组用户、其他用户均可读可写可执行OTZ)

猜你喜欢

转载自www.cnblogs.com/tillnight1996/p/10689922.html