1、确保分布式环境版本与eclipse插件版本要一致(0.20.205.0),否则连接是提示:
2、插件重新打包,需要把
lib/jackson-core-asl-1.8.8.jar,lib/jackson-mapper-asl-1.8.8.jar,lib/commons-configuration-1.6.jar,lib/commons-lang-2.4.jar,lib/commons-httpclient-3.0.1.jar,lib/commons-cli-1.2.jar,打进插件包中,如下:
MANIFEST.MF修改:
Bundle-ClassPath: classes/,lib/hadoop-core.jar ,lib/jackson-core-asl-1.8.8.jar,lib/jackson-mapper-asl-1.8.8.jar,lib/commons-configuration-1.6.jar,lib/commons-lang-2.4.jar,lib/commons-httpclient-3.0.1.jar,lib/commons-cli-1.2.jar
否则出现:
关于eclipse无法连接报错:
"Map/Reduce location status updater". org/codehaus/jackson/map/JsonMappingException
经过查询,是由于hadoop的eclipse 插件里面缺少了包
3、准备测试类
package com.hadoop.learn.test; import java.io.IOException; import java.util.StringTokenizer; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.Reducer; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; import org.apache.hadoop.util.GenericOptionsParser; import org.apache.log4j.Logger; public class WordCountTest { private static final Logger log = Logger.getLogger(WordCountTest.class); public static class TokenizerMapper extends Mapper<Object, Text, Text, IntWritable> { private final static IntWritable one = new IntWritable(1); private Text word = new Text(); public void map(Object key, Text value, Context context) throws IOException, InterruptedException { log.info("Map key : " + key); log.info("Map value : " + value); StringTokenizer itr = new StringTokenizer(value.toString()); while (itr.hasMoreTokens()) { String wordStr = itr.nextToken(); word.set(wordStr); log.info("Map word : " + wordStr); context.write(word, one); } } } public static class IntSumReducer extends Reducer<Text, IntWritable, Text, IntWritable> { private IntWritable result = new IntWritable(); public void reduce(Text key, Iterable<IntWritable> values, Context context) throws IOException, InterruptedException { log.info("Reduce key : " + key); log.info("Reduce value : " + values); int sum = 0; for (IntWritable val : values) { sum += val.get(); } result.set(sum); log.info("Reduce sum : " + sum); context.write(key, result); } } public static void main(String[] args) throws Exception { Configuration config = new Configuration(); String[] otherArgs = new GenericOptionsParser(config, args).getRemainingArgs(); if (otherArgs.length != 2) { System.err.println("Usage: WordCountTest <in> <out>"); System.exit(2); } Job job = new Job(config, "word count test"); job.setJarByClass(WordCountTest.class); job.setMapperClass(TokenizerMapper.class); job.setCombinerClass(IntSumReducer.class); job.setReducerClass(IntSumReducer.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(IntWritable.class); FileInputFormat.addInputPath(job, new Path(otherArgs[0])); FileOutputFormat.setOutputPath(job, new Path(otherArgs[1])); System.exit(job.waitForCompletion(true) ? 0 : 1); } }
4、配置host
192.168.197.131 hadoop-namenode
否则入参的地址已ip会出错:
java.lang.IllegalArgumentException: Wrong FS: hdfs://192.186.54.1:8020/user/hadoop/test.txt, expected: hdfs://hadoop1
正确如下:
5、hadoop-core-0.20.205.0.jar重新编译
运行是,可能报错:
12/04/24 15:32:44 ERROR security.UserGroupInformation: PriviledgedActionException as:Administrator cause:java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Administrator\mapred\staging\Administrator-519341271\.staging to 0700
Exception in thread "main" java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Administrator\mapred\staging\Administrator-519341271\.staging to 0700
at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:682)
这是由于Windows下文件权限问题,在Linux下可以正常运行,不存在这样的问题。
解决方法是修改 F:\编程开发\hadoop\older\hadoop-0.20.203.0rc1\hadoop-0.20.203.0\src\core\org \apache\hadoop\fs\RawLocalFileSystem.java ,注释掉 checkReturnValue(有些粗暴,在Window下,可以不用检查)
然后重新编译,编译可能出错
(1)ant开始下载依赖和编译文件。 我在编译的时候编译错误。经查,是$hadoop_home/src/saveVersion.sh生成的package-info.java有问题,导致无法编译过去。将saveVersion.sh修改一下:
(2):/hadoop/mapred/gridmix/Gridmix.java:396: 错误: 类型参数? extends T不在类型变量E的范围内
这个问题则需要修改/src/contrib/gridmix/src/java/org/apache/hadoop/mapred/gridmix/Gridmix.java这个文件。
原: private <T> String getEnumValues(Enum<? extends T>[] e) {
+改: private String getEnumValues(Enum<?>[] e) {
StringBuilder sb = new StringBuilder();
String sep = "";
-原: for (Enum<? extends T> v : e) {
+改: for (Enum<?> v : e) {
sb.append(sep); sb.append(v.name()); sep = "|";
以上准备完成后,将执行成功