Hadoop编程踩坑

Hadoop踩坑

在hadoop所有组件编程中,遇到在Windows下运行程序出现

 1 java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
 2     at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:356)
 3     at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:371)
 4     at org.apache.hadoop.util.Shell.<clinit>(Shell.java:364)
 5     at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
 6     at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
 7     at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:272)
 8     at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
 9     at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:790)
10     at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:760)
11     at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:633)
12     at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2214)
13     at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2214)
14     at scala.Option.getOrElse(Option.scala:121)
15     at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2214)
16     at org.apache.spark.SparkContext.<init>(SparkContext.scala:322)
17     at leftOutJoin.sparkTopN$.main(SparkLefOutJoin.scala:19)
18     at leftOutJoin.sparkTopN.main(SparkLefOutJoin.scala)

通常是需要放置一个系统缺失的应用“

null\bin\winutils.exe

建议在代码中插入一行

 1 System.setProperty("hadoop.home.dir", "F:\\spack\\hadoop-common-2.2.0-bin-master") 

这里我写的路径是我自己的当然,如果你没有

winutils.exe
那你可以下载我上传的:链接

猜你喜欢

转载自www.cnblogs.com/lzj-/p/11112522.html