Spark读取Hbase报错NoSuchMethodError: org.apache.hadoop.conf.Configuration.getPassword(Ljava/lang/String;

NoSuchMethodError一般报错都是包冲突导致的。

java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.getPassword(Ljava/lang/String;)[C

 at org.apache.spark.SSLOptions$$anonfun$8.apply(SSLOptions.scala:188)
 at org.apache.spark.SSLOptions$$anonfun$8.apply(SSLOptions.scala:188)
 at scala.Option.orElse(Option.scala:289)
 at org.apache.spark.SSLOptions$.parse(SSLOptions.scala:188)
 at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:117)
 at org.apache.spark.SparkEnv$.create(SparkEnv.scala:236)
 at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
 at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
 at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
 at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
 at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
 at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
 at scala.Option.getOrElse(Option.scala:121)
 at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
 at com.lcc.source.hbase.staticTable.ReadStaticHbase.readHbase(ReadStaticHbase.scala:33)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:498)
 at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
 at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
 at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
 at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
 at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
 at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
 at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
 at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
 at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
 at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
 at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
 at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
 at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
 at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
 at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
 at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:51)
 at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
 at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)

这个问题是包冲突引起的,我去maven下查看,发现没有冲突,很迷惑

在这里插入图片描述
然后后来点开org.apache.hadoop.conf.Configuration.getPassword发现是hadoop 2.5.1版本的包。
但是我全局没引入hadoop,只引入了Spark 2.4 kudu 1.2 hbase 1.2 然后,我想是不是版本不对,于是查看CDH的hdfs的版本为2.6.0然后,我在maven中加入

	<dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>2.6.0</version>
	</dependency>
	

然后我排除冲突

在这里插入图片描述

这里只排出 2.5.1 的即使标红的是2.6.0我也不排除,最后后运行成功

猜你喜欢

转载自blog.csdn.net/qq_21383435/article/details/93087305