scala + spingboot + springcloud + spark jackson和slf4j冲突解决

scala + spingboot + springcloud + spark jackson和slf4j冲突解决

环境

  • jdk:1.8
  • spring-boot:1.5.2
  • scala:2.11.8
  • spark:2.2.1
  • bt:maven

问题的出现

正常搭建spring-boot + scala项目,测试正常,在加入spark依赖后,spring-boot可以启动,但是会报错:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/vabsh/.m2/repository/ch/qos/logback/logback-classic/1.1.11/logback-classic-1.1.11.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/vabsh/.m2/repository/org/slf4j/slf4j-log4j12/1.7.21/slf4j-log4j12-1.7.21.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]

执行spark任务失败,报错如下:

Exception in thread "Thread-7" java.lang.ExceptionInInitializerError
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
    at org.apache.spark.rdd.RDD.takeSample(RDD.scala:557)
    at com.mongodb.spark.sql.MongoInferSchema$.apply(MongoInferSchema.scala:70)
    at com.mongodb.spark.MongoSpark.toDF(MongoSpark.scala:581)
    at com.mongodb.spark.MongoSpark$.load(MongoSpark.scala:84)
    at github.clyoudu.test.executor.SystemEventTopNExecutor.execute(SystemEventTopNExecutor.scala:21)
    at github.clyoudu.test.controller.ComputeTriggerController$$anon$1.run(ComputeTriggerController.scala:40)
    at java.lang.Thread.run(Thread.java:748)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.8.7
    at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64)
    at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
    at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:745)
    at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
    at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
    ... 8 more

大概可以看出是jackson版本的问题,要么是spark 2.2.1无法使用jackson 2.8.7,要么是spark中的某些依赖包无法使用jackson 2.8.7

解决过程

首先用mvn dependency:tree Dincludes=com.fasterxml.jackson.core命令查看编译的包从哪里来,结果如下:

[INFO] github.clyoudu.test:offlineCompute:jar:1.0-SNAPSHOT
[INFO] \- org.springframework.boot:spring-boot-starter-web:jar:1.5.2.RELEASE:compile
[INFO]    \- com.fasterxml.jackson.core:jackson-databind:jar:2.8.7:compile
[INFO]       +- com.fasterxml.jackson.core:jackson-annotations:jar:2.8.0:compile
[INFO]       \- com.fasterxml.jackson.core:jackson-core:jar:2.8.7:compile

发现是来自spring-boot-starter-web,去mvnrepository查看Spring Boot» 1.5.2.RELEASE,发现真是jackson-databind-2.8.7

第一步想到的是能不能降spring-boot版本,让spring-bootsparkjackson版本相同,浏览了几个版本的spring-boot,发现1.3.2.RELEASEjackson2.5.6,更改spring-boot版本,发现可以启动,并且spark任务未见异常。

因此确定是因为jackson版本冲突导致的spark任务执行失败,于是换回spring-boot 1.5.2.RELEASE,强行指定spring-bootsparkjackson版本。

 <!-- specify jackson for spark-core&spring-boot-->
<dependency>
    <groupId>com.fasterxml.jackson.core</groupId>
    <artifactId>jackson-databind</artifactId>
    <version>2.8.7</version>
</dependency>
<dependency>
    <groupId>com.fasterxml.jackson.core</groupId>
    <artifactId>jackson-core</artifactId>
    <version>2.8.7</version>
</dependency>
<dependency>
    <groupId>com.fasterxml.jackson.module</groupId>
    <artifactId>jackson-module-scala_2.11</artifactId>
    <version>2.8.7</version>
</dependency>

再次执行mvn dependency:tree Dincludes=com.fasterxml.jackson.core

[INFO] github.clyoudu.test:offlineCompute:jar:1.0-SNAPSHOT
[INFO] +- com.fasterxml.jackson.core:jackson-databind:jar:2.8.7:compile
[INFO] |  \- com.fasterxml.jackson.core:jackson-annotations:jar:2.8.0:compile
[INFO] \- com.fasterxml.jackson.core:jackson-core:jar:2.8.7:compile

所有jar都来自指定的版本了。

不过又出了一个问题:logback失效,日志未能写到对应文件夹,非常奇怪;做了一些exclusion后甚至会报因为包冲突导致的ClassNotFoundException

这时想到前面看到的slf4j冲突,因为spring-boot自己带有logbacksl4j,但是spark-core又有slf4j-log4j12,于是排除spark-coreslf4j-log4j12

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.2.1</version>
    <exclusions>
        <exclusion>
            <artifactId>slf4j-log4j12</artifactId>
            <groupId>org.slf4j</groupId>
        </exclusion>
    </exclusions>
</dependency>

重启后,一切正常,spring-boot正常启动,eureka server注册成功,spark任务执行成功,logback运行正常。

dependency

<dependencies>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>4.11</version>
        <scope>test</scope>
    </dependency>

    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
        <version>1.5.2.RELEASE</version>
    </dependency>
    <dependency>
        <groupId>org.springframework.cloud</groupId>
        <artifactId>spring-cloud-starter-eureka</artifactId>
        <version>1.3.2.RELEASE</version>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-actuator</artifactId>
        <version>1.5.2.RELEASE</version>
    </dependency>

    <!-- scala -->
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
    </dependency>

    <!-- mongo-spark-connector -->
    <dependency>
        <groupId>org.mongodb.spark</groupId>
        <artifactId>mongo-spark-connector_2.11</artifactId>
        <version>2.2.1</version>
    </dependency>

    <!-- spark -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.2.1</version>
        <exclusions>
            <exclusion>
                <artifactId>slf4j-log4j12</artifactId>
                <groupId>org.slf4j</groupId>
            </exclusion>
        </exclusions>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.2.1</version>
    </dependency>
    <dependency>
        <groupId>org.codehaus.janino</groupId>
        <artifactId>commons-compiler</artifactId>
        <version>2.7.8</version>
    </dependency>

    <!-- scala mongo client -->
    <dependency>
        <groupId>org.mongodb</groupId>
        <artifactId>casbah-core_2.11</artifactId>
        <version>3.0.0</version>
    </dependency>

    <!-- fastjson -->
    <dependency>
        <groupId>com.alibaba</groupId>
        <artifactId>fastjson</artifactId>
        <version>1.2.47</version>
    </dependency>

    <!-- datetime -->
    <dependency>
        <groupId>com.github.nscala-time</groupId>
        <artifactId>nscala-time_2.11</artifactId>
        <version>2.18.0</version>
    </dependency>

    <!-- specify jackson for spark-core&spring-boot-->
    <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-databind</artifactId>
        <version>2.8.7</version>
    </dependency>
    <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-core</artifactId>
        <version>2.8.7</version>
    </dependency>
    <dependency>
        <groupId>com.fasterxml.jackson.module</groupId>
        <artifactId>jackson-module-scala_2.11</artifactId>
        <version>2.8.7</version>
    </dependency>
</dependencies>

总结

  • ClassNotFoundException要么是真的没添加依赖,要么是包冲突。
  • 解决版本冲突除了exclusion,还可以在pom里强制指定。
  • 一个包被很多个包依赖的情况下,exlusion可能并不好用。

猜你喜欢

转载自blog.csdn.net/cl_yd/article/details/80324824