scala + spingboot + springcloud + spark jackson和slf4j冲突解决
环境
jdk
:1.8
spring-boot
:1.5.2
scala
:2.11.8
spark
:2.2.1
bt
:maven
问题的出现
正常搭建spring-boot + scala
项目,测试正常,在加入spark
依赖后,spring-boot
可以启动,但是会报错:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/vabsh/.m2/repository/ch/qos/logback/logback-classic/1.1.11/logback-classic-1.1.11.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/vabsh/.m2/repository/org/slf4j/slf4j-log4j12/1.7.21/slf4j-log4j12-1.7.21.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
- 1
- 2
- 3
- 4
- 5
执行spark
任务失败,报错如下:
Exception in thread "Thread-7" java.lang.ExceptionInInitializerError
at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
at org.apache.spark.rdd.RDD.takeSample(RDD.scala:557)
at com.mongodb.spark.sql.MongoInferSchema$.apply(MongoInferSchema.scala:70)
at com.mongodb.spark.MongoSpark.toDF(MongoSpark.scala:581)
at com.mongodb.spark.MongoSpark$.load(MongoSpark.scala:84)
at github.clyoudu.test.executor.SystemEventTopNExecutor.execute(SystemEventTopNExecutor.scala:21)
at github.clyoudu.test.controller.ComputeTriggerController$$anon$1.run(ComputeTriggerController.scala:40)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.8.7
at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64)
at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:745)
at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
... 8 more
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
大概可以看出是jackson
版本的问题,要么是spark 2.2.1
无法使用jackson 2.8.7
,要么是spark
中的某些依赖包无法使用jackson 2.8.7
。
解决过程
首先用mvn dependency:tree Dincludes=com.fasterxml.jackson.core
命令查看编译的包从哪里来,结果如下:
[INFO] github.clyoudu.test:offlineCompute:jar:1.0-SNAPSHOT
[INFO] \- org.springframework.boot:spring-boot-starter-web:jar:1.5.2.RELEASE:compile
[INFO] \- com.fasterxml.jackson.core:jackson-databind:jar:2.8.7:compile
[INFO] +- com.fasterxml.jackson.core:jackson-annotations:jar:2.8.0:compile
[INFO] \- com.fasterxml.jackson.core:jackson-core:jar:2.8.7:compile
- 1
- 2
- 3
- 4
- 5
发现是来自spring-boot-starter-web
,去mvnrepository
查看Spring Boot» 1.5.2.RELEASE,发现真是jackson-databind-2.8.7
。
第一步想到的是能不能降spring-boot
版本,让spring-boot
和spark
的jackson
版本相同,浏览了几个版本的spring-boot
,发现1.3.2.RELEASE
的jackson
是2.5.6
,更改spring-boot
版本,发现可以启动,并且spark
任务未见异常。
因此确定是因为jackson
版本冲突导致的spark
任务执行失败,于是换回spring-boot 1.5.2.RELEASE
,强行指定spring-boot
和spark
的jackson
版本。
<!-- specify jackson for spark-core&spring-boot-->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.8.7</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.8.7</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>jackson-module-scala_2.11</artifactId>
<version>2.8.7</version>
</dependency>
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
再次执行mvn dependency:tree Dincludes=com.fasterxml.jackson.core
:
[INFO] github.clyoudu.test:offlineCompute:jar:1.0-SNAPSHOT
[INFO] +- com.fasterxml.jackson.core:jackson-databind:jar:2.8.7:compile
[INFO] | \- com.fasterxml.jackson.core:jackson-annotations:jar:2.8.0:compile
[INFO] \- com.fasterxml.jackson.core:jackson-core:jar:2.8.7:compile
- 1
- 2
- 3
- 4
所有jar
都来自指定的版本了。
不过又出了一个问题:logback
失效,日志未能写到对应文件夹,非常奇怪;做了一些exclusion
后甚至会报因为包冲突导致的ClassNotFoundException
。
这时想到前面看到的slf4j
冲突,因为spring-boot
自己带有logback
和sl4j
,但是spark-core
又有slf4j-log4j12
,于是排除spark-core
的slf4j-log4j12
:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.1</version>
<exclusions>
<exclusion>
<artifactId>slf4j-log4j12</artifactId>
<groupId>org.slf4j</groupId>
</exclusion>
</exclusions>
</dependency>
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
重启后,一切正常,spring-boot
正常启动,eureka server
注册成功,spark
任务执行成功,logback
运行正常。
dependency
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>1.5.2.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-eureka</artifactId>
<version>1.3.2.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
<version>1.5.2.RELEASE</version>
</dependency>
<!-- scala -->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
</dependency>
<!-- mongo-spark-connector -->
<dependency>
<groupId>org.mongodb.spark</groupId>
<artifactId>mongo-spark-connector_2.11</artifactId>
<version>2.2.1</version>
</dependency>
<!-- spark -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.1</version>
<exclusions>
<exclusion>
<artifactId>slf4j-log4j12</artifactId>
<groupId>org.slf4j</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.1</version>
</dependency>
<dependency>
<groupId>org.codehaus.janino</groupId>
<artifactId>commons-compiler</artifactId>
<version>2.7.8</version>
</dependency>
<!-- scala mongo client -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>casbah-core_2.11</artifactId>
<version>3.0.0</version>
</dependency>
<!-- fastjson -->
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.47</version>
</dependency>
<!-- datetime -->
<dependency>
<groupId>com.github.nscala-time</groupId>
<artifactId>nscala-time_2.11</artifactId>
<version>2.18.0</version>
</dependency>
<!-- specify jackson for spark-core&spring-boot-->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.8.7</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.8.7</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>jackson-module-scala_2.11</artifactId>
<version>2.8.7</version>
</dependency>
</dependencies>
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
- 19
- 20
- 21
- 22
- 23
- 24
- 25
- 26
- 27
- 28
- 29
- 30
- 31
- 32
- 33
- 34
- 35
- 36
- 37
- 38
- 39
- 40
- 41
- 42
- 43
- 44
- 45
- 46
- 47
- 48
- 49
- 50
- 51
- 52
- 53
- 54
- 55
- 56
- 57
- 58
- 59
- 60
- 61
- 62
- 63
- 64
- 65
- 66
- 67
- 68
- 69
- 70
- 71
- 72
- 73
- 74
- 75
- 76
- 77
- 78
- 79
- 80
- 81
- 82
- 83
- 84
- 85
- 86
- 87
- 88
- 89
- 90
- 91
- 92
- 93
- 94
- 95
- 96
- 97
- 98
总结
ClassNotFoundException
要么是真的没添加依赖,要么是包冲突。- 解决版本冲突除了
exclusion
,还可以在pom
里强制指定。 - 一个包被很多个包依赖的情况下,
exlusion
可能并不好用。