내 구성은 다음과 같습니다 Spark2.2.1 호환되지 않는 잭슨 버전 2.8.8
- 스칼라 2.11
:package it.scala // importo packages di Spark import org.apache.spark.SparkContext import org.apache.spark.SparkConf object Wordcount { def main(args: Array[String]) { val inputs: Array[String] = new Array[String](2) inputs(0) = "C:\\Users\\FobiDell\\Desktop\\input" inputs(1) = "C:\\Users\\FobiDell\\Desktop\\output" // oggetto SparkConf per settare i parametri sulla propria applicazione // da fornire poi al cluster manager scelto (Yarn, Mesos o Standalone). val conf = new SparkConf() conf.setAppName("Smartphone Addiction") conf.setMaster("local") // oggetto SparkContext per connessione al cluster manager scelto val sc = new SparkContext(conf) //Read file and create RDD val rawData = sc.textFile(inputs(0)) //convert the lines into words using flatMap operation val words = rawData.flatMap(line => line.split(" ")) //count the individual words using map and reduceByKey operation val wordCount = words.map(word => (word, 1)).reduceByKey(_ + _) //Save the result wordCount.saveAsTextFile(inputs(1)) //stop the spark context sc.stop } }
나는이 간단한 스칼라 코드 (Esempio.scala)를 실행하려면
그래서 Spark-shell을 사용하면 Eclipse IDE에서 파일 (Esempio.scala)을 선택하고 Run-> Run as-> Scala 응용 프로그램을 통해 실행하면이 예외가 발생합니다.
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.spark.SparkContext.withScope(SparkContext.scala:701)
at org.apache.spark.SparkContext.textFile(SparkContext.scala:830)
at it.scala.Wordcount$.main(Esempio.scala:47)
at it.scala.Wordcount.main(Esempio.scala)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.8.8
at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64)
at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:745)
at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
... 4 more
내 pom.xml 파일은 다음과 같습니다
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>it.hgfhgf.xhgfghf</groupId>
<artifactId>progetto</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>progetto</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<!-- Neo4j JDBC DRIVER -->
<dependency>
<groupId>org.neo4j</groupId>
<artifactId>neo4j-jdbc-driver</artifactId>
<version>3.1.0</version>
</dependency>
<!-- Scala -->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.11</version>
</dependency>
<!-- Spark -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.1</version>
</dependency>
</dependencies>
</project>
내가 스파크 2.2.1 - 빈 - hadoop2.7/항아리 디렉토리에있는 jar 파일이있는 것으로 나타났습니다 :
- jackson-core-2.6.5.jar ,
- 잭슨 데이터 바인딩-2.6.5.jar
- 잭슨 모듈 paranamer-2.6.5.jar
- 잭슨 모듈 scala_2.11-2.6.5.jar
- 잭슨 주석 2.6. 5.jar
누구든지이 예외가 무엇인지 간단히 설명하고 어떻게 해결할 수 있습니까?