2013-10-24 3 views
0

CentOS에 Spark를 설치하려고합니다. sbt/sbt assembly 명령을 사용하여 스파크를 작성하는 동안 다음과 같은 오류가 발생합니다.CentOS에서 Spark를 설치할 때 Java compliation 오류가 발생합니다.

[warn] /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:129: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information. 
[warn]  getOutputCommitter().cleanupJob(getJobContext()) 
[warn]      ^
[warn] /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:592: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information. 
[warn]  jobCommitter.cleanupJob(jobTaskContext) 
[warn]    ^
[warn] two warnings found 
[error] ---------- 
[error] 1. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 22) 
[error]   import io.netty.channel.ChannelFuture; 
[error]    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 
[error] The import io.netty.channel.ChannelFuture is never used 
[error] ---------- 
[error] 2. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 23) 
[error]   import io.netty.channel.ChannelFutureListener; 
[error]    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 
[error] The import io.netty.channel.ChannelFutureListener is never used 
[error] ---------- 
[error] ---------- 
[error] 3. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileServer.java (at line 23) 
[error]   import io.netty.channel.Channel; 
[error]    ^^^^^^^^^^^^^^^^^^^^^^^^ 
[error] The import io.netty.channel.Channel is never used 
[error] ---------- 
[error] ---------- 
[error] 4. WARNING in /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/JavaSparkContextVarargsWorkaround.java (at line 20) 
[error]   import java.util.Arrays; 
[error]    ^^^^^^^^^^^^^^^^ 
[error] The import java.util.Arrays is never used 
[error] ---------- 
[error] ---------- 
[error] 5. ERROR in /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/DoubleFlatMapFunction.java (at line 36) 
[error]   public final Iterable<Double> apply(T t) { return call(t); } 
[error]          ^^^^^^^^^^ 
[error] The method apply(T) of type DoubleFlatMapFunction<T> must override a superclass method 
[error] ---------- 
[error] 5 problems (1 error, 4 warnings) 
[error] (core/compile:compile) javac returned nonzero exit code 
[error] Total time: 431 s, completed Oct 24, 2013 7:42:21 AM 

내 컴퓨터에 설치된 Java 버전은 1.7.0_45입니다.
이전에 동일한 오류 집합을 준 jdk 1.6.0_35를 사용했습니다. 나는 또한 다른 유형의 오류를 준 자바 1.4를 시도했다. 어떤 Java 버전을 사용해야합니까? 아니면 다른 문제입니까?

+0

이 질문은 또한 ([스파크 사용자 메일 링리스트에 대한 상호 등록한이었다 https://groups.google.com/d/msg/spark-users/ ti5UF15YBq4/du_Wzhr3uCEJ) –

답변