2017-09-21 9 views
1

내가 로지스틱 회귀 예제를 실행하기 위해 노력하고있어 (https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/ml/JavaLogisticRegressionWithElasticNetExample.java)IntelliJ IDEA에서 MLlib 프로젝트가 "AssertionError : assertion failed : unsafe symbol CompatContext"로 실패하는 이유는 무엇입니까?

이 코드입니다 : 나는 또한 예 (https://github.com/apache/spark/blob/master/data/mllib/sample_libsvm_data.txt) 의 같은 파일을 사용하고

public final class GettingStarted { 

public static void main(final String[] args) throws InterruptedException { 
    System.setProperty("hadoop.home.dir", "C:\\winutils"); 

    SparkSession spark = SparkSession 
      .builder() 
      .appName("JavaLogisticRegressionWithElasticNetExample") 
      .config("spark.master", "local") 
      .getOrCreate(); 

    // $example on$ 
    // Load training data 
    Dataset<Row> training = spark.read().format("libsvm").load("data/mllib/sample_libsvm_data.txt"); 

    LogisticRegression lr = new LogisticRegression() 
      .setMaxIter(10) 
      .setRegParam(0.3) 
      .setElasticNetParam(0.8); 

    // Fit the model 
    LogisticRegressionModel lrModel = lr.fit(training); 

    // Print the coefficients and intercept for logistic regression 
    System.out.println("Coefficients: " 
      + lrModel.coefficients() + " Intercept: " + lrModel.intercept()); 

    // We can also use the multinomial family for binary classification 
    LogisticRegression mlr = new LogisticRegression() 
      .setMaxIter(10) 
      .setRegParam(0.3) 
      .setElasticNetParam(0.8) 
      .setFamily("multinomial"); 

    // Fit the model 
    LogisticRegressionModel mlrModel = mlr.fit(training); 

    // Print the coefficients and intercepts for logistic regression with multinomial family 
    System.out.println("Multinomial coefficients: " + lrModel.coefficientMatrix() 
      + "\nMultinomial intercepts: " + mlrModel.interceptVector()); 
    // $example off$ 

    spark.stop();}} 

그러나 나는이를 얻을 수 오류 :

Exception in thread "main" java.lang.AssertionError: assertion failed: unsafe symbol CompatContext (child of package macrocompat) in runtime reflection universe 
at scala.reflect.internal.Symbols$Symbol.<init>(Symbols.scala:184) 
at scala.reflect.internal.Symbols$TypeSymbol.<init>(Symbols.scala:2984) 
at scala.reflect.internal.Symbols$ClassSymbol.<init>(Symbols.scala:3176) 
at scala.reflect.internal.Symbols$StubClassSymbol.<init>(Symbols.scala:3471) 
at scala.reflect.internal.Symbols$Symbol.newStubSymbol(Symbols.scala:498) 
at scala.reflect.internal.pickling.UnPickler$Scan.readExtSymbol$1(UnPickler.scala:258) 
at scala.reflect.internal.pickling.UnPickler$Scan.readSymbol(UnPickler.scala:284) 
at scala.reflect.internal.pickling.UnPickler$Scan.readSymbolRef(UnPickler.scala:649) 
at scala.reflect.internal.pickling.UnPickler$Scan.readType(UnPickler.scala:417) 
at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef$$anonfun$6.apply(UnPickler.scala:725) 
at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef$$anonfun$6.apply(UnPickler.scala:725) 
at scala.reflect.internal.pickling.UnPickler$Scan.at(UnPickler.scala:179) 
at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef.completeInternal(UnPickler.scala:725) 
at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef.complete(UnPickler.scala:749) 
at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1489) 
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$12.scala$reflect$runtime$SynchronizedSymbols$SynchronizedSymbol$$super$info(SynchronizedSymbols.scala:162) 
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127) 
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127) 
at scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19) 
at scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16) 
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:123) 
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$12.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:162) 
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.info(SynchronizedSymbols.scala:127) 
at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$12.info(SynchronizedSymbols.scala:162) 
at scala.reflect.internal.Mirrors$RootsBase.ensureClassSymbol(Mirrors.scala:94) 
at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102) 
at scala.reflect.internal.Mirrors$RootsBase.getClassIfDefined(Mirrors.scala:114) 
at scala.reflect.internal.Mirrors$RootsBase.getClassIfDefined(Mirrors.scala:111) 
at scala.reflect.internal.Definitions$DefinitionsClass.BlackboxContextClass$lzycompute(Definitions.scala:496) 
at scala.reflect.internal.Definitions$DefinitionsClass.BlackboxContextClass(Definitions.scala:496) 
at scala.reflect.runtime.JavaUniverseForce$class.force(JavaUniverseForce.scala:305) 
at scala.reflect.runtime.JavaUniverse.force(JavaUniverse.scala:16) 
at scala.reflect.runtime.JavaUniverse.init(JavaUniverse.scala:147) 
at scala.reflect.runtime.JavaUniverse.<init>(JavaUniverse.scala:78) 
at scala.reflect.runtime.package$.universe$lzycompute(package.scala:17) 
at scala.reflect.runtime.package$.universe(package.scala:17) 
at org.apache.spark.sql.catalyst.ScalaReflection$.<init>(ScalaReflection.scala:40) 
at org.apache.spark.sql.catalyst.ScalaReflection$.<clinit>(ScalaReflection.scala) 
at org.apache.spark.sql.catalyst.encoders.RowEncoder$.org$apache$spark$sql$catalyst$encoders$RowEncoder$$serializerFor(RowEncoder.scala:74) 
at org.apache.spark.sql.catalyst.encoders.RowEncoder$.apply(RowEncoder.scala:61) 
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:67) 
at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:415) 
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:172) 
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:156) 
at GettingStarted.main(GettingStarted.java:95) 

내가 뭘 잘못 생각하니?

편집 : 나는 인 IntelliJ에서 실행 , 그것은 Maven 프로젝트 내가 종속성 추가 :

<dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-core_2.11</artifactId> 
     <version>2.2.0</version> 
    </dependency> 
    <dependency> 
     <groupId>org.mongodb.spark</groupId> 
     <artifactId>mongo-spark-connector_2.11</artifactId> 
     <version>2.2.0</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-sql_2.11</artifactId> 
     <version>2.2.0</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-mllib_2.10</artifactId> 
     <version>2.2.0</version> 
    </dependency> 
+0

IntelliJ 이것은 Maven 프로젝트입니다. –

답변

2

TL, 반사 우주를 mentionning, 즉시 스칼라 내부 오류를보고 시작으로 박사을 , 호환되지 않는 스칼라 버전을 생각하십시오.

라이브러리의 스칼라 버전이 서로 일치하지 않습니다 (2.10 및 2.11).

실제 스칼라 버전을 모두 정렬해야합니다.

<dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-sql_2.11</artifactId> <!-- This is scala v2.11 --> 
    <version>2.2.0</version> 
</dependency> 
<dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-mllib_2.10</artifactId> <!-- This is scala v2.10 --> 
    <version>2.2.0</version> 
</dependency>