2016-11-30 4 views
1

Spark 배치 작업을 쓰려고합니다. 항아리에 포장하고 스파크 제출과 함께 사용하고 싶습니다.Spark 및 Hbase-client의 버전 호환성

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less; 
    at HBaseBulkload$.saveAsHFile(ThereInLocationGivenTimeInterval.scala:103) 
    at HBaseBulkload$.toHBaseBulk(ThereInLocationGivenTimeInterval.scala:178) 
    at ThereInLocationGivenTimeInterval$.main(ThereInLocationGivenTimeInterval.scala:241) 
    at ThereInLocationGivenTimeInterval.main(ThereInLocationGivenTimeInterval.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:498) 
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) 
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 

this answer에 따르면, 문제는 버전 호환성에서 유래 : 내 프로그램은 내가 스파크와 함께 그것을 실행하려고 제출할 때 그러나 나는 다음과 같은 오류가 발생, 스파크 쉘에서 완벽하게 작동합니다. 또한 this을 찾았지만 내 스파크 버전은 여기에 1.6.0이다 프로젝트에 대한 내 .sbt 파일입니다

name := "HbaseBulkLoad" 

version := "1.0" 

scalaVersion := "2.10.5" 

resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/" 

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.0" 
//libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.2.0-cdh5.9.0" 
//libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.2.0-cdh5.9.0" 
//libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.2.0-cdh5.9.0" 

libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.1.2" 
libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.1.2" 
libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.1.2" 

내 수입과 다음과 같은 오류가 발생 코드 세그먼트 : /SimpleApp.scala을/ 가져 오기 org.apache.spark.SparkContext 수입 org.apache.spark.SparkContext._ 수입 org.apache.spark.SparkConf는

// HBaseBulkLoad imports 
import java.util.UUID 

import org.apache.hadoop.conf.Configuration 
import org.apache.hadoop.fs.permission.FsPermission 
import org.apache.hadoop.fs.{Path, FileSystem} 
import org.apache.hadoop.hbase.{KeyValue, TableName} 
import org.apache.hadoop.hbase.client._ 
import org.apache.hadoop.hbase.io.ImmutableBytesWritable 
import org.apache.hadoop.hbase.mapreduce.{HFileOutputFormat2, LoadIncrementalHFiles} 
import org.apache.hadoop.hbase.util.Bytes 
import org.apache.hadoop.mapreduce.Job 
import org.apache.hadoop.mapreduce.lib.partition.TotalOrderPartitioner 
import org.apache.spark.rdd.RDD 
import org.apache.spark.Partitioner 
import org.apache.spark.storage.StorageLevel 

import scala.collection.JavaConversions._ 
import scala.reflect.ClassTag 

// Hbase admin imports 
import org.apache.hadoop.hbase.{HBaseConfiguration, HTableDescriptor} 
import org.apache.hadoop.hbase.client.HBaseAdmin 
import org.apache.hadoop.hbase.mapreduce.TableInputFormat 
import org.apache.hadoop.fs.Path; 
import org.apache.hadoop.hbase.HColumnDescriptor 
import org.apache.hadoop.hbase.util.Bytes 
import org.apache.hadoop.hbase.client.Put; 
import org.apache.hadoop.hbase.client.HTable; 
import java.util.Calendar 

val now = Calendar.getInstance.getTimeInMillis  
//val filteredRdd = myRdd.filter(... 
val resultRdd= filteredRdd.map{ row => (row(0).asInstanceOf[String].getBytes(), 
           scala.collection.immutable.Map("batchResults" -> 
             Array(("batchResult1", ("true", now))) 
           ) 
         ) 
     } 
println(resultRdd.count) 

답변

0

는 작업 .sbt 파일은 다음과 같다 :

당신은 클라우 데라를 사용하는 경우
name := "HbaseBulkLoad" 

version := "1.0" 

scalaVersion := "2.10.5" 

resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/" 

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.0-cdh5.9.0" 
libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.2.0-cdh5.9.0" 
libraryDependencies += "org.apache.hbase" % "hbase-client" % "1.2.0-cdh5.9.0" 
libraryDependencies += "org.apache.hbase" % "hbase-server" % "1.2.0-cdh5.9.0" 

, 다음 디렉토리에있는 항아리와 해당 버전을 찾을 수 있습니다

/opt/cloudera/parcels/CDH-5.9.0-1.cdh5.9.0.p0.23/jars