2017-01-10 3 views
0

Spark Streaming이 1.6.1에서 2.1.0으로 업그레이드되었습니다. 스트리밍을 위해 spark-rabbitmq 라이브러리를 사용합니다. 이것은 0.3.0에서 0.5.1로 업그레이드되었습니다. 내가 사용하여 내 스파크 스트리밍 작업을 실행하면2.1.0으로의 Spark Streaming 업그레이드 java.lang.VerifyError : 분기 타겟 152의 스택 맵 프레임이 일치하지 않음

내 독립 스파크 사전 구축 - 하둡 2.7 버전에 불꽃을 제출, 나는 오류가 아래 얻을 - 다음과 같이

2017-01-09 12:32:17 ERROR Executor:91 - Exception in task 0.0 in stage 0.0 (TID 0) 
java.lang.VerifyError: Inconsistent stackmap frames at branch target 152 
Exception Details: 
Location: 
akka/dispatch/Mailbox.processAllSystemMessages()V @152: getstatic 
Reason: 
Type top (current frame, locals[9]) is not assignable to 'akka/dispatch/sysmsg/SystemMessage' (stack map, locals[9]) 
Current Frame: 
bci: @131 
flags: { } 
locals: { 'akka/dispatch/Mailbox', 'java/lang/InterruptedException', 'akka/dispatch/sysmsg/SystemMessage', top, 'akka/dispatch/Mailbox', 'java/lang/Throwable', 'java/lang/Throwable' } 
stack: { integer } 
Stackmap Frame: 
bci: @152 
flags: { } 
locals: { 'akka/dispatch/Mailbox', 'java/lang/InterruptedException', 'akka/dispatch/sysmsg/SystemMessage', top, 'akka/dispatch/Mailbox', 'java/lang/Throwable', 'java/lang/Throwable', top, top, 'akka/dispatch/sysmsg/SystemMessage' } 
stack: { } 
Bytecode: 
0x0000000: 014c 2ab2 0132 b601 35b6 0139 4db2 013e 
0x0000010: 2cb6 0142 9900 522a b600 c69a 004b 2c4e 
0x0000020: b201 3e2c b601 454d 2db9 0148 0100 2ab6 
0x0000030: 0052 2db6 014b b801 0999 000e bb00 e759 
0x0000040: 1301 4db7 010f 4cb2 013e 2cb6 0150 99ff 
0x0000050: bf2a b600 c69a ffb8 2ab2 0132 b601 35b6 
0x0000060: 0139 4da7 ffaa 2ab6 0052 b600 56b6 0154 
0x0000070: b601 5a3a 04a7 0091 3a05 1905 3a06 1906 
0x0000080: c100 e799 0015 1906 c000 e73a 0719 074c 
0x0000090: b200 f63a 08a7 0071 b201 5f19 06b6 0163 
0x00000a0: 3a0a 190a b601 6899 0006 1905 bf19 0ab6 
0x00000b0: 016c c000 df3a 0b2a b600 52b6 0170 b601 
0x00000c0: 76bb 000f 5919 0b2a b600 52b6 017a b601 
0x00000d0: 80b6 0186 2ab6 018a bb01 8c59 b701 8e13 
0x00000e0: 0190 b601 9419 09b6 0194 1301 96b6 0194 
0x00000f0: 190b b601 99b6 0194 b601 9ab7 019d b601 
0x0000100: a3b2 00f6 3a08 b201 3e2c b601 4299 0026 
0x0000110: 2c3a 09b2 013e 2cb6 0145 4d19 09b9 0148 
0x0000120: 0100 1904 2ab6 0052 b601 7a19 09b6 01a7 
0x0000130: a7ff d62b c600 09b8 0109 572b bfb1 
Exception Handler Table: 
bci [290, 307] => handler: 120 
Stackmap Table: 
append_frame(@13,Object[#231],Object[#177]) 
append_frame(@71,Object[#177]) 
chop_frame(@102,1) 
full_frame(@120,{Object[#2],Object[#231],Object[#177],Top,Object[#2],Object[#177]},{Object[#223]}) 
full_frame(@152,{Object[#2],Object[#231],Object[#177],Top,Object[#2],Object[#223],Object[#223],Top,Top,Object[#177]},{}) 
append_frame(@173,Object[#357]) 
full_frame(@262,{Object[#2],Object[#231],Object[#177],Top,Object[#2]},{}) 
same_frame(@307) 
same_frame(@317) 

at akka.dispatch.Mailboxes.<init>(Mailboxes.scala:33) 
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:628) 
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142) 
at akka.actor.ActorSystem$.apply(ActorSystem.scala:109) 
at akka.actor.ActorSystem$.apply(ActorSystem.scala:100) 
at org.apache.spark.streaming.rabbitmq.receiver.RabbitMQReceiver.onStart(RabbitMQInputDStream.scala:57) 
at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:149) 
at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:131) 
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:607) 
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$9.apply(ReceiverTracker.scala:597) 
at org.apache.spark.SparkContext$$anonfun$34.apply(SparkContext.scala:2021) 
at org.apache.spark.SparkContext$$anonfun$34.apply(SparkContext.scala:2021) 
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) 
at org.apache.spark.scheduler.Task.run(Task.scala:99) 
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) 
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
at java.lang.Thread.run(Thread.java:745) 

내 pom.xml 파일은 ...

<properties> 
     <slf4j.version>1.7.7</slf4j.version> 
     <log4j.version>1.2.17</log4j.version> 
     <mapr.hbase.version>5.0.0-mapr</mapr.hbase.version> 
     <guava.version>19.0</guava.version> 
    </properties> 
.... 
<dependency> 
      <groupId>com.google.guava</groupId> 
      <artifactId>guava</artifactId> 
      <version>${guava.version}</version> 
     </dependency> 

<dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-core_2.11</artifactId> 
    <version>2.1.0</version> 
    <scope>provided</scope> 
</dependency> 
<dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-streaming_2.11</artifactId> 
    <version>2.1.0</version> 
    <scope>provided</scope> 
</dependency> 
<dependency> 
    <groupId>com.stratio.receiver</groupId> 
    <artifactId>spark-rabbitmq</artifactId> 
    <version>0.5.1</version> 
</dependency> 
</dependencies> 

    <build> 
..... 
..... 
<plugin> 
      <groupId>org.apache.maven.plugins</groupId> 
      <artifactId>maven-shade-plugin</artifactId> 
      <version>2.4.3</version> 
      <executions> 
       <execution> 
        <phase>package</phase> 
        <goals> 
         <goal>shade</goal> 
        </goals> 
        <configuration> 
        <relocations> 

<relocation> 

<pattern>com.google</pattern> 

<shadedPattern>shadeio</shadedPattern> 

<includes> 

<include>com.google.**</include> 

</includes> 

</relocation> 

</relocations> 


         <transformers> 
          <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" /> 
         </transformers> 

구아바 용 음영 플러그인을 제거하면 코드가 작동합니다. 하지만 추가하면 위의 오류가 발생합니다. 내 스트리밍 작업 중 하나에 대해 구아바 - 19.0의 새 버전이 필요하므로이 플러그인을 그늘지게해야합니다.

이 문제를 해결하는 방법에 대한 제안.

답변

0

다른 구아바 버전과 충돌하는 종속성을 제외해야합니다.

등의 제외와 (이 경우 카프카) 의존성의 예 :

<transformers> 
     <transformer 
      implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer"> 
      <resource>reference.conf</resource> 
     </transformer> 
    </transformers> 
: 또한

<dependency> 
     <groupId>org.apache.kafka</groupId> 
     <artifactId>kafka-clients</artifactId> 
     <exclusions> 
      <exclusion> 
       <artifactId>google-collections</artifactId> 
       <groupId>google-collections</groupId> 
      </exclusion> 
      <exclusion> 
       <artifactId>guava</artifactId> 
       <groupId>com.google.guava</groupId> 
      </exclusion> 
     </exclusions> 
     <version>${kafka.version}</version> 
    </dependency> 

, 나는 당신이 음영 문제를 줄 수있는 특정 Akka 구성 파일에 대한 변압기를 사용하는 것이 좋습니다

0

나는 최근에 똑같은 문제가 있었지만 (Flink 프로젝트와 함께). 이 오류를 일으키는 maven-shade-plugin입니다. maven-shade-plugin 버전 2.4.3을 3.1.0으로 업그레이드하면이 문제가 해결됩니다.