Uploaded image for project: 'CDAP'
  1. CDAP
  2. CDAP-12415

Hive on spark failing on CM 5.12

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 4.3.0
    • Fix Version/s: 4.3.1
    • Component/s: CDAP, Explore
    • Labels:
      None
    • Release Notes:
      Fixed an issue with not able to use HiveContext in Spark
    • Rank:
      1|i006sf:

      Description

      Explore queries that trigger a Hive on Spark job are failing on CM 5.12 clusters.

      17/08/21 21:12:09 WARN conf.Configuration: java.io.ByteArrayInputStream@35e64639:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
      Exception in thread "dag-scheduler-event-loop" java.lang.NoSuchMethodError: org.apache.twill.internal.zookeeper.DefaultZKClientService$ServiceDelegate.addListener(Lcom/google/common/util/concurrent/Service$Listener;Ljava/util/concurrent/Executor;)V
      	at org.apache.twill.internal.zookeeper.DefaultZKClientService$ServiceDelegate.<init>(DefaultZKClientService.java:403)
      	at org.apache.twill.internal.zookeeper.DefaultZKClientService$ServiceDelegate.<init>(DefaultZKClientService.java:392)
      	at org.apache.twill.internal.zookeeper.DefaultZKClientService.<init>(DefaultZKClientService.java:98)
      	at org.apache.twill.zookeeper.ZKClientService$Builder.build(ZKClientService.java:101)
      	at co.cask.cdap.common.guice.ZKClientModule.provideZKClientService(ZKClientModule.java:53)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:606)
      	at com.google.inject.internal.ProviderMethod.get(ProviderMethod.java:104)
      	at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40)
      	at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46)
      	at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1031)
      	at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40)
      	at com.google.inject.Scopes$1$1.get(Scopes.java:65)
      	at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40)
      	at com.google.inject.internal.InjectorImpl$4$1.call(InjectorImpl.java:978)
      	at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1024)
      	at com.google.inject.internal.InjectorImpl$4.get(InjectorImpl.java:974)
      	at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1013)
      	at co.cask.cdap.hive.context.ContextManager.createContext(ContextManager.java:177)
      	at co.cask.cdap.hive.context.ContextManager.getContext(ContextManager.java:129)
      	at co.cask.cdap.hive.stream.HiveStreamInputFormat.getSplitFinder(HiveStreamInputFormat.java:92)
      	at co.cask.cdap.hive.stream.HiveStreamInputFormat.getSplits(HiveStreamInputFormat.java:73)
      	at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:307)
      	at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:410)
      	at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getCombineSplits(CombineHiveInputFormat.java:370)
      	at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:541)
      	at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:202)
      	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      	at scala.Option.getOrElse(Option.scala:120)
      	at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      	at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
      	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
      	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
      	at scala.Option.getOrElse(Option.scala:120)
      	at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
      	at org.apache.spark.ShuffleDependency.<init>(Dependency.scala:91)
      	at org.apache.spark.rdd.ShuffledRDD.getDependencies(ShuffledRDD.scala:80)
      	at org.apache.spark.rdd.RDD$$anonfun$dependencies$2.apply(RDD.scala:226)
      	at org.apache.spark.rdd.RDD$$anonfun$dependencies$2.apply(RDD.scala:224)
      	at scala.Option.getOrElse(Option.scala:120)
      	at org.apache.spark.rdd.RDD.dependencies(RDD.scala:224)
      	at org.apache.spark.scheduler.DAGScheduler.visit$1(DAGScheduler.scala:386)
      	at org.apache.spark.scheduler.DAGScheduler.getParentStages(DAGScheduler.scala:398)
      	at org.apache.spark.scheduler.DAGScheduler.getParentStagesAndId(DAGScheduler.scala:299)
      	at org.apache.spark.scheduler.DAGScheduler.newResultStage(DAGScheduler.scala:334)
      	at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:837)
      	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1635)
      	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1627)
      	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1616)
      	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
      17/08/21 21:17:26 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 16
      17/08/21 21:17:26 ERROR yarn.ApplicationMaster: RECEIVED SIGNAL 15: SIGTERM
      

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                terence Terence Yim
                Reporter:
                mattwuenschel Matt Wuenschel
              • Votes:
                0 Vote for this issue
                Watchers:
                4 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: