Details

    • Type: Bug Bug
    • Status: Resolved Resolved
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: 3.4.1, 3.4.0
    • Fix Version/s: 3.4.2
    • Component/s: Pipeline Plugins, Pipelines
    • Labels:
      None
    • Release Notes:
      Fixed the Hydrator Hive batch source so that it no longer throws a ClassNotFoundException.
    • Rank:
      1|hzzdf3:

      Description

      The hive source is broken.

      2016-05-18 01:59:59,683 - ERROR [pcontroller-program:default.test3.workflow.DataPipelineWorkflow-22d50303-1c9c-11e6-a980-42010a800002:c.c.c.i.a.r.d.AbstractProgramTwillRunnable@331] - Program runner error out.
      java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl not found
      	at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[com.google.guava.guava-13.0.1.jar:na]
      	at co.cask.cdap.internal.app.runtime.workflow.WorkflowDriver.executeAll(WorkflowDriver.java:540) ~[co.cask.cdap.cdap-app-fabric-3.5.0-SNAPSHOT.jar:na]
      	at co.cask.cdap.internal.app.runtime.workflow.WorkflowDriver.run(WorkflowDriver.java:521) ~[co.cask.cdap.cdap-app-fabric-3.5.0-SNAPSHOT.jar:na]
      	at com.google.common.util.concurrent.AbstractExecutionThreadService$1$1.run(AbstractExecutionThreadService.java:52) ~[com.google.guava.guava-13.0.1.jar:na]
      	at java.lang.Thread.run(Thread.java:745) [na:1.7.0_75]
      Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl not found
      	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105) ~[hadoop-common-2.6.0-cdh5.5.0.jar:na]
      	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197) ~[hadoop-common-2.6.0-cdh5.5.0.jar:na]
      	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2223) ~[hadoop-common-2.6.0-cdh5.5.0.jar:na]
      	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.loadFilterHooks(HiveMetaStoreClient.java:238) ~[na:na]
      	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:190) ~[na:na]
      	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:179) ~[na:na]
      	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init>(HiveClientCache.java:303) ~[na:na]
      	at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:227) ~[na:na]
      	at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:224) ~[na:na]
      	at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4792) ~[com.google.guava.guava-13.0.1.jar:na]
      	at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599) ~[com.google.guava.guava-13.0.1.jar:na]
      	at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2379) ~[com.google.guava.guava-13.0.1.jar:na]
      	at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342) ~[com.google.guava.guava-13.0.1.jar:na]
      	at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257) ~[com.google.guava.guava-13.0.1.jar:na]
      	at com.google.common.cache.LocalCache.get(LocalCache.java:4000) ~[com.google.guava.guava-13.0.1.jar:na]
      	at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4789) ~[com.google.guava.guava-13.0.1.jar:na]
      	at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:224) ~[na:na]
      	at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:200) ~[na:na]
      	at org.apache.hive.hcatalog.common.HCatUtil.getHiveClient(HCatUtil.java:558) ~[na:na]
      	at org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:104) ~[na:na]
      	at org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86) ~[na:na]
      	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95) ~[na:na]
      	at co.cask.hydrator.plugin.batch.source.HiveBatchSource.prepareRun(HiveBatchSource.java:104) ~[na:na]
      	at co.cask.hydrator.plugin.batch.source.HiveBatchSource.prepareRun(HiveBatchSource.java:54) ~[na:na]
      	at co.cask.cdap.etl.batch.LoggedBatchConfigurable$1.call(LoggedBatchConfigurable.java:44) ~[na:na]
      	at co.cask.cdap.etl.batch.LoggedBatchConfigurable$1.call(LoggedBatchConfigurable.java:41) ~[na:na]
      	at co.cask.cdap.etl.log.LogContext.run(LogContext.java:59) ~[na:na]
      	at co.cask.cdap.etl.batch.LoggedBatchConfigurable.prepareRun(LoggedBatchConfigurable.java:41) ~[na:na]
      	at co.cask.cdap.etl.batch.mapreduce.ETLMapReduce.initialize(ETLMapReduce.java:172) ~[na:na]
      	at co.cask.cdap.etl.batch.mapreduce.ETLMapReduce.initialize(ETLMapReduce.java:75) ~[na:na]
      	at co.cask.cdap.internal.app.runtime.batch.MapReduceRuntimeService$2.call(MapReduceRuntimeService.java:483) ~[co.cask.cdap.cdap-app-fabric-3.5.0-SNAPSHOT.jar:na]
      	at co.cask.cdap.internal.app.runtime.batch.MapReduceRuntimeService$2.call(MapReduceRuntimeService.java:476) ~[co.cask.cdap.cdap-app-fabric-3.5.0-SNAPSHOT.jar:na]
      	at co.cask.cdap.data2.transaction.Transactions.execute(Transactions.java:174) ~[co.cask.cdap.cdap-data-fabric-3.5.0-SNAPSHOT.jar:na]
      	at co.cask.cdap.internal.app.runtime.batch.MapReduceRuntimeService.beforeSubmit(MapReduceRuntimeService.java:476) ~[co.cask.cdap.cdap-app-fabric-3.5.0-SNAPSHOT.jar:na]
      	at co.cask.cdap.internal.app.runtime.batch.MapReduceRuntimeService.startUp(MapReduceRuntimeService.java:207) ~[co.cask.cdap.cdap-app-fabric-3.5.0-SNAPSHOT.jar:na]
      	at com.google.common.util.concurrent.AbstractExecutionThreadService$1$1.run(AbstractExecutionThreadService.java:47) ~[com.google.guava.guava-13.0.1.jar:na]
      	at co.cask.cdap.internal.app.runtime.batch.MapReduceRuntimeService$1$1.run(MapReduceRuntimeService.java:394) ~[co.cask.cdap.cdap-app-fabric-3.5.0-SNAPSHOT.jar:na]
      	... 1 common frames omitted
      

      Looking at the code, I think its because it does weird classloading stuff.

        Activity

        Show
        Albert Shau added a comment - - edited https://github.com/caskdata/hydrator-plugins/pull/268
        Hide
        Rohit Sinha added a comment -

        For some clarification on the issue, the classloading stuff added there was essential for the plugin to work with 3.2 (and I think 3.3 too) because we were not exposing resources to plugin jars from CDAP in 3.2 (Please see comment in the source code for details). I think we fixed this issue on 3.3 or later. This clearly, shows the need to push for having integration tests for plugins as this could have been easily caught in integration test.

        Show
        Rohit Sinha added a comment - For some clarification on the issue, the classloading stuff added there was essential for the plugin to work with 3.2 (and I think 3.3 too) because we were not exposing resources to plugin jars from CDAP in 3.2 (Please see comment in the source code for details). I think we fixed this issue on 3.3 or later. This clearly, shows the need to push for having integration tests for plugins as this could have been easily caught in integration test.

          People

          • Assignee:
            Albert Shau
            Reporter:
            Albert Shau
          • Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved: