Uploaded image for project: 'CDAP'
  1. CDAP
  2. CDAP-11585

Scala classes not getting filtered in Workflow initialize

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: None
    • Labels:
      None
    • Rank:
      1|i001x3:

      Description

      Seeing the following when running a pipeline on spark2:

      java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
      	at org.apache.spark.util.Utils$.getSystemProperties(Utils.scala:1710) ~[na:na]
      	at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:73) ~[na:na]
      	at org.apache.spark.SparkConf.<init>(SparkConf.scala:68) ~[na:na]
      	at org.apache.spark.SparkConf.<init>(SparkConf.scala:55) ~[na:na]
      	at co.cask.cdap.etl.spark.batch.ETLSpark.initialize(ETLSpark.java:107) ~[na:na]
      	at co.cask.cdap.api.spark.AbstractSpark.initialize(AbstractSpark.java:144) ~[na:na]
      	at co.cask.cdap.api.spark.AbstractSpark.initialize(AbstractSpark.java:34) ~[na:na]
      	at co.cask.cdap.app.runtime.spark.SparkRuntimeService$4.run(SparkRuntimeService.java:375) ~[na:na]
      	at co.cask.cdap.internal.app.runtime.AbstractContext$3.run(AbstractContext.java:469) ~[na:na]
      	at co.cask.cdap.data2.transaction.Transactions$CacheBasedTransactional.finishExecute(Transactions.java:235) ~[na:na]
      	at co.cask.cdap.data2.transaction.Transactions$CacheBasedTransactional.execute(Transactions.java:223) ~[na:na]
      	at co.cask.cdap.internal.app.runtime.AbstractContext.execute(AbstractContext.java:464) ~[na:na]
      	at co.cask.cdap.internal.app.runtime.AbstractContext.execute(AbstractContext.java:452) ~[na:na]
      	at co.cask.cdap.app.runtime.spark.BasicSparkClientContext.execute(BasicSparkClientContext.java:289) ~[na:na]
      	at co.cask.cdap.app.runtime.spark.SparkRuntimeService.initialize(SparkRuntimeService.java:385) ~[na:na]
      	at co.cask.cdap.app.runtime.spark.SparkRuntimeService.startUp(SparkRuntimeService.java:163) ~[na:na]
      	at com.google.common.util.concurrent.AbstractExecutionThreadService$1$1.run(AbstractExecutionThreadService.java:47) ~[com.google.guava.guava-13.0.1.jar:na]
      	at co.cask.cdap.app.runtime.spark.SparkRuntimeService$3$1.run(SparkRuntimeService.java:342) ~[na:na]
      	at java.lang.Thread.run(Thread.java:745) [na:1.7.0_75]
      

      which looks a lot like what happens when the wrong scala version is being used. But the example spark2 app works fine, so I'm guessing its some packaging issue for the pipeline jars.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                terence Terence Yim
                Reporter:
                ashau Albert Shau
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: